A Selective Overview of Variable Selection in High Dimensional Feature Space
Fan, Jianqing
2010-01-01
High dimensional statistical problems arise from diverse fields of scientific research and technological development. Variable selection plays a pivotal role in contemporary statistical learning and scientific discoveries. The traditional idea of best subset selection methods, which can be regarded as a specific form of penalized likelihood, is computationally too expensive for many modern statistical applications. Other forms of penalized likelihood methods have been successfully developed over the last decade to cope with high dimensionality. They have been widely applied for simultaneously selecting important variables and estimating their effects in high dimensional statistical inference. In this article, we present a brief account of the recent developments of theory, methods, and implementations for high dimensional variable selection. What limits of the dimensionality such methods can handle, what the role of penalty functions is, and what the statistical properties are rapidly drive the advances of the field. The properties of non-concave penalized likelihood and its roles in high dimensional statistical modeling are emphasized. We also review some recent advances in ultra-high dimensional variable selection, with emphasis on independence screening and two-scale methods. PMID:21572976
High throughput selection of antibiotic-resistant transgenic Arabidopsis plants.
Nagashima, Yukihiro; Koiwa, Hisashi
2017-05-15
Kanamycin resistance is the most frequently used antibiotic-resistance marker for Arabidopsis transformations, however, this method frequently causes escape of untransformed plants, particularly at the high seedling density during the selection. Here we developed a robust high-density selection method using top agar for Arabidopsis thaliana. Top agar effectively suppressed growth of untransformed wild-type plants on selection media at high density. Survival of the transformed plants during the selection were confirmed by production of green true leaves and expression of a firefly luciferase reporter gene. Top agar method allowed selection using a large amount of seeds in Arabidopsis transformation. Copyright © 2017 Elsevier Inc. All rights reserved.
Scheduler for multiprocessor system switch with selective pairing
Gara, Alan; Gschwind, Michael Karl; Salapura, Valentina
2015-01-06
System, method and computer program product for scheduling threads in a multiprocessing system with selective pairing of processor cores for increased processing reliability. A selective pairing facility is provided that selectively connects, i.e., pairs, multiple microprocessor or processor cores to provide one highly reliable thread (or thread group). The method configures the selective pairing facility to use checking provide one highly reliable thread for high-reliability and allocate threads to corresponding processor cores indicating need for hardware checking. The method configures the selective pairing facility to provide multiple independent cores and allocate threads to corresponding processor cores indicating inherent resilience.
NASA Astrophysics Data System (ADS)
Song, Yunquan; Lin, Lu; Jian, Ling
2016-07-01
Single-index varying-coefficient model is an important mathematical modeling method to model nonlinear phenomena in science and engineering. In this paper, we develop a variable selection method for high-dimensional single-index varying-coefficient models using a shrinkage idea. The proposed procedure can simultaneously select significant nonparametric components and parametric components. Under defined regularity conditions, with appropriate selection of tuning parameters, the consistency of the variable selection procedure and the oracle property of the estimators are established. Moreover, due to the robustness of the check loss function to outliers in the finite samples, our proposed variable selection method is more robust than the ones based on the least squares criterion. Finally, the method is illustrated with numerical simulations.
A Selective Review of Group Selection in High-Dimensional Models
Huang, Jian; Breheny, Patrick; Ma, Shuangge
2013-01-01
Grouping structures arise naturally in many statistical modeling problems. Several methods have been proposed for variable selection that respect grouping structure in variables. Examples include the group LASSO and several concave group selection methods. In this article, we give a selective review of group selection concerning methodological developments, theoretical properties and computational algorithms. We pay particular attention to group selection methods involving concave penalties. We address both group selection and bi-level selection methods. We describe several applications of these methods in nonparametric additive models, semiparametric regression, seemingly unrelated regressions, genomic data analysis and genome wide association studies. We also highlight some issues that require further study. PMID:24174707
NASA Technical Reports Server (NTRS)
Yun, Hee-Mann (Inventor); DiCarlo, James A. (Inventor)
2014-01-01
Methods are disclosed for producing architectural preforms and high-temperature composite structures containing high-strength ceramic fibers with reduced preforming stresses within each fiber, with an in-situ grown coating on each fiber surface, with reduced boron within the bulk of each fiber, and with improved tensile creep and rupture resistance properties tier each fiber. The methods include the steps of preparing an original sample of a preform formed from a pre-selected high-strength silicon carbide ceramic fiber type, placing the original sample in a processing furnace under a pre-selected preforming stress state and thermally treating the sample in the processing furnace at a pre-selected processing temperature and hold time in a processing gas having a pre-selected composition, pressure, and flow rate. For the high-temperature composite structures, the method includes additional steps of depositing a thin interphase coating on the surface of each fiber and forming a ceramic or carbon-based matrix within the sample.
Flow “Fine” Synthesis: High Yielding and Selective Organic Synthesis by Flow Methods
2015-01-01
Abstract The concept of flow “fine” synthesis, that is, high yielding and selective organic synthesis by flow methods, is described. Some examples of flow “fine” synthesis of natural products and APIs are discussed. Flow methods have several advantages over batch methods in terms of environmental compatibility, efficiency, and safety. However, synthesis by flow methods is more difficult than synthesis by batch methods. Indeed, it has been considered that synthesis by flow methods can be applicable for the production of simple gasses but that it is difficult to apply to the synthesis of complex molecules such as natural products and APIs. Therefore, organic synthesis of such complex molecules has been conducted by batch methods. On the other hand, syntheses and reactions that attain high yields and high selectivities by flow methods are increasingly reported. Flow methods are leading candidates for the next generation of manufacturing methods that can mitigate environmental concerns toward sustainable society. PMID:26337828
Method for selective CMP of polysilicon
NASA Technical Reports Server (NTRS)
Babu, Suryadevara V. (Inventor); Natarajan, Anita (Inventor); Hegde, Sharath (Inventor)
2010-01-01
A method of removing polysilicon in preference to silicon dioxide and/or silicon nitride by chemical mechanical polishing. The method removes polysilicon from a surface at a high removal rate while maintaining a high selectivity of polysilicon to silicon dioxide and/or a polysilicon to silicon nitride. The method is particularly suitable for use in the fabrication of MEMS devices.
Methods for producing silicon carbide architectural preforms
NASA Technical Reports Server (NTRS)
DiCarlo, James A. (Inventor); Yun, Hee (Inventor)
2010-01-01
Methods are disclosed for producing architectural preforms and high-temperature composite structures containing high-strength ceramic fibers with reduced preforming stresses within each fiber, with an in-situ grown coating on each fiber surface, with reduced boron within the bulk of each fiber, and with improved tensile creep and rupture resistance properties for each fiber. The methods include the steps of preparing an original sample of a preform formed from a pre-selected high-strength silicon carbide ceramic fiber type, placing the original sample in a processing furnace under a pre-selected preforming stress state and thermally treating the sample in the processing furnace at a pre-selected processing temperature and hold time in a processing gas having a pre-selected composition, pressure, and flow rate. For the high-temperature composite structures, the method includes additional steps of depositing a thin interphase coating on the surface of each fiber and forming a ceramic or carbon-based matrix within the sample.
Ihmaid, Saleh K; Ahmed, Hany E A; Zayed, Mohamed F; Abadleh, Mohammed M
2016-01-30
The main step in a successful drug discovery pipeline is the identification of small potent compounds that selectively bind to the target of interest with high affinity. However, there is still a shortage of efficient and accurate computational methods with powerful capability to study and hence predict compound selectivity properties. In this work, we propose an affordable machine learning method to perform compound selectivity classification and prediction. For this purpose, we have collected compounds with reported activity and built a selectivity database formed of 153 cathepsin K and S inhibitors that are considered of medicinal interest. This database has three compound sets, two K/S and S/K selective ones and one non-selective KS one. We have subjected this database to the selectivity classification tool 'Emergent Self-Organizing Maps' for exploring its capability to differentiate selective cathepsin inhibitors for one target over the other. The method exhibited good clustering performance for selective ligands with high accuracy (up to 100 %). Among the possibilites, BAPs and MACCS molecular structural fingerprints were used for such a classification. The results exhibited the ability of the method for structure-selectivity relationship interpretation and selectivity markers were identified for the design of further novel inhibitors with high activity and target selectivity.
A New Direction of Cancer Classification: Positive Effect of Low-Ranking MicroRNAs.
Li, Feifei; Piao, Minghao; Piao, Yongjun; Li, Meijing; Ryu, Keun Ho
2014-10-01
Many studies based on microRNA (miRNA) expression profiles showed a new aspect of cancer classification. Because one characteristic of miRNA expression data is the high dimensionality, feature selection methods have been used to facilitate dimensionality reduction. The feature selection methods have one shortcoming thus far: they just consider the problem of where feature to class is 1:1 or n:1. However, because one miRNA may influence more than one type of cancer, human miRNA is considered to be ranked low in traditional feature selection methods and are removed most of the time. In view of the limitation of the miRNA number, low-ranking miRNAs are also important to cancer classification. We considered both high- and low-ranking features to cover all problems (1:1, n:1, 1:n, and m:n) in cancer classification. First, we used the correlation-based feature selection method to select the high-ranking miRNAs, and chose the support vector machine, Bayes network, decision tree, k-nearest-neighbor, and logistic classifier to construct cancer classification. Then, we chose Chi-square test, information gain, gain ratio, and Pearson's correlation feature selection methods to build the m:n feature subset, and used the selected miRNAs to determine cancer classification. The low-ranking miRNA expression profiles achieved higher classification accuracy compared with just using high-ranking miRNAs in traditional feature selection methods. Our results demonstrate that the m:n feature subset made a positive impression of low-ranking miRNAs in cancer classification.
Pao's Selection Method for Quality Papers and the Subsequent Use of Medical Literature
ERIC Educational Resources Information Center
Boyce, Bert; Primov, Karen
1977-01-01
Pao's "quality filter" selection method is re-examined as to its effectiveness in selecting papers that not only are of use to medical educators but to researchers as well. It is concluded that the method does provide the librarian with a tool for forming a highly selective bibliography in a particular medical literature without need for…
A Way to Select Electrical Sheets of the Segment Stator Core Motors.
NASA Astrophysics Data System (ADS)
Enomoto, Yuji; Kitamura, Masashi; Sakai, Toshihiko; Ohara, Kouichiro
The segment stator core, high density winding coil, high-energy-product permanent magnet are indispensable technologies in the development of a compact and also high efficient motors. The conventional design method for the segment stator core mostly depended on experienced knowledge of selecting a suitable electromagnetic material, far from optimized design. Therefore, we have developed a novel design method in the selection of a suitable electromagnetic material based on the correlation evaluation between the material characteristics and motor performance. It enables the selection of suitable electromagnetic material that will meet the motor specification.
NASA Astrophysics Data System (ADS)
Miura, Hitoshi
The development of compact separation and recovery methods using selective ion-exchange techniques is very important for the reprocessing and high-level liquid wastes (HLLWs) treatment in the nuclear backend field. The selective nuclide separation techniques are effective for the volume reduction of wastes and the utilization of valuable nuclides, and expected for the construction of advanced nuclear fuel cycle system and the rationalization of waste treatment. In order to accomplish the selective nuclide separation, the design and synthesis of novel adsorbents are essential for the development of compact and precise separation processes. The present paper deals with the preparation of highly functional and selective hybrid microcapsules enclosing nano-adsorbents in the alginate gel polymer matrices by sol-gel methods, their characterization and the clarification of selective adsorption properties by batch and column methods. The selective separation of Cs, Pd and Re in real HLLW was further accomplished by using novel microcapsules, and an advanced nuclide separation system was proposed by the combination of selective processes using microcapsules.
This research was conducted in cooperation with EPA Region 4 in Athens, GA to develop a method to analyze selected pyrethroid pesticides using Reverse Phase-High Pressure Liquid Chromatography (HPLC). This HPLC method will aid researchers in separating and identifying these py...
Feature weight estimation for gene selection: a local hyperlinear learning approach
2014-01-01
Background Modeling high-dimensional data involving thousands of variables is particularly important for gene expression profiling experiments, nevertheless,it remains a challenging task. One of the challenges is to implement an effective method for selecting a small set of relevant genes, buried in high-dimensional irrelevant noises. RELIEF is a popular and widely used approach for feature selection owing to its low computational cost and high accuracy. However, RELIEF based methods suffer from instability, especially in the presence of noisy and/or high-dimensional outliers. Results We propose an innovative feature weighting algorithm, called LHR, to select informative genes from highly noisy data. LHR is based on RELIEF for feature weighting using classical margin maximization. The key idea of LHR is to estimate the feature weights through local approximation rather than global measurement, which is typically used in existing methods. The weights obtained by our method are very robust in terms of degradation of noisy features, even those with vast dimensions. To demonstrate the performance of our method, extensive experiments involving classification tests have been carried out on both synthetic and real microarray benchmark datasets by combining the proposed technique with standard classifiers, including the support vector machine (SVM), k-nearest neighbor (KNN), hyperplane k-nearest neighbor (HKNN), linear discriminant analysis (LDA) and naive Bayes (NB). Conclusion Experiments on both synthetic and real-world datasets demonstrate the superior performance of the proposed feature selection method combined with supervised learning in three aspects: 1) high classification accuracy, 2) excellent robustness to noise and 3) good stability using to various classification algorithms. PMID:24625071
This research was conducted in cooperation with EPA Region 4 in Athens, GA to develop a method to analyze selected pyrethroid pesticides using Reverse Phase-High Pressure Liquid Chromatography (HPLC). This HPLC method will aid researchers in separating and identifying these pyre...
Improved Sparse Multi-Class SVM and Its Application for Gene Selection in Cancer Classification
Huang, Lingkang; Zhang, Hao Helen; Zeng, Zhao-Bang; Bushel, Pierre R.
2013-01-01
Background Microarray techniques provide promising tools for cancer diagnosis using gene expression profiles. However, molecular diagnosis based on high-throughput platforms presents great challenges due to the overwhelming number of variables versus the small sample size and the complex nature of multi-type tumors. Support vector machines (SVMs) have shown superior performance in cancer classification due to their ability to handle high dimensional low sample size data. The multi-class SVM algorithm of Crammer and Singer provides a natural framework for multi-class learning. Despite its effective performance, the procedure utilizes all variables without selection. In this paper, we propose to improve the procedure by imposing shrinkage penalties in learning to enforce solution sparsity. Results The original multi-class SVM of Crammer and Singer is effective for multi-class classification but does not conduct variable selection. We improved the method by introducing soft-thresholding type penalties to incorporate variable selection into multi-class classification for high dimensional data. The new methods were applied to simulated data and two cancer gene expression data sets. The results demonstrate that the new methods can select a small number of genes for building accurate multi-class classification rules. Furthermore, the important genes selected by the methods overlap significantly, suggesting general agreement among different variable selection schemes. Conclusions High accuracy and sparsity make the new methods attractive for cancer diagnostics with gene expression data and defining targets of therapeutic intervention. Availability: The source MATLAB code are available from http://math.arizona.edu/~hzhang/software.html. PMID:23966761
Evaluating candidate reactions to selection practices using organisational justice theory.
Patterson, Fiona; Zibarras, Lara; Carr, Victoria; Irish, Bill; Gregory, Simon
2011-03-01
This study aimed to examine candidate reactions to selection practices in postgraduate medical training using organisational justice theory. We carried out three independent cross-sectional studies using samples from three consecutive annual recruitment rounds. Data were gathered from candidates applying for entry into UK general practice (GP) training during 2007, 2008 and 2009. Participants completed an evaluation questionnaire immediately after the short-listing stage and after the selection centre (interview) stage. Participants were doctors applying for GP training in the UK. Main outcome measures were participants' evaluations of the selection methods and perceptions of the overall fairness of each selection stage (short-listing and selection centre). A total of 23,855 evaluation questionnaires were completed (6893 in 2007, 10,497 in 2008 and 6465 in 2009). Absolute levels of perceptions of fairness of all the selection methods at both the short-listing and selection centre stages were consistently high over the 3years. Similarly, all selection methods were considered to be job-related by candidates. However, in general, candidates considered the selection centre stage to be significantly fairer than the short-listing stage. Of all the selection methods, the simulated patient consultation completed at the selection centre stage was rated as the most job-relevant. This is the first study to use a model of organisational justice theory to evaluate candidate reactions during selection into postgraduate specialty training. The high-fidelity selection methods are consistently viewed as more job-relevant and fairer by candidates. This has important implications for the design of recruitment systems for all specialties and, potentially, for medical school admissions. Using this approach, recruiters can systematically compare perceptions of the fairness and job relevance of various selection methods. © Blackwell Publishing Ltd 2011.
Ensemble of sparse classifiers for high-dimensional biological data.
Kim, Sunghan; Scalzo, Fabien; Telesca, Donatello; Hu, Xiao
2015-01-01
Biological data are often high in dimension while the number of samples is small. In such cases, the performance of classification can be improved by reducing the dimension of data, which is referred to as feature selection. Recently, a novel feature selection method has been proposed utilising the sparsity of high-dimensional biological data where a small subset of features accounts for most variance of the dataset. In this study we propose a new classification method for high-dimensional biological data, which performs both feature selection and classification within a single framework. Our proposed method utilises a sparse linear solution technique and the bootstrap aggregating algorithm. We tested its performance on four public mass spectrometry cancer datasets along with two other conventional classification techniques such as Support Vector Machines and Adaptive Boosting. The results demonstrate that our proposed method performs more accurate classification across various cancer datasets than those conventional classification techniques.
ERIC Educational Resources Information Center
Bumstead, Stacey
2012-01-01
The purpose of this mixed methods study was to examine select novice teachers' perceived knowledge of high-quality reading instruction, explore the extent that select novice teachers implemented high-quality reading instruction into their own classrooms, and to investigate any factors that explain the similarities and differences between…
The cross-validated AUC for MCP-logistic regression with high-dimensional data.
Jiang, Dingfeng; Huang, Jian; Zhang, Ying
2013-10-01
We propose a cross-validated area under the receiving operator characteristic (ROC) curve (CV-AUC) criterion for tuning parameter selection for penalized methods in sparse, high-dimensional logistic regression models. We use this criterion in combination with the minimax concave penalty (MCP) method for variable selection. The CV-AUC criterion is specifically designed for optimizing the classification performance for binary outcome data. To implement the proposed approach, we derive an efficient coordinate descent algorithm to compute the MCP-logistic regression solution surface. Simulation studies are conducted to evaluate the finite sample performance of the proposed method and its comparison with the existing methods including the Akaike information criterion (AIC), Bayesian information criterion (BIC) or Extended BIC (EBIC). The model selected based on the CV-AUC criterion tends to have a larger predictive AUC and smaller classification error than those with tuning parameters selected using the AIC, BIC or EBIC. We illustrate the application of the MCP-logistic regression with the CV-AUC criterion on three microarray datasets from the studies that attempt to identify genes related to cancers. Our simulation studies and data examples demonstrate that the CV-AUC is an attractive method for tuning parameter selection for penalized methods in high-dimensional logistic regression models.
Selecting long-term care facilities with high use of acute hospitalisations: issues and options
2014-01-01
Background This paper considers approaches to the question “Which long-term care facilities have residents with high use of acute hospitalisations?” It compares four methods of identifying long-term care facilities with high use of acute hospitalisations by demonstrating four selection methods, identifies key factors to be resolved when deciding which methods to employ, and discusses their appropriateness for different research questions. Methods OPAL was a census-type survey of aged care facilities and residents in Auckland, New Zealand, in 2008. It collected information about facility management and resident demographics, needs and care. Survey records (149 aged care facilities, 6271 residents) were linked to hospital and mortality records routinely assembled by health authorities. The main ranking endpoint was acute hospitalisations for diagnoses that were classified as potentially avoidable. Facilities were ranked using 1) simple event counts per person, 2) event rates per year of resident follow-up, 3) statistical model of rates using four predictors, and 4) change in ranks between methods 2) and 3). A generalized mixed model was used for Method 3 to handle the clustered nature of the data. Results 3048 potentially avoidable hospitalisations were observed during 22 months’ follow-up. The same “top ten” facilities were selected by Methods 1 and 2. The statistical model (Method 3), predicting rates from resident and facility characteristics, ranked facilities differently than these two simple methods. The change-in-ranks method identified a very different set of “top ten” facilities. All methods showed a continuum of use, with no clear distinction between facilities with higher use. Conclusion Choice of selection method should depend upon the purpose of selection. To monitor performance during a period of change, a recent simple rate, count per resident, or even count per bed, may suffice. To find high–use facilities regardless of resident needs, recent history of admissions is highly predictive. To target a few high-use facilities that have high rates after considering facility and resident characteristics, model residuals or a large increase in rank may be preferable. PMID:25052433
Selective biosensing of Staphylococcus aureus using chitosan quantum dots
NASA Astrophysics Data System (ADS)
Abdelhamid, Hani Nasser; Wu, Hui-Fen
2018-01-01
Selective biosensing of Staphylococcus aureus (S. aureus) using chitosan modified quantum dots (CTS@CdS QDs) in the presence of hydrogen peroxide is reported. The method is based on the intrinsic positive catalase activity of S. aureus. CTS@CdS quantum dots provide high dispersion in aqueous media with high fluorescence emission. Staphylococcus aureus causes a selective quenching of the fluorescence emission of CTS@CdS QDs in the presence of H2O2 compared to other pathogens such as Escherichia coli and Pseudomonas aeruginosa. The intrinsic enzymatic character of S. aureus (catalase positive) offers selective and fast biosensing. The present method is highly selective for positive catalase species and requires no expensive reagents such as antibodies, aptamers or microbeads. It could be extended for other species that are positive catalase.
Dominating Scale-Free Networks Using Generalized Probabilistic Methods
Molnár,, F.; Derzsy, N.; Czabarka, É.; Székely, L.; Szymanski, B. K.; Korniss, G.
2014-01-01
We study ensemble-based graph-theoretical methods aiming to approximate the size of the minimum dominating set (MDS) in scale-free networks. We analyze both analytical upper bounds of dominating sets and numerical realizations for applications. We propose two novel probabilistic dominating set selection strategies that are applicable to heterogeneous networks. One of them obtains the smallest probabilistic dominating set and also outperforms the deterministic degree-ranked method. We show that a degree-dependent probabilistic selection method becomes optimal in its deterministic limit. In addition, we also find the precise limit where selecting high-degree nodes exclusively becomes inefficient for network domination. We validate our results on several real-world networks, and provide highly accurate analytical estimates for our methods. PMID:25200937
Method for production of sorghum hybrids with selected flowering times
Mullet, John E.; Rooney, William L.
2016-08-30
Methods and composition for the production of sorghum hybrids with selected and different flowering times are provided. In accordance with the invention, a substantially continual and high-yield harvest of sorghum is provided. Improved methods of seed production are also provided.
Aptamers and methods for their in vitro selection and uses thereof
Doyle, Sharon A [Walnut Creek, CA; Murphy, Michael B [Severna Park, MD
2008-02-12
The present method is an improved in vitro selection protocol that relies on magnetic separations for DNA aptamer production that is relatively easy and scalable without the need for expensive robotics. The ability of aptamers selected by this method to recognize and bind their target protein with high affinity and specificity, and detail their uses in a number of assays is also described. Specific TTF1 and His6 aptamers were selected using the method described, and shown to be useful for enzyme-linked assays, Western blots, and affinity purification.
Aptamers and methods for their in vitro selection and uses thereof
Doyle, Sharon A [Walnut Creek, CA; Murphy, Michael B [Severna Park, MD
2012-01-31
The present method is an improved in vitro selection protocol that relies on magnetic separations for DNA aptamer production that is relatively easy and scalable without the need for expensive robotics. The ability of aptamers selected by this method to recognize and bind their target protein with high affinity and specificity, and detail their uses in a number of assays is also described. Specific TTF1 and His6 aptamers were selected using the method described, and shown to be useful for enzyme-linked assays, Western blots, and affinity purification.
Khurana, Rajneet Kaur; Rao, Satish; Beg, Sarwar; Katare, O.P.; Singh, Bhupinder
2016-01-01
The present work aims at the systematic development of a simple, rapid and highly sensitive densitometry-based thin-layer chromatographic method for the quantification of mangiferin in bioanalytical samples. Initially, the quality target method profile was defined and critical analytical attributes (CAAs) earmarked, namely, retardation factor (Rf), peak height, capacity factor, theoretical plates and separation number. Face-centered cubic design was selected for optimization of volume loaded and plate dimensions as the critical method parameters selected from screening studies employing D-optimal and Plackett–Burman design studies, followed by evaluating their effect on the CAAs. The mobile phase containing a mixture of ethyl acetate : acetic acid : formic acid : water in a 7 : 1 : 1 : 1 (v/v/v/v) ratio was finally selected as the optimized solvent for apt chromatographic separation of mangiferin at 262 nm with Rf 0.68 ± 0.02 and all other parameters within the acceptance limits. Method validation studies revealed high linearity in the concentration range of 50–800 ng/band for mangiferin. The developed method showed high accuracy, precision, ruggedness, robustness, specificity, sensitivity, selectivity and recovery. In a nutshell, the bioanalytical method for analysis of mangiferin in plasma revealed the presence of well-resolved peaks and high recovery of mangiferin. PMID:26912808
Feature Grouping and Selection Over an Undirected Graph.
Yang, Sen; Yuan, Lei; Lai, Ying-Cheng; Shen, Xiaotong; Wonka, Peter; Ye, Jieping
2012-01-01
High-dimensional regression/classification continues to be an important and challenging problem, especially when features are highly correlated. Feature selection, combined with additional structure information on the features has been considered to be promising in promoting regression/classification performance. Graph-guided fused lasso (GFlasso) has recently been proposed to facilitate feature selection and graph structure exploitation, when features exhibit certain graph structures. However, the formulation in GFlasso relies on pairwise sample correlations to perform feature grouping, which could introduce additional estimation bias. In this paper, we propose three new feature grouping and selection methods to resolve this issue. The first method employs a convex function to penalize the pairwise l ∞ norm of connected regression/classification coefficients, achieving simultaneous feature grouping and selection. The second method improves the first one by utilizing a non-convex function to reduce the estimation bias. The third one is the extension of the second method using a truncated l 1 regularization to further reduce the estimation bias. The proposed methods combine feature grouping and feature selection to enhance estimation accuracy. We employ the alternating direction method of multipliers (ADMM) and difference of convex functions (DC) programming to solve the proposed formulations. Our experimental results on synthetic data and two real datasets demonstrate the effectiveness of the proposed methods.
Attallah, Omneya; Karthikesalingam, Alan; Holt, Peter Je; Thompson, Matthew M; Sayers, Rob; Bown, Matthew J; Choke, Eddie C; Ma, Xianghong
2017-11-01
Feature selection is essential in medical area; however, its process becomes complicated with the presence of censoring which is the unique character of survival analysis. Most survival feature selection methods are based on Cox's proportional hazard model, though machine learning classifiers are preferred. They are less employed in survival analysis due to censoring which prevents them from directly being used to survival data. Among the few work that employed machine learning classifiers, partial logistic artificial neural network with auto-relevance determination is a well-known method that deals with censoring and perform feature selection for survival data. However, it depends on data replication to handle censoring which leads to unbalanced and biased prediction results especially in highly censored data. Other methods cannot deal with high censoring. Therefore, in this article, a new hybrid feature selection method is proposed which presents a solution to high level censoring. It combines support vector machine, neural network, and K-nearest neighbor classifiers using simple majority voting and a new weighted majority voting method based on survival metric to construct a multiple classifier system. The new hybrid feature selection process uses multiple classifier system as a wrapper method and merges it with iterated feature ranking filter method to further reduce features. Two endovascular aortic repair datasets containing 91% censored patients collected from two centers were used to construct a multicenter study to evaluate the performance of the proposed approach. The results showed the proposed technique outperformed individual classifiers and variable selection methods based on Cox's model such as Akaike and Bayesian information criterions and least absolute shrinkage and selector operator in p values of the log-rank test, sensitivity, and concordance index. This indicates that the proposed classifier is more powerful in correctly predicting the risk of re-intervention enabling doctor in selecting patients' future follow-up plan.
A computational method for selecting short peptide sequences for inorganic material binding.
Nayebi, Niloofar; Cetinel, Sibel; Omar, Sara Ibrahim; Tuszynski, Jack A; Montemagno, Carlo
2017-11-01
Discovering or designing biofunctionalized materials with improved quality highly depends on the ability to manipulate and control the peptide-inorganic interaction. Various peptides can be used as assemblers, synthesizers, and linkers in the material syntheses. In another context, specific and selective material-binding peptides can be used as recognition blocks in mining applications. In this study, we propose a new in silico method to select short 4-mer peptides with high affinity and selectivity for a given target material. This method is illustrated with the calcite (104) surface as an example, which has been experimentally validated. A calcite binding peptide can play an important role in our understanding of biomineralization. A practical aspect of calcite is a need for it to be selectively depressed in mining sites. © 2017 Wiley Periodicals, Inc.
Effects of advanced selection methods on sperm quality and ART outcome.
Yetunde, I; Vasiliki, M
2013-10-01
In assisted reproductive technology (ART), the role of spermatozoa has evolved over the years. In the past, early methods of selecting sperm for ART only focused on selecting motile and morphologically normal appearing sperm. It has become evident that these methods are inefficient in identifying the most suitable sperm for fertilization. Novel methods have thus been created to identify highly motile, morphologically normal, viable non-apoptotic spermatozoa with intact membranes and high DNA integrity for use in ART. These advanced methods of selection utilize our knowledge of unique characteristics of sperm, such as sperm surface charge, the presence of hyaluronic acid binding sites on sperm, sperm ultramorphology, markers of apoptosis and zona pellucida binding on sperm. These methods have shown potential promise in improving ART outcomes. Future developments may include Raman spectroscopy, confocal light absorption and scattering spectroscopic microscopy, and polarization microscopy. While these novel techniques have potential, they come with a cost burden and further studies are required to demonstrate their impact on ART outcomes. Furthermore, clinicians and human reproductive scientists need to continue to gather knowledge about human fertilization and determine the most physiological methods of sperm selection.
Du, Wei; Sun, Min; Guo, Pengqi; Chang, Chun; Fu, Qiang
2018-09-01
Nowadays, the abuse of antibiotics in aquaculture has generated considerable problems for food safety. Therefore, it is imperative to develop a simple and selective method for monitoring illegal use of antibiotics in aquatic products. In this study, a method combined molecularly imprinted membranes (MIMs) extraction and liquid chromatography was developed for the selective analysis of cloxacillin from shrimp samples. The MIMs was synthesized by UV photopolymerization, and characterized by scanning electron microscope, Fourier transform infrared spectra, thermo-gravimetric analysis and swelling test. The results showed that the MIMs exhibited excellent permselectivity, high adsorption capacity and fast adsorption rate for cloxacillin. Finally, the method was utilized to determine cloxacillin from shrimp samples, with good accuracies and acceptable relative standard deviation values for precision. The proposed method was a promising alternative for selective analysis of cloxacillin in shrimp samples, due to the easy-operation and excellent selectivity. Copyright © 2018. Published by Elsevier Ltd.
An adaptive band selection method for dimension reduction of hyper-spectral remote sensing image
NASA Astrophysics Data System (ADS)
Yu, Zhijie; Yu, Hui; Wang, Chen-sheng
2014-11-01
Hyper-spectral remote sensing data can be acquired by imaging the same area with multiple wavelengths, and it normally consists of hundreds of band-images. Hyper-spectral images can not only provide spatial information but also high resolution spectral information, and it has been widely used in environment monitoring, mineral investigation and military reconnaissance. However, because of the corresponding large data volume, it is very difficult to transmit and store Hyper-spectral images. Hyper-spectral image dimensional reduction technique is desired to resolve this problem. Because of the High relation and high redundancy of the hyper-spectral bands, it is very feasible that applying the dimensional reduction method to compress the data volume. This paper proposed a novel band selection-based dimension reduction method which can adaptively select the bands which contain more information and details. The proposed method is based on the principal component analysis (PCA), and then computes the index corresponding to every band. The indexes obtained are then ranked in order of magnitude from large to small. Based on the threshold, system can adaptively and reasonably select the bands. The proposed method can overcome the shortcomings induced by transform-based dimension reduction method and prevent the original spectral information from being lost. The performance of the proposed method has been validated by implementing several experiments. The experimental results show that the proposed algorithm can reduce the dimensions of hyper-spectral image with little information loss by adaptively selecting the band images.
Khurana, Rajneet Kaur; Rao, Satish; Beg, Sarwar; Katare, O P; Singh, Bhupinder
2016-01-01
The present work aims at the systematic development of a simple, rapid and highly sensitive densitometry-based thin-layer chromatographic method for the quantification of mangiferin in bioanalytical samples. Initially, the quality target method profile was defined and critical analytical attributes (CAAs) earmarked, namely, retardation factor (Rf), peak height, capacity factor, theoretical plates and separation number. Face-centered cubic design was selected for optimization of volume loaded and plate dimensions as the critical method parameters selected from screening studies employing D-optimal and Plackett-Burman design studies, followed by evaluating their effect on the CAAs. The mobile phase containing a mixture of ethyl acetate : acetic acid : formic acid : water in a 7 : 1 : 1 : 1 (v/v/v/v) ratio was finally selected as the optimized solvent for apt chromatographic separation of mangiferin at 262 nm withRf 0.68 ± 0.02 and all other parameters within the acceptance limits. Method validation studies revealed high linearity in the concentration range of 50-800 ng/band for mangiferin. The developed method showed high accuracy, precision, ruggedness, robustness, specificity, sensitivity, selectivity and recovery. In a nutshell, the bioanalytical method for analysis of mangiferin in plasma revealed the presence of well-resolved peaks and high recovery of mangiferin. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Variance Component Selection With Applications to Microbiome Taxonomic Data.
Zhai, Jing; Kim, Juhyun; Knox, Kenneth S; Twigg, Homer L; Zhou, Hua; Zhou, Jin J
2018-01-01
High-throughput sequencing technology has enabled population-based studies of the role of the human microbiome in disease etiology and exposure response. Microbiome data are summarized as counts or composition of the bacterial taxa at different taxonomic levels. An important problem is to identify the bacterial taxa that are associated with a response. One method is to test the association of specific taxon with phenotypes in a linear mixed effect model, which incorporates phylogenetic information among bacterial communities. Another type of approaches consider all taxa in a joint model and achieves selection via penalization method, which ignores phylogenetic information. In this paper, we consider regression analysis by treating bacterial taxa at different level as multiple random effects. For each taxon, a kernel matrix is calculated based on distance measures in the phylogenetic tree and acts as one variance component in the joint model. Then taxonomic selection is achieved by the lasso (least absolute shrinkage and selection operator) penalty on variance components. Our method integrates biological information into the variable selection problem and greatly improves selection accuracies. Simulation studies demonstrate the superiority of our methods versus existing methods, for example, group-lasso. Finally, we apply our method to a longitudinal microbiome study of Human Immunodeficiency Virus (HIV) infected patients. We implement our method using the high performance computing language Julia. Software and detailed documentation are freely available at https://github.com/JingZhai63/VCselection.
High-Sensitivity Spectrophotometry.
ERIC Educational Resources Information Center
Harris, T. D.
1982-01-01
Selected high-sensitivity spectrophotometric methods are examined, and comparisons are made of their relative strengths and weaknesses and the circumstances for which each can best be applied. Methods include long path cells, noise reduction, laser intracavity absorption, thermocouple calorimetry, photoacoustic methods, and thermo-optical methods.…
Xu, Rengyi; Mesaros, Clementina; Weng, Liwei; Snyder, Nathaniel W; Vachani, Anil; Blair, Ian A; Hwang, Wei-Ting
2017-07-01
We compared three statistical methods in selecting a panel of serum lipid biomarkers for mesothelioma and asbestos exposure. Serum samples from mesothelioma, asbestos-exposed subjects and controls (40 per group) were analyzed. Three variable selection methods were considered: top-ranked predictors from univariate model, stepwise and least absolute shrinkage and selection operator. Crossed-validated area under the receiver operating characteristic curve was used to compare the prediction performance. Lipids with high crossed-validated area under the curve were identified. Lipid with mass-to-charge ratio of 372.31 was selected by all three methods comparing mesothelioma versus control. Lipids with mass-to-charge ratio of 1464.80 and 329.21 were selected by two models for asbestos exposure versus control. Different methods selected a similar set of serum lipids. Combining candidate biomarkers can improve prediction.
Biomimetic membranes and methods of making biomimetic membranes
Rempe, Susan; Brinker, Jeffrey C.; Rogers, David Michael; Jiang, Ying-Bing; Yang, Shaorong
2016-11-08
The present disclosure is directed to biomimetic membranes and methods of manufacturing such membranes that include structural features that mimic the structures of cellular membrane channels and produce membrane designs capable of high selectivity and high permeability or adsorptivity. The membrane structure, material and chemistry can be selected to perform liquid separations, gas separation and capture, ion transport and adsorption for a variety of applications.
Novel selection methods for DNA-encoded chemical libraries
Chan, Alix I.; McGregor, Lynn M.; Liu, David R.
2015-01-01
Driven by the need for new compounds to serve as biological probes and leads for therapeutic development and the growing accessibility of DNA technologies including high-throughput sequencing, many academic and industrial groups have begun to use DNA-encoded chemical libraries as a source of bioactive small molecules. In this review, we describe the technologies that have enabled the selection of compounds with desired activities from these libraries. These methods exploit the sensitivity of in vitro selection coupled with DNA amplification to overcome some of the limitations and costs associated with conventional screening methods. In addition, we highlight newer techniques with the potential to be applied to the high-throughput evaluation of DNA-encoded chemical libraries. PMID:25723146
Bougaran, Gaël; Rouxel, Catherine; Dubois, Nolwenn; Kaas, Raymond; Grouas, Sophie; Lukomska, Ewa; Le Coz, Jean-René; Cadoret, Jean-Paul
2012-11-01
Microalgae offer a high potential for energetic lipid storage as well as high growth rates. They are therefore considered promising candidates for biofuel production, with the selection of high lipid-producing strains a major objective in projects on the development of this technology. We developed a mutation-selection method aimed at increasing microalgae neutral lipid productivity. A two step method, based on UVc irradiation followed by flow cytometry selection, was applied to a set of strains that had an initial high lipid content and improvement was assessed by means of Nile-red fluorescence measurements. The method was first tested on Isochrysis affinis galbana (T-Iso). Following a first round of mutation-selection, the total fatty acid content had not increased significantly, being 262 ± 21 mgTFA (gC)-1 for the wild type (WT) and 269 ± 49 mgTFA (gC)-1 for the selected population (S1M1). Conversely, fatty acid distribution among the lipid classes was affected by the process, resulting in a 20% increase for the fatty acids in the neutral lipids and a 40% decrease in the phospholipids. After a second mutation-selection step (S2M2), the total fatty acid content reached 409 ± 64 mgTFA (gC)-1 with a fatty acid distribution similar to the S1M1 population. Growth rate remained unaffected by the process, resulting in a 80% increase for neutral lipid productivity. Copyright © 2012 Wiley Periodicals, Inc.
Selective high-affinity polydentate ligands and methods of making such
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denardo, Sally J.; Denardo, Gerald L.; Balhorn, Rodney L.
This invention provides novel polydentate selective high affinity ligands (SHALs) that can be used in a variety of applications in a manner analogous to the use of antibodies. SHALs typically comprise a multiplicity of ligands that each bind different region son the target molecule. The ligands are joined directly or through a linker thereby forming a polydentate moiety that typically binds the target molecule with high selectivity and avidity.
Selective high-affinity polydentate ligands and methods of making such
DeNardo, Sally; DeNardo, Gerald; Balhorn, Rodney
2013-09-17
This invention provides polydentate selective high affinity ligands (SHALs) that can be used in a variety of applications in a manner analogous to the use of antibodies. SHALs typically comprise a multiplicity of ligands that each binds different regions on the target molecule. The ligands are joined directly or through a linker thereby forming a polydentate moiety that typically binds the target molecule with high selectivity and avidity.
Selective high affinity polydentate ligands and methods of making such
DeNardo, Sally; DeNardo, Gerald; Balhorn, Rodney
2010-02-16
This invention provides novel polydentate selective high affinity ligands (SHALs) that can be used in a variety of applications in a manner analogous to the use of antibodies. SHALs typically comprise a multiplicity of ligands that each bind different region son the target molecule. The ligands are joined directly or through a linker thereby forming a polydentate moiety that typically binds the target molecule with high selectivity and avidity.
NASA Astrophysics Data System (ADS)
Wang, J.; Feng, B.
2016-12-01
Impervious surface area (ISA) has long been studied as an important input into moisture flux models. In general, ISA impedes groundwater recharge, increases stormflow/flood frequency, and alters in-stream and riparian habitats. Urban area is recognized as one of the richest ISA environment. Urban ISA mapping assists flood prevention and urban planning. Hyperspectral imagery (HI), for its ability to detect subtle spectral signature, becomes an ideal candidate in urban ISA mapping. To map ISA from HI involves endmember (EM) selection. The high degree of spatial and spectral heterogeneity of urban environment puts great difficulty in this task: a compromise point is needed between the automatic degree and the good representativeness of the method. The study tested one manual and two semi-automatic EM selection strategies. The manual and the first semi-automatic methods have been widely used in EM selection. The second semi-automatic EM selection method is rather new and has been only proposed for moderate spatial resolution satellite. The manual method visually selected the EM candidates from eight landcover types in the original image. The first semi-automatic method chose the EM candidates using a threshold over the pixel purity index (PPI) map. The second semi-automatic method used the triangle shape of the HI scatter plot in the n-Dimension visualizer to identify the V-I-S (vegetation-impervious surface-soil) EM candidates: the pixels locate at the triangle points. The initial EM candidates from the three methods were further refined by three indexes (EM average RMSE, minimum average spectral angle, and count based EM selection) and generated three spectral libraries, which were used to classify the test image. Spectral angle mapper was applied. The accuracy reports for the classification results were generated. The overall accuracy are 85% for the manual method, 81% for the PPI method, and 87% for the V-I-S method. The V-I-S EM selection method performs best in this study. This fact proves the value of V-I-S EM selection method in not only moderate spatial resolution satellite image but also the more and more accessible high spatial resolution airborne image. This semi-automatic EM selection method can be adopted into a wide range of remote sensing images and provide ISA map for hydrology analysis.
Multiprocessor switch with selective pairing
Gara, Alan; Gschwind, Michael K; Salapura, Valentina
2014-03-11
System, method and computer program product for a multiprocessing system to offer selective pairing of processor cores for increased processing reliability. A selective pairing facility is provided that selectively connects, i.e., pairs, multiple microprocessor or processor cores to provide one highly reliable thread (or thread group). Each paired microprocessor or processor cores that provide one highly reliable thread for high-reliability connect with a system components such as a memory "nest" (or memory hierarchy), an optional system controller, and optional interrupt controller, optional I/O or peripheral devices, etc. The memory nest is attached to a selective pairing facility via a switch or a bus
Frequency-Wavenumber (FK)-Based Data Selection in High-Frequency Passive Surface Wave Survey
NASA Astrophysics Data System (ADS)
Cheng, Feng; Xia, Jianghai; Xu, Zongbo; Hu, Yue; Mi, Binbin
2018-04-01
Passive surface wave methods have gained much attention from geophysical and civil engineering communities because of the limited application of traditional seismic surveys in highly populated urban areas. Considering that they can provide high-frequency phase velocity information up to several tens of Hz, the active surface wave survey would be omitted and the amount of field work could be dramatically reduced. However, the measured dispersion energy image in the passive surface wave survey would usually be polluted by a type of "crossed" artifacts at high frequencies. It is common in the bidirectional noise distribution case with a linear receiver array deployed along roads or railways. We review several frequently used passive surface wave methods and derive the underlying physics for the existence of the "crossed" artifacts. We prove that the "crossed" artifacts would cross the true surface wave energy at fixed points in the f-v domain and propose a FK-based data selection technique to attenuate the artifacts in order to retrieve the high-frequency information. Numerical tests further demonstrate the existence of the "crossed" artifacts and indicate that the well-known wave field separation method, FK filter, does not work for the selection of directional noise data. Real-world applications manifest the feasibility of the proposed FK-based technique to improve passive surface wave methods by a priori data selection. Finally, we discuss the applicability of our approach.
Frequency-Wavenumber (FK)-Based Data Selection in High-Frequency Passive Surface Wave Survey
NASA Astrophysics Data System (ADS)
Cheng, Feng; Xia, Jianghai; Xu, Zongbo; Hu, Yue; Mi, Binbin
2018-07-01
Passive surface wave methods have gained much attention from geophysical and civil engineering communities because of the limited application of traditional seismic surveys in highly populated urban areas. Considering that they can provide high-frequency phase velocity information up to several tens of Hz, the active surface wave survey would be omitted and the amount of field work could be dramatically reduced. However, the measured dispersion energy image in the passive surface wave survey would usually be polluted by a type of "crossed" artifacts at high frequencies. It is common in the bidirectional noise distribution case with a linear receiver array deployed along roads or railways. We review several frequently used passive surface wave methods and derive the underlying physics for the existence of the "crossed" artifacts. We prove that the "crossed" artifacts would cross the true surface wave energy at fixed points in the f- v domain and propose a FK-based data selection technique to attenuate the artifacts in order to retrieve the high-frequency information. Numerical tests further demonstrate the existence of the "crossed" artifacts and indicate that the well-known wave field separation method, FK filter, does not work for the selection of directional noise data. Real-world applications manifest the feasibility of the proposed FK-based technique to improve passive surface wave methods by a priori data selection. Finally, we discuss the applicability of our approach.
Genomic selection in plant breeding
USDA-ARS?s Scientific Manuscript database
Genomic selection (GS) is a method to predict the genetic value of selection candidates based on the genomic estimated breeding value (GEBV) predicted from high-density markers positioned throughout the genome. Unlike marker-assisted selection, the GEBV is based on all markers including both minor ...
NASA Astrophysics Data System (ADS)
Balachandra, Anagi Manjula
Membrane-based separations are attractive in industrial processes because of their low energy costs and simple operation. However, low permeabilities often make membrane processes uneconomical. Since flux is inversely proportional to membrane thickness, composite membranes consisting of ultrathin, selective skins on highly permeable supports are required to simultaneously achieve high throughput and high selectivity. However, the synthesis of defect-free skins with thicknesses less than 50 nm is difficult, and thus flux is often limited. Layer-by-layer deposition of oppositely charged polyelectrolytes on porous supports is an attractive method to synthesize ultrathin ion-separation membranes with high flux and high selectivity. The ion-transport selectivity of multilayer polyelectrolyte membranes (MPMs) is primarily due to Donnan exclusion; therefore increase in fixed charge density should yield high selectivity. However, control over charge density in MPMs is difficult because charges on polycations are electrostatically compensated by charges on polyanions, and the net charge in the bulk of these films is small. To overcome this problem, we introduced a templating method to create ion-exchange sites in the bulk of the membrane. This strategy involves alternating deposition of a Cu2+-poly(acrylic acid) complex and poly(allylamine hydrochloride) on a porous alumina support followed by removal of Cu2+ and deprotonation to yield free -COO- ion-exchange sites. Diffusion dialysis studies showed that the Cl-/SO42-. Selectivity of Cu2+-templated membranes is 4-fold higher than that of membranes prepared in the absence of Cu2+. Post-deposition cross-linking of these membranes by heat-induced amide bond formation further increased Cl-/SO42- selectivity to values as high as 600. Room-temperature, surface-initiated atom transfer radical polymerization (ATRP) provides another convenient method for formation of ultrathin polymer skins. This process involves attachment of polymerization initiators to a porous alumina support and subsequent polymerization from these initiators. Because ATRP is a controlled polymerization technique, it yields well-defined polymer films with low polydispersity indices (narrow molecular weight distributions). Additionally, this method is attractive because film thickness can be easily controlled by adjusting polymerization time. Gas-permeability data showed that grafted poly(ethylene glycol dimethacrylate) membranes have a CO 2/CH4 selectivity of 20, whereas poly(2-hydroxyethyl methacrylate) (PHEMA) films grown from a surface have negligible selectivity. However, derivatization of PHEMA with pentadecafluorooctanoyl chloride increases the solubility of CO2 in the membrane and results in a CO2/CH4 selectivity of 9. Although composite PHEMA membranes have no significant gas-transport selectivity, diffusion dialysis studies with PHEMA membranes showed moderate ion-transport selectivities. Cross-linking of PHEMA membranes by reaction with succinyl chloride greatly enhanced anion-transport selectivities while maintaining reasonable flux. The selectivities of these systems demonstrate that alternating polyelectrolyte deposition and surface-initiated ATRP are indeed capable of forming ultrathin, defect-free membrane skins that can potentially be modified for specific separations.
Feature Selection for Classification of Polar Regions Using a Fuzzy Expert System
NASA Technical Reports Server (NTRS)
Penaloza, Mauel A.; Welch, Ronald M.
1996-01-01
Labeling, feature selection, and the choice of classifier are critical elements for classification of scenes and for image understanding. This study examines several methods for feature selection in polar regions, including the list, of a fuzzy logic-based expert system for further refinement of a set of selected features. Six Advanced Very High Resolution Radiometer (AVHRR) Local Area Coverage (LAC) arctic scenes are classified into nine classes: water, snow / ice, ice cloud, land, thin stratus, stratus over water, cumulus over water, textured snow over water, and snow-covered mountains. Sixty-seven spectral and textural features are computed and analyzed by the feature selection algorithms. The divergence, histogram analysis, and discriminant analysis approaches are intercompared for their effectiveness in feature selection. The fuzzy expert system method is used not only to determine the effectiveness of each approach in classifying polar scenes, but also to further reduce the features into a more optimal set. For each selection method,features are ranked from best to worst, and the best half of the features are selected. Then, rules using these selected features are defined. The results of running the fuzzy expert system with these rules show that the divergence method produces the best set features, not only does it produce the highest classification accuracy, but also it has the lowest computation requirements. A reduction of the set of features produced by the divergence method using the fuzzy expert system results in an overall classification accuracy of over 95 %. However, this increase of accuracy has a high computation cost.
Vast Volatility Matrix Estimation using High Frequency Data for Portfolio Selection*
Fan, Jianqing; Li, Yingying; Yu, Ke
2012-01-01
Portfolio allocation with gross-exposure constraint is an effective method to increase the efficiency and stability of portfolios selection among a vast pool of assets, as demonstrated in Fan et al. (2011). The required high-dimensional volatility matrix can be estimated by using high frequency financial data. This enables us to better adapt to the local volatilities and local correlations among vast number of assets and to increase significantly the sample size for estimating the volatility matrix. This paper studies the volatility matrix estimation using high-dimensional high-frequency data from the perspective of portfolio selection. Specifically, we propose the use of “pairwise-refresh time” and “all-refresh time” methods based on the concept of “refresh time” proposed by Barndorff-Nielsen et al. (2008) for estimation of vast covariance matrix and compare their merits in the portfolio selection. We establish the concentration inequalities of the estimates, which guarantee desirable properties of the estimated volatility matrix in vast asset allocation with gross exposure constraints. Extensive numerical studies are made via carefully designed simulations. Comparing with the methods based on low frequency daily data, our methods can capture the most recent trend of the time varying volatility and correlation, hence provide more accurate guidance for the portfolio allocation in the next time period. The advantage of using high-frequency data is significant in our simulation and empirical studies, which consist of 50 simulated assets and 30 constituent stocks of Dow Jones Industrial Average index. PMID:23264708
Mao, Yong; Zhou, Xiao-Bo; Pi, Dao-Ying; Sun, You-Xian; Wong, Stephen T C
2005-10-01
In microarray-based cancer classification, gene selection is an important issue owing to the large number of variables and small number of samples as well as its non-linearity. It is difficult to get satisfying results by using conventional linear statistical methods. Recursive feature elimination based on support vector machine (SVM RFE) is an effective algorithm for gene selection and cancer classification, which are integrated into a consistent framework. In this paper, we propose a new method to select parameters of the aforementioned algorithm implemented with Gaussian kernel SVMs as better alternatives to the common practice of selecting the apparently best parameters by using a genetic algorithm to search for a couple of optimal parameter. Fast implementation issues for this method are also discussed for pragmatic reasons. The proposed method was tested on two representative hereditary breast cancer and acute leukaemia datasets. The experimental results indicate that the proposed method performs well in selecting genes and achieves high classification accuracies with these genes.
Chevassus, Bernard; Quillet, Edwige; Krieg, Francine; Hollebecq, Marie-Gwénola; Mambrini, Muriel; Fauré, André; Labbé, Laurent; Hiseux, Jean-Pierre; Vandeputte, Marc
2004-01-01
Growth rate is the main breeding goal of fish breeders, but individual selection has often shown poor responses in fish species. The PROSPER method was developed to overcome possible factors that may contribute to this low success, using (1) a variable base population and high number of breeders (Ne > 100), (2) selection within groups with low non-genetic effects and (3) repeated growth challenges. Using calculations, we show that individual selection within groups, with appropriate management of maternal effects, can be superior to mass selection as soon as the maternal effect ratio exceeds 0.15, when heritability is 0.25. Practically, brown trout were selected on length at the age of one year with the PROSPER method. The genetic gain was evaluated against an unselected control line. After four generations, the mean response per generation in length at one year was 6.2% of the control mean, while the mean correlated response in weight was 21.5% of the control mean per generation. At the 4th generation, selected fish also appeared to be leaner than control fish when compared at the same size, and the response on weight was maximal (≈130% of the control mean) between 386 and 470 days post fertilisation. This high response is promising, however, the key points of the method have to be investigated in more detail. PMID:15496285
Selective epitaxy using the gild process
Weiner, Kurt H.
1992-01-01
The present invention comprises a method of selective epitaxy on a semiconductor substrate. The present invention provides a method of selectively forming high quality, thin GeSi layers in a silicon circuit, and a method for fabricating smaller semiconductor chips with a greater yield (more error free chips) at a lower cost. The method comprises forming an upper layer over a substrate, and depositing a reflectivity mask which is then removed over selected sections. Using a laser to melt the unmasked sections of the upper layer, the semiconductor material in the upper layer is heated and diffused into the substrate semiconductor material. By varying the amount of laser radiation, the epitaxial layer is formed to a controlled depth which may be very thin. When cooled, a single crystal epitaxial layer is formed over the patterned substrate. The present invention provides the ability to selectively grow layers of mixed semiconductors over patterned substrates such as a layer of Ge.sub.x Si.sub.1-x grown over silicon. Such a process may be used to manufacture small transistors that have a narrow base, heavy doping, and high gain. The narrowness allows a faster transistor, and the heavy doping reduces the resistance of the narrow layer. The process does not require high temperature annealing; therefore materials such as aluminum can be used. Furthermore, the process may be used to fabricate diodes that have a high reverse breakdown voltage and a low reverse leakage current.
Rashev, Svetoslav; Moule, David C; Rashev, Vladimir
2012-11-01
We perform converged high precision variational calculations to determine the frequencies of a large number of vibrational levels in S(0) D(2)CO, extending from low to very high excess vibrational energies. For the calculations we use our specific vibrational method (recently employed for studies on H(2)CO), consisting of a combination of a search/selection algorithm and a Lanczos iteration procedure. Using the same method we perform large scale converged calculations on the vibrational level spectral structure and fragmentation at selected highly excited overtone states, up to excess vibrational energies of ∼17,000 cm(-1), in order to study the characteristics of intramolecular vibrational redistribution (IVR), vibrational level density and mode selectivity. Copyright © 2012 Elsevier B.V. All rights reserved.
Methods, apparatus and system for selective duplication of subtasks
Andrade Costa, Carlos H.; Cher, Chen-Yong; Park, Yoonho; Rosenburg, Bryan S.; Ryu, Kyung D.
2016-03-29
A method for selective duplication of subtasks in a high-performance computing system includes: monitoring a health status of one or more nodes in a high-performance computing system, where one or more subtasks of a parallel task execute on the one or more nodes; identifying one or more nodes as having a likelihood of failure which exceeds a first prescribed threshold; selectively duplicating the one or more subtasks that execute on the one or more nodes having a likelihood of failure which exceeds the first prescribed threshold; and notifying a messaging library that one or more subtasks were duplicated.
Method for construction of bacterial strains with increased succinic acid production
Donnelly, Mark I.; Sanville-Millard, Cynthia; Chatterjee, Ranjini
2000-01-01
A fermentation process for producing succinic acid is provided comprising selecting a bacterial strain that does not produce succinic acid in high yield, disrupting the normal regulation of sugar metabolism of said bacterial strain, and combining the mutant bacterial strain and selected sugar in anaerobic conditions to facilitate production of succinic acid. Also provided is a method for changing low yield succinic acid producing bacteria to high yield succinic acid producing bacteria comprising selecting a bacterial strain having a phosphotransferase system and altering the phosphotransferase system so as to allow the bacterial strain to simultaneously metabolize different sugars.
2012-01-01
Background Discovering new biomarkers has a great role in improving early diagnosis of Hepatocellular carcinoma (HCC). The experimental determination of biomarkers needs a lot of time and money. This motivates this work to use in-silico prediction of biomarkers to reduce the number of experiments required for detecting new ones. This is achieved by extracting the most representative genes in microarrays of HCC. Results In this work, we provide a method for extracting the differential expressed genes, up regulated ones, that can be considered candidate biomarkers in high throughput microarrays of HCC. We examine the power of several gene selection methods (such as Pearson’s correlation coefficient, Cosine coefficient, Euclidean distance, Mutual information and Entropy with different estimators) in selecting informative genes. A biological interpretation of the highly ranked genes is done using KEGG (Kyoto Encyclopedia of Genes and Genomes) pathways, ENTREZ and DAVID (Database for Annotation, Visualization, and Integrated Discovery) databases. The top ten genes selected using Pearson’s correlation coefficient and Cosine coefficient contained six genes that have been implicated in cancer (often multiple cancers) genesis in previous studies. A fewer number of genes were obtained by the other methods (4 genes using Mutual information, 3genes using Euclidean distance and only one gene using Entropy). A better result was obtained by the utilization of a hybrid approach based on intersecting the highly ranked genes in the output of all investigated methods. This hybrid combination yielded seven genes (2 genes for HCC and 5 genes in different types of cancer) in the top ten genes of the list of intersected genes. Conclusions To strengthen the effectiveness of the univariate selection methods, we propose a hybrid approach by intersecting several of these methods in a cascaded manner. This approach surpasses all of univariate selection methods when used individually according to biological interpretation and the examination of gene expression signal profiles. PMID:22867264
Li, Jinyan; Fong, Simon; Wong, Raymond K; Millham, Richard; Wong, Kelvin K L
2017-06-28
Due to the high-dimensional characteristics of dataset, we propose a new method based on the Wolf Search Algorithm (WSA) for optimising the feature selection problem. The proposed approach uses the natural strategy established by Charles Darwin; that is, 'It is not the strongest of the species that survives, but the most adaptable'. This means that in the evolution of a swarm, the elitists are motivated to quickly obtain more and better resources. The memory function helps the proposed method to avoid repeat searches for the worst position in order to enhance the effectiveness of the search, while the binary strategy simplifies the feature selection problem into a similar problem of function optimisation. Furthermore, the wrapper strategy gathers these strengthened wolves with the classifier of extreme learning machine to find a sub-dataset with a reasonable number of features that offers the maximum correctness of global classification models. The experimental results from the six public high-dimensional bioinformatics datasets tested demonstrate that the proposed method can best some of the conventional feature selection methods up to 29% in classification accuracy, and outperform previous WSAs by up to 99.81% in computational time.
NASA Astrophysics Data System (ADS)
Liang, Lijiao; Zhen, Shujun; Huang, Chengzhi
2017-02-01
A highly selective method was presented for colorimetric determination of melamine using uracil 5‧-triphosphate sodium modified gold nanoparticles (UTP-Au NPs) in this paper. Specific hydrogen-bonding interaction between uracil base (U) and melamine resulted in the aggregation of AuNPs, displaying variations of localized surface plasmon resonance (LSPR) features such as color change from red to blue and enhanced localized surface plasmon resonance light scattering (LSPR-LS) signals. Accordingly, the concentration of melamine could be quantified based on naked eye or a spectrometric method. This method was simple, inexpensive, environmental friendly and highly selective, which has been successfully used for the detection of melamine in pretreated liquid milk products with high recoveries.
Novel selection methods for DNA-encoded chemical libraries.
Chan, Alix I; McGregor, Lynn M; Liu, David R
2015-06-01
Driven by the need for new compounds to serve as biological probes and leads for therapeutic development and the growing accessibility of DNA technologies including high-throughput sequencing, many academic and industrial groups have begun to use DNA-encoded chemical libraries as a source of bioactive small molecules. In this review, we describe the technologies that have enabled the selection of compounds with desired activities from these libraries. These methods exploit the sensitivity of in vitro selection coupled with DNA amplification to overcome some of the limitations and costs associated with conventional screening methods. In addition, we highlight newer techniques with the potential to be applied to the high-throughput evaluation of DNA-encoded chemical libraries. Copyright © 2015 Elsevier Ltd. All rights reserved.
Wang, Yun; Huang, Fangzhou
2018-01-01
The selection of feature genes with high recognition ability from the gene expression profiles has gained great significance in biology. However, most of the existing methods have a high time complexity and poor classification performance. Motivated by this, an effective feature selection method, called supervised locally linear embedding and Spearman's rank correlation coefficient (SLLE-SC2), is proposed which is based on the concept of locally linear embedding and correlation coefficient algorithms. Supervised locally linear embedding takes into account class label information and improves the classification performance. Furthermore, Spearman's rank correlation coefficient is used to remove the coexpression genes. The experiment results obtained on four public tumor microarray datasets illustrate that our method is valid and feasible. PMID:29666661
Xu, Jiucheng; Mu, Huiyu; Wang, Yun; Huang, Fangzhou
2018-01-01
The selection of feature genes with high recognition ability from the gene expression profiles has gained great significance in biology. However, most of the existing methods have a high time complexity and poor classification performance. Motivated by this, an effective feature selection method, called supervised locally linear embedding and Spearman's rank correlation coefficient (SLLE-SC 2 ), is proposed which is based on the concept of locally linear embedding and correlation coefficient algorithms. Supervised locally linear embedding takes into account class label information and improves the classification performance. Furthermore, Spearman's rank correlation coefficient is used to remove the coexpression genes. The experiment results obtained on four public tumor microarray datasets illustrate that our method is valid and feasible.
ERIC Educational Resources Information Center
Hsiao, Hsi-Chi; Lee, Ming-Chao; Tu, Ya-Ling
2013-01-01
Deregulation has formed the primary core of education reform in Taiwan in the past decade. The principal selection system was one of the specific recommendations in the deregulation of education. The method of designation of senior high school principals has changed from being "appointed" to being "selected." The issue as to…
Zeng, Xueqiang; Luo, Gang
2017-12-01
Machine learning is broadly used for clinical data analysis. Before training a model, a machine learning algorithm must be selected. Also, the values of one or more model parameters termed hyper-parameters must be set. Selecting algorithms and hyper-parameter values requires advanced machine learning knowledge and many labor-intensive manual iterations. To lower the bar to machine learning, miscellaneous automatic selection methods for algorithms and/or hyper-parameter values have been proposed. Existing automatic selection methods are inefficient on large data sets. This poses a challenge for using machine learning in the clinical big data era. To address the challenge, this paper presents progressive sampling-based Bayesian optimization, an efficient and automatic selection method for both algorithms and hyper-parameter values. We report an implementation of the method. We show that compared to a state of the art automatic selection method, our method can significantly reduce search time, classification error rate, and standard deviation of error rate due to randomization. This is major progress towards enabling fast turnaround in identifying high-quality solutions required by many machine learning-based clinical data analysis tasks.
Bacteriophage vehicles for phage display: biology, mechanism, and application.
Ebrahimizadeh, Walead; Rajabibazl, Masoumeh
2014-08-01
The phage display technique is a powerful tool for selection of various biological agents. This technique allows construction of large libraries from the antibody repertoire of different hosts and provides a fast and high-throughput selection method. Specific antibodies can be isolated based on distinctive characteristics from a library consisting of millions of members. These features made phage display technology preferred method for antibody selection and engineering. There are several phage display methods available and each has its unique merits and application. Selection of appropriate display technique requires basic knowledge of available methods and their mechanism. In this review, we describe different phage display techniques, available bacteriophage vehicles, and their mechanism.
Kramer, S; Blaschke, G
2001-02-10
A sensitive high-performance liquid chromatographic method has been developed for the determination of the beta2-selective adrenergic agonist fenoterol in human plasma. To improve the sensitivity of the method, fenoterol was derivatized with N-(chloroformyl)-carbazole prior to HPLC analysis yielding highly fluorescent derivatives. The assay involves protein precipitation with acetonitrile, liquid-liquid-extraction of fenoterol from plasma with isobutanol under alkaline conditions followed by derivatization with N-(chloroformyl)-carbazole. Reversed-phase liquid chromatographic determination of the fenoterol derivative was performed using a column-switching system consisting of a LiChrospher 100 RP 18 and a LiChrospher RP-Select B column with acetonitrile, methanol and water as mobile phase. The limit of quantitation in human plasma was 376 pg fenoterol/ml. The method was successfully applied for the assay of fenoterol in patient plasma.
Algamal, Z Y; Lee, M H
2017-01-01
A high-dimensional quantitative structure-activity relationship (QSAR) classification model typically contains a large number of irrelevant and redundant descriptors. In this paper, a new design of descriptor selection for the QSAR classification model estimation method is proposed by adding a new weight inside L1-norm. The experimental results of classifying the anti-hepatitis C virus activity of thiourea derivatives demonstrate that the proposed descriptor selection method in the QSAR classification model performs effectively and competitively compared with other existing penalized methods in terms of classification performance on both the training and the testing datasets. Moreover, it is noteworthy that the results obtained in terms of stability test and applicability domain provide a robust QSAR classification model. It is evident from the results that the developed QSAR classification model could conceivably be employed for further high-dimensional QSAR classification studies.
Li, Juan; Jiang, Yue; Fan, Qi; Chen, Yang; Wu, Ruanqi
2014-05-05
This paper establishes a high-throughput and high selective method to determine the impurity named oxidized glutathione (GSSG) and radial tensile strength (RTS) of reduced glutathione (GSH) tablets based on near infrared (NIR) spectroscopy and partial least squares (PLS). In order to build and evaluate the calibration models, the NIR diffuse reflectance spectra (DRS) and transmittance spectra (TS) for 330 GSH tablets were accurately measured by using the optimized parameter values. For analyzing GSSG or RTS of GSH tablets, the NIR-DRS or NIR-TS were selected, subdivided reasonably into calibration and prediction sets, and processed appropriately with chemometric techniques. After selecting spectral sub-ranges and neglecting spectrum outliers, the PLS calibration models were built and the factor numbers were optimized. Then, the PLS models were evaluated by the root mean square errors of calibration (RMSEC), cross-validation (RMSECV) and prediction (RMSEP), and by the correlation coefficients of calibration (R(c)) and prediction (R(p)). The results indicate that the proposed models have good performances. It is thus clear that the NIR-PLS can simultaneously, selectively, nondestructively and rapidly analyze the GSSG and RTS of GSH tablets, although the contents of GSSG impurity were quite low while those of GSH active pharmaceutical ingredient (API) quite high. This strategy can be an important complement to the common NIR methods used in the on-line analysis of API in pharmaceutical preparations. And this work expands the NIR applications in the high-throughput and extraordinarily selective analysis. Copyright © 2014 Elsevier B.V. All rights reserved.
Arendt, Cassandra S.; Ri, Keirei; Yates, Phillip A.; Ullman, Buddy
2007-01-01
We describe an efficient method for generating highly functional membrane proteins with variant amino acids at defined positions that couples a modified site-saturation strategy with functional genetic selection. We applied this method to the production of a cysteine-less variant of the Crithidia fasciculata inosine-guanosine permease CfNT2, in order to facilitate biochemical studies using thiol-specific modifying reagents. Of ten endogenous cysteine residues in CfNT2, two cannot be replaced with serine or alanine without loss of function. High-quality single- and double-mutant libraries were produced by combining a previously reported site-saturation mutagenesis scheme based on the Quikchange method with a novel gel purification step that effectively eliminated template DNA from the products. Following selection for functional complementation in S. cerevisiae cells auxotrophic for purines, several highly functional non-cysteine substitutions were efficiently identified at each desired position, allowing the construction of cysteine-less variants of CfNT2 that retained wild-type affinity for inosine. This combination of an improved site-saturation mutagenesis technique and positive genetic selection provides a simple and efficient means to identify functional and perhaps unexpected amino acid variants at a desired position. PMID:17481563
Using Deep Learning for Compound Selectivity Prediction.
Zhang, Ruisheng; Li, Juan; Lu, Jingjing; Hu, Rongjing; Yuan, Yongna; Zhao, Zhili
2016-01-01
Compound selectivity prediction plays an important role in identifying potential compounds that bind to the target of interest with high affinity. However, there is still short of efficient and accurate computational approaches to analyze and predict compound selectivity. In this paper, we propose two methods to improve the compound selectivity prediction. We employ an improved multitask learning method in Neural Networks (NNs), which not only incorporates both activity and selectivity for other targets, but also uses a probabilistic classifier with a logistic regression. We further improve the compound selectivity prediction by using the multitask learning method in Deep Belief Networks (DBNs) which can build a distributed representation model and improve the generalization of the shared tasks. In addition, we assign different weights to the auxiliary tasks that are related to the primary selectivity prediction task. In contrast to other related work, our methods greatly improve the accuracy of the compound selectivity prediction, in particular, using the multitask learning in DBNs with modified weights obtains the best performance.
A Ranking Approach to Genomic Selection.
Blondel, Mathieu; Onogi, Akio; Iwata, Hiroyoshi; Ueda, Naonori
2015-01-01
Genomic selection (GS) is a recent selective breeding method which uses predictive models based on whole-genome molecular markers. Until now, existing studies formulated GS as the problem of modeling an individual's breeding value for a particular trait of interest, i.e., as a regression problem. To assess predictive accuracy of the model, the Pearson correlation between observed and predicted trait values was used. In this paper, we propose to formulate GS as the problem of ranking individuals according to their breeding value. Our proposed framework allows us to employ machine learning methods for ranking which had previously not been considered in the GS literature. To assess ranking accuracy of a model, we introduce a new measure originating from the information retrieval literature called normalized discounted cumulative gain (NDCG). NDCG rewards more strongly models which assign a high rank to individuals with high breeding value. Therefore, NDCG reflects a prerequisite objective in selective breeding: accurate selection of individuals with high breeding value. We conducted a comparison of 10 existing regression methods and 3 new ranking methods on 6 datasets, consisting of 4 plant species and 25 traits. Our experimental results suggest that tree-based ensemble methods including McRank, Random Forests and Gradient Boosting Regression Trees achieve excellent ranking accuracy. RKHS regression and RankSVM also achieve good accuracy when used with an RBF kernel. Traditional regression methods such as Bayesian lasso, wBSR and BayesC were found less suitable for ranking. Pearson correlation was found to correlate poorly with NDCG. Our study suggests two important messages. First, ranking methods are a promising research direction in GS. Second, NDCG can be a useful evaluation measure for GS.
Kushnir, Mark M; Nelson, Gordon J; Frank, Elizabeth L; Rockwood, Alan L
2016-01-01
Measurement of methylmalonic acid (MMA) plays an important role in the diagnosis of vitamin B12 deficiency. Vitamin B12 is an essential cofactor for the enzymatic carbon rearrangement of methylmalonyl-CoA (MMA-CoA) to succinyl-CoA (SA-CoA), and the lack of vitamin B12 leads to elevated concentrations of MMA. Presence of succinic acid (SA) complicates the analysis because mass spectra of MMA and SA are indistinguishable, when analyzed in negative ion mode and the peaks are difficult to resolve chromatographically. We developed a method for the selective analysis of MMA that exploits the significant difference in fragmentation patterns of di-butyl derivatives of the isomers MMA and SA in a tandem mass spectrometer when analyzed in positive ion mode. Tandem mass spectra of di-butyl derivatives of MMA and SA are very distinct; this allows selective analysis of MMA in the presence of SA. The instrumental analysis is performed using liquid chromatography-tandem mass spectrometry (LC-MS/MS) in positive ion mode, which is, in combination with selective extraction of acidic compounds, is highly selective for organic acids with multiple carboxyl groups (dicarboxylic, tricarboxylic, etc.). In this method organic acids with a single carboxyl group are virtually undetectable in the mass spectrometer; the only organic acid, other than MMA, that is detected by this method is its isomer, SA. Quantitative measurement of MMA in this method is performed using a deconvolution algorithm, which mathematically resolves the signal corresponding to MMA and does not require chromatographic resolution of the MMA and SA peaks. Because of its high selectivity, the method utilizes isocratic chromatographic separation; reconditioning and re-equilibration of the chromatographic column between injections is unnecessary. The above features of the method allow high-throughput analysis of MMA with analysis cycle time of 1 min.
SNP selection and classification of genome-wide SNP data using stratified sampling random forests.
Wu, Qingyao; Ye, Yunming; Liu, Yang; Ng, Michael K
2012-09-01
For high dimensional genome-wide association (GWA) case-control data of complex disease, there are usually a large portion of single-nucleotide polymorphisms (SNPs) that are irrelevant with the disease. A simple random sampling method in random forest using default mtry parameter to choose feature subspace, will select too many subspaces without informative SNPs. Exhaustive searching an optimal mtry is often required in order to include useful and relevant SNPs and get rid of vast of non-informative SNPs. However, it is too time-consuming and not favorable in GWA for high-dimensional data. The main aim of this paper is to propose a stratified sampling method for feature subspace selection to generate decision trees in a random forest for GWA high-dimensional data. Our idea is to design an equal-width discretization scheme for informativeness to divide SNPs into multiple groups. In feature subspace selection, we randomly select the same number of SNPs from each group and combine them to form a subspace to generate a decision tree. The advantage of this stratified sampling procedure can make sure each subspace contains enough useful SNPs, but can avoid a very high computational cost of exhaustive search of an optimal mtry, and maintain the randomness of a random forest. We employ two genome-wide SNP data sets (Parkinson case-control data comprised of 408 803 SNPs and Alzheimer case-control data comprised of 380 157 SNPs) to demonstrate that the proposed stratified sampling method is effective, and it can generate better random forest with higher accuracy and lower error bound than those by Breiman's random forest generation method. For Parkinson data, we also show some interesting genes identified by the method, which may be associated with neurological disorders for further biological investigations.
Single image super-resolution reconstruction algorithm based on eage selection
NASA Astrophysics Data System (ADS)
Zhang, Yaolan; Liu, Yijun
2017-05-01
Super-resolution (SR) has become more important, because it can generate high-quality high-resolution (HR) images from low-resolution (LR) input images. At present, there are a lot of work is concentrated on developing sophisticated image priors to improve the image quality, while taking much less attention to estimating and incorporating the blur model that can also impact the reconstruction results. We present a new reconstruction method based on eager selection. This method takes full account of the factors that affect the blur kernel estimation and accurately estimating the blur process. When comparing with the state-of-the-art methods, our method has comparable performance.
Liu, Vincent; Song, Yong-Ak; Han, Jongyoon
2010-06-07
In this paper, we report a novel method for fabricating ion-selective membranes in poly(dimethylsiloxane) (PDMS)/glass-based microfluidic preconcentrators. Based on the concept of capillary valves, this fabrication method involves filling a lithographically patterned junction between two microchannels with an ion-selective material such as Nafion resin; subsequent curing results in a high aspect-ratio membrane for use in electrokinetic sample preconcentration. To demonstrate the concentration performance of this high-aspect-ratio, ion-selective membrane, we integrated the preconcentrator with a surface-based immunoassay for R-Phycoerythrin (RPE). Using a 1x PBS buffer system, the preconcentrator-enhanced immunoassay showed an approximately 100x improvement in sensitivity within 30 min. This is the first time that an electrokinetic microfluidic preconcentrator based on ion concentration polarization (ICP) has been used in high ionic strength buffer solutions to enhance the sensitivity of a surface-based immunoassay.
Hacker, David E; Hoinka, Jan; Iqbal, Emil S; Przytycka, Teresa M; Hartman, Matthew C T
2017-03-17
Highly constrained peptides such as the knotted peptide natural products are promising medicinal agents because of their impressive biostability and potent activity. Yet, libraries of highly constrained peptides are challenging to prepare. Here, we present a method which utilizes two robust, orthogonal chemical steps to create highly constrained bicyclic peptide libraries. This technology was optimized to be compatible with in vitro selections by mRNA display. We performed side-by-side monocyclic and bicyclic selections against a model protein (streptavidin). Both selections resulted in peptides with mid-nanomolar affinity, and the bicyclic selection yielded a peptide with remarkable protease resistance.
NASA Astrophysics Data System (ADS)
Bobrovnikov, S. M.; Gorlov, E. V.; Zharkov, V. I.
2018-05-01
A technique for increasing the selectivity of the method of detecting high-energy materials (HEMs) based on laser fragmentation of HEM molecules with subsequent laser excitation of fluorescence of the characteristic NO fragments from the first vibrational level of the ground state is suggested.
Selective oxoanion separation using a tripodal ligand
Custelcean, Radu; Moyer, Bruce A.; Rajbanshi, Arbin
2016-02-16
The present invention relates to urea-functionalized crystalline capsules self-assembled by sodium or potassium cation coordination and by hydrogen-bonding water bridges to selectively encapsulate tetrahedral divalent oxoanions from highly competitive aqueous alkaline solutions and methods using this system for selective anion separations from industrial solutions. The method involves competitive crystallizations using a tripodal tris(urea) functionalized ligand and, in particular, provides a viable approach to sulfate separation from nuclear wastes.
The Cross-Entropy Based Multi-Filter Ensemble Method for Gene Selection.
Sun, Yingqiang; Lu, Chengbo; Li, Xiaobo
2018-05-17
The gene expression profile has the characteristics of a high dimension, low sample, and continuous type, and it is a great challenge to use gene expression profile data for the classification of tumor samples. This paper proposes a cross-entropy based multi-filter ensemble (CEMFE) method for microarray data classification. Firstly, multiple filters are used to select the microarray data in order to obtain a plurality of the pre-selected feature subsets with a different classification ability. The top N genes with the highest rank of each subset are integrated so as to form a new data set. Secondly, the cross-entropy algorithm is used to remove the redundant data in the data set. Finally, the wrapper method, which is based on forward feature selection, is used to select the best feature subset. The experimental results show that the proposed method is more efficient than other gene selection methods and that it can achieve a higher classification accuracy under fewer characteristic genes.
Geng, Zhigeng; Wang, Sijian; Yu, Menggang; Monahan, Patrick O.; Champion, Victoria; Wahba, Grace
2017-01-01
Summary In many scientific and engineering applications, covariates are naturally grouped. When the group structures are available among covariates, people are usually interested in identifying both important groups and important variables within the selected groups. Among existing successful group variable selection methods, some methods fail to conduct the within group selection. Some methods are able to conduct both group and within group selection, but the corresponding objective functions are non-convex. Such a non-convexity may require extra numerical effort. In this article, we propose a novel Log-Exp-Sum(LES) penalty for group variable selection. The LES penalty is strictly convex. It can identify important groups as well as select important variables within the group. We develop an efficient group-level coordinate descent algorithm to fit the model. We also derive non-asymptotic error bounds and asymptotic group selection consistency for our method in the high-dimensional setting where the number of covariates can be much larger than the sample size. Numerical results demonstrate the good performance of our method in both variable selection and prediction. We applied the proposed method to an American Cancer Society breast cancer survivor dataset. The findings are clinically meaningful and may help design intervention programs to improve the qualify of life for breast cancer survivors. PMID:25257196
Nasir, Muhammad; Attique Khan, Muhammad; Sharif, Muhammad; Lali, Ikram Ullah; Saba, Tanzila; Iqbal, Tassawar
2018-02-21
Melanoma is the deadliest type of skin cancer with highest mortality rate. However, the annihilation in early stage implies a high survival rate therefore, it demands early diagnosis. The accustomed diagnosis methods are costly and cumbersome due to the involvement of experienced experts as well as the requirements for highly equipped environment. The recent advancements in computerized solutions for these diagnoses are highly promising with improved accuracy and efficiency. In this article, we proposed a method for the classification of melanoma and benign skin lesions. Our approach integrates preprocessing, lesion segmentation, features extraction, features selection, and classification. Preprocessing is executed in the context of hair removal by DullRazor, whereas lesion texture and color information are utilized to enhance the lesion contrast. In lesion segmentation, a hybrid technique has been implemented and results are fused using additive law of probability. Serial based method is applied subsequently that extracts and fuses the traits such as color, texture, and HOG (shape). The fused features are selected afterwards by implementing a novel Boltzman Entropy method. Finally, the selected features are classified by Support Vector Machine. The proposed method is evaluated on publically available data set PH2. Our approach has provided promising results of sensitivity 97.7%, specificity 96.7%, accuracy 97.5%, and F-score 97.5%, which are significantly better than the results of existing methods available on the same data set. The proposed method detects and classifies melanoma significantly good as compared to existing methods. © 2018 Wiley Periodicals, Inc.
Colorimetric and fluorescent detection of hydrazine with high sensitivity and excellent selectivity
NASA Astrophysics Data System (ADS)
Shi, Bingjie; Qi, Sujie; Yu, Mingming; Liu, Chunxia; Li, Zhanxian; Wei, Liuhe; Ni, Zhonghai
2018-01-01
It is critical to develop probes for rapid, selective, and sensitive detection of the highly toxic hydrazine in both environmental and biological science. In this work, under mild condition, a novel colorimetric and off-on fluorescent probe was synthesized for rapid recognition of hydrazine with excellent selectivity over other various species including some biological species, metal ions and anions. The limit of quantification (LOQ) value was 1.5 × 10- 4 M-3.2 × 10- 3 M (colorimetric method) and 1.5 × 10- 4 M - 3.2 × 10- 3 M (fluorescent method) with as low as detection limit of 46.2 μM.
NASA Astrophysics Data System (ADS)
Chen, Dongyue; Lin, Jianhui; Li, Yanping
2018-06-01
Complementary ensemble empirical mode decomposition (CEEMD) has been developed for the mode-mixing problem in Empirical Mode Decomposition (EMD) method. Compared to the ensemble empirical mode decomposition (EEMD), the CEEMD method reduces residue noise in the signal reconstruction. Both CEEMD and EEMD need enough ensemble number to reduce the residue noise, and hence it would be too much computation cost. Moreover, the selection of intrinsic mode functions (IMFs) for further analysis usually depends on experience. A modified CEEMD method and IMFs evaluation index are proposed with the aim of reducing the computational cost and select IMFs automatically. A simulated signal and in-service high-speed train gearbox vibration signals are employed to validate the proposed method in this paper. The results demonstrate that the modified CEEMD can decompose the signal efficiently with less computation cost, and the IMFs evaluation index can select the meaningful IMFs automatically.
NASA Astrophysics Data System (ADS)
Viironen, K.; Marín-Franch, A.; López-Sanjuan, C.; Varela, J.; Chaves-Montero, J.; Cristóbal-Hornillos, D.; Molino, A.; Fernández-Soto, A.; Vilella-Rojo, G.; Ascaso, B.; Cenarro, A. J.; Cerviño, M.; Cepa, J.; Ederoclite, A.; Márquez, I.; Masegosa, J.; Moles, M.; Oteo, I.; Pović, M.; Aguerri, J. A. L.; Alfaro, E.; Aparicio-Villegas, T.; Benítez, N.; Broadhurst, T.; Cabrera-Caño, J.; Castander, J. F.; Del Olmo, A.; González Delgado, R. M.; Husillos, C.; Infante, L.; Martínez, V. J.; Perea, J.; Prada, F.; Quintana, J. M.
2015-04-01
Context. Most observational results on the high redshift restframe UV-bright galaxies are based on samples pinpointed using the so-called dropout technique or Ly-α selection. However, the availability of multifilter data now allows the dropout selections to be replaced by direct methods based on photometric redshifts. In this paper we present the methodology to select and study the population of high redshift galaxies in the ALHAMBRA survey data. Aims: Our aim is to develop a less biased methodology than the traditional dropout technique to study the high redshift galaxies in ALHAMBRA and other multifilter data. Thanks to the wide area ALHAMBRA covers, we especially aim at contributing to the study of the brightest, least frequent, high redshift galaxies. Methods: The methodology is based on redshift probability distribution functions (zPDFs). It is shown how a clean galaxy sample can be obtained by selecting the galaxies with high integrated probability of being within a given redshift interval. However, reaching both a complete and clean sample with this method is challenging. Hence, a method to derive statistical properties by summing the zPDFs of all the galaxies in the redshift bin of interest is introduced. Results: Using this methodology we derive the galaxy rest frame UV number counts in five redshift bins centred at z = 2.5,3.0,3.5,4.0, and 4.5, being complete up to the limiting magnitude at mUV(AB) = 24, where mUV refers to the first ALHAMBRA filter redwards of the Ly-α line. With the wide field ALHAMBRA data we especially contribute to the study of the brightest ends of these counts, accurately sampling the surface densities down to mUV(AB) = 21-22. Conclusions: We show that using the zPDFs it is easy to select a very clean sample of high redshift galaxies. We also show that it is better to do statistical analysis of the properties of galaxies using a probabilistic approach, which takes into account both the incompleteness and contamination issues in a natural way. Based on observations collected at the German-Spanish Astronomical Center, Calar Alto, jointly operated by the Max-Planck-Institut für Astronomie (MPIA) at Heidelberg and the Instituto de Astrofísica de Andalucía (CSIC).
Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography
Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila
2016-01-01
Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory. PMID:27635251
Method For Manufacturing Articles For High Temperature Use, And Articles Made Therewith
Wang, Hongyu; Mitchell, David Joseph; Lau, Yuk-Chiu; Henry, Arnold Thomas
2006-02-28
A method for manufacturing an article for use in a high-temperature environment, and an article for use in such an environment, are presented. The method comprises providing a substrate; selecting a desired vertical crack density for a protective coating to be deposited on the substrate; providing a powder, wherein the powder has a size range selected to provide a coating having the desired vertical crack density; and applying a thermal-sprayed coating to the substrate, the coating having the desired vertical crack density, wherein the powder is used as a raw material for the coating.
Method For Manufacturing Articles For High Temperature Use, And Articles Made Therewith
Wang, Hongyu; Mitchell, David Joseph; Lau, Yuk-Chiu; Henry, Arnold Thomas
2005-03-15
A method for manufacturing an article for use in a high-temperature environment, and an article for use in such an environment, are presented. The method comprises providing a substrate; selecting a desired vertical crack density for a protective coating to be deposited on the substrate; providing a powder, wherein the powder has a size range selected to provide a coating having the desired vertical crack density; and applying a thermal-sprayed coating to the substrate, the coating having the desired vertical crack density, wherein the powder is used as a raw material for the coating.
Engineering a growth sensor to select intracellular antibodies in the cytosol of mammalian cells.
Nguyen, Thuy Duong; Takasuka, Hitoshi; Kaku, Yoshihiro; Inoue, Satoshi; Nagamune, Teruyuki; Kawahara, Masahiro
2017-07-01
Intracellular antibodies (intrabodies) are expected to function as therapeutics as well as tools for elucidating in vivo function of proteins. In this study, we propose a novel intrabody selection method in the cytosol of mammalian cells by utilizing a growth signal, induced by the interaction of the target antigen and an scFv-c-kit growth sensor. Here, we challenge this method to select specific intrabodies against rabies virus nucleoprotein (RV-N) for the first time. As a result, we successfully select antigen-specific intrabodies from a naïve synthetic library using phage panning followed by our growth sensor-based intracellular selection method, demonstrating the feasibility of the method. Additionally, we succeed in improving the response of the growth sensor by re-engineering the linker region of its construction. Collectively, the described selection method utilizing a growth sensor may become a highly efficient platform for selection of functional intrabodies in the future. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Wang, Haoping; Kang, Tiantian; Wang, Xiaoju; Feng, Liheng
2018-07-01
A simple Schiff base comprised of tris(2-aminoethyl)amine and salicylaldehyde was designed and synthesized by one-step reaction. Although this compound has poor selectivity for metal ions in acetonitrile, it shows high selectivity and sensitivity detection for Zn(II) ions through adjusting the solvent polarity (the volume ratio of CH 3 CN/H 2 O). In other words, this work provides a facile way to realize a transformation from poor to excellent feature for fluorescent probes. The bonding mode of this probe with Zn(II) ions was verified by 1 H NMR and MS assays. The stoichiometric ratio of the probe with Zn(II) is 1:1 (mole), which matches with the Job-plot assay. The detection limitation of the probe for Zn(II) is up to 1 × 10 -8 mol/L. The electrochemical property of the probe combined with Zn(II) was investigated by cyclic voltammetry method, and the result agreed with the theoretical calculation by the Gaussian 09 software. The probe for Zn(II) could be applied in practical samples and biological systems. The main contribution of this work lies in providing a very simple method to realize the selectivity transformation for poor selective probes. The providing way is a simple, easy and low-cost method for obtaining high selectively fluorescence probes. Copyright © 2018 Elsevier B.V. All rights reserved.
Elastic SCAD as a novel penalization method for SVM classification tasks in high-dimensional data.
Becker, Natalia; Toedt, Grischa; Lichter, Peter; Benner, Axel
2011-05-09
Classification and variable selection play an important role in knowledge discovery in high-dimensional data. Although Support Vector Machine (SVM) algorithms are among the most powerful classification and prediction methods with a wide range of scientific applications, the SVM does not include automatic feature selection and therefore a number of feature selection procedures have been developed. Regularisation approaches extend SVM to a feature selection method in a flexible way using penalty functions like LASSO, SCAD and Elastic Net.We propose a novel penalty function for SVM classification tasks, Elastic SCAD, a combination of SCAD and ridge penalties which overcomes the limitations of each penalty alone.Since SVM models are extremely sensitive to the choice of tuning parameters, we adopted an interval search algorithm, which in comparison to a fixed grid search finds rapidly and more precisely a global optimal solution. Feature selection methods with combined penalties (Elastic Net and Elastic SCAD SVMs) are more robust to a change of the model complexity than methods using single penalties. Our simulation study showed that Elastic SCAD SVM outperformed LASSO (L1) and SCAD SVMs. Moreover, Elastic SCAD SVM provided sparser classifiers in terms of median number of features selected than Elastic Net SVM and often better predicted than Elastic Net in terms of misclassification error.Finally, we applied the penalization methods described above on four publicly available breast cancer data sets. Elastic SCAD SVM was the only method providing robust classifiers in sparse and non-sparse situations. The proposed Elastic SCAD SVM algorithm provides the advantages of the SCAD penalty and at the same time avoids sparsity limitations for non-sparse data. We were first to demonstrate that the integration of the interval search algorithm and penalized SVM classification techniques provides fast solutions on the optimization of tuning parameters.The penalized SVM classification algorithms as well as fixed grid and interval search for finding appropriate tuning parameters were implemented in our freely available R package 'penalizedSVM'.We conclude that the Elastic SCAD SVM is a flexible and robust tool for classification and feature selection tasks for high-dimensional data such as microarray data sets.
Elastic SCAD as a novel penalization method for SVM classification tasks in high-dimensional data
2011-01-01
Background Classification and variable selection play an important role in knowledge discovery in high-dimensional data. Although Support Vector Machine (SVM) algorithms are among the most powerful classification and prediction methods with a wide range of scientific applications, the SVM does not include automatic feature selection and therefore a number of feature selection procedures have been developed. Regularisation approaches extend SVM to a feature selection method in a flexible way using penalty functions like LASSO, SCAD and Elastic Net. We propose a novel penalty function for SVM classification tasks, Elastic SCAD, a combination of SCAD and ridge penalties which overcomes the limitations of each penalty alone. Since SVM models are extremely sensitive to the choice of tuning parameters, we adopted an interval search algorithm, which in comparison to a fixed grid search finds rapidly and more precisely a global optimal solution. Results Feature selection methods with combined penalties (Elastic Net and Elastic SCAD SVMs) are more robust to a change of the model complexity than methods using single penalties. Our simulation study showed that Elastic SCAD SVM outperformed LASSO (L1) and SCAD SVMs. Moreover, Elastic SCAD SVM provided sparser classifiers in terms of median number of features selected than Elastic Net SVM and often better predicted than Elastic Net in terms of misclassification error. Finally, we applied the penalization methods described above on four publicly available breast cancer data sets. Elastic SCAD SVM was the only method providing robust classifiers in sparse and non-sparse situations. Conclusions The proposed Elastic SCAD SVM algorithm provides the advantages of the SCAD penalty and at the same time avoids sparsity limitations for non-sparse data. We were first to demonstrate that the integration of the interval search algorithm and penalized SVM classification techniques provides fast solutions on the optimization of tuning parameters. The penalized SVM classification algorithms as well as fixed grid and interval search for finding appropriate tuning parameters were implemented in our freely available R package 'penalizedSVM'. We conclude that the Elastic SCAD SVM is a flexible and robust tool for classification and feature selection tasks for high-dimensional data such as microarray data sets. PMID:21554689
Brock, Guy N; Shaffer, John R; Blakesley, Richard E; Lotz, Meredith J; Tseng, George C
2008-01-10
Gene expression data frequently contain missing values, however, most down-stream analyses for microarray experiments require complete data. In the literature many methods have been proposed to estimate missing values via information of the correlation patterns within the gene expression matrix. Each method has its own advantages, but the specific conditions for which each method is preferred remains largely unclear. In this report we describe an extensive evaluation of eight current imputation methods on multiple types of microarray experiments, including time series, multiple exposures, and multiple exposures x time series data. We then introduce two complementary selection schemes for determining the most appropriate imputation method for any given data set. We found that the optimal imputation algorithms (LSA, LLS, and BPCA) are all highly competitive with each other, and that no method is uniformly superior in all the data sets we examined. The success of each method can also depend on the underlying "complexity" of the expression data, where we take complexity to indicate the difficulty in mapping the gene expression matrix to a lower-dimensional subspace. We developed an entropy measure to quantify the complexity of expression matrixes and found that, by incorporating this information, the entropy-based selection (EBS) scheme is useful for selecting an appropriate imputation algorithm. We further propose a simulation-based self-training selection (STS) scheme. This technique has been used previously for microarray data imputation, but for different purposes. The scheme selects the optimal or near-optimal method with high accuracy but at an increased computational cost. Our findings provide insight into the problem of which imputation method is optimal for a given data set. Three top-performing methods (LSA, LLS and BPCA) are competitive with each other. Global-based imputation methods (PLS, SVD, BPCA) performed better on mcroarray data with lower complexity, while neighbour-based methods (KNN, OLS, LSA, LLS) performed better in data with higher complexity. We also found that the EBS and STS schemes serve as complementary and effective tools for selecting the optimal imputation algorithm.
Contraceptive practices of women with epilepsy: Findings of the epilepsy birth control registry.
Herzog, Andrew G; Mandle, Hannah B; Cahill, Kaitlyn E; Fowler, Kristen M; Hauser, W Allen; Davis, Anne R
2016-04-01
To report the contraceptive practices of women with epilepsy (WWE) in the community, predictors of highly effective contraception use, and reasons WWE provide for the selection of a particular method. These cross-sectional data come from the Epilepsy Birth Control Registry (EBCR) web-based survey regarding the contraceptive practices of 1,144 WWE in the community, ages 18-47 years. We report demographic, epilepsy, and antiepileptic drug (AED) characteristics as well as contraceptive use. We determined the frequency of use of highly effective contraception use, that is, methods with failure rate <10%/year, and conducted binary logistic regression analysis to determine predictors of highly effective contraception use. We report frequencies of WWE who consult various health care providers regarding the selection of a method and the reasons cited for selection. Of the 796 WWE at risk of unintended pregnancy, 69.7% use what is generally considered to be highly effective contraception (hormonal, intrauterine device [IUD], tubal, vasectomy). Efficacy in WWE, especially for the 46.6% who use hormonal contraception, remains to be proven. Significant predictors of highly effective contraception use are insurance (insured 71.6% vs. noninsured 56.0%), race/ethnicity (Caucasian 71.3% vs. minority 51.0%), and age (38-47, 77.5%; 28-37, 71.8%; 18-27, 67.0%). Of the 87.2% who have a neurologist, only 25.4% consult them regarding selection of a method, although AED interaction is cited as the top reason for selection. The EBCR web-based survey is the first large-scale study of the contraceptive practices of WWE in the community. The findings suggest a need for the development of evidence-based guidelines that address the efficacy and safety of contraceptive methods in this special population, and for greater discourse between neurologists and WWE regarding contraception. Wiley Periodicals, Inc. © 2016 International League Against Epilepsy.
Shi, Lei; Wan, Youchuan; Gao, Xianjun
2018-01-01
In object-based image analysis of high-resolution images, the number of features can reach hundreds, so it is necessary to perform feature reduction prior to classification. In this paper, a feature selection method based on the combination of a genetic algorithm (GA) and tabu search (TS) is presented. The proposed GATS method aims to reduce the premature convergence of the GA by the use of TS. A prematurity index is first defined to judge the convergence situation during the search. When premature convergence does take place, an improved mutation operator is executed, in which TS is performed on individuals with higher fitness values. As for the other individuals with lower fitness values, mutation with a higher probability is carried out. Experiments using the proposed GATS feature selection method and three other methods, a standard GA, the multistart TS method, and ReliefF, were conducted on WorldView-2 and QuickBird images. The experimental results showed that the proposed method outperforms the other methods in terms of the final classification accuracy. PMID:29581721
2015-01-01
Background Investigations into novel biomarkers using omics techniques generate large amounts of data. Due to their size and numbers of attributes, these data are suitable for analysis with machine learning methods. A key component of typical machine learning pipelines for omics data is feature selection, which is used to reduce the raw high-dimensional data into a tractable number of features. Feature selection needs to balance the objective of using as few features as possible, while maintaining high predictive power. This balance is crucial when the goal of data analysis is the identification of highly accurate but small panels of biomarkers with potential clinical utility. In this paper we propose a heuristic for the selection of very small feature subsets, via an iterative feature elimination process that is guided by rule-based machine learning, called RGIFE (Rule-guided Iterative Feature Elimination). We use this heuristic to identify putative biomarkers of osteoarthritis (OA), articular cartilage degradation and synovial inflammation, using both proteomic and transcriptomic datasets. Results and discussion Our RGIFE heuristic increased the classification accuracies achieved for all datasets when no feature selection is used, and performed well in a comparison with other feature selection methods. Using this method the datasets were reduced to a smaller number of genes or proteins, including those known to be relevant to OA, cartilage degradation and joint inflammation. The results have shown the RGIFE feature reduction method to be suitable for analysing both proteomic and transcriptomics data. Methods that generate large ‘omics’ datasets are increasingly being used in the area of rheumatology. Conclusions Feature reduction methods are advantageous for the analysis of omics data in the field of rheumatology, as the applications of such techniques are likely to result in improvements in diagnosis, treatment and drug discovery. PMID:25923811
Self-adaptive method for high frequency multi-channel analysis of surface wave method
USDA-ARS?s Scientific Manuscript database
When the high frequency multi-channel analysis of surface waves (MASW) method is conducted to explore soil properties in the vadose zone, existing rules for selecting the near offset and spread lengths cannot satisfy the requirements of planar dominant Rayleigh waves for all frequencies of interest ...
Zhang, Yu; Wu, Jianxin; Cai, Jianfei
2016-05-01
In large-scale visual recognition and image retrieval tasks, feature vectors, such as Fisher vector (FV) or the vector of locally aggregated descriptors (VLAD), have achieved state-of-the-art results. However, the combination of the large numbers of examples and high-dimensional vectors necessitates dimensionality reduction, in order to reduce its storage and CPU costs to a reasonable range. In spite of the popularity of various feature compression methods, this paper shows that the feature (dimension) selection is a better choice for high-dimensional FV/VLAD than the feature (dimension) compression methods, e.g., product quantization. We show that strong correlation among the feature dimensions in the FV and the VLAD may not exist, which renders feature selection a natural choice. We also show that, many dimensions in FV/VLAD are noise. Throwing them away using feature selection is better than compressing them and useful dimensions altogether using feature compression methods. To choose features, we propose an efficient importance sorting algorithm considering both the supervised and unsupervised cases, for visual recognition and image retrieval, respectively. Combining with the 1-bit quantization, feature selection has achieved both higher accuracy and less computational cost than feature compression methods, such as product quantization, on the FV and the VLAD image representations.
Salehi, Simin; Rasoul-Amini, Sara; Adib, Noushin; Shekarchi, Maryam
2016-08-01
In this study a novel method is described for selective quantization of domperidone in biological matrices applying molecular imprinted polymers (MIPs) as a sample clean up procedure using high performance liquid chromatography coupled with a fluorescence detector. MIPs were synthesized with chloroform as the porogen, ethylene glycol dimethacrylate as the crosslinker, methacrylic acid as the monomer, and domperidone as the template molecule. The new imprinted polymer was used as a molecular sorbent for separation of domperidone from serum. Molecular recognition properties, binding capacity and selectivity of MIPs were determined. The results demonstrated exceptional affinity for domperidone in biological fluids. The domperidone analytical method using MIPs was verified according to validation parameters, such as selectivity, linearity (5-80ng/mL, r(2)=0.9977), precision and accuracy (10-40ng/mL, intra-day=1.7-5.1%, inter-day=4.5-5.9%, and accuracy 89.07-98.9%).The limit of detection (LOD) and quantization (LOQ) of domperidone was 0.0279 and 0.092ng/mL, respectively. The simplicity and suitable validation parameters makes this a highly valuable selective bioequivalence method for domperidone analysis in human serum. Copyright © 2016 Elsevier B.V. All rights reserved.
Composite membranes for fluid separations
Blume, Ingo; Peinemann, Klaus-Viktor; Pinnau, Ingo; Wijmans, Johannes G.
1992-01-01
A method for designing and making composite membranes having a microporous support membrane coated with a permselective layer. The method involves calculating the minimum thickness of the permselective layer such that the selectivity of the composite membrane is close to the intrinsic selectivity of the perselective layer. The invention also provides high performance membranes with optimized properties.
Composite membranes for fluid separations
Blume, Ingo; Peinemann, Klaus-Viktor; Pinnau, Ingo; Wijmans, Johannes G.
1991-01-01
A method for designing and making composite membranes having a microporous support membrane coated with a permselective layer. The method involves calculating the minimum thickness of the permselective layer such that the selectivity of the composite membrane is close to the intrinsic selectivity of the permselective layer. The invention also provides high performance membranes with optimized properties.
Composite membranes for fluid separations
Blume, Ingo; Peinemann, Klaus-Viktor; Pinnau, Ingo; Wijmans, Johannes G.
1990-01-01
A method for designing and making composite membranes having a microporous support membrane coated with a permselective layer. The method involves calculating the minimum thickness of the permselective layer such that the selectivity of the composite membrane is close to the intrinsic selectivity of the permselective layer. The invention also provides high performance membranes with optimized properties.
Recent Development in Chemical Depolymerization of Lignin: A Review
Wang, Hai; Tucker, Melvin; Ji, Yun
2013-01-01
This article reviewed recent development of chemical depolymerization of lignins. There were five types of treatment discussed, including base-catalyzed, acid-catalyzed, metallic catalyzed, ionic liquids-assisted, and supercritical fluids-assisted lignin depolymerizations. The methods employed in this research were described, and the important results were marked. Generally, base-catalyzed and acid-catalyzed methods were straightforward, but the selectivity was low. The severe reaction conditions (high pressure, high temperature, and extreme pH) resulted in requirement of specially designed reactors, which led to high costs of facility and handling. Ionic liquids, and supercritical fluids-assisted lignin depolymerizations had high selectivity, but the high costs of ionic liquids recyclingmore » and supercritical fluid facility limited their applications on commercial scale biomass treatment. Metallic catalyzed depolymerization had great advantages because of its high selectivity to certain monomeric compounds and much milder reaction condition than base-catalyzed or acid-catalyzed depolymerizations. It would be a great contribution to lignin conversion if appropriate catalysts were synthesized.« less
Detection of lead(II) ions with a DNAzyme and isothermal strand displacement signal amplification.
Li, Wenying; Yang, Yue; Chen, Jian; Zhang, Qingfeng; Wang, Yan; Wang, Fangyuan; Yu, Cong
2014-03-15
A DNAzyme based method for the sensitive and selective quantification of lead(II) ions has been developed. A DNAzyme that requires Pb(2+) for activation was selected. An RNA containing DNA substrate was cleaved by the DNAzyme in the presence of Pb(2+). The 2',3'-cyclic phosphate of the cleaved 5'-part of the substrate was efficiently removed by Exonuclease III. The remaining part of the single stranded DNA (9 or 13 base long) was subsequently used as the primer for the strand displacement amplification reaction (SDAR). The method is highly sensitive, 200 pM lead(II) could be easily detected. A number of interference ions were tested, and the sensor showed good selectivity. Underground water samples were also tested, which demonstrated the feasibility of the current approach for real sample applications. It is feasible that our method could be used for DNAzyme or aptazyme based new sensing method developments for the quantification of other target analytes with high sensitivity and selectivity. © 2013 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Kariuki, Patrick N. K.; Bush, Elizabeth Danielle
2008-01-01
The purpose of this study was to examine the effects of Total Physical Response by Storytelling and the traditional teaching method on a foreign language in a selected high school. The sample consisted of 30 students who were randomly selected and randomly assigned to experimental and control group. The experimental group was taught using Total…
Scalable Production Method for Graphene Oxide Water Vapor Separation Membranes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fifield, Leonard S.; Shin, Yongsoon; Liu, Wei
ABSTRACT Membranes for selective water vapor separation were assembled from graphene oxide suspension using techniques compatible with high volume industrial production. The large-diameter graphene oxide flake suspensions were synthesized from graphite materials via relatively efficient chemical oxidation steps with attention paid to maintaining flake size and achieving high graphene oxide concentrations. Graphene oxide membranes produced using scalable casting methods exhibited water vapor flux and water/nitrogen selectivity performance meeting or exceeding that of membranes produced using vacuum-assisted laboratory techniques. (PNNL-SA-117497)
Zhang, Xiaohua Douglas; Yang, Xiting Cindy; Chung, Namjin; Gates, Adam; Stec, Erica; Kunapuli, Priya; Holder, Dan J; Ferrer, Marc; Espeseth, Amy S
2006-04-01
RNA interference (RNAi) high-throughput screening (HTS) experiments carried out using large (>5000 short interfering [si]RNA) libraries generate a huge amount of data. In order to use these data to identify the most effective siRNAs tested, it is critical to adopt and develop appropriate statistical methods. To address the questions in hit selection of RNAi HTS, we proposed a quartile-based method which is robust to outliers, true hits and nonsymmetrical data. We compared it with the more traditional tests, mean +/- k standard deviation (SD) and median +/- 3 median of absolute deviation (MAD). The results suggested that the quartile-based method selected more hits than mean +/- k SD under the same preset error rate. The number of hits selected by median +/- k MAD was close to that by the quartile-based method. Further analysis suggested that the quartile-based method had the greatest power in detecting true hits, especially weak or moderate true hits. Our investigation also suggested that platewise analysis (determining effective siRNAs on a plate-by-plate basis) can adjust for systematic errors in different plates, while an experimentwise analysis, in which effective siRNAs are identified in an analysis of the entire experiment, cannot. However, experimentwise analysis may detect a cluster of true positive hits placed together in one or several plates, while platewise analysis may not. To display hit selection results, we designed a specific figure called a plate-well series plot. We thus suggest the following strategy for hit selection in RNAi HTS experiments. First, choose the quartile-based method, or median +/- k MAD, for identifying effective siRNAs. Second, perform the chosen method experimentwise on transformed/normalized data, such as percentage inhibition, to check the possibility of hit clusters. If a cluster of selected hits are observed, repeat the analysis based on untransformed data to determine whether the cluster is due to an artifact in the data. If no clusters of hits are observed, select hits by performing platewise analysis on transformed data. Third, adopt the plate-well series plot to visualize both the data and the hit selection results, as well as to check for artifacts.
Variable Selection through Correlation Sifting
NASA Astrophysics Data System (ADS)
Huang, Jim C.; Jojic, Nebojsa
Many applications of computational biology require a variable selection procedure to sift through a large number of input variables and select some smaller number that influence a target variable of interest. For example, in virology, only some small number of viral protein fragments influence the nature of the immune response during viral infection. Due to the large number of variables to be considered, a brute-force search for the subset of variables is in general intractable. To approximate this, methods based on ℓ1-regularized linear regression have been proposed and have been found to be particularly successful. It is well understood however that such methods fail to choose the correct subset of variables if these are highly correlated with other "decoy" variables. We present a method for sifting through sets of highly correlated variables which leads to higher accuracy in selecting the correct variables. The main innovation is a filtering step that reduces correlations among variables to be selected, making the ℓ1-regularization effective for datasets on which many methods for variable selection fail. The filtering step changes both the values of the predictor variables and output values by projections onto components obtained through a computationally-inexpensive principal components analysis. In this paper we demonstrate the usefulness of our method on synthetic datasets and on novel applications in virology. These include HIV viral load analysis based on patients' HIV sequences and immune types, as well as the analysis of seasonal variation in influenza death rates based on the regions of the influenza genome that undergo diversifying selection in the previous season.
Genomic selection in plant breeding.
Newell, Mark A; Jannink, Jean-Luc
2014-01-01
Genomic selection (GS) is a method to predict the genetic value of selection candidates based on the genomic estimated breeding value (GEBV) predicted from high-density markers positioned throughout the genome. Unlike marker-assisted selection, the GEBV is based on all markers including both minor and major marker effects. Thus, the GEBV may capture more of the genetic variation for the particular trait under selection.
Dickel, Timo; Plaß, Wolfgang R; Lippert, Wayne; Lang, Johannes; Yavor, Mikhail I; Geissel, Hans; Scheidenberger, Christoph
2017-06-01
A novel method for (ultra-)high-resolution spatial mass separation in time-of-flight mass spectrometers is presented. Ions are injected into a time-of-flight analyzer from a radio frequency (rf) trap, dispersed in time-of-flight according to their mass-to-charge ratios and then re-trapped dynamically in the same rf trap. This re-trapping technique is highly mass-selective and after sufficiently long flight times can provide even isobaric separation. A theoretical treatment of the method is presented and the conditions for optimum performance of the method are derived. The method has been implemented in a multiple-reflection time-of-flight mass spectrometer and mass separation powers (FWHM) in excess of 70,000, and re-trapping efficiencies of up to 35% have been obtained for the protonated molecular ion of caffeine. The isobars glutamine and lysine (relative mass difference of 1/4000) have been separated after a flight time of 0.2 ms only. Higher mass separation powers can be achieved using longer flight times. The method will have important applications, including isobar separation in nuclear physics and (ultra-)high-resolution precursor ion selection in multiple-stage tandem mass spectrometry. Graphical Abstract ᅟ.
NASA Astrophysics Data System (ADS)
Dickel, Timo; Plaß, Wolfgang R.; Lippert, Wayne; Lang, Johannes; Yavor, Mikhail I.; Geissel, Hans; Scheidenberger, Christoph
2017-06-01
A novel method for (ultra-)high-resolution spatial mass separation in time-of-flight mass spectrometers is presented. Ions are injected into a time-of-flight analyzer from a radio frequency (rf) trap, dispersed in time-of-flight according to their mass-to-charge ratios and then re-trapped dynamically in the same rf trap. This re-trapping technique is highly mass-selective and after sufficiently long flight times can provide even isobaric separation. A theoretical treatment of the method is presented and the conditions for optimum performance of the method are derived. The method has been implemented in a multiple-reflection time-of-flight mass spectrometer and mass separation powers (FWHM) in excess of 70,000, and re-trapping efficiencies of up to 35% have been obtained for the protonated molecular ion of caffeine. The isobars glutamine and lysine (relative mass difference of 1/4000) have been separated after a flight time of 0.2 ms only. Higher mass separation powers can be achieved using longer flight times. The method will have important applications, including isobar separation in nuclear physics and (ultra-)high-resolution precursor ion selection in multiple-stage tandem mass spectrometry. [Figure not available: see fulltext.
Massey, Andrew J
2018-01-01
Determining and understanding drug target engagement is critical for drug discovery. This can be challenging within living cells as selective readouts are often unavailable. Here we describe a novel method for measuring target engagement in living cells based on the principle of altered protein thermal stabilization / destabilization in response to ligand binding. This assay (HCIF-CETSA) utilizes high content, high throughput single cell immunofluorescent detection to determine target protein levels following heating of adherent cells in a 96 well plate format. We have used target engagement of Chk1 by potent small molecule inhibitors to validate the assay. Target engagement measured by this method was subsequently compared to target engagement measured by two alternative methods (autophosphorylation and CETSA). The HCIF-CETSA method appeared robust and a good correlation in target engagement measured by this method and CETSA for the selective Chk1 inhibitor V158411 was observed. However, these EC50 values were 23- and 12-fold greater than the autophosphorylation IC50. The described method is therefore a valuable advance in the CETSA method allowing the high throughput determination of target engagement in adherent cells.
NetProt: Complex-based Feature Selection.
Goh, Wilson Wen Bin; Wong, Limsoon
2017-08-04
Protein complex-based feature selection (PCBFS) provides unparalleled reproducibility with high phenotypic relevance on proteomics data. Currently, there are five PCBFS paradigms, but not all representative methods have been implemented or made readily available. To allow general users to take advantage of these methods, we developed the R-package NetProt, which provides implementations of representative feature-selection methods. NetProt also provides methods for generating simulated differential data and generating pseudocomplexes for complex-based performance benchmarking. The NetProt open source R package is available for download from https://github.com/gohwils/NetProt/releases/ , and online documentation is available at http://rpubs.com/gohwils/204259 .
Kheiri, Ahmed; Keedwell, Ed
2017-01-01
Operations research is a well-established field that uses computational systems to support decisions in business and public life. Good solutions to operations research problems can make a large difference to the efficient running of businesses and organisations and so the field often searches for new methods to improve these solutions. The high school timetabling problem is an example of an operations research problem and is a challenging task which requires assigning events and resources to time slots subject to a set of constraints. In this article, a new sequence-based selection hyper-heuristic is presented that produces excellent results on a suite of high school timetabling problems. In this study, we present an easy-to-implement, easy-to-maintain, and effective sequence-based selection hyper-heuristic to solve high school timetabling problems using a benchmark of unified real-world instances collected from different countries. We show that with sequence-based methods, it is possible to discover new best known solutions for a number of the problems in the timetabling domain. Through this investigation, the usefulness of sequence-based selection hyper-heuristics has been demonstrated and the capability of these methods has been shown to exceed the state of the art.
Kusumaningrum, Dewi; Lee, Hoonsoo; Lohumi, Santosh; Mo, Changyeun; Kim, Moon S; Cho, Byoung-Kwan
2018-03-01
The viability of seeds is important for determining their quality. A high-quality seed is one that has a high capability of germination that is necessary to ensure high productivity. Hence, developing technology for the detection of seed viability is a high priority in agriculture. Fourier transform near-infrared (FT-NIR) spectroscopy is one of the most popular devices among other vibrational spectroscopies. This study aims to use FT-NIR spectroscopy to determine the viability of soybean seeds. Viable and artificial ageing seeds as non-viable soybeans were used in this research. The FT-NIR spectra of soybean seeds were collected and analysed using a partial least-squares discriminant analysis (PLS-DA) to classify viable and non-viable soybean seeds. Moreover, the variable importance in projection (VIP) method for variable selection combined with the PLS-DA was employed. The most effective wavelengths were selected by the VIP method, which selected 146 optimal variables from the full set of 1557 variables. The results demonstrated that the FT-NIR spectral analysis with the PLS-DA method that uses all variables or the selected variables showed good performance based on the high value of prediction accuracy for soybean viability with an accuracy close to 100%. Hence, FT-NIR techniques with a chemometric analysis have the potential for rapidly measuring soybean seed viability. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
Feng, Ximeng; Li, Gang; Yu, Haixia; Wang, Shaohui; Yi, Xiaoqing; Lin, Ling
2018-03-01
Noninvasive blood component analysis by spectroscopy has been a hotspot in biomedical engineering in recent years. Dynamic spectrum provides an excellent idea for noninvasive blood component measurement, but studies have been limited to the application of broadband light sources and high-resolution spectroscopy instruments. In order to remove redundant information, a more effective wavelength selection method has been presented in this paper. In contrast to many common wavelength selection methods, this method is based on sensing mechanism which has a clear mechanism and can effectively avoid the noise from acquisition system. The spectral difference coefficient was theoretically proved to have a guiding significance for wavelength selection. After theoretical analysis, the multi-band spectral difference coefficient-wavelength selection method combining with the dynamic spectrum was proposed. An experimental analysis based on clinical trial data from 200 volunteers has been conducted to illustrate the effectiveness of this method. The extreme learning machine was used to develop the calibration models between the dynamic spectrum data and hemoglobin concentration. The experiment result shows that the prediction precision of hemoglobin concentration using multi-band spectral difference coefficient-wavelength selection method is higher compared with other methods.
Ultra-High Density Holographic Memory Module with Solid-State Architecture
NASA Technical Reports Server (NTRS)
Markov, Vladimir B.
2000-01-01
NASA's terrestrial. space, and deep-space missions require technology that allows storing. retrieving, and processing a large volume of information. Holographic memory offers high-density data storage with parallel access and high throughput. Several methods exist for data multiplexing based on the fundamental principles of volume hologram selectivity. We recently demonstrated that a spatial (amplitude-phase) encoding of the reference wave (SERW) looks promising as a way to increase the storage density. The SERW hologram offers a method other than traditional methods of selectivity, such as spatial de-correlation between recorded and reconstruction fields, In this report we present the experimental results of the SERW-hologram memory module with solid-state architecture, which is of particular interest for space operations.
James M. Guldin
2011-01-01
The selection method applied in shade-intolerant pine stands in the southern United States has been shown to be an effective method of uneven-aged silviculture, but it is becoming less frequently practiced for a variety of reasons. Economically, the high value of standing timber puts fully stocked uneven-aged pine stands at risk of liquidation if the timberland is sold...
Rational Methods for the Selection of Diverse Screening Compounds
Huggins, David J.; Venkitaraman, Ashok R.; Spring, David R.
2016-01-01
Traditionally a pursuit of large pharmaceutical companies, high-throughput screening assays are becoming increasingly common within academic and government laboratories. This shift has been instrumental in enabling projects that have not been commercially viable, such as chemical probe discovery and screening against high risk targets. Once an assay has been prepared and validated, it must be fed with screening compounds. Crafting a successful collection of small molecules for screening poses a significant challenge. An optimized collection will minimize false positives whilst maximizing hit rates of compounds that are amenable to lead generation and optimization. Without due consideration of the relevant protein targets and the downstream screening assays, compound filtering and selection can fail to explore the great extent of chemical diversity and eschew valuable novelty. Herein, we discuss the different factors to be considered and methods that may be employed when assembling a structurally diverse compound screening collection. Rational methods for selecting diverse chemical libraries are essential for their effective use in high-throughput screens. PMID:21261294
2013-01-01
Background High resolution melting analysis (HRM) is a rapid and cost-effective technique for the characterisation of PCR amplicons. Because the reverse genetics of segmented influenza A viruses allows the generation of numerous influenza A virus reassortants within a short time, methods for the rapid selection of the correct recombinants are very useful. Methods PCR primer pairs covering the single nucleotide polymorphism (SNP) positions of two different influenza A H5N1 strains were designed. Reassortants of the two different H5N1 isolates were used as a model to prove the suitability of HRM for the selection of the correct recombinants. Furthermore, two different cycler instruments were compared. Results Both cycler instruments generated comparable average melting peaks, which allowed the easy identification and selection of the correct cloned segments or reassorted viruses. Conclusions HRM is a highly suitable method for the rapid and precise characterisation of cloned influenza A genomes. PMID:24028349
Statistical Analysis of Big Data on Pharmacogenomics
Fan, Jianqing; Liu, Han
2013-01-01
This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905
Zhang, Bo; Cohen, Joanna E; OʼConnor, Shawn
2014-01-01
Selection of priority groups is important for health interventions. However, no quantitative method has been developed. To develop a quantitative method to support the process of selecting priority groups for public health interventions based on both high risk and population health burden. Secondary data analysis of the 2010 Canadian Community Health Survey. Canadian population. Survey respondents. We identified priority groups for 3 diseases: heart disease, stroke, and chronic lower respiratory diseases. Three measures--prevalence, population counts, and adjusted odds ratios (OR)--were calculated for subpopulations (sociodemographic characteristics and other risk factors). A Priority Group Index (PGI) was calculated by summing the rank scores of these 3 measures. Of the 30 priority groups identified by the PGI (10 for each of the 3 disease outcomes), 7 were identified on the basis of high prevalence only, 5 based on population count only, 3 based on high OR only, and the remainder based on combinations of these. The identified priority groups were all in line with the literature as risk factors for the 3 diseases, such as elderly people for heart disease and stroke and those with low income for chronic lower respiratory diseases. The PGI was thus able to balance both high risk and population burden approaches in selecting priority groups, and thus it would address health inequities as well as disease burden in the overall population. The PGI is a quantitative method to select priority groups for public health interventions; it has the potential to enhance the effective use of limited public resources.
Sparse High Dimensional Models in Economics
Fan, Jianqing; Lv, Jinchi; Qi, Lei
2010-01-01
This paper reviews the literature on sparse high dimensional models and discusses some applications in economics and finance. Recent developments of theory, methods, and implementations in penalized least squares and penalized likelihood methods are highlighted. These variable selection methods are proved to be effective in high dimensional sparse modeling. The limits of dimensionality that regularization methods can handle, the role of penalty functions, and their statistical properties are detailed. Some recent advances in ultra-high dimensional sparse modeling are also briefly discussed. PMID:22022635
Mallik, Rangan; Wa, Chunling; Hage, David S.
2008-01-01
Two techniques were developed for the immobilization of proteins and other ligands to silica through sulfhydryl groups. These methods made use of maleimide-activated silica (the SMCC method) or iodoacetyl-activated silica (the SIA method). The resulting supports were tested for use in high-performance affinity chromatography by employing human serum albumin (HSA) as a model protein. Studies with normal and iodoacetamide-modified HSA indicated that these methods had a high selectivity for sulfhydryl groups on this protein, which accounted for the coupling of 77–81% of this protein to maleimide- or iodacetyl-activated silica. These supports were also evaluated in terms of their total protein content, binding capacity, specific activity, non-specific binding, stability and chiral selectivity for several test solutes. HSA columns prepared using maleimide-activated silica gave the best overall results for these properties when compared to HSA that had been immobilized to silica through the Schiff base method (i.e., an amine-based coupling technique). A key advantage of the supports developed in this work is that they offer the potential of giving greater site-selective immobilization and ligand activity than amine-based coupling methods. These features make these supports attractive in the development of protein columns for such applications as the study of biological interactions and chiral separations. PMID:17297940
Magnetically-focusing biochip structures for high-speed active biosensing with improved selectivity.
Yoo, Haneul; Lee, Dong Jun; Kim, Daesan; Park, Juhun; Chen, Xing; Hong, Seunghun
2018-06-29
We report a magnetically-focusing biochip structure enabling a single layered magnetic trap-and-release cycle for biosensors with an improved detection speed and selectivity. Here, magnetic beads functionalized with specific receptor molecules were utilized to trap target molecules in a solution and transport actively to and away from the sensor surfaces to enhance the detection speed and reduce the non-specific bindings, respectively. Using our method, we demonstrated the high speed detection of IL-13 antigens with the improved detection speed by more than an order of magnitude. Furthermore, the release step in our method was found to reduce the non-specific bindings and improve the selectivity and sensitivity of biosensors. This method is a simple but powerful strategy and should open up various applications such as ultra-fast biosensors for point-of-care services.
Magnetically-focusing biochip structures for high-speed active biosensing with improved selectivity
NASA Astrophysics Data System (ADS)
Yoo, Haneul; Lee, Dong Jun; Kim, Daesan; Park, Juhun; Chen, Xing; Hong, Seunghun
2018-06-01
We report a magnetically-focusing biochip structure enabling a single layered magnetic trap-and-release cycle for biosensors with an improved detection speed and selectivity. Here, magnetic beads functionalized with specific receptor molecules were utilized to trap target molecules in a solution and transport actively to and away from the sensor surfaces to enhance the detection speed and reduce the non-specific bindings, respectively. Using our method, we demonstrated the high speed detection of IL-13 antigens with the improved detection speed by more than an order of magnitude. Furthermore, the release step in our method was found to reduce the non-specific bindings and improve the selectivity and sensitivity of biosensors. This method is a simple but powerful strategy and should open up various applications such as ultra-fast biosensors for point-of-care services.
Automatic allograft bone selection through band registration and its application to distal femur.
Zhang, Yu; Qiu, Lei; Li, Fengzan; Zhang, Qing; Zhang, Li; Niu, Xiaohui
2017-09-01
Clinical reports suggest that large bone defects could be effectively restored by allograft bone transplantation, where allograft bone selection acts an important role. Besides, there is a huge demand for developing the automatic allograft bone selection methods, as the automatic methods could greatly improve the management efficiency of the large bone banks. Although several automatic methods have been presented to select the most suitable allograft bone from the massive allograft bone bank, these methods still suffer from inaccuracy. In this paper, we propose an effective allograft bone selection method without using the contralateral bones. Firstly, the allograft bone is globally aligned to the recipient bone by surface registration. Then, the global alignment is further refined through band registration. The band, defined as the recipient points within the lifted and lowered cutting planes, could involve more local structure of the defected segment. Therefore, our method could achieve robust alignment and high registration accuracy of the allograft and recipient. Moreover, the existing contour method and surface method could be unified into one framework under our method by adjusting the lift and lower distances of the cutting planes. Finally, our method has been validated on the database of distal femurs. The experimental results indicate that our method outperforms the surface method and contour method.
NASA Astrophysics Data System (ADS)
Yang, Yannan; Yu, Meihua; Song, Hao; Wang, Yue; Yu, Chengzhong
2015-07-01
Well-dispersed mesoporous hollow silica-fullerene nanoparticles with particle sizes of ~50 nm have been successfully prepared by incorporating fullerene molecules into the silica framework followed by a selective etching method. The fabricated fluorescent silica-fullerene composite with high porosity demonstrates excellent performance in combined chemo/photodynamic therapy.Well-dispersed mesoporous hollow silica-fullerene nanoparticles with particle sizes of ~50 nm have been successfully prepared by incorporating fullerene molecules into the silica framework followed by a selective etching method. The fabricated fluorescent silica-fullerene composite with high porosity demonstrates excellent performance in combined chemo/photodynamic therapy. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr02769a
Advances in metaheuristics for gene selection and classification of microarray data.
Duval, Béatrice; Hao, Jin-Kao
2010-01-01
Gene selection aims at identifying a (small) subset of informative genes from the initial data in order to obtain high predictive accuracy for classification. Gene selection can be considered as a combinatorial search problem and thus be conveniently handled with optimization methods. In this article, we summarize some recent developments of using metaheuristic-based methods within an embedded approach for gene selection. In particular, we put forward the importance and usefulness of integrating problem-specific knowledge into the search operators of such a method. To illustrate the point, we explain how ranking coefficients of a linear classifier such as support vector machine (SVM) can be profitably used to reinforce the search efficiency of Local Search and Evolutionary Search metaheuristic algorithms for gene selection and classification.
Speech Emotion Feature Selection Method Based on Contribution Analysis Algorithm of Neural Network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang Xiaojia; Mao Qirong; Zhan Yongzhao
There are many emotion features. If all these features are employed to recognize emotions, redundant features may be existed. Furthermore, recognition result is unsatisfying and the cost of feature extraction is high. In this paper, a method to select speech emotion features based on contribution analysis algorithm of NN is presented. The emotion features are selected by using contribution analysis algorithm of NN from the 95 extracted features. Cluster analysis is applied to analyze the effectiveness for the features selected, and the time of feature extraction is evaluated. Finally, 24 emotion features selected are used to recognize six speech emotions.more » The experiments show that this method can improve the recognition rate and the time of feature extraction.« less
Liang, Ja-Der; Ping, Xiao-Ou; Tseng, Yi-Ju; Huang, Guan-Tarn; Lai, Feipei; Yang, Pei-Ming
2014-12-01
Recurrence of hepatocellular carcinoma (HCC) is an important issue despite effective treatments with tumor eradication. Identification of patients who are at high risk for recurrence may provide more efficacious screening and detection of tumor recurrence. The aim of this study was to develop recurrence predictive models for HCC patients who received radiofrequency ablation (RFA) treatment. From January 2007 to December 2009, 83 newly diagnosed HCC patients receiving RFA as their first treatment were enrolled. Five feature selection methods including genetic algorithm (GA), simulated annealing (SA) algorithm, random forests (RF) and hybrid methods (GA+RF and SA+RF) were utilized for selecting an important subset of features from a total of 16 clinical features. These feature selection methods were combined with support vector machine (SVM) for developing predictive models with better performance. Five-fold cross-validation was used to train and test SVM models. The developed SVM-based predictive models with hybrid feature selection methods and 5-fold cross-validation had averages of the sensitivity, specificity, accuracy, positive predictive value, negative predictive value, and area under the ROC curve as 67%, 86%, 82%, 69%, 90%, and 0.69, respectively. The SVM derived predictive model can provide suggestive high-risk recurrent patients, who should be closely followed up after complete RFA treatment. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Cheng, Qiang; Zhou, Hongbo; Cheng, Jie
2011-06-01
Selecting features for multiclass classification is a critically important task for pattern recognition and machine learning applications. Especially challenging is selecting an optimal subset of features from high-dimensional data, which typically have many more variables than observations and contain significant noise, missing components, or outliers. Existing methods either cannot handle high-dimensional data efficiently or scalably, or can only obtain local optimum instead of global optimum. Toward the selection of the globally optimal subset of features efficiently, we introduce a new selector--which we call the Fisher-Markov selector--to identify those features that are the most useful in describing essential differences among the possible groups. In particular, in this paper we present a way to represent essential discriminating characteristics together with the sparsity as an optimization objective. With properly identified measures for the sparseness and discriminativeness in possibly high-dimensional settings, we take a systematic approach for optimizing the measures to choose the best feature subset. We use Markov random field optimization techniques to solve the formulated objective functions for simultaneous feature selection. Our results are noncombinatorial, and they can achieve the exact global optimum of the objective function for some special kernels. The method is fast; in particular, it can be linear in the number of features and quadratic in the number of observations. We apply our procedure to a variety of real-world data, including mid--dimensional optical handwritten digit data set and high-dimensional microarray gene expression data sets. The effectiveness of our method is confirmed by experimental results. In pattern recognition and from a model selection viewpoint, our procedure says that it is possible to select the most discriminating subset of variables by solving a very simple unconstrained objective function which in fact can be obtained with an explicit expression.
ERIC Educational Resources Information Center
Labby, Sandra A.
2010-01-01
Purpose: The purpose of this study was to determine the relationship among principals' emotional intelligence skills, school accountability ratings, and selected demographic factors. Method: The sample was comprised of Texas public school principals from elementary, middle school/junior high, and high schools and their school accountability…
Morph-X-Select: Morphology-based tissue aptamer selection for ovarian cancer biomarker discovery
Wang, Hongyu; Li, Xin; Volk, David E.; Lokesh, Ganesh L.-R.; Elizondo-Riojas, Miguel-Angel; Li, Li; Nick, Alpa M.; Sood, Anil K.; Rosenblatt, Kevin P.; Gorenstein, David G.
2016-01-01
High affinity aptamer-based biomarker discovery has the advantage of simultaneously discovering an aptamer affinity reagent and its target biomarker protein. Here, we demonstrate a morphology-based tissue aptamer selection method that enables us to use tissue sections from individual patients and identify high-affinity aptamers and their associated target proteins in a systematic and accurate way. We created a combinatorial DNA aptamer library that has been modified with thiophosphate substitutions of the phosphate ester backbone at selected 5′dA positions for enhanced nuclease resistance and targeting. Based on morphological assessment, we used image-directed laser microdissection (LMD) to dissect regions of interest bound with the thioaptamer (TA) library and further identified target proteins for the selected TAs. We have successfully identified and characterized the lead candidate TA, V5, as a vimentin-specific sequence that has shown specific binding to tumor vasculature of human ovarian tissue and human microvascular endothelial cells. This new Morph-X-Select method allows us to select high-affinity aptamers and their associated target proteins in a specific and accurate way, and could be used for personalized biomarker discovery to improve medical decision-making and to facilitate the development of targeted therapies to achieve more favorable outcomes. PMID:27839510
Selective suppression of high-order harmonics within phase-matched spectral regions.
Lerner, Gavriel; Diskin, Tzvi; Neufeld, Ofer; Kfir, Ofer; Cohen, Oren
2017-04-01
Phase matching in high-harmonic generation leads to enhancement of multiple harmonics. It is sometimes desired to control the spectral structure within the phase-matched spectral region. We propose a scheme for selective suppression of high-order harmonics within the phase-matched spectral region while weakly influencing the other harmonics. The method is based on addition of phase-mismatched segments within a phase-matched medium. We demonstrate the method numerically in two examples. First, we show that one phase-mismatched segment can significantly suppress harmonic orders 9, 15, and 21. Second, we show that two phase-mismatched segments can efficiently suppress circularly polarized harmonics with one helicity over the other when driven by a bi-circular field. The new method may be useful for various applications, including the generation of highly helical bright attosecond pulses.
Dynamic video encryption algorithm for H.264/AVC based on a spatiotemporal chaos system.
Xu, Hui; Tong, Xiao-Jun; Zhang, Miao; Wang, Zhu; Li, Ling-Hao
2016-06-01
Video encryption schemes mostly employ the selective encryption method to encrypt parts of important and sensitive video information, aiming to ensure the real-time performance and encryption efficiency. The classic block cipher is not applicable to video encryption due to the high computational overhead. In this paper, we propose the encryption selection control module to encrypt video syntax elements dynamically which is controlled by the chaotic pseudorandom sequence. A novel spatiotemporal chaos system and binarization method is used to generate a key stream for encrypting the chosen syntax elements. The proposed scheme enhances the resistance against attacks through the dynamic encryption process and high-security stream cipher. Experimental results show that the proposed method exhibits high security and high efficiency with little effect on the compression ratio and time cost.
NASA Astrophysics Data System (ADS)
Ma, Yan; Yao, Jinxia; Gu, Chao; Chen, Yufeng; Yang, Yi; Zou, Lida
2017-05-01
With the formation of electric big data environment, more and more big data analyses emerge. In the complicated data analysis on equipment condition assessment, there exist many join operations, which are time-consuming. In order to save time, the approach of materialized view is usually used. It places part of common and critical join results on external storage and avoids the frequent join operation. In the paper we propose the methods of selecting and placing materialized views to reduce the query time of electric transmission and transformation equipment, and make the profits of service providers maximal. In selection method we design a computation way for the value of non-leaf node based on MVPP structure chart. In placement method we use relevance weights to place the selected materialized views, which help reduce the network transmission time. Our experiments show that the proposed selection and placement methods have a high throughput and good optimization ability of query time for electric transmission and transformation equipment.
An Approach for Selecting a Theoretical Framework for the Evaluation of Training Programs
ERIC Educational Resources Information Center
Tasca, Jorge Eduardo; Ensslin, Leonardo; Ensslin, Sandra Rolim; Alves, Maria Bernardete Martins
2010-01-01
Purpose: This research paper proposes a method for selecting references related to a research topic, and seeks to exemplify it for the case of a study evaluating training programs. The method is designed to identify references with high academic relevance in databases accessed via the internet, using a bibliometric analysis to sift the selected…
Collective feature selection to identify crucial epistatic variants.
Verma, Shefali S; Lucas, Anastasia; Zhang, Xinyuan; Veturi, Yogasudha; Dudek, Scott; Li, Binglan; Li, Ruowang; Urbanowicz, Ryan; Moore, Jason H; Kim, Dokyoon; Ritchie, Marylyn D
2018-01-01
Machine learning methods have gained popularity and practicality in identifying linear and non-linear effects of variants associated with complex disease/traits. Detection of epistatic interactions still remains a challenge due to the large number of features and relatively small sample size as input, thus leading to the so-called "short fat data" problem. The efficiency of machine learning methods can be increased by limiting the number of input features. Thus, it is very important to perform variable selection before searching for epistasis. Many methods have been evaluated and proposed to perform feature selection, but no single method works best in all scenarios. We demonstrate this by conducting two separate simulation analyses to evaluate the proposed collective feature selection approach. Through our simulation study we propose a collective feature selection approach to select features that are in the "union" of the best performing methods. We explored various parametric, non-parametric, and data mining approaches to perform feature selection. We choose our top performing methods to select the union of the resulting variables based on a user-defined percentage of variants selected from each method to take to downstream analysis. Our simulation analysis shows that non-parametric data mining approaches, such as MDR, may work best under one simulation criteria for the high effect size (penetrance) datasets, while non-parametric methods designed for feature selection, such as Ranger and Gradient boosting, work best under other simulation criteria. Thus, using a collective approach proves to be more beneficial for selecting variables with epistatic effects also in low effect size datasets and different genetic architectures. Following this, we applied our proposed collective feature selection approach to select the top 1% of variables to identify potential interacting variables associated with Body Mass Index (BMI) in ~ 44,000 samples obtained from Geisinger's MyCode Community Health Initiative (on behalf of DiscovEHR collaboration). In this study, we were able to show that selecting variables using a collective feature selection approach could help in selecting true positive epistatic variables more frequently than applying any single method for feature selection via simulation studies. We were able to demonstrate the effectiveness of collective feature selection along with a comparison of many methods in our simulation analysis. We also applied our method to identify non-linear networks associated with obesity.
Liu, Liang; Cooper, Tamara; Eldi, Preethi; Garcia-Valtanen, Pablo; Diener, Kerrilyn R; Howley, Paul M; Hayball, John D
2017-04-01
Recombinant vaccinia viruses (rVACVs) are promising antigen-delivery systems for vaccine development that are also useful as research tools. Two common methods for selection during construction of rVACV clones are (i) co-insertion of drug resistance or reporter protein genes, which requires the use of additional selection drugs or detection methods, and (ii) dominant host-range selection. The latter uses VACV variants rendered replication-incompetent in host cell lines by the deletion of host-range genes. Replicative ability is restored by co-insertion of the host-range genes, providing for dominant selection of the recombinant viruses. Here, we describe a new method for the construction of rVACVs using the cowpox CP77 protein and unmodified VACV as the starting material. Our selection system will expand the range of tools available for positive selection of rVACV during vector construction, and it is substantially more high-fidelity than approaches based on selection for drug resistance.
NASA Astrophysics Data System (ADS)
Stas, Michiel; Dong, Qinghan; Heremans, Stien; Zhang, Beier; Van Orshoven, Jos
2016-08-01
This paper compares two machine learning techniques to predict regional winter wheat yields. The models, based on Boosted Regression Trees (BRT) and Support Vector Machines (SVM), are constructed of Normalized Difference Vegetation Indices (NDVI) derived from low resolution SPOT VEGETATION satellite imagery. Three types of NDVI-related predictors were used: Single NDVI, Incremental NDVI and Targeted NDVI. BRT and SVM were first used to select features with high relevance for predicting the yield. Although the exact selections differed between the prefectures, certain periods with high influence scores for multiple prefectures could be identified. The same period of high influence stretching from March to June was detected by both machine learning methods. After feature selection, BRT and SVM models were applied to the subset of selected features for actual yield forecasting. Whereas both machine learning methods returned very low prediction errors, BRT seems to slightly but consistently outperform SVM.
Guo, Xinyu; Dominick, Kelli C; Minai, Ali A; Li, Hailong; Erickson, Craig A; Lu, Long J
2017-01-01
The whole-brain functional connectivity (FC) pattern obtained from resting-state functional magnetic resonance imaging data are commonly applied to study neuropsychiatric conditions such as autism spectrum disorder (ASD) by using different machine learning models. Recent studies indicate that both hyper- and hypo- aberrant ASD-associated FCs were widely distributed throughout the entire brain rather than only in some specific brain regions. Deep neural networks (DNN) with multiple hidden layers have shown the ability to systematically extract lower-to-higher level information from high dimensional data across a series of neural hidden layers, significantly improving classification accuracy for such data. In this study, a DNN with a novel feature selection method (DNN-FS) is developed for the high dimensional whole-brain resting-state FC pattern classification of ASD patients vs. typical development (TD) controls. The feature selection method is able to help the DNN generate low dimensional high-quality representations of the whole-brain FC patterns by selecting features with high discriminating power from multiple trained sparse auto-encoders. For the comparison, a DNN without the feature selection method (DNN-woFS) is developed, and both of them are tested with different architectures (i.e., with different numbers of hidden layers/nodes). Results show that the best classification accuracy of 86.36% is generated by the DNN-FS approach with 3 hidden layers and 150 hidden nodes (3/150). Remarkably, DNN-FS outperforms DNN-woFS for all architectures studied. The most significant accuracy improvement was 9.09% with the 3/150 architecture. The method also outperforms other feature selection methods, e.g., two sample t -test and elastic net. In addition to improving the classification accuracy, a Fisher's score-based biomarker identification method based on the DNN is also developed, and used to identify 32 FCs related to ASD. These FCs come from or cross different pre-defined brain networks including the default-mode, cingulo-opercular, frontal-parietal, and cerebellum. Thirteen of them are statically significant between ASD and TD groups (two sample t -test p < 0.05) while 19 of them are not. The relationship between the statically significant FCs and the corresponding ASD behavior symptoms is discussed based on the literature and clinician's expert knowledge. Meanwhile, the potential reason of obtaining 19 FCs which are not statistically significant is also provided.
Jin, Yulong; Huang, Yanyan; Liu, Guoquan; Zhao, Rui
2013-09-21
A novel quartz crystal microbalance (QCM) sensor for rapid, highly selective and sensitive detection of copper ions was developed. As a signal amplifier, gold nanoparticles (Au NPs) were self-assembled onto the surface of the sensor. A simple dip-and-dry method enabled the whole detection procedure to be accomplished within 20 min. High selectivity of the sensor towards copper ions is demonstrated by both individual and coexisting assays with interference ions. This gold nanoparticle mediated amplification allowed a detection limit down to 3.1 μM. Together with good repeatability and regeneration, the QCM sensor was also applied to the analysis of copper contamination in drinking water. This work provides a flexible method for fabricating QCM sensors for the analysis of important small molecules in environmental and biological samples.
VanOrder, Tonya; Robbins, Wayne; Zemper, Eric
2017-04-01
Competition for postdoctoral training positions is at an all-time high, and residency program directors continue to have little direction when it comes to structuring an effective interview process. To examine whether a relationship existed between interview methods used and program director satisfaction with resident selection decisions and whether programs that used methods designed to assess candidate personal characteristics were more satisfied with their decisions. Residency directors from the Statewide Campus System at the Michigan State University College of Osteopathic Medicine were invited to complete a 20-item survey regarding their recent interview methods and proportion of resident selections later regretted. Data analyses examined relationships between interview methods used, frequency of personal characteristics evaluated, and subsequent satisfaction with selected residents. Of the 186 program director surveys distributed, 83 (44.6%) were returned, representing 11 clinical specialty areas. In total, 69 responses (83.1%) were from programs accredited by the American Osteopathic Association only, and 14 (16.9%) were from programs accredited dually by the American Osteopathic Association and Accreditation Council for Graduate Medical Education. The most frequent interview method reported was faculty or peer resident interview. No statistically significant correlational relationships were found between type of interview methods used and subsequent satisfaction with selected residents, either within or across clinical specialties. Although program directors rated ethical behavior/honesty as the most highly prioritized characteristic in residents, 27 (32.5%) reported using a specific interview method to assess this trait. Program directors reported later regrets concerning nearly 1 of every 12 resident selection decisions. The perceived success of an osteopathic residency program's interview process does not appear to be related to methods used and is not distinctively different from that of programs dually accredited. The findings suggest that it may not be realistic to aim for standardization of a common set of best interview methods or ideal personal characteristics for all programs. Each residency program's optimal interview process is likely unique, more dependent on analyzing why some resident selections are regretted and developing an interview process designed to assess for specific desirable and unwanted characteristics.
Analysis of Information Content in High-Spectral Resolution Sounders using Subset Selection Analysis
NASA Technical Reports Server (NTRS)
Velez-Reyes, Miguel; Joiner, Joanna
1998-01-01
In this paper, we summarize the results of the sensitivity analysis and data reduction carried out to determine the information content of AIRS and IASI channels. The analysis and data reduction was based on the use of subset selection techniques developed in the linear algebra and statistical community to study linear dependencies in high dimensional data sets. We applied the subset selection method to study dependency among channels by studying the dependency among their weighting functions. Also, we applied the technique to study the information provided by the different levels in which the atmosphere is discretized for retrievals and analysis. Results from the method correlate well with intuition in many respects and point out to possible modifications for band selection in sensor design and number and location of levels in the analysis process.
Fluorescent Nanomaterials for the Development of Latent Fingerprints in Forensic Sciences
Li, Ming; Yu, Aoyang; Zhu, Ye
2018-01-01
This review presents an overview on the application of latent fingerprint development techniques in forensic sciences. At present, traditional developing methods such as powder dusting, cyanoacrylate fuming, chemical method, and small particle reagent method, have all been gradually compromised given their emerging drawbacks such as low contrast, sensitivity, and selectivity, as well as high toxicity. Recently, much attention has been paid to the use of fluorescent nanomaterials including quantum dots (QDs) and rare earth upconversion fluorescent nanomaterials (UCNMs) due to their unique optical and chemical properties. Thus, this review lays emphasis on latent fingerprint development based on QDs and UCNMs. Compared to latent fingerprint development by traditional methods, the new methods using fluorescent nanomaterials can achieve high contrast, sensitivity, and selectivity while showing reduced toxicity. Overall, this review provides a systematic overview on such methods. PMID:29657570
Corrêa, A M; Pereira, M I S; de Abreu, H K A; Sharon, T; de Melo, C L P; Ito, M A; Teodoro, P E; Bhering, L L
2016-10-17
The common bean, Phaseolus vulgaris, is predominantly grown on small farms and lacks accurate genotype recommendations for specific micro-regions in Brazil. This contributes to a low national average yield. The aim of this study was to use the methods of the harmonic mean of the relative performance of genetic values (HMRPGV) and the centroid, for selecting common bean genotypes with high yield, adaptability, and stability for the Cerrado/Pantanal ecotone region in Brazil. We evaluated 11 common bean genotypes in three trials carried out in the dry season in Aquidauana in 2013, 2014, and 2015. A likelihood ratio test detected a significant interaction between genotype x year, contributing 54% to the total phenotypic variation in grain yield. The three genotypes selected by the joint analysis of genotypic values in all years (Carioca Precoce, BRS Notável, and CNFC 15875) were the same as those recommended by the HMRPGV method. Using the centroid method, genotypes BRS Notável and CNFC 15875 were considered ideal genotypes based on their high stability to unfavorable environments and high responsiveness to environmental improvement. We identified a high association between the methods of adaptability and stability used in this study. However, the use of centroid method provided a more accurate and precise recommendation of the behavior of the evaluated genotypes.
Thiry, Arnauld A.; Chavez Dulanto, Perla N.; Reynolds, Matthew P.; Davies, William J.
2016-01-01
The need to accelerate the selection of crop genotypes that are both resistant to and productive under abiotic stress is enhanced by global warming and the increase in demand for food by a growing world population. In this paper, we propose a new method for evaluation of wheat genotypes in terms of their resilience to stress and their production capacity. The method quantifies the components of a new index related to yield under abiotic stress based on previously developed stress indices, namely the stress susceptibility index, the stress tolerance index, the mean production index, the geometric mean production index, and the tolerance index, which were created originally to evaluate drought adaptation. The method, based on a scoring scale, offers simple and easy visualization and identification of resilient, productive and/or contrasting genotypes according to grain yield. This new selection method could help breeders and researchers by defining clear and strong criteria to identify genotypes with high resilience and high productivity and provide a clear visualization of contrasts in terms of grain yield production under stress. It is also expected that this methodology will reduce the time required for first selection and the number of first-selected genotypes for further evaluation by breeders and provide a basis for appropriate comparisons of genotypes that would help reveal the biology behind high stress productivity of crops. PMID:27677299
Selection of Construction Methods: A Knowledge-Based Approach
Skibniewski, Miroslaw
2013-01-01
The appropriate selection of construction methods to be used during the execution of a construction project is a major determinant of high productivity, but sometimes this selection process is performed without the care and the systematic approach that it deserves, bringing negative consequences. This paper proposes a knowledge management approach that will enable the intelligent use of corporate experience and information and help to improve the selection of construction methods for a project. Then a knowledge-based system to support this decision-making process is proposed and described. To define and design the system, semistructured interviews were conducted within three construction companies with the purpose of studying the way that the method' selection process is carried out in practice and the knowledge associated with it. A prototype of a Construction Methods Knowledge System (CMKS) was developed and then validated with construction industry professionals. As a conclusion, the CMKS was perceived as a valuable tool for construction methods' selection, by helping companies to generate a corporate memory on this issue, reducing the reliance on individual knowledge and also the subjectivity of the decision-making process. The described benefits as provided by the system favor a better performance of construction projects. PMID:24453925
Recursive feature selection with significant variables of support vectors.
Tsai, Chen-An; Huang, Chien-Hsun; Chang, Ching-Wei; Chen, Chun-Houh
2012-01-01
The development of DNA microarray makes researchers screen thousands of genes simultaneously and it also helps determine high- and low-expression level genes in normal and disease tissues. Selecting relevant genes for cancer classification is an important issue. Most of the gene selection methods use univariate ranking criteria and arbitrarily choose a threshold to choose genes. However, the parameter setting may not be compatible to the selected classification algorithms. In this paper, we propose a new gene selection method (SVM-t) based on the use of t-statistics embedded in support vector machine. We compared the performance to two similar SVM-based methods: SVM recursive feature elimination (SVMRFE) and recursive support vector machine (RSVM). The three methods were compared based on extensive simulation experiments and analyses of two published microarray datasets. In the simulation experiments, we found that the proposed method is more robust in selecting informative genes than SVMRFE and RSVM and capable to attain good classification performance when the variations of informative and noninformative genes are different. In the analysis of two microarray datasets, the proposed method yields better performance in identifying fewer genes with good prediction accuracy, compared to SVMRFE and RSVM.
Rauniyar, Navin
2015-01-01
The parallel reaction monitoring (PRM) assay has emerged as an alternative method of targeted quantification. The PRM assay is performed in a high resolution and high mass accuracy mode on a mass spectrometer. This review presents the features that make PRM a highly specific and selective method for targeted quantification using quadrupole-Orbitrap hybrid instruments. In addition, this review discusses the label-based and label-free methods of quantification that can be performed with the targeted approach. PMID:26633379
Jager, N G L; Rosing, H; Linn, S C; Schellens, J H M; Beijnen, J H
2012-06-01
The antiestrogenic effect of tamoxifen is mainly attributable to the active metabolites endoxifen and 4-hydroxytamoxifen. This effect is assumed to be concentration-dependent and therefore quantitative analysis of tamoxifen and metabolites for clinical studies and therapeutic drug monitoring is increasing. We investigated the large discrepancies in reported mean endoxifen and 4-hydroxytamoxifen concentrations. Two published LC-MS/MS methods are used to analyse a set of 75 serum samples from patients treated with tamoxifen. The method from Teunissen et al. (J Chrom B, 879:1677-1685, 2011) separates endoxifen and 4-hydroxytamoxifen from other tamoxifen metabolites with similar masses and fragmentation patterns. The second method, published by Gjerde et al. (J Chrom A, 1082:6-14, 2005) however lacks selectivity, resulting in a factor 2-3 overestimation of the endoxifen and 4-hydroxytamoxifen levels, respectively. We emphasize the use of highly selective LC-MS/MS methods for the quantification of tamoxifen and its metabolites in biological samples.
[Effect of space flight on yield of Monascus purpureus].
Yin, Hong; Xie, Shen-yi; Zhang, Guang-ming; Xie, Shen-meng
2003-10-01
To select high Lovastatin-producing microbial breed by space flight. Monascus purpureus species was carried into space by the recoverable spaceship, "Shenzhou 3". After flight, the strain was rejuvenized, segregated and selected. The content of Lovastatin produced in the solid fermentation was examined. Mutants with high productivity of Lovastatin were obtained. A series of tests showed that the acquired character of the mutants was stable. Space flight is an effective method for the selection of fine strains.
Prediction-Oriented Marker Selection (PROMISE): With Application to High-Dimensional Regression.
Kim, Soyeon; Baladandayuthapani, Veerabhadran; Lee, J Jack
2017-06-01
In personalized medicine, biomarkers are used to select therapies with the highest likelihood of success based on an individual patient's biomarker/genomic profile. Two goals are to choose important biomarkers that accurately predict treatment outcomes and to cull unimportant biomarkers to reduce the cost of biological and clinical verifications. These goals are challenging due to the high dimensionality of genomic data. Variable selection methods based on penalized regression (e.g., the lasso and elastic net) have yielded promising results. However, selecting the right amount of penalization is critical to simultaneously achieving these two goals. Standard approaches based on cross-validation (CV) typically provide high prediction accuracy with high true positive rates but at the cost of too many false positives. Alternatively, stability selection (SS) controls the number of false positives, but at the cost of yielding too few true positives. To circumvent these issues, we propose prediction-oriented marker selection (PROMISE), which combines SS with CV to conflate the advantages of both methods. Our application of PROMISE with the lasso and elastic net in data analysis shows that, compared to CV, PROMISE produces sparse solutions, few false positives, and small type I + type II error, and maintains good prediction accuracy, with a marginal decrease in the true positive rates. Compared to SS, PROMISE offers better prediction accuracy and true positive rates. In summary, PROMISE can be applied in many fields to select regularization parameters when the goals are to minimize false positives and maximize prediction accuracy.
da Silva, Pedro Henrique Reis; Diniz, Melina Luiza Vieira; Pianetti, Gerson Antônio; da Costa César, Isabela; Ribeiro E Silva, Maria Elisa Scarpelli; de Souza Freitas, Roberto Fernando; de Sousa, Ricardo Geraldo; Fernandes, Christian
2018-07-01
Lumefantrine is the first-choice treatment of Falciparum uncomplicated malaria. Recent findings of resistance to lumefantrine has brought attention for the importance of therapeutic monitoring, since exposure to subtherapeutic doses of antimalarials after administration is a major cause of selection of resistant parasites. Therefore, this study focused on the development of innovative, selective, less expensive and stable molecularly imprinted polymers (MIPs) for solid-phase extraction (SPE) of lumefantrine from human plasma to be used in drug monitoring. Polymers were synthesized by precipitation polymerization and chemometric tools (Box-Behnken design and surface response methodology) were employed for rational optimization of synthetic parameters. Optimum conditions were achieved with 2-vinylpyridine as monomer, ethylene glycol dimethacrylate as crosslinker and toluene as porogen, at molar ratio of 1:6:30 of template/monomer/crosslinker and azo-bisisobutyronitrile as initiator at 65 °C. The MIP obtained was characterized and exhibited high thermal stability, adequate surface morphology and porosity characteristics and high binding properties, with high affinity (adsorption capacity of 977.83 μg g -1 ) and selectivity (imprinting factor of 2.44; and selectivity factor of 1.48 and selectivity constant of 1.44 compared with halofantrine). Doehlert matrix and fractional designs were satisfactorily used for development and optimization of a MISPE-HPLC-UV method for determination of lumefantrine. The method fulfilled all validation parameters, with recoveries ranging from 83.68% to 85.42%, and was applied for quantitation of the drug in plasma from two healthy volunteers, with results of 1407.89 and 1271.35 ng mL -1 , respectively. Therefore, the MISPE-HPLC-UV method optimized through chemometrics provided a rapid, highly selective, less expensive and reproducible approach for lumefantrine drug monitoring. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Beach, Daniel G.
2017-08-01
Paralytic shellfish toxins (PSTs) are neurotoxins produced by dinoflagellates and cyanobacteria that cause paralytic shellfish poisoning in humans. PST quantitation by LC-MS is challenging because of their high polarity, lability as gas-phase ions, and large number of potentially interfering analogues. Differential mobility spectrometry (DMS) has the potential to improve the performance of LC-MS methods for PSTs in terms of selectivity and limits of detection. This work describes a comprehensive investigation of the separation of 16 regulated PSTs by DMS and the development of highly selective LC-DMS-MS methods for PST quantitation. The effects of all DMS parameters on the separation of PSTs from one another were first investigated in detail. The labile nature of 11α-gonyautoxin epimers gave unique insight into fragmentation of labile analytes before, during, and after the DMS analyzer. Two sets of DMS parameters were identified that either optimized the resolution of PSTs from one another or transmitted them at a limited number of compensation voltage (CV) values corresponding to structural subclasses. These were used to develop multidimensional LC-DMS-MS/MS methods using existing HILIC-MS/MS parameters. In both cases, improved selectivity was observed when using DMS, and the quantitative capabilities of a rapid UPLC-DMS-MS/MS method were evaluated. Limits of detection of the developed method were similar to those without DMS, and differences were highly analyte-dependant. Analysis of shellfish matrix reference materials showed good agreement with established methods. The developed methods will be useful in cases where specific matrix interferences are encountered in the LC-MS/MS analysis of PSTs in complex biological samples.
Concave 1-norm group selection
Jiang, Dingfeng; Huang, Jian
2015-01-01
Grouping structures arise naturally in many high-dimensional problems. Incorporation of such information can improve model fitting and variable selection. Existing group selection methods, such as the group Lasso, require correct membership. However, in practice it can be difficult to correctly specify group membership of all variables. Thus, it is important to develop group selection methods that are robust against group mis-specification. Also, it is desirable to select groups as well as individual variables in many applications. We propose a class of concave \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$1$\\end{document}-norm group penalties that is robust to grouping structure and can perform bi-level selection. A coordinate descent algorithm is developed to calculate solutions of the proposed group selection method. Theoretical convergence of the algorithm is proved under certain regularity conditions. Comparison with other methods suggests the proposed method is the most robust approach under membership mis-specification. Simulation studies and real data application indicate that the \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$1$\\end{document}-norm concave group selection approach achieves better control of false discovery rates. An R package grppenalty implementing the proposed method is available at CRAN. PMID:25417206
Liu, Dongkui; Lu, Xing; Yang, Yiwen; Zhai, Yunyun; Zhang, Jian; Li, Lei
2018-05-04
Acute myocardial infarction (AMI) is one of the leading risks to global health. Thus, the rapid, accurate early diagnosis of AMI is highly critical. Human cardiac troponin I (cTnI) has been regarded as a golden biomarker for AMI due to its excellent selectivity. In this work, a novel fluorescent aptasensor based on a graphene oxide (GO) platform was developed for the highly sensitive and selective detection of cTnI. GO binds to the fluorescent anti-cTnI aptamer and quenches its fluorescence. In the presence of cTnI, the fluorescent anti-cTnI aptamer leaves the surface of GO, combines with cTnI because of the powerful affinity of the fluorescent anti-cTnI aptamer and cTnI, and then restores the fluorescence of the fluorescent anti-cTnI aptamer. Fluorescence-enhanced detection is highly sensitive and selective to cTnI. The method exhibited good analytical performance with a reasonable dynamic linearity at the concentration range of 0.10-6.0 ng/mL and a low detection limit of 0.07 ng/mL (S/N = 3). The fluorescent aptasensor also exhibited high selectivity toward cTnI compared with other interference proteins. The proposed method may be a potentially useful tool for cTnI determination in human serum. Graphical abstract A novel fluorescent aptasensor for the highly sensitive and selective detection of cardiac troponin I based on a graphene oxide platform.
NASA Astrophysics Data System (ADS)
Fujita, Yusuke; Mitani, Yoshihiro; Hamamoto, Yoshihiko; Segawa, Makoto; Terai, Shuji; Sakaida, Isao
2017-03-01
Ultrasound imaging is a popular and non-invasive tool used in the diagnoses of liver disease. Cirrhosis is a chronic liver disease and it can advance to liver cancer. Early detection and appropriate treatment are crucial to prevent liver cancer. However, ultrasound image analysis is very challenging, because of the low signal-to-noise ratio of ultrasound images. To achieve the higher classification performance, selection of training regions of interest (ROIs) is very important that effect to classification accuracy. The purpose of our study is cirrhosis detection with high accuracy using liver ultrasound images. In our previous works, training ROI selection by MILBoost and multiple-ROI classification based on the product rule had been proposed, to achieve high classification performance. In this article, we propose self-training method to select training ROIs effectively. Evaluation experiments were performed to evaluate effect of self-training, using manually selected ROIs and also automatically selected ROIs. Experimental results show that self-training for manually selected ROIs achieved higher classification performance than other approaches, including our conventional methods. The manually ROI definition and sample selection are important to improve classification accuracy in cirrhosis detection using ultrasound images.
Takahashi, Mayumi; Wu, Xiwei; Ho, Michelle; Chomchan, Pritsana; Rossi, John J; Burnett, John C; Zhou, Jiehua
2016-09-22
The systemic evolution of ligands by exponential enrichment (SELEX) technique is a powerful and effective aptamer-selection procedure. However, modifications to the process can dramatically improve selection efficiency and aptamer performance. For example, droplet digital PCR (ddPCR) has been recently incorporated into SELEX selection protocols to putatively reduce the propagation of byproducts and avoid selection bias that result from differences in PCR efficiency of sequences within the random library. However, a detailed, parallel comparison of the efficacy of conventional solution PCR versus the ddPCR modification in the RNA aptamer-selection process is needed to understand effects on overall SELEX performance. In the present study, we took advantage of powerful high throughput sequencing technology and bioinformatics analysis coupled with SELEX (HT-SELEX) to thoroughly investigate the effects of initial library and PCR methods in the RNA aptamer identification. Our analysis revealed that distinct "biased sequences" and nucleotide composition existed in the initial, unselected libraries purchased from two different manufacturers and that the fate of the "biased sequences" was target-dependent during selection. Our comparison of solution PCR- and ddPCR-driven HT-SELEX demonstrated that PCR method affected not only the nucleotide composition of the enriched sequences, but also the overall SELEX efficiency and aptamer efficacy.
Takahashi, Mayumi; Wu, Xiwei; Ho, Michelle; Chomchan, Pritsana; Rossi, John J.; Burnett, John C.; Zhou, Jiehua
2016-01-01
The systemic evolution of ligands by exponential enrichment (SELEX) technique is a powerful and effective aptamer-selection procedure. However, modifications to the process can dramatically improve selection efficiency and aptamer performance. For example, droplet digital PCR (ddPCR) has been recently incorporated into SELEX selection protocols to putatively reduce the propagation of byproducts and avoid selection bias that result from differences in PCR efficiency of sequences within the random library. However, a detailed, parallel comparison of the efficacy of conventional solution PCR versus the ddPCR modification in the RNA aptamer-selection process is needed to understand effects on overall SELEX performance. In the present study, we took advantage of powerful high throughput sequencing technology and bioinformatics analysis coupled with SELEX (HT-SELEX) to thoroughly investigate the effects of initial library and PCR methods in the RNA aptamer identification. Our analysis revealed that distinct “biased sequences” and nucleotide composition existed in the initial, unselected libraries purchased from two different manufacturers and that the fate of the “biased sequences” was target-dependent during selection. Our comparison of solution PCR- and ddPCR-driven HT-SELEX demonstrated that PCR method affected not only the nucleotide composition of the enriched sequences, but also the overall SELEX efficiency and aptamer efficacy. PMID:27652575
Teodoro, P E; Bhering, L L; Costa, R D; Rocha, R B; Laviola, B G
2016-08-19
The aim of this study was to estimate genetic parameters via mixed models and simultaneously to select Jatropha progenies grown in three regions of Brazil that meet high adaptability and stability. From a previous phenotypic selection, three progeny tests were installed in 2008 in the municipalities of Planaltina-DF (Midwest), Nova Porteirinha-MG (Southeast), and Pelotas-RS (South). We evaluated 18 families of half-sib in a randomized block design with three replications. Genetic parameters were estimated using restricted maximum likelihood/best linear unbiased prediction. Selection was based on the harmonic mean of the relative performance of genetic values method in three strategies considering: 1) performance in each environment (with interaction effect); 2) performance in each environment (with interaction effect); and 3) simultaneous selection for grain yield, stability and adaptability. Accuracy obtained (91%) reveals excellent experimental quality and consequently safety and credibility in the selection of superior progenies for grain yield. The gain with the selection of the best five progenies was more than 20%, regardless of the selection strategy. Thus, based on the three selection strategies used in this study, the progenies 4, 11, and 3 (selected in all environments and the mean environment and by adaptability and phenotypic stability methods) are the most suitable for growing in the three regions evaluated.
SU-E-J-128: Two-Stage Atlas Selection in Multi-Atlas-Based Image Segmentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, T; Ruan, D
2015-06-15
Purpose: In the new era of big data, multi-atlas-based image segmentation is challenged by heterogeneous atlas quality and high computation burden from extensive atlas collection, demanding efficient identification of the most relevant atlases. This study aims to develop a two-stage atlas selection scheme to achieve computational economy with performance guarantee. Methods: We develop a low-cost fusion set selection scheme by introducing a preliminary selection to trim full atlas collection into an augmented subset, alleviating the need for extensive full-fledged registrations. More specifically, fusion set selection is performed in two successive steps: preliminary selection and refinement. An augmented subset is firstmore » roughly selected from the whole atlas collection with a simple registration scheme and the corresponding preliminary relevance metric; the augmented subset is further refined into the desired fusion set size, using full-fledged registration and the associated relevance metric. The main novelty of this work is the introduction of an inference model to relate the preliminary and refined relevance metrics, based on which the augmented subset size is rigorously derived to ensure the desired atlases survive the preliminary selection with high probability. Results: The performance and complexity of the proposed two-stage atlas selection method were assessed using a collection of 30 prostate MR images. It achieved comparable segmentation accuracy as the conventional one-stage method with full-fledged registration, but significantly reduced computation time to 1/3 (from 30.82 to 11.04 min per segmentation). Compared with alternative one-stage cost-saving approach, the proposed scheme yielded superior performance with mean and medium DSC of (0.83, 0.85) compared to (0.74, 0.78). Conclusion: This work has developed a model-guided two-stage atlas selection scheme to achieve significant cost reduction while guaranteeing high segmentation accuracy. The benefit in both complexity and performance is expected to be most pronounced with large-scale heterogeneous data.« less
ERIC Educational Resources Information Center
Owusu, K. A.; Monney, K. A.; Appiah, J. Y.; Wilmot, E. M.
2010-01-01
This study investigated the comparative efficiency of computer-assisted instruction (CAI) and conventional teaching method in biology on senior high school students. A science class was selected in each of two randomly selected schools. The pretest-posttest non equivalent quasi experimental design was used. The students in the experimental group…
Reddy, Leleti Rajender; Kotturi, Sharadsrikar; Waman, Yogesh; Patel, Chirag; Patwa, Aditya; Shenoy, Rajesh
2018-06-06
A highly regio- and diastereo-selective ortho-lithiation/addition of anisoles to N-tert-butanesulfinyl imines resulting in the selective formation of chiral α-branched amines is described. This method is also efficient for highly regioselective benzylic lithiation of o-methylanisoles, followed by diastereoselective addition to N-tert-butanesulfinyl imines.
Method Of Signal Amplification In Multi-Chromophore Luminescence Sensors
Levitsky, Igor A.; Krivoshlykov, Sergei G.
2004-02-03
A fluorescence-based method for highly sensitive and selective detection of analyte molecules is proposed. The method employs the energy transfer between two or more fluorescent chromophores in a carefully selected polymer matrix. In one preferred embodiment, signal amplification has been achieved in the fluorescent sensing of dimethyl methylphosphonate (DMMP) using two dyes, 3-aminofluoranthene (AM) and Nile Red (NR), in a hydrogen bond acidic polymer matrix. The selected polymer matrix quenches the fluorescence of both dyes and shifts dye emission and absorption spectra relative to more inert matrices. Upon DMMP sorption, the AM fluorescence shifts to the red at the same time the NR absorption shifts to the blue, resulting in better band overlap and increased energy transfer between chromophores. In another preferred embodiment, the sensitive material is incorporated into an optical fiber system enabling efficient excitation of the dye and collecting the fluorescent signal form the sensitive material on the remote end of the system. The proposed method can be applied to multichromophore luminescence sensor systems incorporating N-chromophores leading to N-fold signal amplification and improved selectivity. The method can be used in all applications where highly sensitive detection of basic gases, such as dimethyl methylphosphonate (DMMP), Sarin, Soman and other chemical warfare agents having basic properties, is required, including environmental monitoring, chemical industry and medicine.
Selected-zone dark-field electron microscopy.
NASA Technical Reports Server (NTRS)
Heinemann, K.; Poppa, H.
1972-01-01
Description of a new method which makes it possible to reduce drastically the resolution-limiting influence of chromatic aberration, and thus to obtain high-quality images, by selecting the image-forming electrons that have passed through a small annular zone of an objective lens. In addition, the manufacture of special objective-lens aperture diaphragms that are needed for this method is also described.
Hong, Bor-Cherng; Dange, Nitin S; Yen, Po-Jen; Lee, Gene-Hsiang; Liao, Ju-Hsiou
2012-10-19
A new method has been developed for the enantioselective synthesis of highly functionalized hydropentalenes bearing up to four stereogenic centers with high stereoselectivity (up to 99% ee). This process combines an enantioselective organocatalytic anti-selective Michael addition with a highly efficient one-pot reduction/lactonization/Pauson-Khand reaction sequence. The structures and absolute configurations of the products were confirmed by X-ray analysis.
Plasma spraying method for forming diamond and diamond-like coatings
Holcombe, C.E.; Seals, R.D.; Price, R.E.
1997-06-03
A method and composition is disclosed for the deposition of a thick layer of diamond or diamond-like material. The method includes high temperature processing wherein a selected composition including at least glassy carbon is heated in a direct current plasma arc device to a selected temperature above the softening point, in an inert atmosphere, and is propelled to quickly quenched on a selected substrate. The softened or molten composition crystallizes on the substrate to form a thick deposition layer comprising at least a diamond or diamond-like material. The selected composition includes at least glassy carbon as a primary constituent and may include at least one secondary constituent. Preferably, the secondary constituents are selected from the group consisting of at least diamond powder, boron carbide (B{sub 4}C) powder and mixtures thereof. 9 figs.
Major, Kevin J; Poutous, Menelaos K; Ewing, Kenneth J; Dunnill, Kevin F; Sanghera, Jasbinder S; Aggarwal, Ishwar D
2015-09-01
Optical filter-based chemical sensing techniques provide a new avenue to develop low-cost infrared sensors. These methods utilize multiple infrared optical filters to selectively measure different response functions for various chemicals, dependent on each chemical's infrared absorption. Rather than identifying distinct spectral features, which can then be used to determine the identity of a target chemical, optical filter-based approaches rely on measuring differences in the ensemble response between a given filter set and specific chemicals of interest. Therefore, the results of such methods are highly dependent on the original optical filter choice, which will dictate the selectivity, sensitivity, and stability of any filter-based sensing method. Recently, a method has been developed that utilizes unique detection vector operations defined by optical multifilter responses, to discriminate between volatile chemical vapors. This method, comparative-discrimination spectral detection (CDSD), is a technique which employs broadband optical filters to selectively discriminate between chemicals with highly overlapping infrared absorption spectra. CDSD has been shown to correctly distinguish between similar chemicals in the carbon-hydrogen stretch region of the infrared absorption spectra from 2800-3100 cm(-1). A key challenge to this approach is how to determine which optical filter sets should be utilized to achieve the greatest discrimination between target chemicals. Previous studies used empirical approaches to select the optical filter set; however this is insufficient to determine the optimum selectivity between strongly overlapping chemical spectra. Here we present a numerical approach to systematically study the effects of filter positioning and bandwidth on a number of three-chemical systems. We describe how both the filter properties, as well as the chemicals in each set, affect the CDSD results and subsequent discrimination. These results demonstrate the importance of choosing the proper filter set and chemicals for comparative discrimination, in order to identify the target chemical of interest in the presence of closely matched chemical interferents. These findings are an integral step in the development of experimental prototype sensors, which will utilize CDSD.
Yang, Ziheng; Zhu, Tianqi
2018-02-20
The Bayesian method is noted to produce spuriously high posterior probabilities for phylogenetic trees in analysis of large datasets, but the precise reasons for this overconfidence are unknown. In general, the performance of Bayesian selection of misspecified models is poorly understood, even though this is of great scientific interest since models are never true in real data analysis. Here we characterize the asymptotic behavior of Bayesian model selection and show that when the competing models are equally wrong, Bayesian model selection exhibits surprising and polarized behaviors in large datasets, supporting one model with full force while rejecting the others. If one model is slightly less wrong than the other, the less wrong model will eventually win when the amount of data increases, but the method may become overconfident before it becomes reliable. We suggest that this extreme behavior may be a major factor for the spuriously high posterior probabilities for evolutionary trees. The philosophical implications of our results to the application of Bayesian model selection to evaluate opposing scientific hypotheses are yet to be explored, as are the behaviors of non-Bayesian methods in similar situations.
Moon, Jihea; Kim, Giyoung; Park, Saet Byeol; Lim, Jongguk; Mo, Changyeun
2015-01-01
Whole-cell Systemic Evolution of Ligands by Exponential enrichment (SELEX) is the process by which aptamers specific to target cells are developed. Aptamers selected by whole-cell SELEX have high affinity and specificity for bacterial surface molecules and live bacterial targets. To identify DNA aptamers specific to Staphylococcus aureus, we applied our rapid whole-cell SELEX method to a single-stranded ssDNA library. To improve the specificity and selectivity of the aptamers, we designed, selected, and developed two categories of aptamers that were selected by two kinds of whole-cell SELEX, by mixing and combining FACS analysis and a counter-SELEX process. Using this approach, we have developed a biosensor system that employs a high affinity aptamer for detection of target bacteria. FAM-labeled aptamer sequences with high binding to S. aureus, as determined by fluorescence spectroscopic analysis, were identified, and aptamer A14, selected by the basic whole-cell SELEX using a once-off FACS analysis, and which had a high binding affinity and specificity, was chosen. The binding assay was evaluated using FACS analysis. Our study demonstrated the development of a set of whole-cell SELEX derived aptamers specific to S. aureus; this approach can be used in the identification of other bacteria. PMID:25884791
Moon, Jihea; Kim, Giyoung; Park, Saet Byeol; Lim, Jongguk; Mo, Changyeun
2015-04-15
Whole-cell Systemic Evolution of Ligands by Exponential enrichment (SELEX) is the process by which aptamers specific to target cells are developed. Aptamers selected by whole-cell SELEX have high affinity and specificity for bacterial surface molecules and live bacterial targets. To identify DNA aptamers specific to Staphylococcus aureus, we applied our rapid whole-cell SELEX method to a single-stranded ssDNA library. To improve the specificity and selectivity of the aptamers, we designed, selected, and developed two categories of aptamers that were selected by two kinds of whole-cell SELEX, by mixing and combining FACS analysis and a counter-SELEX process. Using this approach, we have developed a biosensor system that employs a high affinity aptamer for detection of target bacteria. FAM-labeled aptamer sequences with high binding to S. aureus, as determined by fluorescence spectroscopic analysis, were identified, and aptamer A14, selected by the basic whole-cell SELEX using a once-off FACS analysis, and which had a high binding affinity and specificity, was chosen. The binding assay was evaluated using FACS analysis. Our study demonstrated the development of a set of whole-cell SELEX derived aptamers specific to S. aureus; this approach can be used in the identification of other bacteria.
Smits, Niels; van der Ark, L Andries; Conijn, Judith M
2017-11-02
Two important goals when using questionnaires are (a) measurement: the questionnaire is constructed to assign numerical values that accurately represent the test taker's attribute, and (b) prediction: the questionnaire is constructed to give an accurate forecast of an external criterion. Construction methods aimed at measurement prescribe that items should be reliable. In practice, this leads to questionnaires with high inter-item correlations. By contrast, construction methods aimed at prediction typically prescribe that items have a high correlation with the criterion and low inter-item correlations. The latter approach has often been said to produce a paradox concerning the relation between reliability and validity [1-3], because it is often assumed that good measurement is a prerequisite of good prediction. To answer four questions: (1) Why are measurement-based methods suboptimal for questionnaires that are used for prediction? (2) How should one construct a questionnaire that is used for prediction? (3) Do questionnaire-construction methods that optimize measurement and prediction lead to the selection of different items in the questionnaire? (4) Is it possible to construct a questionnaire that can be used for both measurement and prediction? An empirical data set consisting of scores of 242 respondents on questionnaire items measuring mental health is used to select items by means of two methods: a method that optimizes the predictive value of the scale (i.e., forecast a clinical diagnosis), and a method that optimizes the reliability of the scale. We show that for the two scales different sets of items are selected and that a scale constructed to meet the one goal does not show optimal performance with reference to the other goal. The answers are as follows: (1) Because measurement-based methods tend to maximize inter-item correlations by which predictive validity reduces. (2) Through selecting items that correlate highly with the criterion and lowly with the remaining items. (3) Yes, these methods may lead to different item selections. (4) For a single questionnaire: Yes, but it is problematic because reliability cannot be estimated accurately. For a test battery: Yes, but it is very costly. Implications for the construction of patient-reported outcome questionnaires are discussed.
a Band Selection Method for High Precision Registration of Hyperspectral Image
NASA Astrophysics Data System (ADS)
Yang, H.; Li, X.
2018-04-01
During the registration of hyperspectral images and high spatial resolution images, too much bands in a hyperspectral image make it difficult to select bands with good registration performance. Terrible bands are possible to reduce matching speed and accuracy. To solve this problem, an algorithm based on Cram'er-Rao lower bound theory is proposed to select good matching bands in this paper. The algorithm applies the Cram'er-Rao lower bound theory to the study of registration accuracy, and selects good matching bands by CRLB parameters. Experiments show that the algorithm in this paper can choose good matching bands and provide better data for the registration of hyperspectral image and high spatial resolution image.
A probabilistic and multi-objective analysis of lexicase selection and ε-lexicase selection.
Cava, William La; Helmuth, Thomas; Spector, Lee; Moore, Jason H
2018-05-10
Lexicase selection is a parent selection method that considers training cases individually, rather than in aggregate, when performing parent selection. Whereas previous work has demonstrated the ability of lexicase selection to solve difficult problems in program synthesis and symbolic regression, the central goal of this paper is to develop the theoretical underpinnings that explain its performance. To this end, we derive an analytical formula that gives the expected probabilities of selection under lexicase selection, given a population and its behavior. In addition, we expand upon the relation of lexicase selection to many-objective optimization methods to describe the behavior of lexicase selection, which is to select individuals on the boundaries of Pareto fronts in high-dimensional space. We show analytically why lexicase selection performs more poorly for certain sizes of population and training cases, and show why it has been shown to perform more poorly in continuous error spaces. To address this last concern, we propose new variants of ε-lexicase selection, a method that modifies the pass condition in lexicase selection to allow near-elite individuals to pass cases, thereby improving selection performance with continuous errors. We show that ε-lexicase outperforms several diversity-maintenance strategies on a number of real-world and synthetic regression problems.
NASA Astrophysics Data System (ADS)
Di, Yue; Jin, Yi; Jiang, Hong-liang; Zhai, Chao
2013-09-01
Due to the particularity of the high-speed flow, in order to accurately obtain its' temperature, the measurement system should has some characteristics of not interfereing with the flow, non-contact measurement and high time resolution. The traditional measurement method cannot meet the above requirements, however the measurement method based on tunable diode laser absorption spectroscopy (TDLAS) technology can meet the requirements for high-speed flow temperature measurement. When the near-infared light of a specific frequency is through the media to be measured, it will be absorbed by the water vapor molecules and then the transmission light intensity is detected by the detector. The temperature of the water vapor which is also the high-speed flow temperature, can be accurately obtained by the Beer-Lambert law. This paper focused on the research of absorption spectrum method for high speed flow temperature measurement with the scope of 250K-500K. Firstly, spectral line selection method for low temperature measurement of high-speed flow is discussed. Selected absorption lines should be isolated and have a high peak absorption within the range of 250-500K, at the same time the interference of the other lines should be avoided, so that a high measurement accuracy can be obtained. According to the near-infrared absorption spectra characteristics of water vapor, four absorption lines at the near 1395 nm and 1409 nm are selected. Secondly, a system for the temperature measurement of the water vapor in the high-speed flow is established. Room temperature are measured through two methods, direct absorption spectroscopy (DAS) and wavelength modulation spectroscopy (WMS) ,the results show that this system can realize on-line measurement of the temperature and the measurement error is about 3%. Finally, the system will be used for temperature measurement of the high-speed flow in the shock tunnel, its feasibility of measurement is analyzed.
NASA Astrophysics Data System (ADS)
Zhang, Lufeng; Du, Jianxiu
2016-04-01
The development of highly selective and sensitive method for iron(III) detection is of great importance both from human health as well as environmental point of view. We herein reported a simple, selective and sensitive colorimetric method for the detection of Fe(III) at submicromolar level with 3,3,‧5,5‧-tetramethylbenzidine (TMB) as a chromogenic probe. It was observed that Fe(III) could directly oxidize TMB to form a blue solution without adding any extra oxidants. The reaction has a stoichiometric ratio of 1:1 (Fe(III)/TMB) as determined by a molar ratio method. The resultant color change can be perceived by the naked eye or monitored the absorbance change at 652 nm. The method allowed the measurement of Fe(III) in the range 1.0 × 10- 7-1.5 × 10- 4 mol L- 1 with a detection limit of 5.5 × 10- 8 mol L- 1. The relative standard deviation was 0.9% for eleven replicate measurements of 2.5 × 10- 5 mol L- 1 Fe(III) solution. The chemistry showed high selectivity for Fe(III) in contrast to other common cation ions. The practically of the method was evaluated by the determination of Fe in milk samples; good consistency was obtained between the results of this method and atomic absorption spectrophotometry as indicated by statistical analysis.
GWASinlps: Nonlocal prior based iterative SNP selection tool for genome-wide association studies.
Sanyal, Nilotpal; Lo, Min-Tzu; Kauppi, Karolina; Djurovic, Srdjan; Andreassen, Ole A; Johnson, Valen E; Chen, Chi-Hua
2018-06-19
Multiple marker analysis of the genome-wide association study (GWAS) data has gained ample attention in recent years. However, because of the ultra high-dimensionality of GWAS data, such analysis is challenging. Frequently used penalized regression methods often lead to large number of false positives, whereas Bayesian methods are computationally very expensive. Motivated to ameliorate these issues simultaneously, we consider the novel approach of using nonlocal priors in an iterative variable selection framework. We develop a variable selection method, named, iterative nonlocal prior based selection for GWAS, or GWASinlps, that combines, in an iterative variable selection framework, the computational efficiency of the screen-and-select approach based on some association learning and the parsimonious uncertainty quantification provided by the use of nonlocal priors. The hallmark of our method is the introduction of 'structured screen-and-select' strategy, that considers hierarchical screening, which is not only based on response-predictor associations, but also based on response-response associations, and concatenates variable selection within that hierarchy. Extensive simulation studies with SNPs having realistic linkage disequilibrium structures demonstrate the advantages of our computationally efficient method compared to several frequentist and Bayesian variable selection methods, in terms of true positive rate, false discovery rate, mean squared error, and effect size estimation error. Further, we provide empirical power analysis useful for study design. Finally, a real GWAS data application was considered with human height as phenotype. An R-package for implementing the GWASinlps method is available at https://cran.r-project.org/web/packages/GWASinlps/index.html. Supplementary data are available at Bioinformatics online.
2013-01-01
Background Gene expression data could likely be a momentous help in the progress of proficient cancer diagnoses and classification platforms. Lately, many researchers analyze gene expression data using diverse computational intelligence methods, for selecting a small subset of informative genes from the data for cancer classification. Many computational methods face difficulties in selecting small subsets due to the small number of samples compared to the huge number of genes (high-dimension), irrelevant genes, and noisy genes. Methods We propose an enhanced binary particle swarm optimization to perform the selection of small subsets of informative genes which is significant for cancer classification. Particle speed, rule, and modified sigmoid function are introduced in this proposed method to increase the probability of the bits in a particle’s position to be zero. The method was empirically applied to a suite of ten well-known benchmark gene expression data sets. Results The performance of the proposed method proved to be superior to other previous related works, including the conventional version of binary particle swarm optimization (BPSO) in terms of classification accuracy and the number of selected genes. The proposed method also requires lower computational time compared to BPSO. PMID:23617960
Kavakiotis, Ioannis; Samaras, Patroklos; Triantafyllidis, Alexandros; Vlahavas, Ioannis
2017-11-01
Single Nucleotide Polymorphism (SNPs) are, nowadays, becoming the marker of choice for biological analyses involving a wide range of applications with great medical, biological, economic and environmental interest. Classification tasks i.e. the assignment of individuals to groups of origin based on their (multi-locus) genotypes, are performed in many fields such as forensic investigations, discrimination between wild and/or farmed populations and others. Τhese tasks, should be performed with a small number of loci, for computational as well as biological reasons. Thus, feature selection should precede classification tasks, especially for Single Nucleotide Polymorphism (SNP) datasets, where the number of features can amount to hundreds of thousands or millions. In this paper, we present a novel data mining approach, called FIFS - Frequent Item Feature Selection, based on the use of frequent items for selection of the most informative markers from population genomic data. It is a modular method, consisting of two main components. The first one identifies the most frequent and unique genotypes for each sampled population. The second one selects the most appropriate among them, in order to create the informative SNP subsets to be returned. The proposed method (FIFS) was tested on a real dataset, which comprised of a comprehensive coverage of pig breed types present in Britain. This dataset consisted of 446 individuals divided in 14 sub-populations, genotyped at 59,436 SNPs. Our method outperforms the state-of-the-art and baseline methods in every case. More specifically, our method surpassed the assignment accuracy threshold of 95% needing only half the number of SNPs selected by other methods (FIFS: 28 SNPs, Delta: 70 SNPs Pairwise FST: 70 SNPs, In: 100 SNPs.) CONCLUSION: Our approach successfully deals with the problem of informative marker selection in high dimensional genomic datasets. It offers better results compared to existing approaches and can aid biologists in selecting the most informative markers with maximum discrimination power for optimization of cost-effective panels with applications related to e.g. species identification, wildlife management, and forensics. Copyright © 2017 Elsevier Ltd. All rights reserved.
Protein and Antibody Engineering by Phage Display
Frei, J.C.; Lai, J.R.
2017-01-01
Phage display is an in vitro selection technique that allows for the rapid isolation of proteins with desired properties including increased affinity, specificity, stability, and new enzymatic activity. The power of phage display relies on the phenotype-to-genotype linkage of the protein of interest displayed on the phage surface with the encoding DNA packaged within the phage particle, which allows for selective enrichment of library pools and high-throughput screening of resulting clones. As an in vitro method, the conditions of the binding selection can be tightly controlled. Due to the high-throughput nature, rapidity, and ease of use, phage display is an excellent technological platform for engineering antibody or proteins with enhanced properties. Here, we describe methods for synthesis, selection, and screening of phage libraries with particular emphasis on designing humanizing antibody libraries and combinatorial scanning mutagenesis libraries. We conclude with a brief section on troubleshooting for all stages of the phage display process. PMID:27586328
Assembly of ordered contigs of cosmids selected with YACs of human chromosome 13
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fischer, S.G.; Cayanis, E.; Boukhgalter, B.
1994-06-01
The authors have developed an efficient method for assembling ordered cosmid contigs aligned to mega-YACs and midi-YACs (average insert sizes of 1.0 and 0.35 Mb, respectively) and used this general method to initiate high-resolution physical mapping of human chromosome 13 (Chr 13). Chr 13-enriched midi-YAC (mYAC) and mega-YAC (MYAC) sublibraries were obtained from corresponding CEPH total human YAC libraries by selecting colonies with inter-Alu PCR probes derived from Chr 13 monochromosomal cell hybrid DNA. These sublibraries were arrayed on filters at high density. In this approach, the MYAC 13 sublibrary is screened by hybridization with cytogenetically assigned Chr 13 DNAmore » probes to select one or a small subset of MYACs. Inter-Alu PCR products from each mYAC are then hybridized to the MYAC and mYAC sublibraries to identify overlapping YACs and to an arrayed Chr 13-specific cosmid library to select corresponding cosmids. The set of selected cosmids, gridded on filters at high density, is hybridized with inter-Alu PCR products from each of the overlapping YACs to identify subsets of cosmids and also with riboprobes from each cosmid of the arrayed set ({open_quotes}cosmid matrix cross-hybridization{close_quotes}). From these data, cosmid contigs are assembled by a specifically designed computer program. Application of this method generates cosmid contigs spanning the length of a MYAC with few gaps. To provide a high-resolution map, ends of cosmids are sequenced at preselected sites to position densely spaced sequence-tagged sites. 33 refs., 7 figs., 1 tab.« less
Prothmann, Jens; Sun, Mingzhe; Spégel, Peter; Sandahl, Margareta; Turner, Charlotta
2017-12-01
The conversion of lignin to potentially high-value low molecular weight compounds often results in complex mixtures of monomeric and oligomeric compounds. In this study, a method for the quantitative and qualitative analysis of 40 lignin-derived compounds using ultra-high-performance supercritical fluid chromatography coupled to quadrupole-time-of-flight mass spectrometry (UHPSFC/QTOF-MS) has been developed. Seven different columns were explored for maximum selectivity. Makeup solvent composition and ion source settings were optimised using a D-optimal design of experiment (DoE). Differently processed lignin samples were analysed and used for the method validation. The new UHPSFC/QTOF-MS method showed good separation of the 40 compounds within only 6-min retention time, and out of these, 36 showed high ionisation efficiency in negative electrospray ionisation mode. Graphical abstract A rapid and selective method for the quantitative and qualitative analysis of 40 lignin-derived compounds using ultra-high-performance supercritical fluid chromatography coupled to quadrupole-time-of-flight mass spectrometry (UHPSFC/QTOF-MS).
NASA Astrophysics Data System (ADS)
Yusop, Hanafi M.; Ghazali, M. F.; Yusof, M. F. M.; Remli, M. A. Pi; Kamarulzaman, M. H.
2017-10-01
In a recent study, the analysis of pressure transient signals could be seen as an accurate and low-cost method for leak and feature detection in water distribution systems. Transient phenomena occurs due to sudden changes in the fluid’s propagation in pipelines system caused by rapid pressure and flow fluctuation due to events such as closing and opening valves rapidly or through pump failure. In this paper, the feasibility of the Hilbert-Huang transform (HHT) method/technique in analysing the pressure transient signals in presented and discussed. HHT is a way to decompose a signal into intrinsic mode functions (IMF). However, the advantage of HHT is its difficulty in selecting the suitable IMF for the next data postprocessing method which is Hilbert Transform (HT). This paper reveals that utilizing the application of an integrated kurtosis-based algorithm for a z-filter technique (I-Kaz) to kurtosis ratio (I-Kaz-Kurtosis) allows/contributes to/leads to automatic selection of the IMF that should be used. This technique is demonstrated on a 57.90-meter medium high-density polyethylene (MDPE) pipe installed with a single artificial leak. The analysis results using the I-Kaz-kurtosis ratio revealed/confirmed that the method can be used as an automatic selection of the IMF although the noise level ratio of the signal is low. Therefore, the I-Kaz-kurtosis ratio method is recommended as a means to implement an automatic selection technique of the IMF for HHT analysis.
Wolc, Anna; Stricker, Chris; Arango, Jesus; Settar, Petek; Fulton, Janet E; O'Sullivan, Neil P; Preisinger, Rudolf; Habier, David; Fernando, Rohan; Garrick, Dorian J; Lamont, Susan J; Dekkers, Jack C M
2011-01-21
Genomic selection involves breeding value estimation of selection candidates based on high-density SNP genotypes. To quantify the potential benefit of genomic selection, accuracies of estimated breeding values (EBV) obtained with different methods using pedigree or high-density SNP genotypes were evaluated and compared in a commercial layer chicken breeding line. The following traits were analyzed: egg production, egg weight, egg color, shell strength, age at sexual maturity, body weight, albumen height, and yolk weight. Predictions appropriate for early or late selection were compared. A total of 2,708 birds were genotyped for 23,356 segregating SNP, including 1,563 females with records. Phenotypes on relatives without genotypes were incorporated in the analysis (in total 13,049 production records).The data were analyzed with a Reduced Animal Model using a relationship matrix based on pedigree data or on marker genotypes and with a Bayesian method using model averaging. Using a validation set that consisted of individuals from the generation following training, these methods were compared by correlating EBV with phenotypes corrected for fixed effects, selecting the top 30 individuals based on EBV and evaluating their mean phenotype, and by regressing phenotypes on EBV. Using high-density SNP genotypes increased accuracies of EBV up to two-fold for selection at an early age and by up to 88% for selection at a later age. Accuracy increases at an early age can be mostly attributed to improved estimates of parental EBV for shell quality and egg production, while for other egg quality traits it is mostly due to improved estimates of Mendelian sampling effects. A relatively small number of markers was sufficient to explain most of the genetic variation for egg weight and body weight.
Lin, Wei; Feng, Rui; Li, Hongzhe
2014-01-01
In genetical genomics studies, it is important to jointly analyze gene expression data and genetic variants in exploring their associations with complex traits, where the dimensionality of gene expressions and genetic variants can both be much larger than the sample size. Motivated by such modern applications, we consider the problem of variable selection and estimation in high-dimensional sparse instrumental variables models. To overcome the difficulty of high dimensionality and unknown optimal instruments, we propose a two-stage regularization framework for identifying and estimating important covariate effects while selecting and estimating optimal instruments. The methodology extends the classical two-stage least squares estimator to high dimensions by exploiting sparsity using sparsity-inducing penalty functions in both stages. The resulting procedure is efficiently implemented by coordinate descent optimization. For the representative L1 regularization and a class of concave regularization methods, we establish estimation, prediction, and model selection properties of the two-stage regularized estimators in the high-dimensional setting where the dimensionality of co-variates and instruments are both allowed to grow exponentially with the sample size. The practical performance of the proposed method is evaluated by simulation studies and its usefulness is illustrated by an analysis of mouse obesity data. Supplementary materials for this article are available online. PMID:26392642
Faint blue objects at high Galactic latitude. V - Palomar Schmidt field centered on selected area 71
NASA Technical Reports Server (NTRS)
Usher, Peter D.; Mitchell, Kenneth J.; Warnock, Archibald, III
1988-01-01
Starlike objects with both blue and ultraviolet excess have been selected from a Palomar 1.2 m Schmidt field centered on Kapteyn selected area 71. The method of selection is that used in the previous papers of this series, but modified to account for the differential reddening that occurs across the field. The color classes, color subclasses, positions, and magnitudes of the selected objects are listed.
Conformal coating of highly structured surfaces
Ginley, David S.; Perkins, John; Berry, Joseph; Gennett, Thomas
2012-12-11
Method of applying a conformal coating to a highly structured substrate and devices made by the disclosed methods are disclosed. An example method includes the deposition of a substantially contiguous layer of a material upon a highly structured surface within a deposition process chamber. The highly structured surface may be associated with a substrate or another layer deposited on a substrate. The method includes depositing a material having an amorphous structure on the highly structured surface at a deposition pressure of equal to or less than about 3 mTorr. The method may also include removing a portion of the amorphous material deposited on selected surfaces and depositing additional amorphous material on the highly structured surface.
Analysis of Environmental Contamination resulting from ...
Catastrophic incidents can generate a large number of samples with analytically diverse types including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to safe levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illu
Chen, Minyong; Shi, Xiaofeng; Duke, Rebecca M.; Ruse, Cristian I.; Dai, Nan; Taron, Christopher H.; Samuelson, James C.
2017-01-01
A method for selective and comprehensive enrichment of N-linked glycopeptides was developed to facilitate detection of micro-heterogeneity of N-glycosylation. The method takes advantage of the inherent properties of Fbs1, which functions within the ubiquitin-mediated degradation system to recognize the common core pentasaccharide motif (Man3GlcNAc2) of N-linked glycoproteins. We show that Fbs1 is able to bind diverse types of N-linked glycomolecules; however, wild-type Fbs1 preferentially binds high-mannose-containing glycans. We identified Fbs1 variants through mutagenesis and plasmid display selection, which possess higher affinity and improved recovery of complex N-glycomolecules. In particular, we demonstrate that the Fbs1 GYR variant may be employed for substantially unbiased enrichment of N-linked glycopeptides from human serum. Most importantly, this highly efficient N-glycopeptide enrichment method enables the simultaneous determination of N-glycan composition and N-glycosites with a deeper coverage (compared to lectin enrichment) and improves large-scale N-glycoproteomics studies due to greatly reduced sample complexity. PMID:28534482
NASA Astrophysics Data System (ADS)
Chen, Jie; Brissette, François P.; Lucas-Picher, Philippe
2016-11-01
Given the ever increasing number of climate change simulations being carried out, it has become impractical to use all of them to cover the uncertainty of climate change impacts. Various methods have been proposed to optimally select subsets of a large ensemble of climate simulations for impact studies. However, the behaviour of optimally-selected subsets of climate simulations for climate change impacts is unknown, since the transfer process from climate projections to the impact study world is usually highly non-linear. Consequently, this study investigates the transferability of optimally-selected subsets of climate simulations in the case of hydrological impacts. Two different methods were used for the optimal selection of subsets of climate scenarios, and both were found to be capable of adequately representing the spread of selected climate model variables contained in the original large ensemble. However, in both cases, the optimal subsets had limited transferability to hydrological impacts. To capture a similar variability in the impact model world, many more simulations have to be used than those that are needed to simply cover variability from the climate model variables' perspective. Overall, both optimal subset selection methods were better than random selection when small subsets were selected from a large ensemble for impact studies. However, as the number of selected simulations increased, random selection often performed better than the two optimal methods. To ensure adequate uncertainty coverage, the results of this study imply that selecting as many climate change simulations as possible is the best avenue. Where this was not possible, the two optimal methods were found to perform adequately.
A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events
NASA Astrophysics Data System (ADS)
Kholodovsky, V.
2017-12-01
Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.
Adaptive Modeling Procedure Selection by Data Perturbation.
Zhang, Yongli; Shen, Xiaotong
2015-10-01
Many procedures have been developed to deal with the high-dimensional problem that is emerging in various business and economics areas. To evaluate and compare these procedures, modeling uncertainty caused by model selection and parameter estimation has to be assessed and integrated into a modeling process. To do this, a data perturbation method estimates the modeling uncertainty inherited in a selection process by perturbing the data. Critical to data perturbation is the size of perturbation, as the perturbed data should resemble the original dataset. To account for the modeling uncertainty, we derive the optimal size of perturbation, which adapts to the data, the model space, and other relevant factors in the context of linear regression. On this basis, we develop an adaptive data-perturbation method that, unlike its nonadaptive counterpart, performs well in different situations. This leads to a data-adaptive model selection method. Both theoretical and numerical analysis suggest that the data-adaptive model selection method adapts to distinct situations in that it yields consistent model selection and optimal prediction, without knowing which situation exists a priori. The proposed method is applied to real data from the commodity market and outperforms its competitors in terms of price forecasting accuracy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Zhixin, E-mail: czx@fzu.edu.cn; Instrumental Measurement and Analysis Center, Fuzhou University, Fuzhou 350002; Xu, Jingjing
Hexagonal ZnIn{sub 2}S{sub 4} samples have been synthesized by a solvothermal method. Their properties have been determined by X-ray diffraction, ultraviolet–visible-light diffuse reflectance spectra, field emission scanning electron microscopy, nitrogen adsorption–desorption and X-ray photoelectron spectra. These results demonstrate that ethanol solvent has significant influence on the morphology, optical and electronic nature for such marigold-like ZnIn{sub 2}S{sub 4} microspheres. The visible light photocatalytic activities of the ZnIn{sub 2}S{sub 4} have been evaluated by selective oxidation of benzyl alcohol to benzaldehyde using molecular oxygen as oxidant. The results show that 100% conversion along with >99% selectivity are reached over ZnIn{sub 2}S{sub 4}more » prepared in ethanol solvent under visible light irradiation (λ>420 nm) of 2 h, but only 58% conversion and 57% yield are reached over ZnIn{sub 2}S{sub 4} prepared in aqueous solvent. A possible mechanism of the high photocatalytic activity for selective oxidation of benzyl alcohol over ZnIn{sub 2}S{sub 4} is proposed and discussed. - Graphical abstract: Marigold-like ZnIn{sub 2}S{sub 4} microspheres were synthesized by a solvothermal method. The high visible photocatalytic activities of ZnIn{sub 2}S{sub 4} were evaluated by selective oxidation of benzyl alcohol to benzaldehyde under mild conditions. Display Omitted - Highlights: • Marigold-like ZnIn{sub 2}S{sub 4} microspheres were synthesized by a solvothermal method. • The solvents have a remarkably influence on the morphology and properties of samples. • It is the first time to apply ZnIn{sub 2}S{sub 4} for selective oxidation of benzyl alcohol. • ZnIn{sub 2}S{sub 4} shows high photocatalytic activity for selective oxidation of benzyl alcohol.« less
Automatic selection of arterial input function using tri-exponential models
NASA Astrophysics Data System (ADS)
Yao, Jianhua; Chen, Jeremy; Castro, Marcelo; Thomasson, David
2009-02-01
Dynamic Contrast Enhanced MRI (DCE-MRI) is one method for drug and tumor assessment. Selecting a consistent arterial input function (AIF) is necessary to calculate tissue and tumor pharmacokinetic parameters in DCE-MRI. This paper presents an automatic and robust method to select the AIF. The first stage is artery detection and segmentation, where knowledge about artery structure and dynamic signal intensity temporal properties of DCE-MRI is employed. The second stage is AIF model fitting and selection. A tri-exponential model is fitted for every candidate AIF using the Levenberg-Marquardt method, and the best fitted AIF is selected. Our method has been applied in DCE-MRIs of four different body parts: breast, brain, liver and prostate. The success rates in artery segmentation for 19 cases are 89.6%+/-15.9%. The pharmacokinetic parameters computed from the automatically selected AIFs are highly correlated with those from manually determined AIFs (R2=0.946, P(T<=t)=0.09). Our imaging-based tri-exponential AIF model demonstrated significant improvement over a previously proposed bi-exponential model.
Thermophotovoltaic energy conversion using photonic bandgap selective emitters
Gee, James M.; Lin, Shawn-Yu; Fleming, James G.; Moreno, James B.
2003-06-24
A method for thermophotovoltaic generation of electricity comprises heating a metallic photonic crystal to provide selective emission of radiation that is matched to the peak spectral response of a photovoltaic cell that converts the radiation to electricity. The use of a refractory metal, such as tungsten, for the photonic crystal enables high temperature operation for high radiant flux and high dielectric contrast for a full 3D photonic bandgap, preferable for efficient thermophotovoltaic energy conversion.
USDA-ARS?s Scientific Manuscript database
A high resolution GC/MS with Selected Ion Monitor (SIM) method focusing on the characterization and quantitative analysis of ginkgolic acids (GAs) in Ginkgo biloba L. plant materials, extracts and commercial products was developed and validated. The method involved sample extraction with (1:1) meth...
News video story segmentation method using fusion of audio-visual features
NASA Astrophysics Data System (ADS)
Wen, Jun; Wu, Ling-da; Zeng, Pu; Luan, Xi-dao; Xie, Yu-xiang
2007-11-01
News story segmentation is an important aspect for news video analysis. This paper presents a method for news video story segmentation. Different form prior works, which base on visual features transform, the proposed technique uses audio features as baseline and fuses visual features with it to refine the results. At first, it selects silence clips as audio features candidate points, and selects shot boundaries and anchor shots as two kinds of visual features candidate points. Then this paper selects audio feature candidates as cues and develops different fusion method, which effectively using diverse type visual candidates to refine audio candidates, to get story boundaries. Experiment results show that this method has high efficiency and adaptability to different kinds of news video.
ERIC Educational Resources Information Center
Mokher, Christine; Cavalluzzo, Linda
2011-01-01
This presentation focuses on the quasi-experimental methods used to select comparison schools for an evaluation of a federal investing in innovation (i3) validation grant. The Northeast Tennessee College and Career Ready Consortium (NETCO) consists of 29 high schools participating in a five-year program to expand students' access to rigorous…
Mendes, M P; Ramalho, M A P; Abreu, A F B
2012-04-10
The objective of this study was to compare the BLUP selection method with different selection strategies in F(2:4) and assess the efficiency of this method on the early choice of the best common bean (Phaseolus vulgaris) lines. Fifty-one F(2:4) progenies were produced from a cross between the CVIII8511 x RP-26 lines. A randomized block design was used with 20 replications and one-plant field plots. Character data on plant architecture and grain yield were obtained and then the sum of the standardized variables was estimated for simultaneous selection of both traits. Analysis was carried out by mixed models (BLUP) and the least squares method to compare different selection strategies, like mass selection, stratified mass selection and between and within progeny selection. The progenies selected by BLUP were assessed in advanced generations, always selecting the greatest and smallest sum of the standardized variables. Analyses by the least squares method and BLUP procedure ranked the progenies in the same way. The coincidence of the individuals identified by BLUP and between and within progeny selection was high and of the greatest magnitude when BLUP was compared with mass selection. Although BLUP is the best estimator of genotypic value, its efficiency in the response to long term selection is not different from any of the other methods, because it is also unable to predict the future effect of the progenies x environments interaction. It was inferred that selection success will always depend on the most accurate possible progeny assessment and using alternatives to reduce the progenies x environments interaction effect.
Robust gene selection methods using weighting schemes for microarray data analysis.
Kang, Suyeon; Song, Jongwoo
2017-09-02
A common task in microarray data analysis is to identify informative genes that are differentially expressed between two different states. Owing to the high-dimensional nature of microarray data, identification of significant genes has been essential in analyzing the data. However, the performances of many gene selection techniques are highly dependent on the experimental conditions, such as the presence of measurement error or a limited number of sample replicates. We have proposed new filter-based gene selection techniques, by applying a simple modification to significance analysis of microarrays (SAM). To prove the effectiveness of the proposed method, we considered a series of synthetic datasets with different noise levels and sample sizes along with two real datasets. The following findings were made. First, our proposed methods outperform conventional methods for all simulation set-ups. In particular, our methods are much better when the given data are noisy and sample size is small. They showed relatively robust performance regardless of noise level and sample size, whereas the performance of SAM became significantly worse as the noise level became high or sample size decreased. When sufficient sample replicates were available, SAM and our methods showed similar performance. Finally, our proposed methods are competitive with traditional methods in classification tasks for microarrays. The results of simulation study and real data analysis have demonstrated that our proposed methods are effective for detecting significant genes and classification tasks, especially when the given data are noisy or have few sample replicates. By employing weighting schemes, we can obtain robust and reliable results for microarray data analysis.
Ates, Hatice Ceren; Ozgur, Ebru; Kulah, Haluk
2018-03-23
Methods for isolation and quantification of circulating tumor cells (CTCs) are attracting more attention every day, as the data for their unprecedented clinical utility continue to grow. However, the challenge is that CTCs are extremely rare (as low as 1 in a billion of blood cells) and a highly sensitive and specific technology is required to isolate CTCs from blood cells. Methods utilizing microfluidic systems for immunoaffinity-based CTC capture are preferred, especially when purity is the prime requirement. However, antibody immobilization strategy significantly affects the efficiency of such systems. In this study, two covalent and two bioaffinity antibody immobilization methods were assessed with respect to their CTC capture efficiency and selectivity, using an anti-epithelial cell adhesion molecule (EpCAM) as the capture antibody. Surface functionalization was realized on plain SiO 2 surfaces, as well as in microfluidic channels. Surfaces functionalized with different antibody immobilization methods are physically and chemically characterized at each step of functionalization. MCF-7 breast cancer and CCRF-CEM acute lymphoblastic leukemia cell lines were used as EpCAM positive and negative cell models, respectively, to assess CTC capture efficiency and selectivity. Comparisons reveal that bioaffinity based antibody immobilization involving streptavidin attachment with glutaraldehyde linker gave the highest cell capture efficiency. On the other hand, a covalent antibody immobilization method involving direct antibody binding by N-(3-dimethylaminopropyl)-N'-ethylcarbodiimide hydrochloride (EDC)-N-hydroxysuccinimide (NHS) reaction was found to be more time and cost efficient with a similar cell capture efficiency. All methods provided very high selectivity for CTCs with EpCAM expression. It was also demonstrated that antibody immobilization via EDC-NHS reaction in a microfluidic channel leads to high capture efficiency and selectivity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Honorio, J.; Goldstein, R.; Honorio, J.
We propose a simple, well grounded classification technique which is suited for group classification on brain fMRI data sets that have high dimensionality, small number of subjects, high noise level, high subject variability, imperfect registration and capture subtle cognitive effects. We propose threshold-split region as a new feature selection method and majority voteas the classification technique. Our method does not require a predefined set of regions of interest. We use average acros ssessions, only one feature perexperimental condition, feature independence assumption, and simple classifiers. The seeming counter-intuitive approach of using a simple design is supported by signal processing and statisticalmore » theory. Experimental results in two block design data sets that capture brain function under distinct monetary rewards for cocaine addicted and control subjects, show that our method exhibits increased generalization accuracy compared to commonly used feature selection and classification techniques.« less
Estimate of within population incremental selection through branch imbalance in lineage trees
Liberman, Gilad; Benichou, Jennifer I.C.; Maman, Yaakov; Glanville, Jacob; Alter, Idan; Louzoun, Yoram
2016-01-01
Incremental selection within a population, defined as limited fitness changes following mutation, is an important aspect of many evolutionary processes. Strongly advantageous or deleterious mutations are detected using the synonymous to non-synonymous mutations ratio. However, there are currently no precise methods to estimate incremental selection. We here provide for the first time such a detailed method and show its precision in multiple cases of micro-evolution. The proposed method is a novel mixed lineage tree/sequence based method to detect within population selection as defined by the effect of mutations on the average number of offspring. Specifically, we propose to measure the log of the ratio between the number of leaves in lineage trees branches following synonymous and non-synonymous mutations. The method requires a high enough number of sequences, and a large enough number of independent mutations. It assumes that all mutations are independent events. It does not require of a baseline model and is practically not affected by sampling biases. We show the method's wide applicability by testing it on multiple cases of micro-evolution. We show that it can detect genes and inter-genic regions using the selection rate and detect selection pressures in viral proteins and in the immune response to pathogens. PMID:26586802
Detection of selective sweeps in structured populations: a comparison of recent methods.
Vatsiou, Alexandra I; Bazin, Eric; Gaggiotti, Oscar E
2016-01-01
Identifying genomic regions targeted by positive selection has been a long-standing interest of evolutionary biologists. This objective was difficult to achieve until the recent emergence of next-generation sequencing, which is fostering the development of large-scale catalogues of genetic variation for increasing number of species. Several statistical methods have been recently developed to analyse these rich data sets, but there is still a poor understanding of the conditions under which these methods produce reliable results. This study aims at filling this gap by assessing the performance of genome-scan methods that consider explicitly the physical linkage among SNPs surrounding a selected variant. Our study compares the performance of seven recent methods for the detection of selective sweeps (iHS, nSL, EHHST, xp-EHH, XP-EHHST, XPCLR and hapFLK). We use an individual-based simulation approach to investigate the power and accuracy of these methods under a wide range of population models under both hard and soft sweeps. Our results indicate that XPCLR and hapFLK perform best and can detect soft sweeps under simple population structure scenarios if migration rate is low. All methods perform poorly with moderate-to-high migration rates, or with weak selection and very poorly under a hierarchical population structure. Finally, no single method is able to detect both starting and nearly completed selective sweeps. However, combining several methods (XPCLR or hapFLK with iHS or nSL) can greatly increase the power to pinpoint the selected region. © 2015 John Wiley & Sons Ltd.
A target recognition method for maritime surveillance radars based on hybrid ensemble selection
NASA Astrophysics Data System (ADS)
Fan, Xueman; Hu, Shengliang; He, Jingbo
2017-11-01
In order to improve the generalisation ability of the maritime surveillance radar, a novel ensemble selection technique, termed Optimisation and Dynamic Selection (ODS), is proposed. During the optimisation phase, the non-dominated sorting genetic algorithm II for multi-objective optimisation is used to find the Pareto front, i.e. a set of ensembles of classifiers representing different tradeoffs between the classification error and diversity. During the dynamic selection phase, the meta-learning method is used to predict whether a candidate ensemble is competent enough to classify a query instance based on three different aspects, namely, feature space, decision space and the extent of consensus. The classification performance and time complexity of ODS are compared against nine other ensemble methods using a self-built full polarimetric high resolution range profile data-set. The experimental results clearly show the effectiveness of ODS. In addition, the influence of the selection of diversity measures is studied concurrently.
Plasma spraying method for forming diamond and diamond-like coatings
Holcombe, Cressie E.; Seals, Roland D.; Price, R. Eugene
1997-01-01
A method and composition for the deposition of a thick layer (10) of diamond or diamond-like material. The method includes high temperature processing wherein a selected composition (12) including at least glassy carbon is heated in a direct current plasma arc device to a selected temperature above the softening point, in an inert atmosphere, and is propelled to quickly quenched on a selected substrate (20). The softened or molten composition (18) crystallizes on the substrate (20) to form a thick deposition layer (10) comprising at least a diamond or diamond-like material. The selected composition (12) includes at least glassy carbon as a primary constituent (14) and may include at least one secondary constituent (16). Preferably, the secondary constituents (16) are selected from the group consisting of at least diamond powder, boron carbide (B.sub.4 C) powder and mixtures thereof.
Generative model selection using a scalable and size-independent complex network classifier
NASA Astrophysics Data System (ADS)
Motallebi, Sadegh; Aliakbary, Sadegh; Habibi, Jafar
2013-12-01
Real networks exhibit nontrivial topological features, such as heavy-tailed degree distribution, high clustering, and small-worldness. Researchers have developed several generative models for synthesizing artificial networks that are structurally similar to real networks. An important research problem is to identify the generative model that best fits to a target network. In this paper, we investigate this problem and our goal is to select the model that is able to generate graphs similar to a given network instance. By the means of generating synthetic networks with seven outstanding generative models, we have utilized machine learning methods to develop a decision tree for model selection. Our proposed method, which is named "Generative Model Selection for Complex Networks," outperforms existing methods with respect to accuracy, scalability, and size-independence.
Genetic subdivision and candidate genes under selection in North American grey wolves.
Schweizer, Rena M; vonHoldt, Bridgett M; Harrigan, Ryan; Knowles, James C; Musiani, Marco; Coltman, David; Novembre, John; Wayne, Robert K
2016-01-01
Previous genetic studies of the highly mobile grey wolf (Canis lupus) found population structure that coincides with habitat and phenotype differences. We hypothesized that these ecologically distinct populations (ecotypes) should exhibit signatures of selection in genes related to morphology, coat colour and metabolism. To test these predictions, we quantified population structure related to habitat using a genotyping array to assess variation in 42 036 single-nucleotide polymorphisms (SNPs) in 111 North American grey wolves. Using these SNP data and individual-level measurements of 12 environmental variables, we identified six ecotypes: West Forest, Boreal Forest, Arctic, High Arctic, British Columbia and Atlantic Forest. Next, we explored signals of selection across these wolf ecotypes through the use of three complementary methods to detect selection: FST /haplotype homozygosity bivariate percentilae, bayescan, and environmentally correlated directional selection with bayenv. Across all methods, we found consistent signals of selection on genes related to morphology, coat coloration, metabolism, as predicted, as well as vision and hearing. In several high-ranking candidate genes, including LEPR, TYR and SLC14A2, we found variation in allele frequencies that follow environmental changes in temperature and precipitation, a result that is consistent with local adaptation rather than genetic drift. Our findings show that local adaptation can occur despite gene flow in a highly mobile species and can be detected through a moderately dense genomic scan. These patterns of local adaptation revealed by SNP genotyping likely reflect high fidelity to natal habitats of dispersing wolves, strong ecological divergence among habitats, and moderate levels of linkage in the wolf genome. © 2015 John Wiley & Sons Ltd.
Chen, LiQin; Wang, Hui; Xu, Zhen; Zhang, QiuYue; Liu, Jia; Shen, Jun; Zhang, WanQi
2018-08-03
In the present study, we developed a simple and high-throughput solid phase extraction (SPE) procedure for selective extraction of catecholamines (CAs) in urine samples. The SPE adsorbents were electrospun composite fibers functionalized with 4-carboxybenzo-18-crown-6 ether modified XAD resin and polystyrene, which were packed into 96-well columns and used for high-throughput selective extraction of CAs in healthy human urine samples. Moreover, the extraction efficiency of packed-fiber SPE (PFSPE) was examined by high performance liquid chromatography coupled with fluorescence detector. The parameters affecting the extraction efficiency and impurity removal efficiency were optimized, and good linearity ranging from 0.5 to 400 ng/mL was obtained with a low limit of detection (LOD, 0.2-0.5 ng/mL) and a good repeatability (2.7%-3.7%, n = 6). The extraction recoveries of three CAs ranged from 70.5% to 119.5%. Furthermore, stable and reliable results obtained by the fluorescence detector were superior to those obtained by the electrochemical detector. Collectively, PFSPE coupled with 96-well columns was a simple, rapid, selective, high-throughput and cost-efficient method, and the proposed method could be applied in clinical chemistry. Copyright © 2018 Elsevier B.V. All rights reserved.
Tan, York Kiat; Allen, John C; Lye, Weng Kit; Conaghan, Philip G; Chew, Li-Ching; Thumboo, Julian
2017-05-01
The aim of the study is to compare the responsiveness of two joint inflammation scoring systems (dichotomous scoring (DS) versus semi-quantitative scoring (SQS)) using novel individualized ultrasound joint selection methods and existing ultrasound joint selection methods. Responsiveness measured by the standardized response means (SRMs) using the DS and the SQS system (for both the novel and existing ultrasound joint selection methods) was derived using the baseline and the 3-month total inflammatory scores from 20 rheumatoid arthritis patients. The relative SRM gain ratios (SRM-Gains) for both scoring system (DS and SQS) comparing the novel to the existing methods were computed. Both scoring systems (DS and SQS) demonstrated substantial SRM-Gains (ranged from 3.31 to 5.67 for the DS system and ranged from 1.82 to 3.26 for the SQS system). The SRMs using the novel methods ranged from 0.94 to 1.36 for the DS system and ranged from 0.89 to 1.11 for the SQS system. The SRMs using the existing methods ranged from 0.24 to 0.32 for the DS system and ranged from 0.34 to 0.49 for the SQS system. The DS system appears to achieve high responsiveness comparable to SQS for the novel individualized ultrasound joint selection methods.
A novel feature ranking method for prediction of cancer stages using proteomics data
Saghapour, Ehsan; Sehhati, Mohammadreza
2017-01-01
Proteomic analysis of cancers' stages has provided new opportunities for the development of novel, highly sensitive diagnostic tools which helps early detection of cancer. This paper introduces a new feature ranking approach called FRMT. FRMT is based on the Technique for Order of Preference by Similarity to Ideal Solution method (TOPSIS) which select the most discriminative proteins from proteomics data for cancer staging. In this approach, outcomes of 10 feature selection techniques were combined by TOPSIS method, to select the final discriminative proteins from seven different proteomic databases of protein expression profiles. In the proposed workflow, feature selection methods and protein expressions have been considered as criteria and alternatives in TOPSIS, respectively. The proposed method is tested on seven various classifier models in a 10-fold cross validation procedure that repeated 30 times on the seven cancer datasets. The obtained results proved the higher stability and superior classification performance of method in comparison with other methods, and it is less sensitive to the applied classifier. Moreover, the final introduced proteins are informative and have the potential for application in the real medical practice. PMID:28934234
NASA Astrophysics Data System (ADS)
Hu, Yan-Yan; Li, Dong-Sheng
2016-01-01
The hyperspectral images(HSI) consist of many closely spaced bands carrying the most object information. While due to its high dimensionality and high volume nature, it is hard to get satisfactory classification performance. In order to reduce HSI data dimensionality preparation for high classification accuracy, it is proposed to combine a band selection method of artificial immune systems (AIS) with a hybrid kernels support vector machine (SVM-HK) algorithm. In fact, after comparing different kernels for hyperspectral analysis, the approach mixed radial basis function kernel (RBF-K) with sigmoid kernel (Sig-K) and applied the optimized hybrid kernels in SVM classifiers. Then the SVM-HK algorithm used to induce the bands selection of an improved version of AIS. The AIS was composed of clonal selection and elite antibody mutation, including evaluation process with optional index factor (OIF). Experimental classification performance was on a San Diego Naval Base acquired by AVIRIS, the HRS dataset shows that the method is able to efficiently achieve bands redundancy removal while outperforming the traditional SVM classifier.
Acoustic microscope surface inspection system and method
Khuri-Yakub, Butrus T.; Parent, Philippe; Reinholdtsen, Paul A.
1991-01-01
An acoustic microscope surface inspection system and method in which pulses of high frequency electrical energy are applied to a transducer which forms and focuses acoustic energy onto a selected location on the surface of an object and receives energy from the location and generates electrical pulses. The phase of the high frequency electrical signal pulses are stepped with respected to the phase of a reference signal at said location. An output signal is generated which is indicative of the surface of said selected location. The object is scanned to provide output signals representative of the surface at a plurality of surface locations.
Highly Multiplexed RNA Aptamer Selection using a Microplate-based Microcolumn Device.
Reinholt, Sarah J; Ozer, Abdullah; Lis, John T; Craighead, Harold G
2016-07-19
We describe a multiplexed RNA aptamer selection to 19 different targets simultaneously using a microcolumn-based device, MEDUSA (Microplate-based Enrichment Device Used for the Selection of Aptamers), as well as a modified selection process, that significantly reduce the time and reagents needed for selections. We exploited MEDUSA's reconfigurable design between parallel and serially-connected microcolumns to enable the use of just 2 aliquots of starting library, and its 96-well microplate compatibility to enable the continued use of high-throughput techniques in downstream processes. Our modified selection protocol allowed us to perform the equivalent of a 10-cycle selection in the time it takes for 4 traditional selection cycles. Several aptamers were discovered with nanomolar dissociation constants. Furthermore, aptamers were identified that not only bound with high affinity, but also acted as inhibitors to significantly reduce the activity of their target protein, mouse decapping exoribonuclease (DXO). The aptamers resisted DXO's exoribonuclease activity, and in studies monitoring DXO's degradation of a 30-nucleotide substrate, less than 1 μM of aptamer demonstrated significant inhibition of DXO activity. This aptamer selection method using MEDUSA helps to overcome some of the major challenges with traditional aptamer selections, and provides a platform for high-throughput selections that lends itself to process automation.
Chen, Yifei; Sun, Yuxing; Han, Bing-Qing
2015-01-01
Protein interaction article classification is a text classification task in the biological domain to determine which articles describe protein-protein interactions. Since the feature space in text classification is high-dimensional, feature selection is widely used for reducing the dimensionality of features to speed up computation without sacrificing classification performance. Many existing feature selection methods are based on the statistical measure of document frequency and term frequency. One potential drawback of these methods is that they treat features separately. Hence, first we design a similarity measure between the context information to take word cooccurrences and phrase chunks around the features into account. Then we introduce the similarity of context information to the importance measure of the features to substitute the document and term frequency. Hence we propose new context similarity-based feature selection methods. Their performance is evaluated on two protein interaction article collections and compared against the frequency-based methods. The experimental results reveal that the context similarity-based methods perform better in terms of the F1 measure and the dimension reduction rate. Benefiting from the context information surrounding the features, the proposed methods can select distinctive features effectively for protein interaction article classification.
Design of Highly Selective Gas Sensors via Physicochemical Modification of Oxide Nanowires: Overview
Woo, Hyung-Sik; Na, Chan Woong; Lee, Jong-Heun
2016-01-01
Strategies for the enhancement of gas sensing properties, and specifically the improvement of gas selectivity of metal oxide semiconductor nanowire (NW) networks grown by chemical vapor deposition and thermal evaporation, are reviewed. Highly crystalline NWs grown by vapor-phase routes have various advantages, and thus have been applied in the field of gas sensors over the years. In particular, n-type NWs such as SnO2, ZnO, and In2O3 are widely studied because of their simple synthetic preparation and high gas response. However, due to their usually high responses to C2H5OH and NO2, the selective detection of other harmful and toxic gases using oxide NWs remains a challenging issue. Various strategies—such as doping/loading of noble metals, decorating/doping of catalytic metal oxides, and the formation of core–shell structures—have been explored to enhance gas selectivity and sensitivity, and are discussed herein. Additional methods such as the transformation of n-type into p-type NWs and the formation of catalyst-doped hierarchical structures by branch growth have also proven to be promising for the enhancement of gas selectivity. Accordingly, the physicochemical modification of oxide NWs via various methods provides new strategies to achieve the selective detection of a specific gas, and after further investigations, this approach could pave a new way in the field of NW-based semiconductor-type gas sensors. PMID:27657076
Guo, Xinyu; Dominick, Kelli C.; Minai, Ali A.; Li, Hailong; Erickson, Craig A.; Lu, Long J.
2017-01-01
The whole-brain functional connectivity (FC) pattern obtained from resting-state functional magnetic resonance imaging data are commonly applied to study neuropsychiatric conditions such as autism spectrum disorder (ASD) by using different machine learning models. Recent studies indicate that both hyper- and hypo- aberrant ASD-associated FCs were widely distributed throughout the entire brain rather than only in some specific brain regions. Deep neural networks (DNN) with multiple hidden layers have shown the ability to systematically extract lower-to-higher level information from high dimensional data across a series of neural hidden layers, significantly improving classification accuracy for such data. In this study, a DNN with a novel feature selection method (DNN-FS) is developed for the high dimensional whole-brain resting-state FC pattern classification of ASD patients vs. typical development (TD) controls. The feature selection method is able to help the DNN generate low dimensional high-quality representations of the whole-brain FC patterns by selecting features with high discriminating power from multiple trained sparse auto-encoders. For the comparison, a DNN without the feature selection method (DNN-woFS) is developed, and both of them are tested with different architectures (i.e., with different numbers of hidden layers/nodes). Results show that the best classification accuracy of 86.36% is generated by the DNN-FS approach with 3 hidden layers and 150 hidden nodes (3/150). Remarkably, DNN-FS outperforms DNN-woFS for all architectures studied. The most significant accuracy improvement was 9.09% with the 3/150 architecture. The method also outperforms other feature selection methods, e.g., two sample t-test and elastic net. In addition to improving the classification accuracy, a Fisher's score-based biomarker identification method based on the DNN is also developed, and used to identify 32 FCs related to ASD. These FCs come from or cross different pre-defined brain networks including the default-mode, cingulo-opercular, frontal-parietal, and cerebellum. Thirteen of them are statically significant between ASD and TD groups (two sample t-test p < 0.05) while 19 of them are not. The relationship between the statically significant FCs and the corresponding ASD behavior symptoms is discussed based on the literature and clinician's expert knowledge. Meanwhile, the potential reason of obtaining 19 FCs which are not statistically significant is also provided. PMID:28871217
One-step selection of Vaccinia virus-binding DNA aptamers by MonoLEX
Nitsche, Andreas; Kurth, Andreas; Dunkhorst, Anna; Pänke, Oliver; Sielaff, Hendrik; Junge, Wolfgang; Muth, Doreen; Scheller, Frieder; Stöcklein, Walter; Dahmen, Claudia; Pauli, Georg; Kage, Andreas
2007-01-01
Background As a new class of therapeutic and diagnostic reagents, more than fifteen years ago RNA and DNA aptamers were identified as binding molecules to numerous small compounds, proteins and rarely even to complete pathogen particles. Most aptamers were isolated from complex libraries of synthetic nucleic acids by a process termed SELEX based on several selection and amplification steps. Here we report the application of a new one-step selection method (MonoLEX) to acquire high-affinity DNA aptamers binding Vaccinia virus used as a model organism for complex target structures. Results The selection against complete Vaccinia virus particles resulted in a 64-base DNA aptamer specifically binding to orthopoxviruses as validated by dot blot analysis, Surface Plasmon Resonance, Fluorescence Correlation Spectroscopy and real-time PCR, following an aptamer blotting assay. The same oligonucleotide showed the ability to inhibit in vitro infection of Vaccinia virus and other orthopoxviruses in a concentration-dependent manner. Conclusion The MonoLEX method is a straightforward procedure as demonstrated here for the identification of a high-affinity DNA aptamer binding Vaccinia virus. MonoLEX comprises a single affinity chromatography step, followed by subsequent physical segmentation of the affinity resin and a single final PCR amplification step of bound aptamers. Therefore, this procedure improves the selection of high affinity aptamers by reducing the competition between aptamers of different affinities during the PCR step, indicating an advantage for the single-round MonoLEX method. PMID:17697378
Eating and Exercising: Nebraska Adolescents' Attitudes and Behaviors. Technical Report 25.
ERIC Educational Resources Information Center
Newman, Ian M.
This report describes selected eating and exercise patterns among a sample of 2,237 Nebraska youth in grades 9-12 selected from a random sample of 24 junior and senior high schools. The eating patterns reported cover food selection, body image, weight management, and weight loss methods. The exercise patterns relate to the frequency of…
Nanowire field-effect transistors for gas sensor applications
NASA Astrophysics Data System (ADS)
Constantinou, Marios
Sensing BTEX (Benzene, Ethylbenzene, Toluene, Xylene) pollutants is of utmost importance to reduce health risk and ensure public safety. The lack of sensitivity and selectivity of the current gas sensors and the limited number of available technologies in the field of BTEX-sensing raises the demand for the development of high-performance gas sensors for BTEX applications. The scope of this thesis is the fabrication and characterisation of high-quality field-effect transistors (FETs), with functionalised silicon nanowires (SiNWs), for the selective sensing of benzene vs. other BTEX gases. This research addresses three main challenges in SiNW FET-sensor device development: i) controllable and reproducible assembly of high-quality SiNWs for FET sensor devices using the method of dielectrophoresis (DEP), ii) almost complete elimination of harmful hysteresis effect in the SiNW FET current-voltage characteristics induced by surface states using DMF solvent, iii) selective sensing of benzene with up to ppb range of sensitivity using calix[4]arene-derivatives. It is experimentally demonstrated that frequency-controlled DEP is a powerful tool for the selection and collection of semiconducting SiNWs with advanced electrical and morphological properties, from a poly-disperse as-synthesised NWs. The DEP assembly method also leads to a controllable and reproducible fabrication of high-quality NW-based FETs. The results highlight the superiority of DEP, performed at high signal frequencies (5-20 MHz) to selectively assemble only high-quality NWs which can respond to such high DEP frequencies. The SiNW FETs, with NWs collected at high DEP frequencies, have high mobility (≈50 cm2 V-1 s-1), low sub-threshold-swing (≈1.26 V/decade), high on-current (up to 3 mA) and high on/off ratio (106-107). The DEP NW selection is also demonstrated using an industrially scalable method, to allow establishing of NW response characteristics to different DEP frequencies in a very short time window of about 60 seconds. The choice of solvent for the dispersion of the SiNW for the DEP process demonstrates a dramatic impact on their surface trap, with DMF solvent acting as a mild oxidising agent on the NW surface shell. This surface state passivation technique resulted in the fabrication of high-quality, hysteresis-free NW FET transducers for sensor applications. Finally, the proof-of-concept SiNW FET transducer decorated with calix[4]arene-derivative gas receptors exhibits selective detection of benzene vs. other BTEX gases up to 30 ppm concentrations, and up to sub-ppm benzene concentration. The demonstrated NW-sensors are low power and compact, and therefore can be easily mounted on a mobile device, providing instantaneous determination of hazardous gases in the surrounding atmosphere. The methodologies developed in this thesis, have a high potential to make a breakthrough in low-cost, selective gas sensors, which can be fabricated in line with printed and flexible electronic approaches.
A modified estimation distribution algorithm based on extreme elitism.
Gao, Shujun; de Silva, Clarence W
2016-12-01
An existing estimation distribution algorithm (EDA) with univariate marginal Gaussian model was improved by designing and incorporating an extreme elitism selection method. This selection method highlighted the effect of a few top best solutions in the evolution and advanced EDA to form a primary evolution direction and obtain a fast convergence rate. Simultaneously, this selection can also keep the population diversity to make EDA avoid premature convergence. Then the modified EDA was tested by means of benchmark low-dimensional and high-dimensional optimization problems to illustrate the gains in using this extreme elitism selection. Besides, no-free-lunch theorem was implemented in the analysis of the effect of this new selection on EDAs. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Riad, Safaa M.; El-Rahman, Mohamed K. Abd; Fawaz, Esraa M.; Shehata, Mostafa A.
2015-06-01
Three sensitive, selective, and precise stability indicating spectrophotometric methods for the determination of the X-ray contrast agent, diatrizoate sodium (DTA) in the presence of its acidic degradation product (highly cytotoxic 3,5-diamino metabolite) and in pharmaceutical formulation, were developed and validated. The first method is ratio difference, the second one is the bivariate method, and the third one is the dual wavelength method. The calibration curves for the three proposed methods are linear over a concentration range of 2-24 μg/mL. The selectivity of the proposed methods was tested using laboratory prepared mixtures. The proposed methods have been successfully applied to the analysis of DTA in pharmaceutical dosage forms without interference from other dosage form additives. The results were statistically compared with the official US pharmacopeial method. No significant difference for either accuracy or precision was observed.
Riad, Safaa M; El-Rahman, Mohamed K Abd; Fawaz, Esraa M; Shehata, Mostafa A
2015-06-15
Three sensitive, selective, and precise stability indicating spectrophotometric methods for the determination of the X-ray contrast agent, diatrizoate sodium (DTA) in the presence of its acidic degradation product (highly cytotoxic 3,5-diamino metabolite) and in pharmaceutical formulation, were developed and validated. The first method is ratio difference, the second one is the bivariate method, and the third one is the dual wavelength method. The calibration curves for the three proposed methods are linear over a concentration range of 2-24 μg/mL. The selectivity of the proposed methods was tested using laboratory prepared mixtures. The proposed methods have been successfully applied to the analysis of DTA in pharmaceutical dosage forms without interference from other dosage form additives. The results were statistically compared with the official US pharmacopeial method. No significant difference for either accuracy or precision was observed. Copyright © 2015 Elsevier B.V. All rights reserved.
Leng, Pei-Qiang; Zhao, Feng-Lan; Yin, Bin-Cheng; Ye, Bang-Ce
2015-05-21
We developed a novel colorimetric method for rapid detection of biogenic amines based on arylalkylamine N-acetyltransferase (aaNAT). The proposed method offers distinct advantages including simple handling, high speed, low cost, good sensitivity and selectivity.
A Fast, Open EEG Classification Framework Based on Feature Compression and Channel Ranking
Han, Jiuqi; Zhao, Yuwei; Sun, Hongji; Chen, Jiayun; Ke, Ang; Xu, Gesen; Zhang, Hualiang; Zhou, Jin; Wang, Changyong
2018-01-01
Superior feature extraction, channel selection and classification methods are essential for designing electroencephalography (EEG) classification frameworks. However, the performance of most frameworks is limited by their improper channel selection methods and too specifical design, leading to high computational complexity, non-convergent procedure and narrow expansibility. In this paper, to remedy these drawbacks, we propose a fast, open EEG classification framework centralized by EEG feature compression, low-dimensional representation, and convergent iterative channel ranking. First, to reduce the complexity, we use data clustering to compress the EEG features channel-wise, packing the high-dimensional EEG signal, and endowing them with numerical signatures. Second, to provide easy access to alternative superior methods, we structurally represent each EEG trial in a feature vector with its corresponding numerical signature. Thus, the recorded signals of many trials shrink to a low-dimensional structural matrix compatible with most pattern recognition methods. Third, a series of effective iterative feature selection approaches with theoretical convergence is introduced to rank the EEG channels and remove redundant ones, further accelerating the EEG classification process and ensuring its stability. Finally, a classical linear discriminant analysis (LDA) model is employed to classify a single EEG trial with selected channels. Experimental results on two real world brain-computer interface (BCI) competition datasets demonstrate the promising performance of the proposed framework over state-of-the-art methods. PMID:29713262
Akanno, E C; Schenkel, F S; Sargolzaei, M; Friendship, R M; Robinson, J A B
2014-10-01
Genetic improvement of pigs in tropical developing countries has focused on imported exotic populations which have been subjected to intensive selection with attendant high population-wide linkage disequilibrium (LD). Presently, indigenous pig population with limited selection and low LD are being considered for improvement. Given that the infrastructure for genetic improvement using the conventional BLUP selection methods are lacking, a genome-wide selection (GS) program was proposed for developing countries. A simulation study was conducted to evaluate the option of using 60 K SNP panel and observed amount of LD in the exotic and indigenous pig populations. Several scenarios were evaluated including different size and structure of training and validation populations, different selection methods and long-term accuracy of GS in different population/breeding structures and traits. The training set included previously selected exotic population, unselected indigenous population and their crossbreds. Traits studied included number born alive (NBA), average daily gain (ADG) and back fat thickness (BFT). The ridge regression method was used to train the prediction model. The results showed that accuracies of genomic breeding values (GBVs) in the range of 0.30 (NBA) to 0.86 (BFT) in the validation population are expected if high density marker panels are utilized. The GS method improved accuracy of breeding values better than pedigree-based approach for traits with low heritability and in young animals with no performance data. Crossbred training population performed better than purebreds when validation was in populations with similar or a different structure as in the training set. Genome-wide selection holds promise for genetic improvement of pigs in the tropics. © 2014 Blackwell Verlag GmbH.
Endoscopic methods in the treatment of early-stage esophageal cancer
2014-01-01
Most patients with early esophageal cancer restricted to the mucosa may be offered endoscopic therapy, which is similarly effective, less invasive and less expensive than esophagectomy. Selection of appropriate relevant treatment and therapy methods should be performed at a specialized center with adequate facilities. The selection of an endoscopic treatment method for high-grade dysplasia and early-stage esophageal adenocarcinoma requires that tumor infiltration is restricted to the mucosa and that there is no neighboring lymph node metastasis. In squamous cell carcinoma, this treatment method is accepted in cases of tumors invading only up to the lamina propria of mucosa (m2). Tumors treated with the endoscopic method should be well or moderately differentiated and should not invade lymphatic or blood vessels. When selecting endoscopic treatments for these lesions, a combination of endoscopic resection and endoscopic ablation methods should be considered. PMID:25097676
NASA Astrophysics Data System (ADS)
Sitnikov, Dmitri G.; Monnin, Cian S.; Vuckovic, Dajana
2016-12-01
The comparison of extraction methods for global metabolomics is usually executed in biofluids only and focuses on metabolite coverage and method repeatability. This limits our detailed understanding of extraction parameters such as recovery and matrix effects and prevents side-by-side comparison of different sample preparation strategies. To address this gap in knowledge, seven solvent-based and solid-phase extraction methods were systematically evaluated using standard analytes spiked into both buffer and human plasma. We compared recovery, coverage, repeatability, matrix effects, selectivity and orthogonality of all methods tested for non-lipid metabolome in combination with reversed-phased and mixed-mode liquid chromatography mass spectrometry analysis (LC-MS). Our results confirmed wide selectivity and excellent precision of solvent precipitations, but revealed their high susceptibility to matrix effects. The use of all seven methods showed high overlap and redundancy which resulted in metabolite coverage increases of 34-80% depending on LC-MS method employed as compared to the best single extraction protocol (methanol/ethanol precipitation) despite 7x increase in MS analysis time and sample consumption. The most orthogonal methods to methanol-based precipitation were ion-exchange solid-phase extraction and liquid-liquid extraction using methyl-tertbutyl ether. Our results help facilitate rational design and selection of sample preparation methods and internal standards for global metabolomics.
Sitnikov, Dmitri G.; Monnin, Cian S.; Vuckovic, Dajana
2016-01-01
The comparison of extraction methods for global metabolomics is usually executed in biofluids only and focuses on metabolite coverage and method repeatability. This limits our detailed understanding of extraction parameters such as recovery and matrix effects and prevents side-by-side comparison of different sample preparation strategies. To address this gap in knowledge, seven solvent-based and solid-phase extraction methods were systematically evaluated using standard analytes spiked into both buffer and human plasma. We compared recovery, coverage, repeatability, matrix effects, selectivity and orthogonality of all methods tested for non-lipid metabolome in combination with reversed-phased and mixed-mode liquid chromatography mass spectrometry analysis (LC-MS). Our results confirmed wide selectivity and excellent precision of solvent precipitations, but revealed their high susceptibility to matrix effects. The use of all seven methods showed high overlap and redundancy which resulted in metabolite coverage increases of 34–80% depending on LC-MS method employed as compared to the best single extraction protocol (methanol/ethanol precipitation) despite 7x increase in MS analysis time and sample consumption. The most orthogonal methods to methanol-based precipitation were ion-exchange solid-phase extraction and liquid-liquid extraction using methyl-tertbutyl ether. Our results help facilitate rational design and selection of sample preparation methods and internal standards for global metabolomics. PMID:28000704
Stiffler, Michael A; Subramanian, Subu K; Salinas, Victor H; Ranganathan, Rama
2016-07-03
Site-directed mutagenesis has long been used as a method to interrogate protein structure, function and evolution. Recent advances in massively-parallel sequencing technology have opened up the possibility of assessing the functional or fitness effects of large numbers of mutations simultaneously. Here, we present a protocol for experimentally determining the effects of all possible single amino acid mutations in a protein of interest utilizing high-throughput sequencing technology, using the 263 amino acid antibiotic resistance enzyme TEM-1 β-lactamase as an example. In this approach, a whole-protein saturation mutagenesis library is constructed by site-directed mutagenic PCR, randomizing each position individually to all possible amino acids. The library is then transformed into bacteria, and selected for the ability to confer resistance to β-lactam antibiotics. The fitness effect of each mutation is then determined by deep sequencing of the library before and after selection. Importantly, this protocol introduces methods which maximize sequencing read depth and permit the simultaneous selection of the entire mutation library, by mixing adjacent positions into groups of length accommodated by high-throughput sequencing read length and utilizing orthogonal primers to barcode each group. Representative results using this protocol are provided by assessing the fitness effects of all single amino acid mutations in TEM-1 at a clinically relevant dosage of ampicillin. The method should be easily extendable to other proteins for which a high-throughput selection assay is in place.
Du, Wei; Zhang, Bilin; Guo, Pengqi; Chen, Guoning; Chang, Chun; Fu, Qiang
2018-03-15
Dexamethasone-imprinted polymers were fabricated by reversible addition-fragmentation chain transfer polymerization on the surface of magnetic nanoparticles under mild polymerization conditions, which exhibited a narrow polydispersity and high selectivity for dexamethasone extraction. The dexamethasone-imprinted polymers were characterized by scanning electron microscopy, transmission electron microscope, Fourier transform infrared spectroscopy, X-ray diffraction, energy dispersive spectrometry, and vibrating sample magnetometry. The adsorption performance was evaluated by static adsorption, kinetic adsorption and selectivity tests. The results confirmed the successful construction of an imprinted polymer layer on the surface of the magnetic nanoparticles, which benefits the characteristics of high adsorption capacity, fast mass transfer, specific molecular recognition, and simple magnetic separation. Combined with high-performance liquid chromatography, molecularly imprinted polymers as magnetic extraction sorbents were used for the rapid and selective extraction and determination of dexamethasone in skincare cosmetic samples, with the accuracies of the spiked samples ranging from 93.8 to 97.6%. The relative standard deviations were less than 2.7%. The limit of detection and limit of quantification were 0.05 and 0.20 μg/mL, respectively. The developed method was simple, fast and highly selective and could be a promising method for dexamethasone monitoring in cosmetic products. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Li, Jing; Hong, Wenxue
2014-12-01
The feature extraction and feature selection are the important issues in pattern recognition. Based on the geometric algebra representation of vector, a new feature extraction method using blade coefficient of geometric algebra was proposed in this study. At the same time, an improved differential evolution (DE) feature selection method was proposed to solve the elevated high dimension issue. The simple linear discriminant analysis was used as the classifier. The result of the 10-fold cross-validation (10 CV) classification of public breast cancer biomedical dataset was more than 96% and proved superior to that of the original features and traditional feature extraction method.
Justice, N. B.; Sczesnak, A.; Hazen, T. C.; ...
2017-08-04
A central goal of microbial ecology is to identify and quantify the forces that lead to observed population distributions and dynamics. However, these forces, which include environmental selection, dispersal, and organism interactions, are often difficult to assess in natural environments. Here in this paper, we present a method that links microbial community structures with selective and stochastic forces through highly replicated subsampling and enrichment of a single environmental inoculum. Specifically, groundwater from a well-studied natural aquifer was serially diluted and inoculated into nearly 1,000 aerobic and anaerobic nitrate-reducing cultures, and the final community structures were evaluated with 16S rRNA genemore » amplicon sequencing. We analyzed the frequency and abundance of individual operational taxonomic units (OTUs) to understand how probabilistic immigration, relative fitness differences, environmental factors, and organismal interactions contributed to divergent distributions of community structures. We further used a most probable number (MPN) method to estimate the natural condition-dependent cultivable abundance of each of the nearly 400 OTU cultivated in our study and infer the relative fitness of each. Additionally, we infer condition-specific organism interactions and discuss how this high-replicate culturing approach is essential in dissecting the interplay between overlapping ecological forces and taxon-specific attributes that underpin microbial community assembly. IMPORTANCEThrough highly replicated culturing, in which inocula are subsampled from a single environmental sample, we empirically determine how selective forces, interspecific interactions, relative fitness, and probabilistic dispersal shape bacterial communities. These methods offer a novel approach to untangle not only interspecific interactions but also taxon-specific fitness differences that manifest across different cultivation conditions and lead to the selection and enrichment of specific organisms. Additionally, we provide a method for estimating the number of cultivable units of each OTU in the original sample through the MPN approach.« less
Li, Jin; Tran, Maggie; Siwabessy, Justy
2016-01-01
Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and caution should be taken when applying filter FS methods in selecting predictive models. PMID:26890307
Li, Jin; Tran, Maggie; Siwabessy, Justy
2016-01-01
Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia's marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to 'small p and large n' problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and caution should be taken when applying filter FS methods in selecting predictive models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Justice, N. B.; Sczesnak, A.; Hazen, T. C.
A central goal of microbial ecology is to identify and quantify the forces that lead to observed population distributions and dynamics. However, these forces, which include environmental selection, dispersal, and organism interactions, are often difficult to assess in natural environments. Here in this paper, we present a method that links microbial community structures with selective and stochastic forces through highly replicated subsampling and enrichment of a single environmental inoculum. Specifically, groundwater from a well-studied natural aquifer was serially diluted and inoculated into nearly 1,000 aerobic and anaerobic nitrate-reducing cultures, and the final community structures were evaluated with 16S rRNA genemore » amplicon sequencing. We analyzed the frequency and abundance of individual operational taxonomic units (OTUs) to understand how probabilistic immigration, relative fitness differences, environmental factors, and organismal interactions contributed to divergent distributions of community structures. We further used a most probable number (MPN) method to estimate the natural condition-dependent cultivable abundance of each of the nearly 400 OTU cultivated in our study and infer the relative fitness of each. Additionally, we infer condition-specific organism interactions and discuss how this high-replicate culturing approach is essential in dissecting the interplay between overlapping ecological forces and taxon-specific attributes that underpin microbial community assembly. IMPORTANCEThrough highly replicated culturing, in which inocula are subsampled from a single environmental sample, we empirically determine how selective forces, interspecific interactions, relative fitness, and probabilistic dispersal shape bacterial communities. These methods offer a novel approach to untangle not only interspecific interactions but also taxon-specific fitness differences that manifest across different cultivation conditions and lead to the selection and enrichment of specific organisms. Additionally, we provide a method for estimating the number of cultivable units of each OTU in the original sample through the MPN approach.« less
Integrative Analysis of High-throughput Cancer Studies with Contrasted Penalization
Shi, Xingjie; Liu, Jin; Huang, Jian; Zhou, Yong; Shia, BenChang; Ma, Shuangge
2015-01-01
In cancer studies with high-throughput genetic and genomic measurements, integrative analysis provides a way to effectively pool and analyze heterogeneous raw data from multiple independent studies and outperforms “classic” meta-analysis and single-dataset analysis. When marker selection is of interest, the genetic basis of multiple datasets can be described using the homogeneity model or the heterogeneity model. In this study, we consider marker selection under the heterogeneity model, which includes the homogeneity model as a special case and can be more flexible. Penalization methods have been developed in the literature for marker selection. This study advances from the published ones by introducing the contrast penalties, which can accommodate the within- and across-dataset structures of covariates/regression coefficients and, by doing so, further improve marker selection performance. Specifically, we develop a penalization method that accommodates the across-dataset structures by smoothing over regression coefficients. An effective iterative algorithm, which calls an inner coordinate descent iteration, is developed. Simulation shows that the proposed method outperforms the benchmark with more accurate marker identification. The analysis of breast cancer and lung cancer prognosis studies with gene expression measurements shows that the proposed method identifies genes different from those using the benchmark and has better prediction performance. PMID:24395534
He, Maofang; Wang, Chaozhan; Wei, Yinmao
2016-01-15
In this paper, iminodiacetic acid-Cu(II) functionalized Fe3O4@SiO2 magnetic nanoparticles were prepared and used as new adsorbents for magnetic solid phase extraction (MSPE) of six monoamine neurotransmitters (MNTs) from rabbit plasma. The selective enrichment of MNTs at pH 5.0 was motivated by the specific coordination interaction between amino groups of MNTs and the immobilized Cu(II). The employed weak acidic extraction condition avoided the oxidation of MNTs, and thus facilitated operation and ensured higher recoveries. Under optimal conditions, the recoveries of six MNTs from rabbit plasma were in the range of 83.9-109.4%, with RSD of 2.0-10.0%. When coupled the Cu(II) immobilized MSPE with high-performance liquid chromatography-fluorescence detection, the method exhibited relatively lower detection limits than the previously reported methods, and the method was successfully used to determine the endogenous MNTs in rabbit plasma. The proposed method has potential application for the determination of MNTs in biological samples. Also, the utilization of coordination interaction to improve the selectivity might open another way to selectively enrich small alkaloids from complex samples. Copyright © 2015 Elsevier B.V. All rights reserved.
A new mosaic method for three-dimensional surface
NASA Astrophysics Data System (ADS)
Yuan, Yun; Zhu, Zhaokun; Ding, Yongjun
2011-08-01
Three-dimensional (3-D) data mosaic is a indispensable link in surface measurement and digital terrain map generation. With respect to the mosaic problem of the local unorganized cloud points with rude registration and mass mismatched points, a new mosaic method for 3-D surface based on RANSAC is proposed. Every circular of this method is processed sequentially by random sample with additional shape constraint, data normalization of cloud points, absolute orientation, data denormalization of cloud points, inlier number statistic, etc. After N random sample trials the largest consensus set is selected, and at last the model is re-estimated using all the points in the selected subset. The minimal subset is composed of three non-colinear points which form a triangle. The shape of triangle is considered in random sample selection in order to make the sample selection reasonable. A new coordinate system transformation algorithm presented in this paper is used to avoid the singularity. The whole rotation transformation between the two coordinate systems can be solved by twice rotations expressed by Euler angle vector, each rotation has explicit physical means. Both simulation and real data are used to prove the correctness and validity of this mosaic method. This method has better noise immunity due to its robust estimation property, and has high accuracy as the shape constraint is added to random sample and the data normalization added to the absolute orientation. This method is applicable for high precision measurement of three-dimensional surface and also for the 3-D terrain mosaic.
Evaluation of variable selection methods for random forests and omics data sets.
Degenhardt, Frauke; Seifert, Stephan; Szymczak, Silke
2017-10-16
Machine learning methods and in particular random forests are promising approaches for prediction based on high dimensional omics data sets. They provide variable importance measures to rank predictors according to their predictive power. If building a prediction model is the main goal of a study, often a minimal set of variables with good prediction performance is selected. However, if the objective is the identification of involved variables to find active networks and pathways, approaches that aim to select all relevant variables should be preferred. We evaluated several variable selection procedures based on simulated data as well as publicly available experimental methylation and gene expression data. Our comparison included the Boruta algorithm, the Vita method, recurrent relative variable importance, a permutation approach and its parametric variant (Altmann) as well as recursive feature elimination (RFE). In our simulation studies, Boruta was the most powerful approach, followed closely by the Vita method. Both approaches demonstrated similar stability in variable selection, while Vita was the most robust approach under a pure null model without any predictor variables related to the outcome. In the analysis of the different experimental data sets, Vita demonstrated slightly better stability in variable selection and was less computationally intensive than Boruta.In conclusion, we recommend the Boruta and Vita approaches for the analysis of high-dimensional data sets. Vita is considerably faster than Boruta and thus more suitable for large data sets, but only Boruta can also be applied in low-dimensional settings. © The Author 2017. Published by Oxford University Press.
Stabilizing l1-norm prediction models by supervised feature grouping.
Kamkar, Iman; Gupta, Sunil Kumar; Phung, Dinh; Venkatesh, Svetha
2016-02-01
Emerging Electronic Medical Records (EMRs) have reformed the modern healthcare. These records have great potential to be used for building clinical prediction models. However, a problem in using them is their high dimensionality. Since a lot of information may not be relevant for prediction, the underlying complexity of the prediction models may not be high. A popular way to deal with this problem is to employ feature selection. Lasso and l1-norm based feature selection methods have shown promising results. But, in presence of correlated features, these methods select features that change considerably with small changes in data. This prevents clinicians to obtain a stable feature set, which is crucial for clinical decision making. Grouping correlated variables together can improve the stability of feature selection, however, such grouping is usually not known and needs to be estimated for optimal performance. Addressing this problem, we propose a new model that can simultaneously learn the grouping of correlated features and perform stable feature selection. We formulate the model as a constrained optimization problem and provide an efficient solution with guaranteed convergence. Our experiments with both synthetic and real-world datasets show that the proposed model is significantly more stable than Lasso and many existing state-of-the-art shrinkage and classification methods. We further show that in terms of prediction performance, the proposed method consistently outperforms Lasso and other baselines. Our model can be used for selecting stable risk factors for a variety of healthcare problems, so it can assist clinicians toward accurate decision making. Copyright © 2015 Elsevier Inc. All rights reserved.
Wang, Liqun; Cardenas, Roberto Bravo; Watson, Clifford
2017-09-08
CDC's Division of Laboratory Sciences developed and validated a new method for the simultaneous detection and measurement of 11 sugars, alditols and humectants in tobacco products. The method uses isotope dilution ultra high performance liquid chromatography coupled with tandem mass spectrometry (UHPLC-MS/MS) and has demonstrated high sensitivity, selectivity, throughput and accuracy, with recoveries ranging from 90% to 113%, limits of detection ranging from 0.0002 to 0.0045μg/mL and coefficients of variation (CV%) ranging from 1.4 to 14%. Calibration curves for all analytes were linear with linearity R 2 values greater than 0.995. Quantification of tobacco components is necessary to characterize tobacco product components and their potential effects on consumer appeal, smoke chemistry and toxicology, and to potentially help distinguish tobacco product categories. The researchers analyzed a variety of tobacco products (e.g., cigarettes, little cigars, cigarillos) using the new method and documented differences in the abundance of selected analytes among product categories. Specifically, differences were detected in levels of selected sugars found in little cigars and cigarettes, which could help address appeal potential and have utility when product category is unknown, unclear, or miscategorized. Copyright © 2017. Published by Elsevier B.V.
A Comparison of Two Methods of Teaching Molecular Architecture to High School Chemistry Students.
ERIC Educational Resources Information Center
Halsted, Douglas Alan
This investigation explored the question of how high school chemistry students best learn three-dimensional molecular, ionic, and metallic structures in CHEM Study (Freeman, 1963). The experimenter compared the achievement, attitude, and instructional preferences of 110 randomly selected students taught by two different methods: (1) student…
NASA Astrophysics Data System (ADS)
Orito, N.; Umekage, S.; Sato, K.; Kawauchi, S.; Tanaka, H.; Sakai, E.; Tanaka, T.; Kikuchi, Y.
2012-03-01
We have developed a modified SELEX (systematic evolution of ligands by exponential enrichment) method to obtain RNA aptamers with high affinity to C-reactive protein (CRP). CRP is a clinical biomarker present in plasma, the level of which increases in response to infections and noninfectious inflammation. The CRP level is also an important prognostic indicator in patients with several syndromes. At present, CRP content in blood is measured immunochemically using antibodies. To develop a more sensitive method using RNA aptamers, we have attempted to obtain high-affinity RNA aptamers to CRP. We succeeded in obtaining an RNA aptamer with high affinity to CRP using a CRP-immobilized Sepharose column and pre-elution procedure. Pre-elution is a method that removes the weak binding portion from a selected RNA population by washing for a short time with buffer containing CRP. By surface plasmon-resonance (SPR) analysis, the affinity constant of this aptamer for CRP was calculated to be KD = 2.25×10-9 (M). The secondary structure, contact sites with CRP protein, and application of this aptamer will be described.
NASA Astrophysics Data System (ADS)
Hu, Shuo; Yang, Guangxin; Jiang, Hong; Liu, Yefei; Chen, Rizhi
2018-03-01
Selective phenol hydrogenation is a green and sustainable technology to produce cyclohexanone. The work focused on investigating the role of catalyst reduction method in the liquid-phase phenol hydrogenation to cyclohexanone over Pd@CN (N-doped porous carbon). A series of reduction methods including flowing hydrogen reduction, in-situ reaction reduction and liquid-phase reduction were designed and performed. The results highlighted that the reduction method significantly affected the catalytic performance of Pd@CN in the liquid-phase hydrogenation of phenol to cyclohexanone, and the liquid-phase reduction with the addition of appropriate amount of phenol was highly efficient to improve the catalytic activity of Pd@CN. The influence mechanism was explored by a series of characterizations. The results of TEM, XPS and CO chemisorption confirmed that the reduction method mainly affected the size, surface composition and dispersion of Pd in the CN material. The addition of phenol during the liquid-phase reduction could inhibit the aggregation of Pd NPs and promote the reduction of Pd (2+), and then improved the catalytic activity of Pd@CN. The work would aid the development of high-performance Pd@CN catalysts for selective phenol hydrogenation.
Construction of human antibody gene libraries and selection of antibodies by phage display.
Frenzel, André; Kügler, Jonas; Wilke, Sonja; Schirrmann, Thomas; Hust, Michael
2014-01-01
Antibody phage display is the most commonly used in vitro selection technology and has yielded thousands of useful antibodies for research, diagnostics, and therapy.The prerequisite for successful generation and development of human recombinant antibodies using phage display is the construction of a high-quality antibody gene library. Here, we describe the methods for the construction of human immune and naive scFv gene libraries.The success also depends on the panning strategy for the selection of binders from these libraries. In this article, we describe a panning strategy that is high-throughput compatible and allows parallel selection in microtiter plates.
NASA Astrophysics Data System (ADS)
Iwasaki, Ryosuke; Nagaoka, Ryo; Yoshizawa, Shin; Umemura, Shin-ichiro
2018-07-01
Acoustic cavitation bubbles are known to enhance the heating effect in high-intensity focused ultrasound (HIFU) treatment. The detection of cavitation bubbles with high sensitivity and selectivity is required to predict the therapeutic and side effects of cavitation, and ensure the efficacy and safety of the treatment. A pulse inversion (PI) technique has been widely used for imaging microbubbles through enhancing the second-harmonic component of echo signals. However, it has difficulty in separating the nonlinear response of microbubbles from that due to nonlinear propagation. In this study, a triplet pulse (3P) method was investigated to specifically image cavitation bubbles by extracting the 1.5th fractional harmonic component. The proposed 3P method depicted cavitation bubbles with a contrast ratio significantly higher than those in conventional imaging methods with and without PI. The results suggest that the 3P method is effective for specifically detecting microbubbles in cavitation-enhanced HIFU treatment.
Cheng, Yu-Huei
2014-12-01
Specific primers play an important role in polymerase chain reaction (PCR) experiments, and therefore it is essential to find specific primers of outstanding quality. Unfortunately, many PCR constraints must be simultaneously inspected which makes specific primer selection difficult and time-consuming. This paper introduces a novel computational intelligence-based method, Teaching-Learning-Based Optimisation, to select the specific and feasible primers. The specified PCR product lengths of 150-300 bp and 500-800 bp with three melting temperature formulae of Wallace's formula, Bolton and McCarthy's formula and SantaLucia's formula were performed. The authors calculate optimal frequency to estimate the quality of primer selection based on a total of 500 runs for 50 random nucleotide sequences of 'Homo species' retrieved from the National Center for Biotechnology Information. The method was then fairly compared with the genetic algorithm (GA) and memetic algorithm (MA) for primer selection in the literature. The results show that the method easily found suitable primers corresponding with the setting primer constraints and had preferable performance than the GA and the MA. Furthermore, the method was also compared with the common method Primer3 according to their method type, primers presentation, parameters setting, speed and memory usage. In conclusion, it is an interesting primer selection method and a valuable tool for automatic high-throughput analysis. In the future, the usage of the primers in the wet lab needs to be validated carefully to increase the reliability of the method.
Protein scaffolds for selective enrichment of metal ions
He, Chuan; Zhou, Lu; Bosscher, Michael
2016-02-09
Polypeptides comprising high affinity for the uranyl ion are provided. Methods for binding uranyl using such proteins are likewise provided and can be used, for example, in methods for uranium purification or removal.
Diagnostic for two-mode variable valve activation device
Fedewa, Andrew M
2014-01-07
A method is provided for diagnosing a multi-mode valve train device which selectively provides high lift and low lift to a combustion valve of an internal combustion engine having a camshaft phaser actuated by an electric motor. The method includes applying a variable electric current to the electric motor to achieve a desired camshaft phaser operational mode and commanding the multi-mode valve train device to a desired valve train device operational mode selected from a high lift mode and a low lift mode. The method also includes monitoring the variable electric current and calculating a first characteristic of the parameter. The method also includes comparing the calculated first characteristic against a predetermined value of the first characteristic measured when the multi-mode valve train device is known to be in the desired valve train device operational mode.
Automated selection of trabecular bone regions in knee radiographs.
Podsiadlo, P; Wolski, M; Stachowiak, G W
2008-05-01
Osteoarthritic (OA) changes in knee joints can be assessed by analyzing the structure of trabecular bone (TB) in the tibia. This analysis is performed on TB regions selected manually by a human operator on x-ray images. Manual selection is time-consuming, tedious, and expensive. Even if a radiologist expert or highly trained person is available to select regions, high inter- and intraobserver variabilities are still possible. A fully automated image segmentation method was, therefore, developed to select the bone regions for numerical analyses of changes in bone structures. The newly developed method consists of image preprocessing, delineation of cortical bone plates (active shape model), and location of regions of interest (ROI). The method was trained on an independent set of 40 x-ray images. Automatically selected regions were compared to the "gold standard" that contains ROIs selected manually by a radiologist expert on 132 x-ray images. All images were acquired from subjects locked in a standardized standing position using a radiography rig. The size of each ROI is 12.8 x 12.8 mm. The automated method results showed a good agreement with the gold standard [similarity index (SI) = 0.83 (medial) and 0.81 (lateral) and the offset =[-1.78, 1.27]x[-0.65,0.26] mm (medial) and [-2.15, 1.59]x[-0.58, 0.52] mm (lateral)]. Bland and Altman plots were constructed for fractal signatures, and changes of fractal dimensions (FD) to region offsets calculated between the gold standard and automatically selected regions were calculated. The plots showed a random scatter and the 95% confidence intervals were (-0.006, 0.008) and (-0.001, 0.011). The changes of FDs to region offsets were less than 0.035. Previous studies showed that differences in FDs between non-OA and OA bone regions were greater than 0.05. ROIs were also selected by a second radiologist and then evaluated. Results indicated that the newly developed method could replace a human operator and produces bone regions with an accuracy that is sufficient for fractal analyses of bone texture.
solGS: a web-based tool for genomic selection
USDA-ARS?s Scientific Manuscript database
Genomic selection (GS) promises to improve accuracy in estimating breeding values and genetic gain for quantitative traits compared to traditional breeding methods. Its reliance on high-throughput genome-wide markers and statistical complexity, however, is a serious challenge in data management, ana...
A multi-frequency iterative imaging method for discontinuous inverse medium problem
NASA Astrophysics Data System (ADS)
Zhang, Lei; Feng, Lixin
2018-06-01
The inverse medium problem with discontinuous refractive index is a kind of challenging inverse problem. We employ the primal dual theory and fast solution of integral equations, and propose a new iterative imaging method. The selection criteria of regularization parameter is given by the method of generalized cross-validation. Based on multi-frequency measurements of the scattered field, a recursive linearization algorithm has been presented with respect to the frequency from low to high. We also discuss the initial guess selection strategy by semi-analytical approaches. Numerical experiments are presented to show the effectiveness of the proposed method.
Kyriakis, Efstathios; Psomopoulos, Constantinos; Kokkotis, Panagiotis; Bourtsalas, Athanasios; Themelis, Nikolaos
2017-06-23
This study attempts the development of an algorithm in order to present a step by step selection method for the location and the size of a waste-to-energy facility targeting the maximum output energy, also considering the basic obstacle which is in many cases, the gate fee. Various parameters identified and evaluated in order to formulate the proposed decision making method in the form of an algorithm. The principle simulation input is the amount of municipal solid wastes (MSW) available for incineration and along with its net calorific value are the most important factors for the feasibility of the plant. Moreover, the research is focused both on the parameters that could increase the energy production and those that affect the R1 energy efficiency factor. Estimation of the final gate fee is achieved through the economic analysis of the entire project by investigating both expenses and revenues which are expected according to the selected site and outputs of the facility. In this point, a number of commonly revenue methods were included in the algorithm. The developed algorithm has been validated using three case studies in Greece-Athens, Thessaloniki, and Central Greece, where the cities of Larisa and Volos have been selected for the application of the proposed decision making tool. These case studies were selected based on a previous publication made by two of the authors, in which these areas where examined. Results reveal that the development of a «solid» methodological approach in selecting the site and the size of waste-to-energy (WtE) facility can be feasible. However, the maximization of the energy efficiency factor R1 requires high utilization factors while the minimization of the final gate fee requires high R1 and high metals recovery from the bottom ash as well as economic exploitation of recovered raw materials if any.
Information Gain Based Dimensionality Selection for Classifying Text Documents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dumidu Wijayasekara; Milos Manic; Miles McQueen
2013-06-01
Selecting the optimal dimensions for various knowledge extraction applications is an essential component of data mining. Dimensionality selection techniques are utilized in classification applications to increase the classification accuracy and reduce the computational complexity. In text classification, where the dimensionality of the dataset is extremely high, dimensionality selection is even more important. This paper presents a novel, genetic algorithm based methodology, for dimensionality selection in text mining applications that utilizes information gain. The presented methodology uses information gain of each dimension to change the mutation probability of chromosomes dynamically. Since the information gain is calculated a priori, the computational complexitymore » is not affected. The presented method was tested on a specific text classification problem and compared with conventional genetic algorithm based dimensionality selection. The results show an improvement of 3% in the true positives and 1.6% in the true negatives over conventional dimensionality selection methods.« less
2-DE combined with two-layer feature selection accurately establishes the origin of oolong tea.
Chien, Han-Ju; Chu, Yen-Wei; Chen, Chi-Wei; Juang, Yu-Min; Chien, Min-Wei; Liu, Chih-Wei; Wu, Chia-Chang; Tzen, Jason T C; Lai, Chien-Chen
2016-11-15
Taiwan is known for its high quality oolong tea. Because of high consumer demand, some tea manufactures mix lower quality leaves with genuine Taiwan oolong tea in order to increase profits. Robust scientific methods are, therefore, needed to verify the origin and quality of tea leaves. In this study, we investigated whether two-dimensional gel electrophoresis (2-DE) and nanoscale liquid chromatography/tandem mass spectroscopy (nano-LC/MS/MS) coupled with a two-layer feature selection mechanism comprising information gain attribute evaluation (IGAE) and support vector machine feature selection (SVM-FS) are useful in identifying characteristic proteins that can be used as markers of the original source of oolong tea. Samples in this study included oolong tea leaves from 23 different sources. We found that our method had an accuracy of 95.5% in correctly identifying the origin of the leaves. Overall, our method is a novel approach for determining the origin of oolong tea leaves. Copyright © 2016 Elsevier Ltd. All rights reserved.
Zhang, Chuncheng; Song, Sutao; Wen, Xiaotong; Yao, Li; Long, Zhiying
2015-04-30
Feature selection plays an important role in improving the classification accuracy of multivariate classification techniques in the context of fMRI-based decoding due to the "few samples and large features" nature of functional magnetic resonance imaging (fMRI) data. Recently, several sparse representation methods have been applied to the voxel selection of fMRI data. Despite the low computational efficiency of the sparse representation methods, they still displayed promise for applications that select features from fMRI data. In this study, we proposed the Laplacian smoothed L0 norm (LSL0) approach for feature selection of fMRI data. Based on the fast sparse decomposition using smoothed L0 norm (SL0) (Mohimani, 2007), the LSL0 method used the Laplacian function to approximate the L0 norm of sources. Results of the simulated and real fMRI data demonstrated the feasibility and robustness of LSL0 for the sparse source estimation and feature selection. Simulated results indicated that LSL0 produced more accurate source estimation than SL0 at high noise levels. The classification accuracy using voxels that were selected by LSL0 was higher than that by SL0 in both simulated and real fMRI experiment. Moreover, both LSL0 and SL0 showed higher classification accuracy and required less time than ICA and t-test for the fMRI decoding. LSL0 outperformed SL0 in sparse source estimation at high noise level and in feature selection. Moreover, LSL0 and SL0 showed better performance than ICA and t-test for feature selection. Copyright © 2015 Elsevier B.V. All rights reserved.
Generative model selection using a scalable and size-independent complex network classifier
DOE Office of Scientific and Technical Information (OSTI.GOV)
Motallebi, Sadegh, E-mail: motallebi@ce.sharif.edu; Aliakbary, Sadegh, E-mail: aliakbary@ce.sharif.edu; Habibi, Jafar, E-mail: jhabibi@sharif.edu
2013-12-15
Real networks exhibit nontrivial topological features, such as heavy-tailed degree distribution, high clustering, and small-worldness. Researchers have developed several generative models for synthesizing artificial networks that are structurally similar to real networks. An important research problem is to identify the generative model that best fits to a target network. In this paper, we investigate this problem and our goal is to select the model that is able to generate graphs similar to a given network instance. By the means of generating synthetic networks with seven outstanding generative models, we have utilized machine learning methods to develop a decision tree formore » model selection. Our proposed method, which is named “Generative Model Selection for Complex Networks,” outperforms existing methods with respect to accuracy, scalability, and size-independence.« less
Yu, Sheng; Liao, Katherine P; Shaw, Stanley Y; Gainer, Vivian S; Churchill, Susanne E; Szolovits, Peter; Murphy, Shawn N; Kohane, Isaac S; Cai, Tianxi
2015-09-01
Analysis of narrative (text) data from electronic health records (EHRs) can improve population-scale phenotyping for clinical and genetic research. Currently, selection of text features for phenotyping algorithms is slow and laborious, requiring extensive and iterative involvement by domain experts. This paper introduces a method to develop phenotyping algorithms in an unbiased manner by automatically extracting and selecting informative features, which can be comparable to expert-curated ones in classification accuracy. Comprehensive medical concepts were collected from publicly available knowledge sources in an automated, unbiased fashion. Natural language processing (NLP) revealed the occurrence patterns of these concepts in EHR narrative notes, which enabled selection of informative features for phenotype classification. When combined with additional codified features, a penalized logistic regression model was trained to classify the target phenotype. The authors applied our method to develop algorithms to identify patients with rheumatoid arthritis and coronary artery disease cases among those with rheumatoid arthritis from a large multi-institutional EHR. The area under the receiver operating characteristic curves (AUC) for classifying RA and CAD using models trained with automated features were 0.951 and 0.929, respectively, compared to the AUCs of 0.938 and 0.929 by models trained with expert-curated features. Models trained with NLP text features selected through an unbiased, automated procedure achieved comparable or slightly higher accuracy than those trained with expert-curated features. The majority of the selected model features were interpretable. The proposed automated feature extraction method, generating highly accurate phenotyping algorithms with improved efficiency, is a significant step toward high-throughput phenotyping. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Grane, Camilla
2018-01-01
Highly automated driving will change driver's behavioural patterns. Traditional methods used for assessing manual driving will only be applicable for the parts of human-automation interaction where the driver intervenes such as in hand-over and take-over situations. Therefore, driver behaviour assessment will need to adapt to the new driving scenarios. This paper aims at simplifying the process of selecting appropriate assessment methods. Thirty-five papers were reviewed to examine potential and relevant methods. The review showed that many studies still relies on traditional driving assessment methods. A new method, the Failure-GAM 2 E model, with purpose to aid assessment selection when planning a study, is proposed and exemplified in the paper. Failure-GAM 2 E includes a systematic step-by-step procedure defining the situation, failures (Failure), goals (G), actions (A), subjective methods (M), objective methods (M) and equipment (E). The use of Failure-GAM 2 E in a study example resulted in a well-reasoned assessment plan, a new way of measuring trust through feet movements and a proposed Optimal Risk Management Model. Failure-GAM 2 E and the Optimal Risk Management Model are believed to support the planning process for research studies in the field of human-automation interaction. Copyright © 2017 Elsevier Ltd. All rights reserved.
Selective production of chemicals from biomass pyrolysis over metal chlorides supported on zeolite.
Leng, Shuai; Wang, Xinde; Cai, Qiuxia; Ma, Fengyun; Liu, Yue'e; Wang, Jianguo
2013-12-01
Direct biomass conversion into chemicals remains a great challenge because of the complexity of the compounds; hence, this process has attracted less attention than conversion into fuel. In this study, we propose a simple one-step method for converting bagasse into furfural (FF) and acetic acid (AC). In this method, bagasse pyrolysis over ZnCl2/HZSM-5 achieved a high FF and AC yield (58.10%) and a 1.01 FF/AC ratio, but a very low yield of medium-boiling point components. However, bagasse pyrolysis using HZSM-5 alone or ZnCl2 alone still remained large amounts of medium-boiling point components or high-boiling point components. The synergistic effect of HZSM-5 and ZnCl2, which combines pyrolysis, zeolite cracking, and Lewis acid-selective catalysis results in highly efficient bagasse conversion into FF and AC. Therefore, our study provides a novel, simple method for directly converting biomass into high-yield useful chemical. Copyright © 2013 Elsevier Ltd. All rights reserved.
The method of selecting an integrated development territory for the high-rise unique constructions
NASA Astrophysics Data System (ADS)
Sheina, Svetlana; Shevtsova, Elina; Sukhinin, Alexander; Priss, Elena
2018-03-01
On the basis of data provided by the Department of architecture and urban planning of the city of Rostov-on-don, the problem of the choice of the territory for complex development that will be in priority for the construction of high-rise and unique buildings is solved. The objective of the study was the development of a methodology for selection of the area and the implementation of the proposed method on the example of evaluation of four-territories complex development. The developed method along with standard indicators of complex evaluation considers additional indicators that assess the territory from the position of high-rise unique building. The final result of the study is the rankings of the functional priority areas that takes into account the construction of both residential and public and business objects of unique high-rise construction. The use of the developed methodology will allow investors and customers to assess the investment attractiveness of the future unique construction project on the proposed site.
Zeng, Qiong; Jia, Yan-Wei; Xu, Pei-Li; Xiao, Meng-Wei; Liu, Yi-Ming; Peng, Shu-Lin; Liao, Xun
2015-12-01
A facile and highly efficient magnetic solid-phase extraction method has been developed for Z-ligustilide, the major therapeutic agent in Angelica sinensis. The solid-phase adsorbent material used was prepared by conjugating carbon nanotubes with magnetic Fe3 O4 nanoparticles via a hydrothermal reaction. The magnetic material showed a high affinity toward Z-ligustilide due to the π-π stacking interaction between the carbon nanotubes and Z-ligustilide, allowing a quick and selective exaction of Z-ligustilide from complex sample matrices. Factors influencing the magnetic solid-phase extraction such as the amount of the added adsorbent, adsorption and desorption time, and desorption solvent, were investigated. Due to its high extraction efficiency, this method was proved highly useful for sample cleanup/enrichment in quantitative high-performance liquid chromatography analysis. The proposed method had a linear calibration curve (R(2) = 0.9983) over the concentration between 4 ng/mL and 200 μg/mL Z-ligustilide. The accuracy of the method was determined by the recovery, which was from 92.07 to 104.02%, with the relative standard deviations >4.51%. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Enhanced HTS hit selection via a local hit rate analysis.
Posner, Bruce A; Xi, Hualin; Mills, James E J
2009-10-01
The postprocessing of high-throughput screening (HTS) results is complicated by the occurrence of false positives (inactive compounds misidentified as active by the primary screen) and false negatives (active compounds misidentified as inactive by the primary screen). An activity cutoff is frequently used to select "active" compounds from HTS data; however, this approach is insensitive to both false positives and false negatives. An alternative method that can minimize the occurrence of these artifacts will increase the efficiency of hit selection and therefore lead discovery. In this work, rather than merely using the activity of a given compound, we look at the presence and absence of activity among all compounds in its "chemical space neighborhood" to give a degree of confidence in its activity. We demonstrate that this local hit rate (LHR) analysis method outperforms hit selection based on ranking by primary screen activity values across ten diverse high throughput screens, spanning both cell-based and biochemical assay formats of varying biology and robustness. On average, the local hit rate analysis method was approximately 2.3-fold and approximately 1.3-fold more effective in identifying active compounds and active chemical series, respectively, than selection based on primary activity alone. Moreover, when applied to finding false negatives, this method was 2.3-fold better than ranking by primary activity alone. In most cases, novel hit series were identified that would have otherwise been missed. Additional uses of and observations regarding this HTS analysis approach are also discussed.
Discrete Biogeography Based Optimization for Feature Selection in Molecular Signatures.
Liu, Bo; Tian, Meihong; Zhang, Chunhua; Li, Xiangtao
2015-04-01
Biomarker discovery from high-dimensional data is a complex task in the development of efficient cancer diagnoses and classification. However, these data are usually redundant and noisy, and only a subset of them present distinct profiles for different classes of samples. Thus, selecting high discriminative genes from gene expression data has become increasingly interesting in the field of bioinformatics. In this paper, a discrete biogeography based optimization is proposed to select the good subset of informative gene relevant to the classification. In the proposed algorithm, firstly, the fisher-markov selector is used to choose fixed number of gene data. Secondly, to make biogeography based optimization suitable for the feature selection problem; discrete migration model and discrete mutation model are proposed to balance the exploration and exploitation ability. Then, discrete biogeography based optimization, as we called DBBO, is proposed by integrating discrete migration model and discrete mutation model. Finally, the DBBO method is used for feature selection, and three classifiers are used as the classifier with the 10 fold cross-validation method. In order to show the effective and efficiency of the algorithm, the proposed algorithm is tested on four breast cancer dataset benchmarks. Comparison with genetic algorithm, particle swarm optimization, differential evolution algorithm and hybrid biogeography based optimization, experimental results demonstrate that the proposed method is better or at least comparable with previous method from literature when considering the quality of the solutions obtained. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Hadoux, Xavier; Kumar, Dinesh Kant; Sarossy, Marc G; Roger, Jean-Michel; Gorretta, Nathalie
2016-05-19
Visible and near-infrared (Vis-NIR) spectra are generated by the combination of numerous low resolution features. Spectral variables are thus highly correlated, which can cause problems for selecting the most appropriate ones for a given application. Some decomposition bases such as Fourier or wavelet generally help highlighting spectral features that are important, but are by nature constraint to have both positive and negative components. Thus, in addition to complicating the selected features interpretability, it impedes their use for application-dedicated sensors. In this paper we have proposed a new method for feature selection: Application-Dedicated Selection of Filters (ADSF). This method relaxes the shape constraint by enabling the selection of any type of user defined custom features. By considering only relevant features, based on the underlying nature of the data, high regularization of the final model can be obtained, even in the small sample size context often encountered in spectroscopic applications. For larger scale deployment of application-dedicated sensors, these predefined feature constraints can lead to application specific optical filters, e.g., lowpass, highpass, bandpass or bandstop filters with positive only coefficients. In a similar fashion to Partial Least Squares, ADSF successively selects features using covariance maximization and deflates their influences using orthogonal projection in order to optimally tune the selection to the data with limited redundancy. ADSF is well suited for spectroscopic data as it can deal with large numbers of highly correlated variables in supervised learning, even with many correlated responses. Copyright © 2016 Elsevier B.V. All rights reserved.
Biological evaluation of certain substituted hydantoins and benzalhydantoins against microbes
NASA Astrophysics Data System (ADS)
Hidayat, Ika-Wiani; Thu, Yee Yee; Black, David St. C.; Read, Roger W.
2016-02-01
Twenty-three synthetic (thio)hydantoins and benzalhydantoins were evaluated for antimicrobial activity against Candida albicans, Malassezia furfur, Escherichia coli and Staphylococcus aureus, by the paper disc diffusion method. 3-n-butyl-4'-nitrobenzalhydantoin showed very high activity against E. coli and high selectivity with respect to the other microorganisms, while 3-n-butyl-2'-bromo-4',5'-dimethoxybenzal hydantoin demonstrated very high selectivity in its activity against M. furfur and S. aureus. These compounds show the most promise as drug lead compounds.
Chen, H F; Dong, X C; Zen, B S; Gao, K; Yuan, S G; Panaye, A; Doucet, J P; Fan, B T
2003-08-01
An efficient virtual and rational drug design method is presented. It combines virtual bioactive compound generation with 3D-QSAR model and docking. Using this method, it is possible to generate a lot of highly diverse molecules and find virtual active lead compounds. The method was validated by the study of a set of anti-tumor drugs. With the constraints of pharmacophore obtained by DISCO implemented in SYBYL 6.8, 97 virtual bioactive compounds were generated, and their anti-tumor activities were predicted by CoMFA. Eight structures with high activity were selected and screened by the 3D-QSAR model. The most active generated structure was further investigated by modifying its structure in order to increase the activity. A comparative docking study with telomeric receptor was carried out, and the results showed that the generated structures could form more stable complexes with receptor than the reference compound selected from experimental data. This investigation showed that the proposed method was a feasible way for rational drug design with high screening efficiency.
Dietrich, Stefan; Floegel, Anna; Troll, Martina; Kühn, Tilman; Rathmann, Wolfgang; Peters, Anette; Sookthai, Disorn; von Bergen, Martin; Kaaks, Rudolf; Adamski, Jerzy; Prehn, Cornelia; Boeing, Heiner; Schulze, Matthias B; Illig, Thomas; Pischon, Tobias; Knüppel, Sven; Wang-Sattler, Rui; Drogan, Dagmar
2016-10-01
The application of metabolomics in prospective cohort studies is statistically challenging. Given the importance of appropriate statistical methods for selection of disease-associated metabolites in highly correlated complex data, we combined random survival forest (RSF) with an automated backward elimination procedure that addresses such issues. Our RSF approach was illustrated with data from the European Prospective Investigation into Cancer and Nutrition (EPIC)-Potsdam study, with concentrations of 127 serum metabolites as exposure variables and time to development of type 2 diabetes mellitus (T2D) as outcome variable. Out of this data set, Cox regression with a stepwise selection method was recently published. Replication of methodical comparison (RSF and Cox regression) was conducted in two independent cohorts. Finally, the R-code for implementing the metabolite selection procedure into the RSF-syntax is provided. The application of the RSF approach in EPIC-Potsdam resulted in the identification of 16 incident T2D-associated metabolites which slightly improved prediction of T2D when used in addition to traditional T2D risk factors and also when used together with classical biomarkers. The identified metabolites partly agreed with previous findings using Cox regression, though RSF selected a higher number of highly correlated metabolites. The RSF method appeared to be a promising approach for identification of disease-associated variables in complex data with time to event as outcome. The demonstrated RSF approach provides comparable findings as the generally used Cox regression, but also addresses the problem of multicollinearity and is suitable for high-dimensional data. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.
Zhao, Ying-Yong; Zhao, Ye; Zhang, Yong-Min; Lin, Rui-Chao; Sun, Wen-Ji
2009-06-01
Polyporus umbellatus is a widely used anti-aldosteronic diuretic in Traditional Chinese medicine (TCM). A new, sensitive and selective high-performance liquid chromatography-fluorescence detector (HPLC-FLD) and high-performance liquid chromatography-atmospheric pressure chemical ionization-mass spectrometry (HPLC-APCI-MS/MS) method for quantitative and qualitative determination of ergosta-4,6,8(14),22-tetraen-3-one(ergone), which is the main diuretic component, was provided for quality control of P. umbellatus crude drug. The ergone in the ethanolic extract of P. umbellatus was unambiguously characterized by HPLC-APCI, and further confirmed by comparing with a standard compound. The trace ergone was detected by the sensitive and selective HPLC-FLD. Linearity (r2 > 0.9998) and recoveries of low, medium and high concentration (100.5%, 100.2% and 100.4%) were consistent with the experimental criteria. The limit of detection (LOD) of ergone was around 0.2 microg/mL. Our results indicated that the content of ergone in P. umbellatus varied significantly from habitat to habitat with contents ranging from 2.13 +/- 0.02 to 59.17 +/- 0.05 microg/g. Comparison among HPLC-FLD and HPLC-UV or HPLC-APCI-MS/MS demonstrated that the HPLC-FLD and HPLC-APCI-MS/MS methods gave similar quantitative results for the selected herb samples, the HPLC-UV methods gave lower quantitative results than HPLC-FLD and HPLC-APCI-MS/MS methods. The established new HPLC-FLD method has the advantages of being rapid, simple, selective and sensitive, and could be used for the routine analysis of P. umbellatus crude drug.
Yang, Mingxing; Li, Xiumin; Li, Zhibin; Ou, Zhimin; Liu, Ming; Liu, Suhuan; Li, Xuejun; Yang, Shuyu
2013-01-01
DNA microarray analysis is characterized by obtaining a large number of gene variables from a small number of observations. Cluster analysis is widely used to analyze DNA microarray data to make classification and diagnosis of disease. Because there are so many irrelevant and insignificant genes in a dataset, a feature selection approach must be employed in data analysis. The performance of cluster analysis of this high-throughput data depends on whether the feature selection approach chooses the most relevant genes associated with disease classes. Here we proposed a new method using multiple Orthogonal Partial Least Squares-Discriminant Analysis (mOPLS-DA) models and S-plots to select the most relevant genes to conduct three-class disease classification and prediction. We tested our method using Golub's leukemia microarray data. For three classes with subtypes, we proposed hierarchical orthogonal partial least squares-discriminant analysis (OPLS-DA) models and S-plots to select features for two main classes and their subtypes. For three classes in parallel, we employed three OPLS-DA models and S-plots to choose marker genes for each class. The power of feature selection to classify and predict three-class disease was evaluated using cluster analysis. Further, the general performance of our method was tested using four public datasets and compared with those of four other feature selection methods. The results revealed that our method effectively selected the most relevant features for disease classification and prediction, and its performance was better than that of the other methods.
A survey of variable selection methods in two Chinese epidemiology journals
2010-01-01
Background Although much has been written on developing better procedures for variable selection, there is little research on how it is practiced in actual studies. This review surveys the variable selection methods reported in two high-ranking Chinese epidemiology journals. Methods Articles published in 2004, 2006, and 2008 in the Chinese Journal of Epidemiology and the Chinese Journal of Preventive Medicine were reviewed. Five categories of methods were identified whereby variables were selected using: A - bivariate analyses; B - multivariable analysis; e.g. stepwise or individual significance testing of model coefficients; C - first bivariate analyses, followed by multivariable analysis; D - bivariate analyses or multivariable analysis; and E - other criteria like prior knowledge or personal judgment. Results Among the 287 articles that reported using variable selection methods, 6%, 26%, 30%, 21%, and 17% were in categories A through E, respectively. One hundred sixty-three studies selected variables using bivariate analyses, 80% (130/163) via multiple significance testing at the 5% alpha-level. Of the 219 multivariable analyses, 97 (44%) used stepwise procedures, 89 (41%) tested individual regression coefficients, but 33 (15%) did not mention how variables were selected. Sixty percent (58/97) of the stepwise routines also did not specify the algorithm and/or significance levels. Conclusions The variable selection methods reported in the two journals were limited in variety, and details were often missing. Many studies still relied on problematic techniques like stepwise procedures and/or multiple testing of bivariate associations at the 0.05 alpha-level. These deficiencies should be rectified to safeguard the scientific validity of articles published in Chinese epidemiology journals. PMID:20920252
Diagnosis of Chronic Kidney Disease Based on Support Vector Machine by Feature Selection Methods.
Polat, Huseyin; Danaei Mehr, Homay; Cetin, Aydin
2017-04-01
As Chronic Kidney Disease progresses slowly, early detection and effective treatment are the only cure to reduce the mortality rate. Machine learning techniques are gaining significance in medical diagnosis because of their classification ability with high accuracy rates. The accuracy of classification algorithms depend on the use of correct feature selection algorithms to reduce the dimension of datasets. In this study, Support Vector Machine classification algorithm was used to diagnose Chronic Kidney Disease. To diagnose the Chronic Kidney Disease, two essential types of feature selection methods namely, wrapper and filter approaches were chosen to reduce the dimension of Chronic Kidney Disease dataset. In wrapper approach, classifier subset evaluator with greedy stepwise search engine and wrapper subset evaluator with the Best First search engine were used. In filter approach, correlation feature selection subset evaluator with greedy stepwise search engine and filtered subset evaluator with the Best First search engine were used. The results showed that the Support Vector Machine classifier by using filtered subset evaluator with the Best First search engine feature selection method has higher accuracy rate (98.5%) in the diagnosis of Chronic Kidney Disease compared to other selected methods.
Welch, Leslie; Dong, Xiao; Hewitt, Daniel; Irwin, Michelle; McCarty, Luke; Tsai, Christina; Baginski, Tomasz
2018-06-02
Free thiol content, and its consistency, is one of the product quality attributes of interest during technical development of manufactured recombinant monoclonal antibodies (mAbs). We describe a new, mid/high-throughput reversed-phase-high performance liquid chromatography (RP-HPLC) method coupled with derivatization of free thiols, for the determination of total free thiol content in an E. coli-expressed therapeutic monovalent monoclonal antibody mAb1. Initial selection of the derivatization reagent used an hydrophobicity-tailored approach. Maleimide-based thiol-reactive reagents with varying degrees of hydrophobicity were assessed to identify and select one that provided adequate chromatographic resolution and robust quantitation of free thiol-containing mAb1 forms. The method relies on covalent derivatization of free thiols in denatured mAb1 with N-tert-butylmaleimide (NtBM) label, followed by RP-HPLC separation with UV-based quantitation of native (disulfide containing) and labeled (free thiol containing) forms. The method demonstrated good specificity, precision, linearity, accuracy and robustness. Accuracy of the method, for samples with a wide range of free thiol content, was demonstrated using admixtures as well as by comparison to an orthogonal LC-MS peptide mapping method with isotope tagging of free thiols. The developed method has a facile workflow which fits well into both R&D characterization and quality control (QC) testing environments. The hydrophobicity-tailored approach to the selection of free thiol derivatization reagent is easily applied to the rapid development of free thiol quantitation methods for full-length recombinant antibodies. Copyright © 2018 Elsevier B.V. All rights reserved.
Robust Ground Target Detection by SAR and IR Sensor Fusion Using Adaboost-Based Feature Selection
Kim, Sungho; Song, Woo-Jin; Kim, So-Hyun
2016-01-01
Long-range ground targets are difficult to detect in a noisy cluttered environment using either synthetic aperture radar (SAR) images or infrared (IR) images. SAR-based detectors can provide a high detection rate with a high false alarm rate to background scatter noise. IR-based approaches can detect hot targets but are affected strongly by the weather conditions. This paper proposes a novel target detection method by decision-level SAR and IR fusion using an Adaboost-based machine learning scheme to achieve a high detection rate and low false alarm rate. The proposed method consists of individual detection, registration, and fusion architecture. This paper presents a single framework of a SAR and IR target detection method using modified Boolean map visual theory (modBMVT) and feature-selection based fusion. Previous methods applied different algorithms to detect SAR and IR targets because of the different physical image characteristics. One method that is optimized for IR target detection produces unsuccessful results in SAR target detection. This study examined the image characteristics and proposed a unified SAR and IR target detection method by inserting a median local average filter (MLAF, pre-filter) and an asymmetric morphological closing filter (AMCF, post-filter) into the BMVT. The original BMVT was optimized to detect small infrared targets. The proposed modBMVT can remove the thermal and scatter noise by the MLAF and detect extended targets by attaching the AMCF after the BMVT. Heterogeneous SAR and IR images were registered automatically using the proposed RANdom SAmple Region Consensus (RANSARC)-based homography optimization after a brute-force correspondence search using the detected target centers and regions. The final targets were detected by feature-selection based sensor fusion using Adaboost. The proposed method showed good SAR and IR target detection performance through feature selection-based decision fusion on a synthetic database generated by OKTAL-SE. PMID:27447635
Robust Ground Target Detection by SAR and IR Sensor Fusion Using Adaboost-Based Feature Selection.
Kim, Sungho; Song, Woo-Jin; Kim, So-Hyun
2016-07-19
Long-range ground targets are difficult to detect in a noisy cluttered environment using either synthetic aperture radar (SAR) images or infrared (IR) images. SAR-based detectors can provide a high detection rate with a high false alarm rate to background scatter noise. IR-based approaches can detect hot targets but are affected strongly by the weather conditions. This paper proposes a novel target detection method by decision-level SAR and IR fusion using an Adaboost-based machine learning scheme to achieve a high detection rate and low false alarm rate. The proposed method consists of individual detection, registration, and fusion architecture. This paper presents a single framework of a SAR and IR target detection method using modified Boolean map visual theory (modBMVT) and feature-selection based fusion. Previous methods applied different algorithms to detect SAR and IR targets because of the different physical image characteristics. One method that is optimized for IR target detection produces unsuccessful results in SAR target detection. This study examined the image characteristics and proposed a unified SAR and IR target detection method by inserting a median local average filter (MLAF, pre-filter) and an asymmetric morphological closing filter (AMCF, post-filter) into the BMVT. The original BMVT was optimized to detect small infrared targets. The proposed modBMVT can remove the thermal and scatter noise by the MLAF and detect extended targets by attaching the AMCF after the BMVT. Heterogeneous SAR and IR images were registered automatically using the proposed RANdom SAmple Region Consensus (RANSARC)-based homography optimization after a brute-force correspondence search using the detected target centers and regions. The final targets were detected by feature-selection based sensor fusion using Adaboost. The proposed method showed good SAR and IR target detection performance through feature selection-based decision fusion on a synthetic database generated by OKTAL-SE.
Kim, Won-Geun; Song, Hyerin; Kim, Chuntae; Moon, Jong-Sik; Kim, Kyujung; Lee, Seung-Wuk; Oh, Jin-Woo
2016-11-15
Here, we describe a highly sensitive and selective surface plasmon resonance sensor system by utilizing self-assembly of genetically engineered M13 bacteriophage. About 2700 copies of genetically expressed peptide copies give superior selectivity and sensitivity to M13 phage-based SPR sensor. Furthermore, the sensitivity of the M13 phage-based SPR sensor was enhanced due to the aligning of receptor matrix in specific direction. Incorporation of specific binding peptide (His Pro Gln: HPQ) gives M13 bacteriophage high selectivity for the streptavidin. Our M13 phage-based SPR sensor takes advantage of simplicity of self-assembly compared with relatively complex photolithography techniques or chemical conjugations. Additionally, designed structure which is composed of functionalized M13 bacteriophage can simultaneously improve the sensitivity and selectivity of SPR sensor evidently. By taking advantages of the genetic engineering and self-assembly, we propose the simple method for fabricating novel M13 phage-based SPR sensor system which has a high sensitivity and high selectivity. Copyright © 2016 Elsevier B.V. All rights reserved.
Classification Influence of Features on Given Emotions and Its Application in Feature Selection
NASA Astrophysics Data System (ADS)
Xing, Yin; Chen, Chuang; Liu, Li-Long
2018-04-01
In order to solve the problem that there is a large amount of redundant data in high-dimensional speech emotion features, we analyze deeply the extracted speech emotion features and select better features. Firstly, a given emotion is classified by each feature. Secondly, the recognition rate is ranked in descending order. Then, the optimal threshold of features is determined by rate criterion. Finally, the better features are obtained. When applied in Berlin and Chinese emotional data set, the experimental results show that the feature selection method outperforms the other traditional methods.
NASA Technical Reports Server (NTRS)
Chubb, Donald L.; Flood, Dennis J.; Lowe, Roland A.
1993-01-01
Thermophotovoltaic (TPV) systems are attractive possibilities for direct thermal-to-electric energy conversion, but have typically required the use of black body radiators operating at high temperatures. Recent advances in both the understanding and performance of solid rare-earth oxide selective emitters make possible the use of TPV at temperatures as low as 1200K. Both selective emitter and filter system TPV systems are feasible. However, requirements on the filter system are severe in order to attain high efficiency. A thin-film of a rare-earth oxide is one method for producing an efficient, rugged selective emitter. An efficiency of 0.14 and power density of 9.2 W/KG at 1200K is calculated for a hypothetical thin-film neodymia (Nd2O3) selective emitter TPV system that uses radioisotope decay as the thermal energy source.
Method for producing highly conformal transparent conducting oxides
Elam, Jeffrey W.; Mane, Anil U.
2016-07-26
A method for forming a transparent conducting oxide product layer. The method includes use of precursors, such as tetrakis-(dimethylamino) tin and trimethyl indium, and selected use of dopants, such as SnO and ZnO for obtaining desired optical, electrical and structural properties for a highly conformal layer coating on a substrate. Ozone was also input as a reactive gas which enabled rapid production of the desired product layer.
Methods and devices for high-throughput dielectrophoretic concentration
Simmons, Blake A.; Cummings, Eric B.; Fiechtner, Gregory J.; Fintschenko, Yolanda; McGraw, Gregory J.; Salmi, Allen
2010-02-23
Disclosed herein are methods and devices for assaying and concentrating analytes in a fluid sample using dielectrophoresis. As disclosed, the methods and devices utilize substrates having a plurality of pores through which analytes can be selectively prevented from passing, or inhibited, on application of an appropriate electric field waveform. The pores of the substrate produce nonuniform electric field having local extrema located near the pores. These nonuniform fields drive dielectrophoresis, which produces the inhibition. Arrangements of electrodes and porous substrates support continuous, bulk, multi-dimensional, and staged selective concentration.
Potent and selective mediators of cholesterol efflux
Bielicki, John K; Johansson, Jan
2015-03-24
The present invention provides a family of non-naturally occurring polypeptides having cholesterol efflux activity that parallels that of full-length apolipoproteins (e.g., Apo AI and Apo E), and having high selectivity for ABAC1 that parallels that of full-length apolipoproteins. The invention also provides compositions comprising such polypeptides, methods of identifying, screening and synthesizing such polypeptides, and methods of treating, preventing or diagnosing diseases and disorders associated with dyslipidemia, hypercholesterolemia and inflammation.
Tribological synthesis method for producing low-friction surface film coating
Ajayi, Oyelayo O.; Lorenzo-Martin, Maria De La; Fenske, George R.
2016-10-25
An article of method of manufacture of a low friction tribological film on a substrate. The article includes a substrate of a steel or ceramic which has been tribologically processed with a lubricant containing selected additives and the additives, temperature, load and time of processing can be selectively controlled to bias formation of a film on the substrate where the film is an amorphous structure exhibiting highly advantageous low friction properties.
Laser Journal (Selected Articles),
1982-09-10
temperature CO2 branch selection laser with a lifetime already exceeding 6500 hours which may be even longer. HIGH POWER LONG LIFE HeCd LASER Qu Shipu...method of plating single crystal gold film in a vacuum with the foreign material extension method. First mica is used as the substrate. Then a special...Hospital) Chen Zhasping Zhou Yiping et al (Eye, Ear, Nose, Throat Hospital, Examination Department, Shanghai Medical School Number 1.) Qu Zhipu et al
Protein and Antibody Engineering by Phage Display.
Frei, J C; Lai, J R
2016-01-01
Phage display is an in vitro selection technique that allows for the rapid isolation of proteins with desired properties including increased affinity, specificity, stability, and new enzymatic activity. The power of phage display relies on the phenotype-to-genotype linkage of the protein of interest displayed on the phage surface with the encoding DNA packaged within the phage particle, which allows for selective enrichment of library pools and high-throughput screening of resulting clones. As an in vitro method, the conditions of the binding selection can be tightly controlled. Due to the high-throughput nature, rapidity, and ease of use, phage display is an excellent technological platform for engineering antibody or proteins with enhanced properties. Here, we describe methods for synthesis, selection, and screening of phage libraries with particular emphasis on designing humanizing antibody libraries and combinatorial scanning mutagenesis libraries. We conclude with a brief section on troubleshooting for all stages of the phage display process. © 2016 Elsevier Inc. All rights reserved.
Acoustic microscope surface inspection system and method
Khuri-Yakub, B.T.; Parent, P.; Reinholdtsen, P.A.
1991-02-26
An acoustic microscope surface inspection system and method are described in which pulses of high frequency electrical energy are applied to a transducer which forms and focuses acoustic energy onto a selected location on the surface of an object and receives energy from the location and generates electrical pulses. The phase of the high frequency electrical signal pulses are stepped with respect to the phase of a reference signal at said location. An output signal is generated which is indicative of the surface of said selected location. The object is scanned to provide output signals representative of the surface at a plurality of surface locations. 7 figures.
A proposed method for world weightlifting championships team selection.
Chiu, Loren Z F
2009-08-01
The caliber of competitors at the World Weightlifting Championships (WWC) has increased greatly over the past 20 years. As the WWC are the primary qualifiers for Olympic slots (1996 to present), it is imperative for a nation to select team members who will finish with a high placing and score team points. Previous selection methods were based on a simple percentage system. Analysis of the results from the 2006 and 2007 WWC indicates a curvilinear trend in each weight class, suggesting a simple percentage system will not maximize the number of team points earned. To maximize team points, weightlifters should be selected based on their potential to finish in the top 25. A 5-tier ranking system is proposed that should ensure the athletes with the greatest potential to score team points are selected.
Morell, Montse; Espargaro, Alba; Aviles, Francesc Xavier; Ventura, Salvador
2008-01-01
We present a high-throughput approach to study weak protein-protein interactions by coupling bimolecular fluorescent complementation (BiFC) to flow cytometry (FC). In BiFC, the interaction partners (bait and prey) are fused to two rationally designed fragments of a fluorescent protein, which recovers its function upon the binding of the interacting proteins. For weak protein-protein interactions, the detected fluorescence is proportional to the interaction strength, thereby allowing in vivo discrimination between closely related binders with different affinity for the bait protein. FC provides a method for high-speed multiparametric data acquisition and analysis; the assay is simple, thousands of cells can be analyzed in seconds and, if required, selected using fluorescence-activated cell sorting (FACS). The combination of both methods (BiFC-FC) provides a technically straightforward, fast and highly sensitive method to validate weak protein interactions and to screen and identify optimal ligands in biologically synthesized libraries. Once plasmids encoding the protein fusions have been obtained, the evaluation of a specific interaction, the generation of a library and selection of active partners using BiFC-FC can be accomplished in 5 weeks.
Catalyzed CO.sub.2-transport membrane on high surface area inorganic support
Liu, Wei
2014-05-06
Disclosed are membranes and methods for making the same, which membranes provide improved permeability, stability, and cost-effective manufacturability, for separating CO.sub.2 from gas streams such as flue gas streams. High CO.sub.2 permeation flux is achieved by immobilizing an ultra-thin, optionally catalyzed fluid layer onto a meso-porous modification layer on a thin, porous inorganic substrate such as a porous metallic substrate. The CO.sub.2-selective liquid fluid blocks non-selective pores, and allows for selective absorption of CO.sub.2 from gas mixtures such as flue gas mixtures and subsequent transport to the permeation side of the membrane. Carbon dioxide permeance levels are in the order of 1.0.times.10.sup.-6 mol/(m.sup.2sPa) or better. Methods for making such membranes allow commercial scale membrane manufacturing at highly cost-effective rates when compared to conventional commercial-scale CO.sub.2 separation processes and equipment for the same and such membranes are operable on an industrial use scale.
Mesoporous structured MIPs@CDs fluorescence sensor for highly sensitive detection of TNT.
Xu, Shoufang; Lu, Hongzhi
2016-11-15
A facile strategy was developed to prepare mesoporous structured molecularly imprinted polymers capped carbon dots (M-MIPs@CDs) fluorescence sensor for highly sensitive and selective determination of TNT. The strategy using amino-CDs directly as "functional monomer" for imprinting simplify the imprinting process and provide well recognition sites accessibility. The as-prepared M-MIPs@CDs sensor, using periodic mesoporous silica as imprinting matrix, and amino-CDs directly as "functional monomer", exhibited excellent selectivity and sensitivity toward TNT with detection limit of 17nM. The recycling process was sustainable for 10 times without obvious efficiency decrease. The feasibility of the developed method in real samples was successfully evaluated through the analysis of TNT in soil and water samples with satisfactory recoveries of 88.6-95.7%. The method proposed in this work was proved to be a convenient and practical way to prepare high sensitive and selective fluorescence MIPs@CDs sensors. Copyright © 2016 Elsevier B.V. All rights reserved.
The establishment of insulin resistance model in FL83B and L6 cell
NASA Astrophysics Data System (ADS)
Liu, Lanlan; Han, Jizhong; Li, Haoran; Liu, Mengmeng; Zeng, Bin
2017-10-01
The insulin resistance models of mouse liver epithelial and rat myoblasts cells were induced by three kinds of inducers: dexamethasone, high insulin and high glucose. The purpose is to select the optimal insulin resistance model, to provide a simple and reliable TR cell model for the study of the pathogenesis of TR and the improvement of TR drugs and functional foods. The MTT method is used for toxicity screening of three compounds, selecting security and suitable concentration. We performed a Glucose oxidase peroxidase (GOD-POD) method involving FL83B and L6 cell with dexamethasone, high insulin and high glucose-induced insulin resistance. Results suggested that FL83B cells with dexamethasone-induced (0.25uM) were established insulin resistance and L6 cells with high-glucose (30mM) and dexamethasone-induced (0.25uM) were established insulin resistance.
A genome-wide scan for signatures of selection in Chinese indigenous and commercial pig breeds.
Yang, Songbai; Li, Xiuling; Li, Kui; Fan, Bin; Tang, Zhonglin
2014-01-15
Modern breeding and artificial selection play critical roles in pig domestication and shape the genetic variation of different breeds. China has many indigenous pig breeds with various characteristics in morphology and production performance that differ from those of foreign commercial pig breeds. However, the signatures of selection on genes implying for economic traits between Chinese indigenous and commercial pigs have been poorly understood. We identified footprints of positive selection at the whole genome level, comprising 44,652 SNPs genotyped in six Chinese indigenous pig breeds, one developed breed and two commercial breeds. An empirical genome-wide distribution of Fst (F-statistics) was constructed based on estimations of Fst for each SNP across these nine breeds. We detected selection at the genome level using the High-Fst outlier method and found that 81 candidate genes show high evidence of positive selection. Furthermore, the results of network analyses showed that the genes that displayed evidence of positive selection were mainly involved in the development of tissues and organs, and the immune response. In addition, we calculated the pairwise Fst between Chinese indigenous and commercial breeds (CHN VS EURO) and between Northern and Southern Chinese indigenous breeds (Northern VS Southern). The IGF1R and ESR1 genes showed evidence of positive selection in the CHN VS EURO and Northern VS Southern groups, respectively. In this study, we first identified the genomic regions that showed evidences of selection between Chinese indigenous and commercial pig breeds using the High-Fst outlier method. These regions were found to be involved in the development of tissues and organs, the immune response, growth and litter size. The results of this study provide new insights into understanding the genetic variation and domestication in pigs.
A genome-wide scan for signatures of selection in Chinese indigenous and commercial pig breeds
2014-01-01
Background Modern breeding and artificial selection play critical roles in pig domestication and shape the genetic variation of different breeds. China has many indigenous pig breeds with various characteristics in morphology and production performance that differ from those of foreign commercial pig breeds. However, the signatures of selection on genes implying for economic traits between Chinese indigenous and commercial pigs have been poorly understood. Results We identified footprints of positive selection at the whole genome level, comprising 44,652 SNPs genotyped in six Chinese indigenous pig breeds, one developed breed and two commercial breeds. An empirical genome-wide distribution of Fst (F-statistics) was constructed based on estimations of Fst for each SNP across these nine breeds. We detected selection at the genome level using the High-Fst outlier method and found that 81 candidate genes show high evidence of positive selection. Furthermore, the results of network analyses showed that the genes that displayed evidence of positive selection were mainly involved in the development of tissues and organs, and the immune response. In addition, we calculated the pairwise Fst between Chinese indigenous and commercial breeds (CHN VS EURO) and between Northern and Southern Chinese indigenous breeds (Northern VS Southern). The IGF1R and ESR1 genes showed evidence of positive selection in the CHN VS EURO and Northern VS Southern groups, respectively. Conclusions In this study, we first identified the genomic regions that showed evidences of selection between Chinese indigenous and commercial pig breeds using the High-Fst outlier method. These regions were found to be involved in the development of tissues and organs, the immune response, growth and litter size. The results of this study provide new insights into understanding the genetic variation and domestication in pigs. PMID:24422716
An opinion formation based binary optimization approach for feature selection
NASA Astrophysics Data System (ADS)
Hamedmoghadam, Homayoun; Jalili, Mahdi; Yu, Xinghuo
2018-02-01
This paper proposed a novel optimization method based on opinion formation in complex network systems. The proposed optimization technique mimics human-human interaction mechanism based on a mathematical model derived from social sciences. Our method encodes a subset of selected features to the opinion of an artificial agent and simulates the opinion formation process among a population of agents to solve the feature selection problem. The agents interact using an underlying interaction network structure and get into consensus in their opinions, while finding better solutions to the problem. A number of mechanisms are employed to avoid getting trapped in local minima. We compare the performance of the proposed method with a number of classical population-based optimization methods and a state-of-the-art opinion formation based method. Our experiments on a number of high dimensional datasets reveal outperformance of the proposed algorithm over others.
Evaluation of isolation methods for pathogenic Yersinia enterocolitica from pig intestinal content.
Laukkanen, R; Hakkinen, M; Lundén, J; Fredriksson-Ahomaa, M; Johansson, T; Korkeala, H
2010-03-01
The aim of this study was to evaluate the efficiency of four isolation methods for the detection of pathogenic Yersinia enterocolitica from pig intestinal content. The four methods comprised of 15 isolation steps using selective enrichments (irgasan-ticarcillin-potassium chlorate and modified Rappaport broth) and mildly selective enrichments at 4 or 25 degrees C. Salmonella-Shigella-desoxycholate-calcium chloride agar, cefsulodin-irgasan-novobiocin agar were used as plating media. The most sensitive method detected 78% (53/68) of the positive samples. Individual isolation steps using cold enrichment as the only enrichment or as a pre-enrichment step with further selective enrichment showed the highest sensitivities (55-66%). All isolation methods resulted in high numbers of suspected colonies not confirmed as pathogenic Y. enterocolitica. Cold enrichment should be used in the detection of pathogenic Y. enterocolitica from pig intestinal contents. In addition, more than one parallel isolation step is needed. The study shows that depending on the isolation method used for Y. enterocolitica, the detected prevalence of Y. enterocolitica in pig intestinal contents varies greatly. More selective and sensitive isolation methods need to be developed for pathogenic Y. enterocolitica.
State recovery and lockstep execution restart in a system with multiprocessor pairing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gara, Alan; Gschwind, Michael K; Salapura, Valentina
System, method and computer program product for a multiprocessing system to offer selective pairing of processor cores for increased processing reliability. A selective pairing facility is provided that selectively connects, i.e., pairs, multiple microprocessor or processor cores to provide one highly reliable thread (or thread group). Each paired microprocessor or processor cores that provide one highly reliable thread for high-reliability connect with a system components such as a memory "nest" (or memory hierarchy), an optional system controller, and optional interrupt controller, optional I/O or peripheral devices, etc. The memory nest is attached to a selective pairing facility via a switchmore » or a bus. Each selectively paired processor core is includes a transactional execution facility, whereing the system is configured to enable processor rollback to a previous state and reinitialize lockstep execution in order to recover from an incorrect execution when an incorrect execution has been detected by the selective pairing facility.« less
Fuzzy System-Based Target Selection for a NIR Camera-Based Gaze Tracker
Naqvi, Rizwan Ali; Arsalan, Muhammad; Park, Kang Ryoung
2017-01-01
Gaze-based interaction (GBI) techniques have been a popular subject of research in the last few decades. Among other applications, GBI can be used by persons with disabilities to perform everyday tasks, as a game interface, and can play a pivotal role in the human computer interface (HCI) field. While gaze tracking systems have shown high accuracy in GBI, detecting a user’s gaze for target selection is a challenging problem that needs to be considered while using a gaze detection system. Past research has used the blinking of the eyes for this purpose as well as dwell time-based methods, but these techniques are either inconvenient for the user or requires a long time for target selection. Therefore, in this paper, we propose a method for fuzzy system-based target selection for near-infrared (NIR) camera-based gaze trackers. The results of experiments performed in addition to tests of the usability and on-screen keyboard use of the proposed method show that it is better than previous methods. PMID:28420114
Swiderska, Zaneta; Korzynska, Anna; Markiewicz, Tomasz; Lorent, Malgorzata; Zak, Jakub; Wesolowska, Anna; Roszkowiak, Lukasz; Slodkowska, Janina; Grala, Bartlomiej
2015-01-01
Background. This paper presents the study concerning hot-spot selection in the assessment of whole slide images of tissue sections collected from meningioma patients. The samples were immunohistochemically stained to determine the Ki-67/MIB-1 proliferation index used for prognosis and treatment planning. Objective. The observer performance was examined by comparing results of the proposed method of automatic hot-spot selection in whole slide images, results of traditional scoring under a microscope, and results of a pathologist's manual hot-spot selection. Methods. The results of scoring the Ki-67 index using optical scoring under a microscope, software for Ki-67 index quantification based on hot spots selected by two pathologists (resp., once and three times), and the same software but on hot spots selected by proposed automatic methods were compared using Kendall's tau-b statistics. Results. Results show intra- and interobserver agreement. The agreement between Ki-67 scoring with manual and automatic hot-spot selection is high, while agreement between Ki-67 index scoring results in whole slide images and traditional microscopic examination is lower. Conclusions. The agreement observed for the three scoring methods shows that automation of area selection is an effective tool in supporting physicians and in increasing the reliability of Ki-67 scoring in meningioma.
Swiderska, Zaneta; Korzynska, Anna; Markiewicz, Tomasz; Lorent, Malgorzata; Zak, Jakub; Wesolowska, Anna; Roszkowiak, Lukasz; Slodkowska, Janina; Grala, Bartlomiej
2015-01-01
Background. This paper presents the study concerning hot-spot selection in the assessment of whole slide images of tissue sections collected from meningioma patients. The samples were immunohistochemically stained to determine the Ki-67/MIB-1 proliferation index used for prognosis and treatment planning. Objective. The observer performance was examined by comparing results of the proposed method of automatic hot-spot selection in whole slide images, results of traditional scoring under a microscope, and results of a pathologist's manual hot-spot selection. Methods. The results of scoring the Ki-67 index using optical scoring under a microscope, software for Ki-67 index quantification based on hot spots selected by two pathologists (resp., once and three times), and the same software but on hot spots selected by proposed automatic methods were compared using Kendall's tau-b statistics. Results. Results show intra- and interobserver agreement. The agreement between Ki-67 scoring with manual and automatic hot-spot selection is high, while agreement between Ki-67 index scoring results in whole slide images and traditional microscopic examination is lower. Conclusions. The agreement observed for the three scoring methods shows that automation of area selection is an effective tool in supporting physicians and in increasing the reliability of Ki-67 scoring in meningioma. PMID:26240787
Efficient experimental design of high-fidelity three-qubit quantum gates via genetic programming
NASA Astrophysics Data System (ADS)
Devra, Amit; Prabhu, Prithviraj; Singh, Harpreet; Arvind; Dorai, Kavita
2018-03-01
We have designed efficient quantum circuits for the three-qubit Toffoli (controlled-controlled-NOT) and the Fredkin (controlled-SWAP) gate, optimized via genetic programming methods. The gates thus obtained were experimentally implemented on a three-qubit NMR quantum information processor, with a high fidelity. Toffoli and Fredkin gates in conjunction with the single-qubit Hadamard gates form a universal gate set for quantum computing and are an essential component of several quantum algorithms. Genetic algorithms are stochastic search algorithms based on the logic of natural selection and biological genetics and have been widely used for quantum information processing applications. We devised a new selection mechanism within the genetic algorithm framework to select individuals from a population. We call this mechanism the "Luck-Choose" mechanism and were able to achieve faster convergence to a solution using this mechanism, as compared to existing selection mechanisms. The optimization was performed under the constraint that the experimentally implemented pulses are of short duration and can be implemented with high fidelity. We demonstrate the advantage of our pulse sequences by comparing our results with existing experimental schemes and other numerical optimization methods.
Zhang, Xiao-Chao; Wei, Zhen-Wei; Gong, Xiao-Yun; Si, Xing-Yu; Zhao, Yao-Yao; Yang, Cheng-Dui; Zhang, Si-Chun; Zhang, Xin-Rong
2016-04-29
Integrating droplet-based microfluidics with mass spectrometry is essential to high-throughput and multiple analysis of single cells. Nevertheless, matrix effects such as the interference of culture medium and intracellular components influence the sensitivity and the accuracy of results in single-cell analysis. To resolve this problem, we developed a method that integrated droplet-based microextraction with single-cell mass spectrometry. Specific extraction solvent was used to selectively obtain intracellular components of interest and remove interference of other components. Using this method, UDP-Glc-NAc, GSH, GSSG, AMP, ADP and ATP were successfully detected in single MCF-7 cells. We also applied the method to study the change of unicellular metabolites in the biological process of dysfunctional oxidative phosphorylation. The method could not only realize matrix-free, selective and sensitive detection of metabolites in single cells, but also have the capability for reliable and high-throughput single-cell analysis.
A kinetic method for the determination of thiourea by its catalytic effect in micellar media
NASA Astrophysics Data System (ADS)
Abbasi, Shahryar; Khani, Hossein; Gholivand, Mohammad Bagher; Naghipour, Ali; Farmany, Abbas; Abbasi, Freshteh
2009-03-01
A highly sensitive, selective and simple kinetic method was developed for the determination of trace levels of thiourea based on its catalytic effect on the oxidation of janus green in phosphoric acid media and presence of Triton X-100 surfactant without any separation and pre-concentration steps. The reaction was monitored spectrophotometrically by tracing the formation of the green-colored oxidized product of janus green at 617 nm within 15 min of mixing the reagents. The effect of some factors on the reaction speed was investigated. Following the recommended procedure, thiourea could be determined with linear calibration graph in 0.03-10.00 μg/ml range. The detection limit of the proposed method is 0.02 μg/ml. Most of foreign species do not interfere with the determination. The high sensitivity and selectivity of the proposed method allowed its successful application to fruit juice and industrial waste water.
Tracking metastatic breast cancer: the future of biology in biosensors.
Lim, Y C; Wiegmans, A P
2016-04-01
Circulating tumour cells associated with breast cancer (brCTCs) represent cells that have the capability to establish aggressive secondary metastatic tumours. The isolation and characterization of CTCs from blood in a single device is the future of oncology diagnosis and treatment. The methods of enrichment of CTCs have primarily utilized simple biological interactions with bimodal reporting with biased high purity and low numbers or low purity and high background. In this review, we will discuss the advances in microfluidics that has allowed the use of more complex selection criteria and biological methods to identify CTC populations. We will also discuss a potential new method of selection based on the response of the oncogenic DNA repair pathways within brCTCs. This method would allow insight into not only the oncogenic signalling at play but the chemoresistance mechanisms that could guide future therapeutic intervention at any stage of disease progression.
Variable screening via quantile partial correlation
Ma, Shujie; Tsai, Chih-Ling
2016-01-01
In quantile linear regression with ultra-high dimensional data, we propose an algorithm for screening all candidate variables and subsequently selecting relevant predictors. Specifically, we first employ quantile partial correlation for screening, and then we apply the extended Bayesian information criterion (EBIC) for best subset selection. Our proposed method can successfully select predictors when the variables are highly correlated, and it can also identify variables that make a contribution to the conditional quantiles but are marginally uncorrelated or weakly correlated with the response. Theoretical results show that the proposed algorithm can yield the sure screening set. By controlling the false selection rate, model selection consistency can be achieved theoretically. In practice, we proposed using EBIC for best subset selection so that the resulting model is screening consistent. Simulation studies demonstrate that the proposed algorithm performs well, and an empirical example is presented. PMID:28943683
Magnuson, Matthew; Campisano, Romy; Griggs, John; Fitz-James, Schatzi; Hall, Kathy; Mapp, Latisha; Mullins, Marissa; Nichols, Tonya; Shah, Sanjiv; Silvestri, Erin; Smith, Terry; Willison, Stuart; Ernst, Hiba
2014-11-01
Catastrophic incidents can generate a large number of samples of analytically diverse types, including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for sample analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to acceptable levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illustrates the result of applying this principle, in the form of a compendium of analytical methods for contaminants of interest. The compendium is based on experience with actual incidents, where appropriate and available. This paper also discusses efforts aimed at adaptation of existing methods to increase fitness-for-purpose and development of innovative methods when necessary. The contaminants of interest are primarily those potentially released through catastrophes resulting from malicious activity. However, the same techniques discussed could also have application to catastrophes resulting from other incidents, such as natural disasters or industrial accidents. Further, the high sample throughput enabled by the techniques discussed could be employed for conventional environmental studies and compliance monitoring, potentially decreasing costs and/or increasing the quantity of data available to decision-makers. Published by Elsevier Ltd.
Near-optimal experimental design for model selection in systems biology.
Busetto, Alberto Giovanni; Hauser, Alain; Krummenacher, Gabriel; Sunnåker, Mikael; Dimopoulos, Sotiris; Ong, Cheng Soon; Stelling, Jörg; Buhmann, Joachim M
2013-10-15
Biological systems are understood through iterations of modeling and experimentation. Not all experiments, however, are equally valuable for predictive modeling. This study introduces an efficient method for experimental design aimed at selecting dynamical models from data. Motivated by biological applications, the method enables the design of crucial experiments: it determines a highly informative selection of measurement readouts and time points. We demonstrate formal guarantees of design efficiency on the basis of previous results. By reducing our task to the setting of graphical models, we prove that the method finds a near-optimal design selection with a polynomial number of evaluations. Moreover, the method exhibits the best polynomial-complexity constant approximation factor, unless P = NP. We measure the performance of the method in comparison with established alternatives, such as ensemble non-centrality, on example models of different complexity. Efficient design accelerates the loop between modeling and experimentation: it enables the inference of complex mechanisms, such as those controlling central metabolic operation. Toolbox 'NearOED' available with source code under GPL on the Machine Learning Open Source Software Web site (mloss.org).
Adaptive Optics Image Restoration Based on Frame Selection and Multi-frame Blind Deconvolution
NASA Astrophysics Data System (ADS)
Tian, Yu; Rao, Chang-hui; Wei, Kai
Restricted by the observational condition and the hardware, adaptive optics can only make a partial correction of the optical images blurred by atmospheric turbulence. A postprocessing method based on frame selection and multi-frame blind deconvolution is proposed for the restoration of high-resolution adaptive optics images. By frame selection we mean we first make a selection of the degraded (blurred) images for participation in the iterative blind deconvolution calculation, with no need of any a priori knowledge, and with only a positivity constraint. This method has been applied to the restoration of some stellar images observed by the 61-element adaptive optics system installed on the Yunnan Observatory 1.2m telescope. The experimental results indicate that this method can effectively compensate for the residual errors of the adaptive optics system on the image, and the restored image can reach the diffraction-limited quality.
Garcia-Allende, P Beatriz; Mirapeix, Jesus; Conde, Olga M; Cobo, Adolfo; Lopez-Higuera, Jose M
2009-01-01
Plasma optical spectroscopy is widely employed in on-line welding diagnostics. The determination of the plasma electron temperature, which is typically selected as the output monitoring parameter, implies the identification of the atomic emission lines. As a consequence, additional processing stages are required with a direct impact on the real time performance of the technique. The line-to-continuum method is a feasible alternative spectroscopic approach and it is particularly interesting in terms of its computational efficiency. However, the monitoring signal highly depends on the chosen emission line. In this paper, a feature selection methodology is proposed to solve the uncertainty regarding the selection of the optimum spectral band, which allows the employment of the line-to-continuum method for on-line welding diagnostics. Field test results have been conducted to demonstrate the feasibility of the solution.
[Production of marker-free plants expressing the gene of the hepatitis B virus surface antigen].
Rukavtsova, E B; Gaiazova, A R; Chebotareva, E N; Bur'ianova, Ia I
2009-08-01
The pBM plasmid, carrying the gene of hepatitis B virus surface antigen (HBsAg) and free of any selection markers of antibiotic or herbicide resistance, was constructed for genetic transformation of plants. A method for screening transformed plant seedlings on nonselective media was developed. Enzyme immunoassay was used for selecting transgenic plants with HBsAg gene among the produced regenerants; this method provides for a high sensitivity detection of HBsAg in plant extracts. Tobacco and tomato transgenic lines synthesizing this antigen at a level of 0.01-0.05% of the total soluble protein were obtained. The achieved level of HBsAg synthesis is sufficient for preclinical trials of the produced plants as a new generation safe edible vaccine. The developed method for selecting transformants can be used for producing safe plants free of selection markers.
Gao, Zhao; Wang, Libing; Su, Rongxin; Huang, Renliang; Qi, Wei; He, Zhimin
2015-08-15
We herein report a facile, one-step pyrolysis synthesis of photoluminescent carbon dots (CDs) using citric acid as the carbon source and lysine as the surface passivation reagent. The as-prepared CDs show narrow size distribution, excellent blue fluorescence and good photo-stability and water dispersivity. The fluorescence of the CDs was found to be effectively quenched by ferric (Fe(III)) ions with high selectivity via a photo-induced electron transfer (PET) process. Upon addition of phytic acid (PA) to the CDs/Fe(III) complex dispersion, the fluorescence of the CDs was significantly recovered, arising from the release of Fe(III) ions from the CDs/Fe(III) complex because PA has a higher affinity for Fe(III) ions compared to CDs. Furthermore, we developed an "off-on" fluorescence assay method for the detection of phytic acid using CDs/Fe(III) as a fluorescent probe. This probe enables the selective detection of PA with a linear range of 0.68-18.69 μM and a limit of detection (signal-to-noise ratio is 3) of 0.36 μM. The assay method demonstrates high selectivity, repeatability, stability and recovery ratio in the detection of the standard and real PA samples. We believe that the facile operation, low-cost, high sensitivity and selectivity render this CD-based "off-on" fluorescent probe an ideal sensing platform for the detection of PA. Copyright © 2015 Elsevier B.V. All rights reserved.
High Flyers: Glorious Past, Gloomy Present, Any Future?
ERIC Educational Resources Information Center
Baruch, Yehuda; Peiperl, Maury
1997-01-01
New types of psychological contracts, in which promotions cannot be guaranteed or expected, are emerging. Former methods of selecting and assessing high-potential candidates are obsolete, and human resource management practices must change. (SK)
Liu, Aiming; Liu, Quan; Ai, Qingsong; Xie, Yi; Chen, Anqi
2017-01-01
Motor Imagery (MI) electroencephalography (EEG) is widely studied for its non-invasiveness, easy availability, portability, and high temporal resolution. As for MI EEG signal processing, the high dimensions of features represent a research challenge. It is necessary to eliminate redundant features, which not only create an additional overhead of managing the space complexity, but also might include outliers, thereby reducing classification accuracy. The firefly algorithm (FA) can adaptively select the best subset of features, and improve classification accuracy. However, the FA is easily entrapped in a local optimum. To solve this problem, this paper proposes a method of combining the firefly algorithm and learning automata (LA) to optimize feature selection for motor imagery EEG. We employed a method of combining common spatial pattern (CSP) and local characteristic-scale decomposition (LCD) algorithms to obtain a high dimensional feature set, and classified it by using the spectral regression discriminant analysis (SRDA) classifier. Both the fourth brain–computer interface competition data and real-time data acquired in our designed experiments were used to verify the validation of the proposed method. Compared with genetic and adaptive weight particle swarm optimization algorithms, the experimental results show that our proposed method effectively eliminates redundant features, and improves the classification accuracy of MI EEG signals. In addition, a real-time brain–computer interface system was implemented to verify the feasibility of our proposed methods being applied in practical brain–computer interface systems. PMID:29117100
Liu, Aiming; Chen, Kun; Liu, Quan; Ai, Qingsong; Xie, Yi; Chen, Anqi
2017-11-08
Motor Imagery (MI) electroencephalography (EEG) is widely studied for its non-invasiveness, easy availability, portability, and high temporal resolution. As for MI EEG signal processing, the high dimensions of features represent a research challenge. It is necessary to eliminate redundant features, which not only create an additional overhead of managing the space complexity, but also might include outliers, thereby reducing classification accuracy. The firefly algorithm (FA) can adaptively select the best subset of features, and improve classification accuracy. However, the FA is easily entrapped in a local optimum. To solve this problem, this paper proposes a method of combining the firefly algorithm and learning automata (LA) to optimize feature selection for motor imagery EEG. We employed a method of combining common spatial pattern (CSP) and local characteristic-scale decomposition (LCD) algorithms to obtain a high dimensional feature set, and classified it by using the spectral regression discriminant analysis (SRDA) classifier. Both the fourth brain-computer interface competition data and real-time data acquired in our designed experiments were used to verify the validation of the proposed method. Compared with genetic and adaptive weight particle swarm optimization algorithms, the experimental results show that our proposed method effectively eliminates redundant features, and improves the classification accuracy of MI EEG signals. In addition, a real-time brain-computer interface system was implemented to verify the feasibility of our proposed methods being applied in practical brain-computer interface systems.
Isolation and Characterization of a Novel, Highly Selective Astaxanthin-Producing Marine Bacterium.
Asker, Dalal
2017-10-18
A high-throughput screening approach for astaxanthin-producing bacteria led to the discovery of a novel, highly selective astaxanthin-producing marine bacterium (strain N-5). Phylogenetic analysis based on partial 16S rRNA gene and phenotypic metabolic testing indicated it belongs to the genus Brevundimonas. Therefore, it was designated as Brevundimonas sp. strain N-5. To identify and quantify carotenoids produced by strain N-5, HPLC-DAD and HPLC-MS methods were used. The culture conditions including media, shaking, and time had significant effects on cell growth and carotenoids production including astaxanthin. The total carotenoids were ∼601.2 μg g -1 dry cells including a remarkable amount (364.6 μg g -1 dry cells) of optically pure astaxanthin (3S, 3'S) isomer, with high selectivity (∼60.6%) under medium aeration conditions. Notably, increasing the culture aeration enhanced astaxanthin production up to 85% of total carotenoids. This is the first report that describes a natural, highly selective astaxanthin-producing marine bacterium.
Hussain, Shah; Güzel, Yüksel; Schönbichler, Stefan A; Rainer, Matthias; Huck, Christian W; Bonn, Günther K
2013-09-01
Thionins are cysteine-rich, biologically active small (∼5 kDa) and basic proteins occurring ubiquitously in the plant kingdom. This study describes an efficient solid-phase extraction (SPE) method for the selective isolation of these pharmacologically active proteins. Hollow-monolithic extraction tips based on poly(styrene-co-divinylbenzene) with embedded zirconium silicate nano-powder were designed, which showed an excellent selectivity for sulphur-rich proteins owing to strong co-ordination between zirconium and the sulphur atoms from the thiol-group of cysteine. The sorbent provides a combination of strong hydrophobic and electrostatic interactions which may help in targeted separation of certain classes of proteins in a complex mixture based upon the binding strength of different proteins. European mistletoe, wheat and barley samples were used for selective isolation of viscotoxins, purothionins and hordothionins, respectively. The enriched fractions were subjected to analysis by matrix-assisted laser desorption/ionisation-time-of-flight mass spectrometer to prove the selectivity of the SPE method towards thionins. For peptide mass-fingerprint analysis, tryptic digests of SPE eluates were examined. Reversed-phase high-performance liquid chromatography hyphenated to diode-array detection was employed for the purification of individual isoforms. The developed method was found to be highly specific for the isolation and purification of thionins.
Khatua, Snehadrinarayan; Choi, Shin Hei; Lee, Junseong; Huh, Jung Oh; Do, Youngkyu; Churchill, David G
2009-03-02
Fluorescent dinuclear chiral zinc complexes were synthesized in a "one-pot" method in which the lysine-based Schiff base ligand was generated in situ. This complex acts as a highly sensitive and selective fluorescent ON-OFF probe for Cu(2+) in water at physiological pH. Other metal ions such as Hg(2+), Cd(2+), and Pb(2+) gave little fluorescence change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Economou, Demetre J.
As microelectronic device features continue to shrink approaching atomic dimensions, control of the ion energy distribution on the substrate during plasma etching and deposition becomes increasingly critical. The ion energy should be high enough to drive ion-assisted etching, but not too high to cause substrate damage or loss of selectivity. In many cases, a nearly monoenergetic ion energy distribution (IED) is desired to achieve highly selective etching. In this work, the author briefly reviews: (1) the fundamentals of development of the ion energy distribution in the sheath and (2) methods to control the IED on plasma electrodes. Such methods includemore » the application of “tailored” voltage waveforms on an electrode in continuous wave plasmas, or the application of synchronous bias on a “boundary electrode” during a specified time window in the afterglow of pulsed plasmas.« less
Current strategies with 1-stage prosthetic breast reconstruction
2015-01-01
Background 1-stage prosthetic breast reconstruction is gaining traction as a preferred method of breast reconstruction in select patients who undergo mastectomy for cancer or prevention. Methods Critical elements to the procedure including patient selection, technique, surgical judgment, and postoperative care were reviewed. Results Outcomes series reveal that in properly selected patients, direct-to-implant (DTI) reconstruction has similar low rates of complications and high rates of patient satisfaction compared to traditional 2-stage reconstruction. Conclusions 1-stage prosthetic breast reconstruction may be the procedure of choice in select patients undergoing mastectomy. Advantages include the potential for the entire reconstructive process to be complete in one surgery, the quick return to normal activities, and lack of donor site morbidity. PMID:26005643
NASA Astrophysics Data System (ADS)
Yavari, Somayeh; Valadan Zoej, Mohammad Javad; Salehi, Bahram
2018-05-01
The procedure of selecting an optimum number and best distribution of ground control information is important in order to reach accurate and robust registration results. This paper proposes a new general procedure based on Genetic Algorithm (GA) which is applicable for all kinds of features (point, line, and areal features). However, linear features due to their unique characteristics are of interest in this investigation. This method is called Optimum number of Well-Distributed ground control Information Selection (OWDIS) procedure. Using this method, a population of binary chromosomes is randomly initialized. The ones indicate the presence of a pair of conjugate lines as a GCL and zeros specify the absence. The chromosome length is considered equal to the number of all conjugate lines. For each chromosome, the unknown parameters of a proper mathematical model can be calculated using the selected GCLs (ones in each chromosome). Then, a limited number of Check Points (CPs) are used to evaluate the Root Mean Square Error (RMSE) of each chromosome as its fitness value. The procedure continues until reaching a stopping criterion. The number and position of ones in the best chromosome indicate the selected GCLs among all conjugate lines. To evaluate the proposed method, a GeoEye and an Ikonos Images are used over different areas of Iran. Comparing the obtained results by the proposed method in a traditional RFM with conventional methods that use all conjugate lines as GCLs shows five times the accuracy improvement (pixel level accuracy) as well as the strength of the proposed method. To prevent an over-parametrization error in a traditional RFM due to the selection of a high number of improper correlated terms, an optimized line-based RFM is also proposed. The results show the superiority of the combination of the proposed OWDIS method with an optimized line-based RFM in terms of increasing the accuracy to better than 0.7 pixel, reliability, and reducing systematic errors. These results also demonstrate the high potential of linear features as reliable control features to reach sub-pixel accuracy in registration applications.
Pinter, Stephen Z; Kim, Dae-Ro; Hague, M Nicole; Chambers, Ann F; MacDonald, Ian C; Lacefield, James C
2014-08-01
Flow quantification with high-frequency (>20 MHz) power Doppler ultrasound can be performed objectively using the wall-filter selection curve (WFSC) method to select the cutoff velocity that yields a best-estimate color pixel density (CPD). An in vivo video microscopy system (IVVM) is combined with high-frequency power Doppler ultrasound to provide a method for validation of CPD measurements based on WFSCs in mouse testicular vessels. The ultrasound and IVVM systems are instrumented so that the mouse remains on the same imaging platform when switching between the two modalities. In vivo video microscopy provides gold-standard measurements of vascular diameter to validate power Doppler CPD estimates. Measurements in four image planes from three mice exhibit wide variation in the optimal cutoff velocity and indicate that a predetermined cutoff velocity setting can introduce significant errors in studies intended to quantify vascularity. Consistent with previously published flow-phantom data, in vivo WFSCs exhibited three characteristic regions and detectable plateaus. Selection of a cutoff velocity at the right end of the plateau yielded a CPD close to the gold-standard vascular volume fraction estimated using IVVM. An investigator can implement the WFSC method to help adapt cutoff velocity to current blood flow conditions and thereby improve the accuracy of power Doppler for quantitative microvascular imaging. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
A Primer for Model Selection: The Decisive Role of Model Complexity
NASA Astrophysics Data System (ADS)
Höge, Marvin; Wöhling, Thomas; Nowak, Wolfgang
2018-03-01
Selecting a "best" model among several competing candidate models poses an often encountered problem in water resources modeling (and other disciplines which employ models). For a modeler, the best model fulfills a certain purpose best (e.g., flood prediction), which is typically assessed by comparing model simulations to data (e.g., stream flow). Model selection methods find the "best" trade-off between good fit with data and model complexity. In this context, the interpretations of model complexity implied by different model selection methods are crucial, because they represent different underlying goals of modeling. Over the last decades, numerous model selection criteria have been proposed, but modelers who primarily want to apply a model selection criterion often face a lack of guidance for choosing the right criterion that matches their goal. We propose a classification scheme for model selection criteria that helps to find the right criterion for a specific goal, i.e., which employs the correct complexity interpretation. We identify four model selection classes which seek to achieve high predictive density, low predictive error, high model probability, or shortest compression of data. These goals can be achieved by following either nonconsistent or consistent model selection and by either incorporating a Bayesian parameter prior or not. We allocate commonly used criteria to these four classes, analyze how they represent model complexity and what this means for the model selection task. Finally, we provide guidance on choosing the right type of criteria for specific model selection tasks. (A quick guide through all key points is given at the end of the introduction.)
How To Teach "Dirty" Books in High School.
ERIC Educational Resources Information Center
O'Malley, William J.
1967-01-01
Today's self-centered, utopian attitudes toward sexual experience compel teachers to avoid both overcaution and over-indulgence in selecting controversial books for classroom use. One method of selection is to rank books in a gradual progression from those requiring little literary and sexual sophistication in the reader to those requiring much…
Well-characterized and standardized methods are the foundation upon which monitoring of regulated and unregulated contaminants in drinking water are based. To obtain reliable, high quality data for trace analysis of contaminants, these methods must be rugged, selective and sensit...
Variable-mesh method of solving differential equations
NASA Technical Reports Server (NTRS)
Van Wyk, R.
1969-01-01
Multistep predictor-corrector method for numerical solution of ordinary differential equations retains high local accuracy and convergence properties. In addition, the method was developed in a form conducive to the generation of effective criteria for the selection of subsequent step sizes in step-by-step solution of differential equations.
Resampling method for applying density-dependent habitat selection theory to wildlife surveys.
Tardy, Olivia; Massé, Ariane; Pelletier, Fanie; Fortin, Daniel
2015-01-01
Isodar theory can be used to evaluate fitness consequences of density-dependent habitat selection by animals. A typical habitat isodar is a regression curve plotting competitor densities in two adjacent habitats when individual fitness is equal. Despite the increasing use of habitat isodars, their application remains largely limited to areas composed of pairs of adjacent habitats that are defined a priori. We developed a resampling method that uses data from wildlife surveys to build isodars in heterogeneous landscapes without having to predefine habitat types. The method consists in randomly placing blocks over the survey area and dividing those blocks in two adjacent sub-blocks of the same size. Animal abundance is then estimated within the two sub-blocks. This process is done 100 times. Different functional forms of isodars can be investigated by relating animal abundance and differences in habitat features between sub-blocks. We applied this method to abundance data of raccoons and striped skunks, two of the main hosts of rabies virus in North America. Habitat selection by raccoons and striped skunks depended on both conspecific abundance and the difference in landscape composition and structure between sub-blocks. When conspecific abundance was low, raccoons and striped skunks favored areas with relatively high proportions of forests and anthropogenic features, respectively. Under high conspecific abundance, however, both species preferred areas with rather large corn-forest edge densities and corn field proportions. Based on random sampling techniques, we provide a robust method that is applicable to a broad range of species, including medium- to large-sized mammals with high mobility. The method is sufficiently flexible to incorporate multiple environmental covariates that can reflect key requirements of the focal species. We thus illustrate how isodar theory can be used with wildlife surveys to assess density-dependent habitat selection over large geographic extents.
Nouri, Dorra; Lucas, Yves; Treuillet, Sylvie
2016-12-01
Hyperspectral imaging is an emerging technology recently introduced in medical applications inasmuch as it provides a powerful tool for noninvasive tissue characterization. In this context, a new system was designed to be easily integrated in the operating room in order to detect anatomical tissues hardly noticed by the surgeon's naked eye. Our LCTF-based spectral imaging system is operative over visible, near- and middle-infrared spectral ranges (400-1700 nm). It is dedicated to enhance critical biological tissues such as the ureter and the facial nerve. We aim to find the best three relevant bands to create a RGB image to display during the intervention with maximal contrast between the target tissue and its surroundings. A comparative study is carried out between band selection methods and band transformation methods. Combined band selection methods are proposed. All methods are compared using different evaluation criteria. Experimental results show that the proposed combined band selection methods provide the best performance with rich information, high tissue separability and short computational time. These methods yield a significant discrimination between biological tissues. We developed a hyperspectral imaging system in order to enhance some biological tissue visualization. The proposed methods provided an acceptable trade-off between the evaluation criteria especially in SWIR spectral band that outperforms the naked eye's capacities.
Conditional screening for ultra-high dimensional covariates with survival outcomes
Hong, Hyokyoung G.; Li, Yi
2017-01-01
Identifying important biomarkers that are predictive for cancer patients’ prognosis is key in gaining better insights into the biological influences on the disease and has become a critical component of precision medicine. The emergence of large-scale biomedical survival studies, which typically involve excessive number of biomarkers, has brought high demand in designing efficient screening tools for selecting predictive biomarkers. The vast amount of biomarkers defies any existing variable selection methods via regularization. The recently developed variable screening methods, though powerful in many practical setting, fail to incorporate prior information on the importance of each biomarker and are less powerful in detecting marginally weak while jointly important signals. We propose a new conditional screening method for survival outcome data by computing the marginal contribution of each biomarker given priorily known biological information. This is based on the premise that some biomarkers are known to be associated with disease outcomes a priori. Our method possesses sure screening properties and a vanishing false selection rate. The utility of the proposal is further confirmed with extensive simulation studies and analysis of a diffuse large B-cell lymphoma dataset. We are pleased to dedicate this work to Jack Kalbfleisch, who has made instrumental contributions to the development of modern methods of analyzing survival data. PMID:27933468
Alternative evaluation metrics for risk adjustment methods.
Park, Sungchul; Basu, Anirban
2018-06-01
Risk adjustment is instituted to counter risk selection by accurately equating payments with expected expenditures. Traditional risk-adjustment methods are designed to estimate accurate payments at the group level. However, this generates residual risks at the individual level, especially for high-expenditure individuals, thereby inducing health plans to avoid those with high residual risks. To identify an optimal risk-adjustment method, we perform a comprehensive comparison of prediction accuracies at the group level, at the tail distributions, and at the individual level across 19 estimators: 9 parametric regression, 7 machine learning, and 3 distributional estimators. Using the 2013-2014 MarketScan database, we find that no one estimator performs best in all prediction accuracies. Generally, machine learning and distribution-based estimators achieve higher group-level prediction accuracy than parametric regression estimators. However, parametric regression estimators show higher tail distribution prediction accuracy and individual-level prediction accuracy, especially at the tails of the distribution. This suggests that there is a trade-off in selecting an appropriate risk-adjustment method between estimating accurate payments at the group level and lower residual risks at the individual level. Our results indicate that an optimal method cannot be determined solely on the basis of statistical metrics but rather needs to account for simulating plans' risk selective behaviors. Copyright © 2018 John Wiley & Sons, Ltd.
On the exact solutions of high order wave equations of KdV type (I)
NASA Astrophysics Data System (ADS)
Bulut, Hasan; Pandir, Yusuf; Baskonus, Haci Mehmet
2014-12-01
In this paper, by means of a proper transformation and symbolic computation, we study high order wave equations of KdV type (I). We obtained classification of exact solutions that contain soliton, rational, trigonometric and elliptic function solutions by using the extended trial equation method. As a result, the motivation of this paper is to utilize the extended trial equation method to explore new solutions of high order wave equation of KdV type (I). This method is confirmed by applying it to this kind of selected nonlinear equations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sabyrov, Kairat; Musselwhite, Nathan; Melaet, Gérôme
As the impact of acids on catalytically driven chemical transformations is tremendous, fundamental understanding of catalytically relevant factors is essential for the design of more efficient solid acid catalysts. In this work, we employed a post-synthetic doping method to synthesize a highly selective hydroisomerization catalyst and to demonstrate the effect of acid strength and density, catalyst microstructure, and platinum nanoparticle size on the reaction rate and selectivity. Aluminum doped mesoporous silica catalyzed gas-phase n-hexadecane isomerization with remarkably high selectivity to monobranched isomers (~95%), producing a substantially higher amount of isomers than traditional zeolite catalysts. Mildly acidic sites generated by post-syntheticmore » aluminum grafting were found to be the main reason for its high selectivity. The flexibility of the post-synthetic doping method enabled us to systematically explore the effect of the acid site density on the reaction rate and selectivity, which has been extremely difficult to achieve with zeolite catalysts. We found that a higher density of Brønsted acid sites leads to higher cracking of n-hexadecane presumably due to an increased surface residence time. Furthermore, regardless of pore size and microstructure, hydroisomerization turnover frequency linearly increased as a function of Brønsted acid site density. In addition to strength and density of acid sites, platinum nanoparticle size affected catalytic activity and selectivity. The smallest platinum nanoparticles produced the most effective bifunctional catalyst presumably because of higher percolation into aluminum doped mesoporous silica, generating more 'intimate' metallic and acidic sites. Finally, the aluminum doped silica catalyst was shown to retain its remarkable selectivity towards isomers even at increased reaction conversions.« less
Sabyrov, Kairat; Musselwhite, Nathan; Melaet, Gérôme; ...
2017-01-01
As the impact of acids on catalytically driven chemical transformations is tremendous, fundamental understanding of catalytically relevant factors is essential for the design of more efficient solid acid catalysts. In this work, we employed a post-synthetic doping method to synthesize a highly selective hydroisomerization catalyst and to demonstrate the effect of acid strength and density, catalyst microstructure, and platinum nanoparticle size on the reaction rate and selectivity. Aluminum doped mesoporous silica catalyzed gas-phase n-hexadecane isomerization with remarkably high selectivity to monobranched isomers (~95%), producing a substantially higher amount of isomers than traditional zeolite catalysts. Mildly acidic sites generated by post-syntheticmore » aluminum grafting were found to be the main reason for its high selectivity. The flexibility of the post-synthetic doping method enabled us to systematically explore the effect of the acid site density on the reaction rate and selectivity, which has been extremely difficult to achieve with zeolite catalysts. We found that a higher density of Brønsted acid sites leads to higher cracking of n-hexadecane presumably due to an increased surface residence time. Furthermore, regardless of pore size and microstructure, hydroisomerization turnover frequency linearly increased as a function of Brønsted acid site density. In addition to strength and density of acid sites, platinum nanoparticle size affected catalytic activity and selectivity. The smallest platinum nanoparticles produced the most effective bifunctional catalyst presumably because of higher percolation into aluminum doped mesoporous silica, generating more 'intimate' metallic and acidic sites. Finally, the aluminum doped silica catalyst was shown to retain its remarkable selectivity towards isomers even at increased reaction conversions.« less
An Ensemble Successive Project Algorithm for Liquor Detection Using Near Infrared Sensor.
Qu, Fangfang; Ren, Dong; Wang, Jihua; Zhang, Zhong; Lu, Na; Meng, Lei
2016-01-11
Spectral analysis technique based on near infrared (NIR) sensor is a powerful tool for complex information processing and high precision recognition, and it has been widely applied to quality analysis and online inspection of agricultural products. This paper proposes a new method to address the instability of small sample sizes in the successive projections algorithm (SPA) as well as the lack of association between selected variables and the analyte. The proposed method is an evaluated bootstrap ensemble SPA method (EBSPA) based on a variable evaluation index (EI) for variable selection, and is applied to the quantitative prediction of alcohol concentrations in liquor using NIR sensor. In the experiment, the proposed EBSPA with three kinds of modeling methods are established to test their performance. In addition, the proposed EBSPA combined with partial least square is compared with other state-of-the-art variable selection methods. The results show that the proposed method can solve the defects of SPA and it has the best generalization performance and stability. Furthermore, the physical meaning of the selected variables from the near infrared sensor data is clear, which can effectively reduce the variables and improve their prediction accuracy.
Fast Video Encryption Using the H.264 Error Propagation Property for Smart Mobile Devices
Chung, Yongwha; Lee, Sungju; Jeon, Taewoong; Park, Daihee
2015-01-01
In transmitting video data securely over Video Sensor Networks (VSNs), since mobile handheld devices have limited resources in terms of processor clock speed and battery size, it is necessary to develop an efficient method to encrypt video data to meet the increasing demand for secure connections. Selective encryption methods can reduce the amount of computation needed while satisfying high-level security requirements. This is achieved by selecting an important part of the video data and encrypting it. In this paper, to ensure format compliance and security, we propose a special encryption method for H.264, which encrypts only the DC/ACs of I-macroblocks and the motion vectors of P-macroblocks. In particular, the proposed new selective encryption method exploits the error propagation property in an H.264 decoder and improves the collective performance by analyzing the tradeoff between the visual security level and the processing speed compared to typical selective encryption methods (i.e., I-frame, P-frame encryption, and combined I-/P-frame encryption). Experimental results show that the proposed method can significantly reduce the encryption workload without any significant degradation of visual security. PMID:25850068
A Study of Quasar Selection in the Supernova Fields of the Dark Energy Survey
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tie, S. S.; Martini, P.; Mudd, D.
In this paper, we present a study of quasar selection using the supernova fields of the Dark Energy Survey (DES). We used a quasar catalog from an overlapping portion of the SDSS Stripe 82 region to quantify the completeness and efficiency of selection methods involving color, probabilistic modeling, variability, and combinations of color/probabilistic modeling with variability. In all cases, we considered only objects that appear as point sources in the DES images. We examine color selection methods based on the Wide-field Infrared Survey Explorer (WISE) mid-IR W1-W2 color, a mixture of WISE and DES colors (g - i and i-W1),more » and a mixture of Vista Hemisphere Survey and DES colors (g - i and i - K). For probabilistic quasar selection, we used XDQSO, an algorithm that employs an empirical multi-wavelength flux model of quasars to assign quasar probabilities. Our variability selection uses the multi-band χ 2-probability that sources are constant in the DES Year 1 griz-band light curves. The completeness and efficiency are calculated relative to an underlying sample of point sources that are detected in the required selection bands and pass our data quality and photometric error cuts. We conduct our analyses at two magnitude limits, i < 19.8 mag and i < 22 mag. For the subset of sources with W1 and W2 detections, the W1-W2 color or XDQSOz method combined with variability gives the highest completenesses of >85% for both i-band magnitude limits and efficiencies of >80% to the bright limit and >60% to the faint limit; however, the giW1 and giW1+variability methods give the highest quasar surface densities. The XDQSOz method and combinations of W1W2/giW1/XDQSOz with variability are among the better selection methods when both high completeness and high efficiency are desired. We also present the OzDES Quasar Catalog of 1263 spectroscopically confirmed quasars from three years of OzDES observation in the 30 deg 2 of the DES supernova fields. Finally, the catalog includes quasars with redshifts up to z ~ 4 and brighter than i = 22 mag, although the catalog is not complete up to this magnitude limit.« less
A Study of Quasar Selection in the Supernova Fields of the Dark Energy Survey
Tie, S. S.; Martini, P.; Mudd, D.; ...
2017-02-15
In this paper, we present a study of quasar selection using the supernova fields of the Dark Energy Survey (DES). We used a quasar catalog from an overlapping portion of the SDSS Stripe 82 region to quantify the completeness and efficiency of selection methods involving color, probabilistic modeling, variability, and combinations of color/probabilistic modeling with variability. In all cases, we considered only objects that appear as point sources in the DES images. We examine color selection methods based on the Wide-field Infrared Survey Explorer (WISE) mid-IR W1-W2 color, a mixture of WISE and DES colors (g - i and i-W1),more » and a mixture of Vista Hemisphere Survey and DES colors (g - i and i - K). For probabilistic quasar selection, we used XDQSO, an algorithm that employs an empirical multi-wavelength flux model of quasars to assign quasar probabilities. Our variability selection uses the multi-band χ 2-probability that sources are constant in the DES Year 1 griz-band light curves. The completeness and efficiency are calculated relative to an underlying sample of point sources that are detected in the required selection bands and pass our data quality and photometric error cuts. We conduct our analyses at two magnitude limits, i < 19.8 mag and i < 22 mag. For the subset of sources with W1 and W2 detections, the W1-W2 color or XDQSOz method combined with variability gives the highest completenesses of >85% for both i-band magnitude limits and efficiencies of >80% to the bright limit and >60% to the faint limit; however, the giW1 and giW1+variability methods give the highest quasar surface densities. The XDQSOz method and combinations of W1W2/giW1/XDQSOz with variability are among the better selection methods when both high completeness and high efficiency are desired. We also present the OzDES Quasar Catalog of 1263 spectroscopically confirmed quasars from three years of OzDES observation in the 30 deg 2 of the DES supernova fields. Finally, the catalog includes quasars with redshifts up to z ~ 4 and brighter than i = 22 mag, although the catalog is not complete up to this magnitude limit.« less
NASA Astrophysics Data System (ADS)
Nikolić, G. S.; Žerajić, S.; Cakić, M.
2011-10-01
Multivariate calibration method is a powerful mathematical tool that can be applied in analytical chemistry when the analytical signals are highly overlapped. The method with regression by partial least squares is proposed for the simultaneous spectrophotometric determination of adrenergic vasoconstrictors in decongestive solution containing two active components: phenyleprine hydrochloride and trimazoline hydrochloride. These sympathomimetic agents are that frequently associated in pharmaceutical formulations against the common cold. The proposed method, which is, simple and rapid, offers the advantages of sensitivity and wide range of determinations without the need for extraction of the vasoconstrictors. In order to minimize the optimal factors necessary to obtain the calibration matrix by multivariate calibration, different parameters were evaluated. The adequate selection of the spectral regions proved to be important on the number of factors. In order to simultaneously quantify both hydrochlorides among excipients, the spectral region between 250 and 290 nm was selected. A recovery for the vasoconstrictor was 98-101%. The developed method was applied to assay of two decongestive pharmaceutical preparations.
Zhang, Honghai -Hai; Bonnesen, Peter V.; Hong, Kunlun
2015-07-13
There is a facile method for introducing one or more deuterium atoms onto an aromatic nucleus via Br/D exchange with high functional group tolerance and high incorporation efficiency is disclosed. Deuterium-labeled aryl chlorides and aryl borates which could be used as substrates in cross-coupling reactions to construct more complicated deuterium-labeled compounds can also be synthesized by this method.
Laser-induced selective copper plating of polypropylene surface
NASA Astrophysics Data System (ADS)
Ratautas, K.; Gedvilas, M.; Stankevičiene, I.; JagminienÄ--, A.; Norkus, E.; Li Pira, N.; Sinopoli, S.; Emanuele, U.; Račiukaitis, G.
2016-03-01
Laser writing for selective plating of electro-conductive lines for electronics has several significant advantages, compared to conventional printed circuit board technology. Firstly, this method is faster and cheaper at the prototyping stage. Secondly, material consumption is reduced, because it works selectively. However, the biggest merit of this method is potentiality to produce moulded interconnect device, enabling to create electronics on complex 3D surfaces, thus saving space, materials and cost of production. There are two basic techniques of laser writing for selective plating on plastics: the laser-induced selective activation (LISA) and laser direct structuring (LDS). In the LISA method, pure plastics without any dopant (filler) can be used. In the LDS method, special fillers are mixed in the polymer matrix. These fillers are activated during laser writing process, and, in the next processing step, the laser modified area can be selectively plated with metals. In this work, both methods of the laser writing for the selective plating of polymers were investigated and compared. For LDS approach, new material: polypropylene with carbon-based additives was tested using picosecond and nanosecond laser pulses. Different laser processing parameters (laser pulse energy, scanning speed, the number of scans, pulse durations, wavelength and overlapping of scanned lines) were applied in order to find out the optimal regime of activation. Areal selectivity tests showed a high plating resolution. The narrowest width of a copper-plated line was less than 23 μm. Finally, our material was applied to the prototype of the electronic circuit board on a 2D surface.
Automatic peak selection by a Benjamini-Hochberg-based algorithm.
Abbas, Ahmed; Kong, Xin-Bing; Liu, Zhi; Jing, Bing-Yi; Gao, Xin
2013-01-01
A common issue in bioinformatics is that computational methods often generate a large number of predictions sorted according to certain confidence scores. A key problem is then determining how many predictions must be selected to include most of the true predictions while maintaining reasonably high precision. In nuclear magnetic resonance (NMR)-based protein structure determination, for instance, computational peak picking methods are becoming more and more common, although expert-knowledge remains the method of choice to determine how many peaks among thousands of candidate peaks should be taken into consideration to capture the true peaks. Here, we propose a Benjamini-Hochberg (B-H)-based approach that automatically selects the number of peaks. We formulate the peak selection problem as a multiple testing problem. Given a candidate peak list sorted by either volumes or intensities, we first convert the peaks into [Formula: see text]-values and then apply the B-H-based algorithm to automatically select the number of peaks. The proposed approach is tested on the state-of-the-art peak picking methods, including WaVPeak [1] and PICKY [2]. Compared with the traditional fixed number-based approach, our approach returns significantly more true peaks. For instance, by combining WaVPeak or PICKY with the proposed method, the missing peak rates are on average reduced by 20% and 26%, respectively, in a benchmark set of 32 spectra extracted from eight proteins. The consensus of the B-H-selected peaks from both WaVPeak and PICKY achieves 88% recall and 83% precision, which significantly outperforms each individual method and the consensus method without using the B-H algorithm. The proposed method can be used as a standard procedure for any peak picking method and straightforwardly applied to some other prediction selection problems in bioinformatics. The source code, documentation and example data of the proposed method is available at http://sfb.kaust.edu.sa/pages/software.aspx.
Automatic Peak Selection by a Benjamini-Hochberg-Based Algorithm
Abbas, Ahmed; Kong, Xin-Bing; Liu, Zhi; Jing, Bing-Yi; Gao, Xin
2013-01-01
A common issue in bioinformatics is that computational methods often generate a large number of predictions sorted according to certain confidence scores. A key problem is then determining how many predictions must be selected to include most of the true predictions while maintaining reasonably high precision. In nuclear magnetic resonance (NMR)-based protein structure determination, for instance, computational peak picking methods are becoming more and more common, although expert-knowledge remains the method of choice to determine how many peaks among thousands of candidate peaks should be taken into consideration to capture the true peaks. Here, we propose a Benjamini-Hochberg (B-H)-based approach that automatically selects the number of peaks. We formulate the peak selection problem as a multiple testing problem. Given a candidate peak list sorted by either volumes or intensities, we first convert the peaks into -values and then apply the B-H-based algorithm to automatically select the number of peaks. The proposed approach is tested on the state-of-the-art peak picking methods, including WaVPeak [1] and PICKY [2]. Compared with the traditional fixed number-based approach, our approach returns significantly more true peaks. For instance, by combining WaVPeak or PICKY with the proposed method, the missing peak rates are on average reduced by 20% and 26%, respectively, in a benchmark set of 32 spectra extracted from eight proteins. The consensus of the B-H-selected peaks from both WaVPeak and PICKY achieves 88% recall and 83% precision, which significantly outperforms each individual method and the consensus method without using the B-H algorithm. The proposed method can be used as a standard procedure for any peak picking method and straightforwardly applied to some other prediction selection problems in bioinformatics. The source code, documentation and example data of the proposed method is available at http://sfb.kaust.edu.sa/pages/software.aspx. PMID:23308147
Ringach, Dario L.; Hawken, Michael J.; Shapley, Robert M.
2011-01-01
One of the functions of the cerebral cortex is to increase the selectivity for stimulus features. Finding more about the mechanisms of increased cortical selectivity is important for understanding how the cortex works. Up to now, studies in multiple cortical areas have reported that suppressive mechanisms are involved in feature selectivity. However, the magnitude of the contribution of suppression to tuning selectivity is not yet determined. We use orientation selectivity in macaque primary visual cortex, V1, as an archetypal example of cortical feature selectivity and develop a method to estimate the magnitude of the contribution of suppression to orientation selectivity. The results show that untuned suppression, one form of cortical suppression, decreases the orthogonal-to-preferred response ratio (O/P ratio) of V1 cells from an average of 0.38 to 0.26. Untuned suppression has an especially large effect on orientation selectivity for highly selective cells (O/P < 0.2). Therefore, untuned suppression is crucial for the generation of highly orientation-selective cells in V1 cortex. PMID:22049440
High School Voter Registration.
ERIC Educational Resources Information Center
Institute for Political/Legal Education, Sewell, NJ.
Methods for conducting peer voter registration of high school students cover establishing a permanent voter registration committee and identifying and registering eligible students. The permanent voter registration committee, made up of student body representatives, class representatives, and selected teachers, guarantees comprehensive…
NASA Astrophysics Data System (ADS)
Park, Keun; Lee, Sang-Ik
2010-03-01
High-frequency induction is an efficient, non-contact means of heating the surface of an injection mold through electromagnetic induction. Because the procedure allows for the rapid heating and cooling of mold surfaces, it has been recently applied to the injection molding of thin-walled parts or micro/nano-structures. The present study proposes a localized heating method involving the selective use of mold materials to enhance the heating efficiency of high-frequency induction heating. For localized induction heating, a composite injection mold of ferromagnetic material and paramagnetic material is used. The feasibility of the proposed heating method is investigated through numerical analyses in terms of its heating efficiency for localized mold surfaces and in terms of the structural safety of the composite mold. The moldability of high aspect ratio micro-features is then experimentally compared under a variety of induction heating conditions.
Mohamad Hanapi, Nor Suhaila; Sanagi, Mohd Marsin; Ismail, Abd Khamim; Wan Ibrahim, Wan Aini; Saim, Nor'ashikin; Wan Ibrahim, Wan Nazihah
2017-03-01
The aim of this study was to investigate and apply supported ionic liquid membrane (SILM) in two-phase micro-electrodriven membrane extraction combined with high performance liquid chromatography-ultraviolet detection (HPLC-UV) for pre-concentration and determination of three selected antidepressant drugs in water samples. A thin agarose film impregnated with 1-hexyl-3-methylimidazolium hexafluorophosphate, [C 6 MIM] [PF 6 ], was prepared and used as supported ionic liquid membrane between aqueous sample solution and acceptor phase for extraction of imipramine, amitriptyline and chlorpromazine. Under the optimized extraction conditions, the method provided good linearity in the range of 1.0-1000μgL -1 , good coefficients of determination (r 2 =0.9974-0.9992) and low limits of detection (0.1-0.4μgL -1 ). The method showed high enrichment factors in the range of 110-150 and high relative recoveries in the range of 88.2-111.4% and 90.9-107.0%, for river water and tap water samples, respectively with RSDs of ≤7.6 (n=3). This method was successfully applied to the determination of the drugs in river and tap water samples. It is envisaged that the SILM improved the perm-selectivity by providing a pathway for targeted analytes which resulted in rapid extraction with high degree of selectivity and high enrichment factor. Copyright © 2017 Elsevier B.V. All rights reserved.
Wavelength band selection method for multispectral target detection.
Karlholm, Jörgen; Renhorn, Ingmar
2002-11-10
A framework is proposed for the selection of wavelength bands for multispectral sensors by use of hyperspectral reference data. Using the results from the detection theory we derive a cost function that is minimized by a set of spectral bands optimal in terms of detection performance for discrimination between a class of small rare targets and clutter with known spectral distribution. The method may be used, e.g., in the design of multispectral infrared search and track and electro-optical missile warning sensors, where a low false-alarm rate and a high-detection probability for detection of small targets against a clutter background are of critical importance, but the required high frame rate prevents the use of hyperspectral sensors.
Electron beam enhanced surface modification for making highly resolved structures
Pitts, John R.
1986-01-01
A method for forming high resolution submicron structures on a substrate is provided by direct writing with a submicron electron beam in a partial pressure of a selected gas phase characterized by the ability to dissociate under the beam into a stable gaseous leaving group and a reactant fragment that combines with the substrate material under beam energy to form at least a surface compound. Variations of the method provide semiconductor device regions on doped silicon substrates, interconnect lines between active sites, three dimensional electronic chip structures, electron beam and optical read mass storage devices that may include color differentiated data areas, and resist areas for use with selective etching techniques.
Electron beam enhanced surface modification for making highly resolved structures
Pitts, J.R.
1984-10-10
A method for forming high resolution submicron structures on a substrate is provided by direct writing with a submicron electron beam in a partial pressure of a selected gas phase characterized by the ability to dissociate under the beam into a stable gaseous leaving group and a reactant fragment that combines with the substrate material under beam energy to form at least a surface compound. Variations of the method provide semiconductor device regions on doped silicon substrates, interconnect lines between active sites, three dimensional electronic chip structures, electron beam and optical read mass storage devices that may include color differentiated data areas, and resist areas for use with selective etching techniques.
Ribot, Emeline J.; Wecker, Didier; Trotier, Aurélien J.; Dallaudière, Benjamin; Lefrançois, William; Thiaudière, Eric; Franconi, Jean-Michel; Miraux, Sylvain
2015-01-01
Introduction The purpose of this paper is to develop an easy method to generate both fat signal and banding artifact free 3D balanced Steady State Free Precession (bSSFP) images at high magnetic field. Methods In order to suppress fat signal and bSSFP banding artifacts, two or four images were acquired with the excitation frequency of the water-selective binomial radiofrequency pulse set On Resonance or shifted by a maximum of 3/4TR. Mice and human volunteers were imaged at 7T and 3T, respectively to perform whole-body and musculoskeletal imaging. “Sum-Of-Square” reconstruction was performed and combined or not with parallel imaging. Results The frequency selectivity of 1-2-3-2-1 or 1-3-3-1 binomial pulses was preserved after (3/4TR) frequency shifting. Consequently, whole body small animal 3D imaging was performed at 7T and enabled visualization of small structures within adipose tissue like lymph nodes. In parallel, this method allowed 3D musculoskeletal imaging in humans with high spatial resolution at 3T. The combination with parallel imaging allowed the acquisition of knee images with ~500μm resolution images in less than 2min. In addition, ankles, full head coverage and legs of volunteers were imaged, demonstrating the possible application of the method also for large FOV. Conclusion In conclusion, this robust method can be applied in small animals and humans at high magnetic fields. The high SNR and tissue contrast obtained in short acquisition times allows to prescribe bSSFP sequence for several preclinical and clinical applications. PMID:26426849
ERIC Educational Resources Information Center
Price, Jay R.
This study sought information about selective service rejection in Delaware, specifically rejectee characteristics, reasons for rejection, and the high rejection rate in Delaware. The basic design was a modified case study method in which a sample of individual records were examined. Differences between this sample and national samples were tested…
Visual analytics in cheminformatics: user-supervised descriptor selection for QSAR methods.
Martínez, María Jimena; Ponzoni, Ignacio; Díaz, Mónica F; Vazquez, Gustavo E; Soto, Axel J
2015-01-01
The design of QSAR/QSPR models is a challenging problem, where the selection of the most relevant descriptors constitutes a key step of the process. Several feature selection methods that address this step are concentrated on statistical associations among descriptors and target properties, whereas the chemical knowledge is left out of the analysis. For this reason, the interpretability and generality of the QSAR/QSPR models obtained by these feature selection methods are drastically affected. Therefore, an approach for integrating domain expert's knowledge in the selection process is needed for increase the confidence in the final set of descriptors. In this paper a software tool, which we named Visual and Interactive DEscriptor ANalysis (VIDEAN), that combines statistical methods with interactive visualizations for choosing a set of descriptors for predicting a target property is proposed. Domain expertise can be added to the feature selection process by means of an interactive visual exploration of data, and aided by statistical tools and metrics based on information theory. Coordinated visual representations are presented for capturing different relationships and interactions among descriptors, target properties and candidate subsets of descriptors. The competencies of the proposed software were assessed through different scenarios. These scenarios reveal how an expert can use this tool to choose one subset of descriptors from a group of candidate subsets or how to modify existing descriptor subsets and even incorporate new descriptors according to his or her own knowledge of the target property. The reported experiences showed the suitability of our software for selecting sets of descriptors with low cardinality, high interpretability, low redundancy and high statistical performance in a visual exploratory way. Therefore, it is possible to conclude that the resulting tool allows the integration of a chemist's expertise in the descriptor selection process with a low cognitive effort in contrast with the alternative of using an ad-hoc manual analysis of the selected descriptors. Graphical abstractVIDEAN allows the visual analysis of candidate subsets of descriptors for QSAR/QSPR. In the two panels on the top, users can interactively explore numerical correlations as well as co-occurrences in the candidate subsets through two interactive graphs.
NASA Technical Reports Server (NTRS)
Haftka, R. T.; Adelman, H. M.
1984-01-01
Orbiting spacecraft such as large space antennas have to maintain a highly accurate space to operate satisfactorily. Such structures require active and passive controls to mantain an accurate shape under a variety of disturbances. Methods for the optimum placement of control actuators for correcting static deformations are described. In particular, attention is focused on the case were control locations have to be selected from a large set of available sites, so that integer programing methods are called for. The effectiveness of three heuristic techniques for obtaining a near-optimal site selection is compared. In addition, efficient reanalysis techniques for the rapid assessment of control effectiveness are presented. Two examples are used to demonstrate the methods: a simple beam structure and a 55m space-truss-parabolic antenna.
Molecularly imprinted polymer for analysis of trace atrazine herbicide in water.
Kueseng, Pamornrat; Noir, Mathieu L; Mattiasson, Bo; Thavarungkul, Panote; Kanatharana, Proespichaya
2009-11-01
A molecularly imprinted polymer (MIP) for atrazine was synthesized by non-covalent method. The binding capacity of MIP was 1.00 mg g(-1) polymer. The selectivity and recovery were investigated with various pesticides which are mostly, found in the environment, for both similar and different chemical structure of atrazine. The competitive recognition between atrazine and structurally similar compounds was evaluated and it was found that the system provided highest recovery and selectivity for atrazine while low recovery and selectivity were obtained for the other compounds. The highest recovery was obtained from MIP compared with non-imprinted polymer (NIP), a commercial C(18) and a granular activated carbon (GAC) sorbent. The method provided high recoveries ranged from 94 to 99% at two spiked levels with relative standard deviations less than 2%. The lower detection limit of the method was 80 ng L(-1). This method was successfully applied for analysis of environmental water samples.
Zhao, Yu-Xiang; Chou, Chien-Hsing
2016-01-01
In this study, a new feature selection algorithm, the neighborhood-relationship feature selection (NRFS) algorithm, is proposed for identifying rat electroencephalogram signals and recognizing Chinese characters. In these two applications, dependent relationships exist among the feature vectors and their neighboring feature vectors. Therefore, the proposed NRFS algorithm was designed for solving this problem. By applying the NRFS algorithm, unselected feature vectors have a high priority of being added into the feature subset if the neighboring feature vectors have been selected. In addition, selected feature vectors have a high priority of being eliminated if the neighboring feature vectors are not selected. In the experiments conducted in this study, the NRFS algorithm was compared with two feature algorithms. The experimental results indicated that the NRFS algorithm can extract the crucial frequency bands for identifying rat vigilance states and identifying crucial character regions for recognizing Chinese characters. PMID:27314346
Synthesis of nanosized sodium titanates
Hobbs, David T.; Taylor-Pashow, Kathryn M. L.; Elvington, Mark C.
2015-09-29
Methods directed to the synthesis and peroxide-modification of nanosized monosodium titanate are described. Methods include combination of reactants at a low concentration to a solution including a nonionic surfactant. The nanosized monosodium titanate can exhibit high selectivity for sorbing various metallic ions.
MOLECULAR DIVERSITY OF DRINKING WATER MICROBIAL COMMUNITIES: A PHYLOGENETIC APPROACH
Culture-based methods are traditionally used to determine microbiological quality of drinking water even though these methods are highly selective and tend to underestimate the densities and diversity bacterial populations inhabiting distribution systems. In order to better under...
Analysis and feasibility of asphalt pavement performance-based specifications for WisDOT.
DOT National Transportation Integrated Search
2016-12-25
Literature review of most recent methods used for effective characterization of asphalt mixtures resulted in selecting aset of test methods for measuring mixture resistance for rutting and moisture damage at high temperature, fatigue cracking at inte...
Kiesewetter, André; Menstell, Peter; Peeck, Lars H; Stein, Andreas
2016-11-01
Rapid development of chromatographic processes relies on effective high-throughput screening (HTS) methods. This article describes the development of pseudo-linear gradient elution for resin selectivity screening using RoboColumns ® . It gives guidelines for the implementation of this HTS method on a Tecan Freedom EVO ® robotic platform, addressing fundamental aspects of scale down and liquid handling. The creation of a flexible script for buffer preparation and column operation plus efficient data processing provided the basis for this work. Based on the concept of discretization, linear gradient elution was transformed into multistep gradients. The impact of column size, flow rate, multistep gradient design, and fractionation scheme on separation efficiency was systematically investigated, using a ternary model protein mixture. We identified key parameters and defined optimal settings for effective column performance. For proof of concept, we examined the selectivity of several cation exchange resins using various buffer conditions. The final protocol enabled a clear differentiation of resin selectivity on miniature chromatography column (MCC) scale. Distinct differences in separation behavior of individual resins and the influence of buffer conditions could be demonstrated. Results obtained with the robotic platform were representative and consistent with data generated on a conventional chromatography system. A study on antibody monomer/high molecular weight separation comparing MCC and lab scale under higher loading conditions provided evidence of the applicability of the miniaturized approach to practically relevant feedstocks with challenging separation tasks as well as of the predictive quality for larger scale. A comparison of varying degrees of robotic method complexity with corresponding effort (analysis time and labware consumption) and output quality highlights tradeoffs to select a method appropriate for a given separation challenge or analytical constraints. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1503-1519, 2016. © 2016 American Institute of Chemical Engineers.
Magiera, Sylwia; Baranowska, Irena; Lautenszleger, Anna
2015-01-01
A simple and rapid ultra-high performance liquid chromatographic (UHPLC) method coupled with an ultraviolet detector (UV) has been developed and validated for the separation and determination of 14 major flavonoids ((±)-catechin, (-)-epicatechin, glycitin, (-)-epicatechin gallate, rutin, quercitrin, hesperidine, neohesperidine, daidzein, glycitein, quercetin, genistein, hesperetin, and biochanin A) in herbal dietary supplements. The flavonoids have been separated on a Chromolith Fast Gradient Monolithic RP-18e column utilizing a mobile phase composed of 0.05% trifluoroacetic acid in water and acetonitrile in gradient elution mode. Under these conditions, flavonoids were separated in a 5 min run. The selectivity of the developed UHPLC-UV method was confirmed by comparison with ultra-high-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) analysis. The validation parameters such as linearity, sensitivity, precision, and accuracy were found to be highly satisfactory. The optimized method was applied to determination of flavonoids in different dietary supplements. Additionally, the developed HPLC-UV method combined with 2,2-diphenyl-1-picrylhydrazyl radical (DPPH) assay was used in the evaluation of antioxidant activity of the selected flavonoids. Copyright © 2014 Elsevier B.V. All rights reserved.
Selecting the most appropriate time points to profile in high-throughput studies
Kleyman, Michael; Sefer, Emre; Nicola, Teodora; Espinoza, Celia; Chhabra, Divya; Hagood, James S; Kaminski, Naftali; Ambalavanan, Namasivayam; Bar-Joseph, Ziv
2017-01-01
Biological systems are increasingly being studied by high throughput profiling of molecular data over time. Determining the set of time points to sample in studies that profile several different types of molecular data is still challenging. Here we present the Time Point Selection (TPS) method that solves this combinatorial problem in a principled and practical way. TPS utilizes expression data from a small set of genes sampled at a high rate. As we show by applying TPS to study mouse lung development, the points selected by TPS can be used to reconstruct an accurate representation for the expression values of the non selected points. Further, even though the selection is only based on gene expression, these points are also appropriate for representing a much larger set of protein, miRNA and DNA methylation changes over time. TPS can thus serve as a key design strategy for high throughput time series experiments. Supporting Website: www.sb.cs.cmu.edu/TPS DOI: http://dx.doi.org/10.7554/eLife.18541.001 PMID:28124972
High-Dimensional Heteroscedastic Regression with an Application to eQTL Data Analysis
Daye, Z. John; Chen, Jinbo; Li, Hongzhe
2011-01-01
Summary We consider the problem of high-dimensional regression under non-constant error variances. Despite being a common phenomenon in biological applications, heteroscedasticity has, so far, been largely ignored in high-dimensional analysis of genomic data sets. We propose a new methodology that allows non-constant error variances for high-dimensional estimation and model selection. Our method incorporates heteroscedasticity by simultaneously modeling both the mean and variance components via a novel doubly regularized approach. Extensive Monte Carlo simulations indicate that our proposed procedure can result in better estimation and variable selection than existing methods when heteroscedasticity arises from the presence of predictors explaining error variances and outliers. Further, we demonstrate the presence of heteroscedasticity in and apply our method to an expression quantitative trait loci (eQTLs) study of 112 yeast segregants. The new procedure can automatically account for heteroscedasticity in identifying the eQTLs that are associated with gene expression variations and lead to smaller prediction errors. These results demonstrate the importance of considering heteroscedasticity in eQTL data analysis. PMID:22547833
Falasca, Sara; Petruzziello, Filomena; Kretz, Robert; Rainer, Gregor; Zhang, Xiaozhe
2012-06-08
Endogenous quaternary ammonium compounds are involved in various physiological processes in the central nervous system. In the present study, eleven quaternary ammonium compounds, including acetylcholine, choline, carnitine, acetylcarnitine and seven other acylcarnitines of low polarity, were analyzed from brain extracts using a two dimension capillary liquid chromatography-Fourier transform mass spectrometry method. To deal with their large difference in hydrophobicities, tandem coupling between reversed phase and hydrophilic interaction chromatography columns was used to separate all the targeted quaternary ammonium compounds. Using high accuracy mass spectrometry in selected ion monitoring mode, all the compounds could be detected from each brain sample with high selectivity. The developed method was applied for the relative quantification of these quaternary ammonium compounds in three different brain regions of tree shrews: prefrontal cortex, striatum, and hippocampus. The comparative analysis showed that quaternary ammonium compounds were differentially distributed across the three brain areas. The analytical method proved to be highly sensitive and reliable for simultaneous determination of all the targeted analytes from brain samples. Copyright © 2012 Elsevier B.V. All rights reserved.
Zhang, XiaoQing; Feng, Ye; Yao, QiongQiong; He, Fengjiao
2017-12-15
A rapid and accurate detection method for Mycobacterium tuberculosis (M. tuberculosis) is essential for effectively treating tuberculosis. However, current detection methods cannot meet these clinical requirements because the methods are slow or of low specificity. Consequently, a new highly specific ssDNA aptamer against M. tuberculosis reference strain H37Rv was selected by using the whole-cell systematic evolution of ligands by exponential enrichment technique. The selected aptamer was used to construct a fast and highly specific H37Rv sensor. The probe was produced by immobilizing thiol-modified aptamer on an Au interdigital electrode (Au-IDE) of a multichannel series piezoelectric quartz crystal (MSPQC) through Au-S bonding, and then single-walled carbon nanotubes (SWCNTs) were bonded on the aptamer by π-π stacking. SWCNTs were used as a signal indicator because of their considerable difference in conductivity compared with H37Rv. When H37Rv is present, it replaces the SWCNTs because it binds to the aptamer much more strongly than SWCNTs do. The replacement of SWCNTs by H37Rv resulted in a large change in the electrical properties, and this change was detected by the MSPQC. The proposed sensor is highly selective and can distinguish H37Rv from Mycobacterium smegmatis (M. smegmatis) and Bacillus Calmette-Guerin vaccine (BCG). The detection time was 70min and the detection limit was 100cfu/mL. Compared with conventional methods, this new SWCNT/aptamer/Au-IDE MSPQC H37Rv sensor was specific, rapid, and sensitive, and it holds great potential for the early detection of H37Rv in clinical diagnosis. Copyright © 2017 Elsevier B.V. All rights reserved.
Proactive AP Selection Method Considering the Radio Interference Environment
NASA Astrophysics Data System (ADS)
Taenaka, Yuzo; Kashihara, Shigeru; Tsukamoto, Kazuya; Yamaguchi, Suguru; Oie, Yuji
In the near future, wireless local area networks (WLANs) will overlap to provide continuous coverage over a wide area. In such ubiquitous WLANs, a mobile node (MN) moving freely between multiple access points (APs) requires not only permanent access to the Internet but also continuous communication quality during handover. In order to satisfy these requirements, an MN needs to (1) select an AP with better performance and (2) execute a handover seamlessly. To satisfy requirement (2), we proposed a seamless handover method in a previous study. Moreover, in order to achieve (1), the Received Signal Strength Indicator (RSSI) is usually employed to measure wireless link quality in a WLAN system. However, in a real environment, especially if APs are densely situated, it is difficult to always select an AP with better performance based on only the RSSI. This is because the RSSI alone cannot detect the degradation of communication quality due to radio interference. Moreover, it is important that AP selection is completed only on an MN, because we can assume that, in ubiquitous WLANs, various organizations or operators will manage APs. Hence, we cannot modify the APs for AP selection. To overcome these difficulties, in the present paper, we propose and implement a proactive AP selection method considering wireless link condition based on the number of frame retransmissions in addition to the RSSI. In the evaluation, we show that the proposed AP selection method can appropriately select an AP with good wireless link quality, i.e., high RSSI and low radio interference.
Iron-aluminum alloys having high room-temperature and method for making same
Sikka, Vinod K.; McKamey, Claudette G.
1993-01-01
Iron-aluminum alloys having selectable room-temperature ductilities of greater than 20%, high resistance to oxidation and sulfidation, resistant pitting and corrosion in aqueous solutions, and possessing relatively high yield and ultimate tensile strengths are described. These alloys comprise 8 to 9.5% aluminum, up to 7% chromium, up to 4% molybdenum, up to 0.05% carbon, up to 0.5% of a carbide former such as zirconium, up to 0.1 yttrium, and the balance iron. These alloys in wrought form are annealed at a selected temperature in the range of 700.degree. C. to about 1100.degree. C. for providing the alloys with selected room-temperature ductilities in the range of 20 to about 29%.
An Efficient Adaptive Window Size Selection Method for Improving Spectrogram Visualization.
Nisar, Shibli; Khan, Omar Usman; Tariq, Muhammad
2016-01-01
Short Time Fourier Transform (STFT) is an important technique for the time-frequency analysis of a time varying signal. The basic approach behind it involves the application of a Fast Fourier Transform (FFT) to a signal multiplied with an appropriate window function with fixed resolution. The selection of an appropriate window size is difficult when no background information about the input signal is known. In this paper, a novel empirical model is proposed that adaptively adjusts the window size for a narrow band-signal using spectrum sensing technique. For wide-band signals, where a fixed time-frequency resolution is undesirable, the approach adapts the constant Q transform (CQT). Unlike the STFT, the CQT provides a varying time-frequency resolution. This results in a high spectral resolution at low frequencies and high temporal resolution at high frequencies. In this paper, a simple but effective switching framework is provided between both STFT and CQT. The proposed method also allows for the dynamic construction of a filter bank according to user-defined parameters. This helps in reducing redundant entries in the filter bank. Results obtained from the proposed method not only improve the spectrogram visualization but also reduce the computation cost and achieves 87.71% of the appropriate window length selection.
Nemkov, Travis; D'Alessandro, Angelo; Hansen, Kirk C.
2015-01-01
Amino acid analysis is a powerful bioanalytical technique for many biomedical research endeavors, including cancer, emergency medicine, nutrition and neuroscience research. In the present study, we present a three minute analytical method for underivatized amino acid analysis that employs ultra-high performance liquid chromatography and high resolution quadrupole orbitrap mass spectrometry. This method has demonstrated linearity (mM to nM range), reproducibility (intra-day<5%, inter-day<20%), sensitivity (low fmol) and selectivity. Here, we illustrate the rapidity and accuracy of the method through comparison with conventional liquid chromatography-mass spectrometry methods. We further demonstrate the robustness and sensitivity of this method on a diverse range of biological matrices. Using this method we were able to selectively discriminate murine pancreatic cancer cells with and without knocked down expression of Hypoxia Inducible Factor 1α; plasma, lymph and bronchioalveolar lavage fluid samples from control versus hemorrhaged rats; and muscle tissue samples harvested from rats subjected to both low fat and high fat diets. Furthermore, we were able to exploit the sensitivity of the method to detect and quantify the release of glutamate from sparsely isolated murine taste buds. Spiked in light or heavy standards (13C6-arginine, 13C6-lysine, 13C515N2-glutamine) or xenometabolites were used to determine coefficient of variations, confirm linearity of relative quantitation in four different matrices, and overcome matrix effects for absolute quantitation. The presented method enables high-throughput analysis of low abundance samples requiring only one percent of the material extracted from 100,000 cells, 10 μl of biological fluid, or 2 mg of muscle tissue. PMID:26058356
Star-shaped Polymers through Simple Wavelength-Selective Free-Radical Photopolymerization.
Eibel, Anna; Fast, David E; Sattelkow, Jürgen; Zalibera, Michal; Wang, Jieping; Huber, Alex; Müller, Georgina; Neshchadin, Dmytro; Dietliker, Kurt; Plank, Harald; Grützmacher, Hansjörg; Gescheidt, Georg
2017-11-06
Star-shaped polymers represent highly desired materials in nanotechnology and life sciences, including biomedical applications (e.g., diagnostic imaging, tissue engineering, and targeted drug delivery). Herein, we report a straightforward synthesis of wavelength-selective multifunctional photoinitiators (PIs) that contain a bisacylphosphane oxide (BAPO) group and an α-hydroxy ketone moiety within one molecule. By using three different wavelengths, these photoactive groups can be selectively addressed and activated, thereby allowing the synthesis of ABC-type miktoarm star polymers through a simple, highly selective, and robust free-radical polymerization method. The photochemistry of these new initiators and the feasibility of this concept were investigated in unprecedented detail by using various spectroscopic techniques. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Yang, Ming; Peng, Zhihui; Ning, Yi; Chen, Yongzhe; Zhou, Qin; Deng, Le
2013-05-22
In this paper, a panel of single-stranded DNA aptamers with high affinity and specificity against Salmonella Paratyphi A was selected from an enriched oligonucleotide pool by a whole-cell-Systematic Evolution of Ligands by Exponential Enrichment (SELEX) procedure, during which four other Salmonella serovars were used as counter-selection targets. It was determined through a fluorescence assay that the selected aptamers had high binding ability and specificity to this pathogen. The dissociation constant of these aptamers were up to nanomolar range, and aptamer Apt22 with the lowest Kd (47 ± 3 nM) was used in cell imaging experiments. To detect this bacteria with high specificity and cost-efficiently, a novel useful detection method was also constructed based on the noncovalent self-assembly of single-walled carbon nanotubes (SWNTs) and DNAzyme-labeled aptamer detection probes. The amounts of target bacteria could be quantified by exploiting chemoluminescence intensity changes at 420 nm and the detection limit of the method was 103 cfu/mL. This study demonstrated the applicability of Salmonella specific aptamers and their potential for use in the detection of Salmonella in food, clinical and environmental samples.
Mir, Mònica; Lugo, Roberto; Tahirbegi, Islam Bogachan; Samitier, Josep
2014-01-01
Poly(vinylchloride) (PVC) is the most common polymer matrix used in the fabrication of ion-selective electrodes (ISEs). However, the surfaces of PVC-based sensors have been reported to show membrane instability. In an attempt to overcome this limitation, here we developed two alternative methods for the preparation of highly stable and robust ion-selective sensors. These platforms are based on the selective electropolymerization of poly(3,4-ethylenedioxythiophene) (PEDOT), where the sulfur atoms contained in the polymer covalently interact with the gold electrode, also permitting controlled selective attachment on a miniaturized electrode in an array format. This platform sensor was improved with the crosslinking of the membrane compounds with poly(ethyleneglycol) diglycidyl ether (PEG), thus also increasing the biocompatibility of the sensor. The resulting ISE membranes showed faster signal stabilization of the sensor response compared with that of the PVC matrix and also better reproducibility and stability, thus making these platforms highly suitable candidates for the manufacture of robust implantable sensors. PMID:24999717
Yomogida, Yohei; Tanaka, Takeshi; Zhang, Minfang; Yudasaka, Masako; Wei, Xiaojun; Kataura, Hiromichi
2016-01-01
Single-chirality, single-wall carbon nanotubes are desired due to their inherent physical properties and performance characteristics. Here, we demonstrate a chromatographic separation method based on a newly discovered chirality-selective affinity between carbon nanotubes and a gel containing a mixture of the surfactants. In this system, two different selectivities are found: chiral-angle selectivity and diameter selectivity. Since the chirality of nanotubes is determined by the chiral angle and diameter, combining these independent selectivities leads to high-resolution single-chirality separation with milligram-scale throughput and high purity. Furthermore, we present efficient vascular imaging of mice using separated single-chirality (9,4) nanotubes. Due to efficient absorption and emission, blood vessels can be recognized even with the use of ∼100-fold lower injected dose than the reported value for pristine nanotubes. Thus, 1 day of separation provides material for up to 15,000 imaging experiments, which is acceptable for industrial use. PMID:27350127
Wang, Dongqing; Zhang, Xu; Gao, Xiaoping; Chen, Xiang; Zhou, Ping
2016-01-01
This study presents wavelet packet feature assessment of neural control information in paretic upper limb muscles of stroke survivors for myoelectric pattern recognition, taking advantage of high-resolution time-frequency representations of surface electromyogram (EMG) signals. On this basis, a novel channel selection method was developed by combining the Fisher's class separability index and the sequential feedforward selection analyses, in order to determine a small number of appropriate EMG channels from original high-density EMG electrode array. The advantages of the wavelet packet features and the channel selection analyses were further illustrated by comparing with previous conventional approaches, in terms of classification performance when identifying 20 functional arm/hand movements implemented by 12 stroke survivors. This study offers a practical approach including paretic EMG feature extraction and channel selection that enables active myoelectric control of multiple degrees of freedom with paretic muscles. All these efforts will facilitate upper limb dexterity restoration and improved stroke rehabilitation.
Unbiased feature selection in learning random forests for high-dimensional data.
Nguyen, Thanh-Tung; Huang, Joshua Zhexue; Nguyen, Thuy Thi
2015-01-01
Random forests (RFs) have been widely used as a powerful classification method. However, with the randomization in both bagging samples and feature selection, the trees in the forest tend to select uninformative features for node splitting. This makes RFs have poor accuracy when working with high-dimensional data. Besides that, RFs have bias in the feature selection process where multivalued features are favored. Aiming at debiasing feature selection in RFs, we propose a new RF algorithm, called xRF, to select good features in learning RFs for high-dimensional data. We first remove the uninformative features using p-value assessment, and the subset of unbiased features is then selected based on some statistical measures. This feature subset is then partitioned into two subsets. A feature weighting sampling technique is used to sample features from these two subsets for building trees. This approach enables one to generate more accurate trees, while allowing one to reduce dimensionality and the amount of data needed for learning RFs. An extensive set of experiments has been conducted on 47 high-dimensional real-world datasets including image datasets. The experimental results have shown that RFs with the proposed approach outperformed the existing random forests in increasing the accuracy and the AUC measures.
Methods for the guideline-based development of quality indicators--a systematic review
2012-01-01
Background Quality indicators (QIs) are used in many healthcare settings to measure, compare, and improve quality of care. For the efficient development of high-quality QIs, rigorous, approved, and evidence-based development methods are needed. Clinical practice guidelines are a suitable source to derive QIs from, but no gold standard for guideline-based QI development exists. This review aims to identify, describe, and compare methodological approaches to guideline-based QI development. Methods We systematically searched medical literature databases (Medline, EMBASE, and CINAHL) and grey literature. Two researchers selected publications reporting methodological approaches to guideline-based QI development. In order to describe and compare methodological approaches used in these publications, we extracted detailed information on common steps of guideline-based QI development (topic selection, guideline selection, extraction of recommendations, QI selection, practice test, and implementation) to predesigned extraction tables. Results From 8,697 hits in the database search and several grey literature documents, we selected 48 relevant references. The studies were of heterogeneous type and quality. We found no randomized controlled trial or other studies comparing the ability of different methodological approaches to guideline-based development to generate high-quality QIs. The relevant publications featured a wide variety of methodological approaches to guideline-based QI development, especially regarding guideline selection and extraction of recommendations. Only a few studies reported patient involvement. Conclusions Further research is needed to determine which elements of the methodological approaches identified, described, and compared in this review are best suited to constitute a gold standard for guideline-based QI development. For this research, we provide a comprehensive groundwork. PMID:22436067
Microplate Bioassay for Determining Substrate Selectivity of "Candida rugosa" Lipase
ERIC Educational Resources Information Center
Wang, Shi-zhen; Fang, Bai-shan
2012-01-01
Substrate selectivity of "Candida rugosa" lipase was tested using "p"-nitrophenyl esters of increasing chain length (C[subscript 1], C[subscript 7], C[subscript 15]) using the high-throughput screening method. A fast and easy 96-well microplate bioassay was developed to help students learn and practice biotechnological specificity screen. The…
A Selected Bibliography on International Education.
ERIC Educational Resources Information Center
Foreign Policy Association, New York, NY.
This unannotated bibliography is divided into four major sections; 1) General Background Readings for Teachers; 2) Approaches and Methods; 3) Materials for the Classroom; and, 4) Sources of Information and Materials. It offers a highly selective list of items which provide wide coverage of the field. Included are items on foreign policy, war and…
Selective nanoscale growth of lattice mismatched materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Seung-Chang; Brueck, Steven R. J.
Exemplary embodiments provide materials and methods of forming high-quality semiconductor devices using lattice-mismatched materials. In one embodiment, a composite film including one or more substantially-single-particle-thick nanoparticle layers can be deposited over a substrate as a nanoscale selective growth mask for epitaxially growing lattice-mismatched materials over the substrate.
Qiao, Jindong; Wang, Mingyu; Yan, Hongyuan; Yang, Gengliang
2014-04-02
A new magnetic dummy molecularly imprinted dispersive solid-phase extraction (MAG-MIM-dSPE) coupled with gas chromatography-FID was developed for selective determination of phthalates in plastic bottled beverages. The new magnetic dummy molecularly imprinted microspheres (MAG-MIM) using diisononyl phthalate as a template mimic were synthesized by coprecipitation coupled with aqueous suspension polymerization and were successfully applied as the adsorbents for MAG-MIM-dSPE to extract and isolate five phthalates from plastic bottled beverages. Validation experiments showed that the MAG-MIM-dSPE method had good linearity at 0.0040-0.40 μg/mL (0.9991-0.9998), good precision (3.1-6.9%), and high recovery (89.5-101.3%), and limits of detection were obtained in a range of 0.53-1.2 μg/L. The presented MAG-MIM-dSPE method combines the quick separation of magnetic particles, special selectivity of MIM, and high extraction efficiency of dSPE, which could potentially be applied to selective screening of phthalates in beverage products.
Quantification of six herbicide metabolites in human urine.
Norrgran, Jessica; Bravo, Roberto; Bishop, Amanda M; Restrepo, Paula; Whitehead, Ralph D; Needham, Larry L; Barr, Dana B
2006-01-18
We developed a sensitive, selective and precise method for measuring herbicide metabolites in human urine. Our method uses automated liquid delivery of internal standards and acetate buffer and a mixed polarity polymeric phase solid phase extraction of a 2 mL urine sample. The concentrated eluate is analyzed using high-performance liquid chromatography-tandem mass spectrometry. Isotope dilution calibration is used for quantification of all analytes. The limits of detection of our method range from 0.036 to 0.075 ng/mL. The within- and between-day variation in pooled quality control samples range from 2.5 to 9.0% and from 3.2 to 16%, respectively, for all analytes at concentrations ranging from 0.6 to 12 ng/mL. Precision was similar with samples fortified with 0.1 and 0.25 ng/mL that were analyzed in each run. We validated our selective method against a less selective method used previously in our laboratory by analyzing human specimens using both methods. The methods produced results that were in agreement, with no significant bias observed.
Takahashi, Hiro; Honda, Hiroyuki
2006-07-01
Considering the recent advances in and the benefits of DNA microarray technologies, many gene filtering approaches have been employed for the diagnosis and prognosis of diseases. In our previous study, we developed a new filtering method, namely, the projective adaptive resonance theory (PART) filtering method. This method was effective in subclass discrimination. In the PART algorithm, the genes with a low variance in gene expression in either class, not both classes, were selected as important genes for modeling. Based on this concept, we developed novel simple filtering methods such as modified signal-to-noise (S2N') in the present study. The discrimination model constructed using these methods showed higher accuracy with higher reproducibility as compared with many conventional filtering methods, including the t-test, S2N, NSC and SAM. The reproducibility of prediction was evaluated based on the correlation between the sets of U-test p-values on randomly divided datasets. With respect to leukemia, lymphoma and breast cancer, the correlation was high; a difference of >0.13 was obtained by the constructed model by using <50 genes selected by S2N'. Improvement was higher in the smaller genes and such higher correlation was observed when t-test, NSC and SAM were used. These results suggest that these modified methods, such as S2N', have high potential to function as new methods for marker gene selection in cancer diagnosis using DNA microarray data. Software is available upon request.
Yanagawa, T; Tokudome, S
1990-01-01
We developed methods to assess the cancer risks by screening tests. These methods estimate the size of the high risk group adjusted for the characteristics of screening tests and estimate the incidence rates of cancer among the high risk group adjusted for the characteristics of the tests. A method was also developed for selecting the cut-off point of a screening test. Finally, the methods were applied to estimate the risk of the adult T-cell leukemia/lymphoma. PMID:2269244
Thermal behavior in single track during selective laser melting of AlSi10Mg powder
NASA Astrophysics Data System (ADS)
Wei, Pei; Wei, Zhengying; Chen, Zhen; He, Yuyang; Du, Jun
2017-09-01
A three-dimensional model was developed to simulate the radiation heat transfer in the AlSi10Mg packed bed. The volume of fluid method (VOF) was used to capture the free surface during selective laser melting (SLM). A randomly packed powder bed was obtained using discrete element method (DEM) in Particle Flow Code (PFC). The proposed model has demonstrated a high potential to simulate the selective laser melting process (SLM) with high accuracy. In this paper, the effect of the laser scanning speed and laser power on the thermodynamic behavior of the molten pool was investigated numerically. The results show that the temperature gradient and the resultant surface tension gradient between the center and the edge of the molten pool increase with decreasing the scanning speed or increasing the laser power, thereby intensifying the Marangoni flow and attendant turbulence within the molten pool. However, at a relatively high scanning speed, a significant instability may be generated in the molten pool. The perturbation and instability in the molten pool during SLM may result in an irregular shaped track.
High temperature composites. Status and future directions
NASA Technical Reports Server (NTRS)
Signorelli, R. A.
1982-01-01
A summary of research investigations of manufacturing methods, fabrication methods, and testing of high temperature composites for use in gas turbine engines is presented. Ceramic/ceramic, ceramic/metal, and metal/metal composites are considered. Directional solidification of superalloys and eutectic alloys, fiber reinforced metal and ceramic composites, ceramic fibers and whiskers, refractory coatings, metal fiber/metal composites, matrix metal selection, and the preparation of test specimens are discussed.
Wu, Guorong; Kim, Minjeong; Wang, Qian; Munsell, Brent C.
2015-01-01
Feature selection is a critical step in deformable image registration. In particular, selecting the most discriminative features that accurately and concisely describe complex morphological patterns in image patches improves correspondence detection, which in turn improves image registration accuracy. Furthermore, since more and more imaging modalities are being invented to better identify morphological changes in medical imaging data,, the development of deformable image registration method that scales well to new image modalities or new image applications with little to no human intervention would have a significant impact on the medical image analysis community. To address these concerns, a learning-based image registration framework is proposed that uses deep learning to discover compact and highly discriminative features upon observed imaging data. Specifically, the proposed feature selection method uses a convolutional stacked auto-encoder to identify intrinsic deep feature representations in image patches. Since deep learning is an unsupervised learning method, no ground truth label knowledge is required. This makes the proposed feature selection method more flexible to new imaging modalities since feature representations can be directly learned from the observed imaging data in a very short amount of time. Using the LONI and ADNI imaging datasets, image registration performance was compared to two existing state-of-the-art deformable image registration methods that use handcrafted features. To demonstrate the scalability of the proposed image registration framework image registration experiments were conducted on 7.0-tesla brain MR images. In all experiments, the results showed the new image registration framework consistently demonstrated more accurate registration results when compared to state-of-the-art. PMID:26552069
Wu, Guorong; Kim, Minjeong; Wang, Qian; Munsell, Brent C; Shen, Dinggang
2016-07-01
Feature selection is a critical step in deformable image registration. In particular, selecting the most discriminative features that accurately and concisely describe complex morphological patterns in image patches improves correspondence detection, which in turn improves image registration accuracy. Furthermore, since more and more imaging modalities are being invented to better identify morphological changes in medical imaging data, the development of deformable image registration method that scales well to new image modalities or new image applications with little to no human intervention would have a significant impact on the medical image analysis community. To address these concerns, a learning-based image registration framework is proposed that uses deep learning to discover compact and highly discriminative features upon observed imaging data. Specifically, the proposed feature selection method uses a convolutional stacked autoencoder to identify intrinsic deep feature representations in image patches. Since deep learning is an unsupervised learning method, no ground truth label knowledge is required. This makes the proposed feature selection method more flexible to new imaging modalities since feature representations can be directly learned from the observed imaging data in a very short amount of time. Using the LONI and ADNI imaging datasets, image registration performance was compared to two existing state-of-the-art deformable image registration methods that use handcrafted features. To demonstrate the scalability of the proposed image registration framework, image registration experiments were conducted on 7.0-T brain MR images. In all experiments, the results showed that the new image registration framework consistently demonstrated more accurate registration results when compared to state of the art.
Lara-Capi, Cynthia; Lingström, Peter; Lai, Gianfranco; Cocco, Fabio; Simark-Mattsson, Charlotte; Campus, Guglielmo
2017-01-01
Objectives: This article aimed to evaluate: (a) the agreement between a near-infrared light transillumination device and clinical and radiographic examinations in caries lesion detection and (b) the reliability of images captured by the transillumination device. Methods: Two calibrated examiners evaluated the caries status in premolars and molars on 52 randomly selected subjects by comparing the transillumination device with a clinical examination for the occlusal surfaces and by comparing the transillumination device with a radiographic examination (bitewing radiographs) for the approximal surfaces. Forty-eight trained dental hygienists evaluated and reevaluated 30 randomly selected images 1-month later. Results: A high concordance between transillumination method and clinical examination (kappa = 0.99) was detected for occlusal caries lesions, while for approximal surfaces, the transillumination device identified a higher number of lesions with respect to bitewing (kappa = 0.91). At the dentinal level, the two methods identified the same number of caries lesions (kappa = 1), whereas more approximal lesions were recorded using the transillumination device in the enamel (kappa = 0.24). The intraexaminer reliability was substantial/almost perfect in 59.4% of the participants. Conclusions: The transillumination method showed a high concordance compared with traditional methods (clinical examination and bitewing radiographs). Caries detection reliability using the transillumination device images showed a high intraexaminer agreement. Transillumination showed to be a reliable method and as effective as traditional methods in caries detection. PMID:28191797
Graphene oxide membranes with high permeability and selectivity for dehumidification of air
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Yongsoon; Liu, Wei; Schwenzer, Birgit
Hierarchically stacked 2D graphene oxide (GO) membranes are a fascinating and promising new class of materials with the potential for radically improved water vapor/gas separation with excellent selectivity and high permeability. This paper details dehumidification results from flowing gas mixtures through free-standing GO membrane samples prepared by a casting method. The first demonstrated use of free-standing GO membranes for water vapor separation reveals outstanding water vapor permeability and H2O/N2 selectivity. Free-standing GO membranes exhibit extremely high water vapor permeability of 1.82 x 105 Barrer and a water vapor permeance of 1.01 x 10-5 mol/m2sPa, while the nitrogen permeability was belowmore » the system’s detection limit, yielding a selectivity >104 in 80% relative humidity (RH) air at 30.8 °C. The results show great potential for a range of energy conversion and environmental applications« less
Colmenares, Juan C; Magdziarz, Agnieszka; Bielejewska, Anna
2011-12-01
Glucose was oxidized in the presence of powdered TiO(2) photocatalysts synthesized by an ultrasound-promoted sol-gel method. The catalysts were more selective towards glucaric acid, gluconic acid and arabitol (total selectivity approx. 70%) than the most popular photocatalyst, Degussa P-25. The photocatalytic systems worked at mild reaction conditions: 30°C, atmospheric pressure and very short reaction time (e.g. 5 min). Such relatively good selectivity towards high-valued molecules are attributed to the physico-chemical properties (e.g. high specific surface area, nanostructured anatase phase, and visible light absorption) of novel TiO(2) materials and the reaction conditions. The TiO(2) photocatalysts have potential for water purification and energy production and for use in the pharmaceutical, food, perfume and fuel industries. Copyright © 2011 Elsevier Ltd. All rights reserved.
Micoli, F; Adamo, R; Proietti, D; Gavini, M; Romano, M R; MacLennan, C A; Costantino, P; Berti, F
2013-11-15
A method for meningococcal X (MenX) polysaccharide quantification by high-performance anion-exchange chromatography with pulsed amperometric detection (HPAEC-PAD) is described. The polysaccharide is hydrolyzed by strong acidic treatment, and the peak of glucosamine-4-phosphate (4P-GlcN) is detected and measured after chromatography. In the selected conditions of hydrolysis, 4P-GlcN is the prevalent species formed, with GlcN detected for less than 5% in moles. As standard for the analysis, the monomeric unit of MenX polysaccharide, N-acetylglucosamine-4-phosphate (4P-GlcNAc), was used. This method for MenX quantification is highly selective and sensitive, and it constitutes an important analytical tool for the development of a conjugate vaccine against MenX. Copyright © 2013 Elsevier Inc. All rights reserved.
High purity silica reflective heat shield development
NASA Technical Reports Server (NTRS)
Nachtscheim, P. R.; Blome, J. C.
1976-01-01
A hyperpure vitreous silica material is being developed for use as a reflective and ablative heat shield for planetary entry. Various purity grades and forms of raw materials were evaluated along with various processing methods. Slip casting of high purity grain was selected as the best processing method, resulting in a highly reflective material in the wavelength bands of interest (the visible and ultraviolet regions). The selected material was characterized with respect to optical, mechanical and physical properties using a limited number of specimens. The process has been scaled up to produce a one-half scale heat shield (18 in. dia.) (45.72 cm) for a Jupiter entry vehicle. This work is now being extended to improve the structural safety factor of the heat shield by making hyperpure silica material tougher through the addition of silica fibers.
High performance hydrophobic solvent, carbon dioxide capture
Nulwala, Hunaid; Luebke, David
2017-05-09
Methods and compositions useful, for example, for physical solvent carbon capture. A method comprising: contacting at least one first composition comprising carbon dioxide with at least one second composition to at least partially dissolve the carbon dioxide of the first composition in the second composition, wherein the second composition comprises at least one siloxane compound which is covalently modified with at least one non-siloxane group comprising at least one heteroatom. Polydimethylsiloxane (PDMS) materials and ethylene-glycol based materials have high carbon dioxide solubility but suffer from various problems. PDMS is hydrophobic but suffers from low selectivity. Ethylene-glycol based systems have good solubility and selectivity, but suffer from high affinity to water. Solvents were developed which keep the desired combinations of properties, and result in a simplified, overall process for carbon dioxide removal from a mixed gas stream.
NASA Astrophysics Data System (ADS)
Yuen, Kevin Kam Fung
2009-10-01
The most appropriate prioritization method is still one of the unsettled issues of the Analytic Hierarchy Process, although many studies have been made and applied. Interestingly, many AHP applications apply only Saaty's Eigenvector method as many studies have found that this method may produce rank reversals and have proposed various prioritization methods as alternatives. Some methods have been proved to be better than the Eigenvector method. However, these methods seem not to attract the attention of researchers. In this paper, eight important prioritization methods are reviewed. A Mixed Prioritization Operators Strategy (MPOS) is developed to select a vector which is prioritized by the most appropriate prioritization operator. To verify this new method, a case study of high school selection is revised using the proposed method. The contribution is that MPOS is useful for solving prioritization problems in the AHP.
Cieslak, Wendy; Pap, Kathleen; Bunch, Dustin R; Reineks, Edmunds; Jackson, Raymond; Steinle, Roxanne; Wang, Sihe
2013-02-01
Chromium (Cr), a trace metal element, is implicated in diabetes and cardiovascular disease. A hypochromic state has been associated with poor blood glucose control and unfavorable lipid metabolism. Sensitive and accurate measurement of blood chromium is very important to assess the chromium nutritional status. However, interferents in biological matrices and contamination make the sensitive analysis challenging. The primary goal of this study was to develop a highly sensitive method for quantification of total Cr in whole blood by inductively coupled plasma mass spectrometry (ICP-MS) and to validate the reference interval in a local healthy population. This method was developed on an ICP-MS with a collision/reaction cell. Interference was minimized using both kinetic energy discrimination between the quadrupole and hexapole and a selective collision gas (helium). Reference interval was validated in whole blood samples (n=51) collected in trace element free EDTA tubes from healthy adults (12 males, 39 females), aged 19-64 years (38.8±12.6), after a minimum of 8 h fasting. Blood samples were aliquoted into cryogenic vials and stored at -70 °C until analysis. The assay linearity was 3.42 to 1446.59 nmol/L with an accuracy of 87.7 to 99.8%. The high sensitivity was achieved by minimization of interference through selective kinetic energy discrimination and selective collision using helium. The reference interval for total Cr using a non-parametric method was verified to be 3.92 to 7.48 nmol/L. This validated ICP-MS methodology is highly sensitive and selective for measuring total Cr in whole blood. Copyright © 2012 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved. Published by Elsevier Inc. All rights reserved.
Wang, Yong; Wu, Qiao-Feng; Chen, Chen; Wu, Ling-Yun; Yan, Xian-Zhong; Yu, Shu-Guang; Zhang, Xiang-Sun; Liang, Fan-Rong
2012-01-01
Acupuncture has been practiced in China for thousands of years as part of the Traditional Chinese Medicine (TCM) and has gradually accepted in western countries as an alternative or complementary treatment. However, the underlying mechanism of acupuncture, especially whether there exists any difference between varies acupoints, remains largely unknown, which hinders its widespread use. In this study, we develop a novel Linear Programming based Feature Selection method (LPFS) to understand the mechanism of acupuncture effect, at molecular level, by revealing the metabolite biomarkers for acupuncture treatment. Specifically, we generate and investigate the high-throughput metabolic profiles of acupuncture treatment at several acupoints in human. To select the subsets of metabolites that best characterize the acupuncture effect for each meridian point, an optimization model is proposed to identify biomarkers from high-dimensional metabolic data from case and control samples. Importantly, we use nearest centroid as the prototype to simultaneously minimize the number of selected features and the leave-one-out cross validation error of classifier. We compared the performance of LPFS to several state-of-the-art methods, such as SVM recursive feature elimination (SVM-RFE) and sparse multinomial logistic regression approach (SMLR). We find that our LPFS method tends to reveal a small set of metabolites with small standard deviation and large shifts, which exactly serves our requirement for good biomarker. Biologically, several metabolite biomarkers for acupuncture treatment are revealed and serve as the candidates for further mechanism investigation. Also biomakers derived from five meridian points, Zusanli (ST36), Liangmen (ST21), Juliao (ST3), Yanglingquan (GB34), and Weizhong (BL40), are compared for their similarity and difference, which provide evidence for the specificity of acupoints. Our result demonstrates that metabolic profiling might be a promising method to investigate the molecular mechanism of acupuncture. Comparing with other existing methods, LPFS shows better performance to select a small set of key molecules. In addition, LPFS is a general methodology and can be applied to other high-dimensional data analysis, for example cancer genomics.
2012-01-01
Background Acupuncture has been practiced in China for thousands of years as part of the Traditional Chinese Medicine (TCM) and has gradually accepted in western countries as an alternative or complementary treatment. However, the underlying mechanism of acupuncture, especially whether there exists any difference between varies acupoints, remains largely unknown, which hinders its widespread use. Results In this study, we develop a novel Linear Programming based Feature Selection method (LPFS) to understand the mechanism of acupuncture effect, at molecular level, by revealing the metabolite biomarkers for acupuncture treatment. Specifically, we generate and investigate the high-throughput metabolic profiles of acupuncture treatment at several acupoints in human. To select the subsets of metabolites that best characterize the acupuncture effect for each meridian point, an optimization model is proposed to identify biomarkers from high-dimensional metabolic data from case and control samples. Importantly, we use nearest centroid as the prototype to simultaneously minimize the number of selected features and the leave-one-out cross validation error of classifier. We compared the performance of LPFS to several state-of-the-art methods, such as SVM recursive feature elimination (SVM-RFE) and sparse multinomial logistic regression approach (SMLR). We find that our LPFS method tends to reveal a small set of metabolites with small standard deviation and large shifts, which exactly serves our requirement for good biomarker. Biologically, several metabolite biomarkers for acupuncture treatment are revealed and serve as the candidates for further mechanism investigation. Also biomakers derived from five meridian points, Zusanli (ST36), Liangmen (ST21), Juliao (ST3), Yanglingquan (GB34), and Weizhong (BL40), are compared for their similarity and difference, which provide evidence for the specificity of acupoints. Conclusions Our result demonstrates that metabolic profiling might be a promising method to investigate the molecular mechanism of acupuncture. Comparing with other existing methods, LPFS shows better performance to select a small set of key molecules. In addition, LPFS is a general methodology and can be applied to other high-dimensional data analysis, for example cancer genomics. PMID:23046877
Roelofs, Jeffrey; Peters, Madelon L; Vlaeyen, Johan W S
2003-06-01
The present study assessed, by means of a modified Stroop paradigm, whether highly fearful patients with chronic low back pain pay selective attention to words related to movement and injury. Two groups of patients (High Fear and Low Fear) were included based on their scores on the Tampa Scale of Kinesiophobia (TSK), a measure of fear of movement or (re)injury. A control group was recruited by means of advertisement in a local newspaper. Repeated-measures analysis of variance was conducted to examine whether highly fearful pain patients pay more selective attention to movement and injury words, compared to patients with low pain-related fear and controls. The results from the present study do not support the proposition that highly fearful patients with chronic low back pain selectively pay attention to words related to injury and movement. Limitations of the modified Stroop paradigm are discussed as well as the need for the application of alternative methods such as the dot-probe paradigm.
Subnanometer and nanometer catalysts, method for preparing size-selected catalysts
Vajda, Stefan , Pellin, Michael J.; Elam, Jeffrey W [Elmhurst, IL; Marshall, Christopher L [Naperville, IL; Winans, Randall A [Downers Grove, IL; Meiwes-Broer, Karl-Heinz [Roggentin, GR
2012-04-03
Highly uniform cluster based nanocatalysts supported on technologically relevant supports were synthesized for reactions of top industrial relevance. The Pt-cluster based catalysts outperformed the very best reported ODHP catalyst in both activity (by up to two orders of magnitude higher turn-over frequencies) and in selectivity. The results clearly demonstrate that highly dispersed ultra-small Pt clusters precisely localized on high-surface area supports can lead to affordable new catalysts for highly efficient and economic propene production, including considerably simplified separation of the final product. The combined GISAXS-mass spectrometry provides an excellent tool to monitor the evolution of size and shape of nanocatalyst at action under realistic conditions. Also provided are sub-nanometer gold and sub-nanometer to few nm size-selected silver catalysts which possess size dependent tunable catalytic properties in the epoxidation of alkenes. Invented size-selected cluster deposition provides a unique tool to tune material properties by atom-by-atom fashion, which can be stabilized by protective overcoats.
Subnanometer and nanometer catalysts, method for preparing size-selected catalysts
Vajda, Stefan [Lisle, IL; Pellin, Michael J [Naperville, IL; Elam, Jeffrey W [Elmhurst, IL; Marshall, Christopher L [Naperville, IL; Winans, Randall A [Downers Grove, IL; Meiwes-Broer, Karl-Heinz [Roggentin, GR
2012-03-27
Highly uniform cluster based nanocatalysts supported on technologically relevant supports were synthesized for reactions of top industrial relevance. The Pt-cluster based catalysts outperformed the very best reported ODHP catalyst in both activity (by up to two orders of magnitude higher turn-over frequencies) and in selectivity. The results clearly demonstrate that highly dispersed ultra-small Pt clusters precisely localized on high-surface area supports can lead to affordable new catalysts for highly efficient and economic propene production, including considerably simplified separation of the final product. The combined GISAXS-mass spectrometry provides an excellent tool to monitor the evolution of size and shape of nanocatalyst at action under realistic conditions. Also provided are sub-nanometer gold and sub-nanometer to few nm size-selected silver catalysts which possess size dependent tunable catalytic properties in the epoxidation of alkenes. Invented size-selected cluster deposition provides a unique tool to tune material properties by atom-by-atom fashion, which can be stabilized by protective overcoats.
OPTIMAL TIME-SERIES SELECTION OF QUASARS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, Nathaniel R.; Bloom, Joshua S.
2011-03-15
We present a novel method for the optimal selection of quasars using time-series observations in a single photometric bandpass. Utilizing the damped random walk model of Kelly et al., we parameterize the ensemble quasar structure function in Sloan Stripe 82 as a function of observed brightness. The ensemble model fit can then be evaluated rigorously for and calibrated with individual light curves with no parameter fitting. This yields a classification in two statistics-one describing the fit confidence and the other describing the probability of a false alarm-which can be tuned, a priori, to achieve high quasar detection fractions (99% completenessmore » with default cuts), given an acceptable rate of false alarms. We establish the typical rate of false alarms due to known variable stars as {approx}<3% (high purity). Applying the classification, we increase the sample of potential quasars relative to those known in Stripe 82 by as much as 29%, and by nearly a factor of two in the redshift range 2.5 < z < 3, where selection by color is extremely inefficient. This represents 1875 new quasars in a 290 deg{sup 2} field. The observed rates of both quasars and stars agree well with the model predictions, with >99% of quasars exhibiting the expected variability profile. We discuss the utility of the method at high redshift and in the regime of noisy and sparse data. Our time-series selection complements well-independent selection based on quasar colors and has strong potential for identifying high-redshift quasars for Baryon Acoustic Oscillations and other cosmology studies in the LSST era.« less
Zeng, Yun; Liu, Gang; Ma, Ying; Chen, Xiaoyuan; Ito, Yoichiro
2012-01-01
A new series of organic-high ionic strength aqueous two-phase solvents systems was designed for separation of highly polar compounds by spiral high-speed counter-current chromatography. A total of 21 solvent systems composed of 1-butanol-ethanol-saturated ammonium sulfate-water at various volume ratios are arranged according to an increasing order of polarity. Selection of the two-phase solvent system for a single compound or a multiple sample mixture can be achieved by two steps of partition coefficient measurements using a graphic method. The capability of the method is demonstrated by optimization of partition coefficient for seven highly polar samples including tartrazine (K=0.77), tryptophan (K=1.00), methyl green (K= 0.93), tyrosine (0.81), metanephrine (K=0.89), tyramine (K=0.98), and normetanephrine (K=0.96). Three sulfonic acid components in D&C Green No. 8 were successfully separated by HSCCC using the graphic selection of the two-phase solvent system. PMID:23467197
Can We Train Machine Learning Methods to Outperform the High-dimensional Propensity Score Algorithm?
Karim, Mohammad Ehsanul; Pang, Menglan; Platt, Robert W
2018-03-01
The use of retrospective health care claims datasets is frequently criticized for the lack of complete information on potential confounders. Utilizing patient's health status-related information from claims datasets as surrogates or proxies for mismeasured and unobserved confounders, the high-dimensional propensity score algorithm enables us to reduce bias. Using a previously published cohort study of postmyocardial infarction statin use (1998-2012), we compare the performance of the algorithm with a number of popular machine learning approaches for confounder selection in high-dimensional covariate spaces: random forest, least absolute shrinkage and selection operator, and elastic net. Our results suggest that, when the data analysis is done with epidemiologic principles in mind, machine learning methods perform as well as the high-dimensional propensity score algorithm. Using a plasmode framework that mimicked the empirical data, we also showed that a hybrid of machine learning and high-dimensional propensity score algorithms generally perform slightly better than both in terms of mean squared error, when a bias-based analysis is used.
Sub-Selective Quantization for Learning Binary Codes in Large-Scale Image Search.
Li, Yeqing; Liu, Wei; Huang, Junzhou
2018-06-01
Recently with the explosive growth of visual content on the Internet, large-scale image search has attracted intensive attention. It has been shown that mapping high-dimensional image descriptors to compact binary codes can lead to considerable efficiency gains in both storage and performing similarity computation of images. However, most existing methods still suffer from expensive training devoted to large-scale binary code learning. To address this issue, we propose a sub-selection based matrix manipulation algorithm, which can significantly reduce the computational cost of code learning. As case studies, we apply the sub-selection algorithm to several popular quantization techniques including cases using linear and nonlinear mappings. Crucially, we can justify the resulting sub-selective quantization by proving its theoretic properties. Extensive experiments are carried out on three image benchmarks with up to one million samples, corroborating the efficacy of the sub-selective quantization method in terms of image retrieval.
Crystallization and doping of amorphous silicon on low temperature plastic
Kaschmitter, James L.; Truher, Joel B.; Weiner, Kurt H.; Sigmon, Thomas W.
1994-01-01
A method or process of crystallizing and doping amorphous silicon (a-Si) on a low-temperature plastic substrate using a short pulsed high energy source in a selected environment, without heat propagation and build-up in the substrate. The pulsed energy processing of the a-Si in a selected environment, such as BF3 and PF5, will form a doped micro-crystalline or poly-crystalline silicon (pc-Si) region or junction point with improved mobilities, lifetimes and drift and diffusion lengths and with reduced resistivity. The advantage of this method or process is that it provides for high energy materials processing on low cost, low temperature, transparent plastic substrates. Using pulsed laser processing a high (>900.degree. C.), localized processing temperature can be achieved in thin films, with little accompanying temperature rise in the substrate, since substrate temperatures do not exceed 180.degree. C. for more than a few microseconds. This method enables use of plastics incapable of withstanding sustained processing temperatures (higher than 180.degree. C.) but which are much lower cost, have high tolerance to ultraviolet light, have high strength and good transparency, compared to higher temperature plastics such as polyimide.
Crystallization and doping of amorphous silicon on low temperature plastic
Kaschmitter, J.L.; Truher, J.B.; Weiner, K.H.; Sigmon, T.W.
1994-09-13
A method or process of crystallizing and doping amorphous silicon (a-Si) on a low-temperature plastic substrate using a short pulsed high energy source in a selected environment, without heat propagation and build-up in the substrate is disclosed. The pulsed energy processing of the a-Si in a selected environment, such as BF3 and PF5, will form a doped micro-crystalline or poly-crystalline silicon (pc-Si) region or junction point with improved mobilities, lifetimes and drift and diffusion lengths and with reduced resistivity. The advantage of this method or process is that it provides for high energy materials processing on low cost, low temperature, transparent plastic substrates. Using pulsed laser processing a high (>900 C), localized processing temperature can be achieved in thin films, with little accompanying temperature rise in the substrate, since substrate temperatures do not exceed 180 C for more than a few microseconds. This method enables use of plastics incapable of withstanding sustained processing temperatures (higher than 180 C) but which are much lower cost, have high tolerance to ultraviolet light, have high strength and good transparency, compared to higher temperature plastics such as polyimide. 5 figs.
Szulfer, Jarosław; Plenis, Alina; Bączek, Tomasz
2014-06-13
This paper focuses on the application of a column classification system based on the Katholieke Universiteit Leuven for the characterization of physicochemical properties of core-shell and ultra-high performance liquid chromatographic stationary phases, followed by the verification of the reliability of the obtained column classification in pharmaceutical practice. In the study, 7 stationary phases produced in core-shell technology and 18 ultra-high performance liquid chromatographic columns were chromatographically tested, and ranking lists were built on the FKUL-values calculated against two selected reference columns. In the column performance test, an analysis of alfuzosin in the presence of related substances was carried out using the brands of the stationary phases with the highest ranking positions. Next, a system suitability test as described by the European Pharmacopoeia monograph was performed. Moreover, a study was also performed to achieve a purposeful shortening of the analysis time of the compounds of interest using the selected stationary phases. Finally, it was checked whether methods using core-shell and ultra-high performance liquid chromatographic columns can be an interesting alternative to the high-performance liquid chromatographic method for the analysis of alfuzosin in pharmaceutical practice. Copyright © 2014 Elsevier B.V. All rights reserved.
Fundoplication for laryngopharyngeal reflux despite preoperative dysphagia.
Falk, G L; Van der Wall, H; Burton, L; Falk, M G; O'Donnell, H; Vivian, S J
2017-03-01
INTRODUCTION Fundoplication for laryngopharyngeal disease with oesophageal dysmotility has led to mixed outcomes. In the presence of preoperative dysphagia and oesophageal dysmotility, this procedure has engendered concern in certain regards. METHODS This paper describes a consecutive series of laryngopharyngeal reflux (LPR) patients with a high frequency of dysmotility. Patients were selected for surgery with 24-hour dual channel pH monitoring, oesophageal manometry and standardised reflux scintigraphy. RESULTS Following careful patient selection, 33 patients underwent fundoplication by laparoscopy. Surgery had high efficacy in symptom control and there was no adverse dysphagia. CONCLUSIONS Evidence of proximal reflux can select a group of patients for good results of fundoplication for atypical symptoms.
Wang, Shuaiqun; Aorigele; Kong, Wei; Zeng, Weiming; Hong, Xiaomin
2016-01-01
Gene expression data composed of thousands of genes play an important role in classification platforms and disease diagnosis. Hence, it is vital to select a small subset of salient features over a large number of gene expression data. Lately, many researchers devote themselves to feature selection using diverse computational intelligence methods. However, in the progress of selecting informative genes, many computational methods face difficulties in selecting small subsets for cancer classification due to the huge number of genes (high dimension) compared to the small number of samples, noisy genes, and irrelevant genes. In this paper, we propose a new hybrid algorithm HICATS incorporating imperialist competition algorithm (ICA) which performs global search and tabu search (TS) that conducts fine-tuned search. In order to verify the performance of the proposed algorithm HICATS, we have tested it on 10 well-known benchmark gene expression classification datasets with dimensions varying from 2308 to 12600. The performance of our proposed method proved to be superior to other related works including the conventional version of binary optimization algorithm in terms of classification accuracy and the number of selected genes.
Aorigele; Zeng, Weiming; Hong, Xiaomin
2016-01-01
Gene expression data composed of thousands of genes play an important role in classification platforms and disease diagnosis. Hence, it is vital to select a small subset of salient features over a large number of gene expression data. Lately, many researchers devote themselves to feature selection using diverse computational intelligence methods. However, in the progress of selecting informative genes, many computational methods face difficulties in selecting small subsets for cancer classification due to the huge number of genes (high dimension) compared to the small number of samples, noisy genes, and irrelevant genes. In this paper, we propose a new hybrid algorithm HICATS incorporating imperialist competition algorithm (ICA) which performs global search and tabu search (TS) that conducts fine-tuned search. In order to verify the performance of the proposed algorithm HICATS, we have tested it on 10 well-known benchmark gene expression classification datasets with dimensions varying from 2308 to 12600. The performance of our proposed method proved to be superior to other related works including the conventional version of binary optimization algorithm in terms of classification accuracy and the number of selected genes. PMID:27579323
Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology.
Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses.
Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology
Faltermeier, Rupert; Proescholdt, Martin A.; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses. PMID:26693250
Zhang, Hongkai; Torkamani, Ali; Jones, Teresa M; Ruiz, Diana I; Pons, Jaume; Lerner, Richard A
2011-08-16
Use of large combinatorial antibody libraries and next-generation sequencing of nucleic acids are two of the most powerful methods in modern molecular biology. The libraries are screened using the principles of evolutionary selection, albeit in real time, to enrich for members with a particular phenotype. This selective process necessarily results in the loss of information about less-fit molecules. On the other hand, sequencing of the library, by itself, gives information that is mostly unrelated to phenotype. If the two methods could be combined, the full potential of very large molecular libraries could be realized. Here we report the implementation of a phenotype-information-phenotype cycle that integrates information and gene recovery. After selection for phage-encoded antibodies that bind to targets expressed on the surface of Escherichia coli, the information content of the selected pool is obtained by pyrosequencing. Sequences that encode specific antibodies are identified by a bioinformatic analysis and recovered by a stringent affinity method that is uniquely suited for gene isolation from a highly degenerate collection of nucleic acids. This approach can be generalized for selection of antibodies against targets that are present as minor components of complex systems.
2010-01-01
Background Cluster analysis, and in particular hierarchical clustering, is widely used to extract information from gene expression data. The aim is to discover new classes, or sub-classes, of either individuals or genes. Performing a cluster analysis commonly involve decisions on how to; handle missing values, standardize the data and select genes. In addition, pre-processing, involving various types of filtration and normalization procedures, can have an effect on the ability to discover biologically relevant classes. Here we consider cluster analysis in a broad sense and perform a comprehensive evaluation that covers several aspects of cluster analyses, including normalization. Result We evaluated 2780 cluster analysis methods on seven publicly available 2-channel microarray data sets with common reference designs. Each cluster analysis method differed in data normalization (5 normalizations were considered), missing value imputation (2), standardization of data (2), gene selection (19) or clustering method (11). The cluster analyses are evaluated using known classes, such as cancer types, and the adjusted Rand index. The performances of the different analyses vary between the data sets and it is difficult to give general recommendations. However, normalization, gene selection and clustering method are all variables that have a significant impact on the performance. In particular, gene selection is important and it is generally necessary to include a relatively large number of genes in order to get good performance. Selecting genes with high standard deviation or using principal component analysis are shown to be the preferred gene selection methods. Hierarchical clustering using Ward's method, k-means clustering and Mclust are the clustering methods considered in this paper that achieves the highest adjusted Rand. Normalization can have a significant positive impact on the ability to cluster individuals, and there are indications that background correction is preferable, in particular if the gene selection is successful. However, this is an area that needs to be studied further in order to draw any general conclusions. Conclusions The choice of cluster analysis, and in particular gene selection, has a large impact on the ability to cluster individuals correctly based on expression profiles. Normalization has a positive effect, but the relative performance of different normalizations is an area that needs more research. In summary, although clustering, gene selection and normalization are considered standard methods in bioinformatics, our comprehensive analysis shows that selecting the right methods, and the right combinations of methods, is far from trivial and that much is still unexplored in what is considered to be the most basic analysis of genomic data. PMID:20937082
Rossotti, Martín; Tabares, Sofía; Alfaya, Lucía; Leizagoyen, Carmen; Moron, Gabriel; González-Sapienza, Gualberto
2015-01-01
BACKGROUND Owing to their minimal size, high production yield, versatility and robustness, the recombinant variable domain (nanobody) of camelid single chain antibodies are valued affinity reagents for research, diagnostic, and therapeutic applications. While their preparation against purified antigens is straightforward, the generation of nanobodies to difficult targets such as multi-pass or complex membrane cell receptors remains challenging. Here we devised a platform for high throughput identification of nanobodies to cell receptor based on the use of a biotin handle. METHODS Using a biotin-acceptor peptide tag, the in vivo biotinylation of nanobodies in 96 well culture blocks was optimized allowing their parallel analysis by flow cytometry and ELISA, and their direct used for pull-down/MS target identification. RESULTS The potential of this strategy was demonstrated by the selection and characterization of panels of nanobodies to Mac-1 (CD11b/CD18), MHC II and the mouse Ly-5 leukocyte common antigen (CD45) receptors, from a VHH library obtained from a llama immunized with mouse bone marrow derived dendritic cells. By on and off switching of the addition of biotin, the method also allowed the epitope binning of the selected Nbs directly on cells. CONCLUSIONS This strategy streamline the selection of potent nanobodies to complex antigens, and the selected nanobodies constitute ready-to-use biotinylated reagents. GENERAL SIGNIFICANCE This method will accelerate the discovery of nanobodies to cell membrane receptors which comprise the largest group of drug and analytical targets. PMID:25819371
Selection and maturation of antibodies by phage display through fusion to pIX.
Tornetta, Mark; Reddy, Ramachandra; Wheeler, John C
2012-09-01
Antibody discovery and optimization by M13 phage display have evolved significantly over the past twenty years. Multiple methods of antibody display and selection have been developed - direct display on pIII or indirect display through a Cysteine disulfide linkage or a coiled-coil adapter protein. Here we describe display of Fab libraries on the smaller pIX protein at the opposite end of the virion and its application to discovery of novel antibodies from naive libraries. Antibody selection based on pIX-mediated display produces results comparable to other in vitro methods and uses an efficient direct infection of antigen-bound phages, eliminating any chemical dissociation step(s). Additionally, some evidence suggests that pIX-mediated display can be more efficient than pIII-mediated display in affinity selections. Functional assessment of phage-derived antibodies can be hindered by insufficient affinities or lack of epitopic diversity. Here we describe an approach to managing primary hits from our Fab phage libraries into epitope bins and subsequent high-throughput maturation of clones to isolate epitope- and sequence-diverse panels of high affinity binders. Use of the Octet biosensor was done to examine Fab binding in a facile label-free method and determine epitope competition groups. A receptor extracellular domain and chemokine were subjected to this method of binning and affinity maturation. Parental clones demonstrated improvement in affinity from 1-100nM to 10-500pM. Copyright © 2012 Elsevier Inc. All rights reserved.
Torres, F E; Teodoro, P E; Rodrigues, E V; Santos, A; Corrêa, A M; Ceccon, G
2016-04-29
The aim of this study was to select erect cowpea (Vigna unguiculata L.) genotypes simultaneously for high adaptability, stability, and yield grain in Mato Grosso do Sul, Brazil using mixed models. We conducted six trials of different cowpea genotypes in 2005 and 2006 in Aquidauana, Chapadão do Sul, Dourados, and Primavera do Leste. The experimental design was randomized complete blocks with four replications and 20 genotypes. Genetic parameters were estimated by restricted maximum likelihood/best linear unbiased prediction, and selection was based on the harmonic mean of the relative performance of genetic values method using three strategies: selection based on the predicted breeding value, having considered the performance mean of the genotypes in all environments (no interaction effect); the performance in each environment (with an interaction effect); and the simultaneous selection for grain yield, stability, and adaptability. The MNC99542F-5 and MNC99-537F-4 genotypes could be grown in various environments, as they exhibited high grain yield, adaptability, and stability. The average heritability of the genotypes was moderate to high and the selective accuracy was 82%, indicating an excellent potential for selection.
METHOD OF MAKING METAL BONDED CARBON BODIES
Goeddel, W.V.; Simnad, M.T.
1961-09-26
A method of producing carbon bodies having high structural strength and low permeability is described. The method comprises mixing less than 10 wt.% of a diffusional bonding material selected from the group consisting of zirconium, niobium, molybdenum, titanium, nickel, chromium, silicon, and decomposable compounds thereof with finely divided particles of carbon or graphite. While being maintained at a mechanical pressure over 3,000 psi, the mixture is then heated uniformly to a temperature of 1500 deg C or higher, usually for less than one hour. The resulting carbon bodies have a low diffusion constant, high dimensional stability, and high mechanical strength.
Improved Adaptive LSB Steganography Based on Chaos and Genetic Algorithm
NASA Astrophysics Data System (ADS)
Yu, Lifang; Zhao, Yao; Ni, Rongrong; Li, Ting
2010-12-01
We propose a novel steganographic method in JPEG images with high performance. Firstly, we propose improved adaptive LSB steganography, which can achieve high capacity while preserving the first-order statistics. Secondly, in order to minimize visual degradation of the stego image, we shuffle bits-order of the message based on chaos whose parameters are selected by the genetic algorithm. Shuffling message's bits-order provides us with a new way to improve the performance of steganography. Experimental results show that our method outperforms classical steganographic methods in image quality, while preserving characteristics of histogram and providing high capacity.
Wan, Jian; Chen, Yi-Chieh; Morris, A Julian; Thennadil, Suresh N
2017-07-01
Near-infrared (NIR) spectroscopy is being widely used in various fields ranging from pharmaceutics to the food industry for analyzing chemical and physical properties of the substances concerned. Its advantages over other analytical techniques include available physical interpretation of spectral data, nondestructive nature and high speed of measurements, and little or no need for sample preparation. The successful application of NIR spectroscopy relies on three main aspects: pre-processing of spectral data to eliminate nonlinear variations due to temperature, light scattering effects and many others, selection of those wavelengths that contribute useful information, and identification of suitable calibration models using linear/nonlinear regression . Several methods have been developed for each of these three aspects and many comparative studies of different methods exist for an individual aspect or some combinations. However, there is still a lack of comparative studies for the interactions among these three aspects, which can shed light on what role each aspect plays in the calibration and how to combine various methods of each aspect together to obtain the best calibration model. This paper aims to provide such a comparative study based on four benchmark data sets using three typical pre-processing methods, namely, orthogonal signal correction (OSC), extended multiplicative signal correction (EMSC) and optical path-length estimation and correction (OPLEC); two existing wavelength selection methods, namely, stepwise forward selection (SFS) and genetic algorithm optimization combined with partial least squares regression for spectral data (GAPLSSP); four popular regression methods, namely, partial least squares (PLS), least absolute shrinkage and selection operator (LASSO), least squares support vector machine (LS-SVM), and Gaussian process regression (GPR). The comparative study indicates that, in general, pre-processing of spectral data can play a significant role in the calibration while wavelength selection plays a marginal role and the combination of certain pre-processing, wavelength selection, and nonlinear regression methods can achieve superior performance over traditional linear regression-based calibration.
Laurie, Cathy C.; Chasalow, Scott D.; LeDeaux, John R.; McCarroll, Robert; Bush, David; Hauge, Brian; Lai, Chaoqiang; Clark, Darryl; Rocheford, Torbert R.; Dudley, John W.
2004-01-01
In one of the longest-running experiments in biology, researchers at the University of Illinois have selected for altered composition of the maize kernel since 1896. Here we use an association study to infer the genetic basis of dramatic changes that occurred in response to selection for changes in oil concentration. The study population was produced by a cross between the high- and low-selection lines at generation 70, followed by 10 generations of random mating and the derivation of 500 lines by selfing. These lines were genotyped for 488 genetic markers and the oil concentration was evaluated in replicated field trials. Three methods of analysis were tested in simulations for ability to detect quantitative trait loci (QTL). The most effective method was model selection in multiple regression. This method detected ∼50 QTL accounting for ∼50% of the genetic variance, suggesting that >50 QTL are involved. The QTL effect estimates are small and largely additive. About 20% of the QTL have negative effects (i.e., not predicted by the parental difference), which is consistent with hitchhiking and small population size during selection. The large number of QTL detected accounts for the smooth and sustained response to selection throughout the twentieth century. PMID:15611182
Seifertová, Marta; Čechová, Eliška; Llansola, Marta; Felipo, Vicente; Vykoukalová, Martina; Kočan, Anton
2017-10-01
We developed a simple analytical method for the simultaneous determination of representatives of various groups of neurotoxic insecticides (carbaryl, chlorpyrifos, cypermethrin, and α-endosulfan and β-endosulfan and their metabolite endosulfan sulfate) in limited amounts of animal tissues containing different amounts of lipids. Selected tissues (rodent fat, liver, and brain) were extracted in a special in-house-designed mini-extractor constructed on the basis of the Soxhlet and Twisselmann extractors. A dried tissue sample placed in a small cartridge was extracted, while the nascent extract was simultaneously filtered through a layer of sodium sulfate. The extraction was followed by combined clean-up, including gel permeation chromatography (in case of high lipid content), ultrasonication, and solid-phase extraction chromatography using C 18 on silica and aluminum oxide. Gas chromatography coupled with high-resolution mass spectrometry was used for analyte separation, detection, and quantification. Average recoveries for individual insecticides ranged from 82 to 111%. Expanded measurement uncertainties were generally lower than 35%. The developed method was successfully applied to rat tissue samples obtained from an animal model dealing with insecticide exposure during brain development. This method may also be applied to the analytical treatment of small amounts of various types of animal and human tissue samples. A significant advantage achieved using this method is high sample throughput due to the simultaneous treatment of many samples. Graphical abstract Optimized workflow for the determination of selected insecticides in small amounts of animal tissue including newly developed mini-extractor.
Chen, Kai; Lynen, Frédéric; De Beer, Maarten; Hitzel, Laure; Ferguson, Paul; Hanna-Brown, Melissa; Sandra, Pat
2010-11-12
Stationary phase optimized selectivity liquid chromatography (SOSLC) is a promising technique to optimize the selectivity of a given separation by using a combination of different stationary phases. Previous work has shown that SOSLC offers excellent possibilities for method development, especially after the recent modification towards linear gradient SOSLC. The present work is aimed at developing and extending the SOSLC approach towards selectivity optimization and method development for green chromatography. Contrary to current LC practices, a green mobile phase (water/ethanol/formic acid) is hereby preselected and the composition of the stationary phase is optimized under a given gradient profile to obtain baseline resolution of all target solutes in the shortest possible analysis time. With the algorithm adapted to the high viscosity property of ethanol, the principle is illustrated with a fast, full baseline resolution for a randomly selected mixture composed of sulphonamides, xanthine alkaloids and steroids. Copyright © 2010 Elsevier B.V. All rights reserved.
Lei, Ting; Pochorovski, Igor; Bao, Zhenan
2017-04-18
Electronics that are soft, conformal, and stretchable are highly desirable for wearable electronics, prosthetics, and robotics. Among the various available electronic materials, single walled carbon nanotubes (SWNTs) and their network have exhibited high mechanical flexibility and stretchability, along with comparable electrical performance to traditional rigid materials, e.g. polysilicon and metal oxides. Unfortunately, SWNTs produced en masse contain a mixture of semiconducting (s-) and metallic (m-) SWNTs, rendering them unsuitable for electronic applications. Moreover, the poor solubility of SWNTs requires the introduction of insulating surfactants to properly disperse them into individual tubes for device fabrication. Compared to other SWNT dispersion and separation methods, e.g., DNA wrapping, density gradient ultracentrifugation, and gel chromatography, polymer wrapping can selectively disperse s-SWNTs with high selectivity (>99.7%), high concentration (>0.1 mg/mL), and high yield (>20%). In addition, this method only requires simple sonication and centrifuge equipment with short processing time down to 1 h. Despite these advantages, the polymer wrapping method still faces two major issues: (i) The purified s-SWNTs usually retain a substantial amount of polymers on their surface even after thorough rinsing. The low conductivity of the residual polymers impedes the charge transport in SWNT networks. (ii) Conjugated polymers used for SWNT wrapping are expensive. Their prices ($100-1000/g) are comparable or even higher than those of SWNTs ($10-300/g). These utilized conjugated polymers represent a large portion of the overall separation cost. In this Account, we summarize recent progresses in polymer design for selective dispersion and separation of SWNTs. We focus particularly on removable and/or recyclable polymers that enable low-cost and scalable separation methods. First, different separation methods are compared to show the advantages of the polymer wrapping methods. In specific, we compare different characterization methods used for purity evaluation. For s-SWNTs with high purity, i.e., >99%, short-channel (smaller than SWNT length) electrical measurement is more reliable than optical methods. Second, possible sorting mechanism and molecular design strategies are discussed. Polymer parameters such as backbone design and side chain engineering affect the polymer-SWNT interactions, leading to different dispersion concentration and selectivity. To address the above-mentioned limiting factors in both polymer contamination and cost issues, we describe two important polymer removal and cycling approaches: (i) changing polymer wrapping conformation to release SWNTs; (ii) depolymerization of conjugated polymer into small molecular units that have less affinity toward SWNTs. These methods allow the removal and recycling of the wrapping polymers, thus providing low-cost and clean s-SWNTs. Third, we discuss various applications of polymer-sorted s-SWNTs, including flexible/stretchable thin-film transistors, thermoelectric devices, and solar cells. In these applications, polymer-sorted s-SWNTs and their networks have exhibited good processability, attractive mechanical properties, and high electrical performance. An increasing number of studies have shown that the removable polymer approaches can completely remove polymer residues in SWNT networks and lead to enhanced charge carrier mobility, higher conductivity, and better heterojunction interface.
Method and solvent composition for regenerating an ion exchange resin
Even, William R.; Irvin, David J.; Irvin, Jennifer A.; Tarver, Edward E.; Brown, Gilbert M.; Wang, James C. F.
2002-01-01
A method and composition for removing perchlorate from a highly selective ion exchange resin is disclosed. The disclosed approach comprises treating the resin in a solution of super critical or liquid carbon dioxide and one or more quaternary ammonium chloride surfactant compounds.
Dip coated TiO2 nanostructured thin film: synthesis and application
NASA Astrophysics Data System (ADS)
Vanaraja, Manoj; Muthukrishnan, Karthika; Boomadevi, Shanmugam; Karn, Rakesh Kumar; Singh, Vijay; Singh, Pramod K.; Pandiyan, Krishnamoorthy
2016-02-01
TiO2 thin film was fabricated by dip coating method using titanium IV chloride as precursor and sodium carboxymethyl cellulose as thickening as well as capping agent. Structural and morphological features of TiO2 thin film were characterized by X-ray diffractometer and field emission scanning electron microscope, respectively. Crystallinity of the film was confirmed with high-intensity peak at (101) plane, and its average crystallite size was found to be 28 nm. The ethanol-sensing properties of TiO2 thin film was studied by the chemiresistive method. Furthermore, various gases were tested in order to verify the selectivity of the sensor. Among the several gases, the fabricated TiO2 sensor showed very high selectivity towards ethanol at room temperature.
Two-photon excitation cross-section in light and intermediate atoms
NASA Technical Reports Server (NTRS)
Omidvar, K.
1980-01-01
The method of explicit summation over the intermediate states is used along with LS coupling to derive an expression for two-photon absorption cross section in light and intermediate atoms in terms of integrals over radial wave functions. Two selection rules, one exact and one approximate, are also derived. In evaluating the radial integrals, for low-lying levels, the Hartree-Fock wave functions, and for high-lying levels, hydrogenic wave functions obtained by the quantum defect method are used. A relationship between the cross section and the oscillator strengths is derived. Cross sections due to selected transitions in nitrogen, oxygen, and chlorine are given. The expression for the cross section is useful in calculating the two-photon absorption in light and intermediate atoms.
Ma, Xin; Guo, Jing; Sun, Xiao
2015-01-01
The prediction of RNA-binding proteins is one of the most challenging problems in computation biology. Although some studies have investigated this problem, the accuracy of prediction is still not sufficient. In this study, a highly accurate method was developed to predict RNA-binding proteins from amino acid sequences using random forests with the minimum redundancy maximum relevance (mRMR) method, followed by incremental feature selection (IFS). We incorporated features of conjoint triad features and three novel features: binding propensity (BP), nonbinding propensity (NBP), and evolutionary information combined with physicochemical properties (EIPP). The results showed that these novel features have important roles in improving the performance of the predictor. Using the mRMR-IFS method, our predictor achieved the best performance (86.62% accuracy and 0.737 Matthews correlation coefficient). High prediction accuracy and successful prediction performance suggested that our method can be a useful approach to identify RNA-binding proteins from sequence information.
Hydrogen production by high-temperature water splitting using electron-conducting membranes
Lee, Tae H.; Wang, Shuangyan; Dorris, Stephen E.; Balachandran, Uthamalingam
2004-04-27
A device and method for separating water into hydrogen and oxygen is disclosed. A first substantially gas impervious solid electron-conducting membrane for selectively passing hydrogen is provided and spaced from a second substantially gas impervious solid electron-conducting membrane for selectively passing oxygen. When steam is passed between the two membranes at disassociation temperatures the hydrogen from the disassociation of steam selectively and continuously passes through the first membrane and oxygen selectively and continuously passes through the second membrane, thereby continuously driving the disassociation of steam producing hydrogen and oxygen.
Composite patterning devices for soft lithography
Rogers, John A.; Menard, Etienne
2007-03-27
The present invention provides methods, devices and device components for fabricating patterns on substrate surfaces, particularly patterns comprising structures having microsized and/or nanosized features of selected lengths in one, two or three dimensions. The present invention provides composite patterning devices comprising a plurality of polymer layers each having selected mechanical properties, such as Young's Modulus and flexural rigidity, selected physical dimensions, such as thickness, surface area and relief pattern dimensions, and selected thermal properties, such as coefficients of thermal expansion, to provide high resolution patterning on a variety of substrate surfaces and surface morphologies.
Effects of Comprehensive, Multiple High-Risk Behaviors Prevention Program on High School Students
ERIC Educational Resources Information Center
Collier, Crystal
2013-01-01
The purpose of this mixed methods study was to examine the effect of a multiple high-risk behaviors prevention program applied comprehensively throughout an entire school-system involving universal, selective, and indicated levels of students at a local private high school during a 4-year period. The prevention program was created based upon the…
Recent trends in SELEX technique and its application to food safety monitoring
Mei, Zhanlong; Yao, Li; Wang, Xin; Zheng, Lei; Liu, Jian; Liu, Guodong; Peng, Chifang; Chen, Wei
2014-01-01
The method referred to as “systemic evolution of ligands by exponential enrichment” (SELEX) was introduced in 1990 and ever since has become an important tool for the identification and screening of aptamers. Such nucleic acids can recognize and bind to their corresponding targets (analytes) with high selectivity and affinity, and aptamers therefore have become attractive alternatives to traditional antibodies not the least because they are much more stable. Meanwhile, they have found numerous applications in different fields including food quality and safety monitoring. This review first gives an introduction into the selection process and to the evolution of SELEX, then covers applications of aptamers in the surveillance of food safety (with subsections on absorptiometric, electrochemical, fluorescent and other methods), and then gives conclusions and perspectives. The SELEX method excels by its features of in vitro, high throughput and ease of operation. This review contains 86 references. PMID:25419005
Asymmetric organic-inorganic hybrid membrane formation via block copolymer-nanoparticle co-assembly.
Gu, Yibei; Dorin, Rachel M; Wiesner, Ulrich
2013-01-01
A facile method for forming asymmetric organic-inorganic hybrid membranes for selective separation applications is developed. This approach combines co-assembly of block copolymer (BCP) and inorganic nanoparticles (NPs) with non-solvent induced phase separation. The method is successfully applied to two distinct molar mass BCPs with different fractions of titanium dioxide (TiO2) NPs. The resulting hybrid membranes exhibit structural asymmetry with a thin nanoporous surface layer on top of a macroporous fingerlike support layer. Key parameters that dictate membrane surface morphology include the fraction of inorganics used and the length of time allowed for surface layer development. The resulting membranes exhibit both good selectivity and high permeability (3200 ± 500 Lm(-2) h(-1) bar(-1)). This fast and straightforward synthesis method for asymmetric hybrid membranes provides a new self-assembly platform upon which multifunctional and high-performance organic-inorganic hybrid membranes can be formed.
Olfat, A M; Karimi, A N; Parsapajouh, D
2007-04-01
Biologic agar-block method was developed that allowed wood samples to be evaluated and monitored in terms of colonization and development of the decay by Basidiomycetes fungi (Coriolus versicolor) and to be directly classified based on mean mass loss. In this research, the in vitro decay of five commercial woods by Coriolus versicolor was studied by the agar-block method. The selected wood samples were Abies alba, Populus alba, Fagus orientalis, Platanus orientalis and Ulmus glabra. The results demonstrated the strong resistance of Ulmus glabra and the lowest resistance in Fagus orientalis. The mass losses (%) were 16.8 and 42.4%, respectively. There were also a high correlation between the mass loss and apparent damage. Therefore biological evaluation of wood regarding biodegradation and the selection of wood types for various application respects will be of high priority.
Bruno, Sergio N F; Cardoso, Carlos R; Maciel, Márcia Mosca A; Vokac, Lidmila; da Silva Junior, Ademário I
2014-09-15
High-pressure liquid chromatography with ultra-violet detection (HPLC-UV) is one of the most commonly used methods to identify and quantify saccharin in non-alcoholic beverages. However, due to the wide variety of interfering UV spectra in saccharin-containing beverage matrices, the same method cannot be used to measure this analyte accurately. We have developed a new, highly effective method to identify and quantify saccharin using HPLC with fluorescence detection (HPLC-FLD). The excitation wavelength (250 nm) and emission wavelength (440 nm) chosen increased selectivity for all matrices and ensured few changes were required in the mobile phase or other parameters. The presence of saccharin in non-diet beverages - a fraud commonly used to replace more expensive sucrose - was confirmed by comparing coincident peaks as well as the emission spectra of standards and samples. Copyright © 2014 Elsevier Ltd. All rights reserved.
Petrarca, Mateus Henrique; Ccanccapa-Cartagena, Alexander; Masiá, Ana; Godoy, Helena Teixeira; Picó, Yolanda
2017-05-12
A new selective and sensitive liquid chromatography triple quadrupole mass spectrometry method was developed for simultaneous analysis of natural pyrethrins and synthetic pyrethroids residues in baby food. In this study, two sample preparation methods based on ultrasound-assisted dispersive liquid-liquid microextraction (UA-DLLME) and salting-out assisted liquid-liquid extraction (SALLE) were optimized, and then, compared regarding the performance criteria. Appropriate linearity in solvent and matrix-based calibrations, and suitable recoveries (75-120%) and precision (RSD values≤16%) were achieved for selected analytes by any of the sample preparation procedures. Both methods provided the analytical selectivity required for the monitoring of the insecticides in fruit-, cereal- and milk-based baby foods. SALLE, recognized by cost-effectiveness, and simple and fast execution, provided a lower enrichment factor, consequently, higher limits of quantification (LOQs) were obtained. Some of them too high to meet the strict legislation regarding baby food. Nonetheless, the combination of ultrasound and DLLME also resulted in a high sample throughput and environmental-friendly method, whose LOQs were lower than the default maximum residue limit (MRL) of 10μgkg -1 set by European Community for baby foods. In the commercial baby foods analyzed, cyhalothrin and etofenprox were detected in different samples, demonstrating the suitability of proposed method for baby food control. Copyright © 2017 Elsevier B.V. All rights reserved.
Frikha-Gargouri, Olfa; Ben Abdallah, Dorra; Bhar, Ilhem; Tounsi, Slim
2017-01-01
This study aimed to improve the screening method for the selection of Bacillus biocontrol agents against crown gall disease. The relationship between the strain biocontrol ability and their in vitro studied traits was investigated to identify the most important factors to be considered for the selection of effective biocontrol agents. In fact, previous selection procedure relying only on in vitro antibacterial activity was shown to be not suitable in some cases. A direct plant-protection strategy was performed to screen the 32 Bacillus biocontrol agent candidates. Moreover, potential in vitro biocontrol traits were investigated including biofilm formation, motility, hemolytic activity, detection of lipopeptide biosynthetic genes ( sfp, ituC and bmyB ) and production of antibacterial compounds. The obtained results indicated high correlations of the efficiency of the biocontrol with the reduction of gall weight ( p = 0.000) and the antibacterial activity in vitro ( p = 0.000). Moreover, there was strong correlations of the efficiency of the biocontrol ( p = 0.004) and the reduction in gall weight ( p = 0.000) with the presence of the bmyB gene. This gene directs the synthesis of the lipopeptide bacillomycin belonging to the iturinic family of lipopeptides. These results were also confirmed by the two-way hierarchical cluster analysis and the correspondence analysis showing the relatedness of these four variables. According to the obtained results a new screening procedure of Bacillus biocontrol agents against crown gall disease could be advanced consisting on two step selection procedure. The first consists on selecting strains with high antibacterial activity in vitro or those harbouring the bmyB gene. Further selection has to be performed on tomato plants in vivo . Moreover, based on the results of the biocontrol assay, five potent strains exhibiting high biocontrol abilities were selected. They were identified as Bacillus subtilis or Bacillus amyloliquefaciens . These strains were found to produce either surfactin or surfactin and iturin lipopeptides. In conclusion, our study presented a new and effective method to evaluate the biocontrol ability of antagonistic Bacillus strains against crown gall disease that could increase the efficiency of screening method of biocontrol agents. Besides, the selected strains could be used as novel biocontrol agents against pathogenic Agrobacterium tumefaciens strains.
Frikha-Gargouri, Olfa; Ben Abdallah, Dorra; Bhar, Ilhem; Tounsi, Slim
2017-01-01
This study aimed to improve the screening method for the selection of Bacillus biocontrol agents against crown gall disease. The relationship between the strain biocontrol ability and their in vitro studied traits was investigated to identify the most important factors to be considered for the selection of effective biocontrol agents. In fact, previous selection procedure relying only on in vitro antibacterial activity was shown to be not suitable in some cases. A direct plant-protection strategy was performed to screen the 32 Bacillus biocontrol agent candidates. Moreover, potential in vitro biocontrol traits were investigated including biofilm formation, motility, hemolytic activity, detection of lipopeptide biosynthetic genes (sfp, ituC and bmyB) and production of antibacterial compounds. The obtained results indicated high correlations of the efficiency of the biocontrol with the reduction of gall weight (p = 0.000) and the antibacterial activity in vitro (p = 0.000). Moreover, there was strong correlations of the efficiency of the biocontrol (p = 0.004) and the reduction in gall weight (p = 0.000) with the presence of the bmyB gene. This gene directs the synthesis of the lipopeptide bacillomycin belonging to the iturinic family of lipopeptides. These results were also confirmed by the two-way hierarchical cluster analysis and the correspondence analysis showing the relatedness of these four variables. According to the obtained results a new screening procedure of Bacillus biocontrol agents against crown gall disease could be advanced consisting on two step selection procedure. The first consists on selecting strains with high antibacterial activity in vitro or those harbouring the bmyB gene. Further selection has to be performed on tomato plants in vivo. Moreover, based on the results of the biocontrol assay, five potent strains exhibiting high biocontrol abilities were selected. They were identified as Bacillus subtilis or Bacillus amyloliquefaciens. These strains were found to produce either surfactin or surfactin and iturin lipopeptides. In conclusion, our study presented a new and effective method to evaluate the biocontrol ability of antagonistic Bacillus strains against crown gall disease that could increase the efficiency of screening method of biocontrol agents. Besides, the selected strains could be used as novel biocontrol agents against pathogenic Agrobacterium tumefaciens strains. PMID:28855909
Purely Structural Protein Scoring Functions Using Support Vector Machine and Ensemble Learning.
Mirzaei, Shokoufeh; Sidi, Tomer; Keasar, Chen; Crivelli, Silvia
2016-08-24
The function of a protein is determined by its structure, which creates a need for efficient methods of protein structure determination to advance scientific and medical research. Because current experimental structure determination methods carry a high price tag, computational predictions are highly desirable. Given a protein sequence, computational methods produce numerous 3D structures known as decoys. However, selection of the best quality decoys is challenging as the end users can handle only a few ones. Therefore, scoring functions are central to decoy selection. They combine measurable features into a single number indicator of decoy quality. Unfortunately, current scoring functions do not consistently select the best decoys. Machine learning techniques offer great potential to improve decoy scoring. This paper presents two machine-learning based scoring functions to predict the quality of proteins structures, i.e., the similarity between the predicted structure and the experimental one without knowing the latter. We use different metrics to compare these scoring functions against three state-of-the-art scores. This is a first attempt at comparing different scoring functions using the same non-redundant dataset for training and testing and the same features. The results show that adding informative features may be more significant than the method used.
Automatic blood vessel based-liver segmentation using the portal phase abdominal CT
NASA Astrophysics Data System (ADS)
Maklad, Ahmed S.; Matsuhiro, Mikio; Suzuki, Hidenobu; Kawata, Yoshiki; Niki, Noboru; Shimada, Mitsuo; Iinuma, Gen
2018-02-01
Liver segmentation is the basis for computer-based planning of hepatic surgical interventions. In diagnosis and analysis of hepatic diseases and surgery planning, automatic segmentation of liver has high importance. Blood vessel (BV) has showed high performance at liver segmentation. In our previous work, we developed a semi-automatic method that segments the liver through the portal phase abdominal CT images in two stages. First stage was interactive segmentation of abdominal blood vessels (ABVs) and subsequent classification into hepatic (HBVs) and non-hepatic (non-HBVs). This stage had 5 interactions that include selective threshold for bone segmentation, selecting two seed points for kidneys segmentation, selection of inferior vena cava (IVC) entrance for starting ABVs segmentation, identification of the portal vein (PV) entrance to the liver and the IVC-exit for classifying HBVs from other ABVs (non-HBVs). Second stage is automatic segmentation of the liver based on segmented ABVs as described in [4]. For full automation of our method we developed a method [5] that segments ABVs automatically tackling the first three interactions. In this paper, we propose full automation of classifying ABVs into HBVs and non- HBVs and consequently full automation of liver segmentation that we proposed in [4]. Results illustrate that the method is effective at segmentation of the liver through the portal abdominal CT images.
Safo, Sandra E; Li, Shuzhao; Long, Qi
2018-03-01
Integrative analysis of high dimensional omics data is becoming increasingly popular. At the same time, incorporating known functional relationships among variables in analysis of omics data has been shown to help elucidate underlying mechanisms for complex diseases. In this article, our goal is to assess association between transcriptomic and metabolomic data from a Predictive Health Institute (PHI) study that includes healthy adults at a high risk of developing cardiovascular diseases. Adopting a strategy that is both data-driven and knowledge-based, we develop statistical methods for sparse canonical correlation analysis (CCA) with incorporation of known biological information. Our proposed methods use prior network structural information among genes and among metabolites to guide selection of relevant genes and metabolites in sparse CCA, providing insight on the molecular underpinning of cardiovascular disease. Our simulations demonstrate that the structured sparse CCA methods outperform several existing sparse CCA methods in selecting relevant genes and metabolites when structural information is informative and are robust to mis-specified structural information. Our analysis of the PHI study reveals that a number of gene and metabolic pathways including some known to be associated with cardiovascular diseases are enriched in the set of genes and metabolites selected by our proposed approach. © 2017, The International Biometric Society.
A systematic review of stakeholder views of selection methods for medical schools admission.
Kelly, M E; Patterson, F; O'Flynn, S; Mulligan, J; Murphy, A W
2018-06-15
The purpose of this paper is to systematically review the literature with respect to stakeholder views of selection methods for medical school admissions. An electronic search of nine databases was conducted between January 2000-July 2014. Two reviewers independently assessed all titles (n = 1017) and retained abstracts (n = 233) for relevance. Methodological quality of quantitative papers was assessed using the MERSQI instrument. The overall quality of evidence in this field was low. Evidence was synthesised in a narrative review. Applicants support interviews, and multiple mini interviews (MMIs). There is emerging evidence that situational judgement tests (SJTs) and selection centres (SCs) are also well regarded, but aptitude tests less so. Selectors endorse the use of interviews in general and in particular MMIs judging them to be fair, relevant and appropriate, with emerging evidence of similarly positive reactions to SCs. Aptitude tests and academic records were valued in decisions of whom to call to interview. Medical students prefer interviews based selection to cognitive aptitude tests. They are unconvinced about the transparency and veracity of written applications. Perceptions of organisational justice, which describe views of fairness in organisational processes, appear to be highly influential on stakeholders' views of the acceptability of selection methods. In particular procedural justice (perceived fairness of selection tools in terms of job relevance and characteristics of the test) and distributive justice (perceived fairness of selection outcomes in terms of equal opportunity and equity), appear to be important considerations when deciding on acceptability of selection methods. There were significant gaps with respect to both key stakeholder groups and the range of selection tools assessed. Notwithstanding the observed limitations in the quality of research in this field, there appears to be broad concordance of views on the various selection methods, across the diverse stakeholders groups. This review highlights the need for better standards, more appropriate methodologies and for broadening the scope of stakeholder research.
Computational structural mechanics methods research using an evolving framework
NASA Technical Reports Server (NTRS)
Knight, N. F., Jr.; Lotts, C. G.; Gillian, R. E.
1990-01-01
Advanced structural analysis and computational methods that exploit high-performance computers are being developed in a computational structural mechanics research activity sponsored by the NASA Langley Research Center. These new methods are developed in an evolving framework and applied to representative complex structural analysis problems from the aerospace industry. An overview of the methods development environment is presented, and methods research areas are described. Selected application studies are also summarized.
Discriminative dictionary learning for abdominal multi-organ segmentation.
Tong, Tong; Wolz, Robin; Wang, Zehan; Gao, Qinquan; Misawa, Kazunari; Fujiwara, Michitaka; Mori, Kensaku; Hajnal, Joseph V; Rueckert, Daniel
2015-07-01
An automated segmentation method is presented for multi-organ segmentation in abdominal CT images. Dictionary learning and sparse coding techniques are used in the proposed method to generate target specific priors for segmentation. The method simultaneously learns dictionaries which have reconstructive power and classifiers which have discriminative ability from a set of selected atlases. Based on the learnt dictionaries and classifiers, probabilistic atlases are then generated to provide priors for the segmentation of unseen target images. The final segmentation is obtained by applying a post-processing step based on a graph-cuts method. In addition, this paper proposes a voxel-wise local atlas selection strategy to deal with high inter-subject variation in abdominal CT images. The segmentation performance of the proposed method with different atlas selection strategies are also compared. Our proposed method has been evaluated on a database of 150 abdominal CT images and achieves a promising segmentation performance with Dice overlap values of 94.9%, 93.6%, 71.1%, and 92.5% for liver, kidneys, pancreas, and spleen, respectively. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Bossi, Rossana; Rastogi, Suresh C; Bernard, Guillaume; Gimenez-Arnau, Elena; Johansen, Jeanne D; Lepoittevin, Jean-Pierre; Menné, Torkil
2004-05-01
This paper describes a validated liquid chromatographic-tandem mass spectrometric method for quantitative analysis of the potential oak moss allergens atranol and chloroatranol in perfumes and similar products. The method employs LC-MS-MS with electrospray ionization (ESI) in negative mode. The compounds are analysed by selective reaction monitoring (SRM) of 2 or 3 ions for each compound in order to obtain high selectivity and sensitivity. The method has been validated for the following parameters: linearity; repeatability; recovery; limit of detection; and limit of quantification. The limits of detection, 5.0 ng/mL and 2.4 ng/mL, respectively, for atranol and chloroatranol, achieved by this method allowed identification of these compounds at concentrations below those causing allergic skin reactions in oak-moss-sensitive patients. The recovery of chloratranol from spiked perfumes was 96+/-4%. Low recoveries (49+/-5%) were observed for atranol in spiked perfumes, indicating ion suppression caused by matrix components. The method has been applied to the analysis of 10 randomly selected perfumes and similar products.
Wei, Zuofu; Pan, Youzhi; Li, Lu; Huang, Yuyang; Qi, Xiaolin; Luo, Meng; Zu, Yuangang; Fu, Yujie
2014-11-01
A method based on matrix solid-phase dispersion extraction followed by ultra high performance liquid chromatography with tandem mass spectrometry is presented for the extraction and determination of phenolic compounds in Equisetum palustre. This method combines the high efficiency of matrix solid-phase dispersion extraction and the rapidity, sensitivity, and accuracy of ultra high performance liquid chromatography with tandem mass spectrometry. The influential parameters of the matrix solid-phase dispersion extraction were investigated and optimized. The optimized conditions were as follows: silica gel was selected as dispersing sorbent, the ratio of silica gel to sample was selected to be 2:1 (400/200 mg), and 8 mL of 80% methanol was used as elution solvent. Furthermore, a fast and sensitive ultra high performance liquid chromatography with tandem mass spectrometry method was developed for the determination of nine phenolic compounds in E. palustre. This method was carried out within <6 min, and exhibited satisfactory linearity, precision, and recovery. Compared with ultrasound-assisted extraction, the proposed matrix solid-phase dispersion procedure possessed higher extraction efficiency, and was more convenient and time saving with reduced requirements on sample and solvent amounts. All these results suggest that the developed method represents an excellent alternative for the extraction and determination of active components in plant matrices. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Monitoring visitor use in backcountry and wilderness: a review of methods
Steven J. Hollenhorst; Steven A. Whisman; Alan W. Ewert
1992-01-01
Obtaining accurate and usable visitor counts in backcountry and wilderness settings continues to be problematic for resource managers because use of these areas is dispersed and costs can be prohibitively high. An overview of the available methods for obtaining reliable data on recreation use levels is provided. Monitoring methods were compared and selection criteria...
Zhang, Xiaodong; Chen, Xiaokai; Kai, Siqi; Wang, Hong-Yin; Yang, Jingjing; Wu, Fu-Gen; Chen, Zhan
2015-03-17
A simple and highly efficient method for dopamine (DA) detection using water-soluble silicon nanoparticles (SiNPs) was reported. The SiNPs with a high quantum yield of 23.6% were synthesized by using a one-pot microwave-assisted method. The fluorescence quenching capability of a variety of molecules on the synthesized SiNPs has been tested; only DA molecules were found to be able to quench the fluorescence of these SiNPs effectively. Therefore, such a quenching effect can be used to selectively detect DA. All other molecules tested have little interference with the dopamine detection, including ascorbic acid, which commonly exists in cells and can possibly affect the dopamine detection. The ratio of the fluorescence intensity difference between the quenched and unquenched cases versus the fluorescence intensity without quenching (ΔI/I) was observed to be linearly proportional to the DA analyte concentration in the range from 0.005 to 10.0 μM, with a detection limit of 0.3 nM (S/N = 3). To the best of our knowledge, this is the lowest limit for DA detection reported so far. The mechanism of fluorescence quenching is attributed to the energy transfer from the SiNPs to the oxidized dopamine molecules through Förster resonance energy transfer. The reported method of SiNP synthesis is very simple and cheap, making the above sensitive and selective DA detection approach using SiNPs practical for many applications.
High School Biology Students' Transfer of the Concept of Natural Selection: A Mixed-Methods Approach
ERIC Educational Resources Information Center
Pugh, Kevin J.; Koskey, Kristin L. K.; Linnenbrink-Garcia, Lisa
2014-01-01
The concept of natural selection serves as a foundation for understanding diverse biological concepts and has broad applicability to other domains. However, we know little about students' abilities to transfer (i.e. apply to a new context or use generatively) this concept and the relation between students' conceptual understanding and transfer…
NASA Astrophysics Data System (ADS)
Ghoudelbourk, Sihem.; Dib, D.; Meghni, B.; Zouli, M.
2017-02-01
The paper deals with the multilevel converters control strategy for photovoltaic system integrated in distribution grids. The objective of the proposed work is to design multilevel inverters for solar energy applications so as to reduce the Total Harmonic Distortion (THD) and to improve the power quality. The multilevel inverter power structure plays a vital role in every aspect of the power system. It is easier to produce a high-power, high-voltage inverter with the multilevel structure. The topologies of multilevel inverter have several advantages such as high output voltage, lower total harmonic distortion (THD) and reduction of voltage ratings of the power semiconductor switching devices. The proposed control strategy ensures an implementation of selective harmonic elimination (SHE) modulation for eleven levels. SHE is a very important and efficient strategy of eliminating selected harmonics by judicious selection of the firing angles of the inverter. Harmonics elimination technique eliminates the need of the expensive low pass filters in the system. Previous research considered that constant and equal DC sources with invariant behavior; however, this research extends earlier work to include variant DC sources, which are typical of lead-acid batteries when used in system PV. This Study also investigates methods to minimize the total harmonic distortion of the synthesized multilevel waveform and to help balance the battery voltage. The harmonic elimination method was used to eliminate selected lower dominant harmonics resulting from the inverter switching action.
Highly selective and sensitive fluorescent paper sensor for nitroaromatic explosive detection.
Ma, Yingxin; Li, Hao; Peng, Shan; Wang, Leyu
2012-10-02
Rapid, sensitive, and selective detection of explosives such as 2,4,6-trinitrotoluene (TNT) and 2,4,6-trinitrophenol (TNP), especially using a facile paper sensor, is in high demand for homeland security and public safety. Although many strategies have been successfully developed for the detection of TNT, it is not easy to differentiate the influence from TNP. Also, few methods were demonstrated for the selective detection of TNP. In this work, via a facile and versatile method, 8-hydroxyquinoline aluminum (Alq(3))-based bluish green fluorescent composite nanospheres were successfully synthesized through self-assembly under vigorous stirring and ultrasonic treatment. These polymer-coated nanocomposites are not only water-stable but also highly luminescent. Based on the dramatic and selective fluorescence quenching of the nanocomposites via adding TNP into the aqueous solution, a sensitive and robust platform was developed for visual detection of TNP in the mixture of nitroaromatics including TNT, 2,4-dinitrotoluene (DNT), and nitrobenzene (NB). Meanwhile, the fluorescence intensity is proportional to the concentration of TNP in the range of 0.05-7.0 μg/mL with the 3σ limit of detection of 32.3 ng/mL. By handwriting or finger printing with TNP solution as ink on the filter paper soaked with the fluorescent nanocomposites, the bluish green fluorescence was instantly and dramatically quenched and the dark patterns were left on the paper. Therefore, a convenient and rapid paper sensor for TNP-selective detection was fabricated.
Hash Bit Selection for Nearest Neighbor Search.
Xianglong Liu; Junfeng He; Shih-Fu Chang
2017-11-01
To overcome the barrier of storage and computation when dealing with gigantic-scale data sets, compact hashing has been studied extensively to approximate the nearest neighbor search. Despite the recent advances, critical design issues remain open in how to select the right features, hashing algorithms, and/or parameter settings. In this paper, we address these by posing an optimal hash bit selection problem, in which an optimal subset of hash bits are selected from a pool of candidate bits generated by different features, algorithms, or parameters. Inspired by the optimization criteria used in existing hashing algorithms, we adopt the bit reliability and their complementarity as the selection criteria that can be carefully tailored for hashing performance in different tasks. Then, the bit selection solution is discovered by finding the best tradeoff between search accuracy and time using a modified dynamic programming method. To further reduce the computational complexity, we employ the pairwise relationship among hash bits to approximate the high-order independence property, and formulate it as an efficient quadratic programming method that is theoretically equivalent to the normalized dominant set problem in a vertex- and edge-weighted graph. Extensive large-scale experiments have been conducted under several important application scenarios of hash techniques, where our bit selection framework can achieve superior performance over both the naive selection methods and the state-of-the-art hashing algorithms, with significant accuracy gains ranging from 10% to 50%, relatively.
Efficient feature selection using a hybrid algorithm for the task of epileptic seizure detection
NASA Astrophysics Data System (ADS)
Lai, Kee Huong; Zainuddin, Zarita; Ong, Pauline
2014-07-01
Feature selection is a very important aspect in the field of machine learning. It entails the search of an optimal subset from a very large data set with high dimensional feature space. Apart from eliminating redundant features and reducing computational cost, a good selection of feature also leads to higher prediction and classification accuracy. In this paper, an efficient feature selection technique is introduced in the task of epileptic seizure detection. The raw data are electroencephalography (EEG) signals. Using discrete wavelet transform, the biomedical signals were decomposed into several sets of wavelet coefficients. To reduce the dimension of these wavelet coefficients, a feature selection method that combines the strength of both filter and wrapper methods is proposed. Principal component analysis (PCA) is used as part of the filter method. As for wrapper method, the evolutionary harmony search (HS) algorithm is employed. This metaheuristic method aims at finding the best discriminating set of features from the original data. The obtained features were then used as input for an automated classifier, namely wavelet neural networks (WNNs). The WNNs model was trained to perform a binary classification task, that is, to determine whether a given EEG signal was normal or epileptic. For comparison purposes, different sets of features were also used as input. Simulation results showed that the WNNs that used the features chosen by the hybrid algorithm achieved the highest overall classification accuracy.
Gao, JianZhao; Tao, Xue-Wen; Zhao, Jia; Feng, Yuan-Ming; Cai, Yu-Dong; Zhang, Ning
2017-01-01
Lysine acetylation, as one type of post-translational modifications (PTM), plays key roles in cellular regulations and can be involved in a variety of human diseases. However, it is often high-cost and time-consuming to use traditional experimental approaches to identify the lysine acetylation sites. Therefore, effective computational methods should be developed to predict the acetylation sites. In this study, we developed a position-specific method for epsilon lysine acetylation site prediction. Sequences of acetylated proteins were retrieved from the UniProt database. Various kinds of features such as position specific scoring matrix (PSSM), amino acid factors (AAF), and disorders were incorporated. A feature selection method based on mRMR (Maximum Relevance Minimum Redundancy) and IFS (Incremental Feature Selection) was employed. Finally, 319 optimal features were selected from total 541 features. Using the 319 optimal features to encode peptides, a predictor was constructed based on dagging. As a result, an accuracy of 69.56% with MCC of 0.2792 was achieved. We analyzed the optimal features, which suggested some important factors determining the lysine acetylation sites. We developed a position-specific method for epsilon lysine acetylation site prediction. A set of optimal features was selected. Analysis of the optimal features provided insights into the mechanism of lysine acetylation sites, providing guidance of experimental validation. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Pappula, Nagaraju; Kodali, Balaji; Datla, Peda Varma
2018-04-15
Highly selective and fast liquid chromatography-tandem mass spectrometric (LC-MS/MS) method was developed and validated for simultaneous determination of tadalafil (TDL) and finasteride (FNS) in human plasma. The method was successfully applied for analysis of TDL and FNS samples in clinical study. The method was validated as per USFDA (United States Food and Drug Administration), EMA (European Medicines Agency), and ANVISA (Agência Nacional de Vigilância Sanitária-Brazil) bio analytical method validation guidelines. Glyburide (GLB) was used as common internal standard (ISTD) for both analytes. The selected multiple reaction monitoring (MRM) transitions for mass spectrometric analysis were m/z 390.2/268.2, m/z 373.3/305.4 and m/z 494.2/369.1 for TDL, FNS and ISTD respectively. The extraction of analytes and ISTD was accomplished by a simple solid phase extraction (SPE) procedure. Rapid analysis time was achieved on Zorbax Eclipse C18 column (50 × 4.6 mm, 5 μm). The calibration ranges for TDL and FNS were 5-800 ng/ml and 0.2-30 ng/ml respectively. The results of precision and accuracy, linearity, recovery and matrix effect of the method are acceptable. The accuracy was in the range of 92.9%-106.4% and method precision was also good; %CV was less than 8.1%. Copyright © 2018 Elsevier B.V. All rights reserved.
Data-driven confounder selection via Markov and Bayesian networks.
Häggström, Jenny
2018-06-01
To unbiasedly estimate a causal effect on an outcome unconfoundedness is often assumed. If there is sufficient knowledge on the underlying causal structure then existing confounder selection criteria can be used to select subsets of the observed pretreatment covariates, X, sufficient for unconfoundedness, if such subsets exist. Here, estimation of these target subsets is considered when the underlying causal structure is unknown. The proposed method is to model the causal structure by a probabilistic graphical model, for example, a Markov or Bayesian network, estimate this graph from observed data and select the target subsets given the estimated graph. The approach is evaluated by simulation both in a high-dimensional setting where unconfoundedness holds given X and in a setting where unconfoundedness only holds given subsets of X. Several common target subsets are investigated and the selected subsets are compared with respect to accuracy in estimating the average causal effect. The proposed method is implemented with existing software that can easily handle high-dimensional data, in terms of large samples and large number of covariates. The results from the simulation study show that, if unconfoundedness holds given X, this approach is very successful in selecting the target subsets, outperforming alternative approaches based on random forests and LASSO, and that the subset estimating the target subset containing all causes of outcome yields smallest MSE in the average causal effect estimation. © 2017, The International Biometric Society.
Yu, Fang; Chen, Ming-Hui; Kuo, Lynn; Talbott, Heather; Davis, John S
2015-08-07
Recently, the Bayesian method becomes more popular for analyzing high dimensional gene expression data as it allows us to borrow information across different genes and provides powerful estimators for evaluating gene expression levels. It is crucial to develop a simple but efficient gene selection algorithm for detecting differentially expressed (DE) genes based on the Bayesian estimators. In this paper, by extending the two-criterion idea of Chen et al. (Chen M-H, Ibrahim JG, Chi Y-Y. A new class of mixture models for differential gene expression in DNA microarray data. J Stat Plan Inference. 2008;138:387-404), we propose two new gene selection algorithms for general Bayesian models and name these new methods as the confident difference criterion methods. One is based on the standardized differences between two mean expression values among genes; the other adds the differences between two variances to it. The proposed confident difference criterion methods first evaluate the posterior probability of a gene having different gene expressions between competitive samples and then declare a gene to be DE if the posterior probability is large. The theoretical connection between the proposed first method based on the means and the Bayes factor approach proposed by Yu et al. (Yu F, Chen M-H, Kuo L. Detecting differentially expressed genes using alibrated Bayes factors. Statistica Sinica. 2008;18:783-802) is established under the normal-normal-model with equal variances between two samples. The empirical performance of the proposed methods is examined and compared to those of several existing methods via several simulations. The results from these simulation studies show that the proposed confident difference criterion methods outperform the existing methods when comparing gene expressions across different conditions for both microarray studies and sequence-based high-throughput studies. A real dataset is used to further demonstrate the proposed methodology. In the real data application, the confident difference criterion methods successfully identified more clinically important DE genes than the other methods. The confident difference criterion method proposed in this paper provides a new efficient approach for both microarray studies and sequence-based high-throughput studies to identify differentially expressed genes.
Tian, Xin; Xin, Mingyuan; Luo, Jian; Liu, Mingyao; Jiang, Zhenran
2017-02-01
The selection of relevant genes for breast cancer metastasis is critical for the treatment and prognosis of cancer patients. Although much effort has been devoted to the gene selection procedures by use of different statistical analysis methods or computational techniques, the interpretation of the variables in the resulting survival models has been limited so far. This article proposes a new Random Forest (RF)-based algorithm to identify important variables highly related with breast cancer metastasis, which is based on the important scores of two variable selection algorithms, including the mean decrease Gini (MDG) criteria of Random Forest and the GeneRank algorithm with protein-protein interaction (PPI) information. The new gene selection algorithm can be called PPIRF. The improved prediction accuracy fully illustrated the reliability and high interpretability of gene list selected by the PPIRF approach.
Takada, Kenzo
2013-01-01
The current method of antibody production is mainly the hybridoma method, in which mice are immunized with an excess amount of antigen for a short period to promote activation and proliferation of B-lymphocytes producing the antibodies of interest. Because of the excess antigen, those producing low-affinity antibodies are activated. In contrast, human blood B-lymphocytes are activated through natural immune reactions, such as the reaction to infection. B-lymphocytes are stimulated repeatedly with a small amount of antigen, and thus only those producing high-affinity antibodies are activated. Consequently, the lymphocytes producing the high-affinity antibodies are accumulated in human blood. Therefore, human lymphocytes are an excellent source of high-affinity antibodies. Evec, Inc. has established a unique method to produce high-affinity antibodies from human lymphocytes using Epstein-Barr virus (EBV), which induces the proliferation of B-lymphocytes. The method first induces the proliferation of B-lymphocytes from human blood using EBV, and then isolates those producing the antibodies of interest. The key features of the Evec technique are: 1) development of a lymphocyte library consisting of 150 donors' lymphocytes from which donors suited to develop the antibodies of interest can be selected in 4 days; and 2) development of a sorting method and cell microarray method for selecting lymphocyte clones producing the target antibodies. Licensing agreements have been concluded with European and Japanese pharmaceutical companies for two types of antibody. This paper describes Evec's antibody technology and experience in license negotiations with Mega Pharmacies.
Method for preparing high temperature superconductor
Balachandran, Uthamalingam; Chudzik, Michael P.
2002-01-01
A method of depositing a biaxially textured metal oxide on a substrate defining a plane in which metal oxide atoms are vaporized from a source to form a plume of metal oxide atoms. Atoms in the plume disposed at a selected angle in a predetermined range of angles to the plane of the substrate are allowed to contact the substrate while preventing atoms outside a selected angle from reaching the substrate. The preferred range of angles is 40.degree.-70.degree. and the preferred angle is 60.degree..+-.5.degree.. A moving substrate is disclosed.
Materials, methods and devices to detect and quantify water vapor concentrations in an atmosphere
Allendorf, Mark D; Robinson, Alex L
2014-12-09
We have demonstrated that a surface acoustic wave (SAW) sensor coated with a nanoporous framework material (NFM) film can perform ultrasensitive water vapor detection at concentrations in air from 0.05 to 12,000 ppmv at 1 atmosphere pressure. The method is extendable to other MEMS-based sensors, such as microcantilevers, or to quartz crystal microbalance sensors. We identify a specific NFM that provides high sensitivity and selectivity to water vapor. However, our approach is generalizable to detection of other species using NFM to provide sensitivity and selectivity.
2014-06-01
high-throughput method has utility for evaluating a diversity of natural materials with unknown complex odor blends that can then be down-selected for...method has utility for evaluating a diversity of natural materials with unknown complex odor blends that can then be down-selected for further...leishmaniasis. Lancet 366: 1561-1577. Petts, S.L., Y. Tang, and R.D. Ward. 1997. Nectar from a wax plant, Hoya sp., as a carbohydrate source for
Marateb, Hamid Reza; Mansourian, Marjan; Adibi, Peyman; Farina, Dario
2014-01-01
Background: selecting the correct statistical test and data mining method depends highly on the measurement scale of data, type of variables, and purpose of the analysis. Different measurement scales are studied in details and statistical comparison, modeling, and data mining methods are studied based upon using several medical examples. We have presented two ordinal–variables clustering examples, as more challenging variable in analysis, using Wisconsin Breast Cancer Data (WBCD). Ordinal-to-Interval scale conversion example: a breast cancer database of nine 10-level ordinal variables for 683 patients was analyzed by two ordinal-scale clustering methods. The performance of the clustering methods was assessed by comparison with the gold standard groups of malignant and benign cases that had been identified by clinical tests. Results: the sensitivity and accuracy of the two clustering methods were 98% and 96%, respectively. Their specificity was comparable. Conclusion: by using appropriate clustering algorithm based on the measurement scale of the variables in the study, high performance is granted. Moreover, descriptive and inferential statistics in addition to modeling approach must be selected based on the scale of the variables. PMID:24672565
Lunar-base construction equipment and methods evaluation
NASA Technical Reports Server (NTRS)
Boles, Walter W.; Ashley, David B.; Tucker, Richard L.
1993-01-01
A process for evaluating lunar-base construction equipment and methods concepts is presented. The process is driven by the need for more quantitative, systematic, and logical methods for assessing further research and development requirements in an area where uncertainties are high, dependence upon terrestrial heuristics is questionable, and quantitative methods are seldom applied. Decision theory concepts are used in determining the value of accurate information and the process is structured as a construction-equipment-and-methods selection methodology. Total construction-related, earth-launch mass is the measure of merit chosen for mathematical modeling purposes. The work is based upon the scope of the lunar base as described in the National Aeronautics and Space Administration's Office of Exploration's 'Exploration Studies Technical Report, FY 1989 Status'. Nine sets of conceptually designed construction equipment are selected as alternative concepts. It is concluded that the evaluation process is well suited for assisting in the establishment of research agendas in an approach that is first broad, with a low level of detail, followed by more-detailed investigations into areas that are identified as critical due to high degrees of uncertainty and sensitivity.
NASA Astrophysics Data System (ADS)
Ren, Guoyan; Li, Bafang; Zhao, Xue; Zhuang, Yongliang; Yan, Mingyan; Hou, Hu; Zhang, Xiukun; Chen, Li
2009-03-01
In order to select an optimum extraction method for the target glycoprotein (TGP) from jellyfish ( Rhopilema esculentum) oral-arms, a high performance liquid chromatography (HPLC)-assay for the determination of the TGP was developed. Purified target glycoprotein was taken as a standard glycoprotein. The results showed that the calibration curves for peak area plotted against concentration for TGP were linear ( r = 0.9984, y = 4.5895 x+47.601) over concentrations ranging from 50 to 400 mgL-1. The mean extraction recovery was 97.84% (CV2.60%). The fractions containing TGP were isolated from jellyfish ( R. esculentum) oral-arms by four extraction methods: 1) water extraction (WE), 2) phosphate buffer solution (PBS) extraction (PE), 3) ultrasound-assisted water extraction (UA-WE), 4) ultrasound-assisted PBS extraction (UA-PE). The lyophilized extract was dissolved in Milli-Q water and analyzed directly on a short TSK-GEL G4000PWXL (7.8 mm×300 mm) column. Our results indicated that the UA-PE method was the optimum extraction method selected by HPLC.
Remote sensing image ship target detection method based on visual attention model
NASA Astrophysics Data System (ADS)
Sun, Yuejiao; Lei, Wuhu; Ren, Xiaodong
2017-11-01
The traditional methods of detecting ship targets in remote sensing images mostly use sliding window to search the whole image comprehensively. However, the target usually occupies only a small fraction of the image. This method has high computational complexity for large format visible image data. The bottom-up selective attention mechanism can selectively allocate computing resources according to visual stimuli, thus improving the computational efficiency and reducing the difficulty of analysis. Considering of that, a method of ship target detection in remote sensing images based on visual attention model was proposed in this paper. The experimental results show that the proposed method can reduce the computational complexity while improving the detection accuracy, and improve the detection efficiency of ship targets in remote sensing images.
Perceptions of the Community College of High School Counselors and Advisors
ERIC Educational Resources Information Center
Mitkos, Yvonne M.; Bragg, Debra D.
2008-01-01
Using the case study method, this research investigated how the community college is perceived by high school counselors and advisors. The research considered how high school counselors' and advisors' perceptions of the community college were informed by selected school leaders, faculty, and students, and it explored how those perceptions were…
Using Ensemble Decisions and Active Selection to Improve Low-Cost Labeling for Multi-View Data
NASA Technical Reports Server (NTRS)
Rebbapragada, Umaa; Wagstaff, Kiri L.
2011-01-01
This paper seeks to improve low-cost labeling in terms of training set reliability (the fraction of correctly labeled training items) and test set performance for multi-view learning methods. Co-training is a popular multiview learning method that combines high-confidence example selection with low-cost (self) labeling. However, co-training with certain base learning algorithms significantly reduces training set reliability, causing an associated drop in prediction accuracy. We propose the use of ensemble labeling to improve reliability in such cases. We also discuss and show promising results on combining low-cost ensemble labeling with active (low-confidence) example selection. We unify these example selection and labeling strategies under collaborative learning, a family of techniques for multi-view learning that we are developing for distributed, sensor-network environments.
Catalyst and method for production of methylamines
Klier, Kamil; Herman, Richard G.; Vedage, Gamini A.
1987-01-01
This invention relates to an improved catalyst and method for the selective production of methylamines. More particularly, it is concerned with the preparation of stable highly active catalysts for producing methylamines by a catalytic reaction of ammonia or substituted amines and binary synthesis gas (CO+H.sub.2).
Wilkins, Ruth; Flegal, Farrah; Knoll, Joan H.M.; Rogan, Peter K.
2017-01-01
Accurate digital image analysis of abnormal microscopic structures relies on high quality images and on minimizing the rates of false positive (FP) and negative objects in images. Cytogenetic biodosimetry detects dicentric chromosomes (DCs) that arise from exposure to ionizing radiation, and determines radiation dose received based on DC frequency. Improvements in automated DC recognition increase the accuracy of dose estimates by reclassifying FP DCs as monocentric chromosomes or chromosome fragments. We also present image segmentation methods to rank high quality digital metaphase images and eliminate suboptimal metaphase cells. A set of chromosome morphology segmentation methods selectively filtered out FP DCs arising primarily from sister chromatid separation, chromosome fragmentation, and cellular debris. This reduced FPs by an average of 55% and was highly specific to these abnormal structures (≥97.7%) in three samples. Additional filters selectively removed images with incomplete, highly overlapped, or missing metaphase cells, or with poor overall chromosome morphologies that increased FP rates. Image selection is optimized and FP DCs are minimized by combining multiple feature based segmentation filters and a novel image sorting procedure based on the known distribution of chromosome lengths. Applying the same image segmentation filtering procedures to both calibration and test samples reduced the average dose estimation error from 0.4 Gy to <0.2 Gy, obviating the need to first manually review these images. This reliable and scalable solution enables batch processing for multiple samples of unknown dose, and meets current requirements for triage radiation biodosimetry of high quality metaphase cell preparations. PMID:29026522
Online feature selection with streaming features.
Wu, Xindong; Yu, Kui; Ding, Wei; Wang, Hao; Zhu, Xingquan
2013-05-01
We propose a new online feature selection framework for applications with streaming features where the knowledge of the full feature space is unknown in advance. We define streaming features as features that flow in one by one over time whereas the number of training examples remains fixed. This is in contrast with traditional online learning methods that only deal with sequentially added observations, with little attention being paid to streaming features. The critical challenges for Online Streaming Feature Selection (OSFS) include 1) the continuous growth of feature volumes over time, 2) a large feature space, possibly of unknown or infinite size, and 3) the unavailability of the entire feature set before learning starts. In the paper, we present a novel Online Streaming Feature Selection method to select strongly relevant and nonredundant features on the fly. An efficient Fast-OSFS algorithm is proposed to improve feature selection performance. The proposed algorithms are evaluated extensively on high-dimensional datasets and also with a real-world case study on impact crater detection. Experimental results demonstrate that the algorithms achieve better compactness and higher prediction accuracy than existing streaming feature selection algorithms.
Selectivity Mechanism of ATP-Competitive Inhibitors for PKB and PKA.
Wu, Ke; Pang, Jingzhi; Song, Dong; Zhu, Ying; Wu, Congwen; Shao, Tianqu; Chen, Haifeng
2015-07-01
Protein kinase B (PKB) acts as a central node on the PI3K kinase pathway. Constitutive activation and overexpression of PKB have been identified to involve in various cancers. However, protein kinase A (PKA) sharing high homology with PKB is essential for metabolic regulation. Therefore, specific targeting on PKB is crucial strategy in drug design and development for antitumor. Here, we had revealed the selectivity mechanism for PKB inhibitors with molecular dynamics simulation and 3D-QSAR methods. Selective inhibitors of PKB could form more hydrogen bonds and hydrophobic contacts with PKB than those with PKA. This could explain that selective inhibitor M128 is more potent to PKB than to PKA. Then, 3D-QSAR models were constructed for these selective inhibitors and evaluated by test set compounds. 3D-QSAR model comparison of PKB inhibitors and PKA inhibitors reveals possible methods to improve the selectivity of inhibitors. These models can be used to design new chemical entities and make quantitative prediction of the specific selective inhibitors before resorting to in vitro and in vivo experiment. © 2014 John Wiley & Sons A/S.
Entropy-Based Search Algorithm for Experimental Design
NASA Astrophysics Data System (ADS)
Malakar, N. K.; Knuth, K. H.
2011-03-01
The scientific method relies on the iterated processes of inference and inquiry. The inference phase consists of selecting the most probable models based on the available data; whereas the inquiry phase consists of using what is known about the models to select the most relevant experiment. Optimizing inquiry involves searching the parameterized space of experiments to select the experiment that promises, on average, to be maximally informative. In the case where it is important to learn about each of the model parameters, the relevance of an experiment is quantified by Shannon entropy of the distribution of experimental outcomes predicted by a probable set of models. If the set of potential experiments is described by many parameters, we must search this high-dimensional entropy space. Brute force search methods will be slow and computationally expensive. We present an entropy-based search algorithm, called nested entropy sampling, to select the most informative experiment for efficient experimental design. This algorithm is inspired by Skilling's nested sampling algorithm used in inference and borrows the concept of a rising threshold while a set of experiment samples are maintained. We demonstrate that this algorithm not only selects highly relevant experiments, but also is more efficient than brute force search. Such entropic search techniques promise to greatly benefit autonomous experimental design.
Temporal variation and scale in movement-based resource selection functions
Hooten, M.B.; Hanks, E.M.; Johnson, D.S.; Alldredge, M.W.
2013-01-01
A common population characteristic of interest in animal ecology studies pertains to the selection of resources. That is, given the resources available to animals, what do they ultimately choose to use? A variety of statistical approaches have been employed to examine this question and each has advantages and disadvantages with respect to the form of available data and the properties of estimators given model assumptions. A wealth of high resolution telemetry data are now being collected to study animal population movement and space use and these data present both challenges and opportunities for statistical inference. We summarize traditional methods for resource selection and then describe several extensions to deal with measurement uncertainty and an explicit movement process that exists in studies involving high-resolution telemetry data. Our approach uses a correlated random walk movement model to obtain temporally varying use and availability distributions that are employed in a weighted distribution context to estimate selection coefficients. The temporally varying coefficients are then weighted by their contribution to selection and combined to provide inference at the population level. The result is an intuitive and accessible statistical procedure that uses readily available software and is computationally feasible for large datasets. These methods are demonstrated using data collected as part of a large-scale mountain lion monitoring study in Colorado, USA.
González-Techera, A.; Umpiérrez-Failache, M.; Cardozo, S.; Obal, G.; Pritsch, O.; Last, J. A.; Gee, S. J.; Hammock, B. D.; González-Sapienza, G.
2010-01-01
The use of phage display peptide libraries allows rapid isolation of peptide ligands for any target selector molecule. However, due to differences in peptide expression and the heterogeneity of the phage preparations, there is no easy way to compare the binding properties of the selected clones, which operates as a major “bottleneck” of the technology. Here, we present the development of a new type of library that allows rapid comparison of the relative affinity of the selected peptides in a high-throughput screening format. As a model system, a phage display peptide library constructed on a phagemid vector that contains the bacterial alkaline phosphatase gene (BAP) was selected with an antiherbicide antibody. Due to the intrinsic switching capacity of the library, the selected peptides were transferred “en masse” from the phage coat protein to BAP. This was coupled to an optimized affinity ELISA where normalized amounts of the peptide–BAP fusion allow direct comparison of the binding properties of hundreds of peptide ligands. The system was validated by plasmon surface resonance experiments using synthetic peptides, showing that the method discriminates among the affinities of the peptides within 3 orders of magnitude. In addition, the peptide–BAP protein can find direct application as a tracer reagent. PMID:18393454
Targeted Proteomic Quantification on Quadrupole-Orbitrap Mass Spectrometer*
Gallien, Sebastien; Duriez, Elodie; Crone, Catharina; Kellmann, Markus; Moehring, Thomas; Domon, Bruno
2012-01-01
There is an immediate need for improved methods to systematically and precisely quantify large sets of peptides in complex biological samples. To date protein quantification in biological samples has been routinely performed on triple quadrupole instruments operated in selected reaction monitoring mode (SRM), and two major challenges remain. Firstly, the number of peptides to be included in one survey experiment needs to be increased to routinely reach several hundreds, and secondly, the degree of selectivity should be improved so as to reliably discriminate the targeted analytes from background interferences. High resolution and accurate mass (HR/AM) analysis on the recently developed Q-Exactive mass spectrometer can potentially address these issues. This instrument presents a unique configuration: it is constituted of an orbitrap mass analyzer equipped with a quadrupole mass filter as the front-end for precursor ion mass selection. This configuration enables new quantitative methods based on HR/AM measurements, including targeted analysis in MS mode (single ion monitoring) and in MS/MS mode (parallel reaction monitoring). The ability of the quadrupole to select a restricted m/z range allows one to overcome the dynamic range limitations associated with trapping devices, and the MS/MS mode provides an additional stage of selectivity. When applied to targeted protein quantification in urine samples and benchmarked with the reference SRM technique, the quadrupole-orbitrap instrument exhibits similar or better performance in terms of selectivity, dynamic range, and sensitivity. This high performance is further enhanced by leveraging the multiplexing capability of the instrument to design novel acquisition methods and apply them to large targeted proteomic studies for the first time, as demonstrated on 770 tryptic yeast peptides analyzed in one 60-min experiment. The increased quality of quadrupole-orbitrap data has the potential to improve existing protein quantification methods in complex samples and address the pressing demand of systems biology or biomarker evaluation studies. PMID:22962056
Two-stage atlas subset selection in multi-atlas based image segmentation.
Zhao, Tingting; Ruan, Dan
2015-06-01
Fast growing access to large databases and cloud stored data presents a unique opportunity for multi-atlas based image segmentation and also presents challenges in heterogeneous atlas quality and computation burden. This work aims to develop a novel two-stage method tailored to the special needs in the face of large atlas collection with varied quality, so that high-accuracy segmentation can be achieved with low computational cost. An atlas subset selection scheme is proposed to substitute a significant portion of the computationally expensive full-fledged registration in the conventional scheme with a low-cost alternative. More specifically, the authors introduce a two-stage atlas subset selection method. In the first stage, an augmented subset is obtained based on a low-cost registration configuration and a preliminary relevance metric; in the second stage, the subset is further narrowed down to a fusion set of desired size, based on full-fledged registration and a refined relevance metric. An inference model is developed to characterize the relationship between the preliminary and refined relevance metrics, and a proper augmented subset size is derived to ensure that the desired atlases survive the preliminary selection with high probability. The performance of the proposed scheme has been assessed with cross validation based on two clinical datasets consisting of manually segmented prostate and brain magnetic resonance images, respectively. The proposed scheme demonstrates comparable end-to-end segmentation performance as the conventional single-stage selection method, but with significant computation reduction. Compared with the alternative computation reduction method, their scheme improves the mean and medium Dice similarity coefficient value from (0.74, 0.78) to (0.83, 0.85) and from (0.82, 0.84) to (0.95, 0.95) for prostate and corpus callosum segmentation, respectively, with statistical significance. The authors have developed a novel two-stage atlas subset selection scheme for multi-atlas based segmentation. It achieves good segmentation accuracy with significantly reduced computation cost, making it a suitable configuration in the presence of extensive heterogeneous atlases.
Oriented modulation for watermarking in direct binary search halftone images.
Guo, Jing-Ming; Su, Chang-Cheng; Liu, Yun-Fu; Lee, Hua; Lee, Jiann-Der
2012-09-01
In this paper, a halftoning-based watermarking method is presented. This method enables high pixel-depth watermark embedding, while maintaining high image quality. This technique is capable of embedding watermarks with pixel depths up to 3 bits without causing prominent degradation to the image quality. To achieve high image quality, the parallel oriented high-efficient direct binary search (DBS) halftoning is selected to be integrated with the proposed orientation modulation (OM) method. The OM method utilizes different halftone texture orientations to carry different watermark data. In the decoder, the least-mean-square-trained filters are applied for feature extraction from watermarked images in the frequency domain, and the naïve Bayes classifier is used to analyze the extracted features and ultimately to decode the watermark data. Experimental results show that the DBS-based OM encoding method maintains a high degree of image quality and realizes the processing efficiency and robustness to be adapted in printing applications.
Xue, Hongqi; Wu, Shuang; Wu, Yichao; Ramirez Idarraga, Juan C; Wu, Hulin
2018-05-02
Mechanism-driven low-dimensional ordinary differential equation (ODE) models are often used to model viral dynamics at cellular levels and epidemics of infectious diseases. However, low-dimensional mechanism-based ODE models are limited for modeling infectious diseases at molecular levels such as transcriptomic or proteomic levels, which is critical to understand pathogenesis of diseases. Although linear ODE models have been proposed for gene regulatory networks (GRNs), nonlinear regulations are common in GRNs. The reconstruction of large-scale nonlinear networks from time-course gene expression data remains an unresolved issue. Here, we use high-dimensional nonlinear additive ODEs to model GRNs and propose a 4-step procedure to efficiently perform variable selection for nonlinear ODEs. To tackle the challenge of high dimensionality, we couple the 2-stage smoothing-based estimation method for ODEs and a nonlinear independence screening method to perform variable selection for the nonlinear ODE models. We have shown that our method possesses the sure screening property and it can handle problems with non-polynomial dimensionality. Numerical performance of the proposed method is illustrated with simulated data and a real data example for identifying the dynamic GRN of Saccharomyces cerevisiae. Copyright © 2018 John Wiley & Sons, Ltd.
An enzyme-mediated protein-fragment complementation assay for substrate screening of sortase A.
Li, Ning; Yu, Zheng; Ji, Qun; Sun, Jingying; Liu, Xiao; Du, Mingjuan; Zhang, Wei
2017-04-29
Enzyme-mediated protein conjugation has gained great attention recently due to the remarkable site-selectivity and mild reaction condition affected by the nature of enzyme. Among all sorts of enzymes reported, sortase A from Staphylococcus aureus (SaSrtA) is the most popular enzyme due to its selectivity and well-demonstrated applications. Position scanning has been widely applied to understand enzyme substrate specificity, but the low throughput of chemical synthesis of peptide substrates and analytical methods (HPLC, LC-ESI-MS) have been the major hurdle to fully decode enzyme substrate profile. We have developed a simple high-throughput substrate profiling method to reveal novel substrates of SaSrtA 7M, a widely used hyperactive peptide ligase, by modified protein-fragment complementation assay (PCA). A small library targeting the LPATG motif recognized by SaSrtA 7M was generated and screened against proteins carrying N-terminal glycine. Using this method, we have confirmed all currently known substrates of the enzyme, and moreover identified some previously unknown substrates with varying activities. The method provides an easy, fast and highly-sensitive way to determine substrate profile of a peptide ligase in a high-throughput manner. Copyright © 2017 Elsevier Inc. All rights reserved.
Hu, Bo; Zhao, Yang; Zhu, Hai-Zhou; Yu, Shu-Hong
2011-04-26
Thiol-containing biomolecules show strong affinity with noble metal nanostructures and could not only stably protect them but also control the self-assembly process of these special nanostructures. A highly selective and sensitive chromogenic detection method has been designed for the low and high molecular weight thiol-containing biomolecules, including cysteine, glutathione, dithiothreitol, and bovine serum albumin, using a new type of carbonaceous nanospheres loaded with silver nanoparticles (Ag NPs) as carrier. This strategy relies upon the place-exchange process between the reporter dyes on the surface of Ag NPs and the thiol groups of thiol-containing biomolecules. The concentration of biomolecules can be determined by monitoring with the fluorescence intensity of reporter dyes dispersed in solution. This new chromogenic assay method could selectively detect these biomolecules in the presence of various other amino acids and monosaccharides and even sensitively detect the thiol-containing biomolecules with different molecular weight, even including proteins.
NASA Astrophysics Data System (ADS)
Lestari, A. W.; Rustam, Z.
2017-07-01
In the last decade, breast cancer has become the focus of world attention as this disease is one of the primary leading cause of death for women. Therefore, it is necessary to have the correct precautions and treatment. In previous studies, Fuzzy Kennel K-Medoid algorithm has been used for multi-class data. This paper proposes an algorithm to classify the high dimensional data of breast cancer using Fuzzy Possibilistic C-means (FPCM) and a new method based on clustering analysis using Normed Kernel Function-Based Fuzzy Possibilistic C-Means (NKFPCM). The objective of this paper is to obtain the best accuracy in classification of breast cancer data. In order to improve the accuracy of the two methods, the features candidates are evaluated using feature selection, where Laplacian Score is used. The results show the comparison accuracy and running time of FPCM and NKFPCM with and without feature selection.
Selecting AGN through Variability in SN Datasets
NASA Astrophysics Data System (ADS)
Boutsia, K.; Leibundgut, B.; Trevese, D.; Vagnetti, F.
2010-07-01
Variability is a main property of Active Galactic Nuclei (AGN) and it was adopted as a selection criterion using multi epoch surveys conducted for the detection of supernovae (SNe). We have used two SN datasets. First we selected the AXAF field of the STRESS project, centered in the Chandra Deep Field South where, besides the deep X-ray surveys also various optical catalogs exist. Our method yielded 132 variable AGN candidates. We then extended our method including the dataset of the ESSENCE project that has been active for 6 years, producing high quality light curves in the R and I bands. We obtained a sample of ˜4800 variable sources, down to R=22, in the whole 12 deg2 ESSENCE field. Among them, a subsample of ˜500 high priority AGN candidates was created using as secondary criterion the shape of the structure function. In a pilot spectroscopic run we have confirmed the AGN nature for nearly all of our candidates.
Lorenz, Daniel A; Song, James M; Garner, Amanda L
2015-01-21
MicroRNAs (miRNA) play critical roles in human development and disease. As such, the targeting of miRNAs is considered attractive as a novel therapeutic strategy. A major bottleneck toward this goal, however, has been the identification of small molecule probes that are specific for select RNAs and methods that will facilitate such discovery efforts. Using pre-microRNAs as proof-of-concept, herein we report a conceptually new and innovative approach for assaying RNA-small molecule interactions. Through this platform assay technology, which we term catalytic enzyme-linked click chemistry assay or cat-ELCCA, we have designed a method that can be implemented in high throughput, is virtually free of false readouts, and is general for all nucleic acids. Through cat-ELCCA, we envision the discovery of selective small molecule ligands for disease-relevant miRNAs to promote the field of RNA-targeted drug discovery and further our understanding of the role of miRNAs in cellular biology.
Priority arbitration mechanism
Garmire, Derrick L [Kingston, NY; Herring, Jay R [Poughkeepsie, NY; Stunkel, Craig B [Bethel, CT
2007-03-06
A method is provided for selecting a data source for transmission on one of several logical (virtual) lanes embodied in a single physical connection. Lanes are assigned to either a high priority class or to a low priority class. One of six conditions is employed to determine when re-arbitration of lane priorities is desired. When this occurs a next source for transmission is selected based on a the specification of the maximum number of high priority packets that can be sent after a lower priority transmission has been interrupted. Alternatively, a next source for transmission is selected based on a the specification of the maximum number of high priority packets that can be sent while a lower priority packet is waiting. If initialized correctly, the arbiter keeps all of the packets of a high priority packet contiguous, while allowing lower priority packets to be interrupted by the higher priority packets, but not to the point of starvation of the lower priority packets.
NASA Astrophysics Data System (ADS)
Viboonratanasri, Duangkamon; Pabchanda, Suwat; Prompinit, Panida
2018-05-01
In this study, a simple, rapid and relatively less toxic method for rhodamine 6G dye adsorption on hydrogen-form Y-type zeolite for highly selective nitrite detection was demonstrated. The adsorption behavior was described by Langmuir isotherm and the adsorption process reached the equilibrium promptly within a minute. The developed test papers characterized by fluorescence technique display high sensing performance with wide working range (0.04-20.0 mg L-1) and high selectivity. The test papers show good reproducibility with relative standard deviation (RSD) of 7% for five replicated determinations of 3 mg L-1 of nitrite. The nitrite concentration determined by using the test paper was in the same range as using ion chromatography within a 95% confidence level. The test papers offer advantages in terms of low cost and practical usage enabling them to be a promising candidate for nitrite sensor in environmental samples, food, and fertilizers.
NASA Astrophysics Data System (ADS)
Spaans, K.; Hooper, A. J.
2017-12-01
The short revisit time and high data acquisition rates of current satellites have resulted in increased interest in the development of deformation monitoring and rapid disaster response capability, using InSAR. Fast, efficient data processing methodologies are required to deliver the timely results necessary for this, and also to limit computing resources required to process the large quantities of data being acquired. Contrary to volcano or earthquake applications, urban monitoring requires high resolution processing, in order to differentiate movements between buildings, or between buildings and the surrounding land. Here we present Rapid time series InSAR (RapidSAR), a method that can efficiently update high resolution time series of interferograms, and demonstrate its effectiveness over urban areas. The RapidSAR method estimates the coherence of pixels on an interferogram-by-interferogram basis. This allows for rapid ingestion of newly acquired images without the need to reprocess the earlier acquired part of the time series. The coherence estimate is based on ensembles of neighbouring pixels with similar amplitude behaviour through time, which are identified on an initial set of interferograms, and need be re-evaluated only occasionally. By taking into account scattering properties of points during coherence estimation, a high quality coherence estimate is achieved, allowing point selection at full resolution. The individual point selection maximizes the amount of information that can be extracted from each interferogram, as no selection compromise has to be reached between high and low coherence interferograms. In other words, points do not have to be coherent throughout the time series to contribute to the deformation time series. We demonstrate the effectiveness of our method over urban areas in the UK. We show how the algorithm successfully extracts high density time series from full resolution Sentinel-1 interferograms, and distinguish clearly between buildings and surrounding vegetation or streets. The fact that new interferograms can be processed separately from the remainder of the time series helps manage the high data volumes, both in space and time, generated by current missions.
Gan, Haijiao; Xu, Hui
2018-05-30
In this work, an innovative magnetic aptamer adsorbent (Fe 3 O 4 -aptamer MNPs) was synthesized for the selective extraction of 8-hydroxy-2'-deoxyguanosine (8-OHdG). Amino-functionalized-Fe 3 O 4 was crosslinked with 8-OHdG aptamer by glutaraldehyde and fixed into a steel stainless tube as the sorbent of magnetic solid phase extraction (MSPE). After selective extraction by the aptamer adsorbent, the adsorbed 8-OHdG was desorbed dynamically and online analyzed by high performance liquid chromatography-mass spectrometry (HPLC-MS). The synthesized sorbent presented outstanding features, including specific selectivity, high enrichment capacity, stability and biocompatibility. Moreover, this proposed MSPE-HPLC-MS can achieve adsorption and desorption operation integration, greatly simplify the analysis process and reduce human errors. When compared with offline MSPE, a sensitivity enhancement of 800 times was obtained for the online method. Some experimental parameters such as the amount of the sorbent, sample flow rate and sample volume, were optimized systematically. Under the optimal conditions, low limit of detection (0.01 ng mL -1 , S/N = 3), limit of quantity (0.03 ng mL -1 , S/N = 10) and wide linear range with a satisfactory correlation coefficient (R 2 ≥ 0.9992) were obtained. And the recoveries of 8-OHdG in the urine samples varied from 82% to 116%. All these results revealed that the method is simple, rapid, selective, sensitive and automated, and it could be expected to become a potential approach for the selective determination of trace 8-OHdG in complex urinary samples. Copyright © 2017 Elsevier B.V. All rights reserved.
Lucci, Paolo; Moret, Sabrina; Bettin, Sara; Conte, Lanfranco
2017-01-01
The aim of this work was to evaluate the use of a molecularly imprinted polymer as a selective solid-phase extraction sorbent for the clean-up and pre-concentration of patulin from apple-based food products. Ultra high pressure liquid chromatography coupled to ultraviolet absorbance detection was used for the analysis of patulin. The molecularly imprinted polymer was applied, for the first time, to the determination of patulin in apple juice, puree and jam samples spiked within the maximum levels specified by the European Commission No. 1881/2006. High recoveries (>77%) were obtained. The method was validated and found to be linear in the range 2-100 μg/kg with correlation coefficients greater than 0.965 and repeatability relative standard deviation below 11% in all cases. Compared with dispersive solid-phase extraction (QuEChERS method) and octadecyl sorbent, the molecularly imprinted polymer showed higher recoveries and selectivity for patulin. The application of Affinisep molecularly imprinted polymer as a selective sorbent material for detection of patulin fulfilled the method performance criteria required by the Commission Regulation No. 401/2006, demonstrating the suitability of the technique for the control of patulin at low ppb levels in different apple-based foods such as juice, puree and jam samples. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bastiaens, L.; Springael, D.; Wattiau, P.
Two different procedures were compared to isolate polycyclic aromatic hydrocarbon (PAH)-utilizing bacteria from PAH-contaminated soil and sludge samples, i.e., (i) shaken enrichment cultures in liquid mineral medium in which PAHs were supplied as crystals and (ii) a new method in which PAH degraders were enriched on and recovered from hydrophobic membranes containing sorbed PAHs. Both techniques were successful, but selected from the same source different bacterial strains able to grow on PAHs as the sole source of carbon and energy. The liquid enrichment mainly selected for Sphingomonas spp., whereas the membrane method exclusively led to the selection of Mycobacterium spp.more » Furthermore, in separate membrane enrichment set-ups with different membrane types, three repetitive extragenic palindromic PCR-related Mycobacterium strains were recovered. The new Mycobactereium isolates were strongly hydrophobic and displayed the capacity to adhere strongly to different surfaces. One strain, Mycobacterium sp. LB501T, displayed an unusual combination of high adhesion efficiency and an extremely high negative charge. This strain may represent a new bacterial species as suggested by 16S rRNA gene sequence analysis. These results indicate that the provision of hydrophobic sorbents containing sorbed PAHs in the enrichment procedure discriminated in favor of certain bacterial characteristics. The new isolation method is appropriate to select for adherent PAH-degrading bacteria, which might be useful to biodegrade sorbed PAHs in soils and sludge.« less
Bastiaens, Leen; Springael, Dirk; Wattiau, Pierre; Harms, Hauke; deWachter, Rupert; Verachtert, Hubert; Diels, Ludo
2000-01-01
Two different procedures were compared to isolate polycyclic aromatic hydrocarbon (PAH)-utilizing bacteria from PAH-contaminated soil and sludge samples, i.e., (i) shaken enrichment cultures in liquid mineral medium in which PAHs were supplied as crystals and (ii) a new method in which PAH degraders were enriched on and recovered from hydrophobic membranes containing sorbed PAHs. Both techniques were successful, but selected from the same source different bacterial strains able to grow on PAHs as the sole source of carbon and energy. The liquid enrichment mainly selected for Sphingomonas spp., whereas the membrane method exclusively led to the selection of Mycobacterium spp. Furthermore, in separate membrane enrichment set-ups with different membrane types, three repetitive extragenic palindromic PCR-related Mycobacterium strains were recovered. The new Mycobacterium isolates were strongly hydrophobic and displayed the capacity to adhere strongly to different surfaces. One strain, Mycobacterium sp. LB501T, displayed an unusual combination of high adhesion efficiency and an extremely high negative charge. This strain may represent a new bacterial species as suggested by 16S rRNA gene sequence analysis. These results indicate that the provision of hydrophobic sorbents containing sorbed PAHs in the enrichment procedure discriminated in favor of certain bacterial characteristics. The new isolation method is appropriate to select for adherent PAH-degrading bacteria, which might be useful to biodegrade sorbed PAHs in soils and sludge. PMID:10788347
Coser, S M; Motoike, S Y; Corrêa, T R; Pires, T P; Resende, M D V
2016-10-17
Macaw palm (Acrocomia aculeata) is a promising species for use in biofuel production, and establishing breeding programs is important for the development of commercial plantations. The aim of the present study was to analyze genetic diversity, verify correlations between traits, estimate genetic parameters, and select different accessions of A. aculeata in the Macaw Palm Germplasm Bank located in Universidade Federal de Viçosa, to develop a breeding program for this species. Accessions were selected based on precocity (PREC), total spathe (TS), diameter at breast height (DBH), height of the first spathe (HFS), and canopy area (CA). The traits were evaluated in 52 accessions during the 2012/2013 season and analyzed by restricted estimation maximum likelihood/best linear unbiased predictor procedures. Genetic diversity resulted in the formation of four groups by Tocher's clustering method. The correlation analysis showed it was possible to have indirect and early selection for the traits PREC and DBH. Estimated genetic parameters strengthened the genetic variability verified by cluster analysis. Narrow-sense heritability was classified as moderate (PREC, TS, and CA) to high (HFS and DBH), resulting in strong genetic control of the traits and success in obtaining genetic gains by selection. Accuracy values were classified as moderate (PREC and CA) to high (TS, HFS, and DBH), reinforcing the success of the selection process. Selection of accessions for PREC, TS, and HFS by the rank-average method permits selection gains of over 100%, emphasizing the successful use of the accessions in breeding programs and obtaining superior genotypes for commercial plantations.
Enhancing Breast Cancer Recurrence Algorithms Through Selective Use of Medical Record Data
Chubak, Jessica; Johnson, Lisa; Castillo, Adrienne; Weltzien, Erin; Caan, Bette J.
2016-01-01
Abstract Background: The utility of data-based algorithms in research has been questioned because of errors in identification of cancer recurrences. We adapted previously published breast cancer recurrence algorithms, selectively using medical record (MR) data to improve classification. Methods: We evaluated second breast cancer event (SBCE) and recurrence-specific algorithms previously published by Chubak and colleagues in 1535 women from the Life After Cancer Epidemiology (LACE) and 225 women from the Women’s Health Initiative cohorts and compared classification statistics to published values. We also sought to improve classification with minimal MR examination. We selected pairs of algorithms—one with high sensitivity/high positive predictive value (PPV) and another with high specificity/high PPV—using MR information to resolve discrepancies between algorithms, properly classifying events based on review; we called this “triangulation.” Finally, in LACE, we compared associations between breast cancer survival risk factors and recurrence using MR data, single Chubak algorithms, and triangulation. Results: The SBCE algorithms performed well in identifying SBCE and recurrences. Recurrence-specific algorithms performed more poorly than published except for the high-specificity/high-PPV algorithm, which performed well. The triangulation method (sensitivity = 81.3%, specificity = 99.7%, PPV = 98.1%, NPV = 96.5%) improved recurrence classification over two single algorithms (sensitivity = 57.1%, specificity = 95.5%, PPV = 71.3%, NPV = 91.9%; and sensitivity = 74.6%, specificity = 97.3%, PPV = 84.7%, NPV = 95.1%), with 10.6% MR review. Triangulation performed well in survival risk factor analyses vs analyses using MR-identified recurrences. Conclusions: Use of multiple recurrence algorithms in administrative data, in combination with selective examination of MR data, may improve recurrence data quality and reduce research costs. PMID:26582243
Establishing an efficient way to utilize the drought resistance germplasm population in wheat.
Wang, Jiancheng; Guan, Yajing; Wang, Yang; Zhu, Liwei; Wang, Qitian; Hu, Qijuan; Hu, Jin
2013-01-01
Drought resistance breeding provides a hopeful way to improve yield and quality of wheat in arid and semiarid regions. Constructing core collection is an efficient way to evaluate and utilize drought-resistant germplasm resources in wheat. In the present research, 1,683 wheat varieties were divided into five germplasm groups (high resistant, HR; resistant, R; moderate resistant, MR; susceptible, S; and high susceptible, HS). The least distance stepwise sampling (LDSS) method was adopted to select core accessions. Six commonly used genetic distances (Euclidean distance, Euclid; Standardized Euclidean distance, Seuclid; Mahalanobis distance, Mahal; Manhattan distance, Manhat; Cosine distance, Cosine; and Correlation distance, Correlation) were used to assess genetic distances among accessions. Unweighted pair-group average (UPGMA) method was used to perform hierarchical cluster analysis. Coincidence rate of range (CR) and variable rate of coefficient of variation (VR) were adopted to evaluate the representativeness of the core collection. A method for selecting the ideal constructing strategy was suggested in the present research. A wheat core collection for the drought resistance breeding programs was constructed by the strategy selected in the present research. The principal component analysis showed that the genetic diversity was well preserved in that core collection.
Zhao, Zhiyong; Liu, Na; Yang, Lingchen; Deng, Yifeng; Wang, Jianhua; Song, Suquan; Lin, Shanhai; Wu, Aibo; Zhou, Zhenlei; Hou, Jiafa
2015-09-01
Mycotoxins have the potential to enter the human food chain through carry-over of contaminants from feed into animal-derived products. The objective of the study was to develop a reliable and sensitive method for the analysis of 30 mycotoxins in animal feed and animal-derived food (meat, edible animal tissues, and milk) using liquid chromatography-tandem mass spectrometry (LC-MS/MS). In the study, three extraction procedures, as well as various cleanup procedures, were evaluated to select the most suitable sample preparation procedure for different sample matrices. In addition, timed and highly selective reaction monitoring on LC-MS/MS was used to filter out isobaric matrix interferences. The performance characteristics (linearity, sensitivity, recovery, precision, and specificity) of the method were determined according to Commission Decision 2002/657/EC and 401/2006/EC. The established method was successfully applied to screening of mycotoxins in animal feed and animal-derived food. The results indicated that mycotoxin contamination in feed directly influenced the presence of mycotoxin in animal-derived food. Graphical abstract Multi-mycotoxin analysis of animal feed and animal-derived food using LC-MS/MS.
Exploration of complex visual feature spaces for object perception
Leeds, Daniel D.; Pyles, John A.; Tarr, Michael J.
2014-01-01
The mid- and high-level visual properties supporting object perception in the ventral visual pathway are poorly understood. In the absence of well-specified theory, many groups have adopted a data-driven approach in which they progressively interrogate neural units to establish each unit's selectivity. Such methods are challenging in that they require search through a wide space of feature models and stimuli using a limited number of samples. To more rapidly identify higher-level features underlying human cortical object perception, we implemented a novel functional magnetic resonance imaging method in which visual stimuli are selected in real-time based on BOLD responses to recently shown stimuli. This work was inspired by earlier primate physiology work, in which neural selectivity for mid-level features in IT was characterized using a simple parametric approach (Hung et al., 2012). To extend such work to human neuroimaging, we used natural and synthetic object stimuli embedded in feature spaces constructed on the basis of the complex visual properties of the objects themselves. During fMRI scanning, we employed a real-time search method to control continuous stimulus selection within each image space. This search was designed to maximize neural responses across a pre-determined 1 cm3 brain region within ventral cortex. To assess the value of this method for understanding object encoding, we examined both the behavior of the method itself and the complex visual properties the method identified as reliably activating selected brain regions. We observed: (1) Regions selective for both holistic and component object features and for a variety of surface properties; (2) Object stimulus pairs near one another in feature space that produce responses at the opposite extremes of the measured activity range. Together, these results suggest that real-time fMRI methods may yield more widely informative measures of selectivity within the broad classes of visual features associated with cortical object representation. PMID:25309408
A picture's worth a thousand words: a food-selection observational method.
Carins, Julia E; Rundle-Thiele, Sharyn R; Parkinson, Joy E
2016-05-04
Issue addressed: Methods are needed to accurately measure and describe behaviour so that social marketers and other behaviour change researchers can gain consumer insights before designing behaviour change strategies and so, in time, they can measure the impact of strategies or interventions when implemented. This paper describes a photographic method developed to meet these needs. Methods: Direct observation and photographic methods were developed and used to capture food-selection behaviour and examine those selections according to their healthfulness. Four meals (two lunches and two dinners) were observed at a workplace buffet-style cafeteria over a 1-week period. The healthfulness of individual meals was assessed using a classification scheme developed for the present study and based on the Australian Dietary Guidelines. Results: Approximately 27% of meals (n = 168) were photographed. Agreement was high between raters classifying dishes using the scheme, as well as between researchers when coding photographs. The subset of photographs was representative of patterns observed in the entire dining room. Diners chose main dishes in line with the proportions presented, but in opposition to the proportions presented for side dishes. Conclusions: The present study developed a rigorous observational method to investigate food choice behaviour. The comprehensive food classification scheme produced consistent classifications of foods. The photographic data collection method was found to be robust and accurate. Combining the two observation methods allows researchers and/or practitioners to accurately measure and interpret food selections. Consumer insights gained suggest that, in this setting, increasing the availability of green (healthful) offerings for main dishes would assist in improving healthfulness, whereas other strategies (e.g. promotion) may be needed for side dishes. So what?: Visual observation methods that accurately measure and interpret food-selection behaviour provide both insight for those developing healthy eating interventions and a means to evaluate the effect of implemented interventions on food selection.
Takeno, Shinya; Bamba, Takeshi; Nakazawa, Yoshihisa; Fukusaki, Eiichiro; Okazawa, Atsushi; Kobayashi, Akio
2008-04-01
Commercial development of trans-1,4-polyisoprene from Eucommia ulmoides Oliver (EU-rubber) requires specific knowledge on selection of high-rubber-content lines and establishment of agronomic cultivation methods for achieving maximum EU-rubber yield. The development can be facilitated by high-throughput and highly sensitive analytical techniques for EU-rubber extraction and quantification. In this paper, we described an efficient EU-rubber extraction method, and validated that the accuracy was equivalent to that of the conventional Soxhlet extraction method. We also described a highly sensitive quantification method for EU-rubber by Fourier transform infrared spectroscopy (FT-IR) and pyrolysis-gas chromatography/mass spectrometry (PyGC/MS). We successfully applied the extraction/quantification method for study of seasonal changes in EU-rubber content and molecular weight distribution.
Amperometric Carbon Fiber Nitrite Microsensor for In Situ Biofilm Monitoring
A highly selective needle type solid state amperometric nitrite microsensor based on direct nitrite oxidation on carbon fiber was developed using a simplified fabrication method. The microsensor’s tip diameter was approximately 7 µm, providing a high spatial resolution of at lea...