Sample records for extraction technique based

  1. State of the art of environmentally friendly sample preparation approaches for determination of PBDEs and metabolites in environmental and biological samples: A critical review.

    PubMed

    Berton, Paula; Lana, Nerina B; Ríos, Juan M; García-Reyes, Juan F; Altamirano, Jorgelina C

    2016-01-28

    Green chemistry principles for developing methodologies have gained attention in analytical chemistry in recent decades. A growing number of analytical techniques have been proposed for determination of organic persistent pollutants in environmental and biological samples. In this light, the current review aims to present state-of-the-art sample preparation approaches based on green analytical principles proposed for the determination of polybrominated diphenyl ethers (PBDEs) and metabolites (OH-PBDEs and MeO-PBDEs) in environmental and biological samples. Approaches to lower the solvent consumption and accelerate the extraction, such as pressurized liquid extraction, microwave-assisted extraction, and ultrasound-assisted extraction, are discussed in this review. Special attention is paid to miniaturized sample preparation methodologies and strategies proposed to reduce organic solvent consumption. Additionally, extraction techniques based on alternative solvents (surfactants, supercritical fluids, or ionic liquids) are also commented in this work, even though these are scarcely used for determination of PBDEs. In addition to liquid-based extraction techniques, solid-based analytical techniques are also addressed. The development of greener, faster and simpler sample preparation approaches has increased in recent years (2003-2013). Among green extraction techniques, those based on the liquid phase predominate over those based on the solid phase (71% vs. 29%, respectively). For solid samples, solvent assisted extraction techniques are preferred for leaching of PBDEs, and liquid phase microextraction techniques are mostly used for liquid samples. Likewise, green characteristics of the instrumental analysis used after the extraction and clean-up steps are briefly discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Developing a hybrid dictionary-based bio-entity recognition technique.

    PubMed

    Song, Min; Yu, Hwanjo; Han, Wook-Shin

    2015-01-01

    Bio-entity extraction is a pivotal component for information extraction from biomedical literature. The dictionary-based bio-entity extraction is the first generation of Named Entity Recognition (NER) techniques. This paper presents a hybrid dictionary-based bio-entity extraction technique. The approach expands the bio-entity dictionary by combining different data sources and improves the recall rate through the shortest path edit distance algorithm. In addition, the proposed technique adopts text mining techniques in the merging stage of similar entities such as Part of Speech (POS) expansion, stemming, and the exploitation of the contextual cues to further improve the performance. The experimental results show that the proposed technique achieves the best or at least equivalent performance among compared techniques, GENIA, MESH, UMLS, and combinations of these three resources in F-measure. The results imply that the performance of dictionary-based extraction techniques is largely influenced by information resources used to build the dictionary. In addition, the edit distance algorithm shows steady performance with three different dictionaries in precision whereas the context-only technique achieves a high-end performance with three difference dictionaries in recall.

  3. Developing a hybrid dictionary-based bio-entity recognition technique

    PubMed Central

    2015-01-01

    Background Bio-entity extraction is a pivotal component for information extraction from biomedical literature. The dictionary-based bio-entity extraction is the first generation of Named Entity Recognition (NER) techniques. Methods This paper presents a hybrid dictionary-based bio-entity extraction technique. The approach expands the bio-entity dictionary by combining different data sources and improves the recall rate through the shortest path edit distance algorithm. In addition, the proposed technique adopts text mining techniques in the merging stage of similar entities such as Part of Speech (POS) expansion, stemming, and the exploitation of the contextual cues to further improve the performance. Results The experimental results show that the proposed technique achieves the best or at least equivalent performance among compared techniques, GENIA, MESH, UMLS, and combinations of these three resources in F-measure. Conclusions The results imply that the performance of dictionary-based extraction techniques is largely influenced by information resources used to build the dictionary. In addition, the edit distance algorithm shows steady performance with three different dictionaries in precision whereas the context-only technique achieves a high-end performance with three difference dictionaries in recall. PMID:26043907

  4. Ionic liquid-based ultrasonic/microwave-assisted extraction combined with UPLC-MS-MS for the determination of tannins in Galla chinensis.

    PubMed

    Lu, Chunxia; Wang, Hongxin; Lv, Wenping; Ma, Chaoyang; Lou, Zaixiang; Xie, Jun; Liu, Bo

    2012-01-01

    Ionic liquid was used as extraction solvents and applied to the extraction of tannins from Galla chinensis in the simultaneous ultrasonic- and microwave-assisted extraction (UMAE) technique. Several parameters of UMAE were optimised, and the results were compared with of the conventional extraction techniques. Under optimal conditions, the content of tannins was 630.2 ± 12.1 mg g⁻¹. Compared with the conventional heat-reflux extraction, maceration extraction, regular ultrasound- and microwave-assisted extraction, the proposed approach exhibited higher efficiency (11.7-22.0% enhanced) and shorter extraction time (from 6 h to 1 min). The tannins were then identified by ultraperformance liquid chromatography tandem mass spectrometry. This study suggests that ionic liquid-based UMAE is an efficient, rapid, simple and green sample preparation technique.

  5. Intercomparison of Lab-Based Soil Water Extraction Methods for Stable Water Isotope Analysis

    NASA Astrophysics Data System (ADS)

    Pratt, D.; Orlowski, N.; McDonnell, J.

    2016-12-01

    The effect of pore water extraction technique on resultant isotopic signature is poorly understood. Here we present results of an intercomparison of five common lab-based soil water extraction techniques: high pressure mechanical squeezing, centrifugation, direct vapor equilibration, microwave extraction, and cryogenic extraction. We applied five extraction methods to two physicochemically different standard soil types (silty sand and clayey loam) that were oven-dried and rewetted with water of known isotopic composition at three different gravimetric water contents (8, 20, and 30%). We tested the null hypothisis that all extraction techniques would provide the same isotopic result independent from soil type and water content. Our results showed that the extraction technique had a significant effect on the soil water isotopic composition. Each method exhibited deviations from spiked reference water, with soil type and water content showing a secondary effect. Cryogenic extraction showed the largest deviations from the reference water, whereas mechanical squeezing and centrifugation provided the closest match to the reference water for both soil types. We also compared results for each extraction technique that produced liquid water on both an OA-ICOS and IRMS; differences between them were negligible.

  6. Soil solution extraction techniques for microbial ecotoxicity testing: a comparative evaluation.

    PubMed

    Tiensing, T; Preston, S; Strachan, N; Paton, G I

    2001-02-01

    The suitability of two different techniques (centrifugation and Rhizon sampler) for obtaining the interstitial pore water of soil (soil solution), integral to the ecotoxicity assessment of metal contaminated soil, were investigated by combining chemical analyses and a luminescence-based microbial biosensor. Two different techniques, centrifugation and Rhizon sampler, were used to extract the soil solution from Insch (a loamy sand) and Boyndie (a sandy loam) soils, which had been amended with different concentrations of Zn and Cd. The concentrations of dissolved organic carbon (DOC), major anions (F- , CI-, NO3, SO4(2-)) and major cations (K+, Mg2+, Ca2+) in the soil solutions varied depending on the extraction technique used. Overall, the concentrations of Zn and Cd were significantly higher in the soil solution extracted using the centrifugation technique compared with that extracted using the Rhizon sampler technique. Furthermore, the differences observed between the two extraction techniques depended on the type of soil from which the solution was being extracted. The luminescence-based biosensor Escherichia coli HB101 pUCD607 was shown to respond to the free metal concentrations in the soil solutions and showed that different toxicities were associated with each soil, depending on the technique used to extract the soil solution. This study highlights the need to characterise the type of extraction technique used to obtain the soil solution for ecotoxicity testing in order that a representative ecotoxicity assessment can be carried out.

  7. Review of online coupling of sample preparation techniques with liquid chromatography.

    PubMed

    Pan, Jialiang; Zhang, Chengjiang; Zhang, Zhuomin; Li, Gongke

    2014-03-07

    Sample preparation is still considered as the bottleneck of the whole analytical procedure, and efforts has been conducted towards the automation, improvement of sensitivity and accuracy, and low comsuption of organic solvents. Development of online sample preparation techniques (SP) coupled with liquid chromatography (LC) is a promising way to achieve these goals, which has attracted great attention. This article reviews the recent advances on the online SP-LC techniques. Various online SP techniques have been described and summarized, including solid-phase-based extraction, liquid-phase-based extraction assisted with membrane, microwave assisted extraction, ultrasonic assisted extraction, accelerated solvent extraction and supercritical fluids extraction. Specially, the coupling approaches of online SP-LC systems and the corresponding interfaces have been discussed and reviewed in detail, such as online injector, autosampler combined with transport unit, desorption chamber and column switching. Typical applications of the online SP-LC techniques have been summarized. Then the problems and expected trends in this field are attempted to be discussed and proposed in order to encourage the further development of online SP-LC techniques. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Comparing the Efficiency of Two Different Extraction Techniques in Removal of Maxillary Third Molars: A Randomized Controlled Trial.

    PubMed

    Edward, Joseph; Aziz, Mubarak A; Madhu Usha, Arjun; Narayanan, Jyothi K

    2017-12-01

    Extractions are routine procedures in dental surgery. Traditional extraction techniques use a combination of severing the periodontal attachment, luxation with an elevator, and removal with forceps. A new technique of extraction of maxillary third molar is introduced in this study-Joedds technique, which is compared with the conventional technique. One hundred people were included in the study, the people were divided into two groups by means of simple random sampling. In one group conventional technique of maxillary third molar extraction was used and on second Joedds technique was used. Statistical analysis was carried out with student's t test. Analysis of 100 patients based on parameters showed that the novel joedds technique had minimal trauma to surrounding tissues, less tuberosity and root fractures and the time taken for extraction was <2 min while compared to other group of patients. This novel technique has proved to be better than conventional third molar extraction technique, with minimal complications. If Proper selection of cases and right technique are used.

  9. High Throughput Immunomagnetic Scavenging Technique for ...

    EPA Pesticide Factsheets

    Journal Article This article describes a novel immunomagnetic scavenging (IMSc) technique for extracting cholinesterase inhibitors from aqueous matrixes using biological targeting and antibody-based extraction.

  10. Green bio-oil extraction for oil crops

    NASA Astrophysics Data System (ADS)

    Zainab, H.; Nurfatirah, N.; Norfaezah, A.; Othman, H.

    2016-06-01

    The move towards a green bio-oil extraction technique is highlighted in this paper. The commonly practised organic solvent oil extraction technique could be replaced with a modified microwave extraction. Jatropha seeds (Jatropha curcas) were used to extract bio-oil. Clean samples were heated in an oven at 110 ° C for 24 hours to remove moisture content and ground to obtain particle size smaller than 500μm. Extraction was carried out at different extraction times 15 min, 30 min, 45 min, 60 min and 120 min to determine oil yield. The biooil yield obtained from microwave assisted extraction system at 90 minutes was 36% while that from soxhlet extraction for 6 hours was 42%. Bio-oil extracted using the microwave assisted extraction (MAE) system could enhance yield of bio-oil compared to soxhlet extraction. The MAE extraction system is rapid using only water as solvent which is a nonhazardous, environment-friendly technique compared to soxhlet extraction (SE) method using hexane as solvent. Thus, this is a green technique of bio-oil extraction using only water as extractant. Bio-oil extraction from the pyrolysis of empty fruit bunch (EFB), a biomass waste from oil palm crop, was enhanced using a biocatalyst derived from seashell waste. Oil yield for non-catalytic extraction was 43.8% while addition of seashell based biocatalyst was 44.6%. Oil yield for non-catalytic extraction was 43.8% while with addition of seashell-based biocatalyst was 44.6%. The pH of bio-oil increased from 3.5 to 4.3. The viscosity of bio-oil obtained by catalytic means increased from 20.5 to 37.8 cP. A rapid and environment friendly extraction technique is preferable to enhance bio-oil yield. The microwave assisted approach is a green, rapid and environmental friendly extraction technique for the production of bio-oil bearing crops.

  11. Comparison of extraction techniques and modeling of accelerated solvent extraction for the authentication of natural vanilla flavors.

    PubMed

    Cicchetti, Esmeralda; Chaintreau, Alain

    2009-06-01

    Accelerated solvent extraction (ASE) of vanilla beans has been optimized using ethanol as a solvent. A theoretical model is proposed to account for this multistep extraction. This allows the determination, for the first time, of the total amount of analytes initially present in the beans and thus the calculation of recoveries using ASE or any other extraction technique. As a result, ASE and Soxhlet extractions have been determined to be efficient methods, whereas recoveries are modest for maceration techniques and depend on the solvent used. Because industrial extracts are obtained by many different procedures, including maceration in various solvents, authenticating vanilla extracts using quantitative ratios between the amounts of vanilla flavor constituents appears to be unreliable. When authentication techniques based on isotopic ratios are used, ASE is a valid sample preparation technique because it does not induce isotopic fractionation.

  12. Dispersive Solid Phase Extraction for the Analysis of Veterinary Drugs Applied to Food Samples: A Review

    PubMed Central

    Islas, Gabriela; Hernandez, Prisciliano

    2017-01-01

    To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027

  13. Modeling and prediction of extraction profile for microwave-assisted extraction based on absorbed microwave energy.

    PubMed

    Chan, Chung-Hung; Yusoff, Rozita; Ngoh, Gek-Cheng

    2013-09-01

    A modeling technique based on absorbed microwave energy was proposed to model microwave-assisted extraction (MAE) of antioxidant compounds from cocoa (Theobroma cacao L.) leaves. By adapting suitable extraction model at the basis of microwave energy absorbed during extraction, the model can be developed to predict extraction profile of MAE at various microwave irradiation power (100-600 W) and solvent loading (100-300 ml). Verification with experimental data confirmed that the prediction was accurate in capturing the extraction profile of MAE (R-square value greater than 0.87). Besides, the predicted yields from the model showed good agreement with the experimental results with less than 10% deviation observed. Furthermore, suitable extraction times to ensure high extraction yield at various MAE conditions can be estimated based on absorbed microwave energy. The estimation is feasible as more than 85% of active compounds can be extracted when compared with the conventional extraction technique. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Combining active learning and semi-supervised learning techniques to extract protein interaction sentences.

    PubMed

    Song, Min; Yu, Hwanjo; Han, Wook-Shin

    2011-11-24

    Protein-protein interaction (PPI) extraction has been a focal point of many biomedical research and database curation tools. Both Active Learning and Semi-supervised SVMs have recently been applied to extract PPI automatically. In this paper, we explore combining the AL with the SSL to improve the performance of the PPI task. We propose a novel PPI extraction technique called PPISpotter by combining Deterministic Annealing-based SSL and an AL technique to extract protein-protein interaction. In addition, we extract a comprehensive set of features from MEDLINE records by Natural Language Processing (NLP) techniques, which further improve the SVM classifiers. In our feature selection technique, syntactic, semantic, and lexical properties of text are incorporated into feature selection that boosts the system performance significantly. By conducting experiments with three different PPI corpuses, we show that PPISpotter is superior to the other techniques incorporated into semi-supervised SVMs such as Random Sampling, Clustering, and Transductive SVMs by precision, recall, and F-measure. Our system is a novel, state-of-the-art technique for efficiently extracting protein-protein interaction pairs.

  15. High-throughput immunomagnetic scavenging technique for quantitative analysis of live VX nerve agent in water, hamburger, and soil matrixes.

    PubMed

    Knaack, Jennifer S; Zhou, Yingtao; Abney, Carter W; Prezioso, Samantha M; Magnuson, Matthew; Evans, Ronald; Jakubowski, Edward M; Hardy, Katelyn; Johnson, Rudolph C

    2012-11-20

    We have developed a novel immunomagnetic scavenging technique for extracting cholinesterase inhibitors from aqueous matrixes using biological targeting and antibody-based extraction. The technique was characterized using the organophosphorus nerve agent VX. The limit of detection for VX in high-performance liquid chromatography (HPLC)-grade water, defined as the lowest calibrator concentration, was 25 pg/mL in a small, 500 μL sample. The method was characterized over the course of 22 sample sets containing calibrators, blanks, and quality control samples. Method precision, expressed as the mean relative standard deviation, was less than 9.2% for all calibrators. Quality control sample accuracy was 102% and 100% of the mean for VX spiked into HPLC-grade water at concentrations of 2.0 and 0.25 ng/mL, respectively. This method successfully was applied to aqueous extracts from soil, hamburger, and finished tap water spiked with VX. Recovery was 65%, 81%, and 100% from these matrixes, respectively. Biologically based extractions of organophosphorus compounds represent a new technique for sample extraction that provides an increase in extraction specificity and sensitivity.

  16. Critical Evaluation of Soil Pore Water Extraction Methods on a Natural Soil

    NASA Astrophysics Data System (ADS)

    Orlowski, Natalie; Pratt, Dyan; Breuer, Lutz; McDonnell, Jeffrey

    2017-04-01

    Soil pore water extraction is an important component in ecohydrological studies for the measurement of δ2H and δ18O. The effect of pore water extraction technique on resultant isotopic signature is poorly understood. Here we present results of an intercomparison of commonly applied lab-based soil water extraction techniques on a natural soil: high pressure mechanical squeezing, centrifugation, direct vapor equilibration, microwave extraction, and two types of cryogenic extraction systems. We applied these extraction methods to a natural summer-dry (gravimetric water contents ranging from 8% to 15%) glacio-lacustrine, moderately fine textured clayey soil; excavated in 10 cm sampling increments to a depth of 1 meter. Isotope results were analyzed via OA-ICOS and compared for each extraction technique that produced liquid water. From our previous intercomparison study among the same extraction techniques but with standard soils, we discovered that extraction methods are not comparable. We therefore tested the null hypothesis that all extraction techniques would be able to replicate the natural evaporation front in a comparable manner occurring in a summer-dry soil. Our results showed that the extraction technique utilized had a significant effect on the soil water isotopic composition. High pressure mechanical squeezing and vapor equilibration techniques produced similar results with similarly sloped evaporation lines. Due to the nature of soil properties and dryness, centrifugation was unsuccessful in obtaining pore water for isotopic analysis. Cryogenic extraction on both tested techniques produced similar results to each other on a similar sloping evaporation line, but dissimilar with depth.

  17. Water-based gas purge microsyringe extraction coupled with liquid chromatography for determination of alkylphenols from sea food Laminaria japonica Aresh.

    PubMed

    Yang, Cui; Zhao, Jinhua; Wang, Juan; Yu, Hongling; Piao, Xiangfan; Li, Donghao

    2013-07-26

    A novel organic solvent-free mode of gas purge microsyringe extraction, termed water-based gas purge microsyringe extraction, was developed. This technique can directly extract target compounds in wet samples without any drying process. Parameters affecting the extraction efficiency were investigated. Under optimal extraction conditions, the recoveries of alkylphenols were between 87.6 and 105.8%, and reproducibility was between 5.2 and 12.1%. The technique was also used to determine six kinds of alkylphenols (APs) from samples of Laminaria japonica Aresh. The OP and NP were detected in all the samples, and concentrations ranged from 26.0 to 54.5ngg(-1) and 45.0-180.4ngg(-1), respectively. The 4-n-butylphenol was detected in only one sample and its concentration was very low. Other APs were not detected in L. japonica Aresh samples. The experimental results demonstrated that the technique is fast, simple, non-polluting, allows for quantitative extraction, and a drying process was not required for wet samples. Since only aqueous solution and a conventional microsyringe were used, this technique proved affordable, efficient, and convenient for the extraction of volatile and semivolatile ionizable compounds. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. A novel murmur-based heart sound feature extraction technique using envelope-morphological analysis

    NASA Astrophysics Data System (ADS)

    Yao, Hao-Dong; Ma, Jia-Li; Fu, Bin-Bin; Wang, Hai-Yang; Dong, Ming-Chui

    2015-07-01

    Auscultation of heart sound (HS) signals serves as an important primary approach to diagnose cardiovascular diseases (CVDs) for centuries. Confronting the intrinsic drawbacks of traditional HS auscultation, computer-aided automatic HS auscultation based on feature extraction technique has witnessed explosive development. Yet, most existing HS feature extraction methods adopt acoustic or time-frequency features which exhibit poor relationship with diagnostic information, thus restricting the performance of further interpretation and analysis. Tackling such a bottleneck problem, this paper innovatively proposes a novel murmur-based HS feature extraction method since murmurs contain massive pathological information and are regarded as the first indications of pathological occurrences of heart valves. Adapting discrete wavelet transform (DWT) and Shannon envelope, the envelope-morphological characteristics of murmurs are obtained and three features are extracted accordingly. Validated by discriminating normal HS and 5 various abnormal HS signals with extracted features, the proposed method provides an attractive candidate in automatic HS auscultation.

  19. Recent development of feature extraction and classification multispectral/hyperspectral images: a systematic literature review

    NASA Astrophysics Data System (ADS)

    Setiyoko, A.; Dharma, I. G. W. S.; Haryanto, T.

    2017-01-01

    Multispectral data and hyperspectral data acquired from satellite sensor have the ability in detecting various objects on the earth ranging from low scale to high scale modeling. These data are increasingly being used to produce geospatial information for rapid analysis by running feature extraction or classification process. Applying the most suited model for this data mining is still challenging because there are issues regarding accuracy and computational cost. This research aim is to develop a better understanding regarding object feature extraction and classification applied for satellite image by systematically reviewing related recent research projects. A method used in this research is based on PRISMA statement. After deriving important points from trusted sources, pixel based and texture-based feature extraction techniques are promising technique to be analyzed more in recent development of feature extraction and classification.

  20. Histogram of gradient and binarized statistical image features of wavelet subband-based palmprint features extraction

    NASA Astrophysics Data System (ADS)

    Attallah, Bilal; Serir, Amina; Chahir, Youssef; Boudjelal, Abdelwahhab

    2017-11-01

    Palmprint recognition systems are dependent on feature extraction. A method of feature extraction using higher discrimination information was developed to characterize palmprint images. In this method, two individual feature extraction techniques are applied to a discrete wavelet transform of a palmprint image, and their outputs are fused. The two techniques used in the fusion are the histogram of gradient and the binarized statistical image features. They are then evaluated using an extreme learning machine classifier before selecting a feature based on principal component analysis. Three palmprint databases, the Hong Kong Polytechnic University (PolyU) Multispectral Palmprint Database, Hong Kong PolyU Palmprint Database II, and the Delhi Touchless (IIDT) Palmprint Database, are used in this study. The study shows that our method effectively identifies and verifies palmprints and outperforms other methods based on feature extraction.

  1. A two-step electrodialysis method for DNA purification from polluted metallic environmental samples.

    PubMed

    Rodríguez-Mejía, José Luis; Martínez-Anaya, Claudia; Folch-Mallol, Jorge Luis; Dantán-González, Edgar

    2008-08-01

    Extracting DNA from samples of polluted environments using standard methods often results in low yields of poor-quality material unsuited to subsequent manipulation and analysis by molecular biological techniques. Here, we report a novel two-step electrodialysis-based method for the extraction of DNA from environmental samples. This technique permits the rapid and efficient isolation of high-quality DNA based on its acidic nature, and without the requirement for phenol-chloroform-isoamyl alcohol cleanup and ethanol precipitation steps. Subsequent PCR, endonuclease restriction, and cloning reactions were successfully performed utilizing DNA obtained by electrodialysis, whereas some or all of these techniques failed using DNA extracted with two alternative methods. We also show that his technique is applicable to purify DNA from a range of polluted and nonpolluted samples.

  2. Machine learning based sample extraction for automatic speech recognition using dialectal Assamese speech.

    PubMed

    Agarwalla, Swapna; Sarma, Kandarpa Kumar

    2016-06-01

    Automatic Speaker Recognition (ASR) and related issues are continuously evolving as inseparable elements of Human Computer Interaction (HCI). With assimilation of emerging concepts like big data and Internet of Things (IoT) as extended elements of HCI, ASR techniques are found to be passing through a paradigm shift. Oflate, learning based techniques have started to receive greater attention from research communities related to ASR owing to the fact that former possess natural ability to mimic biological behavior and that way aids ASR modeling and processing. The current learning based ASR techniques are found to be evolving further with incorporation of big data, IoT like concepts. Here, in this paper, we report certain approaches based on machine learning (ML) used for extraction of relevant samples from big data space and apply them for ASR using certain soft computing techniques for Assamese speech with dialectal variations. A class of ML techniques comprising of the basic Artificial Neural Network (ANN) in feedforward (FF) and Deep Neural Network (DNN) forms using raw speech, extracted features and frequency domain forms are considered. The Multi Layer Perceptron (MLP) is configured with inputs in several forms to learn class information obtained using clustering and manual labeling. DNNs are also used to extract specific sentence types. Initially, from a large storage, relevant samples are selected and assimilated. Next, a few conventional methods are used for feature extraction of a few selected types. The features comprise of both spectral and prosodic types. These are applied to Recurrent Neural Network (RNN) and Fully Focused Time Delay Neural Network (FFTDNN) structures to evaluate their performance in recognizing mood, dialect, speaker and gender variations in dialectal Assamese speech. The system is tested under several background noise conditions by considering the recognition rates (obtained using confusion matrices and manually) and computation time. It is found that the proposed ML based sentence extraction techniques and the composite feature set used with RNN as classifier outperform all other approaches. By using ANN in FF form as feature extractor, the performance of the system is evaluated and a comparison is made. Experimental results show that the application of big data samples has enhanced the learning of the ASR system. Further, the ANN based sample and feature extraction techniques are found to be efficient enough to enable application of ML techniques in big data aspects as part of ASR systems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Ultrasound assisted extraction of food and natural products. Mechanisms, techniques, combinations, protocols and applications. A review.

    PubMed

    Chemat, Farid; Rombaut, Natacha; Sicaire, Anne-Gaëlle; Meullemiestre, Alice; Fabiano-Tixier, Anne-Sylvie; Abert-Vian, Maryline

    2017-01-01

    This review presents a complete picture of current knowledge on ultrasound-assisted extraction (UAE) in food ingredients and products, nutraceutics, cosmetic, pharmaceutical and bioenergy applications. It provides the necessary theoretical background and some details about extraction by ultrasound, the techniques and their combinations, the mechanisms (fragmentation, erosion, capillarity, detexturation, and sonoporation), applications from laboratory to industry, security, and environmental impacts. In addition, the ultrasound extraction procedures and the important parameters influencing its performance are also included, together with the advantages and the drawbacks of each UAE techniques. Ultrasound-assisted extraction is a research topic, which affects several fields of modern plant-based chemistry. All the reported applications have shown that ultrasound-assisted extraction is a green and economically viable alternative to conventional techniques for food and natural products. The main benefits are decrease of extraction and processing time, the amount of energy and solvents used, unit operations, and CO 2 emissions. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Opto-electronic characterization of third-generation solar cells.

    PubMed

    Neukom, Martin; Züfle, Simon; Jenatsch, Sandra; Ruhstaller, Beat

    2018-01-01

    We present an overview of opto-electronic characterization techniques for solar cells including light-induced charge extraction by linearly increasing voltage, impedance spectroscopy, transient photovoltage, charge extraction and more. Guidelines for the interpretation of experimental results are derived based on charge drift-diffusion simulations of solar cells with common performance limitations. It is investigated how nonidealities like charge injection barriers, traps and low mobilities among others manifest themselves in each of the studied cell characterization techniques. Moreover, comprehensive parameter extraction for an organic bulk-heterojunction solar cell comprising PCDTBT:PC 70 BM is demonstrated. The simulations reproduce measured results of 9 different experimental techniques. Parameter correlation is minimized due to the combination of various techniques. Thereby a route to comprehensive and accurate parameter extraction is identified.

  5. A histogram-based technique for rapid vector extraction from PIV photographs

    NASA Technical Reports Server (NTRS)

    Humphreys, William M., Jr.

    1991-01-01

    A new analysis technique, performed totally in the image plane, is proposed which rapidly extracts all available vectors from individual interrogation regions on PIV photographs. The technique avoids the need for using Fourier transforms with the associated computational burden. The data acquisition and analysis procedure is described, and results of a preliminary simulation study to evaluate the accuracy of the technique are presented. Recently obtained PIV photographs are analyzed.

  6. Ambient Mass Spectrometry Imaging Using Direct Liquid Extraction Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laskin, Julia; Lanekoff, Ingela

    2015-11-13

    Mass spectrometry imaging (MSI) is a powerful analytical technique that enables label-free spatial localization and identification of molecules in complex samples.1-4 MSI applications range from forensics5 to clinical research6 and from understanding microbial communication7-8 to imaging biomolecules in tissues.1, 9-10 Recently, MSI protocols have been reviewed.11 Ambient ionization techniques enable direct analysis of complex samples under atmospheric pressure without special sample pretreatment.3, 12-16 In fact, in ambient ionization mass spectrometry, sample processing (e.g., extraction, dilution, preconcentration, or desorption) occurs during the analysis.17 This substantially speeds up analysis and eliminates any possible effects of sample preparation on the localization of moleculesmore » in the sample.3, 8, 12-14, 18-20 Venter and co-workers have classified ambient ionization techniques into three major categories based on the sample processing steps involved: 1) liquid extraction techniques, in which analyte molecules are removed from the sample and extracted into a solvent prior to ionization; 2) desorption techniques capable of generating free ions directly from substrates; and 3) desorption techniques that produce larger particles subsequently captured by an electrospray plume and ionized.17 This review focuses on localized analysis and ambient imaging of complex samples using a subset of ambient ionization methods broadly defined as “liquid extraction techniques” based on the classification introduced by Venter and co-workers.17 Specifically, we include techniques where analyte molecules are desorbed from solid or liquid samples using charged droplet bombardment, liquid extraction, physisorption, chemisorption, mechanical force, laser ablation, or laser capture microdissection. Analyte extraction is followed by soft ionization that generates ions corresponding to intact species. Some of the key advantages of liquid extraction techniques include the ease of operation, ability to analyze samples in their native environments, speed of analysis, and ability to tune the extraction solvent composition to a problem at hand. For example, solvent composition may be optimized for efficient extraction of different classes of analytes from the sample or for quantification or online derivatization through reactive analysis. In this review, we will: 1) introduce individual liquid extraction techniques capable of localized analysis and imaging, 2) describe approaches for quantitative MSI experiments free of matrix effects, 3) discuss advantages of reactive analysis for MSI experiments, and 4) highlight selected applications (published between 2012 and 2015) that focus on imaging and spatial profiling of molecules in complex biological and environmental samples.« less

  7. Computer-Aided Diagnosis System for Alzheimer's Disease Using Different Discrete Transform Techniques.

    PubMed

    Dessouky, Mohamed M; Elrashidy, Mohamed A; Taha, Taha E; Abdelkader, Hatem M

    2016-05-01

    The different discrete transform techniques such as discrete cosine transform (DCT), discrete sine transform (DST), discrete wavelet transform (DWT), and mel-scale frequency cepstral coefficients (MFCCs) are powerful feature extraction techniques. This article presents a proposed computer-aided diagnosis (CAD) system for extracting the most effective and significant features of Alzheimer's disease (AD) using these different discrete transform techniques and MFCC techniques. Linear support vector machine has been used as a classifier in this article. Experimental results conclude that the proposed CAD system using MFCC technique for AD recognition has a great improvement for the system performance with small number of significant extracted features, as compared with the CAD system based on DCT, DST, DWT, and the hybrid combination methods of the different transform techniques. © The Author(s) 2015.

  8. Utilizing uncoded consultation notes from electronic medical records for predictive modeling of colorectal cancer.

    PubMed

    Hoogendoorn, Mark; Szolovits, Peter; Moons, Leon M G; Numans, Mattijs E

    2016-05-01

    Machine learning techniques can be used to extract predictive models for diseases from electronic medical records (EMRs). However, the nature of EMRs makes it difficult to apply off-the-shelf machine learning techniques while still exploiting the rich content of the EMRs. In this paper, we explore the usage of a range of natural language processing (NLP) techniques to extract valuable predictors from uncoded consultation notes and study whether they can help to improve predictive performance. We study a number of existing techniques for the extraction of predictors from the consultation notes, namely a bag of words based approach and topic modeling. In addition, we develop a dedicated technique to match the uncoded consultation notes with a medical ontology. We apply these techniques as an extension to an existing pipeline to extract predictors from EMRs. We evaluate them in the context of predictive modeling for colorectal cancer (CRC), a disease known to be difficult to diagnose before performing an endoscopy. Our results show that we are able to extract useful information from the consultation notes. The predictive performance of the ontology-based extraction method moves significantly beyond the benchmark of age and gender alone (area under the receiver operating characteristic curve (AUC) of 0.870 versus 0.831). We also observe more accurate predictive models by adding features derived from processing the consultation notes compared to solely using coded data (AUC of 0.896 versus 0.882) although the difference is not significant. The extracted features from the notes are shown be equally predictive (i.e. there is no significant difference in performance) compared to the coded data of the consultations. It is possible to extract useful predictors from uncoded consultation notes that improve predictive performance. Techniques linking text to concepts in medical ontologies to derive these predictors are shown to perform best for predicting CRC in our EMR dataset. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Opto-electronic characterization of third-generation solar cells

    PubMed Central

    Jenatsch, Sandra

    2018-01-01

    Abstract We present an overview of opto-electronic characterization techniques for solar cells including light-induced charge extraction by linearly increasing voltage, impedance spectroscopy, transient photovoltage, charge extraction and more. Guidelines for the interpretation of experimental results are derived based on charge drift-diffusion simulations of solar cells with common performance limitations. It is investigated how nonidealities like charge injection barriers, traps and low mobilities among others manifest themselves in each of the studied cell characterization techniques. Moreover, comprehensive parameter extraction for an organic bulk-heterojunction solar cell comprising PCDTBT:PC70BM is demonstrated. The simulations reproduce measured results of 9 different experimental techniques. Parameter correlation is minimized due to the combination of various techniques. Thereby a route to comprehensive and accurate parameter extraction is identified. PMID:29707069

  10. Enzyme assisted extraction of biomolecules as an approach to novel extraction technology: A review.

    PubMed

    Nadar, Shamraja S; Rao, Priyanka; Rathod, Virendra K

    2018-06-01

    An interest in the development of extraction techniques of biomolecules from various natural sources has increased in recent years due to their potential applications particularly for food and nutraceutical purposes. The presence of polysaccharides such as hemicelluloses, starch, pectin inside the cell wall, reduces the extraction efficiency of conventional extraction techniques. Conventional techniques also suffer from low extraction yields, time inefficiency and inferior extract quality due to traces of organic solvents present in them. Hence, there is a need of the green and novel extraction methods to recover biomolecules. The present review provides a holistic insight to various aspects related to enzyme aided extraction. Applications of enzymes in the recovery of various biomolecules such as polyphenols, oils, polysaccharides, flavours and colorants have been highlighted. Additionally, the employment of hyphenated extraction technologies can overcome some of the major drawbacks of enzyme based extraction such as longer extraction time and immoderate use of solvents. This review also includes hyphenated intensification techniques by coupling conventional methods with ultrasound, microwave, high pressure and supercritical carbon dioxide. The last section gives an insight on application of enzyme immobilization as a strategy for large scale extraction. Immobilization of enzymes on magnetic nanoparticles can be employed to enhance the operational performance of the system by multiple use of expensive enzymes making them industrially and economically feasible. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Automation of static and dynamic non-dispersive liquid phase microextraction. Part 1: Approaches based on extractant drop-, plug-, film- and microflow-formation.

    PubMed

    Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján

    2016-02-04

    Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Dual-wavelength phase-shifting digital holography selectively extracting wavelength information from wavelength-multiplexed holograms.

    PubMed

    Tahara, Tatsuki; Mori, Ryota; Kikunaga, Shuhei; Arai, Yasuhiko; Takaki, Yasuhiro

    2015-06-15

    Dual-wavelength phase-shifting digital holography that selectively extracts wavelength information from five wavelength-multiplexed holograms is presented. Specific phase shifts for respective wavelengths are introduced to remove the crosstalk components and extract only the object wave at the desired wavelength from the holograms. Object waves in multiple wavelengths are selectively extracted by utilizing 2π ambiguity and the subtraction procedures based on phase-shifting interferometry. Numerical results show the validity of the proposed technique. The proposed technique is also experimentally demonstrated.

  13. Evaluation by latent class analysis of a magnetic capture based DNA extraction followed by real-time qPCR as a new diagnostic method for detection of Echinococcus multilocularis in definitive hosts.

    PubMed

    Maas, Miriam; van Roon, Annika; Dam-Deisz, Cecile; Opsteegh, Marieke; Massolo, Alessandro; Deksne, Gunita; Teunis, Peter; van der Giessen, Joke

    2016-10-30

    A new method, based on a magnetic capture based DNA extraction followed by qPCR, was developed for the detection of the zoonotic parasite Echinococcus multilocularis in definitive hosts. Latent class analysis was used to compare this new method with the currently used phenol-chloroform DNA extraction followed by single tube nested PCR. In total, 60 red foxes and coyotes from three different locations were tested with both molecular methods and the sedimentation and counting technique (SCT) or intestinal scraping technique (IST). Though based on a limited number of samples, it could be established that the magnetic capture based DNA extraction followed by qPCR showed similar sensitivity and specificity as the currently used phenol-chloroform DNA extraction followed by single tube nested PCR. All methods have a high specificity as shown by Bayesian latent class analysis. Both molecular assays have higher sensitivities than the combined SCT and IST, though the uncertainties in sensitivity estimates were wide for all assays tested. The magnetic capture based DNA extraction followed by qPCR has the advantage of not requiring hazardous chemicals like the phenol-chloroform DNA extraction followed by single tube nested PCR. This supports the replacement of the phenol-chloroform DNA extraction followed by single tube nested PCR by the magnetic capture based DNA extraction followed by qPCR for molecular detection of E. multilocularis in definitive hosts. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Automated Solid Phase Extraction (SPE) LC/NMR Applied to the Structural Analysis of Extractable Compounds from a Pharmaceutical Packaging Material of Construction.

    PubMed

    Norwood, Daniel L; Mullis, James O; Davis, Mark; Pennino, Scott; Egert, Thomas; Gonnella, Nina C

    2013-01-01

    The structural analysis (i.e., identification) of organic chemical entities leached into drug product formulations has traditionally been accomplished with techniques involving the combination of chromatography with mass spectrometry. These include gas chromatography/mass spectrometry (GC/MS) for volatile and semi-volatile compounds, and various forms of liquid chromatography/mass spectrometry (LC/MS or HPLC/MS) for semi-volatile and relatively non-volatile compounds. GC/MS and LC/MS techniques are complementary for structural analysis of leachables and potentially leachable organic compounds produced via laboratory extraction of pharmaceutical container closure/delivery system components and corresponding materials of construction. Both hyphenated analytical techniques possess the separating capability, compound specific detection attributes, and sensitivity required to effectively analyze complex mixtures of trace level organic compounds. However, hyphenated techniques based on mass spectrometry are limited by the inability to determine complete bond connectivity, the inability to distinguish between many types of structural isomers, and the inability to unambiguously determine aromatic substitution patterns. Nuclear magnetic resonance spectroscopy (NMR) does not have these limitations; hence it can serve as a complement to mass spectrometry. However, NMR technology is inherently insensitive and its ability to interface with chromatography has been historically challenging. This article describes the application of NMR coupled with liquid chromatography and automated solid phase extraction (SPE-LC/NMR) to the structural analysis of extractable organic compounds from a pharmaceutical packaging material of construction. The SPE-LC/NMR technology combined with micro-cryoprobe technology afforded the sensitivity and sample mass required for full structure elucidation. Optimization of the SPE-LC/NMR analytical method was achieved using a series of model compounds representing the chemical diversity of extractables. This study demonstrates the complementary nature of SPE-LC/NMR with LC/MS for this particular pharmaceutical application. The identification of impurities leached into drugs from the components and materials associated with pharmaceutical containers, packaging components, and materials has historically been done using laboratory techniques based on the combination of chromatography with mass spectrometry. Such analytical techniques are widely recognized as having the selectivity and sensitivity required to separate the complex mixtures of impurities often encountered in such identification studies, including both the identification of leachable impurities as well as potential leachable impurities produced by laboratory extraction of packaging components and materials. However, while mass spectrometry-based analytical techniques have limitations for this application, newer analytical techniques based on the combination of chromatography with nuclear magnetic resonance spectroscopy provide an added dimension of structural definition. This article describes the development, optimization, and application of an analytical technique based on the combination of chromatography and nuclear magnetic resonance spectroscopy to the identification of potential leachable impurities from a pharmaceutical packaging material. The complementary nature of the analytical techniques for this particular pharmaceutical application is demonstrated.

  15. Usability-driven pruning of large ontologies: the case of SNOMED CT.

    PubMed

    López-García, Pablo; Boeker, Martin; Illarramendi, Arantza; Schulz, Stefan

    2012-06-01

    To study ontology modularization techniques when applied to SNOMED CT in a scenario in which no previous corpus of information exists and to examine if frequency-based filtering using MEDLINE can reduce subset size without discarding relevant concepts. Subsets were first extracted using four graph-traversal heuristics and one logic-based technique, and were subsequently filtered with frequency information from MEDLINE. Twenty manually coded discharge summaries from cardiology patients were used as signatures and test sets. The coverage, size, and precision of extracted subsets were measured. Graph-traversal heuristics provided high coverage (71-96% of terms in the test sets of discharge summaries) at the expense of subset size (17-51% of the size of SNOMED CT). Pre-computed subsets and logic-based techniques extracted small subsets (1%), but coverage was limited (24-55%). Filtering reduced the size of large subsets to 10% while still providing 80% coverage. Extracting subsets to annotate discharge summaries is challenging when no previous corpus exists. Ontology modularization provides valuable techniques, but the resulting modules grow as signatures spread across subhierarchies, yielding a very low precision. Graph-traversal strategies and frequency data from an authoritative source can prune large biomedical ontologies and produce useful subsets that still exhibit acceptable coverage. However, a clinical corpus closer to the specific use case is preferred when available.

  16. Use of partial dissolution techniques in geochemical exploration

    USGS Publications Warehouse

    Chao, T.T.

    1984-01-01

    Application of partial dissolution techniques to geochemical exploration has advanced from an early empirical approach to an approach based on sound geochemical principles. This advance assures a prominent future position for the use of these techniques in geochemical exploration for concealed mineral deposits. Partial dissolution techniques are classified as single dissolution or sequential multiple dissolution depending on the number of steps taken in the procedure, or as "nonselective" extraction and as "selective" extraction in terms of the relative specificity of the extraction. The choice of dissolution techniques for use in geochemical exploration is dictated by the geology of the area, the type and degree of weathering, and the expected chemical forms of the ore and of the pathfinding elements. Case histories have illustrated many instances where partial dissolution techniques exhibit advantages over conventional methods of chemical analysis used in geochemical exploration. ?? 1984.

  17. The limit of the film extraction technique for annular two-phase flow in a small tube

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helm, D.E.; Lopez de Bertodano, M.; Beus, S.G.

    1999-07-01

    The limit of the liquid film extraction technique was identified in air-water and Freon-113 annular two-phase flow loops. The purpose of this research is to find the limit of the entrainment rate correlation obtained by Lopez de Bertodano et. al. (1998). The film extraction technique involves the suction of the liquid film through a porous tube and has been widely used to obtain annular flow entrainment and entrainment rate data. In these experiments there are two extraction probes. After the first extraction the entrained droplets in the gas core deposit on the tube wall. A new liquid film develops entirelymore » from liquid deposition and a second liquid film extraction is performed. While it is assumed that the entire liquid film is removed after the first extraction unit, this is not true for high liquid flow. At high liquid film flows the interfacial structure of the film becomes frothy. Then the entire liquid film cannot be removed at the first extraction unit, but continues on and is extracted at the second extraction unit. A simple model to characterize the limit of the extraction technique was obtained based on the hypothesis that the transition occurs due to a change in the wave structure. The resulting dimensionless correlation agrees with the data.« less

  18. The limit of the film extraction technique for annular two-phase flow in a small tube

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helm, D.E.; Lopez de Bertodano, M.; Beus, S.G.

    1999-07-01

    The limit of the liquid film extraction technique was identified in air-water and Freon-113 annular two-phase flow loops. The purpose of this research is to find the limit of the entrainment rate correlation obtained by Lopez de Bertodano et al. (1998). The film extraction technique involves the suction of the liquid film through a porous tube and has been widely used to obtain annular flow entrainment and entrainment rate data. In the experiments there are two extraction probes. After the first extraction the entrained droplets in the gas core deposit on the tube wall. A new liquid film develops entirelymore » from liquid deposition and a second liquid film extraction is performed. While it is assumed that the entire liquid film is removed after the first extraction unit, this is not true for high liquid flow. At high liquid film flows the interfacial structure of the film becomes frothy. Then the entire liquid film cannot be removed at the first extraction unit, but continues on and is extracted at the second extraction unit. A simple model to characterize the limit of the extraction technique was obtained based on the hypothesis that the transition occurs due to a change in the wave structure. The resulting dimensionless correlation agrees with the data.« less

  19. Prostate cancer detection using machine learning techniques by employing combination of features extracting strategies.

    PubMed

    Hussain, Lal; Ahmed, Adeel; Saeed, Sharjil; Rathore, Saima; Awan, Imtiaz Ahmed; Shah, Saeed Arif; Majid, Abdul; Idris, Adnan; Awan, Anees Ahmed

    2018-02-06

    Prostate is a second leading causes of cancer deaths among men. Early detection of cancer can effectively reduce the rate of mortality caused by Prostate cancer. Due to high and multiresolution of MRIs from prostate cancer require a proper diagnostic systems and tools. In the past researchers developed Computer aided diagnosis (CAD) systems that help the radiologist to detect the abnormalities. In this research paper, we have employed novel Machine learning techniques such as Bayesian approach, Support vector machine (SVM) kernels: polynomial, radial base function (RBF) and Gaussian and Decision Tree for detecting prostate cancer. Moreover, different features extracting strategies are proposed to improve the detection performance. The features extracting strategies are based on texture, morphological, scale invariant feature transform (SIFT), and elliptic Fourier descriptors (EFDs) features. The performance was evaluated based on single as well as combination of features using Machine Learning Classification techniques. The Cross validation (Jack-knife k-fold) was performed and performance was evaluated in term of receiver operating curve (ROC) and specificity, sensitivity, Positive predictive value (PPV), negative predictive value (NPV), false positive rate (FPR). Based on single features extracting strategies, SVM Gaussian Kernel gives the highest accuracy of 98.34% with AUC of 0.999. While, using combination of features extracting strategies, SVM Gaussian kernel with texture + morphological, and EFDs + morphological features give the highest accuracy of 99.71% and AUC of 1.00.

  20. Successful Treatment of Postpeak Stage Patients with Class II Division 1 Malocclusion Using Non-extraction and Multiloop Edgewise Archwire Therapy: A Report on 16 Cases

    PubMed Central

    Liu, Jun; Zou, Ling; Zhao, Zhi-he; Welburn, Neala; Yang, Pu; Tang, Tian; Li, Yu

    2009-01-01

    Aim To determine cephalometrically the mechanism of the treatment effects of non-extraction and multiloop edgewise archwire (MEAW) technique on postpeak Class II Division 1 patients. Methodology In this retrospective study, 16 postpeak Class II Division 1 patients successfully corrected using a non-extraction and MEAW technique were cephalometrically evaluated and compared with 16 matched control subjects treated using an extraction technique. Using CorelDRAW® software, standardized digital cephalograms pre- and post-active treatments were traced and a reference grid was set up. The superimpositions were based on the cranial base, the mandibular and the maxilla regions,and skeletal and dental changes were measured. Changes following treatment were evaluated using the paired-sample t-test. Student's t-test for unpaired samples was used to assess the differences in changes between the MEAW and the extraction control groups. Results The correction of the molar relationships comprised 54% skeletal change (mainly the advancement of the mandible) and 46% dental change. Correction of the anterior teeth relationships comprised 30% skeletal change and 70% dental change. Conclusion The MEAW technique can produce the desired vertical and sagittal movement of the tooth segment and then effectively stimulate mandibular advancement by utilizing the residual growth potential of the condyle. PMID:20690424

  1. Sequential ultrasound-microwave assisted acid extraction (UMAE) of pectin from pomelo peels.

    PubMed

    Liew, Shan Qin; Ngoh, Gek Cheng; Yusoff, Rozita; Teoh, Wen Hui

    2016-12-01

    This study aims to optimize sequential ultrasound-microwave assisted extraction (UMAE) on pomelo peel using citric acid. The effects of pH, sonication time, microwave power and irradiation time on the yield and the degree of esterification (DE) of pectin were investigated. Under optimized conditions of pH 1.80, 27.52min sonication followed by 6.40min microwave irradiation at 643.44W, the yield and the DE value of pectin obtained was respectively at 38.00% and 56.88%. Based upon optimized UMAE condition, the pectin from microwave-ultrasound assisted extraction (MUAE), ultrasound assisted extraction (UAE) and microwave assisted extraction (MAE) were studied. The yield of pectin adopting the UMAE was higher than all other techniques in the order of UMAE>MUAE>MAE>UAE. The pectin's galacturonic acid content obtained from combined extraction technique is higher than that obtained from sole extraction technique and the pectin gel produced from various techniques exhibited a pseudoplastic behaviour. The morphological structures of pectin extracted from MUAE and MAE closely resemble each other. The extracted pectin from UMAE with smaller and more regular surface differs greatly from that of UAE. This has substantiated the highest pectin yield of 36.33% from UMAE and further signified their compatibility and potentiality in pectin extraction. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Image fusion for visualization of hepatic vasculature and tumors

    NASA Astrophysics Data System (ADS)

    Chou, Jin-Shin; Chen, Shiuh-Yung J.; Sudakoff, Gary S.; Hoffmann, Kenneth R.; Chen, Chin-Tu; Dachman, Abraham H.

    1995-05-01

    We have developed segmentation and simultaneous display techniques to facilitate the visualization of the three-dimensional spatial relationships between organ structures and organ vasculature. We concentrate on the visualization of the liver based on spiral computed tomography images. Surface-based 3-D rendering and maximal intensity projection algorithms are used for data visualization. To extract the liver in the serial of images accurately and efficiently, we have developed a user-friendly interactive program with a deformable-model segmentation. Surface rendering techniques are used to visualize the extracted structures, adjacent contours are aligned and fitted with a Bezier surface to yield a smooth surface. Visualization of the vascular structures, portal and hepatic veins, is achieved by applying a MIP technique to the extracted liver volume. To integrate the extracted structures they are surface-rendered and their MIP images are aligned and a color table is designed for simultaneous display of the combined liver/tumor and vasculature images. By combining the 3-D surface rendering and MIP techniques, portal veins, hepatic veins, and hepatic tumor can be inspected simultaneously and their spatial relationships can be more easily perceived. The proposed technique will be useful for visualization of both hepatic neoplasm and vasculature in surgical planning for tumor resection or living-donor liver transplantation.

  3. Unsupervised Extraction of Diagnosis Codes from EMRs Using Knowledge-Based and Extractive Text Summarization Techniques

    PubMed Central

    Kavuluru, Ramakanth; Han, Sifei; Harris, Daniel

    2017-01-01

    Diagnosis codes are extracted from medical records for billing and reimbursement and for secondary uses such as quality control and cohort identification. In the US, these codes come from the standard terminology ICD-9-CM derived from the international classification of diseases (ICD). ICD-9 codes are generally extracted by trained human coders by reading all artifacts available in a patient’s medical record following specific coding guidelines. To assist coders in this manual process, this paper proposes an unsupervised ensemble approach to automatically extract ICD-9 diagnosis codes from textual narratives included in electronic medical records (EMRs). Earlier attempts on automatic extraction focused on individual documents such as radiology reports and discharge summaries. Here we use a more realistic dataset and extract ICD-9 codes from EMRs of 1000 inpatient visits at the University of Kentucky Medical Center. Using named entity recognition (NER), graph-based concept-mapping of medical concepts, and extractive text summarization techniques, we achieve an example based average recall of 0.42 with average precision 0.47; compared with a baseline of using only NER, we notice a 12% improvement in recall with the graph-based approach and a 7% improvement in precision using the extractive text summarization approach. Although diagnosis codes are complex concepts often expressed in text with significant long range non-local dependencies, our present work shows the potential of unsupervised methods in extracting a portion of codes. As such, our findings are especially relevant for code extraction tasks where obtaining large amounts of training data is difficult. PMID:28748227

  4. Extraction of intracellular protein from Chlorella pyrenoidosa using a combination of ethanol soaking, enzyme digest, ultrasonication and homogenization techniques.

    PubMed

    Zhang, Ruilin; Chen, Jian; Zhang, Xuewu

    2018-01-01

    Due to the rigid cell wall of Chlorella species, it is still challenging to effectively extract significant amounts of protein. Mass methods were used for the extraction of intracellular protein from microalgae with biological, mechanical and chemical approaches. In this study, based on comparison of different extraction methods, a new protocol was established to maximize extract amounts of protein, which was involved in ethanol soaking, enzyme digest, ultrasonication and homogenization techniques. Under the optimized conditions, 72.4% of protein was extracted from the microalgae Chlorella pyrenoidosa, which should contribute to the research and development of Chlorella protein in functional food and medicine. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. [Evoked potentials extraction based on cross-talk resistant adaptive noise cancellation].

    PubMed

    Zeng, Qingning; Li, Ling; Liu, Qinghua; Yao, Dezhong

    2004-06-01

    As Evoked Potentials are much lower in amplitude with respect to the on-going EEC, many trigger-related signals are needed for common averaging technique to enable the extraction of single-trail evoked potentials (EP). How to acquire EP through fewer evocations is an important research project. This paper proposes a cross-talk resistant adaptive noise cancellation method to extract EP. Together with the use of filtering technique and the common averaging technique, the present method needs much less evocations to acquire EP signals. According to the simulating experiment, it needs only several evocations or even only one evocation to get EP signals in good quality.

  6. Application of gas chromatography to analysis of spirit-based alcoholic beverages.

    PubMed

    Wiśniewska, Paulina; Śliwińska, Magdalena; Dymerski, Tomasz; Wardencki, Waldemar; Namieśnik, Jacek

    2015-01-01

    Spirit-based beverages are alcoholic drinks; their production processes are dependent on the type and origin of raw materials. The composition of this complex matrix is difficult to analyze, and scientists commonly choose gas chromatography techniques for this reason. With a wide selection of extraction methods and detectors it is possible to provide qualitative and quantitative analysis for many chemical compounds with various functional groups. This article describes different types of gas chromatography techniques and their most commonly used associated extraction techniques (e.g., LLE, SPME, SPE, SFE, and SBME) and detectors (MS, TOFMS, FID, ECD, NPD, AED, O or EPD). Additionally, brief characteristics of internationally popular spirit-based beverages and application of gas chromatography to the analysis of selected alcoholic drinks are presented.

  7. Applications of derivatization reactions to trace organic compounds during sample preparation based on pressurized liquid extraction.

    PubMed

    Carro, Antonia M; González, Paula; Lorenzo, Rosa A

    2013-06-28

    Pressurized liquid extraction (PLE) is an exhaustive technique used for the extraction of analytes from solid samples. Temperature, pressure, solvent type and volume, and the addition of other reagents notably influence the efficiency of the extraction. The analytical applications of this technique can be improved by coupling with appropriate derivatization reactions. The aim of this review is to discuss the recent applications of the sequential combination of PLE with derivatization and the approaches that involve simultaneous extraction and in situ derivatization. The potential of the latest developments to the trace analysis of environmental, food and biological samples is also analyzed. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Evaluation of Keyphrase Extraction Algorithm and Tiling Process for a Document/Resource Recommender within E-Learning Environments

    ERIC Educational Resources Information Center

    Mangina, Eleni; Kilbride, John

    2008-01-01

    The research presented in this paper is an examination of the applicability of IUI techniques in an online e-learning environment. In particular we make use of user modeling techniques, information retrieval and extraction mechanisms and collaborative filtering methods. The domains of e-learning, web-based training and instruction and intelligent…

  9. Pharmacovigilance from social media: mining adverse drug reaction mentions using sequence labeling with word embedding cluster features.

    PubMed

    Nikfarjam, Azadeh; Sarker, Abeed; O'Connor, Karen; Ginn, Rachel; Gonzalez, Graciela

    2015-05-01

    Social media is becoming increasingly popular as a platform for sharing personal health-related information. This information can be utilized for public health monitoring tasks, particularly for pharmacovigilance, via the use of natural language processing (NLP) techniques. However, the language in social media is highly informal, and user-expressed medical concepts are often nontechnical, descriptive, and challenging to extract. There has been limited progress in addressing these challenges, and thus far, advanced machine learning-based NLP techniques have been underutilized. Our objective is to design a machine learning-based approach to extract mentions of adverse drug reactions (ADRs) from highly informal text in social media. We introduce ADRMine, a machine learning-based concept extraction system that uses conditional random fields (CRFs). ADRMine utilizes a variety of features, including a novel feature for modeling words' semantic similarities. The similarities are modeled by clustering words based on unsupervised, pretrained word representation vectors (embeddings) generated from unlabeled user posts in social media using a deep learning technique. ADRMine outperforms several strong baseline systems in the ADR extraction task by achieving an F-measure of 0.82. Feature analysis demonstrates that the proposed word cluster features significantly improve extraction performance. It is possible to extract complex medical concepts, with relatively high performance, from informal, user-generated content. Our approach is particularly scalable, suitable for social media mining, as it relies on large volumes of unlabeled data, thus diminishing the need for large, annotated training data sets. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  10. Modern extraction techniques and their impact on the pharmacological profile of Serenoa repens extracts for the treatment of lower urinary tract symptoms

    PubMed Central

    2014-01-01

    Background Bioactive compounds from plants (i.e., Serenoa repens) are often used in medicine in the treatment of several pathologies, among which benign prostatic hyperplasia (BPH) associated to lower urinary tract symptoms (LUTS). Discussion There are different techniques of extraction, also used in combination, with the aim of enhancing the amount of the target molecules, gaining time and reducing waste of solvents. However, the qualitative and quantitative composition of the bioactives depends on the extractive process, and so the brands of the recovered products from the same plant are different in terms of clinical efficacy (no product interchangeability among different commercial brands). Summary In this review, we report on several and recent extraction techniques and their impact on the composition/biological activity of S. repens-based available products. PMID:25112532

  11. Combining Feature Extraction Methods to Assist the Diagnosis of Alzheimer's Disease.

    PubMed

    Segovia, F; Górriz, J M; Ramírez, J; Phillips, C

    2016-01-01

    Neuroimaging data as (18)F-FDG PET is widely used to assist the diagnosis of Alzheimer's disease (AD). Looking for regions with hypoperfusion/ hypometabolism, clinicians may predict or corroborate the diagnosis of the patients. Modern computer aided diagnosis (CAD) systems based on the statistical analysis of whole neuroimages are more accurate than classical systems based on quantifying the uptake of some predefined regions of interests (ROIs). In addition, these new systems allow determining new ROIs and take advantage of the huge amount of information comprised in neuroimaging data. A major branch of modern CAD systems for AD is based on multivariate techniques, which analyse a neuroimage as a whole, considering not only the voxel intensities but also the relations among them. In order to deal with the vast dimensionality of the data, a number of feature extraction methods have been successfully applied. In this work, we propose a CAD system based on the combination of several feature extraction techniques. First, some commonly used feature extraction methods based on the analysis of the variance (as principal component analysis), on the factorization of the data (as non-negative matrix factorization) and on classical magnitudes (as Haralick features) were simultaneously applied to the original data. These feature sets were then combined by means of two different combination approaches: i) using a single classifier and a multiple kernel learning approach and ii) using an ensemble of classifier and selecting the final decision by majority voting. The proposed approach was evaluated using a labelled neuroimaging database along with a cross validation scheme. As conclusion, the proposed CAD system performed better than approaches using only one feature extraction technique. We also provide a fair comparison (using the same database) of the selected feature extraction methods.

  12. Extraction of CT dose information from DICOM metadata: automated Matlab-based approach.

    PubMed

    Dave, Jaydev K; Gingold, Eric L

    2013-01-01

    The purpose of this study was to extract exposure parameters and dose-relevant indexes of CT examinations from information embedded in DICOM metadata. DICOM dose report files were identified and retrieved from a PACS. An automated software program was used to extract from these files information from the structured elements in the DICOM metadata relevant to exposure. Extracting information from DICOM metadata eliminated potential errors inherent in techniques based on optical character recognition, yielding 100% accuracy.

  13. Droplet-Based Segregation and Extraction of Concentrated Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buie, C R; Buckley, P; Hamilton, J

    2007-02-23

    Microfluidic analysis often requires sample concentration and separation techniques to isolate and detect analytes of interest. Complex or scarce samples may also require an orthogonal separation and detection method or off-chip analysis to confirm results. To perform these additional steps, the concentrated sample plug must be extracted from the primary microfluidic channel with minimal sample loss and dilution. We investigated two extraction techniques; injection of immiscible fluid droplets into the sample stream (''capping'''') and injection of the sample into an immiscible fluid stream (''extraction''). From our results we conclude that capping is the more effective partitioning technique. Furthermore, this functionalitymore » enables additional off-chip post-processing procedures such as DNA/RNA microarray analysis, realtime polymerase chain reaction (RT-PCR), and culture growth to validate chip performance.« less

  14. PKDE4J: Entity and relation extraction for public knowledge discovery.

    PubMed

    Song, Min; Kim, Won Chul; Lee, Dahee; Heo, Go Eun; Kang, Keun Young

    2015-10-01

    Due to an enormous number of scientific publications that cannot be handled manually, there is a rising interest in text-mining techniques for automated information extraction, especially in the biomedical field. Such techniques provide effective means of information search, knowledge discovery, and hypothesis generation. Most previous studies have primarily focused on the design and performance improvement of either named entity recognition or relation extraction. In this paper, we present PKDE4J, a comprehensive text-mining system that integrates dictionary-based entity extraction and rule-based relation extraction in a highly flexible and extensible framework. Starting with the Stanford CoreNLP, we developed the system to cope with multiple types of entities and relations. The system also has fairly good performance in terms of accuracy as well as the ability to configure text-processing components. We demonstrate its competitive performance by evaluating it on many corpora and found that it surpasses existing systems with average F-measures of 85% for entity extraction and 81% for relation extraction. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Classification of Two Class Motor Imagery Tasks Using Hybrid GA-PSO Based K-Means Clustering.

    PubMed

    Suraj; Tiwari, Purnendu; Ghosh, Subhojit; Sinha, Rakesh Kumar

    2015-01-01

    Transferring the brain computer interface (BCI) from laboratory condition to meet the real world application needs BCI to be applied asynchronously without any time constraint. High level of dynamism in the electroencephalogram (EEG) signal reasons us to look toward evolutionary algorithm (EA). Motivated by these two facts, in this work a hybrid GA-PSO based K-means clustering technique has been used to distinguish two class motor imagery (MI) tasks. The proposed hybrid GA-PSO based K-means clustering is found to outperform genetic algorithm (GA) and particle swarm optimization (PSO) based K-means clustering techniques in terms of both accuracy and execution time. The lesser execution time of hybrid GA-PSO technique makes it suitable for real time BCI application. Time frequency representation (TFR) techniques have been used to extract the feature of the signal under investigation. TFRs based features are extracted and relying on the concept of event related synchronization (ERD) and desynchronization (ERD) feature vector is formed.

  16. Classification of Two Class Motor Imagery Tasks Using Hybrid GA-PSO Based K-Means Clustering

    PubMed Central

    Suraj; Tiwari, Purnendu; Ghosh, Subhojit; Sinha, Rakesh Kumar

    2015-01-01

    Transferring the brain computer interface (BCI) from laboratory condition to meet the real world application needs BCI to be applied asynchronously without any time constraint. High level of dynamism in the electroencephalogram (EEG) signal reasons us to look toward evolutionary algorithm (EA). Motivated by these two facts, in this work a hybrid GA-PSO based K-means clustering technique has been used to distinguish two class motor imagery (MI) tasks. The proposed hybrid GA-PSO based K-means clustering is found to outperform genetic algorithm (GA) and particle swarm optimization (PSO) based K-means clustering techniques in terms of both accuracy and execution time. The lesser execution time of hybrid GA-PSO technique makes it suitable for real time BCI application. Time frequency representation (TFR) techniques have been used to extract the feature of the signal under investigation. TFRs based features are extracted and relying on the concept of event related synchronization (ERD) and desynchronization (ERD) feature vector is formed. PMID:25972896

  17. Application of a Tenax Model to Assess Bioavailability of Polychlorinated Biphenyls in Field Sediments

    EPA Science Inventory

    Recent literature has shown that bioavailability-based techniques, such as Tenax extraction, can estimate sediment exposure to benthos. In a previous study by the authors,Tenax extraction was used to create and validate a literature-based Tenax model to predict oligochaete bioac...

  18. Recovery of 1,3-, 2,3-dichloropropenes, 1,2-dibromo-3-chloropropane, and o-, p-dichlorobenzenes from fatty and non-fat foodstuffs by liquid extraction technique.

    PubMed

    Daft, J L

    1990-01-01

    Food samples including fatty, non-fatty, grain-based, and nongrain-based types were fortified with the following five nematocides and fumigants: 1,3-dichloropropene, 2,3-dichloropropene, 1,2-dibromo-3-chloropropane, o-dichlorobenzene, and p-dichlorobenzene. Then, depending on sample consistency and type, the samples were diluted in, or extracted with organic solvent such as isooctane. A few of the high-fat extracts were passed through Florisil to remove excess fat or endogenous interferences. Analysis of the initial or cleaned up extracts was done by gas chromatography (GC) at 90 degrees C. The dichloropropenes were determined on 20% OV-101 columns with electron-capture and Hall electroconductivity detectors. The dichlorobenzenes and 1,2-dibromo-3-chloropropane, which elute beyond 30 min on the above columns, were determined on 5%-loaded columns using the same detectors. All five analytes were recovered from these techniques. Mean analyte recovery following a direct dilution or extraction was 83%, and following the Florisil cleanup step, was 52%. In 1986, a fumigant survey of about 200 foodstuffs by using this overall technique gave no findings of the five compounds studied here.

  19. Stroke-model-based character extraction from gray-level document images.

    PubMed

    Ye, X; Cheriet, M; Suen, C Y

    2001-01-01

    Global gray-level thresholding techniques such as Otsu's method, and local gray-level thresholding techniques such as edge-based segmentation or the adaptive thresholding method are powerful in extracting character objects from simple or slowly varying backgrounds. However, they are found to be insufficient when the backgrounds include sharply varying contours or fonts in different sizes. A stroke-model is proposed to depict the local features of character objects as double-edges in a predefined size. This model enables us to detect thin connected components selectively, while ignoring relatively large backgrounds that appear complex. Meanwhile, since the stroke width restriction is fully factored in, the proposed technique can be used to extract characters in predefined font sizes. To process large volumes of documents efficiently, a hybrid method is proposed for character extraction from various backgrounds. Using the measurement of class separability to differentiate images with simple backgrounds from those with complex backgrounds, the hybrid method can process documents with different backgrounds by applying the appropriate methods. Experiments on extracting handwriting from a check image, as well as machine-printed characters from scene images demonstrate the effectiveness of the proposed model.

  20. Information Extraction Using Controlled English to Support Knowledge-Sharing and Decision-Making

    DTIC Science & Technology

    2012-06-01

    or language variants. CE-based information extraction will greatly facilitate the processes in the cognitive and social domains that enable forces...terminology or language variants. CE-based information extraction will greatly facilitate the processes in the cognitive and social domains that...processor is run to turn the atomic CE into a more “ stylistically felicitous” CE, using techniques such as: aggregating all information about an entity

  1. Postoperative Refractive Errors Following Pediatric Cataract Extraction with Intraocular Lens Implantation.

    PubMed

    Indaram, Maanasa; VanderVeen, Deborah K

    2018-01-01

    Advances in surgical techniques allow implantation of intraocular lenses (IOL) with cataract extraction, even in young children. However, there are several challenges unique to the pediatric population that result in greater degrees of postoperative refractive error compared to adults. Literature review of the techniques and outcomes of pediatric cataract surgery with IOL implantation. Pediatric cataract surgery is associated with several sources of postoperative refractive error. These include planned refractive error based on age or fellow eye status, loss of accommodation, and unexpected refractive errors due to inaccuracies in biometry technique, use of IOL power formulas based on adult normative values, and late refractive changes due to unpredictable eye growth. Several factors can preclude the achievement of optimal refractive status following pediatric cataract extraction with IOL implantation. There is a need for new technology to reduce postoperative refractive surprises and address refractive adjustment in a growing eye.

  2. Usability-driven pruning of large ontologies: the case of SNOMED CT

    PubMed Central

    Boeker, Martin; Illarramendi, Arantza; Schulz, Stefan

    2012-01-01

    Objectives To study ontology modularization techniques when applied to SNOMED CT in a scenario in which no previous corpus of information exists and to examine if frequency-based filtering using MEDLINE can reduce subset size without discarding relevant concepts. Materials and Methods Subsets were first extracted using four graph-traversal heuristics and one logic-based technique, and were subsequently filtered with frequency information from MEDLINE. Twenty manually coded discharge summaries from cardiology patients were used as signatures and test sets. The coverage, size, and precision of extracted subsets were measured. Results Graph-traversal heuristics provided high coverage (71–96% of terms in the test sets of discharge summaries) at the expense of subset size (17–51% of the size of SNOMED CT). Pre-computed subsets and logic-based techniques extracted small subsets (1%), but coverage was limited (24–55%). Filtering reduced the size of large subsets to 10% while still providing 80% coverage. Discussion Extracting subsets to annotate discharge summaries is challenging when no previous corpus exists. Ontology modularization provides valuable techniques, but the resulting modules grow as signatures spread across subhierarchies, yielding a very low precision. Conclusion Graph-traversal strategies and frequency data from an authoritative source can prune large biomedical ontologies and produce useful subsets that still exhibit acceptable coverage. However, a clinical corpus closer to the specific use case is preferred when available. PMID:22268217

  3. Extraction of Urban Trees from Integrated Airborne Based Digital Image and LIDAR Point Cloud Datasets - Initial Results

    NASA Astrophysics Data System (ADS)

    Dogon-yaro, M. A.; Kumar, P.; Rahman, A. Abdul; Buyuksalih, G.

    2016-10-01

    Timely and accurate acquisition of information on the condition and structural changes of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting tree features include; ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraint, such as labour intensive field work, a lot of financial requirement, influences by weather condition and topographical covers which can be overcome by means of integrated airborne based LiDAR and very high resolution digital image datasets. This study presented a semi-automated approach for extracting urban trees from integrated airborne based LIDAR and multispectral digital image datasets over Istanbul city of Turkey. The above scheme includes detection and extraction of shadow free vegetation features based on spectral properties of digital images using shadow index and NDVI techniques and automated extraction of 3D information about vegetation features from the integrated processing of shadow free vegetation image and LiDAR point cloud datasets. The ability of the developed algorithms shows a promising result as an automated and cost effective approach to estimating and delineated 3D information of urban trees. The research also proved that integrated datasets is a suitable technology and a viable source of information for city managers to be used in urban trees management.

  4. Evaluation and comparison of FTA card and CTAB DNA extraction methods for non-agricultural taxa.

    PubMed

    Siegel, Chloe S; Stevenson, Florence O; Zimmer, Elizabeth A

    2017-02-01

    An efficient, effective DNA extraction method is necessary for comprehensive analysis of plant genomes. This study analyzed the quality of DNA obtained using paper FTA cards prepared directly in the field when compared to the more traditional cetyltrimethylammonium bromide (CTAB)-based extraction methods from silica-dried samples. DNA was extracted using FTA cards according to the manufacturer's protocol. In parallel, CTAB-based extractions were done using the automated AutoGen DNA isolation system. DNA quality for both methods was determined for 15 non-agricultural species collected in situ, by gel separation, spectrophotometry, fluorometry, and successful amplification and sequencing of nuclear and chloroplast gene markers. The FTA card extraction method yielded less concentrated, but also less fragmented samples than the CTAB-based technique. The card-extracted samples provided DNA that could be successfully amplified and sequenced. The FTA cards are also useful because the collected samples do not require refrigeration, extensive laboratory expertise, or as many hazardous chemicals as extractions using the CTAB-based technique. The relative success of the FTA card method in our study suggested that this method could be a valuable tool for studies in plant population genetics and conservation biology that may involve screening of hundreds of individual plants. The FTA cards, like the silica gel samples, do not contain plant material capable of propagation, and therefore do not require permits from the U.S. Department of Agriculture (USDA) Animal and Plant Health Inspection Service (APHIS) for transportation.

  5. Support Vector Machine Based on Adaptive Acceleration Particle Swarm Optimization

    PubMed Central

    Abdulameer, Mohammed Hasan; Othman, Zulaiha Ali

    2014-01-01

    Existing face recognition methods utilize particle swarm optimizer (PSO) and opposition based particle swarm optimizer (OPSO) to optimize the parameters of SVM. However, the utilization of random values in the velocity calculation decreases the performance of these techniques; that is, during the velocity computation, we normally use random values for the acceleration coefficients and this creates randomness in the solution. To address this problem, an adaptive acceleration particle swarm optimization (AAPSO) technique is proposed. To evaluate our proposed method, we employ both face and iris recognition based on AAPSO with SVM (AAPSO-SVM). In the face and iris recognition systems, performance is evaluated using two human face databases, YALE and CASIA, and the UBiris dataset. In this method, we initially perform feature extraction and then recognition on the extracted features. In the recognition process, the extracted features are used for SVM training and testing. During the training and testing, the SVM parameters are optimized with the AAPSO technique, and in AAPSO, the acceleration coefficients are computed using the particle fitness values. The parameters in SVM, which are optimized by AAPSO, perform efficiently for both face and iris recognition. A comparative analysis between our proposed AAPSO-SVM and the PSO-SVM technique is presented. PMID:24790584

  6. Electroencephalogram-based decoding cognitive states using convolutional neural network and likelihood ratio based score fusion.

    PubMed

    Zafar, Raheel; Dass, Sarat C; Malik, Aamir Saeed

    2017-01-01

    Electroencephalogram (EEG)-based decoding human brain activity is challenging, owing to the low spatial resolution of EEG. However, EEG is an important technique, especially for brain-computer interface applications. In this study, a novel algorithm is proposed to decode brain activity associated with different types of images. In this hybrid algorithm, convolutional neural network is modified for the extraction of features, a t-test is used for the selection of significant features and likelihood ratio-based score fusion is used for the prediction of brain activity. The proposed algorithm takes input data from multichannel EEG time-series, which is also known as multivariate pattern analysis. Comprehensive analysis was conducted using data from 30 participants. The results from the proposed method are compared with current recognized feature extraction and classification/prediction techniques. The wavelet transform-support vector machine method is the most popular currently used feature extraction and prediction method. This method showed an accuracy of 65.7%. However, the proposed method predicts the novel data with improved accuracy of 79.9%. In conclusion, the proposed algorithm outperformed the current feature extraction and prediction method.

  7. Effects of moisture content in cigar tobacco on nicotine extraction. Similarity between soxhlet and focused open-vessel microwave-assisted techniques.

    PubMed

    Ng, Lay-Keow; Hupé, Michel

    2003-09-05

    The effects of tobacco moisture on nicotine yield were investigated in this study. Soxhlet and microwave-assisted techniques were used to extract nicotine from cigar fillers of varying moisture contents (5-20%), using a polar (methanol) and a non-polar (isooctane) solvent. The extracts were analyzed by a gas chromatograph equipped with a flame-ionization detector. For both extraction techniques, higher nicotine yields were consistently obtained with methanol than with isooctane from the same samples. Solubility of nicotine salts in methanol but not in isooctane is the major cause of this observation. Moreover, pronounced effects of the tobacco moisture content on extraction efficiency were observed with isooctane but not with methanol. For microwave assisted extraction (MAE) with isooctane, nicotine yield increased from 3 to 70% as the moisture level in tobacco was raised from 3 to 13%, and leveled off thereafter. Similar observations were made with Soxhlet extraction. While MAE results were rationalized by the known cell-rupture process, a mechanism based on the interaction between the solvents and the structural components of the plant cells has been proposed to account for the observations made with Soxhlet extraction.

  8. Pulse echo and combined resonance techniques: a full set of LGT acoustic wave constants and temperature coefficients.

    PubMed

    Sturtevant, Blake T; Davulis, Peter M; da Cunha, Mauricio Pereira

    2009-04-01

    This work reports on the determination of langatate elastic and piezoelectric constants and their associated temperature coefficients employing 2 independent methods, the pulse echo overlap (PEO) and a combined resonance technique (CRT) to measure bulk acoustic wave (BAW) phase velocities. Details on the measurement techniques are provided and discussed, including the analysis of the couplant material in the PEO technique used to couple signal to the sample, which showed to be an order of magnitude more relevant than the experimental errors involved in the data extraction. At room temperature, elastic and piezoelectric constants were extracted by the PEO and the CRT methods and showed results consistent to within a few percent for the elastic constants. Both raw acquired data and optimized constants, based on minimization routines applied to all the modes involved in the measurements, are provided and discussed. Comparison between the elastic constants and their temperature behavior with the literature reveals the recent efforts toward the consistent growth and characterization of LGT, in spite of significant variations (between 1 and 30%) among the constants extracted by different groups at room temperature. The density, dielectric permittivity constants, and respective temperature coefficients used in this work have also been independently determined based on samples from the same crystal boule. The temperature behavior of the BAW modes was extracted using the CRT technique, which has the advantage of not relying on temperature dependent acoustic couplants. Finally, the extracted temperature coefficients for the elastic and piezoelectric constants between room temperature and 120 degrees C are reported and discussed in this work.

  9. Simplified multiple headspace extraction gas chromatographic technique for determination of monomer solubility in water.

    PubMed

    Chai, X S; Schork, F J; DeCinque, Anthony

    2005-04-08

    This paper reports an improved headspace gas chromatographic (GC) technique for determination of monomer solubilities in water. The method is based on a multiple headspace extraction GC technique developed previously [X.S. Chai, Q.X. Hou, F.J. Schork, J. Appl. Polym. Sci., in press], but with the major modification in the method calibration technique. As a result, only a few iterations of headspace extraction and GC measurement are required, which avoids the "exhaustive" headspace extraction, and thus the experimental time for each analysis. For highly insoluble monomers, effort must be made to minimize adsorption in the headspace sampling channel, transportation conduit and capillary column by using higher operating temperature and a short capillary column in the headspace sampler and GC system. For highly water soluble monomers, a new calibration method is proposed. The combinations of these technique modifications results in a method that is simple, rapid and automated. While the current focus of the authors is on the determination of monomer solubility in aqueous solutions, the method should be applicable to determination of solubility of any organic in water.

  10. A non-invasive technique for rapid extraction of DNA from fish scales.

    PubMed

    Kumar, Ravindra; Singh, Poonam Jayant; Nagpure, N S; Kushwaha, Basdeo; Srivastava, S K; Lakra, W S

    2007-11-01

    DNA markers are being increasingly used in studies related to population genetics and conservation biology of endangered species. DNA isolation for such studies requires a source of biological material that is easy to collect, non-bulky and reliable. Further, the sampling strategies based on non-invasive procedures are desirable, especially for the endangered fish species. In view of above, a rapid DNA extraction method from fish scales has been developed with the use of a modified lysis buffer that require about 2 hr duration. This methodology is non-invasive, less expensive and reproducible with high efficiency of DNA recovery. The DNA extracted by this technique, have been found suitable for performing restriction enzyme digestion and PCR amplification. Therefore, the present DNA extraction procedure can be used as an alternative technique in population genetic studies pertaining to endangered fish species. The technique was also found equally effective for DNA isolation from fresh, dried and ethanol preserved scales.

  11. FEX: A Knowledge-Based System For Planimetric Feature Extraction

    NASA Astrophysics Data System (ADS)

    Zelek, John S.

    1988-10-01

    Topographical planimetric features include natural surfaces (rivers, lakes) and man-made surfaces (roads, railways, bridges). In conventional planimetric feature extraction, a photointerpreter manually interprets and extracts features from imagery on a stereoplotter. Visual planimetric feature extraction is a very labour intensive operation. The advantages of automating feature extraction include: time and labour savings; accuracy improvements; and planimetric data consistency. FEX (Feature EXtraction) combines techniques from image processing, remote sensing and artificial intelligence for automatic feature extraction. The feature extraction process co-ordinates the information and knowledge in a hierarchical data structure. The system simulates the reasoning of a photointerpreter in determining the planimetric features. Present efforts have concentrated on the extraction of road-like features in SPOT imagery. Keywords: Remote Sensing, Artificial Intelligence (AI), SPOT, image understanding, knowledge base, apars.

  12. Comparison of different methods for extraction and purification of human Papillomavirus (HPV) DNA from serum samples

    NASA Astrophysics Data System (ADS)

    Azizah, N.; Hashim, U.; Nadzirah, Sh.; Arshad, M. K. Md; Ruslinda, A. R.; Gopinath, Subash C. B.

    2017-03-01

    The affectability and unwavering quality of PCR for indicative and research purposes require effective fair systems of extraction and sanitization of nucleic acids. One of the real impediments of PCR-based tests is the hindrance of the enhancement procedure by substances exhibit in clinical examples. This examination considers distinctive techniques for extraction and cleaning of viral DNA from serum tests in view of recuperation productivity as far as yield of DNA and rate recouped immaculateness of removed DNA, and rate of restraint. The best extraction strategies were the phenol/chloroform strategy and the silica gel extraction methodology for serum tests, individually. Considering DNA immaculateness, extraction technique by utilizing the phenol/chloroform strategy delivered the most tasteful results in serum tests contrasted with the silica gel, separately. The nearness of inhibitors was overcome by all DNA extraction strategies in serum tests, as confirm by semiquantitative PCR enhancement.

  13. Comparing deep learning and concept extraction based methods for patient phenotyping from clinical narratives.

    PubMed

    Gehrmann, Sebastian; Dernoncourt, Franck; Li, Yeran; Carlson, Eric T; Wu, Joy T; Welt, Jonathan; Foote, John; Moseley, Edward T; Grant, David W; Tyler, Patrick D; Celi, Leo A

    2018-01-01

    In secondary analysis of electronic health records, a crucial task consists in correctly identifying the patient cohort under investigation. In many cases, the most valuable and relevant information for an accurate classification of medical conditions exist only in clinical narratives. Therefore, it is necessary to use natural language processing (NLP) techniques to extract and evaluate these narratives. The most commonly used approach to this problem relies on extracting a number of clinician-defined medical concepts from text and using machine learning techniques to identify whether a particular patient has a certain condition. However, recent advances in deep learning and NLP enable models to learn a rich representation of (medical) language. Convolutional neural networks (CNN) for text classification can augment the existing techniques by leveraging the representation of language to learn which phrases in a text are relevant for a given medical condition. In this work, we compare concept extraction based methods with CNNs and other commonly used models in NLP in ten phenotyping tasks using 1,610 discharge summaries from the MIMIC-III database. We show that CNNs outperform concept extraction based methods in almost all of the tasks, with an improvement in F1-score of up to 26 and up to 7 percentage points in area under the ROC curve (AUC). We additionally assess the interpretability of both approaches by presenting and evaluating methods that calculate and extract the most salient phrases for a prediction. The results indicate that CNNs are a valid alternative to existing approaches in patient phenotyping and cohort identification, and should be further investigated. Moreover, the deep learning approach presented in this paper can be used to assist clinicians during chart review or support the extraction of billing codes from text by identifying and highlighting relevant phrases for various medical conditions.

  14. Evaluation of two membrane-based microextraction techniques for the determination of endocrine disruptors in aqueous samples by HPLC with diode array detection.

    PubMed

    Luiz Oenning, Anderson; Lopes, Daniela; Neves Dias, Adriana; Merib, Josias; Carasek, Eduardo

    2017-11-01

    In this study, the viability of two membrane-based microextraction techniques for the determination of endocrine disruptors by high-performance liquid chromatography with diode array detection was evaluated: hollow fiber microporous membrane liquid-liquid extraction and hollow-fiber-supported dispersive liquid-liquid microextraction. The extraction efficiencies obtained for methylparaben, ethylparaben, bisphenol A, benzophenone, and 2-ethylhexyl-4-methoxycinnamate from aqueous matrices obtained using both approaches were compared and showed that hollow fiber microporous membrane liquid-liquid extraction exhibited higher extraction efficiency for most of the compounds studied. Therefore, a detailed optimization of the extraction procedure was carried out with this technique. The optimization of the extraction conditions and liquid desorption were performed by univariate analysis. The optimal conditions for the method were supported liquid membrane with 1-octanol for 10 s, sample pH 7, addition of 15% w/v of NaCl, extraction time of 30 min, and liquid desorption in 150 μL of acetonitrile/methanol (50:50 v/v) for 5 min. The linear correlation coefficients were higher than 0.9936. The limits of detection were 0.5-4.6 μg/L and the limits of quantification were 2-16 μg/L. The analyte relative recoveries were 67-116%, and the relative standard deviations were less than 15.5%. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Application of Ionic Liquids in the Microwave-Assisted Extraction of Proanthocyanidins from Larix gmelini Bark

    PubMed Central

    Yang, Lei; Sun, Xiaowei; Yang, Fengjian; Zhao, Chunjian; Zhang, Lin; Zu, Yuangang

    2012-01-01

    Ionic liquid based, microwave-assisted extraction (ILMAE) was successfully applied to the extraction of proanthocyanidins from Larix gmelini bark. In this work, in order to evaluate the performance of ionic liquids in the microwave-assisted extraction process, a series of 1-alkyl-3-methylimidazolium ionic liquids with different cations and anions were evaluated for extraction yield, and 1-butyl-3-methylimidazolium bromide was selected as the optimal solvent. In addition, the ILMAE procedure for the proanthocyanidins was optimized and compared with other conventional extraction techniques. Under the optimized conditions, satisfactory extraction yield of the proanthocyanidins was obtained. Relative to other methods, the proposed approach provided higher extraction yield and lower energy consumption. The Larix gmelini bark samples before and after extraction were analyzed by Thermal gravimetric analysis, Fourier-transform infrared spectroscopy and characterized by scanning electron microscopy. The results showed that the ILMAE method is a simple and efficient technique for sample preparation. PMID:22606036

  16. A highly selective dispersive liquid-liquid microextraction approach based on the unique fluorous affinity for the extraction and detection of per- and polyfluoroalkyl substances coupled with high performance liquid chromatography tandem-mass spectrometry.

    PubMed

    Wang, Juan; Shi, Yali; Cai, Yaqi

    2018-04-06

    In the present study, a highly selective fluorous affinity-based dispersive liquid-liquid microextraction (DLLME) technique was developed for the extraction and analysis of per- and polyfluoroalkyl substances (PFASs) followed by high performance liquid chromatography tandem-mass spectrometry. Perfluoro-tert-butanol with multiple C-F bonds was chosen as the extraction solvent, which was injected into the aqueous samples with a dispersive solvent (acetonitrile) in a 120:800 (μL, v/v) mixture for PFASs enrichment. The fluorous affinity-based extraction mechanism was confirmed by the significantly higher extraction recoveries for PFASs containing multiple fluorine atoms than those for compounds with fewer or no fluorine atoms. The extraction recoveries of medium and long-chain PFASs (CF 2  > 5) exceeded 70%, except perfluoroheptanoic acid, while those of short-chain PFASs were lower than 50%, implying that the proposed DLLME may not be suitable for their extraction due to weak fluorous affinity. This highly fluoroselective DLLME technique can greatly decrease the matrix effect that occurs in mass spectrometry detection when applied to the analysis of urine samples. Under the optimum conditions, the relative recoveries of PFASs with CF 2  > 5 ranged from 80.6-121.4% for tap water, river water and urine samples spiked with concentrations of 10, 50 and 100 ng/L. The method limits of quantification for PFASs in water and urine samples were in the range of 0.6-8.7 ng/L. Furthermore, comparable concentrations of PFASs were obtained via DLLME and solid-phase extraction, confirming that the developed DLLME technique is a promising method for the extraction of PFASs in real samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. A New Data Representation Based on Training Data Characteristics to Extract Drug Name Entity in Medical Text

    PubMed Central

    Basaruddin, T.

    2016-01-01

    One essential task in information extraction from the medical corpus is drug name recognition. Compared with text sources come from other domains, the medical text mining poses more challenges, for example, more unstructured text, the fast growing of new terms addition, a wide range of name variation for the same drug, the lack of labeled dataset sources and external knowledge, and the multiple token representations for a single drug name. Although many approaches have been proposed to overwhelm the task, some problems remained with poor F-score performance (less than 0.75). This paper presents a new treatment in data representation techniques to overcome some of those challenges. We propose three data representation techniques based on the characteristics of word distribution and word similarities as a result of word embedding training. The first technique is evaluated with the standard NN model, that is, MLP. The second technique involves two deep network classifiers, that is, DBN and SAE. The third technique represents the sentence as a sequence that is evaluated with a recurrent NN model, that is, LSTM. In extracting the drug name entities, the third technique gives the best F-score performance compared to the state of the art, with its average F-score being 0.8645. PMID:27843447

  18. Systematic comparison of static and dynamic headspace sampling techniques for gas chromatography.

    PubMed

    Kremser, Andreas; Jochmann, Maik A; Schmidt, Torsten C

    2016-09-01

    Six automated, headspace-based sample preparation techniques were used to extract volatile analytes from water with the goal of establishing a systematic comparison between commonly available instrumental alternatives. To that end, these six techniques were used in conjunction with the same gas chromatography instrument for analysis of a common set of volatile organic carbon (VOC) analytes. The methods were thereby divided into three classes: static sampling (by syringe or loop), static enrichment (SPME and PAL SPME Arrow), and dynamic enrichment (ITEX and trap sampling). For PAL SPME Arrow, different sorption phase materials were also included in the evaluation. To enable an effective comparison, method detection limits (MDLs), relative standard deviations (RSDs), and extraction yields were determined and are discussed for all techniques. While static sampling techniques exhibited sufficient extraction yields (approx. 10-20 %) to be reliably used down to approx. 100 ng L(-1), enrichment techniques displayed extraction yields of up to 80 %, resulting in MDLs down to the picogram per liter range. RSDs for all techniques were below 27 %. The choice on one of the different instrumental modes of operation (aforementioned classes) was thereby the most influential parameter in terms of extraction yields and MDLs. Individual methods inside each class showed smaller deviations, and the least influences were observed when evaluating different sorption phase materials for the individual enrichment techniques. The option of selecting specialized sorption phase materials may, however, be more important when analyzing analytes with different properties such as high polarity or the capability of specific molecular interactions. Graphical Abstract PAL SPME Arrow during the extraction of volatile analytes from the headspace of an aqueous sample.

  19. Computer-aided screening system for cervical precancerous cells based on field emission scanning electron microscopy and energy dispersive x-ray images and spectra

    NASA Astrophysics Data System (ADS)

    Jusman, Yessi; Ng, Siew-Cheok; Hasikin, Khairunnisa; Kurnia, Rahmadi; Osman, Noor Azuan Bin Abu; Teoh, Kean Hooi

    2016-10-01

    The capability of field emission scanning electron microscopy and energy dispersive x-ray spectroscopy (FE-SEM/EDX) to scan material structures at the microlevel and characterize the material with its elemental properties has inspired this research, which has developed an FE-SEM/EDX-based cervical cancer screening system. The developed computer-aided screening system consisted of two parts, which were the automatic features of extraction and classification. For the automatic features extraction algorithm, the image and spectra of cervical cells features extraction algorithm for extracting the discriminant features of FE-SEM/EDX data was introduced. The system automatically extracted two types of features based on FE-SEM/EDX images and FE-SEM/EDX spectra. Textural features were extracted from the FE-SEM/EDX image using a gray level co-occurrence matrix technique, while the FE-SEM/EDX spectra features were calculated based on peak heights and corrected area under the peaks using an algorithm. A discriminant analysis technique was employed to predict the cervical precancerous stage into three classes: normal, low-grade intraepithelial squamous lesion (LSIL), and high-grade intraepithelial squamous lesion (HSIL). The capability of the developed screening system was tested using 700 FE-SEM/EDX spectra (300 normal, 200 LSIL, and 200 HSIL cases). The accuracy, sensitivity, and specificity performances were 98.2%, 99.0%, and 98.0%, respectively.

  20. Evaluation and comparison of FTA card and CTAB DNA extraction methods for non-agricultural taxa1

    PubMed Central

    Siegel, Chloe S.; Stevenson, Florence O.; Zimmer, Elizabeth A.

    2017-01-01

    Premise of the study: An efficient, effective DNA extraction method is necessary for comprehensive analysis of plant genomes. This study analyzed the quality of DNA obtained using paper FTA cards prepared directly in the field when compared to the more traditional cetyltrimethylammonium bromide (CTAB)–based extraction methods from silica-dried samples. Methods: DNA was extracted using FTA cards according to the manufacturer’s protocol. In parallel, CTAB-based extractions were done using the automated AutoGen DNA isolation system. DNA quality for both methods was determined for 15 non-agricultural species collected in situ, by gel separation, spectrophotometry, fluorometry, and successful amplification and sequencing of nuclear and chloroplast gene markers. Results: The FTA card extraction method yielded less concentrated, but also less fragmented samples than the CTAB-based technique. The card-extracted samples provided DNA that could be successfully amplified and sequenced. The FTA cards are also useful because the collected samples do not require refrigeration, extensive laboratory expertise, or as many hazardous chemicals as extractions using the CTAB-based technique. Discussion: The relative success of the FTA card method in our study suggested that this method could be a valuable tool for studies in plant population genetics and conservation biology that may involve screening of hundreds of individual plants. The FTA cards, like the silica gel samples, do not contain plant material capable of propagation, and therefore do not require permits from the U.S. Department of Agriculture (USDA) Animal and Plant Health Inspection Service (APHIS) for transportation. PMID:28224056

  1. Sample-space-based feature extraction and class preserving projection for gene expression data.

    PubMed

    Wang, Wenjun

    2013-01-01

    In order to overcome the problems of high computational complexity and serious matrix singularity for feature extraction using Principal Component Analysis (PCA) and Fisher's Linear Discrinimant Analysis (LDA) in high-dimensional data, sample-space-based feature extraction is presented, which transforms the computation procedure of feature extraction from gene space to sample space by representing the optimal transformation vector with the weighted sum of samples. The technique is used in the implementation of PCA, LDA, Class Preserving Projection (CPP) which is a new method for discriminant feature extraction proposed, and the experimental results on gene expression data demonstrate the effectiveness of the method.

  2. Glioma grading using cell nuclei morphologic features in digital pathology images

    NASA Astrophysics Data System (ADS)

    Reza, Syed M. S.; Iftekharuddin, Khan M.

    2016-03-01

    This work proposes a computationally efficient cell nuclei morphologic feature analysis technique to characterize the brain gliomas in tissue slide images. In this work, our contributions are two-fold: 1) obtain an optimized cell nuclei segmentation method based on the pros and cons of the existing techniques in literature, 2) extract representative features by k-mean clustering of nuclei morphologic features to include area, perimeter, eccentricity, and major axis length. This clustering based representative feature extraction avoids shortcomings of extensive tile [1] [2] and nuclear score [3] based methods for brain glioma grading in pathology images. Multilayer perceptron (MLP) is used to classify extracted features into two tumor types: glioblastoma multiforme (GBM) and low grade glioma (LGG). Quantitative scores such as precision, recall, and accuracy are obtained using 66 clinical patients' images from The Cancer Genome Atlas (TCGA) [4] dataset. On an average ~94% accuracy from 10 fold crossvalidation confirms the efficacy of the proposed method.

  3. Artificially intelligent recognition of Arabic speaker using voice print-based local features

    NASA Astrophysics Data System (ADS)

    Mahmood, Awais; Alsulaiman, Mansour; Muhammad, Ghulam; Akram, Sheeraz

    2016-11-01

    Local features for any pattern recognition system are based on the information extracted locally. In this paper, a local feature extraction technique was developed. This feature was extracted in the time-frequency plain by taking the moving average on the diagonal directions of the time-frequency plane. This feature captured the time-frequency events producing a unique pattern for each speaker that can be viewed as a voice print of the speaker. Hence, we referred to this technique as voice print-based local feature. The proposed feature was compared to other features including mel-frequency cepstral coefficient (MFCC) for speaker recognition using two different databases. One of the databases used in the comparison is a subset of an LDC database that consisted of two short sentences uttered by 182 speakers. The proposed feature attained 98.35% recognition rate compared to 96.7% for MFCC using the LDC subset.

  4. Extraction of α-humulene-enriched oil from clove using ultrasound-assisted supercritical carbon dioxide extraction and studies of its fictitious solubility.

    PubMed

    Wei, Ming-Chi; Xiao, Jianbo; Yang, Yu-Chiao

    2016-11-01

    Clove buds are used as a spice and food flavoring. In this study, clove oil and α-humulene was extracted from cloves using supercritical carbon dioxide extraction with and without ultrasound assistance (USC-CO2 and SC-CO2, respectively) at different temperatures (32-50°C) and pressures (9.0-25.0MPa). The results of these extractions were compared with those of heat reflux extraction and steam distillation methods conducted in parallel. The extracts obtained using these four techniques were analyzed using gas chromatography and gas chromatography/mass spectrometry (GC/MS). The results demonstrated that the USC-CO2 extraction procedure may extract clove oil and α-humulene from clove buds with better yields and shorter extraction times than conventional extraction techniques while utilizing less severe operating parameters. Furthermore, the experimental fictitious solubility data obtained using the dynamic method were well correlated with density-based models, including the Chrastil model, the Bartle model and the Kumar and Johnston model. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Biomedical named entity extraction: some issues of corpus compatibilities.

    PubMed

    Ekbal, Asif; Saha, Sriparna; Sikdar, Utpal Kumar

    2013-01-01

    Named Entity (NE) extraction is one of the most fundamental and important tasks in biomedical information extraction. It involves identification of certain entities from text and their classification into some predefined categories. In the biomedical community, there is yet no general consensus regarding named entity (NE) annotation; thus, it is very difficult to compare the existing systems due to corpus incompatibilities. Due to this problem we can not also exploit the advantages of using different corpora together. In our present work we address the issues of corpus compatibilities, and use a single objective optimization (SOO) based classifier ensemble technique that uses the search capability of genetic algorithm (GA) for NE extraction in biomedicine. We hypothesize that the reliability of predictions of each classifier differs among the various output classes. We use Conditional Random Field (CRF) and Support Vector Machine (SVM) frameworks to build a number of models depending upon the various representations of the set of features and/or feature templates. It is to be noted that we tried to extract the features without using any deep domain knowledge and/or resources. In order to assess the challenges of corpus compatibilities, we experiment with the different benchmark datasets and their various combinations. Comparison results with the existing approaches prove the efficacy of the used technique. GA based ensemble achieves around 2% performance improvements over the individual classifiers. Degradation in performance on the integrated corpus clearly shows the difficulties of the task. In summary, our used ensemble based approach attains the state-of-the-art performance levels for entity extraction in three different kinds of biomedical datasets. The possible reasons behind the better performance in our used approach are the (i). use of variety and rich features as described in Subsection "Features for named entity extraction"; (ii) use of GA based classifier ensemble technique to combine the outputs of multiple classifiers.

  6. Radar fall detection using principal component analysis

    NASA Astrophysics Data System (ADS)

    Jokanovic, Branka; Amin, Moeness; Ahmad, Fauzia; Boashash, Boualem

    2016-05-01

    Falls are a major cause of fatal and nonfatal injuries in people aged 65 years and older. Radar has the potential to become one of the leading technologies for fall detection, thereby enabling the elderly to live independently. Existing techniques for fall detection using radar are based on manual feature extraction and require significant parameter tuning in order to provide successful detections. In this paper, we employ principal component analysis for fall detection, wherein eigen images of observed motions are employed for classification. Using real data, we demonstrate that the PCA based technique provides performance improvement over the conventional feature extraction methods.

  7. Spatial-spectral preprocessing for endmember extraction on GPU's

    NASA Astrophysics Data System (ADS)

    Jimenez, Luis I.; Plaza, Javier; Plaza, Antonio; Li, Jun

    2016-10-01

    Spectral unmixing is focused in the identification of spectrally pure signatures, called endmembers, and their corresponding abundances in each pixel of a hyperspectral image. Mainly focused on the spectral information contained in the hyperspectral images, endmember extraction techniques have recently included spatial information to achieve more accurate results. Several algorithms have been developed for automatic or semi-automatic identification of endmembers using spatial and spectral information, including the spectral-spatial endmember extraction (SSEE) where, within a preprocessing step in the technique, both sources of information are extracted from the hyperspectral image and equally used for this purpose. Previous works have implemented the SSEE technique in four main steps: 1) local eigenvectors calculation in each sub-region in which the original hyperspectral image is divided; 2) computation of the maxima and minima projection of all eigenvectors over the entire hyperspectral image in order to obtain a candidates pixels set; 3) expansion and averaging of the signatures of the candidate set; 4) ranking based on the spectral angle distance (SAD). The result of this method is a list of candidate signatures from which the endmembers can be extracted using various spectral-based techniques, such as orthogonal subspace projection (OSP), vertex component analysis (VCA) or N-FINDR. Considering the large volume of data and the complexity of the calculations, there is a need for efficient implementations. Latest- generation hardware accelerators such as commodity graphics processing units (GPUs) offer a good chance for improving the computational performance in this context. In this paper, we develop two different implementations of the SSEE algorithm using GPUs. Both are based on the eigenvectors computation within each sub-region of the first step, one using the singular value decomposition (SVD) and another one using principal component analysis (PCA). Based on our experiments with hyperspectral data sets, high computational performance is observed in both cases.

  8. The use of carrier RNA to enhance DNA extraction from microfluidic-based silica monoliths.

    PubMed

    Shaw, Kirsty J; Thain, Lauren; Docker, Peter T; Dyer, Charlotte E; Greenman, John; Greenway, Gillian M; Haswell, Stephen J

    2009-10-12

    DNA extraction was carried out on silica-based monoliths within a microfluidic device. Solid-phase DNA extraction methodology was applied in which the DNA binds to silica in the presence of a chaotropic salt, such as guanidine hydrochloride, and is eluted in a low ionic strength solution, such as water. The addition of poly-A carrier RNA to the chaotropic salt solution resulted in a marked increase in the effective amount of DNA that could be recovered (25ng) compared to the absence of RNA (5ng) using the silica-based monolith. These findings confirm that techniques utilising nucleic acid carrier molecules can enhance DNA extraction methodologies in microfluidic applications.

  9. Electroencephalogram-based decoding cognitive states using convolutional neural network and likelihood ratio based score fusion

    PubMed Central

    2017-01-01

    Electroencephalogram (EEG)-based decoding human brain activity is challenging, owing to the low spatial resolution of EEG. However, EEG is an important technique, especially for brain–computer interface applications. In this study, a novel algorithm is proposed to decode brain activity associated with different types of images. In this hybrid algorithm, convolutional neural network is modified for the extraction of features, a t-test is used for the selection of significant features and likelihood ratio-based score fusion is used for the prediction of brain activity. The proposed algorithm takes input data from multichannel EEG time-series, which is also known as multivariate pattern analysis. Comprehensive analysis was conducted using data from 30 participants. The results from the proposed method are compared with current recognized feature extraction and classification/prediction techniques. The wavelet transform-support vector machine method is the most popular currently used feature extraction and prediction method. This method showed an accuracy of 65.7%. However, the proposed method predicts the novel data with improved accuracy of 79.9%. In conclusion, the proposed algorithm outperformed the current feature extraction and prediction method. PMID:28558002

  10. Comparison of extraction techniques of robenidine from poultry feed samples.

    PubMed

    Wilga, Joanna; Wasik, Agata Kot-; Namieśnik, Jacek

    2007-10-31

    In this paper, effectiveness of six different commonly applied extraction techniques for the determination of robenidine in poultry feed has been compared. The sample preparation techniques included shaking, Soxhlet, Soxtec, ultrasonically assisted extraction, microwave - assisted extraction and accelerated solvent extraction. Comparison of these techniques was done with respect to the recovery extraction, temperature and time, reproducibility and solvent consumption. Every single extract was subjected to clean - up using aluminium oxide column (Pasteur pipette filled with 1g of aluminium oxide), from which robenidine was eluted with 10ml of methanol. The eluate from the clean-up column was collected in a volumetric flask, and finally it was analysed by HPLC-DAD-MS. In general, all extraction techniques were capable of isolating of robenidine from poultry feed, but the recovery obtained using modern extraction techniques was higher than that obtained using conventional techniques. In particular, accelerated solvent extraction was more superior to other techniques, which highlights the advantages of this sample preparation technique. However, in routine analysis, shaking and ultrasonically assisted extraction is still the preferred method for the solution of robenidine and other coccidiostatics.

  11. Small incision lenticule extraction (SMILE) in the correction of myopic astigmatism: outcomes and limitations - an update.

    PubMed

    Alió Del Barrio, Jorge L; Vargas, Verónica; Al-Shymali, Olena; Alió, Jorge L

    2017-01-01

    Small Incision Lenticule Extraction (SMILE) is a flap-free intrastromal technique for the correction of myopia and myopic astigmatism. To date, this technique lacks automated centration and cyclotorsion control, so several concerns have been raised regarding its capability to correct moderate or high levels of astigmatism. The objective of this paper is to review the reported SMILE outcomes for the correction of myopic astigmatism associated with a cylinder over 0.75 D, and its comparison with the outcomes reported with the excimer laser-based corneal refractive surgery techniques. A total of five studies clearly reporting SMILE astigmatic outcomes were identified. SMILE shows acceptable outcomes for the correction of myopic astigmatism, although a general agreement exists about the superiority of the excimer laser-based techniques for low to moderate levels of astigmatism. Manual correction of the static cyclotorsion should be adopted for any SMILE astigmatic correction over 0.75 D.

  12. Cryogenic Cathode Cooling Techniques for Improved SABRE Extraction Ion Diode Li Beam Generation

    NASA Astrophysics Data System (ADS)

    Hanson, D. L.; Johnston, R. R.; Cuneo, M. E.; Menge, P. R.; Fowler, W. E.; Armijo, J.; Nielsen, D. S.; Petmecky, D.

    1997-11-01

    We are developing techniques for cryogenic cooling of the SABRE extraction ion diode cathode that, combined with source cleaning, should improve the purity and brightness of Li beams for ICF light ion fusion. By liquid helium (LHe) cathode cooling, we have been able to maintain A-K gap base pressures in the range of 5 - 7x10-8 Torr for about 45 minutes. These base pressures extend the monolayer formation time for the worst beam contaminants (H2 and water vapor) to 10 - 100 sec or longer, which should allow the accelerator to be fired without significant Li source recontamination. This technique is compatible with He glow discharge cleaning, laser cleaning, and in situ Li deposition. We are also developing techniques for Ti-gettering of H2 and for cryogenic cooling of cathode electrodes to delay cathode plasma expansion.

  13. Fetal Electrocardiogram Extraction and Analysis Using Adaptive Noise Cancellation and Wavelet Transformation Techniques.

    PubMed

    Sutha, P; Jayanthi, V E

    2017-12-08

    Birth defect-related demise is mainly due to congenital heart defects. In the earlier stage of pregnancy, fetus problem can be identified by finding information about the fetus to avoid stillbirths. The gold standard used to monitor the health status of the fetus is by Cardiotachography(CTG), cannot be used for long durations and continuous monitoring. There is a need for continuous and long duration monitoring of fetal ECG signals to study the progressive health status of the fetus using portable devices. The non-invasive method of electrocardiogram recording is one of the best method used to diagnose fetal cardiac problem rather than the invasive methods.The monitoring of the fECG requires development of a miniaturized hardware and a efficient signal processing algorithms to extract the fECG embedded in the mother ECG. The paper discusses a prototype hardware developed to monitor and record the raw mother ECG signal containing the fECG and a signal processing algorithm to extract the fetal Electro Cardiogram signal. We have proposed two methods of signal processing, first is based on the Least Mean Square (LMS) Adaptive Noise Cancellation technique and the other method is based on the Wavelet Transformation technique. A prototype hardware was designed and developed to acquire the raw ECG signal containing the mother and fetal ECG and the signal processing techniques were used to eliminate the noises and extract the fetal ECG and the fetal Heart Rate Variability was studied. Both the methods were evaluated with the signal acquired from a fetal ECG simulator, from the Physionet database and that acquired from the subject. Both the methods are evaluated by finding heart rate and its variability, amplitude spectrum and mean value of extracted fetal ECG. Also the accuracy, sensitivity and positive predictive value are also determined for fetal QRS detection technique. In this paper adaptive filtering technique uses Sign-sign LMS algorithm and wavelet techniques with Daubechies wavelet, employed along with de noising techniques for the extraction of fetal Electrocardiogram.Both the methods are having good sensitivity and accuracy. In adaptive method the sensitivity is 96.83, accuracy 89.87, wavelet sensitivity is 95.97 and accuracy is 88.5. Additionally, time domain parameters from the plot of heart rate variability of mother and fetus are analyzed.

  14. Method of simultaneous stir bar sorptive extraction of phenethylamines and THC metabolite from urine.

    PubMed

    Goto, Yoshiyuki; Takeda, Shiho; Araki, Toshinori; Fuchigami, Takayuki

    2011-10-01

    Stir bar sorptive extraction is a technique used for extracting target substances from various aqueous matrixes such as environmental water, food, and biological samples. This type of extraction is carried out by rotating a coated stir bar is rotated in the sample solution. In particular, Twister bar is a commercial stir bar that is coated with polydimethylsiloxane (PDMS) and used to perform sorptive extraction. In this study, we developed a method for simultaneous detection of amphetamine, methamphetamine, 3,4-methylenedioxyamphetamine, 3,4-methylenedioxymethamphetamine, and a Δ(9)-tetrahydrocannabiniol (THC) metabolite in human urine. For extracting the target analytes, the Twister bar was simply stirred in the sample in the presence of a derivatizing agent. Using this technique, phenethylamines and the acidic THC metabolite can be simultaneously extracted from human urine. This method also enables the extraction of trace amounts of these substances with good reproducibility and high selectivity. The proposed method offers many advantages over other extraction-based approaches and is therefore well suited for screening psychoactive substances in urine specimens.

  15. Recent Application of Solid Phase Based Techniques for Extraction and Preconcentration of Cyanotoxins in Environmental Matrices.

    PubMed

    Mashile, Geaneth Pertunia; Nomngongo, Philiswa N

    2017-03-04

    Cyanotoxins are toxic and are found in eutrophic, municipal, and residential water supplies. For this reason, their occurrence in drinking water systems has become a global concern. Therefore, monitoring, control, risk assessment, and prevention of these contaminants in the environmental bodies are important subjects associated with public health. Thus, rapid, sensitive, selective, simple, and accurate analytical methods for the identification and determination of cyanotoxins are required. In this paper, the sampling methodologies and applications of solid phase-based sample preparation methods for the determination of cyanotoxins in environmental matrices are reviewed. The sample preparation techniques mainly include solid phase micro-extraction (SPME), solid phase extraction (SPE), and solid phase adsorption toxin tracking technology (SPATT). In addition, advantages and disadvantages and future prospects of these methods have been discussed.

  16. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Foundational Performance Analyses of Pressure Gain Combustion Thermodynamic Benefits for Gas Turbines

    NASA Technical Reports Server (NTRS)

    Paxson, Daniel E.; Kaemming, Thomas A.

    2012-01-01

    A methodology is described whereby the work extracted by a turbine exposed to the fundamentally nonuniform flowfield from a representative pressure gain combustor (PGC) may be assessed. The method uses an idealized constant volume cycle, often referred to as an Atkinson or Humphrey cycle, to model the PGC. Output from this model is used as input to a scalable turbine efficiency function (i.e., a map), which in turn allows for the calculation of useful work throughout the cycle. Integration over the entire cycle yields mass-averaged work extraction. The unsteady turbine work extraction is compared to steady work extraction calculations based on various averaging techniques for characterizing the combustor exit pressure and temperature. It is found that averages associated with momentum flux (as opposed to entropy or kinetic energy) provide the best match. This result suggests that momentum-based averaging is the most appropriate figure-of-merit to use as a PGC performance metric. Using the mass-averaged work extraction methodology, it is also found that the design turbine pressure ratio for maximum work extraction is significantly higher than that for a turbine fed by a constant pressure combustor with similar inlet conditions and equivalence ratio. Limited results are presented whereby the constant volume cycle is replaced by output from a detonation-based PGC simulation. The results in terms of averaging techniques and design pressure ratio are similar.

  18. Photovoltaic panel extraction from very high-resolution aerial imagery using region-line primitive association analysis and template matching

    NASA Astrophysics Data System (ADS)

    Wang, Min; Cui, Qi; Sun, Yujie; Wang, Qiao

    2018-07-01

    In object-based image analysis (OBIA), object classification performance is jointly determined by image segmentation, sample or rule setting, and classifiers. Typically, as a crucial step to obtain object primitives, image segmentation quality significantly influences subsequent feature extraction and analyses. By contrast, template matching extracts specific objects from images and prevents shape defects caused by image segmentation. However, creating or editing templates is tedious and sometimes results in incomplete or inaccurate templates. In this study, we combine OBIA and template matching techniques to address these problems and aim for accurate photovoltaic panel (PVP) extraction from very high-resolution (VHR) aerial imagery. The proposed method is based on the previously proposed region-line primitive association framework, in which complementary information between region (segment) and line (straight line) primitives is utilized to achieve a more powerful performance than routine OBIA. Several novel concepts, including the mutual fitting ratio and best-fitting template based on region-line primitive association analyses, are proposed. Automatic template generation and matching method for PVP extraction from VHR imagery are designed for concept and model validation. Results show that the proposed method can successfully extract PVPs without any user-specified matching template or training sample. High user independency and accuracy are the main characteristics of the proposed method in comparison with routine OBIA and template matching techniques.

  19. Silica nanoparticle based techniques for extraction, detection, and degradation of pesticides.

    PubMed

    Bapat, Gandhali; Labade, Chaitali; Chaudhari, Amol; Zinjarde, Smita

    2016-11-01

    Silica nanoparticles (SiNPs) find applications in the fields of drug delivery, catalysis, immobilization and sensing. Their synthesis can be mediated in a facile manner and they display broad range compatibility and stability. Their existence in the form of spheres, wires and sheets renders them suitable for varied purposes. This review summarizes the use of silica nanostructures in developing techniques for extraction, detection and degradation of pesticides. Silica nanostructures on account of their sorbent properties, porous nature and increased surface area allow effective extraction of pesticides. They can be modified (with ionic liquids, silanes or amines), coated with molecularly imprinted polymers or magnetized to improve the extraction of pesticides. Moreover, they can be altered to increase their sensitivity and stability. In addition to the analysis of pesticides by sophisticated techniques such as High Performance Liquid Chromatography or Gas chromatography, silica nanoparticles related simple detection methods are also proving to be effective. Electrochemical and optical detection based on enzymes (acetylcholinesterase and organophosphate hydrolase) or antibodies have been developed. Pesticide sensors dependent on fluorescence, chemiluminescence or Surface Enhanced Raman Spectroscopic responses are also SiNP based. Moreover, degradative enzymes (organophosphate hydrolases, carboxyesterases and laccases) and bacterial cells that produce recombinant enzymes have been immobilized on SiNPs for mediating pesticide degradation. After immobilization, these systems show increased stability and improved degradation. SiNP are significant in developing systems for effective extraction, detection and degradation of pesticides. SiNPs on account of their chemically inert nature and amenability to surface modifications makes them popular tools for fabricating devices for 'on-site' applications. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Macromolecular structural changes in bituminous coals during extraction and solubilization. Annual technical progress report, September 1, 1980-August 31, 1981

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peppas, N.A.; Hill-Lievense, M.E.; Hooker, D.T. II

    1981-01-01

    Seven coal samples ranging from a lignite with 69.95% carbon to an anthracite with 94.17% carbon on a dry mineral matter-free (dmmf) basis were extracted with pyridine at its reflux temperature for two weeks. The coal matrices obtained were subjected to two degradation techniques, the Sternberg reductive alkylation technique and the Miyake alkylation technique. Gel permeation chromatographic analysis of pyridine-extracted liquids of the alkylated coal showed average molecular weights smaller than those of the original coal extracts. Electron impact mass spectrometry was used to obtain the mass spectra of these alkylated coal samples. Based on investigation of the recurring patternmore » of the peaks of the mass spectra of these products it was concluded that a cluster size of 126 to 130 is characteristic of the crosslinked structure of the coal studied. In addition, several chemical compounds in the range of m/e 78-191 were identified.« less

  1. An Electronic Engineering Curriculum Design Based on Concept-Mapping Techniques

    ERIC Educational Resources Information Center

    Toral, S. L.; Martinez-Torres, M. R.; Barrero, F.; Gallardo, S.; Duran, M. J.

    2007-01-01

    Curriculum design is a concern in European Universities as they face the forthcoming European Higher Education Area (EHEA). This process can be eased by the use of scientific tools such as Concept-Mapping Techniques (CMT) that extract and organize the most relevant information from experts' experience using statistics techniques, and helps a…

  2. Research on feature extraction techniques of Hainan Li brocade pattern

    NASA Astrophysics Data System (ADS)

    Zhou, Yuping; Chen, Fuqiang; Zhou, Yuhua

    2016-03-01

    Hainan Li brocade skills has been listed as world non-material cultural heritage preservation, therefore, the research on Hainan Li brocade patterns plays an important role in Li brocade culture inheritance. The meaning of Li brocade patterns was analyzed and the shape feature extraction techniques to original Li brocade patterns were advanced in this paper, based on the contour tracking algorithm. First, edge detection was made on the design patterns, and then the morphological closing operation was used to smooth the image, and finally contour tracking was used to extract the outer contours of Li brocade patterns. The extracted contour features were processed by means of morphology, and digital characteristics of contours are obtained by invariant moments. At last, different patterns of Li brocade design are briefly analyzed according to the digital characteristics. The results showed that the pattern extraction method to Li brocade pattern shapes is feasible and effective according to above method.

  3. A Biomechanical Modeling Guided CBCT Estimation Technique

    PubMed Central

    Zhang, You; Tehrani, Joubin Nasehi; Wang, Jing

    2017-01-01

    Two-dimensional-to-three-dimensional (2D-3D) deformation has emerged as a new technique to estimate cone-beam computed tomography (CBCT) images. The technique is based on deforming a prior high-quality 3D CT/CBCT image to form a new CBCT image, guided by limited-view 2D projections. The accuracy of this intensity-based technique, however, is often limited in low-contrast image regions with subtle intensity differences. The solved deformation vector fields (DVFs) can also be biomechanically unrealistic. To address these problems, we have developed a biomechanical modeling guided CBCT estimation technique (Bio-CBCT-est) by combining 2D-3D deformation with finite element analysis (FEA)-based biomechanical modeling of anatomical structures. Specifically, Bio-CBCT-est first extracts the 2D-3D deformation-generated displacement vectors at the high-contrast anatomical structure boundaries. The extracted surface deformation fields are subsequently used as the boundary conditions to drive structure-based FEA to correct and fine-tune the overall deformation fields, especially those at low-contrast regions within the structure. The resulting FEA-corrected deformation fields are then fed back into 2D-3D deformation to form an iterative loop, combining the benefits of intensity-based deformation and biomechanical modeling for CBCT estimation. Using eleven lung cancer patient cases, the accuracy of the Bio-CBCT-est technique has been compared to that of the 2D-3D deformation technique and the traditional CBCT reconstruction techniques. The accuracy was evaluated in the image domain, and also in the DVF domain through clinician-tracked lung landmarks. PMID:27831866

  4. Visual reconciliation of alternative similarity spaces in climate modeling

    Treesearch

    J Poco; A Dasgupta; Y Wei; William Hargrove; C.R. Schwalm; D.N. Huntzinger; R Cook; E Bertini; C.T. Silva

    2015-01-01

    Visual data analysis often requires grouping of data objects based on their similarity. In many application domains researchers use algorithms and techniques like clustering and multidimensional scaling to extract groupings from data. While extracting these groups using a single similarity criteria is relatively straightforward, comparing alternative criteria poses...

  5. Novel Features for Brain-Computer Interfaces

    PubMed Central

    Woon, W. L.; Cichocki, A.

    2007-01-01

    While conventional approaches of BCI feature extraction are based on the power spectrum, we have tried using nonlinear features for classifying BCI data. In this paper, we report our test results and findings, which indicate that the proposed method is a potentially useful addition to current feature extraction techniques. PMID:18364991

  6. Pattern-Based Extraction of Argumentation from the Scientific Literature

    ERIC Educational Resources Information Center

    White, Elizabeth K.

    2010-01-01

    As the number of publications in the biomedical field continues its exponential increase, techniques for automatically summarizing information from this body of literature have become more diverse. In addition, the targets of summarization have become more subtle; initial work focused on extracting the factual assertions from full-text papers,…

  7. Method for improving accuracy in full evaporation headspace analysis.

    PubMed

    Xie, Wei-Qi; Chai, Xin-Sheng

    2017-05-01

    We report a new headspace analytical method in which multiple headspace extraction is incorporated with the full evaporation technique. The pressure uncertainty caused by the solid content change in the samples has a great impact to the measurement accuracy in the conventional full evaporation headspace analysis. The results (using ethanol solution as the model sample) showed that the present technique is effective to minimize such a problem. The proposed full evaporation multiple headspace extraction analysis technique is also automated and practical, and which could greatly broaden the applications of the full-evaporation-based headspace analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Effectiveness of high-throughput miniaturized sorbent- and solid phase microextraction techniques combined with gas chromatography-mass spectrometry analysis for a rapid screening of volatile and semi-volatile composition of wines--a comparative study.

    PubMed

    Mendes, Berta; Gonçalves, João; Câmara, José S

    2012-01-15

    In this study the feasibility of different extraction procedures was evaluated in order to test their potential for the extraction of the volatile (VOCs) and semi-volatile constituents (SVOCs) from wines. In this sense, and before they could be analysed by gas chromatography-quadrupole first stage masss spectrometry (GC-qMS), three different high-throughput miniaturized (ad)sorptive extraction techniques, based on solid phase extraction (SPE), microextraction by packed sorbents (MEPS) and solid phase microextraction (SPME), were studied for the first time together, for the extraction step. To achieve the most complete volatile and semi-volatile signature, distinct SPE (LiChrolut EN, Poropak Q, Styrene-Divinylbenzene and Amberlite XAD-2) and MEPS (C(2), C(8), C(18), Silica and M1 (mixed C(8)-SCX)) sorbent materials, and different SPME fibre coatings (PA, PDMS, PEG, DVB/CAR/PDMS, PDMS/DVB, and CAR/PDMS), were tested and compared. All the extraction techniques were followed by GC-qMS analysis, which allowed the identification of up to 103 VOCs and SVOCs, distributed by distinct chemical families: higher alcohols, esters, fatty acids, carbonyl compounds and furan compounds. Mass spectra, standard compounds and retention index were used for identification purposes. SPE technique, using LiChrolut EN as sorbent (SPE(LiChrolut EN)), was the most efficient method allowing for the identification of 78 VOCs and SVOCs, 63 and 19 more than MEPS and SPME techniques, respectively. In MEPS technique the best results in terms of number of extractable/identified compounds and total peak areas of volatile and semi-volatile fraction, were obtained by using C(8) resin whereas DVB/CAR/PDMS was revealed the most efficient SPME coating to extract VOCs and SVOCs from Bual wine. Diethyl malate (18.8±3.2%) was the main component found in wine SPE(LiChrolut EN) extracts followed by ethyl succinate (13.5±5.3%), 3-methyl-1-butanol (13.2±1.7%), and 2-phenylethanol (11.2±9.9%), while in SPME(DVB/CAR/PDMS) technique 3-methyl-1-butanol (43.3±0.6%) followed by diethyl succinate (18.9±1.6%), and 2-furfural (10.4±0.4%), are the major compounds. The major VOCs and SVOCs isolated by MEPS(C8) were 3-methyl-1-butanol (26.8±0.6%, from wine total volatile fraction), diethyl succinate (24.9±0.8%), and diethyl malate (16.3±0.9%). Regardless of the extraction technique, the highest extraction efficiency corresponds to esters and higher alcohols and the lowest to fatty acids. Despite some drawbacks associated with the SPE procedure such as the use of organic solvents, the time-consuming and tedious sampling procedure, it was observed that SPE(LiChrolut EN), revealed to be the most effective technique allowing the extraction of a higher number of compounds (78) rather than the other extraction techniques studied. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Tenax extraction as a simple approach to improve environmental risk assessments.

    PubMed

    Harwood, Amanda D; Nutile, Samuel A; Landrum, Peter F; Lydy, Michael J

    2015-07-01

    It is well documented that using exhaustive chemical extractions is not an effective means of assessing exposure of hydrophobic organic compounds in sediments and that bioavailability-based techniques are an improvement over traditional methods. One technique that has shown special promise as a method for assessing the bioavailability of hydrophobic organic compounds in sediment is the use of Tenax-extractable concentrations. A 6-h or 24-h single-point Tenax-extractable concentration correlates to both bioaccumulation and toxicity. This method has demonstrated effectiveness for several hydrophobic organic compounds in various organisms under both field and laboratory conditions. In addition, a Tenax bioaccumulation model was developed for multiple compounds relating 24-h Tenax-extractable concentrations to oligochaete tissue concentrations exposed in both the laboratory and field. This model has demonstrated predictive capacity for additional compounds and species. Use of Tenax-extractable concentrations to estimate exposure is rapid, simple, straightforward, and relatively inexpensive, as well as accurate. Therefore, this method would be an invaluable tool if implemented in risk assessments. © 2015 SETAC.

  10. Distribution and uptake dynamics of mercury in leaves of common deciduous tree species in Minnesota, U.S.A.

    Treesearch

    Aicam Laacouri; Edward A. Nater; Randall K. Kolka

    2013-01-01

    A sequential extraction technique for compartmentalizing mercury (Hg) in leaves was developed based on a water extraction of Hg from the leaf surface followed by a solvent extraction of the cuticle. The bulk of leaf Hg was found in the tissue compartment (90-96%) with lesser amounts in the surface and cuticle compartments. Total leaf concentrations of Hg varied among...

  11. Automating the generation of lexical patterns for processing free text in clinical documents.

    PubMed

    Meng, Frank; Morioka, Craig

    2015-09-01

    Many tasks in natural language processing utilize lexical pattern-matching techniques, including information extraction (IE), negation identification, and syntactic parsing. However, it is generally difficult to derive patterns that achieve acceptable levels of recall while also remaining highly precise. We present a multiple sequence alignment (MSA)-based technique that automatically generates patterns, thereby leveraging language usage to determine the context of words that influence a given target. MSAs capture the commonalities among word sequences and are able to reveal areas of linguistic stability and variation. In this way, MSAs provide a systemic approach to generating lexical patterns that are generalizable, which will both increase recall levels and maintain high levels of precision. The MSA-generated patterns exhibited consistent F1-, F.5-, and F2- scores compared to two baseline techniques for IE across four different tasks. Both baseline techniques performed well for some tasks and less well for others, but MSA was found to consistently perform at a high level for all four tasks. The performance of MSA on the four extraction tasks indicates the method's versatility. The results show that the MSA-based patterns are able to handle the extraction of individual data elements as well as relations between two concepts without the need for large amounts of manual intervention. We presented an MSA-based framework for generating lexical patterns that showed consistently high levels of both performance and recall over four different extraction tasks when compared to baseline methods. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. [Analysis of triterpenoids in Ganoderma lucidum by microwave-assisted continuous extraction].

    PubMed

    Lu, Yan-fang; An, Jing; Jiang, Ye

    2015-04-01

    For further improving the extraction efficiency of microwave extraction, a microwave-assisted contijuous extraction (MACE) device has been designed and utilized. By contrasting with the traditional methods, the characteristics and extraction efficiency of MACE has also been studied. The method was validated by the analysis of the triterpenoids in Ganoderma lucidum. The extraction conditions of MACE were: using 95% ethanol as solvent, microwave power 200 W and radiation time 14.5 min (5 cycles). The extraction results were subsequently compared with traditional heat reflux extraction ( HRE) , soxhlet extraction (SE), ultrasonic extraction ( UE) as well as the conventional microwave extraction (ME). For triterpenoids, the two methods based on the microwaves (ME and MACE) were in general capable of finishing the extraction in 10, 14.5 min, respectively, while other methods should consume 60 min and even more than 100 min. Additionally, ME can produce comparable extraction results as the classical HRE and higher extraction yield than both SE and UE, however, notably lower extraction yield than MASE. More importantly, the purity of the crud extract by MACE is far better than the other methods. MACE can effectively combine the advantages of microwave extraction and soxhlet extraction, thus enabling a more complete extraction of the analytes of TCMs in comparison with ME. And therefore makes the analytic result more accurate. It provides a novel, high efficient, rapid and reliable pretreatment technique for the analysis of TCMs, and it could potentially be extended to ingredient preparation or extracting techniques of TCMs.

  13. Extending the spectrum of DNA sequences retrieved from ancient bones and teeth

    PubMed Central

    Glocke, Isabelle; Meyer, Matthias

    2017-01-01

    The number of DNA fragments surviving in ancient bones and teeth is known to decrease with fragment length. Recent genetic analyses of Middle Pleistocene remains have shown that the recovery of extremely short fragments can prove critical for successful retrieval of sequence information from particularly degraded ancient biological material. Current sample preparation techniques, however, are not optimized to recover DNA sequences from fragments shorter than ∼35 base pairs (bp). Here, we show that much shorter DNA fragments are present in ancient skeletal remains but lost during DNA extraction. We present a refined silica-based DNA extraction method that not only enables efficient recovery of molecules as short as 25 bp but also doubles the yield of sequences from longer fragments due to improved recovery of molecules with single-strand breaks. Furthermore, we present strategies for monitoring inefficiencies in library preparation that may result from co-extraction of inhibitory substances during DNA extraction. The combination of DNA extraction and library preparation techniques described here substantially increases the yield of DNA sequences from ancient remains and provides access to a yet unexploited source of highly degraded DNA fragments. Our work may thus open the door for genetic analyses on even older material. PMID:28408382

  14. Gas flow headspace liquid phase microextraction.

    PubMed

    Yang, Cui; Qiu, Jinxue; Ren, Chunyan; Piao, Xiangfan; Li, Xifeng; Wu, Xue; Li, Donghao

    2009-11-06

    There is a trend towards the use of enrichment techniques such as microextraction in the analysis of trace chemicals. Based on the theory of ideal gases, theory of gas chromatography and the original headspace liquid phase microextraction (HS-LPME) technique, a simple gas flow headspace liquid phase microextraction (GF-HS-LPME) technique has been developed, where the extracting gas phase volume is increased using a gas flow. The system is an open system, where an inert gas containing the target compounds flows continuously through a special gas outlet channel (D=1.8mm), and the target compounds are trapped on a solvent microdrop (2.4 microL) hanging on the microsyringe tip, as a result, a high enrichment factor is obtained. The parameters affecting the enrichment factor, such as the gas flow rate, the position of the microdrop, the diameter of the gas outlet channel, the temperatures of the extracting solvent and of the sample, and the extraction time, were systematically optimized for four types of polycyclic aromatic hydrocarbons. The results were compared with results obtained from HS-LPME. Under the optimized conditions (where the extraction time and the volume of the extracting sample vial were fixed at 20min and 10mL, respectively), detection limits (S/N=3) were approximately a factor of 4 lower than those for the original HS-LPME technique. The method was validated by comparison of the GF-HS-LPME and HS-LPME techniques using data for PAHs from environmental sediment samples.

  15. A new strategy for accelerated extraction of target compounds using molecularly imprinted polymer particles embedded in a paper-based disk.

    PubMed

    Zarejousheghani, Mashaalah; Schrader, Steffi; Möder, Monika; Schmidt, Matthias; Borsdorf, Helko

    2018-03-01

    In this study, a general simple and inexpensive method is introduced for the preparation of a paper-based selective disk-type solid phase extraction (SPE) technique, appropriate for fast and high throughput monitoring of target compounds. An ion exchange molecularly imprinted polymer (MIP) was synthesized for the extraction and analysis of acesulfame, an anthropogenic water quality marker. Acesulfame imprinting was used as an example for demonstrating the benefits of a nanosized, swellable MIP extraction sorbents integrated in an on-site compatible concept for water quality monitoring. Compared with an 8 mL standard SPE cartridge, the paper-based MIP disk (47 mm ø) format allowed (1) high sample flow rates up to 30 mL•min -1 without losing extraction efficiency (2) extracting sample volumes up to 500 mL in much shorter times than with standard SPE, (3) the reuse of the disks (up to 3 times more than SPE cartridge) due to high robustness and an efficient post-cleaning, and (4) reducing the sampling time from 100 minutes (using the standard SPE format) to about 2 minutes with the MIP paper disk for 50 mL water sample. Different parameters like cellulose fiber/polymer ratios, sample volume, sample flow-rate, washing, and elution conditions were evaluated and optimized. Using developed extraction technique with high-performance liquid chromatography-tandem mass spectrometry (HPLC-MS-MS) analysis, a new protocol was established that provides detection and quantification limits of 0.015 μg•L -1 and 0.05 μg•L -1 , respectively. The developed paper disks were used in-field for the selective extraction of target compounds and transferred to the laboratory for further analysis. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Collaborative Filtering Based on Sequential Extraction of User-Item Clusters

    NASA Astrophysics Data System (ADS)

    Honda, Katsuhiro; Notsu, Akira; Ichihashi, Hidetomo

    Collaborative filtering is a computational realization of “word-of-mouth” in network community, in which the items prefered by “neighbors” are recommended. This paper proposes a new item-selection model for extracting user-item clusters from rectangular relation matrices, in which mutual relations between users and items are denoted in an alternative process of “liking or not”. A technique for sequential co-cluster extraction from rectangular relational data is given by combining the structural balancing-based user-item clustering method with sequential fuzzy cluster extraction appraoch. Then, the tecunique is applied to the collaborative filtering problem, in which some items may be shared by several user clusters.

  17. Evaluation of Method-Specific Extraction Variability for the Measurement of Fatty Acids in a Candidate Infant/Adult Nutritional Formula Reference Material.

    PubMed

    Place, Benjamin J

    2017-05-01

    To address community needs, the National Institute of Standards and Technology has developed a candidate Standard Reference Material (SRM) for infant/adult nutritional formula based on milk and whey protein concentrates with isolated soy protein called SRM 1869 Infant/Adult Nutritional Formula. One major component of this candidate SRM is the fatty acid content. In this study, multiple extraction techniques were evaluated to quantify the fatty acids in this new material. Extraction methods that were based on lipid extraction followed by transesterification resulted in lower mass fraction values for all fatty acids than the values measured by methods utilizing in situ transesterification followed by fatty acid methyl ester extraction (ISTE). An ISTE method, based on the identified optimal parameters, was used to determine the fatty acid content of the new infant/adult nutritional formula reference material.

  18. Comparison of DGT with traditional extraction methods for assessing arsenic bioavailability to Brassica chinensis in different soils.

    PubMed

    Dai, Yunchao; Nasir, Mubasher; Zhang, Yulin; Gao, Jiakai; Lv, Yamin; Lv, Jialong

    2018-01-01

    Several predictive models and methods have been used for heavy metals bioavailability, but there is no universally accepted approach in evaluating the bioavailability of arsenic (As) in soil. The technique of diffusive gradients in thin-films (DGT) is a promising tool, but there is a considerable debate with respect to its suitability. The DGT method was compared with other traditional chemical extractions techniques (soil solution, NaHCO 3 , NH 4 Cl, HCl, and total As method) for estimating As bioavailability in soil based on a greenhouse experiment using Brassica chinensis grown in various soils from 15 provinces in China. In addition, we assessed whether these methods are independent of soil properties. The correlations between plant and soil As concentration measured with traditional extraction techniques were pH and iron oxide (Fe ox ) dependent, indicating that these methods are influenced by soil properties. In contrast, DGT measurements were independent of soil properties and also showed a better correlation coefficient than other traditional techniques. Thus, DGT technique is superior to traditional techniques and should be preferable for evaluating As bioavailability in different type of soils. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Nonredundant sparse feature extraction using autoencoders with receptive fields clustering.

    PubMed

    Ayinde, Babajide O; Zurada, Jacek M

    2017-09-01

    This paper proposes new techniques for data representation in the context of deep learning using agglomerative clustering. Existing autoencoder-based data representation techniques tend to produce a number of encoding and decoding receptive fields of layered autoencoders that are duplicative, thereby leading to extraction of similar features, thus resulting in filtering redundancy. We propose a way to address this problem and show that such redundancy can be eliminated. This yields smaller networks and produces unique receptive fields that extract distinct features. It is also shown that autoencoders with nonnegativity constraints on weights are capable of extracting fewer redundant features than conventional sparse autoencoders. The concept is illustrated using conventional sparse autoencoder and nonnegativity-constrained autoencoders with MNIST digits recognition, NORB normalized-uniform object data and Yale face dataset. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Pattern recognition of satellite cloud imagery for improved weather prediction

    NASA Technical Reports Server (NTRS)

    Gautier, Catherine; Somerville, Richard C. J.; Volfson, Leonid B.

    1986-01-01

    The major accomplishment was the successful development of a method for extracting time derivative information from geostationary meteorological satellite imagery. This research is a proof-of-concept study which demonstrates the feasibility of using pattern recognition techniques and a statistical cloud classification method to estimate time rate of change of large-scale meteorological fields from remote sensing data. The cloud classification methodology is based on typical shape function analysis of parameter sets characterizing the cloud fields. The three specific technical objectives, all of which were successfully achieved, are as follows: develop and test a cloud classification technique based on pattern recognition methods, suitable for the analysis of visible and infrared geostationary satellite VISSR imagery; develop and test a methodology for intercomparing successive images using the cloud classification technique, so as to obtain estimates of the time rate of change of meteorological fields; and implement this technique in a testbed system incorporating an interactive graphics terminal to determine the feasibility of extracting time derivative information suitable for comparison with numerical weather prediction products.

  1. Classification by diagnosing all absorption features (CDAF) for the most abundant minerals in airborne hyperspectral images

    NASA Astrophysics Data System (ADS)

    Mobasheri, Mohammad Reza; Ghamary-Asl, Mohsen

    2011-12-01

    Imaging through hyperspectral technology is a powerful tool that can be used to spectrally identify and spatially map materials based on their specific absorption characteristics in electromagnetic spectrum. A robust method called Tetracorder has shown its effectiveness at material identification and mapping, using a set of algorithms within an expert system decision-making framework. In this study, using some stages of Tetracorder, a technique called classification by diagnosing all absorption features (CDAF) is introduced. This technique enables one to assign a class to the most abundant mineral in each pixel with high accuracy. The technique is based on the derivation of information from reflectance spectra of the image. This can be done through extraction of spectral absorption features of any minerals from their respected laboratory-measured reflectance spectra, and comparing it with those extracted from the pixels in the image. The CDAF technique has been executed on the AVIRIS image where the results show an overall accuracy of better than 96%.

  2. Continuous nucleus extraction by optically-induced cell lysis on a batch-type microfluidic platform.

    PubMed

    Huang, Shih-Hsuan; Hung, Lien-Yu; Lee, Gwo-Bin

    2016-04-21

    The extraction of a cell's nucleus is an essential technique required for a number of procedures, such as disease diagnosis, genetic replication, and animal cloning. However, existing nucleus extraction techniques are relatively inefficient and labor-intensive. Therefore, this study presents an innovative, microfluidics-based approach featuring optically-induced cell lysis (OICL) for nucleus extraction and collection in an automatic format. In comparison to previous micro-devices designed for nucleus extraction, the new OICL device designed herein is superior in terms of flexibility, selectivity, and efficiency. To facilitate this OICL module for continuous nucleus extraction, we further integrated an optically-induced dielectrophoresis (ODEP) module with the OICL device within the microfluidic chip. This on-chip integration circumvents the need for highly trained personnel and expensive, cumbersome equipment. Specifically, this microfluidic system automates four steps by 1) automatically focusing and transporting cells, 2) releasing the nuclei on the OICL module, 3) isolating the nuclei on the ODEP module, and 4) collecting the nuclei in the outlet chamber. The efficiency of cell membrane lysis and the ODEP nucleus separation was measured to be 78.04 ± 5.70% and 80.90 ± 5.98%, respectively, leading to an overall nucleus extraction efficiency of 58.21 ± 2.21%. These results demonstrate that this microfluidics-based system can successfully perform nucleus extraction, and the integrated platform is therefore promising in cell fusion technology with the goal of achieving genetic replication, or even animal cloning, in the near future.

  3. Charge transport and recombination in bulk heterojunction solar cells studied by the photoinduced charge extraction in linearly increasing voltage technique

    NASA Astrophysics Data System (ADS)

    Mozer, A. J.; Sariciftci, N. S.; Lutsen, L.; Vanderzande, D.; Österbacka, R.; Westerling, M.; Juška, G.

    2005-03-01

    Charge carrier mobility and recombination in a bulk heterojunction solar cell based on the mixture of poly[2-methoxy-5-(3,7-dimethyloctyloxy)-phenylene vinylene] (MDMO-PPV) and 1-(3-methoxycarbonyl)propyl-1-phenyl-(6,6)-C61 (PCBM) has been studied using the novel technique of photoinduced charge carrier extraction in a linearly increasing voltage (Photo-CELIV). In this technique, charge carriers are photogenerated by a short laser flash, and extracted under a reverse bias voltage ramp after an adjustable delay time (tdel). The Photo-CELIV mobility at room temperature is found to be μ =2×10-4cm2V-1s-1, which is almost independent on charge carrier density, but slightly dependent on tdel. Furthermore, determination of charge carrier lifetime and demonstration of an electric field dependent mobility is presented.

  4. Solventless and solvent-minimized sample preparation techniques for determining currently used pesticides in water samples: a review.

    PubMed

    Tankiewicz, Maciej; Fenik, Jolanta; Biziuk, Marek

    2011-10-30

    The intensification of agriculture means that increasing amounts of toxic organic and inorganic compounds are entering the environment. The pesticides generally applied nowadays are regarded as some of the most dangerous contaminants of the environment. Their presence in the environment, especially in water, is hazardous because they cause human beings to become more susceptible to disease. For these reasons, it is essential to monitor pesticide residues in the environment with the aid of all accessible analytical methods. The analysis of samples for the presence of pesticides is problematic, because of the laborious and time-consuming operations involved in preparing samples for analysis, which themselves may be a source of additional contaminations and errors. To date, it has been standard practice to use large quantities of organic solvents in the sample preparation process; but as these solvents are themselves hazardous, solventless and solvent-minimized techniques are coming into use. This paper discusses the most commonly used over the last 15 years sample preparation techniques for monitoring organophosphorus and organonitrogen pesticides residue in water samples. Furthermore, a significant trend in sample preparation, in accordance with the principles of 'Green Chemistry' is the simplification, miniaturization and automation of analytical techniques. In view of this aspect, several novel techniques are being developed in order to reduce the analysis step, increase the sample throughput and to improve the quality and the sensitivity of analytical methods. The paper describes extraction techniques requiring the use of solvents - liquid-liquid extraction (LLE) and its modifications, membrane extraction techniques, hollow fibre-protected two-phase solvent microextraction, liquid phase microextraction based on the solidification of a floating organic drop (LPME-SFO), solid-phase extraction (SPE) and single-drop microextraction (SDME) - as well as solvent-free techniques - solid phase microextraction (SPME) and stir bar sorptive extraction (SBSE). The advantages and drawbacks of these techniques are also discussed, and some solutions to their limitations are proposed. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. USING CARBOHYDRATES AS MOLECULAR MARKERS TO DETERMINE THE CONTRIBUTION OF AGRICULTURAL SOIL TO AMBIENT FINE AND COURSE PM

    EPA Science Inventory

    Project research optimized the quantification technique for carbohydrates that also allows quantification of other non-polar molecular markers based on using an isotopically labeled internal standard (D-glucose-1,2,3,4,5,6,6-d7) to monitor extraction efficiency, extraction usi...

  6. Flow-based analysis using microfluidics-chemiluminescence systems.

    PubMed

    Al Lawati, Haider A J

    2013-01-01

    This review will discuss various approaches and techniques in which analysis using microfluidics-chemiluminescence systems (MF-CL) has been reported. A variety of applications is examined, including environmental, pharmaceutical, biological, food and herbal analysis. Reported uses of CL reagents, sample introduction techniques, sample pretreatment methods, CL signal enhancement and detection systems are discussed. A hydrodynamic pumping system is predominately used for these applications. However, several reports are available in which electro-osmotic (EO) pumping has been implemented. Various sample pretreatment methods have been used, including liquid-liquid extraction, solid-phase extraction and molecularly imprinted polymers. A wide range of innovative techniques has been reported for CL signal enhancement. Most of these techniques are based on enhancement of the mixing process in the microfluidics channels, which leads to enhancement of the CL signal. However, other techniques are also reported, such as mirror reaction, liquid core waveguide, on-line pre-derivatization and the use of an opaque white chip with a thin transparent seal. Photodetectors are the most commonly used detectors; however, other detection systems have also been used, including integrated electrochemiluminescence (ECL) and organic photodiodes (OPDs). Copyright © 2012 John Wiley & Sons, Ltd.

  7. A methodology for automated CPA extraction using liver biopsy image analysis and machine learning techniques.

    PubMed

    Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas

    2017-03-01

    Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. Development of a quantitative intracranial vascular features extraction tool on 3D MRA using semiautomated open-curve active contour vessel tracing.

    PubMed

    Chen, Li; Mossa-Basha, Mahmud; Balu, Niranjan; Canton, Gador; Sun, Jie; Pimentel, Kristi; Hatsukami, Thomas S; Hwang, Jenq-Neng; Yuan, Chun

    2018-06-01

    To develop a quantitative intracranial artery measurement technique to extract comprehensive artery features from time-of-flight MR angiography (MRA). By semiautomatically tracing arteries based on an open-curve active contour model in a graphical user interface, 12 basic morphometric features and 16 basic intensity features for each artery were identified. Arteries were then classified as one of 24 types using prediction from a probability model. Based on the anatomical structures, features were integrated within 34 vascular groups for regional features of vascular trees. Eight 3D MRA acquisitions with intracranial atherosclerosis were assessed to validate this technique. Arterial tracings were validated by an experienced neuroradiologist who checked agreement at bifurcation and stenosis locations. This technique achieved 94% sensitivity and 85% positive predictive values (PPV) for bifurcations, and 85% sensitivity and PPV for stenosis. Up to 1,456 features, such as length, volume, and averaged signal intensity for each artery, as well as vascular group in each of the MRA images, could be extracted to comprehensively reflect characteristics, distribution, and connectivity of arteries. Length for the M1 segment of the middle cerebral artery extracted by this technique was compared with reviewer-measured results, and the intraclass correlation coefficient was 0.97. A semiautomated quantitative method to trace, label, and measure intracranial arteries from 3D-MRA was developed and validated. This technique can be used to facilitate quantitative intracranial vascular research, such as studying cerebrovascular adaptation to aging and disease conditions. Magn Reson Med 79:3229-3238, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  9. Robust breathing signal extraction from cone beam CT projections based on adaptive and global optimization techniques

    PubMed Central

    Chao, Ming; Wei, Jie; Li, Tianfang; Yuan, Yading; Rosenzweig, Kenneth E; Lo, Yeh-Chi

    2017-01-01

    We present a study of extracting respiratory signals from cone beam computed tomography (CBCT) projections within the framework of the Amsterdam Shroud (AS) technique. Acquired prior to the radiotherapy treatment, CBCT projections were preprocessed for contrast enhancement by converting the original intensity images to attenuation images with which the AS image was created. An adaptive robust z-normalization filtering was applied to further augment the weak oscillating structures locally. From the enhanced AS image, the respiratory signal was extracted using a two-step optimization approach to effectively reveal the large-scale regularity of the breathing signals. CBCT projection images from five patients acquired with the Varian Onboard Imager on the Clinac iX System Linear Accelerator (Varian Medical Systems, Palo Alto, CA) were employed to assess the proposed technique. Stable breathing signals can be reliably extracted using the proposed algorithm. Reference waveforms obtained using an air bellows belt (Philips Medical Systems, Cleveland, OH) were exported and compared to those with the AS based signals. The average errors for the enrolled patients between the estimated breath per minute (bpm) and the reference waveform bpm can be as low as −0.07 with the standard deviation 1.58. The new algorithm outperformed the original AS technique for all patients by 8.5% to 30%. The impact of gantry rotation on the breathing signal was assessed with data acquired with a Quasar phantom (Modus Medical Devices Inc., London, Canada) and found to be minimal on the signal frequency. The new technique developed in this work will provide a practical solution to rendering markerless breathing signal using the CBCT projections for thoracic and abdominal patients. PMID:27008349

  10. PREDOSE: A Semantic Web Platform for Drug Abuse Epidemiology using Social Media

    PubMed Central

    Cameron, Delroy; Smith, Gary A.; Daniulaityte, Raminta; Sheth, Amit P.; Dave, Drashti; Chen, Lu; Anand, Gaurish; Carlson, Robert; Watkins, Kera Z.; Falck, Russel

    2013-01-01

    Objectives The role of social media in biomedical knowledge mining, including clinical, medical and healthcare informatics, prescription drug abuse epidemiology and drug pharmacology, has become increasingly significant in recent years. Social media offers opportunities for people to share opinions and experiences freely in online communities, which may contribute information beyond the knowledge of domain professionals. This paper describes the development of a novel Semantic Web platform called PREDOSE (PREscription Drug abuse Online Surveillance and Epidemiology), which is designed to facilitate the epidemiologic study of prescription (and related) drug abuse practices using social media. PREDOSE uses web forum posts and domain knowledge, modeled in a manually created Drug Abuse Ontology (DAO) (pronounced dow), to facilitate the extraction of semantic information from User Generated Content (UGC). A combination of lexical, pattern-based and semantics-based techniques is used together with the domain knowledge to extract fine-grained semantic information from UGC. In a previous study, PREDOSE was used to obtain the datasets from which new knowledge in drug abuse research was derived. Here, we report on various platform enhancements, including an updated DAO, new components for relationship and triple extraction, and tools for content analysis, trend detection and emerging patterns exploration, which enhance the capabilities of the PREDOSE platform. Given these enhancements, PREDOSE is now more equipped to impact drug abuse research by alleviating traditional labor-intensive content analysis tasks. Methods Using custom web crawlers that scrape UGC from publicly available web forums, PREDOSE first automates the collection of web-based social media content for subsequent semantic annotation. The annotation scheme is modeled in the DAO, and includes domain specific knowledge such as prescription (and related) drugs, methods of preparation, side effects, routes of administration, etc. The DAO is also used to help recognize three types of data, namely: 1) entities, 2) relationships and 3) triples. PREDOSE then uses a combination of lexical and semantic-based techniques to extract entities and relationships from the scraped content, and a top-down approach for triple extraction that uses patterns expressed in the DAO. In addition, PREDOSE uses publicly available lexicons to identify initial sentiment expressions in text, and then a probabilistic optimization algorithm (from related research) to extract the final sentiment expressions. Together, these techniques enable the capture of fine-grained semantic information from UGC, and querying, search, trend analysis and overall content analysis of social media related to prescription drug abuse. Moreover, extracted data are also made available to domain experts for the creation of training and test sets for use in evaluation and refinements in information extraction techniques. Results A recent evaluation of the information extraction techniques applied in the PREDOSE platform indicates 85% precision and 72% recall in entity identification, on a manually created gold standard dataset. In another study, PREDOSE achieved 36% precision in relationship identification and 33% precision in triple extraction, through manual evaluation by domain experts. Given the complexity of the relationship and triple extraction tasks and the abstruse nature of social media texts, we interpret these as favorable initial results. Extracted semantic information is currently in use in an online discovery support system, by prescription drug abuse researchers at the Center for Interventions, Treatment and Addictions Research (CITAR) at Wright State University. Conclusion A comprehensive platform for entity, relationship, triple and sentiment extraction from such abstruse texts has never been developed for drug abuse research. PREDOSE has already demonstrated the importance of mining social media by providing data from which new findings in drug abuse research were uncovered. Given the recent platform enhancements, including the refined DAO, components for relationship and triple extraction, and tools for content, trend and emerging pattern analysis, it is expected that PREDOSE will play a significant role in advancing drug abuse epidemiology in future. PMID:23892295

  11. WREP: A wavelet-based technique for extracting the red edge position from reflectance spectra for estimating leaf and canopy chlorophyll contents of cereal crops

    NASA Astrophysics Data System (ADS)

    Li, Dong; Cheng, Tao; Zhou, Kai; Zheng, Hengbiao; Yao, Xia; Tian, Yongchao; Zhu, Yan; Cao, Weixing

    2017-07-01

    Red edge position (REP), defined as the wavelength of the inflexion point in the red edge region (680-760 nm) of the reflectance spectrum, has been widely used to estimate foliar chlorophyll content from reflectance spectra. A number of techniques have been developed for REP extraction in the past three decades, but most of them require data-specific parameterization and the consistence of their performance from leaf to canopy levels remains poorly understood. In this study, we propose a new technique (WREP) to extract REPs based on the application of continuous wavelet transform to reflectance spectra. The REP is determined by the zero-crossing wavelength in the red edge region of a wavelet transformed spectrum for a number of scales of wavelet decomposition. The new technique is simple to implement and requires no parameterization from the user as long as continuous wavelet transforms are applied to reflectance spectra. Its performance was evaluated for estimating leaf chlorophyll content (LCC) and canopy chlorophyll content (CCC) of cereal crops (i.e. rice and wheat) and compared with traditional techniques including linear interpolation, linear extrapolation, polynomial fitting and inverted Gaussian. Our results demonstrated that WREP obtained the best estimation accuracy for both LCC and CCC as compared to traditional techniques. High scales of wavelet decomposition were favorable for the estimation of CCC and low scales for the estimation of LCC. The difference in optimal scale reveals the underlying mechanism of signature transfer from leaf to canopy levels. In addition, crop-specific models were required for the estimation of CCC over the full range. However, a common model could be built with the REPs extracted with Scale 5 of the WREP technique for wheat and rice crops when CCC was less than 2 g/m2 (R2 = 0.73, RMSE = 0.26 g/m2). This insensitivity of WREP to crop type indicates the potential for aerial mapping of chlorophyll content between growth seasons of cereal crops. The new REP extraction technique provides us a new insight for understanding the spectral changes in the red edge region in response to chlorophyll variation from leaf to canopy levels.

  12. Evaluating bis(2-ethylhexyl) methanediphosphonic acid (H 2DEH[MDP]) based polymer ligand film (PLF) for plutonium and uranium extraction

    DOE PAGES

    Rim, Jung H.; Armenta, Claudine E.; Gonzales, Edward R.; ...

    2015-09-12

    This paper describes a new analyte extraction medium called polymer ligand film (PLF) that was developed to rapidly extract radionuclides. PLF is a polymer medium with ligands incorporated in its matrix that selectively and quickly extracts analytes. The main focus of the new technique is to shorten and simplify the procedure for chemically isolating radionuclides for determination through alpha spectroscopy. The PLF system was effective for plutonium and uranium extraction. The PLF was capable of co-extracting or selectively extracting plutonium over uranium depending on the PLF composition. As a result, the PLF and electrodeposited samples had similar alpha spectra resolutions.

  13. Interrogating Bronchoalveolar Lavage Samples via Exclusion-Based Analyte Extraction.

    PubMed

    Tokar, Jacob J; Warrick, Jay W; Guckenberger, David J; Sperger, Jamie M; Lang, Joshua M; Ferguson, J Scott; Beebe, David J

    2017-06-01

    Although average survival rates for lung cancer have improved, earlier and better diagnosis remains a priority. One promising approach to assisting earlier and safer diagnosis of lung lesions is bronchoalveolar lavage (BAL), which provides a sample of lung tissue as well as proteins and immune cells from the vicinity of the lesion, yet diagnostic sensitivity remains a challenge. Reproducible isolation of lung epithelia and multianalyte extraction have the potential to improve diagnostic sensitivity and provide new information for developing personalized therapeutic approaches. We present the use of a recently developed exclusion-based, solid-phase-extraction technique called SLIDE (Sliding Lid for Immobilized Droplet Extraction) to facilitate analysis of BAL samples. We developed a SLIDE protocol for lung epithelial cell extraction and biomarker staining of patient BALs, testing both EpCAM and Trop2 as capture antigens. We characterized captured cells using TTF1 and p40 as immunostaining biomarkers of adenocarcinoma and squamous cell carcinoma, respectively. We achieved up to 90% (EpCAM) and 84% (Trop2) extraction efficiency of representative tumor cell lines. We then used the platform to process two patient BAL samples in parallel within the same sample plate to demonstrate feasibility and observed that Trop2-based extraction potentially extracts more target cells than EpCAM-based extraction.

  14. A comparison of two colorimetric assays, based upon Lowry and Bradford techniques, to estimate total protein in soil extracts.

    PubMed

    Redmile-Gordon, M A; Armenise, E; White, R P; Hirsch, P R; Goulding, K W T

    2013-12-01

    Soil extracts usually contain large quantities of dissolved humified organic material, typically reflected by high polyphenolic content. Since polyphenols seriously confound quantification of extracted protein, minimising this interference is important to ensure measurements are representative. Although the Bradford colorimetric assay is used routinely in soil science for rapid quantification protein in soil-extracts, it has several limitations. We therefore investigated an alternative colorimetric technique based on the Lowry assay (frequently used to measure protein and humic substances as distinct pools in microbial biofilms). The accuracies of both the Bradford assay and a modified Lowry microplate method were compared in factorial combination. Protein was quantified in soil-extracts (extracted with citrate), including standard additions of model protein (BSA) and polyphenol (Sigma H1675-2). Using the Lowry microplate assay described, no interfering effects of citrate were detected even with concentrations up to 5 times greater than are typically used to extract soil protein. Moreover, the Bradford assay was found to be highly susceptible to two simultaneous and confounding artefacts: 1) the colour development due to added protein was greatly inhibited by polyphenol concentration, and 2) substantial colour development was caused directly by the polyphenol addition. In contrast, the Lowry method enabled distinction between colour development from protein and non-protein origin, providing a more accurate quantitative analysis. These results suggest that the modified-Lowry method is a more suitable measure of extract protein (defined by standard equivalents) because it is less confounded by the high polyphenolic content which is so typical of soil extracts.

  15. The extraction and chromatographic determination of the essentials oils from Ocimum basilicum L. by different techniques

    NASA Astrophysics Data System (ADS)

    Loredana Soran, Maria; Codruta Cobzac, Simona; Varodi, Codruta; Lung, Ildiko; Surducan, Emanoil; Surducan, Vasile

    2009-08-01

    Three different techniques (maceration, sonication and extraction in microwave field) were used for extraction of essential oils from Ocimum basilicum L. The extracts were analyzed by TLC/HPTLC technique and the fingerprint informations were obtained. The GC-FID was used to characterized the extraction efficiency and for identify the terpenic bioactive compounds. The most efficient extraction technique was maceration followed by microwave and ultrasound. The best extraction solvent system was ethyl ether + ethanol (1:1, v/v). The main compounds identified in Ocimum basilicum L. extracts were: α and β-pinene (mixture), limonene, citronellol, and geraniol.

  16. Highly Effective DNA Extraction Method for Nuclear Short Tandem Repeat Testing of Skeletal Remains from Mass Graves

    PubMed Central

    Davoren, Jon; Vanek, Daniel; Konjhodzić, Rijad; Crews, John; Huffine, Edwin; Parsons, Thomas J.

    2007-01-01

    Aim To quantitatively compare a silica extraction method with a commonly used phenol/chloroform extraction method for DNA analysis of specimens exhumed from mass graves. Methods DNA was extracted from twenty randomly chosen femur samples, using the International Commission on Missing Persons (ICMP) silica method, based on Qiagen Blood Maxi Kit, and compared with the DNA extracted by the standard phenol/chloroform-based method. The efficacy of extraction methods was compared by real time polymerase chain reaction (PCR) to measure DNA quantity and the presence of inhibitors and by amplification with the PowerPlex 16 (PP16) multiplex nuclear short tandem repeat (STR) kit. Results DNA quantification results showed that the silica-based method extracted on average 1.94 ng of DNA per gram of bone (range 0.25-9.58 ng/g), compared with only 0.68 ng/g by the organic method extracted (range 0.0016-4.4880 ng/g). Inhibition tests showed that there were on average significantly lower levels of PCR inhibitors in DNA isolated by the organic method. When amplified with PP16, all samples extracted by silica-based method produced 16 full loci profiles, while only 75% of the DNA extracts obtained by organic technique amplified 16 loci profiles. Conclusions The silica-based extraction method showed better results in nuclear STR typing from degraded bone samples than a commonly used phenol/chloroform method. PMID:17696302

  17. Methods for automatically analyzing humpback song units.

    PubMed

    Rickwood, Peter; Taylor, Andrew

    2008-03-01

    This paper presents mathematical techniques for automatically extracting and analyzing bioacoustic signals. Automatic techniques are described for isolation of target signals from background noise, extraction of features from target signals and unsupervised classification (clustering) of the target signals based on these features. The only user-provided inputs, other than raw sound, is an initial set of signal processing and control parameters. Of particular note is that the number of signal categories is determined automatically. The techniques, applied to hydrophone recordings of humpback whales (Megaptera novaeangliae), produce promising initial results, suggesting that they may be of use in automated analysis of not only humpbacks, but possibly also in other bioacoustic settings where automated analysis is desirable.

  18. Recognition of Similar Shaped Handwritten Marathi Characters Using Artificial Neural Network

    NASA Astrophysics Data System (ADS)

    Jane, Archana P.; Pund, Mukesh A.

    2012-03-01

    The growing need have handwritten Marathi character recognition in Indian offices such as passport, railways etc has made it vital area of a research. Similar shape characters are more prone to misclassification. In this paper a novel method is provided to recognize handwritten Marathi characters based on their features extraction and adaptive smoothing technique. Feature selections methods avoid unnecessary patterns in an image whereas adaptive smoothing technique form smooth shape of charecters.Combination of both these approaches leads to the better results. Previous study shows that, no one technique achieves 100% accuracy in handwritten character recognition area. This approach of combining both adaptive smoothing & feature extraction gives better results (approximately 75-100) and expected outcomes.

  19. Space debris tracking based on fuzzy running Gaussian average adaptive particle filter track-before-detect algorithm

    NASA Astrophysics Data System (ADS)

    Torteeka, Peerapong; Gao, Peng-Qi; Shen, Ming; Guo, Xiao-Zhang; Yang, Da-Tao; Yu, Huan-Huan; Zhou, Wei-Ping; Zhao, You

    2017-02-01

    Although tracking with a passive optical telescope is a powerful technique for space debris observation, it is limited by its sensitivity to dynamic background noise. Traditionally, in the field of astronomy, static background subtraction based on a median image technique has been used to extract moving space objects prior to the tracking operation, as this is computationally efficient. The main disadvantage of this technique is that it is not robust to variable illumination conditions. In this article, we propose an approach for tracking small and dim space debris in the context of a dynamic background via one of the optical telescopes that is part of the space surveillance network project, named the Asia-Pacific ground-based Optical Space Observation System or APOSOS. The approach combines a fuzzy running Gaussian average for robust moving-object extraction with dim-target tracking using a particle-filter-based track-before-detect method. The performance of the proposed algorithm is experimentally evaluated, and the results show that the scheme achieves a satisfactory level of accuracy for space debris tracking.

  20. Rapid non-enzymatic extraction method for isolating PCR-quality camelpox virus DNA from skin.

    PubMed

    Yousif, A Ausama; Al-Naeem, A Abdelmohsen; Al-Ali, M Ahmad

    2010-10-01

    Molecular diagnostic investigations of orthopoxvirus (OPV) infections are performed using a variety of clinical samples including skin lesions, tissues from internal organs, blood and secretions. Skin samples are particularly convenient for rapid diagnosis and molecular epidemiological investigations of camelpox virus (CMLV). Classical extraction procedures and commercial spin-column-based kits are time consuming, relatively expensive, and require multiple extraction and purification steps in addition to proteinase K digestion. A rapid non-enzymatic procedure for extracting CMLV DNA from dried scabs or pox lesions was developed to overcome some of the limitations of the available DNA extraction techniques. The procedure requires as little as 10mg of tissue and produces highly purified DNA [OD(260)/OD(280) ratios between 1.47 and 1.79] with concentrations ranging from 6.5 to 16 microg/ml. The extracted CMLV DNA was proven suitable for virus-specific qualitative and, semi-quantitative PCR applications. Compared to spin-column and conventional viral DNA extraction techniques, the two-step extraction procedure saves money and time, and retains the potential for automation without compromising CMLV PCR sensitivity. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  1. Effect of three extraction techniques on submitochondrial particle and Microtox bioassays for airborne particulate matter.

    PubMed

    Torres-Pérez, Mónica I; Jiménez-Velez, Braulio D; Mansilla-Rivera, Imar; Rodríguez-Sierra, Carlos J

    2005-03-01

    The effect that three extraction techniques (e.g., Soxhlet, ultrasound and microwave-assisted extraction) have on the toxicity, as measured by submitochondrial particle (SMP) and Microtox assays, of organic extracts was compared from three sources of airborne particulate matter (APM). The extraction technique influenced the toxicity response of APM extracts and it was dependent on the bioassay method, and APM sample source. APM extracts from microwave-assisted extraction (MAE) were similar or more toxic than the conventional extraction techniques of Soxhlet and ultrasound, thus, providing an alternate extraction method. The microwave extraction technique has the advantage of using less solvent volume, less extraction time, and the capacity to simultaneously extract twelve samples. The ordering of APM toxicity was generally urban dust > diesel dust > PM10 (particles with diameter < 10 microm), thus, reflecting different chemical composition of the samples. This study is the first to report the suitability of two standard in-vitro bioassays for the future toxicological characterization of APM collected from Puerto Rico, with the SMP generally showing better sensitivity to the well-known Microtox bioassay.

  2. Adaptive video-based vehicle classification technique for monitoring traffic.

    DOT National Transportation Integrated Search

    2015-08-01

    This report presents a methodology for extracting two vehicle features, vehicle length and number of axles in order : to classify the vehicles from video, based on Federal Highway Administration (FHWA)s recommended vehicle : classification scheme....

  3. Determination of parabens using two microextraction methods coupled with capillary liquid chromatography-UV detection.

    PubMed

    Chen, Chen-Wen; Hsu, Wen-Chan; Lu, Ya-Chen; Weng, Jing-Ru; Feng, Chia-Hsien

    2018-02-15

    Parabens are common preservatives and environmental hormones. As such, possible detrimental health effects could be amplified through their widespread use in foods, cosmetics, and pharmaceutical products. Thus, the determination of parabens in such products is of particular importance. This study explored vortex-assisted dispersive liquid-liquid microextraction techniques based on the solidification of a floating organic drop (VA-DLLME-SFO) and salt-assisted cloud point extraction (SA-CPE) for paraben extraction. Microanalysis was performed using a capillary liquid chromatography-ultraviolet detection system. These techniques were modified successfully to determine four parabens in 19 commercial products. The regression equations of these parabens exhibited good linearity (r 2 =0.998, 0.1-10μg/mL), good precision (RSD<5%) and accuracy (RE<5%), reduced reagent consumption and reaction times (<6min), and excellent sample versatility. VA-DLLME-SFO was also particularly convenient due to the use of a solidified extract. Thus, the VA-DLLME-SFO technique was better suited to the extraction of parabens from complex matrices. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Separation techniques for the clean-up of radioactive mixed waste for ICP-AES/ICP-MS analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swafford, A.M.; Keller, J.M.

    1993-03-17

    Two separation techniques were investigated for the clean-up of typical radioactive mixed waste samples requiring elemental analysis by Inductively Coupled Plasma-Atomic Emission Spectroscopy (ICP-AES) or Inductively Coupled Plasma-Mass Spectrometry (ICP-MS). These measurements frequently involve regulatory or compliance criteria which include the determination of elements on the EPA Target Analyte List (TAL). These samples usually consist of both an aqueous phase and a solid phase which is mostly an inorganic sludge. Frequently, samples taken from the waste tanks contain high levels of uranium and thorium which can cause spectral interferences in ICP-AES or ICP-MS analysis. The removal of these interferences ismore » necessary to determine the presence of the EPA TAL elements in the sample. Two clean-up methods were studied on simulated aqueous waste samples containing the EPA TAL elements. The first method studied was a classical procedure based upon liquid-liquid extraction using tri-n- octylphosphine oxide (TOPO) dissolved in cyclohexane. The second method investigated was based on more recently developed techniques using extraction chromatography; specifically the use of a commercially available Eichrom TRU[center dot]Spec[trademark] column. Literature on these two methods indicates the efficient removal of uranium and thorium from properly prepared samples and provides considerable qualitative information on the extraction behavior of many other elements. However, there is a lack of quantitative data on the extraction behavior of elements on the EPA Target Analyte List. Experimental studies on these two methods consisted of determining whether any of the analytes were extracted by these methods and the recoveries obtained. Both methods produced similar results; the EPA target analytes were only slightly or not extracted. Advantages and disadvantages of each method were evaluated and found to be comparable.« less

  5. Evaluation of current techniques for isolation of chars as natural adsorbents

    USGS Publications Warehouse

    Chun, Y.; Sheng, G.; Chiou, C.T.

    2004-01-01

    Chars in soils or sediments may potentially influence the soil/sediment sorption behavior. Current techniques for the isolation of black carbon including chars rely often on acid demineralization, base extraction, and chemical oxidation to remove salts and minerals, humic acid, and refractory kerogen, respectively. Little is known about the potential effects of these chemical processes on the char surface and adsorptive properties. This study examined the effects of acid demineralization, base extraction, and acidic Cr2O72- oxidation on the surface areas, surface acidity, and benzene adsorption characteristics of laboratory-produced pinewood and wheat-residue chars, pure or mixed with soils, and a commercial activated carbon. Demineralization resulted in a small reduction in the char surface area, whereas base extraction showed no obvious effect. Neither demineralization nor base extraction caused an appreciable variation in benzene adsorption and presumably the char surface properties. By contrast, the Cr2O 72- oxidation caused a >31% reduction in char surface area. The Boehm titration, supplemented by FTIR spectra, indicated that the surface acidity of oxidized chars increased by a factor between 2.3 and 12 compared to nonoxidized chars. Benzene adsorption with the oxidized chars was lower than that with the non-oxidized chars by a factor of >8.9; both the decrease in char surface area and the increase in char surface acidity contributed to the reduction in char adsorptive power. Although the Cr 2O72- oxidation effectively removes resistant kerogen, it is not well suited for the isolation of chars as contaminant adsorbents because of its destructive nature. Alternative nondestructive techniques that preserve the char surface properties and effectively remove kerogen must be sought.

  6. Neural network explanation using inversion.

    PubMed

    Saad, Emad W; Wunsch, Donald C

    2007-01-01

    An important drawback of many artificial neural networks (ANN) is their lack of explanation capability [Andrews, R., Diederich, J., & Tickle, A. B. (1996). A survey and critique of techniques for extracting rules from trained artificial neural networks. Knowledge-Based Systems, 8, 373-389]. This paper starts with a survey of algorithms which attempt to explain the ANN output. We then present HYPINV, a new explanation algorithm which relies on network inversion; i.e. calculating the ANN input which produces a desired output. HYPINV is a pedagogical algorithm, that extracts rules, in the form of hyperplanes. It is able to generate rules with arbitrarily desired fidelity, maintaining a fidelity-complexity tradeoff. To our knowledge, HYPINV is the only pedagogical rule extraction method, which extracts hyperplane rules from continuous or binary attribute neural networks. Different network inversion techniques, involving gradient descent as well as an evolutionary algorithm, are presented. An information theoretic treatment of rule extraction is presented. HYPINV is applied to example synthetic problems, to a real aerospace problem, and compared with similar algorithms using benchmark problems.

  7. Evaluation of Airborne l- Band Multi-Baseline Pol-Insar for dem Extraction Beneath Forest Canopy

    NASA Astrophysics Data System (ADS)

    Li, W. M.; Chen, E. X.; Li, Z. Y.; Jiang, C.; Jia, Y.

    2018-04-01

    DEM beneath forest canopy is difficult to extract with optical stereo pairs, InSAR and Pol-InSAR techniques. Tomographic SAR (TomoSAR) based on different penetration and view angles could reflect vertical structure and ground structure. This paper aims at evaluating the possibility of TomoSAR for underlying DEM extraction. Airborne L-band repeat-pass Pol-InSAR collected in BioSAR 2008 campaign was applied to reconstruct the 3D structure of forest. And sum of kronecker product and algebraic synthesis algorithm were used to extract ground structure, and phase linking algorithm was applied to estimate ground phase. Then Goldstein cut-branch approach was used to unwrap the phases and then estimated underlying DEM. The average difference between the extracted underlying DEM and Lidar DEM is about 3.39 m in our test site. And the result indicates that it is possible for underlying DEM estimation with airborne L-band repeat-pass TomoSAR technique.

  8. A Wave Diagnostics in Geophysics: Algorithmic Extraction of Atmosphere Disturbance Modes

    NASA Astrophysics Data System (ADS)

    Leble, S.; Vereshchagin, S.

    2018-04-01

    The problem of diagnostics in geophysics is discussed and a proposal based on dynamic projecting operators technique is formulated. The general exposition is demonstrated by an example of symbolic algorithm for the wave and entropy modes in the exponentially stratified atmosphere. The novel technique is developed as a discrete version for the evolution operator and the corresponding projectors via discrete Fourier transformation. Its explicit realization for directed modes in exponential one-dimensional atmosphere is presented via the correspondent projection operators in its discrete version in terms of matrices with a prescribed action on arrays formed from observation tables. A simulation based on opposite directed (upward and downward) wave train solution is performed and the modes' extraction from a mixture is illustrated.

  9. HMMBinder: DNA-Binding Protein Prediction Using HMM Profile Based Features.

    PubMed

    Zaman, Rianon; Chowdhury, Shahana Yasmin; Rashid, Mahmood A; Sharma, Alok; Dehzangi, Abdollah; Shatabda, Swakkhar

    2017-01-01

    DNA-binding proteins often play important role in various processes within the cell. Over the last decade, a wide range of classification algorithms and feature extraction techniques have been used to solve this problem. In this paper, we propose a novel DNA-binding protein prediction method called HMMBinder. HMMBinder uses monogram and bigram features extracted from the HMM profiles of the protein sequences. To the best of our knowledge, this is the first application of HMM profile based features for the DNA-binding protein prediction problem. We applied Support Vector Machines (SVM) as a classification technique in HMMBinder. Our method was tested on standard benchmark datasets. We experimentally show that our method outperforms the state-of-the-art methods found in the literature.

  10. PREDOSE: a semantic web platform for drug abuse epidemiology using social media.

    PubMed

    Cameron, Delroy; Smith, Gary A; Daniulaityte, Raminta; Sheth, Amit P; Dave, Drashti; Chen, Lu; Anand, Gaurish; Carlson, Robert; Watkins, Kera Z; Falck, Russel

    2013-12-01

    The role of social media in biomedical knowledge mining, including clinical, medical and healthcare informatics, prescription drug abuse epidemiology and drug pharmacology, has become increasingly significant in recent years. Social media offers opportunities for people to share opinions and experiences freely in online communities, which may contribute information beyond the knowledge of domain professionals. This paper describes the development of a novel semantic web platform called PREDOSE (PREscription Drug abuse Online Surveillance and Epidemiology), which is designed to facilitate the epidemiologic study of prescription (and related) drug abuse practices using social media. PREDOSE uses web forum posts and domain knowledge, modeled in a manually created Drug Abuse Ontology (DAO--pronounced dow), to facilitate the extraction of semantic information from User Generated Content (UGC), through combination of lexical, pattern-based and semantics-based techniques. In a previous study, PREDOSE was used to obtain the datasets from which new knowledge in drug abuse research was derived. Here, we report on various platform enhancements, including an updated DAO, new components for relationship and triple extraction, and tools for content analysis, trend detection and emerging patterns exploration, which enhance the capabilities of the PREDOSE platform. Given these enhancements, PREDOSE is now more equipped to impact drug abuse research by alleviating traditional labor-intensive content analysis tasks. Using custom web crawlers that scrape UGC from publicly available web forums, PREDOSE first automates the collection of web-based social media content for subsequent semantic annotation. The annotation scheme is modeled in the DAO, and includes domain specific knowledge such as prescription (and related) drugs, methods of preparation, side effects, and routes of administration. The DAO is also used to help recognize three types of data, namely: (1) entities, (2) relationships and (3) triples. PREDOSE then uses a combination of lexical and semantic-based techniques to extract entities and relationships from the scraped content, and a top-down approach for triple extraction that uses patterns expressed in the DAO. In addition, PREDOSE uses publicly available lexicons to identify initial sentiment expressions in text, and then a probabilistic optimization algorithm (from related research) to extract the final sentiment expressions. Together, these techniques enable the capture of fine-grained semantic information, which facilitate search, trend analysis and overall content analysis using social media on prescription drug abuse. Moreover, extracted data are also made available to domain experts for the creation of training and test sets for use in evaluation and refinements in information extraction techniques. A recent evaluation of the information extraction techniques applied in the PREDOSE platform indicates 85% precision and 72% recall in entity identification, on a manually created gold standard dataset. In another study, PREDOSE achieved 36% precision in relationship identification and 33% precision in triple extraction, through manual evaluation by domain experts. Given the complexity of the relationship and triple extraction tasks and the abstruse nature of social media texts, we interpret these as favorable initial results. Extracted semantic information is currently in use in an online discovery support system, by prescription drug abuse researchers at the Center for Interventions, Treatment and Addictions Research (CITAR) at Wright State University. A comprehensive platform for entity, relationship, triple and sentiment extraction from such abstruse texts has never been developed for drug abuse research. PREDOSE has already demonstrated the importance of mining social media by providing data from which new findings in drug abuse research were uncovered. Given the recent platform enhancements, including the refined DAO, components for relationship and triple extraction, and tools for content, trend and emerging pattern analysis, it is expected that PREDOSE will play a significant role in advancing drug abuse epidemiology in future. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Introduction to the JASIST Special Topic Issue on Web Retrieval and Mining: A Machine Learning Perspective.

    ERIC Educational Resources Information Center

    Chen, Hsinchun

    2003-01-01

    Discusses information retrieval techniques used on the World Wide Web. Topics include machine learning in information extraction; relevance feedback; information filtering and recommendation; text classification and text clustering; Web mining, based on data mining techniques; hyperlink structure; and Web size. (LRW)

  12. Rapid Nucleic Acid Extraction and Purification Using a Miniature Ultrasonic Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Branch, Darren W.; Vreeland, Erika C.; McClain, Jamie L.

    Miniature ultrasonic lysis for biological sample preparation is a promising technique for efficient and rapid extraction of nucleic acids and proteins from a wide variety of biological sources. Acoustic methods achieve rapid, unbiased, and efficacious disruption of cellular membranes while avoiding the use of harsh chemicals and enzymes, which interfere with detection assays. In this work, a miniature acoustic nucleic acid extraction system is presented. Using a miniature bulk acoustic wave (BAW) transducer array based on 36° Y-cut lithium niobate, acoustic waves were coupled into disposable laminate-based microfluidic cartridges. To verify the lysing effectiveness, the amount of liberated ATP andmore » the cell viability were measured and compared to untreated samples. The relationship between input power, energy dose, flow-rate, and lysing efficiency were determined. DNA was purified on-chip using three approaches implemented in the cartridges: a silica-based sol-gel silica-bead filled microchannel, nucleic acid binding magnetic beads, and Nafion-coated electrodes. Using E. coli, the lysing dose defined as ATP released per joule was 2.2× greater, releasing 6.1× more ATP for the miniature BAW array compared to a bench-top acoustic lysis system. An electric field-based nucleic acid purification approach using Nafion films yielded an extraction efficiency of 69.2% in 10 min for 50 µL samples.« less

  13. Rapid Nucleic Acid Extraction and Purification Using a Miniature Ultrasonic Technique

    DOE PAGES

    Branch, Darren W.; Vreeland, Erika C.; McClain, Jamie L.; ...

    2017-07-21

    Miniature ultrasonic lysis for biological sample preparation is a promising technique for efficient and rapid extraction of nucleic acids and proteins from a wide variety of biological sources. Acoustic methods achieve rapid, unbiased, and efficacious disruption of cellular membranes while avoiding the use of harsh chemicals and enzymes, which interfere with detection assays. In this work, a miniature acoustic nucleic acid extraction system is presented. Using a miniature bulk acoustic wave (BAW) transducer array based on 36° Y-cut lithium niobate, acoustic waves were coupled into disposable laminate-based microfluidic cartridges. To verify the lysing effectiveness, the amount of liberated ATP andmore » the cell viability were measured and compared to untreated samples. The relationship between input power, energy dose, flow-rate, and lysing efficiency were determined. DNA was purified on-chip using three approaches implemented in the cartridges: a silica-based sol-gel silica-bead filled microchannel, nucleic acid binding magnetic beads, and Nafion-coated electrodes. Using E. coli, the lysing dose defined as ATP released per joule was 2.2× greater, releasing 6.1× more ATP for the miniature BAW array compared to a bench-top acoustic lysis system. An electric field-based nucleic acid purification approach using Nafion films yielded an extraction efficiency of 69.2% in 10 min for 50 µL samples.« less

  14. Combining Natural Language Processing and Statistical Text Mining: A Study of Specialized versus Common Languages

    ERIC Educational Resources Information Center

    Jarman, Jay

    2011-01-01

    This dissertation focuses on developing and evaluating hybrid approaches for analyzing free-form text in the medical domain. This research draws on natural language processing (NLP) techniques that are used to parse and extract concepts based on a controlled vocabulary. Once important concepts are extracted, additional machine learning algorithms,…

  15. Profiling of poorly stratified smoky atmospheres with scanning lidar

    Treesearch

    Vladimir Kovalev; Cyle Wold; Alexander Petkov; Wei Min Hao

    2012-01-01

    The multiangle data processing technique is considered based on using the signal measured in zenith (or close to zenith) as a core source for extracting the information about the vertical atmospheric aerosol loading. The multiangle signals are used as the auxiliary data to extract the vertical transmittance profile from the zenith signal. Simulated and experimental...

  16. Text Detection, Tracking and Recognition in Video: A Comprehensive Survey.

    PubMed

    Yin, Xu-Cheng; Zuo, Ze-Yu; Tian, Shu; Liu, Cheng-Lin

    2016-04-14

    Intelligent analysis of video data is currently in wide demand because video is a major source of sensory data in our lives. Text is a prominent and direct source of information in video, while recent surveys of text detection and recognition in imagery [1], [2] focus mainly on text extraction from scene images. Here, this paper presents a comprehensive survey of text detection, tracking and recognition in video with three major contributions. First, a generic framework is proposed for video text extraction that uniformly describes detection, tracking, recognition, and their relations and interactions. Second, within this framework, a variety of methods, systems and evaluation protocols of video text extraction are summarized, compared, and analyzed. Existing text tracking techniques, tracking based detection and recognition techniques are specifically highlighted. Third, related applications, prominent challenges, and future directions for video text extraction (especially from scene videos and web videos) are also thoroughly discussed.

  17. Ionic-liquid-based ultrasound/microwave-assisted extraction of 2,4-dihydroxy-7-methoxy-1,4-benzoxazin-3-one and 6-methoxy-benzoxazolin-2-one from maize (Zea mays L.) seedlings.

    PubMed

    Li, Chunying; Lu, Zhicheng; Zhao, Chunjian; Yang, Lei; Fu, Yujie; Shi, Kunming; He, Xin; Li, Zhao; Zu, Yuangang

    2015-01-01

    We evaluated an ionic-liquid-based ultrasound/microwave-assisted extraction method for the extraction of 2,4-dihydroxy-7-methoxy-1,4-benzoxazin-3-one and 6-methoxy-benzoxazolin-2-one from etiolated maize seedlings. We performed single-factor and central composite rotatable design experiments to optimize the most important parameters influencing this technique. The best results were obtained using 1.00 M 1-octyl-3-methylimidazolium bromide as the extraction solvent, a 50°C extraction temperature, a 20:1 liquid/solid ratio (mL/g), a 21 min treatment time, 590 W microwave power, and 50 W fixed ultrasonic power. We performed a comparison between ionic-liquid-based ultrasound/microwave-assisted extraction and conventional homogenized extraction. Extraction yields of 2,4-dihydroxy-7-methoxy-1,4-benzoxazin-3-one and 6-methoxy-benzoxazolin-2-one by the ionic-liquid-based ultrasound/microwave-assisted extraction method were 1.392 ± 0.051 and 0.205 ± 0.008 mg/g, respectively, which were correspondingly 1.46- and 1.32-fold higher than those obtained by conventional homogenized extraction. All the results show that the ionic-liquid-based ultrasound/microwave-assisted extraction method is therefore an efficient and credible method for the extraction of 2,4-dihydroxy-7-methoxy-1,4-benzoxazin-3-one and 6-methoxy-benzoxazolin-2-one from maize seedlings. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Parallel Key Frame Extraction for Surveillance Video Service in a Smart City.

    PubMed

    Zheng, Ran; Yao, Chuanwei; Jin, Hai; Zhu, Lei; Zhang, Qin; Deng, Wei

    2015-01-01

    Surveillance video service (SVS) is one of the most important services provided in a smart city. It is very important for the utilization of SVS to provide design efficient surveillance video analysis techniques. Key frame extraction is a simple yet effective technique to achieve this goal. In surveillance video applications, key frames are typically used to summarize important video content. It is very important and essential to extract key frames accurately and efficiently. A novel approach is proposed to extract key frames from traffic surveillance videos based on GPU (graphics processing units) to ensure high efficiency and accuracy. For the determination of key frames, motion is a more salient feature in presenting actions or events, especially in surveillance videos. The motion feature is extracted in GPU to reduce running time. It is also smoothed to reduce noise, and the frames with local maxima of motion information are selected as the final key frames. The experimental results show that this approach can extract key frames more accurately and efficiently compared with several other methods.

  19. Validation assessment of shoreline extraction on medium resolution satellite image

    NASA Astrophysics Data System (ADS)

    Manaf, Syaifulnizam Abd; Mustapha, Norwati; Sulaiman, Md Nasir; Husin, Nor Azura; Shafri, Helmi Zulhaidi Mohd

    2017-10-01

    Monitoring coastal zones helps provide information about the conditions of the coastal zones, such as erosion or accretion. Moreover, monitoring the shorelines can help measure the severity of such conditions. Such measurement can be performed accurately by using Earth observation satellite images rather than by using traditional ground survey. To date, shorelines can be extracted from satellite images with a high degree of accuracy by using satellite image classification techniques based on machine learning to identify the land and water classes of the shorelines. In this study, the researchers validated the results of extracted shorelines of 11 classifiers using a reference shoreline provided by the local authority. Specifically, the validation assessment was performed to examine the difference between the extracted shorelines and the reference shorelines. The research findings showed that the SVM Linear was the most effective image classification technique, as evidenced from the lowest mean distance between the extracted shoreline and the reference shoreline. Furthermore, the findings showed that the accuracy of the extracted shoreline was not directly proportional to the accuracy of the image classification.

  20. Extending metabolome coverage for untargeted metabolite profiling of adherent cultured hepatic cells.

    PubMed

    García-Cañaveras, Juan Carlos; López, Silvia; Castell, José Vicente; Donato, M Teresa; Lahoz, Agustín

    2016-02-01

    MS-based metabolite profiling of adherent mammalian cells comprises several challenging steps such as metabolism quenching, cell detachment, cell disruption, metabolome extraction, and metabolite measurement. In LC-MS, the final metabolome coverage is strongly determined by the separation technique and the MS conditions used. Human liver-derived cell line HepG2 was chosen as adherent mammalian cell model to evaluate the performance of several commonly used procedures in both sample processing and LC-MS analysis. In a first phase, metabolite extraction and sample analysis were optimized in a combined manner. To this end, the extraction abilities of five different solvents (or combinations) were assessed by comparing the number and the levels of the metabolites comprised in each extract. Three different chromatographic methods were selected for metabolites separation. A HILIC-based method which was set to specifically separate polar metabolites and two RP-based methods focused on lipidome and wide-ranging metabolite detection, respectively. With regard to metabolite measurement, a Q-ToF instrument operating in both ESI (+) and ESI (-) was used for unbiased extract analysis. Once metabolite extraction and analysis conditions were set up, the influence of cell harvesting on metabolome coverage was also evaluated. Therefore, different protocols for cell detachment (trypsinization or scraping) and metabolism quenching were compared. This study confirmed the inconvenience of trypsinization as a harvesting technique, and the importance of using complementary extraction solvents to extend metabolome coverage, minimizing interferences and maximizing detection, thanks to the use of dedicated analytical conditions through the combination of HILIC and RP separations. The proposed workflow allowed the detection of over 300 identified metabolites from highly polar compounds to a wide range of lipids.

  1. Validated determination of losartan and valsartan in human plasma by stir bar sorptive extraction based on acrylate monolithic polymer, liquid chromatographic analysis and experimental design methodology.

    PubMed

    Babarahimi, Vida; Talebpour, Zahra; Haghighi, Farideh; Adib, Nuoshin; Vahidi, Hamed

    2018-05-10

    In our previous work, a new monolithic coating based on vinylpyrrolidone-ethylene glycol dimethacrylate polymer was introduced for stir bar sorptive extraction. The formulation of the prepared vinylpyrrolidone-ethylene glycol dimethacrylate monolithic polymer was optimized and the satisfactory quality of prepared coated stir bar was demonstrated. In this work, the prepared stir bar was utilized in combination with ultrasound-assisted liquid desorption, followed by high-performance liquid chromatography with ultraviolet detection for the simultaneous determination of losartan (LOS) and valsartan (VAS) in human plasma samples. In a comparison study, the extraction efficiency of the prepared stir bar was accompanied much higher extraction efficiency than the two commercial stir bars (polydimethylsiloxand and polyacrylate) for both target compounds. In order to improve the desorption efficiency of LOS and VAS, the best values for effective parameters on desorption step were selected systematically. Also, the effective parameters on extraction step were optimized using a Box-Behnken design. Under the optimum conditions, the analytical performance of the proposed method displayed excellent linear dynamic ranges for LOS (24-1000 ng mL -1 ) and VAS (91-1000 ng mL -1 ), with correlation coefficients of 0.9998 and 0.9971 and detection limits of 7 and 27 ng mL -1 , respectively. The intra- and inter-day recovery ranged from 98 to 117%, and the relative standard deviations were less than 8%. Finally, the proposed technique was successfully applied to the analysis of LOS and VAS at their therapeutic levels in volunteer patient plasma sample. The obtained results were confirmed using liquid chromatography-mass spectrometry. The proposed technique was more rapid than previously reported stir bar sorptive extraction techniques based on monolithic coatings, and exhibited lower detection limits in comparison with similar methods for the determination of LOS and VLS in biological fluids. The obtained results were demonstrated that the lower selectivity of UV in comparison with MS detection was rectified by appropriate sample preparation through proposed extraction method to eliminate as many interfering compounds as possible. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. The research of edge extraction and target recognition based on inherent feature of objects

    NASA Astrophysics Data System (ADS)

    Xie, Yu-chan; Lin, Yu-chi; Huang, Yin-guo

    2008-03-01

    Current research on computer vision often needs specific techniques for particular problems. Little use has been made of high-level aspects of computer vision, such as three-dimensional (3D) object recognition, that are appropriate for large classes of problems and situations. In particular, high-level vision often focuses mainly on the extraction of symbolic descriptions, and pays little attention to the speed of processing. In order to extract and recognize target intelligently and rapidly, in this paper we developed a new 3D target recognition method based on inherent feature of objects in which cuboid was taken as model. On the basis of analysis cuboid nature contour and greyhound distributing characteristics, overall fuzzy evaluating technique was utilized to recognize and segment the target. Then Hough transform was used to extract and match model's main edges, we reconstruct aim edges by stereo technology in the end. There are three major contributions in this paper. Firstly, the corresponding relations between the parameters of cuboid model's straight edges lines in an image field and in the transform field were summed up. By those, the aimless computations and searches in Hough transform processing can be reduced greatly and the efficiency is improved. Secondly, as the priori knowledge about cuboids contour's geometry character known already, the intersections of the component extracted edges are taken, and assess the geometry of candidate edges matches based on the intersections, rather than the extracted edges. Therefore the outlines are enhanced and the noise is depressed. Finally, a 3-D target recognition method is proposed. Compared with other recognition methods, this new method has a quick response time and can be achieved with high-level computer vision. The method present here can be used widely in vision-guide techniques to strengthen its intelligence and generalization, which can also play an important role in object tracking, port AGV, robots fields. The results of simulation experiments and theory analyzing demonstrate that the proposed method could suppress noise effectively, extracted target edges robustly, and achieve the real time need. Theory analysis and experiment shows the method is reasonable and efficient.

  3. Evaluation of needle trap micro-extraction and solid-phase micro-extraction: Obtaining comprehensive information on volatile emissions from in vitro cultures.

    PubMed

    Oertel, Peter; Bergmann, Andreas; Fischer, Sina; Trefz, Phillip; Küntzel, Anne; Reinhold, Petra; Köhler, Heike; Schubert, Jochen K; Miekisch, Wolfram

    2018-05-14

    Volatile organic compounds (VOCs) emitted from in vitro cultures may reveal information on species and metabolism. Owing to low nmol L -1 concentration ranges, pre-concentration techniques are required for gas chromatography-mass spectrometry (GC-MS) based analyses. This study was intended to compare the efficiency of established micro-extraction techniques - solid-phase micro-extraction (SPME) and needle-trap micro-extraction (NTME) - for the analysis of complex VOC patterns. For SPME, a 75 μm Carboxen®/polydimethylsiloxane fiber was used. The NTME needle was packed with divinylbenzene, Carbopack X and Carboxen 1000. The headspace was sampled bi-directionally. Seventy-two VOCs were calibrated by reference standard mixtures in the range of 0.041-62.24 nmol L -1 by means of GC-MS. Both pre-concentration methods were applied to profile VOCs from cultures of Mycobacterium avium ssp. paratuberculosis. Limits of detection ranged from 0.004 to 3.93 nmol L -1 (median = 0.030 nmol L -1 ) for NTME and from 0.001 to 5.684 nmol L -1 (median = 0.043 nmol L -1 ) for SPME. NTME showed advantages in assessing polar compounds such as alcohols. SPME showed advantages in reproducibility but disadvantages in sensitivity for N-containing compounds. Micro-extraction techniques such as SPME and NTME are well suited for trace VOC profiling over cultures if the limitations of each technique is taken into account. Copyright © 2018 John Wiley & Sons, Ltd.

  4. Impact of JPEG2000 compression on endmember extraction and unmixing of remotely sensed hyperspectral data

    NASA Astrophysics Data System (ADS)

    Martin, Gabriel; Gonzalez-Ruiz, Vicente; Plaza, Antonio; Ortiz, Juan P.; Garcia, Inmaculada

    2010-07-01

    Lossy hyperspectral image compression has received considerable interest in recent years due to the extremely high dimensionality of the data. However, the impact of lossy compression on spectral unmixing techniques has not been widely studied. These techniques characterize mixed pixels (resulting from insufficient spatial resolution) in terms of a suitable combination of spectrally pure substances (called endmembers) weighted by their estimated fractional abundances. This paper focuses on the impact of JPEG2000-based lossy compression of hyperspectral images on the quality of the endmembers extracted by different algorithms. The three considered algorithms are the orthogonal subspace projection (OSP), which uses only spatial information, and the automatic morphological endmember extraction (AMEE) and spatial spectral endmember extraction (SSEE), which integrate both spatial and spectral information in the search for endmembers. The impact of compression on the resulting abundance estimation based on the endmembers derived by different methods is also substantiated. Experimental results are conducted using a hyperspectral data set collected by NASA Jet Propulsion Laboratory over the Cuprite mining district in Nevada. The experimental results are quantitatively analyzed using reference information available from U.S. Geological Survey, resulting in recommendations to specialists interested in applying endmember extraction and unmixing algorithms to compressed hyperspectral data.

  5. An ultrasensitive chemiluminescence immunoassay of chloramphenicol based on gold nanoparticles and magnetic beads.

    PubMed

    Tao, Xiaoqi; Jiang, Haiyang; Yu, Xuezhi; Zhu, Jinghui; Wang, Xia; Wang, Zhanhui; Niu, Lanlan; Wu, Xiaoping; Shen, Jianzhong

    2013-05-01

    A competitive, direct, chemiluminescent immunoassay based on a magnetic beads (MBs) separation and gold nanoparticles (AuNPs) labelling technique to detect chloramphenicol (CAP) has been developed. Horseradish peroxidase (HRP)-labelled anti-CAP monoclonal antibody conjugated with AuNPs and antigen-immobilized MBs were prepared. After optimization parameters of immunocomplex MBs, the IC50 values of chemiluminescence magnetic nanoparticles immunoassay (CL-MBs-nano-immunoassay) were 0.017 µg L(-1) for extract method I and 0.17 µg L(-1) for extract method II. The immunoassay with two extract methods was applied to detect CAP in milk. Comparison of these two extract methods showed that extract method I was advantageous in better sensitivity, in which the sensitivity was 10 times compared to that of extract method II, while extract method II was superior in simple operation, suitable for high throughout screen. The recoveries were 86.7-98.0% (extract method I) and 80.0-103.0% (extract method II), and the coefficients of variation (CVs) were all <15%. The satisfactory recovery with both extract methods and high correlation with traditional ELISA kit in milk system confirmed that the immunomagnetic assay based on AuNPs exhibited promising potential in rapid field screening for trace CAP analysis. Copyright © 2013 John Wiley & Sons, Ltd.

  6. RNA extraction from self-assembling peptide hydrogels to allow qPCR analysis of encapsulated cells.

    PubMed

    Burgess, Kyle A; Workman, Victoria L; Elsawy, Mohamed A; Miller, Aline F; Oceandy, Delvac; Saiani, Alberto

    2018-01-01

    Self-assembling peptide hydrogels offer a novel 3-dimensional platform for many applications in cell culture and tissue engineering but are not compatible with current methods of RNA isolation; owing to interactions between RNA and the biomaterial. This study investigates the use of two techniques based on two different basic extraction principles: solution-based extraction and direct solid-state binding of RNA respectively, to extract RNA from cells encapsulated in four β-sheet forming self-assembling peptide hydrogels with varying net positive charge. RNA-peptide fibril interactions, rather than RNA-peptide molecular complexing, were found to interfere with the extraction process resulting in low yields. A column-based approach relying on RNA-specific binding was shown to be more suited to extracting RNA with higher purity from these peptide hydrogels owing to its reliance on strong specific RNA binding interactions which compete directly with RNA-peptide fibril interactions. In order to reduce the amount of fibrils present and improve RNA yields a broad spectrum enzyme solution-pronase-was used to partially digest the hydrogels before RNA extraction. This pre-treatment was shown to significantly increase the yield of RNA extracted, allowing downstream RT-qPCR to be performed.

  7. Monolithic methacrylate packed 96-tips for high throughput bioanalysis.

    PubMed

    Altun, Zeki; Skoglund, Christina; Abdel-Rehim, Mohamed

    2010-04-16

    In the pharmaceutical industry the growing number of samples to be analyzed requires high throughput and fully automated analytical techniques. Commonly used sample-preparation methods are solid-phase extraction (SPE), liquid-liquid extraction (LLE) and protein precipitation. In this paper we will discus a new sample-preparation technique based on SPE for high throughput drug extraction developed and used by our group. This new sample-preparation method is based on monolithic methacrylate polymer as packing sorbent for 96-tip robotic device. Using this device a 96-well plate could be handled in 2-4min. The key aspect of the monolithic phase is that monolithic material can offer both good binding capacity and low back-pressure properties compared to e.g. silica phases. The present paper presents the successful application of monolithic 96-tips and LC-MS/MS by the sample preparation of busulphan, rescovitine, metoprolol, pindolol and local anaesthetics from human plasma samples and cyklophosphamid from mice blood samples. Copyright 2009 Elsevier B.V. All rights reserved.

  8. PLAN2L: a web tool for integrated text mining and literature-based bioentity relation extraction.

    PubMed

    Krallinger, Martin; Rodriguez-Penagos, Carlos; Tendulkar, Ashish; Valencia, Alfonso

    2009-07-01

    There is an increasing interest in using literature mining techniques to complement information extracted from annotation databases or generated by bioinformatics applications. Here we present PLAN2L, a web-based online search system that integrates text mining and information extraction techniques to access systematically information useful for analyzing genetic, cellular and molecular aspects of the plant model organism Arabidopsis thaliana. Our system facilitates a more efficient retrieval of information relevant to heterogeneous biological topics, from implications in biological relationships at the level of protein interactions and gene regulation, to sub-cellular locations of gene products and associations to cellular and developmental processes, i.e. cell cycle, flowering, root, leaf and seed development. Beyond single entities, also predefined pairs of entities can be provided as queries for which literature-derived relations together with textual evidences are returned. PLAN2L does not require registration and is freely accessible at http://zope.bioinfo.cnio.es/plan2l.

  9. Singular value decomposition based feature extraction technique for physiological signal analysis.

    PubMed

    Chang, Cheng-Ding; Wang, Chien-Chih; Jiang, Bernard C

    2012-06-01

    Multiscale entropy (MSE) is one of the popular techniques to calculate and describe the complexity of the physiological signal. Many studies use this approach to detect changes in the physiological conditions in the human body. However, MSE results are easily affected by noise and trends, leading to incorrect estimation of MSE values. In this paper, singular value decomposition (SVD) is adopted to replace MSE to extract the features of physiological signals, and adopt the support vector machine (SVM) to classify the different physiological states. A test data set based on the PhysioNet website was used, and the classification results showed that using SVD to extract features of the physiological signal could attain a classification accuracy rate of 89.157%, which is higher than that using the MSE value (71.084%). The results show the proposed analysis procedure is effective and appropriate for distinguishing different physiological states. This promising result could be used as a reference for doctors in diagnosis of congestive heart failure (CHF) disease.

  10. Discovery of Newer Therapeutic Leads for Prostate Cancer

    DTIC Science & Technology

    2009-06-01

    promising plant extracts and then prepare large-scale quantities of the plant extracts using supercritical fluid extraction techniques and use this...quantities of the plant extracts using supercritical fluid extraction techniques. Large scale plant collections were conducted for 14 of the top 20...material for bioassay-guided fractionation of the biologically active constituents using modern chromatography techniques. The chemical structures of

  11. Method 1664: N-hexane extractable material (hem) and silica gel treated n-hexane extractable material (SGT-HEM) by extraction and gravimetry (oil and grease and total petroleum hydrocarbons), April 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Method 1664 was developed by the United States Environmental Protection Agency Office of Science and Technology to replace previously used gravimetric procedures that employed Freon-113, a Class I CFC, as the extraction solvent for the determination of oil and grease and petroleum hydrocarbons. Method 1664 is a performance-based method applicable to aqueous matrices that requires the use of n-hexane as the extraction solvent and gravimetry as the determinative technique. In addition, QC procedures designed to monitor precision and accuracy have been incorporated into Method 1664.

  12. Application of the windowed-Fourier-transform-based fringe analysis technique for investigating temperature and concentration fields in fluids.

    PubMed

    Mohanan, Sharika; Srivastava, Atul

    2014-04-10

    The present work is concerned with the development and application of a novel fringe analysis technique based on the principles of the windowed-Fourier-transform (WFT) for the determination of temperature and concentration fields from interferometric images for a range of heat and mass transfer applications. Based on the extent of the noise level associated with the experimental data, the technique has been coupled with two different phase unwrapping methods: the Itoh algorithm and the quality guided phase unwrapping technique for phase extraction. In order to generate the experimental data, a range of experiments have been carried out which include cooling of a vertical flat plate in free convection conditions, combustion of mono-propellant flames, and growth of organic as well as inorganic crystals from their aqueous solutions. The flat plate and combustion experiments are modeled as heat transfer applications wherein the interest is to determine the whole-field temperature distribution. Aqueous-solution-based crystal growth experiments are performed to simulate the mass transfer phenomena and the interest is to determine the two-dimensional solute concentration field around the growing crystal. A Mach-Zehnder interferometer has been employed to record the path-integrated quantity of interest (temperature and/or concentration) in the form of interferometric images in the experiments. The potential of the WFT method has also been demonstrated on numerically simulated phase data for varying noise levels, and the accuracy in phase extraction have been quantified in terms of the root mean square errors. Three levels of noise, i.e., 0%, 10%, and 20% have been considered. Results of the present study show that the WFT technique allows an accurate extraction of phase values that can subsequently be converted into two-dimensional temperature and/or concentration distribution fields. Moreover, since WFT is a local processing technique, speckle patterns and the inherent noise in the interferometric data do not affect the resultant phase values. Brief comparisons of the accuracy of the WFT with other standard techniques such as conventional Fourier-filtering methods are also presented.

  13. Analytical methods for determination of mycotoxins: a review.

    PubMed

    Turner, Nicholas W; Subrahmanyam, Sreenath; Piletsky, Sergey A

    2009-01-26

    Mycotoxins are small (MW approximately 700), toxic chemical products formed as secondary metabolites by a few fungal species that readily colonise crops and contaminate them with toxins in the field or after harvest. Ochratoxins and Aflatoxins are mycotoxins of major significance and hence there has been significant research on broad range of analytical and detection techniques that could be useful and practical. Due to the variety of structures of these toxins, it is impossible to use one standard technique for analysis and/or detection. Practical requirements for high-sensitivity analysis and the need for a specialist laboratory setting create challenges for routine analysis. Several existing analytical techniques, which offer flexible and broad-based methods of analysis and in some cases detection, have been discussed in this manuscript. There are a number of methods used, of which many are lab-based, but to our knowledge there seems to be no single technique that stands out above the rest, although analytical liquid chromatography, commonly linked with mass spectroscopy is likely to be popular. This review manuscript discusses (a) sample pre-treatment methods such as liquid-liquid extraction (LLE), supercritical fluid extraction (SFE), solid phase extraction (SPE), (b) separation methods such as (TLC), high performance liquid chromatography (HPLC), gas chromatography (GC), and capillary electrophoresis (CE) and (c) others such as ELISA. Further currents trends, advantages and disadvantages and future prospects of these methods have been discussed.

  14. Epileptic seizure detection in EEG signal using machine learning techniques.

    PubMed

    Jaiswal, Abeg Kumar; Banka, Haider

    2018-03-01

    Epilepsy is a well-known nervous system disorder characterized by seizures. Electroencephalograms (EEGs), which capture brain neural activity, can detect epilepsy. Traditional methods for analyzing an EEG signal for epileptic seizure detection are time-consuming. Recently, several automated seizure detection frameworks using machine learning technique have been proposed to replace these traditional methods. The two basic steps involved in machine learning are feature extraction and classification. Feature extraction reduces the input pattern space by keeping informative features and the classifier assigns the appropriate class label. In this paper, we propose two effective approaches involving subpattern based PCA (SpPCA) and cross-subpattern correlation-based PCA (SubXPCA) with Support Vector Machine (SVM) for automated seizure detection in EEG signals. Feature extraction was performed using SpPCA and SubXPCA. Both techniques explore the subpattern correlation of EEG signals, which helps in decision-making process. SVM is used for classification of seizure and non-seizure EEG signals. The SVM was trained with radial basis kernel. All the experiments have been carried out on the benchmark epilepsy EEG dataset. The entire dataset consists of 500 EEG signals recorded under different scenarios. Seven different experimental cases for classification have been conducted. The classification accuracy was evaluated using tenfold cross validation. The classification results of the proposed approaches have been compared with the results of some of existing techniques proposed in the literature to establish the claim.

  15. Automation of static and dynamic non-dispersive liquid phase microextraction. Part 2: Approaches based on impregnated membranes and porous supports.

    PubMed

    Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján

    2016-02-11

    A critical overview on automation of modern liquid phase microextraction (LPME) approaches based on the liquid impregnation of porous sorbents and membranes is presented. It is the continuation of part 1, in which non-dispersive LPME techniques based on the use of the extraction phase (EP) in the form of drop, plug, film, or microflow have been surveyed. Compared to the approaches described in part 1, porous materials provide an improved support for the EP. Simultaneously they allow to enlarge its contact surface and to reduce the risk of loss by incident flow or by components of surrounding matrix. Solvent-impregnated membranes or hollow fibres are further ideally suited for analyte extraction with simultaneous or subsequent back-extraction. Their use can therefore improve the procedure robustness and reproducibility as well as it "opens the door" to the new operation modes and fields of application. However, additional work and time are required for membrane replacement and renewed impregnation. Automation of porous support-based and membrane-based approaches plays an important role in the achievement of better reliability, rapidness, and reproducibility compared to manual assays. Automated renewal of the extraction solvent and coupling of sample pretreatment with the detection instrumentation can be named as examples. The different LPME methodologies using impregnated membranes and porous supports for the extraction phase and the different strategies of their automation, and their analytical applications are comprehensively described and discussed in this part. Finally, an outlook on future demands and perspectives of LPME techniques from both parts as a promising area in the field of sample pretreatment is given. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Optimized microwave-assisted extraction of 6-gingerol from Zingiber officinale Roscoeand evaluation of antioxidant activity in vitro.

    PubMed

    Liu, Wei; Zhou, Chun-Li; Zhao, Jing; Chen, Dong; Li, Quan-Hong

    2014-01-01

    6-Gingerol is one of the most pharmacologically active and abundant components in ginger, which has a wide array of biochemical and pharmacologic activities. In recent years, the application of microwave-assisted extraction (MAE) for obtaining bioactive compounds from plant materials has shown tremendous research interest and potential. In this study, an efficient microwave-assisted extraction (MAE) technique was developed to extract 6-gingerol from ginger. The extraction efficiency of MAE was also compared with conventional extraction techniques. Fresh gingers (Zingiber officinale Rose.) were harvested at commercial maturity (originally from Shandong, laiwu, China). In single-factor experiments for the recovery of 6-gingerol, proper ranges of ratio of liquid to solid, ethanol proportion, microwave power, extraction time were determined. Based on the values obtained in single-factor experiments, a Box-Behnken design (BBD) was applied to determine the best combination of extraction variables on the yield of 6-gingerol. The optimum extraction conditions were as follows: microwave power 528 W, ratio of liquid to solid 26 mL·g(-1), extraction time 31 s and ethanol proportion 78%. Furthermore, more 6-gingerol and total polyphenols contents were extracted by MAE than conventional methods including Maceration (MAC), Stirring Extraction (SE), Heat reflux extraction (HRE), Ultrasound-assisted extraction (UAE), as well as the antioxidant capacity. Microwave-assisted extraction showed obvious advantages in terms of high extraction efficiency and antioxidant activity of extract within shortest extraction time. Scanning electron microscopy (SEM) images of ginger powder materials after different extractions were obtained to provide visual evidence of the disruption effect. To our best knowledge, this is the first report about usage of MAE of 6-gingerol extraction from ginger, which could be referenced for the extraction of other active compounds from herbal plants.

  17. Enviromentally Sound Timber Extracting Techniques for Small Tree Harvesting

    Treesearch

    Lihai Wang

    1999-01-01

    Due to large area disturbed and great deal of energy cost during-its operations, introducing or applying the appropriate timber extracting techniques could significantly reduce the impact of timber extraction operations to forest environment while pursuing the reasonable operation costs. Four environmentally sound timber extraction techniques for small tree harvesting...

  18. n-SIFT: n-dimensional scale invariant feature transform.

    PubMed

    Cheung, Warren; Hamarneh, Ghassan

    2009-09-01

    We propose the n-dimensional scale invariant feature transform (n-SIFT) method for extracting and matching salient features from scalar images of arbitrary dimensionality, and compare this method's performance to other related features. The proposed features extend the concepts used for 2-D scalar images in the computer vision SIFT technique for extracting and matching distinctive scale invariant features. We apply the features to images of arbitrary dimensionality through the use of hyperspherical coordinates for gradients and multidimensional histograms to create the feature vectors. We analyze the performance of a fully automated multimodal medical image matching technique based on these features, and successfully apply the technique to determine accurate feature point correspondence between pairs of 3-D MRI images and dynamic 3D + time CT data.

  19. Wavelet-based techniques for the gamma-ray sky

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDermott, Samuel D.; Fox, Patrick J.; Cholis, Ilias

    2016-07-01

    Here, we demonstrate how the image analysis technique of wavelet decomposition can be applied to the gamma-ray sky to separate emission on different angular scales. New structures on scales that differ from the scales of the conventional astrophysical foreground and background uncertainties can be robustly extracted, allowing a model-independent characterization with no presumption of exact signal morphology. As a test case, we generate mock gamma-ray data to demonstrate our ability to extract extended signals without assuming a fixed spatial template. For some point source luminosity functions, our technique also allows us to differentiate a diffuse signal in gamma-rays from darkmore » matter annihilation and extended gamma-ray point source populations in a data-driven way.« less

  20. Overview of existing algorithms for emotion classification. Uncertainties in evaluations of accuracies.

    NASA Astrophysics Data System (ADS)

    Avetisyan, H.; Bruna, O.; Holub, J.

    2016-11-01

    A numerous techniques and algorithms are dedicated to extract emotions from input data. In our investigation it was stated that emotion-detection approaches can be classified into 3 following types: Keyword based / lexical-based, learning based, and hybrid. The most commonly used techniques, such as keyword-spotting method, Support Vector Machines, Naïve Bayes Classifier, Hidden Markov Model and hybrid algorithms, have impressive results in this sphere and can reach more than 90% determining accuracy.

  1. Multifunctional picoliter droplet manipulation platform and its application in single cell analysis.

    PubMed

    Gu, Shu-Qing; Zhang, Yun-Xia; Zhu, Ying; Du, Wen-Bin; Yao, Bo; Fang, Qun

    2011-10-01

    We developed an automated and multifunctional microfluidic platform based on DropLab to perform flexible generation and complex manipulations of picoliter-scale droplets. Multiple manipulations including precise droplet generation, sequential reagent merging, and multistep solid-phase extraction for picoliter-scale droplets could be achieved in the present platform. The system precision in generating picoliter-scale droplets was significantly improved by minimizing the thermo-induced fluctuation of flow rate. A novel droplet fusion technique based on the difference of droplet interfacial tensions was developed without the need of special microchannel networks or external devices. It enabled sequential addition of reagents to droplets on demand for multistep reactions. We also developed an effective picoliter-scale droplet splitting technique with magnetic actuation. The difficulty in phase separation of magnetic beads from picoliter-scale droplets due to the high interfacial tension was overcome using ferromagnetic particles to carry the magnetic beads to pass through the phase interface. With this technique, multistep solid-phase extraction was achieved among picoliter-scale droplets. The present platform had the ability to perform complex multistep manipulations to picoliter-scale droplets, which is particularly required for single cell analysis. Its utility and potentials in single cell analysis were preliminarily demonstrated in achieving high-efficiency single-cell encapsulation, enzyme activity assay at the single cell level, and especially, single cell DNA purification based on solid-phase extraction.

  2. Component-Oriented Behavior Extraction for Autonomic System Design

    NASA Technical Reports Server (NTRS)

    Bakera, Marco; Wagner, Christian; Margaria,Tiziana; Hinchey, Mike; Vassev, Emil; Steffen, Bernhard

    2009-01-01

    Rich and multifaceted domain specific specification languages like the Autonomic System Specification Language (ASSL) help to design reliable systems with self-healing capabilities. The GEAR game-based Model Checker has been used successfully to investigate properties of the ESA Exo- Mars Rover in depth. We show here how to enable GEAR s game-based verification techniques for ASSL via systematic model extraction from a behavioral subset of the language, and illustrate it on a description of the Voyager II space mission.

  3. Patch-based image reconstruction for PET using prior-image derived dictionaries

    NASA Astrophysics Data System (ADS)

    Tahaei, Marzieh S.; Reader, Andrew J.

    2016-09-01

    In PET image reconstruction, regularization is often needed to reduce the noise in the resulting images. Patch-based image processing techniques have recently been successfully used for regularization in medical image reconstruction through a penalized likelihood framework. Re-parameterization within reconstruction is another powerful regularization technique in which the object in the scanner is re-parameterized using coefficients for spatially-extensive basis vectors. In this work, a method for extracting patch-based basis vectors from the subject’s MR image is proposed. The coefficients for these basis vectors are then estimated using the conventional MLEM algorithm. Furthermore, using the alternating direction method of multipliers, an algorithm for optimizing the Poisson log-likelihood while imposing sparsity on the parameters is also proposed. This novel method is then utilized to find sparse coefficients for the patch-based basis vectors extracted from the MR image. The results indicate the superiority of the proposed methods to patch-based regularization using the penalized likelihood framework.

  4. Machine vision extracted plant movement for early detection of plant water stress.

    PubMed

    Kacira, M; Ling, P P; Short, T H

    2002-01-01

    A methodology was established for early, non-contact, and quantitative detection of plant water stress with machine vision extracted plant features. Top-projected canopy area (TPCA) of the plants was extracted from plant images using image-processing techniques. Water stress induced plant movement was decoupled from plant diurnal movement and plant growth using coefficient of relative variation of TPCA (CRV[TPCA)] and was found to be an effective marker for water stress detection. Threshold value of CRV(TPCA) as an indicator of water stress was determined by a parametric approach. The effectiveness of the sensing technique was evaluated against the timing of stress detection by an operator. Results of this study suggested that plant water stress detection using projected canopy area based features of the plants was feasible.

  5. An evaluation of object-oriented image analysis techniques to identify motorized vehicle effects in semi-arid to arid ecosystems of the American West

    USGS Publications Warehouse

    Mladinich, C.

    2010-01-01

    Human disturbance is a leading ecosystem stressor. Human-induced modifications include transportation networks, areal disturbances due to resource extraction, and recreation activities. High-resolution imagery and object-oriented classification rather than pixel-based techniques have successfully identified roads, buildings, and other anthropogenic features. Three commercial, automated feature-extraction software packages (Visual Learning Systems' Feature Analyst, ENVI Feature Extraction, and Definiens Developer) were evaluated by comparing their ability to effectively detect the disturbed surface patterns from motorized vehicle traffic. Each package achieved overall accuracies in the 70% range, demonstrating the potential to map the surface patterns. The Definiens classification was more consistent and statistically valid. Copyright ?? 2010 by Bellwether Publishing, Ltd. All rights reserved.

  6. The Best Extraction Technique for Kaempferol and Quercetin Isolation from Guava Leaves (Psidium guajava)

    NASA Astrophysics Data System (ADS)

    Batubara, I.; Suparto, I. H.; Wulandari, N. S.

    2017-03-01

    Guava leaves contain various compounds that have biological activity such as kaempferol and quercetin as anticancer. Twelve extraction techniques were performed to obtain the best extraction technique to isolate kaempferol and quercetin from the guava leaves. Toxicity of extracts was tested against Artemia salina larvae. All extracts were toxic (LC50 value less than 1000 ppm) except extract of direct soxhletation on guava leaves, and extract of sonication and soxhletation using n-hexane. The extract with high content of total phenols and total flavonoids, low content of tannins, intense color of spot on thin layer chromatogram was selected for high performance liquid chromatography analysis. Direct sonication of guava leaves was chosen as the best extraction technique with kampferol and quercetin content of 0.02% and 2.15%, respectively. In addition to high content of kaempferol and quercetin, direct sonication was chosen due to the shortest extraction time, lesser impurities and high toxicity.

  7. Determination of Hypochlorite in Bleaching Products with Flower Extracts to Demonstrate the Principles of Flow Injection Analysis

    ERIC Educational Resources Information Center

    Ramos, Luiz Antonio; Prieto, Katia Roberta; Carvalheiro, Eder Tadeu Gomes; Carvalheiro, Carla Cristina Schmitt

    2005-01-01

    The use of crude flower extracts to the principle of analytical chemistry automation, with the flow injection analysis (FIA) procedure developed to determine hypochlorite in household bleaching products was performed. The FIA comprises a group of techniques based on injection of a liquid sample into a moving, nonsegmented carrier stream of a…

  8. LISA Framework for Enhancing Gravitational Wave Signal Extraction Techniques

    NASA Technical Reports Server (NTRS)

    Thompson, David E.; Thirumalainambi, Rajkumar

    2006-01-01

    This paper describes the development of a Framework for benchmarking and comparing signal-extraction and noise-interference-removal methods that are applicable to interferometric Gravitational Wave detector systems. The primary use is towards comparing signal and noise extraction techniques at LISA frequencies from multiple (possibly confused) ,gravitational wave sources. The Framework includes extensive hybrid learning/classification algorithms, as well as post-processing regularization methods, and is based on a unique plug-and-play (component) architecture. Published methods for signal extraction and interference removal at LISA Frequencies are being encoded, as well as multiple source noise models, so that the stiffness of GW Sensitivity Space can be explored under each combination of methods. Furthermore, synthetic datasets and source models can be created and imported into the Framework, and specific degraded numerical experiments can be run to test the flexibility of the analysis methods. The Framework also supports use of full current LISA Testbeds, Synthetic data systems, and Simulators already in existence through plug-ins and wrappers, thus preserving those legacy codes and systems in tact. Because of the component-based architecture, all selected procedures can be registered or de-registered at run-time, and are completely reusable, reconfigurable, and modular.

  9. Application of pseudo-template molecularly imprinted polymers by atom transfer radical polymerization to the solid-phase extraction of pyrethroids.

    PubMed

    Zhang, Ming; He, Juan; Shen, Yanzheng; He, Weiye; Li, Yuanyuan; Zhao, Dongxin; Zhang, Shusheng

    2018-02-01

    A polymer-based adsorption medium with molecular recognition ability for homologs of pyrethroids was prepared by atom transfer radical polymer iration using a fragment imprinting technique. Phenyl ether-biphenyl eutectic was utilized as a pseudo-template molecule, and the adsorption medium prepared was evaluated by solid-phase extraction and gas chromatography. Selectivity of the medium for pyrethroids was evaluated using it as solid phase extraction packing by Gas Chromatography. The results demonstrated that the absorption amount of bifenthrin, fenpropathrin, permethrin, cypermethrin, fenvalerate, Dursban and pentachloronitrobenzene for molecularly imprinted polymers were 2.32, 2.12, 2.18, 2.20, 2.30, 1.30 and 1.40mgg -1 , respectively, while the non-imprinted polymers were 1.20, 1.13, 1.25, 1.05, 1.20, 1.23 and 1.32mgg -1 , respectively. The rebinding test based on the molecularly imprinted solid phase extraction column technique showed the recoveries of honey sample spiked with seven insecticides within 88.5-106.2%, with relative standard deviations of 2.38-5.63%. Finally, the method was successfully applied to the analysis of pyrethroids in a honey sample. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Fabrication of a Dipole-assisted Solid Phase Extraction Microchip for Trace Metal Analysis in Water Samples

    PubMed Central

    Chen, Ping-Hung; Chen, Shun-Niang; Tseng, Sheng-Hao; Deng, Ming-Jay; Lin, Yang-Wei; Sun, Yuh-Chang

    2016-01-01

    This paper describes a fabrication protocol for a dipole-assisted solid phase extraction (SPE) microchip available for trace metal analysis in water samples. A brief overview of the evolution of chip-based SPE techniques is provided. This is followed by an introduction to specific polymeric materials and their role in SPE. To develop an innovative dipole-assisted SPE technique, a chlorine (Cl)-containing SPE functionality was implanted into a poly(methyl methacrylate) (PMMA) microchip. Herein, diverse analytical techniques including contact angle analysis, Raman spectroscopic analysis, and laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) analysis were employed to validate the utility of the implantation protocol of the C-Cl moieties on the PMMA. The analytical results of the X-ray absorption near-edge structure (XANES) analysis also demonstrated the feasibility of the Cl-containing PMMA used as an extraction medium by virtue of the dipole-ion interactions between the highly electronegative C-Cl moieties and the positively charged metal ions. PMID:27584954

  11. Biometric sample extraction using Mahalanobis distance in Cardioid based graph using electrocardiogram signals.

    PubMed

    Sidek, Khairul; Khali, Ibrahim

    2012-01-01

    In this paper, a person identification mechanism implemented with Cardioid based graph using electrocardiogram (ECG) is presented. Cardioid based graph has given a reasonably good classification accuracy in terms of differentiating between individuals. However, the current feature extraction method using Euclidean distance could be further improved by using Mahalanobis distance measurement producing extracted coefficients which takes into account the correlations of the data set. Identification is then done by applying these extracted features to Radial Basis Function Network. A total of 30 ECG data from MITBIH Normal Sinus Rhythm database (NSRDB) and MITBIH Arrhythmia database (MITDB) were used for development and evaluation purposes. Our experimentation results suggest that the proposed feature extraction method has significantly increased the classification performance of subjects in both databases with accuracy from 97.50% to 99.80% in NSRDB and 96.50% to 99.40% in MITDB. High sensitivity, specificity and positive predictive value of 99.17%, 99.91% and 99.23% for NSRDB and 99.30%, 99.90% and 99.40% for MITDB also validates the proposed method. This result also indicates that the right feature extraction technique plays a vital role in determining the persistency of the classification accuracy for Cardioid based person identification mechanism.

  12. Extraction of decision rules via imprecise probabilities

    NASA Astrophysics Data System (ADS)

    Abellán, Joaquín; López, Griselda; Garach, Laura; Castellano, Javier G.

    2017-05-01

    Data analysis techniques can be applied to discover important relations among features. This is the main objective of the Information Root Node Variation (IRNV) technique, a new method to extract knowledge from data via decision trees. The decision trees used by the original method were built using classic split criteria. The performance of new split criteria based on imprecise probabilities and uncertainty measures, called credal split criteria, differs significantly from the performance obtained using the classic criteria. This paper extends the IRNV method using two credal split criteria: one based on a mathematical parametric model, and other one based on a non-parametric model. The performance of the method is analyzed using a case study of traffic accident data to identify patterns related to the severity of an accident. We found that a larger number of rules is generated, significantly supplementing the information obtained using the classic split criteria.

  13. EVALUATION OF ANALYTICAL METHODS FOR DETERMINING PESTICIDES IN BABY FOOD

    EPA Science Inventory

    Three extraction methods and two detection techniques for determining pesticides in baby food were evaluated. The extraction techniques examined were supercritical fluid extraction (SFE), enhanced solvent extraction (ESE), and solid phase extraction (SPE). The detection techni...

  14. Ultrahigh pressure extraction of bioactive compounds from plants-A review.

    PubMed

    Xi, Jun

    2017-04-13

    Extraction of bioactive compounds from plants is one of the most important research areas for pharmaceutical and food industries. Conventional extraction techniques are usually associated with longer extraction times, lower yields, more organic solvent consumption, and poor extraction efficiency. A novel extraction technique, ultrahigh pressure extraction, has been developed for the extraction of bioactive compounds from plants, in order to shorten the extraction time, decrease the solvent consumption, increase the extraction yields, and enhance the quality of extracts. The mild processing temperature of ultrahigh pressure extraction may lead to an enhanced extraction of thermolabile bioactive ingredients. A critical review is conducted to introduce the different aspects of ultrahigh pressure extraction of plants bioactive compounds, including principles and mechanisms, the important parameters influencing its performance, comparison of ultrahigh pressure extraction with other extraction techniques, advantages, and disadvantages. The future opportunities of ultrahigh pressure extraction are also discussed.

  15. Image processing and analysis using neural networks for optometry area

    NASA Astrophysics Data System (ADS)

    Netto, Antonio V.; Ferreira de Oliveira, Maria C.

    2002-11-01

    In this work we describe the framework of a functional system for processing and analyzing images of the human eye acquired by the Hartmann-Shack technique (HS), in order to extract information to formulate a diagnosis of eye refractive errors (astigmatism, hypermetropia and myopia). The analysis is to be carried out using an Artificial Intelligence system based on Neural Nets, Fuzzy Logic and Classifier Combination. The major goal is to establish the basis of a new technology to effectively measure ocular refractive errors that is based on methods alternative those adopted in current patented systems. Moreover, analysis of images acquired with the Hartmann-Shack technique may enable the extraction of additional information on the health of an eye under exam from the same image used to detect refraction errors.

  16. Visibility enhancement of color images using Type-II fuzzy membership function

    NASA Astrophysics Data System (ADS)

    Singh, Harmandeep; Khehra, Baljit Singh

    2018-04-01

    Images taken in poor environmental conditions decrease the visibility and hidden information of digital images. Therefore, image enhancement techniques are necessary for improving the significant details of these images. An extensive review has shown that histogram-based enhancement techniques greatly suffer from over/under enhancement issues. Fuzzy-based enhancement techniques suffer from over/under saturated pixels problems. In this paper, a novel Type-II fuzzy-based image enhancement technique has been proposed for improving the visibility of images. The Type-II fuzzy logic can automatically extract the local atmospheric light and roughly eliminate the atmospheric veil in local detail enhancement. The proposed technique has been evaluated on 10 well-known weather degraded color images and is also compared with four well-known existing image enhancement techniques. The experimental results reveal that the proposed technique outperforms others regarding visible edge ratio, color gradients and number of saturated pixels.

  17. Fast and effective characterization of 3D region of interest in medical image data

    NASA Astrophysics Data System (ADS)

    Kontos, Despina; Megalooikonomou, Vasileios

    2004-05-01

    We propose a framework for detecting, characterizing and classifying spatial Regions of Interest (ROIs) in medical images, such as tumors and lesions in MRI or activation regions in fMRI. A necessary step prior to classification is efficient extraction of discriminative features. For this purpose, we apply a characterization technique especially designed for spatial ROIs. The main idea of this technique is to extract a k-dimensional feature vector using concentric spheres in 3D (or circles in 2D) radiating out of the ROI's center of mass. These vectors form characterization signatures that can be used to represent the initial ROIs. We focus on classifying fMRI ROIs obtained from a study that explores neuroanatomical correlates of semantic processing in Alzheimer's disease (AD). We detect a ROI highly associated with AD and apply the feature extraction technique with different experimental settings. We seek to distinguish control from patient samples. We study how classification can be performed using the extracted signatures as well as how different experimental parameters affect classification accuracy. The obtained classification accuracy ranged from 82% to 87% (based on the selected ROI) suggesting that the proposed classification framework can be potentially useful in supporting medical decision-making.

  18. Application of ionic liquids based enzyme-assisted extraction of chlorogenic acid from Eucommia ulmoides leaves.

    PubMed

    Liu, Tingting; Sui, Xiaoyu; Li, Li; Zhang, Jie; Liang, Xin; Li, Wenjing; Zhang, Honglian; Fu, Shuang

    2016-01-15

    A new approach for ionic liquid based enzyme-assisted extraction (ILEAE) of chlorogenic acid (CGA) from Eucommia ulmoides is presented in which enzyme pretreatment was used in ionic liquids aqueous media to enhance extraction yield. For this purpose, the solubility of CGA and the activity of cellulase were investigated in eight 1-alkyl-3-methylimidazolium ionic liquids. Cellulase in 0.5 M [C6mim]Br aqueous solution was found to provide better performance in extraction. The factors of ILEAE procedures including extraction time, extraction phase pH, extraction temperatures and enzyme concentrations were investigated. Moreover, the novel developed approach offered advantages in term of yield and efficiency compared with other conventional extraction techniques. Scanning electronic microscopy of plant samples indicated that cellulase treated cell wall in ionic liquid solution was subjected to extract, which led to more efficient extraction by reducing mass transfer barrier. The proposed ILEAE method would develope a continuous process for enzyme-assisted extraction including enzyme incubation and solvent extraction process. In this research, we propose a novel view for enzyme-assisted extraction of plant active component, besides concentrating on enzyme facilitated cell wall degradation, focusing on improvement of bad permeability of ionic liquids solutions. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Dispersive liquid-liquid microextraction based on solidification of floating organic droplet followed by high-performance liquid chromatography with ultraviolet detection and liquid chromatography-tandem mass spectrometry for the determination of triclosan and 2,4-dichlorophenol in water samples.

    PubMed

    Zheng, Cao; Zhao, Jing; Bao, Peng; Gao, Jin; He, Jin

    2011-06-24

    A novel, simple and efficient dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO) technique coupled with high-performance liquid chromatography with ultraviolet detection (HPLC-UV) and liquid chromatography-tandem mass spectrometry (LC-MS/MS) was developed for the determination of triclosan and its degradation product 2,4-dichlorophenol in real water samples. The extraction solvent used in this work is of low density, low volatility, low toxicity and proper melting point around room temperature. The extractant droplets can be collected easily by solidifying it at a lower temperature. Parameters that affect the extraction efficiency, including type and volume of extraction solvent and dispersive solvent, salt effect, pH and extraction time, were investigated and optimized in a 5 mL sample system by HPLC-UV. Under the optimum conditions (extraction solvent: 12 μL of 1-dodecanol; dispersive solvent: 300 of μL acetonitrile; sample pH: 6.0; extraction time: 1 min), the limits of detection (LODs) of the pretreatment method combined with LC-MS/MS were in the range of 0.002-0.02 μg L(-1) which are lower than or comparable with other reported approaches applied to the determination of the same compounds. Wide linearities, good precisions and satisfactory relative recoveries were also obtained. The proposed technique was successfully applied to determine triclosan and 2,4-dichlorophenol in real water samples. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Automated extraction of knowledge for model-based diagnostics

    NASA Technical Reports Server (NTRS)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  1. Imaging nanoscale lattice variations by machine learning of x-ray diffraction microscopy data

    DOE PAGES

    Laanait, Nouamane; Zhang, Zhan; Schlepütz, Christian M.

    2016-08-09

    In this paper, we present a novel methodology based on machine learning to extract lattice variations in crystalline materials, at the nanoscale, from an x-ray Bragg diffraction-based imaging technique. By employing a full-field microscopy setup, we capture real space images of materials, with imaging contrast determined solely by the x-ray diffracted signal. The data sets that emanate from this imaging technique are a hybrid of real space information (image spatial support) and reciprocal lattice space information (image contrast), and are intrinsically multidimensional (5D). By a judicious application of established unsupervised machine learning techniques and multivariate analysis to this multidimensional datamore » cube, we show how to extract features that can be ascribed physical interpretations in terms of common structural distortions, such as lattice tilts and dislocation arrays. Finally, we demonstrate this 'big data' approach to x-ray diffraction microscopy by identifying structural defects present in an epitaxial ferroelectric thin-film of lead zirconate titanate.« less

  2. Imaging nanoscale lattice variations by machine learning of x-ray diffraction microscopy data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laanait, Nouamane; Zhang, Zhan; Schlepütz, Christian M.

    In this paper, we present a novel methodology based on machine learning to extract lattice variations in crystalline materials, at the nanoscale, from an x-ray Bragg diffraction-based imaging technique. By employing a full-field microscopy setup, we capture real space images of materials, with imaging contrast determined solely by the x-ray diffracted signal. The data sets that emanate from this imaging technique are a hybrid of real space information (image spatial support) and reciprocal lattice space information (image contrast), and are intrinsically multidimensional (5D). By a judicious application of established unsupervised machine learning techniques and multivariate analysis to this multidimensional datamore » cube, we show how to extract features that can be ascribed physical interpretations in terms of common structural distortions, such as lattice tilts and dislocation arrays. Finally, we demonstrate this 'big data' approach to x-ray diffraction microscopy by identifying structural defects present in an epitaxial ferroelectric thin-film of lead zirconate titanate.« less

  3. An efficient scheme for automatic web pages categorization using the support vector machine

    NASA Astrophysics Data System (ADS)

    Bhalla, Vinod Kumar; Kumar, Neeraj

    2016-07-01

    In the past few years, with an evolution of the Internet and related technologies, the number of the Internet users grows exponentially. These users demand access to relevant web pages from the Internet within fraction of seconds. To achieve this goal, there is a requirement of an efficient categorization of web page contents. Manual categorization of these billions of web pages to achieve high accuracy is a challenging task. Most of the existing techniques reported in the literature are semi-automatic. Using these techniques, higher level of accuracy cannot be achieved. To achieve these goals, this paper proposes an automatic web pages categorization into the domain category. The proposed scheme is based on the identification of specific and relevant features of the web pages. In the proposed scheme, first extraction and evaluation of features are done followed by filtering the feature set for categorization of domain web pages. A feature extraction tool based on the HTML document object model of the web page is developed in the proposed scheme. Feature extraction and weight assignment are based on the collection of domain-specific keyword list developed by considering various domain pages. Moreover, the keyword list is reduced on the basis of ids of keywords in keyword list. Also, stemming of keywords and tag text is done to achieve a higher accuracy. An extensive feature set is generated to develop a robust classification technique. The proposed scheme was evaluated using a machine learning method in combination with feature extraction and statistical analysis using support vector machine kernel as the classification tool. The results obtained confirm the effectiveness of the proposed scheme in terms of its accuracy in different categories of web pages.

  4. An improved discriminative filter bank selection approach for motor imagery EEG signal classification using mutual information.

    PubMed

    Kumar, Shiu; Sharma, Alok; Tsunoda, Tatsuhiko

    2017-12-28

    Common spatial pattern (CSP) has been an effective technique for feature extraction in electroencephalography (EEG) based brain computer interfaces (BCIs). However, motor imagery EEG signal feature extraction using CSP generally depends on the selection of the frequency bands to a great extent. In this study, we propose a mutual information based frequency band selection approach. The idea of the proposed method is to utilize the information from all the available channels for effectively selecting the most discriminative filter banks. CSP features are extracted from multiple overlapping sub-bands. An additional sub-band has been introduced that cover the wide frequency band (7-30 Hz) and two different types of features are extracted using CSP and common spatio-spectral pattern techniques, respectively. Mutual information is then computed from the extracted features of each of these bands and the top filter banks are selected for further processing. Linear discriminant analysis is applied to the features extracted from each of the filter banks. The scores are fused together, and classification is done using support vector machine. The proposed method is evaluated using BCI Competition III dataset IVa, BCI Competition IV dataset I and BCI Competition IV dataset IIb, and it outperformed all other competing methods achieving the lowest misclassification rate and the highest kappa coefficient on all three datasets. Introducing a wide sub-band and using mutual information for selecting the most discriminative sub-bands, the proposed method shows improvement in motor imagery EEG signal classification.

  5. Heart Sound Biometric System Based on Marginal Spectrum Analysis

    PubMed Central

    Zhao, Zhidong; Shen, Qinqin; Ren, Fangqin

    2013-01-01

    This work presents a heart sound biometric system based on marginal spectrum analysis, which is a new feature extraction technique for identification purposes. This heart sound identification system is comprised of signal acquisition, pre-processing, feature extraction, training, and identification. Experiments on the selection of the optimal values for the system parameters are conducted. The results indicate that the new spectrum coefficients result in a significant increase in the recognition rate of 94.40% compared with that of the traditional Fourier spectrum (84.32%) based on a database of 280 heart sounds from 40 participants. PMID:23429515

  6. Classification of forensic autopsy reports through conceptual graph-based document representation model.

    PubMed

    Mujtaba, Ghulam; Shuib, Liyana; Raj, Ram Gopal; Rajandram, Retnagowri; Shaikh, Khairunisa; Al-Garadi, Mohammed Ali

    2018-06-01

    Text categorization has been used extensively in recent years to classify plain-text clinical reports. This study employs text categorization techniques for the classification of open narrative forensic autopsy reports. One of the key steps in text classification is document representation. In document representation, a clinical report is transformed into a format that is suitable for classification. The traditional document representation technique for text categorization is the bag-of-words (BoW) technique. In this study, the traditional BoW technique is ineffective in classifying forensic autopsy reports because it merely extracts frequent but discriminative features from clinical reports. Moreover, this technique fails to capture word inversion, as well as word-level synonymy and polysemy, when classifying autopsy reports. Hence, the BoW technique suffers from low accuracy and low robustness unless it is improved with contextual and application-specific information. To overcome the aforementioned limitations of the BoW technique, this research aims to develop an effective conceptual graph-based document representation (CGDR) technique to classify 1500 forensic autopsy reports from four (4) manners of death (MoD) and sixteen (16) causes of death (CoD). Term-based and Systematized Nomenclature of Medicine-Clinical Terms (SNOMED CT) based conceptual features were extracted and represented through graphs. These features were then used to train a two-level text classifier. The first level classifier was responsible for predicting MoD. In addition, the second level classifier was responsible for predicting CoD using the proposed conceptual graph-based document representation technique. To demonstrate the significance of the proposed technique, its results were compared with those of six (6) state-of-the-art document representation techniques. Lastly, this study compared the effects of one-level classification and two-level classification on the experimental results. The experimental results indicated that the CGDR technique achieved 12% to 15% improvement in accuracy compared with fully automated document representation baseline techniques. Moreover, two-level classification obtained better results compared with one-level classification. The promising results of the proposed conceptual graph-based document representation technique suggest that pathologists can adopt the proposed system as their basis for second opinion, thereby supporting them in effectively determining CoD. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. A High-Speed Vision-Based Sensor for Dynamic Vibration Analysis Using Fast Motion Extraction Algorithms.

    PubMed

    Zhang, Dashan; Guo, Jie; Lei, Xiujun; Zhu, Changan

    2016-04-22

    The development of image sensor and optics enables the application of vision-based techniques to the non-contact dynamic vibration analysis of large-scale structures. As an emerging technology, a vision-based approach allows for remote measuring and does not bring any additional mass to the measuring object compared with traditional contact measurements. In this study, a high-speed vision-based sensor system is developed to extract structure vibration signals in real time. A fast motion extraction algorithm is required for this system because the maximum sampling frequency of the charge-coupled device (CCD) sensor can reach up to 1000 Hz. Two efficient subpixel level motion extraction algorithms, namely the modified Taylor approximation refinement algorithm and the localization refinement algorithm, are integrated into the proposed vision sensor. Quantitative analysis shows that both of the two modified algorithms are at least five times faster than conventional upsampled cross-correlation approaches and achieve satisfactory error performance. The practicability of the developed sensor is evaluated by an experiment in a laboratory environment and a field test. Experimental results indicate that the developed high-speed vision-based sensor system can extract accurate dynamic structure vibration signals by tracking either artificial targets or natural features.

  8. Improving KPCA Online Extraction by Orthonormalization in the Feature Space.

    PubMed

    Souza Filho, Joao B O; Diniz, Paulo S R

    2018-04-01

    Recently, some online kernel principal component analysis (KPCA) techniques based on the generalized Hebbian algorithm (GHA) were proposed for use in large data sets, defining kernel components using concise dictionaries automatically extracted from data. This brief proposes two new online KPCA extraction algorithms, exploiting orthogonalized versions of the GHA rule. In both the cases, the orthogonalization of kernel components is achieved by the inclusion of some low complexity additional steps to the kernel Hebbian algorithm, thus not substantially affecting the computational cost of the algorithm. Results show improved convergence speed and accuracy of components extracted by the proposed methods, as compared with the state-of-the-art online KPCA extraction algorithms.

  9. Novel capsule phase microextraction in combination with liquid chromatography-tandem mass spectrometry for determining personal care products in environmental water.

    PubMed

    Lakade, Sameer S; Borrull, Francesc; Furton, Kenneth G; Kabir, Abuzar; Marcé, Rosa Maria; Fontanals, Núria

    2018-05-01

    A novel sample preparation technique named capsule phase microextraction (CPME) is presented here. The technique utilizes a miniaturized microextraction capsule (MEC) as the extraction medium. The MEC consists of two conjoined porous tubular polypropylene membranes, one of which encapsulates the sorbent through sol-gel technology, while the other encapsulates a magnetic metal rod. As such, MEC integrates both the extraction and stirring mechanisms into a single device. The aim of this article is to demonstrate the application potential of CPME as sample preparation technique for the extraction of a group of personal care products (PCPs) from water matrices. Among the different sol-gel sorbent materials (UCON ® , poly(caprolactone-dimethylsiloxane-caprolactone) (PCAP-DMS-CAP) and Carbowax 20M (CW-20M)) evaluated, CW-20M MEC demonstrated the best extraction performance for the selected PCPs. The extraction conditions for sol-gel CW-20M MEC were optimized, including sample pH, stirring speed, addition of salt, extraction time, sample volume, liquid desorption solvent, and time. Under the optimal conditions, sol-gel CW-20M MEC provided recoveries, ranging between 47 and 90% for all analytes, except for ethylparaben, which showed a recovery of 26%. The method based on CPME with sol-gel CW-20M followed by liquid chromatography-tandem mass spectrometry was developed and validated for the extraction of PCPs from river water and effluent wastewater samples. When analyzing different environmental samples, some analytes such as 2,4-dihydroxybenzophenone, 2,2-dihydroxy-4-4 methoxybenzophenone and 3-benzophenone were found at low ng L -1 .

  10. A novel fatty-acid-based in-tube dispersive liquid-liquid microextraction technique for the rapid determination of nonylphenol and 4-tert-octylphenol in aqueous samples using high-performance liquid chromatography-ultraviolet detection.

    PubMed

    Shih, Hou-Kuang; Shu, Ting-Yun; Ponnusamy, Vinoth Kumar; Jen, Jen-Fon

    2015-01-07

    In this study, a novel fatty-acid-based in-tube dispersive liquid-liquid microextraction (FA-IT-DLLME) technique is proposed for the first time and is developed as a simple, rapid and eco-friendly sample extraction method for the determination of alkylphenols in aqueous samples using high-performance liquid chromatography-ultraviolet detection (HPLC-UV). In this extraction method, medium-chain saturated fatty acids were investigated as a pH-dependent phase because they acted as either anionic surfactants or neutral extraction solvents based on the acid-base reaction caused solely by the adjustment of the pH of the solution. A specially designed home-made glass extraction tube with a built-in scaled capillary tube was utilized as the phase-separation device for the FA-IT-DLLME to collect and measure the separated extractant phase for analysis. Nonylphenol (NP) and 4-tert-octylphenol (4-tOP) were chosen as model analytes. The parameters influencing the FA-IT-DLLME were thoroughly investigated and optimized. Under the optimal conditions, the detector responses of NP and 4-tOP were linear in the concentration ranges of 5-4000 μg L(-1), with correlation coefficients of 0.9990 and 0.9996 for NP and 4-tOP, respectively. The limits of detection based on a signal-to-noise ratio of 3 were 0.7 and 0.5 μg L(-1), and the enrichment factors were 195 and 143 for NP and 4-tOP, respectively. The applicability of the developed method was demonstrated for the analysis of alkylphenols in environmental wastewater samples, and the recoveries ranged from 92.9 to 107.1%. The extraction process required less than 4 min and utilized only acids, alkalis, and fatty acids to achieve the extraction. The results demonstrated that the presented FA-IT-DLLME approach is highly cost-effective, simple, rapid and environmentally friendly in its sample preparation. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Natural dye extract of lawsonia inermis seed as photo sensitizer for titanium dioxide based dye sensitized solar cells.

    PubMed

    Ananth, S; Vivek, P; Arumanayagam, T; Murugakoothan, P

    2014-07-15

    Natural dye extract of lawsonia inermis seed were used as photo sensitizer to fabricate titanium dioxide nanoparticles based dye sensitized solar cells. Pure titanium dioxide (TiO2) nanoparticles in anatase phase were synthesized by sol-gel technique and pre dye treated TiO2 nanoparticles were synthesized using modified sol-gel technique by mixing lawsone pigment rich natural dye during the synthesis itself. This pre dye treatment with natural dye has yielded colored TiO2 nanoparticles with uniform adsorption of natural dye, reduced agglomeration, less dye aggregation and improved morphology. The pure and pre dye treated TiO2 nanoparticles were subjected to structural, optical, spectral and morphological studies. Dye sensitized solar cells (DSSC) fabricated using the pre dye treated and pure TiO2 nanoparticles sensitized by natural dye extract of lawsonia inermis seed showed a promising solar light to electron conversion efficiency of 1.47% and 1% respectively. The pre dye treated TiO2 based DSSC showed an improved efficiency of 47% when compared to that of conventional DSSC. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. On the radiated EMI current extraction of dc transmission line based on corona current statistical measurements

    NASA Astrophysics Data System (ADS)

    Yi, Yong; Chen, Zhengying; Wang, Liming

    2018-05-01

    Corona-originated discharge of DC transmission lines is the main reason for the radiated electromagnetic interference (EMI) field in the vicinity of transmission lines. A joint time-frequency analysis technique was proposed to extract the radiated EMI current (excitation current) of DC corona based on corona current statistical measurements. A reduced-scale experimental platform was setup to measure the statistical distributions of current waveform parameters of aluminum conductor steel reinforced. Based on the measured results, the peak value, root-mean-square value and average value with 9 kHz and 200 Hz band-with of 0.5 MHz radiated EMI current were calculated by the technique proposed and validated with conventional excitation function method. Radio interference (RI) was calculated based on the radiated EMI current and a wire-to-plate platform was built for the validity of the RI computation results. The reason for the certain deviation between the computations and measurements was detailed analyzed.

  13. Image Processing for Planetary Limb/Terminator Extraction

    NASA Technical Reports Server (NTRS)

    Udomkesmalee, S.; Zhu, D. Q.; Chu, C. -C.

    1995-01-01

    A novel image segmentation technique for extracting limb and terminator of planetary bodies is proposed. Conventional edge- based histogramming approaches are used to trace object boundaries. The limb and terminator bifurcation is achieved by locating the harmonized segment in the two equations representing the 2-D parameterized boundary curve. Real planetary images from Voyager 1 and 2 served as representative test cases to verify the proposed methodology.

  14. Vision-based weld pool boundary extraction and width measurement during keyhole fiber laser welding

    NASA Astrophysics Data System (ADS)

    Luo, Masiyang; Shin, Yung C.

    2015-01-01

    In keyhole fiber laser welding processes, the weld pool behavior is essential to determining welding quality. To better observe and control the welding process, the accurate extraction of the weld pool boundary as well as the width is required. This work presents a weld pool edge detection technique based on an off axial green illumination laser and a coaxial image capturing system that consists of a CMOS camera and optic filters. According to the difference of image quality, a complete developed edge detection algorithm is proposed based on the local maximum gradient of greyness searching approach and linear interpolation. The extracted weld pool geometry and the width are validated by the actual welding width measurement and predictions by a numerical multi-phase model.

  15. Kruskal-Wallis-based computationally efficient feature selection for face recognition.

    PubMed

    Ali Khan, Sajid; Hussain, Ayyaz; Basit, Abdul; Akram, Sheeraz

    2014-01-01

    Face recognition in today's technological world, and face recognition applications attain much more importance. Most of the existing work used frontal face images to classify face image. However these techniques fail when applied on real world face images. The proposed technique effectively extracts the prominent facial features. Most of the features are redundant and do not contribute to representing face. In order to eliminate those redundant features, computationally efficient algorithm is used to select the more discriminative face features. Extracted features are then passed to classification step. In the classification step, different classifiers are ensemble to enhance the recognition accuracy rate as single classifier is unable to achieve the high accuracy. Experiments are performed on standard face database images and results are compared with existing techniques.

  16. SPECTROSCOPIC ONLINE MONITORING FOR PROCESS CONTROL AND SAFEGUARDING OF RADIOCHEMICAL STREAMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan, Samuel A.; Levitskaia, Tatiana G.

    2013-09-29

    There is a renewed interest worldwide to promote the use of nuclear power and close the nuclear fuel cycle. The long term successful use of nuclear power is critically dependent upon adequate and safe processing and disposition of the used nuclear fuel. Liquid-liquid extraction is a separation technique commonly employed for the processing of the dissolved used nuclear fuel. The instrumentation used to monitor these processes must be robust, require little or no maintenance, and be able to withstand harsh environments such as high radiation fields and aggressive chemical matrices. This paper summarizes application of the absorption and vibrational spectroscopicmore » techniques supplemented by physicochemical measurements for radiochemical process monitoring. In this context, our team experimentally assessed the potential of Raman and spectrophotometric techniques for online real-time monitoring of the U(VI)/nitrate ion/nitric acid and Pu(IV)/Np(V)/Nd(III), respectively, in solutions relevant to spent fuel reprocessing. These techniques demonstrate robust performance in the repetitive batch measurements of each analyte in a wide concentration range using simulant and commercial dissolved spent fuel solutions. Spectroscopic measurements served as training sets for the multivariate data analysis to obtain partial least squares predictive models, which were validated using on-line centrifugal contactor extraction tests. Satisfactory prediction of the analytes concentrations in these preliminary experiments warrants further development of the spectroscopy-based methods for radiochemical process control and safeguarding. Additionally, the ability to identify material intentionally diverted from a liquid-liquid extraction contactor system was successfully tested using on-line process monitoring as a means to detect the amount of material diverted. A chemical diversion and detection from a liquid-liquid extraction scheme was demonstrated using a centrifugal contactor system operating with the simulant PUREX extraction system of Nd(NO3)3/nitric acid aqueous phase and TBP/n-dodecane organic phase. During a continuous extraction experiment, a portion of the feed from a counter-current extraction system was diverted while the spectroscopic on-line process monitoring system was simultaneously measuring the feed, raffinate and organic products streams. The amount observed to be diverted by on-line spectroscopic process monitoring was in excellent agreement with values based from the known mass of sample directly taken (diverted) from system feed solution.« less

  17. Improving EEG-Based Motor Imagery Classification for Real-Time Applications Using the QSA Method.

    PubMed

    Batres-Mendoza, Patricia; Ibarra-Manzano, Mario A; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Montoro-Sanjose, Carlos R; Romero-Troncoso, Rene J; Rostro-Gonzalez, Horacio

    2017-01-01

    We present an improvement to the quaternion-based signal analysis (QSA) technique to extract electroencephalography (EEG) signal features with a view to developing real-time applications, particularly in motor imagery (IM) cognitive processes. The proposed methodology (iQSA, improved QSA) extracts features such as the average, variance, homogeneity, and contrast of EEG signals related to motor imagery in a more efficient manner (i.e., by reducing the number of samples needed to classify the signal and improving the classification percentage) compared to the original QSA technique. Specifically, we can sample the signal in variable time periods (from 0.5 s to 3 s, in half-a-second intervals) to determine the relationship between the number of samples and their effectiveness in classifying signals. In addition, to strengthen the classification process a number of boosting-technique-based decision trees were implemented. The results show an 82.30% accuracy rate for 0.5 s samples and 73.16% for 3 s samples. This is a significant improvement compared to the original QSA technique that offered results from 33.31% to 40.82% without sampling window and from 33.44% to 41.07% with sampling window, respectively. We can thus conclude that iQSA is better suited to develop real-time applications.

  18. Improving EEG-Based Motor Imagery Classification for Real-Time Applications Using the QSA Method

    PubMed Central

    Batres-Mendoza, Patricia; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Montoro-Sanjose, Carlos R.

    2017-01-01

    We present an improvement to the quaternion-based signal analysis (QSA) technique to extract electroencephalography (EEG) signal features with a view to developing real-time applications, particularly in motor imagery (IM) cognitive processes. The proposed methodology (iQSA, improved QSA) extracts features such as the average, variance, homogeneity, and contrast of EEG signals related to motor imagery in a more efficient manner (i.e., by reducing the number of samples needed to classify the signal and improving the classification percentage) compared to the original QSA technique. Specifically, we can sample the signal in variable time periods (from 0.5 s to 3 s, in half-a-second intervals) to determine the relationship between the number of samples and their effectiveness in classifying signals. In addition, to strengthen the classification process a number of boosting-technique-based decision trees were implemented. The results show an 82.30% accuracy rate for 0.5 s samples and 73.16% for 3 s samples. This is a significant improvement compared to the original QSA technique that offered results from 33.31% to 40.82% without sampling window and from 33.44% to 41.07% with sampling window, respectively. We can thus conclude that iQSA is better suited to develop real-time applications. PMID:29348744

  19. Sparse alignment for robust tensor learning.

    PubMed

    Lai, Zhihui; Wong, Wai Keung; Xu, Yong; Zhao, Cairong; Sun, Mingming

    2014-10-01

    Multilinear/tensor extensions of manifold learning based algorithms have been widely used in computer vision and pattern recognition. This paper first provides a systematic analysis of the multilinear extensions for the most popular methods by using alignment techniques, thereby obtaining a general tensor alignment framework. From this framework, it is easy to show that the manifold learning based tensor learning methods are intrinsically different from the alignment techniques. Based on the alignment framework, a robust tensor learning method called sparse tensor alignment (STA) is then proposed for unsupervised tensor feature extraction. Different from the existing tensor learning methods, L1- and L2-norms are introduced to enhance the robustness in the alignment step of the STA. The advantage of the proposed technique is that the difficulty in selecting the size of the local neighborhood can be avoided in the manifold learning based tensor feature extraction algorithms. Although STA is an unsupervised learning method, the sparsity encodes the discriminative information in the alignment step and provides the robustness of STA. Extensive experiments on the well-known image databases as well as action and hand gesture databases by encoding object images as tensors demonstrate that the proposed STA algorithm gives the most competitive performance when compared with the tensor-based unsupervised learning methods.

  20. Microemulsion-based lycopene extraction: Effect of surfactants, co-surfactants and pretreatments.

    PubMed

    Amiri-Rigi, Atefeh; Abbasi, Soleiman

    2016-04-15

    Lycopene is a potent antioxidant that has received extensive attention recently. Due to the challenges encountered with current methods of lycopene extraction using hazardous solvents, industry calls for a greener, safer and more efficient process. The main purpose of present study was application of microemulsion technique to extract lycopene from tomato pomace. In this respect, the effect of eight different surfactants, four different co-surfactants, and ultrasound and enzyme pretreatments on lycopene extraction efficiency was examined. Experimental results revealed that application of combined ultrasound and enzyme pretreatments, saponin as a natural surfactant, and glycerol as a co-surfactant, in the bicontinuous region of microemulsion was the optimal experimental conditions resulting in a microemulsion containing 409.68±0.68 μg/glycopene. The high lycopene concentration achieved, indicates that microemulsion technique, using a low-cost natural surfactant could be promising for a simple and safe separation of lycopene from tomato pomace and possibly from tomato industrial wastes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Neural network-based brain tissue segmentation in MR images using extracted features from intraframe coding in H.264

    NASA Astrophysics Data System (ADS)

    Jafari, Mehdi; Kasaei, Shohreh

    2012-01-01

    Automatic brain tissue segmentation is a crucial task in diagnosis and treatment of medical images. This paper presents a new algorithm to segment different brain tissues, such as white matter (WM), gray matter (GM), cerebral spinal fluid (CSF), background (BKG), and tumor tissues. The proposed technique uses the modified intraframe coding yielded from H.264/(AVC), for feature extraction. Extracted features are then imposed to an artificial back propagation neural network (BPN) classifier to assign each block to its appropriate class. Since the newest coding standard, H.264/AVC, has the highest compression ratio, it decreases the dimension of extracted features and thus yields to a more accurate classifier with low computational complexity. The performance of the BPN classifier is evaluated using the classification accuracy and computational complexity terms. The results show that the proposed technique is more robust and effective with low computational complexity compared to other recent works.

  2. Neural network-based brain tissue segmentation in MR images using extracted features from intraframe coding in H.264

    NASA Astrophysics Data System (ADS)

    Jafari, Mehdi; Kasaei, Shohreh

    2011-12-01

    Automatic brain tissue segmentation is a crucial task in diagnosis and treatment of medical images. This paper presents a new algorithm to segment different brain tissues, such as white matter (WM), gray matter (GM), cerebral spinal fluid (CSF), background (BKG), and tumor tissues. The proposed technique uses the modified intraframe coding yielded from H.264/(AVC), for feature extraction. Extracted features are then imposed to an artificial back propagation neural network (BPN) classifier to assign each block to its appropriate class. Since the newest coding standard, H.264/AVC, has the highest compression ratio, it decreases the dimension of extracted features and thus yields to a more accurate classifier with low computational complexity. The performance of the BPN classifier is evaluated using the classification accuracy and computational complexity terms. The results show that the proposed technique is more robust and effective with low computational complexity compared to other recent works.

  3. Automatic Feature Extraction from Planetary Images

    NASA Technical Reports Server (NTRS)

    Troglio, Giulia; Le Moigne, Jacqueline; Benediktsson, Jon A.; Moser, Gabriele; Serpico, Sebastiano B.

    2010-01-01

    With the launch of several planetary missions in the last decade, a large amount of planetary images has already been acquired and much more will be available for analysis in the coming years. The image data need to be analyzed, preferably by automatic processing techniques because of the huge amount of data. Although many automatic feature extraction methods have been proposed and utilized for Earth remote sensing images, these methods are not always applicable to planetary data that often present low contrast and uneven illumination characteristics. Different methods have already been presented for crater extraction from planetary images, but the detection of other types of planetary features has not been addressed yet. Here, we propose a new unsupervised method for the extraction of different features from the surface of the analyzed planet, based on the combination of several image processing techniques, including a watershed segmentation and the generalized Hough Transform. The method has many applications, among which image registration and can be applied to arbitrary planetary images.

  4. The role of chemometrics in single and sequential extraction assays: a review. Part II. Cluster analysis, multiple linear regression, mixture resolution, experimental design and other techniques.

    PubMed

    Giacomino, Agnese; Abollino, Ornella; Malandrino, Mery; Mentasti, Edoardo

    2011-03-04

    Single and sequential extraction procedures are used for studying element mobility and availability in solid matrices, like soils, sediments, sludge, and airborne particulate matter. In the first part of this review we reported an overview on these procedures and described the applications of chemometric uni- and bivariate techniques and of multivariate pattern recognition techniques based on variable reduction to the experimental results obtained. The second part of the review deals with the use of chemometrics not only for the visualization and interpretation of data, but also for the investigation of the effects of experimental conditions on the response, the optimization of their values and the calculation of element fractionation. We will describe the principles of the multivariate chemometric techniques considered, the aims for which they were applied and the key findings obtained. The following topics will be critically addressed: pattern recognition by cluster analysis (CA), linear discriminant analysis (LDA) and other less common techniques; modelling by multiple linear regression (MLR); investigation of spatial distribution of variables by geostatistics; calculation of fractionation patterns by a mixture resolution method (Chemometric Identification of Substrates and Element Distributions, CISED); optimization and characterization of extraction procedures by experimental design; other multivariate techniques less commonly applied. Copyright © 2010 Elsevier B.V. All rights reserved.

  5. Mathematical morphology-based shape feature analysis for Chinese character recognition systems

    NASA Astrophysics Data System (ADS)

    Pai, Tun-Wen; Shyu, Keh-Hwa; Chen, Ling-Fan; Tai, Gwo-Chin

    1995-04-01

    This paper proposes an efficient technique of shape feature extraction based on the application of mathematical morphology theory. A new shape complexity index for preclassification of machine printed Chinese Character Recognition (CCR) is also proposed. For characters represented in different fonts/sizes or in a low resolution environment, a more stable local feature such as shape structure is preferred for character recognition. Morphological valley extraction filters are applied to extract the protrusive strokes from four sides of an input Chinese character. The number of extracted local strokes reflects the shape complexity of each side. These shape features of characters are encoded as corresponding shape complexity indices. Based on the shape complexity index, data base is able to be classified into 16 groups prior to recognition procedures. The performance of associating with shape feature analysis reclaims several characters from misrecognized character sets and results in an average of 3.3% improvement of recognition rate from an existing recognition system. In addition to enhance the recognition performance, the extracted stroke information can be further analyzed and classified its own stroke type. Therefore, the combination of extracted strokes from each side provides a means for data base clustering based on radical or subword components. It is one of the best solutions for recognizing high complexity characters such as Chinese characters which are divided into more than 200 different categories and consist more than 13,000 characters.

  6. Trace-fiber color discrimination by electrospray ionization mass spectrometry: a tool for the analysis of dyes extracted from submillimeter nylon fibers.

    PubMed

    Tuinman, Albert A; Lewis, Linda A; Lewis, Samuel A

    2003-06-01

    The application of electrospray ionization mass spectrometry (ESI-MS) to trace-fiber color analysis is explored using acidic dyes commonly employed to color nylon-based fibers, as well as extracts from dyed nylon fibers. Qualitative information about constituent dyes and quantitative information about the relative amounts of those dyes present on a single fiber become readily available using this technique. Sample requirements for establishing the color identity of different samples (i.e., comparative trace-fiber analysis) are shown to be submillimeter. Absolute verification of dye mixture identity (beyond the comparison of molecular weights derived from ESI-MS) can be obtained by expanding the technique to include tandem mass spectrometry (ESI-MS/MS). For dyes of unknown origin, the ESI-MS/MS analyses may offer insights into the chemical structure of the compound-information not available from chromatographic techniques alone. This research demonstrates that ESI-MS is viable as a sensitive technique for distinguishing dye constituents extracted from a minute amount of trace-fiber evidence. A protocol is suggested to establish/refute the proposition that two fibers--one of which is available in minute quantity only--are of the same origin.

  7. Spectral features based tea garden extraction from digital orthophoto maps

    NASA Astrophysics Data System (ADS)

    Jamil, Akhtar; Bayram, Bulent; Kucuk, Turgay; Zafer Seker, Dursun

    2018-05-01

    The advancements in the photogrammetry and remote sensing technologies has made it possible to extract useful tangible information from data which plays a pivotal role in various application such as management and monitoring of forests and agricultural lands etc. This study aimed to evaluate the effectiveness of spectral signatures for extraction of tea gardens from 1 : 5000 scaled digital orthophoto maps obtained from Rize city in Turkey. First, the normalized difference vegetation index (NDVI) was derived from the input images to suppress the non-vegetation areas. NDVI values less than zero were discarded and the output images was normalized in the range 0-255. Individual pixels were then mapped into meaningful objects using global region growing technique. The resulting image was filtered and smoothed to reduce the impact of noise. Furthermore, geometrical constraints were applied to remove small objects (less than 500 pixels) followed by morphological opening operator to enhance the results. These objects served as building blocks for further image analysis. Finally, for the classification stage, a range of spectral values were empirically calculated for each band and applied on candidate objects to extract tea gardens. For accuracy assessment, we employed an area based similarity metric by overlapping obtained tea garden boundaries with the manually digitized tea garden boundaries created by experts of photogrammetry. The overall accuracy of the proposed method scored 89 % for tea gardens from 10 sample orthophoto maps. We concluded that exploiting the spectral signatures using object based analysis is an effective technique for extraction of dominant tree species from digital orthophoto maps.

  8. Sample preparation for the analysis of isoflavones from soybeans and soy foods.

    PubMed

    Rostagno, M A; Villares, A; Guillamón, E; García-Lafuente, A; Martínez, J A

    2009-01-02

    This manuscript provides a review of the actual state and the most recent advances as well as current trends and future prospects in sample preparation and analysis for the quantification of isoflavones from soybeans and soy foods. Individual steps of the procedures used in sample preparation, including sample conservation, extraction techniques and methods, and post-extraction treatment procedures are discussed. The most commonly used methods for extraction of isoflavones with both conventional and "modern" techniques are examined in detail. These modern techniques include ultrasound-assisted extraction, pressurized liquid extraction, supercritical fluid extraction and microwave-assisted extraction. Other aspects such as stability during extraction and analysis by high performance liquid chromatography are also covered.

  9. Palmprint verification using Lagrangian decomposition and invariant interest points

    NASA Astrophysics Data System (ADS)

    Gupta, P.; Rattani, A.; Kisku, D. R.; Hwang, C. J.; Sing, J. K.

    2011-06-01

    This paper presents a palmprint based verification system using SIFT features and Lagrangian network graph technique. We employ SIFT for feature extraction from palmprint images whereas the region of interest (ROI) which has been extracted from wide palm texture at the preprocessing stage, is considered for invariant points extraction. Finally, identity is established by finding permutation matrix for a pair of reference and probe palm graphs drawn on extracted SIFT features. Permutation matrix is used to minimize the distance between two graphs. The propsed system has been tested on CASIA and IITK palmprint databases and experimental results reveal the effectiveness and robustness of the system.

  10. Chemical Processing of Non-Crop Plants for Jet Fuel Blends Production

    NASA Technical Reports Server (NTRS)

    Kulis, M. J.; Hepp, A. F.; McDowell, M.; Ribita, D.

    2009-01-01

    The use of Biofuels has been gaining in popularity over the past few years due to their ability to reduce the dependence on fossil fuels. Biofuels as a renewable energy source can be a viable option for sustaining long-term energy needs if they are managed efficiently. We describe our initial efforts to exploit algae, halophytes and other non-crop plants to produce synthetics for fuel blends that can potentially be used as fuels for aviation and non-aerospace applications. Our efforts have been dedicated to crafting efficient extraction and refining processes in order to extract constituents from the plant materials with the ultimate goal of determining the feasibility of producing biomass-based jet fuel from the refined extract. Two extraction methods have been developed based on communition processes, and liquid-solid extraction techniques. Refining procedures such as chlorophyll removal and transesterification of triglycerides have been performed. Gas chromatography in tandem with mass spectroscopy is currently being utilized in order to qualitatively determine the individual components of the refined extract. We also briefly discuss and compare alternative methods to extract fuel-blending agents from alternative biofuels sources.

  11. Essential Oil Variability and Biological Activities of Tetraclinis articulata (Vahl) Mast. Wood According to the Extraction Time.

    PubMed

    Djouahri, Abderrahmane; Saka, Boualem; Boudarene, Lynda; Baaliouamer, Aoumeur

    2016-12-01

    In the present work, the hydrodistillation (HD) and microwave-assisted hydrodistillation (MAHD) kinetics of essential oil (EO) extracted from Tetraclinis articulata (Vahl) Mast. wood was conducted, in order to assess the impact of extraction time and technique on chemical composition and biological activities. Gas chromatography (GC) and GC/mass spectrometry analyses showed significant differences between the extracted EOs, where each family class or component presents a specific kinetic according to extraction time, technique and especially for the major components: camphene, linalool, cedrol, carvacrol and α-acorenol. Furthermore, our findings showed a high variability for both antioxidant and anti-inflammatory activities, where each activity has a specific effect according to extraction time and technique. The highlighted variability reflects the high impact of extraction time and technique on chemical composition and biological activities, which led to conclude that we should select EOs to be investigated carefully depending on extraction time and technique, in order to isolate the bioactive components or to have the best quality of EO in terms of biological activities and preventive effects in food. © 2016 Wiley-VHCA AG, Zurich, Switzerland.

  12. Diesel Engine Valve Clearance Fault Diagnosis Based on Features Extraction Techniques and FastICA-SVM

    NASA Astrophysics Data System (ADS)

    Jing, Ya-Bing; Liu, Chang-Wen; Bi, Feng-Rong; Bi, Xiao-Yang; Wang, Xia; Shao, Kang

    2017-07-01

    Numerous vibration-based techniques are rarely used in diesel engines fault diagnosis in a direct way, due to the surface vibration signals of diesel engines with the complex non-stationary and nonlinear time-varying features. To investigate the fault diagnosis of diesel engines, fractal correlation dimension, wavelet energy and entropy as features reflecting the diesel engine fault fractal and energy characteristics are extracted from the decomposed signals through analyzing vibration acceleration signals derived from the cylinder head in seven different states of valve train. An intelligent fault detector FastICA-SVM is applied for diesel engine fault diagnosis and classification. The results demonstrate that FastICA-SVM achieves higher classification accuracy and makes better generalization performance in small samples recognition. Besides, the fractal correlation dimension and wavelet energy and entropy as the special features of diesel engine vibration signal are considered as input vectors of classifier FastICA-SVM and could produce the excellent classification results. The proposed methodology improves the accuracy of feature extraction and the fault diagnosis of diesel engines.

  13. Coherent vorticity extraction in resistive drift-wave turbulence: Comparison of orthogonal wavelets versus proper orthogonal decomposition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Futatani, S.; Bos, W.J.T.; Del-Castillo-Negrete, Diego B

    2011-01-01

    We assess two techniques for extracting coherent vortices out of turbulent flows: the wavelet based Coherent Vorticity Extraction (CVE) and the Proper Orthogonal Decomposition (POD). The former decomposes the flow field into an orthogonal wavelet representation and subsequent thresholding of the coefficients allows one to split the flow into organized coherent vortices with non-Gaussian statistics and an incoherent random part which is structureless. POD is based on the singular value decomposition and decomposes the flow into basis functions which are optimal with respect to the retained energy for the ensemble average. Both techniques are applied to direct numerical simulation datamore » of two-dimensional drift-wave turbulence governed by Hasegawa Wakatani equation, considering two limit cases: the quasi-hydrodynamic and the quasi-adiabatic regimes. The results are compared in terms of compression rate, retained energy, retained enstrophy and retained radial flux, together with the enstrophy spectrum and higher order statistics. (c) 2010 Published by Elsevier Masson SAS on behalf of Academie des sciences.« less

  14. Vision-based system identification technique for building structures using a motion capture system

    NASA Astrophysics Data System (ADS)

    Oh, Byung Kwan; Hwang, Jin Woo; Kim, Yousok; Cho, Tongjun; Park, Hyo Seon

    2015-11-01

    This paper presents a new vision-based system identification (SI) technique for building structures by using a motion capture system (MCS). The MCS with outstanding capabilities for dynamic response measurements can provide gage-free measurements of vibrations through the convenient installation of multiple markers. In this technique, from the dynamic displacement responses measured by MCS, the dynamic characteristics (natural frequency, mode shape, and damping ratio) of building structures are extracted after the processes of converting the displacement from MCS to acceleration and conducting SI by frequency domain decomposition. A free vibration experiment on a three-story shear frame was conducted to validate the proposed technique. The SI results from the conventional accelerometer-based method were compared with those from the proposed technique and showed good agreement, which confirms the validity and applicability of the proposed vision-based SI technique for building structures. Furthermore, SI directly employing MCS measured displacements to FDD was performed and showed identical results to those of conventional SI method.

  15. On-line coupling of supercritical fluid extraction and chromatographic techniques.

    PubMed

    Sánchez-Camargo, Andrea Del Pilar; Parada-Alfonso, Fabián; Ibáñez, Elena; Cifuentes, Alejandro

    2017-01-01

    This review summarizes and discusses recent advances and applications of on-line supercritical fluid extraction coupled to liquid chromatography, gas chromatography, and supercritical fluid chromatographic techniques. Supercritical fluids, due to their exceptional physical properties, provide unique opportunities not only during the extraction step but also in the separation process. Although supercritical fluid extraction is especially suitable for recovery of non-polar organic compounds, this technique can also be successfully applied to the extraction of polar analytes by the aid of modifiers. Supercritical fluid extraction process can be performed following "off-line" or "on-line" approaches and their main features are contrasted herein. Besides, the parameters affecting the supercritical fluid extraction process are explained and a "decision tree" is for the first time presented in this review work as a guide tool for method development. The general principles (instrumental and methodological) of the different on-line couplings of supercritical fluid extraction with chromatographic techniques are described. Advantages and shortcomings of supercritical fluid extraction as hyphenated technique are discussed. Besides, an update of the most recent applications (from 2005 up to now) of the mentioned couplings is also presented in this review. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Extraction and Determination of Quercetin from Ginkgo biloba by DESs-Based Polymer Monolithic Cartridge.

    PubMed

    Wang, Xiaoqin; Li, Guizhen; Ho Row, Kyung

    2017-09-01

    Deep eutectic solvents (DES) were formed from choline chloride (ChCl). DES-modified polymer monolithic (DES-M), template molecular polymer monolithic and non-DES-M without a molecular template were synthesized in identical process. These polymer materials were characterized by field emission scanning electron microscopy and Fourier transform infrared spectroscopy. The significant selective adsorption properties of the polymers were assessed by an absorption capacity experiment and solid-phase extraction (SPE). The optimized extraction procedure was as follows: ultrasonic time (30 min), optimal solvent (ethanol) and liquid to material ratio (20 mL g-1). Under this condition, the amount of quercetin extracted from Ginkgo biloba was 290.8 mg g-1. The purification of G. biloba was achieved by the SPE process. Based on the results, DESs-based monolithic cartridges can be used for simple and efficient extraction and as a pre-concentration technique for the purification of bioactive compounds or drugs in aqueous environments with high affinity and selectivity. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Development of Total Reflection X-ray fluorescence spectrometry quantitative methodologies for elemental characterization of building materials and their degradation products

    NASA Astrophysics Data System (ADS)

    García-Florentino, Cristina; Maguregui, Maite; Marguí, Eva; Torrent, Laura; Queralt, Ignasi; Madariaga, Juan Manuel

    2018-05-01

    In this work, a Total Reflection X-ray fluorescence (TXRF) spectrometry based quantitative methodology for elemental characterization of liquid extracts and solids belonging to old building materials and their degradation products from a building of the beginning of 20th century with a high historic cultural value in Getxo, (Basque Country, North of Spain) is proposed. This quantification strategy can be considered a faster methodology comparing to traditional Energy or Wavelength Dispersive X-ray fluorescence (ED-XRF and WD-XRF) spectrometry based methodologies or other techniques such as Inductively Coupled Plasma Mass Spectrometry (ICP-MS). In particular, two kinds of liquid extracts were analysed: (i) water soluble extracts from different mortars and (ii) acid extracts from mortars, black crusts, and calcium carbonate formations. In order to try to avoid the acid extraction step of the materials and their degradation products, it was also studied the TXRF direct measurement of the powdered solid suspensions in water. With this aim, different parameters such as the deposition volume and the measuring time were studied for each kind of samples. Depending on the quantified element, the limits of detection achieved with the TXRF quantitative methodologies for liquid extracts and solids were set around 0.01-1.2 and 2-200 mg/L respectively. The quantification of K, Ca, Ti, Mn, Fe, Zn, Rb, Sr, Sn and Pb in the liquid extracts was proved to be a faster alternative to other more classic quantification techniques (i.e. ICP-MS), accurate enough to obtain information about the composition of the acidic soluble part of the materials and their degradation products. Regarding the solid samples measured as suspensions, it was quite difficult to obtain stable and repetitive suspensions affecting in this way the accuracy of the results. To cope with this problem, correction factors based on the quantitative results obtained using ED-XRF were calculated to improve the accuracy of the TXRF results.

  18. Green extraction of grape skin phenolics by using deep eutectic solvents.

    PubMed

    Cvjetko Bubalo, Marina; Ćurko, Natka; Tomašević, Marina; Kovačević Ganić, Karin; Radojčić Redovniković, Ivana

    2016-06-01

    Conventional extraction techniques for plant phenolics are usually associated with high organic solvent consumption and long extraction times. In order to establish an environmentally friendly extraction method for grape skin phenolics, deep eutectic solvents (DES) as a green alternative to conventional solvents coupled with highly efficient microwave-assisted and ultrasound-assisted extraction methods (MAE and UAE, respectively) have been considered. Initially, screening of five different DES for proposed extraction was performed and choline chloride-based DES containing oxalic acid as a hydrogen bond donor with 25% of water was selected as the most promising one, resulting in more effective extraction of grape skin phenolic compounds compared to conventional solvents. Additionally, in our study, UAE proved to be the best extraction method with extraction efficiency superior to both MAE and conventional extraction method. The knowledge acquired in this study will contribute to further DES implementation in extraction of biologically active compounds from various plant sources. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Features Extraction of Flotation Froth Images and BP Neural Network Soft-Sensor Model of Concentrate Grade Optimized by Shuffled Cuckoo Searching Algorithm

    PubMed Central

    Wang, Jie-sheng; Han, Shuang; Shen, Na-na; Li, Shu-xia

    2014-01-01

    For meeting the forecasting target of key technology indicators in the flotation process, a BP neural network soft-sensor model based on features extraction of flotation froth images and optimized by shuffled cuckoo search algorithm is proposed. Based on the digital image processing technique, the color features in HSI color space, the visual features based on the gray level cooccurrence matrix, and the shape characteristics based on the geometric theory of flotation froth images are extracted, respectively, as the input variables of the proposed soft-sensor model. Then the isometric mapping method is used to reduce the input dimension, the network size, and learning time of BP neural network. Finally, a shuffled cuckoo search algorithm is adopted to optimize the BP neural network soft-sensor model. Simulation results show that the model has better generalization results and prediction accuracy. PMID:25133210

  20. Main Road Extraction from ZY-3 Grayscale Imagery Based on Directional Mathematical Morphology and VGI Prior Knowledge in Urban Areas

    PubMed Central

    Liu, Bo; Wu, Huayi; Wang, Yandong; Liu, Wenming

    2015-01-01

    Main road features extracted from remotely sensed imagery play an important role in many civilian and military applications, such as updating Geographic Information System (GIS) databases, urban structure analysis, spatial data matching and road navigation. Current methods for road feature extraction from high-resolution imagery are typically based on threshold value segmentation. It is difficult however, to completely separate road features from the background. We present a new method for extracting main roads from high-resolution grayscale imagery based on directional mathematical morphology and prior knowledge obtained from the Volunteered Geographic Information found in the OpenStreetMap. The two salient steps in this strategy are: (1) using directional mathematical morphology to enhance the contrast between roads and non-roads; (2) using OpenStreetMap roads as prior knowledge to segment the remotely sensed imagery. Experiments were conducted on two ZiYuan-3 images and one QuickBird high-resolution grayscale image to compare our proposed method to other commonly used techniques for road feature extraction. The results demonstrated the validity and better performance of the proposed method for urban main road feature extraction. PMID:26397832

  1. A Fault Alarm and Diagnosis Method Based on Sensitive Parameters and Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Zhang, Jinjie; Yao, Ziyun; Lv, Zhiquan; Zhu, Qunxiong; Xu, Fengtian; Jiang, Zhinong

    2015-08-01

    Study on the extraction of fault feature and the diagnostic technique of reciprocating compressor is one of the hot research topics in the field of reciprocating machinery fault diagnosis at present. A large number of feature extraction and classification methods have been widely applied in the related research, but the practical fault alarm and the accuracy of diagnosis have not been effectively improved. Developing feature extraction and classification methods to meet the requirements of typical fault alarm and automatic diagnosis in practical engineering is urgent task. The typical mechanical faults of reciprocating compressor are presented in the paper, and the existing data of online monitoring system is used to extract fault feature parameters within 15 types in total; the inner sensitive connection between faults and the feature parameters has been made clear by using the distance evaluation technique, also sensitive characteristic parameters of different faults have been obtained. On this basis, a method based on fault feature parameters and support vector machine (SVM) is developed, which will be applied to practical fault diagnosis. A better ability of early fault warning has been proved by the experiment and the practical fault cases. Automatic classification by using the SVM to the data of fault alarm has obtained better diagnostic accuracy.

  2. Detection of leachables and cytotoxicity after exposure to methacrylate- and epoxy-based root canal sealers in vitro.

    PubMed

    Lodienė, Greta; Kopperud, Hilde M; Ørstavik, Dag; Bruzell, Ellen M

    2013-10-01

    Root canal sealing materials may have toxic potential in vitro depending on the cell line, cytotoxicity assay, material chemistry, and degree of polymer curing. The aims of the present study were to detect leaching components from epoxy- or methacrylate-based root canal sealers and to investigate the degree of cytotoxicity after exposure to extracts from these materials. Qualitative determination of substances released from the materials was performed by gas- and liquid chromatography/mass spectrometry. Submandibular salivary gland acinar cell death (apoptosis/necrosis) was determined using a fluorescence staining/microscopy technique. The major leachable monomer from the epoxy-based material was bisphenol-A diglycidyl ether (BADGE), whereas leachables from the methacrylate-based materials were mainly triethylene glycol dimethacrylate (TEGDMA), urethane dimethacrylate (UDMA), hydroxyethyl methacrylate (HEMA), and polyethyleneglycol dimethacrylate (PEGDMA). Exposure to diluted extracts of cured methacrylate-based materials caused a postexposure time-dependent increase in cell death. This effect was not demonstrated as a result of exposure to undiluted extract of cured epoxy-based material. Extracts of all fresh materials induced apoptosis significantly, but at lower dilutions of the epoxy- than the methacrylate-based materials. The degree of leaching, determined from the relative chromatogram peak heights of eluates from the methacrylate-based sealer materials, corresponded with the degree of cell death induced by extracts of these materials. © 2013 Eur J Oral Sci.

  3. Auto-Context Convolutional Neural Network (Auto-Net) for Brain Extraction in Magnetic Resonance Imaging.

    PubMed

    Mohseni Salehi, Seyed Sadegh; Erdogmus, Deniz; Gholipour, Ali

    2017-11-01

    Brain extraction or whole brain segmentation is an important first step in many of the neuroimage analysis pipelines. The accuracy and the robustness of brain extraction, therefore, are crucial for the accuracy of the entire brain analysis process. The state-of-the-art brain extraction techniques rely heavily on the accuracy of alignment or registration between brain atlases and query brain anatomy, and/or make assumptions about the image geometry, and therefore have limited success when these assumptions do not hold or image registration fails. With the aim of designing an accurate, learning-based, geometry-independent, and registration-free brain extraction tool, in this paper, we present a technique based on an auto-context convolutional neural network (CNN), in which intrinsic local and global image features are learned through 2-D patches of different window sizes. We consider two different architectures: 1) a voxelwise approach based on three parallel 2-D convolutional pathways for three different directions (axial, coronal, and sagittal) that implicitly learn 3-D image information without the need for computationally expensive 3-D convolutions and 2) a fully convolutional network based on the U-net architecture. Posterior probability maps generated by the networks are used iteratively as context information along with the original image patches to learn the local shape and connectedness of the brain to extract it from non-brain tissue. The brain extraction results we have obtained from our CNNs are superior to the recently reported results in the literature on two publicly available benchmark data sets, namely, LPBA40 and OASIS, in which we obtained the Dice overlap coefficients of 97.73% and 97.62%, respectively. Significant improvement was achieved via our auto-context algorithm. Furthermore, we evaluated the performance of our algorithm in the challenging problem of extracting arbitrarily oriented fetal brains in reconstructed fetal brain magnetic resonance imaging (MRI) data sets. In this application, our voxelwise auto-context CNN performed much better than the other methods (Dice coefficient: 95.97%), where the other methods performed poorly due to the non-standard orientation and geometry of the fetal brain in MRI. Through training, our method can provide accurate brain extraction in challenging applications. This, in turn, may reduce the problems associated with image registration in segmentation tasks.

  4. Data base management system configuration specification. [computer storage devices

    NASA Technical Reports Server (NTRS)

    Neiers, J. W.

    1979-01-01

    The functional requirements and the configuration of the data base management system are described. Techniques and technology which will enable more efficient and timely transfer of useful data from the sensor to the user, extraction of information by the user, and exchange of information among the users are demonstrated.

  5. Extraction of convective cloud parameters from Doppler Weather Radar MAX(Z) product using Image Processing Technique

    NASA Astrophysics Data System (ADS)

    Arunachalam, M. S.; Puli, Anil; Anuradha, B.

    2016-07-01

    In the present work continuous extraction of convective cloud optical information and reflectivity (MAX(Z) in dBZ) using online retrieval technique for time series data production from Doppler Weather Radar (DWR) located at Indian Meteorological Department, Chennai has been developed in MATLAB. Reflectivity measurements for different locations within the DWR range of 250 Km radii of circular disc area can be retrieved using this technique. It gives both time series reflectivity of point location and also Range Time Intensity (RTI) maps of reflectivity for the corresponding location. The Graphical User Interface (GUI) developed for the cloud reflectivity is user friendly; it also provides the convective cloud optical information such as cloud base height (CBH), cloud top height (CTH) and cloud optical depth (COD). This technique is also applicable for retrieving other DWR products such as Plan Position Indicator (Z, in dBZ), Plan Position Indicator (Z, in dBZ)-Close Range, Volume Velocity Processing (V, in knots), Plan Position Indicator (V, in m/s), Surface Rainfall Intensity (SRI, mm/hr), Precipitation Accumulation (PAC) 24 hrs at 0300UTC. Keywords: Reflectivity, cloud top height, cloud base, cloud optical depth

  6. Following the Social Media: Aspect Evolution of Online Discussion

    NASA Astrophysics Data System (ADS)

    Tang, Xuning; Yang, Christopher C.

    Due to the advance of Internet and Web 2.0 technologies, it is easy to extract thousands of threads about a topic of interest from an online forum but it is nontrivial to capture the blueprint of different aspects (i.e., subtopic, or facet) associated with the topic. To better understand and analyze a forum discussion given topic, it is important to uncover the evolution relationships (temporal dependencies) between different topic aspects (i.e. how the discussion topic is evolving). Traditional Topic Detection and Tracking (TDT) techniques usually organize topics as a flat structure but it does not present the evolution relationships between topic aspects. In addition, the properties of short and sparse messages make the content-based TDT techniques difficult to perform well in identifying evolution relationships. The contributions in this paper are two-folded. We formally define a topic aspect evolution graph modeling framework and propose to utilize social network information, content similarity and temporal proximity to model evolution relationships between topic aspects. The experimental results showed that, by incorporating social network information, our technique significantly outperformed content-based technique in the task of extracting evolution relationships between topic aspects.

  7. Αntioxidant activity of Cynara scolymus L. and Cynara cardunculus L. extracts obtained by different extraction techniques.

    PubMed

    Kollia, Eleni; Markaki, Panagiota; Zoumpoulakis, Panagiotis; Proestos, Charalampos

    2017-05-01

    Extracts of different parts (heads, bracts and stems) of Cynara cardunculus L. (cardoon) and Cynara scolymus L. (globe artichoke), obtained by two different extraction techniques (Ultrasound-Assisted Extraction (UAE) and classical extraction (CE)) were examined and compared for their total phenolic content (TPC) and their antioxidant activity. Moreover, infusions of the plant's parts were also analysed and compared to aforementioned samples. Results showed that cardoon's heads extract (obtained by Ultrasound-Assisted Extraction) displayed the highest TPC values (1.57 mg Gallic Acid Equivalents (GAE) g -1 fresh weight (fw)), the highest DPPH • scavenging activity (IC50; 0.91 mg ml -1 ) and the highest ABTS •+ radical scavenging capacity (2.08 mg Trolox Equivalents (TE) g -1 fw) compared to infusions and other extracts studied. Moreover, Ultrasound-Assisted Extraction technique proved to be more appropriate and effective for the extraction of antiradical and phenolic compounds.

  8. Isolation of quercetin from the methanolic extract of Lagerstroemia speciosa by HPLC technique, its cytotoxicity against MCF-7 cells and photocatalytic activity.

    PubMed

    Sai Saraswathi, V; Saravanan, D; Santhakumar, K

    2017-06-01

    The flavonoids present in the leaves of Lagerstroemia speciosa were extracted, characterized by spectral methods and studied for its cytotoxicity activity against MCF-cell lines and photocatalytic activity against azo dye. Direct and sequential soxhlet extraction was performed and its concentrated crude extract was subjected to high performance liquid chromatography. The yield obtained by the isolated compound (MEI-quercetin) from leaves of L. speciosa was found to be 1.8g from the methanolic extract. The phytochemical analysis and the Rf value of the isolated flavonoid was found to be 3.59. The isolated compound was characterized by Infrared Spectroscopy, NMR and Mass. Based on the characterization, the structure was elucidated as quercetin - a flavonoid. The isolated compound showed the significant in vitro cytotoxicity activity against MCF-7 cell lines at 500μg/ml when compared to the crude extract. Among the various concentrations (25, 50, 100, 250, and 500μg/ml), at higher concentration the cell viability was pronounced and also compared with that of the control. It was first time to report that the isolated flavonoid showed photocatalytic against azo dye-methyl orange. The dye degradation was monitored by UV-Vis spectrophotometry. The isolated compound showed dye degradation of 91.66% with the crude extract 82.47% at 160min. Hence in the present findings, the photocatalytic degradation of MO dye under UV irradiation was investigated over isolated compound of L. speciosa. Hence we expect that this can be used to treat the waste water in near future based on the photocatalytic technique. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Comparison of solvent/derivatization agent systems for determination of extractable toluene diisocyanate from flexible polyurethane foam.

    PubMed

    Vangronsveld, Erik; Berckmans, Steven; Spence, Mark

    2013-06-01

    Flexible polyurethane foam (FPF) is produced from the reaction of toluene diisocyanate (TDI) and polyols. Limited and conflicting results exist in the literature concerning the presence of unreacted TDI remaining in FPF as determined by various solvent extraction and analysis techniques. This study reports investigations into the effect of several solvent/derivatization agent combinations on extractable TDI results and suggests a preferred method. The suggested preferred method employs a syringe-based multiple extraction of foam samples with a toluene solution of 1-(2-methoxyphenyl)-piperazine. Extracts are analyzed by liquid chromatography using an ion trap mass spectrometry detection technique. Detection limits of the method are ~10ng TDI g(-1) foam (10 ppb, w/w) for each TDI isomer (i.e. 2,4-TDI and 2,6-TDI). The method was evaluated by a three-laboratory interlaboratory comparison using two representative foam samples. The total extractable TDI results found by the three labs for the two foams were in good agreement (relative standard deviation of the mean of 30-40%). The method has utility as a basis for comparing FPFs, but the interpretation of extractable TDI results using any solvent as the true value for 'free' or 'unreacted' TDI in the foam is problematic, as demonstrated by the difference in the extracted TDI results from the different extraction systems studied. Further, a consideration of polyurethane foam chemistry raises the possibility that extractable TDI may result from decomposition of parts of the foam structure (e.g. dimers, biurets, and allophanates) by the extraction system.

  10. Techniques for noise removal and registration of TIMS data

    USGS Publications Warehouse

    Hummer-Miller, S.

    1990-01-01

    Extracting subtle differences from highly correlated thermal infrared aircraft data is possible with appropriate noise filters, constructed and applied in the spatial frequency domain. This paper discusses a heuristic approach to designing noise filters for removing high- and low-spatial frequency striping and banding. Techniques for registering thermal infrared aircraft data to a topographic base using Thematic Mapper data are presented. The noise removal and registration techniques are applied to TIMS thermal infrared aircraft data. -Author

  11. CARES: Completely Automated Robust Edge Snapper for carotid ultrasound IMT measurement on a multi-institutional database of 300 images: a two stage system combining an intensity-based feature approach with first order absolute moments

    NASA Astrophysics Data System (ADS)

    Molinari, Filippo; Acharya, Rajendra; Zeng, Guang; Suri, Jasjit S.

    2011-03-01

    The carotid intima-media thickness (IMT) is the most used marker for the progression of atherosclerosis and onset of the cardiovascular diseases. Computer-aided measurements improve accuracy, but usually require user interaction. In this paper we characterized a new and completely automated technique for carotid segmentation and IMT measurement based on the merits of two previously developed techniques. We used an integrated approach of intelligent image feature extraction and line fitting for automatically locating the carotid artery in the image frame, followed by wall interfaces extraction based on Gaussian edge operator. We called our system - CARES. We validated the CARES on a multi-institutional database of 300 carotid ultrasound images. IMT measurement bias was 0.032 +/- 0.141 mm, better than other automated techniques and comparable to that of user-driven methodologies. Our novel approach of CARES processed 96% of the images leading to the figure of merit to be 95.7%. CARES ensured complete automation and high accuracy in IMT measurement; hence it could be a suitable clinical tool for processing of large datasets in multicenter studies involving atherosclerosis.pre-

  12. Accurate Learning with Few Atlases (ALFA): an algorithm for MRI neonatal brain extraction and comparison with 11 publicly available methods.

    PubMed

    Serag, Ahmed; Blesa, Manuel; Moore, Emma J; Pataky, Rozalia; Sparrow, Sarah A; Wilkinson, A G; Macnaught, Gillian; Semple, Scott I; Boardman, James P

    2016-03-24

    Accurate whole-brain segmentation, or brain extraction, of magnetic resonance imaging (MRI) is a critical first step in most neuroimage analysis pipelines. The majority of brain extraction algorithms have been developed and evaluated for adult data and their validity for neonatal brain extraction, which presents age-specific challenges for this task, has not been established. We developed a novel method for brain extraction of multi-modal neonatal brain MR images, named ALFA (Accurate Learning with Few Atlases). The method uses a new sparsity-based atlas selection strategy that requires a very limited number of atlases 'uniformly' distributed in the low-dimensional data space, combined with a machine learning based label fusion technique. The performance of the method for brain extraction from multi-modal data of 50 newborns is evaluated and compared with results obtained using eleven publicly available brain extraction methods. ALFA outperformed the eleven compared methods providing robust and accurate brain extraction results across different modalities. As ALFA can learn from partially labelled datasets, it can be used to segment large-scale datasets efficiently. ALFA could also be applied to other imaging modalities and other stages across the life course.

  13. FacetGist: Collective Extraction of Document Facets in Large Technical Corpora.

    PubMed

    Siddiqui, Tarique; Ren, Xiang; Parameswaran, Aditya; Han, Jiawei

    2016-10-01

    Given the large volume of technical documents available, it is crucial to automatically organize and categorize these documents to be able to understand and extract value from them. Towards this end, we introduce a new research problem called Facet Extraction. Given a collection of technical documents, the goal of Facet Extraction is to automatically label each document with a set of concepts for the key facets ( e.g. , application, technique, evaluation metrics, and dataset) that people may be interested in. Facet Extraction has numerous applications, including document summarization, literature search, patent search and business intelligence. The major challenge in performing Facet Extraction arises from multiple sources: concept extraction, concept to facet matching, and facet disambiguation. To tackle these challenges, we develop FacetGist, a framework for facet extraction. Facet Extraction involves constructing a graph-based heterogeneous network to capture information available across multiple local sentence-level features, as well as global context features. We then formulate a joint optimization problem, and propose an efficient algorithm for graph-based label propagation to estimate the facet of each concept mention. Experimental results on technical corpora from two domains demonstrate that Facet Extraction can lead to an improvement of over 25% in both precision and recall over competing schemes.

  14. Rare Earth Extraction from NdFeB Magnet Using a Closed-Loop Acid Process.

    PubMed

    Kitagawa, Jiro; Uemura, Ryohei

    2017-08-14

    There is considerable interest in extraction of rare earth elements from NdFeB magnets to enable recycling of these elements. In practical extraction methods using wet processes, the acid waste solution discharge is a problem that must be resolved to reduce the environmental impact of the process. Here, we present an encouraging demonstration of rare earth element extraction from a NdFeB magnet using a closed-loop hydrochloric acid (HCl)-based process. The extraction method is based on corrosion of the magnet in a pretreatment stage and a subsequent ionic liquid technique for Fe extraction from the HCl solution. The rare earth elements are then precipitated using oxalic acid. Triple extraction has been conducted and the recovery ratio of the rare earth elements from the solution is approximately 50% for each extraction process, as compared to almost 100% recovery when using a one-shot extraction process without the ionic liquid but with sufficient oxalic acid. Despite its reduced extraction efficiency, the proposed method with its small number of procedures at almost room temperature is still highly advantageous in terms of both cost and environmental friendliness. This study represents an initial step towards realization of a closed-loop acid process for recycling of rare earth elements.

  15. FacetGist: Collective Extraction of Document Facets in Large Technical Corpora

    PubMed Central

    Siddiqui, Tarique; Ren, Xiang; Parameswaran, Aditya; Han, Jiawei

    2017-01-01

    Given the large volume of technical documents available, it is crucial to automatically organize and categorize these documents to be able to understand and extract value from them. Towards this end, we introduce a new research problem called Facet Extraction. Given a collection of technical documents, the goal of Facet Extraction is to automatically label each document with a set of concepts for the key facets (e.g., application, technique, evaluation metrics, and dataset) that people may be interested in. Facet Extraction has numerous applications, including document summarization, literature search, patent search and business intelligence. The major challenge in performing Facet Extraction arises from multiple sources: concept extraction, concept to facet matching, and facet disambiguation. To tackle these challenges, we develop FacetGist, a framework for facet extraction. Facet Extraction involves constructing a graph-based heterogeneous network to capture information available across multiple local sentence-level features, as well as global context features. We then formulate a joint optimization problem, and propose an efficient algorithm for graph-based label propagation to estimate the facet of each concept mention. Experimental results on technical corpora from two domains demonstrate that Facet Extraction can lead to an improvement of over 25% in both precision and recall over competing schemes. PMID:28210517

  16. The extraction of spot signal in Shack-Hartmann wavefront sensor based on sparse representation

    NASA Astrophysics Data System (ADS)

    Zhang, Yanyan; Xu, Wentao; Chen, Suting; Ge, Junxiang; Wan, Fayu

    2016-07-01

    Several techniques have been used with Shack-Hartmann wavefront sensors to determine the local wave-front gradient across each lenslet. While the centroid error of Shack-Hartmann wavefront sensor is relatively large since the skylight background and the detector noise. In this paper, we introduce a new method based on sparse representation to extract the target signal from the background and the noise. First, an over complete dictionary of the spot signal is constructed based on two-dimensional Gaussian model. Then the Shack-Hartmann image is divided into sub blocks. The corresponding coefficients of each block is computed in the over complete dictionary. Since the coefficients of the noise and the target are large different, then extract the target by setting a threshold to the coefficients. Experimental results show that the target can be well extracted and the deviation, RMS and PV of the centroid are all smaller than the method of subtracting threshold.

  17. Recent Advances in On-Line Methods Based on Extraction for Speciation Analysis of Chromium in Environmental Matrices.

    PubMed

    Trzonkowska, Laura; Leśniewska, Barbara; Godlewska-Żyłkiewicz, Beata

    2016-07-03

    The biological activity of Cr(III) and Cr(VI) species, their chemical behavior, and toxic effects are dissimilar. The speciation analysis of Cr(III) and Cr(VI) in environmental matrices is then of great importance and much research has been devoted to this area. This review presents recent developments in on-line speciation analysis of chromium in such samples. Flow systems have proved to be excellent tools for automation of sample pretreatment, separation/preconcentration of chromium species, and their detection by various instrumental techniques. Analytical strategies used in chromium speciation analysis discussed in this review are divided into categories based on selective extraction/separation of chromium species on solid sorbents and liquid-liquid extraction of chromium species. The most popular strategy is that based on solid-phase extraction. Therefore, this review shows the potential of novel materials designed and used for selective binding of chromium species. The progress in miniaturization of measurement systems is also presented.

  18. LexValueSets: An Approach for Context-Driven Value Sets Extraction

    PubMed Central

    Pathak, Jyotishman; Jiang, Guoqian; Dwarkanath, Sridhar O.; Buntrock, James D.; Chute, Christopher G.

    2008-01-01

    The ability to model, share and re-use value sets across multiple medical information systems is an important requirement. However, generating value sets semi-automatically from a terminology service is still an unresolved issue, in part due to the lack of linkage to clinical context patterns that provide the constraints in defining a concept domain and invocation of value sets extraction. Towards this goal, we develop and evaluate an approach for context-driven automatic value sets extraction based on a formal terminology model. The crux of the technique is to identify and define the context patterns from various domains of discourse and leverage them for value set extraction using two complementary ideas based on (i) local terms provided by the Subject Matter Experts (extensional) and (ii) semantic definition of the concepts in coding schemes (intensional). A prototype was implemented based on SNOMED CT rendered in the LexGrid terminology model and a preliminary evaluation is presented. PMID:18998955

  19. Assessment of Carbon- and Metal-Based Nanoparticle DNA Damage with Microfluidic Electrophoretic Separation Technology.

    PubMed

    Schrand, Amanda M; Powell, Thomas; Robertson, Tiffany; Hussain, Saber M

    2015-02-01

    In this study, we examined the feasibility of extracting DNA from whole cell lysates exposed to nanoparticles using two different methodologies for evaluation of fragmentation with microfluidic electrophoretic separation. Human lung macrophages were exposed to five different carbon- and metal-based nanoparticles at two different time points (2 h, 24 h) and two different doses (5 µg/ml, 100 µg/ml). The primary difference in the banding patterns after 2 h of nanoparticle exposure is more DNA fragmentation at the higher NP concentration when examining cells exposed to nanoparticles of the same composition. However, higher doses of carbon and silver nanoparticles at both short and long dosing periods can contribute to erroneous or incomplete data with this technique. Also comparing DNA isolation methodologies, we recommend the centrifugation extraction technique, which provides more consistent banding patterns in the control samples compared to the spooling technique. Here we demonstrate that multi-walled carbon nanotubes, 15 nm silver nanoparticles and the positive control cadmium oxide cause similar DNA fragmentation at the short time point of 2 h with the centrifugation extraction technique. Therefore, the results of these studies contribute to elucidating the relationship between nanoparticle physicochemical properties and DNA fragmentation results while providing the pros and cons of altering the DNA isolation methodology. Overall, this technique provides a high throughput way to analyze subcellular alterations in DNA profiles of cells exposed to nanomaterials to aid in understanding the consequences of exposure and mechanistic effects. Future studies in microfluidic electrophoretic separation technologies should be investigated to determine the utility of protein or other assays applicable to cellular systems exposed to nanoparticles.

  20. Dry Socket Etiology, Diagnosis, and Clinical Treatment Techniques.

    PubMed

    Mamoun, John

    2018-04-01

    Dry socket, also termed fibrinolytic osteitis or alveolar osteitis, is a complication of tooth exodontia. A dry socket lesion is a post-extraction socket that exhibits exposed bone that is not covered by a blood clot or healing epithelium and exists inside or around the perimeter of the socket or alveolus for days after the extraction procedure. This article describes dry socket lesions; reviews the basic clinical techniques of treating different manifestations of dry socket lesions; and shows how microscope level loupe magnification of 6× to 8× or greater, combined with co-axial illumination or a dental operating microscope, facilitate more precise treatment of dry socket lesions. The author examines the scientific validity of the proposed causes of dry socket lesions (such as bacteria, inflammation, fibrinolysis, or traumatic extractions) and the scientific validity of different terminologies used to describe dry socket lesions. This article also presents an alternative model of what causes dry socket lesions, based on evidence from dental literature. Although the clinical techniques for treating dry socket lesions seem empirically correct, more evidence is required to determine the causes of dry socket lesions.

  1. Dry Socket Etiology, Diagnosis, and Clinical Treatment Techniques

    PubMed Central

    2018-01-01

    Dry socket, also termed fibrinolytic osteitis or alveolar osteitis, is a complication of tooth exodontia. A dry socket lesion is a post-extraction socket that exhibits exposed bone that is not covered by a blood clot or healing epithelium and exists inside or around the perimeter of the socket or alveolus for days after the extraction procedure. This article describes dry socket lesions; reviews the basic clinical techniques of treating different manifestations of dry socket lesions; and shows how microscope level loupe magnification of 6× to 8× or greater, combined with co-axial illumination or a dental operating microscope, facilitate more precise treatment of dry socket lesions. The author examines the scientific validity of the proposed causes of dry socket lesions (such as bacteria, inflammation, fibrinolysis, or traumatic extractions) and the scientific validity of different terminologies used to describe dry socket lesions. This article also presents an alternative model of what causes dry socket lesions, based on evidence from dental literature. Although the clinical techniques for treating dry socket lesions seem empirically correct, more evidence is required to determine the causes of dry socket lesions. PMID:29732309

  2. Enriching a document collection by integrating information extraction and PDF annotation

    NASA Astrophysics Data System (ADS)

    Powley, Brett; Dale, Robert; Anisimoff, Ilya

    2009-01-01

    Modern digital libraries offer all the hyperlinking possibilities of the World Wide Web: when a reader finds a citation of interest, in many cases she can now click on a link to be taken to the cited work. This paper presents work aimed at providing the same ease of navigation for legacy PDF document collections that were created before the possibility of integrating hyperlinks into documents was ever considered. To achieve our goal, we need to carry out two tasks: first, we need to identify and link citations and references in the text with high reliability; and second, we need the ability to determine physical PDF page locations for these elements. We demonstrate the use of a high-accuracy citation extraction algorithm which significantly improves on earlier reported techniques, and a technique for integrating PDF processing with a conventional text-stream based information extraction pipeline. We demonstrate these techniques in the context of a particular document collection, this being the ACL Anthology; but the same approach can be applied to other document sets.

  3. Depth estimation of features in video frames with improved feature matching technique using Kinect sensor

    NASA Astrophysics Data System (ADS)

    Sharma, Kajal; Moon, Inkyu; Kim, Sung Gaun

    2012-10-01

    Estimating depth has long been a major issue in the field of computer vision and robotics. The Kinect sensor's active sensing strategy provides high-frame-rate depth maps and can recognize user gestures and human pose. This paper presents a technique to estimate the depth of features extracted from video frames, along with an improved feature-matching method. In this paper, we used the Kinect camera developed by Microsoft, which captured color and depth images for further processing. Feature detection and selection is an important task for robot navigation. Many feature-matching techniques have been proposed earlier, and this paper proposes an improved feature matching between successive video frames with the use of neural network methodology in order to reduce the computation time of feature matching. The features extracted are invariant to image scale and rotation, and different experiments were conducted to evaluate the performance of feature matching between successive video frames. The extracted features are assigned distance based on the Kinect technology that can be used by the robot in order to determine the path of navigation, along with obstacle detection applications.

  4. Magnetic headspace adsorptive extraction of chlorobenzenes prior to thermal desorption gas chromatography-mass spectrometry.

    PubMed

    Vidal, Lorena; Ahmadi, Mazaher; Fernández, Elena; Madrakian, Tayyebeh; Canals, Antonio

    2017-06-08

    This study presents a new, user-friendly, cost-effective and portable headspace solid-phase extraction technique based on graphene oxide decorated with iron oxide magnetic nanoparticles as sorbent, located on one end of a small neodymium magnet. Hence, the new headspace solid-phase extraction technique has been called Magnetic Headspace Adsorptive Extraction (Mag-HSAE). In order to assess Mag-HSAE technique applicability to model analytes, some chlorobenzenes were extracted from water samples prior to gas chromatography-mass spectrometry determination. A multivariate approach was employed to optimize the experimental parameters affecting Mag-HSAE. The method was evaluated under optimized extraction conditions (i.e., sample volume, 20 mL; extraction time, 30 min; sorbent amount, 10 mg; stirring speed, 1500 rpm, and ionic strength, non-significant), obtaining a linear response from 0.5 to 100 ng L -1 for 1,3-DCB, 1,4-DCB, 1,2-DCB, 1,3,5-TCB, 1,2,4-TCB and 1,2,3-TCB; from 0.5 to 75 ng L -1 for 1,2,4,5-TeCB, and PeCB; and from 1 to 75 ng L -1 for 1,2,3,4-TeCB. The repeatability of the proposed method was evaluated at 10 ng L -1 and 50 ng L -1 spiking levels, and coefficients of variation ranged between 1.5 and 9.5% (n = 5). Limits of detection values were found between 93 and 301 pg L -1 . Finally, tap, mineral and effluent water were selected as real water samples to assess method applicability. Relative recoveries varied between 86 and 110% showing negligible matrix effects. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. A Novel Feature Extraction Method with Feature Selection to Identify Golgi-Resident Protein Types from Imbalanced Data

    PubMed Central

    Yang, Runtao; Zhang, Chengjin; Gao, Rui; Zhang, Lina

    2016-01-01

    The Golgi Apparatus (GA) is a major collection and dispatch station for numerous proteins destined for secretion, plasma membranes and lysosomes. The dysfunction of GA proteins can result in neurodegenerative diseases. Therefore, accurate identification of protein subGolgi localizations may assist in drug development and understanding the mechanisms of the GA involved in various cellular processes. In this paper, a new computational method is proposed for identifying cis-Golgi proteins from trans-Golgi proteins. Based on the concept of Common Spatial Patterns (CSP), a novel feature extraction technique is developed to extract evolutionary information from protein sequences. To deal with the imbalanced benchmark dataset, the Synthetic Minority Over-sampling Technique (SMOTE) is adopted. A feature selection method called Random Forest-Recursive Feature Elimination (RF-RFE) is employed to search the optimal features from the CSP based features and g-gap dipeptide composition. Based on the optimal features, a Random Forest (RF) module is used to distinguish cis-Golgi proteins from trans-Golgi proteins. Through the jackknife cross-validation, the proposed method achieves a promising performance with a sensitivity of 0.889, a specificity of 0.880, an accuracy of 0.885, and a Matthew’s Correlation Coefficient (MCC) of 0.765, which remarkably outperforms previous methods. Moreover, when tested on a common independent dataset, our method also achieves a significantly improved performance. These results highlight the promising performance of the proposed method to identify Golgi-resident protein types. Furthermore, the CSP based feature extraction method may provide guidelines for protein function predictions. PMID:26861308

  6. Can cloud point-based enrichment, preservation, and detection methods help to bridge gaps in aquatic nanometrology?

    PubMed

    Duester, Lars; Fabricius, Anne-Lena; Jakobtorweihen, Sven; Philippe, Allan; Weigl, Florian; Wimmer, Andreas; Schuster, Michael; Nazar, Muhammad Faizan

    2016-11-01

    Coacervate-based techniques are intensively used in environmental analytical chemistry to enrich and extract different kinds of analytes. Most methods focus on the total content or the speciation of inorganic and organic substances. Size fractionation is less commonly addressed. Within coacervate-based techniques, cloud point extraction (CPE) is characterized by a phase separation of non-ionic surfactants dispersed in an aqueous solution when the respective cloud point temperature is exceeded. In this context, the feature article raises the following question: May CPE in future studies serve as a key tool (i) to enrich and extract nanoparticles (NPs) from complex environmental matrices prior to analyses and (ii) to preserve the colloidal status of unstable environmental samples? With respect to engineered NPs, a significant gap between environmental concentrations and size- and element-specific analytical capabilities is still visible. CPE may support efforts to overcome this "concentration gap" via the analyte enrichment. In addition, most environmental colloidal systems are known to be unstable, dynamic, and sensitive to changes of the environmental conditions during sampling and sample preparation. This delivers a so far unsolved "sample preparation dilemma" in the analytical process. The authors are of the opinion that CPE-based methods have the potential to preserve the colloidal status of these instable samples. Focusing on NPs, this feature article aims to support the discussion on the creation of a convention called the "CPE extractable fraction" by connecting current knowledge on CPE mechanisms and on available applications, via the uncertainties visible and modeling approaches available, with potential future benefits from CPE protocols.

  7. A new technique for collection, concentration and determination of gaseous tropospheric formaldehyde

    NASA Astrophysics Data System (ADS)

    Cofer, Wesley R.; Edahl, Robert A.

    This article describes an improved technique for making in situ measurements of gaseous tropospheric formaldehyde (CH 2O). The new technique is based on nebulization/reflux principles that have proved very effective in quantitatively scrubbing water soluble trace gases (e.g. CH 2O) into aqueous mediums, which are subsequently analyzed. Atmospheric formaldehyde extractions and analyses have been performed with the nebulization/reflux concentrator using an acidified dinitrophenylhydrazine solution that indicate that quantitative analysis of CH 2O at global background levels (˜ 0.1 ppbv) is feasible with 20-min extractions. Analysis of CH 2O, once concentrated, is accomplished using high performance liquid chromatography (HPLC) with ultraviolet photometric detection. The CH 2O-hydrazone derivative, produced by the reaction of 2,4-dinitrophenylhydrazine in H 2SO 4 acidified aqueous solution, is detected as CH 2O.

  8. A new technique for collection, concentration and determination of gaseous tropospheric formaldehyde

    NASA Technical Reports Server (NTRS)

    Cofer, W. R., III; Edahl, R. A., Jr.

    1986-01-01

    This article describes an improved technique for making in situ measurements of gaseous tropospheric formaldehyde (CH2O). The new technique is based on nebulization/reflux principles that have proved very effective in quantitatively scrubbing water soluble trace gases (e.g., CH2O) into aqueous mediums, which are subsequently analyzed. Atmospheric formaldehyde extractions and analyses have been performed with the nebulization/reflux concentrator using an acidified dinitrophenylhydrazine solution that indicate that quantitative analysis of CH2O at global background levels (about 0.1 ppbv) is feasible with 20-min extractions. Analysis of CH2O, once concentrated, is accomplished using high performance liquid chromatography with ultraviolet photometric detection. The CH2O-hydrazone derivative, produced by the reaction of 2,4-dinitrophenylhydrazine in H2SO4 acidified aqueous solution, is detected as CH2O.

  9. Antifungal activity of extracts from Piper aduncum leaves prepared by different solvents and extraction techniques against dermatophytes Trichophyton rubrum and Trichophyton interdigitale.

    PubMed

    Santos, Maximillan Leite; Magalhães, Chaiana Froés; da Rosa, Marcelo Barcellos; de Assis Santos, Daniel; Brasileiro, Beatriz Gonçalves; de Carvalho, Leandro Machado; da Silva, Marcelo Barreto; Zani, Carlos Leomar; de Siqueira, Ezequias Pessoa; Peres, Rodrigo Loreto; Andrade, Anderson Assunção

    2013-12-01

    The effects of different solvents and extraction techniques upon the phytochemical profile and anti-Trichophyton activity of extracts from Piper aduncum leaves were evaluated. Extract done by maceration method with ethanol has higher content of sesquiterpenes and antifungal activity. This extract may be useful as an alternative treatment for dermatophytosis.

  10. Antifungal activity of extracts from Piper aduncum leaves prepared by different solvents and extraction techniques against dermatophytes Trichophyton rubrum and Trichophyton interdigitale

    PubMed Central

    Santos, Maximillan Leite; Magalhães, Chaiana Froés; da Rosa, Marcelo Barcellos; de Assis Santos, Daniel; Brasileiro, Beatriz Gonçalves; de Carvalho, Leandro Machado; da Silva, Marcelo Barreto; Zani, Carlos Leomar; de Siqueira, Ezequias Pessoa; Peres, Rodrigo Loreto; Andrade, Anderson Assunção

    2013-01-01

    The effects of different solvents and extraction techniques upon the phytochemical profile and anti-Trichophyton activity of extracts from Piper aduncum leaves were evaluated. Extract done by maceration method with ethanol has higher content of sesquiterpenes and antifungal activity. This extract may be useful as an alternative treatment for dermatophytosis. PMID:24688522

  11. Robust watermark technique using masking and Hermite transform.

    PubMed

    Coronel, Sandra L Gomez; Ramírez, Boris Escalante; Mosqueda, Marco A Acevedo

    2016-01-01

    The following paper evaluates a watermark algorithm designed for digital images by using a perceptive mask and a normalization process, thus preventing human eye detection, as well as ensuring its robustness against common processing and geometric attacks. The Hermite transform is employed because it allows a perfect reconstruction of the image, while incorporating human visual system properties; moreover, it is based on the Gaussian functions derivates. The applied watermark represents information of the digital image proprietor. The extraction process is blind, because it does not require the original image. The following techniques were utilized in the evaluation of the algorithm: peak signal-to-noise ratio, the structural similarity index average, the normalized crossed correlation, and bit error rate. Several watermark extraction tests were performed, with against geometric and common processing attacks. It allowed us to identify how many bits in the watermark can be modified for its adequate extraction.

  12. Intelligent Traffic Quantification System

    NASA Astrophysics Data System (ADS)

    Mohanty, Anita; Bhanja, Urmila; Mahapatra, Sudipta

    2017-08-01

    Currently, city traffic monitoring and controlling is a big issue in almost all cities worldwide. Vehicular ad-hoc Network (VANET) technique is an efficient tool to minimize this problem. Usually, different types of on board sensors are installed in vehicles to generate messages characterized by different vehicle parameters. In this work, an intelligent system based on fuzzy clustering technique is developed to reduce the number of individual messages by extracting important features from the messages of a vehicle. Therefore, the proposed fuzzy clustering technique reduces the traffic load of the network. The technique also reduces congestion and quantifies congestion.

  13. A comparison of essential oils obtained from lavandin via different extraction processes: Ultrasound, microwave, turbohydrodistillation, steam and hydrodistillation.

    PubMed

    Périno-Issartier, Sandrine; Ginies, Christian; Cravotto, Giancarlo; Chemat, Farid

    2013-08-30

    A total of eight extraction techniques ranging from conventional methods (hydrodistillation (HD), steam distillation (SD), turbohydrodistillation (THD)), through innovative techniques (ultrasound assisted extraction (US-SD) and finishing with microwave assisted extraction techniques such as In situ microwave-generated hydrodistillation (ISMH), microwave steam distillation (MSD), microwave hydrodiffusion and gravity (MHG), and microwave steam diffusion (MSDf)) were used to extract essential oil from lavandin flowers and their results were compared. Extraction time, yield, essential oil composition and sensorial analysis were considered as the principal terms of comparison. The essential oils extracted using the more innovative processes were quantitatively (yield) and qualitatively (aromatic profile) similar to those obtained from the conventional techniques. The method which gave the best results was the microwave hydrodiffusion and gravity (MHG) method which gave reduced extraction time (30min against 220min for SD) and gave no differences in essential oil yield and sensorial perception. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Object-oriented feature extraction approach for mapping supraglacial debris in Schirmacher Oasis using very high-resolution satellite data

    NASA Astrophysics Data System (ADS)

    Jawak, Shridhar D.; Jadhav, Ajay; Luis, Alvarinho J.

    2016-05-01

    Supraglacial debris was mapped in the Schirmacher Oasis, east Antarctica, by using WorldView-2 (WV-2) high resolution optical remote sensing data consisting of 8-band calibrated Gram Schmidt (GS)-sharpened and atmospherically corrected WV-2 imagery. This study is a preliminary attempt to develop an object-oriented rule set to extract supraglacial debris for Antarctic region using 8-spectral band imagery. Supraglacial debris was manually digitized from the satellite imagery to generate the ground reference data. Several trials were performed using few existing traditional pixel-based classification techniques and color-texture based object-oriented classification methods to extract supraglacial debris over a small domain of the study area. Multi-level segmentation and attributes such as scale, shape, size, compactness along with spectral information from the data were used for developing the rule set. The quantitative analysis of error was carried out against the manually digitized reference data to test the practicability of our approach over the traditional pixel-based methods. Our results indicate that OBIA-based approach (overall accuracy: 93%) for extracting supraglacial debris performed better than all the traditional pixel-based methods (overall accuracy: 80-85%). The present attempt provides a comprehensive improved method for semiautomatic feature extraction in supraglacial environment and a new direction in the cryospheric research.

  15. Vision-Based Finger Detection, Tracking, and Event Identification Techniques for Multi-Touch Sensing and Display Systems

    PubMed Central

    Chen, Yen-Lin; Liang, Wen-Yew; Chiang, Chuan-Yen; Hsieh, Tung-Ju; Lee, Da-Cheng; Yuan, Shyan-Ming; Chang, Yang-Lang

    2011-01-01

    This study presents efficient vision-based finger detection, tracking, and event identification techniques and a low-cost hardware framework for multi-touch sensing and display applications. The proposed approach uses a fast bright-blob segmentation process based on automatic multilevel histogram thresholding to extract the pixels of touch blobs obtained from scattered infrared lights captured by a video camera. The advantage of this automatic multilevel thresholding approach is its robustness and adaptability when dealing with various ambient lighting conditions and spurious infrared noises. To extract the connected components of these touch blobs, a connected-component analysis procedure is applied to the bright pixels acquired by the previous stage. After extracting the touch blobs from each of the captured image frames, a blob tracking and event recognition process analyzes the spatial and temporal information of these touch blobs from consecutive frames to determine the possible touch events and actions performed by users. This process also refines the detection results and corrects for errors and occlusions caused by noise and errors during the blob extraction process. The proposed blob tracking and touch event recognition process includes two phases. First, the phase of blob tracking associates the motion correspondence of blobs in succeeding frames by analyzing their spatial and temporal features. The touch event recognition process can identify meaningful touch events based on the motion information of touch blobs, such as finger moving, rotating, pressing, hovering, and clicking actions. Experimental results demonstrate that the proposed vision-based finger detection, tracking, and event identification system is feasible and effective for multi-touch sensing applications in various operational environments and conditions. PMID:22163990

  16. Comparison of methods for extracting kafirin proteins from sorghum distillers dried grains with solubles.

    PubMed

    Wang, Ying; Tilley, Michael; Bean, Scott; Sun, X Susan; Wang, Donghai

    2009-09-23

    Use of coproducts generated during fermentation is important to the overall economics of biofuel production. The main coproduct from grain-based ethanol production is distillers dried grains with solubles (DDGS). High in protein, DDGS is a potential source of protein for many bioindustrial applications such as adhesives and resins. The objective of this research was to characterize the composition as well as chemical and physical properties of kafirin proteins from sorghum DDGS with various extraction methods including use of acetic acid, HCl-ethanol and NaOH-ethanol under reducing conditions. Extraction conditions affected purity and thermal properties of the extracted kafirin proteins. Extraction yields of 44.2, 24.2, and 56.8% were achieved by using acetic acid, HCl-ethanol and NaOH-ethanol, respectively. Acetic acid and NaOH-ethanol produced protein with higher purity than kafirins extracted with the HCl-ethanol protocol. The acetic acid extraction protocol produced protein with the highest purity, 98.9%. Several techniques were used to evaluate structural, molecular and thermal properties of kairin extracts. FTIR showed alpha-helix dominated in all three samples, with only a small portion of beta-sheet present. Electrophoresis results showed alpha(1), alpha(2) band and beta kafirins were present in all three extracts. Glass transition peaks of the extracts were shown by DSC to be approximately 230 degrees C. Kafirin degraded at 270-290 degrees C. Size exclusion chromatography revealed that the acetic acid and HCl-ethanol based extraction methods tended to extract more high molecular weight protein than the NaOH-ethanol based method. Reversed phase high-performance liquid chromatography showed that the gamma kafirins were found only in extracts from the NaOH-ethanol extraction method.

  17. Bio-dispersive liquid liquid microextraction based on nano rhaminolipid aggregates combined with magnetic solid phase extraction using Fe3O4@PPy magnetic nanoparticles for the determination of methamphetamine in human urine.

    PubMed

    Haeri, Seyed Ammar; Abbasi, Shahryar; Sajjadifar, Sami

    2017-09-15

    In the present investigation, extraction and preconcentration of methamphetamine in human urine samples was carried out using a novel bio-dispersive liquid liquid microextraction (Bio-DLLME) technique coupled with magnetic solid phase extraction (MSPE). Bio-DLLME is a kind of microextraction technique based nano-materials which have potential capabilities in many application fields. Bio-DLLME is based on the use of a binary part system consisting of methanol and nano rhaminolipid biosurfactant. Use of this binary mixture is ecologically accepted due to their specificity, biocompatibility and biodegradable nature. The potential of nano rhaminolipid biosurfactant as a biological agent in the extraction of organic compounds has been investigated in recent years. They are able to partition at the oil/water interfaces and reduce the interfacial tension in order to increase solubility of hydrocarbons. The properties of the prepared Fe 3 O 4 @PPy magnetic nanoparticles were characterized using Fourier transform infrared spectroscopy and X-ray diffraction methods The influences of the experimental parameters on the quantitative recovery of analyte were investigated. Under optimized conditions, the enrichment factor was 310, the calibration graph was linear in the methamphetamine concentration range from 1 to 60μgL -1 , with a correlation coefficient of 0.9998. The relative standard deviations for six replicate measurements was 5.2%. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. New developments of a knowledge based system (VEG) for inferring vegetation characteristics

    NASA Technical Reports Server (NTRS)

    Kimes, D. S.; Harrison, P. A.; Harrison, P. R.

    1992-01-01

    An extraction technique for inferring physical and biological surface properties of vegetation using nadir and/or directional reflectance data as input has been developed. A knowledge-based system (VEG) accepts spectral data of an unknown target as input, determines the best strategy for inferring the desired vegetation characteristic, applies the strategy to the target data, and provides a rigorous estimate of the accuracy of the inference. Progress in developing the system is presented. VEG combines methods from remote sensing and artificial intelligence, and integrates input spectral measurements with diverse knowledge bases. VEG has been developed to (1) infer spectral hemispherical reflectance from any combination of nadir and/or off-nadir view angles; (2) test and develop new extraction techniques on an internal spectral database; (3) browse, plot, or analyze directional reflectance data in the system's spectral database; (4) discriminate between user-defined vegetation classes using spectral and directional reflectance relationships; and (5) infer unknown view angles from known view angles (known as view angle extension).

  19. [The determination of molecular sulphur in Matsesta mineral water and its analog Novonukutskaya mineral water].

    PubMed

    Khutorianskiĭ, V A; Smirnov, A I; Matveev, D A

    2014-01-01

    The method of microcolumn reversed phase high performance liquid chromatography (rp-HPLC) was employed to determine the content of elemental sulphur in mineral waters. The study envisaged the analysis of the samples of sulphide-containing mineral waters Novonukutskaya and Matsesta obtained by the solid phase extraction technique. Based on these data, the authors discuss the origin and the circulation of sulphur in the hydrogen sulphide sources. The elution conditions selected in this study ensured the high-resolution separation of the octasulphur peak from the peaks of allotropic components of the extract whereas the two-wave detection technique allowed to identify the peaks of molecular sulphur.

  20. Comparison on extraction yield of sennoside A and sennoside B from senna (Cassia angustifolia) using conventional and non conventional extraction techniques and their quantification using a validated HPLC-PDA detection method.

    PubMed

    Dhanani, Tushar; Singh, Raghuraj; Reddy, Nagaraja; Trivedi, A; Kumar, Satyanshu

    2017-05-01

    Senna is an important medicinal plant and is used in many Ayurvedic formulations. Dianthraquinone glucosides are the main bioactive phytochemicals present in leaves and pods of senna. The extraction efficiency in terms of yield and composition of the extract of senna prepared using both conventional (cold percolation at room temperature and refluxing) and non conventional (ultrasound and microwave assisted solvent extraction as well as supercritical fluid extraction) techniques were compared in the present study. Also a rapid reverse phase HPLC-PDA detection method was developed and validated for the simultaneous determination of sennoside A and sennoside B in the different extracts of senna leaves. Ultrasound and microwave assisted solvent extraction techniques were more effective in terms of yield and composition of the extracts compared to cold percolation at room temperature and refluxing methods of extraction.

  1. Piezoelectric extraction of ECG signal

    NASA Astrophysics Data System (ADS)

    Ahmad, Mahmoud Al

    2016-11-01

    The monitoring and early detection of abnormalities or variations in the cardiac cycle functionality are very critical practices and have significant impact on the prevention of heart diseases and their associated complications. Currently, in the field of biomedical engineering, there is a growing need for devices capable of measuring and monitoring a wide range of cardiac cycle parameters continuously, effectively and on a real-time basis using easily accessible and reusable probes. In this paper, the revolutionary generation and extraction of the corresponding ECG signal using a piezoelectric transducer as alternative for the ECG will be discussed. The piezoelectric transducer pick up the vibrations from the heart beats and convert them into electrical output signals. To this end, piezoelectric and signal processing techniques were employed to extract the ECG corresponding signal from the piezoelectric output voltage signal. The measured electrode based and the extracted piezoelectric based ECG traces are well corroborated. Their peaks amplitudes and locations are well aligned with each other.

  2. Automatic extraction of blocks from 3D point clouds of fractured rock

    NASA Astrophysics Data System (ADS)

    Chen, Na; Kemeny, John; Jiang, Qinghui; Pan, Zhiwen

    2017-12-01

    This paper presents a new method for extracting blocks and calculating block size automatically from rock surface 3D point clouds. Block size is an important rock mass characteristic and forms the basis for several rock mass classification schemes. The proposed method consists of four steps: 1) the automatic extraction of discontinuities using an improved Ransac Shape Detection method, 2) the calculation of discontinuity intersections based on plane geometry, 3) the extraction of block candidates based on three discontinuities intersecting one another to form corners, and 4) the identification of "true" blocks using an improved Floodfill algorithm. The calculated block sizes were compared with manual measurements in two case studies, one with fabricated cardboard blocks and the other from an actual rock mass outcrop. The results demonstrate that the proposed method is accurate and overcomes the inaccuracies, safety hazards, and biases of traditional techniques.

  3. Sensing cocaine in saliva with attenuated total reflection infrared (ATR-IR) spectroscopy combined with a one-step extraction method

    NASA Astrophysics Data System (ADS)

    Hans, Kerstin M.-C.; Gianella, Michele; Sigrist, Markus W.

    2012-03-01

    On-site drug tests have gained importance, e.g., for protecting the society from impaired drivers. Since today's drug tests are majorly only positive/negative, there is a great need for a reliable, portable and preferentially quantitative drug test. In the project IrSens we aim to bridge this gap with the development of an optical sensor platform based on infrared spectroscopy and focus on cocaine detection in saliva. We combine a one-step extraction method, a sample drying technique and infrared attenuated total reflection (ATR) spectroscopy. As a first step we have developed an extraction technique that allows us to extract cocaine from saliva to an almost infrared-transparent solvent and to record ATR spectra with a commercially available Fourier Transform-infrared spectrometer. To the best of our knowledge this is the first time that such a simple and easy-to-use one-step extraction method is used to transfer cocaine from saliva into an organic solvent and detect it quantitatively. With this new method we are able to reach a current limit of detection around 10 μg/ml. This new extraction method could also be applied to waste water monitoring and controlling caffeine content in beverages.

  4. Sensitive determination of polycyclic aromatic hydrocarbons in water samples by HPLC coupled with SPE based on graphene functionalized with triethoxysilane.

    PubMed

    Huang, Ke-Jing; Li, Jing; Liu, Yan-Ming; Wang, Lan

    2013-02-01

    The graphene functionalized with (3-aminopropyl) triethoxysilane was synthesized by a simple hydrothermal reaction and applied as SPE sorbents to extract trace polycyclic aromatic hydrocarbons (PAHs) from environmental water samples. These sorbents possess high adsorption capacity and extraction efficiency due to strong adsorption ability of carbon materials and large specific surface area of nanoparticles, and only 10 mg of sorbents are required to extract PAHs from 100 mL water samples. Several condition parameters, such as eluent and its volume, adsorbent amount, sample volume, sample pH, and sample flow rate, were optimized to achieve good sensitivity and precision. Under the optimized extraction conditions, the method showed good linearity in the range of 1-100 μg/L, repeatability of the extraction (the RSDs were between 1.8 and 2.9%, n = 6), and satisfactory detection limits of 0.029-0.1 μg/L. The recoveries of PAHs spiked in environmental water samples ranged from 84.6 to 109.5%. All these results demonstrated that this new SPE technique was a viable alternative to conventional enrichment techniques for the extraction and analysis of PAHs in complex samples. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Quaternion-Based Signal Analysis for Motor Imagery Classification from Electroencephalographic Signals.

    PubMed

    Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J; Ibarra-Manzano, Mario A

    2016-03-05

    Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states.

  6. Quaternion-Based Signal Analysis for Motor Imagery Classification from Electroencephalographic Signals

    PubMed Central

    Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R.; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J.; Ibarra-Manzano, Mario A.

    2016-01-01

    Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states. PMID:26959029

  7. Comparative Approach of MRI-Based Brain Tumor Segmentation and Classification Using Genetic Algorithm.

    PubMed

    Bahadure, Nilesh Bhaskarrao; Ray, Arun Kumar; Thethi, Har Pal

    2018-01-17

    The detection of a brain tumor and its classification from modern imaging modalities is a primary concern, but a time-consuming and tedious work was performed by radiologists or clinical supervisors. The accuracy of detection and classification of tumor stages performed by radiologists is depended on their experience only, so the computer-aided technology is very important to aid with the diagnosis accuracy. In this study, to improve the performance of tumor detection, we investigated comparative approach of different segmentation techniques and selected the best one by comparing their segmentation score. Further, to improve the classification accuracy, the genetic algorithm is employed for the automatic classification of tumor stage. The decision of classification stage is supported by extracting relevant features and area calculation. The experimental results of proposed technique are evaluated and validated for performance and quality analysis on magnetic resonance brain images, based on segmentation score, accuracy, sensitivity, specificity, and dice similarity index coefficient. The experimental results achieved 92.03% accuracy, 91.42% specificity, 92.36% sensitivity, and an average segmentation score between 0.82 and 0.93 demonstrating the effectiveness of the proposed technique for identifying normal and abnormal tissues from brain MR images. The experimental results also obtained an average of 93.79% dice similarity index coefficient, which indicates better overlap between the automated extracted tumor regions with manually extracted tumor region by radiologists.

  8. Review of in situ derivatization techniques for enhanced bioanalysis using liquid chromatography with mass spectrometry.

    PubMed

    Baghdady, Yehia Z; Schug, Kevin A

    2016-01-01

    Accurate and specific analysis of target molecules in complex biological matrices remains a significant challenge, especially when ultra-trace detection limits are required. Liquid chromatography with mass spectrometry is often the method of choice for bioanalysis. Conventional sample preparation and clean-up methods prior to the analysis of biological fluids such as liquid-liquid extraction, solid-phase extraction, or protein precipitation are time-consuming, tedious, and can negatively affect target recovery and detection sensitivity. An alternative or complementary strategy is the use of an off-line or on-line in situ derivatization technique. In situ derivatization can be incorporated to directly derivatize target analytes in their native biological matrices, without any prior sample clean-up methods, to substitute or even enhance the extraction and preconcentration efficiency of these traditional sample preparation methods. Designed appropriately, it can reduce the number of sample preparation steps necessary prior to analysis. Moreover, in situ derivatization can be used to enhance the performance of the developed liquid chromatography with mass spectrometry-based bioanalysis methods regarding stability, chromatographic separation, selectivity, and ionization efficiency. This review presents an overview of the commonly used in situ derivatization techniques coupled to liquid chromatography with mass spectrometry-based bioanalysis to guide and to stimulate future research. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Development of a nucleic Acid extraction procedure for simultaneous recovery of DNA and RNA from diverse microbes in water.

    PubMed

    Hill, Vincent R; Narayanan, Jothikumar; Gallen, Rachel R; Ferdinand, Karen L; Cromeans, Theresa; Vinjé, Jan

    2015-05-26

    Drinking and environmental water samples contain a diverse array of constituents that can interfere with molecular testing techniques, especially when large volumes of water are concentrated to the small volumes needed for effective molecular analysis. In this study, a suite of enteric viruses, bacteria, and protozoan parasites were seeded into concentrated source water and finished drinking water samples, in order to investigate the relative performance of nucleic acid extraction techniques for molecular testing. Real-time PCR and reverse transcription-PCR crossing threshold (CT) values were used as the metrics for evaluating relative performance. Experimental results were used to develop a guanidinium isothiocyanate-based lysis buffer (UNEX buffer) that enabled effective simultaneous extraction and recovery of DNA and RNA from the suite of study microbes. Procedures for bead beating, nucleic acid purification, and PCR facilitation were also developed and integrated in the protocol. The final lysis buffer and sample preparation procedure was found to be effective for a panel of drinking water and source water concentrates when compared to commercial nucleic acid extraction kits. The UNEX buffer-based extraction protocol enabled PCR detection of six study microbes, in 100 L finished water samples from four drinking water treatment facilities, within three CT values (i.e., within 90% difference) of the reagent-grade water control. The results from this study indicate that this newly formulated lysis buffer and sample preparation procedure can be useful for standardized molecular testing of drinking and environmental waters.

  10. Development of a Nucleic Acid Extraction Procedure for Simultaneous Recovery of DNA and RNA from Diverse Microbes in Water

    PubMed Central

    Hill, Vincent R.; Narayanan, Jothikumar; Gallen, Rachel R.; Ferdinand, Karen L.; Cromeans, Theresa; Vinjé, Jan

    2015-01-01

    Drinking and environmental water samples contain a diverse array of constituents that can interfere with molecular testing techniques, especially when large volumes of water are concentrated to the small volumes needed for effective molecular analysis. In this study, a suite of enteric viruses, bacteria, and protozoan parasites were seeded into concentrated source water and finished drinking water samples, in order to investigate the relative performance of nucleic acid extraction techniques for molecular testing. Real-time PCR and reverse transcription-PCR crossing threshold (CT) values were used as the metrics for evaluating relative performance. Experimental results were used to develop a guanidinium isothiocyanate-based lysis buffer (UNEX buffer) that enabled effective simultaneous extraction and recovery of DNA and RNA from the suite of study microbes. Procedures for bead beating, nucleic acid purification, and PCR facilitation were also developed and integrated in the protocol. The final lysis buffer and sample preparation procedure was found to be effective for a panel of drinking water and source water concentrates when compared to commercial nucleic acid extraction kits. The UNEX buffer-based extraction protocol enabled PCR detection of six study microbes, in 100 L finished water samples from four drinking water treatment facilities, within three CT values (i.e., within 90% difference) of the reagent-grade water control. The results from this study indicate that this newly formulated lysis buffer and sample preparation procedure can be useful for standardized molecular testing of drinking and environmental waters. PMID:26016775

  11. The extraction of essential oil from patchouli leaves (Pogostemon cablin Benth) using microwave hydrodistillation and solvent-free microwave extraction methods

    NASA Astrophysics Data System (ADS)

    Putri, D. K. Y.; Kusuma, H. S.; Syahputra, M. E.; Parasandi, D.; Mahfud, M.

    2017-12-01

    Patchouli plant (Pogostemon cablin Benth) is one of the important essential oil-producing plant, contributes more than 50% of total exports of Indonesia’s essential oil. However, the extraction of patchouli oil that has been done in Indonesia is generally still used conventional methods that require enormous amount of energy, high solvent usage, and long time of extraction. Therefore, in this study, patchouli oil extraction was carried out by using microwave hydrodistillation and solvent-free microwave extraction methods. Based on this research, it is known that the extraction of patchouli oil using microwave hydrodistillation method with longer extraction time (240 min) only produced patchouli oil’s yield 1.2 times greater than solvent-free microwave extraction method which require faster extraction time (120 min). Otherwise the analysis of electric consumption and the environmental impact, the solvent-free microwave extraction method showed a smaller amount when compared with microwave hydrodistillation method. It is conclude that the use of solvent-free microwave extraction method for patchouli oil extraction is suitably method as a new green technique.

  12. Analysis of Technique to Extract Data from the Web for Improved Performance

    NASA Astrophysics Data System (ADS)

    Gupta, Neena; Singh, Manish

    2010-11-01

    The World Wide Web rapidly guides the world into a newly amazing electronic world, where everyone can publish anything in electronic form and extract almost all the information. Extraction of information from semi structured or unstructured documents, such as web pages, is a useful yet complex task. Data extraction, which is important for many applications, extracts the records from the HTML files automatically. Ontologies can achieve a high degree of accuracy in data extraction. We analyze method for data extraction OBDE (Ontology-Based Data Extraction), which automatically extracts the query result records from the web with the help of agents. OBDE first constructs an ontology for a domain according to information matching between the query interfaces and query result pages from different web sites within the same domain. Then, the constructed domain ontology is used during data extraction to identify the query result section in a query result page and to align and label the data values in the extracted records. The ontology-assisted data extraction method is fully automatic and overcomes many of the deficiencies of current automatic data extraction methods.

  13. DNA Extraction from Museum Specimens of Parasitic Hymenoptera

    PubMed Central

    Andersen, Jeremy C.; Mills, Nicholas J.

    2012-01-01

    At the same time that molecular researchers are improving techniques to extract DNA from museum specimens, this increased demand for access to museum specimens has created tension between the need to preserve specimens for maintaining collections and morphological research and the desire to conduct molecular analyses. To address these concerns, we examined the suitability of non-invasive DNA extraction techniques on three species of parasitic Hymenoptera (Braconidae), and test the effects of body size (parasitoid species), age (time since collection), and DNA concentration from each extract on the probability of amplifying meaningful fragments of two commonly used genetic loci. We found that age was a significant factor for determining the probability of success for sequencing both 28S and COI fragments. While the size of the braconid parasitoids significantly affected the total amount of extracted DNA, neither size nor DNA concentration were significant factors for the amplification of either gene region. We also tested several primer combinations of various lengths, but were unable to amplify fragments longer than ∼150 base pairs. These short fragments of 28S and COI were however sufficient for species identification, and for the discovery of within species genetic variation. PMID:23077493

  14. Parameter Extraction Method for the Electrical Model of a Silicon Photomultiplier

    NASA Astrophysics Data System (ADS)

    Licciulli, Francesco; Marzocca, Cristoforo

    2016-10-01

    The availability of an effective electrical model, able to accurately reproduce the signals generated by a Silicon Photo-Multiplier coupled to the front-end electronics, is mandatory when the performance of a detection system based on this kind of detector has to be evaluated by means of reliable simulations. We propose a complete extraction procedure able to provide the whole set of the parameters involved in a well-known model of the detector, which includes the substrate ohmic resistance. The technique allows achieving very good quality of the fit between simulation results provided by the model and experimental data, thanks to accurate discrimination between the quenching and substrate resistances, which results in a realistic set of extracted parameters. The extraction procedure has been applied to a commercial device considering a wide range of different conditions in terms of input resistance of the front-end electronics and interconnection parasitics. In all the considered situations, very good correspondence has been found between simulations and measurements, especially for what concerns the leading edge of the current pulses generated by the detector, which strongly affects the timing performance of the detection system, thus confirming the effectiveness of the model and the associated parameter extraction technique.

  15. A highly efficient bead extraction technique with low bead number for digital microfluidic immunoassay

    PubMed Central

    Tsai, Po-Yen; Lee, I-Chin; Hsu, Hsin-Yun; Huang, Hong-Yuan; Fan, Shih-Kang; Liu, Cheng-Hsien

    2016-01-01

    Here, we describe a technique to manipulate a low number of beads to achieve high washing efficiency with zero bead loss in the washing process of a digital microfluidic (DMF) immunoassay. Previously, two magnetic bead extraction methods were reported in the DMF platform: (1) single-side electrowetting method and (2) double-side electrowetting method. The first approach could provide high washing efficiency, but it required a large number of beads. The second approach could reduce the required number of beads, but it was inefficient where multiple washes were required. More importantly, bead loss during the washing process was unavoidable in both methods. Here, an improved double-side electrowetting method is proposed for bead extraction by utilizing a series of unequal electrodes. It is shown that, with proper electrode size ratio, only one wash step is required to achieve 98% washing rate without any bead loss at bead number less than 100 in a droplet. It allows using only about 25 magnetic beads in DMF immunoassay to increase the number of captured analytes on each bead effectively. In our human soluble tumor necrosis factor receptor I (sTNF-RI) model immunoassay, the experimental results show that, comparing to our previous results without using the proposed bead extraction technique, the immunoassay with low bead number significantly enhances the fluorescence signal to provide a better limit of detection (3.14 pg/ml) with smaller reagent volumes (200 nl) and shorter analysis time (<1 h). This improved bead extraction technique not only can be used in the DMF immunoassay but also has great potential to be used in any other bead-based DMF systems for different applications. PMID:26858807

  16. Annual Quality Assurance Conference Abstracts by Barbara Marshik

    EPA Pesticide Factsheets

    25th Annual Quality Assurance Conference. Abstracts: Material and Process Conditions for Successful Use of Extractive Sampling Techniques and Certification Methods Errors in the Analysis of NMHC and VOCs in CNG-Based Engine Emissions by Barbara Marshik

  17. A review on "A Novel Technique for Image Steganography Based on Block-DCT and Huffman Encoding"

    NASA Astrophysics Data System (ADS)

    Das, Rig; Tuithung, Themrichon

    2013-03-01

    This paper reviews the embedding and extraction algorithm proposed by "A. Nag, S. Biswas, D. Sarkar and P. P. Sarkar" on "A Novel Technique for Image Steganography based on Block-DCT and Huffman Encoding" in "International Journal of Computer Science and Information Technology, Volume 2, Number 3, June 2010" [3] and shows that the Extraction of Secret Image is Not Possible for the algorithm proposed in [3]. 8 bit Cover Image of size is divided into non joint blocks and a two dimensional Discrete Cosine Transformation (2-D DCT) is performed on each of the blocks. Huffman Encoding is performed on an 8 bit Secret Image of size and each bit of the Huffman Encoded Bit Stream is embedded in the frequency domain by altering the LSB of the DCT coefficients of Cover Image blocks. The Huffman Encoded Bit Stream and Huffman Table

  18. Exploitation of immunofluorescence for the quantification and characterization of small numbers of Pasteuria endospores.

    PubMed

    Costa, Sofia R; Kerry, Brian R; Bardgett, Richard D; Davies, Keith G

    2006-12-01

    The Pasteuria group of endospore-forming bacteria has been studied as a biocontrol agent of plant-parasitic nematodes. Techniques have been developed for its detection and quantification in soil samples, and these mainly focus on observations of endospore attachment to nematodes. Characterization of Pasteuria populations has recently been performed with DNA-based techniques, which usually require the extraction of large numbers of spores. We describe a simple immunological method for the quantification and characterization of Pasteuria populations. Bayesian statistics were used to determine an extraction efficiency of 43% and a threshold of detection of 210 endospores g(-1) sand. This provided a robust means of estimating numbers of endospores in small-volume samples from a natural system. Based on visual assessment of endospore fluorescence, a quantitative method was developed to characterize endospore populations, which were shown to vary according to their host.

  19. Morphology filter bank for extracting nodular and linear patterns in medical images.

    PubMed

    Hashimoto, Ryutaro; Uchiyama, Yoshikazu; Uchimura, Keiichi; Koutaki, Gou; Inoue, Tomoki

    2017-04-01

    Using image processing to extract nodular or linear shadows is a key technique of computer-aided diagnosis schemes. This study proposes a new method for extracting nodular and linear patterns of various sizes in medical images. We have developed a morphology filter bank that creates multiresolution representations of an image. Analysis bank of this filter bank produces nodular and linear patterns at each resolution level. Synthesis bank can then be used to perfectly reconstruct the original image from these decomposed patterns. Our proposed method shows better performance based on a quantitative evaluation using a synthesized image compared with a conventional method based on a Hessian matrix, often used to enhance nodular and linear patterns. In addition, experiments show that our method can be applied to the followings: (1) microcalcifications of various sizes in mammograms can be extracted, (2) blood vessels of various sizes in retinal fundus images can be extracted, and (3) thoracic CT images can be reconstructed while removing normal vessels. Our proposed method is useful for extracting nodular and linear shadows or removing normal structures in medical images.

  20. A comparative study of conventional and supercritical fluid extraction methods for the recovery of secondary metabolites from Syzygium campanulatum Korth.

    PubMed

    Memon, Abdul Hakeem; Hamil, Mohammad Shahrul Ridzuan; Laghari, Madeeha; Rithwan, Fahim; Zhari, Salman; Saeed, Mohammed Ali Ahmed; Ismail, Zhari; Majid, Amin Malik Shah Abdul

    2016-09-01

    Syzygium campanulatum Korth is a plant, which is a rich source of secondary metabolites (especially flavanones, chalcone, and triterpenoids). In our present study, three conventional solvent extraction (CSE) techniques and supercritical fluid extraction (SFE) techniques were performed to achieve a maximum recovery of two flavanones, chalcone, and two triterpenoids from S. campanulatum leaves. Furthermore, a Box-Behnken design was constructed for the SFE technique using pressure, temperature, and particle size as independent variables, and yields of crude extract, individual and total secondary metabolites as the dependent variables. In the CSE procedure, twenty extracts were produced using ten different solvents and three techniques (maceration, soxhletion, and reflux). An enriched extract of five secondary metabolites was collected using n-hexane:methanol (1:1) soxhletion. Using food-grade ethanol as a modifier, the SFE methods produced a higher recovery (25.5%‒84.9%) of selected secondary metabolites as compared to the CSE techniques (0.92%‒66.00%).

  1. A comparative study of conventional and supercritical fluid extraction methods for the recovery of secondary metabolites from Syzygium campanulatum Korth#

    PubMed Central

    Memon, Abdul Hakeem; Hamil, Mohammad Shahrul Ridzuan; Laghari, Madeeha; Rithwan, Fahim; Zhari, Salman; Saeed, Mohammed Ali Ahmed; Ismail, Zhari; Majid, Amin Malik Shah Abdul

    2016-01-01

    Syzygium campanulatum Korth is a plant, which is a rich source of secondary metabolites (especially flavanones, chalcone, and triterpenoids). In our present study, three conventional solvent extraction (CSE) techniques and supercritical fluid extraction (SFE) techniques were performed to achieve a maximum recovery of two flavanones, chalcone, and two triterpenoids from S. campanulatum leaves. Furthermore, a Box-Behnken design was constructed for the SFE technique using pressure, temperature, and particle size as independent variables, and yields of crude extract, individual and total secondary metabolites as the dependent variables. In the CSE procedure, twenty extracts were produced using ten different solvents and three techniques (maceration, soxhletion, and reflux). An enriched extract of five secondary metabolites was collected using n-hexane:methanol (1:1) soxhletion. Using food-grade ethanol as a modifier, the SFE methods produced a higher recovery (25.5%‒84.9%) of selected secondary metabolites as compared to the CSE techniques (0.92%‒66.00%). PMID:27604860

  2. Extraction of kiwi seed oil: Soxhlet versus four different non-conventional techniques.

    PubMed

    Cravotto, Giancarlo; Bicchi, Carlo; Mantegna, Stefano; Binello, Arianna; Tomao, Valerie; Chemat, Farid

    2011-06-01

    Kiwi seed oil has a nutritionally interesting fatty acid profile, but a rather low oxidative stability, which requires careful extraction procedures and adequate packaging and storage. For these reasons and with the aim to achieve process intensification with shorter extraction time, lower energy consumption and higher yields, four different non-conventional techniques were experimented. Kiwi seeds were extracted in hexane using classic Soxhlet as well as under power ultrasound (US), microwaves (MWs; closed vessel) and MW-integrated Soxhlet. Supercritical CO₂ was also employed and compared to the other techniques in term of yield, extraction time, fatty acid profiles and organoleptic properties. All these non-conventional techniques are fast, effective and safe. A sensory evaluation test showed the presence of off-flavours in oil samples extracted by Soxhlet and US, an indicator of partial degradation.

  3. Automated generation of individually customized visualizations of diagnosis-specific medical information using novel techniques of information extraction

    NASA Astrophysics Data System (ADS)

    Chen, Andrew A.; Meng, Frank; Morioka, Craig A.; Churchill, Bernard M.; Kangarloo, Hooshang

    2005-04-01

    Managing pediatric patients with neurogenic bladder (NGB) involves regular laboratory, imaging, and physiologic testing. Using input from domain experts and current literature, we identified specific data points from these tests to develop the concept of an electronic disease vector for NGB. An information extraction engine was used to extract the desired data elements from free-text and semi-structured documents retrieved from the patient"s medical record. Finally, a Java-based presentation engine created graphical visualizations of the extracted data. After precision, recall, and timing evaluation, we conclude that these tools may enable clinically useful, automatically generated, and diagnosis-specific visualizations of patient data, potentially improving compliance and ultimately, outcomes.

  4. Forest Roadidentification and Extractionof Through Advanced Log Matching Techniques

    NASA Astrophysics Data System (ADS)

    Zhang, W.; Hu, B.; Quist, L.

    2017-10-01

    A novel algorithm for forest road identification and extraction was developed. The algorithm utilized Laplacian of Gaussian (LoG) filter and slope calculation on high resolution multispectral imagery and LiDAR data respectively to extract both primary road and secondary road segments in the forest area. The proposed method used road shape feature to extract the road segments, which have been further processed as objects with orientation preserved. The road network was generated after post processing with tensor voting. The proposed method was tested on Hearst forest, located in central Ontario, Canada. Based on visual examination against manually digitized roads, the majority of roads from the test area have been identified and extracted from the process.

  5. Automatic Extraction of Planetary Image Features

    NASA Technical Reports Server (NTRS)

    Troglio, G.; LeMoigne, J.; Moser, G.; Serpico, S. B.; Benediktsson, J. A.

    2009-01-01

    With the launch of several Lunar missions such as the Lunar Reconnaissance Orbiter (LRO) and Chandrayaan-1, a large amount of Lunar images will be acquired and will need to be analyzed. Although many automatic feature extraction methods have been proposed and utilized for Earth remote sensing images, these methods are not always applicable to Lunar data that often present low contrast and uneven illumination characteristics. In this paper, we propose a new method for the extraction of Lunar features (that can be generalized to other planetary images), based on the combination of several image processing techniques, a watershed segmentation and the generalized Hough Transform. This feature extraction has many applications, among which image registration.

  6. A new adaptive algorithm for automated feature extraction in exponentially damped signals for health monitoring of smart structures

    NASA Astrophysics Data System (ADS)

    Qarib, Hossein; Adeli, Hojjat

    2015-12-01

    In this paper authors introduce a new adaptive signal processing technique for feature extraction and parameter estimation in noisy exponentially damped signals. The iterative 3-stage method is based on the adroit integration of the strengths of parametric and nonparametric methods such as multiple signal categorization, matrix pencil, and empirical mode decomposition algorithms. The first stage is a new adaptive filtration or noise removal scheme. The second stage is a hybrid parametric-nonparametric signal parameter estimation technique based on an output-only system identification technique. The third stage is optimization of estimated parameters using a combination of the primal-dual path-following interior point algorithm and genetic algorithm. The methodology is evaluated using a synthetic signal and a signal obtained experimentally from transverse vibrations of a steel cantilever beam. The method is successful in estimating the frequencies accurately. Further, it estimates the damping exponents. The proposed adaptive filtration method does not include any frequency domain manipulation. Consequently, the time domain signal is not affected as a result of frequency domain and inverse transformations.

  7. Comparison of two novel in-syringe dispersive liquid-liquid microextraction techniques for the determination of iodide in water samples using spectrophotometry.

    PubMed

    Kaykhaii, Massoud; Sargazi, Mona

    2014-01-01

    Two new, rapid methodologies have been developed and applied successfully for the determination of trace levels of iodide in real water samples. Both techniques are based on a combination of in-syringe dispersive liquid-liquid microextraction (IS-DLLME) and micro-volume UV-Vis spectrophotometry. In the first technique, iodide is oxidized with nitrous acid to the colorless anion of ICl2(-) at high concentration of hydrochloric acid. Rhodamine B is added and by means of one step IS-DLLME, the ion-pair formed was extracted into toluene and measured spectrophotometrically. Acetone is used as dispersive solvent. The second method is based on the IS-DLLME microextraction of iodide as iodide/1, 10-phenanthroline-iron((II)) chelate cation ion-pair (colored) into nitrobenzene. Methanol was selected as dispersive solvent. Optimal conditions for iodide extraction were determined for both approaches. Methods are compared in terms of analytical parameters such as precision, accuracy, speed and limit of detection. Both methods were successfully applied to determining iodide in tap and river water samples. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Methodologies for extraction of dissolved inorganic carbon for stable carbon isotope studies : evaluation and alternatives

    USGS Publications Warehouse

    Hassan, Afifa Afifi

    1982-01-01

    The gas evolution and the strontium carbonate precipitation techniques to extract dissolved inorganic carbon (DIC) for stable carbon isotope analysis were investigated. Theoretical considerations, involving thermodynamic calculations and computer simulation pointed out several possible sources of error in delta carbon-13 measurements of the DIC and demonstrated the need for experimental evaluation of the magnitude of the error. An alternative analytical technique, equilibration with out-gassed vapor phase, is proposed. The experimental studies revealed that delta carbon-13 of the DIC extracted from a 0.01 molar NaHC03 solution by both techniques agreed within 0.1 per mil with the delta carbon-13 of the DIC extracted by the precipitation technique, and an increase of only 0.27 per mil in that extracted by the gas evolution technique. The efficiency of extraction of DIC decreased with sulfate concentration in the precipitation technique but was independent of sulfate concentration in the gas evolution technique. Both the precipitation and gas evolution technique were found to be satisfactory for extraction of DIC from different kinds of natural water for stable carbon isotope analysis, provided appropriate precautions are observed in handling the samples. For example, it was found that diffusion of atmospheric carbon dioxide does alter the delta carbon-13 of the samples contained in polyethylene bottles; filtration and drying in the air change the delta carbon-13 of the samples contained in polyethylene bottles; filtration and drying in the air change the delta carbon-13 of the precipitation technique; hot manganese dioxide purification changes the delta carbon-13 of carbon dioxide. (USGS)

  9. FIR: An Effective Scheme for Extracting Useful Metadata from Social Media.

    PubMed

    Chen, Long-Sheng; Lin, Zue-Cheng; Chang, Jing-Rong

    2015-11-01

    Recently, the use of social media for health information exchange is expanding among patients, physicians, and other health care professionals. In medical areas, social media allows non-experts to access, interpret, and generate medical information for their own care and the care of others. Researchers paid much attention on social media in medical educations, patient-pharmacist communications, adverse drug reactions detection, impacts of social media on medicine and healthcare, and so on. However, relatively few papers discuss how to extract useful knowledge from a huge amount of textual comments in social media effectively. Therefore, this study aims to propose a Fuzzy adaptive resonance theory network based Information Retrieval (FIR) scheme by combining Fuzzy adaptive resonance theory (ART) network, Latent Semantic Indexing (LSI), and association rules (AR) discovery to extract knowledge from social media. In our FIR scheme, Fuzzy ART network firstly has been employed to segment comments. Next, for each customer segment, we use LSI technique to retrieve important keywords. Then, in order to make the extracted keywords understandable, association rules mining is presented to organize these extracted keywords to build metadata. These extracted useful voices of customers will be transformed into design needs by using Quality Function Deployment (QFD) for further decision making. Unlike conventional information retrieval techniques which acquire too many keywords to get key points, our FIR scheme can extract understandable metadata from social media.

  10. Single-Rooted Extraction Sockets: Classification and Treatment Protocol.

    PubMed

    El Chaar, Edgar; Oshman, Sarah; Fallah Abed, Pooria

    2016-09-01

    Clinicians have many treatment techniques from which to choose when extracting a failing tooth and replacing it with an implant-supported restoration and when successful management of an extraction socket during the course of tooth replacement is necessary to achieve predictable and esthetic outcomes. This article presents a straightforward, yet thorough, classification for extraction sockets of single-rooted teeth and provides guidance to clinicians in the selection of appropriate and predictable treatment. The presented classification of extraction sockets for single-rooted teeth focuses on the topography of the extraction socket, while the protocol for treatment of each socket type factors in the shape of the remaining bone, the biotype, and the location of the socket whether it be in the mandible or maxilla. This system is based on the biologic foundations of wound healing and can help guide clinicians to successful treatment outcomes.

  11. Experimental validation of a numerical 3-D finite model applied to wind turbines design under vibration constraints: TREVISE platform

    NASA Astrophysics Data System (ADS)

    Sellami, Takwa; Jelassi, Sana; Darcherif, Abdel Moumen; Berriri, Hanen; Mimouni, Med Faouzi

    2018-04-01

    With the advancement of wind turbines towards complex structures, the requirement of trusty structural models has become more apparent. Hence, the vibration characteristics of the wind turbine components, like the blades and the tower, have to be extracted under vibration constraints. Although extracting the modal properties of blades is a simple task, calculating precise modal data for the whole wind turbine coupled to its tower/foundation is still a perplexing task. In this framework, this paper focuses on the investigation of the structural modeling approach of modern commercial micro-turbines. Thus, the structural model a complex designed wind turbine, which is Rutland 504, is established based on both experimental and numerical methods. A three-dimensional (3-D) numerical model of the structure was set up based on the finite volume method (FVM) using the academic finite element analysis software ANSYS. To validate the created model, experimental vibration tests were carried out using the vibration test system of TREVISE platform at ECAM-EPMI. The tests were based on the experimental modal analysis (EMA) technique, which is one of the most efficient techniques for identifying structures parameters. Indeed, the poles and residues of the frequency response functions (FRF), between input and output spectra, were calculated to extract the mode shapes and the natural frequencies of the structure. Based on the obtained modal parameters, the numerical designed model was up-dated.

  12. An approach of ionic liquids/lithium salts based microwave irradiation pretreatment followed by ultrasound-microwave synergistic extraction for two coumarins preparation from Cortex fraxini.

    PubMed

    Liu, Zaizhi; Gu, Huiyan; Yang, Lei

    2015-10-23

    Ionic liquids/lithium salts solvent system was successfully introduced into the separation technique for the preparation of two coumarins (aesculin and aesculetin) from Cortex fraxini. Ionic liquids/lithium salts based microwave irradiation pretreatment followed by ultrasound-microwave synergy extraction (ILSMP-UMSE) procedure was developed and optimized for the sufficient extraction of these two analytes. Several variables which can potentially influence the extraction yields, including pretreatment time and temperature, [C4mim]Br concentration, LiAc content, ultrasound-microwave synergy extraction (UMSE) time, liquid-solid ratio, and UMSE power were optimized by Plackett-Burman design. Among seven variables, UMSE time, liquid-solid ratio, and UMSE power were the statistically significant variables and these three factors were further optimized by Box-Behnken design to predict optimal extraction conditions and find out operability ranges with maximum extraction yields. Under optimum operating conditions, ILSMP-UMSE showed higher extraction yields of two target compounds than those obtained by reference extraction solvents. Method validation studies also evidenced that ILSMP-UMSE is credible for the preparation of two coumarins from Cortex fraxini. This study is indicative of the proposed procedure that has huge application prospects for the preparation of natural products from plant materials. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Earthquake Damage Assessment Using Objective Image Segmentation: A Case Study of 2010 Haiti Earthquake

    NASA Technical Reports Server (NTRS)

    Oommen, Thomas; Rebbapragada, Umaa; Cerminaro, Daniel

    2012-01-01

    In this study, we perform a case study on imagery from the Haiti earthquake that evaluates a novel object-based approach for characterizing earthquake induced surface effects of liquefaction against a traditional pixel based change technique. Our technique, which combines object-oriented change detection with discriminant/categorical functions, shows the power of distinguishing earthquake-induced surface effects from changes in buildings using the object properties concavity, convexity, orthogonality and rectangularity. Our results suggest that object-based analysis holds promise in automatically extracting earthquake-induced damages from high-resolution aerial/satellite imagery.

  14. Mobile robots traversability awareness based on terrain visual sensory data fusion

    NASA Astrophysics Data System (ADS)

    Shirkhodaie, Amir

    2007-04-01

    In this paper, we have presented methods that significantly improve the robot awareness of its terrain traversability conditions. The terrain traversability awareness is achieved by association of terrain image appearances from different poses and fusion of extracted information from multimodality imaging and range sensor data for localization and clustering environment landmarks. Initially, we describe methods for extraction of salient features of the terrain for the purpose of landmarks registration from two or more images taken from different via points along the trajectory path of the robot. The method of image registration is applied as a means of overlaying (two or more) of the same terrain scene at different viewpoints. The registration geometrically aligns salient landmarks of two images (the reference and sensed images). A Similarity matching techniques is proposed for matching the terrain salient landmarks. Secondly, we present three terrain classifier models based on rule-based, supervised neural network, and fuzzy logic for classification of terrain condition under uncertainty and mapping the robot's terrain perception to apt traversability measures. This paper addresses the technical challenges and navigational skill requirements of mobile robots for traversability path planning in natural terrain environments similar to Mars surface terrains. We have described different methods for detection of salient terrain features based on imaging texture analysis techniques. We have also presented three competing techniques for terrain traversability assessment of mobile robots navigating in unstructured natural terrain environments. These three techniques include: a rule-based terrain classifier, a neural network-based terrain classifier, and a fuzzy-logic terrain classifier. Each proposed terrain classifier divides a region of natural terrain into finite sub-terrain regions and classifies terrain condition exclusively within each sub-terrain region based on terrain spatial and textural cues.

  15. Digital filtering of plume emission spectra

    NASA Technical Reports Server (NTRS)

    Madzsar, George C.

    1990-01-01

    Fourier transformation and digital filtering techniques were used to separate the superpositioned spectral phenomena observed in the exhaust plumes of liquid propellant rocket engines. Space shuttle main engine (SSME) spectral data were used to show that extraction of spectral lines in the spatial frequency domain does not introduce error, and extraction of the background continuum introduces only minimal error. Error introduced during band extraction could not be quantified due to poor spectrometer resolution. Based on the atomic and molecular species found in the SSME plume, it was determined that spectrometer resolution must be 0.03 nm for SSME plume spectral monitoring.

  16. Ionic liquid solutions as extractive solvents for value-added compounds from biomass

    PubMed Central

    Passos, Helena; Freire, Mara G.; Coutinho, João A. P.

    2014-01-01

    In the past few years, the number of studies regarding the application of ionic liquids (ILs) as alternative solvents to extract value-added compounds from biomass has been growing. Based on an extended compilation and analysis of the data hitherto reported, the main objective of this review is to provide an overview on the use of ILs and their mixtures with molecular solvents for the extraction of value-added compounds present in natural sources. The ILs (or IL solutions) investigated as solvents for the extraction of natural compounds, such as alkaloids, flavonoids, terpenoids, lipids, among others, are outlined. The extraction techniques employed, namely solid–liquid extraction, and microwave-assisted and ultrasound-assisted extractions, are emphasized and discussed in terms of extraction yields and purification factors. Furthermore, the evaluation of the IL chemical structure and the optimization of the process conditions (IL concentration, temperature, biomass–solvent ratio, etc.) are critically addressed. Major conclusions on the role of the ILs towards the extraction mechanisms and improved extraction yields are additionally provided. The isolation and recovery procedures of the value-added compounds are ascertained as well as some scattered strategies already reported for the IL solvent recovery and reusability. Finally, a critical analysis on the economic impact versus the extraction performance of IL-based methodologies was also carried out and is here presented and discussed. PMID:25516718

  17. Ionic liquid solutions as extractive solvents for value-added compounds from biomass.

    PubMed

    Passos, Helena; Freire, Mara G; Coutinho, João A P

    2014-12-01

    In the past few years, the number of studies regarding the application of ionic liquids (ILs) as alternative solvents to extract value-added compounds from biomass has been growing. Based on an extended compilation and analysis of the data hitherto reported, the main objective of this review is to provide an overview on the use of ILs and their mixtures with molecular solvents for the extraction of value-added compounds present in natural sources. The ILs (or IL solutions) investigated as solvents for the extraction of natural compounds, such as alkaloids, flavonoids, terpenoids, lipids, among others, are outlined. The extraction techniques employed, namely solid-liquid extraction, and microwave-assisted and ultrasound-assisted extractions, are emphasized and discussed in terms of extraction yields and purification factors. Furthermore, the evaluation of the IL chemical structure and the optimization of the process conditions (IL concentration, temperature, biomass-solvent ratio, etc.) are critically addressed. Major conclusions on the role of the ILs towards the extraction mechanisms and improved extraction yields are additionally provided. The isolation and recovery procedures of the value-added compounds are ascertained as well as some scattered strategies already reported for the IL solvent recovery and reusability. Finally, a critical analysis on the economic impact versus the extraction performance of IL-based methodologies was also carried out and is here presented and discussed.

  18. Plasmonic characterization of photo-induced silver nanoparticles extracted from silver halide based TEM film

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sudheer,, E-mail: sudheer@rrcat.gov.in; Tiwari, P.; Rai, V. N.

    The plasmonic responses of silver nanoparticles extracted from silver halide based electron microscope film are investigated. Photo-reduction process is carried out to convert the silver halide grains into the metallic silver. The centrifuge technique is used for separating the silver nanoparticles from the residual solution. Morphological study performed by field emission scanning electron microscope (FESEM) shows that all the nanoparticles have an average diameter of ~120 nm with a high degree of mono dispersion in size. The localized surface plasmon resonance (LSPR) absorption peak at ~537 nm confirms the presence of large size silver nanoparticles.

  19. The 3-D image recognition based on fuzzy neural network technology

    NASA Technical Reports Server (NTRS)

    Hirota, Kaoru; Yamauchi, Kenichi; Murakami, Jun; Tanaka, Kei

    1993-01-01

    Three dimensional stereoscopic image recognition system based on fuzzy-neural network technology was developed. The system consists of three parts; preprocessing part, feature extraction part, and matching part. Two CCD color camera image are fed to the preprocessing part, where several operations including RGB-HSV transformation are done. A multi-layer perception is used for the line detection in the feature extraction part. Then fuzzy matching technique is introduced in the matching part. The system is realized on SUN spark station and special image input hardware system. An experimental result on bottle images is also presented.

  20. Improved optical axis determination accuracy for fiber-based polarization-sensitive optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Lu, Zenghai; Matcher, Stephen J.

    2013-03-01

    We report on a new calibration technique that permits the accurate extraction of sample Jones matrix and hence fast-axis orientation by using fiber-based polarization-sensitive optical coherence tomography (PS-OCT) that is completely based on non polarization maintaining fiber such as SMF-28. In this technique, two quarter waveplates are used to completely specify the parameters of the system fibers in the sample arm so that the Jones matrix of the sample can be determined directly. The device was validated on measurements of a quarter waveplate and an equine tendon sample by a single-mode fiber-based swept-source PS-OCT system.

  1. Application of fermentation for isoflavone extraction from soy molasses

    NASA Astrophysics Data System (ADS)

    Duru, K. C.; Kovaleva, E. G.; Glukhareva, T. V.

    2017-09-01

    Extraction of isoflavones from soy products remains a major challenge for researchers. Different extraction techniques have been employed but the need to use a cheap green extraction technique remains the main focus. This study applied fermentation of soy molasses using Saccharomyces cerevisiae for extraction of isoflavones and compared this technique to the conventional extraction method. The aluminum chloride colorimetric method was used for the determination of total flavonoid content of extracts. The highest yield was observed from extraction using ethyl acetate after fermentation of soy molasses and the lowest one was given by the extract from conventional extraction method. The DPPH radical scavenging activities of the extracts were also compared. The extract obtained using ethyl acetate after fermentation showed the highest antioxidant activity (0.0269 meq), while extract from conventional extraction had the lowest antioxidant activity (0.0055 meq). The effect of time on daidzein yield was studied using HPLC standard addition method. Daidzein concentration was higher in extract obtained at t = 80 min (3.82 ± 0.11 mg of daidzein /g of extract) as compared to that obtained at t = 60 min (2.89 ± 0.10 mg of daidzein /g of extract).

  2. Coupling Front-End Separations, Ion Mobility Spectrometry, and Mass Spectrometry For Enhanced Multidimensional Biological and Environmental Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Xueyun; Wojcik, Roza; Zhang, Xing

    Ion mobility spectrometry (IMS) is a widely used analytical technique for rapid molecular separations in the gas phase. IMS alone is useful, but its coupling with mass spectrometry (MS) and front-end separations has been extremely beneficial for increasing measurement sensitivity, peak capacity of complex mixtures, and the scope of molecular information in biological and environmental sample analyses. Multiple studies in disease screening and environmental evaluations have even shown these IMS-based multidimensional separations extract information not possible with each technique individually. This review highlights 3-dimensional separations using IMS-MS in conjunction with a range of front-end techniques, such as gas chromatography (GC),more » supercritical fluid chromatography (SFC), liquid chromatography (LC), solid phase extractions (SPE), capillary electrophoresis (CE), field asymmetric ion mobility spectrometry (FAIMS), and microfluidic devices. The origination, current state, various applications, and future capabilities for these multidimensional approaches are described to provide insight into the utility and potential of each technique.« less

  3. Extraction Techniques for Polycyclic Aromatic Hydrocarbons in Soils

    PubMed Central

    Lau, E. V.; Gan, S.; Ng, H. K.

    2010-01-01

    This paper aims to provide a review of the analytical extraction techniques for polycyclic aromatic hydrocarbons (PAHs) in soils. The extraction technologies described here include Soxhlet extraction, ultrasonic and mechanical agitation, accelerated solvent extraction, supercritical and subcritical fluid extraction, microwave-assisted extraction, solid phase extraction and microextraction, thermal desorption and flash pyrolysis, as well as fluidised-bed extraction. The influencing factors in the extraction of PAHs from soil such as temperature, type of solvent, soil moisture, and other soil characteristics are also discussed. The paper concludes with a review of the models used to describe the kinetics of PAH desorption from soils during solvent extraction. PMID:20396670

  4. Determination of Cd in urine by cloud point extraction-tungsten coil atomic absorption spectrometry.

    PubMed

    Donati, George L; Pharr, Kathryn E; Calloway, Clifton P; Nóbrega, Joaquim A; Jones, Bradley T

    2008-09-15

    Cadmium concentrations in human urine are typically at or below the 1 microgL(-1) level, so only a handful of techniques may be appropriate for this application. These include sophisticated methods such as graphite furnace atomic absorption spectrometry and inductively coupled plasma mass spectrometry. While tungsten coil atomic absorption spectrometry is a simpler and less expensive technique, its practical detection limits often prohibit the detection of Cd in normal urine samples. In addition, the nature of the urine matrix often necessitates accurate background correction techniques, which would add expense and complexity to the tungsten coil instrument. This manuscript describes a cloud point extraction method that reduces matrix interference while preconcentrating Cd by a factor of 15. Ammonium pyrrolidinedithiocarbamate and Triton X-114 are used as complexing agent and surfactant, respectively, in the extraction procedure. Triton X-114 forms an extractant coacervate surfactant-rich phase that is denser than water, so the aqueous supernatant is easily removed leaving the metal-containing surfactant layer intact. A 25 microL aliquot of this preconcentrated sample is placed directly onto the tungsten coil for analysis. The cloud point extraction procedure allows for simple background correction based either on the measurement of absorption at a nearby wavelength, or measurement of absorption at a time in the atomization step immediately prior to the onset of the Cd signal. Seven human urine samples are analyzed by this technique and the results are compared to those found by the inductively coupled plasma mass spectrometry analysis of the same samples performed at a different institution. The limit of detection for Cd in urine is 5 ngL(-1) for cloud point extraction tungsten coil atomic absorption spectrometry. The accuracy of the method is determined with a standard reference material (toxic metals in freeze-dried urine) and the determined values agree with the reported levels at the 95% confidence level.

  5. Acquiring 3-D information about thick objects from differential interference contrast images using texture extraction

    NASA Astrophysics Data System (ADS)

    Sierra, Heidy; Brooks, Dana; Dimarzio, Charles

    2010-07-01

    The extraction of 3-D morphological information about thick objects is explored in this work. We extract this information from 3-D differential interference contrast (DIC) images by applying a texture detection method. Texture extraction methods have been successfully used in different applications to study biological samples. A 3-D texture image is obtained by applying a local entropy-based texture extraction method. The use of this method to detect regions of blastocyst mouse embryos that are used in assisted reproduction techniques such as in vitro fertilization is presented as an example. Results demonstrate the potential of using texture detection methods to improve morphological analysis of thick samples, which is relevant to many biomedical and biological studies. Fluorescence and optical quadrature microscope phase images are used for validation.

  6. Extractive sampling and optical remote sensing of F100 aircraft engine emissions.

    PubMed

    Cowen, Kenneth; Goodwin, Bradley; Joseph, Darrell; Tefend, Matthew; Satola, Jan; Kagann, Robert; Hashmonay, Ram; Spicer, Chester; Holdren, Michael; Mayfield, Howard

    2009-05-01

    The Strategic Environmental Research and Development Program (SERDP) has initiated several programs to develop and evaluate techniques to characterize emissions from military aircraft to meet increasingly stringent regulatory requirements. This paper describes the results of a recent field study using extractive and optical remote sensing (ORS) techniques to measure emissions from six F-15 fighter aircraft. Testing was performed between November 14 and 16, 2006 on the trim-pad facility at Tyndall Air Force Base in Panama City, FL. Measurements were made on eight different F100 engines, and the engines were tested on-wing of in-use aircraft. A total of 39 test runs were performed at engine power levels that ranged from idle to military power. The approach adopted for these tests involved extractive sampling with collocated ORS measurements at a distance of approximately 20-25 nozzle diameters downstream of the engine exit plane. The emission indices calculated for carbon dioxide, carbon monoxide, nitric oxide, and several volatile organic compounds showed very good agreement when comparing the extractive and ORS sampling methods.

  7. Driving profile modeling and recognition based on soft computing approach.

    PubMed

    Wahab, Abdul; Quek, Chai; Tan, Chin Keong; Takeda, Kazuya

    2009-04-01

    Advancements in biometrics-based authentication have led to its increasing prominence and are being incorporated into everyday tasks. Existing vehicle security systems rely only on alarms or smart card as forms of protection. A biometric driver recognition system utilizing driving behaviors is a highly novel and personalized approach and could be incorporated into existing vehicle security system to form a multimodal identification system and offer a greater degree of multilevel protection. In this paper, detailed studies have been conducted to model individual driving behavior in order to identify features that may be efficiently and effectively used to profile each driver. Feature extraction techniques based on Gaussian mixture models (GMMs) are proposed and implemented. Features extracted from the accelerator and brake pedal pressure were then used as inputs to a fuzzy neural network (FNN) system to ascertain the identity of the driver. Two fuzzy neural networks, namely, the evolving fuzzy neural network (EFuNN) and the adaptive network-based fuzzy inference system (ANFIS), are used to demonstrate the viability of the two proposed feature extraction techniques. The performances were compared against an artificial neural network (NN) implementation using the multilayer perceptron (MLP) network and a statistical method based on the GMM. Extensive testing was conducted and the results show great potential in the use of the FNN for real-time driver identification and verification. In addition, the profiling of driver behaviors has numerous other potential applications for use by law enforcement and companies dealing with buses and truck drivers.

  8. Extraction and labeling high-resolution images from PDF documents

    NASA Astrophysics Data System (ADS)

    Chachra, Suchet K.; Xue, Zhiyun; Antani, Sameer; Demner-Fushman, Dina; Thoma, George R.

    2013-12-01

    Accuracy of content-based image retrieval is affected by image resolution among other factors. Higher resolution images enable extraction of image features that more accurately represent the image content. In order to improve the relevance of search results for our biomedical image search engine, Open-I, we have developed techniques to extract and label high-resolution versions of figures from biomedical articles supplied in the PDF format. Open-I uses the open-access subset of biomedical articles from the PubMed Central repository hosted by the National Library of Medicine. Articles are available in XML and in publisher supplied PDF formats. As these PDF documents contain little or no meta-data to identify the embedded images, the task includes labeling images according to their figure number in the article after they have been successfully extracted. For this purpose we use the labeled small size images provided with the XML web version of the article. This paper describes the image extraction process and two alternative approaches to perform image labeling that measure the similarity between two images based upon the image intensity projection on the coordinate axes and similarity based upon the normalized cross-correlation between the intensities of two images. Using image identification based on image intensity projection, we were able to achieve a precision of 92.84% and a recall of 82.18% in labeling of the extracted images.

  9. Using Web-Based Knowledge Extraction Techniques to Support Cultural Modeling

    NASA Astrophysics Data System (ADS)

    Smart, Paul R.; Sieck, Winston R.; Shadbolt, Nigel R.

    The World Wide Web is a potentially valuable source of information about the cognitive characteristics of cultural groups. However, attempts to use the Web in the context of cultural modeling activities are hampered by the large-scale nature of the Web and the current dominance of natural language formats. In this paper, we outline an approach to support the exploitation of the Web for cultural modeling activities. The approach begins with the development of qualitative cultural models (which describe the beliefs, concepts and values of cultural groups), and these models are subsequently used to develop an ontology-based information extraction capability. Our approach represents an attempt to combine conventional approaches to information extraction with epidemiological perspectives of culture and network-based approaches to cultural analysis. The approach can be used, we suggest, to support the development of models providing a better understanding of the cognitive characteristics of particular cultural groups.

  10. Query-oriented evidence extraction to support evidence-based medicine practice.

    PubMed

    Sarker, Abeed; Mollá, Diego; Paris, Cecile

    2016-02-01

    Evidence-based medicine practice requires medical practitioners to rely on the best available evidence, in addition to their expertise, when making clinical decisions. The medical domain boasts a large amount of published medical research data, indexed in various medical databases such as MEDLINE. As the size of this data grows, practitioners increasingly face the problem of information overload, and past research has established the time-associated obstacles faced by evidence-based medicine practitioners. In this paper, we focus on the problem of automatic text summarisation to help practitioners quickly find query-focused information from relevant documents. We utilise an annotated corpus that is specialised for the task of evidence-based summarisation of text. In contrast to past summarisation approaches, which mostly rely on surface level features to identify salient pieces of texts that form the summaries, our approach focuses on the use of corpus-based statistics, and domain-specific lexical knowledge for the identification of summary contents. We also apply a target-sentence-specific summarisation technique that reduces the problem of underfitting that persists in generic summarisation models. In automatic evaluations run over a large number of annotated summaries, our extractive summarisation technique statistically outperforms various baseline and benchmark summarisation models with a percentile rank of 96.8%. A manual evaluation shows that our extractive summarisation approach is capable of selecting content with high recall and precision, and may thus be used to generate bottom-line answers to practitioners' queries. Our research shows that the incorporation of specialised data and domain-specific knowledge can significantly improve text summarisation performance in the medical domain. Due to the vast amounts of medical text available, and the high growth of this form of data, we suspect that such summarisation techniques will address the time-related obstacles associated with evidence-based medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Modified kernel-based nonlinear feature extraction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, J.; Perkins, S. J.; Theiler, J. P.

    2002-01-01

    Feature Extraction (FE) techniques are widely used in many applications to pre-process data in order to reduce the complexity of subsequent processes. A group of Kernel-based nonlinear FE ( H E ) algorithms has attracted much attention due to their high performance. However, a serious limitation that is inherent in these algorithms -- the maximal number of features extracted by them is limited by the number of classes involved -- dramatically degrades their flexibility. Here we propose a modified version of those KFE algorithms (MKFE), This algorithm is developed from a special form of scatter-matrix, whose rank is not determinedmore » by the number of classes involved, and thus breaks the inherent limitation in those KFE algorithms. Experimental results suggest that MKFE algorithm is .especially useful when the training set is small.« less

  12. Extracting Information about the Rotator Cuff from Magnetic Resonance Images Using Deterministic and Random Techniques

    PubMed Central

    De Los Ríos, F. A.; Paluszny, M.

    2015-01-01

    We consider some methods to extract information about the rotator cuff based on magnetic resonance images; the study aims to define an alternative method of display that might facilitate the detection of partial tears in the supraspinatus tendon. Specifically, we are going to use families of ellipsoidal triangular patches to cover the humerus head near the affected area. These patches are going to be textured and displayed with the information of the magnetic resonance images using the trilinear interpolation technique. For the generation of points to texture each patch, we propose a new method that guarantees the uniform distribution of its points using a random statistical method. Its computational cost, defined as the average computing time to generate a fixed number of points, is significantly lower as compared with deterministic and other standard statistical techniques. PMID:25650281

  13. Use of different sample temperatures in a single extraction procedure for the screening of the aroma profile of plant matrices by headspace solid-phase microextraction.

    PubMed

    Martendal, Edmar; de Souza Silveira, Cristine Durante; Nardini, Giuliana Stael; Carasek, Eduardo

    2011-06-17

    This study proposes a new approach to the optimization of the extraction of the volatile fraction of plant matrices using the headspace solid-phase microextraction (HS-SPME) technique. The optimization focused on the extraction time and temperature using a CAR/DVB/PDMS 50/30 μm SPME fiber and 100mg of a mixture of plants as the sample in a 15-mL vial. The extraction time (10-60 min) and temperature (5-60 °C) were optimized by means of a central composite design. The chromatogram was divided into four groups of peaks based on the elution temperature to provide a better understanding of the influence of the extraction parameters on the extraction efficiency considering compounds with different volatilities/polarities. In view of the different optimum extraction time and temperature conditions obtained for each group, a new approach based on the use of two extraction temperatures in the same procedure is proposed. The optimum conditions were achieved by extracting for 30 min with a sample temperature of 60 °C followed by a further 15 min at 5 °C. The proposed method was compared with the optimized conventional method based on a single extraction temperature (45 min of extraction at 50 °C) by submitting five samples to both procedures. The proposed method led to better results in all cases, considering as the response both peak area and the number of identified peaks. The newly proposed optimization approach provided an excellent alternative procedure to extract analytes with quite different volatilities in the same procedure. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Planetary Gears Feature Extraction and Fault Diagnosis Method Based on VMD and CNN.

    PubMed

    Liu, Chang; Cheng, Gang; Chen, Xihui; Pang, Yusong

    2018-05-11

    Given local weak feature information, a novel feature extraction and fault diagnosis method for planetary gears based on variational mode decomposition (VMD), singular value decomposition (SVD), and convolutional neural network (CNN) is proposed. VMD was used to decompose the original vibration signal to mode components. The mode matrix was partitioned into a number of submatrices and local feature information contained in each submatrix was extracted as a singular value vector using SVD. The singular value vector matrix corresponding to the current fault state was constructed according to the location of each submatrix. Finally, by training a CNN using singular value vector matrices as inputs, planetary gear fault state identification and classification was achieved. The experimental results confirm that the proposed method can successfully extract local weak feature information and accurately identify different faults. The singular value vector matrices of different fault states have a distinct difference in element size and waveform. The VMD-based partition extraction method is better than ensemble empirical mode decomposition (EEMD), resulting in a higher CNN total recognition rate of 100% with fewer training times (14 times). Further analysis demonstrated that the method can also be applied to the degradation recognition of planetary gears. Thus, the proposed method is an effective feature extraction and fault diagnosis technique for planetary gears.

  15. Planetary Gears Feature Extraction and Fault Diagnosis Method Based on VMD and CNN

    PubMed Central

    Cheng, Gang; Chen, Xihui

    2018-01-01

    Given local weak feature information, a novel feature extraction and fault diagnosis method for planetary gears based on variational mode decomposition (VMD), singular value decomposition (SVD), and convolutional neural network (CNN) is proposed. VMD was used to decompose the original vibration signal to mode components. The mode matrix was partitioned into a number of submatrices and local feature information contained in each submatrix was extracted as a singular value vector using SVD. The singular value vector matrix corresponding to the current fault state was constructed according to the location of each submatrix. Finally, by training a CNN using singular value vector matrices as inputs, planetary gear fault state identification and classification was achieved. The experimental results confirm that the proposed method can successfully extract local weak feature information and accurately identify different faults. The singular value vector matrices of different fault states have a distinct difference in element size and waveform. The VMD-based partition extraction method is better than ensemble empirical mode decomposition (EEMD), resulting in a higher CNN total recognition rate of 100% with fewer training times (14 times). Further analysis demonstrated that the method can also be applied to the degradation recognition of planetary gears. Thus, the proposed method is an effective feature extraction and fault diagnosis technique for planetary gears. PMID:29751671

  16. Fabric phase sorptive extraction: Two practical sample pretreatment techniques for brominated flame retardants in water.

    PubMed

    Huang, Guiqi; Dong, Sheying; Zhang, Mengfei; Zhang, Haihan; Huang, Tinglin

    2016-09-15

    Sample pretreatment is the critical section for residue monitoring of hazardous pollutants. In this paper, using the cellulose fabric as host matrix, three extraction sorbents such as poly (tetrahydrofuran) (PTHF), poly (ethylene glycol) (PEG) and poly (dimethyldiphenylsiloxane) (PDMDPS), were prepared on the surface of the cellulose fabric. Two practical extraction techniques including stir bar fabric phase sorptive extraction (stir bar-FPSE) and magnetic stir fabric phase sorptive extraction (magnetic stir-FPSE) have been designed, which allow stirring of fabric phase sorbent during the whole extraction process. In the meantime, three brominated flame retardants (BFRs) [tetrabromobisphenol A (TBBPA), tetrabromobisphenol A bisallylether (TBBPA-BAE), tetrabromobisphenol A bis(2,3-dibromopropyl)ether (TBBPA-BDBPE)] in the water sample were selected as model analytes for the practical evaluation of the proposed two techniques using high-performance liquid chromatography (HPLC). Moreover, various experimental conditions affecting extraction process such as the type of fabric phase, extraction time, the amount of salt and elution conditions were also investigated. Due to the large sorbent loading capacity and unique stirring performance, both techniques possessed high extraction capability and fast extraction equilibrium. Under the optimized conditions, high recoveries (90-99%) and low limits of detection (LODs) (0.01-0.05 μg L(-1)) were achieved. In addition, the reproducibility was obtained by evaluating the intraday and interday precisions with relative standard deviations (RSDs) less than 5.1% and 6.8%, respectively. The results indicated that two pretreatment techniques were promising and practical for monitoring of hazardous pollutants in the water sample. Due to low solvent consumption and high repeated use performance, proposed techniques also could meet green analytical criteria. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. The Metabolism of the Volatile Amines

    PubMed Central

    Tobe, Barry A.

    1963-01-01

    As part of a series of studies into the etiology of acute hepatic encephalopathy the identity of the volatile base, which had been extracted from blood and presumed to be ammonia, was investigated. In order to differentiate between ammonia and the other volatile bases with which it can be confused, a method of separating these compounds by the use of gas chromatography was developed. The technique is described in detail because it incorporates several novel ideas that can be applied to the isolation and identification of similar compounds in many biological systems. The volatile base extracted from blood was found to be ammonia, and no other volatile base was demonstrable in blood from both healthy subjects and patients suffering from acute hepatic encephalopathy. PMID:14101451

  18. Rapid and green analytical method for the determination of quinoline alkaloids from Cinchona succirubra based on Microwave-Integrated Extraction and Leaching (MIEL) prior to high performance liquid chromatography.

    PubMed

    Fabiano-Tixier, Anne-Sylvie; Elomri, Abdelhakim; Blanckaert, Axelle; Seguin, Elisabeth; Petitcolas, Emmanuel; Chemat, Farid

    2011-01-01

    Quinas contains several compounds, such as quinoline alkaloids, principally quinine, quinidine, cinchonine and cichonidine. Identified from barks of Cinchona, quinine is still commonly used to treat human malaria. Microwave-Integrated Extraction and Leaching (MIEL) is proposed for the extraction of quinoline alkaloids from bark of Cinchona succirubra. The process is performed in four steps, which ensures complete, rapid and accurate extraction of the samples. Optimal conditions for extraction were obtained using a response surface methodology reached from a central composite design. The MIEL extraction has been compared with a conventional technique soxhlet extraction. The extracts of quinoline alkaloids from C. succirubra obtained by these two different methods were compared by HPLC. The extracts obtained by MIEL in 32 min were quantitatively (yield) and qualitatively (quinine, quinidine, cinchonine, cinchonidine) similar to those obtained by conventional Soxhlet extraction in 3 hours. MIEL is a green technology that serves as a good alternative for the extraction of Cinchona alkaloids.

  19. Rapid and Green Analytical Method for the Determination of Quinoline Alkaloids from Cinchona succirubra Based on Microwave-Integrated Extraction and Leaching (MIEL) Prior to High Performance Liquid Chromatography

    PubMed Central

    Fabiano-Tixier, Anne-Sylvie; Elomri, Abdelhakim; Blanckaert, Axelle; Seguin, Elisabeth; Petitcolas, Emmanuel; Chemat, Farid

    2011-01-01

    Quinas contains several compounds, such as quinoline alkaloids, principally quinine, quinidine, cinchonine and cichonidine. Identified from barks of Cinchona, quinine is still commonly used to treat human malaria. Microwave-Integrated Extraction and Leaching (MIEL) is proposed for the extraction of quinoline alkaloids from bark of Cinchona succirubra. The process is performed in four steps, which ensures complete, rapid and accurate extraction of the samples. Optimal conditions for extraction were obtained using a response surface methodology reached from a central composite design. The MIEL extraction has been compared with a conventional technique soxhlet extraction. The extracts of quinoline alkaloids from C. succirubra obtained by these two different methods were compared by HPLC. The extracts obtained by MIEL in 32 min were quantitatively (yield) and qualitatively (quinine, quinidine, cinchonine, cinchonidine) similar to those obtained by conventional Soxhlet extraction in 3 hours. MIEL is a green technology that serves as a good alternative for the extraction of Cinchona alkaloids. PMID:22174637

  20. Polymer-based adsorption medium prepared using a fragment imprinting technique for homologues of chlorinated bisphenol A produced in the environment.

    PubMed

    Kubo, Takuya; Hosoya, Ken; Watabe, Yoshiyuki; Ikegami, Tohru; Tanaka, Nobuo; Sano, Tomoharu; Kaya, Kunimitsu

    2004-03-12

    A polymer-based adsorption medium having molecular recognition ability for homologues of chlorinated bisphenol A produced in environment was prepared using a fragment imprinting technique. 2,6-Dimethyl phenol was utilized as a pseudo-template molecule and the adsorption media prepared was evaluated by high performance liquid chromatography (HPLC) and solid-phase extraction (SPE). As results, the adsorption medium showed preferable chromatographic retention and specific adsorption ability for the chlorinated bisphenol As having chlorine substituents at 3,5-positions through fragment imprinting effect.

  1. New 3D-printed sorbent for extraction of steroids from human plasma preceding LC-MS analysis.

    PubMed

    Konieczna, Lucyna; Belka, Mariusz; Okońska, Magdalena; Pyszka, Magdalena; Bączek, Tomasz

    2018-04-13

    In recent years, there has been an increasing worldwide interest in the use of alternative sample preparation methods that are proceeded by separation techniques. Fused deposition modeling (FDM) is a 3D printing technique that is based the consecutive layering of softened/melted thermoplastic materials. In this study, a group of natural steroids and sexual hormones - namely, aldosterone, cortisol, β-estradiol, testosterone, dihydrotestosterone, and synthetic methyltestosterone and betamethasone - were separated and determined using an optimized high-performance liquid chromatography coupled to mass spectrometry (LC-MS) method in positive ionization mode. 3D-printed sorbents were selected as the pre-concentration technique because they are generally low cost, fast, and simple to make and automate. Furthermore, the use of 3D-printed sorbents helps to minimize potential errors due to their repeatability and reproducibility, and their ability to eliminate carry over by using one printed sorbent for a single extraction of steroids from biological matrices. The extraction procedure was optimized and the parameters influencing 3D-printed Layfomm 60 ® based sorbent and LC-MS were studied, including the type of extraction solvent used, sorption and desorption times, temperature, and the salting-out effect. To demonstrate this method's applicability for biological sample analysis, the SPME-LC-MS method was validated for its ability to simultaneously quantify endogenous steroids. This evaluation confirmed good linearity and an R 2 that was between 0.9970 and 0.9990. The recovery rates for human plasma samples were 86.34-93.6% for the studied steroids with intra- and inter-day RSDs of 1.44-7.42% and 1.44-9.46%, respectively. To our knowledge, this study is the first time that 3D-printed sorbents have been used to extract trace amounts of endogenous low-molecular-weight compounds, such as steroids, from biological samples, such as plasma. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Model, analysis, and evaluation of the effects of analog VLSI arithmetic on linear subspace-based image recognition.

    PubMed

    Carvajal, Gonzalo; Figueroa, Miguel

    2014-07-01

    Typical image recognition systems operate in two stages: feature extraction to reduce the dimensionality of the input space, and classification based on the extracted features. Analog Very Large Scale Integration (VLSI) is an attractive technology to achieve compact and low-power implementations of these computationally intensive tasks for portable embedded devices. However, device mismatch limits the resolution of the circuits fabricated with this technology. Traditional layout techniques to reduce the mismatch aim to increase the resolution at the transistor level, without considering the intended application. Relating mismatch parameters to specific effects in the application level would allow designers to apply focalized mismatch compensation techniques according to predefined performance/cost tradeoffs. This paper models, analyzes, and evaluates the effects of mismatched analog arithmetic in both feature extraction and classification circuits. For the feature extraction, we propose analog adaptive linear combiners with on-chip learning for both Least Mean Square (LMS) and Generalized Hebbian Algorithm (GHA). Using mathematical abstractions of analog circuits, we identify mismatch parameters that are naturally compensated during the learning process, and propose cost-effective guidelines to reduce the effect of the rest. For the classification, we derive analog models for the circuits necessary to implement Nearest Neighbor (NN) approach and Radial Basis Function (RBF) networks, and use them to emulate analog classifiers with standard databases of face and hand-writing digits. Formal analysis and experiments show how we can exploit adaptive structures and properties of the input space to compensate the effects of device mismatch at the application level, thus reducing the design overhead of traditional layout techniques. Results are also directly extensible to multiple application domains using linear subspace methods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Real-time Quaking-induced Conversion Assay for Detection of CWD Prions in Fecal Material.

    PubMed

    Cheng, Yo Ching; Hannaoui, Samia; John, Theodore Ralph; Dudas, Sandor; Czub, Stefanie; Gilch, Sabine

    2017-09-29

    The RT-QuIC technique is a sensitive in vitro cell-free prion amplification assay based mainly on the seeded misfolding and aggregation of recombinant prion protein (PrP) substrate using prion seeds as a template for the conversion. RT-QuIC is a novel high-throughput technique which is analogous to real-time polymerase chain reaction (PCR). Detection of amyloid fibril growth is based on the dye Thioflavin T, which fluoresces upon specific interaction with ᵦ-sheet rich proteins. Thus, amyloid formation can be detected in real time. We attempted to develop a reliable non-invasive screening test to detect chronic wasting disease (CWD) prions in fecal extract. Here, we have specifically adapted the RT-QuIC technique to reveal PrP Sc seeding activity in feces of CWD infected cervids. Initially, the seeding activity of the fecal extracts we prepared was relatively low in RT-QuIC, possibly due to potential assay inhibitors in the fecal material. To improve seeding activity of feces extracts and remove potential assay inhibitors, we homogenized the fecal samples in a buffer containing detergents and protease inhibitors. We also submitted the samples to different methodologies to concentrate PrP Sc on the basis of protein precipitation using sodium phosphotungstic acid, and centrifugal force. Finally, the feces extracts were tested by optimized RT-QuIC which included substrate replacement in the protocol to improve the sensitivity of detection. Thus, we established a protocol for sensitive detection of CWD prion seeding activity in feces of pre-clinical and clinical cervids by RT-QuIC, which can be a practical tool for non-invasive CWD diagnosis.

  4. Brownian motion properties of optoelectronic random bit generators based on laser chaos.

    PubMed

    Li, Pu; Yi, Xiaogang; Liu, Xianglian; Wang, Yuncai; Wang, Yongge

    2016-07-11

    The nondeterministic property of the optoelectronic random bit generator (RBG) based on laser chaos are experimentally analyzed from two aspects of the central limit theorem and law of iterated logarithm. The random bits are extracted from an optical feedback chaotic laser diode using a multi-bit extraction technique in the electrical domain. Our experimental results demonstrate that the generated random bits have no statistical distance from the Brownian motion, besides that they can pass the state-of-the-art industry-benchmark statistical test suite (NIST SP800-22). All of them give a mathematically provable evidence that the ultrafast random bit generator based on laser chaos can be used as a nondeterministic random bit source.

  5. Multivariate Optimization for Extraction of Pyrethroids in Milk and Validation for GC-ECD and CG-MS/MS Analysis

    PubMed Central

    Zanchetti Meneghini, Leonardo; Rübensam, Gabriel; Claudino Bica, Vinicius; Ceccon, Amanda; Barreto, Fabiano; Flores Ferrão, Marco; Bergold, Ana Maria

    2014-01-01

    A simple and inexpensive method based on solvent extraction followed by low temperature clean-up was applied for determination of seven pyrethroids residues in bovine raw milk using gas chromatography coupled to tandem mass spectrometry (GC-MS/MS) and gas chromatography with electron-capture detector (GC-ECD). Sample extraction procedure was established through the evaluation of seven different extraction protocols, evaluated in terms of analyte recovery and cleanup efficiency. Sample preparation optimization was based on Doehlert design using fifteen runs with three different variables. Response surface methodologies and polynomial analysis were used to define the best extraction conditions. Method validation was carried out based on SANCO guide parameters and assessed by multivariate analysis. Method performance was considered satisfactory since mean recoveries were between 87% and 101% for three distinct concentrations. Accuracy and precision were lower than ±20%, and led to no significant differences (p < 0.05) between results obtained by GC-ECD and GC-MS/MS techniques. The method has been applied to routine analysis for determination of pyrethroid residues in bovine raw milk in the Brazilian National Residue Control Plan since 2013, in which a total of 50 samples were analyzed. PMID:25380457

  6. Development of a cloud point extraction and spectrophotometry-based microplate method for the determination of nitrite in human urine and blood.

    PubMed

    Zhao, Jiao; Lu, Yunhui; Fan, Chongyang; Wang, Jun; Yang, Yaling

    2015-02-05

    A novel and simple method for the sensitive determination of trace amounts of nitrite in human urine and blood has been developed by combination of cloud point extraction (CPE) and microplate assay. The method is based on the Griess reaction and the reaction product is extracted into nonionic surfactant Triton-X114 using CPE technique. In this study, decolorization treatment of urine and blood was applied to overcome the interference of matrix and enhance the sensitivity of nitrite detection. Multi-sample can be simultaneously detected thanks to a 96-well microplate technique. The effects of different operating parameters such as type of decolorizing agent, concentration of surfactant (Triton X-114), addition of (NH4)2SO4, extraction temperature and time, interfering elements were studied and optimum conditions were obtained. Under the optimum conditions, a linear calibration graph was obtained in the range of 10-400 ng mL(-1) of nitrite with limit of detection (LOD) of 2.5 ng mL(-1). The relative standard deviation (RSD) for determination of 100 ng mL(-1) of nitrite was 2.80%. The proposed method was successfully applied for the determination of nitrite in the urine and blood samples with recoveries of 92.6-101.2%. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Feature extraction through parallel Probabilistic Principal Component Analysis for heart disease diagnosis

    NASA Astrophysics Data System (ADS)

    Shah, Syed Muhammad Saqlain; Batool, Safeera; Khan, Imran; Ashraf, Muhammad Usman; Abbas, Syed Hussnain; Hussain, Syed Adnan

    2017-09-01

    Automatic diagnosis of human diseases are mostly achieved through decision support systems. The performance of these systems is mainly dependent on the selection of the most relevant features. This becomes harder when the dataset contains missing values for the different features. Probabilistic Principal Component Analysis (PPCA) has reputation to deal with the problem of missing values of attributes. This research presents a methodology which uses the results of medical tests as input, extracts a reduced dimensional feature subset and provides diagnosis of heart disease. The proposed methodology extracts high impact features in new projection by using Probabilistic Principal Component Analysis (PPCA). PPCA extracts projection vectors which contribute in highest covariance and these projection vectors are used to reduce feature dimension. The selection of projection vectors is done through Parallel Analysis (PA). The feature subset with the reduced dimension is provided to radial basis function (RBF) kernel based Support Vector Machines (SVM). The RBF based SVM serves the purpose of classification into two categories i.e., Heart Patient (HP) and Normal Subject (NS). The proposed methodology is evaluated through accuracy, specificity and sensitivity over the three datasets of UCI i.e., Cleveland, Switzerland and Hungarian. The statistical results achieved through the proposed technique are presented in comparison to the existing research showing its impact. The proposed technique achieved an accuracy of 82.18%, 85.82% and 91.30% for Cleveland, Hungarian and Switzerland dataset respectively.

  8. Ionic liquid-based microwave-assisted extraction of essential oil and biphenyl cyclooctene lignans from Schisandra chinensis Baill fruits.

    PubMed

    Ma, Chun-hui; Liu, Ting-ting; Yang, Lei; Zu, Yuan-gang; Chen, Xiaoqiang; Zhang, Lin; Zhang, Ying; Zhao, Chunjian

    2011-12-02

    Ionic liquid-based microwave-assisted extraction (ILMAE) has been successfully applied in extracting essential oil and four kinds of biphenyl cyclooctene lignans from Schisandra chinensis Baill. 0.25 M 1-lauryl-3-methylimidazolium bromide ionic liquid is selected as solvent. The optimum parameters of dealing with 25.0 g sample are 385 W irradiation power, 40 min microwave extraction time and 1:12 solid-liquid ratio. The yields of essential oil and lignans are 12.12±0.37 ml/kg and 250.2±38.2 mg/kg under the optimum conditions. The composition of the essential oil extracted by hydro-distillation, steam-distillation and ILMAE is analyzed by GC-MS. With ILMAE method, the energy consumption time has not only been shortened to 40 min (hydro-distillation 3.0 h for extracting essential oil and reflux extraction 4.0 h for extracting lignans, respectively), but also the extraction efficiency has been improved (extraction of lignans and distillation of essential oil at the same time) and reduces the environmental pollution. S. chinensis materials treated by different methods are observed by scanning electronic microscopy. Micrographs provide more evidence to prove that ILMAE is a better and faster method. The experimental results also indicate that ILMAE is a simple and efficient technique for sample preparation. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Bio-refinery of orange peels waste: a new concept based on integrated green and solvent free extraction processes using ultrasound and microwave techniques to obtain essential oil, polyphenols and pectin.

    PubMed

    Boukroufa, Meryem; Boutekedjiret, Chahrazed; Petigny, Loïc; Rakotomanomana, Njara; Chemat, Farid

    2015-05-01

    In this study, extraction of essential oil, polyphenols and pectin from orange peel has been optimized using microwave and ultrasound technology without adding any solvent but only "in situ" water which was recycled and used as solvent. The essential oil extraction performed by Microwave Hydrodiffusion and Gravity (MHG) was optimized and compared to steam distillation extraction (SD). No significant changes in yield were noticed: 4.22 ± 0.03% and 4.16 ± 0.05% for MHG and SD, respectively. After extraction of essential oil, residual water of plant obtained after MHG extraction was used as solvent for polyphenols and pectin extraction from MHG residues. Polyphenols extraction was performed by ultrasound-assisted extraction (UAE) and conventional extraction (CE). Response surface methodology (RSM) using central composite designs (CCD) approach was launched to investigate the influence of process variables on the ultrasound-assisted extraction (UAE). The statistical analysis revealed that the optimized conditions of ultrasound power and temperature were 0.956 W/cm(2) and 59.83°C giving a polyphenol yield of 50.02 mgGA/100 g dm. Compared with the conventional extraction (CE), the UAE gave an increase of 30% in TPC yield. Pectin was extracted by conventional and microwave assisted extraction. This technique gives a maximal yield of 24.2% for microwave power of 500 W in only 3 min whereas conventional extraction gives 18.32% in 120 min. Combination of microwave, ultrasound and the recycled "in situ" water of citrus peels allow us to obtain high added values compounds in shorter time and managed to make a closed loop using only natural resources provided by the plant which makes the whole process intensified in term of time and energy saving, cleanliness and reduced waste water. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Inhibition of chemiluminescence and chemotactic activity of phagocytes in vitro by the extracts of selected medicinal plants.

    PubMed

    Jantan, Ibrahim; Harun, Nurul Hikmah; Septama, Abdi Wira; Murad, Shahnaz; Mesaik, M A

    2011-04-01

    The methanol extracts of 20 selected medicinal plants were investigated for their effects on the respiratory burst of human whole blood, isolated human polymorphonuclear leukocytes (PMNs) and isolated mice macrophages using a luminol/lucigenin-based chemiluminescence assay. We also tested the effect of the extracts on chemotactic migration of PMNs using the Boyden chamber technique. The extracts of Curcuma domestica L., Phyllanthus amarus Schum & Thonn and C. xanthorrhiza Roxb. were the samples producing the strongest oxidative burst of PMNs with luminol-based chemiluminescence, with IC(50) values ranging from 0.5 to 0.7 μg/ml. For macrophage cells, the extracts which showed strong suppressive activity for luminol-based chemiluminescence were C. xanthorrhiza and Garcinia mangostana L. Among the extracts studied, C. mangga Valton & Vazsjip, Piper nigrum L. and Labisia pumila var. alata showed strong inhibitory activity on lucigenin-amplified oxidative burst of PMNs, with IC(50) values ranging from 0.9 to 1.5 μg/ml. The extracts of Zingiber officinale Rosc., Alpinia galangal (L.) Willd and Averrhoa bilimbi Linn showed strong inhibition on the chemotaxic migration of cells, with IC(50) values comparable to that of ibuprofen (1.5 μg/ml). The results suggest that some of these plants were able to modulate the innate immune response of phagocytes at different steps, emphasizing their potential as a source of new immunomodulatory agents.

  11. An effective vacuum assisted extraction method for the optimization of labdane diterpenoids from Andrographis paniculata by response surface methodology.

    PubMed

    Wang, Ya-Qi; Wu, Zhen-Feng; Ke, Gang; Yang, Ming

    2014-12-31

    An effective vacuum assisted extraction (VAE) technique was proposed for the first time and applied to extract bioactive components from Andrographis paniculata. The process was carefully optimized by response surface methodology (RSM). Under the optimized experimental conditions, the best results were obtained using a boiling temperature of 65 °C, 50% ethanol concentration, 16 min of extraction time, one extraction cycles and a 12:1 liquid-solid ratio. Compared with conventional ultrasonic assisted extraction and heat reflux extraction, the VAE technique gave shorter extraction times and remarkable higher extraction efficiency, which indicated that a certain degree of vacuum gave the solvent a better penetration of the solvent into the pores and between the matrix particles, and enhanced the process of mass transfer. The present results demonstrated that VAE is an efficient, simple and fast method for extracting bioactive components from A. paniculata, which shows great potential for becoming an alternative technique for industrial scale-up applications.

  12. Influence of extraction technique on the anti-oxidative potential of hawthorn (Crataegus monogyna) extracts in bovine muscle homogenates.

    PubMed

    Shortle, E; O'Grady, M N; Gilroy, D; Furey, A; Quinn, N; Kerry, J P

    2014-12-01

    Six extracts were prepared from hawthorn (Crataegus monogyna) leaves and flowers (HLF) and berries (HB) using solid-liquid [traditional (T) (HLFT, HBT), sonicated (S) (HLFS, HBS)] and supercritical fluid (C) extraction (HLFC, HBC) techniques. The antioxidant activities of HLF and HB extracts were characterised using in vitro antioxidant assays (TPC, DPPH, FRAP) and in 25% bovine muscle (longissimus lumborum) homogenates (lipid oxidation (TBARS), oxymyoglobin (% of total myoglobin)) after 24h storage at 4°C. Hawthorn extracts exhibited varying degrees of antioxidant potency. In vitro and muscle homogenate (TBARS) antioxidant activity followed the order: HLFS>HLFT and HBT>HBS. In supercritical fluid extracts, HLFC>HBC (in vitro antioxidant activity) and HLFC≈HBC (TBARS). All extracts (except HBS) reduced oxymyoglobin oxidation. The HLFS extract had the highest antioxidant activity in all test systems. Supercritical fluid extraction (SFE) exhibited potential as a technique for the manufacture of functional ingredients (antioxidants) from hawthorn for use in muscle foods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Comparison of different methods for extraction from Tetraclinis articulata: yield, chemical composition and antioxidant activity.

    PubMed

    Herzi, Nejia; Bouajila, Jalloul; Camy, Séverine; Romdhane, Mehrez; Condoret, Jean-Stéphane

    2013-12-15

    In the present study, three techniques of extraction: hydrodistillation (HD), solvent extraction (conventional 'Soxhlet' technique) and an innovative technique, i.e., the supercritical fluid extraction (SFE), were applied to ground Tetraclinis articulata leaves and compared for extraction duration, extraction yield, and chemical composition of the extracts as well as their antioxidant activities. The extracts were analyzed by GC-FID and GC-MS. The antioxidant activity was measured using two methods: ABTS(•+) and DPPH(•). The yield obtained using HD, SFE, hexane and ethanol Soxhlet extractions were found to be 0.6, 1.6, 40.4 and 21.2-27.4 g/kg respectively. An original result of this study is that the best antioxidant activity was obtained with an SFE extract (41 mg/L). The SFE method offers some noteworthy advantages over traditional alternatives, such as shorter extraction times, low environmental impact, and a clean, non-thermally-degraded final product. Also, a good correlation between the phenolic contents and the antioxidant activity was observed with extracts obtained by SFE at 9 MPa. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. A Simple and Efficient Method of Extracting DNA from Aged Bones and Teeth.

    PubMed

    Liu, Qiqi; Liu, Liyan; Zhang, Minli; Zhang, Qingzhen; Wang, Qiong; Ding, Xiaoran; Shao, Liting; Zhou, Zhe; Wang, Shengqi

    2018-05-01

    DNA is often difficult to extract from old bones and teeth due to low levels of DNA and high levels of degradation. This study established a simple yet efficient method for extracting DNA from 20 aged bones and teeth (approximately 60 years old). Based on the concentration and STR typing results, the new method of DNA extraction (OM) developed in this study was compared with the PrepFiler™ BTA Forensic DNA Extraction Kit (BM). The total amount of DNA extracted using the OM method was not significantly different from that extracted using the commercial kit (p > 0.05). However, the number of STR loci detected was significantly higher in the samples processed using the OM method than using the BM method (p < 0.05). This study aimed to establish a DNA extraction method for aged bones and teeth to improve the detection rate of STR typing and reduce costs compared to the BM technique. © 2017 American Academy of Forensic Sciences.

  15. Isolation, Separation, and Preconcentration of Biologically Active Compounds from Plant Matrices by Extraction Techniques.

    PubMed

    Raks, Victoria; Al-Suod, Hossam; Buszewski, Bogusław

    2018-01-01

    Development of efficient methods for isolation and separation of biologically active compounds remains an important challenge for researchers. Designing systems such as organomineral composite materials that allow extraction of a wide range of biologically active compounds, acting as broad-utility solid-phase extraction agents, remains an important and necessary task. Selective sorbents can be easily used for highly selective and reliable extraction of specific components present in complex matrices. Herein, state-of-the-art approaches for selective isolation, preconcentration, and separation of biologically active compounds from a range of matrices are discussed. Primary focus is given to novel extraction methods for some biologically active compounds including cyclic polyols, flavonoids, and oligosaccharides from plants. In addition, application of silica-, carbon-, and polymer-based solid-phase extraction adsorbents and membrane extraction for selective separation of these compounds is discussed. Potential separation process interactions are recommended; their understanding is of utmost importance for the creation of optimal conditions to extract biologically active compounds including those with estrogenic properties.

  16. Conventional and Accelerated-Solvent Extractions of Green Tea (Camellia sinensis) for Metabolomics-based Chemometrics

    PubMed Central

    Kellogg, Joshua J.; Wallace, Emily D.; Graf, Tyler N.; Oberlies, Nicholas H.; Cech, Nadja B.

    2018-01-01

    Metabolomics has emerged as an important analytical technique for multiple applications. The value of information obtained from metabolomics analysis depends on the degree to which the entire metabolome is present and the reliability of sample treatment to ensure reproducibility across the study. The purpose of this study was to compare methods of preparing complex botanical extract samples prior to metabolomics profiling. Two extraction methodologies, accelerated solvent extraction and a conventional solvent maceration, were compared using commercial green tea [Camellia sinensis (L.) Kuntze (Theaceae)] products as a test case. The accelerated solvent protocol was first evaluated to ascertain critical factors influencing extraction using a D-optimal experimental design study. The accelerated solvent and conventional extraction methods yielded similar metabolite profiles for the green tea samples studied. The accelerated solvent extraction yielded higher total amounts of extracted catechins, was more reproducible, and required less active bench time to prepare the samples. This study demonstrates the effectiveness of accelerated solvent as an efficient methodology for metabolomics studies. PMID:28787673

  17. An Efficient Extraction Method for Fragrant Volatiles from Jasminum sambac (L.) Ait.

    PubMed

    Ye, Qiuping; Jin, Xinyi; Zhu, Xinliang; Lin, Tongxiang; Hao, Zhilong; Yang, Qian

    2015-01-01

    The sweet smell of aroma of Jasminum sambac (L.) Ait. is releasing while the flowers are blooming. Although components of volatile oil have been extensively studied, there are problematic issues, such as low efficiency of yield, flavour distortion. Here, the subcritical fluid extraction (SFE) was performed to extract fragrant volatiles from activated carbon that had absorbed the aroma of jasmine flowers. This novel method could effectively obtain main aromatic compounds with quality significantly better than solvent extraction (SE). Based on the analysis data with response surface methodology (RSM), we optimized the extraction conditions which consisted of a temperature of 44°C, a solvent-to-material ratio of 3.5:1, and an extraction time of 53 min. Under these conditions, the extraction yield was 4.91%. Furthermore, the key jasmine essence oil components, benzyl acetate and linalool, increase 7 fold and 2 fold respectively which lead to strong typical smell of the jasmine oil. The new method can reduce spicy components which lead to the essential oils smelling sweeter. Thus, the quality of the jasmine essence oil was dramatically improved and yields based on the key component increased dramatically. Our results provide a new effective technique for extracting fragrant volatiles from jasmine flowers.

  18. Extraction and separation of Co(II) and Ni(II) from acidic sulfate solutions using Aliquat 336.

    PubMed

    Nayl, A A

    2010-01-15

    Extraction and separation of Co(II) and Ni(II) from acidic sulfate solutions by solvent extraction technique were studied using different forms of Aliquat 336 diluted with kerosene. The extraction percent of each metal ion was found to increase with increasing pH and extractant concentration. Co(II) was preferentially extracted by different forms of Aliquat 336 over Ni(II) under the same extraction conditions. From analysis of the experimental results, the extraction mechanism of R(4)N-forms was proposed with Co(II). It was found that the highest separation factor (S(Co/Ni)) value of 606.7 was obtained with 0.36 M R(4)N-SCN in kerosene from 2.0M H(2)SO(4) solution at pH 4.8 and shaking time of 20 min. Stripping of the two metal ions from the organic phase was also investigated. Based on the experimental results, a separation method was developed and tested to separate high purity Co(II), Ni(II) and Ln(III) from Ni-MH batteries leached by 2.0M H(2)SO(4). Based on the experimental results, a flow sheet was developed and tested and 0.34 g Co, 1.39 g Ln and 5.2g Ni were obtained from the leaching process.

  19. Cloud-point extraction of green-polymers from Cupriavidus necator lysate using thermoseparating-based aqueous two-phase extraction.

    PubMed

    Leong, Yoong Kit; Lan, John Chi-Wei; Loh, Hwei-San; Ling, Tau Chuan; Ooi, Chien Wei; Show, Pau Loke

    2017-03-01

    Polyhydroxyalkanoates (PHAs), a class of renewable and biodegradable green polymers, have gained attraction as a potential substitute for the conventional plastics due to the increasing concern towards environmental pollution as well as the rapidly depleting petroleum reserve. Nevertheless, the high cost of downstream processing of PHA has been a bottleneck for the wide adoption of PHAs. Among the options of PHAs recovery techniques, aqueous two-phase extraction (ATPE) outshines the others by having the advantages of providing a mild environment for bioseparation, being green and non-toxic, the capability to handle a large operating volume and easily scaled-up. Utilizing unique properties of thermo-responsive polymer which has decreasing solubility in its aqueous solution as the temperature rises, cloud point extraction (CPE) is an ATPE technique that allows its phase-forming component to be recycled and reused. A thorough literature review has shown that this is the first time isolation and recovery of PHAs from Cupriavidus necator H16 via CPE was reported. The optimum condition for PHAs extraction (recovery yield of 94.8% and purification factor of 1.42 fold) was achieved under the conditions of 20 wt/wt % ethylene oxide-propylene oxide (EOPO) with molecular weight of 3900 g/mol and 10 mM of sodium chloride addition at thermoseparating temperature of 60°C with crude feedstock limit of 37.5 wt/wt %. Recycling and reutilization of EOPO 3900 can be done at least twice with satisfying yield and PF. CPE has been demonstrated as an effective technique for the extraction of PHAs from microbial crude culture. Copyright © 2016 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  20. Computer-Aided Diagnostic (CAD) Scheme by Use of Contralateral Subtraction Technique

    NASA Astrophysics Data System (ADS)

    Nagashima, Hiroyuki; Harakawa, Tetsumi

    We developed a computer-aided diagnostic (CAD) scheme for detection of subtle image findings of acute cerebral infarction in brain computed tomography (CT) by using a contralateral subtraction technique. In our computerized scheme, the lateral inclination of image was first corrected automatically by rotating and shifting. The contralateral subtraction image was then derived by subtraction of reversed image from original image. Initial candidates for acute cerebral infarctions were identified using the multiple-thresholding and image filtering techniques. As the 1st step for removing false positive candidates, fourteen image features were extracted in each of the initial candidates. Halfway candidates were detected by applying the rule-based test with these image features. At the 2nd step, five image features were extracted using the overlapping scale with halfway candidates in interest slice and upper/lower slice image. Finally, acute cerebral infarction candidates were detected by applying the rule-based test with five image features. The sensitivity in the detection for 74 training cases was 97.4% with 3.7 false positives per image. The performance of CAD scheme for 44 testing cases had an approximate result to training cases. Our CAD scheme using the contralateral subtraction technique can reveal suspected image findings of acute cerebral infarctions in CT images.

  1. Depth extraction method with high accuracy in integral imaging based on moving array lenslet technique

    NASA Astrophysics Data System (ADS)

    Wang, Yao-yao; Zhang, Juan; Zhao, Xue-wei; Song, Li-pei; Zhang, Bo; Zhao, Xing

    2018-03-01

    In order to improve depth extraction accuracy, a method using moving array lenslet technique (MALT) in pickup stage is proposed, which can decrease the depth interval caused by pixelation. In this method, the lenslet array is moved along the horizontal and vertical directions simultaneously for N times in a pitch to get N sets of elemental images. Computational integral imaging reconstruction method for MALT is taken to obtain the slice images of the 3D scene, and the sum modulus (SMD) blur metric is taken on these slice images to achieve the depth information of the 3D scene. Simulation and optical experiments are carried out to verify the feasibility of this method.

  2. An automatic rat brain extraction method based on a deformable surface model.

    PubMed

    Li, Jiehua; Liu, Xiaofeng; Zhuo, Jiachen; Gullapalli, Rao P; Zara, Jason M

    2013-08-15

    The extraction of the brain from the skull in medical images is a necessary first step before image registration or segmentation. While pre-clinical MR imaging studies on small animals, such as rats, are increasing, fully automatic imaging processing techniques specific to small animal studies remain lacking. In this paper, we present an automatic rat brain extraction method, the Rat Brain Deformable model method (RBD), which adapts the popular human brain extraction tool (BET) through the incorporation of information on the brain geometry and MR image characteristics of the rat brain. The robustness of the method was demonstrated on T2-weighted MR images of 64 rats and compared with other brain extraction methods (BET, PCNN, PCNN-3D). The results demonstrate that RBD reliably extracts the rat brain with high accuracy (>92% volume overlap) and is robust against signal inhomogeneity in the images. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Comparative analysis of essential oil composition of Iranian and Indian Nigella sativa L. extracted using supercritical fluid extraction and solvent extraction

    PubMed Central

    Ghahramanloo, Kourosh Hasanzadeh; Kamalidehghan, Behnam; Akbari Javar, Hamid; Teguh Widodo, Riyanto; Majidzadeh, Keivan; Noordin, Mohamed Ibrahim

    2017-01-01

    The objective of this study was to compare the oil extraction yield and essential oil composition of Indian and Iranian Nigella sativa L. extracted by using Supercritical Fluid Extraction (SFE) and solvent extraction methods. In this study, a gas chromatography equipped with a mass spectrophotometer detector was employed for qualitative analysis of the essential oil composition of Indian and Iranian N. sativa L. The results indicated that the main fatty acid composition identified in the essential oils extracted by using SFE and solvent extraction were linoleic acid (22.4%–61.85%) and oleic acid (1.64%–18.97%). Thymoquinone (0.72%–21.03%) was found to be the major volatile compound in the extracted N. sativa oil. It was observed that the oil extraction efficiency obtained from SFE was significantly (P<0.05) higher than that achieved by the solvent extraction technique. The present study showed that SFE can be used as a more efficient technique for extraction of N. Sativa L. essential oil, which is composed of higher linoleic acid and thymoquinone contents compared to the essential oil obtained by the solvent extraction technique. PMID:28814830

  4. Comparative analysis of essential oil composition of Iranian and Indian Nigella sativa L. extracted using supercritical fluid extraction and solvent extraction.

    PubMed

    Ghahramanloo, Kourosh Hasanzadeh; Kamalidehghan, Behnam; Akbari Javar, Hamid; Teguh Widodo, Riyanto; Majidzadeh, Keivan; Noordin, Mohamed Ibrahim

    2017-01-01

    The objective of this study was to compare the oil extraction yield and essential oil composition of Indian and Iranian Nigella sativa L. extracted by using Supercritical Fluid Extraction (SFE) and solvent extraction methods. In this study, a gas chromatography equipped with a mass spectrophotometer detector was employed for qualitative analysis of the essential oil composition of Indian and Iranian N. sativa L. The results indicated that the main fatty acid composition identified in the essential oils extracted by using SFE and solvent extraction were linoleic acid (22.4%-61.85%) and oleic acid (1.64%-18.97%). Thymoquinone (0.72%-21.03%) was found to be the major volatile compound in the extracted N. sativa oil. It was observed that the oil extraction efficiency obtained from SFE was significantly ( P <0.05) higher than that achieved by the solvent extraction technique. The present study showed that SFE can be used as a more efficient technique for extraction of N. Sativa L. essential oil, which is composed of higher linoleic acid and thymoquinone contents compared to the essential oil obtained by the solvent extraction technique.

  5. Mapping brain activity in gradient-echo functional MRI using principal component analysis

    NASA Astrophysics Data System (ADS)

    Khosla, Deepak; Singh, Manbir; Don, Manuel

    1997-05-01

    The detection of sites of brain activation in functional MRI has been a topic of immense research interest and many technique shave been proposed to this end. Recently, principal component analysis (PCA) has been applied to extract the activated regions and their time course of activation. This method is based on the assumption that the activation is orthogonal to other signal variations such as brain motion, physiological oscillations and other uncorrelated noises. A distinct advantage of this method is that it does not require any knowledge of the time course of the true stimulus paradigm. This technique is well suited to EPI image sequences where the sampling rate is high enough to capture the effects of physiological oscillations. In this work, we propose and apply tow methods that are based on PCA to conventional gradient-echo images and investigate their usefulness as tools to extract reliable information on brain activation. The first method is a conventional technique where a single image sequence with alternating on and off stages is subject to a principal component analysis. The second method is a PCA-based approach called the common spatial factor analysis technique (CSF). As the name suggests, this method relies on common spatial factors between the above fMRI image sequence and a background fMRI. We have applied these methods to identify active brain ares during visual stimulation and motor tasks. The results from these methods are compared to those obtained by using the standard cross-correlation technique. We found good agreement in the areas identified as active across all three techniques. The results suggest that PCA and CSF methods have good potential in detecting the true stimulus correlated changes in the presence of other interfering signals.

  6. An expert botanical feature extraction technique based on phenetic features for identifying plant species.

    PubMed

    Kolivand, Hoshang; Fern, Bong Mei; Rahim, Mohd Shafry Mohd; Sulong, Ghazali; Baker, Thar; Tully, David

    2018-01-01

    In this paper, we present a new method to recognise the leaf type and identify plant species using phenetic parts of the leaf; lobes, apex and base detection. Most of the research in this area focuses on the popular features such as the shape, colour, vein, and texture, which consumes large amounts of computational processing and are not efficient, especially in the Acer database with a high complexity structure of the leaves. This paper is focused on phenetic parts of the leaf which increases accuracy. Detecting the local maxima and local minima are done based on Centroid Contour Distance for Every Boundary Point, using north and south region to recognise the apex and base. Digital morphology is used to measure the leaf shape and the leaf margin. Centroid Contour Gradient is presented to extract the curvature of leaf apex and base. We analyse 32 leaf images of tropical plants and evaluated with two different datasets, Flavia, and Acer. The best accuracy obtained is 94.76% and 82.6% respectively. Experimental results show the effectiveness of the proposed technique without considering the commonly used features with high computational cost.

  7. An expert botanical feature extraction technique based on phenetic features for identifying plant species

    PubMed Central

    Fern, Bong Mei; Rahim, Mohd Shafry Mohd; Sulong, Ghazali; Baker, Thar; Tully, David

    2018-01-01

    In this paper, we present a new method to recognise the leaf type and identify plant species using phenetic parts of the leaf; lobes, apex and base detection. Most of the research in this area focuses on the popular features such as the shape, colour, vein, and texture, which consumes large amounts of computational processing and are not efficient, especially in the Acer database with a high complexity structure of the leaves. This paper is focused on phenetic parts of the leaf which increases accuracy. Detecting the local maxima and local minima are done based on Centroid Contour Distance for Every Boundary Point, using north and south region to recognise the apex and base. Digital morphology is used to measure the leaf shape and the leaf margin. Centroid Contour Gradient is presented to extract the curvature of leaf apex and base. We analyse 32 leaf images of tropical plants and evaluated with two different datasets, Flavia, and Acer. The best accuracy obtained is 94.76% and 82.6% respectively. Experimental results show the effectiveness of the proposed technique without considering the commonly used features with high computational cost. PMID:29420568

  8. Mapping population-based structural connectomes.

    PubMed

    Zhang, Zhengwu; Descoteaux, Maxime; Zhang, Jingwen; Girard, Gabriel; Chamberland, Maxime; Dunson, David; Srivastava, Anuj; Zhu, Hongtu

    2018-05-15

    Advances in understanding the structural connectomes of human brain require improved approaches for the construction, comparison and integration of high-dimensional whole-brain tractography data from a large number of individuals. This article develops a population-based structural connectome (PSC) mapping framework to address these challenges. PSC simultaneously characterizes a large number of white matter bundles within and across different subjects by registering different subjects' brains based on coarse cortical parcellations, compressing the bundles of each connection, and extracting novel connection weights. A robust tractography algorithm and streamline post-processing techniques, including dilation of gray matter regions, streamline cutting, and outlier streamline removal are applied to improve the robustness of the extracted structural connectomes. The developed PSC framework can be used to reproducibly extract binary networks, weighted networks and streamline-based brain connectomes. We apply the PSC to Human Connectome Project data to illustrate its application in characterizing normal variations and heritability of structural connectomes in healthy subjects. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Application of pulsed field gradient NMR techniques for investigating binding of flavor compounds to macromolecules.

    PubMed

    Jung, Da-Mi; De Ropp, Jeffrey S; Ebeler, Susan E

    2002-07-17

    Two diffusion-based NMR techniques are presented and used to investigate the binding of selected flavor compounds to macromolecules. A pulsed field gradient NMR (PFG-NMR) method was applied to measure the apparent diffusion coefficients of four alkanone compounds as they associated with bovine serum albumin (BSA). The change in the apparent diffusion coefficient as a function of the BSA/alkanone ratio was fitted to yield binding constants (K(a)()) and binding stoichiometry (n) for each alkanone. The results showed that the apparent diffusion coefficients of alkanones increased with a decrease in the BSA/alkanone ratios, and the measured values of K(a)() and n were comparable with those obtained with other methods and depended on the alkanone structure. A diffusion-based nuclear Overhauser effect (called diffusion NOE pumping) method was also applied to screen mixtures of flavor compounds and identify those that have a binding affinity to complex macromolecules. Using this technique benzaldehyde and vanillin were observed to bind with bovine serum albumin, whereas 2-phenylethanol was identified as a nonbinding or weakly binding ligand with BSA. The diffusion NOE pumping method was also applied to a hydro alcoholic solution of cacao bean tannin extracts to which a mixture of ethylbenzoate, benzaldehyde, and 2-phenylethanol was added. The diffusion NOE pumping technique clearly indicated that ethylbenzoate had a stronger binding affinity to the polymeric (-)-epicatechin units of the cacao bean tannin extracts than the other two flavor compounds. The results successfully demonstrate the potential applications of diffusion-based NMR techniques for studying flavors and nonvolatile food matrix interactions.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Wei, E-mail: wguo2@ncsu.edu; Kirste, Ronny; Bryan, Zachary

    Enhanced light extraction efficiency was demonstrated on nanostructure patterned GaN and AlGaN/AlN Multiple-Quantum-Well (MQW) structures using mass production techniques including natural lithography and interference lithography with feature size as small as 100 nm. Periodic nanostructures showed higher light extraction efficiency and modified emission profile compared to non-periodic structures based on integral reflection and angular-resolved transmission measurement. Light extraction mechanism of macroscopic and microscopic nanopatterning is discussed, and the advantage of using periodic nanostructure patterning is provided. An enhanced photoluminescence emission intensity was observed on nanostructure patterned AlGaN/AlN MQW compared to as-grown structure, demonstrating a large-scale and mass-producible pathway to higher lightmore » extraction efficiency in deep-ultra-violet light-emitting diodes.« less

  11. Supercritical fluid extraction of selected pharmaceuticals from water and serum.

    PubMed

    Simmons, B R; Stewart, J T

    1997-01-24

    Selected drugs from benzodiazepine, anabolic agent and non-steroidal anti-inflammatory drug (NSAID) therapeutic classes were extracted from water and serum using a supercritical CO2 mobile phase. The samples were extracted at a pump pressure of 329 MPa, an extraction chamber temperature of 45 degrees C, and a restrictor temperature of 60 degrees C. The static extraction time for all samples was 2.5 min and the dynamic extraction time ranged from 5 to 20 min. The analytes were collected in appropriate solvent traps and assayed by modified literature HPLC procedures. Analyte recoveries were calculated based on peak height measurements of extracted vs. unextracted analyte. The recovery of the benzodiazepines ranged from 80 to 98% in water and from 75 to 94% in serum. Anabolic drug recoveries from water and serum ranged from 67 to 100% and 70 to 100%, respectively. The NSAIDs were recovered from water in the 76 to 97% range and in the 76 to 100% range from serum. Accuracy, precision and endogenous peak interference, if any, were determined for blank and spiked serum extractions and compared with classical sample preparation techniques of liquid-liquid and solid-phase extraction reported in the literature. For the benzodiazepines, accuracy and precision for supercritical fluid extraction (SFE) ranged from 1.95 to 3.31 and 0.57 to 1.25%, respectively (n = 3). The SFE accuracy and precision data for the anabolic agents ranged from 4.03 to 7.84 and 0.66 to 2.78%, respectively (n = 3). The accuracy and precision data reported for the SFE of the NSAIDs ranged from 2.79 to 3.79 and 0.33 to 1.27%, respectively (n = 3). The precision of the SFE method from serum was shown to be comparable to the precision obtained with other classical preparation techniques.

  12. Analysis of a Knowledge-Management-Based Process of Transferring Project Management Skills

    ERIC Educational Resources Information Center

    Ioi, Toshihiro; Ono, Masakazu; Ishii, Kota; Kato, Kazuhiko

    2012-01-01

    Purpose: The purpose of this paper is to propose a method for the transfer of knowledge and skills in project management (PM) based on techniques in knowledge management (KM). Design/methodology/approach: The literature contains studies on methods to extract experiential knowledge in PM, but few studies exist that focus on methods to convert…

  13. Fundus Image Features Extraction for Exudate Mining in Coordination with Content Based Image Retrieval: A Study

    NASA Astrophysics Data System (ADS)

    Gururaj, C.; Jayadevappa, D.; Tunga, Satish

    2018-02-01

    Medical field has seen a phenomenal improvement over the previous years. The invention of computers with appropriate increase in the processing and internet speed has changed the face of the medical technology. However there is still scope for improvement of the technologies in use today. One of the many such technologies of medical aid is the detection of afflictions of the eye. Although a repertoire of research has been accomplished in this field, most of them fail to address how to take the detection forward to a stage where it will be beneficial to the society at large. An automated system that can predict the current medical condition of a patient after taking the fundus image of his eye is yet to see the light of the day. Such a system is explored in this paper by summarizing a number of techniques for fundus image features extraction, predominantly hard exudate mining, coupled with Content Based Image Retrieval to develop an automation tool. The knowledge of the same would bring about worthy changes in the domain of exudates extraction of the eye. This is essential in cases where the patients may not have access to the best of technologies. This paper attempts at a comprehensive summary of the techniques for Content Based Image Retrieval (CBIR) or fundus features image extraction, and few choice methods of both, and an exploration which aims to find ways to combine these two attractive features, and combine them so that it is beneficial to all.

  14. Fundus Image Features Extraction for Exudate Mining in Coordination with Content Based Image Retrieval: A Study

    NASA Astrophysics Data System (ADS)

    Gururaj, C.; Jayadevappa, D.; Tunga, Satish

    2018-06-01

    Medical field has seen a phenomenal improvement over the previous years. The invention of computers with appropriate increase in the processing and internet speed has changed the face of the medical technology. However there is still scope for improvement of the technologies in use today. One of the many such technologies of medical aid is the detection of afflictions of the eye. Although a repertoire of research has been accomplished in this field, most of them fail to address how to take the detection forward to a stage where it will be beneficial to the society at large. An automated system that can predict the current medical condition of a patient after taking the fundus image of his eye is yet to see the light of the day. Such a system is explored in this paper by summarizing a number of techniques for fundus image features extraction, predominantly hard exudate mining, coupled with Content Based Image Retrieval to develop an automation tool. The knowledge of the same would bring about worthy changes in the domain of exudates extraction of the eye. This is essential in cases where the patients may not have access to the best of technologies. This paper attempts at a comprehensive summary of the techniques for Content Based Image Retrieval (CBIR) or fundus features image extraction, and few choice methods of both, and an exploration which aims to find ways to combine these two attractive features, and combine them so that it is beneficial to all.

  15. Techniques for information extraction from compressed GPS traces : final report.

    DOT National Transportation Integrated Search

    2015-12-31

    Developing techniques for extracting information requires a good understanding of methods used to compress the traces. Many techniques for compressing trace data : consisting of position (i.e., latitude/longitude) and time values have been developed....

  16. X-ray phase contrast tomography by tracking near field speckle

    PubMed Central

    Wang, Hongchang; Berujon, Sebastien; Herzen, Julia; Atwood, Robert; Laundy, David; Hipp, Alexander; Sawhney, Kawal

    2015-01-01

    X-ray imaging techniques that capture variations in the x-ray phase can yield higher contrast images with lower x-ray dose than is possible with conventional absorption radiography. However, the extraction of phase information is often more difficult than the extraction of absorption information and requires a more sophisticated experimental arrangement. We here report a method for three-dimensional (3D) X-ray phase contrast computed tomography (CT) which gives quantitative volumetric information on the real part of the refractive index. The method is based on the recently developed X-ray speckle tracking technique in which the displacement of near field speckle is tracked using a digital image correlation algorithm. In addition to differential phase contrast projection images, the method allows the dark-field images to be simultaneously extracted. After reconstruction, compared to conventional absorption CT images, the 3D phase CT images show greatly enhanced contrast. This new imaging method has advantages compared to other X-ray imaging methods in simplicity of experimental arrangement, speed of measurement and relative insensitivity to beam movements. These features make the technique an attractive candidate for material imaging such as in-vivo imaging of biological systems containing soft tissue. PMID:25735237

  17. Machine Reading for Extraction of Bacteria and Habitat Taxonomies

    PubMed Central

    Kordjamshidi, Parisa; Massa, Wouter; Provoost, Thomas; Moens, Marie-Francine

    2015-01-01

    There is a vast amount of scientific literature available from various resources such as the internet. Automating the extraction of knowledge from these resources is very helpful for biologists to easily access this information. This paper presents a system to extract the bacteria and their habitats, as well as the relations between them. We investigate to what extent current techniques are suited for this task and test a variety of models in this regard. We detect entities in a biological text and map the habitats into a given taxonomy. Our model uses a linear chain Conditional Random Field (CRF). For the prediction of relations between the entities, a model based on logistic regression is built. Designing a system upon these techniques, we explore several improvements for both the generation and selection of good candidates. One contribution to this lies in the extended exibility of our ontology mapper that uses an advanced boundary detection and assigns the taxonomy elements to the detected habitats. Furthermore, we discover value in the combination of several distinct candidate generation rules. Using these techniques, we show results that are significantly improving upon the state of art for the BioNLP Bacteria Biotopes task. PMID:27077141

  18. 3D Feature Extraction for Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Silver, Deborah

    1996-01-01

    Visualization techniques provide tools that help scientists identify observed phenomena in scientific simulation. To be useful, these tools must allow the user to extract regions, classify and visualize them, abstract them for simplified representations, and track their evolution. Object Segmentation provides a technique to extract and quantify regions of interest within these massive datasets. This article explores basic algorithms to extract coherent amorphous regions from two-dimensional and three-dimensional scalar unstructured grids. The techniques are applied to datasets from Computational Fluid Dynamics and those from Finite Element Analysis.

  19. Automatic sub-pixel coastline extraction based on spectral mixture analysis using EO-1 Hyperion data

    NASA Astrophysics Data System (ADS)

    Hong, Zhonghua; Li, Xuesu; Han, Yanling; Zhang, Yun; Wang, Jing; Zhou, Ruyan; Hu, Kening

    2018-06-01

    Many megacities (such as Shanghai) are located in coastal areas, therefore, coastline monitoring is critical for urban security and urban development sustainability. A shoreline is defined as the intersection between coastal land and a water surface and features seawater edge movements as tides rise and fall. Remote sensing techniques have increasingly been used for coastline extraction; however, traditional hard classification methods are performed only at the pixel-level and extracting subpixel accuracy using soft classification methods is both challenging and time consuming due to the complex features in coastal regions. This paper presents an automatic sub-pixel coastline extraction method (ASPCE) from high-spectral satellite imaging that performs coastline extraction based on spectral mixture analysis and, thus, achieves higher accuracy. The ASPCE method consists of three main components: 1) A Water- Vegetation-Impervious-Soil (W-V-I-S) model is first presented to detect mixed W-V-I-S pixels and determine the endmember spectra in coastal regions; 2) The linear spectral mixture unmixing technique based on Fully Constrained Least Squares (FCLS) is applied to the mixed W-V-I-S pixels to estimate seawater abundance; and 3) The spatial attraction model is used to extract the coastline. We tested this new method using EO-1 images from three coastal regions in China: the South China Sea, the East China Sea, and the Bohai Sea. The results showed that the method is accurate and robust. Root mean square error (RMSE) was utilized to evaluate the accuracy by calculating the distance differences between the extracted coastline and the digitized coastline. The classifier's performance was compared with that of the Multiple Endmember Spectral Mixture Analysis (MESMA), Mixture Tuned Matched Filtering (MTMF), Sequential Maximum Angle Convex Cone (SMACC), Constrained Energy Minimization (CEM), and one classical Normalized Difference Water Index (NDWI). The results from the three test sites indicated that the proposed ASPCE method extracted coastlines more efficiently than did the compared methods, and its coastline extraction accuracy corresponded closely to the digitized coastline, with 0.39 pixels, 0.40 pixels, and 0.35 pixels in the three test regions, showing that the ASPCE method achieves an accuracy below 12.0 m (0.40 pixels). Moreover, in the quantitative accuracy assessment for the three test sites, the ASPCE method shows the best performance in coastline extraction, achieving a 0.35 pixel-level at the Bohai Sea, China test site. Therefore, the proposed ASPCE method can extract coastline more accurately than can the hard classification methods or other spectral unmixing methods.

  20. Sample preparation techniques for the determination of trace residues and contaminants in foods.

    PubMed

    Ridgway, Kathy; Lalljie, Sam P D; Smith, Roger M

    2007-06-15

    The determination of trace residues and contaminants in complex matrices, such as food, often requires extensive sample extraction and preparation prior to instrumental analysis. Sample preparation is often the bottleneck in analysis and there is a need to minimise the number of steps to reduce both time and sources of error. There is also a move towards more environmentally friendly techniques, which use less solvent and smaller sample sizes. Smaller sample size becomes important when dealing with real life problems, such as consumer complaints and alleged chemical contamination. Optimal sample preparation can reduce analysis time, sources of error, enhance sensitivity and enable unequivocal identification, confirmation and quantification. This review considers all aspects of sample preparation, covering general extraction techniques, such as Soxhlet and pressurised liquid extraction, microextraction techniques such as liquid phase microextraction (LPME) and more selective techniques, such as solid phase extraction (SPE), solid phase microextraction (SPME) and stir bar sorptive extraction (SBSE). The applicability of each technique in food analysis, particularly for the determination of trace organic contaminants in foods is discussed.

  1. The Efficiency of Random Forest Method for Shoreline Extraction from LANDSAT-8 and GOKTURK-2 Imageries

    NASA Astrophysics Data System (ADS)

    Bayram, B.; Erdem, F.; Akpinar, B.; Ince, A. K.; Bozkurt, S.; Catal Reis, H.; Seker, D. Z.

    2017-11-01

    Coastal monitoring plays a vital role in environmental planning and hazard management related issues. Since shorelines are fundamental data for environment management, disaster management, coastal erosion studies, modelling of sediment transport and coastal morphodynamics, various techniques have been developed to extract shorelines. Random Forest is one of these techniques which is used in this study for shoreline extraction.. This algorithm is a machine learning method based on decision trees. Decision trees analyse classes of training data creates rules for classification. In this study, Terkos region has been chosen for the proposed method within the scope of "TUBITAK Project (Project No: 115Y718) titled "Integration of Unmanned Aerial Vehicles for Sustainable Coastal Zone Monitoring Model - Three-Dimensional Automatic Coastline Extraction and Analysis: Istanbul-Terkos Example". Random Forest algorithm has been implemented to extract the shoreline of the Black Sea where near the lake from LANDSAT-8 and GOKTURK-2 satellite imageries taken in 2015. The MATLAB environment was used for classification. To obtain land and water-body classes, the Random Forest method has been applied to NIR bands of LANDSAT-8 (5th band) and GOKTURK-2 (4th band) imageries. Each image has been digitized manually and shorelines obtained for accuracy assessment. According to accuracy assessment results, Random Forest method is efficient for both medium and high resolution images for shoreline extraction studies.

  2. Assessment of visual landscape quality using IKONOS imagery.

    PubMed

    Ozkan, Ulas Yunus

    2014-07-01

    The assessment of visual landscape quality is of importance to the management of urban woodlands. Satellite remote sensing may be used for this purpose as a substitute for traditional survey techniques that are both labour-intensive and time-consuming. This study examines the association between the quality of the perceived visual landscape in urban woodlands and texture measures extracted from IKONOS satellite data, which features 4-m spatial resolution and four spectral bands. The study was conducted in the woodlands of Istanbul (the most important element of urban mosaic) lying along both shores of the Bosporus Strait. The visual quality assessment applied in this study is based on the perceptual approach and was performed via a survey of expressed preferences. For this purpose, representative photographs of real scenery were used to elicit observers' preferences. A slide show comprising 33 images was presented to a group of 153 volunteers (all undergraduate students), and they were asked to rate the visual quality of each on a 10-point scale (1 for very low visual quality, 10 for very high). Average visual quality scores were calculated for landscape. Texture measures were acquired using the two methods: pixel-based and object-based. Pixel-based texture measures were extracted from the first principle component (PC1) image. Object-based texture measures were extracted by using the original four bands. The association between image texture measures and perceived visual landscape quality was tested via Pearson's correlation coefficient. The analysis found a strong linear association between image texture measures and visual quality. The highest correlation coefficient was calculated between standard deviation of gray levels (SDGL) (one of the pixel-based texture measures) and visual quality (r = 0.82, P < 0.05). The results showed that perceived visual quality of urban woodland landscapes can be estimated by using texture measures extracted from satellite data in combination with appropriate modelling techniques.

  3. Study on Building Extraction from High-Resolution Images Using Mbi

    NASA Astrophysics Data System (ADS)

    Ding, Z.; Wang, X. Q.; Li, Y. L.; Zhang, S. S.

    2018-04-01

    Building extraction from high resolution remote sensing images is a hot research topic in the field of photogrammetry and remote sensing. However, the diversity and complexity of buildings make building extraction methods still face challenges in terms of accuracy, efficiency, and so on. In this study, a new building extraction framework based on MBI and combined with image segmentation techniques, spectral constraint, shadow constraint, and shape constraint is proposed. In order to verify the proposed method, worldview-2, GF-2, GF-1 remote sensing images covered Xiamen Software Park were used for building extraction experiments. Experimental results indicate that the proposed method improve the original MBI significantly, and the correct rate is over 86 %. Furthermore, the proposed framework reduces the false alarms by 42 % on average compared to the performance of the original MBI.

  4. Magnetic ionic liquid-based dispersive liquid-liquid microextraction technique for preconcentration and ultra-trace determination of Cd in honey.

    PubMed

    Fiorentini, Emiliano F; Escudero, Leticia B; Wuilloud, Rodolfo G

    2018-04-19

    A simple, highly efficient, batch, and centrifuge-less dispersive liquid-liquid microextraction method based on a magnetic ionic liquid (MIL-DLLME) and electrothermal atomic absorption spectrometry (ETAAS) detection was developed for ultra-trace Cd determination in honey. Initially, Cd(II) was chelated with ammonium diethyldithiophosphate (DDTP) at pH 0.5 followed by its extraction with the MIL trihexyl(tetradecyl)phosphonium tetrachloroferrate(III) ([P 6,6,6,14 ]FeCl 4 ) and acetonitrile as dispersant. The MIL phase containing the analyte was separated from the aqueous phase using only a magnet. A back-extraction procedure was applied to recover Cd from the MIL phase using diluted HNO 3 and this solution was directly injected into the graphite furnace of ETAAS instrument. An extraction efficiency of 93% and a sensitivity enhancement factor of 112 were obtained under optimal experimental conditions. The detection limit (LOD) was 0.4 ng L -1 Cd, while the relative standard deviation (RSD) was 3.8% (at 2 μg L -1 Cd and n = 10), calculated from the peak height of absorbance signals. This work reports the first application of the MIL [P 6,6,6,14 ]FeCl 4 along with the DLLME technique for the successful determination of Cd at trace levels in different honey samples. Graphical abstract Preconcentration of ultratraces of Cd in honey using a magnetic ionic liquid and dispersive liquid-liquid microextraction technique.

  5. Synthetic aperture radar target detection, feature extraction, and image formation techniques

    NASA Technical Reports Server (NTRS)

    Li, Jian

    1994-01-01

    This report presents new algorithms for target detection, feature extraction, and image formation with the synthetic aperture radar (SAR) technology. For target detection, we consider target detection with SAR and coherent subtraction. We also study how the image false alarm rates are related to the target template false alarm rates when target templates are used for target detection. For feature extraction from SAR images, we present a computationally efficient eigenstructure-based 2D-MODE algorithm for two-dimensional frequency estimation. For SAR image formation, we present a robust parametric data model for estimating high resolution range signatures of radar targets and for forming high resolution SAR images.

  6. Various extraction and analytical techniques for isolation and identification of secondary metabolites from Nigella sativa seeds.

    PubMed

    Liu, X; Abd El-Aty, A M; Shim, J-H

    2011-10-01

    Nigella sativa L. (black cumin), commonly known as black seed, is a member of the Ranunculaceae family. This seed is used as a natural remedy in many Middle Eastern and Far Eastern countries. Extracts prepared from N. sativa have, for centuries, been used for medical purposes. Thus far, the organic compounds in N. sativa, including alkaloids, steroids, carbohydrates, flavonoids, fatty acids, etc. have been fairly well characterized. Herein, we summarize some new extraction techniques, including microwave assisted extraction (MAE) and supercritical extraction techniques (SFE), in addition to the classical method of hydrodistillation (HD), which have been employed for isolation and various analytical techniques used for the identification of secondary metabolites in black seed. We believe that some compounds contained in N. sativa remain to be identified, and that high-throughput screening could help to identify new compounds. A study addressing environmentally-friendly techniques that have minimal or no environmental effects is currently underway in our laboratory.

  7. Complex investigation of extraction techniques applied for cyclitols and sugars isolation from different species of Solidago genus.

    PubMed

    Ratiu, Ileana-Andreea; Al-Suod, Hossam; Ligor, Magdalena; Ligor, Tomasz; Railean-Plugaru, Viorica; Buszewski, Bogusław

    2018-03-15

    Cyclitols are phytochemicals naturally occurring in plant material, which attracted an increasing interest due to multiple medicinal attributes, among which the most important are the antidiabetic, antioxidant, and anticancer properties. Due to their valuable properties, sugars are used in the food industry as sweeteners, preservatives, texture modifiers, fermentation substrates, and flavoring and coloring agents. In this study, we report for the first time the quantitative analysis of sugars and cyclitols isolated from Solidago virgaurea L., which was used for the selection of the optimal solvent and extraction technique that can provide the best possible yield. Moreover, the quantities of sugars and cyclitols extracted from two other species, Solidago canadensis and Solidago gigantea, were investigated using the best extraction method and the most appropriate solvent. Comparative analysis of natural plant extracts obtained using five different techniques-maceration, Soxhlet extraction, pressurized liquid extraction, ultrasound-assisted extraction, and supercritical fluid extraction-was performed in order to decide the most suitable, efficient, and economically convenient extraction method. Three different solvents were used. Analysis of samples has been performed by solid-phase extraction for purification and pre-concentration, followed by derivation and GC-MS analysis. Highest efficiency for the total amount of obtained compounds has been reached by PLE, when water was used as a solvent. d-pinitol amount was almost similar for every solvent and for all the extraction techniques involved. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Application of wavelet techniques for cancer diagnosis using ultrasound images: A Review.

    PubMed

    Sudarshan, Vidya K; Mookiah, Muthu Rama Krishnan; Acharya, U Rajendra; Chandran, Vinod; Molinari, Filippo; Fujita, Hamido; Ng, Kwan Hoong

    2016-02-01

    Ultrasound is an important and low cost imaging modality used to study the internal organs of human body and blood flow through blood vessels. It uses high frequency sound waves to acquire images of internal organs. It is used to screen normal, benign and malignant tissues of various organs. Healthy and malignant tissues generate different echoes for ultrasound. Hence, it provides useful information about the potential tumor tissues that can be analyzed for diagnostic purposes before therapeutic procedures. Ultrasound images are affected with speckle noise due to an air gap between the transducer probe and the body. The challenge is to design and develop robust image preprocessing, segmentation and feature extraction algorithms to locate the tumor region and to extract subtle information from isolated tumor region for diagnosis. This information can be revealed using a scale space technique such as the Discrete Wavelet Transform (DWT). It decomposes an image into images at different scales using low pass and high pass filters. These filters help to identify the detail or sudden changes in intensity in the image. These changes are reflected in the wavelet coefficients. Various texture, statistical and image based features can be extracted from these coefficients. The extracted features are subjected to statistical analysis to identify the significant features to discriminate normal and malignant ultrasound images using supervised classifiers. This paper presents a review of wavelet techniques used for preprocessing, segmentation and feature extraction of breast, thyroid, ovarian and prostate cancer using ultrasound images. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Extraction of fault component from abnormal sound in diesel engines using acoustic signals

    NASA Astrophysics Data System (ADS)

    Dayong, Ning; Changle, Sun; Yongjun, Gong; Zengmeng, Zhang; Jiaoyi, Hou

    2016-06-01

    In this paper a method for extracting fault components from abnormal acoustic signals and automatically diagnosing diesel engine faults is presented. The method named dislocation superimposed method (DSM) is based on the improved random decrement technique (IRDT), differential function (DF) and correlation analysis (CA). The aim of DSM is to linearly superpose multiple segments of abnormal acoustic signals because of the waveform similarity of faulty components. The method uses sample points at the beginning of time when abnormal sound appears as the starting position for each segment. In this study, the abnormal sound belonged to shocking faulty type; thus, the starting position searching method based on gradient variance was adopted. The coefficient of similar degree between two same sized signals is presented. By comparing with a similar degree, the extracted fault component could be judged automatically. The results show that this method is capable of accurately extracting the fault component from abnormal acoustic signals induced by faulty shocking type and the extracted component can be used to identify the fault type.

  10. Ridge extraction from the time-frequency representation (TFR) of signals based on an image processing approach: application to the analysis of uterine electromyogram AR TFR.

    PubMed

    Terrien, Jérémy; Marque, Catherine; Germain, Guy

    2008-05-01

    Time-frequency representations (TFRs) of signals are increasingly being used in biomedical research. Analysis of such representations is sometimes difficult, however, and is often reduced to the extraction of ridges, or local energy maxima. In this paper, we describe a new ridge extraction method based on the image processing technique of active contours or snakes. We have tested our method on several synthetic signals and for the analysis of uterine electromyogram or electrohysterogram (EHG) recorded during gestation in monkeys. We have also evaluated a postprocessing algorithm that is especially suited for EHG analysis. Parameters are evaluated on real EHG signals in different gestational periods. The presented method gives good results when applied to synthetic as well as EHG signals. We have been able to obtain smaller ridge extraction errors when compared to two other methods specially developed for EHG. The gradient vector flow (GVF) snake method, or GVF-snake method, appears to be a good ridge extraction tool, which could be used on TFR of mono or multicomponent signals with good results.

  11. Alternative and Efficient Extraction Methods for Marine-Derived Compounds

    PubMed Central

    Grosso, Clara; Valentão, Patrícia; Ferreres, Federico; Andrade, Paula B.

    2015-01-01

    Marine ecosystems cover more than 70% of the globe’s surface. These habitats are occupied by a great diversity of marine organisms that produce highly structural diverse metabolites as a defense mechanism. In the last decades, these metabolites have been extracted and isolated in order to test them in different bioassays and assess their potential to fight human diseases. Since traditional extraction techniques are both solvent- and time-consuming, this review emphasizes alternative extraction techniques, such as supercritical fluid extraction, pressurized solvent extraction, microwave-assisted extraction, ultrasound-assisted extraction, pulsed electric field-assisted extraction, enzyme-assisted extraction, and extraction with switchable solvents and ionic liquids, applied in the search for marine compounds. Only studies published in the 21st century are considered. PMID:26006714

  12. Characteristic-eddy decomposition of turbulence in a channel

    NASA Technical Reports Server (NTRS)

    Moin, Parviz; Moser, Robert D.

    1989-01-01

    Lumley's proper orthogonal decomposition technique is applied to the turbulent flow in a channel. Coherent structures are extracted by decomposing the velocity field into characteristic eddies with random coefficients. A generalization of the shot-noise expansion is used to determine the characteristic eddies in homogeneous spatial directions. Three different techniques are used to determine the phases of the Fourier coefficients in the expansion: (1) one based on the bispectrum, (2) a spatial compactness requirement, and (3) a functional continuity argument. Similar results are found from each of these techniques.

  13. Construction of Green Tide Monitoring System and Research on its Key Techniques

    NASA Astrophysics Data System (ADS)

    Xing, B.; Li, J.; Zhu, H.; Wei, P.; Zhao, Y.

    2018-04-01

    As a kind of marine natural disaster, Green Tide has been appearing every year along the Qingdao Coast, bringing great loss to this region, since the large-scale bloom in 2008. Therefore, it is of great value to obtain the real time dynamic information about green tide distribution. In this study, methods of optical remote sensing and microwave remote sensing are employed in Green Tide Monitoring Research. A specific remote sensing data processing flow and a green tide information extraction algorithm are designed, according to the optical and microwave data of different characteristics. In the aspect of green tide spatial distribution information extraction, an automatic extraction algorithm of green tide distribution boundaries is designed based on the principle of mathematical morphology dilation/erosion. And key issues in information extraction, including the division of green tide regions, the obtaining of basic distributions, the limitation of distribution boundary, and the elimination of islands, have been solved. The automatic generation of green tide distribution boundaries from the results of remote sensing information extraction is realized. Finally, a green tide monitoring system is built based on IDL/GIS secondary development in the integrated environment of RS and GIS, achieving the integration of RS monitoring and information extraction.

  14. A mobile unit for memory retrieval in daily life based on image and sensor processing

    NASA Astrophysics Data System (ADS)

    Takesumi, Ryuji; Ueda, Yasuhiro; Nakanishi, Hidenobu; Nakamura, Atsuyoshi; Kakimori, Nobuaki

    2003-10-01

    We developed a Mobile Unit which purpose is to support memory retrieval of daily life. In this paper, we describe the two characteristic factors of this unit. (1)The behavior classification with an acceleration sensor. (2)Extracting the difference of environment with image processing technology. In (1), By analyzing power and frequency of an acceleration sensor which turns to gravity direction, the one's activities can be classified using some techniques to walk, stay, and so on. In (2), By extracting the difference between the beginning scene and the ending scene of a stay scene with image processing, the result which is done by user is recognized as the difference of environment. Using those 2 techniques, specific scenes of daily life can be extracted, and important information at the change of scenes can be realized to record. Especially we describe the effect to support retrieving important things, such as a thing left behind and a state of working halfway.

  15. Peptides, Peptidomimetics, and Polypeptides from Marine Sources: A Wealth of Natural Sources for Pharmaceutical Applications

    PubMed Central

    Sable, Rushikesh; Parajuli, Pravin; Jois, Seetharama

    2017-01-01

    Nature provides a variety of peptides that are expressed in most living species. Evolutionary pressure and natural selection have created and optimized these peptides to bind to receptors with high affinity. Hence, natural resources provide an abundant chemical space to be explored in peptide-based drug discovery. Marine peptides can be extracted by simple solvent extraction techniques. The advancement of analytical techniques has made it possible to obtain pure peptides from natural resources. Extracted peptides have been evaluated as possible therapeutic agents for a wide range of diseases, including antibacterial, antifungal, antidiabetic and anticancer activity as well as cardiovascular and neurotoxin activity. Although marine resources provide thousands of possible peptides, only a few peptides derived from marine sources have reached the pharmaceutical market. This review focuses on some of the peptides derived from marine sources in the past ten years and gives a brief review of those that are currently in clinical trials or on the market. PMID:28441741

  16. A new shock wave assisted sandalwood oil extraction technique

    NASA Astrophysics Data System (ADS)

    Arunkumar, A. N.; Srinivasa, Y. B.; Ravikumar, G.; Shankaranarayana, K. H.; Rao, K. S.; Jagadeesh, G.

    A new shock wave assisted oil extraction technique from sandalwood has been developed in the Shock Waves Lab, IISc, Bangalore. The fragrant oil extracted from sandalwood finds variety of applications in medicine and perfumery industries. In the present method sandal wood specimens (2.5mm diameter and 25mm in length)are subjected to shock wave loading (over pressure 15 bar)in a constant area shock tube, before extracting the sandal oil using non-destructive oil extraction technique. The results from the study indicates that both the rate of extraction as well as the quantity of oil obtained from sandal wood samples exposed to shock waves are higher (15-40 percent) compared to non-destructive oil extraction technique. The compressive squeezing of the interior oil pockets in the sandalwood specimen due to shock wave loading appears to be the main reason for enhancement in the oil extraction rate. This is confirmed by the presence of warty structures in the cross-section and micro-fissures in the radial direction of the wood samples exposed to shock waves in the scanning electron microscopic investigation. In addition the gas chromatographic studies do not show any change in the q uality of sandal oil extracted from samples exposed to shock waves.

  17. The determination of acoustic reflection coefficients by using cepstral techniques, II: Extensions of the technique and considerations of accuracy

    NASA Astrophysics Data System (ADS)

    Bolton, J. S.; Gold, E.

    1986-10-01

    In a companion paper the cepstral technique for the measurement of reflection coefficients was described. In particular the concepts of extraction noise and extraction delay were introduced. They are considered further here, and, in addition, a means of extending the cepstral technique to accommodate surfaces having lengthy impulse responses is described. The character of extraction noise, a cepstral component which interferes with reflection measurements, is largely determined by the spectrum of the signal radiated from the source loudspeaker. Here the origin and effects of extraction noise are discussed and it is shown that inverse filtering techniques may be used to reduce extraction noise without making impractical demands of the electrical test signal or the source loudspeaker. The extraction delay, a factor which is introduced when removing the reflector impulse response from the power cepstrum, has previously been estimated by a cross-correlation technique. Here the importance of estimating the extraction delay accurately is emphasized by showing the effect of small spurious delays on the calculation of the normal impedance of a reflecting surface. The effects are shown to accord with theory, and it was found that the real part of the estimated surface normal impedance is very nearly maximized when the spurious delay is eliminated; this has suggested a new way of determining the extraction delay itself. Finally, the basic cepstral technique is suited only to the measurement of surfaces whose impulse responses are shorter than τ, the delay between the arrival of the direct and specularly reflected components at the measurement position. Here it is shown that this restriction can be eliminated, by using a process known as cepstral inversion, when the direct cepstrum has a duration less than τ and cepstral aliasing is insignificant. It is also possible to use this technique to deconvolve a signal from an echo sequence in the time domain, an operation previously associated with the complex cepstrum rather than with the power cepstrum as used here.

  18. Determination of Three Organochlorine Pesticides in Aqueous Samples by Solid-Phase Extraction Based on Natural Nano Diatomite in Packed Syringe Coupled to Gas Chromatography-Mass Spectrometry.

    PubMed

    Taghani, Abdollah; Goudarzi, Nasser; Bagherian, Ghadamali; Chamjangali, Mansour Arab

    2017-01-01

    A rapid, simple, and sensitive technique is proposed based on a miniaturized solid-phase extraction method named mictroextraction in a packed syringe coupled with gas chromatography-mass spectrometry for the preconcentration and determination of three organochlorine pesticides. These include hexachlorobenzene, heptachlor and aldrine in aqueous samples. For the first time, the natural nano diatomite is used a sorbent. Based on this technique, 6.0 mg of the nano sorbent is inserted in a syringe between two polypropylene frits. The analytes would be adsorbed on the solid phase, and would subsequently be eluted using organic solvents. The influence of some important parameters, such as the solution pH, type and volume of the organic desorption solvent, and amount of sorbent on the extraction efficiency of the selected pesticides, is investigated. The proposed method shows good linearity in the range of 0.1 - 40.0 μg L -1 , and at low limits of detection in the range of 0.02 - 0.13 μg L -1 using the selected ion-monitoring mode. The reproducibility of this method was found to be in the range of 3.5 - 11.1% for the understudied pesticides. In order to evaluate the matrix effect, the developed method is also applied to the preconcentration and determination of the selected pesticides in different water samples.

  19. Infrared moving small target detection based on saliency extraction and image sparse representation

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaomin; Ren, Kan; Gao, Jin; Li, Chaowei; Gu, Guohua; Wan, Minjie

    2016-10-01

    Moving small target detection in infrared image is a crucial technique of infrared search and tracking system. This paper present a novel small target detection technique based on frequency-domain saliency extraction and image sparse representation. First, we exploit the features of Fourier spectrum image and magnitude spectrum of Fourier transform to make a rough extract of saliency regions and use a threshold segmentation system to classify the regions which look salient from the background, which gives us a binary image as result. Second, a new patch-image model and over-complete dictionary were introduced to the detection system, then the infrared small target detection was converted into a problem solving and optimization process of patch-image information reconstruction based on sparse representation. More specifically, the test image and binary image can be decomposed into some image patches follow certain rules. We select the target potential area according to the binary patch-image which contains salient region information, then exploit the over-complete infrared small target dictionary to reconstruct the test image blocks which may contain targets. The coefficients of target image patch satisfy sparse features. Finally, for image sequence, Euclidean distance was used to reduce false alarm ratio and increase the detection accuracy of moving small targets in infrared images due to the target position correlation between frames.

  20. Development of On-Line High Performance Liquid Chromatography (HPLC)-Biochemical Detection Methods as Tools in the Identification of Bioactives

    PubMed Central

    Malherbe, Christiaan J.; de Beer, Dalene; Joubert, Elizabeth

    2012-01-01

    Biochemical detection (BCD) methods are commonly used to screen plant extracts for specific biological activities in batch assays. Traditionally, bioactives in the most active extracts were identified through time-consuming bio-assay guided fractionation until single active compounds could be isolated. Not only are isolation procedures often tedious, but they could also lead to artifact formation. On-line coupling of BCD assays to high performance liquid chromatography (HPLC) is gaining ground as a high resolution screening technique to overcome problems associated with pre-isolation by measuring the effects of compounds post-column directly after separation. To date, several on-line HPLC-BCD assays, applied to whole plant extracts and mixtures, have been published. In this review the focus will fall on enzyme-based, receptor-based and antioxidant assays. PMID:22489144

  1. Extraction and Chromatographic Determination of Shikimic Acid in Chinese Conifer Needles with 1-Benzyl-3-methylimidazolium Bromide Ionic Liquid Aqueous Solutions

    PubMed Central

    Chen, Fengli; Hou, Kexin; Li, Shuangyang; Zu, Yuangang; Yang, Lei

    2014-01-01

    An ionic liquids-based ultrasound-assisted extraction (ILUAE) method was successfully developed for extracting shikimic acid from conifer needles. Eleven 1-alkyl-3-methylimidazolium ionic liquids with different cations and anions were investigated and 1-benzyl-3-methylimidazolium bromide solution was selected as the solvent. The conditions for ILUAE, including the ionic liquid concentration, ultrasound power, ultrasound time, and liquid-solid ratio, were optimized. The proposed method had good recovery (99.37%–100.11%) and reproducibility (RSD, n = 6; 3.6%). ILUAE was an efficient, rapid, and simple sample preparation technique that showed high reproducibility. Based on the results, a number of plant species, namely, Picea koraiensis, Picea meyeri, Pinus elliottii, and Pinus banksiana, were identified as among the best resources of shikimic acid. PMID:24782942

  2. A face and palmprint recognition approach based on discriminant DCT feature extraction.

    PubMed

    Jing, Xiao-Yuan; Zhang, David

    2004-12-01

    In the field of image processing and recognition, discrete cosine transform (DCT) and linear discrimination are two widely used techniques. Based on them, we present a new face and palmprint recognition approach in this paper. It first uses a two-dimensional separability judgment to select the DCT frequency bands with favorable linear separability. Then from the selected bands, it extracts the linear discriminative features by an improved Fisherface method and performs the classification by the nearest neighbor classifier. We detailedly analyze theoretical advantages of our approach in feature extraction. The experiments on face databases and palmprint database demonstrate that compared to the state-of-the-art linear discrimination methods, our approach obtains better classification performance. It can significantly improve the recognition rates for face and palmprint data and effectively reduce the dimension of feature space.

  3. Hybrid Feature Extraction-based Approach for Facial Parts Representation and Recognition

    NASA Astrophysics Data System (ADS)

    Rouabhia, C.; Tebbikh, H.

    2008-06-01

    Face recognition is a specialized image processing which has attracted a considerable attention in computer vision. In this article, we develop a new facial recognition system from video sequences images dedicated to person identification whose face is partly occulted. This system is based on a hybrid image feature extraction technique called ACPDL2D (Rouabhia et al. 2007), it combines two-dimensional principal component analysis and two-dimensional linear discriminant analysis with neural network. We performed the feature extraction task on the eyes and the nose images separately then a Multi-Layers Perceptron classifier is used. Compared to the whole face, the results of simulation are in favor of the facial parts in terms of memory capacity and recognition (99.41% for the eyes part, 98.16% for the nose part and 97.25 % for the whole face).

  4. Determination of Activity Coefficients of di-(2-ethylhexyl) Phosphoric Acid Dimer in Select Organic Solvents Using Vapor Phase Osmometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael F. Gray; Peter Zalupski; Mikael Nilsson

    2013-08-01

    Effective models for solvent extraction require accurate characterization of the nonideality effects for each component, including the extractants. In this study, the nonideal behavior of the industrial extractant di(2-ethylhexyl) phosphoric acid has been investigated using vapor pressure osmometry (VPO). From the osmometry data, activity coefficients for the HDEHP dimer were obtained based on a formulation of the regular solution theory of Scatchard and Hildebrand, and the Margules two- and three-suffix equations. The results show similarity with a slope-analysis based relation from previous literature, although important differences are highlighted. The work points towards VPO as a useful technique for this typemore » of study, but care must be taken with the choice of standard and method of analysis.« less

  5. Solid-Phase Extraction Coupled to a Paper-Based Technique for Trace Copper Detection in Drinking Water.

    PubMed

    Quinn, Casey W; Cate, David M; Miller-Lionberg, Daniel D; Reilly, Thomas; Volckens, John; Henry, Charles S

    2018-03-20

    Metal contamination of natural and drinking water systems poses hazards to public and environmental health. Quantifying metal concentrations in water typically requires sample collection in the field followed by expensive laboratory analysis that can take days to weeks to obtain results. The objective of this work was to develop a low-cost, field-deployable method to quantify trace levels of copper in drinking water by coupling solid-phase extraction/preconcentration with a microfluidic paper-based analytical device. This method has the advantages of being hand-powered (instrument-free) and using a simple "read by eye" quantification motif (based on color distance). Tap water samples collected across Fort Collins, CO, were tested with this method and validated against ICP-MS. We demonstrate the ability to quantify the copper content of tap water within 30% of a reference technique at levels ranging from 20 to 500 000 ppb. The application of this technology, which should be sufficient as a rapid screening tool, can lead to faster, more cost-effective detection of soluble metals in water systems.

  6. A comparison of various modes of liquid-liquid based microextraction techniques: determination of picric acid.

    PubMed

    Burdel, Martin; Šandrejová, Jana; Balogh, Ioseph S; Vishnikin, Andriy; Andruch, Vasil

    2013-03-01

    Three modes of liquid-liquid based microextraction techniques--namely auxiliary solvent-assisted dispersive liquid-liquid microextraction, auxiliary solvent-assisted dispersive liquid-liquid microextraction with low-solvent consumption, and ultrasound-assisted emulsification microextraction--were compared. Picric acid was used as the model analyte. The determination is based on the reaction of picric acid with Astra Phloxine reagent to produce an ion associate easily extractable by various organic solvents, followed by spectrophotometric detection at 558 nm. Each of the compared procedures has both advantages and disadvantages. The main benefit of ultrasound-assisted emulsification microextraction is that no hazardous chlorinated extraction solvents and no dispersive solvent are necessary. Therefore, this procedure was selected for validation. Under optimized experimental conditions (pH 3, 7 × 10(-5) mol/L of Astra Phloxine, and 100 μL of toluene), the calibration plot was linear in the range of 0.02-0.14 mg/L and the LOD was 7 μg/L of picric acid. The developed procedure was applied to the analysis of spiked water samples. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Maternal and fetal effect of misgav ladach cesarean section in nigerian women: a randomized control study.

    PubMed

    Ezechi, Oc; Ezeobi, Pm; Gab-Okafor, Cv; Edet, A; Nwokoro, Ca; Akinlade, A

    2013-10-01

    The poor utilisation of the Misgav-Ladach (ML) caesarean section method in our environment despite its proven advantage has been attributed to several factors including its non-evaluation. A well designed and conducted trial is needed to provide evidence to convince clinician of its advantage over Pfannenstiel based methods. To evaluate the outcome of ML based caesarean section among Nigerian women. Randomised controlled open label study of 323 women undergoing primary caesarean section in Lagos Nigeria. The women were randomised to either ML method or Pfannenstiel based (PB) caesarean section technique using computer generated random numbers. The mean duration of surgery (P < 0.001), time to first bowel motion (P = 0.01) and ambulation (P < 0.001) were significantly shorter in the ML group compared to PB group. Postoperative anaemia (P < 0.01), analgesic needs (P = 0.02), extra suture use, estimated blood loss (P < 0.01) and post-operative complications (P = 0.001) were significantly lower in the ML group compared to PB group. Though the mean hospital stay was shorter (5.8 days) in the ML group as against 6.0 days, the difference was not significant statistically (P = 0.17). Of the fetal outcome measures compared, it was only in the fetal extraction time that there was significant difference between the two groups (P = 0.001). The mean fetal extraction time was 162 sec in ML group compared to 273 sec in the PB group. This study confirmed the already established benefit of ML techniques in Nigerian women, as it relates to the postoperative outcomes, duration of surgery, and fetal extraction time. The technique is recommended to clinicians as its superior maternal and fetal outcome and cost saving advantage makes it appropriate for use in poor resource setting.

  8. Alternative oil extraction methods from Echium plantagineum L. seeds using advanced techniques and green solvents.

    PubMed

    Castejón, Natalia; Luna, Pilar; Señoráns, Francisco J

    2018-04-01

    The edible oil processing industry involves large losses of organic solvent into the atmosphere and long extraction times. In this work, fast and environmentally friendly alternatives for the production of echium oil using green solvents are proposed. Advanced extraction techniques such as Pressurized Liquid Extraction (PLE), Microwave Assisted Extraction (MAE) and Ultrasound Assisted Extraction (UAE) were evaluated to efficiently extract omega-3 rich oil from Echium plantagineum seeds. Extractions were performed with ethyl acetate, ethanol, water and ethanol:water to develop a hexane-free processing method. Optimal PLE conditions with ethanol at 150 °C during 10 min produced a very similar oil yield (31.2%) to Soxhlet using hexane for 8 h (31.3%). UAE optimized method with ethanol at mild conditions (55 °C) produced a high oil yield (29.1%). Consequently, advanced extraction techniques showed good lipid yields and furthermore, the produced echium oil had the same omega-3 fatty acid composition than traditionally extracted oil. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Extraction of bioactive carbohydrates from artichoke (Cynara scolymus L.) external bracts using microwave assisted extraction and pressurized liquid extraction.

    PubMed

    Ruiz-Aceituno, Laura; García-Sarrió, M Jesús; Alonso-Rodriguez, Belén; Ramos, Lourdes; Sanz, M Luz

    2016-04-01

    Microwave assisted extraction (MAE) and pressurized liquid extraction (PLE) methods using water as solvent have been optimized by means of a Box-Behnken and 3(2) composite experimental designs, respectively, for the effective extraction of bioactive carbohydrates (inositols and inulin) from artichoke (Cynara scolymus L.) external bracts. MAE at 60 °C for 3 min of 0.3 g of sample allowed the extraction of slightly higher concentrations of inositol than PLE at 75 °C for 26.7 min (11.6 mg/g dry sample vs. 7.6 mg/g dry sample). On the contrary, under these conditions, higher concentrations of inulin were extracted with the latter technique (185.4 mg/g vs. 96.4 mg/g dry sample), considering two successive extraction cycles for both techniques. Both methodologies can be considered appropriate for the simultaneous extraction of these bioactive carbohydrates from this particular industrial by-product. To the best of our knowledge this is the first time that these techniques are applied for this purpose. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Antibacterial Activities of Jatropha curcas (LINN) on Coliforms Isolated from Surface Waters in Akure, Nigeria

    PubMed Central

    Dada, E. O.; Ekundayo, F. O.; Makanjuola, O. O.

    2014-01-01

    This study investigated the antibacterial activities of hot water, ethanol and acetone extracts of Jatropha curcas (LINN) leaves on coliforms isolated from surface waters using growth inhibition indices based on agar plate technique. The percentage recovery of the extracts was 19.17%, 18.10% and 18.80% for hot water, ethanol and acetone respectively. Phytochemical screening of the extracts was also determined. Qualitative phytochemical screening showed that the plant extracts contained steroids, tannins, flavonoids and cardiac glycosides, while alkaloids, phlobatannin, terpenoids and anthraquinones were absent. Only ethanolic extract did not possess saponins. Aqueous extracts of J. curcas compared most favourably with the standard antibiotics (gentamycin) on all the coliform bacteria except on K. pneumoniae and E. coli likely due to a measurably higher antibacterial activity compared to the organic extracts. The minimum inhibitory concentration of the aqueous extract ranged from 3.00 to 7.00 mg/L while minimum bactericidal concentration ranged from 4.00 to 10.00 mg/L. Aqueous extract of J. curcas could be used as antibacterial agents against diseases caused by coliforms. PMID:24711746

  11. Comparison of various techniques for the extraction of umbelliferone and herniarin in Matricaria chamomilla processing fractions.

    PubMed

    Molnar, Maja; Mendešević, Nikolina; Šubarić, Drago; Banjari, Ines; Jokić, Stela

    2017-08-05

    Chamomile, a well-known medicinal plant, is a rich source of bioactive compounds, among which two coumarin derivatives, umbelliferone and herniarin, are often found in its extracts. Chamomile extracts have found a different uses in cosmetic industry, as well as umbelliferone itself, which is, due to its strong absorption of UV light, usually added to sunscreens, while herniarin (7-methoxycoumarin) is also known for its biological activity. Therefore, chamomile extracts with certain herniarin and umbelliferone content could be of interest for application in pharmaceutical and cosmetic products. The aim of this study was to compare the extracts of different chamomile fractions (unprocessed chamomile flowers first class, processed chamomile flowers first class, pulvis and processing waste) and to identify the best material and method of extraction to obtain herniarin and umbelliferone. Various extraction techniques such as soxhlet, hydrodistillation, maceration and supercritical CO 2 extraction were used in this study. Umbelliferone and herniarin content was determined by high performance liquid chromatography (HPLC). The highest yield of umbelliferone (11.80 mg/100 g) and herniarin (82.79 mg/100 g) were obtained from chamomile processing waste using maceration technique with 50% aqueous ethanol solution and this extract has also proven to possess antioxidant activity (61.5% DPPH scavenging activity). This study shows a possibility of potential utilization of waste from chamomile processing applying different extraction techniques.

  12. Efficacy Evaluation of Different Wavelet Feature Extraction Methods on Brain MRI Tumor Detection

    NASA Astrophysics Data System (ADS)

    Nabizadeh, Nooshin; John, Nigel; Kubat, Miroslav

    2014-03-01

    Automated Magnetic Resonance Imaging brain tumor detection and segmentation is a challenging task. Among different available methods, feature-based methods are very dominant. While many feature extraction techniques have been employed, it is still not quite clear which of feature extraction methods should be preferred. To help improve the situation, we present the results of a study in which we evaluate the efficiency of using different wavelet transform features extraction methods in brain MRI abnormality detection. Applying T1-weighted brain image, Discrete Wavelet Transform (DWT), Discrete Wavelet Packet Transform (DWPT), Dual Tree Complex Wavelet Transform (DTCWT), and Complex Morlet Wavelet Transform (CMWT) methods are applied to construct the feature pool. Three various classifiers as Support Vector Machine, K Nearest Neighborhood, and Sparse Representation-Based Classifier are applied and compared for classifying the selected features. The results show that DTCWT and CMWT features classified with SVM, result in the highest classification accuracy, proving of capability of wavelet transform features to be informative in this application.

  13. ChemBrowser: a flexible framework for mining chemical documents.

    PubMed

    Wu, Xian; Zhang, Li; Chen, Ying; Rhodes, James; Griffin, Thomas D; Boyer, Stephen K; Alba, Alfredo; Cai, Keke

    2010-01-01

    The ability to extract chemical and biological entities and relations from text documents automatically has great value to biochemical research and development activities. The growing maturity of text mining and artificial intelligence technologies shows promise in enabling such automatic chemical entity extraction capabilities (called "Chemical Annotation" in this paper). Many techniques have been reported in the literature, ranging from dictionary and rule-based techniques to machine learning approaches. In practice, we found that no single technique works well in all cases. A combinatorial approach that allows one to quickly compose different annotation techniques together for a given situation is most effective. In this paper, we describe the key challenges we face in real-world chemical annotation scenarios. We then present a solution called ChemBrowser which has a flexible framework for chemical annotation. ChemBrowser includes a suite of customizable processing units that might be utilized in a chemical annotator, a high-level language that describes the composition of various processing units that would form a chemical annotator, and an execution engine that translates the composition language to an actual annotator that can generate annotation results for a given set of documents. We demonstrate the impact of this approach by tailoring an annotator for extracting chemical names from patent documents and show how this annotator can be easily modified with simple configuration alone.

  14. Use of advanced techniques for the extraction of phenolic compounds from Tunisian olive leaves: phenolic composition and cytotoxicity against human breast cancer cells.

    PubMed

    Taamalli, Amani; Arráez-Román, David; Barrajón-Catalán, Enrique; Ruiz-Torres, Verónica; Pérez-Sánchez, Almudena; Herrero, Miguel; Ibañez, Elena; Micol, Vicente; Zarrouk, Mokhtar; Segura-Carretero, Antonio; Fernández-Gutiérrez, Alberto

    2012-06-01

    A comparison among different advanced extraction techniques such as microwave-assisted extraction (MAE), supercritical fluid extraction (SFE) and pressurized liquid extraction (PLE), together with traditional solid-liquid extraction, was performed to test their efficiency towards the extraction of phenolic compounds from leaves of six Tunisian olive varieties. Extractions were carried out at the best selected conditions for each technique; the obtained extracts were chemically characterized using high-performance liquid chromatography (HPLC) coupled to electrospray time-of-flight mass spectrometry (ESI-TOF-MS) and electrospray ion trap tandem mass spectrometry (ESI-IT-MS(2)). As expected, higher extraction yields were obtained for PLE while phenolic profiles were mainly influenced by the solvent used as optimum in the different extraction methods. A larger number of phenolic compounds, mostly of a polar character, were found in the extracts obtained by using MAE. Best extraction yields do not correlate with highest cytotoxic activity against breast cancer cells, indicating that cytotoxicity is highly dependent on the presence of certain compounds in the extracts, although not exclusively on a single compound. Therefore, a multifactorial behavior is proposed for the anticancer activity of olive leaf compounds. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Video multiple watermarking technique based on image interlacing using DWT.

    PubMed

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth.

  16. Microwave-Assisted Extraction of Fucoidan from Marine Algae.

    PubMed

    Mussatto, Solange I

    2015-01-01

    Microwave-assisted extraction (MAE) is a technique that can be applied to extract compounds from different natural resources. In this chapter, the use of this technique to extract fucoidan from marine algae is described. The method involves a closed MAE system, ultrapure water as extraction solvent, and suitable conditions of time, pressure, and algal biomass/water ratio. By using this procedure under the specified conditions, the penetration of the electromagnetic waves into the material structure occurs in an efficient manner, generating a distributed heat source that promotes the fucoidan extraction from the algal biomass.

  17. Using time-frequency analysis to determine time-resolved detonation velocity with microwave interferometry.

    PubMed

    Kittell, David E; Mares, Jesus O; Son, Steven F

    2015-04-01

    Two time-frequency analysis methods based on the short-time Fourier transform (STFT) and continuous wavelet transform (CWT) were used to determine time-resolved detonation velocities with microwave interferometry (MI). The results were directly compared to well-established analysis techniques consisting of a peak-picking routine as well as a phase unwrapping method (i.e., quadrature analysis). The comparison is conducted on experimental data consisting of transient detonation phenomena observed in triaminotrinitrobenzene and ammonium nitrate-urea explosives, representing high and low quality MI signals, respectively. Time-frequency analysis proved much more capable of extracting useful and highly resolved velocity information from low quality signals than the phase unwrapping and peak-picking methods. Additionally, control of the time-frequency methods is mainly constrained to a single parameter which allows for a highly unbiased analysis method to extract velocity information. In contrast, the phase unwrapping technique introduces user based variability while the peak-picking technique does not achieve a highly resolved velocity result. Both STFT and CWT methods are proposed as improved additions to the analysis methods applied to MI detonation experiments, and may be useful in similar applications.

  18. Noncontact methods for measuring water-surface elevations and velocities in rivers: Implications for depth and discharge extraction

    USGS Publications Warehouse

    Nelson, Jonathan M.; Kinzel, Paul J.; McDonald, Richard R.; Schmeeckle, Mark

    2016-01-01

    Recently developed optical and videographic methods for measuring water-surface properties in a noninvasive manner hold great promise for extracting river hydraulic and bathymetric information. This paper describes such a technique, concentrating on the method of infrared videog- raphy for measuring surface velocities and both acoustic (laboratory-based) and laser-scanning (field-based) techniques for measuring water-surface elevations. In ideal laboratory situations with simple flows, appropriate spatial and temporal averaging results in accurate water-surface elevations and water-surface velocities. In test cases, this accuracy is sufficient to allow direct inversion of the governing equations of motion to produce estimates of depth and discharge. Unlike other optical techniques for determining local depth that rely on transmissivity of the water column (bathymetric lidar, multi/hyperspectral correlation), this method uses only water-surface information, so even deep and/or turbid flows can be investigated. However, significant errors arise in areas of nonhydrostatic spatial accelerations, such as those associated with flow over bedforms or other relatively steep obstacles. Using laboratory measurements for test cases, the cause of these errors is examined and both a simple semi-empirical method and computational results are presented that can potentially reduce bathymetric inversion errors.

  19. Evaluation of economically feasible, natural plant extract-based microbiological media for producing biomass of the dry rot biocontrol strain Pseudomonas fluorescens P22Y05 in liquid culture.

    PubMed

    Khalil, Sadia; Ali, Tasneem Adam; Skory, Chris; Slininger, Patricia J; Schisler, David A

    2016-02-01

    The production of microbial biomass in liquid media often represents an indispensable step in the research and development of bacterial and fungal strains. Costs of commercially prepared nutrient media or purified media components, however, can represent a significant hurdle to conducting research in locations where obtaining these products is difficult. A less expensive option for providing components essential to microbial growth in liquid culture is the use of extracts of fresh or dried plant products obtained by using hot water extraction techniques. A total of 13 plant extract-based media were prepared from a variety of plant fruits, pods or seeds of plant species including Allium cepa (red onion bulb), Phaseolus vulgaris (green bean pods), and Lens culinaris (lentil seeds). In shake flask tests, cell production by potato dry rot antagonist Pseudomonas fluorescens P22Y05 in plant extract-based media was generally statistically indistinguishable from that in commercially produced tryptic soy broth and nutrient broth as measured by optical density and colony forming units/ml produced (P ≤ 0.05, Fisher's protected LSD). The efficacy of biomass produced in the best plant extract-based media or commercial media was equivalent in reducing Fusarium dry rot by 50-96% compared to controls. In studies using a high-throughput microbioreactor, logarithmic growth of P22Y05 in plant extract-based media initiated in 3-5 h in most cases but specific growth rate and the time of maximum OD varied as did the maximum pH obtained in media. Nutrient analysis of selected media before and after cell growth indicated that nitrogen in the form of NH4 accumulated in culture supernatants, possibly due to unbalanced growth conditions brought on by a scarcity of simple sugars in the media tested. The potential of plant extract-based media to economically produce biomass of microbes active in reducing plant disease is considerable and deserves further research.

  20. Patient's pain perception during mandibular molar extraction with articaine: a comparison study between infiltration and inferior alveolar nerve block.

    PubMed

    Bataineh, Anwar B; Alwarafi, Majid A

    2016-11-01

    The aim of this study was to investigate the effectiveness of a local anesthetic agent comprising of 4 % articaine with 1:100,000 adrenaline, administered through an infiltration technique prior to the extraction of mandibular permanent first molar teeth. The study adopted a split mouth approach and involved patients who needed simple extractions of permanent mandibular first molar teeth on both sides. A combination of buccal and lingual infiltrations was used on one side, while the conventional inferior alveolar nerve block (IANB) technique, with a 1.8-ml cartridge of 4 % articaine with 1:100,000 epinephrine, was administered to the other. The patients' pain perception was assessed using visual analogue scale (VAS) and verbal rating scale (VRS) after the injection, followed by extraction. As a part of the study, 104 teeth were extracted from mouths of 52 patients. The difference in pain perception was statistically insignificant (p > .05) regarding the local anesthetic injection between the two techniques. The difference in pain perception regarding the extraction between the two techniques was also statistically insignificant (p < .05). No difference in pain perception between the two techniques among the study population was noted. This indicates that the extraction of permanent mandibular first molar teeth is possible without the administration of an IANB with the use of 4 % articaine with 1:100,000 epinephrine. The buccal and lingual infiltrations are slightly less painful than the conventional IANB technique.

  1. Comparative Evaluation of Pavement Crack Detection Using Kernel-Based Techniques in Asphalt Road Surfaces

    NASA Astrophysics Data System (ADS)

    Miraliakbari, A.; Sok, S.; Ouma, Y. O.; Hahn, M.

    2016-06-01

    With the increasing demand for the digital survey and acquisition of road pavement conditions, there is also the parallel growing need for the development of automated techniques for the analysis and evaluation of the actual road conditions. This is due in part to the resulting large volumes of road pavement data captured through digital surveys, and also to the requirements for rapid data processing and evaluations. In this study, the Canon 5D Mark II RGB camera with a resolution of 21 megapixels is used for the road pavement condition mapping. Even though many imaging and mapping sensors are available, the development of automated pavement distress detection, recognition and extraction systems for pavement condition is still a challenge. In order to detect and extract pavement cracks, a comparative evaluation of kernel-based segmentation methods comprising line filtering (LF), local binary pattern (LBP) and high-pass filtering (HPF) is carried out. While the LF and LBP methods are based on the principle of rotation-invariance for pattern matching, the HPF applies the same principle for filtering, but with a rotational invariant matrix. With respect to the processing speeds, HPF is fastest due to the fact that it is based on a single kernel, as compared to LF and LBP which are based on several kernels. Experiments with 20 sample images which contain linear, block and alligator cracks are carried out. On an average a completeness of distress extraction with values of 81.2%, 76.2% and 81.1% have been found for LF, HPF and LBP respectively.

  2. The Science and Art of Eyebrow Transplantation by Follicular Unit Extraction

    PubMed Central

    Gupta, Jyoti; Kumar, Amrendra; Chouhan, Kavish; Ariganesh, C; Nandal, Vinay

    2017-01-01

    Eyebrows constitute a very important and prominent feature of the face. With growing information, eyebrow transplant has become a popular procedure. However, though it is a small area it requires a lot of precision and knowledge regarding anatomy, designing of brows, extraction and implantation technique. This article gives a comprehensive view regarding eyebrow transplant with special emphasis on follicular unit extraction technique, which has become the most popular technique. PMID:28852290

  3. Developing a complex independent component analysis technique to extract non-stationary patterns from geophysical time-series

    NASA Astrophysics Data System (ADS)

    Forootan, Ehsan; Kusche, Jürgen

    2016-04-01

    Geodetic/geophysical observations, such as the time series of global terrestrial water storage change or sea level and temperature change, represent samples of physical processes and therefore contain information about complex physical interactionswith many inherent time scales. Extracting relevant information from these samples, for example quantifying the seasonality of a physical process or its variability due to large-scale ocean-atmosphere interactions, is not possible by rendering simple time series approaches. In the last decades, decomposition techniques have found increasing interest for extracting patterns from geophysical observations. Traditionally, principal component analysis (PCA) and more recently independent component analysis (ICA) are common techniques to extract statistical orthogonal (uncorrelated) and independent modes that represent the maximum variance of observations, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the auto-covariance matrix or diagonalizing higher (than two)-order statistical tensors from centered time series. However, the stationary assumption is obviously not justifiable for many geophysical and climate variables even after removing cyclic components e.g., the seasonal cycles. In this paper, we present a new decomposition method, the complex independent component analysis (CICA, Forootan, PhD-2014), which can be applied to extract to non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA (Forootan and Kusche, JoG-2012), where we (i) define a new complex data set using a Hilbert transformation. The complex time series contain the observed values in their real part, and the temporal rate of variability in their imaginary part. (ii) An ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex data set in (i). (iii) Dominant non-stationary patterns are recognized as independent complex patterns that can be used to represent the space and time amplitude and phase propagations. We present the results of CICA on simulated and real cases e.g., for quantifying the impact of large-scale ocean-atmosphere interaction on global mass changes. Forootan (PhD-2014) Statistical signal decomposition techniques for analyzing time-variable satellite gravimetry data, PhD Thesis, University of Bonn, http://hss.ulb.uni-bonn.de/2014/3766/3766.htm Forootan and Kusche (JoG-2012) Separation of global time-variable gravity signals into maximally independent components, Journal of Geodesy 86 (7), 477-497, doi: 10.1007/s00190-011-0532-5

  4. Application of ionic liquids based microwave-assisted simultaneous extraction of carnosic acid, rosmarinic acid and essential oil from Rosmarinus officinalis.

    PubMed

    Liu, Tingting; Sui, Xiaoyu; Zhang, Rongrui; Yang, Lei; Zu, Yuangang; Zhang, Lin; Zhang, Ying; Zhang, Zhonghua

    2011-11-25

    An ionic liquid based microwave-assisted simultaneous extraction and distillation (ILMSED) method has been developed for the effective extraction of carnosic acid (CA), rosmarinic acid (RA) and essential oil (EO) from Rosmarinus officinalis. A series of 1-alkyl-3-methylimidazolium ionic liquids differing in composition of anion and cation were evaluated for extraction yield in this work. The results obtained indicated that the anions and cations of ionic liquids had influences on the extraction of CA and RA, 1.0M 1-octyl-3-methylimidazolium bromide ([C8mim]Br) solution was selected as solvent. In addition, the ILMSED procedures for the three target ingredients were optimized and compared with other conventional extraction techniques. ILMSED gave the best result due to the highest extraction yield within the shortest extraction time for CA and RA. The novel process developed offered advantages in term of yield and selectivity of EO and shorter isolation time (20 min in comparison of 4h of hydrodistillation), and provides a more valuable EO (with high amount of oxygenated compounds). The microstructures and chemical structures of rosemary samples before and after extraction were also investigated. Moreover, the proposed method was validated by the stability, repeatability and recovery experiments. The results indicated that the developed ILMSED method provided a good alternative for the both extraction of non-volatile compounds (CA and RA) and EO from rosemary as well as other herbs. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. A Morphological Hessian Based Approach for Retinal Blood Vessels Segmentation and Denoising Using Region Based Otsu Thresholding

    PubMed Central

    BahadarKhan, Khan; A Khaliq, Amir; Shahid, Muhammad

    2016-01-01

    Diabetic Retinopathy (DR) harm retinal blood vessels in the eye causing visual deficiency. The appearance and structure of blood vessels in retinal images play an essential part in the diagnoses of an eye sicknesses. We proposed a less computational unsupervised automated technique with promising results for detection of retinal vasculature by using morphological hessian based approach and region based Otsu thresholding. Contrast Limited Adaptive Histogram Equalization (CLAHE) and morphological filters have been used for enhancement and to remove low frequency noise or geometrical objects, respectively. The hessian matrix and eigenvalues approach used has been in a modified form at two different scales to extract wide and thin vessel enhanced images separately. Otsu thresholding has been further applied in a novel way to classify vessel and non-vessel pixels from both enhanced images. Finally, postprocessing steps has been used to eliminate the unwanted region/segment, non-vessel pixels, disease abnormalities and noise, to obtain a final segmented image. The proposed technique has been analyzed on the openly accessible DRIVE (Digital Retinal Images for Vessel Extraction) and STARE (STructured Analysis of the REtina) databases along with the ground truth data that has been precisely marked by the experts. PMID:27441646

  6. Context-based automated defect classification system using multiple morphological masks

    DOEpatents

    Gleason, Shaun S.; Hunt, Martin A.; Sari-Sarraf, Hamed

    2002-01-01

    Automatic detection of defects during the fabrication of semiconductor wafers is largely automated, but the classification of those defects is still performed manually by technicians. This invention includes novel digital image analysis techniques that generate unique feature vector descriptions of semiconductor defects as well as classifiers that use these descriptions to automatically categorize the defects into one of a set of pre-defined classes. Feature extraction techniques based on multiple-focus images, multiple-defect mask images, and segmented semiconductor wafer images are used to create unique feature-based descriptions of the semiconductor defects. These feature-based defect descriptions are subsequently classified by a defect classifier into categories that depend on defect characteristics and defect contextual information, that is, the semiconductor process layer(s) with which the defect comes in contact. At the heart of the system is a knowledge database that stores and distributes historical semiconductor wafer and defect data to guide the feature extraction and classification processes. In summary, this invention takes as its input a set of images containing semiconductor defect information, and generates as its output a classification for the defect that describes not only the defect itself, but also the location of that defect with respect to the semiconductor process layers.

  7. Verification of Eulerian-Eulerian and Eulerian-Lagrangian simulations for turbulent fluid-particle flows

    DOE PAGES

    Patel, Ravi G.; Desjardins, Olivier; Kong, Bo; ...

    2017-09-01

    Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less

  8. Verification of Eulerian-Eulerian and Eulerian-Lagrangian simulations for turbulent fluid-particle flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patel, Ravi G.; Desjardins, Olivier; Kong, Bo

    Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less

  9. Considering context: reliable entity networks through contextual relationship extraction

    NASA Astrophysics Data System (ADS)

    David, Peter; Hawes, Timothy; Hansen, Nichole; Nolan, James J.

    2016-05-01

    Existing information extraction techniques can only partially address the problem of exploiting unreadable-large amounts text. When discussion of events and relationships is limited to simple, past-tense, factual descriptions of events, current NLP-based systems can identify events and relationships and extract a limited amount of additional information. But the simple subset of available information that existing tools can extract from text is only useful to a small set of users and problems. Automated systems need to find and separate information based on what is threatened or planned to occur, has occurred in the past, or could potentially occur. We address the problem of advanced event and relationship extraction with our event and relationship attribute recognition system, which labels generic, planned, recurring, and potential events. The approach is based on a combination of new machine learning methods, novel linguistic features, and crowd-sourced labeling. The attribute labeler closes the gap between structured event and relationship models and the complicated and nuanced language that people use to describe them. Our operational-quality event and relationship attribute labeler enables Warfighters and analysts to more thoroughly exploit information in unstructured text. This is made possible through 1) More precise event and relationship interpretation, 2) More detailed information about extracted events and relationships, and 3) More reliable and informative entity networks that acknowledge the different attributes of entity-entity relationships.

  10. Magnetic molecularly imprinted polymers based on silica modified by deep eutectic solvents for the rapid simultaneous magnetic-based solid-phase extraction of Salvia miltiorrhiza bunge, Glycine max (Linn.) Merr and green tea.

    PubMed

    Li, Guizhen; Wang, Xiaoqin; Row, Kyung Ho

    2018-04-01

    Novel magnetic molecularly imprinted polymers (MMIPs) with multiple-template based on silica were modified by four types of deep eutectic solvents (DESs) for the rapid simultaneous magnetic solid-phase extraction (MSPE) of tanshinone Ⅰ, tanshinone ⅡA, and cryptotanshinone from Salvia miltiorrhiza bunge; glycitein, genistein, and daidzein from Glycine max (Linn.) Merr; and epicatechin, epigallocatechin gallate, and epicatechin gallate from green tea, respectively. The synthesized materials were characterized by Fourier transform infrared spectroscopy and field emission scanning electron microscopy. Single factor experiments were to explore the relationship between the extraction efficiency and four factors (the sample solution pH, amount of DESs for modification, amount of adsorbent, and extraction time). It was showed that the DES4-MMIPs have better extraction ability than the MMIPs without DESs and the other three DESs-modified MMIPs. The best extraction recoveries with DES4-MMIP were tanshinone Ⅰ (85.57%), tanshinone ⅡA (80.58%), cryptotanshinone (92.12%), glycitein (81.65%), genistein (87.72%), daidzein (92.24%), epicatechin (86.43%), epigallocatechin gallate (80.92%), and epicatechin gallate (93.64%), respectively. The novel multiple-template MMIPs materials modified by DES for the rapid simultaneous MSPE of active compounds were proved to reduce the experimental steps than single-template technique, and increase the extraction efficiency. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. An intelligent signal processing and pattern recognition technique for defect identification using an active sensor network

    NASA Astrophysics Data System (ADS)

    Su, Zhongqing; Ye, Lin

    2004-08-01

    The practical utilization of elastic waves, e.g. Rayleigh-Lamb waves, in high-performance structural health monitoring techniques is somewhat impeded due to the complicated wave dispersion phenomena, the existence of multiple wave modes, the high susceptibility to diverse interferences, the bulky sampled data and the difficulty in signal interpretation. An intelligent signal processing and pattern recognition (ISPPR) approach using the wavelet transform and artificial neural network algorithms was developed; this was actualized in a signal processing package (SPP). The ISPPR technique comprehensively functions as signal filtration, data compression, characteristic extraction, information mapping and pattern recognition, capable of extracting essential yet concise features from acquired raw wave signals and further assisting in structural health evaluation. For validation, the SPP was applied to the prediction of crack growth in an alloy structural beam and construction of a damage parameter database for defect identification in CF/EP composite structures. It was clearly apparent that the elastic wave propagation-based damage assessment could be dramatically streamlined by introduction of the ISPPR technique.

  12. Secondary metabolites isolation in natural products chemistry: comparison of two semipreparative chromatographic techniques (high pressure liquid chromatography and high performance thin-layer chromatography).

    PubMed

    Do, Thi Kieu Tiên; Hadji-Minaglou, Francis; Antoniotti, Sylvain; Fernandez, Xavier

    2014-01-17

    Chemical investigations on secondary metabolites in natural products chemistry require efficient isolation techniques for characterization purpose as well as for the evaluation of their biological properties. In the case of phytochemical studies, the performance of the techniques is critical (resolution and yield) since the products generally present a narrow range of polarity and physicochemical properties. Several techniques are currently available, but HPLC (preparative and semipreparative) is the most widely used. To compare the performance of semipreparative HPLC and HPTLC for the isolation of secondary metabolites in different types of extracts, we have chosen carvone from spearmint essential oil (Mentha spicata L.), resveratrol from Fallopia multiflora (Thunb.) Haraldson, and rosmarinic acid from rosemary (Rosmarinus officinalis L.) extracts. The comparison was based on the chromatographic separation, the purity and quantity of isolated compounds, the solvent consumption, the duration and the cost of the isolation operations. The results showed that semipreparative HPTLC can in some case offer some advantages over conventional semipreparative HPLC. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Use of liquid/supercritical CO2 extraction process for butanol recovery from fermentation broth

    USDA-ARS?s Scientific Manuscript database

    In order for butanol fermentation to be a viable option, it is essential to recover it from fermentation broth using economical alternate in-situ product recovery techniques such as liquid/supercritical CO2 extraction as compared to distillation. This technique (liquid CO2 extraction & supercritical...

  14. Automated Techniques for Quantification of Coastline Change Rates using Landsat Imagery along Caofeidian, China

    NASA Astrophysics Data System (ADS)

    Dong, Di; Li, Ziwei; Liu, Zhaoqin; Yu, Yang

    2014-03-01

    This paper focuses on automated extraction and monitoring of coastlines by remote sensing techniques using multi-temporal Landsat imagery along Caofeidian, China. Caofeidian, as one of the active economic regions in China, has experienced dramatic change due to enhanced human activities, such as land reclamation. These processes have caused morphological changes of the Caofeidian shoreline. In this study, shoreline extraction and change analysis are researched. An algorithm based on image texture and mathematical morphology is proposed to automate coastline extraction. We tested this approach and found that it's capable of extracting coastlines from TM and ETM+ images with little human modifications. Then, the detected coastline vectors are imported into Arcgis software, and the Digital Shoreline Analysis System (DSAS) is used to calculate the change rate (the end point rate and linear regression rate). The results show that in some parts of the research area, remarkable coastline changes are observed, especially the accretion rate. The abnormal accretion is mostly attributed to the large-scale land reclamation during 2003 and 2004 in Caofeidian. So we can conclude that various construction projects, especially the land reclamation project, have made Caofeidian shorelines change greatly, far above the normal.

  15. Ancient DNA in historical parchments - identifying a procedure for extraction and amplification of genetic material.

    PubMed

    Lech, T

    2016-05-06

    Historical parchments in the form of documents, manuscripts, books, or letters, make up a large portion of cultural heritage collections. Their priceless historical value is associated with not only their content, but also the information hidden in the DNA deposited on them. Analyses of ancient DNA (aDNA) retrieved from parchments can be used in various investigations, including, but not limited to, studying their authentication, tracing the development of the culture, diplomacy, and technology, as well as obtaining information on the usage and domestication of animals. This article proposes and verifies a procedure for aDNA recovery from historical parchments and its appropriate preparation for further analyses. This study involved experimental selection of an aDNA extraction method with the highest efficiency and quality of extracted genetic material, from among the multi-stage phenol-chloroform extraction methods, and the modern, column-based techniques that use selective DNA-binding membranes. Moreover, current techniques to amplify entire genetic material were questioned, and the possibility of using mitochondrial DNA for species identification was analyzed. The usefulness of the proposed procedure was successfully confirmed in identification tests of historical parchments dating back to the 13-16th century AD.

  16. The effects of solar incidence angle over digital processing of LANDSAT data

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Novo, E. M. L. M.

    1983-01-01

    A technique to extract the topography modulation component from digital data is described. The enhancement process is based on the fact that the pixel contains two types of information: (1) reflectance variation due to the target; (2) reflectance variation due to the topography. In order to enhance the signal variation due to topography, the technique recommends the extraction from original LANDSAT data of the component resulting from target reflectance. Considering that the role of topographic modulation over the pixel information will vary with solar incidence angle, the results of this technique of digital processing will differ from one season to another, mainly in highly dissected topography. In this context, the effects of solar incidence angle over the topographic modulation technique were evaluated. Two sets of MSS/LANDSAT data, with solar elevation angles varying from 22 to 41 deg were selected to implement the digital processing at the Image-100 System. A secondary watershed (Rio Bocaina) draining into Rio Paraiba do Sul (Sao Paulo State) was selected as a test site. The results showed that the technique used was more appropriate to MSS data acquired under higher Sun elevation angles. Topographic modulation components applied to low Sun elevation angles lessens rather than enhances topography.

  17. Directed Hidden-Code Extractor for Environment-Sensitive Malwares

    NASA Astrophysics Data System (ADS)

    Jia, Chunfu; Wang, Zhi; Lu, Kai; Liu, Xinhai; Liu, Xin

    Malware writers often use packing technique to hide malicious payload. A number of dynamic unpacking tools are.designed in order to identify and extract the hidden code in the packed malware. However, such unpacking methods.are all based on a highly controlled environment that is vulnerable to various anti-unpacking techniques. If execution.environment is suspicious, malwares may stay inactive for a long time or stop execution immediately to evade.detection. In this paper, we proposed a novel approach that automatically reasons about the environment requirements.imposed by malware, then directs a unpacking tool to change the controlled environment to extract the hide code at.the new environment. The experimental results show that our approach significantly increases the resilience of the.traditional unpacking tools to environment-sensitive malware.

  18. Extracting TSK-type Neuro-Fuzzy model using the Hunting search algorithm

    NASA Astrophysics Data System (ADS)

    Bouzaida, Sana; Sakly, Anis; M'Sahli, Faouzi

    2014-01-01

    This paper proposes a Takagi-Sugeno-Kang (TSK) type Neuro-Fuzzy model tuned by a novel metaheuristic optimization algorithm called Hunting Search (HuS). The HuS algorithm is derived based on a model of group hunting of animals such as lions, wolves, and dolphins when looking for a prey. In this study, the structure and parameters of the fuzzy model are encoded into a particle. Thus, the optimal structure and parameters are achieved simultaneously. The proposed method was demonstrated through modeling and control problems, and the results have been compared with other optimization techniques. The comparisons indicate that the proposed method represents a powerful search approach and an effective optimization technique as it can extract the accurate TSK fuzzy model with an appropriate number of rules.

  19. Exploring Characterizations of Learning Object Repositories Using Data Mining Techniques

    NASA Astrophysics Data System (ADS)

    Segura, Alejandra; Vidal, Christian; Menendez, Victor; Zapata, Alfredo; Prieto, Manuel

    Learning object repositories provide a platform for the sharing of Web-based educational resources. As these repositories evolve independently, it is difficult for users to have a clear picture of the kind of contents they give access to. Metadata can be used to automatically extract a characterization of these resources by using machine learning techniques. This paper presents an exploratory study carried out in the contents of four public repositories that uses clustering and association rule mining algorithms to extract characterizations of repository contents. The results of the analysis include potential relationships between different attributes of learning objects that may be useful to gain an understanding of the kind of resources available and eventually develop search mechanisms that consider repository descriptions as a criteria in federated search.

  20. Extraction of Greenhouse Areas with Image Processing Methods in Karabuk Province

    NASA Astrophysics Data System (ADS)

    Yildirima, M. Z.; Ozcan, C.

    2017-11-01

    Greenhouses provide the environmental conditions to be controlled and regulated as desired while allowing agricultural products to be produced without being affected by external environmental conditions. High quality and a wide variety of agricultural products can be produced throughout the year. In addition, mapping and detection of these areas has great importance in terms of factors such as yield analysis, natural resource management and environmental impact. Various remote sensing techniques are currently available for extraction of greenhouse areas. These techniques are based on the automatic detection and interpretation of objects on remotely sensed images. In this study, greenhouse areas were determined from optical images obtained from Landsat. The study was carried out in the greenhouse areas in Karabuk province. The obtained results are presented with figures and tables.

  1. Advances in paper-based sample pretreatment for point-of-care testing.

    PubMed

    Tang, Rui Hua; Yang, Hui; Choi, Jane Ru; Gong, Yan; Feng, Shang Sheng; Pingguan-Murphy, Belinda; Huang, Qing Sheng; Shi, Jun Ling; Mei, Qi Bing; Xu, Feng

    2017-06-01

    In recent years, paper-based point-of-care testing (POCT) has been widely used in medical diagnostics, food safety and environmental monitoring. However, a high-cost, time-consuming and equipment-dependent sample pretreatment technique is generally required for raw sample processing, which are impractical for low-resource and disease-endemic areas. Therefore, there is an escalating demand for a cost-effective, simple and portable pretreatment technique, to be coupled with the commonly used paper-based assay (e.g. lateral flow assay) in POCT. In this review, we focus on the importance of using paper as a platform for sample pretreatment. We firstly discuss the beneficial use of paper for sample pretreatment, including sample collection and storage, separation, extraction, and concentration. We highlight the working principle and fabrication of each sample pretreatment device, the existing challenges and the future perspectives for developing paper-based sample pretreatment technique.

  2. Interferometric Dynamic Measurement: Techniques Based on High-Speed Imaging or a Single Photodetector

    PubMed Central

    Fu, Yu; Pedrini, Giancarlo

    2014-01-01

    In recent years, optical interferometry-based techniques have been widely used to perform noncontact measurement of dynamic deformation in different industrial areas. In these applications, various physical quantities need to be measured in any instant and the Nyquist sampling theorem has to be satisfied along the time axis on each measurement point. Two types of techniques were developed for such measurements: one is based on high-speed cameras and the other uses a single photodetector. The limitation of the measurement range along the time axis in camera-based technology is mainly due to the low capturing rate, while the photodetector-based technology can only do the measurement on a single point. In this paper, several aspects of these two technologies are discussed. For the camera-based interferometry, the discussion includes the introduction of the carrier, the processing of the recorded images, the phase extraction algorithms in various domains, and how to increase the temporal measurement range by using multiwavelength techniques. For the detector-based interferometry, the discussion mainly focuses on the single-point and multipoint laser Doppler vibrometers and their applications for measurement under extreme conditions. The results show the effort done by researchers for the improvement of the measurement capabilities using interferometry-based techniques to cover the requirements needed for the industrial applications. PMID:24963503

  3. Optimization of antibacterial activity by Gold-Thread (Coptidis Rhizoma Franch) against Streptococcus mutans using evolutionary operation-factorial design technique.

    PubMed

    Choi, Ung-Kyu; Kim, Mi-Hyang; Lee, Nan-Hee

    2007-11-01

    This study was conducted to find the optimum extraction condition of Gold-Thread for antibacterial activity against Streptococcus mutans using The evolutionary operation-factorial design technique. Higher antibacterial activity was achieved in a higher extraction temperature (R2 = -0.79) and in a longer extraction time (R2 = -0.71). Antibacterial activity was not affected by differentiation of the ethanol concentration in the extraction solvent (R2 = -0.12). The maximum antibacterial activity of clove against S. mutans determined by the EVOP-factorial technique was obtained at 80 degrees C extraction temperature, 26 h extraction time, and 50% ethanol concentration. The population of S. mutans decreased from 6.110 logCFU/ml in the initial set to 4.125 logCFU/ml in the third set.

  4. Developing a Complex Independent Component Analysis (CICA) Technique to Extract Non-stationary Patterns from Geophysical Time Series

    NASA Astrophysics Data System (ADS)

    Forootan, Ehsan; Kusche, Jürgen; Talpe, Matthieu; Shum, C. K.; Schmidt, Michael

    2017-12-01

    In recent decades, decomposition techniques have enabled increasingly more applications for dimension reduction, as well as extraction of additional information from geophysical time series. Traditionally, the principal component analysis (PCA)/empirical orthogonal function (EOF) method and more recently the independent component analysis (ICA) have been applied to extract, statistical orthogonal (uncorrelated), and independent modes that represent the maximum variance of time series, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the autocovariance matrix and diagonalizing higher (than two) order statistical tensors from centered time series, respectively. However, the stationarity assumption in these techniques is not justified for many geophysical and climate variables even after removing cyclic components, e.g., the commonly removed dominant seasonal cycles. In this paper, we present a novel decomposition method, the complex independent component analysis (CICA), which can be applied to extract non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA, where (a) we first define a new complex dataset that contains the observed time series in its real part, and their Hilbert transformed series as its imaginary part, (b) an ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex dataset in (a), and finally, (c) the dominant independent complex modes are extracted and used to represent the dominant space and time amplitudes and associated phase propagation patterns. The performance of CICA is examined by analyzing synthetic data constructed from multiple physically meaningful modes in a simulation framework, with known truth. Next, global terrestrial water storage (TWS) data from the Gravity Recovery And Climate Experiment (GRACE) gravimetry mission (2003-2016), and satellite radiometric sea surface temperature (SST) data (1982-2016) over the Atlantic and Pacific Oceans are used with the aim of demonstrating signal separations of the North Atlantic Oscillation (NAO) from the Atlantic Multi-decadal Oscillation (AMO), and the El Niño Southern Oscillation (ENSO) from the Pacific Decadal Oscillation (PDO). CICA results indicate that ENSO-related patterns can be extracted from the Gravity Recovery And Climate Experiment Terrestrial Water Storage (GRACE TWS) with an accuracy of 0.5-1 cm in terms of equivalent water height (EWH). The magnitude of errors in extracting NAO or AMO from SST data using the complex EOF (CEOF) approach reaches up to 50% of the signal itself, while it is reduced to 16% when applying CICA. Larger errors with magnitudes of 100% and 30% of the signal itself are found while separating ENSO from PDO using CEOF and CICA, respectively. We thus conclude that the CICA is more effective than CEOF in separating non-stationary patterns.

  5. Assessment of biological activity and UPLC-MS based chromatographic profiling of ethanolic extract of Ochradenus arabicus.

    PubMed

    Ali, M Ajmal; Farah, M Abul; Al-Hemaid, Fahad M; Abou-Tarboush, Faisal M; Al-Anazi, Khaled M; Wabaidur, S M; Alothman, Z A; Lee, Joongku

    2016-03-01

    Natural products from wild and medicinal plants, either in the form of crude extracts or pure compounds provide unlimited opportunities for new drug leads owing to the unmatched availability of chemical diversity. In the present study, the cytotoxic potential of crude ethanolic extract of Ochradenus arabicus was analyzed by MTT cell viability assay in MCF-7 adenocarcinoma breast cancer cells. We further investigated its effect against oxidative stress induced by anticancer drug doxorubicin. In addition, Ultra Performance Liquid Chromatography-Mass Spectrometry (UPLC-MS) based chromatographic profiling of crude extract of O. arabicus was performed. The MTT assay data showed that the extract is moderately toxic to the MCF-7 cells. However, its treatment alone does not induce oxidative stress while doxorubicin increases the level of oxidative stress in MCF-7 cells. Whereas, simultaneous treatment of plant extract and doxorubicin significantly (p < 0.05) decreased the level of intracellular reactive oxygen species (ROS) and lipid peroxidation while an increase in the reduced glutathione and superoxide dismutase activity was observed in time and dose dependent manner. Hence, our finding confirmed cytotoxic and antioxidant potential of crude extract of O. arabicus in MCF-7 cells. However, further investigations on O. arabicus as a potential chemotherapeutic agent are needed. The analysis of bioactive compounds present in the plant extracts involving the applications of common phytochemical screening assays such as chromatographic techniques is discussed.

  6. Automatic quantitative analysis of in-stent restenosis using FD-OCT in vivo intra-arterial imaging.

    PubMed

    Mandelias, Kostas; Tsantis, Stavros; Spiliopoulos, Stavros; Katsakiori, Paraskevi F; Karnabatidis, Dimitris; Nikiforidis, George C; Kagadis, George C

    2013-06-01

    A new segmentation technique is implemented for automatic lumen area extraction and stent strut detection in intravascular optical coherence tomography (OCT) images for the purpose of quantitative analysis of in-stent restenosis (ISR). In addition, a user-friendly graphical user interface (GUI) is developed based on the employed algorithm toward clinical use. Four clinical datasets of frequency-domain OCT scans of the human femoral artery were analyzed. First, a segmentation method based on fuzzy C means (FCM) clustering and wavelet transform (WT) was applied toward inner luminal contour extraction. Subsequently, stent strut positions were detected by utilizing metrics derived from the local maxima of the wavelet transform into the FCM membership function. The inner lumen contour and the position of stent strut were extracted with high precision. Compared to manual segmentation by an expert physician, the automatic lumen contour delineation had an average overlap value of 0.917 ± 0.065 for all OCT images included in the study. The strut detection procedure achieved an overall accuracy of 93.80% and successfully identified 9.57 ± 0.5 struts for every OCT image. Processing time was confined to approximately 2.5 s per OCT frame. A new fast and robust automatic segmentation technique combining FCM and WT for lumen border extraction and strut detection in intravascular OCT images was designed and implemented. The proposed algorithm integrated in a GUI represents a step forward toward the employment of automated quantitative analysis of ISR in clinical practice.

  7. Feature generation using genetic programming with application to fault classification.

    PubMed

    Guo, Hong; Jack, Lindsay B; Nandi, Asoke K

    2005-02-01

    One of the major challenges in pattern recognition problems is the feature extraction process which derives new features from existing features, or directly from raw data in order to reduce the cost of computation during the classification process, while improving classifier efficiency. Most current feature extraction techniques transform the original pattern vector into a new vector with increased discrimination capability but lower dimensionality. This is conducted within a predefined feature space, and thus, has limited searching power. Genetic programming (GP) can generate new features from the original dataset without prior knowledge of the probabilistic distribution. In this paper, a GP-based approach is developed for feature extraction from raw vibration data recorded from a rotating machine with six different conditions. The created features are then used as the inputs to a neural classifier for the identification of six bearing conditions. Experimental results demonstrate the ability of GP to discover autimatically the different bearing conditions using features expressed in the form of nonlinear functions. Furthermore, four sets of results--using GP extracted features with artificial neural networks (ANN) and support vector machines (SVM), as well as traditional features with ANN and SVM--have been obtained. This GP-based approach is used for bearing fault classification for the first time and exhibits superior searching power over other techniques. Additionaly, it significantly reduces the time for computation compared with genetic algorithm (GA), therefore, makes a more practical realization of the solution.

  8. Localized Segment Based Processing for Automatic Building Extraction from LiDAR Data

    NASA Astrophysics Data System (ADS)

    Parida, G.; Rajan, K. S.

    2017-05-01

    The current methods of object segmentation and extraction and classification of aerial LiDAR data is manual and tedious task. This work proposes a technique for object segmentation out of LiDAR data. A bottom-up geometric rule based approach was used initially to devise a way to segment buildings out of the LiDAR datasets. For curved wall surfaces, comparison of localized surface normals was done to segment buildings. The algorithm has been applied to both synthetic datasets as well as real world dataset of Vaihingen, Germany. Preliminary results show successful segmentation of the buildings objects from a given scene in case of synthetic datasets and promissory results in case of real world data. The advantages of the proposed work is non-dependence on any other form of data required except LiDAR. It is an unsupervised method of building segmentation, thus requires no model training as seen in supervised techniques. It focuses on extracting the walls of the buildings to construct the footprint, rather than focussing on roof. The focus on extracting the wall to reconstruct the buildings from a LiDAR scene is crux of the method proposed. The current segmentation approach can be used to get 2D footprints of the buildings, with further scope to generate 3D models. Thus, the proposed method can be used as a tool to get footprints of buildings in urban landscapes, helping in urban planning and the smart cities endeavour.

  9. Evaluation of four automated protocols for extraction of DNA from FTA cards.

    PubMed

    Stangegaard, Michael; Børsting, Claus; Ferrero-Miliani, Laura; Frank-Hansen, Rune; Poulsen, Lena; Hansen, Anders J; Morling, Niels

    2013-10-01

    Extraction of DNA using magnetic bead-based techniques on automated DNA extraction instruments provides a fast, reliable, and reproducible method for DNA extraction from various matrices. Here, we have compared the yield and quality of DNA extracted from FTA cards using four automated extraction protocols on three different instruments. The extraction processes were repeated up to six times with the same pieces of FTA cards. The sample material on the FTA cards was either blood or buccal cells. With the QIAamp DNA Investigator and QIAsymphony DNA Investigator kits, it was possible to extract DNA from the FTA cards in all six rounds of extractions in sufficient amount and quality to obtain complete short tandem repeat (STR) profiles on a QIAcube and a QIAsymphony SP. With the PrepFiler Express kit, almost all the extractable DNA was extracted in the first two rounds of extractions. Furthermore, we demonstrated that it was possible to successfully extract sufficient DNA for STR profiling from previously processed FTA card pieces that had been stored at 4 °C for up to 1 year. This showed that rare or precious FTA card samples may be saved for future analyses even though some DNA was already extracted from the FTA cards.

  10. Dependency-based long short term memory network for drug-drug interaction extraction.

    PubMed

    Wang, Wei; Yang, Xi; Yang, Canqun; Guo, Xiaowei; Zhang, Xiang; Wu, Chengkun

    2017-12-28

    Drug-drug interaction extraction (DDI) needs assistance from automated methods to address the explosively increasing biomedical texts. In recent years, deep neural network based models have been developed to address such needs and they have made significant progress in relation identification. We propose a dependency-based deep neural network model for DDI extraction. By introducing the dependency-based technique to a bi-directional long short term memory network (Bi-LSTM), we build three channels, namely, Linear channel, DFS channel and BFS channel. All of these channels are constructed with three network layers, including embedding layer, LSTM layer and max pooling layer from bottom up. In the embedding layer, we extract two types of features, one is distance-based feature and another is dependency-based feature. In the LSTM layer, a Bi-LSTM is instituted in each channel to better capture relation information. Then max pooling is used to get optimal features from the entire encoding sequential data. At last, we concatenate the outputs of all channels and then link it to the softmax layer for relation identification. To the best of our knowledge, our model achieves new state-of-the-art performance with the F-score of 72.0% on the DDIExtraction 2013 corpus. Moreover, our approach obtains much higher Recall value compared to the existing methods. The dependency-based Bi-LSTM model can learn effective relation information with less feature engineering in the task of DDI extraction. Besides, the experimental results show that our model excels at balancing the Precision and Recall values.

  11. Automatic exudate detection by fusing multiple active contours and regionwise classification.

    PubMed

    Harangi, Balazs; Hajdu, Andras

    2014-11-01

    In this paper, we propose a method for the automatic detection of exudates in digital fundus images. Our approach can be divided into three stages: candidate extraction, precise contour segmentation and the labeling of candidates as true or false exudates. For candidate detection, we borrow a grayscale morphology-based method to identify possible regions containing these bright lesions. Then, to extract the precise boundary of the candidates, we introduce a complex active contour-based method. Namely, to increase the accuracy of segmentation, we extract additional possible contours by taking advantage of the diverse behavior of different pre-processing methods. After selecting an appropriate combination of the extracted contours, a region-wise classifier is applied to remove the false exudate candidates. For this task, we consider several region-based features, and extract an appropriate feature subset to train a Naïve-Bayes classifier optimized further by an adaptive boosting technique. Regarding experimental studies, the method was tested on publicly available databases both to measure the accuracy of the segmentation of exudate regions and to recognize their presence at image-level. In a proper quantitative evaluation on publicly available datasets the proposed approach outperformed several state-of-the-art exudate detector algorithms. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Preparation, characterization and application of a new stir bar sorptive extraction based on poly(vinylphthalimide-co-N,N'-methylenebisacrylamide) monolith.

    PubMed

    Huang, Xiaojia; Chen, Linli; Yuan, Dongxing; Luo, Xianbin

    2011-12-01

    In this study, a new stir bar sorptive extraction (SBSE) coating based on poly(vinylphthalimide-co-N,N'-methylenebisacrylamide) monolith (SBSE-VPMB) was prepared. The influences of the contents of monomer in polymerization mixture and the percentage of porogen solvent on the extraction performance were investigated thoroughly. Several characteristic techniques, such as elemental analysis, scanning electron microscopy, mercury intrusion porosimetry and infrared spectroscopy, were used to characterize the monolithic material. The analysis of oxfendazole (OFZ) and mebendazole (MBZ) in milk and honey samples by the combination of SBSE with HPLC with diode array detection was selected as paradigms for the practical evaluation of the new coating. Under the optimized extraction conditions, the limits of detection (S/N=3) for OFZ and MBZ were 0.23-0.60 μg/L in milk and 0.24-1.08 μg/L in honey, respectively. The method also showed good linearity, repeatability, high feasibility and acceptable recoveries for real samples. At the same time, the extraction performance and the distribution coefficients (K(VPMB/W)) of OFZ and MBZ on SBSE-VPMB were compared with other SBSEs based on porous monoliths and commercial SBSE. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Biological activity and chemical profile of Lavatera thuringiaca L. extracts obtained by different extraction approaches.

    PubMed

    Mašković, Pavle Z; Veličković, Vesna; Đurović, Saša; Zeković, Zoran; Radojković, Marija; Cvetanović, Aleksandra; Švarc-Gajić, Jaroslava; Mitić, Milan; Vujić, Jelena

    2018-01-01

    Lavatera thuringiaca L. is herbaceous perennial plant from Malvaceae family, which is known for its biological activity and richness in polyphenolic compounds. Despite this, the information regarding the biological activity and chemical profile is still insufficient. Aim of this study was to investigate biological potential and chemical profile of Lavatera thuringiaca L., as well as influence of applied extraction technique on them. Two conventional and four non-conventional extraction techniques were applied in order to obtain extracts rich in bioactive compound. Extracts were further tested for total phenolics, flavonoids, condensed tannins, gallotannins and anthocyanins contents using spectrophotometric assays. Polyphenolic profile was established using HPLC-DAD analysis. Biological activity was investigated regarding antioxidant, cytotoxic and antibacterial activities. Four antioxidant assays were applied as well as three different cell lines for cytotoxic and fifteen bacterial strain for antibacterial activity. Results showed that subcritical water extraction (SCW) dominated over the other extraction techniques, where SCW extract exhibited the highest biological activity. Study indicates that plant Lavatera thuringiaca L. may be used as a potential source of biologically compounds. Copyright © 2017 Elsevier GmbH. All rights reserved.

  14. Functional chitosan-based grapefruit seed extract composite films for applications in food packaging technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Y.M.; Lim, S.H.; Tay, B.Y.

    Highlights: • Chitosan-based grapefruit seed extract (GFSE) films were solution casted. • GFSE was uniformly dispersed within all chitosan film matrices. • All chitosan-based composite films showed remarkable transparency. • Increasing amounts of GFSE incorporated increased the elongation at break of films. • Chitosan-based GFSE composite films inhibited the proliferation of fungal growth. - Abstract: Chitosan-based composite films with different amounts of grapefruit seed extract (GFSE) (0.5, 1.0 and 1.5% v/v) were fabricated via solution casting technique. Experimental results showed that GFSE was uniformly dispersed within all chitosan film matrices. The presence of GFSE made the films more amorphous andmore » tensile strength decreased, while elongation at break values increased as GFSE content increased. Results from the measurement of light transmission revealed that increasing amounts of GFSE (from 0.5 to 1.5% v/v) did not affect transparency of the films. Furthermore, packaging of bread samples with chitosan-based GFSE composite films inhibited the proliferation of fungal growth as compared to control samples. Hence, chitosan-based GFSE composite films have the potential to be a useful material in the area of food technology.« less

  15. Near-infrared image formation and processing for the extraction of hand veins

    NASA Astrophysics Data System (ADS)

    Bouzida, Nabila; Hakim Bendada, Abdel; Maldague, Xavier P.

    2010-10-01

    The main objective of this work is to extract the hand vein network using a non-invasive technique in the near-infrared region (NIR). The visualization of the veins is based on a relevant feature of the blood in relation with certain wavelengths of the electromagnetic spectrum. In the present paper, we first introduce the image formation in the NIR spectral band. Then, the acquisition system will be presented as well as the method used for the image processing in order to extract the vein signature. Extractions of this pattern on the finger, on the wrist and on the dorsal hand are achieved after exposing the hand to an optical stimulation by reflection or transmission of light. We present meaningful results of the extracted vein pattern demonstrating the utility of the method for a clinical application like the diagnosis of vein disease, of primitive varicose vein and also for applications in vein biometrics.

  16. Freeze-out extraction of monocarboxylic acids from water into acetonitrile under the action of centrifugal forces

    NASA Astrophysics Data System (ADS)

    Bekhterev, V. N.

    2016-10-01

    It is established that the efficiency of the freezing-out extraction of monocarboxylic acids C3-C;8 and sorbic acid from water into acetonitrile increases under the action of centrifugal forces. The linear growth of the partition coefficient in the homologous series of C2-C8 acids with an increase in molecule length, and the difference between the efficiency of extracting sorbic and hexanoic acid, are discussed using a theoretical model proposed earlier and based on the adsorption-desorption equilibrium of the partition of dissolved organic compounds between the resulting surface of ice and the liquid phase of the extract. The advantages of the proposed technique with respect to the degree of concentration over the method of low-temperature liquid-liquid extraction are explained in light of the phase diagram for the water-acetonitrile mixture.

  17. A surprising method for green extraction of essential oil from dry spices: Microwave dry-diffusion and gravity.

    PubMed

    Farhat, Asma; Fabiano-Tixier, Anne-Sylvie; Visinoni, Franco; Romdhane, Mehrez; Chemat, Farid

    2010-11-19

    Without adding any solvent or water, we proposed a novel and green approach for the extraction of secondary metabolites from dried plant materials. This "solvent, water and vapor free" approach based on a simple principle involves the application of microwave irradiation and earth gravity to extract the essential oil from dried caraway seeds. Microwave dry-diffusion and gravity (MDG) has been compared with a conventional technique, hydrodistillation (HD), for the extraction of essential oil from dried caraway seeds. Essential oils isolated by MDG were quantitatively (yield) and qualitatively (aromatic profile) similar to those obtained by HD, but MDG was better than HD in terms of rapidity (45min versus 300min), energy saving, and cleanliness. The present apparatus permits fast and efficient extraction, reduces waste, avoids water and solvent consumption, and allows substantial energy savings. Copyright © 2010 Elsevier B.V. All rights reserved.

  18. Protection of Health Imagery by Region Based Lossless Reversible Watermarking Scheme

    PubMed Central

    Priya, R. Lakshmi; Sadasivam, V.

    2015-01-01

    Providing authentication and integrity in medical images is a problem and this work proposes a new blind fragile region based lossless reversible watermarking technique to improve trustworthiness of medical images. The proposed technique embeds the watermark using a reversible least significant bit embedding scheme. The scheme combines hashing, compression, and digital signature techniques to create a content dependent watermark making use of compressed region of interest (ROI) for recovery of ROI as reported in literature. The experiments were carried out to prove the performance of the scheme and its assessment reveals that ROI is extracted in an intact manner and PSNR values obtained lead to realization that the presented scheme offers greater protection for health imageries. PMID:26649328

  19. Current trends in geomorphological mapping

    NASA Astrophysics Data System (ADS)

    Seijmonsbergen, A. C.

    2012-04-01

    Geomorphological mapping is a world currently in motion, driven by technological advances and the availability of new high resolution data. As a consequence, classic (paper) geomorphological maps which were the standard for more than 50 years are rapidly being replaced by digital geomorphological information layers. This is witnessed by the following developments: 1. the conversion of classic paper maps into digital information layers, mainly performed in a digital mapping environment such as a Geographical Information System, 2. updating the location precision and the content of the converted maps, by adding more geomorphological details, taken from high resolution elevation data and/or high resolution image data, 3. (semi) automated extraction and classification of geomorphological features from digital elevation models, broadly separated into unsupervised and supervised classification techniques and 4. New digital visualization / cartographic techniques and reading interfaces. Newly digital geomorphological information layers can be based on manual digitization of polygons using DEMs and/or aerial photographs, or prepared through (semi) automated extraction and delineation of geomorphological features. DEMs are often used as basis to derive Land Surface Parameter information which is used as input for (un) supervised classification techniques. Especially when using high-res data, object-based classification is used as an alternative to traditional pixel-based classifications, to cluster grid cells into homogeneous objects, which can be classified as geomorphological features. Classic map content can also be used as training material for the supervised classification of geomorphological features. In the classification process, rule-based protocols, including expert-knowledge input, are used to map specific geomorphological features or entire landscapes. Current (semi) automated classification techniques are increasingly able to extract morphometric, hydrological, and in the near future also morphogenetic information. As a result, these new opportunities have changed the workflows for geomorphological mapmaking, and their focus have shifted from field-based techniques to using more computer-based techniques: for example, traditional pre-field air-photo based maps are now replaced by maps prepared in a digital mapping environment, and designated field visits using mobile GIS / digital mapping devices now focus on gathering location information and attribute inventories and are strongly time efficient. The resulting 'modern geomorphological maps' are digital collections of geomorphological information layers consisting of georeferenced vector, raster and tabular data which are stored in a digital environment such as a GIS geodatabase, and are easily visualized as e.g. 'birds' eye' views, as animated 3D displays, on virtual globes, or stored as GeoPDF maps in which georeferenced attribute information can be easily exchanged over the internet. Digital geomorphological information layers are increasingly accessed via web-based services distributed through remote servers. Information can be consulted - or even build using remote geoprocessing servers - by the end user. Therefore, it will not only be the geomorphologist anymore, but also the professional end user that dictates the applied use of digital geomorphological information layers.

  20. Development of Novel Method for Rapid Extract of Radionuclides from Solution Using Polymer Ligand Film

    NASA Astrophysics Data System (ADS)

    Rim, Jung H.

    Accurate and fast determination of the activity of radionuclides in a sample is critical for nuclear forensics and emergency response. Radioanalytical techniques are well established for radionuclides measurement, however, they are slow and labor intensive, requiring extensive radiochemical separations and purification prior to analysis. With these limitations of current methods, there is great interest for a new technique to rapidly process samples. This dissertation describes a new analyte extraction medium called Polymer Ligand Film (PLF) developed to rapidly extract radionuclides. Polymer Ligand Film is a polymer medium with ligands incorporated in its matrix that selectively and rapidly extract analytes from a solution. The main focus of the new technique is to shorten and simplify the procedure necessary to chemically isolate radionuclides for determination by alpha spectrometry or beta counting. Five different ligands were tested for plutonium extraction: bis(2-ethylhexyl) methanediphosphonic acid (H2DEH[MDP]), di(2-ethyl hexyl) phosphoric acid (HDEHP), trialkyl methylammonium chloride (Aliquat-336), 4,4'(5')-di-t-butylcyclohexano 18-crown-6 (DtBuCH18C6), and 2-ethylhexyl 2-ethylhexylphosphonic acid (HEH[EHP]). The ligands that were effective for plutonium extraction further studied for uranium extraction. The plutonium recovery by PLFs has shown dependency on nitric acid concentration and ligand to total mass ratio. H2DEH[MDP] PLFs performed best with 1:10 and 1:20 ratio PLFs. 50.44% and 47.61% of plutonium were extracted on the surface of PLFs with 1M nitric acid for 1:10 and 1:20 PLF, respectively. HDEHP PLF provided the best combination of alpha spectroscopy resolution and plutonium recovery with 1:5 PLF when used with 0.1M nitric acid. The overall analyte recovery was lower than electrodeposited samples, which typically has recovery above 80%. However, PLF is designed to be a rapid field deployable screening technique and consistency is more important than recovery. PLFs were also tested using blind quality control samples and the activities were accurately measured. It is important to point out that PLFs were consistently susceptible to analytes penetrating and depositing below the surface. The internal radiation within the body of PLF is mostly contained and did not cause excessive self-attenuation and peak broadening in alpha spectroscopy. The analyte penetration issue was beneficial in the destructive analysis. H2DEH[MDP] PLF was tested with environmental samples to fully understand the capabilities and limitations of the PLF in relevant environments. The extraction system was very effective in extracting plutonium from environmental water collected from Mortandad Canyon at Los Alamos National Laboratory with minimal sample processing. Soil samples were tougher to process than the water samples. Analytes were first leached from the soil matrixes using nitric acid before processing with PLF. This approach had a limitation in extracting plutonium using PLF. The soil samples from Mortandad Canyon, which are about 1% iron by weight, were effectively processed with the PLF system. Even with certain limitations of the PLF extraction system, this technique was able to considerably decrease the sample analysis time. The entire environmental sample was analyzed within one to two days. The decrease in time can be attributed to the fact that PLF is replacing column chromatography and electrodeposition with a single step for preparing alpha spectrometry samples. The two-step process of column chromatography and electrodeposition takes a couple days to a week to complete depending on the sample. The decrease in time and the simplified procedure make this technique a unique solution for application to nuclear forensics and emergency response. A large number of samples can be quickly analyzed and selective samples can be further analyzed with more sensitive techniques based on the initial data. The deployment of a PLF system as a screening method will greatly reduce a total analysis time required to gain meaningful isotopic data for the nuclear forensics application. (Abstract shortened by UMI.)

  1. Effectiveness of Liquid-Liquid Extraction, Solid Phase Extraction, and Headspace Technique for Determination of Some Volatile Water-Soluble Compounds of Rose Aromatic Water

    PubMed Central

    2017-01-01

    Steam distillation is used to isolate scent of rose flowers. Rose aromatic water is commonly used in European cuisine and aromatherapy besides its use in cosmetic industry for its lovely scent. In this study, three different sampling techniques, liquid-liquid extraction (LLE), headspace technique (HS), and solid phase extraction (SPE), were compared for the analysis of volatile water-soluble compounds in commercial rose aromatic water. Some volatile water-soluble compounds of rose aromatic water were also analyzed by gas chromatography mass spectrometry (GCMS). In any case, it was concluded that one of the solid phase extraction methods led to higher recoveries for 2-phenylethyl alcohol (PEA) in the rose aromatic water than the liquid-liquid extraction and headspace technique. Liquid-liquid extraction method provided higher recovery ratios for citronellol, nerol, and geraniol than others. Ideal linear correlation coefficient values were observed by GCMS for quantitative analysis of volatile compounds (r2 ≥ 0.999). Optimized methods showed acceptable repeatability (RSDs < 5%) and excellent recovery (>95%). For compounds such as α-pinene, linalool, β-caryophyllene, α-humulene, methyl eugenol, and eugenol, the best recovery values were obtained with LLE and SPE. PMID:28791049

  2. Analysis of atomic force microscopy data for surface characterization using fuzzy logic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Mousa, Amjed, E-mail: aalmousa@vt.edu; Niemann, Darrell L.; Niemann, Devin J.

    2011-07-15

    In this paper we present a methodology to characterize surface nanostructures of thin films. The methodology identifies and isolates nanostructures using Atomic Force Microscopy (AFM) data and extracts quantitative information, such as their size and shape. The fuzzy logic based methodology relies on a Fuzzy Inference Engine (FIE) to classify the data points as being top, bottom, uphill, or downhill. The resulting data sets are then further processed to extract quantitative information about the nanostructures. In the present work we introduce a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and present an omni-directional searchmore » technique to improve the structural recognition accuracy. In order to demonstrate the effectiveness of our approach we present a case study which uses our approach to quantitatively identify particle sizes of two specimens each with a unique gold nanoparticle size distribution. - Research Highlights: {yields} A Fuzzy logic analysis technique capable of characterizing AFM images of thin films. {yields} The technique is applicable to different surfaces regardless of their densities. {yields} Fuzzy logic technique does not require manual adjustment of the algorithm parameters. {yields} The technique can quantitatively capture differences between surfaces. {yields} This technique yields more realistic structure boundaries compared to other methods.« less

  3. Conventional and accelerated-solvent extractions of green tea (camellia sinensis) for metabolomics-based chemometrics.

    PubMed

    Kellogg, Joshua J; Wallace, Emily D; Graf, Tyler N; Oberlies, Nicholas H; Cech, Nadja B

    2017-10-25

    Metabolomics has emerged as an important analytical technique for multiple applications. The value of information obtained from metabolomics analysis depends on the degree to which the entire metabolome is present and the reliability of sample treatment to ensure reproducibility across the study. The purpose of this study was to compare methods of preparing complex botanical extract samples prior to metabolomics profiling. Two extraction methodologies, accelerated solvent extraction and a conventional solvent maceration, were compared using commercial green tea [Camellia sinensis (L.) Kuntze (Theaceae)] products as a test case. The accelerated solvent protocol was first evaluated to ascertain critical factors influencing extraction using a D-optimal experimental design study. The accelerated solvent and conventional extraction methods yielded similar metabolite profiles for the green tea samples studied. The accelerated solvent extraction yielded higher total amounts of extracted catechins, was more reproducible, and required less active bench time to prepare the samples. This study demonstrates the effectiveness of accelerated solvent as an efficient methodology for metabolomics studies. Copyright © 2017. Published by Elsevier B.V.

  4. Dynamic fabric phase sorptive extraction for a group of pharmaceuticals and personal care products from environmental waters.

    PubMed

    Lakade, Sameer S; Borrull, Francesc; Furton, Kenneth G; Kabir, Abuzar; Marcé, Rosa Maria; Fontanals, Núria

    2016-07-22

    This paper describes for the first time the use of a new extraction technique, based on fabric phase sorptive extraction (FPSE). This new mode proposes the extraction of the analytes in dynamic mode in order to reduce the extraction time. Dynamic fabric phase sorptive extraction (DFPSE) followed by liquid chromatography-tandem mass spectrometry was evaluated for the extraction of a group of pharmaceuticals and personal care products (PPCPs) from environmental water samples. Different parameters affecting the extraction were optimized and best conditions were achieved when 50mL of sample at pH 3 was passed through 3 disks and analytes retained were eluted with 10mL of ethyl acetate. The recoveries were higher than 60% for most of compounds with the exception of the most polar ones (between 8% and 38%). The analytical method was validated with environmental samples such as river water and effluent and influent wastewater, and good performance was obtained. The analysis of samples revealed the presence of some PPCPs at low ngL(-1) concentrations. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Metabolomics study of Saw palmetto extracts based on 1H NMR spectroscopy.

    PubMed

    de Combarieu, Eric; Martinelli, Ernesto Marco; Pace, Roberto; Sardone, Nicola

    2015-04-01

    Preparations containing Saw palmetto extracts are used in traditional medicine to treat benign prostatic hyperplasia. According to the European and the American Pharmacopoeias, the extract is obtained from comminuted Saw palmetto berries by a suitable extracting procedure using ethanol or supercritical carbon dioxide or a mixture of n-hexane and methylpentanes. In the present study an approach to metabolomics profiling using nuclear magnetic resonance (NMR) has been used as a finger-printing tool to assess the overall composition of the extracts. The phytochemical analysis coupled with principal component analysis (PCA) showed the same composition of the Saw palmetto extracts obtained with carbon dioxide and hexane with minor not significant differences for extracts obtained with ethanol. In fact these differences are anyhow lower than the batch-to-batch variability ascribable to the natural-occurring variability in the Saw palmetto fruits' phytochemical composition. The fingerprinting analysis combined with chemometric method, is a technique, which would provide a tool to comprehensively assess the quality control of Saw palmetto extracts. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Differential evolution-based multi-objective optimization for the definition of a health indicator for fault diagnostics and prognostics

    NASA Astrophysics Data System (ADS)

    Baraldi, P.; Bonfanti, G.; Zio, E.

    2018-03-01

    The identification of the current degradation state of an industrial component and the prediction of its future evolution is a fundamental step for the development of condition-based and predictive maintenance approaches. The objective of the present work is to propose a general method for extracting a health indicator to measure the amount of component degradation from a set of signals measured during operation. The proposed method is based on the combined use of feature extraction techniques, such as Empirical Mode Decomposition and Auto-Associative Kernel Regression, and a multi-objective Binary Differential Evolution (BDE) algorithm for selecting the subset of features optimal for the definition of the health indicator. The objectives of the optimization are desired characteristics of the health indicator, such as monotonicity, trendability and prognosability. A case study is considered, concerning the prediction of the remaining useful life of turbofan engines. The obtained results confirm that the method is capable of extracting health indicators suitable for accurate prognostics.

  7. Joint Feature Extraction and Classifier Design for ECG-Based Biometric Recognition.

    PubMed

    Gutta, Sandeep; Cheng, Qi

    2016-03-01

    Traditional biometric recognition systems often utilize physiological traits such as fingerprint, face, iris, etc. Recent years have seen a growing interest in electrocardiogram (ECG)-based biometric recognition techniques, especially in the field of clinical medicine. In existing ECG-based biometric recognition methods, feature extraction and classifier design are usually performed separately. In this paper, a multitask learning approach is proposed, in which feature extraction and classifier design are carried out simultaneously. Weights are assigned to the features within the kernel of each task. We decompose the matrix consisting of all the feature weights into sparse and low-rank components. The sparse component determines the features that are relevant to identify each individual, and the low-rank component determines the common feature subspace that is relevant to identify all the subjects. A fast optimization algorithm is developed, which requires only the first-order information. The performance of the proposed approach is demonstrated through experiments using the MIT-BIH Normal Sinus Rhythm database.

  8. Pre-trained convolutional neural networks as feature extractors toward improved malaria parasite detection in thin blood smear images.

    PubMed

    Rajaraman, Sivaramakrishnan; Antani, Sameer K; Poostchi, Mahdieh; Silamut, Kamolrat; Hossain, Md A; Maude, Richard J; Jaeger, Stefan; Thoma, George R

    2018-01-01

    Malaria is a blood disease caused by the Plasmodium parasites transmitted through the bite of female Anopheles mosquito. Microscopists commonly examine thick and thin blood smears to diagnose disease and compute parasitemia. However, their accuracy depends on smear quality and expertise in classifying and counting parasitized and uninfected cells. Such an examination could be arduous for large-scale diagnoses resulting in poor quality. State-of-the-art image-analysis based computer-aided diagnosis (CADx) methods using machine learning (ML) techniques, applied to microscopic images of the smears using hand-engineered features demand expertise in analyzing morphological, textural, and positional variations of the region of interest (ROI). In contrast, Convolutional Neural Networks (CNN), a class of deep learning (DL) models promise highly scalable and superior results with end-to-end feature extraction and classification. Automated malaria screening using DL techniques could, therefore, serve as an effective diagnostic aid. In this study, we evaluate the performance of pre-trained CNN based DL models as feature extractors toward classifying parasitized and uninfected cells to aid in improved disease screening. We experimentally determine the optimal model layers for feature extraction from the underlying data. Statistical validation of the results demonstrates the use of pre-trained CNNs as a promising tool for feature extraction for this purpose.

  9. Characterization of the volatile components in green tea by IRAE-HS-SPME/GC-MS combined with multivariate analysis.

    PubMed

    Yang, Yan-Qin; Yin, Hong-Xu; Yuan, Hai-Bo; Jiang, Yong-Wen; Dong, Chun-Wang; Deng, Yu-Liang

    2018-01-01

    In the present work, a novel infrared-assisted extraction coupled to headspace solid-phase microextraction (IRAE-HS-SPME) followed by gas chromatography-mass spectrometry (GC-MS) was developed for rapid determination of the volatile components in green tea. The extraction parameters such as fiber type, sample amount, infrared power, extraction time, and infrared lamp distance were optimized by orthogonal experimental design. Under optimum conditions, a total of 82 volatile compounds in 21 green tea samples from different geographical origins were identified. Compared with classical water-bath heating, the proposed technique has remarkable advantages of considerably reducing the analytical time and high efficiency. In addition, an effective classification of green teas based on their volatile profiles was achieved by partial least square-discriminant analysis (PLS-DA) and hierarchical clustering analysis (HCA). Furthermore, the application of a dual criterion based on the variable importance in the projection (VIP) values of the PLS-DA models and on the category from one-way univariate analysis (ANOVA) allowed the identification of 12 potential volatile markers, which were considered to make the most important contribution to the discrimination of the samples. The results suggest that IRAE-HS-SPME/GC-MS technique combined with multivariate analysis offers a valuable tool to assess geographical traceability of different tea varieties.

  10. Recent advances in enzyme extraction strategies: A comprehensive review.

    PubMed

    Nadar, Shamraja S; Pawar, Rohini G; Rathod, Virendra K

    2017-08-01

    The increasing interest of industrial enzymes demands for development of new downstream strategies for maximizing enzyme recovery. The significant efforts have been focused on the development of newly adapted technologies to purify enzymes in catalytically active form. Recently, an aqueous two phase system (ATPS) is emerged as powerful tools for efficient extraction and purification of enzymes due to their versatility, lower cost, process integration capability and easy scale-up. The present review gives an overview of effect of parameters such as tie line length, pH, neutral salts, properties of polymer and salt involved in traditional polymer/polymer and polymer/salt ATPS for enzyme recovery. Further, advanced ATPS have been developed based on alcohols, surfactants, micellar compounds to avoid tedious recovery steps for getting desired enzyme. In order to improve the selectivity and efficiency of ATPS, recent approaches of conventional ATPS combined with different techniques like affinity ligands, ionic liquids, thermoseparating polymers and microfluidic device based ATPS have been reviewed. Moreover, three phase partitioning is also highlighted for enzymes enrichment as a blooming technology for efficiently integrated bioseparation techniques. At the end, it includes an overview of CLEAs technology and organic-inorganic nanoflowers preparation as novel strategies for simultaneous extraction, purification and immobilization of enzymes. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Characterization of the volatile components in green tea by IRAE-HS-SPME/GC-MS combined with multivariate analysis

    PubMed Central

    Yin, Hong-Xu; Yuan, Hai-Bo; Jiang, Yong-Wen; Dong, Chun-Wang; Deng, Yu-Liang

    2018-01-01

    In the present work, a novel infrared-assisted extraction coupled to headspace solid-phase microextraction (IRAE-HS-SPME) followed by gas chromatography-mass spectrometry (GC-MS) was developed for rapid determination of the volatile components in green tea. The extraction parameters such as fiber type, sample amount, infrared power, extraction time, and infrared lamp distance were optimized by orthogonal experimental design. Under optimum conditions, a total of 82 volatile compounds in 21 green tea samples from different geographical origins were identified. Compared with classical water-bath heating, the proposed technique has remarkable advantages of considerably reducing the analytical time and high efficiency. In addition, an effective classification of green teas based on their volatile profiles was achieved by partial least square-discriminant analysis (PLS-DA) and hierarchical clustering analysis (HCA). Furthermore, the application of a dual criterion based on the variable importance in the projection (VIP) values of the PLS-DA models and on the category from one-way univariate analysis (ANOVA) allowed the identification of 12 potential volatile markers, which were considered to make the most important contribution to the discrimination of the samples. The results suggest that IRAE-HS-SPME/GC-MS technique combined with multivariate analysis offers a valuable tool to assess geographical traceability of different tea varieties. PMID:29494626

  12. A comparative study of Averrhoabilimbi extraction method

    NASA Astrophysics Data System (ADS)

    Zulhaimi, H. I.; Rosli, I. R.; Kasim, K. F.; Akmal, H. Muhammad; Nuradibah, M. A.; Sam, S. T.

    2017-09-01

    In recent year, bioactive compound in plant has become a limelight in the food and pharmaceutical market, leading to research interest to implement effective technologies for extracting bioactive substance. Therefore, this study is focusing on extraction of Averrhoabilimbi by different extraction technique namely, maceration and ultrasound-assisted extraction. Fewplant partsof Averrhoabilimbiweretaken as extraction samples which are fruits, leaves and twig. Different solvents such as methanol, ethanol and distilled water were utilized in the process. Fruit extractsresult in highest extraction yield compared to other plant parts. Ethanol and distilled water have significant role compared to methanol in all parts and both extraction technique. The result also shows that ultrasound-assisted extraction gave comparable result with maceration. Besides, the shorter period on extraction process gives useful in term of implementation to industries.

  13. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    PubMed Central

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to evaluate the quality and performance on different matrixes and extraction techniques. The effect of PCR efficiency on the resulting GMO content is demonstrated. Conclusion The crucial influence of extraction technique and sample matrix properties on the results of GMO quantification is demonstrated. Appropriate extraction techniques for each matrix need to be determined to achieve accurate DNA quantification. Nevertheless, as it is shown that in the area of food and feed testing matrix with certain specificities is impossible to define strict quality controls need to be introduced to monitor PCR. The results of our study are also applicable to other fields of quantitative testing by real-time PCR. PMID:16907967

  14. Tabletted microspheres containing Cynara scolymus (var. Spinoso sardo) extract for the preparation of controlled release nutraceutical matrices.

    PubMed

    Gavini, E; Alamanni, M C; Cossu, M; Giunchedi, P

    2005-08-01

    Controlled release dosage forms based on tabletted microspheres containing fresh artichoke Cynara scolymus extract were performed for the oral administration of a nutritional supplement. Microspheres were prepared using a spray-drying technique; lactose or hypromellose have been chosen as excipients. Microspheres were characterized in terms of encapsulated extract content, size and morphology. Qualitative and quantitative composition of the extract before and after the spray process was determined. Compressed matrices (tablets) were prepared by direct compression of the spray-dried microspheres. In vitro release tests of microparticles and tablets prepared were carried out in both acidic and neutral media. Spray-drying is a good method to prepare microspheres containing the artichoke extract. The microspheres encapsulate an amount of extract close to the theoretical value. Particle size analyses indicate that the microparticles have dvs of approximately 6-7 microm. Electronic microscopy observations reveal that particles based on lactose have spherical shape and particles containing hypromellose are almost collapsed. The hydroalcoholic extract is stable to the microsphere production process: its polyphenolic composition (qualitative and quantitative) did not change after spraying. In vitro release studies show that microparticles characterized by a quick polyphenolic release both in acidic and neutral media due to the high water solubility of the carrier lactose. On the contrary, microspheres based hypromellose release only 20% of the loaded extract at pH 1.2 in 2 h and the total amount of polyphenols is released only after about further 6 h at pH 6.8. Matrices prepared tabletting lactose microspheres and hypromellose microparticles in the weight ratio 1:1 show a slow release rate, that lasts approximately 24 h. This one-a-day sustained release formulation containing Cynara scolymus extract could be proposed as a nutraceutical controlled release dosage form for oral administration.

  15. Automatic extraction of plots from geo-registered UAS imagery of crop fields with complex planting schemes

    NASA Astrophysics Data System (ADS)

    Hearst, Anthony A.

    Complex planting schemes are common in experimental crop fields and can make it difficult to extract plots of interest from high-resolution imagery of the fields gathered by Unmanned Aircraft Systems (UAS). This prevents UAS imagery from being applied in High-Throughput Precision Phenotyping and other areas of agricultural research. If the imagery is accurately geo-registered, then it may be possible to extract plots from the imagery based on their map coordinates. To test this approach, a UAS was used to acquire visual imagery of 5 ha of soybean fields containing 6.0 m2 plots in a complex planting scheme. Sixteen artificial targets were setup in the fields before flights and different spatial configurations of 0 to 6 targets were used as Ground Control Points (GCPs) for geo-registration, resulting in a total of 175 geo-registered image mosaics with a broad range of geo-registration accuracies. Geo-registration accuracy was quantified based on the horizontal Root Mean Squared Error (RMSE) of targets used as checkpoints. Twenty test plots were extracted from the geo-registered imagery. Plot extraction accuracy was quantified based on the percentage of the desired plot area that was extracted. It was found that using 4 GCPs along the perimeter of the field minimized the horizontal RMSE and enabled a plot extraction accuracy of at least 70%, with a mean plot extraction accuracy of 92%. Future work will focus on further enhancing the plot extraction accuracy through additional image processing techniques so that it becomes sufficiently accurate for all practical purposes in agricultural research and potentially other areas of research.

  16. Integrated Micro-Chip Amino Acid Chirality Detector for MOD

    NASA Technical Reports Server (NTRS)

    Glavin, D. P.; Bada, J. L.; Botta, O.; Kminek, G.; Grunthaner, F.; Mathies, R.

    2001-01-01

    Integration of a micro-chip capillary electrophoresis analyzer with a sublimation-based extraction technique, as used in the Mars Organic Detector (MOD), for the in-situ detection of amino acids and their enantiomers on solar system bodies. Additional information is contained in the original extended abstract.

  17. Rapid methods for extraction and concentration of poliovirus from oyster tissues.

    PubMed

    Richards, G P; Goldmintz, D; Green, D L; Babinchak, J A

    1982-12-01

    A procedure is discussed for the extraction of poliovirus from oyster meats by modification of several enterovirus extraction techniques. The modified method uses meat extract and Cat-Floc, a polycationic electrolyte, for virus extraction and concentration. Virus recovery from inoculated oyster homogenates is 93-120%. Adsorption of viruses to oyster proteins by acidification of homogenates does not affect virus recovery. Elution of viruses from oyster proteins appears more efficient at pH 9.5 than at pH 8.0. This technique is relatively simple, economical and requires only 2.5 h to complete the combined extraction and concentration procedure.

  18. Target oriented dimensionality reduction of hyperspectral data by Kernel Fukunaga-Koontz Transform

    NASA Astrophysics Data System (ADS)

    Binol, Hamidullah; Ochilov, Shuhrat; Alam, Mohammad S.; Bal, Abdullah

    2017-02-01

    Principal component analysis (PCA) is a popular technique in remote sensing for dimensionality reduction. While PCA is suitable for data compression, it is not necessarily an optimal technique for feature extraction, particularly when the features are exploited in supervised learning applications (Cheriyadat and Bruce, 2003) [1]. Preserving features belonging to the target is very crucial to the performance of target detection/recognition techniques. Fukunaga-Koontz Transform (FKT) based supervised band reduction technique can be used to provide this requirement. FKT achieves feature selection by transforming into a new space in where feature classes have complimentary eigenvectors. Analysis of these eigenvectors under two classes, target and background clutter, can be utilized for target oriented band reduction since each basis functions best represent target class while carrying least information of the background class. By selecting few eigenvectors which are the most relevant to the target class, dimension of hyperspectral data can be reduced and thus, it presents significant advantages for near real time target detection applications. The nonlinear properties of the data can be extracted by kernel approach which provides better target features. Thus, we propose constructing kernel FKT (KFKT) to present target oriented band reduction. The performance of the proposed KFKT based target oriented dimensionality reduction algorithm has been tested employing two real-world hyperspectral data and results have been reported consequently.

  19. Tuberculosis diagnosis support analysis for precarious health information systems.

    PubMed

    Orjuela-Cañón, Alvaro David; Camargo Mendoza, Jorge Eliécer; Awad García, Carlos Enrique; Vergara Vela, Erika Paola

    2018-04-01

    Pulmonary tuberculosis is a world emergency for the World Health Organization. Techniques and new diagnosis tools are important to battle this bacterial infection. There have been many advances in all those fields, but in developing countries such as Colombia, where the resources and infrastructure are limited, new fast and less expensive strategies are increasingly needed. Artificial neural networks are computational intelligence techniques that can be used in this kind of problems and offer additional support in the tuberculosis diagnosis process, providing a tool to medical staff to make decisions about management of subjects under suspicious of tuberculosis. A database extracted from 105 subjects with precarious information of people under suspect of pulmonary tuberculosis was used in this study. Data extracted from sex, age, diabetes, homeless, AIDS status and a variable with clinical knowledge from the medical personnel were used. Models based on artificial neural networks were used, exploring supervised learning to detect the disease. Unsupervised learning was used to create three risk groups based on available information. Obtained results are comparable with traditional techniques for detection of tuberculosis, showing advantages such as fast and low implementation costs. Sensitivity of 97% and specificity of 71% where achieved. Used techniques allowed to obtain valuable information that can be useful for physicians who treat the disease in decision making processes, especially under limited infrastructure and data. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Improved aqueous scrubber for collection of soluble atmospheric trace gases

    NASA Technical Reports Server (NTRS)

    Cofer, W. R., III; Talbot, R. W.; Collins, V. G.

    1985-01-01

    A new concentration technique for the extraction and enrichment of water-soluble atmospheric trace gases has been developed. The gas scrubbing technique efficiently extracts soluble gases from a large volume flow rate of air sample into a small volume of refluxed trapping solution. The gas scrubber utilizes a small nebulizing nozzle that mixes the incoming air with an aqueous extracting solution to form an air/droplet mist. The mist provides excellent interfacial surface areas for mass transfer. The resulting mist sprays upward through the reaction chamber until it impinges upon a hydrophobic membrane that virtually blocks the passage of droplets but offers little resistance to the existing gas flow. Droplets containing the scrubbed gases coalesce on the membrane and drip back into the reservoir for further refluxing. After a suitable concentration period, the extracting solution containing the analyte can be withdrawn for analysis. The nebulization-reflex concentration technique is more efficient (maximum flow of gas through the minimum volume of extractant) than conventional bubbler/impinger gas extraction techniques and is offered as an alternative method.

  1. Synthetic Minority Oversampling Technique and Fractal Dimension for Identifying Multiple Sclerosis

    NASA Astrophysics Data System (ADS)

    Zhang, Yu-Dong; Zhang, Yin; Phillips, Preetha; Dong, Zhengchao; Wang, Shuihua

    Multiple sclerosis (MS) is a severe brain disease. Early detection can provide timely treatment. Fractal dimension can provide statistical index of pattern changes with scale at a given brain image. In this study, our team used susceptibility weighted imaging technique to obtain 676 MS slices and 880 healthy slices. We used synthetic minority oversampling technique to process the unbalanced dataset. Then, we used Canny edge detector to extract distinguishing edges. The Minkowski-Bouligand dimension was a fractal dimension estimation method and used to extract features from edges. Single hidden layer neural network was used as the classifier. Finally, we proposed a three-segment representation biogeography-based optimization to train the classifier. Our method achieved a sensitivity of 97.78±1.29%, a specificity of 97.82±1.60% and an accuracy of 97.80±1.40%. The proposed method is superior to seven state-of-the-art methods in terms of sensitivity and accuracy.

  2. Coupling Front-End Separations, Ion Mobility Spectrometry, and Mass Spectrometry For Enhanced Multidimensional Biological and Environmental Analyses

    PubMed Central

    Zheng, Xueyun; Wojcik, Roza; Zhang, Xing; Ibrahim, Yehia M.; Burnum-Johnson, Kristin E.; Orton, Daniel J.; Monroe, Matthew E.; Moore, Ronald J.; Smith, Richard D.; Baker, Erin S.

    2017-01-01

    Ion mobility spectrometry (IMS) is a widely used analytical technique for rapid molecular separations in the gas phase. Though IMS alone is useful, its coupling with mass spectrometry (MS) and front-end separations is extremely beneficial for increasing measurement sensitivity, peak capacity of complex mixtures, and the scope of molecular information available from biological and environmental sample analyses. In fact, multiple disease screening and environmental evaluations have illustrated that the IMS-based multidimensional separations extract information that cannot be acquired with each technique individually. This review highlights three-dimensional separations using IMS-MS in conjunction with a range of front-end techniques, such as gas chromatography, supercritical fluid chromatography, liquid chromatography, solid-phase extractions, capillary electrophoresis, field asymmetric ion mobility spectrometry, and microfluidic devices. The origination, current state, various applications, and future capabilities of these multidimensional approaches are described in detail to provide insight into their uses and benefits. PMID:28301728

  3. Multiple Vehicle Detection and Segmentation in Malaysia Traffic Flow

    NASA Astrophysics Data System (ADS)

    Fariz Hasan, Ahmad; Fikri Che Husin, Mohd; Affendi Rosli, Khairul; Norhafiz Hashim, Mohd; Faiz Zainal Abidin, Amar

    2018-03-01

    Vision based system are widely used in the field of Intelligent Transportation System (ITS) to extract a large amount of information to analyze traffic scenes. By rapid number of vehicles on the road as well as significant increase on cameras dictated the need for traffic surveillance systems. This system can take over the burden some task was performed by human operator in traffic monitoring centre. The main technique proposed by this paper is concentrated on developing a multiple vehicle detection and segmentation focusing on monitoring through Closed Circuit Television (CCTV) video. The system is able to automatically segment vehicle extracted from heavy traffic scene by optical flow estimation alongside with blob analysis technique in order to detect the moving vehicle. Prior to segmentation, blob analysis technique will compute the area of interest region corresponding to moving vehicle which will be used to create bounding box on that particular vehicle. Experimental validation on the proposed system was performed and the algorithm is demonstrated on various set of traffic scene.

  4. Eyeglasses Lens Contour Extraction from Facial Images Using an Efficient Shape Description

    PubMed Central

    Borza, Diana; Darabant, Adrian Sergiu; Danescu, Radu

    2013-01-01

    This paper presents a system that automatically extracts the position of the eyeglasses and the accurate shape and size of the frame lenses in facial images. The novelty brought by this paper consists in three key contributions. The first one is an original model for representing the shape of the eyeglasses lens, using Fourier descriptors. The second one is a method for generating the search space starting from a finite, relatively small number of representative lens shapes based on Fourier morphing. Finally, we propose an accurate lens contour extraction algorithm using a multi-stage Monte Carlo sampling technique. Multiple experiments demonstrate the effectiveness of our approach. PMID:24152926

  5. Benefits of utilizing CellProfiler as a characterization tool for U–10Mo nuclear fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collette, R.; Douglas, J.; Patterson, L.

    2015-07-15

    Automated image processing techniques have the potential to aid in the performance evaluation of nuclear fuels by eliminating judgment calls that may vary from person-to-person or sample-to-sample. Analysis of in-core fuel performance is required for design and safety evaluations related to almost every aspect of the nuclear fuel cycle. This study presents a methodology for assessing the quality of uranium–molybdenum fuel images and describes image analysis routines designed for the characterization of several important microstructural properties. The analyses are performed in CellProfiler, an open-source program designed to enable biologists without training in computer vision or programming to automatically extract cellularmore » measurements from large image sets. The quality metric scores an image based on three parameters: the illumination gradient across the image, the overall focus of the image, and the fraction of the image that contains scratches. The metric presents the user with the ability to ‘pass’ or ‘fail’ an image based on a reproducible quality score. Passable images may then be characterized through a separate CellProfiler pipeline, which enlists a variety of common image analysis techniques. The results demonstrate the ability to reliably pass or fail images based on the illumination, focus, and scratch fraction of the image, followed by automatic extraction of morphological data with respect to fission gas voids, interaction layers, and grain boundaries. - Graphical abstract: Display Omitted - Highlights: • A technique is developed to score U–10Mo FIB-SEM image quality using CellProfiler. • The pass/fail metric is based on image illumination, focus, and area scratched. • Automated image analysis is performed in pipeline fashion to characterize images. • Fission gas void, interaction layer, and grain boundary coverage data is extracted. • Preliminary characterization results demonstrate consistency of the algorithm.« less

  6. Deformation-based augmented reality for hepatic surgery.

    PubMed

    Haouchine, Nazim; Dequidt, Jérémie; Berger, Marie-Odile; Cotin, Stéphane

    2013-01-01

    In this paper we introduce a method for augmenting the laparoscopic view during hepatic tumor resection. Using augmented reality techniques, vessels, tumors and cutting planes computed from pre-operative data can be overlaid onto the laparoscopic video. Compared to current techniques, which are limited to a rigid registration of the pre-operative liver anatomy with the intra-operative image, we propose a real-time, physics-based, non-rigid registration. The main strength of our approach is that the deformable model can also be used to regularize the data extracted from the computer vision algorithms. We show preliminary results on a video sequence which clearly highlights the interest of using physics-based model for elastic registration.

  7. [Technical questions of the transrectal specimen extraction].

    PubMed

    Lukovich, Péter; Csibi, Noémi; Bokor, Attila

    2016-03-01

    During laparoscopic partial colectomy the specimen can be extracted transrectally. This technique decreases the invasiveness of the surgery, because the abdominal wall incision is avoided. Premises of a new surgical technique are precise technical description as well as a favourable balance of advantages and disadvantages. In this paper the authors review the technique they apply and analyse their first results. 45 laparoscopic bowel resections were performed by a multidisciplinary team between 16th April 2014 and 1st November 2015. Indication of surgery was endometriosis, and the specimen was extracted transrectally in 11 patients. Having ligated both bowel ends proximal and distal to the section infiltrated with endometriosis, and the proximal bowel secured with a laparoscopic bulldog. Then the bowel was resected and the specimen was extracted in a camera bag transrectally. A purse-string suture was placed into the proximal bowel end, and the anvil of the circular stapler--which was introduced transrectally--was inserted into the bowel. After closing the rectal stump, the anastomosis was performed with a circular stapler. We used this technique when the upper third of the rectum or sigmoid colon was infiltrated with endometriosis. The difference between the operation time of the two techniques (transabdominal vs. transrectal specimen extraction: 108 min vs. 118 min) was not significant. There was not difference in the WBC count between the first and second postoperative day, and there was not any anastomosis leakage detected either. By using the above technique, postoperative infections could have been reduced to minimum. Transrectal specimen extraction did not increase postoperative complication The authors believe this is a safe way of specimen extraction after partial colectomy.

  8. An Eulerian time filtering technique to study large-scale transient flow phenomena

    NASA Astrophysics Data System (ADS)

    Vanierschot, Maarten; Persoons, Tim; van den Bulck, Eric

    2009-10-01

    Unsteady fluctuating velocity fields can contain large-scale periodic motions with frequencies well separated from those of turbulence. Examples are the wake behind a cylinder or the processing vortex core in a swirling jet. These turbulent flow fields contain large-scale, low-frequency oscillations, which are obscured by turbulence, making it impossible to identify them. In this paper, we present an Eulerian time filtering (ETF) technique to extract the large-scale motions from unsteady statistical non-stationary velocity fields or flow fields with multiple phenomena that have sufficiently separated spectral content. The ETF method is based on non-causal time filtering of the velocity records in each point of the flow field. It is shown that the ETF technique gives good results, similar to the ones obtained by the phase-averaging method. In this paper, not only the influence of the temporal filter is checked, but also parameters such as the cut-off frequency and sampling frequency of the data are investigated. The technique is validated on a selected set of time-resolved stereoscopic particle image velocimetry measurements such as the initial region of an annular jet and the transition between flow patterns in an annular jet. The major advantage of the ETF method in the extraction of large scales is that it is computationally less expensive and it requires less measurement time compared to other extraction methods. Therefore, the technique is suitable in the startup phase of an experiment or in a measurement campaign where several experiments are needed such as parametric studies.

  9. A Spatial Division Clustering Method and Low Dimensional Feature Extraction Technique Based Indoor Positioning System

    PubMed Central

    Mo, Yun; Zhang, Zhongzhao; Meng, Weixiao; Ma, Lin; Wang, Yao

    2014-01-01

    Indoor positioning systems based on the fingerprint method are widely used due to the large number of existing devices with a wide range of coverage. However, extensive positioning regions with a massive fingerprint database may cause high computational complexity and error margins, therefore clustering methods are widely applied as a solution. However, traditional clustering methods in positioning systems can only measure the similarity of the Received Signal Strength without being concerned with the continuity of physical coordinates. Besides, outage of access points could result in asymmetric matching problems which severely affect the fine positioning procedure. To solve these issues, in this paper we propose a positioning system based on the Spatial Division Clustering (SDC) method for clustering the fingerprint dataset subject to physical distance constraints. With the Genetic Algorithm and Support Vector Machine techniques, SDC can achieve higher coarse positioning accuracy than traditional clustering algorithms. In terms of fine localization, based on the Kernel Principal Component Analysis method, the proposed positioning system outperforms its counterparts based on other feature extraction methods in low dimensionality. Apart from balancing online matching computational burden, the new positioning system exhibits advantageous performance on radio map clustering, and also shows better robustness and adaptability in the asymmetric matching problem aspect. PMID:24451470

  10. Proteomic platform for the identification of proteins in olive (Olea europaea) pulp.

    PubMed

    Capriotti, Anna Laura; Cavaliere, Chiara; Foglia, Patrizia; Piovesana, Susy; Samperi, Roberto; Stampachiacchiere, Serena; Laganà, Aldo

    2013-10-24

    The nutritional and cancer-protective properties of the oil extracted mechanically from the ripe fruits of Olea europaea trees are attracting constantly more attention worldwide. The preparation of high-quality protein samples from plant tissues for proteomic analysis poses many challenging problems. In this study we employed a proteomic platform based on two different extraction methods, SDS and CHAPS based protocols, followed by two precipitation protocols, TCA/acetone and MeOH precipitation, in order to increase the final number of identified proteins. The use of advanced MS techniques in combination with the Swissprot and NCBI Viridiplantae databases and TAIR10 Arabidopsis database allowed us to identify 1265 proteins, of which 22 belong to O. europaea. The application of this proteomic platform for protein extraction and identification will be useful also for other proteomic studies on recalcitrant plant/fruit tissues. Copyright © 2013. Published by Elsevier B.V.

  11. Texture feature extraction based on a uniformity estimation method for local brightness and structure in chest CT images.

    PubMed

    Peng, Shao-Hu; Kim, Deok-Hwan; Lee, Seok-Lyong; Lim, Myung-Kwan

    2010-01-01

    Texture feature is one of most important feature analysis methods in the computer-aided diagnosis (CAD) systems for disease diagnosis. In this paper, we propose a Uniformity Estimation Method (UEM) for local brightness and structure to detect the pathological change in the chest CT images. Based on the characteristics of the chest CT images, we extract texture features by proposing an extension of rotation invariant LBP (ELBP(riu4)) and the gradient orientation difference so as to represent a uniform pattern of the brightness and structure in the image. The utilization of the ELBP(riu4) and the gradient orientation difference allows us to extract rotation invariant texture features in multiple directions. Beyond this, we propose to employ the integral image technique to speed up the texture feature computation of the spatial gray level dependent method (SGLDM). Copyright © 2010 Elsevier Ltd. All rights reserved.

  12. Classifiers utilized to enhance acoustic based sensors to identify round types of artillery/mortar

    NASA Astrophysics Data System (ADS)

    Grasing, David; Desai, Sachi; Morcos, Amir

    2008-04-01

    Feature extraction methods based on the statistical analysis of the change in event pressure levels over a period and the level of ambient pressure excitation facilitate the development of a robust classification algorithm. The features reliably discriminates mortar and artillery variants via acoustic signals produced during the launch events. Utilizing acoustic sensors to exploit the sound waveform generated from the blast for the identification of mortar and artillery variants as type A, etcetera through analysis of the waveform. Distinct characteristics arise within the different mortar/artillery variants because varying HE mortar payloads and related charges emphasize varying size events at launch. The waveform holds various harmonic properties distinct to a given mortar/artillery variant that through advanced signal processing and data mining techniques can employed to classify a given type. The skewness and other statistical processing techniques are used to extract the predominant components from the acoustic signatures at ranges exceeding 3000m. Exploiting these techniques will help develop a feature set highly independent of range, providing discrimination based on acoustic elements of the blast wave. Highly reliable discrimination will be achieved with a feedforward neural network classifier trained on a feature space derived from the distribution of statistical coefficients, frequency spectrum, and higher frequency details found within different energy bands. The processes that are described herein extend current technologies, which emphasis acoustic sensor systems to provide such situational awareness.

  13. Artillery/mortar type classification based on detected acoustic transients

    NASA Astrophysics Data System (ADS)

    Morcos, Amir; Grasing, David; Desai, Sachi

    2008-04-01

    Feature extraction methods based on the statistical analysis of the change in event pressure levels over a period and the level of ambient pressure excitation facilitate the development of a robust classification algorithm. The features reliably discriminates mortar and artillery variants via acoustic signals produced during the launch events. Utilizing acoustic sensors to exploit the sound waveform generated from the blast for the identification of mortar and artillery variants as type A, etcetera through analysis of the waveform. Distinct characteristics arise within the different mortar/artillery variants because varying HE mortar payloads and related charges emphasize varying size events at launch. The waveform holds various harmonic properties distinct to a given mortar/artillery variant that through advanced signal processing and data mining techniques can employed to classify a given type. The skewness and other statistical processing techniques are used to extract the predominant components from the acoustic signatures at ranges exceeding 3000m. Exploiting these techniques will help develop a feature set highly independent of range, providing discrimination based on acoustic elements of the blast wave. Highly reliable discrimination will be achieved with a feed-forward neural network classifier trained on a feature space derived from the distribution of statistical coefficients, frequency spectrum, and higher frequency details found within different energy bands. The processes that are described herein extend current technologies, which emphasis acoustic sensor systems to provide such situational awareness.

  14. A PCA aided cross-covariance scheme for discriminative feature extraction from EEG signals.

    PubMed

    Zarei, Roozbeh; He, Jing; Siuly, Siuly; Zhang, Yanchun

    2017-07-01

    Feature extraction of EEG signals plays a significant role in Brain-computer interface (BCI) as it can significantly affect the performance and the computational time of the system. The main aim of the current work is to introduce an innovative algorithm for acquiring reliable discriminating features from EEG signals to improve classification performances and to reduce the time complexity. This study develops a robust feature extraction method combining the principal component analysis (PCA) and the cross-covariance technique (CCOV) for the extraction of discriminatory information from the mental states based on EEG signals in BCI applications. We apply the correlation based variable selection method with the best first search on the extracted features to identify the best feature set for characterizing the distribution of mental state signals. To verify the robustness of the proposed feature extraction method, three machine learning techniques: multilayer perceptron neural networks (MLP), least square support vector machine (LS-SVM), and logistic regression (LR) are employed on the obtained features. The proposed methods are evaluated on two publicly available datasets. Furthermore, we evaluate the performance of the proposed methods by comparing it with some recently reported algorithms. The experimental results show that all three classifiers achieve high performance (above 99% overall classification accuracy) for the proposed feature set. Among these classifiers, the MLP and LS-SVM methods yield the best performance for the obtained feature. The average sensitivity, specificity and classification accuracy for these two classifiers are same, which are 99.32%, 100%, and 99.66%, respectively for the BCI competition dataset IVa and 100%, 100%, and 100%, for the BCI competition dataset IVb. The results also indicate the proposed methods outperform the most recently reported methods by at least 0.25% average accuracy improvement in dataset IVa. The execution time results show that the proposed method has less time complexity after feature selection. The proposed feature extraction method is very effective for getting representatives information from mental states EEG signals in BCI applications and reducing the computational complexity of classifiers by reducing the number of extracted features. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. 2D DOST based local phase pattern for face recognition

    NASA Astrophysics Data System (ADS)

    Moniruzzaman, Md.; Alam, Mohammad S.

    2017-05-01

    A new two dimensional (2-D) Discrete Orthogonal Stcokwell Transform (DOST) based Local Phase Pattern (LPP) technique has been proposed for efficient face recognition. The proposed technique uses 2-D DOST as preliminary preprocessing and local phase pattern to form robust feature signature which can effectively accommodate various 3D facial distortions and illumination variations. The S-transform, is an extension of the ideas of the continuous wavelet transform (CWT), is also known for its local spectral phase properties in time-frequency representation (TFR). It provides a frequency dependent resolution of the time-frequency space and absolutely referenced local phase information while maintaining a direct relationship with the Fourier spectrum which is unique in TFR. After utilizing 2-D Stransform as the preprocessing and build local phase pattern from extracted phase information yield fast and efficient technique for face recognition. The proposed technique shows better correlation discrimination compared to alternate pattern recognition techniques such as wavelet or Gabor based face recognition. The performance of the proposed method has been tested using the Yale and extended Yale facial database under different environments such as illumination variation and 3D changes in facial expressions. Test results show that the proposed technique yields better performance compared to alternate time-frequency representation (TFR) based face recognition techniques.

  16. Hexahedral finite element mesh coarsening using pillowing technique

    DOEpatents

    Staten, Matthew L [Pittsburgh, PA; Woodbury, Adam C [Provo, UT; Benzley, Steven E [Provo, UT; Shepherd, Jason F [Edgewood, NM

    2012-06-05

    A techniques for coarsening a hexahedral mesh is described. The technique includes identifying a coarsening region within a hexahedral mesh to be coarsened. A boundary sheet of hexahedral elements is inserted into the hexahedral mesh around the coarsening region. A column of hexahedral elements is identified within the boundary sheet. The column of hexahedral elements is collapsed to create an extraction sheet of hexahedral elements contained within the coarsening region. Then, the extraction sheet of hexahedral elements is extracted to coarsen the hexahedral mesh.

  17. Phospholipid Fatty Acid Analysis: Past, Present and Future

    NASA Astrophysics Data System (ADS)

    Findlay, R. H.

    2008-12-01

    With their 1980 publication, Bobbie and White initiated the use of phospholipid fatty acids for the study of microbial communities. This method, integrated with a previously published biomass assay based on the colorimetric detection of orthophosphate liberated from phospholipids, provided the first quantitative method for determining microbial community structure. The method is based on a quantitative extraction of lipids from the sample matrix, isolation of the phospholipids, conversion of the phospholipid fatty acids to their corresponding fatty acid methyl esters (known by the acronym FAME) and the separation, identification and quantification of the FAME by gas chromatography. Early laboratory and field samples focused on correlating individual fatty acids to particular groups of microorganisms. Subsequent improvements to the methodology include reduced solvent volumes for extractions, improved sensitivity in the detection of orthophosphate and the use of solid phase extraction technology. Improvements in the field of gas chromatography also increased accessibility of the technique and it has been widely applied to water, sediment, soil and aerosol samples. Whole cell fatty acid analysis, a related but not equal technique, is currently used for phenotypic characterization in bacterial species descriptions and is the basis for a commercial, rapid bacterial identification system. In the early 1990ês application of multivariate statistical analysis, first cluster analysis and then principal component analysis, further improved the usefulness of the technique and allowed the development of a functional group approach to interpretation of phospholipid fatty acid profiles. Statistical techniques currently applied to the analysis of phospholipid fatty acid profiles include constrained ordinations and neutral networks. Using redundancy analysis, a form of constrained ordination, we have recently shown that both cation concentration and dissolved organic matter (DOM) quality are determinates of microbial community structure in forested headwater streams. One of the most exciting recent developments in phospholipid fatty acid analysis is the application of compound specific stable isotope analysis. We are currently applying this technique to stream sediments to help determine which microorganisms are involved in the initial processing of DOM and the technique promises to be a useful tool for assigning ecological function to microbial populations.

  18. Extraction and Production of Omega-3 from UniMAP Puyu (Jade Perch) and Mackarel

    NASA Astrophysics Data System (ADS)

    Nur Izzati, I.; Zainab, H.; Nornadhiratulhusna, M.; Chee Hann, Y.; Khairunissa Syairah, A. S.; Amira Farzana, S.

    2018-03-01

    Extraction techniques to extract fish oil from various types of fish are numerous but not widely accepted because of the use of chemicals that may be harmful to health. In this study, fish oil is extracted using a technique of Microwave-Assisted Extraction, which uses only water. The optimum conditions required for the production of fish oil for extraction is carried out by examining three parameters such as microwave power (300-700W), extraction time (10-30 min) and amount of water used (70-190ml). Optimum conditions were determined after using design of experiments (DOE). The optimum condition obtained was 300 W for microwave power, 10 minutes extraction time and 190 milliliter amounts of water used. Fourier transform infrared spectroscopy (FTIR) was used to analyze the functional groups of fish oil. Two types of fish such as Jade Perch or UniMAP Puyu and Indian Mackerel were used. A standard omega-3 oil sample (Blackmores) purchased from pharmacy was also determined to confirm the presence of omega-3 oil in these fishes. Similar compounds were present in Jade Perch and Indian Mackerel as compared to the standard. Therefore, there were presence of omega-3 fish oil in the two types of fish. From this study, omega-3 in UniMAP Puyu fish was higher compared to Indian Mackerel fish. However, based on the FTIR analysis, besides the presence of omega-3, the two fishes also contain other functional groups such as alkanes, alkenes, aldehyde, ketones and many others. The yield of fish oil for the Jade Perch was low compared to Indian Mackerel which was 9% while Indian Mackerel was 10 %.

  19. Supercritical fluid extraction of grape seeds: extract chemical composition, antioxidant activity and inhibition of nitrite production in LPS-stimulated Raw 264.7 cells.

    PubMed

    Pérez, Concepción; Ruiz del Castillo, María Luisa; Gil, Carmen; Blanch, Gracia Patricia; Flores, Gema

    2015-08-01

    Grape by-products are a rich source of bioactive compounds having broad medicinal properties, but are usually wasted from juice/wine processing industries. The present study investigates the use of supercritical fluid extraction (SFE) for obtaining an extract rich in bioactive compounds. First, some variables involved in the extraction were applied. SFE conditions were selected based on the oil mass yield, fatty acid profile and total phenolic composition. As a result, 40 °C and 300 bar were selected as operational conditions. The phenolic composition of the grape seed oil was determined using LC-DAD. The antioxidant activity was determined by ABTS and DPPH assays. For the anti-inflammatory activity the inhibition of nitrite production was assessed. The grape seed oil extracted was rich in phenolic compounds and fatty acids with significant antioxidant and anti-inflammatory activities. From these results, added economic value to this agroindustrial residue is proposed using environmentally friendly techniques.

  20. Comparative analysis of feature extraction methods in satellite imagery

    NASA Astrophysics Data System (ADS)

    Karim, Shahid; Zhang, Ye; Asif, Muhammad Rizwan; Ali, Saad

    2017-10-01

    Feature extraction techniques are extensively being used in satellite imagery and getting impressive attention for remote sensing applications. The state-of-the-art feature extraction methods are appropriate according to the categories and structures of the objects to be detected. Based on distinctive computations of each feature extraction method, different types of images are selected to evaluate the performance of the methods, such as binary robust invariant scalable keypoints (BRISK), scale-invariant feature transform, speeded-up robust features (SURF), features from accelerated segment test (FAST), histogram of oriented gradients, and local binary patterns. Total computational time is calculated to evaluate the speed of each feature extraction method. The extracted features are counted under shadow regions and preprocessed shadow regions to compare the functioning of each method. We have studied the combination of SURF with FAST and BRISK individually and found very promising results with an increased number of features and less computational time. Finally, feature matching is conferred for all methods.

Top