Science.gov

Sample records for automated relation extraction

  1. Hybrid curation of gene–mutation relations combining automated extraction and crowdsourcing

    PubMed Central

    Burger, John D.; Doughty, Emily; Khare, Ritu; Wei, Chih-Hsuan; Mishra, Rajashree; Aberdeen, John; Tresner-Kirsch, David; Wellner, Ben; Kann, Maricel G.; Lu, Zhiyong; Hirschman, Lynette

    2014-01-01

    Background: This article describes capture of biological information using a hybrid approach that combines natural language processing to extract biological entities and crowdsourcing with annotators recruited via Amazon Mechanical Turk to judge correctness of candidate biological relations. These techniques were applied to extract gene– mutation relations from biomedical abstracts with the goal of supporting production scale capture of gene–mutation–disease findings as an open source resource for personalized medicine. Results: The hybrid system could be configured to provide good performance for gene–mutation extraction (precision ∼82%; recall ∼70% against an expert-generated gold standard) at a cost of $0.76 per abstract. This demonstrates that crowd labor platforms such as Amazon Mechanical Turk can be used to recruit quality annotators, even in an application requiring subject matter expertise; aggregated Turker judgments for gene–mutation relations exceeded 90% accuracy. Over half of the precision errors were due to mismatches against the gold standard hidden from annotator view (e.g. incorrect EntrezGene identifier or incorrect mutation position extracted), or incomplete task instructions (e.g. the need to exclude nonhuman mutations). Conclusions: The hybrid curation model provides a readily scalable cost-effective approach to curation, particularly if coupled with expert human review to filter precision errors. We plan to generalize the framework and make it available as open source software. Database URL: http://www.mitre.org/publications/technical-papers/hybrid-curation-of-gene-mutation-relations-combining-automated PMID:25246425

  2. Automated Neuroanatomical Relation Extraction: A Linguistically Motivated Approach with a PVT Connectivity Graph Case Study

    PubMed Central

    Gökdeniz, Erinç; Özgür, Arzucan; Canbeyli, Reşit

    2016-01-01

    Identifying the relations among different regions of the brain is vital for a better understanding of how the brain functions. While a large number of studies have investigated the neuroanatomical and neurochemical connections among brain structures, their specific findings are found in publications scattered over a large number of years and different types of publications. Text mining techniques have provided the means to extract specific types of information from a large number of publications with the aim of presenting a larger, if not necessarily an exhaustive picture. By using natural language processing techniques, the present paper aims to identify connectivity relations among brain regions in general and relations relevant to the paraventricular nucleus of the thalamus (PVT) in particular. We introduce a linguistically motivated approach based on patterns defined over the constituency and dependency parse trees of sentences. Besides the presence of a relation between a pair of brain regions, the proposed method also identifies the directionality of the relation, which enables the creation and analysis of a directional brain region connectivity graph. The approach is evaluated over the manually annotated data sets of the WhiteText Project. In addition, as a case study, the method is applied to extract and analyze the connectivity graph of PVT, which is an important brain region that is considered to influence many functions ranging from arousal, motivation, and drug-seeking behavior to attention. The results of the PVT connectivity graph show that PVT may be a new target of research in mood assessment. PMID:27708573

  3. Automated Extraction of Flow Features

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne (Technical Monitor); Haimes, Robert

    2004-01-01

    Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, recirculation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; iso-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for (co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.

  4. Automated Extraction of Flow Features

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne (Technical Monitor); Haimes, Robert

    2005-01-01

    Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, re-circulation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; isc-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.

  5. Automated DNA extraction from pollen in honey.

    PubMed

    Guertler, Patrick; Eicheldinger, Adelina; Muschler, Paul; Goerlich, Ottmar; Busch, Ulrich

    2014-04-15

    In recent years, honey has become subject of DNA analysis due to potential risks evoked by microorganisms, allergens or genetically modified organisms. However, so far, only a few DNA extraction procedures are available, mostly time-consuming and laborious. Therefore, we developed an automated DNA extraction method from pollen in honey based on a CTAB buffer-based DNA extraction using the Maxwell 16 instrument and the Maxwell 16 FFS Nucleic Acid Extraction System, Custom-Kit. We altered several components and extraction parameters and compared the optimised method with a manual CTAB buffer-based DNA isolation method. The automated DNA extraction was faster and resulted in higher DNA yield and sufficient DNA purity. Real-time PCR results obtained after automated DNA extraction are comparable to results after manual DNA extraction. No PCR inhibition was observed. The applicability of this method was further successfully confirmed by analysis of different routine honey samples.

  6. Automated Extraction of Secondary Flow Features

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne M.; Haimes, Robert

    2005-01-01

    The use of Computational Fluid Dynamics (CFD) has become standard practice in the design and development of the major components used for air and space propulsion. To aid in the post-processing and analysis phase of CFD many researchers now use automated feature extraction utilities. These tools can be used to detect the existence of such features as shocks, vortex cores and separation and re-attachment lines. The existence of secondary flow is another feature of significant importance to CFD engineers. Although the concept of secondary flow is relatively understood there is no commonly accepted mathematical definition for secondary flow. This paper will present a definition for secondary flow and one approach for automatically detecting and visualizing secondary flow.

  7. Automated Fluid Feature Extraction from Transient Simulations

    NASA Technical Reports Server (NTRS)

    Haimes, Robert; Lovely, David

    1999-01-01

    In the past, feature extraction and identification were interesting concepts, but not required to understand the underlying physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of much interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snap-shot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense. The following is a list of the important physical phenomena found in transient (and steady-state) fluid flow: (1) Shocks, (2) Vortex cores, (3) Regions of recirculation, (4) Boundary layers, (5) Wakes. Three papers and an initial specification for the (The Fluid eXtraction tool kit) FX Programmer's guide were included. The papers, submitted to the AIAA Computational Fluid Dynamics Conference, are entitled : (1) Using Residence Time for the Extraction of Recirculation Regions, (2) Shock Detection from Computational Fluid Dynamics results and (3) On the Velocity Gradient Tensor and Fluid Feature Extraction.

  8. Automated Fluid Feature Extraction from Transient Simulations

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    2000-01-01

    In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one 'snap-shot' of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense.

  9. Automated Fluid Feature Extraction from Transient Simulations

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    1998-01-01

    In the past, feature extraction and identification were interesting concepts, but not required to understand the underlying physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of much interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one 'snap-shot' of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense. The following is a list of the important physical phenomena found in transient (and steady-state) fluid flow: Shocks; Vortex ores; Regions of Recirculation; Boundary Layers; Wakes.

  10. Automated feature extraction for 3-dimensional point clouds

    NASA Astrophysics Data System (ADS)

    Magruder, Lori A.; Leigh, Holly W.; Soderlund, Alexander; Clymer, Bradley; Baer, Jessica; Neuenschwander, Amy L.

    2016-05-01

    Light detection and ranging (LIDAR) technology offers the capability to rapidly capture high-resolution, 3-dimensional surface data with centimeter-level accuracy for a large variety of applications. Due to the foliage-penetrating properties of LIDAR systems, these geospatial data sets can detect ground surfaces beneath trees, enabling the production of highfidelity bare earth elevation models. Precise characterization of the ground surface allows for identification of terrain and non-terrain points within the point cloud, and facilitates further discernment between natural and man-made objects based solely on structural aspects and relative neighboring parameterizations. A framework is presented here for automated extraction of natural and man-made features that does not rely on coincident ortho-imagery or point RGB attributes. The TEXAS (Terrain EXtraction And Segmentation) algorithm is used first to generate a bare earth surface from a lidar survey, which is then used to classify points as terrain or non-terrain. Further classifications are assigned at the point level by leveraging local spatial information. Similarly classed points are then clustered together into regions to identify individual features. Descriptions of the spatial attributes of each region are generated, resulting in the identification of individual tree locations, forest extents, building footprints, and 3-dimensional building shapes, among others. Results of the fully-automated feature extraction algorithm are then compared to ground truth to assess completeness and accuracy of the methodology.

  11. ACME, a GIS tool for Automated Cirque Metric Extraction

    NASA Astrophysics Data System (ADS)

    Spagnolo, Matteo; Pellitero, Ramon; Barr, Iestyn D.; Ely, Jeremy C.; Pellicer, Xavier M.; Rea, Brice R.

    2017-02-01

    Regional scale studies of glacial cirque metrics provide key insights on the (palaeo) environment related to the formation of these erosional landforms. The growing availability of high resolution terrain models means that more glacial cirques can be identified and mapped in the future. However, the extraction of their metrics still largely relies on time consuming manual techniques or the combination of, more or less obsolete, GIS tools. In this paper, a newly coded toolbox is provided for the automated, and comparatively quick, extraction of 16 key glacial cirque metrics; including length, width, circularity, planar and 3D area, elevation, slope, aspect, plan closure and hypsometry. The set of tools, named ACME (Automated Cirque Metric Extraction), is coded in Python, runs in one of the most commonly used GIS packages (ArcGIS) and has a user friendly interface. A polygon layer of mapped cirques is required for all metrics, while a Digital Terrain Model and a point layer of cirque threshold midpoints are needed to run some of the tools. Results from ACME are comparable to those from other techniques and can be obtained rapidly, allowing large cirque datasets to be analysed and potentially important regional trends highlighted.

  12. Automated feature extraction and classification from image sources

    USGS Publications Warehouse

    ,

    1995-01-01

    The U.S. Department of the Interior, U.S. Geological Survey (USGS), and Unisys Corporation have completed a cooperative research and development agreement (CRADA) to explore automated feature extraction and classification from image sources. The CRADA helped the USGS define the spectral and spatial resolution characteristics of airborne and satellite imaging sensors necessary to meet base cartographic and land use and land cover feature classification requirements and help develop future automated geographic and cartographic data production capabilities. The USGS is seeking a new commercial partner to continue automated feature extraction and classification research and development.

  13. Automated blood vessel extraction using local features on retinal images

    NASA Astrophysics Data System (ADS)

    Hatanaka, Yuji; Samo, Kazuki; Tajima, Mikiya; Ogohara, Kazunori; Muramatsu, Chisako; Okumura, Susumu; Fujita, Hiroshi

    2016-03-01

    An automated blood vessel extraction using high-order local autocorrelation (HLAC) on retinal images is presented. Although many blood vessel extraction methods based on contrast have been proposed, a technique based on the relation of neighbor pixels has not been published. HLAC features are shift-invariant; therefore, we applied HLAC features to retinal images. However, HLAC features are weak to turned image, thus a method was improved by the addition of HLAC features to a polar transformed image. The blood vessels were classified using an artificial neural network (ANN) with HLAC features using 105 mask patterns as input. To improve performance, the second ANN (ANN2) was constructed by using the green component of the color retinal image and the four output values of ANN, Gabor filter, double-ring filter and black-top-hat transformation. The retinal images used in this study were obtained from the "Digital Retinal Images for Vessel Extraction" (DRIVE) database. The ANN using HLAC output apparent white values in the blood vessel regions and could also extract blood vessels with low contrast. The outputs were evaluated using the area under the curve (AUC) based on receiver operating characteristics (ROC) analysis. The AUC of ANN2 was 0.960 as a result of our study. The result can be used for the quantitative analysis of the blood vessels.

  14. Automated DNA extraction for large numbers of plant samples.

    PubMed

    Mehle, Nataša; Nikolić, Petra; Rupar, Matevž; Boben, Jana; Ravnikar, Maja; Dermastia, Marina

    2013-01-01

    The method described here is a rapid, total DNA extraction procedure applicable to a large number of plant samples requiring pathogen detection. The procedure combines a simple and quick homogenization step of crude extracts with DNA extraction based upon the binding of DNA to magnetic beads. DNA is purified in an automated process in which the magnetic beads are transferred through a series of washing buffers. The eluted DNA is suitable for efficient amplification in PCR reactions.

  15. Docking automation related technology, Phase 2 report

    SciTech Connect

    Jatko, W.B.; Goddard, J.S.; Gleason, S.S.; Ferrell, R.K.

    1995-04-01

    This report generalizes the progress for Phase II of the Docking Automated Related Technologies task component within the Modular Artillery Ammunition Delivery System (MAADS) technology demonstrator of the Future Armored Resupply Vehicle (FARV) project. This report also covers development activity at Oak Ridge National Laboratory (ORNL) during the period from January to July 1994.

  16. Automated sea floor extraction from underwater video

    NASA Astrophysics Data System (ADS)

    Kelly, Lauren; Rahmes, Mark; Stiver, James; McCluskey, Mike

    2016-05-01

    Ocean floor mapping using video is a method to simply and cost-effectively record large areas of the seafloor. Obtaining visual and elevation models has noteworthy applications in search and recovery missions. Hazards to navigation are abundant and pose a significant threat to the safety, effectiveness, and speed of naval operations and commercial vessels. This project's objective was to develop a workflow to automatically extract metadata from marine video and create image optical and elevation surface mosaics. Three developments made this possible. First, optical character recognition (OCR) by means of two-dimensional correlation, using a known character set, allowed for the capture of metadata from image files. Second, exploiting the image metadata (i.e., latitude, longitude, heading, camera angle, and depth readings) allowed for the determination of location and orientation of the image frame in mosaic. Image registration improved the accuracy of mosaicking. Finally, overlapping data allowed us to determine height information. A disparity map was created using the parallax from overlapping viewpoints of a given area and the relative height data was utilized to create a three-dimensional, textured elevation map.

  17. Automated extraction of radiation dose information for CT examinations.

    PubMed

    Cook, Tessa S; Zimmerman, Stefan; Maidment, Andrew D A; Kim, Woojin; Boonn, William W

    2010-11-01

    Exposure to radiation as a result of medical imaging is currently in the spotlight, receiving attention from Congress as well as the lay press. Although scanner manufacturers are moving toward including effective dose information in the Digital Imaging and Communications in Medicine headers of imaging studies, there is a vast repository of retrospective CT data at every imaging center that stores dose information in an image-based dose sheet. As such, it is difficult for imaging centers to participate in the ACR's Dose Index Registry. The authors have designed an automated extraction system to query their PACS archive and parse CT examinations to extract the dose information stored in each dose sheet. First, an open-source optical character recognition program processes each dose sheet and converts the information to American Standard Code for Information Interchange (ASCII) text. Each text file is parsed, and radiation dose information is extracted and stored in a database which can be queried using an existing pathology and radiology enterprise search tool. Using this automated extraction pipeline, it is possible to perform dose analysis on the >800,000 CT examinations in the PACS archive and generate dose reports for all of these patients. It is also possible to more effectively educate technologists, radiologists, and referring physicians about exposure to radiation from CT by generating report cards for interpreted and performed studies. The automated extraction pipeline enables compliance with the ACR's reporting guidelines and greater awareness of radiation dose to patients, thus resulting in improved patient care and management.

  18. Automated vasculature extraction from placenta images

    NASA Astrophysics Data System (ADS)

    Almoussa, Nizar; Dutra, Brittany; Lampe, Bryce; Getreuer, Pascal; Wittman, Todd; Salafia, Carolyn; Vese, Luminita

    2011-03-01

    Recent research in perinatal pathology argues that analyzing properties of the placenta may reveal important information on how certain diseases progress. One important property is the structure of the placental blood vessels, which supply a fetus with all of its oxygen and nutrition. An essential step in the analysis of the vascular network pattern is the extraction of the blood vessels, which has only been done manually through a costly and time-consuming process. There is no existing method to automatically detect placental blood vessels; in addition, the large variation in the shape, color, and texture of the placenta makes it difficult to apply standard edge-detection algorithms. We describe a method to automatically detect and extract blood vessels from a given image by using image processing techniques and neural networks. We evaluate several local features for every pixel, in addition to a novel modification to an existing road detector. Pixels belonging to blood vessel regions have recognizable responses; hence, we use an artificial neural network to identify the pattern of blood vessels. A set of images where blood vessels are manually highlighted is used to train the network. We then apply the neural network to recognize blood vessels in new images. The network is effective in capturing the most prominent vascular structures of the placenta.

  19. Automated Image Registration Using Morphological Region of Interest Feature Extraction

    NASA Technical Reports Server (NTRS)

    Plaza, Antonio; LeMoigne, Jacqueline; Netanyahu, Nathan S.

    2005-01-01

    With the recent explosion in the amount of remotely sensed imagery and the corresponding interest in temporal change detection and modeling, image registration has become increasingly important as a necessary first step in the integration of multi-temporal and multi-sensor data for applications such as the analysis of seasonal and annual global climate changes, as well as land use/cover changes. The task of image registration can be divided into two major components: (1) the extraction of control points or features from images; and (2) the search among the extracted features for the matching pairs that represent the same feature in the images to be matched. Manual control feature extraction can be subjective and extremely time consuming, and often results in few usable points. Automated feature extraction is a solution to this problem, where desired target features are invariant, and represent evenly distributed landmarks such as edges, corners and line intersections. In this paper, we develop a novel automated registration approach based on the following steps. First, a mathematical morphology (MM)-based method is used to obtain a scale-orientation morphological profile at each image pixel. Next, a spectral dissimilarity metric such as the spectral information divergence is applied for automated extraction of landmark chips, followed by an initial approximate matching. This initial condition is then refined using a hierarchical robust feature matching (RFM) procedure. Experimental results reveal that the proposed registration technique offers a robust solution in the presence of seasonal changes and other interfering factors. Keywords-Automated image registration, multi-temporal imagery, mathematical morphology, robust feature matching.

  20. Automated Road Extraction from High Resolution Multispectral Imagery

    SciTech Connect

    Doucette, Peter J.; Agouris, Peggy; Stefanidis, Anthony

    2004-12-01

    Road networks represent a vital component of geospatial data sets in high demand, and thus contribute significantly to extraction labor costs. Multispectral imagery has only recently become widely available at high spatial resolutions, and modeling spectral content has received limited consideration for road extraction algorithms. This paper presents a methodology that exploits spectral content for fully automated road centerline extraction. Preliminary detection of road centerline pixel candidates is performed with Anti-parallel-edge Centerline Extraction (ACE). This is followed by constructing a road vector topology with a fuzzy grouping model that links nodes from a self-organized mapping of the ACE pixels. Following topology construction, a self-supervised road classification (SSRC) feedback loop is implemented to automate the process of training sample selection and refinement for a road class, as well deriving practical spectral definitions for non-road classes. SSRC demonstrates a potential to provide dramatic improvement in road extraction results by exploiting spectral content. Road centerline extraction results are presented for three 1m color-infrared suburban scenes, which show significant improvement following SSRC.

  1. Applications of the Automated SMAC Modal Parameter Extraction Package

    SciTech Connect

    MAYES,RANDALL L.; DORRELL,LARRY R.; KLENKE,SCOTT E.

    1999-10-29

    An algorithm known as SMAC (Synthesize Modes And Correlate), based on principles of modal filtering, has been in development for a few years. The new capabilities of the automated version are demonstrated on test data from a complex shell/payload system. Examples of extractions from impact and shaker data are shown. The automated algorithm extracts 30 to 50 modes in the bandwidth from each column of the frequency response function matrix. Examples of the synthesized Mode Indicator Functions (MIFs) compared with the actual MIFs show the accuracy of the technique. A data set for one input and 170 accelerometer outputs can typically be reduced in an hour. Application to a test with some complex modes is also demonstrated.

  2. Improved Automated Seismic Event Extraction Using Machine Learning

    NASA Astrophysics Data System (ADS)

    Mackey, L.; Kleiner, A.; Jordan, M. I.

    2009-12-01

    Like many organizations engaged in seismic monitoring, the Preparatory Commission for the Comprehensive Test Ban Treaty Organization collects and processes seismic data from a large network of sensors. This data is continuously transmitted to a central data center, and bulletins of seismic events are automatically extracted. However, as for many such automated systems at present, the inaccuracy of this extraction necessitates substantial human analyst review effort. A significant opportunity for improvement thus lies in the fact that these systems currently fail to fully utilize the valuable repository of historical data provided by prior analyst reviews. In this work, we present the results of the application of machine learning approaches to several fundamental sub-tasks in seismic event extraction. These methods share as a common theme the use of historical analyst-reviewed bulletins as ground truth from which they extract relevant patterns to accomplish the desired goals. For instance, we demonstrate the effectiveness of classification and ranking methods for the identification of false events -- that is, those which will be invalidated and discarded by analysts -- in automated bulletins. We also show gains in the accuracy of seismic phase identification via the use of classification techniques to automatically assign seismic phase labels to station detections. Furthermore, we examine the potential of historical association data to inform the direct association of new signal detections with their corresponding seismic events. Empirical results are based upon parametric historical seismic detection and event data received from the Preparatory Commission for the Comprehensive Test Ban Treaty Organization.

  3. Automated RNA Extraction and Purification for Multiplexed Pathogen Detection

    SciTech Connect

    Bruzek, Amy K.; Bruckner-Lea, Cindy J.

    2005-01-01

    Pathogen detection has become an extremely important part of our nation?s defense in this post 9/11 world where the threat of bioterrorist attacks are a grim reality. When a biological attack takes place, response time is critical. The faster the biothreat is assessed, the faster countermeasures can be put in place to protect the health of the general public. Today some of the most widely used methods for detecting pathogens are either time consuming or not reliable [1]. Therefore, a method that can detect multiple pathogens that is inherently reliable, rapid, automated and field portable is needed. To that end, we are developing automated fluidics systems for the recovery, cleanup, and direct labeling of community RNA from suspect environmental samples. The advantage of using RNA for detection is that there are multiple copies of mRNA in a cell, whereas there are normally only one or two copies of DNA [2]. Because there are multiple copies of mRNA in a cell for highly expressed genes, no amplification of the genetic material may be necessary, and thus rapid and direct detection of only a few cells may be possible [3]. This report outlines the development of both manual and automated methods for the extraction and purification of mRNA. The methods were evaluated using cell lysates from Escherichia coli 25922 (nonpathogenic), Salmonella typhimurium (pathogenic), and Shigella spp (pathogenic). Automated RNA purification was achieved using a custom sequential injection fluidics system consisting of a syringe pump, a multi-port valve and a magnetic capture cell. mRNA was captured using silica coated superparamagnetic beads that were trapped in the tubing by a rare earth magnet. RNA was detected by gel electrophoresis and/or by hybridization of the RNA to microarrays. The versatility of the fluidics systems and the ability to automate these systems allows for quick and easy processing of samples and eliminates the need for an experienced operator.

  4. Arduino-based automation of a DNA extraction system.

    PubMed

    Kim, Kyung-Won; Lee, Mi-So; Ryu, Mun-Ho; Kim, Jong-Won

    2015-01-01

    There have been many studies to detect infectious diseases with the molecular genetic method. This study presents an automation process for a DNA extraction system based on microfluidics and magnetic bead, which is part of a portable molecular genetic test system. This DNA extraction system consists of a cartridge with chambers, syringes, four linear stepper actuators, and a rotary stepper actuator. The actuators provide a sequence of steps in the DNA extraction process, such as transporting, mixing, and washing for the gene specimen, magnetic bead, and reagent solutions. The proposed automation system consists of a PC-based host application and an Arduino-based controller. The host application compiles a G code sequence file and interfaces with the controller to execute the compiled sequence. The controller executes stepper motor axis motion, time delay, and input-output manipulation. It drives the stepper motor with an open library, which provides a smooth linear acceleration profile. The controller also provides a homing sequence to establish the motor's reference position, and hard limit checking to prevent any over-travelling. The proposed system was implemented and its functionality was investigated, especially regarding positioning accuracy and velocity profile.

  5. A simple automated instrument for DNA extraction in forensic casework.

    PubMed

    Montpetit, Shawn A; Fitch, Ian T; O'Donnell, Patrick T

    2005-05-01

    The Qiagen BioRobot EZ1 is a small, rapid, and reliable automated DNA extraction instrument capable of extracting DNA from up to six samples in as few as 20 min using magnetic bead technology. The San Diego Police Department Crime Laboratory has validated the BioRobot EZ1 for the DNA extraction of evidence and reference samples in forensic casework. The BioRobot EZ1 was evaluated for use on a variety of different evidence sample types including blood, saliva, and semen evidence. The performance of the BioRobot EZ1 with regard to DNA recovery and potential cross-contamination was also assessed. DNA yields obtained with the BioRobot EZ1 were comparable to those from organic extraction. The BioRobot EZ1 was effective at removing PCR inhibitors, which often co-purify with DNA in organic extractions. The incorporation of the BioRobot EZ1 into forensic casework has streamlined the DNA analysis process by reducing the need for labor-intensive phenol-chloroform extractions.

  6. Automated tools for phenotype extraction from medical records.

    PubMed

    Yetisgen-Yildiz, Meliha; Bejan, Cosmin A; Vanderwende, Lucy; Xia, Fei; Evans, Heather L; Wurfel, Mark M

    2013-01-01

    Clinical research studying critical illness phenotypes relies on the identification of clinical syndromes defined by consensus definitions. Historically, identifying phenotypes has required manual chart review, a time and resource intensive process. The overall research goal of C ritical I llness PH enotype E xt R action (deCIPHER) project is to develop automated approaches based on natural language processing and machine learning that accurately identify phenotypes from EMR. We chose pneumonia as our first critical illness phenotype and conducted preliminary experiments to explore the problem space. In this abstract, we outline the tools we built for processing clinical records, present our preliminary findings for pneumonia extraction, and describe future steps.

  7. Automated labeling of bibliographic data extracted from biomedical online journals

    NASA Astrophysics Data System (ADS)

    Kim, Jongwoo; Le, Daniel X.; Thoma, George R.

    2003-01-01

    A prototype system has been designed to automate the extraction of bibliographic data (e.g., article title, authors, abstract, affiliation and others) from online biomedical journals to populate the National Library of Medicine"s MEDLINE database. This paper describes a key module in this system: the labeling module that employs statistics and fuzzy rule-based algorithms to identify segmented zones in an article"s HTML pages as specific bibliographic data. Results from experiments conducted with 1,149 medical articles from forty-seven journal issues are presented.

  8. An automated approach for extracting Barrier Island morphology from digital elevation models

    NASA Astrophysics Data System (ADS)

    Wernette, Phillipe; Houser, Chris; Bishop, Michael P.

    2016-06-01

    The response and recovery of a barrier island to extreme storms depends on the elevation of the dune base and crest, both of which can vary considerably alongshore and through time. Quantifying the response to and recovery from storms requires that we can first identify and differentiate the dune(s) from the beach and back-barrier, which in turn depends on accurate identification and delineation of the dune toe, crest and heel. The purpose of this paper is to introduce a multi-scale automated approach for extracting beach, dune (dune toe, dune crest and dune heel), and barrier island morphology. The automated approach introduced here extracts the shoreline and back-barrier shoreline based on elevation thresholds, and extracts the dune toe, dune crest and dune heel based on the average relative relief (RR) across multiple spatial scales of analysis. The multi-scale automated RR approach to extracting dune toe, dune crest, and dune heel based upon relative relief is more objective than traditional approaches because every pixel is analyzed across multiple computational scales and the identification of features is based on the calculated RR values. The RR approach out-performed contemporary approaches and represents a fast objective means to define important beach and dune features for predicting barrier island response to storms. The RR method also does not require that the dune toe, crest, or heel are spatially continuous, which is important because dune morphology is likely naturally variable alongshore.

  9. Automated Feature Extraction of Foredune Morphology from Terrestrial Lidar Data

    NASA Astrophysics Data System (ADS)

    Spore, N.; Brodie, K. L.; Swann, C.

    2014-12-01

    Foredune morphology is often described in storm impact prediction models using the elevation of the dune crest and dune toe and compared with maximum runup elevations to categorize the storm impact and predicted responses. However, these parameters do not account for other foredune features that may make them more or less erodible, such as alongshore variations in morphology, vegetation coverage, or compaction. The goal of this work is to identify other descriptive features that can be extracted from terrestrial lidar data that may affect the rate of dune erosion under wave attack. Daily, mobile-terrestrial lidar surveys were conducted during a 6-day nor'easter (Hs = 4 m in 6 m water depth) along 20km of coastline near Duck, North Carolina which encompassed a variety of foredune forms in close proximity to each other. This abstract will focus on the tools developed for the automated extraction of the morphological features from terrestrial lidar data, while the response of the dune will be presented by Brodie and Spore as an accompanying abstract. Raw point cloud data can be dense and is often under-utilized due to time and personnel constraints required for analysis, since many algorithms are not fully automated. In our approach, the point cloud is first projected into a local coordinate system aligned with the coastline, and then bare earth points are interpolated onto a rectilinear 0.5 m grid creating a high resolution digital elevation model. The surface is analyzed by identifying features along each cross-shore transect. Surface curvature is used to identify the position of the dune toe, and then beach and berm morphology is extracted shoreward of the dune toe, and foredune morphology is extracted landward of the dune toe. Changes in, and magnitudes of, cross-shore slope, curvature, and surface roughness are used to describe the foredune face and each cross-shore transect is then classified using its pre-storm morphology for storm-response analysis.

  10. Automated extraction of knowledge for model-based diagnostics

    NASA Technical Reports Server (NTRS)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  11. Extraction of polychlorinated biphenyls from soils by automated focused microwave-assisted Soxhlet extraction.

    PubMed

    Luque-García, J L; de Castro, Luque

    2003-05-23

    The application of a new focused microwave-assisted Soxhlet extractor for the extraction of polychlorinated biphenyls from differently aged soils is here presented. The new extractor overcomes the disadvantages of previous devices based on the same principle and enables a fully automated extraction of two samples simultaneously. The variables affecting the extraction step (namely, power of irradiation, irradiation time, extractant volume, extractant composition and number of extraction cycles) have been optimized using experimental design methodology. The optimized method has also been applied to a certified reference material (CRM910-050 "real" contaminated soil) for quality assurance validation. Quantification of the target compounds has been performed by GC with ion-trap MS. The mass spectrometer was operated in the electron-ionization mode, with selected-ion monitoring at m/z 152, 186, 292, 326 and 498. The results obtained have demonstrated that this approach is as efficient as conventional Soxhlet but with a drastic reduction of both extraction time (70 min vs. 24 h for the "real" contaminated soil) and organic solvent disposal, as 75-80% of the extractant is recycled.

  12. Automated Dsm Extraction from Uav Images and Performance Analysis

    NASA Astrophysics Data System (ADS)

    Rhee, S.; Kim, T.

    2015-08-01

    As technology evolves, unmanned aerial vehicles (UAVs) imagery is being used from simple applications such as image acquisition to complicated applications such as 3D spatial information extraction. Spatial information is usually provided in the form of a DSM or point cloud. It is important to generate very dense tie points automatically from stereo images. In this paper, we tried to apply stereo image-based matching technique developed for satellite/aerial images to UAV images, propose processing steps for automated DSM generation and to analyse the possibility of DSM generation. For DSM generation from UAV images, firstly, exterior orientation parameters (EOPs) for each dataset were adjusted. Secondly, optimum matching pairs were determined. Thirdly, stereo image matching was performed with each pair. Developed matching algorithm is based on grey-level correlation on pixels applied along epipolar lines. Finally, the extracted match results were united with one result and the final DSM was made. Generated DSM was compared with a reference DSM from Lidar. Overall accuracy was 1.5 m in NMAD. However, several problems have to be solved in future, including obtaining precise EOPs, handling occlusion and image blurring problems. More effective interpolation technique needs to be developed in the future.

  13. Multichannel Convolutional Neural Network for Biological Relation Extraction

    PubMed Central

    Quan, Chanqin; Sun, Xiao; Bai, Wenjun

    2016-01-01

    The plethora of biomedical relations which are embedded in medical logs (records) demands researchers' attention. Previous theoretical and practical focuses were restricted on traditional machine learning techniques. However, these methods are susceptible to the issues of “vocabulary gap” and data sparseness and the unattainable automation process in feature extraction. To address aforementioned issues, in this work, we propose a multichannel convolutional neural network (MCCNN) for automated biomedical relation extraction. The proposed model has the following two contributions: (1) it enables the fusion of multiple (e.g., five) versions in word embeddings; (2) the need for manual feature engineering can be obviated by automated feature learning with convolutional neural network (CNN). We evaluated our model on two biomedical relation extraction tasks: drug-drug interaction (DDI) extraction and protein-protein interaction (PPI) extraction. For DDI task, our system achieved an overall f-score of 70.2% compared to the standard linear SVM based system (e.g., 67.0%) on DDIExtraction 2013 challenge dataset. And for PPI task, we evaluated our system on Aimed and BioInfer PPI corpus; our system exceeded the state-of-art ensemble SVM system by 2.7% and 5.6% on f-scores. PMID:28053977

  14. Brain MAPS: an automated, accurate and robust brain extraction technique using a template library

    PubMed Central

    Leung, Kelvin K.; Barnes, Josephine; Modat, Marc; Ridgway, Gerard R.; Bartlett, Jonathan W.; Fox, Nick C.; Ourselin, Sébastien

    2011-01-01

    Whole brain extraction is an important pre-processing step in neuro-image analysis. Manual or semi-automated brain delineations are labour-intensive and thus not desirable in large studies, meaning that automated techniques are preferable. The accuracy and robustness of automated methods are crucial because human expertise may be required to correct any sub-optimal results, which can be very time consuming. We compared the accuracy of four automated brain extraction methods: Brain Extraction Tool (BET), Brain Surface Extractor (BSE), Hybrid Watershed Algorithm (HWA) and a Multi-Atlas Propagation and Segmentation (MAPS) technique we have previously developed for hippocampal segmentation. The four methods were applied to extract whole brains from 682 1.5T and 157 3T T1-weighted MR baseline images from the Alzheimer’s Disease Neuroimaging Initiative database. Semi-automated brain segmentations with manual editing and checking were used as the gold-standard to compare with the results. The median Jaccard index of MAPS was higher than HWA, BET and BSE in 1.5T and 3T scans (p < 0.05, all tests), and the 1st-99th centile range of the Jaccard index of MAPS was smaller than HWA, BET and BSE in 1.5T and 3T scans (p < 0.05, all tests). HWA and MAPS were found to be best at including all brain tissues (median false negative rate ≤ 0.010% for 1.5T scans and ≤ 0.019% for 3T scans, both methods). The median Jaccard index of MAPS were similar in both 1.5T and 3T scans, whereas those of BET, BSE and HWA were higher in 1.5T scans than 3T scans (p < 0.05, all tests). We found that the diagnostic group had a small effect on the median Jaccard index of all four methods. In conclusion, MAPS had relatively high accuracy and low variability compared to HWA, BET and BSE in MR scans with and without atrophy. PMID:21195780

  15. AUTOMATED SOLID PHASE EXTRACTION GC/MS FOR ANALYSIS OF SEMIVOLATILES IN WATER AND SEDIMENTS

    EPA Science Inventory

    Data is presented on the development of a new automated system combining solid phase extraction (SPE) with GC/MS spectrometry for the single-run analysis of water samples containing a broad range of organic compounds. The system uses commercially available automated in-line sampl...

  16. Selecting a Relational Database Management System for Library Automation Systems.

    ERIC Educational Resources Information Center

    Shekhel, Alex; O'Brien, Mike

    1989-01-01

    Describes the evaluation of four relational database management systems (RDBMSs) (Informix Turbo, Oracle 6.0 TPS, Unify 2000 and Relational Technology's Ingres 5.0) to determine which is best suited for library automation. The evaluation criteria used to develop a benchmark specifically designed to test RDBMSs for libraries are discussed. (CLB)

  17. Towards automated support for extraction of reusable components

    NASA Technical Reports Server (NTRS)

    Abd-El-hafiz, S. K.; Basili, Victor R.; Caldiera, Gianluigi

    1992-01-01

    A cost effective introduction of software reuse techniques requires the reuse of existing software developed in many cases without aiming at reusability. This paper discusses the problems related to the analysis and reengineering of existing software in order to reuse it. We introduce a process model for component extraction and focus on the problem of analyzing and qualifying software components which are candidates for reuse. A prototype tool for supporting the extraction of reusable components is presented. One of the components of this tool aids in understanding programs and is based on the functional model of correctness. It can assist software engineers in the process of finding correct formal specifications for programs. A detailed description of this component and an example to demonstrate a possible operational scenario are given.

  18. Automated extraction and variability analysis of sulcal neuroanatomy.

    PubMed

    Le Goualher, G; Procyk, E; Collins, D L; Venugopal, R; Barillot, C; Evans, A C

    1999-03-01

    Systematic mapping of the variability in cortical sulcal anatomy is an area of increasing interest which presents numerous methodological challenges. To address these issues, we have implemented sulcal extraction and assisted labeling (SEAL) to automatically extract the two-dimensional (2-D) surface ribbons that represent the median axis of cerebral sulci and to neuroanatomically label these entities. To encode the extracted three-dimensional (3-D) cortical sulcal schematic topography (CSST) we define a relational graph structure composed of two main features: vertices (representing sulci) and arcs (representing the relationships between sulci). Vertices contain a parametric representation of the surface ribbon buried within the sulcus. Points on this surface are expressed in stereotaxic coordinates (i.e., with respect to a standardized brain coordinate system). For each of these vertices, we store length, depth, and orientation as well as anatomical attributes (e.g., hemisphere, lobe, sulcus type, etc.). Each arc stores the 3-D location of the junction between sulci as well as a list of its connecting sulci. Sulcal labeling is performed semiautomatically by selecting a sulcal entity in the CSST and selecting from a menu of candidate sulcus names. In order to help the user in the labeling task, the menu is restricted to the most likely candidates by using priors for the expected sulcal spatial distribution. These priors, i.e., sulcal probabilistic maps, were created from the spatial distribution of 34 sulci traced manually on 36 different subjects. Given these spatial probability maps, the user is provided with the likelihood that the selected entity belongs to a particular sulcus. The cortical structure representation obtained by SEAL is suitable to extract statistical information about both the spatial and the structural composition of the cerebral cortical topography. This methodology allows for the iterative construction of a successively more complete

  19. Automating Nuclear-Safety-Related SQA Procedures with Custom Applications

    SciTech Connect

    Freels, James D.

    2016-01-01

    Nuclear safety-related procedures are rigorous for good reason. Small design mistakes can quickly turn into unwanted failures. Researchers at Oak Ridge National Laboratory worked with COMSOL to define a simulation app that automates the software quality assurance (SQA) verification process and provides results in less than 24 hours.

  20. Comparison of manual and automated nucleic acid extraction from whole-blood samples.

    PubMed

    Riemann, Kathrin; Adamzik, Michael; Frauenrath, Stefan; Egensperger, Rupert; Schmid, Kurt W; Brockmeyer, Norbert H; Siffert, Winfried

    2007-01-01

    Nucleic acid extraction and purification from whole blood is a routine application in many laboratories. Automation of this procedure promises standardized sample treatment, a low error rate, and avoidance of contamination. The performance of the BioRobot M48 (Qiagen) and the manual QIAmp DNA Blood Mini Kit (Qiagen) was compared for the extraction of DNA from whole blood. The concentration and purity of the extracted DNAs were determined by spectrophotometry. Analytical sensitivity was assessed by common PCR and genotyping techniques. The quantity and quality of the generated DNAs were slightly higher using the manual extraction method. The results of downstream applications were comparable to each other. Amplification of high-molecular-weight PCR fragments, genotyping by restriction digest, and pyrosequencing were successful for all samples. No cross-contamination could be detected. While automated DNA extraction requires significantly less hands-on time, it is slightly more expensive than the manual extraction method.

  1. Disposable and removable nucleic acid extraction and purification cartridges for automated flow-through systems

    DOEpatents

    Regan, John Frederick

    2014-09-09

    Removable cartridges are used on automated flow-through systems for the purpose of extracting and purifying genetic material from complex matrices. Different types of cartridges are paired with specific automated protocols to concentrate, extract, and purifying pathogenic or human genetic material. Their flow-through nature allows large quantities sample to be processed. Matrices may be filtered using size exclusion and/or affinity filters to concentrate the pathogen of interest. Lysed material is ultimately passed through a filter to remove the insoluble material before the soluble genetic material is delivered past a silica-like membrane that binds the genetic material, where it is washed, dried, and eluted. Cartridges are inserted into the housing areas of flow-through automated instruments, which are equipped with sensors to ensure proper placement and usage of the cartridges. Properly inserted cartridges create fluid- and air-tight seals with the flow lines of an automated instrument.

  2. Evaluation of four automated protocols for extraction of DNA from FTA cards.

    PubMed

    Stangegaard, Michael; Børsting, Claus; Ferrero-Miliani, Laura; Frank-Hansen, Rune; Poulsen, Lena; Hansen, Anders J; Morling, Niels

    2013-10-01

    Extraction of DNA using magnetic bead-based techniques on automated DNA extraction instruments provides a fast, reliable, and reproducible method for DNA extraction from various matrices. Here, we have compared the yield and quality of DNA extracted from FTA cards using four automated extraction protocols on three different instruments. The extraction processes were repeated up to six times with the same pieces of FTA cards. The sample material on the FTA cards was either blood or buccal cells. With the QIAamp DNA Investigator and QIAsymphony DNA Investigator kits, it was possible to extract DNA from the FTA cards in all six rounds of extractions in sufficient amount and quality to obtain complete short tandem repeat (STR) profiles on a QIAcube and a QIAsymphony SP. With the PrepFiler Express kit, almost all the extractable DNA was extracted in the first two rounds of extractions. Furthermore, we demonstrated that it was possible to successfully extract sufficient DNA for STR profiling from previously processed FTA card pieces that had been stored at 4 °C for up to 1 year. This showed that rare or precious FTA card samples may be saved for future analyses even though some DNA was already extracted from the FTA cards.

  3. Spatial resolution requirements for automated cartographic road extraction

    USGS Publications Warehouse

    Benjamin, S.; Gaydos, L.

    1990-01-01

    Ground resolution requirements for detection and extraction of road locations in a digitized large-scale photographic database were investigated. A color infrared photograph of Sunnyvale, California was scanned, registered to a map grid, and spatially degraded to 1- to 5-metre resolution pixels. Road locations in each data set were extracted using a combination of image processing and CAD programs. These locations were compared to a photointerpretation of road locations to determine a preferred pixel size for the extraction method. Based on road pixel omission error computations, a 3-metre pixel resolution appears to be the best choice for this extraction method. -Authors

  4. Automated multisyringe stir bar sorptive extraction using robust montmorillonite/epoxy-coated stir bars.

    PubMed

    Ghani, Milad; Saraji, Mohammad; Maya, Fernando; Cerdà, Víctor

    2016-05-06

    Herein we present a simple, rapid and low cost strategy for the preparation of robust stir bar coatings based on the combination of montmorillonite with epoxy resin. The composite stir bar was implemented in a novel automated multisyringe stir bar sorptive extraction system (MS-SBSE), and applied to the extraction of four chlorophenols (4-chlorophenol, 2,4-dichlorophenol, 2,4,6-trichlorophenol and pentachlorophenol) as model compounds, followed by high performance liquid chromatography-diode array detection. The different experimental parameters of the MS-SBSE, such as sample volume, selection of the desorption solvent, desorption volume, desorption time, sample solution pH, salt effect and extraction time were studied. Under the optimum conditions, the detection limits were between 0.02 and 0.34μgL(-1). Relative standard deviations (RSD) of the method for the analytes at 10μgL(-1) concentration level ranged from 3.5% to 4.1% (as intra-day RSD) and from 3.9% to 4.3% (as inter-day RSD at 50μgL(-1) concentration level). Batch-to-batch reproducibility for three different stir bars was 4.6-5.1%. The enrichment factors were between 30 and 49. In order to investigate the capability of the developed technique for real sample analysis, well water, wastewater and leachates from a solid waste treatment plant were satisfactorily analyzed.

  5. Artificial intelligence issues related to automated computing operations

    NASA Technical Reports Server (NTRS)

    Hornfeck, William A.

    1989-01-01

    Large data processing installations represent target systems for effective applications of artificial intelligence (AI) constructs. The system organization of a large data processing facility at the NASA Marshall Space Flight Center is presented. The methodology and the issues which are related to AI application to automated operations within a large-scale computing facility are described. Problems to be addressed and initial goals are outlined.

  6. Automated microfluidic DNA/RNA extraction with both disposable and reusable components

    NASA Astrophysics Data System (ADS)

    Kim, Jungkyu; Johnson, Michael; Hill, Parker; Sonkul, Rahul S.; Kim, Jongwon; Gale, Bruce K.

    2012-01-01

    An automated microfluidic nucleic extraction system was fabricated with a multilayer polydimethylsiloxane (PDMS) structure that consists of sample wells, microvalves, a micropump and a disposable microfluidic silica cartridge. Both the microvalves and micropump structures were fabricated in a single layer and are operated pneumatically using a 100 µm PDMS membrane. To fabricate the disposable microfluidic silica cartridge, two-cavity structures were made in a PDMS replica to fit the stacked silica membranes. A handheld controller for the microvalves and pumps was developed to enable system automation. With purified ribonucleic acid (RNA), whole blood and E. coli samples, the automated microfluidic nucleic acid extraction system was validated with a guanidine-based solid phase extraction procedure. An extraction efficiency of ~90% for deoxyribonucleic acid (DNA) and ~54% for RNA was obtained in 12 min from whole blood and E. coli samples, respectively. In addition, the same quantity and quality of extracted DNA was confirmed by polymerase chain reaction (PCR) amplification. The PCR also presented the appropriate amplification and melting profiles. Automated, programmable fluid control and physical separation of the reusable components and the disposable components significantly decrease the assay time and manufacturing cost and increase the flexibility and compatibility of the system with downstream components.

  7. Feature Extraction and Selection Strategies for Automated Target Recognition

    NASA Technical Reports Server (NTRS)

    Greene, W. Nicholas; Zhang, Yuhan; Lu, Thomas T.; Chao, Tien-Hsin

    2010-01-01

    Several feature extraction and selection methods for an existing automatic target recognition (ATR) system using JPLs Grayscale Optical Correlator (GOC) and Optimal Trade-Off Maximum Average Correlation Height (OT-MACH) filter were tested using MATLAB. The ATR system is composed of three stages: a cursory region of-interest (ROI) search using the GOC and OT-MACH filter, a feature extraction and selection stage, and a final classification stage. Feature extraction and selection concerns transforming potential target data into more useful forms as well as selecting important subsets of that data which may aide in detection and classification. The strategies tested were built around two popular extraction methods: Principal Component Analysis (PCA) and Independent Component Analysis (ICA). Performance was measured based on the classification accuracy and free-response receiver operating characteristic (FROC) output of a support vector machine(SVM) and a neural net (NN) classifier.

  8. Automated Protein Biomarker Analysis: on-line extraction of clinical samples by Molecularly Imprinted Polymers

    NASA Astrophysics Data System (ADS)

    Rossetti, Cecilia; Świtnicka-Plak, Magdalena A.; Grønhaug Halvorsen, Trine; Cormack, Peter A. G.; Sellergren, Börje; Reubsaet, Léon

    2017-03-01

    Robust biomarker quantification is essential for the accurate diagnosis of diseases and is of great value in cancer management. In this paper, an innovative diagnostic platform is presented which provides automated molecularly imprinted solid-phase extraction (MISPE) followed by liquid chromatography-mass spectrometry (LC-MS) for biomarker determination using ProGastrin Releasing Peptide (ProGRP), a highly sensitive biomarker for Small Cell Lung Cancer, as a model. Molecularly imprinted polymer microspheres were synthesized by precipitation polymerization and analytical optimization of the most promising material led to the development of an automated quantification method for ProGRP. The method enabled analysis of patient serum samples with elevated ProGRP levels. Particularly low sample volumes were permitted using the automated extraction within a method which was time-efficient, thereby demonstrating the potential of such a strategy in a clinical setting.

  9. Automated Protein Biomarker Analysis: on-line extraction of clinical samples by Molecularly Imprinted Polymers

    PubMed Central

    Rossetti, Cecilia; Świtnicka-Plak, Magdalena A.; Grønhaug Halvorsen, Trine; Cormack, Peter A.G.; Sellergren, Börje; Reubsaet, Léon

    2017-01-01

    Robust biomarker quantification is essential for the accurate diagnosis of diseases and is of great value in cancer management. In this paper, an innovative diagnostic platform is presented which provides automated molecularly imprinted solid-phase extraction (MISPE) followed by liquid chromatography-mass spectrometry (LC-MS) for biomarker determination using ProGastrin Releasing Peptide (ProGRP), a highly sensitive biomarker for Small Cell Lung Cancer, as a model. Molecularly imprinted polymer microspheres were synthesized by precipitation polymerization and analytical optimization of the most promising material led to the development of an automated quantification method for ProGRP. The method enabled analysis of patient serum samples with elevated ProGRP levels. Particularly low sample volumes were permitted using the automated extraction within a method which was time-efficient, thereby demonstrating the potential of such a strategy in a clinical setting. PMID:28303910

  10. Automation and Other Extensions of the SMAC Modal Parameter Extraction Package

    SciTech Connect

    KLENKE,SCOTT E.; MAYES,RANDALL L.

    1999-11-01

    As model validation techniques gain more acceptance and increase in power, the demands on the modal parameter extractions increase. The estimation accuracy, the number of modes desired, and the data reduction efficiency are required features. An algorithm known as SMAC (Synthesize Modes And Correlate), based on principles of modal filtering, has been in development for a few years. SMAC has now been extended in two main areas. First, it has now been automated. Second, it has been extended to fit complex modes as well as real modes. These extensions have enhanced the power of modal extraction so that, typically, the analyst needs to manually fit only 10 percent of the modes in the desired bandwidth, whereas the automated routines will fit 90 percent of the modes. SMAC could be successfully automated because it generally does not produce computational roots.

  11. Discovering Indicators of Successful Collaboration Using Tense: Automated Extraction of Patterns in Discourse

    ERIC Educational Resources Information Center

    Thompson, Kate; Kennedy-Clark, Shannon; Wheeler, Penny; Kelly, Nick

    2014-01-01

    This paper describes a technique for locating indicators of success within the data collected from complex learning environments, proposing an application of e-research to access learner processes and measure and track group progress. The technique combines automated extraction of tense and modality via parts-of-speech tagging with a visualisation…

  12. Data Mining: The Art of Automated Knowledge Extraction

    NASA Astrophysics Data System (ADS)

    Karimabadi, H.; Sipes, T.

    2012-12-01

    Data mining algorithms are used routinely in a wide variety of fields and they are gaining adoption in sciences. The realities of real world data analysis are that (a) data has flaws, and (b) the models and assumptions that we bring to the data are inevitably flawed, and/or biased and misspecified in some way. Data mining can improve data analysis by detecting anomalies in the data, check for consistency of the user model assumptions, and decipher complex patterns and relationships that would not be possible otherwise. The common form of data collected from in situ spacecraft measurements is multi-variate time series which represents one of the most challenging problems in data mining. We have successfully developed algorithms to deal with such data and have extended the algorithms to handle streaming data. In this talk, we illustrate the utility of our algorithms through several examples including automated detection of reconnection exhausts in the solar wind and flux ropes in the magnetotail. We also show examples from successful applications of our technique to analysis of 3D kinetic simulations. With an eye to the future, we provide an overview of our upcoming plans that include collaborative data mining, expert outsourcing data mining, computer vision for image analysis, among others. Finally, we discuss the integration of data mining algorithms with web-based services such as VxOs and other Heliophysics data centers and the resulting capabilities that it would enable.

  13. Multispectral Image Road Extraction Based Upon Automated Map Conflation

    NASA Astrophysics Data System (ADS)

    Chen, Bin

    Road network extraction from remotely sensed imagery enables many important and diverse applications such as vehicle tracking, drone navigation, and intelligent transportation studies. There are, however, a number of challenges to road detection from an image. Road pavement material, width, direction, and topology vary across a scene. Complete or partial occlusions caused by nearby buildings, trees, and the shadows cast by them, make maintaining road connectivity difficult. The problems posed by occlusions are exacerbated with the increasing use of oblique imagery from aerial and satellite platforms. Further, common objects such as rooftops and parking lots are made of materials similar or identical to road pavements. This problem of common materials is a classic case of a single land cover material existing for different land use scenarios. This work addresses these problems in road extraction from geo-referenced imagery by leveraging the OpenStreetMap digital road map to guide image-based road extraction. The crowd-sourced cartography has the advantages of worldwide coverage that is constantly updated. The derived road vectors follow only roads and so can serve to guide image-based road extraction with minimal confusion from occlusions and changes in road material. On the other hand, the vector road map has no information on road widths and misalignments between the vector map and the geo-referenced image are small but nonsystematic. Properly correcting misalignment between two geospatial datasets, also known as map conflation, is an essential step. A generic framework requiring minimal human intervention is described for multispectral image road extraction and automatic road map conflation. The approach relies on the road feature generation of a binary mask and a corresponding curvilinear image. A method for generating the binary road mask from the image by applying a spectral measure is presented. The spectral measure, called anisotropy-tunable distance (ATD

  14. Highly efficient automated extraction of DNA from old and contemporary skeletal remains.

    PubMed

    Zupanič Pajnič, Irena; Debska, Magdalena; Gornjak Pogorelc, Barbara; Vodopivec Mohorčič, Katja; Balažic, Jože; Zupanc, Tomaž; Štefanič, Borut; Geršak, Ksenija

    2016-01-01

    We optimised the automated extraction of DNA from old and contemporary skeletal remains using the AutoMate Express system and the PrepFiler BTA kit. 24 Contemporary and 25 old skeletal remains from WWII were analysed. For each skeleton, extraction using only 0.05 g of powder was performed according to the manufacturer's recommendations (no demineralisation - ND method). Since only 32% of full profiles were obtained from aged and 58% from contemporary casework skeletons, the extraction protocol was modified to acquire higher quality DNA and genomic DNA was obtained after full demineralisation (FD method). The nuclear DNA of the samples was quantified using the Investigator Quantiplex kit and STR typing was performed using the NGM kit to evaluate the performance of tested extraction methods. In the aged DNA samples, 64% of full profiles were obtained using the FD method. For the contemporary skeletal remains the performance of the ND method was closer to the FD method compared to the old skeletons, giving 58% of full profiles with the ND method and 71% of full profiles using the FD method. The extraction of DNA from only 0.05 g of bone or tooth powder using the AutoMate Express has proven highly successful in the recovery of DNA from old and contemporary skeletons, especially with the modified FD method. We believe that the results obtained will contribute to the possibilities of using automated devices for extracting DNA from skeletal remains, which would shorten the procedures for obtaining high-quality DNA from skeletons in forensic laboratories.

  15. Visual Routines for Extracting Magnitude Relations

    ERIC Educational Resources Information Center

    Michal, Audrey L.; Uttal, David; Shah, Priti; Franconeri, Steven L.

    2016-01-01

    Linking relations described in text with relations in visualizations is often difficult. We used eye tracking to measure the optimal way to extract such relations in graphs, college students, and young children (6- and 8-year-olds). Participants compared relational statements ("Are there more blueberries than oranges?") with simple…

  16. Automated motif extraction and classification in RNA tertiary structures

    PubMed Central

    Djelloul, Mahassine; Denise, Alain

    2008-01-01

    We used a novel graph-based approach to extract RNA tertiary motifs. We cataloged them all and clustered them using an innovative graph similarity measure. We applied our method to three widely studied structures: Haloarcula marismortui 50S (H.m 50S), Escherichia coli 50S (E. coli 50S), and Thermus thermophilus 16S (T.th 16S) RNAs. We identified 10 known motifs without any prior knowledge of their shapes or positions. We additionally identified four putative new motifs. PMID:18957493

  17. Fully Automated Electro Membrane Extraction Autosampler for LC-MS Systems Allowing Soft Extractions for High-Throughput Applications.

    PubMed

    Fuchs, David; Pedersen-Bjergaard, Stig; Jensen, Henrik; Rand, Kasper D; Honoré Hansen, Steen; Petersen, Nickolaj Jacob

    2016-07-05

    The current work describes the implementation of electro membrane extraction (EME) into an autosampler for high-throughput analysis of samples by EME-LC-MS. The extraction probe was built into a luer lock adapter connected to a HTC PAL autosampler syringe. As the autosampler drew sample solution, analytes were extracted into the lumen of the extraction probe and transferred to a LC-MS system for further analysis. Various parameters affecting extraction efficacy were investigated including syringe fill strokes, syringe pull up volume, pull up delay and volume in the sample vial. The system was optimized for soft extraction of analytes and high sample throughput. Further, it was demonstrated that by flushing the EME-syringe with acidic wash buffer and reverting the applied electric potential, carry-over between samples can be reduced to below 1%. Performance of the system was characterized (RSD, <10%; R(2), 0.994) and finally, the EME-autosampler was used to analyze in vitro conversion of methadone into its main metabolite by rat liver microsomes and for demonstrating the potential of known CYP3A4 inhibitors to prevent metabolism of methadone. By making use of the high extraction speed of EME, a complete analytical workflow of purification, separation, and analysis of sample could be achieved within only 5.5 min. With the developed system large sequences of samples could be analyzed in a completely automated manner. This high degree of automation makes the developed EME-autosampler a powerful tool for a wide range of applications where high-throughput extractions are required before sample analysis.

  18. Automated extraction and labelling of the arterial tree from whole-body MRA data.

    PubMed

    Shahzad, Rahil; Dzyubachyk, Oleh; Staring, Marius; Kullberg, Joel; Johansson, Lars; Ahlström, Håkan; Lelieveldt, Boudewijn P F; van der Geest, Rob J

    2015-08-01

    In this work, we present a fully automated algorithm for extraction of the 3D arterial tree and labelling the tree segments from whole-body magnetic resonance angiography (WB-MRA) sequences. The algorithm developed consists of two core parts (i) 3D volume reconstruction from different stations with simultaneous correction of different types of intensity inhomogeneity, and (ii) Extraction of the arterial tree and subsequent labelling of the pruned extracted tree. Extraction of the arterial tree is performed using the probability map of the "contrast" class, which is obtained as one of the results of the inhomogeneity correction scheme. We demonstrate that such approach is more robust than using the difference between the pre- and post-contrast channels traditionally used for this purpose. Labelling the extracted tree is performed by using a combination of graph-based and atlas-based approaches. Validation of our method with respect to the extracted tree was performed on the arterial tree subdivided into 32 segments, 82.4% of which were completely detected, 11.7% partially detected, and 5.9% were missed on a cohort of 35 subjects. With respect to automated labelling accuracy of the 32 segments, various registration strategies were investigated on a training set consisting of 10 scans. Further analysis on the test set consisting of 25 data sets indicates that 69% of the vessel centerline tree in the head and neck region, 80% in the thorax and abdomen region, and 84% in the legs was accurately labelled to the correct vessel segment. These results indicate clinical potential of our approach in enabling fully automated and accurate analysis of the entire arterial tree. This is the first study that not only automatically extracts the WB-MRA arterial tree, but also labels the vessel tree segments.

  19. Comparisons of Three Automated Systems for Genomic DNA Extraction in a Clinical Diagnostic Laboratory

    PubMed Central

    Lee, Jong-Han; Park, Yongjung; Choi, Jong Rak; Lee, Eun Kyung

    2010-01-01

    Purpose The extraction of nucleic acid is initially a limiting step for successful molecular-based diagnostic workup. This study aims to compare the effectiveness of three automated DNA extraction systems for clinical laboratory use. Materials and Methods Venous blood samples from 22 healthy volunteers were analyzed using QIAamp® Blood Mini Kit (Qiagen), MagNA Pure LC Nucleic Acid Isolation Kit I (Roche), and Magtration-Magnazorb DNA common kit-200N (PSS). The concentration of extracted DNAs was measured by NanoDrop ND-1000 (PeqLab). Also, extracted DNAs were confirmed by applying in direct agarose gel electrophoresis and were amplified by polymerase chain reaction (PCR) for human beta-globin gene. Results The corrected concentrations of extracted DNAs were 25.42 ± 8.82 ng/µL (13.49-52.85 ng/µL) by QIAamp® Blood Mini Kit (Qiagen), and 22.65 ± 14.49 ng/µL (19.18-93.39 ng/µL) by MagNA Pure LC Nucleic Acid Isolation Kit I, and 22.35 ± 6.47 ng/µL (12.57-35.08 ng/µL) by Magtration-Magnazorb DNA common kit-200N (PSS). No statistically significant difference was noticed among the three commercial kits (p > 0.05). Only the mean value of DNA purity through PSS was slightly lower than others. All the extracted DNAs were successfully identified in direct agarose gel electrophoresis. And all the product of beta-globin gene PCR showed a reproducible pattern of bands. Conclusion The effectiveness of the three automated extraction systems is of an equivalent level and good enough to produce reasonable results. Each laboratory could select the automated system according to its clinical and laboratory conditions. PMID:20046522

  20. Automated DNA extraction platforms offer solutions to challenges of assessing microbial biofouling in oil production facilities

    PubMed Central

    2012-01-01

    The analysis of microbial assemblages in industrial, marine, and medical systems can inform decisions regarding quality control or mitigation. Modern molecular approaches to detect, characterize, and quantify microorganisms provide rapid and thorough measures unbiased by the need for cultivation. The requirement of timely extraction of high quality nucleic acids for molecular analysis is faced with specific challenges when used to study the influence of microorganisms on oil production. Production facilities are often ill equipped for nucleic acid extraction techniques, making the preservation and transportation of samples off-site a priority. As a potential solution, the possibility of extracting nucleic acids on-site using automated platforms was tested. The performance of two such platforms, the Fujifilm QuickGene-Mini80™ and Promega Maxwell®16 was compared to a widely used manual extraction kit, MOBIO PowerBiofilm™ DNA Isolation Kit, in terms of ease of operation, DNA quality, and microbial community composition. Three pipeline biofilm samples were chosen for these comparisons; two contained crude oil and corrosion products and the third transported seawater. Overall, the two more automated extraction platforms produced higher DNA yields than the manual approach. DNA quality was evaluated for amplification by quantitative PCR (qPCR) and end-point PCR to generate 454 pyrosequencing libraries for 16S rRNA microbial community analysis. Microbial community structure, as assessed by DGGE analysis and pyrosequencing, was comparable among the three extraction methods. Therefore, the use of automated extraction platforms should enhance the feasibility of rapidly evaluating microbial biofouling at remote locations or those with limited resources. PMID:23168231

  1. A fully automated liquid–liquid extraction system utilizing interface detection

    PubMed Central

    Maslana, Eugene; Schmitt, Robert; Pan, Jeffrey

    2000-01-01

    The development of the Abbott Liquid-Liquid Extraction Station was a result of the need for an automated system to perform aqueous extraction on large sets of newly synthesized organic compounds used for drug discovery. The system utilizes a cylindrical laboratory robot to shuttle sample vials between two loading racks, two identical extraction stations, and a centrifuge. Extraction is performed by detecting the phase interface (by difference in refractive index) of the moving column of fluid drawn from the bottom of each vial containing a biphasic mixture. The integration of interface detection with fluid extraction maximizes sample throughput. Abbott-developed electronics process the detector signals. Sample mixing is performed by high-speed solvent injection. Centrifuging of the samples reduces interface emulsions. Operating software permits the user to program wash protocols with any one of six solvents per wash cycle with as many cycle repeats as necessary. Station capacity is eighty, 15 ml vials. This system has proven successful with a broad spectrum of both ethyl acetate and methylene chloride based chemistries. The development and characterization of this automated extraction system will be presented. PMID:18924693

  2. Automated segmentation and feature extraction of product inspection items

    NASA Astrophysics Data System (ADS)

    Talukder, Ashit; Casasent, David P.

    1997-03-01

    X-ray film and linescan images of pistachio nuts on conveyor trays for product inspection are considered. The final objective is the categorization of pistachios into good, blemished and infested nuts. A crucial step before classification is the separation of touching products and the extraction of features essential for classification. This paper addresses new detection and segmentation algorithms to isolate touching or overlapping items. These algorithms employ a new filter, a new watershed algorithm, and morphological processing to produce nutmeat-only images. Tests on a large database of x-ray film and real-time x-ray linescan images of around 2900 small, medium and large nuts showed excellent segmentation results. A new technique to detect and segment dark regions in nutmeat images is also presented and tested on approximately 300 x-ray film and approximately 300 real-time linescan x-ray images with 95-97 percent detection and correct segmentation. New algorithms are described that determine nutmeat fill ratio and locate splits in nutmeat. The techniques formulated in this paper are of general use in many different product inspection and computer vision problems.

  3. Extraction of prostatic lumina and automated recognition for prostatic calculus image using PCA-SVM.

    PubMed

    Wang, Zhuocai; Xu, Xiangmin; Ding, Xiaojun; Xiao, Hui; Huang, Yusheng; Liu, Jian; Xing, Xiaofen; Wang, Hua; Liao, D Joshua

    2011-01-01

    Identification of prostatic calculi is an important basis for determining the tissue origin. Computation-assistant diagnosis of prostatic calculi may have promising potential but is currently still less studied. We studied the extraction of prostatic lumina and automated recognition for calculus images. Extraction of lumina from prostate histology images was based on local entropy and Otsu threshold recognition using PCA-SVM and based on the texture features of prostatic calculus. The SVM classifier showed an average time 0.1432 second, an average training accuracy of 100%, an average test accuracy of 93.12%, a sensitivity of 87.74%, and a specificity of 94.82%. We concluded that the algorithm, based on texture features and PCA-SVM, can recognize the concentric structure and visualized features easily. Therefore, this method is effective for the automated recognition of prostatic calculi.

  4. Automated renal histopathology: digital extraction and quantification of renal pathology

    NASA Astrophysics Data System (ADS)

    Sarder, Pinaki; Ginley, Brandon; Tomaszewski, John E.

    2016-03-01

    The branch of pathology concerned with excess blood serum proteins being excreted in the urine pays particular attention to the glomerulus, a small intertwined bunch of capillaries located at the beginning of the nephron. Normal glomeruli allow moderate amount of blood proteins to be filtered; proteinuric glomeruli allow large amount of blood proteins to be filtered. Diagnosis of proteinuric diseases requires time intensive manual examination of the structural compartments of the glomerulus from renal biopsies. Pathological examination includes cellularity of individual compartments, Bowman's and luminal space segmentation, cellular morphology, glomerular volume, capillary morphology, and more. Long examination times may lead to increased diagnosis time and/or lead to reduced precision of the diagnostic process. Automatic quantification holds strong potential to reduce renal diagnostic time. We have developed a computational pipeline capable of automatically segmenting relevant features from renal biopsies. Our method first segments glomerular compartments from renal biopsies by isolating regions with high nuclear density. Gabor texture segmentation is used to accurately define glomerular boundaries. Bowman's and luminal spaces are segmented using morphological operators. Nuclei structures are segmented using color deconvolution, morphological processing, and bottleneck detection. Average computation time of feature extraction for a typical biopsy, comprising of ~12 glomeruli, is ˜69 s using an Intel(R) Core(TM) i7-4790 CPU, and is ~65X faster than manual processing. Using images from rat renal tissue samples, automatic glomerular structural feature estimation was reproducibly demonstrated for 15 biopsy images, which contained 148 individual glomeruli images. The proposed method holds immense potential to enhance information available while making clinical diagnoses.

  5. Metal-organic framework mixed-matrix disks: Versatile supports for automated solid-phase extraction prior to chromatographic separation.

    PubMed

    Ghani, Milad; Font Picó, Maria Francesca; Salehinia, Shima; Palomino Cabello, Carlos; Maya, Fernando; Berlier, Gloria; Saraji, Mohammad; Cerdà, Víctor; Turnes Palomino, Gemma

    2017-03-10

    We present for the first time the application of metal-organic framework (MOF) mixed-matrix disks (MMD) for the automated flow-through solid-phase extraction (SPE) of environmental pollutants. Zirconium terephthalate UiO-66 and UiO-66-NH2 MOFs with different size (90, 200 and 300nm) have been incorporated into mechanically stable polyvinylidene difluoride (PVDF) disks. The performance of the MOF-MMDs for automated SPE of seven substituted phenols prior to HPLC analysis has been evaluated using the sequential injection analysis technique. MOF-MMDs enabled the simultaneous extraction of phenols with the concomitant size exclusion of molecules of larger size. The best extraction performance was obtained using a MOF-MMD containing 90nm UiO-66-NH2 crystals. Using the selected MOF-MMD, detection limits ranging from 0.1 to 0.2μgL(-1) were obtained. Relative standard deviations ranged from 3.9 to 5.3% intra-day, and 4.7-5.7% inter-day. Membrane batch-to-batch reproducibility was from 5.2 to 6.4%. Three different groundwater samples were analyzed with the proposed method using MOF-MMDs, obtaining recoveries ranging from 90 to 98% for all tested analytes.

  6. An advanced distributed automated extraction of drainage network model on high-resolution DEM

    NASA Astrophysics Data System (ADS)

    Mao, Y.; Ye, A.; Xu, J.; Ma, F.; Deng, X.; Miao, C.; Gong, W.; Di, Z.

    2014-07-01

    A high-resolution and high-accuracy drainage network map is a prerequisite for simulating the water cycle in land surface hydrological models. The objective of this study was to develop a new automated extraction of drainage network model, which can get high-precision continuous drainage network on high-resolution DEM (Digital Elevation Model). The high-resolution DEM need too much computer resources to extract drainage network. The conventional GIS method often can not complete to calculate on high-resolution DEM of big basins, because the number of grids is too large. In order to decrease the computation time, an advanced distributed automated extraction of drainage network model (Adam) was proposed in the study. The Adam model has two features: (1) searching upward from outlet of basin instead of sink filling, (2) dividing sub-basins on low-resolution DEM, and then extracting drainage network on sub-basins of high-resolution DEM. The case study used elevation data of the Shuttle Radar Topography Mission (SRTM) at 3 arc-second resolution in Zhujiang River basin, China. The results show Adam model can dramatically reduce the computation time. The extracting drainage network was continuous and more accurate than HydroSHEDS (Hydrological data and maps based on Shuttle Elevation Derivatives at multiple Scales).

  7. Bacterial and fungal DNA extraction from positive blood culture bottles: a manual and an automated protocol.

    PubMed

    Mäki, Minna

    2015-01-01

    When adapting a gene amplification-based method in a routine sepsis diagnostics using a blood culture sample as a specimen type, a prerequisite for a successful and sensitive downstream analysis is the efficient DNA extraction step. In recent years, a number of in-house and commercial DNA extraction solutions have become available. Careful evaluation in respect to cell wall disruption of various microbes and subsequent recovery of microbial DNA without putative gene amplification inhibitors should be conducted prior selecting the most feasible DNA extraction solution for the downstream analysis used. Since gene amplification technologies have been developed to be highly sensitive for a broad range of microbial species, it is also important to confirm that the used sample preparation reagents and materials are bioburden-free to avoid any risks for false-positive result reporting or interference of the diagnostic process. Here, one manual and one automated DNA extraction system feasible for blood culture samples are described.

  8. Comparison of QIAGEN automated nucleic acid extraction methods for CMV quantitative PCR testing.

    PubMed

    Miller, Steve; Seet, Henrietta; Khan, Yasmeen; Wright, Carolyn; Nadarajah, Rohan

    2010-04-01

    We examined the effect of nucleic acid extraction methods on the analytic characteristics of a quantitative polymerase chain reaction (PCR) assay for cytomegalovirus (CMV). Human serum samples were extracted with 2 automated instruments (BioRobot EZ1 and QIAsymphony SP, QIAGEN, Valencia, CA) and CMV PCR results compared with those of pp65 antigenemia testing. Both extraction methods yielded results that were comparably linear and precise, whereas the QIAsymphony SP had a slightly lower limit of detection (1.92 log(10) copies/mL vs 2.26 log(10) copies/mL). In both cases, PCR was more sensitive than CMV antigen detection, detecting CMV viremia in 12% (EZ1) and 21% (QIAsymphony) of antigen-negative specimens. This study demonstrates the feasibility of using 2 different extraction techniques to yield results within 0.5 log(10) copies/mL of the mean value, a level that would allow for clinical comparison between different laboratory assays.

  9. Automated information extraction of key trial design elements from clinical trial publications.

    PubMed

    de Bruijn, Berry; Carini, Simona; Kiritchenko, Svetlana; Martin, Joel; Sim, Ida

    2008-11-06

    Clinical trials are one of the most valuable sources of scientific evidence for improving the practice of medicine. The Trial Bank project aims to improve structured access to trial findings by including formalized trial information into a knowledge base. Manually extracting trial information from published articles is costly, but automated information extraction techniques can assist. The current study highlights a single architecture to extract a wide array of information elements from full-text publications of randomized clinical trials (RCTs). This architecture combines a text classifier with a weak regular expression matcher. We tested this two-stage architecture on 88 RCT reports from 5 leading medical journals, extracting 23 elements of key trial information such as eligibility rules, sample size, intervention, and outcome names. Results prove this to be a promising avenue to help critical appraisers, systematic reviewers, and curators quickly identify key information elements in published RCT articles.

  10. Precision Relative Positioning for Automated Aerial Refueling from a Stereo Imaging System

    DTIC Science & Technology

    2015-03-01

    PRECISION RELATIVE POSITIONING FOR AUTOMATED AERIAL REFUELING FROM A STEREO IMAGING SYSTEM THESIS Kyle P. Werner, 2Lt, USAF AFIT-ENG-MS-15-M-048...Government and is not subject to copyright protection in the United States. AFIT-ENG-MS-15-M-048 PRECISION RELATIVE POSITIONING FOR AUTOMATED AERIAL...RELEASE; DISTRIBUTION UNLIMITED. AFIT-ENG-MS-15-M-048 PRECISION RELATIVE POSITIONING FOR AUTOMATED AERIAL REFUELING FROM A STEREO IMAGING SYSTEM THESIS

  11. Analyzing Automated Instructional Systems: Metaphors from Related Design Professions.

    ERIC Educational Resources Information Center

    Jonassen, David H.; Wilson, Brent G.

    Noting that automation has had an impact on virtually every manufacturing and information operation in the world, including instructional design (ID), this paper suggests three basic metaphors for automating instructional design activities: (1) computer-aided design and manufacturing (CAD/CAM) systems; (2) expert system advisor systems; and (3)…

  12. Automated extraction of single H atoms with STM: tip state dependency.

    PubMed

    Møller, Morten; Jarvis, Samuel P; Guérinet, Laurent; Sharp, Peter; Woolley, Richard; Rahe, Philipp; Moriarty, Philip

    2017-02-17

    The atomistic structure of the tip apex plays a crucial role in performing reliable atomic-scale surface and adsorbate manipulation using scanning probe techniques. We have developed an automated extraction routine for controlled removal of single hydrogen atoms from the H:Si(100) surface. The set of atomic extraction protocols detect a variety of desorption events during scanning tunneling microscope (STM)-induced modification of the hydrogen-passivated surface. The influence of the tip state on the probability for hydrogen removal was examined by comparing the desorption efficiency for various classifications of STM topographs (rows, dimers, atoms, etc). We find that dimer-row-resolving tip apices extract hydrogen atoms most readily and reliably (and with least spurious desorption), while tip states which provide atomic resolution counter-intuitively have a lower probability for single H atom removal.

  13. Automated extraction of single H atoms with STM: tip state dependency

    NASA Astrophysics Data System (ADS)

    Møller, Morten; Jarvis, Samuel P.; Guérinet, Laurent; Sharp, Peter; Woolley, Richard; Rahe, Philipp; Moriarty, Philip

    2017-02-01

    The atomistic structure of the tip apex plays a crucial role in performing reliable atomic-scale surface and adsorbate manipulation using scanning probe techniques. We have developed an automated extraction routine for controlled removal of single hydrogen atoms from the H:Si(100) surface. The set of atomic extraction protocols detect a variety of desorption events during scanning tunneling microscope (STM)-induced modification of the hydrogen-passivated surface. The influence of the tip state on the probability for hydrogen removal was examined by comparing the desorption efficiency for various classifications of STM topographs (rows, dimers, atoms, etc). We find that dimer-row-resolving tip apices extract hydrogen atoms most readily and reliably (and with least spurious desorption), while tip states which provide atomic resolution counter-intuitively have a lower probability for single H atom removal.

  14. Factors controlling the manual and automated extraction of image information using imaging polarimetry

    NASA Astrophysics Data System (ADS)

    Duggin, Michael J.

    2004-07-01

    The factors governing the extraction of useful information from polarimetric images depend upon the image acquisition and analytical methodologies being used, and upon systematic and environmental variations present during the acquisition process. The acquisition process generally occurs with foreknowledge of the analysis to be used. Broadly, interactive image analysis and automated image analysis are two different procedures: in each case, there are technical challenges. Imaging polarimetry is more complex than other imaging methodologies, and produces an increased dimensionality. However, there are several potential broad areas of interactive (manual) and automated remote sensing in which imaging polarimetry can provide useful additional information. A review is presented of the factors controlling feature discrimination, of metrics that are used, and of some proposed directions for future research.

  15. Automated Kinematic Extraction of Wing and Body Motions of Free Flying Diptera

    NASA Astrophysics Data System (ADS)

    Kostreski, Nicholas I.

    In the quest to understand the forces generated by micro aerial systems powered by oscillating appendages, it is necessary to study the kinematics that generate those forces. Automated and manual tracking techniques were developed to extract the complex wing and body motions of dipteran insects, ideal micro aerial systems, in free flight. Video sequences were captured by three high speed cameras (7500 fps) oriented orthogonally around a clear flight test chamber. Synchronization and image-based triggering were made possible by an automated triggering circuit. A multi-camera calibration was implemented using image-based tracking techniques. Three-dimensional reconstructions of the insect were generated from the 2-D images by shape from silhouette (SFS) methods. An intensity based segmentation of the wings and body was performed using a mixture of Gaussians. In addition to geometric and cost based filtering, spectral clustering was also used to refine the reconstruction and Principal Component Analysis (PCA) was performed to find the body roll axis and wing-span axes. The unobservable roll state of the cylindrically shaped body was successfully estimated by combining observations of the wing kinematics with a wing symmetry assumption. Wing pitch was determined by a ray tracing technique to compute and minimize a point-to-line cost function. Linear estimation with assumed motion models was accomplished by discrete Kalman filtering the measured body states. Generative models were developed for different species of diptera for model based tracking, simulation, and extraction of inertial properties. Manual and automated tracking results were analyzed and insect flight simulation videos were developed to quantify ground truth errors for an assumed model. The results demonstrated the automated tracker to have comparable performance to a human digitizer, though manual techniques displayed superiority during aggressive maneuvers and image blur. Both techniques demonstrated

  16. Automated extraction of natural drainage density patterns for the conterminous United States through high performance computing

    USGS Publications Warehouse

    Stanislawski, Larry V.; Falgout, Jeff T.; Buttenfield, Barbara P.

    2015-01-01

    Hydrographic networks form an important data foundation for cartographic base mapping and for hydrologic analysis. Drainage density patterns for these networks can be derived to characterize local landscape, bedrock and climate conditions, and further inform hydrologic and geomorphological analysis by indicating areas where too few headwater channels have been extracted. But natural drainage density patterns are not consistently available in existing hydrographic data for the United States because compilation and capture criteria historically varied, along with climate, during the period of data collection over the various terrain types throughout the country. This paper demonstrates an automated workflow that is being tested in a high-performance computing environment by the U.S. Geological Survey (USGS) to map natural drainage density patterns at the 1:24,000-scale (24K) for the conterminous United States. Hydrographic network drainage patterns may be extracted from elevation data to guide corrections for existing hydrographic network data. The paper describes three stages in this workflow including data pre-processing, natural channel extraction, and generation of drainage density patterns from extracted channels. The workflow is concurrently implemented by executing procedures on multiple subbasin watersheds within the U.S. National Hydrography Dataset (NHD). Pre-processing defines parameters that are needed for the extraction process. Extraction proceeds in standard fashion: filling sinks, developing flow direction and weighted flow accumulation rasters. Drainage channels with assigned Strahler stream order are extracted within a subbasin and simplified. Drainage density patterns are then estimated with 100-meter resolution and subsequently smoothed with a low-pass filter. The extraction process is found to be of better quality in higher slope terrains. Concurrent processing through the high performance computing environment is shown to facilitate and refine

  17. Automated extraction of DNA from biological stains on fabric from crime cases. A comparison of a manual and three automated methods.

    PubMed

    Stangegaard, Michael; Hjort, Benjamin B; Hansen, Thomas N; Hoflund, Anders; Mogensen, Helle S; Hansen, Anders J; Morling, Niels

    2013-05-01

    The presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. DNA extraction from fabric for forensic genetic purposes may be challenging due to the occasional presence of PCR inhibitors that may be co-extracted with the DNA. Using 120 forensic trace evidence samples consisting of various types of fabric, we compared three automated DNA extraction methods based on magnetic beads (PrepFiler Express Forensic DNA Extraction Kit on an AutoMate Express, QIAsyphony DNA Investigator kit either with the sample pre-treatment recommended by Qiagen or an in-house optimized sample pre-treatment on a QIAsymphony SP) and one manual method (Chelex) with the aim of reducing the amount of PCR inhibitors in the DNA extracts and increasing the proportion of reportable STR-profiles. A total of 480 samples were processed. The highest DNA recovery was obtained with the PrepFiler Express kit on an AutoMate Express while the lowest DNA recovery was obtained using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen. Extraction using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen resulted in the lowest percentage of PCR inhibition (0%) while extraction using manual Chelex resulted in the highest percentage of PCR inhibition (51%). The largest number of reportable STR-profiles was obtained with DNA from samples extracted with the PrepFiler Express kit (75%) while the lowest number was obtained with DNA from samples extracted using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen (41%).

  18. Automated milk fat extraction for the analyses of persistent organic pollutants.

    PubMed

    Archer, Jeffrey C; Jenkins, Roy G

    2017-01-15

    We have utilized an automated acid hydrolysis technology, followed by an abbreviated Soxhlet extraction technique to obtain fat from whole milk for the determination of persistent organic pollutants, namely polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans and polychlorinated biphenyls. The process simply involves (1) pouring the liquid milk into the hydrolysis beaker with reagents and standards, (2) drying the obtained fat on a filter paper and (3) obtaining pure fat via the modified Soxhlet extraction using 100mL of hexane per sample. This technique is in contrast to traditional manually intense liquid-liquid extractions and avoids the preparatory step of freeze-drying the samples for pressurized liquid extractions. Along with these extraction improvements, analytical results closely agree between the methods, thus no quality has been compromised. The native spike (n=12) and internal standard (n=24) precision and accuracy results are within EPA Methods 1613 and 1668 limits. While the median (n=6) Toxic Equivalency Quotient (TEQ) for polychlorinated dibenzo-p-dioxins/polychlorinated dibenzofurans and the concentration of the marker polychlorinated biphenyls show a percent difference of 1% and 12%, respectively, compared to 315 previously analyzed milk samples at the same laboratory using liquid-liquid extraction. During our feasibility studies, both egg and fish tissue show substantial promise using this technique as well.

  19. Comparison of manual and automated nucleic acid extraction methods from clinical specimens for microbial diagnosis purposes.

    PubMed

    Wozniak, Aniela; Geoffroy, Enrique; Miranda, Carolina; Castillo, Claudia; Sanhueza, Francia; García, Patricia

    2016-11-01

    The choice of nucleic acids (NAs) extraction method for molecular diagnosis in microbiology is of major importance because of the low microbial load, different nature of microorganisms, and clinical specimens. The NA yield of different extraction methods has been mostly studied using spiked samples. However, information from real human clinical specimens is scarce. The purpose of this study was to compare the performance of a manual low-cost extraction method (Qiagen kit or salting-out extraction method) with the automated high-cost MagNAPure Compact method. According to cycle threshold values for different pathogens, MagNAPure is as efficient as Qiagen for NA extraction from noncomplex clinical specimens (nasopharyngeal swab, skin swab, plasma, respiratory specimens). In contrast, according to cycle threshold values for RNAseP, MagNAPure method may not be an appropriate method for NA extraction from blood. We believe that MagNAPure versatility reduced risk of cross-contamination and reduced hands-on time compensates its high cost.

  20. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples.

    PubMed

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane

    2012-09-01

    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination.

  1. Automated solid-phase extraction of herbicides from water for gas chromatographic-mass spectrometric analysis

    USGS Publications Warehouse

    Meyer, M.T.; Mills, M.S.; Thurman, E.M.

    1993-01-01

    An automated solid-phase extraction (SPE) method was developed for the pre-concentration of chloroacetanilide and triazine herbicides, and two triazine metabolites from 100-ml water samples. Breakthrough experiments for the C18 SPE cartridge show that the two triazine metabolites are not fully retained and that increasing flow-rate decreases their retention. Standard curve r2 values of 0.998-1.000 for each compound were consistently obtained and a quantitation level of 0.05 ??g/l was achieved for each compound tested. More than 10,000 surface and ground water samples have been analyzed by this method.

  2. Automated CO2 extraction from air for clumped isotope analysis in the atmo- and biosphere

    NASA Astrophysics Data System (ADS)

    Hofmann, Magdalena; Ziegler, Martin; Pons, Thijs; Lourens, Lucas; Röckmann, Thomas

    2015-04-01

    The conventional stable isotope ratios 13C/12C and 18O/16O in atmospheric CO2 are a powerful tool for unraveling the global carbon cycle. In recent years, it has been suggested that the abundance of the very rare isotopologue 13C18O16O on m/z 47 might be a promising tracer to complement conventional stable isotope analysis of atmospheric CO2 [Affek and Eiler, 2006; Affek et al. 2007; Eiler and Schauble, 2004; Yeung et al., 2009]. Here we present an automated analytical system that is designed for clumped isotope analysis of atmo- and biospheric CO2. The carbon dioxide gas is quantitatively extracted from about 1.5L of air (ATP). The automated stainless steel extraction and purification line consists of three main components: (i) a drying unit (a magnesium perchlorate unit and a cryogenic water trap), (ii) two CO2 traps cooled with liquid nitrogen [Werner et al., 2001] and (iii) a GC column packed with Porapak Q that can be cooled with liquid nitrogen to -30°C during purification and heated up to 230°C in-between two extraction runs. After CO2 extraction and purification, the CO2 is automatically transferred to the mass spectrometer. Mass spectrometric analysis of the 13C18O16O abundance is carried out in dual inlet mode on a MAT 253 mass spectrometer. Each analysis generally consists of 80 change-over-cycles. Three additional Faraday cups were added to the mass spectrometer for simultaneous analysis of the mass-to-charge ratios 44, 45, 46, 47, 48 and 49. The reproducibility for δ13C, δ18O and Δ47 for repeated CO2 extractions from air is in the range of 0.11o (SD), 0.18o (SD) and 0.02 (SD)o respectively. This automated CO2 extraction and purification system will be used to analyse the clumped isotopic signature in atmospheric CO2 (tall tower, Cabauw, Netherlands) and to study the clumped isotopic fractionation during photosynthesis (leaf chamber experiments) and soil respiration. References Affek, H. P., Xu, X. & Eiler, J. M., Geochim. Cosmochim. Acta 71, 5033

  3. Extraction of gravitational waves in numerical relativity.

    PubMed

    Bishop, Nigel T; Rezzolla, Luciano

    2016-01-01

    A numerical-relativity calculation yields in general a solution of the Einstein equations including also a radiative part, which is in practice computed in a region of finite extent. Since gravitational radiation is properly defined only at null infinity and in an appropriate coordinate system, the accurate estimation of the emitted gravitational waves represents an old and non-trivial problem in numerical relativity. A number of methods have been developed over the years to "extract" the radiative part of the solution from a numerical simulation and these include: quadrupole formulas, gauge-invariant metric perturbations, Weyl scalars, and characteristic extraction. We review and discuss each method, in terms of both its theoretical background as well as its implementation. Finally, we provide a brief comparison of the various methods in terms of their inherent advantages and disadvantages.

  4. Extraction of gravitational waves in numerical relativity

    NASA Astrophysics Data System (ADS)

    Bishop, Nigel T.; Rezzolla, Luciano

    2016-12-01

    A numerical-relativity calculation yields in general a solution of the Einstein equations including also a radiative part, which is in practice computed in a region of finite extent. Since gravitational radiation is properly defined only at null infinity and in an appropriate coordinate system, the accurate estimation of the emitted gravitational waves represents an old and non-trivial problem in numerical relativity. A number of methods have been developed over the years to "extract" the radiative part of the solution from a numerical simulation and these include: quadrupole formulas, gauge-invariant metric perturbations, Weyl scalars, and characteristic extraction. We review and discuss each method, in terms of both its theoretical background as well as its implementation. Finally, we provide a brief comparison of the various methods in terms of their inherent advantages and disadvantages.

  5. Automation of lidar-based hydrologic feature extraction workflows using GIS

    NASA Astrophysics Data System (ADS)

    Borlongan, Noel Jerome B.; de la Cruz, Roel M.; Olfindo, Nestor T.; Perez, Anjillyn Mae C.

    2016-10-01

    With the advent of LiDAR technology, higher resolution datasets become available for use in different remote sensing and GIS applications. One significant application of LiDAR datasets in the Philippines is in resource features extraction. Feature extraction using LiDAR datasets require complex and repetitive workflows which can take a lot of time for researchers through manual execution and supervision. The Development of the Philippine Hydrologic Dataset for Watersheds from LiDAR Surveys (PHD), a project under the Nationwide Detailed Resources Assessment Using LiDAR (Phil-LiDAR 2) program, created a set of scripts, the PHD Toolkit, to automate its processes and workflows necessary for hydrologic features extraction specifically Streams and Drainages, Irrigation Network, and Inland Wetlands, using LiDAR Datasets. These scripts are created in Python and can be added in the ArcGIS® environment as a toolbox. The toolkit is currently being used as an aid for the researchers in hydrologic feature extraction by simplifying the workflows, eliminating human errors when providing the inputs, and providing quick and easy-to-use tools for repetitive tasks. This paper discusses the actual implementation of different workflows developed by Phil-LiDAR 2 Project 4 in Streams, Irrigation Network and Inland Wetlands extraction.

  6. CHANNEL MORPHOLOGY TOOL (CMT): A GIS-BASED AUTOMATED EXTRACTION MODEL FOR CHANNEL GEOMETRY

    SciTech Connect

    JUDI, DAVID; KALYANAPU, ALFRED; MCPHERSON, TIMOTHY; BERSCHEID, ALAN

    2007-01-17

    This paper describes an automated Channel Morphology Tool (CMT) developed in ArcGIS 9.1 environment. The CMT creates cross-sections along a stream centerline and uses a digital elevation model (DEM) to create station points with elevations along each of the cross-sections. The generated cross-sections may then be exported into a hydraulic model. Along with the rapid cross-section generation the CMT also eliminates any cross-section overlaps that might occur due to the sinuosity of the channels using the Cross-section Overlap Correction Algorithm (COCoA). The CMT was tested by extracting cross-sections from a 5-m DEM for a 50-km channel length in Houston, Texas. The extracted cross-sections were compared directly with surveyed cross-sections in terms of the cross-section area. Results indicated that the CMT-generated cross-sections satisfactorily matched the surveyed data.

  7. Strategies for Medical Data Extraction and Presentation Part 3: Automated Context- and User-Specific Data Extraction.

    PubMed

    Reiner, Bruce

    2015-08-01

    In current medical practice, data extraction is limited by a number of factors including lack of information system integration, manual workflow, excessive workloads, and lack of standardized databases. The combined limitations result in clinically important data often being overlooked, which can adversely affect clinical outcomes through the introduction of medical error, diminished diagnostic confidence, excessive utilization of medical services, and delays in diagnosis and treatment planning. Current technology development is largely inflexible and static in nature, which adversely affects functionality and usage among the diverse and heterogeneous population of end users. In order to address existing limitations in medical data extraction, alternative technology development strategies need to be considered which incorporate the creation of end user profile groups (to account for occupational differences among end users), customization options (accounting for individual end user needs and preferences), and context specificity of data (taking into account both the task being performed and data subject matter). Creation of the proposed context- and user-specific data extraction and presentation templates offers a number of theoretical benefits including automation and improved workflow, completeness in data search, ability to track and verify data sources, creation of computerized decision support and learning tools, and establishment of data-driven best practice guidelines.

  8. Automated solid-phase extraction approaches for large scale biomonitoring studies.

    PubMed

    Kuklenyik, Zsuzsanna; Ye, Xiaoyun; Needham, Larry L; Calafat, Antonia M

    2009-01-01

    The main value in measuring environmental chemicals in biological specimens (i.e., biomonitoring) is the ability to minimize risk assessment uncertainties. The collection of biomonitoring data for risk assessment requires the analysis of a statistically significant number of samples from subjects with a significant prevalence of detectable internal dose levels. This paper addresses the practical laboratory challenges that arise from these statistical requirements: development of high throughput techniques that can handle, with high accuracy and precision, a large number of samples and can do a trace level analysis of multiple and diverse environmental chemicals (i.e., analytes). We review here examples of high throughput, automated solid-phase extraction methods developed in our laboratory for biomonitoring of analytes with representative hydrophobic properties and for typical biomonitoring matrices. We discuss key aspects of sample preparation, column, and solvent selection for off- and online extractions, and the so-called nuts-and-bolts of online column-switching systems necessary for developing-with minimal sample handling-rugged, automated methods.

  9. Semi-automated solid-phase extraction method for studying the biodegradation of ochratoxin A by human intestinal microbiota.

    PubMed

    Camel, Valérie; Ouethrani, Minale; Coudray, Cindy; Philippe, Catherine; Rabot, Sylvie

    2012-04-15

    A simple and rapid semi-automated solid-phase (SPE) extraction method has been developed for the analysis of ochratoxin A in aqueous matrices related to biodegradation experiments (namely digestive contents and faecal excreta), with a view of using this method to follow OTA biodegradation by human intestinal microbiota. Influence of extraction parameters that could affect semi-automated SPE efficiency was studied, using C18-silica as the sorbent and water as the simplest matrix, being further applied to the matrices of interest. Conditions finally retained were as follows: 5-mL aqueous samples (pH 3) containing an organic modifier (20% ACN) were applied on 100-mg cartridges. After drying (9 mL of air), the cartridge was rinsed with 5-mL H(2)O/ACN (80:20, v/v), before eluting the compounds with 3 × 1 mL of MeOH/THF (10:90, v/v). Acceptable recoveries and limits of quantification could be obtained considering the complexity of the investigated matrices and the low volumes sampled; this method was also suitable for the analysis of ochratoxin B in faecal extracts. Applicability of the method is illustrated by preliminary results of ochratoxin A biodegradation studies by human intestinal microbiota under simple in vitro conditions. Interestingly, partial degradation of ochratoxin A was observed, with efficiencies ranging from 14% to 47% after 72 h incubation. In addition, three phase I metabolites could be identified using high resolution mass spectrometry, namely ochratoxin α, open ochratoxin A and ochratoxin B.

  10. ChemDataExtractor: A Toolkit for Automated Extraction of Chemical Information from the Scientific Literature.

    PubMed

    Swain, Matthew C; Cole, Jacqueline M

    2016-10-24

    The emergence of "big data" initiatives has led to the need for tools that can automatically extract valuable chemical information from large volumes of unstructured data, such as the scientific literature. Since chemical information can be present in figures, tables, and textual paragraphs, successful information extraction often depends on the ability to interpret all of these domains simultaneously. We present a complete toolkit for the automated extraction of chemical entities and their associated properties, measurements, and relationships from scientific documents that can be used to populate structured chemical databases. Our system provides an extensible, chemistry-aware, natural language processing pipeline for tokenization, part-of-speech tagging, named entity recognition, and phrase parsing. Within this scope, we report improved performance for chemical named entity recognition through the use of unsupervised word clustering based on a massive corpus of chemistry articles. For phrase parsing and information extraction, we present the novel use of multiple rule-based grammars that are tailored for interpreting specific document domains such as textual paragraphs, captions, and tables. We also describe document-level processing to resolve data interdependencies and show that this is particularly necessary for the autogeneration of chemical databases since captions and tables commonly contain chemical identifiers and references that are defined elsewhere in the text. The performance of the toolkit to correctly extract various types of data was evaluated, affording an F-score of 93.4%, 86.8%, and 91.5% for extracting chemical identifiers, spectroscopic attributes, and chemical property attributes, respectively; set against the CHEMDNER chemical name extraction challenge, ChemDataExtractor yields a competitive F-score of 87.8%. All tools have been released under the MIT license and are available to download from http://www.chemdataextractor.org .

  11. Automated sample preparation by pressurized liquid extraction-solid-phase extraction for the liquid chromatographic-mass spectrometric investigation of polyphenols in the brewing process.

    PubMed

    Papagiannopoulos, Menelaos; Mellenthin, Annett

    2002-11-08

    The analysis of polyphenols from solid plant or food samples usually requires laborious sample preparation. The liquid extraction of these compounds from the sample is compromised by apolar matrix interferences, an excess of which has to be eliminated prior to subsequent purification and separation. Applying pressurized liquid extraction to the extraction of polyphenols from hops, the use of different solvents sequentially can partly overcome these problems. Initial extraction with pentane eliminates hydrophobic compounds like hop resins and oils and enables the straightforward automated on-line solid-phase extraction as part of an optimized LC-MS analysis.

  12. Automated Device for Asynchronous Extraction of RNA, DNA, or Protein Biomarkers from Surrogate Patient Samples.

    PubMed

    Bitting, Anna L; Bordelon, Hali; Baglia, Mark L; Davis, Keersten M; Creecy, Amy E; Short, Philip A; Albert, Laura E; Karhade, Aditya V; Wright, David W; Haselton, Frederick R; Adams, Nicholas M

    2016-12-01

    Many biomarker-based diagnostic methods are inhibited by nontarget molecules in patient samples, necessitating biomarker extraction before detection. We have developed a simple device that purifies RNA, DNA, or protein biomarkers from complex biological samples without robotics or fluid pumping. The device design is based on functionalized magnetic beads, which capture biomarkers and remove background biomolecules by magnetically transferring the beads through processing solutions arrayed within small-diameter tubing. The process was automated by wrapping the tubing around a disc-like cassette and rotating it past a magnet using a programmable motor. This device recovered biomarkers at ~80% of the operator-dependent extraction method published previously. The device was validated by extracting biomarkers from a panel of surrogate patient samples containing clinically relevant concentrations of (1) influenza A RNA in nasal swabs, (2) Escherichia coli DNA in urine, (3) Mycobacterium tuberculosis DNA in sputum, and (4) Plasmodium falciparum protein and DNA in blood. The device successfully extracted each biomarker type from samples representing low levels of clinically relevant infectivity (i.e., 7.3 copies/µL of influenza A RNA, 405 copies/µL of E. coli DNA, 0.22 copies/µL of TB DNA, 167 copies/µL of malaria parasite DNA, and 2.7 pM of malaria parasite protein).

  13. Automated Detection and Extraction of Coronal Dimmings from SDO/AIA Data

    NASA Astrophysics Data System (ADS)

    Davey, Alisdair R.; Attrill, G. D. R.; Wills-Davey, M. J.

    2010-05-01

    The sheer volume of data anticipated from the Solar Dynamics Observatory/Atmospheric Imaging Assembly (SDO/AIA) highlights the necessity for the development of automatic detection methods for various types of solar activity. Initially recognised in the 1970s, it is now well established that coronal dimmings are closely associated with coronal mass ejections (CMEs), and are particularly recognised as an indicator of front-side (halo) CMEs, which can be difficult to detect in white-light coronagraph data. An automated coronal dimming region detection and extraction algorithm removes visual observer bias from determination of physical quantities such as spatial location, area and volume. This allows reproducible, quantifiable results to be mined from very large datasets. The information derived may facilitate more reliable early space weather detection, as well as offering the potential for conducting large-sample studies focused on determining the geoeffectiveness of CMEs, coupled with analysis of their associated coronal dimmings. We present examples of dimming events extracted using our algorithm from existing EUV data, demonstrating the potential for the anticipated application to SDO/AIA data. Metadata returned by our algorithm include: location, area, volume, mass and dynamics of coronal dimmings. As well as running on historic datasets, this algorithm is capable of detecting and extracting coronal dimmings in near real-time. The coronal dimming detection and extraction algorithm described in this poster is part of the SDO/Computer Vision Center effort hosted at SAO (Martens et al., 2009). We acknowledge NASA grant NNH07AB97C.

  14. Automated hand thermal image segmentation and feature extraction in the evaluation of rheumatoid arthritis.

    PubMed

    Snekhalatha, U; Anburajan, M; Sowmiya, V; Venkatraman, B; Menaka, M

    2015-04-01

    The aim of the study was (1) to perform an automated segmentation of hot spot regions of the hand from thermograph using the k-means algorithm and (2) to test the potential of features extracted from the hand thermograph and its measured skin temperature indices in the evaluation of rheumatoid arthritis. Thermal image analysis based on skin temperature measurement, heat distribution index and thermographic index was analyzed in rheumatoid arthritis patients and controls. The k-means algorithm was used for image segmentation, and features were extracted from the segmented output image using the gray-level co-occurrence matrix method. In metacarpo-phalangeal, proximal inter-phalangeal and distal inter-phalangeal regions, the calculated percentage difference in the mean values of skin temperatures was found to be higher in rheumatoid arthritis patients (5.3%, 4.9% and 4.8% in MCP3, PIP3 and DIP3 joints, respectively) as compared to the normal group. k-Means algorithm applied in the thermal imaging provided better segmentation results in evaluating the disease. In the total population studied, the measured mean average skin temperature of the MCP3 joint was highly correlated with most of the extracted features of the hand. In the total population studied, the statistical feature extracted parameters correlated significantly with skin surface temperature measurements and measured temperature indices. Hence, the developed computer-aided diagnostic tool using MATLAB could be used as a reliable method in diagnosing and analyzing the arthritis in hand thermal images.

  15. An automated system for liquid-liquid extraction in monosegmented flow analysis

    PubMed Central

    Facchin, Ileana; Pasquini, Celio

    1997-01-01

    An automated system to perform liquid-liquid extraction in monosegmented flow analysis is described. The system is controlled by a microcomputer that can track the localization of the aqueous monosegmented sample in the manifold. Optical switches are employed to sense the gas-liquid interface of the air bubbles that define the monosegment. The logical level changes, generated by the switches, are flagged by the computer through a home-made interface that also contains the analogue-to-digital converter for signal acquisition. The sequence of operations, necessary for a single extraction or for concentration of the analyte in the organic phase, is triggered by these logical transitions. The system was evaluated for extraction of Cd(II), Cu(II) and Zn(II) and concentration of Cd(II) from aqueous solutions at pH 9.9 (NH3/NH4Cl buffer) into chloroform containing PAN (1-(2-pyridylazo)-2-naphthol) . The results show a mean repeatability of 3% (rsd) for a 2.0 mg l-1 Cd(II) solution and a linear increase of the concentration factor for a 0.5mg l-1 Cd(II) solution observed for up to nine extraction cycles. PMID:18924792

  16. Semi-automated extraction of landslides in Taiwan based on SPOT imagery and DEMs

    NASA Astrophysics Data System (ADS)

    Eisank, Clemens; Hölbling, Daniel; Friedl, Barbara; Chen, Yi-Chin; Chang, Kang-Tsung

    2014-05-01

    The vast availability and improved quality of optical satellite data and digital elevation models (DEMs), as well as the need for complete and up-to-date landslide inventories at various spatial scales have fostered the development of semi-automated landslide recognition systems. Among the tested approaches for designing such systems, object-based image analysis (OBIA) stepped out to be a highly promising methodology. OBIA offers a flexible, spatially enabled framework for effective landslide mapping. Most object-based landslide mapping systems, however, have been tailored to specific, mainly small-scale study areas or even to single landslides only. Even though reported mapping accuracies tend to be higher than for pixel-based approaches, accuracy values are still relatively low and depend on the particular study. There is still room to improve the applicability and objectivity of object-based landslide mapping systems. The presented study aims at developing a knowledge-based landslide mapping system implemented in an OBIA environment, i.e. Trimble eCognition. In comparison to previous knowledge-based approaches, the classification of segmentation-derived multi-scale image objects relies on digital landslide signatures. These signatures hold the common operational knowledge on digital landslide mapping, as reported by 25 Taiwanese landslide experts during personal semi-structured interviews. Specifically, the signatures include information on commonly used data layers, spectral and spatial features, and feature thresholds. The signatures guide the selection and implementation of mapping rules that were finally encoded in Cognition Network Language (CNL). Multi-scale image segmentation is optimized by using the improved Estimation of Scale Parameter (ESP) tool. The approach described above is developed and tested for mapping landslides in a sub-region of the Baichi catchment in Northern Taiwan based on SPOT imagery and a high-resolution DEM. An object

  17. Automated endmember extraction for subpixel classification of multispectral and hyperspectral data

    NASA Astrophysics Data System (ADS)

    Shrivastava, Deepali; Kumar, Vinay; Sharma, Richa U.

    2016-04-01

    Most of the multispectral sensors acquire data in several broad wavelength bands and are capable of extracting different Land Cover features while hyperspectral sensors contain ample spectral data in narrow bandwidth (10- 20nm). The spectrally rich data enable the extraction of useful quantitative information from earth surface features. Endmembers are the pure spectral components extracted from the remote sensing datasets. Most approaches for Endmember extraction (EME) are manual and have been designed from a spectroscopic viewpoint, thus neglecting the spatial arrangement of the pixels. Therefore, EME techniques which can consider both spectral and spatial aspects are required to find more accurate Endmembers for Subpixel classification. Multispectral (EO-1 ALI and Landsat 8 OLI) and Hyperspectral (EO-1 Hyperion) datasets of Udaipur region, Rajasthan is used in this study. All the above mentioned datasets are preprocessed and converted to surface reflectance using Fast Line-of-sight Atmospheric Analysis of Spectral Hypercube (FLAASH). Further Automated Endmember extraction and Subpixel classification is carried out using Multiple Endmember Spectral Mixture Analysis (MESMA). Endmembers are selected from spectral libraries to be given as input to MESMA. To optimize these spectral libraries three techniques are deployed i.e. Count based Endmember selection (CoB), Endmember Average RMSE (EAR) and Minimum Average Spectral Angle (MASA) for endmember selection. Further identified endmembers are used for classifying multispectral and hyperspectral data using MESMA and SAM. It was observed from the obtained classified results that diverse features, spread over a pixel, which are spectrally same are well classified by MESMA whereas SAM was unable to do so.

  18. Investigation of automated feature extraction techniques for applications in cancer detection from multispectral histopathology images

    NASA Astrophysics Data System (ADS)

    Harvey, Neal R.; Levenson, Richard M.; Rimm, David L.

    2003-05-01

    Recent developments in imaging technology mean that it is now possible to obtain high-resolution histological image data at multiple wavelengths. This allows pathologists to image specimens over a full spectrum, thereby revealing (often subtle) distinctions between different types of tissue. With this type of data, the spectral content of the specimens, combined with quantitative spatial feature characterization may make it possible not only to identify the presence of an abnormality, but also to classify it accurately. However, such are the quantities and complexities of these data, that without new automated techniques to assist in the data analysis, the information contained in the data will remain inaccessible to those who need it. We investigate the application of a recently developed system for the automated analysis of multi-/hyper-spectral satellite image data to the problem of cancer detection from multispectral histopathology image data. The system provides a means for a human expert to provide training data simply by highlighting regions in an image using a computer mouse. Application of these feature extraction techniques to examples of both training and out-of-training-sample data demonstrate that these, as yet unoptimized, techniques already show promise in the discrimination between benign and malignant cells from a variety of samples.

  19. FBI DRUGFIRE program: the development and deployment of an automated firearms identification system to support serial, gang, and drug-related shooting investigations

    NASA Astrophysics Data System (ADS)

    Sibert, Robert W.

    1994-03-01

    The FBI DRUGFIRE Program entails the continuing phased development and deployment of a scalable automated firearms identification system. The first phase of this system, a networked, database-driven firearms evidence imaging system, has been operational for approximately one year and has demonstrated its effectiveness in facilitating the sharing and linking of firearms evidence collected in serial, gang, and drug-related shooting investigations. However, there is a pressing need for development of enhancements which will more fully automate the system so that it is capable of processing very large volumes of firearms evidence. These enhancements would provide automated image analysis and pattern matching functionalities. Existing `spin off' technologies need to be integrated into the present DRUGFIRE system to automate the 3-D mensuration, registration, feature extraction, and matching of the microtopographical surface features imprinted on the primers of fired casings during firing.

  20. Automated extraction of the cortical sulci based on a supervised learning approach.

    PubMed

    Tu, Zhuowen; Zheng, Songfeng; Yuille, Alan L; Reiss, Allan L; Dutton, Rebecca A; Lee, Agatha D; Galaburda, Albert M; Dinov, Ivo; Thompson, Paul M; Toga, Arthur W

    2007-04-01

    It is important to detect and extract the major cortical sulci from brain images, but manually annotating these sulci is a time-consuming task and requires the labeler to follow complex protocols. This paper proposes a learning-based algorithm for automated extraction of the major cortical sulci from magnetic resonance imaging (MRI) volumes and cortical surfaces. Unlike alternative methods for detecting the major cortical sulci, which use a small number of predefined rules based on properties of the cortical surface such as the mean curvature, our approach learns a discriminative model using the probabilistic boosting tree algorithm (PBT). PBT is a supervised learning approach which selects and combines hundreds of features at different scales, such as curvatures, gradients and shape index. Our method can be applied to either MRI volumes or cortical surfaces. It first outputs a probability map which indicates how likely each voxel lies on a major sulcal curve. Next, it applies dynamic programming to extract the best curve based on the probability map and a shape prior. The algorithm has almost no parameters to tune for extracting different major sulci. It is very fast (it runs in under 1 min per sulcus including the time to compute the discriminative models) due to efficient implementation of the features (e.g., using the integral volume to rapidly compute the responses of 3-D Haar filters). Because the algorithm can be applied to MRI volumes directly, there is no need to perform preprocessing such as tissue segmentation or mapping to a canonical space. The learning aspect of our approach makes the system very flexible and general. For illustration, we use volumes of the right hemisphere with several major cortical sulci manually labeled. The algorithm is tested on two groups of data, including some brains from patients with Williams Syndrome, and the results are very encouraging.

  1. Validation of the Total Visual Acuity Extraction Algorithm (TOVA) for Automated Extraction of Visual Acuity Data From Free Text, Unstructured Clinical Records

    PubMed Central

    Baughman, Douglas M.; Su, Grace L.; Tsui, Irena; Lee, Cecilia S.; Lee, Aaron Y.

    2017-01-01

    Purpose With increasing volumes of electronic health record data, algorithm-driven extraction may aid manual extraction. Visual acuity often is extracted manually in vision research. The total visual acuity extraction algorithm (TOVA) is presented and validated for automated extraction of visual acuity from free text, unstructured clinical notes. Methods Consecutive inpatient ophthalmology notes over an 8-year period from the University of Washington healthcare system in Seattle, WA were used for validation of TOVA. The total visual acuity extraction algorithm applied natural language processing to recognize Snellen visual acuity in free text notes and assign laterality. The best corrected measurement was determined for each eye and converted to logMAR. The algorithm was validated against manual extraction of a subset of notes. Results A total of 6266 clinical records were obtained giving 12,452 data points. In a subset of 644 validated notes, comparison of manually extracted data versus TOVA output showed 95% concordance. Interrater reliability testing gave κ statistics of 0.94 (95% confidence interval [CI], 0.89–0.99), 0.96 (95% CI, 0.94–0.98), 0.95 (95% CI, 0.92–0.98), and 0.94 (95% CI, 0.90–0.98) for acuity numerators, denominators, adjustments, and signs, respectively. Pearson correlation coefficient was 0.983. Linear regression showed an R2 of 0.966 (P < 0.0001). Conclusions The total visual acuity extraction algorithm is a novel tool for extraction of visual acuity from free text, unstructured clinical notes and provides an open source method of data extraction. Translational Relevance Automated visual acuity extraction through natural language processing can be a valuable tool for data extraction from free text ophthalmology notes. PMID:28299240

  2. Automated data extraction from in situ protein-stable isotope probing studies.

    PubMed

    Slysz, Gordon W; Steinke, Laurey; Ward, David M; Klatt, Christian G; Clauss, Therese R W; Purvine, Samuel O; Payne, Samuel H; Anderson, Gordon A; Smith, Richard D; Lipton, Mary S

    2014-03-07

    Protein-stable isotope probing (protein-SIP) has strong potential for revealing key metabolizing taxa in complex microbial communities. While most protein-SIP work to date has been performed under controlled laboratory conditions to allow extensive isotope labeling of the target organism(s), a key application will be in situ studies of microbial communities for short periods of time under natural conditions that result in small degrees of partial labeling. One hurdle restricting large-scale in situ protein-SIP studies is the lack of algorithms and software for automated data processing of the massive data sets resulting from such studies. In response, we developed Stable Isotope Probing Protein Extraction Resources software (SIPPER) and applied it for large-scale extraction and visualization of data from short-term (3 h) protein-SIP experiments performed in situ on phototrophic bacterial mats isolated from Yellowstone National Park. Several metrics incorporated into the software allow it to support exhaustive analysis of the complex composite isotopic envelope observed as a result of low amounts of partial label incorporation. SIPPER also enables the detection of labeled molecular species without the need for any prior identification.

  3. BLINKER: Automated Extraction of Ocular Indices from EEG Enabling Large-Scale Analysis.

    PubMed

    Kleifges, Kelly; Bigdely-Shamlo, Nima; Kerick, Scott E; Robbins, Kay A

    2017-01-01

    Electroencephalography (EEG) offers a platform for studying the relationships between behavioral measures, such as blink rate and duration, with neural correlates of fatigue and attention, such as theta and alpha band power. Further, the existence of EEG studies covering a variety of subjects and tasks provides opportunities for the community to better characterize variability of these measures across tasks and subjects. We have implemented an automated pipeline (BLINKER) for extracting ocular indices such as blink rate, blink duration, and blink velocity-amplitude ratios from EEG channels, EOG channels, and/or independent components (ICs). To illustrate the use of our approach, we have applied the pipeline to a large corpus of EEG data (comprising more than 2000 datasets acquired at eight different laboratories) in order to characterize variability of certain ocular indicators across subjects. We also investigate dependence of ocular indices on task in a shooter study. We have implemented our algorithms in a freely available MATLAB toolbox called BLINKER. The toolbox, which is easy to use and can be applied to collections of data without user intervention, can automatically discover which channels or ICs capture blinks. The tools extract blinks, calculate common ocular indices, generate a report for each dataset, dump labeled images of the individual blinks, and provide summary statistics across collections. Users can run BLINKER as a script or as a plugin for EEGLAB. The toolbox is available at https://github.com/VisLab/EEG-Blinks. User documentation and examples appear at http://vislab.github.io/EEG-Blinks/.

  4. Automated data extraction from in situ protein stable isotope probing studies

    SciTech Connect

    Slysz, Gordon W.; Steinke, Laurey A.; Ward, David M.; Klatt, Christian G.; Clauss, Therese RW; Purvine, Samuel O.; Payne, Samuel H.; Anderson, Gordon A.; Smith, Richard D.; Lipton, Mary S.

    2014-01-27

    Protein stable isotope probing (protein-SIP) has strong potential for revealing key metabolizing taxa in complex microbial communities. While most protein-SIP work to date has been performed under controlled laboratory conditions to allow extensive isotope labeling of the target organism, a key application will be in situ studies of microbial communities under conditions that result in small degrees of partial labeling. One hurdle restricting large scale in situ protein-SIP studies is the lack of algorithms and software for automated data processing of the massive data sets resulting from such studies. In response, we developed Stable Isotope Probing Protein Extraction Resources software (SIPPER) and applied it for large scale extraction and visualization of data from short term (3 h) protein-SIP experiments performed in situ on Yellowstone phototrophic bacterial mats. Several metrics incorporated into the software allow it to support exhaustive analysis of the complex composite isotopic envelope observed as a result of low amounts of partial label incorporation. SIPPER also enables the detection of labeled molecular species without the need for any prior identification.

  5. BLINKER: Automated Extraction of Ocular Indices from EEG Enabling Large-Scale Analysis

    PubMed Central

    Kleifges, Kelly; Bigdely-Shamlo, Nima; Kerick, Scott E.; Robbins, Kay A.

    2017-01-01

    Electroencephalography (EEG) offers a platform for studying the relationships between behavioral measures, such as blink rate and duration, with neural correlates of fatigue and attention, such as theta and alpha band power. Further, the existence of EEG studies covering a variety of subjects and tasks provides opportunities for the community to better characterize variability of these measures across tasks and subjects. We have implemented an automated pipeline (BLINKER) for extracting ocular indices such as blink rate, blink duration, and blink velocity-amplitude ratios from EEG channels, EOG channels, and/or independent components (ICs). To illustrate the use of our approach, we have applied the pipeline to a large corpus of EEG data (comprising more than 2000 datasets acquired at eight different laboratories) in order to characterize variability of certain ocular indicators across subjects. We also investigate dependence of ocular indices on task in a shooter study. We have implemented our algorithms in a freely available MATLAB toolbox called BLINKER. The toolbox, which is easy to use and can be applied to collections of data without user intervention, can automatically discover which channels or ICs capture blinks. The tools extract blinks, calculate common ocular indices, generate a report for each dataset, dump labeled images of the individual blinks, and provide summary statistics across collections. Users can run BLINKER as a script or as a plugin for EEGLAB. The toolbox is available at https://github.com/VisLab/EEG-Blinks. User documentation and examples appear at http://vislab.github.io/EEG-Blinks/. PMID:28217081

  6. Rapid and automated sample preparation for nucleic acid extraction on a microfluidic CD (compact disk)

    NASA Astrophysics Data System (ADS)

    Kim, Jitae; Kido, Horacio; Zoval, Jim V.; Gagné, Dominic; Peytavi, Régis; Picard, François J.; Bastien, Martine; Boissinot, Maurice; Bergeron, Michel G.; Madou, Marc J.

    2006-01-01

    Rapid and automated preparation of PCR (polymerase chain reaction)-ready genomic DNA was demonstrated on a multiplexed CD (compact disk) platform by using hard-to-lyse bacterial spores. Cell disruption is carried out while beadcell suspensions are pushed back and forth in center-tapered lysing chambers by angular oscillation of the disk - keystone effect. During this lysis period, the cell suspensions are securely held within the lysing chambers by heatactivated wax valves. Upon application of a remote heat to the disk in motion, the wax valves release lysate solutions into centrifuge chambers where cell debris are separated by an elevated rotation of the disk. Only debris-free DNA extract is then transferred to collection chambers by capillary-assisted siphon and collected for heating that inactivates PCR inhibitors. Lysing capacity was evaluated using a real-time PCR assay to monitor the efficiency of Bacillus globigii spore lysis. PCR analysis showed that 5 minutes' CD lysis run gave spore lysis efficiency similar to that obtained with a popular commercial DNA extraction kit (i.e., IDI-lysis kit from GeneOhm Sciences Inc.) which is highly efficient for microbial cell and spore lysis. This work will contribute to the development of an integrated CD-based assay for rapid diagnosis of infectious diseases.

  7. Streamlining DNA Barcoding Protocols: Automated DNA Extraction and a New cox1 Primer in Arachnid Systematics

    PubMed Central

    Vidergar, Nina; Toplak, Nataša; Kuntner, Matjaž

    2014-01-01

    Background DNA barcoding is a popular tool in taxonomic and phylogenetic studies, but for most animal lineages protocols for obtaining the barcoding sequences—mitochondrial cytochrome C oxidase subunit I (cox1 AKA CO1)—are not standardized. Our aim was to explore an optimal strategy for arachnids, focusing on the species-richest lineage, spiders by (1) improving an automated DNA extraction protocol, (2) testing the performance of commonly used primer combinations, and (3) developing a new cox1 primer suitable for more efficient alignment and phylogenetic analyses. Methodology We used exemplars of 15 species from all major spider clades, processed a range of spider tissues of varying size and quality, optimized genomic DNA extraction using the MagMAX Express magnetic particle processor—an automated high throughput DNA extraction system—and tested cox1 amplification protocols emphasizing the standard barcoding region using ten routinely employed primer pairs. Results The best results were obtained with the commonly used Folmer primers (LCO1490/HCO2198) that capture the standard barcode region, and with the C1-J-2183/C1-N-2776 primer pair that amplifies its extension. However, C1-J-2183 is designed too close to HCO2198 for well-interpreted, continuous sequence data, and in practice the resulting sequences from the two primer pairs rarely overlap. We therefore designed a new forward primer C1-J-2123 60 base pairs upstream of the C1-J-2183 binding site. The success rate of this new primer (93%) matched that of C1-J-2183. Conclusions The use of C1-J-2123 allows full, indel-free overlap of sequences obtained with the standard Folmer primers and with C1-J-2123 primer pair. Our preliminary tests suggest that in addition to spiders, C1-J-2123 will also perform in other arachnids and several other invertebrates. We provide optimal PCR protocols for these primer sets, and recommend using them for systematic efforts beyond DNA barcoding. PMID:25415202

  8. Automated fast extraction of nitrated polycyclic aromatic hydrocarbons from soil by focused microwave-assisted Soxhlet extraction prior to gas chromatography--electron-capture detection.

    PubMed

    Priego-Capote, F; Luque-García, J L; Luque de Castro, M D

    2003-04-25

    An approach for the automated fast extraction of nitrated polycyclic aromatic hydrocarbons (nitroPAHs) from soil, using a focused microwave-assisted Soxhlet extractor, is proposed. The main factors affecting the extraction efficiency (namely: irradiation power, irradiation time, number of cycles and extractant volume) were optimised by using experimental design methodology. The reduction of the nitro-PAHs to amino-PAHs and the derivatisation of the reduced analytes with heptafluorobutyric anhydride was mandatory prior to the separation-determination step by gas chromatography--electron-capture detection. The proposed approach has allowed the extraction of these pollutants from spiked and "real" contaminated soils with extraction efficiencies similar to those provided by the US Environmental Protection Agency methods 3540-8091, but with a drastic reduction in both the extraction time and sample handling, and using less organic solvent, as 75-85% of it was recycled.

  9. Development of automated extraction method of biliary tract from abdominal CT volumes based on local intensity structure analysis

    NASA Astrophysics Data System (ADS)

    Koga, Kusuto; Hayashi, Yuichiro; Hirose, Tomoaki; Oda, Masahiro; Kitasaka, Takayuki; Igami, Tsuyoshi; Nagino, Masato; Mori, Kensaku

    2014-03-01

    In this paper, we propose an automated biliary tract extraction method from abdominal CT volumes. The biliary tract is the path by which bile is transported from liver to the duodenum. No extraction method have been reported for the automated extraction of the biliary tract from common contrast CT volumes. Our method consists of three steps including: (1) extraction of extrahepatic bile duct (EHBD) candidate regions, (2) extraction of intrahepatic bile duct (IHBD) candidate regions, and (3) combination of these candidate regions. The IHBD has linear structures and intensities of the IHBD are low in CT volumes. We use a dark linear structure enhancement (DLSE) filter based on a local intensity structure analysis method using the eigenvalues of the Hessian matrix for the IHBD candidate region extraction. The EHBD region is extracted using a thresholding process and a connected component analysis. In the combination process, we connect the IHBD candidate regions to each EHBD candidate region and select a bile duct region from the connected candidate regions. We applied the proposed method to 22 cases of CT volumes. An average Dice coefficient of extraction result was 66.7%.

  10. Californian demonstration and validation of automated agricultural field extraction from multi-temporal Landsat data

    NASA Astrophysics Data System (ADS)

    Yan, L.; Roy, D. P.

    2013-12-01

    The spatial distribution of agricultural fields is a fundamental description of rural landscapes and the location and extent of fields is important to establish the area of land utilized for agricultural yield prediction, resource allocation, and for economic planning. To date, field objects have not been extracted from satellite data over large areas because of computational constraints and because consistently processed appropriate resolution data have not been available or affordable. We present a fully automated computational methodology to extract agricultural fields from 30m Web Enabled Landsat data (WELD) time series and results for approximately 250,000 square kilometers (eleven 150 x 150 km WELD tiles) encompassing all the major agricultural areas of California. The extracted fields, including rectangular, circular, and irregularly shaped fields, are evaluated by comparison with manually interpreted Landsat field objects. Validation results are presented in terms of standard confusion matrix accuracy measures and also the degree of field object over-segmentation, under-segmentation, fragmentation and shape distortion. The apparent success of the presented field extraction methodology is due to several factors. First, the use of multi-temporal Landsat data, as opposed to single Landsat acquisitions, that enables crop rotations and inter-annual variability in the state of the vegetation to be accommodated for and provides more opportunities for cloud-free, non-missing and atmospherically uncontaminated surface observations. Second, the adoption of an object based approach, namely the variational region-based geometric active contour method that enables robust segmentation with only a small number of parameters and that requires no training data collection. Third, the use of a watershed algorithm to decompose connected segments belonging to multiple fields into coherent isolated field segments and a geometry based algorithm to detect and associate parts of

  11. Determination of 21 drugs in oral fluid using fully automated supported liquid extraction and UHPLC-MS/MS.

    PubMed

    Valen, Anja; Leere Øiestad, Åse Marit; Strand, Dag Helge; Skari, Ragnhild; Berg, Thomas

    2016-07-28

    Collection of oral fluid (OF) is easy and non-invasive compared to the collection of urine and blood, and interest in OF for drug screening and diagnostic purposes is increasing. A high-throughput ultra-high-performance liquid chromatography-tandem mass spectrometry method for determination of 21 drugs in OF using fully automated 96-well plate supported liquid extraction for sample preparation is presented. The method contains a selection of classic drugs of abuse, including amphetamines, cocaine, cannabis, opioids, and benzodiazepines. The method was fully validated for 200 μL OF/buffer mix using an Intercept OF sampling kit; validation included linearity, sensitivity, precision, accuracy, extraction recovery, matrix effects, stability, and carry-over. Inter-assay precision (RSD) and accuracy (relative error) were <15% and 13 to 5%, respectively, for all compounds at concentrations equal to or higher than the lower limit of quantification. Extraction recoveries were between 58 and 76% (RSD < 8%), except for tetrahydrocannabinol and three 7-amino benzodiazepine metabolites with recoveries between 23 and 33% (RSD between 51 and 52 % and 11 and 25%, respectively). Ion enhancement or ion suppression effects were observed for a few compounds; however, to a large degree they were compensated for by the internal standards used. Deuterium-labelled and (13) C-labelled internal standards were used for 8 and 11 of the compounds, respectively. In a comparison between Intercept and Quantisal OF kits, better recoveries and fewer matrix effects were observed for some compounds using Quantisal. The method is sensitive and robust for its purposes and has been used successfully since February 2015 for analysis of Intercept OF samples from 2600 cases in a 12-month period. Copyright © 2016 John Wiley & Sons, Ltd.

  12. Support Vector Machine with Ensemble Tree Kernel for Relation Extraction

    PubMed Central

    Fu, Hui; Du, Zhiguo

    2016-01-01

    Relation extraction is one of the important research topics in the field of information extraction research. To solve the problem of semantic variation in traditional semisupervised relation extraction algorithm, this paper proposes a novel semisupervised relation extraction algorithm based on ensemble learning (LXRE). The new algorithm mainly uses two kinds of support vector machine classifiers based on tree kernel for integration and integrates the strategy of constrained extension seed set. The new algorithm can weaken the inaccuracy of relation extraction, which is caused by the phenomenon of semantic variation. The numerical experimental research based on two benchmark data sets (PropBank and AIMed) shows that the LXRE algorithm proposed in the paper is superior to other two common relation extraction methods in four evaluation indexes (Precision, Recall, F-measure, and Accuracy). It indicates that the new algorithm has good relation extraction ability compared with others. PMID:27118966

  13. A new automated spectral feature extraction method and its application in spectral classification and defective spectra recovery

    NASA Astrophysics Data System (ADS)

    Wang, Ke; Guo, Ping; Luo, A.-Li

    2017-03-01

    Spectral feature extraction is a crucial procedure in automated spectral analysis. This procedure starts from the spectral data and produces informative and non-redundant features, facilitating the subsequent automated processing and analysis with machine-learning and data-mining techniques. In this paper, we present a new automated feature extraction method for astronomical spectra, with application in spectral classification and defective spectra recovery. The basic idea of our approach is to train a deep neural network to extract features of spectra with different levels of abstraction in different layers. The deep neural network is trained with a fast layer-wise learning algorithm in an analytical way without any iterative optimization procedure. We evaluate the performance of the proposed scheme on real-world spectral data. The results demonstrate that our method is superior regarding its comprehensive performance, and the computational cost is significantly lower than that for other methods. The proposed method can be regarded as a new valid alternative general-purpose feature extraction method for various tasks in spectral data analysis.

  14. A comparison of methods for forensic DNA extraction: Chelex-100® and the QIAGEN DNA Investigator Kit (manual and automated).

    PubMed

    Phillips, Kirsty; McCallum, Nicola; Welch, Lindsey

    2012-03-01

    Efficient isolation of DNA from a sample is the basis for successful forensic DNA profiling. There are many DNA extraction methods available and they vary in their ability to efficiently extract the DNA; as well as in processing time, operator intervention, contamination risk and ease of use. In recent years, automated robots have been made available which speed up processing time and decrease the amount of operator input. This project was set up to investigate the efficiency of three DNA extraction methods, two manual (Chelex(®)-100 and the QIAGEN DNA Investigator Kit) and one automated (QIAcube), using both buccal cells and blood stains as the DNA source. Extracted DNA was quantified using real-time PCR in order to assess the amount of DNA present in each sample. Selected samples were then amplified using AmpFlSTR SGM Plus amplification kit. The results suggested that there was no statistical difference between results gained for the different methods investigated, but the automated QIAcube robot made sample processing much simpler and quicker without introducing DNA contamination.

  15. Sensitivity testing of trypanosome detection by PCR from whole blood samples using manual and automated DNA extraction methods.

    PubMed

    Dunlop, J; Thompson, C K; Godfrey, S S; Thompson, R C A

    2014-11-01

    Automated extraction of DNA for testing of laboratory samples is an attractive alternative to labour-intensive manual methods when higher throughput is required. However, it is important to maintain the maximum detection sensitivity possible to reduce the occurrence of type II errors (false negatives; failure to detect the target when it is present), especially in the biomedical field, where PCR is used for diagnosis. We used blood infected with known concentrations of Trypanosoma copemani to test the impact of analysis techniques on trypanosome detection sensitivity by PCR. We compared combinations of a manual and an automated DNA extraction method and two different PCR primer sets to investigate the impact of each on detection levels. Both extraction techniques and specificity of primer sets had a significant impact on detection sensitivity. Samples extracted using the same DNA extraction technique performed substantially differently for each of the separate primer sets. Type I errors (false positives; detection of the target when it is not present), produced by contaminants, were avoided with both extraction methods. This study highlights the importance of testing laboratory techniques with known samples to optimise accuracy of test results.

  16. PKDE4J: Entity and relation extraction for public knowledge discovery.

    PubMed

    Song, Min; Kim, Won Chul; Lee, Dahee; Heo, Go Eun; Kang, Keun Young

    2015-10-01

    Due to an enormous number of scientific publications that cannot be handled manually, there is a rising interest in text-mining techniques for automated information extraction, especially in the biomedical field. Such techniques provide effective means of information search, knowledge discovery, and hypothesis generation. Most previous studies have primarily focused on the design and performance improvement of either named entity recognition or relation extraction. In this paper, we present PKDE4J, a comprehensive text-mining system that integrates dictionary-based entity extraction and rule-based relation extraction in a highly flexible and extensible framework. Starting with the Stanford CoreNLP, we developed the system to cope with multiple types of entities and relations. The system also has fairly good performance in terms of accuracy as well as the ability to configure text-processing components. We demonstrate its competitive performance by evaluating it on many corpora and found that it surpasses existing systems with average F-measures of 85% for entity extraction and 81% for relation extraction.

  17. Rapid and Semi-Automated Extraction of Neuronal Cell Bodies and Nuclei from Electron Microscopy Image Stacks

    PubMed Central

    Holcomb, Paul S.; Morehead, Michael; Doretto, Gianfranco; Chen, Peter; Berg, Stuart; Plaza, Stephen; Spirou, George

    2016-01-01

    Connectomics—the study of how neurons wire together in the brain—is at the forefront of modern neuroscience research. However, many connectomics studies are limited by the time and precision needed to correctly segment large volumes of electron microscopy (EM) image data. We present here a semi-automated segmentation pipeline using freely available software that can significantly decrease segmentation time for extracting both nuclei and cell bodies from EM image volumes. PMID:27259933

  18. Evaluation of automated and manual commercial DNA extraction methods for recovery of Brucella DNA from suspensions and spiked swabs.

    PubMed

    Dauphin, Leslie A; Hutchins, Rebecca J; Bost, Liberty A; Bowen, Michael D

    2009-12-01

    This study evaluated automated and manual commercial DNA extraction methods for their ability to recover DNA from Brucella species in phosphate-buffered saline (PBS) suspension and from spiked swab specimens. Six extraction methods, representing several of the methodologies which are commercially available for DNA extraction, as well as representing various throughput capacities, were evaluated: the MagNA Pure Compact and the MagNA Pure LC instruments, the IT 1-2-3 DNA sample purification kit, the MasterPure Complete DNA and RNA purification kit, the QIAamp DNA blood mini kit, and the UltraClean microbial DNA isolation kit. These six extraction methods were performed upon three pathogenic Brucella species: B. abortus, B. melitensis, and B. suis. Viability testing of the DNA extracts indicated that all six extraction methods were efficient at inactivating virulent Brucella spp. Real-time PCR analysis using Brucella genus- and species-specific TaqMan assays revealed that use of the MasterPure kit resulted in superior levels of detection from bacterial suspensions, while the MasterPure kit and MagNA Pure Compact performed equally well for extraction of spiked swab samples. This study demonstrated that DNA extraction methodologies differ in their ability to recover Brucella DNA from PBS bacterial suspensions and from swab specimens and, thus, that the extraction method used for a given type of sample matrix can influence the sensitivity of real-time PCR assays for Brucella.

  19. An automated method of on-line extraction coupled with flow injection and capillary electrophoresis for phytochemical analysis.

    PubMed

    Chen, Hongli; Ding, Xiuping; Wang, Min; Chen, Xingguo

    2010-11-01

    In this study, an automated system for phytochemical analysis was successfully fabricated for the first time in our laboratory. The system included on-line decocting, filtering, cooling, sample introducing, separation, and detection, which greatly simplified the sample preparation and shortened the analysis time. Samples from the decoction extract were drawn every 5 min through an on-line filter and a condenser pipe to the sample loop from which 20-μL samples were injected into the running buffer and transported into a split-flow interface coupling the flow injection and capillary electrophoresis systems. The separation of glycyrrhetinic acid (GTA) and glycyrrhizic acid (GA) took less than 5 min by using a 10 mM borate buffer (adjusted pH to 8.8) and +10 kV voltage. Calibration curves showed good linearity with correlation coefficients (R) more than 0.9991. The intra-day repeatabilities (n = 5, expressed as relative standard deviation) of the proposed system, obtained using GTA and GA standards, were 1.1% and 0.8% for migration time and 0.7% and 0.9% for peak area, respectively. The mean recoveries of GTA and GA in the off-line extract of Glycyrrhiza uralensis Fisch root were better than 99.0%. The limits of detection (signal-to-noise ratio = 3) of the proposed method were 6.2 μg/mL and 6.9 μg/mL for GTA and GA, respectively. The dynamic changes of GTA and GA on the decoction time were obtained during the on-line decoction process of Glycyrrhiza uralensis Fisch root.

  20. Automated Semantic Indices Related to Cognitive Function and Rate of Cognitive Decline

    ERIC Educational Resources Information Center

    Pakhomov, Serguei V. S.; Hemmy, Laura S.; Lim, Kelvin O.

    2012-01-01

    The objective of our study is to introduce a fully automated, computational linguistic technique to quantify semantic relations between words generated on a standard semantic verbal fluency test and to determine its cognitive and clinical correlates. Cognitive differences between patients with Alzheimer's disease and mild cognitive impairment are…

  1. Arsenic fractionation in agricultural soil using an automated three-step sequential extraction method coupled to hydride generation-atomic fluorescence spectrometry.

    PubMed

    Rosas-Castor, J M; Portugal, L; Ferrer, L; Guzmán-Mar, J L; Hernández-Ramírez, A; Cerdà, V; Hinojosa-Reyes, L

    2015-05-18

    A fully automated modified three-step BCR flow-through sequential extraction method was developed for the fractionation of the arsenic (As) content from agricultural soil based on a multi-syringe flow injection analysis (MSFIA) system coupled to hydride generation-atomic fluorescence spectrometry (HG-AFS). Critical parameters that affect the performance of the automated system were optimized by exploiting a multivariate approach using a Doehlert design. The validation of the flow-based modified-BCR method was carried out by comparison with the conventional BCR method. Thus, the total As content was determined in the following three fractions: fraction 1 (F1), the acid-soluble or interchangeable fraction; fraction 2 (F2), the reducible fraction; and fraction 3 (F3), the oxidizable fraction. The limits of detection (LOD) were 4.0, 3.4, and 23.6 μg L(-1) for F1, F2, and F3, respectively. A wide working concentration range was obtained for the analysis of each fraction, i.e., 0.013-0.800, 0.011-0.900 and 0.079-1.400 mg L(-1) for F1, F2, and F3, respectively. The precision of the automated MSFIA-HG-AFS system, expressed as the relative standard deviation (RSD), was evaluated for a 200 μg L(-1) As standard solution, and RSD values between 5 and 8% were achieved for the three BCR fractions. The new modified three-step BCR flow-based sequential extraction method was satisfactorily applied for arsenic fractionation in real agricultural soil samples from an arsenic-contaminated mining zone to evaluate its extractability. The frequency of analysis of the proposed method was eight times higher than that of the conventional BCR method (6 vs 48 h), and the kinetics of lixiviation were established for each fraction.

  2. Using mobile laser scanning data for automated extraction of road markings

    NASA Astrophysics Data System (ADS)

    Guan, Haiyan; Li, Jonathan; Yu, Yongtao; Wang, Cheng; Chapman, Michael; Yang, Bisheng

    2014-01-01

    A mobile laser scanning (MLS) system allows direct collection of accurate 3D point information in unprecedented detail at highway speeds and at less than traditional survey costs, which serves the fast growing demands of transportation-related road surveying including road surface geometry and road environment. As one type of road feature in traffic management systems, road markings on paved roadways have important functions in providing guidance and information to drivers and pedestrians. This paper presents a stepwise procedure to recognize road markings from MLS point clouds. To improve computational efficiency, we first propose a curb-based method for road surface extraction. This method first partitions the raw MLS data into a set of profiles according to vehicle trajectory data, and then extracts small height jumps caused by curbs in the profiles via slope and elevation-difference thresholds. Next, points belonging to the extracted road surface are interpolated into a geo-referenced intensity image using an extended inverse-distance-weighted (IDW) approach. Finally, we dynamically segment the geo-referenced intensity image into road-marking candidates with multiple thresholds that correspond to different ranges determined by point-density appropriate normality. A morphological closing operation with a linear structuring element is finally used to refine the road-marking candidates by removing noise and improving completeness. This road-marking extraction algorithm is comprehensively discussed in the analysis of parameter sensitivity and overall performance. An experimental study performed on a set of road markings with ground-truth shows that the proposed algorithm provides a promising solution to the road-marking extraction from MLS data.

  3. A System for Automated Extraction of Metadata from Scanned Documents using Layout Recognition and String Pattern Search Models.

    PubMed

    Misra, Dharitri; Chen, Siyuan; Thoma, George R

    2009-01-01

    One of the most expensive aspects of archiving digital documents is the manual acquisition of context-sensitive metadata useful for the subsequent discovery of, and access to, the archived items. For certain types of textual documents, such as journal articles, pamphlets, official government records, etc., where the metadata is contained within the body of the documents, a cost effective method is to identify and extract the metadata in an automated way, applying machine learning and string pattern search techniques.At the U. S. National Library of Medicine (NLM) we have developed an automated metadata extraction (AME) system that employs layout classification and recognition models with a metadata pattern search model for a text corpus with structured or semi-structured information. A combination of Support Vector Machine and Hidden Markov Model is used to create the layout recognition models from a training set of the corpus, following which a rule-based metadata search model is used to extract the embedded metadata by analyzing the string patterns within and surrounding each field in the recognized layouts.In this paper, we describe the design of our AME system, with focus on the metadata search model. We present the extraction results for a historic collection from the Food and Drug Administration, and outline how the system may be adapted for similar collections. Finally, we discuss some ongoing enhancements to our AME system.

  4. A novel automated device for rapid nucleic acid extraction utilizing a zigzag motion of magnetic silica beads.

    PubMed

    Yamaguchi, Akemi; Matsuda, Kazuyuki; Uehara, Masayuki; Honda, Takayuki; Saito, Yasunori

    2016-02-04

    We report a novel automated device for nucleic acid extraction, which consists of a mechanical control system and a disposable cassette. The cassette is composed of a bottle, a capillary tube, and a chamber. After sample injection in the bottle, the sample is lysed, and nucleic acids are adsorbed on the surface of magnetic silica beads. These magnetic beads are transported and are vibrated through the washing reagents in the capillary tube under the control of the mechanical control system, and thus, the nucleic acid is purified without centrifugation. The purified nucleic acid is automatically extracted in 3 min for the polymerase chain reaction (PCR). The nucleic acid extraction is dependent on the transport speed and the vibration frequency of the magnetic beads, and optimizing these two parameters provided better PCR efficiency than the conventional manual procedure. There was no difference between the detection limits of our novel device and that of the conventional manual procedure. We have already developed the droplet-PCR machine, which can amplify and detect specific nucleic acids rapidly and automatically. Connecting the droplet-PCR machine to our novel automated extraction device enables PCR analysis within 15 min, and this system can be made available as a point-of-care testing in clinics as well as general hospitals.

  5. Comparative evaluation of commercially available manual and automated nucleic acid extraction methods for rotavirus RNA detection in stools.

    PubMed

    Esona, Mathew D; McDonald, Sharla; Kamili, Shifaq; Kerin, Tara; Gautam, Rashi; Bowen, Michael D

    2013-12-01

    Rotaviruses are a major cause of viral gastroenteritis in children. For accurate and sensitive detection of rotavirus RNA from stool samples by reverse transcription-polymerase chain reaction (RT-PCR), the extraction process must be robust. However, some extraction methods may not remove the strong RT-PCR inhibitors known to be present in stool samples. The objective of this study was to evaluate and compare the performance of six extraction methods used commonly for extraction of rotavirus RNA from stool, which have never been formally evaluated: the MagNA Pure Compact, KingFisher Flex and NucliSENS easyMAG instruments, the NucliSENS miniMAG semi-automated system, and two manual purification kits, the QIAamp Viral RNA kit and a modified RNaid kit. Using each method, total nucleic acid or RNA was extracted from eight rotavirus-positive stool samples with enzyme immunoassay optical density (EIA OD) values ranging from 0.176 to 3.098. Extracts prepared using the MagNA Pure Compact instrument yielded the most consistent results by qRT-PCR and conventional RT-PCR. When extracts prepared from a dilution series were extracted by the 6 methods and tested, rotavirus RNA was detected in all samples by qRT-PCR but by conventional RT-PCR testing, only the MagNA Pure Compact and KingFisher Flex extracts were positive in all cases. RT-PCR inhibitors were detected in extracts produced with the QIAamp Viral RNA Mini kit. The findings of this study should prove useful for selection of extraction methods to be incorporated into future rotavirus detection and genotyping protocols.

  6. Mixed-mode isolation of triazine metabolites from soil and aquifer sediments using automated solid-phase extraction

    USGS Publications Warehouse

    Mills, M.S.; Thurman, E.M.

    1992-01-01

    Reversed-phase isolation and ion-exchange purification were combined in the automated solid-phase extraction of two polar s-triazine metabolites, 2-amino-4-chloro-6-(isopropylamino)-s-triazine (deethylatrazine) and 2-amino-4-chloro-6-(ethylamino)-s-triazine (deisopropylatrazine) from clay-loam and slit-loam soils and sandy aquifer sediments. First, methanol/ water (4/1, v/v) soil extracts were transferred to an automated workstation following evaporation of the methanol phase for the rapid reversed-phase isolation of the metabolites on an octadecylresin (C18). The retention of the triazine metabolites on C18 decreased substantially when trace methanol concentrations (1%) remained. Furthermore, the retention on C18 increased with decreasing aqueous solubility and increasing alkyl-chain length of the metabolites and parent herbicides, indicating a reversed-phase interaction. The analytes were eluted with ethyl acetate, which left much of the soil organic-matter impurities on the resin. Second, the small-volume organic eluate was purified on an anion-exchange resin (0.5 mL/min) to extract the remaining soil pigments that could foul the ion source of the GC/MS system. Recoveries of the analytes were 75%, using deuterated atrazine as a surrogate, and were comparable to recoveries by soxhlet extraction. The detection limit was 0.1 ??g/kg with a coefficient of variation of 15%. The ease and efficiency of this automated method makes it viable, practical technique for studying triazine metabolites in the environment.

  7. Automated solid phase extraction, on-support derivatization and isotope dilution-GC/MS method for the detection of urinary dialkyl phosphates in humans.

    PubMed

    De Alwis, G K Hemakanthi; Needham, Larry L; Barr, Dana B

    2009-01-15

    We developed an analytical method based on solid phase extraction, on-support derivatization and isotope dilution-GC/MS for the detection of dialkyl phosphate (DAP) metabolites, dimethyl thiophosphate, diethyl thiophosphate, dimethyl dithiophosphate, and diethyl dithiophosphate in human urine. The sample preparative procedure is simple and fully automated. In this method, the analytes were extracted from the urinary matrix onto a styrene-divinyl benzene polymer-based solid phase extraction cartridge and derivatized on-column with pentafluorobenzyl bromide. The ester conjugated analytes are eluted from the column with acetonitrile, concentrated and analyzed. Compared to extraction-post extraction derivatization methods for the analysis of DAP metabolites, this on-support derivatization is fast, efficient, and less labor-intensive. Furthermore, it has fewer steps in the sample preparation, uses less solvent and produces less interference. The method is highly sensitive with limits of detection for the analytes ranging from 0.1 to 0.3 ng/mL. The recoveries were high and comparable with those of our previous method. Relative standard deviation, indicative of the repeatability and precision of the method, was 1-17% for the metabolites.

  8. Automated extraction of oscillation parameters for Kepler observations of solar-type stars

    NASA Astrophysics Data System (ADS)

    Huber, D.; Stello, D.; Bedding, T. R.; Chaplin, W. J.; Arentoft, T.; Quirion, P.-O.; Kjeldsen, H.

    2009-10-01

    The recent launch of the Kepler space telescope brings the opportunity to study oscillations systematically in large numbers of solar-like stars. In the framework of the asteroFLAG project, we have developed an automated pipeline to estimate global oscillation parameters, such as the frequency of maximum power (νmax ) and the large frequency spacing (Δν), for a large number of time series. We present an effective method based on the autocorrelation function to find excess power and use a scaling relation to estimate granulation timescales as initial conditions for background modelling. We derive reliable uncertainties for νmax and Δν through extensive simulations. We have tested the pipeline on about 2000 simulated Kepler stars with magnitudes of V ˜ 7-12 and were able to correctly determine νmax and Δν for about half of the sample. For about 20%, the returned large frequency spacing is accurate enough to determine stellar radii to a 1% precision. We conclude that the methods presented here are a promising approach to process the large amount of data expected from Kepler.

  9. Single-trial event-related potential extraction through one-unit ICA-with-reference

    NASA Astrophysics Data System (ADS)

    Lih Lee, Wee; Tan, Tele; Falkmer, Torbjörn; Leung, Yee Hong

    2016-12-01

    Objective. In recent years, ICA has been one of the more popular methods for extracting event-related potential (ERP) at the single-trial level. It is a blind source separation technique that allows the extraction of an ERP without making strong assumptions on the temporal and spatial characteristics of an ERP. However, the problem with traditional ICA is that the extraction is not direct and is time-consuming due to the need for source selection processing. In this paper, the application of an one-unit ICA-with-Reference (ICA-R), a constrained ICA method, is proposed. Approach. In cases where the time-region of the desired ERP is known a priori, this time information is utilized to generate a reference signal, which is then used for guiding the one-unit ICA-R to extract the source signal of the desired ERP directly. Main results. Our results showed that, as compared to traditional ICA, ICA-R is a more effective method for analysing ERP because it avoids manual source selection and it requires less computation thus resulting in faster ERP extraction. Significance. In addition to that, since the method is automated, it reduces the risks of any subjective bias in the ERP analysis. It is also a potential tool for extracting the ERP in online application.

  10. A Model-Based Analysis of Semi-Automated Data Discovery and Entry Using Automated Content Extraction

    DTIC Science & Technology

    2011-02-01

    10. SPONSOR/MONITOR’S ACRONYM(S) 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION /AVAILABILITY STATEMENT Approved for public release... distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as...all sentences S = number of sentences across all documents WSa = words per sentence containing a relation for SW WPa = words per paragraph

  11. Automated identification and geometrical features extraction of individual trees from Mobile Laser Scanning data in Budapest

    NASA Astrophysics Data System (ADS)

    Koma, Zsófia; Székely, Balázs; Folly-Ritvay, Zoltán; Skobrák, Ferenc; Koenig, Kristina; Höfle, Bernhard

    2016-04-01

    Mobile Laser Scanning (MLS) is an evolving operational measurement technique for urban environment providing large amounts of high resolution information about trees, street features, pole-like objects on the street sides or near to motorways. In this study we investigate a robust segmentation method to extract the individual trees automatically in order to build an object-based tree database system. We focused on the large urban parks in Budapest (Margitsziget and Városliget; KARESZ project) which contained large diversity of different kind of tree species. The MLS data contained high density point cloud data with 1-8 cm mean absolute accuracy 80-100 meter distance from streets. The robust segmentation method contained following steps: The ground points are determined first. As a second step cylinders are fitted in vertical slice 1-1.5 meter relative height above ground, which is used to determine the potential location of each single trees trunk and cylinder-like object. Finally, residual values are calculated as deviation of each point from a vertically expanded fitted cylinder; these residual values are used to separate cylinder-like object from individual trees. After successful parameterization, the model parameters and the corresponding residual values of the fitted object are extracted and imported into the tree database. Additionally, geometric features are calculated for each segmented individual tree like crown base, crown width, crown length, diameter of trunk, volume of the individual trees. In case of incompletely scanned trees, the extraction of geometric features is based on fitted circles. The result of the study is a tree database containing detailed information about urban trees, which can be a valuable dataset for ecologist, city planners, planting and mapping purposes. Furthermore, the established database will be the initial point for classification trees into single species. MLS data used in this project had been measured in the framework of

  12. Automated solid-phase extraction coupled online with HPLC-FLD for the quantification of zearalenone in edible oil.

    PubMed

    Drzymala, Sarah S; Weiz, Stefan; Heinze, Julia; Marten, Silvia; Prinz, Carsten; Zimathies, Annett; Garbe, Leif-Alexander; Koch, Matthias

    2015-05-01

    Established maximum levels for the mycotoxin zearalenone (ZEN) in edible oil require monitoring by reliable analytical methods. Therefore, an automated SPE-HPLC online system based on dynamic covalent hydrazine chemistry has been developed. The SPE step comprises a reversible hydrazone formation by ZEN and a hydrazine moiety covalently attached to a solid phase. Seven hydrazine materials with different properties regarding the resin backbone, pore size, particle size, specific surface area, and loading have been evaluated. As a result, a hydrazine-functionalized silica gel was chosen. The final automated online method was validated and applied to the analysis of three maize germ oil samples including a provisionally certified reference material. Important performance criteria for the recovery (70-120 %) and precision (RSDr <25 %) as set by the Commission Regulation EC 401/2006 were fulfilled: The mean recovery was 78 % and RSDr did not exceed 8 %. The results of the SPE-HPLC online method were further compared to results obtained by liquid-liquid extraction with stable isotope dilution analysis LC-MS/MS and found to be in good agreement. The developed SPE-HPLC online system with fluorescence detection allows a reliable, accurate, and sensitive quantification (limit of quantification, 30 μg/kg) of ZEN in edible oils while significantly reducing the workload. To our knowledge, this is the first report on an automated SPE-HPLC method based on a covalent SPE approach.

  13. Novel automated extraction method for quantitative analysis of urinary 11-nor-delta(9)-tetrahydrocannabinol-9-carboxylic acid (THC-COOH).

    PubMed

    Fu, Shanlin; Lewis, John

    2008-05-01

    An automated extraction method for extracting the major urinary metabolite of cannabis, 11-nor-Delta(9)-tetrahydrocannabinol-9-carboxylic acid (THC-COOH) was developed on the four-probe Gilson ASPEC XL4trade mark solid-phase extraction (SPE) system. The method works on liquid-liquid extraction principles but does not require the use of SPE cartridges. The limits of detection and quantitation and the upper limit of linearity (ULOL) of the developed method were found to be 1, 2, and 1,500 ng/mL, respectively. There was no detectable carry over after 10,000 ng/mL analyte. For a batch of 76 samples, the process uses less than 100 mL methanol, 450 mL extracting solvent hexane/ethyl acetate (5:1, v/v) and 1 L rinsing solvent, 30% methanol in water. The automated extraction process takes 5 h to complete. Precision and accuracy of the method are comparable to both manual liquid-liquid extraction and automated SPE methods. The method has proven to be a simple, speedy, and economical alternative to the currently popular automated SPE method for the quantitative analysis of urinary THC-COOH.

  14. Screening for anabolic steroids in urine of forensic cases using fully automated solid phase extraction and LC-MS-MS.

    PubMed

    Andersen, David W; Linnet, Kristian

    2014-01-01

    A screening method for 18 frequently measured exogenous anabolic steroids and the testosterone/epitestosterone (T/E) ratio in forensic cases has been developed and validated. The method involves a fully automated sample preparation including enzyme treatment, addition of internal standards and solid phase extraction followed by analysis by liquid chromatography-tandem mass spectrometry (LC-MS-MS) using electrospray ionization with adduct formation for two compounds. Urine samples from 580 forensic cases were analyzed to determine the T/E ratio and occurrence of exogenous anabolic steroids. Extraction recoveries ranged from 77 to 95%, matrix effects from 48 to 78%, overall process efficiencies from 40 to 54% and the lower limit of identification ranged from 2 to 40 ng/mL. In the 580 urine samples analyzed from routine forensic cases, 17 (2.9%) were found positive for one or more anabolic steroids. Only seven different steroids including testosterone were found in the material, suggesting that only a small number of common steroids are likely to occur in a forensic context. The steroids were often in high concentrations (>100 ng/mL), and a combination of steroids and/or other drugs of abuse were seen in the majority of cases. The method presented serves as a fast and automated screening procedure, proving the suitability of LC-MS-MS for analyzing anabolic steroids.

  15. INVESTIGATION OF ARSENIC SPECIATION ON DRINKING WATER TREATMENT MEDIA UTILIZING AUTOMATED SEQUENTIAL CONTINUOUS FLOW EXTRACTION WITH IC-ICP-MS DETECTION

    EPA Science Inventory

    Three treatment media, used for the removal of arsenic from drinking water, were sequentially extracted using 10mM MgCl2 (pH 8), 10mM NaH2PO4 (pH 7) followed by 10mM (NH4)2C2O4 (pH 3). The media were extracted using an on-line automated continuous extraction system which allowed...

  16. Extraction of hydroxyaromatic compounds in river water by liquid-liquid-liquid microextraction with automated movement of the acceptor and the donor phase.

    PubMed

    Melwanki, Mahaveer B; Huang, Shang-Da

    2006-08-01

    Liquid-liquid-liquid microextraction with automated movement of the acceptor and the donor phase technique is described for the extraction of six hydroxyaromatic compounds in river water using a disposable and ready to use hollow fiber. Separation and quantitative analyses were performed using LC with UV detection at 254 nm. Analytes were extracted from the acidified sample solution (donor phase) into the organic solvent impregnated in the pores of the hollow fiber and then back extracted into the alkaline solution (acceptor phase) inside the lumen of the hollow fiber. The fiber was held by a conventional 10 microL LC syringe. The acceptor phase was sandwitched between the plunger and a small volume of the organic solvent (microcap). The acceptor solution was repeatedly moved in and out of the hollow fiber using a syringe pump. This movement provides a fresh acceptor phase to come in contact with the organic phase and thus enhancing extraction kinetics thereby leading to the improvement in enrichment of the analytes. The microcap separates the acceptor phase and the donor phase in addition to being partially responsible for mass transfer of the analytes from the donor solution to the acceptor solution. Under stirring, a fresh donor phase will enter through the open end of the fiber that will also contribute to the mass transfer. Various parameters affecting the extraction efficiency viz type of organic solvent, extraction time, stirring speed, effect of sodium chloride, and concentration of donor and acceptor phases were studied. RSD (3.9-5.6%), correlation coefficient (0.995-0.997), detection limit (2.0-51.2 ng/mL), enrichment factor (339-630), relative recovery (93.2-97.9%), and absolute recovery (33.9-63.0%) have also been investigated. The developed method was applied for the analysis of river water.

  17. Chemical-induced disease relation extraction with various linguistic features

    PubMed Central

    Gu, Jinghang; Qian, Longhua; Zhou, Guodong

    2016-01-01

    Understanding the relations between chemicals and diseases is crucial in various biomedical tasks such as new drug discoveries and new therapy developments. While manually mining these relations from the biomedical literature is costly and time-consuming, such a procedure is often difficult to keep up-to-date. To address these issues, the BioCreative-V community proposed a challenging task of automatic extraction of chemical-induced disease (CID) relations in order to benefit biocuration. This article describes our work on the CID relation extraction task on the BioCreative-V tasks. We built a machine learning based system that utilized simple yet effective linguistic features to extract relations with maximum entropy models. In addition to leveraging various features, the hypernym relations between entity concepts derived from the Medical Subject Headings (MeSH)-controlled vocabulary were also employed during both training and testing stages to obtain more accurate classification models and better extraction performance, respectively. We demoted relation extraction between entities in documents to relation extraction between entity mentions. In our system, pairs of chemical and disease mentions at both intra- and inter-sentence levels were first constructed as relation instances for training and testing, then two classification models at both levels were trained from the training examples and applied to the testing examples. Finally, we merged the classification results from mention level to document level to acquire final relations between chemicals and diseases. Our system achieved promising F-scores of 60.4% on the development dataset and 58.3% on the test dataset using gold-standard entity annotations, respectively. Database URL: https://github.com/JHnlp/BC5CIDTask PMID:27052618

  18. Chemical-induced disease relation extraction with various linguistic features.

    PubMed

    Gu, Jinghang; Qian, Longhua; Zhou, Guodong

    2016-01-01

    Understanding the relations between chemicals and diseases is crucial in various biomedical tasks such as new drug discoveries and new therapy developments. While manually mining these relations from the biomedical literature is costly and time-consuming, such a procedure is often difficult to keep up-to-date. To address these issues, the BioCreative-V community proposed a challenging task of automatic extraction of chemical-induced disease (CID) relations in order to benefit biocuration. This article describes our work on the CID relation extraction task on the BioCreative-V tasks. We built a machine learning based system that utilized simple yet effective linguistic features to extract relations with maximum entropy models. In addition to leveraging various features, the hypernym relations between entity concepts derived from the Medical Subject Headings (MeSH)-controlled vocabulary were also employed during both training and testing stages to obtain more accurate classification models and better extraction performance, respectively. We demoted relation extraction between entities in documents to relation extraction between entity mentions. In our system, pairs of chemical and disease mentions at both intra- and inter-sentence levels were first constructed as relation instances for training and testing, then two classification models at both levels were trained from the training examples and applied to the testing examples. Finally, we merged the classification results from mention level to document level to acquire final relations between chemicals and diseases. Our system achieved promisingF-scores of 60.4% on the development dataset and 58.3% on the test dataset using gold-standard entity annotations, respectively. Database URL:https://github.com/JHnlp/BC5CIDTask.

  19. Direct Sampling and Analysis from Solid Phase Extraction Cards using an Automated Liquid Extraction Surface Analysis Nanoelectrospray Mass Spectrometry System

    SciTech Connect

    Walworth, Matthew J; ElNaggar, Mariam S; Stankovich, Joseph J; WitkowskiII, Charles E.; Norris, Jeremy L; Van Berkel, Gary J

    2011-01-01

    Direct liquid extraction based surface sampling, a technique previously demonstrated with continuous flow and autonomous pipette liquid microjunction surface sampling probes, has recently been implemented as the Liquid Extraction Surface Analysis (LESA) mode on the commercially available Advion NanoMate chip-based infusion nanoelectrospray ionization system. In the present paper, the LESA mode was applied to the analysis of 96-well format custom solid phase extraction (SPE) cards, with each well consisting of either a 1 or 2 mm diameter monolithic hydrophobic stationary phase. These substrate wells were conditioned, loaded with either single or multi-component aqueous mixtures, and read out using the LESA mode of a TriVersa NanoMate or a Nanomate 100 coupled to an ABI/Sciex 4000QTRAPTM hybrid triple quadrupole/linear ion trap mass spectrometer and a Thermo LTQ XL linear ion trap mass spectrometer. Extraction conditions, including extraction/nanoESI solvent composition, volume, and dwell times, were optimized in the analysis of targeted compounds. Limit of detection and quantitation as well as analysis reproducibility figures of merit were measured. Calibration data was obtained for propranolol using a deuterated internal standard which demonstrated linearity and reproducibility. A 10x increase in signal and cleanup of micromolar Angiotensin II from a concentrated salt solution was demonstrated. Additionally, a multicomponent herbicide mixture at ppb concentration levels was analyzed using MS3 spectra for compound identification in the presence of isobaric interferences.

  20. Performance verification of the Maxwell 16 Instrument and DNA IQ Reference Sample Kit for automated DNA extraction of known reference samples.

    PubMed

    Krnajski, Z; Geering, S; Steadman, S

    2007-12-01

    Advances in automation have been made for a number of processes conducted in the forensic DNA laboratory. However, because most robotic systems are designed for high-throughput laboratories batching large numbers of samples, smaller laboratories are left with a limited number of cost-effective options for employing automation. The Maxwell 16 Instrument and DNA IQ Reference Sample Kit marketed by Promega are designed for rapid, automated purification of DNA extracts from sample sets consisting of sixteen or fewer samples. Because the system is based on DNA capture by paramagnetic particles with maximum binding capacity, it is designed to generate extracts with yield consistency. The studies herein enabled evaluation of STR profile concordance, consistency of yield, and cross-contamination performance for the Maxwell 16 Instrument. Results indicate that the system performs suitably for streamlining the process of extracting known reference samples generally used for forensic DNA analysis and has many advantages in a small or moderate-sized laboratory environment.

  1. Sequential Chomospheric Brightening: An Automated Approach to Extracting Physics from Ephemeral Brightening

    DTIC Science & Technology

    2012-10-17

    University; 2 Space Vehicles Directorate, Air Force Research Laboratory; 3 National Solar Observatory, Sunspot, NM; 4 Astrophysics Research Centre...propose a connection of the small-scale features to solar flares. Our automated routine detects and distinguishes three separate types of brightening...88003-8001; mskirk@nmsu.edu 2 Space Vehicles Directorate, Air Force Research Laboratory, Kirtland AFB, NM 87114 3 National Solar Observatory, Sunspot

  2. A high yield DNA extraction method for medically important Candida species: A comparison of manual versus QIAcube-based automated system.

    PubMed

    Das, P; Pandey, P; Harishankar, A; Chandy, M; Bhattacharya, S

    2016-01-01

    The prognosis of infected individuals with candidemia depends on rapid and precise diagnosis which enables optimising treatment. Three fungal DNA extraction protocols have been compared in this study for medically important Candida species. The quality and quantity of the DNA extracted by physical, chemical and automated protocols was compared using NanoDrop ND-2000 spectrophotometer. It was found that the yield and purity (260/230) ratio of extracted DNA was significantly high in the physical treatment-based protocol as compared to chemical based or automated protocol. Extracted DNA-based real time-polymerase chain reaction showed an analytical sensitivity of 103 cfu/mL. The result of this study suggests physical treatment is the most successful extraction technique compared to other two protocols.

  3. Online in situ analysis of selected semi-volatile organic compounds in water by automated microscale solid-phase extraction with large-volume injection/gas chromatography/mass spectrometry.

    PubMed

    Li, Yongtao; George, John E; McCarty, Christina L

    2007-12-28

    A fully automated analytical method was developed for the online in situ analysis of selected semi-volatile organic compounds in water. The method used a large-volume injection/gas chromatography/mass spectrometry coupled with a fully automated microscale solid-phase extraction technique, which was based on x-y-z robotic techniques. Water samples were extracted by using a 96-well solid-phase extraction plate. For most analytes included in this study, the obtained linear calibrations ranged from 0.05 to 5.0 microg/L with correlation coefficients of 0.996-1.000, the method detection limits were less than 0.1 microg/L, and the relative recoveries were in the range of 70-120% with a relative standard deviation of less than 15% for fortified reagent water samples. The applications to chlorinated tap water, well water, and river water have been validated. The obtained results were similar to those resulting from fortified reagent water samples for all analytes except metribuzin, bromacil, aldrin, and methoxychlor. Matrix effects were observed for these analytes. In general, this fully automated analytical method was rugged, reliable, and easy to operate, and was capable of providing real-time data to water treatment and distribution systems as well as water reservation and protection systems. In addition, the method could reduce the analytical costs associated with sample collection, transportation, storage, and preparation.

  4. Liquid-liquid-liquid microextraction with automated movement of the acceptor and the donor phase for the extraction of phenoxyacetic acids prior to liquid chromatography detection.

    PubMed

    Chen, Chung-Chiang; Melwanki, Mahaveer B; Huang, Shang-Da

    2006-02-03

    A simple liquid-liquid-liquid microextraction with automated movement of the acceptor and the donor phase (LLLME/AMADP) technique is described for the quantitative determination of five phenoxyacetic acids in water using a disposable and ready to use hollow fiber. The target compounds were extracted from the acidified sample solution (donor phase) into the organic solvent residing in the pores of the hollow fiber and then back extracted into the alkaline solution (acceptor phase) inside the lumen of the hollow fiber. The fiber was held by a conventional 10-microl syringe. The acceptor phase was sandwiched between the plunger and a small volume of the organic solvent (microcap). The acceptor solution was repeatedly moved in and out of the hollow fiber assisted by a programmable syringe pump. This repeated movement provides a fresh acceptor phase to come in-contact with the organic phase and thus enhancing extraction kinetics leading to high enrichment of the analytes. The microcap separates the aqueous acceptor phase and the donor phase in addition of being partially responsible for mass transfer of the analytes from donor solution (moving in and out of the hollow fiber from the open end of the fiber) to the acceptor solution. Separation and quantitative analyses were then performed using liquid chromatography (LC) with ultraviolet (UV) detection at 280 nm. Various parameters affecting the extraction efficiency viz. type of organic solvent used for immobilization in the pores of the hollow fiber, extraction time, stirring speed, effect of sodium chloride, and concentration of donor and acceptor phases were studied. Repeatability (RSD, 3.2-7.4%), correlation coefficient (0.996-0.999), detection limit (0.2-2.8 ng ml(-1)) and enrichment factors (129-240) were also investigated. Relative recovery (87-101%) and absolute recoveries (4.6-13%) have also been calculated. The developed method was applied for the analysis of river water.

  5. Characterization and Application of Superlig 620 Solid Phase Extraction Resin for Automated Process Monitoring of 90Sr

    SciTech Connect

    Devol, Timothy A.; Clements, John P.; Farawila, Anne F.; O'Hara, Matthew J.; Egorov, Oleg; Grate, Jay W.

    2009-11-30

    Characterization of SuperLig® 620 solid phase extraction resin was performed in order to develop an automated on-line process monitor for 90Sr. The main focus was on strontium separation from barium, with the goal of developing an automated separation process for 90Sr in high-level wastes. High-level waste contains significant 137Cs activity, of which 137mBa is of great concern as an interference to the quantification of strontium. In addition barium, yttrium and plutonium were studied as potential interferences to strontium uptake and detection. A number of complexants were studied in a series of batch Kd experiments, as SuperLig® 620 was not previously known to elute strontium in typical mineral acids. The optimal separation was found using a 2M nitric acid load solution with a strontium elution step of ~0.49M ammonium citrate and a barium elution step of ~1.8M ammonium citrate. 90Sr quantification of Hanford high-level tank waste was performed on a sequential injection analysis microfluidics system coupled to a flow-cell detector. The results of the on-line procedure are compared to standard radiochemical techniques in this paper.

  6. Automated Control of the Organic and Inorganic Composition of Aloe vera Extracts Using (1)H NMR Spectroscopy.

    PubMed

    Monakhova, Yulia B; Randel, Gabriele; Diehl, Bernd W K

    2016-09-01

    Recent classification of Aloe vera whole-leaf extract by the International Agency for Research and Cancer as a possible carcinogen to humans as well as the continuous adulteration of A. vera's authentic material have generated renewed interest in controlling A. vera. The existing NMR spectroscopic method for the analysis of A. vera, which is based on a routine developed at Spectral Service, was extended. Apart from aloverose, glucose, malic acid, lactic acid, citric acid, whole-leaf material (WLM), acetic acid, fumaric acid, sodium benzoate, and potassium sorbate, the quantification of Mg(2+), Ca(2+), and fructose is possible with the addition of a Cs-EDTA solution to sample. The proposed methodology was automated, which includes phasing, baseline-correction, deconvolution (based on the Lorentzian function), integration, quantification, and reporting. The NMR method was applied to 41 A. vera preparations in the form of liquid A. vera juice and solid A. vera powder. The advantages of the new NMR methodology over the previous method were discussed. Correlation between the new and standard NMR methodologies was significant for aloverose, glucose, malic acid, lactic acid, citric acid, and WLM (P < 0.0001, R(2) = 0.99). NMR was found to be suitable for the automated simultaneous quantitative determination of 13 parameters in A. vera.

  7. Automated 96-well solid phase extraction and hydrophilic interaction liquid chromatography-tandem mass spectrometric method for the analysis of cetirizine (ZYRTEC) in human plasma--with emphasis on method ruggedness.

    PubMed

    Song, Qi; Junga, Heiko; Tang, Yong; Li, Austin C; Addison, Tom; McCort-Tipton, Melanie; Beato, Brian; Naidong, Weng

    2005-01-05

    A high-throughput bioanalytical method based on automated sample transfer, automated solid phase extraction, and hydrophilic interaction liquid chromatography-tandem mass spectrometry (HILIC-MS/MS) analysis, has been developed for the determination of cetirizine, a selective H(1)-receptor antagonist. Deuterated cetirizine (cetirizine-d(8)) was synthesized as described and was used as the internal standard. Samples were transferred into 96-well plates using an automated sample handling system. Automated solid phase extraction was carried out using a 96-channel programmable liquid-handling workstation. Solid phase extraction 96-well plate on polymer sorbent (Strata X) was used to extract the analyte. The extracted samples were injected onto a Betasil silica column (50 x 3, 5 microm) using a mobile phase of acetonitrile-water-acetic acid-trifluroacetic acid (93:7:1:0.025, v/v/v/v) at a flow rate of 0.5 ml/min. The chromatographic run time is 2.0 min per injection, with retention time of cetirizine and cetirizine-d(8) both at 1.1 min. The system consisted of a Shimadzu HPLC system and a PE Sciex API 3000 or API 4000 tandem mass spectrometer with (+) ESI. The method has been validated over the concentration range of 1.00-1000 ng/ml cetirizine in human plasma, based on a 0.10-ml sample size. The inter-day precision and accuracy of the quality control (QC) samples demonstrated <3.0% relative standard deviation (R.S.D.) and <6.0% relative error (RE). Stability of cetirizine in stock solution, in plasma, and in reconstitution solution was established. The absolute extraction recovery was 85.8%, 84.5%, and 88.0% at 3, 40, and 800 ng/ml, respectively. The recovery for the internal standard was 84.1%. No adverse matrix effects were noticed for this assay. The automation of the sample preparation steps not only increased the analysis throughput, but also increased method ruggedness. The use of a stable isotope-labeled internal standard further improved the method ruggedness

  8. Prospective evaluation of a new automated nucleic acid extraction system using routine clinical respiratory specimens.

    PubMed

    Mengelle, C; Mansuy, J-M; Sandres-Sauné, K; Barthe, C; Boineau, J; Izopet, J

    2012-06-01

    The aim of the study was to evaluate the MagNA Pure 96™ nucleic acid extraction system using clinical respiratory specimens for identifying viruses by qualitative real-time PCR assays. Three extraction methods were tested, that is, the MagNA Pure LC™, the COBAS Ampliprep™, and the MagNA Pure 96™ with 10-fold dilutions of an influenza A(H1N1)pdm09 sample. Two hundred thirty-nine respiratory specimens, 35 throat swabs, 164 nasopharyngeal specimens, and 40 broncho-alveolar fluids, were extracted with the MagNA Pure 96™ and the COBAS Ampliprep™ instruments. Forty COBAS Ampliprep™ positive samples were also tested. Real-time PCRs were used to identify influenza A and influenza A(H1N1)pdm09, rhinovirus, enterovirus, adenovirus, varicella zoster virus, cytomegalovirus, and herpes simplex virus. Similar results were obtained on RNA extracted from dilutions of influenza A(H1N1)pdm09 with the three systems: the MagNA Pure LC™, the COBAS Ampliprep™, and the MagNA Pure 96™. Data from clinical respiratory specimens extracted with the MagNA Pure 96™ and COBAS Ampliprep™ instruments were in 98.5% in agreement (P < 0.0001) for influenza A and influenza A(H1N1)pdm09. Data for rhinovirus were in 97.3% agreement (P < 0.0001) and in 96.8% agreement for enterovirus. They were in 100% agreement for adenovirus. Data for cytomegalovirus and HSV1-2 were in 95.2% agreement (P < 0.0001). The MagNA Pure 96™ instrument is easy-to-use, reliable, and has a high throughput for extracting total nucleic acid from respiratory specimens. These extracts are suitable for molecular diagnosis with any type of real-time PCR assay.

  9. Background Knowledge in Learning-Based Relation Extraction

    ERIC Educational Resources Information Center

    Do, Quang Xuan

    2012-01-01

    In this thesis, we study the importance of background knowledge in relation extraction systems. We not only demonstrate the benefits of leveraging background knowledge to improve the systems' performance but also propose a principled framework that allows one to effectively incorporate knowledge into statistical machine learning models for…

  10. Extracting infrared absolute reflectance from relative reflectance measurements.

    PubMed

    Berets, Susan L; Milosevic, Milan

    2012-06-01

    Absolute reflectance measurements are valuable to the optics industry for development of new materials and optical coatings. Yet, absolute reflectance measurements are notoriously difficult to make. In this paper, we investigate the feasibility of extracting the absolute reflectance from a relative reflectance measurement using a reference material with known refractive index.

  11. An automated system for retrieving herb-drug interaction related articles from MEDLINE

    PubMed Central

    Lin, Kuo; Friedman, Carol; Finkelstein, Joseph

    2016-01-01

    An automated, user-friendly and accurate system for retrieving herb-drug interaction (HDIs) related articles in MEDLINE can increase the safety of patients, as well as improve the physicians’ article retrieving ability regarding speed and experience. Previous studies show that MeSH based queries associated with negative effects of drugs can be customized, resulting in good performance in retrieving relevant information, but no study has focused on the area of herb-drug interactions (HDI). This paper adapted the characteristics of HDI related papers and created a multilayer HDI article searching system. It achieved a sensitivity of 92% at a precision of 93% in a preliminary evaluation. Instead of requiring physicians to conduct PubMed searches directly, this system applies a more user-friendly approach by employing a customized system that enhances PubMed queries, shielding users from having to write queries, dealing with PubMed, or reading many irrelevant articles. The system provides automated processes and outputs target articles based on the input. PMID:27570662

  12. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    NASA Astrophysics Data System (ADS)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and

  13. Coreference based event-argument relation extraction on biomedical text

    PubMed Central

    2011-01-01

    This paper presents a new approach to exploit coreference information for extracting event-argument (E-A) relations from biomedical documents. This approach has two advantages: (1) it can extract a large number of valuable E-A relations based on the concept of salience in discourse; (2) it enables us to identify E-A relations over sentence boundaries (cross-links) using transitivity of coreference relations. We propose two coreference-based models: a pipeline based on Support Vector Machine (SVM) classifiers, and a joint Markov Logic Network (MLN). We show the effectiveness of these models on a biomedical event corpus. Both models outperform the systems that do not use coreference information. When the two proposed models are compared to each other, joint MLN outperforms pipeline SVM with gold coreference information. PMID:22166257

  14. Table Extraction from Web Pages Using Conditional Random Fields to Extract Toponym Related Data

    NASA Astrophysics Data System (ADS)

    Luthfi Hanifah, Hayyu’; Akbar, Saiful

    2017-01-01

    Table is one of the ways to visualize information on web pages. The abundant number of web pages that compose the World Wide Web has been the motivation of information extraction and information retrieval research, including the research for table extraction. Besides, there is a need for a system which is designed to specifically handle location-related information. Based on this background, this research is conducted to provide a way to extract location-related data from web tables so that it can be used in the development of Geographic Information Retrieval (GIR) system. The location-related data will be identified by the toponym (location name). In this research, a rule-based approach with gazetteer is used to recognize toponym from web table. Meanwhile, to extract data from a table, a combination of rule-based approach and statistical-based approach is used. On the statistical-based approach, Conditional Random Fields (CRF) model is used to understand the schema of the table. The result of table extraction is presented on JSON format. If a web table contains toponym, a field will be added on the JSON document to store the toponym values. This field can be used to index the table data in accordance to the toponym, which then can be used in the development of GIR system.

  15. [Corrected Title: Solid-Phase Extraction of Polar Compounds from Water] Automated Electrostatics Environmental Chamber

    NASA Technical Reports Server (NTRS)

    Sauer, Richard; Rutz, Jeffrey; Schultz, John

    2005-01-01

    A solid-phase extraction (SPE) process has been developed for removing alcohols, carboxylic acids, aldehydes, ketones, amines, and other polar organic compounds from water. This process can be either a subprocess of a water-reclamation process or a means of extracting organic compounds from water samples for gas-chromatographic analysis. This SPE process is an attractive alternative to an Environmental Protection Administration liquid-liquid extraction process that generates some pollution and does not work in a microgravitational environment. In this SPE process, one forces a water sample through a resin bed by use of positive pressure on the upstream side and/or suction on the downstream side, thereby causing organic compounds from the water to be adsorbed onto the resin. If gas-chromatographic analysis is to be done, the resin is dried by use of a suitable gas, then the adsorbed compounds are extracted from the resin by use of a solvent. Unlike the liquid-liquid process, the SPE process works in both microgravity and Earth gravity. In comparison with the liquid-liquid process, the SPE process is more efficient, extracts a wider range of organic compounds, generates less pollution, and costs less.

  16. Automated extraction of information on chemical-P-glycoprotein interactions from the literature.

    PubMed

    Yoshida, Shuya; Yamashita, Fumiyoshi; Ose, Atsushi; Maeda, Kazuya; Sugiyama, Yuichi; Hashida, Mitsuru

    2013-10-28

    Knowledge of the interactions between drugs and transporters is important for drug discovery and development as well as for the evaluation of their clinical safety. We recently developed a text-mining system for the automatic extraction of information on chemical-CYP3A4 interactions from the literature. This system is based on natural language processing and can extract chemical names and their interaction patterns according to sentence context. The present study aimed to extend this system to the extraction of information regarding chemical-transporter interactions. For this purpose, the key verb list designed for cytochrome P450 enzymes was replaced with that for known drug transporters. The performance of the system was then tested by examining the accuracy of information on chemical-P-glycoprotein (P-gp) interactions extracted from randomly selected PubMed abstracts. The system achieved 89.8% recall and 84.2% precision for the identification of chemical names and 71.7% recall and 78.6% precision for the extraction of chemical-P-gp interactions.

  17. Automation of static and dynamic non-dispersive liquid phase microextraction. Part 1: Approaches based on extractant drop-, plug-, film- and microflow-formation.

    PubMed

    Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján

    2016-02-04

    Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given.

  18. Automated Template-based Brain Localization and Extraction for Fetal Brain MRI Reconstruction.

    PubMed

    Tourbier, Sébastien; Velasco-Annis, Clemente; Taimouri, Vahid; Hagmann, Patric; Meuli, Reto; Warfield, Simon K; Cuadra, Meritxell Bach; Gholipour, Ali

    2017-04-10

    Most fetal brain MRI reconstruction algorithms rely only on brain tissue-relevant voxels of low-resolution (LR) images to enhance the quality of inter-slice motion correction and image reconstruction. Consequently the fetal brain needs to be localized and extracted as a first step, which is usually a laborious and time consuming manual or semi-automatic task. We have proposed in this work to use age-matched template images as prior knowledge to automatize brain localization and extraction. This has been achieved through a novel automatic brain localization and extraction method based on robust template-to-slice block matching and deformable slice-to-template registration. Our template-based approach has also enabled the reconstruction of fetal brain images in standard radiological anatomical planes in a common coordinate space. We have integrated this approach into our new reconstruction pipeline that involves intensity normalization, inter-slice motion correction, and super-resolution (SR) reconstruction. To this end we have adopted a novel approach based on projection of every slice of the LR brain masks into the template space using a fusion strategy. This has enabled the refinement of brain masks in the LR images at each motion correction iteration. The overall brain localization and extraction algorithm has shown to produce brain masks that are very close to manually drawn brain masks, showing an average Dice overlap measure of 94.5%. We have also demonstrated that adopting a slice-to-template registration and propagation of the brain mask slice-by-slice leads to a significant improvement in brain extraction performance compared to global rigid brain extraction and consequently in the quality of the final reconstructed images. Ratings performed by two expert observers show that the proposed pipeline can achieve similar reconstruction quality to reference reconstruction based on manual slice-by-slice brain extraction. The proposed brain mask refinement and

  19. Automated biphasic morphological assessment of hepatitis B-related liver fibrosis using second harmonic generation microscopy

    NASA Astrophysics Data System (ADS)

    Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting

    2015-08-01

    Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P < 0.001). A biphasic scoring method was applied, combining support vector machine and multivariate generalized linear models to assess the early and late stages of fibrosis, respectively, based on these parameters. The verification cohort was used to verify the scoring method, and the area under the receiver operating characteristic curve was >0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method.

  20. Progress in automated extraction and purification of in situ 14C from quartz: Results from the Purdue in situ 14C laboratory

    NASA Astrophysics Data System (ADS)

    Lifton, Nathaniel; Goehring, Brent; Wilson, Jim; Kubley, Thomas; Caffee, Marc

    2015-10-01

    Current extraction methods for in situ 14C from quartz [e.g., Lifton et al., (2001), Pigati et al., (2010), Hippe et al., (2013)] are time-consuming and repetitive, making them an attractive target for automation. We report on the status of in situ 14C extraction and purification systems originally automated at the University of Arizona that have now been reconstructed and upgraded at the Purdue Rare Isotope Measurement Laboratory (PRIME Lab). The Purdue in situ 14C laboratory builds on the flow-through extraction system design of Pigati et al. (2010), automating most of the procedure by retrofitting existing valves with external servo-controlled actuators, regulating the pressure of research purity O2 inside the furnace tube via a PID-based pressure controller in concert with an inlet mass flow controller, and installing an automated liquid N2 distribution system, all driven by LabView® software. A separate system for cryogenic CO2 purification, dilution, and splitting is also fully automated, ensuring a highly repeatable process regardless of the operator. We present results from procedural blanks and an intercomparison material (CRONUS-A), as well as results of experiments to increase the amount of material used in extraction, from the standard 5 g to 10 g or above. Results thus far are quite promising with procedural blanks comparable to previous work and significant improvements in reproducibility for CRONUS-A measurements. The latter analyses also demonstrate the feasibility of quantitative extraction of in situ 14C from sample masses up to 10 g. Our lab is now analyzing unknowns routinely, but lowering overall blank levels is the focus of ongoing research.

  1. Automated reference region extraction and population-based input function for brain [11C]TMSX PET image analyses

    PubMed Central

    Rissanen, Eero; Tuisku, Jouni; Luoto, Pauliina; Arponen, Eveliina; Johansson, Jarkko; Oikonen, Vesa; Parkkola, Riitta; Airas, Laura; Rinne, Juha O

    2015-01-01

    [11C]TMSX ([7-N-methyl-11C]-(E)-8-(3,4,5-trimethoxystyryl)-1,3,7-trimethylxanthine) is a selective adenosine A2A receptor (A2AR) radioligand. In the central nervous system (CNS), A2AR are linked to dopamine D2 receptor function in striatum, but they are also important modulators of inflammation. The golden standard for kinetic modeling of brain [11C]TMSX positron emission tomography (PET) is to obtain arterial input function via arterial blood sampling. However, this method is laborious, prone to errors and unpleasant for study subjects. The aim of this work was to evaluate alternative input function acquisition methods for brain [11C]TMSX PET imaging. First, a noninvasive, automated method for the extraction of gray matter reference region using supervised clustering (SCgm) was developed. Second, a method for obtaining a population-based arterial input function (PBIF) was implemented. These methods were created using data from 28 study subjects (7 healthy controls, 12 multiple sclerosis patients, and 9 patients with Parkinson's disease). The results with PBIF correlated well with original plasma input, and the SCgm yielded similar results compared with cerebellum as a reference region. The clustering method for extracting reference region and the population-based approach for acquiring input for dynamic [11C]TMSX brain PET image analyses appear to be feasible and robust methods, that can be applied in patients with CNS pathology. PMID:25370856

  2. Automated reference region extraction and population-based input function for brain [(11)C]TMSX PET image analyses.

    PubMed

    Rissanen, Eero; Tuisku, Jouni; Luoto, Pauliina; Arponen, Eveliina; Johansson, Jarkko; Oikonen, Vesa; Parkkola, Riitta; Airas, Laura; Rinne, Juha O

    2015-01-01

    [(11)C]TMSX ([7-N-methyl-(11)C]-(E)-8-(3,4,5-trimethoxystyryl)-1,3,7-trimethylxanthine) is a selective adenosine A2A receptor (A2AR) radioligand. In the central nervous system (CNS), A2AR are linked to dopamine D2 receptor function in striatum, but they are also important modulators of inflammation. The golden standard for kinetic modeling of brain [(11)C]TMSX positron emission tomography (PET) is to obtain arterial input function via arterial blood sampling. However, this method is laborious, prone to errors and unpleasant for study subjects. The aim of this work was to evaluate alternative input function acquisition methods for brain [(11)C]TMSX PET imaging. First, a noninvasive, automated method for the extraction of gray matter reference region using supervised clustering (SCgm) was developed. Second, a method for obtaining a population-based arterial input function (PBIF) was implemented. These methods were created using data from 28 study subjects (7 healthy controls, 12 multiple sclerosis patients, and 9 patients with Parkinson's disease). The results with PBIF correlated well with original plasma input, and the SCgm yielded similar results compared with cerebellum as a reference region. The clustering method for extracting reference region and the population-based approach for acquiring input for dynamic [(11)C]TMSX brain PET image analyses appear to be feasible and robust methods, that can be applied in patients with CNS pathology.

  3. A Simple Method for Automated Solid Phase Extraction of Water Samples for Immunological Analysis of Small Pollutants.

    PubMed

    Heub, Sarah; Tscharner, Noe; Kehl, Florian; Dittrich, Petra S; Follonier, Stéphane; Barbe, Laurent

    2016-01-01

    A new method for solid phase extraction (SPE) of environmental water samples is proposed. The developed prototype is cost-efficient and user friendly, and enables to perform rapid, automated and simple SPE. The pre-concentrated solution is compatible with analysis by immunoassay, with a low organic solvent content. A method is described for the extraction and pre-concentration of natural hormone 17β-estradiol in 100 ml water samples. Reverse phase SPE is performed with octadecyl-silica sorbent and elution is done with 200 µl of methanol 50% v/v. Eluent is diluted by adding di-water to lower the amount of methanol. After preparing manually the SPE column, the overall procedure is performed automatically within 1 hr. At the end of the process, estradiol concentration is measured by using a commercial enzyme-linked immune-sorbent assay (ELISA). 100-fold pre-concentration is achieved and the methanol content in only 10% v/v. Full recoveries of the molecule are achieved with 1 ng/L spiked de-ionized and synthetic sea water samples.

  4. Fully automated Liquid Extraction-Based Surface Sampling and Ionization Using a Chip-Based Robotic Nanoelectrospray Platform

    SciTech Connect

    Kertesz, Vilmos; Van Berkel, Gary J

    2010-01-01

    A fully automated liquid extraction-based surface sampling device utilizing an Advion NanoMate chip-based infusion nanoelectrospray ionization system is reported. Analyses were enabled for discrete spot sampling by using the Advanced User Interface of the current commercial control software. This software interface provided the parameter control necessary for the NanoMate robotic pipettor to both form and withdraw a liquid microjunction for sampling from a surface. The system was tested with three types of analytically important sample surface types, viz., spotted sample arrays on a MALDI plate, dried blood spots on paper, and whole-body thin tissue sections from drug dosed mice. The qualitative and quantitative data were consistent with previous studies employing other liquid extraction-based surface sampling techniques. The successful analyses performed here utilized the hardware and software elements already present in the NanoMate system developed to handle and analyze liquid samples. Implementation of an appropriate sample (surface) holder, a solvent reservoir, faster movement of the robotic arm, finer control over solvent flow rate when dispensing and retrieving the solution at the surface, and the ability to select any location on a surface to sample from would improve the analytical performance and utility of the platform.

  5. Concept recognition for extracting protein interaction relations from biomedical text

    PubMed Central

    Baumgartner, William A; Lu, Zhiyong; Johnson, Helen L; Caporaso, J Gregory; Paquette, Jesse; Lindemann, Anna; White, Elizabeth K; Medvedeva, Olga; Cohen, K Bretonnel; Hunter, Lawrence

    2008-01-01

    Background: Reliable information extraction applications have been a long sought goal of the biomedical text mining community, a goal that if reached would provide valuable tools to benchside biologists in their increasingly difficult task of assimilating the knowledge contained in the biomedical literature. We present an integrated approach to concept recognition in biomedical text. Concept recognition provides key information that has been largely missing from previous biomedical information extraction efforts, namely direct links to well defined knowledge resources that explicitly cement the concept's semantics. The BioCreative II tasks discussed in this special issue have provided a unique opportunity to demonstrate the effectiveness of concept recognition in the field of biomedical language processing. Results: Through the modular construction of a protein interaction relation extraction system, we present several use cases of concept recognition in biomedical text, and relate these use cases to potential uses by the benchside biologist. Conclusion: Current information extraction technologies are approaching performance standards at which concept recognition can begin to deliver high quality data to the benchside biologist. Our system is available as part of the BioCreative Meta-Server project and on the internet . PMID:18834500

  6. Towards a Relation Extraction Framework for Cyber-Security Concepts

    SciTech Connect

    Jones, Corinne L; Bridges, Robert A; Huffer, Kelly M; Goodall, John R

    2015-01-01

    In order to assist security analysts in obtaining information pertaining to their network, such as novel vulnerabilities, exploits, or patches, information retrieval methods tailored to the security domain are needed. As labeled text data is scarce and expensive, we follow developments in semi-supervised NLP and implement a bootstrapping algorithm for extracting security entities and their relationships from text. The algorithm requires little input data, specifically, a few relations or patterns (heuristics for identifying relations), and incorporates an active learning component which queries the user on the most important decisions to prevent drifting the desired relations. Preliminary testing on a small corpus shows promising results, obtaining precision of .82.

  7. Automated object extraction from remote sensor image based on adaptive thresholding technique

    NASA Astrophysics Data System (ADS)

    Zhao, Tongzhou; Ma, Shuaijun; Li, Jin; Ming, Hui; Luo, Xiaobo

    2009-10-01

    Detection and extraction of the dim moving small objects in the infrared image sequences is an interesting research area. A system for detection of the dim moving small targets in the IR image sequences is presented, and a new algorithm having high performance for extracting moving small targets in infrared image sequences containing cloud clutter is proposed in the paper. This method can get the better detection precision than some other methods, and two independent units can realize the calculative process. The novelty of the algorithm is that it uses adaptive thresholding technique of the moving small targets in both the spatial domain and temporal domain. The results of experiment show that the algorithm we presented has high ratio of detection precision.

  8. Automating identification of avian vocalizations using time-frequency information extracted from the Gabor transform.

    PubMed

    Connor, Edward F; Li, Shidong; Li, Steven

    2012-07-01

    Based on the Gabor transform, a metric is developed and applied to automatically identify bird species from a sample of 568 digital recordings of songs/calls from 67 species of birds. The Gabor frequency-amplitude spectrum and the Gabor time-amplitude profile are proposed as a means to characterize the frequency and time patterns of a bird song. An approach based on template matching where unknown song clips are compared to a library of known song clips is used. After adding noise to simulate the background environment and using an adaptive high-pass filter to de-noise the recordings, the successful identification rate exceeded 93% even at signal-to-noise ratios as low as 5 dB. Bird species whose songs/calls were dominated by low frequencies were more difficult to identify than species whose songs were dominated by higher frequencies. The results suggest that automated identification may be practical if comprehensive libraries of recordings that encompass the vocal variation within species can be assembled.

  9. Quantification of lung tumor rotation with automated landmark extraction using orthogonal cine MRI images

    NASA Astrophysics Data System (ADS)

    Paganelli, Chiara; Lee, Danny; Greer, Peter B.; Baroni, Guido; Riboldi, Marco; Keall, Paul

    2015-09-01

    The quantification of tumor motion in sites affected by respiratory motion is of primary importance to improve treatment accuracy. To account for motion, different studies analyzed the translational component only, without focusing on the rotational component, which was quantified in a few studies on the prostate with implanted markers. The aim of our study was to propose a tool able to quantify lung tumor rotation without the use of internal markers, thus providing accurate motion detection close to critical structures such as the heart or liver. Specifically, we propose the use of an automatic feature extraction method in combination with the acquisition of fast orthogonal cine MRI images of nine lung patients. As a preliminary test, we evaluated the performance of the feature extraction method by applying it on regions of interest around (i) the diaphragm and (ii) the tumor and comparing the estimated motion with that obtained by (i) the extraction of the diaphragm profile and (ii) the segmentation of the tumor, respectively. The results confirmed the capability of the proposed method in quantifying tumor motion. Then, a point-based rigid registration was applied to the extracted tumor features between all frames to account for rotation. The median lung rotation values were  -0.6   ±   2.3° and  -1.5   ±   2.7° in the sagittal and coronal planes respectively, confirming the need to account for tumor rotation along with translation to improve radiotherapy treatment.

  10. Exploratory normalized difference water indices for semi-automated extraction of Antarctic lake features

    NASA Astrophysics Data System (ADS)

    Jawak, Shridhar D.; Luis, Alvarinho J.

    2016-05-01

    This work presents various normalized difference water indices (NDWI) to delineate lakes from Schirmacher Oasis, East Antarctica, by using a very high resolution WorldView-2 (WV-2) satellite imagery. Schirmacher oasis region hosts a number of fresh as well as saline water lakes, such as epishelf lakes, ice-free or landlocked lakes, which are completely frozen or semi-frozen and in a ice-free state. Hence, detecting all these types of lakes distinctly on satellite imagery was the major challenge, as the spectral characteristics of various types of lakes were identical to the other land cover targets. Multiband spectral index pixel-based approach is most experimented and recently growing technique because of its unbeatable advantages such as its simplicity and comparatively lesser amount of processing-time. In present study, semiautomatic extraction of lakes in cryospheric region was carried out by designing specific spectral indices. The study utilized number of existing spectral indices to extract lakes but none could deliver satisfactory results and hence we modified NDWI. The potentials of newly added bands in WV-2 satellite imagery was explored by developing spectral indices comprising of Yellow (585 - 625 nm) band, in combination with Blue (450 - 510 nm), Coastal (400 - 450 nm) and Green (510 - 580 nm) bands. For extraction of frozen lakes, use of Yellow (585 - 625 nm) and near-infrared 2 (NIR2) band pair, and Yellow and Green band pair worked well, whereas for ice-free lakes extraction, a combination of Blue and Coastal band yielded appreciable results, when compared with manually digitized data. The results suggest that the modified NDWI approach rendered bias error varying from 1 to 34 m2.

  11. Automated extraction of DNA from blood and PCR setup using a Tecan Freedom EVO liquid handler for forensic genetic STR typing of reference samples.

    PubMed

    Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune; Hansen, Anders J; Morling, Niels

    2011-04-01

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained the samples were washed with MilliQ water before the return of the DNA extracts. The PCR was setup in 96-well microtiter plates. The methods were validated for the kits: AmpFℓSTR Identifiler, SGM Plus and Yfiler (Applied Biosystems, Foster City, CA), GenePrint FFFL and PowerPlex Y (Promega, Madison, WI). The automated protocols allowed for extraction and addition of PCR master mix of 96 samples within 3.5h. In conclusion, we demonstrated that (1) DNA extraction with magnetic beads and (2) PCR setup for accredited, forensic genetic short tandem repeat typing can be implemented on a simple automated liquid handler leading to the reduction of manual work, and increased quality and throughput.

  12. Automated sample preparation based on the sequential injection principle. Solid-phase extraction on a molecularly imprinted polymer coupled on-line to high-performance liquid chromatography.

    PubMed

    Theodoridis, Georgios; Zacharis, Constantinos K; Tzanavaras, Paraskevas D; Themelis, Demetrius G; Economou, Anastasios

    2004-03-19

    A molecularly imprinted polymer (MIP) prepared using caffeine, as a template, was validated as a selective sorbent for solid-phase extraction (SPE), within an automated on-line sample preparation method. The polymer produced was packed in a polypropylene cartridge, which was incorporated in a flow system prior to the HPLC analytical instrumentation. The principle of sequential injection was utilised for a rapid automated and efficient SPE procedure on the MIP. Samples, buffers, washing and elution solvents were introduced to the extraction cartridge via a peristaltic pump and a multi-position valve, both controlled by appropriate software developed in-house. The method was optimised in terms of flow rates, extraction time and volume. After extraction, the final eluent from the extraction cartridge was directed to the injection loop and was subsequently analysed on HPLC. The overall set-up facilitated unattended operation, operation and improved both mixing fluidics and method development flexibility. This system may be readily built in the laboratory and can be further used as an automated platform for on-line sample preparation.

  13. Evaluation of an Automated Information Extraction Tool for Imaging Data Elements to Populate a Breast Cancer Screening Registry.

    PubMed

    Lacson, Ronilda; Harris, Kimberly; Brawarsky, Phyllis; Tosteson, Tor D; Onega, Tracy; Tosteson, Anna N A; Kaye, Abby; Gonzalez, Irina; Birdwell, Robyn; Haas, Jennifer S

    2015-10-01

    Breast cancer screening is central to early breast cancer detection. Identifying and monitoring process measures for screening is a focus of the National Cancer Institute's Population-based Research Optimizing Screening through Personalized Regimens (PROSPR) initiative, which requires participating centers to report structured data across the cancer screening continuum. We evaluate the accuracy of automated information extraction of imaging findings from radiology reports, which are available as unstructured text. We present prevalence estimates of imaging findings for breast imaging received by women who obtained care in a primary care network participating in PROSPR (n = 139,953 radiology reports) and compared automatically extracted data elements to a "gold standard" based on manual review for a validation sample of 941 randomly selected radiology reports, including mammograms, digital breast tomosynthesis, ultrasound, and magnetic resonance imaging (MRI). The prevalence of imaging findings vary by data element and modality (e.g., suspicious calcification noted in 2.6% of screening mammograms, 12.1% of diagnostic mammograms, and 9.4% of tomosynthesis exams). In the validation sample, the accuracy of identifying imaging findings, including suspicious calcifications, masses, and architectural distortion (on mammogram and tomosynthesis); masses, cysts, non-mass enhancement, and enhancing foci (on MRI); and masses and cysts (on ultrasound), range from 0.8 to1.0 for recall, precision, and F-measure. Information extraction tools can be used for accurate documentation of imaging findings as structured data elements from text reports for a variety of breast imaging modalities. These data can be used to populate screening registries to help elucidate more effective breast cancer screening processes.

  14. A device for automated direct sampling and quantitation from solid-phase sorbent extraction cards by electrospray tandem mass spectrometry.

    PubMed

    Wachs, Timothy; Henion, Jack

    2003-04-01

    A new solid-phase extraction (SPE) device in the 96-well format (SPE Card) has been employed for automated off-line sample preparation of low-volume urine samples. On-line automated analyte elution via SPE and direct quantitation by micro ion spray mass spectrometry is reported. This sample preparation device has the format of a microtiter plate and is molded in a plastic frame which houses 96 separate sandwiched 3M Empore sorbents (0.5-mm-thickness, 8-microm particles) covered on both sides by a microfiber support material. Ninety-six discrete SPE zones, each 7 mm in diameter, are imbedded into the sheet in the conventional 9-mm pitch (spacing) of a 96-well microtiter plate. In this study one-quarter of an SPE Card (24 individual zones) was used merely as a convenience. After automated off-line interference elution of applied human urine from 24 samples, a section of SPE Card is mounted vertically on a computer-controlled X, Y, Z positioner in front of a micro ion spray direct sampling tube equipped with a beveled tip. The beveled tip of this needle robotically penetrates each SPE elution zone (sorbent disk) or stationary phase in a serial fashion. The eluted analytes are sequentially transferred directly to a microelectrosprayer to obtain tandem mass spectrometric (MS/MS) analysis. This strategy precludes any HPLC separation and the associated method development. The quantitative determination of Ritalin (methylphenidate) from fortified human urine samples is demonstrated. A trideuterated internal standard of methylphenidate was used to obtain ion current response ratios between the parent drug and the internal standard. Human control urine samples fortified from 6.6 to 3300 ng/mL (normal therapeutic levels have been determined in other studies to be between 50 and 100 ng/mL urine) were analyzed and a linear calibration curve was obtained with a correlation coefficient of 0.9999, where the precision of the quality control (QC) samples ranged from 9.6% at the 24

  15. Comparative evaluation of automated and manual commercial DNA extraction methods for detection of Francisella tularensis DNA from suspensions and spiked swabs by real-time polymerase chain reaction.

    PubMed

    Dauphin, Leslie A; Walker, Roblena E; Petersen, Jeannine M; Bowen, Michael D

    2011-07-01

    This study evaluated commercial automated and manual DNA extraction methods for the isolation of Francisella tularensis DNA suitable for real-time polymerase chain reaction (PCR) analysis from cell suspensions and spiked cotton, foam, and polyester swabs. Two automated methods, the MagNA Pure Compact and the QIAcube, were compared to 4 manual methods, the IT 1-2-3 DNA sample purification kit, the MasterPure Complete DNA and RNA purification kit, the QIAamp DNA blood mini kit, and the UltraClean Microbial DNA isolation kit. The methods were compared using 6 F. tularensis strains representing the 2 subspecies which cause the majority of reported cases of tularemia in humans. Cell viability testing of the DNA extracts showed that all 6 extraction methods efficiently inactivated F. tularensis at concentrations of ≤10⁶ CFU/mL. Real-time PCR analysis using a multitarget 5' nuclease assay for F. tularensis revealed that the PCR sensitivity was equivalent using DNA extracted by the 2 automated methods and the manual MasterPure and QIAamp methods. These 4 methods resulted in significantly better levels of detection from bacterial suspensions and performed equivalently for spiked swab samples than the remaining 2. This study identifies optimal DNA extraction methods for processing swab specimens for the subsequent detection of F. tularensis DNA using real-time PCR assays. Furthermore, the results provide diagnostic laboratories with the option to select from 2 automated DNA extraction methods as suitable alternatives to manual methods for the isolation of DNA from F. tularensis.

  16. Applicability of a System for fully automated nucleic acid extraction from formalin-fixed paraffin-embedded sections for routine KRAS mutation testing.

    PubMed

    Lehmann, Annika; Schewe, Christiane; Hennig, Guido; Denkert, Carsten; Weichert, Wilko; Budczies, Jan; Dietel, Manfred

    2012-06-01

    Due to the approval of various new targeted therapies for the treatment of cancer, molecular pathology laboratories with a diagnostic focus have to meet new challenges: simultaneous handling of a large number of samples, small amounts of input material, and fragmentation of nucleic acids because of formalin fixation. As a consequence, fully automated systems for a fast and standardized extraction of high-quality DNA from formalin-fixed paraffin-embedded (FFPE) tissues are urgently needed. In this study, we tested the performance of a fully automated, high-throughput method for the extraction of nucleic acids from FFPE tissues. We investigated the extraction performance in sections of 5 different tissue types often analyzed in routine pathology laboratories (cervix, colon, liver, lymph node, and lung; n=340). Furthermore, we compared the quality, labor input, and applicability of the method for diagnostic purposes with those of a laboratory-validated manual method in a clinical setting by screening a set of 45 colorectal adenocarcinoma for the KRAS mutation. Automated extraction of both DNA and RNA was successful in 339 of 340 FFPE samples representing 5 different tissue types. In comparison with a conventional manual extraction protocol, the method showed an overall agreement of 97.7% (95% confidence interval, 88.2%-99.9%) for the subsequent mutational analysis of the KRAS gene in colorectal cancer samples. The fully automated system is a promising tool for a simple, robust, and rapid extraction of DNA and RNA from formalin-fixed tissue. It ensures a standardization of sample processing and can be applied to clinical FFPE samples in routine pathology.

  17. Linearly Supporting Feature Extraction for Automated Estimation of Stellar Atmospheric Parameters

    NASA Astrophysics Data System (ADS)

    Li, Xiangru; Lu, Yu; Comte, Georges; Luo, Ali; Zhao, Yongheng; Wang, Yongjun

    2015-05-01

    We describe a scheme to extract linearly supporting (LSU) features from stellar spectra to automatically estimate the atmospheric parameters {{T}{\\tt{eff} }}, log g, and [Fe/H]. “Linearly supporting” means that the atmospheric parameters can be accurately estimated from the extracted features through a linear model. The successive steps of the process are as follow: first, decompose the spectrum using a wavelet packet (WP) and represent it by the derived decomposition coefficients; second, detect representative spectral features from the decomposition coefficients using the proposed method Least Absolute Shrinkage and Selection Operator (LARS)bs; third, estimate the atmospheric parameters {{T}{\\tt{eff} }}, log g, and [Fe/H] from the detected features using a linear regression method. One prominent characteristic of this scheme is its ability to evaluate quantitatively the contribution of each detected feature to the atmospheric parameter estimate and also to trace back the physical significance of that feature. This work also shows that the usefulness of a component depends on both the wavelength and frequency. The proposed scheme has been evaluated on both real spectra from the Sloan Digital Sky Survey (SDSS)/SEGUE and synthetic spectra calculated from Kurucz's NEWODF models. On real spectra, we extracted 23 features to estimate {{T}{\\tt{eff} }}, 62 features for log g, and 68 features for [Fe/H]. Test consistencies between our estimates and those provided by the Spectroscopic Parameter Pipeline of SDSS show that the mean absolute errors (MAEs) are 0.0062 dex for log {{T}{\\tt{eff} }} (83 K for {{T}{\\tt{eff} }}), 0.2345 dex for log g, and 0.1564 dex for [Fe/H]. For the synthetic spectra, the MAE test accuracies are 0.0022 dex for log {{T}{\\tt{eff} }} (32 K for {{T}{\\tt{eff} }}), 0.0337 dex for log g, and 0.0268 dex for [Fe/H].

  18. Pedestrian detection in thermal images: An automated scale based region extraction with curvelet space validation

    NASA Astrophysics Data System (ADS)

    Lakshmi, A.; Faheema, A. G. J.; Deodhare, Dipti

    2016-05-01

    Pedestrian detection is a key problem in night vision processing with a dozen of applications that will positively impact the performance of autonomous systems. Despite significant progress, our study shows that performance of state-of-the-art thermal image pedestrian detectors still has much room for improvement. The purpose of this paper is to overcome the challenge faced by the thermal image pedestrian detectors, which employ intensity based Region Of Interest (ROI) extraction followed by feature based validation. The most striking disadvantage faced by the first module, ROI extraction, is the failed detection of cloth insulted parts. To overcome this setback, this paper employs an algorithm and a principle of region growing pursuit tuned to the scale of the pedestrian. The statistics subtended by the pedestrian drastically vary with the scale and deviation from normality approach facilitates scale detection. Further, the paper offers an adaptive mathematical threshold to resolve the problem of subtracting the background while extracting cloth insulated parts as well. The inherent false positives of the ROI extraction module are limited by the choice of good features in pedestrian validation step. One such feature is curvelet feature, which has found its use extensively in optical images, but has as yet no reported results in thermal images. This has been used to arrive at a pedestrian detector with a reduced false positive rate. This work is the first venture made to scrutinize the utility of curvelet for characterizing pedestrians in thermal images. Attempt has also been made to improve the speed of curvelet transform computation. The classification task is realized through the use of the well known methodology of Support Vector Machines (SVMs). The proposed method is substantiated with qualified evaluation methodologies that permits us to carry out probing and informative comparisons across state-of-the-art features, including deep learning methods, with six

  19. Automated feature extraction and spatial organization of seafloor pockmarks, Belfast Bay, Maine, USA

    USGS Publications Warehouse

    Andrews, B.D.; Brothers, L.L.; Barnhardt, W.A.

    2010-01-01

    Seafloor pockmarks occur worldwide and may represent millions of m3 of continental shelf erosion, but few numerical analyses of their morphology and spatial distribution of pockmarks exist. We introduce a quantitative definition of pockmark morphology and, based on this definition, propose a three-step geomorphometric method to identify and extract pockmarks from high-resolution swath bathymetry. We apply this GIS-implemented approach to 25km2 of bathymetry collected in the Belfast Bay, Maine USA pockmark field. Our model extracted 1767 pockmarks and found a linear pockmark depth-to-diameter ratio for pockmarks field-wide. Mean pockmark depth is 7.6m and mean diameter is 84.8m. Pockmark distribution is non-random, and nearly half of the field's pockmarks occur in chains. The most prominent chains are oriented semi-normal to the steepest gradient in Holocene sediment thickness. A descriptive model yields field-wide spatial statistics indicating that pockmarks are distributed in non-random clusters. Results enable quantitative comparison of pockmarks in fields worldwide as well as similar concave features, such as impact craters, dolines, or salt pools. ?? 2010.

  20. Automated DICOM metadata and volumetric anatomical information extraction for radiation dosimetry

    NASA Astrophysics Data System (ADS)

    Papamichail, D.; Ploussi, A.; Kordolaimi, S.; Karavasilis, E.; Papadimitroulas, P.; Syrgiamiotis, V.; Efstathopoulos, E.

    2015-09-01

    Patient-specific dosimetry calculations based on simulation techniques have as a prerequisite the modeling of the modality system and the creation of voxelized phantoms. This procedure requires the knowledge of scanning parameters and patients’ information included in a DICOM file as well as image segmentation. However, the extraction of this information is complicated and time-consuming. The objective of this study was to develop a simple graphical user interface (GUI) to (i) automatically extract metadata from every slice image of a DICOM file in a single query and (ii) interactively specify the regions of interest (ROI) without explicit access to the radiology information system. The user-friendly application developed in Matlab environment. The user can select a series of DICOM files and manage their text and graphical data. The metadata are automatically formatted and presented to the user as a Microsoft Excel file. The volumetric maps are formed by interactively specifying the ROIs and by assigning a specific value in every ROI. The result is stored in DICOM format, for data and trend analysis. The developed GUI is easy, fast and and constitutes a very useful tool for individualized dosimetry. One of the future goals is to incorporate a remote access to a PACS server functionality.

  1. Automated bare earth extraction technique for complex topography in light detection and ranging surveys

    NASA Astrophysics Data System (ADS)

    Stevenson, Terry H.; Magruder, Lori A.; Neuenschwander, Amy L.; Bradford, Brian

    2013-01-01

    Bare earth extraction is an important component to light detection and ranging (LiDAR) data analysis in terms of terrain classification. The challenge in providing accurate digital surface models is augmented when there is diverse topography within the data set or complex combinations of vegetation and built structures. Few existing algorithms can handle substantial terrain diversity without significant editing or user interaction. This effort presents a newly developed methodology that provides a flexible, adaptable tool capable of integrating multiple LiDAR data attributes for an accurate terrain assessment. The terrain extraction and segmentation (TEXAS) approach uses a third-order spatial derivative for each point in the digital surface model to determine the curvature of the terrain rather than rely solely on the slope. The utilization of the curvature has shown to successfully preserve ground points in areas of steep terrain as they typically exhibit low curvature. Within the framework of TEXAS, the contiguous sets of points with low curvatures are grouped into regions using an edge-based segmentation method. The process does not require any user inputs and is completely data driven. This technique was tested on a variety of existing LiDAR surveys, each with varying levels of topographic complexity.

  2. Automated boundary extraction of the spinal canal in MRI based on dynamic programming.

    PubMed

    Koh, Jaehan; Chaudhary, Vipin; Dhillon, Gurmeet

    2012-01-01

    The spinal cord is the only communication link between the brain and the body. The abnormalities in it can lead to severe pain and sometimes to paralysis. Due to the growing gap between the number of available radiologists and the number of required radiologists, the need for computer-aided diagnosis and characterization is increasing. To ease this gap, we have developed a computer-aided diagnosis and characterization framework in lumbar spine that includes the spinal cord, vertebrae, and intervertebral discs. In this paper, we propose two spinal cord boundary extraction methods that fit into our framework based on dynamic programming in lumbar spine MRI. Our method incorporates the intensity of the image and the gradient of the image into a dynamic programming scheme and works in a fully-automatic fashion. The boundaries generated by our method is compared against reference boundaries in terms of Fréchet distance which is known to be a metric for shape analysis. The experimental results from 65 clinical data show that our method finds the spinal canal boundary correctly achieving a mean Fréchet distance of 13.5 pixels. For almost all data, the extracted boundary falls within the spinal cord. So, it can be used as a landmark when marking background regions and finding regions of interest.

  3. Kernel-Based Learning for Domain-Specific Relation Extraction

    NASA Astrophysics Data System (ADS)

    Basili, Roberto; Giannone, Cristina; Del Vescovo, Chiara; Moschitti, Alessandro; Naggar, Paolo

    In a specific process of business intelligence, i.e. investigation on organized crime, empirical language processing technologies can play a crucial role. The analysis of transcriptions on investigative activities, such as police interrogatories, for the recognition and storage of complex relations among people and locations is a very difficult and time consuming task, ultimately based on pools of experts. We discuss here an inductive relation extraction platform that opens the way to much cheaper and consistent workflows. The presented empirical investigation shows that accurate results, comparable to the expert teams, can be achieved, and parametrization allows to fine tune the system behavior for fitting domain-specific requirements.

  4. A novel approach for automated shoreline extraction from remote sensing images using low level programming

    NASA Astrophysics Data System (ADS)

    Rigos, Anastasios; Vaiopoulos, Aristidis; Skianis, George; Tsekouras, George; Drakopoulos, Panos

    2015-04-01

    Tracking coastline changes is a crucial task in the context of coastal management and synoptic remotely sensed data has become an essential tool for this purpose. In this work, and within the framework of BeachTour project, we introduce a new method for shoreline extraction from high resolution satellite images. It was applied on two images taken by the WorldView-2 satellite (7 channels, 2m resolution) during July 2011 and August 2014. The location is the well-known tourist destination of Laganas beach spanning 5 km along the southern part of Zakynthos Island, Greece. The atmospheric correction was performed with the ENVI FLAASH procedure and the final images were validated against hyperspectral field measurements. Using three channels (CH2=blue, CH3=green and CH7=near infrared) the Modified Redness Index image was calculated according to: MRI=(CH7)2/[CH2x(CH3)3]. MRI has the property that its value keeps increasing as the water becomes shallower. This is followed by an abrupt reduction trend at the location of the wet sand up to the point where the dry shore face begins. After that it remains low-valued throughout the beach zone. Images based on this index were used for the shoreline extraction process that included the following steps: a) On the MRI based image, only an area near the shoreline was kept (this process is known as image masking). b) On the masked image the Canny edge detector operator was applied. c) Of all edges discovered on step (b) only the biggest was kept. d) If the line revealed on step (c) was unacceptable, i.e. not defining the shoreline or defining only part of it, then either more than one areas on step (c) were kept or on the MRI image the pixel values were bound in a particular interval [Blow, Bhigh] and only the ones belonging in this interval were kept. Then, steps (a)-(d) were repeated. Using this method, which is still under development, we were able to extract the shoreline position and reveal its changes during the 3-year period

  5. Automation or De-automation

    NASA Astrophysics Data System (ADS)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  6. Dispersive liquid-liquid microextraction combined with semi-automated in-syringe back extraction as a new approach for the sample preparation of ionizable organic compounds prior to liquid chromatography.

    PubMed

    Melwanki, Mahaveer B; Fuh, Ming-Ren

    2008-07-11

    Dispersive liquid-liquid microextraction (DLLME) followed by a newly designed semi-automated in-syringe back extraction technique has been developed as an extraction methodology for the extraction of polar organic compounds prior to liquid chromatography (LC) measurement. The method is based on the formation of tiny droplets of the extractant in the sample solution using water-immiscible organic solvent (extractant) dissolved in a water-miscible organic dispersive solvent. Extraction of the analytes from aqueous sample into the dispersed organic droplets took place. The extracting organic phase was separated by centrifuging and the sedimented phase was withdrawn into a syringe. Then in-syringe back extraction was utilized to extract the analytes into an aqueous solution prior to LC analysis. Clenbuterol (CB), a basic organic compound used as a model, was extracted from a basified aqueous sample using 25 microL tetrachloroethylene (TCE, extraction solvent) dissolved in 500 microL acetone (as a dispersive solvent). After separation of the organic extracting phase by centrifuging, CB enriched in TCE phase was back extracted into 10 microL of 1% aqueous formic acid (FA) within the syringe. Back extraction was facilitated by repeatedly moving the plunger back and forth within the barrel of syringe, assisted by a syringe pump. Due to the plunger movement, a thin organic film is formed on the inner layer of the syringe that comes in contact with the acidic aqueous phase. Here, CB, a basic analyte, will be protonated and back extracted into FA. Various parameters affecting the extraction efficiency, viz., choice of extraction and dispersive solvent, salt effect, speed of syringe pump, back extraction time period, effect of concentration of base and acid, were evaluated. Under optimum conditions, precision, linearity (correlation coefficient, r(2)=0.9966 over the concentration range of 10-1000 ng mL(-1) CB), detection limit (4.9 ng mL(-1)), enrichment factor (175), relative

  7. Automated extraction of absorption features from Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and Geophysical and Environmental Research Imaging Spectrometer (GERIS) data

    NASA Technical Reports Server (NTRS)

    Kruse, Fred A.; Calvin, Wendy M.; Seznec, Olivier

    1988-01-01

    Automated techniques were developed for the extraction and characterization of absorption features from reflectance spectra. The absorption feature extraction algorithms were successfully tested on laboratory, field, and aircraft imaging spectrometer data. A suite of laboratory spectra of the most common minerals was analyzed and absorption band characteristics tabulated. A prototype expert system was designed, implemented, and successfully tested to allow identification of minerals based on the extracted absorption band characteristics. AVIRIS spectra for a site in the northern Grapevine Mountains, Nevada, have been characterized and the minerals sericite (fine grained muscovite) and dolomite were identified. The minerals kaolinite, alunite, and buddingtonite were identified and mapped for a site at Cuprite, Nevada, using the feature extraction algorithms on the new Geophysical and Environmental Research 64 channel imaging spectrometer (GERIS) data. The feature extraction routines (written in FORTRAN and C) were interfaced to the expert system (written in PROLOG) to allow both efficient processing of numerical data and logical spectrum analysis.

  8. Automated and portable solid phase extraction platform for immuno-detection of 17β-estradiol in water.

    PubMed

    Heub, Sarah; Tscharner, Noe; Monnier, Véronique; Kehl, Florian; Dittrich, Petra S; Follonier, Stéphane; Barbe, Laurent

    2015-02-13

    A fully automated and portable system for solid phase extraction (SPE) has been developed for the analysis of the natural hormone 17β-estradiol (E2) in environmental water by enzyme linked immuno-sorbent assay (ELISA). The system has been validated with de-ionized and artificial sea water as model samples and allowed for pre-concentration of E2 at levels of 1, 10 and 100 ng/L with only 100 ml of sample. Recoveries ranged from 24±3% to 107±6% depending on the concentration and sample matrix. The method successfully allowed us to determine the concentration of two seawater samples. A concentration of 15.1±0.3 ng/L of E2 was measured in a sample obtained from a food production process, and 8.8±0.7 ng/L in a sample from the Adriatic Sea. The system would be suitable for continuous monitoring of water quality as it is user friendly, and as the method is reproducible and totally compatible with the analysis of water sample by simple immunoassays and other detection methods such as biosensors.

  9. Dried Blood Spot Proteomics: Surface Extraction of Endogenous Proteins Coupled with Automated Sample Preparation and Mass Spectrometry Analysis

    NASA Astrophysics Data System (ADS)

    Martin, Nicholas J.; Bunch, Josephine; Cooper, Helen J.

    2013-08-01

    Dried blood spots offer many advantages as a sample format including ease and safety of transport and handling. To date, the majority of mass spectrometry analyses of dried blood spots have focused on small molecules or hemoglobin. However, dried blood spots are a potentially rich source of protein biomarkers, an area that has been overlooked. To address this issue, we have applied an untargeted bottom-up proteomics approach to the analysis of dried blood spots. We present an automated and integrated method for extraction of endogenous proteins from the surface of dried blood spots and sample preparation via trypsin digestion by use of the Advion Biosciences Triversa Nanomate robotic platform. Liquid chromatography tandem mass spectrometry of the resulting digests enabled identification of 120 proteins from a single dried blood spot. The proteins identified cross a concentration range of four orders of magnitude. The method is evaluated and the results discussed in terms of the proteins identified and their potential use as biomarkers in screening programs.

  10. Automated extraction and quantitation of oncogenic HPV genotypes from cervical samples by a real-time PCR-based system.

    PubMed

    Broccolo, Francesco; Cocuzza, Clementina E

    2008-03-01

    Accurate laboratory assays for the diagnosis of persistent oncogenic HPV infection are being recognized increasingly as essential for clinical management of women with cervical precancerous lesions. HPV viral load has been suggested to be a surrogate marker of persistent infection. Four independent real-time quantitative TaqMan PCR assays were developed for: HPV-16, -31, -18 and/or -45 and -33 and/or -52, -58, -67. The assays had a wide dynamic range of detection and a high degree of accuracy, repeatability and reproducibility. In order to minimize material and hands-on time, automated nucleic acid extraction was performed using a 96-well plate format integrated into a robotic liquid handler workstation. The performance of the TaqMan assays for HPV identification was assessed by comparing results with those obtained by means of PCR using consensus primers (GP5+/GP6+) and sequencing (296 samples) and INNO-LiPA analysis (31 samples). Good agreement was found generally between results obtained by real-time PCR assays and GP(+)-PCR system (kappa statistic=0.91). In conclusion, this study describes four newly developed real-time PCR assays that provide a reliable and high-throughput method for detection of not only HPV DNA but also HPV activity of the most common oncogenic HPV types in cervical specimens.

  11. Exposing Exposure: Automated Anatomy-specific CT Radiation Exposure Extraction for Quality Assurance and Radiation Monitoring

    PubMed Central

    Warden, Graham I.; Farkas, Cameron E.; Ikuta, Ichiro; Prevedello, Luciano M.; Andriole, Katherine P.; Khorasani, Ramin

    2012-01-01

    Purpose: To develop and validate an informatics toolkit that extracts anatomy-specific computed tomography (CT) radiation exposure metrics (volume CT dose index and dose-length product) from existing digital image archives through optical character recognition of CT dose report screen captures (dose screens) combined with Digital Imaging and Communications in Medicine attributes. Materials and Methods: This institutional review board–approved HIPAA-compliant study was performed in a large urban health care delivery network. Data were drawn from a random sample of CT encounters that occurred between 2000 and 2010; images from these encounters were contained within the enterprise image archive, which encompassed images obtained at an adult academic tertiary referral hospital and its affiliated sites, including a cancer center, a community hospital, and outpatient imaging centers, as well as images imported from other facilities. Software was validated by using 150 randomly selected encounters for each major CT scanner manufacturer, with outcome measures of dose screen retrieval rate (proportion of correctly located dose screens) and anatomic assignment precision (proportion of extracted exposure data with correctly assigned anatomic region, such as head, chest, or abdomen and pelvis). The 95% binomial confidence intervals (CIs) were calculated for discrete proportions, and CIs were derived from the standard error of the mean for continuous variables. After validation, the informatics toolkit was used to populate an exposure repository from a cohort of 54 549 CT encounters; of which 29 948 had available dose screens. Results: Validation yielded a dose screen retrieval rate of 99% (597 of 605 CT encounters; 95% CI: 98%, 100%) and an anatomic assignment precision of 94% (summed DLP fraction correct 563 in 600 CT encounters; 95% CI: 92%, 96%). Patient safety applications of the resulting data repository include benchmarking between institutions, CT protocol quality

  12. Hyphenating Centrifugal Partition Chromatography with Nuclear Magnetic Resonance through Automated Solid Phase Extraction.

    PubMed

    Bisson, Jonathan; Brunel, Marion; Badoc, Alain; Da Costa, Grégory; Richard, Tristan; Mérillon, Jean-Michel; Waffo-Téguo, Pierre

    2016-10-18

    Centrifugal partition chromatography (CPC) and all countercurrent separation apparatus provide chemists with efficient ways to work with complex matrixes, especially in the domain of natural products. However, despite the great advances provided by these techniques, more efficient ways of analyzing the output flow would bring further enhancement. This study describe a hyphenated approach made by coupling NMR with CPC through a hybrid-indirect coupling made possible by using a solid phase extraction (SPE) apparatus intended for high-pressure liquid chromatography (HPLC)-NMR hyphenation. Some hardware changes were needed to adapt the incompatible flow-rates and a reverse-engineering approach that led to the specific software required to control the apparatus. 1D (1)HNMR and (1)H-(1)H correlation spectroscopy (COSY) spectra were acquired in reasonable time without the need for any solvent-suppression method thanks to the SPE nitrogen drying step. The reduced usage of expensive deuterated solvents from several hundreds of milliliters to the milliliter order is the major improvement of this approach compared to the previously published ones.

  13. Automated quantification of distributed landslide movement using circular tree trunks extracted from terrestrial laser scan data

    NASA Astrophysics Data System (ADS)

    Conner, Jeremy C.; Olsen, Michael J.

    2014-06-01

    This manuscript presents a novel algorithm to automatically detect landslide movement in a forested area using displacements of tree trunks distributed across the landslide surveyed repeatedly using terrestrial laser scanning (TLS). Common landslide monitoring techniques include: inclinometers, global position system (GPS), and interferometric synthetic aperture radar (InSAR). While these techniques provide valuable data for monitoring landslides, they can be difficult to apply with adequate spatial or temporal resolution needed to understand complex landslides, specifically in forested environments. Comparison of the center coordinates (determined via least-squares fit of the TLS data) of a cross section of the tree trunk between consecutive surveys enable quantification of landslide movement rates, which can be used to analyze patterns of landslide displacement. The capabilities of this new methodology were tested through a case-study analyzing the Johnson Creek Landslide, a complex, quick moving coastal landslide, which has proven difficult to monitor using other techniques. A parametric analysis of fitting thresholds was also conducted to determine the reliability of tree trunk displacements calculated and the number of features that were extracted. The optimal parameters in selecting trees for movement analysis were found to be less than 1.5 cm for the RMS residuals of the circle fit and less than 1.0 cm for the difference in the calculated tree radii between epochs.

  14. High-throughput pharmacokinetics screen of VLA-4 antagonists by LC/MS/MS coupled with automated solid-phase extraction sample preparation.

    PubMed

    Tong, Xinchun S; Wang, Junying; Zheng, Song; Pivnichny, James V

    2004-06-29

    Automation of plasma sample preparation for pharmacokinetic studies on VLA-4 antagonists has been achieved by using 96-well format solid-phase extraction operated by Beckman Coulter Biomek 2000 liquid handling system. A Biomek 2000 robot is used to perform fully automated plasma sample preparation tasks that include serial dilution of standard solutions, pipetting plasma samples, addition of standard and internal standard solutions, performing solid-phase extraction (SPE) on Waters OASIS 96-well plates. This automated sample preparation process takes less than 2 h for a typical pharmacokinetic study, including 51 samples, 24 standards, 9 quality controls, and 3-6 dose checks with minimal manual intervention. Extensive validation has been made to ensure the accuracy and reliability of this method. A two-stage vacuum pressure controller has been incorporated in the program to improve SPE efficiency. This automated SPE sample preparation approach combined with liquid chromatography coupled with the high sensitivity and selectivity of tandem mass spectrometry (LC/MS)/MS has been successfully applied on both individual and cassette dosing for pharmacokinetic screening of a large number of VLA-4 antagonists with a limit of quantitation in the range of 1-5 ng/ml. Consequently, a significant throughput increase has been achieved along with an elimination of tedious labor and its consequential tendency to produce errors.

  15. Validation of the TaqMan Influenza A Detection Kit and a rapid automated total nucleic acid extraction method to detect influenza A virus in nasopharyngeal specimens.

    PubMed

    Bolotin, Shelly; De Lima, Cedric; Choi, Kam-Wing; Lombos, Ernesto; Burton, Laura; Mazzulli, Tony; Drews, Steven J

    2009-01-01

    This study describes the validation of the TaqMan Influenza A Detection Kit v2.0 combined with an automated nucleic acid extraction method. The limit of detection of this assay was determined by probit regression (95% confidence interval) to be 2 influenza A/PR/8/34 (H1N1) virus particles per microlitre. One hundred and eleven specimens previously tested using the Seeplex RV assay and viral culture methods were tested using the TaqMan Influenza A Detection Kit. Compared to the aggregate gold-standard, the sensitivity and specificity of the TaqMan Influenza A Detection Kit were 100% (35/35) and 97% (74/76), respectively. Because of its accuracy, quick turn-around-time and lyophilized bead form, the TaqMan Influenza A Detection Kit, combined with the NucliSense easyMAG automated extraction method, constitutes a reliable protocol for influenza A diagnosis.

  16. Development of an automated method for Folin-Ciocalteu total phenolic assay in artichoke extracts.

    PubMed

    Yoo, Kil Sun; Lee, Eun Jin; Leskovar, Daniel; Patil, Bhimanagouda S

    2012-12-01

    We developed a system to run the Folin-Ciocalteu (F-C) total phenolic assay, in artichoke extract samples, which is fully automatic, consistent, and fast. The system uses 2 high performance liquid chromatography (HPLC) pumps, an autosampler, a column heater, a UV/Vis detector, and a data collection system. To test the system, a pump delivered 10-fold diluted F-C reagent solution at a rate of 0.7 mL/min, and 0.4 g/mL sodium carbonate at a rate of 2.1 mL/min. The autosampler injected 10 μL per 1.2 min, which was mixed with the F-C reagent and heated to 65 °C while it passed through the column heater. The heated reactant was mixed with sodium carbonate and color intensity was measured by the detector at 600 nm. The data collection system recorded the color intensity, and peak area of each sample was calculated as the concentration of the total phenolic content, expressed in μg/mL as either chlorogenic acid or gallic acid. This new method had superb repeatability (0.7% CV) and a high correlation with both the manual method (r(2) = 0.93) and the HPLC method (r(2) = 0.78). Ascorbic acid and quercetin showed variable antioxidant activity, but sugars did not. This method can be efficiently applied to research that needs to test many numbers of antioxidant capacity samples with speed and accuracy.

  17. Automated liquid-liquid extraction workstation for library synthesis and its use in the parallel and chromatography-free synthesis of 2-alkyl-3-alkyl-4-(3H)-quinazolinones.

    PubMed

    Carpintero, Mercedes; Cifuentes, Marta; Ferritto, Rafael; Haro, Rubén; Toledo, Miguel A

    2007-01-01

    An automated liquid-liquid extraction workstation has been developed. This module processes up to 96 samples in an automated and parallel mode avoiding the time-consuming and intensive sample manipulation during the workup process. To validate the workstation, a highly automated and chromatography-free synthesis of differentially substituted quinazolin-4(3H)-ones with two diversity points has been carried out using isatoic anhydride as starting material.

  18. Path duplication using GPS carrier based relative position for automated ground vehicle convoys

    NASA Astrophysics Data System (ADS)

    Travis, William E., III

    A GPS based automated convoy strategy to duplicate the path of a lead vehicle is presented in this dissertation. Laser scanners and cameras are not used; all information available comes from GPS or inertial systems. An algorithm is detailed that uses GPS carrier phase measurements to determine relative position between two moving ground vehicles. Error analysis shows the accuracy is centimeter level. It is shown that the time to the first solution fix is dependent upon initial relative position accuracy, and that near instantaneous fixes can be realized if that accuracy is less than 20 centimeters. The relative positioning algorithm is then augmented with inertial measurement units to dead reckon through brief outages. Performance analysis of automotive and tactical grade units shows the twenty centimeter threshold can be maintained for only a few seconds with the automotive grade unit and for 14 seconds with the tactical unit. Next, techniques to determine odometry information in vector form are discussed. Three methods are outlined: dead reckoning of inertial sensors, time differencing GPS carrier measurements to determine change in platform position, and aiding the time differenced carrier measurements with inertial measurements. Partial integration of a tactical grade inertial measurement unit provided the lowest error drift for the scenarios investigated, but the time differenced carrier phase approach provided the most cost feasible approach with similar accuracy. Finally, the relative position and odometry algorithms are used to generate a reference by which an automated following vehicle can replicate a lead vehicle's path of travel. The first method presented uses only the relative position information to determine a relative angle to the leader. Using the relative angle as a heading reference for a steering control causes the follower to drive at the lead vehicle, thereby creating a towing effect on the follower when both vehicles are in motion. Effective

  19. Extracted facial feature of racial closely related faces

    NASA Astrophysics Data System (ADS)

    Liewchavalit, Chalothorn; Akiba, Masakazu; Kanno, Tsuneo; Nagao, Tomoharu

    2010-02-01

    Human faces contain a lot of demographic information such as identity, gender, age, race and emotion. Human being can perceive these pieces of information and use it as an important clue in social interaction with other people. Race perception is considered the most delicacy and sensitive parts of face perception. There are many research concerning image-base race recognition, but most of them are focus on major race group such as Caucasoid, Negroid and Mongoloid. This paper focuses on how people classify race of the racial closely related group. As a sample of racial closely related group, we choose Japanese and Thai face to represents difference between Northern and Southern Mongoloid. Three psychological experiment was performed to study the strategies of face perception on race classification. As a result of psychological experiment, it can be suggested that race perception is an ability that can be learn. Eyes and eyebrows are the most attention point and eyes is a significant factor in race perception. The Principal Component Analysis (PCA) was performed to extract facial features of sample race group. Extracted race features of texture and shape were used to synthesize faces. As the result, it can be suggested that racial feature is rely on detailed texture rather than shape feature. This research is a indispensable important fundamental research on the race perception which are essential in the establishment of human-like race recognition system.

  20. AUTOMATED ANALYSIS OF AQUEOUS SAMPLES CONTAINING PESTICIDES, ACIDIC/BASIC/NEUTRAL SEMIVOLATILES AND VOLATILE ORGANIC COMPOUNDS BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GC/MS

    EPA Science Inventory

    Data is presented on the development of a new automated system combining solid phase extraction (SPE) with GC/MS spectrometry for the single-run analysis of water samples containing a broad range of organic compounds. The system uses commercially available automated in-line 10-m...

  1. Determination of dialkyl phosphate metabolites of organophosphorus pesticides in human urine by automated solid-phase extraction, derivatization, and gas chromatography-mass spectrometry.

    PubMed

    Hemakanthi De Alwis, G K; Needham, Larry L; Barr, Dana B

    2008-01-01

    Organophosphorus (OP) pesticides are highly toxic but used commonly worldwide, nevertheless. Their urinary dialkylphosphate (DAP) metabolites are widely used for exposure assessment of OP pesticides in humans. We previously developed an analytical method to measure urinary DAPs utilizing solid-phase extraction (SPE)-derivatization-gas chromatography-tandem mass spectrometry (GC-MS-MS) with quantification using isotope-dilution technique. We now present a more cost-effective yet highly accurate method that can be easily adaptable to many laboratories for routine OP exposure assessment. This method is simple and fast and involves automated SPE of the metabolites followed by derivatization with pentafluorobenzyl bromide and quantification by GC-MS. Dibutyl phosphate (DBP) serves as the internal standard. The detection limits for the six metabolites ranged from 0.1 to 0.15 ng/mL. Depending on the metabolite the relative standard deviation of the analytical procedure was 2-15% for the metabolites. We compared performance of DBP as an internal standard with that of isotope-labeled compounds and found that DBP gives reliable results for the analytical procedure. We also optimized reaction parameters of pentafluorobenzylation.

  2. Automated extraction of pressure ridges from SAR images of sea ice - Comparison with surface truth

    NASA Technical Reports Server (NTRS)

    Vesecky, J. F.; Smith, M. P.; Samadani, R.; Daida, J. M.; Comiso, J. C.

    1991-01-01

    The authors estimate the characteristics of ridges and leads in sea ice from SAR (synthetic aperture radar) images. Such estimates are based on the hypothesis that bright filamentary features in SAR sea ice images correspond with pressure ridges. A data set collected in the Greenland Sea in 1987 allows this hypothesis to be evaluated for X-band SAR images. A preliminary analysis of data collected from SAR images and ice elevation (from a laser altimeter) is presented. It is found that SAR image brightness and ice elevation are clearly related. However, the correlation, using the data and techniques applied, is not strong.

  3. Evaluation of three automated nucleic acid extraction systems for identification of respiratory viruses in clinical specimens by multiplex real-time PCR.

    PubMed

    Kim, Yoonjung; Han, Mi-Soon; Kim, Juwon; Kwon, Aerin; Lee, Kyung-A

    2014-01-01

    A total of 84 nasopharyngeal swab specimens were collected from 84 patients. Viral nucleic acid was extracted by three automated extraction systems: QIAcube (Qiagen, Germany), EZ1 Advanced XL (Qiagen), and MICROLAB Nimbus IVD (Hamilton, USA). Fourteen RNA viruses and two DNA viruses were detected using the Anyplex II RV16 Detection kit (Seegene, Republic of Korea). The EZ1 Advanced XL system demonstrated the best analytical sensitivity for all the three viral strains. The nucleic acids extracted by EZ1 Advanced XL showed higher positive rates for virus detection than the others. Meanwhile, the MICROLAB Nimbus IVD system was comprised of fully automated steps from nucleic extraction to PCR setup function that could reduce human errors. For the nucleic acids recovered from nasopharyngeal swab specimens, the QIAcube system showed the fewest false negative results and the best concordance rate, and it may be more suitable for detecting various viruses including RNA and DNA virus strains. Each system showed different sensitivity and specificity for detection of certain viral pathogens and demonstrated different characteristics such as turnaround time and sample capacity. Therefore, these factors should be considered when new nucleic acid extraction systems are introduced to the laboratory.

  4. Automation of DNA and miRNA co-extraction for miRNA-based identification of human body fluids and tissues.

    PubMed

    Kulstein, Galina; Marienfeld, Ralf; Miltner, Erich; Wiegand, Peter

    2016-10-01

    In the last years, microRNA (miRNA) analysis came into focus in the field of forensic genetics. Yet, no standardized and recommendable protocols for co-isolation of miRNA and DNA from forensic relevant samples have been developed so far. Hence, this study evaluated the performance of an automated Maxwell® 16 System-based strategy (Promega) for co-extraction of DNA and miRNA from forensically relevant (blood and saliva) samples compared to (semi-)manual extraction methods. Three procedures were compared on the basis of recovered quantity of DNA and miRNA (as determined by real-time PCR and Bioanalyzer), miRNA profiling (shown by Cq values and extraction efficiency), STR profiles, duration, contamination risk and handling. All in all, the results highlight that the automated co-extraction procedure yielded the highest miRNA and DNA amounts from saliva and blood samples compared to both (semi-)manual protocols. Also, for aged and genuine samples of forensically relevant traces the miRNA and DNA yields were sufficient for subsequent downstream analysis. Furthermore, the strategy allows miRNA extraction only in cases where it is relevant to obtain additional information about the sample type. Besides, this system enables flexible sample throughput and labor-saving sample processing with reduced risk of cross-contamination.

  5. The ValleyMorph Tool: An automated extraction tool for transverse topographic symmetry (T-) factor and valley width to valley height (Vf-) ratio

    NASA Astrophysics Data System (ADS)

    Daxberger, Heidi; Dalumpines, Ron; Scott, Darren M.; Riller, Ulrich

    2014-09-01

    In tectonically active regions on Earth, shallow-crustal deformation associated with seismic hazards may pose a threat to human life and property. The study of landform development, such as analysis of the valley width to valley height ratio (Vf-ratio) and the Transverse Topographic Symmetry Factor (T-factor), delineating drainage basin symmetry, can be used as a relative measure of tectonic activity along fault-bound mountain fronts. The fast evolution of digital elevation models (DEM) provides an ideal base for remotely-sensed tectonomorphic studies of large areas using Geographical Information Systems (GIS). However, a manual extraction of the above mentioned morphologic parameters may be tedious and very time consuming. Moreover, basic GIS software suites do not provide the necessary built-in functions. Therefore, we present a newly developed, Python based, ESRI ArcGIS compatible tool and stand-alone script, the ValleyMorph Tool. This tool facilitates an automated extraction of the Vf-ratio and the T-factor data for large regions. Using a digital elevation raster and watershed polygon files as input, the tool provides output in the form of several ArcGIS data tables and shapefiles, ideal for further data manipulation and computation. This coding enables an easy application among the ArcGIS user community and code conversion to earlier ArcGIS versions. The ValleyMorph Tool is easy to use due to a simple graphical user interface. The tool is tested for the southern Central Andes using a total of 3366 watersheds.

  6. Employment and residential characteristics in relation to automated external defibrillator locations

    PubMed Central

    Griffis, Heather M.; Band, Roger A; Ruther, Matthew; Harhay, Michael; Asch, David A.; Hershey, John C.; Hill, Shawndra; Nadkarni, Lindsay; Kilaru, Austin; Branas, Charles C.; Shofer, Frances; Nichol, Graham; Becker, Lance B.; Merchant, Raina M.

    2015-01-01

    Background Survival from out-of-hospital cardiac arrest (OHCA) is generally poor and varies by geography. Variability in automated external defibrillator (AED) locations may be a contributing factor. To inform optimal placement of AEDs, we investigated AED access in a major US city relative to demographic and employment characteristics. Methods and Results This was a retrospective analysis of a Philadelphia AED registry (2,559 total AEDs). The 2010 US Census and the Local Employment Dynamics (LED) database by ZIP code was used. AED access was calculated as the weighted areal percentage of each ZIP code covered by a 400 meter radius around each AED. Of 47 ZIP codes, only 9%(4) were high AED service areas. In 26%(12) of ZIP codes, less than 35% of the area was covered by AED service areas. Higher AED access ZIP codes were more likely to have a moderately populated residential area (p=0.032), higher median household income (p=0.006), and higher paying jobs (p=008). Conclusions The locations of AEDs vary across specific ZIP codes; select residential and employment characteristics explain some variation. Further work on evaluating OHCA locations, AED use and availability, and OHCA outcomes could inform AED placement policies. Optimizing the placement of AEDs through this work may help to increase survival. PMID:26856232

  7. Submicrometric Magnetic Nanoporous Carbons Derived from Metal-Organic Frameworks Enabling Automated Electromagnet-Assisted Online Solid-Phase Extraction.

    PubMed

    Frizzarin, Rejane M; Palomino Cabello, Carlos; Bauzà, Maria Del Mar; Portugal, Lindomar A; Maya, Fernando; Cerdà, Víctor; Estela, José M; Turnes Palomino, Gemma

    2016-07-19

    We present the first application of submicrometric magnetic nanoporous carbons (μMNPCs) as sorbents for automated solid-phase extraction (SPE). Small zeolitic imidazolate framework-67 crystals are obtained at room temperature and directly carbonized under an inert atmosphere to obtain submicrometric nanoporous carbons containing magnetic cobalt nanoparticles. The μMNPCs have a high contact area, high stability, and their preparation is simple and cost-effective. The prepared μMNPCs are exploited as sorbents in a microcolumn format in a sequential injection analysis (SIA) system with online spectrophotometric detection, which includes a specially designed three-dimensional (3D)-printed holder containing an automatically actuated electromagnet. The combined action of permanent magnets and an automatically actuated electromagnet enabled the movement of the solid bed of particles inside the microcolumn, preventing their aggregation, increasing the versatility of the system, and increasing the preconcentration efficiency. The method was optimized using a full factorial design and Doehlert Matrix. The developed system was applied to the determination of anionic surfactants, exploiting the retention of the ion-pairs formed with Methylene Blue on the μMNPC. Using sodium dodecyl sulfate as a model analyte, quantification was linear from 50 to 1000 μg L(-1), and the detection limit was equal to 17.5 μg L(-1), the coefficient of variation (n = 8; 100 μg L(-1)) was 2.7%, and the analysis throughput was 13 h(-1). The developed approach was applied to the determination of anionic surfactants in water samples (natural water, groundwater, and wastewater), yielding recoveries of 93% to 110% (95% confidence level).

  8. MG-Digger: An Automated Pipeline to Search for Giant Virus-Related Sequences in Metagenomes.

    PubMed

    Verneau, Jonathan; Levasseur, Anthony; Raoult, Didier; La Scola, Bernard; Colson, Philippe

    2016-01-01

    The number of metagenomic studies conducted each year is growing dramatically. Storage and analysis of such big data is difficult and time-consuming. Interestingly, analysis shows that environmental and human metagenomes include a significant amount of non-annotated sequences, representing a 'dark matter.' We established a bioinformatics pipeline that automatically detects metagenome reads matching query sequences from a given set and applied this tool to the detection of sequences matching large and giant DNA viral members of the proposed order Megavirales or virophages. A total of 1,045 environmental and human metagenomes (≈ 1 Terabase) were collected, processed, and stored on our bioinformatics server. In addition, nucleotide and protein sequences from 93 Megavirales representatives, including 19 giant viruses of amoeba, and 5 virophages, were collected. The pipeline was generated by scripts written in Python language and entitled MG-Digger. Metagenomes previously found to contain megavirus-like sequences were tested as controls. MG-Digger was able to annotate 100s of metagenome sequences as best matching those of giant viruses. These sequences were most often found to be similar to phycodnavirus or mimivirus sequences, but included reads related to recently available pandoraviruses, Pithovirus sibericum, and faustoviruses. Compared to other tools, MG-Digger combined stand-alone use on Linux or Windows operating systems through a user-friendly interface, implementation of ready-to-use customized metagenome databases and query sequence databases, adjustable parameters for BLAST searches, and creation of output files containing selected reads with best match identification. Compared to Metavir 2, a reference tool in viral metagenome analysis, MG-Digger detected 8% more true positive Megavirales-related reads in a control metagenome. The present work shows that massive, automated and recurrent analyses of metagenomes are effective in improving knowledge about the

  9. MG-Digger: An Automated Pipeline to Search for Giant Virus-Related Sequences in Metagenomes

    PubMed Central

    Verneau, Jonathan; Levasseur, Anthony; Raoult, Didier; La Scola, Bernard; Colson, Philippe

    2016-01-01

    The number of metagenomic studies conducted each year is growing dramatically. Storage and analysis of such big data is difficult and time-consuming. Interestingly, analysis shows that environmental and human metagenomes include a significant amount of non-annotated sequences, representing a ‘dark matter.’ We established a bioinformatics pipeline that automatically detects metagenome reads matching query sequences from a given set and applied this tool to the detection of sequences matching large and giant DNA viral members of the proposed order Megavirales or virophages. A total of 1,045 environmental and human metagenomes (≈ 1 Terabase) were collected, processed, and stored on our bioinformatics server. In addition, nucleotide and protein sequences from 93 Megavirales representatives, including 19 giant viruses of amoeba, and 5 virophages, were collected. The pipeline was generated by scripts written in Python language and entitled MG-Digger. Metagenomes previously found to contain megavirus-like sequences were tested as controls. MG-Digger was able to annotate 100s of metagenome sequences as best matching those of giant viruses. These sequences were most often found to be similar to phycodnavirus or mimivirus sequences, but included reads related to recently available pandoraviruses, Pithovirus sibericum, and faustoviruses. Compared to other tools, MG-Digger combined stand-alone use on Linux or Windows operating systems through a user-friendly interface, implementation of ready-to-use customized metagenome databases and query sequence databases, adjustable parameters for BLAST searches, and creation of output files containing selected reads with best match identification. Compared to Metavir 2, a reference tool in viral metagenome analysis, MG-Digger detected 8% more true positive Megavirales-related reads in a control metagenome. The present work shows that massive, automated and recurrent analyses of metagenomes are effective in improving knowledge about

  10. An automated method to analyze language use in patients with schizophrenia and their first-degree relatives

    PubMed Central

    Elvevåg, Brita; Foltz, Peter W.; Rosenstein, Mark; DeLisi, Lynn E.

    2009-01-01

    Communication disturbances are prevalent in schizophrenia, and since it is a heritable illness these are likely present - albeit in a muted form - in the relatives of patients. Given the time-consuming, and often subjective nature of discourse analysis, these deviances are frequently not assayed in large scale studies. Recent work in computational linguistics and statistical-based semantic analysis has shown the potential and power of automated analysis of communication. We present an automated and objective approach to modeling discourse that detects very subtle deviations between probands, their first-degree relatives and unrelated healthy controls. Although these findings should be regarded as preliminary due to the limitations of the data at our disposal, we present a brief analysis of the models that best differentiate these groups in order to illustrate the utility of the method for future explorations of how language components are differentially affected by familial and illness related issues. PMID:20383310

  11. Characterization of eleutheroside B metabolites derived from an extract of Acanthopanax senticosus Harms by high-resolution liquid chromatography/quadrupole time-of-flight mass spectrometry and automated data analysis.

    PubMed

    Lu, Fang; Sun, Qiang; Bai, Yun; Bao, Shunru; Li, Xuzhao; Yan, Guangli; Liu, Shumin

    2012-10-01

    We elucidated the structure and metabolite profile of eleutheroside B, a component derived from the extract of Acanthopanax senticosus Harms, after oral administration of the extract in rats. Samples of rat plasma were collected and analyzed by selective high-resolution liquid chromatography/quadrupole time-of-flight mass spectrometry (UPLC/Q-TOF MS) automated data analysis method. A total of 11 metabolites were detected: four were identified, and three of those four are reported for the first time here. The three new plasma metabolites were identified on the basis of mass fragmentation patterns and literature reports. The major in vivo metabolic processes associated with eleutheroside B in A. senticosus include demethylation, acetylation, oxidation and glucuronidation after deglycosylation. A fairly comprehensive metabolic pathway was proposed for eleutheroside B. Our results provide a meaningful basis for drug discovery, design and clinical applications related to A. senticosus in traditional Chinese medicine.

  12. Semi-automated disk-type solid-phase extraction method for polychlorinated dibenzo-p-dioxins and dibenzofurans in aqueous samples and its application to natural water.

    PubMed

    Choi, J W; Lee, J H; Moon, B S; Baek, K H

    2007-07-20

    A disk-type solid-phase extraction (SPE) method was used for the extraction of polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) in natural water and tap water. Since this SPE system comprised airtight glass covers with a decompression pump, it enabled continuous extraction with semi-automation. The disk-type SPE method was validated by comparing its recovery rates of spiked internal standards with those of the liquid-liquid extraction (LLE). The recovery ranges of both methods were similar in terms of (13)C-labeled internal standards: 64.3-99.2% for the LLE and 52.4-93.6% for the SPE. For the native spike of 1,3,6,8-tetrachlorinated dibenzo-p-dioxin (TCDD) and octachlorinated dibenzo-p-dioxin (OCDD), the recoveries in the SPE were in the normal range of 77.9-101.1%. However, in the LLE, the recoveries of 1,3,6,8-TCDD decreased significantly. One of the reasons for the low recovery is that the solubility of this congener is high. The semi-automated SPE method was applied to the analysis of different types of water: river water, snow, sea water, raw water for drinking purposes, and tap water. PCDD/F congeners were found in some sea water and snow samples, while their concentrations in the other samples were below the limits of detection (LODs). This SPE system is appropriate for the routine analysis of water samples below 50L.

  13. A Logic-Based Approach to Relation Extraction from Texts

    NASA Astrophysics Data System (ADS)

    Horváth, Tamás; Paass, Gerhard; Reichartz, Frank; Wrobel, Stefan

    In recent years, text mining has moved far beyond the classical problem of text classification with an increased interest in more sophisticated processing of large text corpora, such as, for example, evaluations of complex queries. This and several other tasks are based on the essential step of relation extraction. This problem becomes a typical application of learning logic programs by considering the dependency trees of sentences as relational structures and examples of the target relation as ground atoms of a target predicate. In this way, each example is represented by a definite first-order Horn-clause. We show that an adaptation of Plotkin's least general generalization (LGG) operator can effectively be applied to such clauses and propose a simple and effective divide-and-conquer algorithm for listing a certain set of LGGs. We use these LGGs to generate binary features and compute the hypothesis by applying SVM to the feature vectors obtained. Empirical results on the ACE-2003 benchmark dataset indicate that the performance of our approach is comparable to state-of-the-art kernel methods.

  14. Effects of a psychophysiological system for adaptive automation on performance, workload, and the event-related potential P300 component

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J 3rd; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.

    2003-01-01

    The present study examined the effects of an electroencephalographic- (EEG-) based system for adaptive automation on tracking performance and workload. In addition, event-related potentials (ERPs) to a secondary task were derived to determine whether they would provide an additional degree of workload specificity. Participants were run in an adaptive automation condition, in which the system switched between manual and automatic task modes based on the value of each individual's own EEG engagement index; a yoked control condition; or another control group, in which task mode switches followed a random pattern. Adaptive automation improved performance and resulted in lower levels of workload. Further, the P300 component of the ERP paralleled the sensitivity to task demands of the performance and subjective measures across conditions. These results indicate that it is possible to improve performance with a psychophysiological adaptive automation system and that ERPs may provide an alternative means for distinguishing among levels of cognitive task demand in such systems. Actual or potential applications of this research include improved methods for assessing operator workload and performance.

  15. Automated Metadata Extraction

    DTIC Science & Technology

    2008-06-01

    advanced forensic format, library and tools. Paper presented at the Second Annual IFIP WG 11.9 International Conference on Digital Forensics...search.htm. [49] C. Spieler. UnZip 5.52. Retrieved 5/24/2008, from http://www.WinZip.com. [50] J. E. Towle , C. T. Clotfelter. (2007). TwiddleNet

  16. Highly sensitive routine method for urinary 3-hydroxybenzo[a]pyrene quantitation using liquid chromatography-fluorescence detection and automated off-line solid phase extraction.

    PubMed

    Barbeau, Damien; Maître, Anne; Marques, Marie

    2011-03-21

    Many workers and also the general population are exposed to polycyclic aromatic hydrocarbons (PAHs), and benzo[a]pyrene (BaP) was recently classified as carcinogenic for humans (group 1) by the International Agency for Research on Cancer. Biomonitoring of PAHs exposure is usually performed by urinary 1-hydroxypyrene (1-OHP) analysis. 1-OHP is a metabolite of pyrene, a non-carcinogenic PAH. In this work, we developed a very simple but highly sensitive analytical method of quantifying one urinary metabolite of BaP, 3-hydroxybenzo[a]pyrene (3-OHBaP), to evaluate carcinogenic PAHs exposure. After hydrolysis of 10 mL urine for two hours and concentration by automated off-line solid phase extraction, the sample was injected in a column-switching high-performance liquid chromatography fluorescence detection system. The limit of quantification was 0.2 pmol L(-1) (0.05 ng L(-1)) and the limit of detection was estimated at 0.07 pmol L(-1) (0.02 ng L(-1)). Linearity was established for 3-OHBaP concentrations ranging from 0.4 to 74.5 pmol L(-1) (0.1 to 20 ng L(-1)). Relative within-day standard deviation was less than 3% and relative between-day standard deviation was less than 4%. In non-occupationally exposed subjects, median concentrations for smokers compared with non-smokers were 3.5 times higher for 1-OHP (p<0.001) and 2 times higher for 3-OHBaP (p<0.05). The two urinary biomarkers were correlated in smokers (ρ=0.636; p<0.05; n=10) but not in non-smokers (ρ=0.09; p>0.05; n=21).

  17. Determination of Low Concentrations of Acetochlor in Water by Automated Solid-Phase Extraction and Gas Chromatography with Mass-Selective Detection

    USGS Publications Warehouse

    Lindley, C.E.; Stewart, J.T.; Sandstrom, M.W.

    1996-01-01

    A sensitive and reliable gas chromatographic/mass spectrometric (GC/MS) method for determining acetochlor in environmental water samples was developed. The method involves automated extraction of the herbicide from a filtered 1 L water sample through a C18 solid-phase extraction column, elution from the column with hexane-isopropyl alcohol (3 + 1), and concentration of the extract with nitrogen gas. The herbicide is quantitated by capillary/column GC/MS with selected-ion monitoring of 3 characteristic ions. The single-operator method detection limit for reagent water samples is 0.0015 ??g/L. Mean recoveries ranged from about 92 to 115% for 3 water matrixes fortified at 0.05 and 0.5 ??g/L. Average single-operator precision, over the course of 1 week, was better than 5%.

  18. [Determination of isothiocyanates and related compounds in mustard extract and horseradish extract used as natural food additives].

    PubMed

    Uematsu, Yoko; Hirata, Keiko; Suzuki, Kumi; Iida, Kenji; Ueta, Tadahiko; Kamata, Kunihiro

    2002-02-01

    Amounts of isothiocyanates and related compounds in a mustard extract and a horseradish extract for food additive use were determined by GC, after confirmation of the identity of GC peaks by GC/MS. Amounts of allyl isothiocyanate, which included that of allyl thiocyanate, because most of the allyl thiocyanate detected in the sample was assumed to have been formed from allyl isothiocyanate during GC analysis, were 97.6% and 85.4%, in the mustard extract and the horseradish extract, respectively. Total amounts of the identified isothiocyanates in the mustard extract and the horseradish extract were 98.5% and 95.4%, respectively. Allyl cyanide, a degradation product of allyl isothiocyanate, was found in the mustard extract and the horseradish extract at the levels of 0.57% and 1.73%, respectively. beta-Phenylethyl cyanide, a possible degradation product of beta-phenylethyl isothiocyanate, and allyl sulfides were found in the horseradish extract, at the levels of 0.13% and 0.46%, respectively. Allylamine, which is another degradation product of allyl isothiocyanate, was determined after acetylation, and was found in the mustard extract and the horseradish extract at the levels of 8 micrograms/g and 67 micrograms/g, respectively.

  19. Fully automated diagnosis of papilledema through robust extraction of vascular patterns and ocular pathology from fundus photographs

    PubMed Central

    Fatima, Khush Naseeb; Hassan, Taimur; Akram, M. Usman; Akhtar, Mahmood; Butt, Wasi Haider

    2017-01-01

    Rapid development in the field of ophthalmology has increased the demand of computer aided diagnosis of various eye diseases. Papilledema is an eye disease in which the optic disc of the eye is swelled due to an increase in intracranial pressure. This increased pressure can cause severe encephalic complications like abscess, tumors, meningitis or encephalitis, which may lead to a patient’s death. Although there have been several papilledema case studies reported from a medical point of view, only a few researchers have presented automated algorithms for this problem. This paper presents a novel computer aided system which aims to automatically detect papilledema from fundus images. Firstly, the fundus images are preprocessed by going through optic disc detection and vessel segmentation. After preprocessing, a total of 26 different features are extracted to capture possible changes in the optic disc due to papilledema. These features are further divided into four categories based upon their color, textural, vascular and disc margin obscuration properties. The best features are then selected and combined to form a feature matrix that is used to distinguish between normal images and images with papilledema using the supervised support vector machine (SVM) classifier. The proposed method is tested on 160 fundus images obtained from two different data sets i.e. structured analysis of retina (STARE), which is a publicly available data set, and our local data set that has been acquired from the Armed Forces Institute of Ophthalmology (AFIO). The STARE data set contained 90 and our local data set contained 70 fundus images respectively. These annotations have been performed with the help of two ophthalmologists. We report detection accuracies of 95.6% for STARE, 87.4% for the local data set, and 85.9% for the combined STARE and local data sets. The proposed system is fast and robust in detecting papilledema from fundus images with promising results. This will aid

  20. Time-resolved characterization of particle associated polycyclic aromatic hydrocarbons using a newly-developed sequential spot sampler with automated extraction and analysis

    NASA Astrophysics Data System (ADS)

    Eiguren-Fernandez, Arantzazu; Lewis, Gregory S.; Spielman, Steven R.; Hering, Susanne V.

    2014-10-01

    A versatile and compact sampling system, the Sequential Spot Sampler (S3) has been developed for pre-concentrated, time-resolved, dry collection of fine and ultrafine particles. Using a temperature-moderated laminar flow water condensation method, ambient particles as small as 6 nm are deposited within a dry, 1-mm diameter spot. Sequential samples are collected on a multiwell plate. Chemical analyses are laboratory-based, but automated. The sample preparation, extraction and chemical analysis steps are all handled through a commercially-available, needle-based autosampler coupled to a liquid chromatography system. This automation is enabled by the small deposition area of the collection. The entire sample is extracted into 50-100 μL volume of solvent, providing quantifiable samples with small collected air volumes. A pair of S3 units was deployed in Stockton (CA) from November 2011 to February 2012. PM2.5 samples were collected every 12 h, and analyzed for polycyclic aromatic hydrocarbons (PAHs). In parallel, conventional filter samples were collected for 48 h and used to assess the new system's performance. An automated sample preparation and extraction was developed for samples collected using the S3. Collocated data from the two sequential spot samplers were highly correlated for all measured compounds, with a regression slope of 1.1 and r2 = 0.9 for all measured concentrations. S3/filter ratios for the mean concentration of each individual PAH vary between 0.82 and 1.33, with the larger variability observed for the semivolatile components. Ratio for total PAH concentrations was 1.08. Total PAH concentrations showed similar temporal trend as ambient PM2.5 concentrations. Source apportionment analysis estimated a significant contribution of biomass burning to ambient PAH concentrations during winter.

  1. Time-resolved Characterization of Particle Associated Polycyclic Aromatic Hydrocarbons using a newly-developed Sequential Spot Sampler with Automated Extraction and Analysis

    PubMed Central

    Lewis, Gregory S.; Spielman, Steven R.; Hering, Susanne V.

    2014-01-01

    A versatile and compact sampling system, the Sequential Spot Sampler (S3) has been developed for pre-concentrated, time-resolved, dry collection of fine and ultrafine particles. Using a temperature-moderated laminar flow water condensation method, ambient particles as small as 6 nm are deposited within a dry, 1-mm diameter spot. Sequential samples are collected on a multiwell plate. Chemical analyses are laboratory-based, but automated. The sample preparation, extraction and chemical analysis steps are all handled through a commercially-available, needle-based autosampler coupled to a liquid chromatography system. This automation is enabled by the small deposition area of the collection. The entire sample is extracted into 50–100μl volume of solvent, providing quantifiable samples with small collected air volumes. A pair of S3 units was deployed in Stockton (CA) from November 2011 to February 2012. PM2.5 samples were collected every 12 hrs, and analyzed for polycyclic aromatic hydrocarbons (PAHs). In parallel, conventional filter samples were collected for 48 hrs and used to assess the new system’s performance. An automated sample preparation and extraction was developed for samples collected using the S3. Collocated data from the two sequential spot samplers were highly correlated for all measured compounds, with a regression slope of 1.1 and r2=0.9 for all measured concentrations. S3/filter ratios for the mean concentration of each individual PAH vary between 0.82 and 1.33, with the larger variability observed for the semivolatile components. Ratio for total PAH concentrations was 1.08. Total PAH concentrations showed similar temporal trend as ambient PM2.5 concentrations. Source apportionment analysis estimated a significant contribution of biomass burning to ambient PAH concentrations during winter. PMID:25574151

  2. Method for extracting copper, silver and related metals

    DOEpatents

    Moyer, Bruce A.; McDowell, W. J.

    1990-01-01

    A process for selectively extracting precious metals such as silver and gold concurrent with copper extraction from aqueous solutions containing the same. The process utilizes tetrathiamacrocycles and high molecular weight organic acids that exhibit a synergistic relationship when complexing with certain metal ions thereby removing them from ore leach solutions.

  3. Method for extracting copper, silver and related metals

    DOEpatents

    Moyer, B.A.; McDowell, W.J.

    1987-10-23

    A process for selectively extracting precious metals such as silver and gold concurrent with copper extraction from aqueous solutions containing the same. The process utilizes tetrathiamacrocycles and high molecular weight organic acids that exhibit a synergistic relationship when complexing with certain metal ions thereby removing them from ore leach solutions.

  4. Toward automated classification of consumers' cancer-related questions with a new taxonomy of expected answer types.

    PubMed

    McRoy, Susan; Jones, Sean; Kurmally, Adam

    2016-09-01

    This article examines methods for automated question classification applied to cancer-related questions that people have asked on the web. This work is part of a broader effort to provide automated question answering for health education. We created a new corpus of consumer-health questions related to cancer and a new taxonomy for those questions. We then compared the effectiveness of different statistical methods for developing classifiers, including weighted classification and resampling. Basic methods for building classifiers were limited by the high variability in the natural distribution of questions and typical refinement approaches of feature selection and merging categories achieved only small improvements to classifier accuracy. Best performance was achieved using weighted classification and resampling methods, the latter yielding an accuracy of F1 = 0.963. Thus, it would appear that statistical classifiers can be trained on natural data, but only if natural distributions of classes are smoothed. Such classifiers would be useful for automated question answering, for enriching web-based content, or assisting clinical professionals to answer questions.

  5. Extraction of a group-pair relation: problem-solving relation from web-board documents.

    PubMed

    Pechsiri, Chaveevan; Piriyakul, Rapepun

    2016-01-01

    This paper aims to extract a group-pair relation as a Problem-Solving relation, for example a DiseaseSymptom-Treatment relation and a CarProblem-Repair relation, between two event-explanation groups, a problem-concept group as a symptom/CarProblem-concept group and a solving-concept group as a treatment-concept/repair concept group from hospital-web-board and car-repair-guru-web-board documents. The Problem-Solving relation (particularly Symptom-Treatment relation) including the graphical representation benefits non-professional persons by supporting knowledge of primarily solving problems. The research contains three problems: how to identify an EDU (an Elementary Discourse Unit, which is a simple sentence) with the event concept of either a problem or a solution; how to determine a problem-concept EDU boundary and a solving-concept EDU boundary as two event-explanation groups, and how to determine the Problem-Solving relation between these two event-explanation groups. Therefore, we apply word co-occurrence to identify a problem-concept EDU and a solving-concept EDU, and machine-learning techniques to solve a problem-concept EDU boundary and a solving-concept EDU boundary. We propose using k-mean and Naïve Bayes to determine the Problem-Solving relation between the two event-explanation groups involved with clustering features. In contrast to previous works, the proposed approach enables group-pair relation extraction with high accuracy.

  6. Automated mini-column solid-phase extraction cleanup for high-throughput analysis of chemical contaminants in foods by low-pressure gas chromatography – tandem mass spectrometry

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography – tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. ...

  7. A filter paper-based microdevice for low-cost, rapid, and automated DNA extraction and amplification from diverse sample types.

    PubMed

    Gan, Wupeng; Zhuang, Bin; Zhang, Pengfei; Han, Junping; Li, Cai-Xia; Liu, Peng

    2014-10-07

    A plastic microfluidic device that integrates a filter disc as a DNA capture phase was successfully developed for low-cost, rapid and automated DNA extraction and PCR amplification from various raw samples. The microdevice was constructed by sandwiching a piece of Fusion 5 filter, as well as a PDMS (polydimethylsiloxane) membrane, between two PMMA (poly(methyl methacrylate)) layers. An automated DNA extraction from 1 μL of human whole blood can be finished on the chip in 7 minutes by sequentially aspirating NaOH, HCl, and water through the filter. The filter disc containing extracted DNA was then taken out directly for PCR. On-chip DNA purification from 0.25-1 μL of human whole blood yielded 8.1-21.8 ng of DNA, higher than those obtained using QIAamp® DNA Micro kits. To realize DNA extraction from raw samples, an additional sample loading chamber containing a filter net with an 80 μm mesh size was designed in front of the extraction chamber to accommodate sample materials. Real-world samples, including whole blood, dried blood stains on Whatman® 903 paper, dried blood stains on FTA™ cards, buccal swabs, saliva, and cigarette butts, can all be processed in the system in 8 minutes. In addition, multiplex amplification of 15 STR (short tandem repeat) loci and Sanger-based DNA sequencing of the 520 bp GJB2 gene were accomplished from the filters that contained extracted DNA from blood. To further prove the feasibility of integrating this extraction method with downstream analyses, "in situ" PCR amplifications were successfully performed in the DNA extraction chamber following DNA purification from blood and blood stains without DNA elution. Using a modified protocol to bond the PDMS and PMMA, our plastic PDMS devices withstood the PCR process without any leakage. This study represents a significant step towards the practical application of on-chip DNA extraction methods, as well as the development of fully integrated genetic analytical systems.

  8. Automated position control of a surface array relative to a liquid microjunction surface sampler

    DOEpatents

    Van Berkel, Gary J.; Kertesz, Vilmos; Ford, Michael James

    2007-11-13

    A system and method utilizes an image analysis approach for controlling the probe-to-surface distance of a liquid junction-based surface sampling system for use with mass spectrometric detection. Such an approach enables a hands-free formation of the liquid microjunction used to sample solution composition from the surface and for re-optimization, as necessary, of the microjunction thickness during a surface scan to achieve a fully automated surface sampling system.

  9. Office automation.

    PubMed

    Arenson, R L

    1986-03-01

    By now, the term "office automation" should have more meaning for those readers who are not intimately familiar with the subject. Not all of the preceding material pertains to every department or practice, but certainly, word processing and simple telephone management are key items. The size and complexity of the organization will dictate the usefulness of electronic mail and calendar management, and the individual radiologist's personal needs and habits will determine the usefulness of the home computer. Perhaps the most important ingredient for success in the office automation arena relates to the ability to integrate information from various systems in a simple and flexible manner. Unfortunately, this is perhaps the one area that most office automation systems have ignored or handled poorly. In the personal computer world, there has been much emphasis recently on integration of packages such as spreadsheet, database management, word processing, graphics, time management, and communications. This same philosophy of integration has been applied to a few office automation systems, but these are generally vendor-specific and do not allow for a mixture of foreign subsystems. During the next few years, it is likely that a few vendors will emerge as dominant in this integrated office automation field and will stress simplicity and flexibility as major components.

  10. Three Experiments Examining the Use of Electroencephalogram,Event-Related Potentials, and Heart-Rate Variability for Real-Time Human-Centered Adaptive Automation Design

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Parasuraman, Raja; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.

    2003-01-01

    Adaptive automation represents an advanced form of human-centered automation design. The approach to automation provides for real-time and model-based assessments of human-automation interaction, determines whether the human has entered into a hazardous state of awareness and then modulates the task environment to keep the operator in-the-loop , while maintaining an optimal state of task engagement and mental alertness. Because adaptive automation has not matured, numerous challenges remain, including what the criteria are, for determining when adaptive aiding and adaptive function allocation should take place. Human factors experts in the area have suggested a number of measures including the use of psychophysiology. This NASA Technical Paper reports on three experiments that examined the psychophysiological measures of event-related potentials, electroencephalogram, and heart-rate variability for real-time adaptive automation. The results of the experiments confirm the efficacy of these measures for use in both a developmental and operational role for adaptive automation design. The implications of these results and future directions for psychophysiology and human-centered automation design are discussed.

  11. Automated age-related macular degeneration classification in OCT using unsupervised feature learning

    NASA Astrophysics Data System (ADS)

    Venhuizen, Freerk G.; van Ginneken, Bram; Bloemen, Bart; van Grinsven, Mark J. J. P.; Philipsen, Rick; Hoyng, Carel; Theelen, Thomas; Sánchez, Clara I.

    2015-03-01

    Age-related Macular Degeneration (AMD) is a common eye disorder with high prevalence in elderly people. The disease mainly affects the central part of the retina, and could ultimately lead to permanent vision loss. Optical Coherence Tomography (OCT) is becoming the standard imaging modality in diagnosis of AMD and the assessment of its progression. However, the evaluation of the obtained volumetric scan is time consuming, expensive and the signs of early AMD are easy to miss. In this paper we propose a classification method to automatically distinguish AMD patients from healthy subjects with high accuracy. The method is based on an unsupervised feature learning approach, and processes the complete image without the need for an accurate pre-segmentation of the retina. The method can be divided in two steps: an unsupervised clustering stage that extracts a set of small descriptive image patches from the training data, and a supervised training stage that uses these patches to create a patch occurrence histogram for every image on which a random forest classifier is trained. Experiments using 384 volume scans show that the proposed method is capable of identifying AMD patients with high accuracy, obtaining an area under the Receiver Operating Curve of 0:984. Our method allows for a quick and reliable assessment of the presence of AMD pathology in OCT volume scans without the need for accurate layer segmentation algorithms.

  12. Automated flow-based anion-exchange method for high-throughput isolation and real-time monitoring of RuBisCO in plant extracts.

    PubMed

    Suárez, Ruth; Miró, Manuel; Cerdà, Víctor; Perdomo, Juan Alejandro; Galmés, Jeroni

    2011-06-15

    In this work, a miniaturized, completely enclosed multisyringe-flow system is proposed for high-throughput purification of RuBisCO from Triticum aestivum extracts. The automated method capitalizes on the uptake of the target protein at 4°C onto Q-Sepharose Fast Flow strong anion-exchanger packed in a cylindrical microcolumn (105 × 4 mm) followed by a stepwise ionic-strength gradient elution (0-0.8 mol/L NaCl) to eliminate concomitant extract components and retrieve highly purified RuBisCO. The manifold is furnished downstream with a flow-through diode-array UV/vis spectrophotometer for real-time monitoring of the column effluent at the protein-specific wavelength of 280 nm to detect the elution of RuBisCO. Quantitation of RuBisCO and total soluble proteins in the eluate fractions were undertaken using polyacrylamide gel electrophoresis (PAGE) and the spectrophotometric Bradford assay, respectively. A comprehensive investigation of the effect of distinct concentration gradients on the isolation of RuBisCO and experimental conditions (namely, type of resin, column dimensions and mobile-phase flow rate) upon column capacity and analyte breakthrough was effected. The assembled set-up was aimed to critically ascertain the efficiency of preliminary batchwise pre-treatments of crude plant extracts (viz., polyethylenglycol (PEG) precipitation, ammonium sulphate precipitation and sucrose gradient centrifugation) in terms of RuBisCO purification and absolute recovery prior to automated anion-exchange column separation. Under the optimum physical and chemical conditions, the flow-through column system is able to admit crude plant extracts and gives rise to RuBisCO purification yields better than 75%, which might be increased up to 96 ± 9% with a prior PEG fractionation followed by sucrose gradient step.

  13. A fully automated method for simultaneous determination of aflatoxins and ochratoxin A in dried fruits by pressurized liquid extraction and online solid-phase extraction cleanup coupled to ultra-high-pressure liquid chromatography-tandem mass spectrometry.

    PubMed

    Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Russo, Mariateresa; Valdés, Alberto; Ibáñez, Clara; Rastrelli, Luca

    2015-04-01

    According to current demands and future perspectives in food safety, this study reports a fast and fully automated analytical method for the simultaneous analysis of the mycotoxins with high toxicity and wide spread, aflatoxins (AFs) and ochratoxin A (OTA) in dried fruits, a high-risk foodstuff. The method is based on pressurized liquid extraction (PLE), with aqueous methanol (30%) at 110 °C, of the slurried dried fruit and online solid-phase extraction (online SPE) cleanup of the PLE extracts with a C18 cartridge. The purified sample was directly analysed by ultra-high-pressure liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) for sensitive and selective determination of AFs and OTA. The proposed analytical procedure was validated for different dried fruits (vine fruit, fig and apricot), providing method detection and quantification limits much lower than the AFs and OTA maximum levels imposed by EU regulation in dried fruit for direct human consumption. Also, recoveries (83-103%) and repeatability (RSD < 8, n = 3) meet the performance criteria required by EU regulation for the determination of the levels of mycotoxins in foodstuffs. The main advantage of the proposed method is full automation of the whole analytical procedure that reduces the time and cost of the analysis, sample manipulation and solvent consumption, enabling high-throughput analysis and highly accurate and precise results.

  14. Demonstration and validation of automated agricultural field extraction from multi-temporal Landsat data for the majority of United States harvested cropland

    NASA Astrophysics Data System (ADS)

    Yan, L.; Roy, D. P.

    2014-12-01

    The spatial distribution of agricultural fields is a fundamental description of rural landscapes and the location and extent of fields is important to establish the area of land utilized for agricultural yield prediction, resource allocation, and for economic planning, and may be indicative of the degree of agricultural capital investment, mechanization, and labor intensity. To date, field objects have not been extracted from satellite data over large areas because of computational constraints, the complexity of the extraction task, and because consistently processed appropriate resolution data have not been available or affordable. A recently published automated methodology to extract agricultural crop fields from weekly 30 m Web Enabled Landsat data (WELD) time series was refined and applied to 14 states that cover 70% of harvested U.S. cropland (USDA 2012 Census). The methodology was applied to 2010 combined weekly Landsat 5 and 7 WELD data. The field extraction and quantitative validation results are presented for the following 14 states: Iowa, North Dakota, Illinois, Kansas, Minnesota, Nebraska, Texas, South Dakota, Missouri, Indiana, Ohio, Wisconsin, Oklahoma and Michigan (sorted by area of harvested cropland). These states include the top 11 U.S states by harvested cropland area. Implications and recommendations for systematic application to global coverage Landsat data are discussed.

  15. A Comprehensive Automated 3D Approach for Building Extraction, Reconstruction, and Regularization from Airborne Laser Scanning Point Clouds

    PubMed Central

    Dorninger, Peter; Pfeifer, Norbert

    2008-01-01

    Three dimensional city models are necessary for supporting numerous management applications. For the determination of city models for visualization purposes, several standardized workflows do exist. They are either based on photogrammetry or on LiDAR or on a combination of both data acquisition techniques. However, the automated determination of reliable and highly accurate city models is still a challenging task, requiring a workflow comprising several processing steps. The most relevant are building detection, building outline generation, building modeling, and finally, building quality analysis. Commercial software tools for building modeling require, generally, a high degree of human interaction and most automated approaches described in literature stress the steps of such a workflow individually. In this article, we propose a comprehensive approach for automated determination of 3D city models from airborne acquired point cloud data. It is based on the assumption that individual buildings can be modeled properly by a composition of a set of planar faces. Hence, it is based on a reliable 3D segmentation algorithm, detecting planar faces in a point cloud. This segmentation is of crucial importance for the outline detection and for the modeling approach. We describe the theoretical background, the segmentation algorithm, the outline detection, and the modeling approach, and we present and discuss several actual projects. PMID:27873931

  16. A Comprehensive Automated 3D Approach for Building Extraction, Reconstruction, and Regularization from Airborne Laser Scanning Point Clouds.

    PubMed

    Dorninger, Peter; Pfeifer, Norbert

    2008-11-17

    Three dimensional city models are necessary for supporting numerous management applications. For the determination of city models for visualization purposes, several standardized workflows do exist. They are either based on photogrammetry or on LiDAR or on a combination of both data acquisition techniques. However, the automated determination of reliable and highly accurate city models is still a challenging task, requiring a workflow comprising several processing steps. The most relevant are building detection, building outline generation, building modeling, and finally, building quality analysis. Commercial software tools for building modeling require, generally, a high degree of human interaction and most automated approaches described in literature stress the steps of such a workflow individually. In this article, we propose a comprehensive approach for automated determination of 3D city models from airborne acquired point cloud data. It is based on the assumption that individual buildings can be modeled properly by a composition of a set of planar faces. Hence, it is based on a reliable 3D segmentation algorithm, detecting planar faces in a point cloud. This segmentation is of crucial importance for the outline detection and for the modeling approach. We describe the theoretical background, the segmentation algorithm, the outline detection, and the modeling approach, and we present and discuss several actual projects.

  17. Simultaneous analysis of organochlorinated pesticides (OCPs) and polychlorinated biphenyls (PCBs) from marine samples using automated pressurized liquid extraction (PLE) and Power Prep™ clean-up.

    PubMed

    Helaleh, Murad I H; Al-Rashdan, Amal; Ibtisam, A

    2012-05-30

    An automated pressurized liquid extraction (PLE) method followed by Power Prep™ clean-up was developed for organochlorinated pesticide (OCP) and polychlorinated biphenyl (PCB) analysis in environmental marine samples of fish, squid, bivalves, shells, octopus and shrimp. OCPs and PCBs were simultaneously determined in a single chromatographic run using gas chromatography-mass spectrometry-negative chemical ionization (GC-MS-NCI). About 5 g of each biological marine sample was mixed with anhydrous sodium sulphate and placed in the extraction cell of the PLE system. PLE is controlled by means of a PC using DMS 6000 software. Purification of the extract was accomplished using automated Power Prep™ clean-up with a pre-packed disposable silica column (6 g) supplied by Fluid Management Systems (FMS). All OCPs and PCBs were eluted from the silica column using two types of solvent: 80 mL of hexane and a 50 mL mixture of hexane and dichloromethane (1:1). A wide variety of fish and shellfish were collected from the fish market and analyzed using this method. The total PCB concentrations were 2.53, 0.25, 0.24, 0.24, 0.17 and 1.38 ng g(-1) (w/w) for fish, squid, bivalves, shells, octopus and shrimp, respectively, and the corresponding total OCP concentrations were 30.47, 2.86, 0.92, 10.72, 5.13 and 18.39 ng g(-1) (w/w). Lipids were removed using an SX-3 Bio-Beads gel permeation chromatography (GPC) column. Analytical criteria such as recovery, reproducibility and repeatability were evaluated through a range of biological matrices.

  18. Relation Extraction with Weak Supervision and Distributional Semantics

    DTIC Science & Technology

    2013-05-01

    Wimbledon> <car, sport>* <Chrysler, NASCAR>; <Porsche, Grand Prix racing> be a unit of <company, company> <ABC, the Walt Disney co.>; <American Airlines...Acknowledgements First of all, I would like to thank my advisor Prof. Ralph Grishman. Ralph introduced me to the fascinating world of Information Extraction...that can understand human lanauge since the early days of modern computing. An area of Natural Language Processing (NLP) research that evolves towards

  19. Picogram per liter level determination of estrogens in natural waters and waterworks by a fully automated on-line solid-phase extraction-liquid chromatography-electrospray tandem mass spectrometry method.

    PubMed

    Rodriguez-Mozaz, Sara; Lopez de Alda, Maria J; Barceló, Damià

    2004-12-01

    The present work describes a novel, fully automated method, based on on-line solid-phase extraction-liquid chromatography-electrospray tandem mass spectrometry (SPE-LC-ESI-MS-MS), which allows the unequivocal identification and quantification of the most environmentally relevant estrogens (estradiol, estrone, estriol, estradiol-17-glucuronide, estradiol-17-acetate, estrone-3-sulfate, ethynyl estradiol, diethylstilbestrol) in natural and treated waters at levels well below those of concern (limits of quantification between 0.02 and 1.02 ng/L). The method is highly precise, with relative standard deviations varying between 1.43 and 3.89%, and accurate (recovery percentages >74 %). This method was used to track the presence and fate of the target compounds in a waterworks and to evaluate the removal efficiency of the treatment processes applied. Only estrone and estrone-3-sulfate were detected in the river water used as source (at 0.68 and 0.33 ng/L, respectively). After progressive removal through the various treatment steps, none of them were detected in the finished drinking water. In addition to selectivity, sensitivity, repeatability, and automation (up to 15 samples plus 6 calibration solutions and 1 blank can be analyzed unattended), this technique offers fairly high throughput (analysis time per sample is 60 min), low time and solvent consumption, and ease of use.

  20. Automated suppression of sample-related artifacts in Fluorescence Correlation Spectroscopy.

    PubMed

    Ries, Jonas; Bayer, Mathias; Csúcs, Gábor; Dirkx, Ronald; Solimena, Michele; Ewers, Helge; Schwille, Petra

    2010-05-24

    Fluorescence Correlation Spectroscopy (FCS) in cells often suffers from artifacts caused by bright aggregates or vesicles, depletion of fluorophores or bleaching of a fluorescent background. The common practice of manually discarding distorted curves is time consuming and subjective. Here we demonstrate the feasibility of automated FCS data analysis with efficient rejection of corrupted parts of the signal. As test systems we use a solution of fluorescent molecules, contaminated with bright fluorescent beads, as well as cells expressing a fluorescent protein (ICA512-EGFP), which partitions into bright secretory granules. This approach improves the accuracy of FCS measurements in biological samples, extends its applicability to especially challenging systems and greatly simplifies and accelerates the data analysis.

  1. Automated tissue m-FISH analysis workstation for identification of clonally related cells

    NASA Astrophysics Data System (ADS)

    Dubrowski, Piotr; Lam, Wan; Ling, Victor; Lam, Stephen; MacAulay, Calum

    2008-02-01

    We have developed an automated multicolour high-throughput multi-colour Fluorescence in-situ Hybridization (FISH) scanning system for examining Non-Small Cell Lung Cancer (NSCLC) 5-10μm thick tissue specimens and analyzing their FISH spot signals at the individual cell level and then as clonal populations using cell-cell architecture (spatial distributions). Using FISH probes targeting genomic areas deemed significant to chemotherapy resistance, we aim to identify clonal subpopulations of cells in tissue samples likely to be resistant to cis-platinum/vinorelbine chemotherapy. The scanning system consists of automatic image acquisition, cell nuclei segmentation, spot counting and measuring the spatial distribution and connectivity of cells with specific genetic profiles across the entire section using architectural tools to provide the scoring system.

  2. Automation and robotics and related technology issues for Space Station customer servicing

    NASA Technical Reports Server (NTRS)

    Cline, Helmut P.

    1987-01-01

    Several flight servicing support elements are discussed within the context of the Space Station. Particular attention is given to the servicing facility, the mobile servicing center, and the flight telerobotic servicer (FTS). The role that automation and robotics can play in the design and operation of each of these elements is discussed. It is noted that the FTS, which is currently being developed by NASA, will evolve to increasing levels of autonomy to allow for the virtual elimination of routine EVA. Some of the features of the FTS will probably be: dual manipulator arms having reach and dexterity roughly equivalent to that of an EVA-suited astronaut, force reflection capability allowing efficient teleoperation, and capability of operating from a variety of support systems.

  3. Differential genetic regulation of motor activity and anxiety-related behaviors in mice using an automated home cage task.

    PubMed

    Kas, Martien J H; de Mooij-van Malsen, Annetrude J G; Olivier, Berend; Spruijt, Berry M; van Ree, Jan M

    2008-08-01

    Traditional behavioral tests, such as the open field test, measure an animal's responsiveness to a novel environment. However, it is generally difficult to assess whether the behavioral response obtained from these tests relates to the expression level of motor activity and/or to avoidance of anxiogenic areas. Here, an automated home cage environment for mice was designed to obtain independent measures of motor activity levels and of sheltered feeding preference during three consecutive days. Chronic treatment with the anxiolytic drug chlordiazepoxide (5 and 10 mg/kg/day) in C57BL/6J mice reduced sheltered feeding preference without altering motor activity levels. Furthermore, two distinct chromosome substitution strains, derived from C57BL/6J (host strain) and A/J (donor strain) inbred strains, expressed either increased sheltering preference in females (chromosome 15) or reduced motor activity levels in females and males (chromosome 1) when compared to C57BL/6J. Longitudinal behavioral monitoring revealed that these phenotypic differences maintained after adaptation to the home cage. Thus, by using new automated behavioral phenotyping approaches, behavior can be dissociated into distinct behavioral domains (e.g., anxiety-related and motor activity domains) with different underlying genetic origin and pharmacological responsiveness.

  4. An Automated Approach to Agricultural Tile Drain Detection and Extraction Utilizing High Resolution Aerial Imagery and Object-Based Image Analysis

    NASA Astrophysics Data System (ADS)

    Johansen, Richard A.

    Subsurface drainage from agricultural fields in the Maumee River watershed is suspected to adversely impact the water quality and contribute to the formation of harmful algal blooms (HABs) in Lake Erie. In early August of 2014, a HAB developed in the western Lake Erie Basin that resulted in over 400,000 people being unable to drink their tap water due to the presence of a toxin from the bloom. HAB development in Lake Erie is aided by excess nutrients from agricultural fields, which are transported through subsurface tile and enter the watershed. Compounding the issue within the Maumee watershed, the trend within the watershed has been to increase the installation of tile drains in both total extent and density. Due to the immense area of drained fields, there is a need to establish an accurate and effective technique to monitor subsurface farmland tile installations and their associated impacts. This thesis aimed at developing an automated method in order to identify subsurface tile locations from high resolution aerial imagery by applying an object-based image analysis (OBIA) approach utilizing eCognition. This process was accomplished through a set of algorithms and image filters, which segment and classify image objects by their spectral and geometric characteristics. The algorithms utilized were based on the relative location of image objects and pixels, in order to maximize the robustness and transferability of the final rule-set. These algorithms were coupled with convolution and histogram image filters to generate results for a 10km2 study area located within Clay Township in Ottawa County, Ohio. The eCognition results were compared to previously collected tile locations from an associated project that applied heads-up digitizing of aerial photography to map field tile. The heads-up digitized locations were used as a baseline for the accuracy assessment. The accuracy assessment generated a range of agreement values from 67.20% - 71.20%, and an average

  5. Shoe-String Automation

    SciTech Connect

    Duncan, M.L.

    2001-07-30

    Faced with a downsizing organization, serious budget reductions and retirement of key metrology personnel, maintaining capabilities to provide necessary services to our customers was becoming increasingly difficult. It appeared that the only solution was to automate some of our more personnel-intensive processes; however, it was crucial that the most personnel-intensive candidate process be automated, at the lowest price possible and with the lowest risk of failure. This discussion relates factors in the selection of the Standard Leak Calibration System for automation, the methods of automation used to provide the lowest-cost solution and the benefits realized as a result of the automation.

  6. Simultaneous analysis of thebaine, 6-MAM and six abused opiates in postmortem fluids and tissues using Zymark automated solid-phase extraction and gas chromatography-mass spectrometry.

    PubMed

    Lewis, R J; Johnson, R D; Hattrup, R A

    2005-08-05

    Opiates are some of the most widely prescribed drugs in America and are often abused. Demonstrating the presence or absence of opiate compounds in postmortem fluids and/or tissues derived from fatal civil aviation accidents can have serious legal consequences and may help determine the cause of impairment and/or death. However, the consumption of poppy seed products can result in a positive opiate drug test. We have developed a simple method for the simultaneous determination of eight opiate compounds from one extraction. These compounds are hydrocodone, dihydrocodeine, codeine, oxycodone, hydromorphone, 6-monoacetylmorphine, morphine, and thebaine. The inclusion of thebaine is notable as it is an indicator of poppy seed consumption and may help explain morphine/codeine positives in cases where no opiate use was indicated. This method incorporates a Zymark RapidTracetrade mark automated solid-phase extraction system, gas chromatography/mass spectrometry, and trimethyl silane (TMS) and oxime-TMS derivatives. The limits of detection ranged from 0.78 to 12.5 ng/mL. The linear dynamic range for most analytes was 6.25-1600 ng/mL. The extraction efficiencies ranged from 70 to 103%. We applied this method to eight separate aviation fatalities where opiate compounds had previously been detected.

  7. Therapeutic drug monitoring of haloperidol, perphenazine, and zuclopenthixol in serum by a fully automated sequential solid phase extraction followed by high-performance liquid chromatography.

    PubMed

    Angelo, H R; Petersen, A

    2001-04-01

    In Denmark, haloperidol, perphenazine, and zuclopenthixol are among the most frequently requested antipsychotics for therapeutic drug monitoring. With the number of requests made at the authors' laboratory, the only rational analysis is one that can measure all three drugs simultaneously. The authors therefore decided to develop an automated high-performance liquid chromatography (HPLC) method. Two milliliters serum, 2.0 mL 10 mmol/L sodium phosphate buffer (pH 5.5), and 150 microL internal standard (trifluoperazine) solution were pipetted into HPLC vials and extracted on an ASPEC XL equipped with 1 mL (50 mg) Isolute C2 (EC) extraction columns and acetonitrile-methanol-ammonium acetate buffer (60:34:6) as extracting solution. Three hundred fifty microliters was analyzed by HPLC; a 150 x 4.6-mm S5CN Spherisorb column with a mobile phase of 10 mmol/L ammonium acetate buffer-methanol (1:9), a flow rate of 0.6-1.7 mL/min, and ultraviolet detection at 256 and 245 nm were used. Reproducibility was 5-12% and the lower limit of quantitation was 10, 1, and 5 nmol/L (4, 0.4, and 2 ng/mL) for haloperidol, perphenazine, and zuclopenthixol, respectively. The method was found to be sufficiently selective and robust for routine analysis.

  8. Automated detection of feeding strikes by larval fish using continuous high-speed digital video: a novel method to extract quantitative data from fast, sparse kinematic events.

    PubMed

    Shamur, Eyal; Zilka, Miri; Hassner, Tal; China, Victor; Liberzon, Alex; Holzman, Roi

    2016-06-01

    Using videography to extract quantitative data on animal movement and kinematics constitutes a major tool in biomechanics and behavioral ecology. Advanced recording technologies now enable acquisition of long video sequences encompassing sparse and unpredictable events. Although such events may be ecologically important, analysis of sparse data can be extremely time-consuming and potentially biased; data quality is often strongly dependent on the training level of the observer and subject to contamination by observer-dependent biases. These constraints often limit our ability to study animal performance and fitness. Using long videos of foraging fish larvae, we provide a framework for the automated detection of prey acquisition strikes, a behavior that is infrequent yet critical for larval survival. We compared the performance of four video descriptors and their combinations against manually identified feeding events. For our data, the best single descriptor provided a classification accuracy of 77-95% and detection accuracy of 88-98%, depending on fish species and size. Using a combination of descriptors improved the accuracy of classification by ∼2%, but did not improve detection accuracy. Our results indicate that the effort required by an expert to manually label videos can be greatly reduced to examining only the potential feeding detections in order to filter false detections. Thus, using automated descriptors reduces the amount of manual work needed to identify events of interest from weeks to hours, enabling the assembly of an unbiased large dataset of ecologically relevant behaviors.

  9. Automated diagnosis of congestive heart failure using dual tree complex wavelet transform and statistical features extracted from 2s of ECG signals.

    PubMed

    Sudarshan, Vidya K; Acharya, U Rajendra; Oh, Shu Lih; Adam, Muhammad; Tan, Jen Hong; Chua, Chua Kuang; Chua, Kok Poo; Tan, Ru San

    2017-04-01

    Identification of alarming features in the electrocardiogram (ECG) signal is extremely significant for the prediction of congestive heart failure (CHF). ECG signal analysis carried out using computer-aided techniques can speed up the diagnosis process and aid in the proper management of CHF patients. Therefore, in this work, dual tree complex wavelets transform (DTCWT)-based methodology is proposed for an automated identification of ECG signals exhibiting CHF from normal. In the experiment, we have performed a DTCWT on ECG segments of 2s duration up to six levels to obtain the coefficients. From these DTCWT coefficients, statistical features are extracted and ranked using Bhattacharyya, entropy, minimum redundancy maximum relevance (mRMR), receiver-operating characteristics (ROC), Wilcoxon, t-test and reliefF methods. Ranked features are subjected to k-nearest neighbor (KNN) and decision tree (DT) classifiers for automated differentiation of CHF and normal ECG signals. We have achieved 99.86% accuracy, 99.78% sensitivity and 99.94% specificity in the identification of CHF affected ECG signals using 45 features. The proposed method is able to detect CHF patients accurately using only 2s of ECG signal length and hence providing sufficient time for the clinicians to further investigate on the severity of CHF and treatments.

  10. Development of an Automated Column Solid-Phase Extraction Cleanup of QuEChERS Extracts, Using a Zirconia-Based Sorbent, for Pesticide Residue Analyses by LC-MS/MS.

    PubMed

    Morris, Bruce D; Schriner, Richard B

    2015-06-03

    A new, automated, high-throughput, mini-column solid-phase extraction (c-SPE) cleanup method for QuEChERS extracts was developed, using a robotic X-Y-Z instrument autosampler, for analysis of pesticide residues in fruits and vegetables by LC-MS/MS. Removal of avocado matrix and recoveries of 263 pesticides and metabolites were studied, using various stationary phase mixtures, including zirconia-based sorbents, and elution with acetonitrile. These experiments allowed selection of a sorbent mixture consisting of zirconia, C18, and carbon-coated silica, that effectively retained avocado matrix but also retained 53 pesticides with <70% recoveries. Addition of MeOH to the elution solvent improved pesticide recoveries from zirconia, as did citrate ions in CEN QuEChERS extracts. Finally, formate buffer in acetonitrile/MeOH (1:1) was required to give >70% recoveries of all 263 pesticides. Analysis of avocado extracts by LC-Q-Orbitrap-MS showed that the method developed was removing >90% of di- and triacylglycerols. The method was validated for 269 pesticides (including homologues and metabolites) in avocado and citrus. Spike recoveries were within 70-120% and 20% RSD for 243 of these analytes in avocado and 254 in citrus, when calibrated against solvent-only standards, indicating effective matrix removal and minimal electrospray ionization suppression.

  11. A simple micro-extraction plate assay for automated LC-MS/MS analysis of human serum 25-hydroxyvitamin D levels.

    PubMed

    Geib, Timon; Meier, Florian; Schorr, Pascal; Lammert, Frank; Stokes, Caroline S; Volmer, Dietrich A

    2015-01-01

    This short application note describes a simple and automated assay for determination of 25-hydroxyvitamin D (25(OH)D) levels in very small volumes of human serum. It utilizes commercial 96-well micro-extraction plates with commercial 25(OH)D isotope calibration and quality control kits. Separation was achieved using a pentafluorophenyl liquid chromatography column followed by multiple reaction monitoring-based quantification on an electrospray triple quadrupole mass spectrometer. Emphasis was placed on providing a simple assay that can be rapidly established in non-specialized laboratories within days, without the need for laborious and time consuming sample preparation steps, advanced calibration or data acquisition routines. The analytical figures of merit obtained from this assay compared well to established assays. To demonstrate the applicability, the assay was applied to analysis of serum samples from patients with chronic liver diseases and compared to results from a routine clinical immunoassay.

  12. Fully automated analysis of four tobacco-specific N-nitrosamines in mainstream cigarette smoke using two-dimensional online solid phase extraction combined with liquid chromatography-tandem mass spectrometry.

    PubMed

    Zhang, Jie; Bai, Ruoshi; Yi, Xiaoli; Yang, Zhendong; Liu, Xingyu; Zhou, Jun; Liang, Wei

    2016-01-01

    A fully automated method for the detection of four tobacco-specific nitrosamines (TSNAs) in mainstream cigarette smoke (MSS) has been developed. The new developed method is based on two-dimensional online solid-phase extraction-liquid chromatography-tandem mass spectrometry (SPE/LC-MS/MS). The two dimensional SPE was performed in the method utilizing two cartridges with different extraction mechanisms to cleanup disturbances of different polarity to minimize sample matrix effects on each analyte. Chromatographic separation was achieved using a UPLC C18 reversed phase analytical column. Under the optimum online SPE/LC-MS/MS conditions, N'-nitrosonornicotine (NNN), N'-nitrosoanatabine (NAT), N'-nitrosoanabasine (NAB), and 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanone (NNK) were baseline separated with good peak shapes. This method appears to be the most sensitive method yet reported for determination of TSNAs in mainstream cigarette smoke. The limits of quantification for NNN, NNK, NAT and NAB reached the levels of 6.0, 1.0, 3.0 and 0.6 pg/cig, respectively, which were well below the lowest levels of TSNAs in MSS of current commercial cigarettes. The accuracy of the measurement of four TSNAs was from 92.8 to 107.3%. The relative standard deviations of intra-and inter-day analysis were less than 5.4% and 7.5%, respectively. The main advantages of the method developed are fairly high sensitivity, selectivity and accuracy of results, minimum sample pre-treatment, full automation, and high throughput. As a part of the validation procedure, the developed method was applied to evaluate TSNAs yields for 27 top-selling commercial cigarettes in China.

  13. A Customized Attention-Based Long Short-Term Memory Network for Distant Supervised Relation Extraction.

    PubMed

    He, Dengchao; Zhang, Hongjun; Hao, Wenning; Zhang, Rui; Cheng, Kai

    2017-04-14

    Distant supervision, a widely applied approach in the field of relation extraction can automatically generate large amounts of labeled training corpus with minimal manual effort. However, the labeled training corpus may have many false-positive data, which would hurt the performance of relation extraction. Moreover, in traditional feature-based distant supervised approaches, extraction models adopt human design features with natural language processing. It may also cause poor performance. To address these two shortcomings, we propose a customized attention-based long short-term memory network. Our approach adopts word-level attention to achieve better data representation for relation extraction without manually designed features to perform distant supervision instead of fully supervised relation extraction, and it utilizes instance-level attention to tackle the problem of false-positive data. Experimental results demonstrate that our proposed approach is effective and achieves better performance than traditional methods.

  14. Comparison of Boiling and Robotics Automation Method in DNA Extraction for Metagenomic Sequencing of Human Oral Microbes.

    PubMed

    Yamagishi, Junya; Sato, Yukuto; Shinozaki, Natsuko; Ye, Bin; Tsuboi, Akito; Nagasaki, Masao; Yamashita, Riu

    2016-01-01

    The rapid improvement of next-generation sequencing performance now enables us to analyze huge sample sets with more than ten thousand specimens. However, DNA extraction can still be a limiting step in such metagenomic approaches. In this study, we analyzed human oral microbes to compare the performance of three DNA extraction methods: PowerSoil (a method widely used in this field), QIAsymphony (a robotics method), and a simple boiling method. Dental plaque was initially collected from three volunteers in the pilot study and then expanded to 12 volunteers in the follow-up study. Bacterial flora was estimated by sequencing the V4 region of 16S rRNA following species-level profiling. Our results indicate that the efficiency of PowerSoil and QIAsymphony was comparable to the boiling method. Therefore, the boiling method may be a promising alternative because of its simplicity, cost effectiveness, and short handling time. Moreover, this method was reliable for estimating bacterial species and could be used in the future to examine the correlation between oral flora and health status. Despite this, differences in the efficiency of DNA extraction for various bacterial species were observed among the three methods. Based on these findings, there is no "gold standard" for DNA extraction. In future, we suggest that the DNA extraction method should be selected on a case-by-case basis considering the aims and specimens of the study.

  15. Comparison of Boiling and Robotics Automation Method in DNA Extraction for Metagenomic Sequencing of Human Oral Microbes

    PubMed Central

    Shinozaki, Natsuko; Ye, Bin; Tsuboi, Akito; Nagasaki, Masao; Yamashita, Riu

    2016-01-01

    The rapid improvement of next-generation sequencing performance now enables us to analyze huge sample sets with more than ten thousand specimens. However, DNA extraction can still be a limiting step in such metagenomic approaches. In this study, we analyzed human oral microbes to compare the performance of three DNA extraction methods: PowerSoil (a method widely used in this field), QIAsymphony (a robotics method), and a simple boiling method. Dental plaque was initially collected from three volunteers in the pilot study and then expanded to 12 volunteers in the follow-up study. Bacterial flora was estimated by sequencing the V4 region of 16S rRNA following species-level profiling. Our results indicate that the efficiency of PowerSoil and QIAsymphony was comparable to the boiling method. Therefore, the boiling method may be a promising alternative because of its simplicity, cost effectiveness, and short handling time. Moreover, this method was reliable for estimating bacterial species and could be used in the future to examine the correlation between oral flora and health status. Despite this, differences in the efficiency of DNA extraction for various bacterial species were observed among the three methods. Based on these findings, there is no “gold standard” for DNA extraction. In future, we suggest that the DNA extraction method should be selected on a case-by-case basis considering the aims and specimens of the study. PMID:27104353

  16. A METHOD FOR AUTOMATED ANALYSIS OF 10 ML WATER SAMPLES CONTAINING ACIDIC, BASIC, AND NEUTRAL SEMIVOLATILE COMPOUNDS LISTED IN USEPA METHOD 8270 BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GAS CHROMATOGRAPHY/MASS SPECTROMETRY

    EPA Science Inventory

    Data is presented showing the progress made towards the development of a new automated system combining solid phase extraction (SPE) with gas chromatography/mass spectrometry for the single run analysis of water samples containing a broad range of acid, base and neutral compounds...

  17. Screening of drugs in equine plasma using automated on-line solid-phase extraction coupled with liquid chromatography-tandem mass spectrometry.

    PubMed

    Kwok, W H; Leung, David K K; Leung, Gary N W; Wan, Terence S M; Wong, Colton H F; Wong, Jenny K Y

    2010-05-07

    A rapid liquid chromatography-tandem mass spectrometry (LC-MS-MS) method was developed for the simultaneous screening of 19 drugs of different classes in equine plasma using automated on-line solid-phase extraction (SPE) coupled with a triple quadrupole mass spectrometer. Plasma samples were first protein precipitated using acetonitrile. After centrifugation, the supernatant was directly injected into the on-line SPE system and analysed by a triple quadrupole LC-MS-MS in positive electrospray ionisation (+ESI) mode with selected reaction monitoring (SRM) scan function. On-line extraction and chromatographic separation of the targeted drugs were performed using respectively a polymeric extraction column (2 cm L x 2.1mm ID, 25 microm particle size) and a reversed-phase C18 LC column (3 cm L x 2.1mm ID, 3 microm particle size) with gradient elution to provide fast analysis time. The overall instrument turnaround time was 9.5 min, inclusive of post-run and equilibration time. Plasma samples fortified with 19 targeted drugs including narcotic analgesics, local anaesthetics, antipsychotics, bronchodilators, mucolytics, corticosteroids, sedative and tranquillisers at sub-parts per billion (ppb) to low parts per trillion (ppt) levels could be consistently detected. No significant matrix interference was observed at the expected retention times of the targeted ion transitions. Over 70% of the drugs studied gave detection limits at or below 100 pg/mL, with some detection limits reaching down to 19 pg/mL. The method had been validated for extraction recovery, precision and sensitivity, and a blockage study had also been carried out. This method is used regularly in the authors' laboratory to screen for the presence of targeted drugs in pre-race plasma samples from racehorses.

  18. MICAS: a fully automated web server for microsatellite extraction and analysis from prokaryote and viral genomic sequences.

    PubMed

    Sreenu, Vattipally B; Ranjitkumar, Gundu; Swaminathan, Sugavanam; Priya, Sasidharan; Bose, Buddhaditta; Pavan, Mogili N; Thanu, Geeta; Nagaraju, Javaregowda; Nagarajaram, Hampapathalu A

    2003-01-01

    MICAS is a web server for extracting microsatellite information from completely sequenced prokaryote and viral genomes, or user-submitted sequences. This server provides an integrated platform for MICdb (database of prokaryote and viral microsatellites), W-SSRF (simple sequence repeat finding program) and Autoprimer (primer design software). MICAS, through dynamic HTML page generation, helps in the systematic extraction of microsatellite information from selected genomes hosted on MICdb or from user-submitted sequences. Further, it assists in the design of primers with the help of Autoprimer, for sequences containing selected microsatellite tracts.

  19. Bacterial flora in relation to cataract extraction. II. Peroperative flora.

    PubMed

    Fahmy, J A; Moller, S; Bentzon, M W

    1975-06-01

    The peroperative flora of 499 patients undergoing cataract extraction was studied with local bacterial cultures taken at the beginning and end of surgery and compared with the preoperative flora examined previously (Fahmy et al. 1975 b) on admission the day prior to surgery. The local application of a single dose of oxytetracycline - polymyxin B, approximately 18 hours before surgery, significantly reduced the incidence of bacteria at the time of surgery. However, 92% of the conjunctivas examined immediately before operation proved to harbour one or more kinds of microorganisms. Futhermore, 61% of the wound sites were found to be contaminated with bacteria at the conclusion of surgery. The reasons are discussed. The origin of Staphylococcus aureus isolated peroperatively from the conjunctiva and wound site was studied. The great majority of strains could be traced to the patient's own conjunctiva preoperatively. In a few cases S. aureus was traced to the patient's own nose, skin of face or to the surgeon's nose. The air of the wards and operating theatre as well as the hands and gloves of surgeons and assistant nurses apparently did not play any role as a source of S. aureus infection.

  20. Discovery of Predicate-Oriented Relations among Named Entities Extracted from Thai Texts

    NASA Astrophysics Data System (ADS)

    Tongtep, Nattapong; Theeramunkong, Thanaruk

    Extracting named entities (NEs) and their relations is more difficult in Thai than in other languages due to several Thai specific characteristics, including no explicit boundaries for words, phrases and sentences; few case markers and modifier clues; high ambiguity in compound words and serial verbs; and flexible word orders. Unlike most previous works which focused on NE relations of specific actions, such as work_for, live_in, located_in, and kill, this paper proposes more general types of NE relations, called predicate-oriented relation (PoR), where an extracted action part (verb) is used as a core component to associate related named entities extracted from Thai Texts. Lacking a practical parser for the Thai language, we present three types of surface features, i.e. punctuation marks (such as token spaces), entity types and the number of entities and then apply five alternative commonly used learning schemes to investigate their performance on predicate-oriented relation extraction. The experimental results show that our approach achieves the F-measure of 97.76%, 99.19%, 95.00% and 93.50% on four different types of predicate-oriented relation (action-location, location-action, action-person and person-action) in crime-related news documents using a data set of 1,736 entity pairs. The effects of NE extraction techniques, feature sets and class unbalance on the performance of relation extraction are explored.

  1. Comparison of automated nucleic acid extraction methods for the detection of cytomegalovirus DNA in fluids and tissues

    PubMed Central

    Waggoner, Jesse J.

    2014-01-01

    Testing for cytomegalovirus (CMV) DNA is increasingly being used for specimen types other than plasma or whole blood. However, few studies have investigated the performance of different nucleic acid extraction protocols in such specimens. In this study, CMV extraction using the Cell-free 1000 and Pathogen Complex 400 protocols on the QIAsymphony Sample Processing (SP) system were compared using bronchoalveolar lavage fluid (BAL), tissue samples, and urine. The QIAsymphonyAssay Set-up (AS) system was used to assemble reactions using artus CMV PCR reagents and amplification was carried out on the Rotor-Gene Q. Samples from 93 patients previously tested for CMV DNA and negative samples spiked with CMV AD-169 were used to evaluate assay performance. The Pathogen Complex 400 protocol yielded the following results: BAL, sensitivity 100% (33/33), specificity 87% (20/23); tissue, sensitivity 100% (25/25), specificity 100% (20/20); urine, sensitivity 100% (21/21), specificity 100% (20/20). Cell-free 1000 extraction gave comparable results for BAL and tissue, however, for urine, the sensitivity was 86% (18/21) and specimen quantitation was inaccurate. Comparative studies of different extraction protocols and DNA detection methods in body fluids and tissues are needed, as assays optimized for blood or plasma will not necessarily perform well on other specimen types. PMID:24765569

  2. Pathology report data extraction from relational database using R, with extraction from reports on melanoma of skin as an example

    PubMed Central

    Ye, Jay J.

    2016-01-01

    Background: Different methods have been described for data extraction from pathology reports with varying degrees of success. Here a technique for directly extracting data from relational database is described. Methods: Our department uses synoptic reports modified from College of American Pathologists (CAP) Cancer Protocol Templates to report most of our cancer diagnoses. Choosing the melanoma of skin synoptic report as an example, R scripting language extended with RODBC package was used to query the pathology information system database. Reports containing melanoma of skin synoptic report in the past 4 and a half years were retrieved and individual data elements were extracted. Using the retrieved list of the cases, the database was queried a second time to retrieve/extract the lymph node staging information in the subsequent reports from the same patients. Results: 426 synoptic reports corresponding to unique lesions of melanoma of skin were retrieved, and data elements of interest were extracted into an R data frame. The distribution of Breslow depth of melanomas grouped by year is used as an example of intra-report data extraction and analysis. When the new pN staging information was present in the subsequent reports, 82% (77/94) was precisely retrieved (pN0, pN1, pN2 and pN3). Additional 15% (14/94) was retrieved with certain ambiguity (positive or knowing there was an update). The specificity was 100% for both. The relationship between Breslow depth and lymph node status was graphed as an example of lesion-specific multi-report data extraction and analysis. Conclusions: R extended with RODBC package is a simple and versatile approach well-suited for the above tasks. The success or failure of the retrieval and extraction depended largely on whether the reports were formatted and whether the contents of the elements were consistently phrased. This approach can be easily modified and adopted for other pathology information systems that use relational database

  3. Temporal Relation Extraction in Outcome Variances of Clinical Pathways.

    PubMed

    Yamashita, Takanori; Wakata, Yoshifumi; Hamai, Satoshi; Nakashima, Yasuharu; Iwamoto, Yukihide; Franagan, Brendan; Nakashima, Naoki; Hirokawa, Sachio

    2015-01-01

    Recently the clinical pathway has progressed with digitalization and the analysis of activity. There are many previous studies on the clinical pathway but not many feed directly into medical practice. We constructed a mind map system that applies the spanning tree. This system can visualize temporal relations in outcome variances, and indicate outcomes that affect long-term hospitalization.

  4. High-throughput, automated extraction of DNA and RNA from clinical samples using TruTip technology on common liquid handling robots.

    PubMed

    Holmberg, Rebecca C; Gindlesperger, Alissa; Stokes, Tinsley; Brady, Dane; Thakore, Nitu; Belgrader, Philip; Cooney, Christopher G; Chandler, Darrell P

    2013-06-11

    TruTip is a simple nucleic acid extraction technology whereby a porous, monolithic binding matrix is inserted into a pipette tip. The geometry of the monolith can be adapted for specific pipette tips ranging in volume from 1.0 to 5.0 ml. The large porosity of the monolith enables viscous or complex samples to readily pass through it with minimal fluidic backpressure. Bi-directional flow maximizes residence time between the monolith and sample, and enables large sample volumes to be processed within a single TruTip. The fundamental steps, irrespective of sample volume or TruTip geometry, include cell lysis, nucleic acid binding to the inner pores of the TruTip monolith, washing away unbound sample components and lysis buffers, and eluting purified and concentrated nucleic acids into an appropriate buffer. The attributes and adaptability of TruTip are demonstrated in three automated clinical sample processing protocols using an Eppendorf epMotion 5070, Hamilton STAR and STARplus liquid handling robots, including RNA isolation from nasopharyngeal aspirate, genomic DNA isolation from whole blood, and fetal DNA extraction and enrichment from large volumes of maternal plasma (respectively).

  5. Semi-automated solid phase extraction method for the mass spectrometric quantification of 12 specific metabolites of organophosphorus pesticides, synthetic pyrethroids, and select herbicides in human urine.

    PubMed

    Davis, Mark D; Wade, Erin L; Restrepo, Paula R; Roman-Esteva, William; Bravo, Roberto; Kuklenyik, Peter; Calafat, Antonia M

    2013-06-15

    Organophosphate and pyrethroid insecticides and phenoxyacetic acid herbicides represent important classes of pesticides applied in commercial and residential settings. Interest in assessing the extent of human exposure to these pesticides exists because of their widespread use and their potential adverse health effects. An analytical method for measuring 12 biomarkers of several of these pesticides in urine has been developed. The target analytes were extracted from one milliliter of urine by a semi-automated solid phase extraction technique, separated from each other and from other urinary biomolecules by reversed-phase high performance liquid chromatography, and detected using tandem mass spectrometry with isotope dilution quantitation. This method can be used to measure all the target analytes in one injection with similar repeatability and detection limits of previous methods which required more than one injection. Each step of the procedure was optimized to produce a robust, reproducible, accurate, precise and efficient method. The required selectivity and sensitivity for trace-level analysis (e.g., limits of detection below 0.5ng/mL) was achieved using a narrow diameter analytical column, higher than unit mass resolution for certain analytes, and stable isotope labeled internal standards. The method was applied to the analysis of 55 samples collected from adult anonymous donors with no known exposure to the target pesticides. This efficient and cost-effective method is adequate to handle the large number of samples required for national biomonitoring surveys.

  6. CD-REST: a system for extracting chemical-induced disease relation in literature.

    PubMed

    Xu, Jun; Wu, Yonghui; Zhang, Yaoyun; Wang, Jingqi; Lee, Hee-Jin; Xu, Hua

    2016-01-01

    Mining chemical-induced disease relations embedded in the vast biomedical literature could facilitate a wide range of computational biomedical applications, such as pharmacovigilance. The BioCreative V organized a Chemical Disease Relation (CDR) Track regarding chemical-induced disease relation extraction from biomedical literature in 2015. We participated in all subtasks of this challenge. In this article, we present our participation system Chemical Disease Relation Extraction SysTem (CD-REST), an end-to-end system for extracting chemical-induced disease relations in biomedical literature. CD-REST consists of two main components: (1) a chemical and disease named entity recognition and normalization module, which employs the Conditional Random Fields algorithm for entity recognition and a Vector Space Model-based approach for normalization; and (2) a relation extraction module that classifies both sentence-level and document-level candidate drug-disease pairs by support vector machines. Our system achieved the best performance on the chemical-induced disease relation extraction subtask in the BioCreative V CDR Track, demonstrating the effectiveness of our proposed machine learning-based approaches for automatic extraction of chemical-induced disease relations in biomedical literature. The CD-REST system provides web services using HTTP POST request. The web services can be accessed fromhttp://clinicalnlptool.com/cdr The online CD-REST demonstration system is available athttp://clinicalnlptool.com/cdr/cdr.html. Database URL:http://clinicalnlptool.com/cdr;http://clinicalnlptool.com/cdr/cdr.html.

  7. Automated extraction of lysergic acid diethylamide (LSD) and N-demethyl-LSD from blood, serum, plasma, and urine samples using the Zymark RapidTrace with LC/MS/MS confirmation.

    PubMed

    de Kanel, J; Vickery, W E; Waldner, B; Monahan, R M; Diamond, F X

    1998-05-01

    A forensic procedure for the quantitative confirmation of lysergic acid diethylamide (LSD) and the qualitative confirmation of its metabolite, N-demethyl-LSD, in blood, serum, plasma, and urine samples is presented. The Zymark RapidTrace was used to perform fully automated solid-phase extractions of all specimen types. After extract evaporation, confirmations were performed using liquid chromatography (LC) followed by positive electrospray ionization (ESI+) mass spectrometry/mass spectrometry (MS/MS) without derivatization. Quantitation of LSD was accomplished using LSD-d3 as an internal standard. The limit of quantitation (LOQ) for LSD was 0.05 ng/mL. The limit of detection (LOD) for both LSD and N-demethyl-LSD was 0.025 ng/mL. The recovery of LSD was greater than 95% at levels of 0.1 ng/mL and 2.0 ng/mL. For LSD at 1.0 ng/mL, the within-run and between-run (different day) relative standard deviation (RSD) was 2.2% and 4.4%, respectively.

  8. Automated and sensitive determination of four anabolic androgenic steroids in urine by online turbulent flow solid-phase extraction coupled with liquid chromatography-tandem mass spectrometry: a novel approach for clinical monitoring and doping control.

    PubMed

    Guo, Feng; Shao, Jing; Liu, Qian; Shi, Jian-Bo; Jiang, Gui-Bin

    2014-07-01

    A novel method for automated and sensitive analysis of testosterone, androstenedione, methyltestosterone and methenolone in urine samples by online turbulent flow solid-phase extraction coupled with high performance liquid chromatography-tandem mass spectrometry was developed. The optimization and validation of the method were discussed in detail. The Turboflow C18-P SPE column showed the best extraction efficiency for all the analytes. Nanogram per liter (ng/L) level of AAS could be determined directly and the limits of quantification (LOQs) were 0.01 ng/mL, which were much lower than normally concerned concentrations for these typical anabolic androgenic steroids (AAS) (0.1 ng/mL). The linearity range was from the LOQ to 100 ng/mL for each compound, with the coefficients of determination (r(2)) ranging from 0.9990 to 0.9999. The intraday and interday relative standard deviations (RSDs) ranged from 1.1% to 14.5% (n=5). The proposed method was successfully applied to the analysis of urine samples collected from 24 male athletes and 15 patients of prostate cancer. The proposed method provides an alternative practical way to rapidly determine AAS in urine samples, especially for clinical monitoring and doping control.

  9. Maneuver Automation Software

    NASA Technical Reports Server (NTRS)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; Illsley, Jeannette

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  10. Extraction of Children's Friendship Relation from Activity Level

    NASA Astrophysics Data System (ADS)

    Kono, Aki; Shintani, Kimio; Katsuki, Takuya; Kihara, Shin'ya; Ueda, Mari; Kaneda, Shigeo; Haga, Hirohide

    Children learn to fit into society through living in a group, and it's greatly influenced by their friend relations. Although preschool teachers need to observe them to assist in the growth of children's social progress and support the development each child's personality, only experienced teachers can watch over children while providing high-quality guidance. To resolve the problem, this paper proposes a mathematical and objective method that assists teachers with observation. It uses numerical data of activity level recorded by pedometers, and we make tree diagram called dendrogram based on hierarchical clustering with recorded activity level. Also, we calculate children's ``breadth'' and ``depth'' of friend relations by using more than one dendrogram. When we record children's activity level in a certain kindergarten for two months and evaluated the proposed method, the results usually coincide with remarks of teachers about the children.

  11. Semi-automated extraction and delineation of 3D roads of street scene from mobile laser scanning point clouds

    NASA Astrophysics Data System (ADS)

    Yang, Bisheng; Fang, Lina; Li, Jonathan

    2013-05-01

    Accurate 3D road information is important for applications such as road maintenance and virtual 3D modeling. Mobile laser scanning (MLS) is an efficient technique for capturing dense point clouds that can be used to construct detailed road models for large areas. This paper presents a method for extracting and delineating roads from large-scale MLS point clouds. The proposed method partitions MLS point clouds into a set of consecutive "scanning lines", which each consists of a road cross section. A moving window operator is used to filter out non-ground points line by line, and curb points are detected based on curb patterns. The detected curb points are tracked and refined so that they are both globally consistent and locally similar. To evaluate the validity of the proposed method, experiments were conducted using two types of street-scene point clouds captured by Optech's Lynx Mobile Mapper System. The completeness, correctness, and quality of the extracted roads are over 94.42%, 91.13%, and 91.3%, respectively, which proves the proposed method is a promising solution for extracting 3D roads from MLS point clouds.

  12. Automated on-fiber derivatization with headspace SPME-GC-MS-MS for the determination of primary amines in sewage sludge using pressurized hot water extraction.

    PubMed

    Llop, Anna; Pocurull, Eva; Borrull, Francesc

    2011-07-01

    An automated, environmentally friendly, simple, selective, and sensitive method was developed for the determination of ten primary aliphatic amines in sewage sludge at μg/kg dry weight (d.w.). The procedure involves a pressurized hot water extraction (PHWE) of the analytes from the solid matrix, followed by a fully automated on-fiber derivatization with 2,3,4,5-pentafluorobenzaldehyde (PFBAY) and headspace solid-phase microextraction (HS-SPME) and subsequent gas chromatography ion-trap tandem mass spectrometry (GC-IT-MS-MS) analysis. The limits of detection (LODs) of the method were between 0.5 and 45 μg/kg (d.w.) for all compounds except for ethyl-, isopropyl-, and amylamine, whose LODs were 70, 109, and 116 μg/kg (d.w.), respectively. The limits of quantification (LOQs) were between 10 and 350 μg/kg (d.w.). Repeatability and intermediate precision, expressed as RSD(%) (n=3), were lower than 18 and 21%, respectively. The method developed enabled to determine primary aliphatic amines in sludge from various urban and industrial sewage treatment plants as well as from a potable treatment plant. Most of the primary aliphatic amines were found in the sewage sludge samples analyzed corresponding to the maximum concentrations to the samples from the urban plant: for instance, isobutylamine and methylamine were found at 7728 and 12 536 μg/kg (d.w.), respectively. Amylamine was detected only in few samples but always at concentrations lower than its LOQ.

  13. An unsupervised text mining method for relation extraction from biomedical literature.

    PubMed

    Quan, Changqin; Wang, Meng; Ren, Fuji

    2014-01-01

    The wealth of interaction information provided in biomedical articles motivated the implementation of text mining approaches to automatically extract biomedical relations. This paper presents an unsupervised method based on pattern clustering and sentence parsing to deal with biomedical relation extraction. Pattern clustering algorithm is based on Polynomial Kernel method, which identifies interaction words from unlabeled data; these interaction words are then used in relation extraction between entity pairs. Dependency parsing and phrase structure parsing are combined for relation extraction. Based on the semi-supervised KNN algorithm, we extend the proposed unsupervised approach to a semi-supervised approach by combining pattern clustering, dependency parsing and phrase structure parsing rules. We evaluated the approaches on two different tasks: (1) Protein-protein interactions extraction, and (2) Gene-suicide association extraction. The evaluation of task (1) on the benchmark dataset (AImed corpus) showed that our proposed unsupervised approach outperformed three supervised methods. The three supervised methods are rule based, SVM based, and Kernel based separately. The proposed semi-supervised approach is superior to the existing semi-supervised methods. The evaluation on gene-suicide association extraction on a smaller dataset from Genetic Association Database and a larger dataset from publicly available PubMed showed that the proposed unsupervised and semi-supervised methods achieved much higher F-scores than co-occurrence based method.

  14. Automation of Silica Bead-based Nucleic Acid Extraction on a Centrifugal Lab-on-a-Disc Platform

    NASA Astrophysics Data System (ADS)

    Kinahan, David J.; Mangwanya, Faith; Garvey, Robert; Chung, Danielle WY; Lipinski, Artur; Julius, Lourdes AN; King, Damien; Mohammadi, Mehdi; Mishra, Rohit; Al-Ofi, May; Miyazaki, Celina; Ducrée, Jens

    2016-10-01

    We describe a centrifugal microfluidic ‘Lab-on-a-Disc’ (LoaD) technology for DNA purification towards eventual integration into a Sample-to-Answer platform for detection of the pathogen Escherichia coli O157:H7 from food samples. For this application, we use a novel microfluidic architecture which combines ‘event-triggered’ dissolvable film (DF) valves with a reaction chamber gated by a centrifugo-pneumatic siphon valve (CPSV). This architecture permits comprehensive flow control by simple changes in the speed of the platform innate spindle motor. Even before method optimisation, characterisation by DNA fluorescence reveals an extraction efficiency of 58%, which is close to commercial spin columns.

  15. Extracting Concepts Related to Homelessness from the Free Text of VA Electronic Medical Records.

    PubMed

    Gundlapalli, Adi V; Carter, Marjorie E; Divita, Guy; Shen, Shuying; Palmer, Miland; South, Brett; Durgahee, B S Begum; Redd, Andrew; Samore, Matthew

    2014-01-01

    Mining the free text of electronic medical records (EMR) using natural language processing (NLP) is an effective method of extracting information not always captured in administrative data. We sought to determine if concepts related to homelessness, a non-medical condition, were amenable to extraction from the EMR of Veterans Affairs (VA) medical records. As there were no off-the-shelf products, a lexicon of terms related to homelessness was created. A corpus of free text documents from outpatient encounters was reviewed to create the reference standard for NLP training and testing. V3NLP Framework was used to detect instances of lexical terms and was compared to the reference standard. With a positive predictive value of 77% for extracting relevant concepts, this study demonstrates the feasibility of extracting positively asserted concepts related to homelessness from the free text of medical records.

  16. Revealing Dimensions of Thinking in Open-Ended Self-Descriptions: An Automated Meaning Extraction Method for Natural Language

    PubMed Central

    2008-01-01

    A new method for extracting common themes from written text is introduced and applied to 1,165 open-ended self-descriptive narratives. Drawing on a lexical approach to personality, the most commonly-used adjectives within narratives written by college students were identified using computerized text analytic tools. A factor analysis on the use of these adjectives in the self-descriptions produced a 7-factor solution consisting of psychologically meaningful dimensions. Some dimensions were unipolar (e.g., Negativity factor, wherein most loaded items were negatively valenced adjectives); others were dimensional in that semantically opposite words clustered together (e.g., Sociability factor, wherein terms such as shy, outgoing, reserved, and loud all loaded in the same direction). The factors exhibited modest reliability across different types of writ writing samples and were correlated with self-reports and behaviors consistent with the dimensions. Similar analyses with additional content words (adjectives, adverbs, nouns, and verbs) yielded additional psychological dimensions associated with physical appearance, school, relationships, etc. in which people contextualize their self-concepts. The results suggest that the meaning extraction method is a promising strategy that determines the dimensions along which people think about themselves. PMID:18802499

  17. MDL constrained 3-D grayscale skeletonization algorithm for automated extraction of dendrites and spines from fluorescence confocal images.

    PubMed

    Yuan, Xiaosong; Trachtenberg, Joshua T; Potter, Steve M; Roysam, Badrinath

    2009-12-01

    This paper presents a method for improved automatic delineation of dendrites and spines from three-dimensional (3-D) images of neurons acquired by confocal or multi-photon fluorescence microscopy. The core advance presented here is a direct grayscale skeletonization algorithm that is constrained by a structural complexity penalty using the minimum description length (MDL) principle, and additional neuroanatomy-specific constraints. The 3-D skeleton is extracted directly from the grayscale image data, avoiding errors introduced by image binarization. The MDL method achieves a practical tradeoff between the complexity of the skeleton and its coverage of the fluorescence signal. Additional advances include the use of 3-D spline smoothing of dendrites to improve spine detection, and graph-theoretic algorithms to explore and extract the dendritic structure from the grayscale skeleton using an intensity-weighted minimum spanning tree (IW-MST) algorithm. This algorithm was evaluated on 30 datasets organized in 8 groups from multiple laboratories. Spines were detected with false negative rates less than 10% on most datasets (the average is 7.1%), and the average false positive rate was 11.8%. The software is available in open source form.

  18. Extraction and colorimetric determination of azadirachtin-related limonoids in neem seed kernel.

    PubMed

    Dai, J; Yaylayan, V A; Raghavan, G S; Parè, J R

    1999-09-01

    A colorimetric method was developed for the determination of total azadirachtin-related limonoids (AZRL) in neem seed kernel extracts. The method employed acidified vanillin solution in methanol for the colorization of the standard azadirachtin or neem seed kernel extracts in dichloromethane. Through the investigation of various factors influencing the sensitivity of detection, such as the concentration of vanillin, acid, and the time required for the formation of color, optimum conditions were selected to perform the assay. Under the optimum conditions, a good linearity was found between the absorbance at 577 nm and the concentration of standard azadirachtin solution in the range of 0.01-0.10 mg/mL. In addition, different extraction procedures were evaluated using the vanillin assay. The HPLC analysis of the extracts indicated that if the extractions were performed in methanol followed by partitioning in dichloromethane, approximately 50% of the value determined by the vanillin assay represents azadirachtin content.

  19. Automated and quantitative headspace in-tube extraction for the accurate determination of highly volatile compounds from wines and beers.

    PubMed

    Zapata, Julián; Mateo-Vivaracho, Laura; Lopez, Ricardo; Ferreira, Vicente

    2012-03-23

    An automatic headspace in-tube extraction (ITEX) method for the accurate determination of acetaldehyde, ethyl acetate, diacetyl and other volatile compounds from wine and beer has been developed and validated. Method accuracy is based on the nearly quantitative transference of volatile compounds from the sample to the ITEX trap. For achieving that goal most methodological aspects and parameters have been carefully examined. The vial and sample sizes and the trapping materials were found to be critical due to the pernicious saturation effects of ethanol. Small 2 mL vials containing very small amounts of sample (20 μL of 1:10 diluted sample) and a trap filled with 22 mg of Bond Elut ENV resins could guarantee a complete trapping of sample vapors. The complete extraction requires 100 × 0.5 mL pumping strokes at 60 °C and takes 24 min. Analytes are further desorbed at 240 °C into the GC injector under a 1:5 split ratio. The proportion of analytes finally transferred to the trap ranged from 85 to 99%. The validation of the method showed satisfactory figures of merit. Determination coefficients were better than 0.995 in all cases and good repeatability was also obtained (better than 7% in all cases). Reproducibility was better than 8.3% except for acetaldehyde (13.1%). Detection limits were below the odor detection thresholds of these target compounds in wine and beer and well below the normal ranges of occurrence. Recoveries were not significantly different to 100%, except in the case of acetaldehyde. In such a case it could be determined that the method is not able to break some of the adducts that this compound forms with sulfites. However, such problem was avoided after incubating the sample with glyoxal. The method can constitute a general and reliable alternative for the analysis of very volatile compounds in other difficult matrixes.

  20. Extraction conditions of white rose petals for the inhibition of enzymes related to skin aging.

    PubMed

    Choi, Ehn-Kyoung; Guo, Haiyu; Choi, Jae-Kwon; Jang, Su-Kil; Shin, Kyungha; Cha, Ye-Seul; Choi, Youngjin; Seo, Da-Woom; Lee, Yoon-Bok; Joo, Seong-So; Kim, Yun-Bae

    2015-09-01

    In order to assess inhibitory potentials of white rose petal extracts (WRPE) on the activities of enzymes related to dermal aging according to the extraction conditions, three extraction methods were adopted. WRPE was prepared by extracting dried white rose (Rosa hybrida) petals with 50% ethanol (WRPE-EtOH), Pectinex® SMASH XXL enzyme (WRPE-enzyme) or high temperature-high pressure (WRPE-HTHP). In the inhibition of matrix metalloproteinase-1, although the enzyme activity was fully inhibited by all 3 extracts at 100 µg/mL in 60 min, partial inhibition (50-70%) was achieved only by WRPE-EtOH and WRPE-enzyme at 50 µg/mL. High concentrations (≥250 µg/mL) of all 3 extracts markedly inhibited the elastase activity. However, at low concentrations (15.6-125 µg/mL), only WRPE-EtOH inhibited the enzyme activity. Notably, WRPE-EtOH was superior to WRPE-enzyme and WRPE-HTHP in the inhibition of tyrosinase. WRPE-EtOH significantly inhibited the enzyme activity from 31.2 µM, reaching 80% inhibition at 125 µM. In addition to its strong antioxidative activity, the ethanol extract of white rose petals was confirmed to be effective in inhibiting skin aging-related enzymes. Therefore, it is suggested that WRPE-EtOH could be a good candidate for the improvement of skin aging such as wrinkle formation and pigmentation.

  1. Extraction conditions of white rose petals for the inhibition of enzymes related to skin aging

    PubMed Central

    Choi, Ehn-Kyoung; Guo, Haiyu; Choi, Jae-Kwon; Jang, Su-Kil; Shin, Kyungha; Cha, Ye-Seul; Choi, Youngjin; Seo, Da-Woom; Lee, Yoon-Bok

    2015-01-01

    In order to assess inhibitory potentials of white rose petal extracts (WRPE) on the activities of enzymes related to dermal aging according to the extraction conditions, three extraction methods were adopted. WRPE was prepared by extracting dried white rose (Rosa hybrida) petals with 50% ethanol (WRPE-EtOH), Pectinex® SMASH XXL enzyme (WRPE-enzyme) or high temperature-high pressure (WRPE-HTHP). In the inhibition of matrix metalloproteinase-1, although the enzyme activity was fully inhibited by all 3 extracts at 100 µg/mL in 60 min, partial inhibition (50-70%) was achieved only by WRPE-EtOH and WRPE-enzyme at 50 µg/mL. High concentrations (≥250 µg/mL) of all 3 extracts markedly inhibited the elastase activity. However, at low concentrations (15.6-125 µg/mL), only WRPE-EtOH inhibited the enzyme activity. Notably, WRPE-EtOH was superior to WRPE-enzyme and WRPE-HTHP in the inhibition of tyrosinase. WRPE-EtOH significantly inhibited the enzyme activity from 31.2 µM, reaching 80% inhibition at 125 µM. In addition to its strong antioxidative activity, the ethanol extract of white rose petals was confirmed to be effective in inhibiting skin aging-related enzymes. Therefore, it is suggested that WRPE-EtOH could be a good candidate for the improvement of skin aging such as wrinkle formation and pigmentation. PMID:26472968

  2. Analysis of trace contamination of phthalate esters in ultrapure water using a modified solid-phase extraction procedure and automated thermal desorption-gas chromatography/mass spectrometry.

    PubMed

    Liu, Hsu-Chuan; Den, Walter; Chan, Shu-Fei; Kin, Kuan Tzu

    2008-04-25

    The present study was aimed to develop a procedure modified from the conventional solid-phase extraction (SPE) method for the analysis of trace concentration of phthalate esters in industrial ultrapure water (UPW). The proposed procedure allows UPW sample to be drawn through a sampling tube containing hydrophobic sorbent (Tenax TA) to concentrate the aqueous phthalate esters. The solid trap was then demoisturized by two-stage gas drying before subjecting to thermal desorption and analysis by gas chromatography-mass spectrometry. This process removes the solvent extraction procedure necessary for the conventional SPE method, and permits automation of the analytical procedure for high-volume analyses. Several important parameters, including desorption temperature and duration, packing quantity and demoisturizing procedure, were optimized in this study based on the analytical sensitivity for a standard mixture containing five different phthalate esters. The method detection limits for the five phthalate esters were between 36 ng l(-1) and 95 ng l(-1) and recovery rates between 15% and 101%. Dioctyl phthalate (DOP) was not recovered adequately because the compound was both poorly adsorbed and desorbed on and off Tenax TA sorbents. Furthermore, analyses of material leaching from poly(vinyl chloride) (PVC) tubes as well as the actual water samples showed that di-n-butyl phthalate (DBP) and di(2-ethylhexyl) phthalate (DEHP) were the common contaminants detected from PVC contaminated UPW and the actual UPW, as well as in tap water. The reduction of DEHP in the production processes of actual UPW was clearly observed, however a DEHP concentration of 0.20 microg l(-1) at the point of use was still being quantified, suggesting that the contamination of phthalate esters could present a barrier to the future cleanliness requirement of UPW. The work demonstrated that the proposed modified SPE procedure provided an effective method for rapid analysis and contamination

  3. High sensitivity measurements of active oxysterols with automated filtration/filter backflush-solid phase extraction-liquid chromatography-mass spectrometry.

    PubMed

    Roberg-Larsen, Hanne; Strand, Martin Frank; Grimsmo, Anders; Olsen, Petter Angell; Dembinski, Jennifer L; Rise, Frode; Lundanes, Elsa; Greibrokk, Tyge; Krauss, Stefan; Wilson, Steven Ray

    2012-09-14

    Oxysterols are important in numerous biological processes, including cell signaling. Here we present an automated filtration/filter backflush-solid phase extraction-liquid chromatography-tandem mass spectrometry (AFFL-SPE-LC-MS/MS) method for determining 24-hydroxysterol and the isomers 25-hydroxycholesterol and 22S-hydroxycholesterol that enables simplified sample preparation, high sensitivity (~25 pg/mL cell lysis sample) and low sample variability. Only one sample transfer step was required for the entire process of cell lysis, derivatization and determination of selected oxysterols. During the procedure, autoxidation of cholesterol, a potential/common problem using standard analytical methods, was found to be negligible. The reversed phase AFFL-SPE-LC-MS/MS method utilizing a 1mm inner diameter column was validated, and used to determine levels of the oxysterol analytes in mouse fibroblast cell lines SSh-LII and NIH-3T3, and human cancer cell lines, BxPC3, HCT-15 and HCT-116. In BxPC3 cells, the AFFL-SPE-LC-MS/MS method was used to detect significant differences in 24S-OHC levels between vimentin+ and vimentin- heterogenous sub-populations. The methodology also allowed monitoring of significant alterations in 24S-OHC levels upon delivery of the Hedgehog (Hh) antagonist MS-0022 in HCT-116 colorectal carcinoma cell lines.

  4. Comparison of turbulent-flow chromatography with automated solid-phase extraction in 96-well plates and liquid-liquid extraction used as plasma sample preparation techniques for liquid chromatography-tandem mass spectrometry.

    PubMed

    Zimmer, D; Pickard, V; Czembor, W; Müller, C

    1999-08-27

    Turbulent flow chromatography (TFC) combined with the high selectivity and sensitivity of tandem mass spectrometry (MS-MS) is a new technique for the fast direct analysis of drugs from crude plasma. TFC in the 96-well plate format reduces significantly the time required for sample clean-up in the laboratory. For example, for 100 samples the workload for a technician is reduced from about 8 h by a manual liquid-liquid extraction (LLE) assay to about 1 h in the case of TFC. Sample clean-up and analysis are performed on-line on the same column. Similar chromatographic performance and validation results were achieved using HTLC Turbo-C18 columns (Cohesive Technologies) and Oasis HLB extraction columns (Waters). One 96-well plate with 96 plasma samples is analyzed within 5.25 h, corresponding to 3.3 min per sample. Compared to this LLE and analysis of 96 samples takes about 16 h. Two structurally different and highly protein bound compounds, drug A and drug B, were analyzed under identical TFC conditions and the assays were fully validated for the application to toxicokinetics studies (compliant with Good Laboratory Practices-GLP). The limit of quantitation was 1.00 microg/l and the linear working range covered three orders of magnitude for both drugs. In the case of drug A the quality of analysis by TFC was similar to the reference LLE assay and slightly better than automated solid-phase extraction in 96-well plates. The accuracy was -3.1 to 6.7% and the precision was 3.1 to 6.8% in the case of drug A determined for dog plasma by TFC-MS-MS. For drug B the accuracy was -3.7 to 3.5% and the precision was 1.6 to 5.4% for rat plasma, which is even slightly better than what was achieved with the validated protein precipitation assay.

  5. The extraction of human urinary kinin (substance z) and its relation to the plasma kinins

    PubMed Central

    Gaddum, J. H.; Horton, E. W.

    1959-01-01

    Human urinary kinin (substance Z) has been extracted by modifications of the methods previously described by Gomes (1955) and Jensen (1958). The separation of two oxytocic fractions from such extracts by paper pulp chromatography (Walaszek, 1957; Jensen, 1958) could not be confirmed. Substance Z could not be distinguished from kallidin, bradykinin or glass-activated kinin by parallel quantitative assays, thus confirming that these four substances are very closely related. PMID:13651588

  6. Automated measurement of parameters related to the deformities of lower limbs based on x-rays images.

    PubMed

    Wojciechowski, Wadim; Molka, Adrian; Tabor, Zbisław

    2016-03-01

    Measurement of the deformation of the lower limbs in the current standard full-limb X-rays images presents significant challenges to radiologists and orthopedists. The precision of these measurements is deteriorated because of inexact positioning of the leg during image acquisition, problems with selecting reliable anatomical landmarks in projective X-ray images, and inevitable errors of manual measurements. The influence of the random errors resulting from the last two factors on the precision of the measurement can be reduced if an automated measurement method is used instead of a manual one. In the paper a framework for an automated measurement of various metric and angular quantities used in the description of the lower extremity deformation in full-limb frontal X-ray images is described. The results of automated measurements are compared with manual measurements. These results demonstrate that an automated method can be a valuable alternative to the manual measurements.

  7. Development and validation of an automated liquid-liquid extraction GC/MS method for the determination of THC, 11-OH-THC, and free THC-carboxylic acid (THC-COOH) from blood serum.

    PubMed

    Purschke, Kirsten; Heinl, Sonja; Lerch, Oliver; Erdmann, Freidoon; Veit, Florian

    2016-06-01

    The analysis of Δ(9)-tetrahydrocannabinol (THC) and its metabolites 11-hydroxy-Δ(9)-tetrahydrocannabinol (11-OH-THC), and 11-nor-9-carboxy-Δ(9)-tetrahydrocannabinol (THC-COOH) from blood serum is a routine task in forensic toxicology laboratories. For examination of consumption habits, the concentration of the phase I metabolite THC-COOH is used. Recommendations for interpretation of analysis values in medical-psychological assessments (regranting of driver's licenses, Germany) include threshold values for the free, unconjugated THC-COOH. Using a fully automated two-step liquid-liquid extraction, THC, 11-OH-THC, and free, unconjugated THC-COOH were extracted from blood serum, silylated with N-methyl-N-(trimethylsilyl) trifluoroacetamide (MSTFA), and analyzed by GC/MS. The automation was carried out by an x-y-z sample robot equipped with modules for shaking, centrifugation, and solvent evaporation. This method was based on a previously developed manual sample preparation method. Validation guidelines of the Society of Toxicological and Forensic Chemistry (GTFCh) were fulfilled for both methods, at which the focus of this article is the automated one. Limits of detection and quantification for THC were 0.3 and 0.6 μg/L, for 11-OH-THC were 0.1 and 0.8 μg/L, and for THC-COOH were 0.3 and 1.1 μg/L, when extracting only 0.5 mL of blood serum. Therefore, the required limit of quantification for THC of 1 μg/L in driving under the influence of cannabis cases in Germany (and other countries) can be reached and the method can be employed in that context. Real and external control samples were analyzed, and a round robin test was passed successfully. To date, the method is employed in the Institute of Legal Medicine in Giessen, Germany, in daily routine. Automation helps in avoiding errors during sample preparation and reduces the workload of the laboratory personnel. Due to its flexibility, the analysis system can be employed for other liquid-liquid extractions as

  8. CD-REST: a system for extracting chemical-induced disease relation in literature

    PubMed Central

    Xu, Jun; Wu, Yonghui; Zhang, Yaoyun; Wang, Jingqi; Lee, Hee-Jin; Xu, Hua

    2016-01-01

    Mining chemical-induced disease relations embedded in the vast biomedical literature could facilitate a wide range of computational biomedical applications, such as pharmacovigilance. The BioCreative V organized a Chemical Disease Relation (CDR) Track regarding chemical-induced disease relation extraction from biomedical literature in 2015. We participated in all subtasks of this challenge. In this article, we present our participation system Chemical Disease Relation Extraction SysTem (CD-REST), an end-to-end system for extracting chemical-induced disease relations in biomedical literature. CD-REST consists of two main components: (1) a chemical and disease named entity recognition and normalization module, which employs the Conditional Random Fields algorithm for entity recognition and a Vector Space Model-based approach for normalization; and (2) a relation extraction module that classifies both sentence-level and document-level candidate drug–disease pairs by support vector machines. Our system achieved the best performance on the chemical-induced disease relation extraction subtask in the BioCreative V CDR Track, demonstrating the effectiveness of our proposed machine learning-based approaches for automatic extraction of chemical-induced disease relations in biomedical literature. The CD-REST system provides web services using HTTP POST request. The web services can be accessed from http://clinicalnlptool.com/cdr. The online CD-REST demonstration system is available at http://clinicalnlptool.com/cdr/cdr.html. Database URL: http://clinicalnlptool.com/cdr; http://clinicalnlptool.com/cdr/cdr.html PMID:27016700

  9. Semi-automated relative quantification of cell culture contamination with mycoplasma by Photoshop-based image analysis on immunofluorescence preparations.

    PubMed

    Kumar, Ashok; Yerneni, Lakshmana K

    2009-01-01

    Mycoplasma contamination in cell culture is a serious setback for the cell-culturist. The experiments undertaken using contaminated cell cultures are known to yield unreliable or false results due to various morphological, biochemical and genetic effects. Earlier surveys revealed incidences of mycoplasma contamination in cell cultures to range from 15 to 80%. Out of a vast array of methods for detecting mycoplasma in cell culture, the cytological methods directly demonstrate the contaminating organism present in association with the cultured cells. In this investigation, we report the adoption of a cytological immunofluorescence assay (IFA), in an attempt to obtain a semi-automated relative quantification of contamination by employing the user-friendly Photoshop-based image analysis. The study performed on 77 cell cultures randomly collected from various laboratories revealed mycoplasma contamination in 18 cell cultures simultaneously by IFA and Hoechst DNA fluorochrome staining methods. It was observed that the Photoshop-based image analysis on IFA stained slides was very valuable as a sensitive tool in providing quantitative assessment on the extent of contamination both per se and in comparison to cellularity of cell cultures. The technique could be useful in estimating the efficacy of anti-mycoplasma agents during decontaminating measures.

  10. Systematic review automation technologies.

    PubMed

    Tsafnat, Guy; Glasziou, Paul; Choong, Miew Keen; Dunn, Adam; Galgani, Filippo; Coiera, Enrico

    2014-07-09

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects.We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time.

  11. Systematic review automation technologies

    PubMed Central

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  12. Automation of reverse engineering process in aircraft modeling and related optimization problems

    NASA Technical Reports Server (NTRS)

    Li, W.; Swetits, J.

    1994-01-01

    During the year of 1994, the engineering problems in aircraft modeling were studied. The initial concern was to obtain a surface model with desirable geometric characteristics. Much of the effort during the first half of the year was to find an efficient way of solving a computationally difficult optimization model. Since the smoothing technique in the proposal 'Surface Modeling and Optimization Studies of Aerodynamic Configurations' requires solutions of a sequence of large-scale quadratic programming problems, it is important to design algorithms that can solve each quadratic program in a few interactions. This research led to three papers by Dr. W. Li, which were submitted to SIAM Journal on Optimization and Mathematical Programming. Two of these papers have been accepted for publication. Even though significant progress has been made during this phase of research and computation times was reduced from 30 min. to 2 min. for a sample problem, it was not good enough for on-line processing of digitized data points. After discussion with Dr. Robert E. Smith Jr., it was decided not to enforce shape constraints in order in order to simplify the model. As a consequence, P. Dierckx's nonparametric spline fitting approach was adopted, where one has only one control parameter for the fitting process - the error tolerance. At the same time the surface modeling software developed by Imageware was tested. Research indicated a substantially improved fitting of digitalized data points can be achieved if a proper parameterization of the spline surface is chosen. A winning strategy is to incorporate Dierckx's surface fitting with a natural parameterization for aircraft parts. The report consists of 4 chapters. Chapter 1 provides an overview of reverse engineering related to aircraft modeling and some preliminary findings of the effort in the second half of the year. Chapters 2-4 are the research results by Dr. W. Li on penalty functions and conjugate gradient methods for

  13. Exploiting syntactic and semantics information for chemical–disease relation extraction

    PubMed Central

    Zhou, Huiwei; Deng, Huijie; Chen, Long; Yang, Yunlong; Jia, Chen; Huang, Degen

    2016-01-01

    Identifying chemical–disease relations (CDR) from biomedical literature could improve chemical safety and toxicity studies. This article proposes a novel syntactic and semantic information exploitation method for CDR extraction. The proposed method consists of a feature-based model, a tree kernel-based model and a neural network model. The feature-based model exploits lexical features, the tree kernel-based model captures syntactic structure features, and the neural network model generates semantic representations. The motivation of our method is to fully utilize the nice properties of the three models to explore diverse information for CDR extraction. Experiments on the BioCreative V CDR dataset show that the three models are all effective for CDR extraction, and their combination could further improve extraction performance. Database URL: http://www.biocreative.org/resources/corpora/biocreative-v-cdr-corpus/. PMID:27081156

  14. Exploiting syntactic and semantics information for chemical-disease relation extraction.

    PubMed

    Zhou, Huiwei; Deng, Huijie; Chen, Long; Yang, Yunlong; Jia, Chen; Huang, Degen

    2016-01-01

    Identifying chemical-disease relations (CDR) from biomedical literature could improve chemical safety and toxicity studies. This article proposes a novel syntactic and semantic information exploitation method for CDR extraction. The proposed method consists of a feature-based model, a tree kernel-based model and a neural network model. The feature-based model exploits lexical features, the tree kernel-based model captures syntactic structure features, and the neural network model generates semantic representations. The motivation of our method is to fully utilize the nice properties of the three models to explore diverse information for CDR extraction. Experiments on the BioCreative V CDR dataset show that the three models are all effective for CDR extraction, and their combination could further improve extraction performance.Database URL:http://www.biocreative.org/resources/corpora/biocreative-v-cdr-corpus/.

  15. UHPLC/HRMS Analysis of African Mango (Irvingia gabonensis) Seeds, Extract and Related Dietary Supplements

    PubMed Central

    Sun, Jianghao; Chen, Pei

    2012-01-01

    Dietary Supplements based on an extract from Irvingia gabonensis (African Mango, AM for abbreviation) seeds are one of the popular herbal weight loss dietary supplements in the US market. The extract is believed to be a natural and healthy way to lose weight and improve overall health. However, the chemical composition of African mango based-dietary supplements (AMDS) has never been reported. In this study, the chemical constituents of African mango seeds, African mango seeds extract (AMSE), and different kinds of commercially available African mango based dietary supplements (AMDS) have been investigated using an ultra high-performance liquid chromatography with high resolution mass spectrometry (UHPLC-HRMS) method. Ellagic acid, mono, di, tri-O methyl-ellagic acids and their glycosides were found as major components in African Mango seeds. These compounds may be used for quality control of African Mango extract and related dietary supplements. PMID:22880691

  16. A fully automated system with on-line micro solid-phase extraction combined with capillary liquid chromatography-tandem mass spectrometry for high throughput analysis of microcystins and nodularin-R in tap water and lake water.

    PubMed

    Shan, Yuanhong; Shi, Xianzhe; Dou, Abo; Zou, Cunjie; He, Hongbing; Yang, Qin; Zhao, Sumin; Lu, Xin; Xu, Guowang

    2011-04-01

    Microcystins and nodularins are cyclic peptide hepatotoxins and tumour promoters from cyanobacteria. The present study describes the development, validation and practical application of a fully automated analytical method based on on-line micro solid-phase extraction-capillary liquid chromatography-tandem mass spectrometry for the simultaneous determination of seven microcystins and nodularin-R in tap water and lake water. Aliquots of just 100 μL of water samples are sufficient for the detection and quantification of all eight toxins. Selected reaction monitoring was used to obtain the highest sensitivity. Good linear calibrations were obtained for microcystins (50-2000ng/L) and nodularin-R (25-1000 ng/L) in spiked tap water and lake water samples. Excellent interday and intraday repeatability were achieved for eight toxins with relative standard deviation less than 15.7% in three different concentrations. Acceptable recoveries were achieved in the three concentrations with both tap water matrix and lake water matrix and no significant matrix effect was found in tap water and lake water except for microcystin-RR. The limits of detection (signal to noise ratio=3) of toxins were lower than 56.6 ng/L which is far below the 1 μg/L defined by the World Health Organization provisional guideline for microcystin-LR. Finally, this method was successfully applied to lake water samples from Tai lake and proved to be useful for water quality monitoring.

  17. Semisupervised Learning Based Disease-Symptom and Symptom-Therapeutic Substance Relation Extraction from Biomedical Literature

    PubMed Central

    Li, Yuxia

    2016-01-01

    With the rapid growth of biomedical literature, a large amount of knowledge about diseases, symptoms, and therapeutic substances hidden in the literature can be used for drug discovery and disease therapy. In this paper, we present a method of constructing two models for extracting the relations between the disease and symptom and symptom and therapeutic substance from biomedical texts, respectively. The former judges whether a disease causes a certain physiological phenomenon while the latter determines whether a substance relieves or eliminates a certain physiological phenomenon. These two kinds of relations can be further utilized to extract the relations between disease and therapeutic substance. In our method, first two training sets for extracting the relations between the disease-symptom and symptom-therapeutic substance are manually annotated and then two semisupervised learning algorithms, that is, Co-Training and Tri-Training, are applied to utilize the unlabeled data to boost the relation extraction performance. Experimental results show that exploiting the unlabeled data with both Co-Training and Tri-Training algorithms can enhance the performance effectively. PMID:27822473

  18. Microwave signal extraction from femtosecond mode-locked lasers with attosecond relative timing drift.

    PubMed

    Kim, Jungwon; Kärtner, Franz X

    2010-06-15

    We present a feedback-control method for suppression of excess phase noise in the optical-to-electronic conversion process involved in the extraction of microwave signals from femtosecond mode-locked lasers. A delay-locked loop based on drift-free phase detection with a differentially biased Sagnac loop is employed to eliminate low-frequency (e.g., <1 kHz) excess phase noise and drift in the regenerated microwave signals. A 10 GHz microwave signal is extracted from a 200 MHz repetition rate mode-locked laser with a relative rms timing jitter of 2.4 fs (integrated from 1 mHz to 1 MHz) and a relative rms timing drift of 0.84 fs (integrated over 8 h with 1 Hz bandwidth) between the optical pulse train and the extracted microwave signal.

  19. Automated Brightness and Contrast Adjustment of Color Fundus Photographs for the Grading of Age-Related Macular Degeneration

    PubMed Central

    Tsikata, Edem; Laíns, Inês; Gil, João; Marques, Marco; Brown, Kelsey; Mesquita, Tânia; Melo, Pedro; da Luz Cachulo, Maria; Kim, Ivana K.; Vavvas, Demetrios; Murta, Joaquim N.; Miller, John B.; Silva, Rufino; Miller, Joan W.; Chen, Teresa C.; Husain, Deeba

    2017-01-01

    Purpose The purpose of this study was to develop an algorithm to automatically standardize the brightness, contrast, and color balance of digital color fundus photographs used to grade AMD and to validate this algorithm by determining the effects of the standardization on image quality and disease grading. Methods Seven-field color photographs of patients (>50 years) with any stage of AMD and a control group were acquired at two study sites, with either the Topcon TRC-50DX or Zeiss FF-450 Plus cameras. Field 2 photographs were analyzed. Pixel brightness values in the red, green, and blue (RGB) color channels were adjusted in custom-built software to make the mean brightness and contrast of the images equal to optimal values determined by the Age-Related Eye Disease Study (AREDS) 2 group. Results Color photographs of 370 eyes were analyzed. We found a wide range of brightness and contrast values in the images at baseline, even for those taken with the same camera. After processing, image brightness variability (brightest image–dimmest image in a color channel) was reduced 69-fold, 62-fold, and 96-fold for the RGB channels. Contrast variability was reduced 6-fold, 8-fold, and 13-fold, respectively, after adjustment. Of the 23% images considered nongradable before adjustment, only 5.7% remained nongradable. Conclusions This automated software enables rapid and accurate standardization of color photographs for AMD grading. Translational Relevance This work offers the potential to be the future of assessing and grading AMD from photos for clinical research and teleimaging. PMID:28316876

  20. Study on electrical current variations in electromembrane extraction process: Relation between extraction recovery and magnitude of electrical current.

    PubMed

    Rahmani, Turaj; Rahimi, Atyeh; Nojavan, Saeed

    2016-01-15

    This contribution presents an experimental approach to improve analytical performance of electromembrane extraction (EME) procedure, which is based on the scrutiny of current pattern under different extraction conditions such as using different organic solvents as supported liquid membrane, electrical potentials, pH values of donor and acceptor phases, variable extraction times, temperatures, stirring rates, different hollow fiber lengths and the addition of salts or organic solvents to the sample matrix. In this study, four basic drugs with different polarities were extracted under different conditions with the corresponding electrical current patterns compared against extraction recoveries. The extraction process was demonstrated in terms of EME-HPLC analyses of selected basic drugs. Comparing the obtained extraction recoveries with the electrical current patterns, most cases exhibited minimum recovery and repeatability at the highest investigated magnitude of electrical current. . It was further found that identical current patterns are associated with repeated extraction efficiencies. In other words, the pattern should be repeated for a successful extraction. The results showed completely different electrical currents under different extraction conditions, so that all variable parameters have contributions into the electrical current pattern. Finally, the current patterns of extractions from wastewater, plasma and urine samples were demonstrated. The results indicated an increase in the electrical current when extracting from complex matrices; this was seen to decrease the extraction efficiency.

  1. Relation of retinal blood flow and retinal oxygen extraction during stimulation with diffuse luminance flicker.

    PubMed

    Palkovits, Stefan; Lasta, Michael; Told, Reinhard; Schmidl, Doreen; Werkmeister, René; Cherecheanu, Alina Popa; Garhöfer, Gerhard; Schmetterer, Leopold

    2015-12-17

    Cerebral and retinal blood flow are dependent on local neuronal activity. Several studies quantified the increase in cerebral blood flow and oxygen consumption during activity. In the present study we investigated the relation between changes in retinal blood flow and oxygen extraction during stimulation with diffuse luminance flicker and the influence of breathing gas mixtures with different fractions of O2 (FiO2; 100% 15% and 12%). Twenty-four healthy subjects were included. Retinal blood flow was studied by combining measurement of vessel diameters using the Dynamic Vessel Analyser with measurements of blood velocity using laser Doppler velocimetry. Oxygen saturation was measured using spectroscopic reflectometry and oxygen extraction was calculated. Flicker stimulation increased retinal blood flow (57.7 ± 17.8%) and oxygen extraction (34.6 ± 24.1%; p < 0.001 each). During 100% oxygen breathing the response of retinal blood flow and oxygen extraction was increased (p < 0.01 each). By contrast, breathing gas mixtures with 12% and 15% FiO2 did not alter flicker-induced retinal haemodynamic changes. The present study indicates that at a comparable increase in blood flow the increase in oxygen extraction in the retina is larger than in the brain. During systemic hyperoxia the blood flow and oxygen extraction responses to neural stimulation are augmented. The underlying mechanism is unknown.

  2. Influence of DNA extraction methods on relative telomere length measurements and its impact on epidemiological studies

    PubMed Central

    Raschenberger, Julia; Lamina, Claudia; Haun, Margot; Kollerits, Barbara; Coassin, Stefan; Boes, Eva; Kedenko, Ludmilla; Köttgen, Anna; Kronenberg, Florian

    2016-01-01

    Measurement of telomere length is widely used in epidemiologic studies. Insufficient standardization of the measurements processes has, however, complicated the comparison of results between studies. We aimed to investigate whether DNA extraction methods have an influence on measured values of relative telomere length (RTL) and whether this has consequences for epidemiological studies. We performed four experiments with RTL measurement in quadruplicate by qPCR using DNA extracted with different methods: 1) a standardized validation experiment including three extraction methods (magnetic-particle-method EZ1, salting-out-method INV, phenol-chloroform-isoamyl-alcohol PCI) each in the same 20 samples demonstrated pronounced differences in RTL with lowest values with EZ1 followed by INV and PCI-isolated DNA; 2) a comparison of 307 samples from an epidemiological study showing EZ1-measurements 40% lower than INV-measurements; 3) a matching-approach of two similar non-diseased control groups including 143 pairs of subjects revealed significantly shorter RTL in EZ1 than INV-extracted DNA (0.844 ± 0.157 vs. 1.357 ± 0.242); 4) an association analysis of RTL with prevalent cardiovascular disease detected a stronger association with INV than with EZ1-extracted DNA. In summary, DNA extraction methods have a pronounced influence on the measured RTL-values. This might result in spurious or lost associations in epidemiological studies under certain circumstances. PMID:27138987

  3. Influence of DNA extraction methods on relative telomere length measurements and its impact on epidemiological studies.

    PubMed

    Raschenberger, Julia; Lamina, Claudia; Haun, Margot; Kollerits, Barbara; Coassin, Stefan; Boes, Eva; Kedenko, Ludmilla; Köttgen, Anna; Kronenberg, Florian

    2016-05-03

    Measurement of telomere length is widely used in epidemiologic studies. Insufficient standardization of the measurements processes has, however, complicated the comparison of results between studies. We aimed to investigate whether DNA extraction methods have an influence on measured values of relative telomere length (RTL) and whether this has consequences for epidemiological studies. We performed four experiments with RTL measurement in quadruplicate by qPCR using DNA extracted with different methods: 1) a standardized validation experiment including three extraction methods (magnetic-particle-method EZ1, salting-out-method INV, phenol-chloroform-isoamyl-alcohol PCI) each in the same 20 samples demonstrated pronounced differences in RTL with lowest values with EZ1 followed by INV and PCI-isolated DNA; 2) a comparison of 307 samples from an epidemiological study showing EZ1-measurements 40% lower than INV-measurements; 3) a matching-approach of two similar non-diseased control groups including 143 pairs of subjects revealed significantly shorter RTL in EZ1 than INV-extracted DNA (0.844 ± 0.157 vs. 1.357 ± 0.242); 4) an association analysis of RTL with prevalent cardiovascular disease detected a stronger association with INV than with EZ1-extracted DNA. In summary, DNA extraction methods have a pronounced influence on the measured RTL-values. This might result in spurious or lost associations in epidemiological studies under certain circumstances.

  4. Relation of retinal blood flow and retinal oxygen extraction during stimulation with diffuse luminance flicker

    PubMed Central

    Palkovits, Stefan; Lasta, Michael; Told, Reinhard; Schmidl, Doreen; Werkmeister, René; Cherecheanu, Alina Popa; Garhöfer, Gerhard; Schmetterer, Leopold

    2015-01-01

    Cerebral and retinal blood flow are dependent on local neuronal activity. Several studies quantified the increase in cerebral blood flow and oxygen consumption during activity. In the present study we investigated the relation between changes in retinal blood flow and oxygen extraction during stimulation with diffuse luminance flicker and the influence of breathing gas mixtures with different fractions of O2 (FiO2; 100% 15% and 12%). Twenty-four healthy subjects were included. Retinal blood flow was studied by combining measurement of vessel diameters using the Dynamic Vessel Analyser with measurements of blood velocity using laser Doppler velocimetry. Oxygen saturation was measured using spectroscopic reflectometry and oxygen extraction was calculated. Flicker stimulation increased retinal blood flow (57.7 ± 17.8%) and oxygen extraction (34.6 ± 24.1%; p < 0.001 each). During 100% oxygen breathing the response of retinal blood flow and oxygen extraction was increased (p < 0.01 each). By contrast, breathing gas mixtures with 12% and 15% FiO2 did not alter flicker–induced retinal haemodynamic changes. The present study indicates that at a comparable increase in blood flow the increase in oxygen extraction in the retina is larger than in the brain. During systemic hyperoxia the blood flow and oxygen extraction responses to neural stimulation are augmented. The underlying mechanism is unknown. PMID:26672758

  5. The Application of Thermal Plasma to Extraction Metallurgy and Related Fields

    NASA Technical Reports Server (NTRS)

    Akashi, K.

    1980-01-01

    Various applications of thermal plasma to extraction metallurgy and related fields are surveyed, chiefly on the basis of documents published during the past two or three years. Applications to melting and smelting, to thermal decomposition, to reduction, to manufacturing of inorganic compounds, and to other fields are considered.

  6. Extracting microRNA-gene relations from biomedical literature using distant supervision

    PubMed Central

    Clarke, Luka A.; Couto, Francisco M.

    2017-01-01

    Many biomedical relation extraction approaches are based on supervised machine learning, requiring an annotated corpus. Distant supervision aims at training a classifier by combining a knowledge base with a corpus, reducing the amount of manual effort necessary. This is particularly useful for biomedicine because many databases and ontologies have been made available for many biological processes, while the availability of annotated corpora is still limited. We studied the extraction of microRNA-gene relations from text. MicroRNA regulation is an important biological process due to its close association with human diseases. The proposed method, IBRel, is based on distantly supervised multi-instance learning. We evaluated IBRel on three datasets, and the results were compared with a co-occurrence approach as well as a supervised machine learning algorithm. While supervised learning outperformed on two of those datasets, IBRel obtained an F-score 28.3 percentage points higher on the dataset for which there was no training set developed specifically. To demonstrate the applicability of IBRel, we used it to extract 27 miRNA-gene relations from recently published papers about cystic fibrosis. Our results demonstrate that our method can be successfully used to extract relations from literature about a biological process without an annotated corpus. The source code and data used in this study are available at https://github.com/AndreLamurias/IBRel. PMID:28263989

  7. Automated systems to identify relevant documents in product risk management

    PubMed Central

    2012-01-01

    Background Product risk management involves critical assessment of the risks and benefits of health products circulating in the market. One of the important sources of safety information is the primary literature, especially for newer products which regulatory authorities have relatively little experience with. Although the primary literature provides vast and diverse information, only a small proportion of which is useful for product risk assessment work. Hence, the aim of this study is to explore the possibility of using text mining to automate the identification of useful articles, which will reduce the time taken for literature search and hence improving work efficiency. In this study, term-frequency inverse document-frequency values were computed for predictors extracted from the titles and abstracts of articles related to three tumour necrosis factors-alpha blockers. A general automated system was developed using only general predictors and was tested for its generalizability using articles related to four other drug classes. Several specific automated systems were developed using both general and specific predictors and training sets of different sizes in order to determine the minimum number of articles required for developing such systems. Results The general automated system had an area under the curve value of 0.731 and was able to rank 34.6% and 46.2% of the total number of 'useful' articles among the first 10% and 20% of the articles presented to the evaluators when tested on the generalizability set. However, its use may be limited by the subjective definition of useful articles. For the specific automated system, it was found that only 20 articles were required to develop a specific automated system with a prediction performance (AUC 0.748) that was better than that of general automated system. Conclusions Specific automated systems can be developed rapidly and avoid problems caused by subjective definition of useful articles. Thus the efficiency of

  8. A method for automatically extracting infectious disease-related primers and probes from the literature

    PubMed Central

    2010-01-01

    Background Primer and probe sequences are the main components of nucleic acid-based detection systems. Biologists use primers and probes for different tasks, some related to the diagnosis and prescription of infectious diseases. The biological literature is the main information source for empirically validated primer and probe sequences. Therefore, it is becoming increasingly important for researchers to navigate this important information. In this paper, we present a four-phase method for extracting and annotating primer/probe sequences from the literature. These phases are: (1) convert each document into a tree of paper sections, (2) detect the candidate sequences using a set of finite state machine-based recognizers, (3) refine problem sequences using a rule-based expert system, and (4) annotate the extracted sequences with their related organism/gene information. Results We tested our approach using a test set composed of 297 manuscripts. The extracted sequences and their organism/gene annotations were manually evaluated by a panel of molecular biologists. The results of the evaluation show that our approach is suitable for automatically extracting DNA sequences, achieving precision/recall rates of 97.98% and 95.77%, respectively. In addition, 76.66% of the detected sequences were correctly annotated with their organism name. The system also provided correct gene-related information for 46.18% of the sequences assigned a correct organism name. Conclusions We believe that the proposed method can facilitate routine tasks for biomedical researchers using molecular methods to diagnose and prescribe different infectious diseases. In addition, the proposed method can be expanded to detect and extract other biological sequences from the literature. The extracted information can also be used to readily update available primer/probe databases or to create new databases from scratch. PMID:20682041

  9. Causality patterns and machine learning for the extraction of problem-action relations in discharge summaries.

    PubMed

    Seol, Jae-Wook; Yi, Wangjin; Choi, Jinwook; Lee, Kyung Soon

    2017-02-01

    Clinical narrative text includes information related to a patient's medical history such as chronological progression of medical problems and clinical treatments. A chronological view of a patient's history makes clinical audits easier and improves quality of care. In this paper, we propose a clinical Problem-Action relation extraction method, based on clinical semantic units and event causality patterns, to present a chronological view of a patient's problem and a doctor's action. Based on our observation that a clinical text describes a patient's medical problems and a doctor's treatments in chronological order, a clinical semantic unit is defined as a problem and/or an action relation. Since a clinical event is a basic unit of the problem and action relation, events are extracted from narrative texts, based on the external knowledge resources context features of the conditional random fields. A clinical semantic unit is extracted from each sentence based on time expressions and context structures of events. Then, a clinical semantic unit is classified into a problem and/or action relation based on the event causality patterns of the support vector machines. Experimental results on Korean discharge summaries show 78.8% performance in the F1-measure. This result shows that the proposed method is effectively classifies clinical Problem-Action relations.

  10. Comparative evaluation of three automated systems for DNA extraction in conjunction with three commercially available real-time PCR assays for quantitation of plasma Cytomegalovirus DNAemia in allogeneic stem cell transplant recipients.

    PubMed

    Bravo, Dayana; Clari, María Ángeles; Costa, Elisa; Muñoz-Cobo, Beatriz; Solano, Carlos; José Remigia, María; Navarro, David

    2011-08-01

    Limited data are available on the performance of different automated extraction platforms and commercially available quantitative real-time PCR (QRT-PCR) methods for the quantitation of cytomegalovirus (CMV) DNA in plasma. We compared the performance characteristics of the Abbott mSample preparation system DNA kit on the m24 SP instrument (Abbott), the High Pure viral nucleic acid kit on the COBAS AmpliPrep system (Roche), and the EZ1 Virus 2.0 kit on the BioRobot EZ1 extraction platform (Qiagen) coupled with the Abbott CMV PCR kit, the LightCycler CMV Quant kit (Roche), and the Q-CMV complete kit (Nanogen), for both plasma specimens from allogeneic stem cell transplant (Allo-SCT) recipients (n = 42) and the OptiQuant CMV DNA panel (AcroMetrix). The EZ1 system displayed the highest extraction efficiency over a wide range of CMV plasma DNA loads, followed by the m24 and the AmpliPrep methods. The Nanogen PCR assay yielded higher mean CMV plasma DNA values than the Abbott and the Roche PCR assays, regardless of the platform used for DNA extraction. Overall, the effects of the extraction method and the QRT-PCR used on CMV plasma DNA load measurements were less pronounced for specimens with high CMV DNA content (>10,000 copies/ml). The performance characteristics of the extraction methods and QRT-PCR assays evaluated herein for clinical samples were extensible at cell-based standards from AcroMetrix. In conclusion, different automated systems are not equally efficient for CMV DNA extraction from plasma specimens, and the plasma CMV DNA loads measured by commercially available QRT-PCRs can differ significantly. The above findings should be taken into consideration for the establishment of cutoff values for the initiation or cessation of preemptive antiviral therapies and for the interpretation of data from clinical studies in the Allo-SCT setting.

  11. PPInterFinder--a mining tool for extracting causal relations on human proteins from literature.

    PubMed

    Raja, Kalpana; Subramani, Suresh; Natarajan, Jeyakumar

    2013-01-01

    One of the most common and challenging problem in biomedical text mining is to mine protein-protein interactions (PPIs) from MEDLINE abstracts and full-text research articles because PPIs play a major role in understanding the various biological processes and the impact of proteins in diseases. We implemented, PPInterFinder--a web-based text mining tool to extract human PPIs from biomedical literature. PPInterFinder uses relation keyword co-occurrences with protein names to extract information on PPIs from MEDLINE abstracts and consists of three phases. First, it identifies the relation keyword using a parser with Tregex and a relation keyword dictionary. Next, it automatically identifies the candidate PPI pairs with a set of rules related to PPI recognition. Finally, it extracts the relations by matching the sentence with a set of 11 specific patterns based on the syntactic nature of PPI pair. We find that PPInterFinder is capable of predicting PPIs with the accuracy of 66.05% on AIMED corpus and outperforms most of the existing systems. DATABASE URL: http://www.biomining-bu.in/ppinterfinder/

  12. Antimutagenicity of Methanolic Extracts from Anemopsis californica in Relation to Their Antioxidant Activity.

    PubMed

    Del-Toro-Sánchez, Carmen Lizette; Bautista-Bautista, Nereyda; Blasco-Cabal, José Luis; Gonzalez-Ávila, Marisela; Gutiérrez-Lomelí, Melesio; Arriaga-Alba, Myriam

    2014-01-01

    Anemopsis californica has been used empirically to treat infectious diseases. However, there are no antimutagenic evaluation reports on this plant. The present study evaluated the antioxidant activity in relation to the mutagenic and antimutagenic activity properties of leaf (LME) and stem (SME) methanolic extracts of A. californica collected in the central Mexican state of Querétaro. Antioxidant properties and total phenols of extracts were evaluated using DPPH (1,1-diphenyl-2-picrylhydrazyl) and Folin-Ciocalteu methods, respectively. Mutagenicity was evaluated using the Ames test employing Salmonella enterica serovar Typhimurium strains (TA98, TA100, and TA102), with and without an aroclor 1254 (S9 mixture). Antimutagenesis was performed against mutations induced on the Ames test with MNNG, 2AA, or 4NQO. SME presented the highest antioxidant capacity and total phenolic content. None of the extracts exhibited mutagenicity in the Ames test. The extracts produced a significant reduction in 2AA-induced mutations in S. typhimurium TA98. In both extracts, mutagenesis induced by 4NQO or methyl-N'-nitro-N-nitrosoguanidine (MNNG) was reduced only if the exposure of strains was <10 μg/Petri dish. A. californca antioxidant properties and its capacity to reduce point mutations render it suitable to enhance medical cancer treatments. The significant effect against antimutagenic 2AA suggests that their consumption would provide protection against carcinogenic polycyclic aromatic compounds.

  13. Antimutagenicity of Methanolic Extracts from Anemopsis californica in Relation to Their Antioxidant Activity

    PubMed Central

    Del-Toro-Sánchez, Carmen Lizette; Bautista-Bautista, Nereyda; Blasco-Cabal, José Luis; Gonzalez-Ávila, Marisela; Gutiérrez-Lomelí, Melesio; Arriaga-Alba, Myriam

    2014-01-01

    Anemopsis californica has been used empirically to treat infectious diseases. However, there are no antimutagenic evaluation reports on this plant. The present study evaluated the antioxidant activity in relation to the mutagenic and antimutagenic activity properties of leaf (LME) and stem (SME) methanolic extracts of A. californica collected in the central Mexican state of Querétaro. Antioxidant properties and total phenols of extracts were evaluated using DPPH (1,1-diphenyl-2-picrylhydrazyl) and Folin-Ciocalteu methods, respectively. Mutagenicity was evaluated using the Ames test employing Salmonella enterica serovar Typhimurium strains (TA98, TA100, and TA102), with and without an aroclor 1254 (S9 mixture). Antimutagenesis was performed against mutations induced on the Ames test with MNNG, 2AA, or 4NQO. SME presented the highest antioxidant capacity and total phenolic content. None of the extracts exhibited mutagenicity in the Ames test. The extracts produced a significant reduction in 2AA-induced mutations in S. typhimurium TA98. In both extracts, mutagenesis induced by 4NQO or methyl-N′-nitro-N-nitrosoguanidine (MNNG) was reduced only if the exposure of strains was <10 μg/Petri dish. A. californca antioxidant properties and its capacity to reduce point mutations render it suitable to enhance medical cancer treatments. The significant effect against antimutagenic 2AA suggests that their consumption would provide protection against carcinogenic polycyclic aromatic compounds. PMID:25152760

  14. Extracting the frequencies of the pinna spectral notches in measured head related impulse responses

    NASA Astrophysics Data System (ADS)

    Raykar, Vikas C.; Duraiswami, Ramani; Yegnanarayana, B.

    2005-07-01

    The head related impulse response (HRIR) characterizes the auditory cues created by scattering of sound off a person's anatomy. The experimentally measured HRIR depends on several factors such as reflections from body parts (torso, shoulder, and knees), head diffraction, and reflection/diffraction effects due to the pinna. Structural models (Algazi et al., 2002; Brown and Duda, 1998) seek to establish direct relationships between the features in the HRIR and the anatomy. While there is evidence that particular features in the HRIR can be explained by anthropometry, the creation of such models from experimental data is hampered by the fact that the extraction of the features in the HRIR is not automatic. One of the prominent features observed in the HRIR, and one that has been shown to be important for elevation perception, are the deep spectral notches attributed to the pinna. In this paper we propose a method to robustly extract the frequencies of the pinna spectral notches from the measured HRIR, distinguishing them from other confounding features. The method also extracts the resonances described by Shaw (1997). The techniques are applied to the publicly available CIPIC HRIR database (Algazi et al., 2001c). The extracted notch frequencies are related to the physical dimensions and shape of the pinna.

  15. Automated Lattice Perturbation Theory

    SciTech Connect

    Monahan, Christopher

    2014-11-01

    I review recent developments in automated lattice perturbation theory. Starting with an overview of lattice perturbation theory, I focus on the three automation packages currently "on the market": HiPPy/HPsrc, Pastor and PhySyCAl. I highlight some recent applications of these methods, particularly in B physics. In the final section I briefly discuss the related, but distinct, approach of numerical stochastic perturbation theory.

  16. Orbital transfer vehicle launch operations study: Automated technology knowledge base, volume 4

    NASA Technical Reports Server (NTRS)

    1986-01-01

    A simplified retrieval strategy for compiling automation-related bibliographies from NASA/RECON is presented. Two subsets of NASA Thesaurus subject terms were extracted: a primary list, which is used to obtain an initial set of citations; and a secondary list, which is used to limit or further specify a large initial set of citations. These subject term lists are presented in Appendix A as the Automated Technology Knowledge Base (ATKB) Thesaurus.

  17. Bridging semantics and syntax with graph algorithms-state-of-the-art of extracting biomedical relations.

    PubMed

    Luo, Yuan; Uzuner, Özlem; Szolovits, Peter

    2017-01-01

    Research on extracting biomedical relations has received growing attention recently, with numerous biological and clinical applications including those in pharmacogenomics, clinical trial screening and adverse drug reaction detection. The ability to accurately capture both semantic and syntactic structures in text expressing these relations becomes increasingly critical to enable deep understanding of scientific papers and clinical narratives. Shared task challenges have been organized by both bioinformatics and clinical informatics communities to assess and advance the state-of-the-art research. Significant progress has been made in algorithm development and resource construction. In particular, graph-based approaches bridge semantics and syntax, often achieving the best performance in shared tasks. However, a number of problems at the frontiers of biomedical relation extraction continue to pose interesting challenges and present opportunities for great improvement and fruitful research. In this article, we place biomedical relation extraction against the backdrop of its versatile applications, present a gentle introduction to its general pipeline and shared resources, review the current state-of-the-art in methodology advancement, discuss limitations and point out several promising future directions.

  18. Integrating Multiple On-line Knowledge Bases for Disease-Lab Test Relation Extraction.

    PubMed

    Zhang, Yaoyun; Soysal, Ergin; Moon, Sungrim; Wang, Jingqi; Tao, Cui; Xu, Hua

    2015-01-01

    A computable knowledge base containing relations between diseases and lab tests would be a great resource for many biomedical informatics applications. This paper describes our initial step towards establishing a comprehensive knowledge base of disease and lab tests relations utilizing three public on-line resources. LabTestsOnline, MedlinePlus and Wikipedia are integrated to create a freely available, computable disease-lab test knowledgebase. Disease and lab test concepts are identified using MetaMap and relations between diseases and lab tests are determined based on source-specific rules. Experimental results demonstrate a high precision for relation extraction, with Wikipedia achieving the highest precision of 87%. Combining the three sources reached a recall of 51.40%, when compared with a subset of disease-lab test relations extracted from a reference book. Moreover, we found additional disease-lab test relations from on-line resources, indicating they are complementary to existing reference books for building a comprehensive disease and lab test relation knowledge base.

  19. A crowdsourcing workflow for extracting chemical-induced disease relations from free text.

    PubMed

    Li, Tong Shu; Bravo, Àlex; Furlong, Laura I; Good, Benjamin M; Su, Andrew I

    2016-01-01

    Relations between chemicals and diseases are one of the most queried biomedical interactions. Although expert manual curation is the standard method for extracting these relations from the literature, it is expensive and impractical to apply to large numbers of documents, and therefore alternative methods are required. We describe here a crowdsourcing workflow for extracting chemical-induced disease relations from free text as part of the BioCreative V Chemical Disease Relation challenge. Five non-expert workers on the CrowdFlower platform were shown each potential chemical-induced disease relation highlighted in the original source text and asked to make binary judgments about whether the text supported the relation. Worker responses were aggregated through voting, and relations receiving four or more votes were predicted as true. On the official evaluation dataset of 500 PubMed abstracts, the crowd attained a 0.505F-score (0.475 precision, 0.540 recall), with a maximum theoretical recall of 0.751 due to errors with named entity recognition. The total crowdsourcing cost was $1290.67 ($2.58 per abstract) and took a total of 7 h. A qualitative error analysis revealed that 46.66% of sampled errors were due to task limitations and gold standard errors, indicating that performance can still be improved. All code and results are publicly available athttps://github.com/SuLab/crowd_cid_relexDatabase URL:https://github.com/SuLab/crowd_cid_relex.

  20. Integrating Multiple On-line Knowledge Bases for Disease-Lab Test Relation Extraction

    PubMed Central

    Zhang, Yaoyun; Soysal, Ergin; Moon, Sungrim; Wang, Jingqi; Tao, Cui; Xu, Hua

    2015-01-01

    A computable knowledge base containing relations between diseases and lab tests would be a great resource for many biomedical informatics applications. This paper describes our initial step towards establishing a comprehensive knowledge base of disease and lab tests relations utilizing three public on-line resources. LabTestsOnline, MedlinePlus and Wikipedia are integrated to create a freely available, computable disease-lab test knowledgebase. Disease and lab test concepts are identified using MetaMap and relations between diseases and lab tests are determined based on source-specific rules. Experimental results demonstrate a high precision for relation extraction, with Wikipedia achieving the highest precision of 87%. Combining the three sources reached a recall of 51.40%, when compared with a subset of disease-lab test relations extracted from a reference book. Moreover, we found additional disease-lab test relations from on-line resources, indicating they are complementary to existing reference books for building a comprehensive disease and lab test relation knowledge base. PMID:26306271

  1. Complete automation of solid-phase extraction with subsequent liquid chromatography-tandem mass spectrometry for the quantification of benzoylecgonine, m-hydroxybenzoylecgonine, p-hydroxybenzoylecgonine, and norbenzoylecgonine in urine--application to a high-throughput urine analysis laboratory.

    PubMed

    Robandt, Paul P; Reda, Louis J; Klette, Kevin L

    2008-10-01

    A fully automated system utilizing a liquid handler and an online solid-phase extraction (SPE) device coupled with liquid chromatography-tandem mass spectrometry (LC-MS-MS) was designed to process, detect, and quantify benzoylecgonine (BZE), meta-hydroxybenzoylecgonine (m-OH BZE), para-hydroxybenzoylecgonine (p-OH BZE), and norbenzoylecgonine (nor-BZE) metabolites in human urine. The method was linear for BZE, m-OH BZE, and p-OH BZE from 1.2 to 10,000 ng/mL with limits of detection (LOD) and quantification (LOQ) of 1.2 ng/mL. Nor-BZE was linear from 5 to 10,000 ng/mL with an LOD and LOQ of 1.2 and 5 ng/mL, respectively. The intrarun precision measured as the coefficient of variation of 10 replicates of a 100 ng/mL control was less than 2.6%, and the interrun precision for 5 replicates of the same control across 8 batches was less than 4.8% for all analytes. No assay interference was noted from controls containing cocaine, cocaethylene, and ecgonine methyl ester. Excellent data concordance (R2 > 0.994) was found for direct comparison of the automated SPE-LC-MS-MS procedure and an existing gas chromatography-MS procedure using 94 human urine samples previously determined to be positive for BZE. The automated specimen handling and SPE procedure, when compared to the traditional extraction schema, eliminates the human factors of specimen handling, processing, extraction, and derivatization, thereby reducing labor costs and rework resulting from batch handling issues, and may reduce the number of fume hoods required in the laboratory.

  2. Achyrocline satureioides (Lam.) D.C. Hydroalcoholic Extract Inhibits Neutrophil Functions Related to Innate Host Defense

    PubMed Central

    Barioni, Eric Diego; Machado, Isabel Daufenback; Rodrigues, Stephen Fernandes de Paula; Ferraz-de-Paula, Viviane; Wagner, Theodoro Marcel; Cogliati, Bruno; Corrêa dos Santos, Matheus; Machado, Marina da Silva; de Andrade, Sérgio Faloni; Niero, Rivaldo; Farsky, Sandra Helena Poliselli

    2013-01-01

    Achyrocline satureioides (Lam.) D.C. is a herb native to South America, and its inflorescences are popularly employed to treat inflammatory diseases. Here, the effects of the in vivo actions of the hydroalcoholic extract obtained from inflorescences of A. satureioides on neutrophil trafficking into inflamed tissue were investigated. Male Wistar rats were orally treated with A. satureioides extract, and inflammation was induced one hour later by lipopolysaccharide injection into the subcutaneous tissue. The number of leukocytes and the amount of chemotactic mediators were quantified in the inflammatory exudate, and adhesion molecule and toll-like receptor 4 (TLR-4) expressions and phorbol-myristate-acetate- (PMA-) stimulated oxidative burst were quantified in circulating neutrophils. Leukocyte-endothelial interactions were quantified in the mesentery tissue. Enzymes and tissue morphology of the liver and kidney were evaluated. Treatment with A. satureioides extract reduced neutrophil influx and secretion of leukotriene B4 and CINC-1 in the exudates, the number of rolling and adhered leukocytes in the mesentery postcapillary venules, neutrophil L-selectin, β2-integrin and TLR-4 expression, and oxidative burst, but did not cause an alteration in the morphology and activities of liver and kidney. Together, the data show that A. satureioides extract inhibits neutrophil functions related to the innate response and does not cause systemic toxicity. PMID:23476704

  3. In vitro activity of plant extracts against biofilm-producing food-related bacteria.

    PubMed

    Nostro, Antonia; Guerrini, Alessandra; Marino, Andreana; Tacchini, Massimo; Di Giulio, Mara; Grandini, Alessandro; Akin, Methap; Cellini, Luigina; Bisignano, Giuseppe; Saraçoğlu, Hatice T

    2016-12-05

    The identification of effective antimicrobial agents also active on biofilms is a topic of crucial importance in food and industrial environment. For that purpose methanol extracts of Turkish plants, Ficus carica L., Juglans regia L., Olea europaea L., Punica granatum L. and Rhus coriaria L., were investigated. Among the extracts, P. granatum L. and R. coriaria L. showed the best antibacterial activity with minimum inhibitory concentrations (MIC) of 78-625μg/ml for Listeria monocytogenes and Staphylococcus aureus and 312-1250μg/ml for Escherichia coli and Pseudomonas aeruginosa. SubMICs produced a significant biofilm inhibition equal to 80-60% for L. monocytogenes and 90-80% for S. aureus. The extracts showed also the highest polyphenol content and the strongest antioxidant activity. Bioassay-guided and HPLC procedures demonstrated the presence of apigenin 4'-O-β-glucoside in P. granatum L. and myricetrin and quercitrin in R. coriaria L. Antigenotoxicity of plant extracts was also observed The present findings promote the value-adding of P. granatum L. and R. coriaria L. leaves as natural antimicrobial/antioxidant agents for control of food-related bacterial biofilms.

  4. Development of a fully automated sequential injection solid-phase extraction procedure coupled to liquid chromatography to determine free 2-hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid in human urine.

    PubMed

    León, Zacarías; Chisvert, Alberto; Balaguer, Angel; Salvador, Amparo

    2010-04-07

    2-Hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid, commonly known as benzophenone-3 (BZ3) and benzophenone-4 (BZ4), respectively, are substances widely used as UV filters in cosmetic products in order to absorb UV radiation and protect human skin from direct exposure to the deleterious wavelengths of sunlight. As with other UV filters, there is evidence of their percutaneous absorption. This work describes an analytical method developed to determine trace levels of free BZ3 and BZ4 in human urine. The methodology is based on a solid-phase extraction (SPE) procedure for clean-up and pre-concentration, followed by the monitoring of the UV filters by liquid chromatography-ultraviolet spectrophotometry detection (LC-UV). In order to improve not only the sensitivity and selectivity, but also the precision of the method, the principle of sequential injection analysis was used to automate the SPE process and to transfer the eluates from the SPE to the LC system. The application of a six-channel valve as an interface for the switching arrangements successfully allowed the on-line connection of SPE sample processing with LC analysis. The SPE process for BZ3 and BZ4 was performed using octadecyl (C18) and diethylaminopropyl (DEA) modified silica microcolumns, respectively, in which the analytes were retained and eluted selectively. Due to the matrix effects, the determination was based on standard addition quantification and was fully validated. The relative standard deviations of the results were 13% and 6% for BZ3 and BZ4, respectively, whereas the limits of detection were 60 and 30 ng mL(-1), respectively. The method was satisfactorily applied to determine BZ3 and BZ4 in urine from volunteers that had applied a sunscreen cosmetic containing both UV filters.

  5. Extensible automated dispersive liquid-liquid microextraction.

    PubMed

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang

    2015-05-04

    In this study, a convenient and extensible automated ionic liquid-based in situ dispersive liquid-liquid microextraction (automated IL-based in situ DLLME) was developed. 1-Octyl-3-methylimidazolium bis[(trifluoromethane)sulfonyl]imide ([C8MIM]NTf2) is formed through the reaction between [C8MIM]Cl and lithium bis[(trifluoromethane)sulfonyl]imide (LiNTf2) to extract the analytes. Using a fully automatic SPE workstation, special SPE columns packed with nonwoven polypropylene (NWPP) fiber, and a modified operation program, the procedures of the IL-based in situ DLLME, including the collection of a water sample, injection of an ion exchange solvent, phase separation of the emulsified solution, elution of the retained extraction phase, and collection of the eluent into vials, can be performed automatically. The developed approach, coupled with high-performance liquid chromatography-diode array detection (HPLC-DAD), was successfully applied to the detection and concentration determination of benzoylurea (BU) insecticides in water samples. Parameters affecting the extraction performance were investigated and optimized. Under the optimized conditions, the proposed method achieved extraction recoveries of 80% to 89% for water samples. The limits of detection (LODs) of the method were in the range of 0.16-0.45 ng mL(-1). The intra-column and inter-column relative standard deviations (RSDs) were <8.6%. Good linearity (r>0.9986) was obtained over the calibration range from 2 to 500 ng mL(-1). The proposed method opens a new avenue for automated DLLME that not only greatly expands the range of viable extractants, especially functional ILs but also enhances its application for various detection methods. Furthermore, multiple samples can be processed simultaneously, which accelerates the sample preparation and allows the examination of a large number of samples.

  6. Quantification of asenapine and three metabolites in human plasma using liquid chromatography-tandem mass spectrometry with automated solid-phase extraction: application to a phase I clinical trial with asenapine in healthy male subjects.

    PubMed

    de Boer, Theo; Meulman, Erik; Meijering, Henri; Wieling, Jaap; Dogterom, Peter; Lass, Holger

    2012-02-01

    The development and validation of methods for determining concentrations of the antipsychotic drug asenapine (ASE) and three of its metabolites [N-desmethylasenapine (DMA), asenapine-N(+) -glucuronide (ASG) and 11-O-sulfate-asenapine (OSA)] in human plasma using LC-MS/MS with automated solid-phase extraction is described. The three assessment methods in human plasma were found to be acceptable for quantification in the ranges 0.0250-20.0 ng/mL (ASE), 0.0500-20.0 ng/mL (DMA and OSA) and 0.250-50.0 ng/mL (ASG).

  7. Automation in Photogrammetry,

    DTIC Science & Technology

    1980-07-25

    Allam , 1978), and the OM-Bendix AS-lIB-X (Scarano and Bruma, 1976). The UNAMACE and GPM-2 employ analog (electronic) correlation technology. However...Survey (USGS) and the Surveys and Mapping Branch (Canada) have formed integrated systems based on the Gestalt GPM 2 (Brunson and Olson, 1978; Allam , 1978...ten years off, and the full automation of planimetric extraction may be more than 20 years in the future. REFERENCES Allam , M. M., 1978. The Role of

  8. Automated Cooperative Trajectories

    NASA Technical Reports Server (NTRS)

    Hanson, Curt; Pahle, Joseph; Brown, Nelson

    2015-01-01

    This presentation is an overview of the Automated Cooperative Trajectories project. An introduction to the phenomena of wake vortices is given, along with a summary of past research into the possibility of extracting energy from the wake by flying close parallel trajectories. Challenges and barriers to adoption of civilian automatic wake surfing technology are identified. A hardware-in-the-loop simulation is described that will support future research. Finally, a roadmap for future research and technology transition is proposed.

  9. Direct DNA isolation from solid biological sources without pretreatments with proteinase-K and/or homogenization through automated DNA extraction.

    PubMed

    Ki, Jang-Seu; Chang, Ki Byum; Roh, Hee June; Lee, Bong Youb; Yoon, Joon Yong; Jang, Gi Young

    2007-03-01

    Genomic DNA from solid biomaterials was directly isolated with an automated DNA extractor, which was based on magnetic bead technology with a bore-mediated grinding (BMG) system. The movement of the bore broke down the solid biomaterials, mixed crude lysates thoroughly with reagents to isolate the DNA, and carried the beads to the next step. The BMG system was suitable for the mechanical homogenization of the solid biomaterials and valid as an automated system for purifying the DNA from the solid biomaterials without the need for pretreatment or disruption procedures prior to the application of the solid biomaterials.

  10. Automation pilot

    NASA Technical Reports Server (NTRS)

    1983-01-01

    An important concept of the Action Information Management System (AIMS) approach is to evaluate office automation technology in the context of hands on use by technical program managers in the conduct of human acceptance difficulties which may accompany the transition to a significantly changing work environment. The improved productivity and communications which result from application of office automation technology are already well established for general office environments, but benefits unique to NASA are anticipated and these will be explored in detail.

  11. Method Specific Calibration Corrects for DNA Extraction Method Effects on Relative Telomere Length Measurements by Quantitative PCR.

    PubMed

    Seeker, Luise A; Holland, Rebecca; Underwood, Sarah; Fairlie, Jennifer; Psifidi, Androniki; Ilska, Joanna J; Bagnall, Ainsley; Whitelaw, Bruce; Coffey, Mike; Banos, Georgios; Nussey, Daniel H

    2016-01-01

    Telomere length (TL) is increasingly being used as a biomarker in epidemiological, biomedical and ecological studies. A wide range of DNA extraction techniques have been used in telomere experiments and recent quantitative PCR (qPCR) based studies suggest that the choice of DNA extraction method may influence average relative TL (RTL) measurements. Such extraction method effects may limit the use of historically collected DNA samples extracted with different methods. However, if extraction method effects are systematic an extraction method specific (MS) calibrator might be able to correct for them, because systematic effects would influence the calibrator sample in the same way as all other samples. In the present study we tested whether leukocyte RTL in blood samples from Holstein Friesian cattle and Soay sheep measured by qPCR was influenced by DNA extraction method and whether MS calibration could account for any observed differences. We compared two silica membrane-based DNA extraction kits and a salting out method. All extraction methods were optimized to yield enough high quality DNA for TL measurement. In both species we found that silica membrane-based DNA extraction methods produced shorter RTL measurements than the non-membrane-based method when calibrated against an identical calibrator. However, these differences were not statistically detectable when a MS calibrator was used to calculate RTL. This approach produced RTL measurements that were highly correlated across extraction methods (r > 0.76) and had coefficients of variation lower than 10% across plates of identical samples extracted by different methods. Our results are consistent with previous findings that popular membrane-based DNA extraction methods may lead to shorter RTL measurements than non-membrane-based methods. However, we also demonstrate that these differences can be accounted for by using an extraction method-specific calibrator, offering researchers a simple means of accounting for

  12. Method Specific Calibration Corrects for DNA Extraction Method Effects on Relative Telomere Length Measurements by Quantitative PCR

    PubMed Central

    Holland, Rebecca; Underwood, Sarah; Fairlie, Jennifer; Psifidi, Androniki; Ilska, Joanna J.; Bagnall, Ainsley; Whitelaw, Bruce; Coffey, Mike; Banos, Georgios; Nussey, Daniel H.

    2016-01-01

    Telomere length (TL) is increasingly being used as a biomarker in epidemiological, biomedical and ecological studies. A wide range of DNA extraction techniques have been used in telomere experiments and recent quantitative PCR (qPCR) based studies suggest that the choice of DNA extraction method may influence average relative TL (RTL) measurements. Such extraction method effects may limit the use of historically collected DNA samples extracted with different methods. However, if extraction method effects are systematic an extraction method specific (MS) calibrator might be able to correct for them, because systematic effects would influence the calibrator sample in the same way as all other samples. In the present study we tested whether leukocyte RTL in blood samples from Holstein Friesian cattle and Soay sheep measured by qPCR was influenced by DNA extraction method and whether MS calibration could account for any observed differences. We compared two silica membrane-based DNA extraction kits and a salting out method. All extraction methods were optimized to yield enough high quality DNA for TL measurement. In both species we found that silica membrane-based DNA extraction methods produced shorter RTL measurements than the non-membrane-based method when calibrated against an identical calibrator. However, these differences were not statistically detectable when a MS calibrator was used to calculate RTL. This approach produced RTL measurements that were highly correlated across extraction methods (r > 0.76) and had coefficients of variation lower than 10% across plates of identical samples extracted by different methods. Our results are consistent with previous findings that popular membrane-based DNA extraction methods may lead to shorter RTL measurements than non-membrane-based methods. However, we also demonstrate that these differences can be accounted for by using an extraction method-specific calibrator, offering researchers a simple means of accounting for

  13. Comparative evaluation of Amplicor HIV-1 DNA test, version 1.5, by manual and automated DNA extraction methods using venous blood and dried blood spots for HIV-1 DNA PCR testing.

    PubMed

    Nsojo, Anthony; Aboud, Said; Lyamuya, Eligius

    2010-10-01

    Human immunodeficiency virus (HIV) DNA polymerase chain reaction (PCR) test using venous blood sample has been used for many years in low resource settings for early infant diagnosis of HIV infection in children less than 18 months. The aim of this study was to evaluate and compare the performance characteristics of Amplicor HIV-1 DNA assay version 1.5 following processing of venous blood and dried blood spot (DBS) samples by Roche manual DNA extraction and automated Roche MagNA Pure LC instrument (MP) for HIV-1 DNA PCR testing in Dar es Salaam, Tanzania, in order to scale up early infant diagnosis of HIV infection in routine practice. Venous blood samples from children under 18 months born to HIV-infected mothers between January and April 2008 were collected. Venous blood was used to prepare cell pellet and DBS samples. DNA extractions by manual procedure and MP were performed each on cell pellet, venous blood and DBS samples and tested by Amplicor HIV-1 DNA assay. Of 325 samples included, 60 (18.5%) were confirmed HIV-infected by manual extraction performed on cell pellets. Sensitivity of the assay following MP processing of venous blood was 95% (95% CI; 86.1-99.0%) and 98.3% (95% CI; 91.1 to 99.9%) for the manual extraction and processing by MP performed on DBS samples. Specificity of the assay with all DNA extraction methods was 99.6% (95% CI; 97.9 to 100%). Performance of the assay with Roche manual extraction and processing by MP on DBS samples compared well with Roche manual extraction performed on cell pellet samples. The choice of DNA extraction method needs to be individualized based on the level of laboratory facility, volume of testing and cost benefit analysis before it is adopted for use.

  14. A Knowledge-Driven Approach to Extract Disease-Related Biomarkers from the Literature

    PubMed Central

    Bravo, À.; Cases, M.; Queralt-Rosinach, N.; Sanz, F.; Furlong, L. I.

    2014-01-01

    The biomedical literature represents a rich source of biomarker information. However, both the size of literature databases and their lack of standardization hamper the automatic exploitation of the information contained in these resources. Text mining approaches have proven to be useful for the exploitation of information contained in the scientific publications. Here, we show that a knowledge-driven text mining approach can exploit a large literature database to extract a dataset of biomarkers related to diseases covering all therapeutic areas. Our methodology takes advantage of the annotation of MEDLINE publications pertaining to biomarkers with MeSH terms, narrowing the search to specific publications and, therefore, minimizing the false positive ratio. It is based on a dictionary-based named entity recognition system and a relation extraction module. The application of this methodology resulted in the identification of 131,012 disease-biomarker associations between 2,803 genes and 2,751 diseases, and represents a valuable knowledge base for those interested in disease-related biomarkers. Additionally, we present a bibliometric analysis of the journals reporting biomarker related information during the last 40 years. PMID:24839601

  15. Drought Resilience of Water Supplies for Shale Gas Extraction and Related Power Generation in Texas

    NASA Astrophysics Data System (ADS)

    Reedy, R. C.; Scanlon, B. R.; Nicot, J. P.; Uhlman, K.

    2014-12-01

    There is considerable concern about water availability to support energy production in Texas, particularly considering that many of the shale plays are in semiarid areas of Texas and the state experienced the most extreme drought on record in 2011. The Eagle Ford shale play provides an excellent case study. Hydraulic fracturing water use for shale gas extraction in the play totaled ~ 12 billion gallons (bgal) in 2012, representing ~7 - 10% of total water use in the 16 county play area. The dominant source of water is groundwater which is not highly vulnerable to drought from a recharge perspective because water is primarily stored in the confined portion of aquifers that were recharged thousands of years ago. Water supply drought vulnerability results primarily from increased water use for irrigation. Irrigation water use in the Eagle Ford play was 30 billion gallons higher in the 2011 drought year relative to 2010. Recent trends toward increased use of brackish groundwater for shale gas extraction in the Eagle Ford also reduce pressure on fresh water resources. Evaluating the impacts of natural gas development on water resources should consider the use of natural gas in power generation, which now represents 50% of power generation in Texas. Water consumed in extracting the natural gas required for power generation is equivalent to ~7% of the water consumed in cooling these power plants in the state. However, natural gas production from shale plays can be overall beneficial in terms of water resources in the state because natural gas combined cycle power generation decreases water consumption by ~60% relative to traditional coal, nuclear, and natural gas plants that use steam turbine generation. This reduced water consumption enhances drought resilience of power generation in the state. In addition, natural gas combined cycle plants provide peaking capacity that complements increasing renewable wind generation which has no cooling water requirement. However, water

  16. Semi-automated fault system extraction and displacement analysis of an excavated oyster reef using high-resolution laser scanned data

    NASA Astrophysics Data System (ADS)

    Molnár, Gábor; Székely, Balázs; Harzhauser, Mathias; Djuricic, Ana; Mandic, Oleg; Dorninger, Peter; Nothegger, Clemens; Exner, Ulrike; Pfeifer, Norbert

    2015-04-01

    In this contribution we present a semi-automated method for reconstructing the brittle deformation field of an excavated Miocene oyster reef, in Stetten, Korneuburg Basin, Lower Austria. Oyster shells up to 80 cm in size were scattered in a shallow estuarine bay forming a continuous and almost isochronous layer as a consequence of a catastrophic event in the Miocene. This shell bed was preserved by burial of several hundred meters of sandy to silty sediments. Later the layers were tilted westward, uplifted and erosion almost exhumed them. An excavation revealed a 27 by 17 meters area of the oyster covered layer. During the tectonic processes the sediment volume suffered brittle deformation. Faults mostly with some centimeter normal component and NW-SE striking affected the oyster covered volume, dissecting many shells and the surrounding matrix as well. Faults and displacements due to them can be traced along the site typically at several meters long, and as fossil oysters are broken and parts are displaced due to the faulting, along some faults it is possible to follow these displacements in 3D. In order to quantify these varying displacements and to map the undulating fault traces high-resolution scanning of the excavated and cleaned surface of the oyster bed has been carried out using a terrestrial laser scanner. The resulting point clouds have been co-georeferenced at mm accuracy and a 1mm resolution 3D point cloud of the surface has been created. As the faults are well-represented in the point cloud, this enables us to measure the dislocations of the dissected shell parts along the fault lines. We used a semi-automatic method to quantify these dislocations. First we manually digitized the fault lines in 2D as an initial model. In the next step we estimated the vertical (i.e. perpendicular to the layer) component of the dislocation along these fault lines comparing the elevations on two sides of the faults with moving averaging windows. To estimate the strike

  17. Age differences in the takeover of vehicle control and engagement in non-driving-related activities in simulated driving with conditional automation.

    PubMed

    Clark, Hallie; Feng, Jing

    2016-09-26

    High-level vehicle automation has been proposed as a valuable means to enhance the mobility of older drivers, as older drivers experience age-related declines in many cognitive functions that are vital for safe driving. Recent research attempted to examine age differences in how engagement in non-driving-related activities impact driving performance, by instructing drivers to engage in mandatory pre-designed activities. While the mandatory engagement method allows a precise control of the timing and mental workload of the non-driving-related activities, it is different from how a driver would naturally engage in these activities. This study allowed younger (age 18-35, mean age=19.9years) and older drivers (age 62-81, mean age=70.4years) to freely decide when and how to engage in voluntarily chosen non-driving-related activities during simulated driving with conditional automation. We coded video recordings of participants' engagement in non-driving-related activities. We examined the effect of age, level of activity-engagement and takeover notification interval on vehicle control performance during the takeover, by comparing between the high and low engagement groups in younger and older drivers, across two takeover notification interval conditions. We found that both younger and older drivers engaged in various non-driving-related activities during the automated driving portion, with distinct preferences on the type of activity for each age group (i.e., while younger drivers mostly used an electronic device, older drivers tended to converse). There were also significant differences between the two age groups and between the two notification intervals on various driving performance measures. Older drivers benefited more than younger drivers from the longer interval in terms of response time to notifications. Voluntary engagement in non-driving-related activities did not impair takeover performance in general, although there was a trend of older drivers who were

  18. Nrf2-mediated mucoprotective and anti-inflammatory actions of Artemisia extracts led to attenuate stress related mucosal damages

    PubMed Central

    Park, Jong-Min; Han, Young-Min; Lee, Jin-Seok; Ko, Kwang Hyun; Hong, Sung-Pyo; Kim, Eun-Hee; Hahm, Ki-Baik

    2015-01-01

    The aim of this study was to compare biological actions between isopropanol and ethanol extracts of Artemisia including antioxidant, anti-inflammatory, and cytoprotective actions. Antioxidant activities were evaluated using 2,2-diphenyl-1-picrylhydrazyl (DPPH) method and confocal microscopy on lipopolysaccharide-induced RGM1 cells, cytoprotection effects evaluated by detecting heme oxygenase-1 (HO-1), Nf-E2 related factor2 (Nrf2) and heat shock protein 70 (HSP70), and anti-inflammatory effects investigated by measuring inflammatory mediators. Water immersion restraint stress was imposed to provoke stress related mucosal damages (SRMD) in rats. Isopropanol extracts of Artemisia showed the higher DPPH radical scavenging activity and lesser LPS-induced reactive oxygen species productions and increased HO-1 expression through increased nuclear translocation of Nrf2 transcription factor compared to ethanol extracts. The increased expression of HSP70 and decreased expression of endothelin-1 were only increased with isopropanol extracts. A concentration-dependent inhibition of LPS-induced COX-2 and iNOS even at a rather lower concentration than ethanol extract was achieved with isopropanol extracts. Cytokine protein array revealed Artemisia extracts significantly attenuated the levels of CXCL-1, CXCL-16, and MCP-1. These orchestrated actions led to significant rescue from SRMD. Conclusively, Artemisia extracts imposed significant antioxidant and anti-inflammatory activity against SRMD and isopropanol extracts were superior to ethanol extracts in these beneficiary actions of Artemisia. PMID:25759519

  19. Automation in organizations: Eternal conflict

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  20. Effect of Selenium-Enriched Agaricus bisporus (Higher Basidiomycetes) Extracts, Obtained by Pressurized Water Extraction, on the Expression of Cholesterol Homeostasis Related Genes by Low-Density Array.

    PubMed

    Gil-Ramírez, Alicia; Soler-Rivas, Cristina; Rodriguez-Casado, Arantxa; Ruiz-Rodríguez, Alejandro; Reglero, Guillermo; Marín, Francisco Ramón

    2015-01-01

    Culinary-medicinal mushrooms are able to lower blood cholesterol levels in animal models by different mechanisms. They might impair the endogenous cholesterol synthesis and exogenous cholesterol absorption during digestion. Mushroom extracts, obtained using pressurized water extractions (PWE) from Agaricus bisporus basidiomes, supplemented or not supplemented with selenium, were applied to HepG2 cell cultures to study the expression of 19 genes related to cholesterol homeostasis by low-density arrays (LDA). Only the PWE fractions obtained at 25°C showed 3-hydroxy-3-methylglutaryl-CoA reductase (HMGCR) inhibitory activity. Besides the enzymatic inhibition, PWE extracts may downregulate some of the key genes involved in the cholesterol homeostasis, such as the squalene synthase gene (FDFT1), since its mRNA expression falls by one third of its initial value. In summary, A. bisporus extracts may also modulate biological cholesterol levels by molecular mechanisms further than the enzymatic way previously reported.

  1. Use of relational database management system by clinicians to create automated MICU progress note from existent data sources.

    PubMed Central

    Delaney, D. P.; Zibrak, J. D.; Samore, M.; Peterson, M.

    1997-01-01

    We designed and built an application called MD Assist that compiles data from several hospital databases to create reports used for daily house officer rounding in the medical intensive care unit (MICU). After rounding, the report becomes the objective portion of the daily "SOAP" MICU progress note. All data used in the automated note was available in digital format residing in an institution wide Sybase data repository which had been built to fulfill data needs of the parent enterprise. From initial design of target output through actual creation and implementation in the MICU, MD Assist was created by physicians with only consultative help from information systems (IS). This project demonstrated a method for rapidly developing time saving, clinically useful applications using a comprehensive clinical data repository. PMID:9357578

  2. PPI-IRO: a two-stage method for protein-protein interaction extraction based on interaction relation ontology.

    PubMed

    Li, Chuan-Xi; Chen, Peng; Wang, Ru-Jing; Wang, Xiu-Jie; Su, Ya-Ru; Li, Jinyan

    2014-01-01

    Mining Protein-Protein Interactions (PPIs) from the fast-growing biomedical literature resources has been proven as an effective approach for the identification of biological regulatory networks. This paper presents a novel method based on the idea of Interaction Relation Ontology (IRO), which specifies and organises words of various proteins interaction relationships. Our method is a two-stage PPI extraction method. At first, IRO is applied in a binary classifier to determine whether sentences contain a relation or not. Then, IRO is taken to guide PPI extraction by building sentence dependency parse tree. Comprehensive and quantitative evaluations and detailed analyses are used to demonstrate the significant performance of IRO on relation sentences classification and PPI extraction. Our PPI extraction method yielded a recall of around 80% and 90% and an F1 of around 54% and 66% on corpora of AIMed and BioInfer, respectively, which are superior to most existing extraction methods.

  3. A modified extraction protocol enables detection and quantification of celiac disease-related gluten proteins from wheat.

    PubMed

    van den Broeck, Hetty C; America, Antoine H P; Smulders, Marinus J M; Bosch, Dirk; Hamer, Rob J; Gilissen, Ludovicus J W J; van der Meer, Ingrid M

    2009-04-01

    The detection, analysis, and quantification of individual celiac disease (CD) immune responsive gluten proteins in wheat and related cereals (barley, rye) require an adequate and reliable extraction protocol. Because different types of gluten proteins behave differently in terms of solubility, currently different extraction protocols exist. The performance of various documented gluten extraction protocols is evaluated for specificity and completeness by gel electrophoresis (SDS-PAGE), immunoblotting and RIDASCREEN Gliadin competitive ELISA. Based on these results, an optimized, two-step extraction protocol has been developed.

  4. Thematic orders and the comprehension of subject-extracted relative clauses in Mandarin Chinese

    PubMed Central

    Lin, Chien-Jer Charles

    2015-01-01

    This study investigates the comprehension of three kinds of subject-extracted relative clauses (SRs) in Mandarin Chinese: standard SRs, relative clauses involving the disposal ba construction (“disposal SRs”), and relative clauses involving the long passive bei constructions (“passive SRs”). In a self-paced reading experiment, the regions before the relativizer (where the sentential fragments are temporarily ambiguous) showed reading patterns consistent with expectation-based incremental processing: standard SRs, with the highest constructional frequency and the least complex syntactic structure, were processed faster than the other two variants. However, in the regions after the relativizer and the head noun where the existence of a relative clause is unambiguously indicated, a top-down global effect of thematic ordering was observed: passive SRs, whose thematic role order conforms to the canonical thematic order of Chinese, were read faster than both the standard SRs and the disposal SRs. Taken together, these results suggest that two expectation-based processing factors are involved in the comprehension of Chinese relative clauses, including both the structural probabilities of pre-relativizer constituents and the overall surface thematic orders in the relative clauses. PMID:26441697

  5. A knowledge-poor approach to chemical-disease relation extraction

    PubMed Central

    Alam, Firoj; Corazza, Anna; Lavelli, Alberto; Zanoli, Roberto

    2016-01-01

    The article describes a knowledge-poor approach to the task of extracting Chemical-Disease Relations from PubMed abstracts. A first version of the approach was applied during the participation in the BioCreative V track 3, both in Disease Named Entity Recognition and Normalization (DNER) and in Chemical-induced diseases (CID) relation extraction. For both tasks, we have adopted a general-purpose approach based on machine learning techniques integrated with a limited number of domain-specific knowledge resources and using freely available tools for preprocessing data. Crucially, the system only uses the data sets provided by the organizers. The aim is to design an easily portable approach with a limited need of domain-specific knowledge resources. In the participation in the BioCreative V task, we ranked 5 out of 16 in DNER, and 7 out of 18 in CID. In this article, we present our follow-up study in particular on CID by performing further experiments, extending our approach and improving the performance. PMID:27189609

  6. Quantification of five compounds with heterogeneous physicochemical properties (morphine, 6-monoacetylmorphine, cyamemazine, meprobamate and caffeine) in 11 fluids and tissues, using automated solid-phase extraction and gas chromatography-tandem mass spectrometry.

    PubMed

    Bévalot, Fabien; Bottinelli, Charline; Cartiser, Nathalie; Fanton, Laurent; Guitton, Jérôme

    2014-06-01

    An automated solid-phase extraction (SPE) protocol followed by gas chromatography coupled with tandem mass spectrometry was developed for quantification of caffeine, cyamemazine, meprobamate, morphine and 6-monoacetylmorphine (6-MAM) in 11 biological matrices [blood, urine, bile, vitreous humor, liver, kidney, lung and skeletal muscle, brain, adipose tissue and bone marrow (BM)]. The assay was validated for linearity, within- and between-day precision and accuracy, limits of quantification, selectivity, extraction recovery (ER), sample dilution and autosampler stability on BM. For the other matrices, partial validation was performed (limits of quantification, linearity, within-day precision, accuracy, selectivity and ER). The lower limits of quantification were 12.5 ng/mL(ng/g) for 6-MAM, morphine and cyamemazine, 100 ng/mL(ng/g) for meprobamate and 50 ng/mL(ng/g) for caffeine. Analysis of real-case samples demonstrated the performance of the assay in forensic toxicology to investigate challenging cases in which, for example, blood is not available or in which analysis in alternative matrices could be relevant. The SPE protocol was also assessed as an extraction procedure that could target other relevant analytes of interest. The extraction procedure was applied to 12 molecules of forensic interest with various physicochemical properties (alimemazine, alprazolam, amitriptyline, citalopram, cocaine, diazepam, levomepromazine, nordazepam, tramadol, venlafaxine, pentobarbital and phenobarbital). All drugs were able to be detected at therapeutic concentrations in blood and in the alternate matrices.

  7. Validation of high-throughput measurement system with microwave-assisted extraction, fully automated sample preparation device, and gas chromatography-electron capture detector for determination of polychlorinated biphenyls in whale blubber.

    PubMed

    Fujita, Hiroyuki; Honda, Katsuhisa; Hamada, Noriaki; Yasunaga, Genta; Fujise, Yoshihiro

    2009-02-01

    Validation of a high-throughput measurement system with microwave-assisted extraction (MAE), fully automated sample preparation device (SPD), and gas chromatography-electron capture detector (GC-ECD) for the determination of polychlorinated biphenyls (PCBs) in minke whale blubber was performed. PCB congeners accounting for > 95% of the total PCBs burden in blubber were efficiently extracted with a small volume (20 mL) of n-hexane using MAE due to simultaneous saponification and extraction. Further, the crude extract obtained by MAE was rapidly purified and automatically substituted to a small volume (1 mL) of toluene using SPD without using concentrators. Furthermore, the concentration of PCBs in the purified and concentrated solution was accurately determined by GC-ECD. Moreover, the result of accuracy test using a certified material (SRM 1588b; Cod liver oil) showed good agreement with the NIST certified concentration values. In addition, the method quantification limit of total-PCB in whale blubbers was 41 ng g(-1). This new measurement system for PCBs takes only four hours. Consequently, it indicated this method is the most suitable for the monitoring and screening of PCBs in the conservation of the marine ecosystem and safe distribution of foods.

  8. Habitat automation

    NASA Technical Reports Server (NTRS)

    Swab, Rodney E.

    1992-01-01

    A habitat, on either the surface of the Moon or Mars, will be designed and built with the proven technologies of that day. These technologies will be mature and readily available to the habitat designer. We believe an acceleration of the normal pace of automation would allow a habitat to be safer and more easily maintained than would be the case otherwise. This document examines the operation of a habitat and describes elements of that operation which may benefit from an increased use of automation. Research topics within the automation realm are then defined and discussed with respect to the role they can have in the design of the habitat. Problems associated with the integration of advanced technologies into real-world projects at NASA are also addressed.

  9. A Novel Method for Extracting Respiration Rate and Relative Tidal Volume from Infrared Thermography

    PubMed Central

    Lewis, Gregory F.; Gatto, Rodolfo G.; Porges, Stephen W.

    2010-01-01

    In psychophysiological research, measurement of respiration has been dependent on transducers having direct contact with the participant. The current study provides empirical data demonstrating that a noncontact technology, infrared video thermography, can accurately estimate breathing rate and relative tidal volume across a range of breathing patterns. Video tracking algorithms were applied to frame-by-frame thermal images of the face to extract time series of nostril temperature and to generate breath-by-breath measures of respiration rate and relative tidal volume. The thermal indices of respiration were contrasted with criterion measures collected with inductance plethysmography. The strong correlations observed between the technologies demonstrate the potential use of facial video thermography as a noncontact technology to monitor respiration. PMID:21214587

  10. FAQs Related to Response to Petition to Add Oil And Gas Extraction Sector to the TRI Program

    EPA Pesticide Factsheets

    Questions and answers related to EPA's response to a petition by the Environmental Integrity Project and 16 other organizations to add the Oil and Gas Extraction sector to the scope of industries subject to TRI reporting requirements.

  11. miRTex: A Text Mining System for miRNA-Gene Relation Extraction

    PubMed Central

    Li, Gang; Ross, Karen E.; Arighi, Cecilia N.; Peng, Yifan; Wu, Cathy H.; Vijay-Shanker, K.

    2015-01-01

    MicroRNAs (miRNAs) regulate a wide range of cellular and developmental processes through gene expression suppression or mRNA degradation. Experimentally validated miRNA gene targets are often reported in the literature. In this paper, we describe miRTex, a text mining system that extracts miRNA-target relations, as well as miRNA-gene and gene-miRNA regulation relations. The system achieves good precision and recall when evaluated on a literature corpus of 150 abstracts with F-scores close to 0.90 on the three different types of relations. We conducted full-scale text mining using miRTex to process all the Medline abstracts and all the full-length articles in the PubMed Central Open Access Subset. The results for all the Medline abstracts are stored in a database for interactive query and file download via the website at http://proteininformationresource.org/mirtex. Using miRTex, we identified genes potentially regulated by miRNAs in Triple Negative Breast Cancer, as well as miRNA-gene relations that, in conjunction with kinase-substrate relations, regulate the response to abiotic stress in Arabidopsis thaliana. These two use cases demonstrate the usefulness of miRTex text mining in the analysis of miRNA-regulated biological processes. PMID:26407127

  12. miRTex: A Text Mining System for miRNA-Gene Relation Extraction.

    PubMed

    Li, Gang; Ross, Karen E; Arighi, Cecilia N; Peng, Yifan; Wu, Cathy H; Vijay-Shanker, K

    2015-01-01

    MicroRNAs (miRNAs) regulate a wide range of cellular and developmental processes through gene expression suppression or mRNA degradation. Experimentally validated miRNA gene targets are often reported in the literature. In this paper, we describe miRTex, a text mining system that extracts miRNA-target relations, as well as miRNA-gene and gene-miRNA regulation relations. The system achieves good precision and recall when evaluated on a literature corpus of 150 abstracts with F-scores close to 0.90 on the three different types of relations. We conducted full-scale text mining using miRTex to process all the Medline abstracts and all the full-length articles in the PubMed Central Open Access Subset. The results for all the Medline abstracts are stored in a database for interactive query and file download via the website at http://proteininformationresource.org/mirtex. Using miRTex, we identified genes potentially regulated by miRNAs in Triple Negative Breast Cancer, as well as miRNA-gene relations that, in conjunction with kinase-substrate relations, regulate the response to abiotic stress in Arabidopsis thaliana. These two use cases demonstrate the usefulness of miRTex text mining in the analysis of miRNA-regulated biological processes.

  13. Separating arterial and venous-related components of photoplethysmographic signals for accurate extraction of oxygen saturation and respiratory rate.

    PubMed

    Yousefi, Rasoul; Nourani, Mehrdad

    2015-05-01

    We propose an algorithm for separating arterial and venous-related signals using second-order statistics of red and infrared signals in a blind source separation technique. The separated arterial signal is used to compute accurate arterial oxygen saturation. We have also introduced an algorithm for extracting the respiratory pattern from the extracted venous-related signal. In addition to real-time monitoring, respiratory rate is also extracted. Our experimental results from multiple subjects show that the proposed separation technique is extremely useful for extracting accurate arterial oxygen saturation and respiratory rate. Specifically, the breathing rate is extracted with average root mean square deviation of 1.89 and average mean difference of -0.69.

  14. Characterization of cysteine related variants in an IgG2 antibody by LC-MS with an automated data analysis approach.

    PubMed

    Zhang, Yuling; Bailey, Robert; Nightlinger, Nancy; Gillespie, Alison; Balland, Alain; Rogers, Richard

    2015-08-01

    In this communication, a high-throughput method for automated data analysis of cysteine-related product quality attributes (PQAs) in IgG2 antibodies is reported. This method leverages recent advances in the relative quantification of PQAs to facilitate the characterization of disulfide variants and free sulfhydryls (SHs) in IgG2 antibodies. The method uses samples labeled with a mass tag (N-ethyl maleimide [NEM]) followed by enzymatic digestion under non-reducing conditions to maintain the cysteine connectivity. The digested IgG2 samples are separated and detected by mass spectrometry (MS) and the resulting peptide map is analyzed in an automated fashion using Pinpoint software (Thermo Scientific). Previous knowledge of IgG2 disulfide structures can be fed into the Pinpoint software to create workbooks for various disulfide linkages and hinge disulfide variants. In addition, the NEM mass tag can be added to the workbooks for targeted analysis of labeled cysteine-containing peptides. The established Pinpoint workbooks are a high-throughput approach to quantify relative abundances of unpaired cysteines and disulfide linkages, including complicated hinge disulfide variants. This approach is especially efficient for comparing large sets of similar samples such as those created in comparability and stability studies or chromatographic fractions. Here, the high throughput method is applied to quantify the relative abundance of hinge disulfide variants and unpaired cysteines in the IgG2 fractions from non-reduced reversed-phase high-performance liquid chromatography (nrRP-HPLC). The LC-MS data analyzed by the Pinpoint workbook suggests that the nrRP-HPLC separated peaks contain hinge disulfide isoforms and free cysteine pairs for each major disulfide isoform structure.

  15. Automating Finance

    ERIC Educational Resources Information Center

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  16. Personnel Department Automation.

    ERIC Educational Resources Information Center

    Wilkinson, David

    In 1989, the Austin Independent School District's Office of Research and Evaluation was directed to monitor the automation of personnel information and processes in the district's Department of Personnel. Earlier, a study committee appointed by the Superintendent during the 1988-89 school year identified issues related to Personnel Department…

  17. Flight-deck automation: Promises and problems

    NASA Technical Reports Server (NTRS)

    Wiener, E. L.; Curry, R. E.

    1980-01-01

    The state of the art in human factors in flight-deck automation is presented. A number of critical problem areas are identified and broad design guidelines are offered. Automation-related aircraft accidents and incidents are discussed as examples of human factors problems in automated flight.

  18. Pathogenesis-related protein expression in the apoplast of wheat leaves protected against leaf rust following application of plant extracts.

    PubMed

    Naz, Rabia; Bano, Asghari; Wilson, Neil L; Guest, David; Roberts, Thomas H

    2014-09-01

    Leaf rust (Puccinia triticina) is a major disease of wheat. We tested aqueous leaf extracts of Jacaranda mimosifolia (Bignoniaceae), Thevetia peruviana (Apocynaceae), and Calotropis procera (Apocynaceae) for their ability to protect wheat from leaf rust. Extracts from all three species inhibited P. triticina urediniospore germination in vitro. Plants sprayed with extracts before inoculation developed significantly lower levels of disease incidence (number of plants infected) than unsprayed, inoculated controls. Sprays combining 0.6% leaf extracts and 2 mM salicylic acid with the fungicide Amistar Xtra at 0.05% (azoxystrobin at 10 μg/liter + cyproconazole at 4 μg/liter) reduced disease incidence significantly more effectively than sprays of fungicide at 0.1% alone. Extracts of J. mimosifolia were most active, either alone (1.2%) or in lower doses (0.6%) in combination with 0.05% Amistar Xtra. Leaf extracts combined with fungicide strongly stimulated defense-related gene expression and the subsequent accumulation of pathogenesis-related (PR) proteins in the apoplast of inoculated wheat leaves. The level of protection afforded was significantly correlated with the ability of extracts to increase PR protein expression. We conclude that pretreatment of wheat leaves with spray formulations containing previously untested plant leaf extracts enhances protection against leaf rust provided by fungicide sprays, offering an alternative disease management strategy.

  19. Automating the analytical laboratory via the Chemical Analysis Automation paradigm

    SciTech Connect

    Hollen, R.; Rzeszutko, C.

    1997-10-01

    To address the need for standardization within the analytical chemistry laboratories of the nation, the Chemical Analysis Automation (CAA) program within the US Department of Energy, Office of Science and Technology`s Robotic Technology Development Program is developing laboratory sample analysis systems that will automate the environmental chemical laboratories. The current laboratory automation paradigm consists of islands-of-automation that do not integrate into a system architecture. Thus, today the chemist must perform most aspects of environmental analysis manually using instrumentation that generally cannot communicate with other devices in the laboratory. CAA is working towards a standardized and modular approach to laboratory automation based upon the Standard Analysis Method (SAM) architecture. Each SAM system automates a complete chemical method. The building block of a SAM is known as the Standard Laboratory Module (SLM). The SLM, either hardware or software, automates a subprotocol of an analysis method and can operate as a standalone or as a unit within a SAM. The CAA concept allows the chemist to easily assemble an automated analysis system, from sample extraction through data interpretation, using standardized SLMs without the worry of hardware or software incompatibility or the necessity of generating complicated control programs. A Task Sequence Controller (TSC) software program schedules and monitors the individual tasks to be performed by each SLM configured within a SAM. The chemist interfaces with the operation of the TSC through the Human Computer Interface (HCI), a logical, icon-driven graphical user interface. The CAA paradigm has successfully been applied in automating EPA SW-846 Methods 3541/3620/8081 for the analysis of PCBs in a soil matrix utilizing commercially available equipment in tandem with SLMs constructed by CAA.

  20. Automated clean-up, separation and detection of polycyclic aromatic hydrocarbons in particulate matter extracts from urban dust and diesel standard reference materials using a 2D-LC/2D-GC system.

    PubMed

    Ahmed, Trifa M; Lim, Hwanmi; Bergvall, Christoffer; Westerholm, Roger

    2013-10-01

    A multidimensional, on-line coupled liquid chromatographic/gas chromatographic system was developed for the quantification of polycyclic aromatic hydrocarbons (PAHs). A two-dimensional liquid chromatographic system (2D-liquid chromatography (LC)), with three columns having different selectivities, was connected on-line to a two-dimensional gas chromatographic system (2D-gas chromatography (GC)). Samples were cleaned up by combining normal elution and column back-flush of the LC columns to selectively remove matrix constituents and isolate well-defined, PAH enriched fractions. Using this system, the sequential removal of polar, mono/diaromatic, olefinic and alkane compounds from crude extracts was achieved. The LC/GC coupling was performed using a fused silica transfer line into a programmable temperature vaporizer (PTV) GC injector. Using the PTV in the solvent vent mode, excess solvent was removed and the enriched PAH sample extract was injected into the GC. The 2D-GC setup consisted of two capillary columns with different stationary phase selectivities. Heart-cutting of selected PAH compounds in the first GC column (first dimension) and transfer of these to the second GC column (second dimension) increased the baseline resolutions of closely eluting PAHs. The on-line system was validated using the standard reference materials SRM 1649a (urban dust) and SRM 1975 (diesel particulate extract). The PAH concentrations measured were comparable to the certified values and the fully automated LC/GC system performed the clean-up, separation and detection of PAHs in 16 extracts in less than 24 h. The multidimensional, on-line 2D-LC/2D-GC system eliminated manual handling of the sample extracts and minimised the risk of sample loss and contamination, while increasing accuracy and precision.

  1. LC-HR-MS/MS standard urine screening approach: Pros and cons of automated on-line extraction by turbulent flow chromatography versus dilute-and-shoot and comparison with established urine precipitation.

    PubMed

    Helfer, Andreas G; Michely, Julian A; Weber, Armin A; Meyer, Markus R; Maurer, Hans H

    2017-02-01

    Comprehensive urine screening for drugs and metabolites by LC-HR-MS/MS using Orbitrap technology has been described with precipitation as simple workup. In order to fasten, automate, and/or simplify the workup, on-line extraction by turbulent flow chromatography and a dilute-and-shoot approach were developed and compared. After chromatographic separation within 10min, the Q-Exactive mass spectrometer was run in full scan mode with positive/negative switching and subsequent data dependent acquisition mode. The workup approaches were validated concerning selectivity, recovery, matrix effects, process efficiency, and limits of identification and detection for typical drug representatives and metabolites. The total workup time for on-line extraction was 6min, for the dilution approach 3min. For comparison, the established urine precipitation and evaporation lasted 10min. The validation results were acceptable. The limits for on-line extraction were comparable with those described for precipitation, but lower than for dilution. Thanks to the high sensitivity of the LC-HR-MS/MS system, all three workup approaches were sufficient for comprehensive urine screening and allowed fast, reliable, and reproducible detection of cardiovascular drugs, drugs of abuse, and other CNS acting drugs after common doses.

  2. Automated headspace-solid-phase micro extraction-retention time locked-isotope dilution gas chromatography-mass spectrometry for the analysis of organotin compounds in water and sediment samples.

    PubMed

    Devosa, Christophe; Vliegen, Maarten; Willaert, Bart; David, Frank; Moens, Luc; Sandra, Pat

    2005-06-24

    An automated method for the simultaneous determination of six important organotin compounds namely monobutyltin (MBT), dibutyltin (DBT), tributyltin (TBT), monophenyltin (MPhT), diphenyltin (DPhT) and triphenyltin (TPhT) in water and sediment samples is described. The method is based on derivatization with sodium tetraethylborate followed by automated headspace-solid-phase micro extraction (SPME) combined with GC-MS under retention time locked (RTL) conditions. Home-synthesized deuterated organotin analogues were used as internal standards. Two high abundant fragment ions corresponding to the main tin isotopes Sn118 and Sn120 were chosen; one for quantification and one as qualifier ion. The method was validated and excellent figures of merit were obtained. Limits of quantification (LOQs) are from 1.3 to 15 ng l(-1) (ppt) for water samples and from 1.0 to 6.3 microg kg(-1) (ppb) for sediment samples. Accuracy for sediment samples was tested on spiked real-life sediment samples and on a reference PACS-2 marine harbor sediment. The developed method was used in a case-study at the harbor of Antwerp where sediment samples in different areas were taken and subsequently screened for TBT contamination. Concentrations ranged from 15 microg kg(-1) in the port of Antwerp up to 43 mg kg(-1) near a ship repair unit.

  3. Extract the Relational Information of Static Features and Motion Features for Human Activities Recognition in Videos

    PubMed Central

    2016-01-01

    Both static features and motion features have shown promising performance in human activities recognition task. However, the information included in these features is insufficient for complex human activities. In this paper, we propose extracting relational information of static features and motion features for human activities recognition. The videos are represented by a classical Bag-of-Word (BoW) model which is useful in many works. To get a compact and discriminative codebook with small dimension, we employ the divisive algorithm based on KL-divergence to reconstruct the codebook. After that, to further capture strong relational information, we construct a bipartite graph to model the relationship between words of different feature set. Then we use a k-way partition to create a new codebook in which similar words are getting together. With this new codebook, videos can be represented by a new BoW vector with strong relational information. Moreover, we propose a method to compute new clusters from the divisive algorithm's projective function. We test our work on the several datasets and obtain very promising results. PMID:27656199

  4. Locoregional Control of Non-Small Cell Lung Cancer in Relation to Automated Early Assessment of Tumor Regression on Cone Beam Computed Tomography

    SciTech Connect

    Brink, Carsten; Bernchou, Uffe; Bertelsen, Anders; Hansen, Olfred; Schytte, Tine; Bentzen, Soren M.

    2014-07-15

    Purpose: Large interindividual variations in volume regression of non-small cell lung cancer (NSCLC) are observable on standard cone beam computed tomography (CBCT) during fractionated radiation therapy. Here, a method for automated assessment of tumor volume regression is presented and its potential use in response adapted personalized radiation therapy is evaluated empirically. Methods and Materials: Automated deformable registration with calculation of the Jacobian determinant was applied to serial CBCT scans in a series of 99 patients with NSCLC. Tumor volume at the end of treatment was estimated on the basis of the first one third and two thirds of the scans. The concordance between estimated and actual relative volume at the end of radiation therapy was quantified by Pearson's correlation coefficient. On the basis of the estimated relative volume, the patients were stratified into 2 groups having volume regressions below or above the population median value. Kaplan-Meier plots of locoregional disease-free rate and overall survival in the 2 groups were used to evaluate the predictive value of tumor regression during treatment. Cox proportional hazards model was used to adjust for other clinical characteristics. Results: Automatic measurement of the tumor regression from standard CBCT images was feasible. Pearson's correlation coefficient between manual and automatic measurement was 0.86 in a sample of 9 patients. Most patients experienced tumor volume regression, and this could be quantified early into the treatment course. Interestingly, patients with pronounced volume regression had worse locoregional tumor control and overall survival. This was significant on patient with non-adenocarcinoma histology. Conclusions: Evaluation of routinely acquired CBCT images during radiation therapy provides biological information on the specific tumor. This could potentially form the basis for personalized response adaptive therapy.

  5. Comparative analyses of universal extraction buffers for assay of stress related biochemical and physiological parameters.

    PubMed

    Han, Chunyu; Chan, Zhulong; Yang, Fan

    2015-01-01

    Comparative efficiency of three extraction solutions, including the universal sodium phosphate buffer (USPB), the Tris-HCl buffer (UTHB), and the specific buffers, were compared for assays of soluble protein, free proline, superoxide radical (O2∙-), hydrogen peroxide (H2O2), and the antioxidant enzymes such as superoxide dismutase (SOD), catalase (CAT), guaiacol peroxidase (POD), ascorbate peroxidase (APX), glutathione peroxidase (GPX), and glutathione reductase (GR) in Populus deltoide. Significant differences for protein extraction were detected via sodium dodecyl sulfate polyacrylamide gel electrophoresis (SDS-PAGE) and two-dimensional electrophoresis (2-DE). Between the two universal extraction buffers, the USPB showed higher efficiency for extraction of soluble protein, CAT, GR, O2∙-, GPX, SOD, and free proline, while the UTHB had higher efficiency for extraction of APX, POD, and H2O2. When compared with the specific buffers, the USPB showed higher extraction efficiency for measurement of soluble protein, CAT, GR, and O2∙-, parallel extraction efficiency for GPX, SOD, free proline, and H2O2, and lower extraction efficiency for APX and POD, whereas the UTHB had higher extraction efficiency for measurement of POD and H2O2. Further comparisons proved that 100 mM USPB buffer showed the highest extraction efficiencies. These results indicated that USPB would be suitable and efficient for extraction of soluble protein, CAT, GR, GPX, SOD, H2O2, O2∙-, and free proline.

  6. Automated solid-phase extraction-liquid chromatography-tandem mass spectrometry analysis of 11-nor-Delta9-tetrahydrocannabinol-9-carboxylic acid in human urine specimens: application to a high-throughput urine analysis laboratory.

    PubMed

    Robandt, P V; Klette, K L; Sibum, M

    2009-10-01

    An automated solid-phase extraction coupled with liquid chromatography and tandem mass spectrometry (SPE-LC-MS-MS) method for the analysis of 11-nor-Delta(9)-tetrahydrocannabinol-9-carboxylic acid (THC-COOH) in human urine specimens was developed. The method was linear (R(2) = 0.9986) to 1000 ng/mL with no carryover evidenced at 2000 ng/mL. Limits of quantification and detection were found to be 2 ng/mL. Interrun precision was evaluated at the 15 ng/mL level over nine batches spanning 15 days (n = 45). The coefficient of variation (%CV) was found to be 5.5% over the course of the validation. Intrarun precision of a 15 ng/mL control (n = 5) ranged from 0.58% CV to 7.4% CV for the same set of analytical batches. Interference was tested using (+/-)-11-hydroxy-Delta(9)-tetrahydrocannabinol, cannabidiol, (-)-Delta(8)-tetrahydrocannabinol, and cannabinol. One hundred and nineteen specimens previously found to contain THC-COOH by a previously validated gas chromatographic mass spectrometry (GC-MS) procedure were compared to the SPE-LC-MS-MS method. Excellent agreement was found (R(2) = 0.9925) for the parallel comparison study. The automated SPE procedure eliminates the human factors of specimen handling, extraction, and derivatization, thereby reducing labor costs and rework resulting from human error or technique issues. Additionally, method runtime is greatly reduced (e.g., during parallel studies the SPE-LC-MS-MS instrument was often finished with analysis by the time the technician finished the offline SPE and derivatization procedure prior to the GC-MS analysis).

  7. Automated PCR setup for forensic casework samples using the Normalization Wizard and PCR Setup robotic methods.

    PubMed

    Greenspoon, S A; Sykes, K L V; Ban, J D; Pollard, A; Baisden, M; Farr, M; Graham, N; Collins, B L; Green, M M; Christenson, C C

    2006-12-20

    Human genome, pharmaceutical and research laboratories have long enjoyed the application of robotics to performing repetitive laboratory tasks. However, the utilization of robotics in forensic laboratories for processing casework samples is relatively new and poses particular challenges. Since the quantity and quality (a mixture versus a single source sample, the level of degradation, the presence of PCR inhibitors) of the DNA contained within a casework sample is unknown, particular attention must be paid to procedural susceptibility to contamination, as well as DNA yield, especially as it pertains to samples with little biological material. The Virginia Department of Forensic Science (VDFS) has successfully automated forensic casework DNA extraction utilizing the DNA IQ(trade mark) System in conjunction with the Biomek 2000 Automation Workstation. Human DNA quantitation is also performed in a near complete automated fashion utilizing the AluQuant Human DNA Quantitation System and the Biomek 2000 Automation Workstation. Recently, the PCR setup for casework samples has been automated, employing the Biomek 2000 Automation Workstation and Normalization Wizard, Genetic Identity version, which utilizes the quantitation data, imported into the software, to create a customized automated method for DNA dilution, unique to that plate of DNA samples. The PCR Setup software method, used in conjunction with the Normalization Wizard method and written for the Biomek 2000, functions to mix the diluted DNA samples, transfer the PCR master mix, and transfer the diluted DNA samples to PCR amplification tubes. Once the process is complete, the DNA extracts, still on the deck of the robot in PCR amplification strip tubes, are transferred to pre-labeled 1.5 mL tubes for long-term storage using an automated method. The automation of these steps in the process of forensic DNA casework analysis has been accomplished by performing extensive optimization, validation and testing of the

  8. Office Automation Boosts University's Productivity.

    ERIC Educational Resources Information Center

    School Business Affairs, 1986

    1986-01-01

    The University of Pittsburgh has a 2-year agreement designating the Xerox Corporation as the primary supplier of word processing and related office automation equipment in order to increase productivity and more efficient use of campus resources. (MLF)

  9. Handwritten Chinese character recognition based on supervised competitive learning neural network and block-based relative fuzzy feature extraction

    NASA Astrophysics Data System (ADS)

    Sun, Limin; Wu, Shuanhu

    2005-02-01

    Offline handwritten chinese character recognition is still a difficult problem because of its large stroke changes, writing anomaly, and the difficulty for obtaining its stroke ranking information. Generally, offline handwritten chinese character can be divided into two procedures: feature extraction for capturing handwritten chinese character information and feature classifying for character recognition. In this paper, we proposed a new Chinese character recognition algorithm. In feature extraction part, we adopted elastic mesh dividing method for extracting the block features and its relative fuzzy features that utilized the relativities between different strokes and distribution probability of a stroke in its neighbor sub-blocks. In recognition part, we constructed a classifier based on a supervised competitive learning algorithm to train competitive learning neural network with the extracted features set. Experimental results show that the performance of our algorithm is encouraging and can be comparable to other algorithms.

  10. How to extract clinically useful information from large amount of dialysis related stored data.

    PubMed

    Vito, Domenico; Casagrande, Giustina; Bianchi, Camilla; Costantino, Maria L

    2015-01-01

    The basic storage infrastructure used to gather data from the technological evolution also in the healthcare field was leading to the storing into public or private repository of even higher quantities of data related to patients and their pathological evolution. Big data techniques are spreading also in medical research. By these techniques is possible extract information from complex heterogeneous sources, realizing longitudinal studies focused to correlate the patient status with biometric parameters. In our work we develop a common data infrastructure involving 4 clinical dialysis centers between Lombardy and Switzerland. The common platform has been build to store large amount of clinical data related to 716 dialysis session of 70 patient. The platform is made up by a combination of a MySQL(®) database (Dialysis Database) and a MATLAB-based mining library (Dialysis MATlib). A statistical analysis of these data has been performed on the data gathered. These analyses led to the development of two clinical indexes, representing an example of transformation of big data into clinical information.

  11. RELATIVE POTENCY OF FUNGAL EXTRACTS IN INDUCING ALLERGIC ASTHMA-LIKE RESPONSES IN BALB/C MICE

    EPA Science Inventory

    Indoor mold has been associated with the development of allergic asthma. However, relative potency of molds in the induction of allergic asthma is not clear. In this study, we tested the relative potency of fungal extracts (Metarizium anisophilae [MACA], Stachybotrys ...

  12. BioCreative V CDR task corpus: a resource for chemical disease relation extraction.

    PubMed

    Li, Jiao; Sun, Yueping; Johnson, Robin J; Sciaky, Daniela; Wei, Chih-Hsuan; Leaman, Robert; Davis, Allan Peter; Mattingly, Carolyn J; Wiegers, Thomas C; Lu, Zhiyong

    2016-01-01

    Community-run, formal evaluations and manually annotated text corpora are critically important for advancing biomedical text-mining research. Recently in BioCreative V, a new challenge was organized for the tasks of disease named entity recognition (DNER) and chemical-induced disease (CID) relation extraction. Given the nature of both tasks, a test collection is required to contain both disease/chemical annotations and relation annotations in the same set of articles. Despite previous efforts in biomedical corpus construction, none was found to be sufficient for the task. Thus, we developed our own corpus called BC5CDR during the challenge by inviting a team of Medical Subject Headings (MeSH) indexers for disease/chemical entity annotation and Comparative Toxicogenomics Database (CTD) curators for CID relation annotation. To ensure high annotation quality and productivity, detailed annotation guidelines and automatic annotation tools were provided. The resulting BC5CDR corpus consists of 1500 PubMed articles with 4409 annotated chemicals, 5818 diseases and 3116 chemical-disease interactions. Each entity annotation includes both the mention text spans and normalized concept identifiers, using MeSH as the controlled vocabulary. To ensure accuracy, the entities were first captured independently by two annotators followed by a consensus annotation: The average inter-annotator agreement (IAA) scores were 87.49% and 96.05% for the disease and chemicals, respectively, in the test set according to the Jaccard similarity coefficient. Our corpus was successfully used for the BioCreative V challenge tasks and should serve as a valuable resource for the text-mining research community.Database URL: http://www.biocreative.org/tasks/biocreative-v/track-3-cdr/.

  13. Extraction of solubles from plant biomass for use as microbial growth stimulant and methods related thereto

    SciTech Connect

    Lau, Ming Woei

    2015-12-08

    A method for producing a microbial growth stimulant (MGS) from a plant biomass is described. In one embodiment, an ammonium hydroxide solution is used to extract a solution of proteins and ammonia from the biomass. Some of the proteins and ammonia are separated from the extracted solution to provide the MGS solution. The removed ammonia can be recycled and the proteins are useful as animal feeds. In one embodiment, the method comprises extracting solubles from pretreated lignocellulosic biomass with a cellulase enzyme-producing growth medium (such T. reesei) in the presence of water and an aqueous extract.

  14. An object-oriented approach to automated landform mapping: A case study of drumlins

    NASA Astrophysics Data System (ADS)

    Saha, Kakoli; Wells, Neil A.; Munro-Stasiuk, Mandy

    2011-09-01

    This paper details an automated object-oriented approach to mapping landforms from digital elevation models (DEMs), using the example of drumlins in the Chautauqua drumlin field in NW Pennsylvania and upstate New York. Object-oriented classification is highly desirable as it can identify specific shapes in datasets based on both the pixel values in a raster dataset and the contextual information between pixels and extracted objects. The methodology is built specifically for application to the USGS 30 m resolution DEM data, which are freely available to the public and of sufficient resolution to map medium scale landforms. Using the raw DEM data, as well as derived aspect and slope, Definiens Developer (v.7) was used to perform multiresolution segmentation, followed by rule-based classification in order to extract individual polygons that represent drumlins. Drumlins obtained by automated extraction were visually and statistically compared to those identified via manual digitization. Detailed morphometric descriptive statistics such as means, ranges, and standard deviations were inspected and compared for length, width, elongation ratio, area, and perimeter. Although the manual and automated results were not always statistically identical, a more detailed comparison of just the drumlins identified by both procedures showed that the automated methods easily matched the manual digitization. Differences in the two methods related to mapping compound drumlins, and smaller and larger drumlins. The automated method generally identified more features in these categories than the manual method and thus outperformed the manual method.

  15. Assessing the clinical uses of fuzzy detection results in the automated detection of CVC-related infections: a preliminary report.

    PubMed

    de Bruin, Jeroen S; Blacky, Alexander; Adlassnig, Klaus-Peter

    2012-01-01

    Central venous catheters (CVCs) play an essential role in the care of the critically ill, but their use comes at the risk of infection. By using fuzzy set theory and logic to model clinical linguistic CVC-related infection criteria, clinical detection systems can detect borderline infections where not all infection parameters have been (fully) met, also called fuzzy results. In this paper we analyzed the clinical use of these results. We used a fuzzy-logic-based computerized infection control system for the monitoring of healthcare-associated infections to uncover fuzzy results and periods, after which we classified them, and used these classifications together with knowledge of prior CVC-related infection episodes in temporal association rule mining. As a result, we uncovered several rules which can help with the early detection of re-occurring CVC-related infections.

  16. A novel dual-valve sequential injection manifold (DV-SIA) for automated liquid-liquid extraction. Application for the determination of picric acid.

    PubMed

    Skrlíková, Jana; Andruch, Vasil; Sklenárová, Hana; Chocholous, Petr; Solich, Petr; Balogh, Ioseph S

    2010-05-07

    A novel dual-valve sequential injection system (DV-SIA) for online liquid-liquid extraction which resolves the main problems of LLE utilization in SIA has been designed. The main idea behind this new design was to construct an SIA system by connecting two independent units, one for aqueous-organic mixture flow and the second specifically for organic phase flow. As a result, the DV-SIA manifold consists of an Extraction unit and a Detection unit. Processing a mixture of aqueous-organic phase in the Extraction unit and a separated organic phase in the Detection unit solves the problems associated with the change of phases having different affinities to the walls of the Teflon tubing used in the SI-system. The developed manifold is a simple, user-friendly and universal system built entirely from commercially available components. The system can be used for a variety of samples and organic solvents and is simple enough to be easily handled by operators less familiar with flow systems. The efficiency of the DV-SIA system is demonstrated by the extraction of picric acid in the form of an ion associate with 2-[2-(4-methoxy-phenylamino)-vinyl]-1,3,3-trimethyl-3H-indolium reagent, with subsequent spectrophotometric detection. The suggested DV-SIA concept can be expected to stimulate new experiments in analytical laboratories and can be applied to the elaboration of procedures for the determination of other compounds extractable by organic solvents. It could thus form a basis for the design of simple, single-purpose commercial instruments used in LLE procedures.

  17. Aspects of the antimicrobial efficacy of grapefruit seed extract and its relation to preservative substances contained.

    PubMed

    von Woedtke, T; Schlüter, B; Pflegel, P; Lindequist, U; Jülich, W D

    1999-06-01

    The antimicrobial efficacy as well as the content of preservative agents of six commercially available grapefruit seed extracts were examined. Five of the six extracts showed a high growth inhibiting activity against the test germs Bacillus subtilis SBUG 14, Micrococcus flavus SBUG 16, Staphylococcus aureus SBUG 11, Serratia marcescens SBUG 9, Escherichia coli SBUG 17, Proteus mirabilis SBUG 47, and Candida maltosa SBUG 700. In all of the antimicrobial active grapefruit seed extracts, the preservative benzethonium chloride was detected by thin layer chromatography. Additionally, three extracts contained the preserving substances triclosan and methyl parabene. In only one of the grapefruit seed extracts tested no preservative agent was found. However, with this extract as well as with several self-made extracts from seed and juiceless pulp of grapefruits (Citrus paradisi) no antimicrobial activity could be detected (standard serial broth dilution assay, agar diffusion test). Thus, it is concluded that the potent as well as nearly universal antimicrobial activity being attributed to grapefruit seed extract is merely due to the synthetic preservative agents contained within. Natural products with antimicrobial activity do not appear to be present.

  18. Optimization of DNA extraction and PCR protocols for phylogenetic analysis in Schinopsis spp. and related Anacardiaceae.

    PubMed

    Mogni, Virginia Y; Kahan, Mariano A; de Queiroz, Luciano Paganucci; Vesprini, José L; Ortiz, Juan Pablo A; Prado, Darién E

    2016-01-01

    The Anacardiaceae is an important and worldwide distributed family of ecological and socio-economic relevance. Notwithstanding that, molecular studies in this family are scarce and problematic because of the particularly high concentration of secondary metabolites-i.e. tannins and oleoresins-that are present in almost all tissues of the many members of the group, which complicate the purification and amplification of the DNA. The objective of this work was to improve an available DNA isolation method for Schinopsis spp. and other related Anacardiaceae, as well as the PCR protocols for DNA amplification of the chloroplast trnL-F, rps16 and ndhF and nuclear ITS-ETS fragments. The modifications proposed allowed the extraction of 70-120 µg of non-degraded genomic DNA per gram of dry tissue that resulted useful for PCR amplification. PCR reactions produced the expected fragments that could be directly sequenced. Sequence analyses of amplicons showed similarity with the corresponding Schinopsis accessions available at GenBank. The methodology presented here can be routinely applied for molecular studies of the group aimed to clarify not only aspects on the molecular biology but also the taxonomy and phylogeny of this fascinating group of vascular plants.

  19. Validation of an assay for the determination of cotinine and 3-hydroxycotinine in human saliva using automated solid-phase extraction and liquid chromatography with tandem mass spectrometric detection.

    PubMed

    Bentley, M C; Abrar, M; Kelk, M; Cook, J; Phillips, K

    1999-02-19

    The validation of a high-performance liquid chromatographic method for the simultaneous determination of low level cotinine and 3-hydroxycotinine in human saliva is reported. Analytes and deuterated internal standards were extracted from saliva samples using automated solid-phase extraction, the columns containing a hyper cross-linked styrene-divinylbenzene copolymer sorbent, and analysed by reversed-phase liquid chromatography with tandem mass spectrometric detection (LC-MS-MS). Lower limits of quantitation of 0.05 and 0.10 ng/ml for cotinine and 3-hydroxycotinine, respectively, were achieved. Intra- and inter-batch precision and accuracy values fell within +/-17% for all quality control samples, with the exception of quality control samples prepared at 0.30 ng/ml for 3-hydroxycotinine (inter-day precision 21.1%). Results from the analysis of saliva samples using this assay were consistent with subjects' self-reported environmental tobacco smoke (ETS) exposures, enhancing the applicability of cotinine as a biomarker for the assessment of low level ETS exposure.

  20. Multiresidue trace analysis of pharmaceuticals, their human metabolites and transformation products by fully automated on-line solid-phase extraction-liquid chromatography-tandem mass spectrometry.

    PubMed

    García-Galán, María Jesús; Petrovic, Mira; Rodríguez-Mozaz, Sara; Barceló, Damià

    2016-09-01

    A novel, fully automated analytical methodology based on dual column liquid chromatography coupled to tandem mass spectrometry (LC-LC-MS(2)) has been developed and validated for the analysis of 12 pharmaceuticals and 20 metabolites and transformation products in different types of water (influent and effluent wastewaters and surface water). Two LC columns were used - one for pre-concentration of the sample and the second for separation and analysis - so that water samples were injected directly in the chromatographic system. Besides the many advantages of the methodology, such as minimization of the sample volume required and its manipulation, both compounds ionized in positive and negative mode could be analyzed simultaneously without compromising the sensitivity. A comparative study of different mobile phases, gradients and LC pre-concentration columns was carried out to obtain the best analytical performance. Limits of detection (MLODs) achieved were in the low ngL(-1) range for all the compounds. The method was successfully applied to study the presence of the target analytes in different wastewater and surface water samples collected near the city of Girona (Catalonia, Spain). Data on the environmental presence and fate of pharmaceutical metabolites and TPs is still scarce, highlighting the relevance of the developed methodology.

  1. Analysis of Cocaine, Its Metabolites, Pyrolysis Products, and Ethanol Adducts in Postmortem Fluids and Tissues Using Zymark Automated Solid-Phase Extraction and Gas Chromatography-Mass Spectrometry

    DTIC Science & Technology

    2003-12-01

    NA, et al. Solid- phase extraction and GC/MS quantitation of cocaine, ecgonine methyl ester, benzoylecgonine, and cocaethylene from meconium , whole...blood, and plasma. J Anal Toxicol 1993;17(6):353-8. 21. Browne SP, Tebbett IR, Moore CM, Dusick A, Co- vert R, Yee GT. Analysis of meconium for...humor. J Anal Toxicol 2000;24(1):59-65. 29. Oyler J, Darwin WD, Preston KL, Suess P, Cone EJ. Cocaine disposition in meconium from newborns of

  2. An Automated Method for Extracting Spatially Varying Time-Dependent Quantities from an ALEGRA Simulation Using VisIt Visualization Software

    DTIC Science & Technology

    2014-07-01

    Visualization software such as VisIt presents an alternative method to examine data through the use of EXODUS databases.3 In addition, VisIt...extracting transient quantities that vary spatially from an EXODUS database using a VisIt macro written in the Python programming language. 2...Graphics; Sandia National Laboratories: Albuquerque, NM, September 1991. Revised April 1994. 3 Schoof, L. A.; Yarberry, V. R. EXODUS II: A Finite

  3. Performance of the Automated Self-Administered 24-hour Recall relative to a measure of true intakes and to an interviewer-administered 24-h recall123

    PubMed Central

    Kirkpatrick, Sharon I; Subar, Amy F; Douglass, Deirdre; Zimmerman, Thea P; Thompson, Frances E; Kahle, Lisa L; George, Stephanie M; Dodd, Kevin W; Potischman, Nancy

    2014-01-01

    Background: The Automated Self-Administered 24-hour Recall (ASA24), a freely available Web-based tool, was developed to enhance the feasibility of collecting high-quality dietary intake data from large samples. Objective: The purpose of this study was to assess the criterion validity of ASA24 through a feeding study in which the true intake for 3 meals was known. Design: True intake and plate waste from 3 meals were ascertained for 81 adults by inconspicuously weighing foods and beverages offered at a buffet before and after each participant served him- or herself. Participants were randomly assigned to complete an ASA24 or an interviewer-administered Automated Multiple-Pass Method (AMPM) recall the following day. With the use of linear and Poisson regression analysis, we examined the associations between recall mode and 1) the proportions of items consumed for which a match was reported and that were excluded, 2) the number of intrusions (items reported but not consumed), and 3) differences between energy, nutrient, food group, and portion size estimates based on true and reported intakes. Results: Respondents completing ASA24 reported 80% of items truly consumed compared with 83% in AMPM (P = 0.07). For both ASA24 and AMPM, additions to or ingredients in multicomponent foods and drinks were more frequently omitted than were main foods or drinks. The number of intrusions was higher in ASA24 (P < 0.01). Little evidence of differences by recall mode was found in the gap between true and reported energy, nutrient, and food group intakes or portion sizes. Conclusions: Although the interviewer-administered AMPM performed somewhat better relative to true intakes for matches, exclusions, and intrusions, ASA24 performed well. Given the substantial cost savings that ASA24 offers, it has the potential to make important contributions to research aimed at describing the diets of populations, assessing the effect of interventions on diet, and elucidating diet and health

  4. Automated Inadvertent Intruder Application

    SciTech Connect

    Koffman, Larry D.; Lee, Patricia L.; Cook, James R.; Wilhite, Elmer L.

    2008-01-15

    The Environmental Analysis and Performance Modeling group of Savannah River National Laboratory (SRNL) conducts performance assessments of the Savannah River Site (SRS) low-level waste facilities to meet the requirements of DOE Order 435.1. These performance assessments, which result in limits on the amounts of radiological substances that can be placed in the waste disposal facilities, consider numerous potential exposure pathways that could occur in the future. One set of exposure scenarios, known as inadvertent intruder analysis, considers the impact on hypothetical individuals who are assumed to inadvertently intrude onto the waste disposal site. Inadvertent intruder analysis considers three distinct scenarios for exposure referred to as the agriculture scenario, the resident scenario, and the post-drilling scenario. Each of these scenarios has specific exposure pathways that contribute to the overall dose for the scenario. For the inadvertent intruder analysis, the calculation of dose for the exposure pathways is a relatively straightforward algebraic calculation that utilizes dose conversion factors. Prior to 2004, these calculations were performed using an Excel spreadsheet. However, design checks of the spreadsheet calculations revealed that errors could be introduced inadvertently when copying spreadsheet formulas cell by cell and finding these errors was tedious and time consuming. This weakness led to the specification of functional requirements to create a software application that would automate the calculations for inadvertent intruder analysis using a controlled source of input parameters. This software application, named the Automated Inadvertent Intruder Application, has undergone rigorous testing of the internal calculations and meets software QA requirements. The Automated Inadvertent Intruder Application was intended to replace the previous spreadsheet analyses with an automated application that was verified to produce the same calculations and

  5. Automated solid-phase extraction-liquid chromatography-tandem mass spectrometry analysis of 6-acetylmorphine in human urine specimens: application for a high-throughput urine analysis laboratory.

    PubMed

    Robandt, P V; Bui, H M; Scancella, J M; Klette, K L

    2010-10-01

    An automated solid-phase extraction-liquid chromatography- tandem mass spectrometry (SPE-LC-MS-MS) method using the Spark Holland Symbiosis Pharma SPE-LC coupled to a Waters Quattro Micro MS-MS was developed for the analysis of 6-acetylmorphine (6-AM) in human urine specimens. The method was linear (R² = 0.9983) to 100 ng/mL, with no carryover at 200 ng/mL. Limits of quantification and detection were found to be 2 ng/mL. Interrun precision calculated as percent coefficient of variation (%CV) and evaluated by analyzing five specimens at 10 ng/mL over nine batches (n = 45) was 3.6%. Intrarun precision evaluated from 0 to 100 ng/mL ranged from 1.0 to 4.4%CV. Other opioids (codeine, morphine, oxycodone, oxymorphone, hydromorphone, hydrocodone, and norcodeine) did not interfere in the detection, quantification, or chromatography of 6-AM or the deuterated internal standard. The quantified values for 41 authentic human urine specimens previously found to contain 6-AM by a validated gas chromatography (GC)-MS method were compared to those obtained by the SPE-LC-MS-MS method. The SPE-LC-MS-MS procedure eliminates the human factors of specimen handling, extraction, and derivatization, thereby reducing labor costs and rework resulting from human error or technique issues. The time required for extraction and analysis was reduced by approximately 50% when compared to a validated 6-AM procedure using manual SPE and GC-MS analysis.

  6. A semi-automated system for quantifying the oxidative potential of ambient particles in aqueous extracts using the dithiothreitol (DTT) assay: results from the Southeastern Center for Air Pollution and Epidemiology (SCAPE)

    NASA Astrophysics Data System (ADS)

    Fang, T.; Verma, V.; Guo, H.; King, L. E.; Edgerton, E. S.; Weber, R. J.

    2014-07-01

    A variety of methods are used to measure the capability of particulate matter (PM) to catalytically generate reactive oxygen species (ROS) in vivo, also defined as the aerosol oxidative potential. A widely used measure of aerosol oxidative potential is the dithiothreitol (DTT) assay, which monitors the depletion of DTT (a surrogate for cellular antioxidants) as catalyzed by the redox-active species in PM. However, a major constraint in the routine use of the DTT assay for integrating it with the large-scale health studies is its labor-intensive and time-consuming protocol. To specifically address this concern, we have developed a semi-automated system for quantifying the oxidative potential of aerosol liquid extracts using the DTT assay. The system, capable of unattended analysis at one sample per hour, has a high analytical precision (Coefficient of Variation of 12% for standards, 4% for ambient samples), and reasonably low limit of detection (0.31 nmol min-1). Comparison of the automated approach with the manual method conducted on ambient samples yielded good agreement (slope = 1.08 ± 0.12, r2 = 0.92, N = 9). The system was utilized for the Southeastern Center for Air Pollution and Epidemiology (SCAPE) to generate an extensive data set on DTT activity of ambient particles collected from contrasting environments (urban, road-side, and rural) in the southeastern US. We find that water-soluble PM2.5 DTT activity on a per air volume basis was spatially uniform and often well correlated with PM2.5 mass (r = 0.49 to 0.88), suggesting regional sources contributing to the PM oxidative potential in southeast US. However, the greater heterogeneity in the intrinsic DTT activity (per PM mass basis) across seasons indicates variability in the DTT activity associated with aerosols from sources that vary with season. Although developed for the DTT assay, the instrument can also be used to determine oxidative potential with other acellular assays.

  7. Quantitative analysis of simvastatin and its beta-hydroxy acid in human plasma using automated liquid-liquid extraction based on 96-well plate format and liquid chromatography-tandem mass spectrometry.

    PubMed

    Zhang, Nanyan; Yang, Amy; Rogers, John Douglas; Zhao, Jamie J

    2004-01-27

    An assay based on automated liquid-liquid extraction (LLE) and liquid chromatography-tandem mass spectrometry (LC/MS/MS) has been developed and validated for the quantitative analysis of simvastatin (SV) and its beta-hydroxy acid (SVA) in human plasma. A Packard MultiProbe II workstation was used to convert human plasma samples collected following administration of simvastatin and quality control (QC) samples from individual tubes into 96-well plate format. The workstation was also used to prepare calibration standards and spike internal standards. A Tomtec Quadra 96-channel liquid handling workstation was used to perform LLE based on 96-well plates including adding solvents, separating organic from aqueous layer and reconstitution. SV and SVA were separated through a Kromasil C18 column (50 mm x 2 mm i.d., 5 microm) and detected by tandem mass spectrometry with a TurboIonspray interface. Stable isotope-labeled SV and SVA, 13CD(3)-SV and 13 CD(3)-SVA, were used as the internal standards for SV and SVA, respectively. The automated procedures reduced the overall analytical time (96 samples) to 1/3 of that of manual LLE. Most importantly, an analyst spent only a fraction of time on the 96-well LLE. A limit of quantitation of 50 pg/ml was achieved for both SV and SVA. The interconversion between SV and SVA during the 96-well LLE was found to be negligible. The assay showed very good reproducibility, with intra- and inter-assay precision (%R.S.D.) of less than 7.5%, and accuracy of 98.7-102.3% of nominal values for both analytes. By using this method, sample throughput should be enhanced at least three-fold compared to that of the manual procedure.

  8. Physiological changes in rhizobia after growth in peat extract may be related to improved desiccation tolerance.

    PubMed

    Casteriano, Andrea; Wilkes, Meredith A; Deaker, Rosalind

    2013-07-01

    Improved survival of peat-cultured rhizobia compared to survival of liquid-cultured cells has been attributed to cellular adaptations during solid-state fermentation in moist peat. We have observed improved desiccation tolerance of Rhizobium leguminosarum bv. trifolii TA1 and Bradyrhizobium japonicum CB1809 after aerobic growth in water extracts of peat. Survival of TA1 grown in crude peat extract was 18-fold greater than that of cells grown in a defined liquid medium but was diminished when cells were grown in different-sized colloidal fractions of peat extract. Survival of CB1809 was generally better when grown in crude peat extract than in the control but was not statistically significant (P > 0.05) and was strongly dependent on peat extract concentration. Accumulation of intracellular trehalose by both TA1 and CB1809 was higher after growth in peat extract than in the defined medium control. Cells grown in water extracts of peat exhibit morphological changes similar to those observed after growth in moist peat. Electron microscopy revealed thickened plasma membranes, with an electron-dense material occupying the periplasmic space in both TA1 and CB1809. Growth in peat extract also resulted in changes to polypeptide expression in both strains, and peptide analysis by liquid chromatography-mass spectrometry indicated increased expression of stress response proteins. Our results suggest that increased capacity for desiccation tolerance in rhizobia is multifactorial, involving the accumulation of trehalose together with increased expression of proteins involved in protection of the cell envelope, repair of DNA damage, oxidative stress responses, and maintenance of stability and integrity of proteins.

  9. Feature Extraction of Event-Related Potentials Using Wavelets: An Application to Human Performance Monitoring

    NASA Technical Reports Server (NTRS)

    Trejo, Leonard J.; Shensa, Mark J.; Remington, Roger W. (Technical Monitor)

    1998-01-01

    This report describes the development and evaluation of mathematical models for predicting human performance from discrete wavelet transforms (DWT) of event-related potentials (ERP) elicited by task-relevant stimuli. The DWT was compared to principal components analysis (PCA) for representation of ERPs in linear regression and neural network models developed to predict a composite measure of human signal detection performance. Linear regression models based on coefficients of the decimated DWT predicted signal detection performance with half as many f ree parameters as comparable models based on PCA scores. In addition, the DWT-based models were more resistant to model degradation due to over-fitting than PCA-based models. Feed-forward neural networks were trained using the backpropagation,-, algorithm to predict signal detection performance based on raw ERPs, PCA scores, or high-power coefficients of the DWT. Neural networks based on high-power DWT coefficients trained with fewer iterations, generalized to new data better, and were more resistant to overfitting than networks based on raw ERPs. Networks based on PCA scores did not generalize to new data as well as either the DWT network or the raw ERP network. The results show that wavelet expansions represent the ERP efficiently and extract behaviorally important features for use in linear regression or neural network models of human performance. The efficiency of the DWT is discussed in terms of its decorrelation and energy compaction properties. In addition, the DWT models provided evidence that a pattern of low-frequency activity (1 to 3.5 Hz) occurring at specific times and scalp locations is a reliable correlate of human signal detection performance.

  10. Feature extraction of event-related potentials using wavelets: an application to human performance monitoring

    NASA Technical Reports Server (NTRS)

    Trejo, L. J.; Shensa, M. J.

    1999-01-01

    This report describes the development and evaluation of mathematical models for predicting human performance from discrete wavelet transforms (DWT) of event-related potentials (ERP) elicited by task-relevant stimuli. The DWT was compared to principal components analysis (PCA) for representation of ERPs in linear regression and neural network models developed to predict a composite measure of human signal detection performance. Linear regression models based on coefficients of the decimated DWT predicted signal detection performance with half as many free parameters as comparable models based on PCA scores. In addition, the DWT-based models were more resistant to model degradation due to over-fitting than PCA-based models. Feed-forward neural networks were trained using the backpropagation algorithm to predict signal detection performance based on raw ERPs, PCA scores, or high-power coefficients of the DWT. Neural networks based on high-power DWT coefficients trained with fewer iterations, generalized to new data better, and were more resistant to overfitting than networks based on raw ERPs. Networks based on PCA scores did not generalize to new data as well as either the DWT network or the raw ERP network. The results show that wavelet expansions represent the ERP efficiently and extract behaviorally important features for use in linear regression or neural network models of human performance. The efficiency of the DWT is discussed in terms of its decorrelation and energy compaction properties. In addition, the DWT models provided evidence that a pattern of low-frequency activity (1 to 3.5 Hz) occurring at specific times and scalp locations is a reliable correlate of human signal detection performance. Copyright 1999 Academic Press.

  11. Rapid and automated analysis of aflatoxin M1 in milk and dairy products by online solid phase extraction coupled to ultra-high-pressure-liquid-chromatography tandem mass spectrometry.

    PubMed

    Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Pagano, Imma; Russo, Mariateresa; Rastrelli, Luca

    2016-01-08

    This study reports a fast and automated analytical procedure for the analysis of aflatoxin M1 (AFM1) in milk and dairy products. The method is based on the simultaneous protein precipitation and AFM1 extraction, by salt-induced liquid-liquid extraction (SI-LLE), followed by an online solid-phase extraction (online SPE) coupled to ultra-high-pressure-liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) analysis to the automatic pre-concentration, clean up and sensitive and selective determination of AFM1. The main parameters affecting the extraction efficiency and accuracy of the analytical method were studied in detail. In the optimal conditions, acetonitrile and NaCl were used as extraction/denaturant solvent and salting-out agent in SI-LLE, respectively. After centrifugation, the organic phase (acetonitrile) was diluted with water (1:9 v/v) and purified (1mL) by online C18 cartridge coupled with an UHPLC column. Finally, selected reaction monitoring (SRM) acquisition mode was applied to the detection of AFM1. Validation studies were carried out on different dairy products (whole and skimmed cow milk, yogurt, goat milk, and powder infant formula), providing method quantification limits about 25 times lower than AFM1 maximum levels permitted by EU regulation 1881/2006 in milk and dairy products for direct human consumption. Recoveries (86-102%) and repeatability (RSD<3, n=6) meet the performance criteria required by EU regulation N. 401/2006 for the determination of the levels of mycotoxins in foodstuffs. Moreover, no matrix effects were observed in the different milk and dairy products studied. The proposed method improves the performance of AFM1 analysis in milk samples as AFM1 determination is performed with a degree of accuracy higher than the conventional methods. Other advantages are the reduction of sample preparation procedure, time and cost of the analysis, enabling high sample throughput that meet the current concerns of food safety and the public

  12. [Relations between extraction of wisdom teeth and temporomandibular disorders: a case/control study].

    PubMed

    Duval, Florian; Leroux, Agathe; Bertaud, Valérie; Meary, Fleur; Le Padellec, Clément; Refuveille, Laura; Lemaire, Arnaud; Sorel, Olivier; Chauvel-Lebret, Dominique

    2015-09-01

    The aim of this study was to assess the impact of extraction of third molars on the occurrence of temporo-mandibular disorders (TMD). A review of the literature and a case-control study have been conducted. The case-control study compares the frequency of extraction of third molars between the sample with TMD (case) and the sample without TMD (control). The proportion of patients who had undergone extractions of wisdom teeth was higher in the case group than in the control group. The difference was statistically significant when patients had undergone extraction of all four wisdom teeth or when the extraction of four wisdom teeth underwent in one sitting or under general anesthesia. The study of patients in case sample shows that all signs of TMD were more common in patients who had undergone extractions in several sessions and under local anesthesia. The temporomandibular joint sounds are significantly more frequent with local anesthesia. In the case group, 85 to 92% of patients have parafunctions and 5 to 11% have malocclusion. This demonstrates the multifactorial etiology of temporomandibular disorders.

  13. An automated method for measurement of methoxetamine in human plasma by use of turbulent flow on-line extraction coupled with liquid chromatography and mass spectrometric detection.

    PubMed

    Abe, Emuri; Ricard, Florian; Darrouzain, François; Alvarez, Jean Claude

    2013-01-01

    Methoxetamine is a new ketamine derivative designer drug which has recently become available via the Internet marketed as "legal ketamine". It is a new dissociative recreational drug, acting as an NMDA receptor antagonist and dopamine reuptake inhibitor. The objective of this study was to develop on-line automated sample preparation using a TurboFlow device coupled with liquid chromatography with ion-trap mass spectrometric detection for measurement of methoxetamine in human plasma. Samples (100 μL) were vortex mixed with internal standard solution (ketamine-d4 in acetonitrile). After centrifugation, 20 μL of the supernatant was injected on to a 50 mm × 0.5-mm C18XL Turboflow column. The retained analytes were then back-flushed on to a 50 mm × 3-mm (3 μm) Hypersil Gold analytical column for chromatographic separation, then eluted with a formate buffer-acetonitrile gradient. Methoxetamine and the IS were ionized by electrospray in positive mode. Parent [M + H](+) ions were m/z 248.1 for methoxetamine and m/z 242.0 for the IS. The most intense product ions from methoxetamine (m/z 203.0) and the IS (m/z 224.0) were used for quantification. The assay was accurate (96.8-108.8% range) and precise (intra and inter-day coefficients of variation <8.8%) over the range of 2.0 (lower limit of quantification) to 1000.0 ng mL(-1) (upper limit of quantification). No matrix effect was observed. This method has been successfully applied to determination of plasma concentrations of methoxetamine in the first French hospitalization case report after acute intoxication; the plasma concentration was 136 ng mL(-1).

  14. Automated statistical experimental design approach for rapid separation of coenzyme Q10 and identification of its biotechnological process related impurities using UHPLC and UHPLC-APCI-MS.

    PubMed

    Talluri, Murali V N Kumar; Kalariya, Pradipbhai D; Dharavath, Shireesha; Shaikh, Naeem; Garg, Prabha; Ramisetti, Nageswara Rao; Ragampeta, Srinivas

    2016-09-01

    A novel ultra high performance liquid chromatography method development strategy was ameliorated by applying quality by design approach. The developed systematic approach was divided into five steps (i) Analytical Target Profile, (ii) Critical Quality Attributes, (iii) Risk Assessments of Critical parameters using design of experiments (screening and optimization phases), (iv) Generation of design space, and (v) Process Capability Analysis (Cp) for robustness study using Monte Carlo simulation. The complete quality-by-design-based method development was made automated and expedited by employing sub-2 μm particles column with an ultra high performance liquid chromatography system. Successful chromatographic separation of the Coenzyme Q10 from its biotechnological process related impurities was achieved on a Waters Acquity phenyl hexyl (100 mm × 2.1 mm, 1.7 μm) column with gradient elution of 10 mM ammonium acetate buffer (pH 4.0) and a mixture of acetonitrile/2-propanol (1:1) as the mobile phase. Through this study, fast and organized method development workflow was developed and robustness of the method was also demonstrated. The method was validated for specificity, linearity, accuracy, precision, and robustness in compliance to the International Conference on Harmonization, Q2 (R1) guidelines. The impurities were identified by atmospheric pressure chemical ionization-mass spectrometry technique. Further, the in silico toxicity of impurities was analyzed using TOPKAT and DEREK software.

  15. The automation of science.

    PubMed

    King, Ross D; Rowland, Jem; Oliver, Stephen G; Young, Michael; Aubrey, Wayne; Byrne, Emma; Liakata, Maria; Markham, Magdalena; Pir, Pinar; Soldatova, Larisa N; Sparkes, Andrew; Whelan, Kenneth E; Clare, Amanda

    2009-04-03

    The basis of science is the hypothetico-deductive method and the recording of experiments in sufficient detail to enable reproducibility. We report the development of Robot Scientist "Adam," which advances the automation of both. Adam has autonomously generated functional genomics hypotheses about the yeast Saccharomyces cerevisiae and experimentally tested these hypotheses by using laboratory automation. We have confirmed Adam's conclusions through manual experiments. To describe Adam's research, we have developed an ontology and logical language. The resulting formalization involves over 10,000 different research units in a nested treelike structure, 10 levels deep, that relates the 6.6 million biomass measurements to their logical description. This formalization describes how a machine contributed to scientific knowledge.

  16. The automated command transmission

    NASA Astrophysics Data System (ADS)

    Inoue, Y.; Satoh, S.

    A technique for automated command transmission (ACT) to GEO-stationed satellites is presented. The system is intended for easing the command center workload. The ACT system determines the relation of the commands to on-board units, connects the telemetry with on-board units, defines the control path on the spacecraft, identifies the correspondence of back-up units to primary units, and ascertains sunlight or eclipse conditions. The system also has the address of satellite and command decoders, the ID and content for the mission command sequence, group and inhibit codes, a listing of all available commands, and restricts the data to a command sequence. Telemetry supplies data for automated problem correction. All other missions operations are terminated during system recovery data processing after a crash. The ACT system is intended for use with the GMS spacecraft.

  17. A Shortest Dependency Path Based Convolutional Neural Network for Protein-Protein Relation Extraction

    PubMed Central

    Quan, Chanqin

    2016-01-01

    The state-of-the-art methods for protein-protein interaction (PPI) extraction are primarily based on kernel methods, and their performances strongly depend on the handcraft features. In this paper, we tackle PPI extraction by using convolutional neural networks (CNN) and propose a shortest dependency path based CNN (sdpCNN) model. The proposed method (1) only takes the sdp and word embedding as input and (2) could avoid bias from feature selection by using CNN. We performed experiments on standard Aimed and BioInfer datasets, and the experimental results demonstrated that our approach outperformed state-of-the-art kernel based methods. In particular, by tracking the sdpCNN model, we find that sdpCNN could extract key features automatically and it is verified that pretrained word embedding is crucial in PPI task. PMID:27493967

  18. Automated Desalting Apparatus

    NASA Technical Reports Server (NTRS)

    Spencer, Maegan K.; Liu, De-Ling; Kanik, Isik; Beegle, Luther

    2010-01-01

    Because salt and metals can mask the signature of a variety of organic molecules (like amino acids) in any given sample, an automated system to purify complex field samples has been created for the analytical techniques of electrospray ionization/ mass spectroscopy (ESI/MS), capillary electrophoresis (CE), and biological assays where unique identification requires at least some processing of complex samples. This development allows for automated sample preparation in the laboratory and analysis of complex samples in the field with multiple types of analytical instruments. Rather than using tedious, exacting protocols for desalting samples by hand, this innovation, called the Automated Sample Processing System (ASPS), takes analytes that have been extracted through high-temperature solvent extraction and introduces them into the desalting column. After 20 minutes, the eluent is produced. This clear liquid can then be directly analyzed by the techniques listed above. The current apparatus including the computer and power supplies is sturdy, has an approximate mass of 10 kg, and a volume of about 20 20 20 cm, and is undergoing further miniaturization. This system currently targets amino acids. For these molecules, a slurry of 1 g cation exchange resin in deionized water is packed into a column of the apparatus. Initial generation of the resin is done by flowing sequentially 2.3 bed volumes of 2N NaOH and 2N HCl (1 mL each) to rinse the resin, followed by .5 mL of deionized water. This makes the pH of the resin near neutral, and eliminates cross sample contamination. Afterward, 2.3 mL of extracted sample is then loaded into the column onto the top of the resin bed. Because the column is packed tightly, the sample can be applied without disturbing the resin bed. This is a vital step needed to ensure that the analytes adhere to the resin. After the sample is drained, oxalic acid (1 mL, pH 1.6-1.8, adjusted with NH4OH) is pumped into the column. Oxalic acid works as a

  19. Detection of Staphylococcus aureus enterotoxin production genes from patient samples using an automated extraction platform and multiplex real-time PCR.

    PubMed

    Chiefari, Amy K; Perry, Michael J; Kelly-Cirino, Cassandra; Egan, Christina T

    2015-12-01

    To minimize specimen volume, handling and testing time, we have developed two TaqMan(®) multiplex real-time PCR (rtPCR) assays to detect staphylococcal enterotoxins A-E and Toxic Shock Syndrome Toxin production genes directly from clinical patient stool specimens utilizing a novel lysis extraction process in parallel with the Roche MagNA Pure Compact. These assays are specific, sensitive and reliable for the detection of the staphylococcal enterotoxin encoding genes and the tst1 gene from known toxin producing strains of Staphylococcus aureus. Specificity was determined by testing a total of 47 microorganism strains, including 8 previously characterized staphylococcal enterotoxin producing strains against each rtPCR target. Sensitivity for these assays range from 1 to 25 cfu per rtPCR reaction for cultured isolates and 8-20 cfu per rtPCR for the clinical stool matrix.

  20. Automated 3-D extraction and evaluation of the inner and outer cortical surfaces using a Laplacian map and partial volume effect classification.

    PubMed

    Kim, June Sic; Singh, Vivek; Lee, Jun Ki; Lerch, Jason; Ad-Dab'bagh, Yasser; MacDonald, David; Lee, Jong Min; Kim, Sun I; Evans, Alan C

    2005-08-01

    Accurate reconstruction of the inner and outer cortical surfaces of the human cerebrum is a critical objective for a wide variety of neuroimaging analysis purposes, including visualization, morphometry, and brain mapping. The Anatomic Segmentation using Proximity (ASP) algorithm, previously developed by our group, provides a topology-preserving cortical surface deformation method that has been extensively used for the aforementioned purposes. However, constraints in the algorithm to ensure topology preservation occasionally produce incorrect thickness measurements due to a restriction in the range of allowable distances between the gray and white matter surfaces. This problem is particularly prominent in pediatric brain images with tightly folded gyri. This paper presents a novel method for improving the conventional ASP algorithm by making use of partial volume information through probabilistic classification in order to allow for topology preservation across a less restricted range of cortical thickness values. The new algorithm also corrects the classification of the insular cortex by masking out subcortical tissues. For 70 pediatric brains, validation experiments for the modified algorithm, Constrained Laplacian ASP (CLASP), were performed by three methods: (i) volume matching between surface-masked gray matter (GM) and conventional tissue-classified GM, (ii) surface matching between simulated and CLASP-extracted surfaces, and (iii) repeatability of the surface reconstruction among 16 MRI scans of the same subject. In the volume-based evaluation, the volume enclosed by the CLASP WM and GM surfaces matched the classified GM volume 13% more accurately than using conventional ASP. In the surface-based evaluation, using synthesized thick cortex, the average difference between simulated and extracted surfaces was 4.6 +/- 1.4 mm for conventional ASP and 0.5 +/- 0.4 mm for CLASP. In a repeatability study, CLASP produced a 30% lower RMS error for the GM surface and a 8

  1. AIDS-related non-Hodgkin's lymphoma presenting as delayed healing of an extraction wound.

    PubMed

    Nittayananta, W; Chungpanich, S; Pongpanich, S; Mitarnun, W

    1996-08-10

    Non-Hodgkin's lymphoma (NHL) of the oral cavity frequently occurs in patients infected with human immunodeficiency virus (HIV). This report describes a lesion presenting as delayed healing of an extraction wound with hyperaemic swollen gingivae and ulceration in an apparently healthy 34-year-old Thai fisherman. The lesion was the first evidence of his HIV-positivity. It is, therefore, imperative that clinicians should consider a diagnosis of HIV infection in cases of non-healing extraction wounds in patients in high risk categories.

  2. Inhibitive Effects of Mulberry Leaf-Related Extracts on Cell Adhesion and Inflammatory Response in Human Aortic Endothelial Cells

    PubMed Central

    Chao, P.-Y.; Lin, K.-H.; Chiu, C.-C.; Yang, Y.-Y.; Huang, M.-Y.; Yang, C.-M.

    2013-01-01

    Effects of mulberry leaf-related extracts (MLREs) on hydrogen peroxide-induced DNA damage in human lymphocytes and on inflammatory signaling pathways in human aortic endothelial cells (HAECs) were studied. The tested MLREs were rich in flavonols, especially bombyx faces tea (BT) in quercetin and kaempferol. Polyphenols, flavonoids, and anthocyanidin also abounded in BT. The best trolox equivalent antioxidant capacity (TEAC) was generated from the acidic methanolic extracts of BT. Acidic methanolic and water extracts of mulberry leaf tea (MT), mulberry leaf (M), and BT significantly inhibited DNA oxidative damage to lymphocytes based on the comet assay as compared to the H2O2-treated group. TNF-α-induced monocyte-endothelial cell adhesion was significantly suppressed by MLREs. Additionally, nuclear factor kappa B (NF-κB) expression was significantly reduced by BT and MT. Significant reductions were also observed in both NF-κB and activator protein (AP)-1 DNA binding by MLREs. Significant increases in peroxisome proliferator-activated receptor (PPAR) α and γ DNA binding by MLREs were also detected in M and MT extracts, but no evidence for PPAR α DNA binding in 50 μg/mL MT extract was found. Apparently, MLREs can provide distinct cytoprotective mechanisms that may contribute to its putative beneficial effects on suppressing endothelial responses to cytokines during inflammation. PMID:24371453

  3. Algorithms Could Automate Cancer Diagnosis

    NASA Technical Reports Server (NTRS)

    Baky, A. A.; Winkler, D. G.

    1982-01-01

    Five new algorithms are a complete statistical procedure for quantifying cell abnormalities from digitized images. Procedure could be basis for automated detection and diagnosis of cancer. Objective of procedure is to assign each cell an atypia status index (ASI), which quantifies level of abnormality. It is possible that ASI values will be accurate and economical enough to allow diagnoses to be made quickly and accurately by computer processing of laboratory specimens extracted from patients.

  4. Validation of a sensitive and automated 96-well solid-phase extraction liquid chromatography-tandem mass spectrometry method for the determination of desloratadine and 3-hydroxydesloratadine in human plasma.

    PubMed

    Yang, Liyu; Clement, Robert P; Kantesaria, Bhavna; Reyderman, Larisa; Beaudry, Francis; Grandmaison, Charles; Di Donato, Lorella; Masse, Robert; Rudewicz, Patrick J

    2003-07-25

    To support clinical development, a liquid chromatographic-tandem mass spectrometric (LC-MS-MS) method was developed and validated for the determination of desloratadine (descarboethoxyloratadine) and 3-OH desloratadine (3-hydroxydescarboethoxyloratadine) concentrations in human plasma. The method consisted of automated 96-well solid-phase extraction for sample preparation and liquid chromatography/turbo ionspray tandem mass spectrometry for analysis. [2H(4)]Desloratadine and [2H(4)]3-OH desloratadine were used as internal standards (I.S.). A quadratic regression (weighted 1/concentration(2)) gave the best fit for calibration curves over the concentration range of 25-10000 pg/ml for both desloratadine and 3-OH desloratadine. There was no interference from endogenous components in the blank plasma tested. The accuracy (%bias) at the lower limit of quantitation (LLOQ) was -12.8 and +3.4% for desloratadine and 3-OH desloratadine, respectively. The precision (%CV) for samples at the LLOQ was 15.1 and 10.9% for desloratadine and 3-OH desloratadine, respectively. For quality control samples at 75, 1000 and 7500 pg/ml, the between run %CV was extracts (up to 185 h at 5 degrees C). This LC-MS-MS method for the determination of desloratadine and 3-OH desloratadine in human plasma met regulatory requirements for selectivity, sensitivity, goodness of fit, precision, accuracy and stability.

  5. A multiplex real-time PCR-platform integrated into automated extraction method for the rapid detection and measurement of oncogenic HPV type-specific viral DNA load from cervical samples.

    PubMed

    Broccolo, Francesco

    2014-01-01

    The persistent infection with most frequent high-risk (HR)-HPV types (HPV-16, -18, -31, -33, -45, -52, and -58) is considered to be the true precursor of neoplastic progression. HR-HPV detection and genotyping is the most effective and accurate approach in screening of the early cervical lesions and cervical cancer, although also the HR-HPV DNA load is considered an ancillary marker for persistent HPV infection. Here, it is described an in-house multiplex quantitative real-time PCR (qPCR)-based typing system for the rapid detection and quantitation of the most common HR-HPV genotypes from cervical cytology screening tests. First, a separate qPCR assay to quantify a single-copy gene is recommended prior to screening (prescreening assay) to verify the adequate cellularity of the sample and the quality of DNA extracted and to normalize the HPV copy number per genomic DNA equivalent in the sample. Subsequently, to minimize the number of reactions, two multiplex qPCR assays (first line screening) are performed to detect and quantify HPV-16, -18, -31, -33, -45, -52, and -58 (HPV-18 and -45 are measured together by single-fluorophore). In addition, a multiplex qPCR assay specific for HPV-18 and HPV-45 is also available to type precisely the samples found to be positive for one of the two strains. Finally, two nucleic acid extraction methods are proposed by using a 96-well plate format: one manual method (supported by centrifuge or by vacuum) and one automated method integrated into a robotic liquid handler workstation to minimize material and hands-on time. In conclusion, this system provides a reliable high-throughput method for the rapid detection and quantitation of HR-HPV DNA load in cervical samples.

  6. Prevention of medication-related osteonecrosis of the jaws secondary to tooth extractions. A systematic review

    PubMed Central

    Limeres, Jacobo

    2016-01-01

    Background A study was made to identify the most effective protocol for reducing the risk of osteonecrosis of the jaws (ONJ) following tooth extraction in patients subjected to treatment with antiresorptive or antiangiogenic drugs. Material and Methods A MEDLINE and SCOPUS search (January 2003 - March 2015) was made with the purpose of conducting a systematic literature review based on the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines. All articles contributing information on tooth extractions in patients treated with oral or intravenous antiresorptive or antiangiogenic drugs were included. Results Only 13 of the 380 selected articles were finally included in the review: 11 and 5 of them offered data on patients treated with intravenous and oral bisphosphonates, respectively. No randomized controlled trials were found – all publications corresponding to case series or cohort studies. The prevalence of ONJ in the patients treated with intravenous and oral bisphosphonates was 6,9% (range 0-34.7%) and 0.47% (range 0-2.5%), respectively. The main preventive measures comprised local and systemic infection control. Conclusions No conclusive scientific evidence is available to date on the efficacy of ONJ prevention protocols in patients treated with antiresorptive or antiangiogenic drugs subjected to tooth extraction. Key words:Bisphosphonates, angiogenesis inhibitors, antiresorptive drugs, extraction, osteonecrosis. PMID:26827065

  7. [Aspects related to extraction and preservation in 60 cases of liver transplant].

    PubMed

    Mora, N P; Turrión, V S; Pereira, F; Herrera, J; Murcia, J; Vázquez, J; De Vicente, E; Ardaiz, J

    1989-02-01

    Extraction and preservation are of special interest in any liver transplant program. The viability and correct early function of the graft are determinant factors of the success or failure of the transplant. Application of a restrictive criterion in the acceptance of donor livers has allowed us to achieve an optimal viability (96.7%) in our first 60 cases of liver transplant.

  8. The relative allergenicity of Stachybotrys chartarum compared to house dust mite extracts in a mouse model

    EPA Science Inventory

    A report by the Institute of Medicine suggested that more research is needed to better understand mold effects on allergic disease, particularly asthma development. The authors compared the ability of the fungus Stachybotrys chartarum (SCE) and house dust mite (HDM) extracts to i...

  9. Cinnamon polyphenol extract regulates tristetraprolin and related gene expression in mouse adipocytes

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Cinnamon (Cinnamomum verum) has been widely used in spices, flavoring agents, and preservatives. Cinnamon polyphenol extract (CPE) may be important in the alleviation of chronic diseases, but the molecular evidence is not substantial. Tristetraprolin (TTP) family proteins have anti-inflammatory ef...

  10. On-line coupling of automated solid-phase extraction with high-performance liquid chromatography and electrochemical detection. Quantitation of oxidizable drugs of abuse and their metabolites in plasma and urine.

    PubMed

    Krämer, E; Kovar, K A

    1999-08-20

    The concentration effect of automated on-line solid-phase extraction (SPE) in combination with HPLC and very sensitive electrochemical detection was employed for the determination of N-ethyl-4-hydroxy-3-methoxy-amphetamine (HMEA, the main metabolite of the ecstasy analogue MDE), delta 9-tetrahydrocannabinol (THC) and 11-nor-delta 9-tetrahydrocannabinol-carboxylic acid (THC-COOH) in plasma and urine in comparison to a previously published psilocin assay. For the SPE either CBA (functional group: carboxypropyl)- or CH (functional group: cyclohexyl)-sorbent was used. The following separation was carried out on a reversed-phase column (LiChroCart, Superspher 60 RP select B from Merck). Depending on the hydrodynamic voltammogram of the analyzed substance the oxidation potential varied from 920 mV up to 1.2 V. In spite of using high potentials, precision and accuracy were always within the accepted statistical requirements. The limits of quantitation were between 5 ng/ml (THC, THC-COOH in plasma) and 20 ng/ml (HMEA in plasma). Advantages of on-line SPE in comparison with off-line methods were less manual effort, evidently smaller volumes (< or = 400 microliters) of plasma or urine and almost always higher recovery rates (> 93%). The assays have been successfully proven with real biological samples and found suitable for use in routine analysis.

  11. Automated liquid-liquid extraction based on 96-well plate format in conjunction with ultra-performance liquid chromatography tandem mass spectrometry (UPLC-MS/MS) for the quantitation of methoxsalen in human plasma.

    PubMed

    Yadav, Manish; Contractor, Pritesh; Upadhyay, Vivek; Gupta, Ajay; Guttikar, Swati; Singhal, Puran; Goswami, Sailendra; Shrivastav, Pranav S

    2008-09-01

    A sensitive, specific and high throughput bioanalytical method using automated sample processing via 96-well plate liquid-liquid extraction and ultra-performance liquid chromatography tandem mass spectrometry (UPLC-MS/MS) has been developed for the determination of methoxsalen in human plasma. Plasma samples with ketoconazole as internal standard (IS) were prepared by employing 0.2 mL human plasma in ethyl acetate:dichloromethane (80:20, v/v). The chromatographic separation was achieved on a Waters Acquity UPLC BEH C18 column using isocratic mobile phase, consisting of 10 mM ammonium formate and acetonitrile (60:40, v/v), at a flow rate of 0.5 mL/min. The linear dynamic range was established over the concentration range 1.1-213.1 ng/mL for methoxsalen. The method was rugged and rapid with a total run time of 1.5 min. It was successfully applied to a pivotal bioequivalence study in 12 healthy human subjects after oral administration of 10 mg extended release methoxsalen formulation under fasting condition.

  12. Bioactive compounds extracted from Indian wild legume seeds: antioxidant and type II diabetes-related enzyme inhibition properties.

    PubMed

    Gautam, Basanta; Vadivel, Vellingiri; Stuetz, Wolfgang; Biesalski, Hans K

    2012-03-01

    Seven different wild legume seeds (Acacia leucophloea, Bauhinia variegata, Canavalia gladiata, Entada scandens, Mucuna pruriens, Sesbania bispinosa and Tamarindus indica) from various parts of India were analyzed for total free phenolics, l-Dopa (l-3,4 dihydroxyphenylalanine), phytic acid and their antioxidant capacity (ferric-reducing antioxidant power [FRAP] and 2,2-diphenyl-1-picrylhydrazyl [DPPH] assay) and type II diabetes-related enzyme inhibition activitiy (α-amylase). S. bispinosa had the highest content in both total free phenolics and l-Dopa, and relatively low phytic acid when compared with other seeds. Phytic acid content, being highest in E. scandens, M. pruriens and T. indica, was highly predictive for FRAP (r = 0.47, p < 0.05) and DPPH (r = 0.66, p < 0.001) assays. The phenolic extract from T. indica and l-Dopa extract from E. scandens showed significantly higher FRAP values among others. All seed extracts demonstrated a remarkable reducing power (7-145 mM FeSO4 per mg extract), DPPH radical scavenging activity (16-95%) and α-amylase enzyme inhibition activity (28-40%).

  13. Exploiting the UMLS Metathesaurus for extracting and categorizing concepts representing signs and symptoms to anatomically related organ systems

    PubMed Central

    Tran, Le-Thuy T.; Divita, Guy; Carter, Marjorie E.; Judd, Joshua; Samore, Matthew H.; Gundlapalli, Adi V.

    2016-01-01

    Objective To develop a method to exploit the UMLS Metathesaurus for extracting and categorizing concepts found in clinical text representing signs and symptoms to anatomically related organ systems. The overarching goal is to classify patient reported symptoms to organ systems for population health and epidemiological analyses. Materials and methods Using the concepts’ semantic types and the inter-concept relationships as guidance, a selective portion of the concepts within the UMLS Metathesaurus was traversed starting from the concepts representing the highest level organ systems. The traversed concepts were chosen, filtered, and reviewed to obtain the concepts representing clinical signs and symptoms by blocking deviations, pruning superfluous concepts, and manual review. The mapping process was applied to signs and symptoms annotated in a corpus of 750 clinical notes. Results The mapping process yielded a total of 91,000 UMLS concepts (with approximately 300,000 descriptions) possibly representing physical and mental signs and symptoms that were extracted and categorized to the anatomically related organ systems. Of 1864 distinct descriptions of signs and symptoms found in the 750 document corpus, 1635 of these (88%) were successfully mapped to the set of concepts extracted from the UMLS. Of 668 unique concepts mapped, 603 (90%) were correctly categorized to their organ systems. Conclusion We present a process that facilitates mapping of signs and symptoms to their organ systems. By providing a smaller set of UMLS concepts to use for comparing and matching patient records, this method has the potential to increase efficiency of information extraction pipelines. PMID:26362345

  14. Automated Confocal Microscope Bias Correction

    NASA Astrophysics Data System (ADS)

    Dorval, Thierry; Genovesio, Auguste

    2006-10-01

    Illumination artifacts systematically occur in 2D cross-section confocal microscopy imaging . These bias can strongly corrupt an higher level image processing such as a segmentation, a fluorescence evaluation or even a pattern extraction/recognition. This paper presents a new fully automated bias correction methodology based on large image database preprocessing. This method is very appropriate to the High Content Screening (HCS), method dedicated to drugs discovery. Our method assumes that the amount of pictures available is large enough to allow a reliable statistical computation of an average bias image. A relevant segmentation evaluation protocol and experimental results validate our correction algorithm by outperforming object extraction on non corrupted images.

  15. Microwave-assisted extraction of coumarin and related compounds from Melilotus officinalis (L.) Pallas as an alternative to Soxhlet and ultrasound-assisted extraction.

    PubMed

    Martino, Emanuela; Ramaiola, Ilaria; Urbano, Mariangela; Bracco, Francesco; Collina, Simona

    2006-09-01

    Soxhlet extraction, ultrasound-assisted extraction (USAE) and microwaves-assisted extraction (MAE) in closed system have been investigated to determine the content of coumarin, o-coumaric and melilotic acids in flowering tops of Melilotus officinalis. The extracts were analyzed with an appropriate HPLC procedure. The reproducibility of extraction and of chromatographic analysis was proved. Taking into account the extraction yield, the cost and the time, we studied the effects of extraction variables on the yield of the above-mentioned compounds. Better results were obtained with MAE (50% v/v aqueous ethanol, two heating cycles of 5 min, 50 degrees C). On the basis of the ratio extraction yield/extraction time, we therefore propose MAE as the most efficient method.

  16. Analysis of drugs of abuse in hair by automated solid-phase extraction, GC/EI/MS and GC ion trap/CI/MS.

    PubMed

    Girod, C; Staub, C

    2000-01-10

    In our laboratory, analysis of human hair for the detection of drugs of abuse was first performed in 1995. Initially, requests for hair analysis were few, and it is only since 1997 that these analyses have become routine. As demand grew, we developed an automatic solid-phase extraction method; the use of a robot ASPEC allowed us to drop certain fastidious manipulations, and to treat a large number of samples at a time. This method is described, along with analysis by gas-chromatography-mass spectrometry (GC/MS) in selected ion monitoring mode (SIM), for the following drugs: codeine, 6-monoacetylmorphine (6-MAM), morphine, cocaine, methadone, ecstasy (MDMA) and Eve (MDE). This requires prior derivatization with propionic anhydride. The different validation parameters, linearity, repeatability, recovery and detection limits are described, as well as the application of this method to some real cases. Analysis of these cases is also performed by an ion trap GC/MS in chemical ionization mode (GC/IT/CI/MS) in order to demonstrate the usefulness of this technique as a complement to routine analysis. Analysis by GC/IT/CI/MS indeed avoids the risk of false-positive results by the identification of metabolites.

  17. Automating Frame Analysis

    SciTech Connect

    Sanfilippo, Antonio P.; Franklin, Lyndsey; Tratz, Stephen C.; Danielson, Gary R.; Mileson, Nicholas D.; Riensche, Roderick M.; McGrath, Liam

    2008-04-01

    Frame Analysis has come to play an increasingly stronger role in the study of social movements in Sociology and Political Science. While significant steps have been made in providing a theory of frames and framing, a systematic characterization of the frame concept is still largely lacking and there are no rec-ognized criteria and methods that can be used to identify and marshal frame evi-dence reliably and in a time and cost effective manner. Consequently, current Frame Analysis work is still too reliant on manual annotation and subjective inter-pretation. The goal of this paper is to present an approach to the representation, acquisition and analysis of frame evidence which leverages Content Analysis, In-formation Extraction and Semantic Search methods to provide a systematic treat-ment of a Frame Analysis and automate frame annotation.

  18. Relation between various soil phosphorus extraction methods and sorption parameters in calcareous soils with different texture.

    PubMed

    Jalali, Mohsen; Jalali, Mahdi

    2016-10-01

    The aim of this study was to investigate the influence of soil texture on phosphorus (P) extractability and sorption from a wide range of calcareous soils across Hamedan, western Iran. Fifty seven soil samples were selected and partitioned into five types on the basis of soil texture (clay, sandy, sandy clay loam, sandy loam and mixed loam) and the P extracted with calcium chloride (PCaCl2), citrate (Pcitrate), HCl (PHCl), Olsen (POls), and Mehlich-3 (PM3) solutions. On the average, the P extracted was in the order PHCl>PM3>Pcitrate>POls>PCaCl2. The P extracted by Pcitrate, PHCl, POls, and PM3 methods were significantly higher in sandy, sandy clay loam and sandy loam textures than clay and mixed loam textures, while soil phosphorus buffer capacity (PBC) was significantly higher in clay and mixed loam soil textures. The correlation analysis revealed a significant positive relationship between silt content Freundlich sorption coefficient (KF), maximum P sorption (Qmax), linear distribution coefficient (Kd), and PBC. All extractions were highly correlated with each other and among soil components with silt content. The principal component analysis (PCA) performed on data identified five principal components describing 74.5% of total variation. The results point to soil texture as an important factor and that silt was the crucial soil property associated with P sorption and its extractability in these calcareous soils. DPSM3-2 (PM3PM3+Qmax×100) and DPScitrate (PcitratePcitrate+Qmax×100) proved to be good indicators of soil's potential P release in these calcareous soils. Among the DPS, 21% of soils reported DPSM3-2, values higher than the environmental threshold, indicating build-up of P and P release. Most of the studied sandy clay loam soils had exceeded the environmentally unacceptable P concentration. Various management practices should be taken into account to reduce P losses from these soils. Further inorganic and organic P fertilizer inputs should be reduced

  19. HITSZ_CDR: an end-to-end chemical and disease relation extraction system for BioCreative V

    PubMed Central

    Li, Haodi; Tang, Buzhou; Chen, Qingcai; Chen, Kai; Wang, Xiaolong; Wang, Baohua; Wang, Zhe

    2016-01-01

    In this article, an end-to-end system was proposed for the challenge task of disease named entity recognition (DNER) and chemical-induced disease (CID) relation extraction in BioCreative V, where DNER includes disease mention recognition (DMR) and normalization (DN). Evaluation on the challenge corpus showed that our system achieved the highest F1-scores 86.93% on DMR, 84.11% on DN, 43.04% on CID relation extraction, respectively. The F1-score on DMR is higher than our previous one reported by the challenge organizers (86.76%), the highest F1-score of the challenge. Database URL: http://database.oxfordjournals.org/content/2016/baw077 PMID:27270713

  20. Safety in the Automated Office.

    ERIC Educational Resources Information Center

    Graves, Pat R.; Greathouse, Lillian R.

    1990-01-01

    Office automation has introduced new hazards to the workplace: electrical hazards related to computer wiring, musculoskeletal problems resulting from use of computer terminals and design of work stations, and environmental concerns related to ventilation, noise levels, and office machine chemicals. (SK)

  1. Automated data entry system: performance issues

    NASA Astrophysics Data System (ADS)

    Thoma, George R.; Ford, Glenn

    2001-12-01

    This paper discusses the performance of a system for extracting bibliographic fields from scanned pages in biomedical journals to populate MEDLINE, the flagship database of the national Library of Medicine (NLM), and heavily used worldwide. This system consists of automated processes to extract the article title, author names, affiliations and abstract, and manual workstations for the entry of other required fields such as pagination, grant support information, databank accession numbers and others needed for a completed bibliographic record in MEDLINE. Labor and time data are given for (1) a wholly manual keyboarding process to create the records, (2) an OCR-based system that requires all fields except the abstract to be manually input, and (3) a more automated system that relies on document image analysis and understanding techniques for the extraction of several fields. It is shown that this last, most automated, approach requires less than 25% of the labor effort in the first, manual, process.

  2. 22 CFR 120.30 - The Automated Export System (AES).

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 22 Foreign Relations 1 2012-04-01 2012-04-01 false The Automated Export System (AES). 120.30 Section 120.30 Foreign Relations DEPARTMENT OF STATE INTERNATIONAL TRAFFIC IN ARMS REGULATIONS PURPOSE AND DEFINITIONS § 120.30 The Automated Export System (AES). The Automated Export System (AES) is the Department...

  3. 22 CFR 120.30 - The Automated Export System (AES).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 22 Foreign Relations 1 2014-04-01 2014-04-01 false The Automated Export System (AES). 120.30 Section 120.30 Foreign Relations DEPARTMENT OF STATE INTERNATIONAL TRAFFIC IN ARMS REGULATIONS PURPOSE AND DEFINITIONS § 120.30 The Automated Export System (AES). The Automated Export System (AES) is the Department...

  4. 22 CFR 120.30 - The Automated Export System (AES).

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 22 Foreign Relations 1 2013-04-01 2013-04-01 false The Automated Export System (AES). 120.30 Section 120.30 Foreign Relations DEPARTMENT OF STATE INTERNATIONAL TRAFFIC IN ARMS REGULATIONS PURPOSE AND DEFINITIONS § 120.30 The Automated Export System (AES). The Automated Export System (AES) is the Department...

  5. Goitrogenic/antithyroidal potential of green tea extract in relation to catechin in rats.

    PubMed

    Chandra, Amar K; De, Neela

    2010-01-01

    Catechins are flavonoids found in abundance in green tea, have elicited high interest due to their beneficial effects on health. Though flavonoids have been reported to have an antithyroid effect and also to be goitrogenic there have been no reports about the effect of green tea on rat thyroid. The present study was designed to examine whether high doses of green tea has any harmful effect on thyroid physiology. For this purpose green tea extract was administered orally to male albino rats for 30 days at doses of 1.25 g%, 2.5 g% and 5.0 g%, respectively. Similarly, pure catechin was administered at doses of 25, 50 and 100mg/kg body weight which is equivalent to above doses of green tea extract. Lower body weight gain associated with marked hypertrophy and/or hyperplasia of the follicles was noted in the high dose of green tea and catechin treated groups. Decreased activity of thyroid peroxidase and 5'-deiodinase I and substantially elevated thyroidal Na,K+ATPase activity have been observed. Moreover, serum T3 and T4 levels were found to reduce followed by significant elevation of serum TSH. Taken together, these results suggest that catechin present in green tea extract might behave as antithyroid agent and possibly the consumption of green tea at high dose could alter thyroid function adversely.

  6. A semi-automated system for quantifying the oxidative potential of ambient particles in aqueous extracts using the dithiothreitol (DTT) assay: results from the Southeastern Center for Air Pollution and Epidemiology (SCAPE)

    NASA Astrophysics Data System (ADS)

    Fang, T.; Verma, V.; Guo, H.; King, L. E.; Edgerton, E. S.; Weber, R. J.

    2015-01-01

    A variety of methods are used to measure the capability of particulate matter (PM) to catalytically generate reactive oxygen species (ROS) in vivo, also defined as the aerosol oxidative potential. A widely used measure of aerosol oxidative potential is the dithiothreitol (DTT) assay, which monitors the depletion of DTT (a surrogate for cellular antioxidants) as catalyzed by the redox-active species in PM. However, a major constraint in the routine use of the DTT assay for integrating it with large-scale health studies is its labor-intensive and time-consuming protocol. To specifically address this concern, we have developed a semi-automated system for quantifying the oxidative potential of aerosol liquid extracts using the DTT assay. The system, capable of unattended analysis at one sample per hour, has a high analytical precision (coefficient of variation of 15% for positive control, 4% for ambient samples) and reasonably low limit of detection (0.31 nmol min-1). Comparison of the automated approach with the manual method conducted on ambient samples yielded good agreement (slope = 1.08 ± 0.12, r2 = 0.92, N = 9). The system was utilized for the Southeastern Center for Air Pollution & Epidemiology (SCAPE) to generate an extensive data set on DTT activity of ambient particles collected from contrasting environments (urban, roadside, and rural) in the southeastern US. We find that water-soluble PM2.5 DTT activity on a per-air-volume basis was spatially uniform and often well correlated with PM2.5 mass (r = 0.49 to 0.88), suggesting regional sources contributing to the PM oxidative potential in the southeastern US. The correlation may also suggest a mechanistic explanation (oxidative stress) for observed PM2.5 mass-health associations. The heterogeneity in the intrinsic DTT activity (per-PM-mass basis) across seasons indicates variability in the DTT activity associated with aerosols from sources that vary with season. Although developed for the DTT assay, the

  7. Liver-related safety assessment of green tea extracts in humans: a systematic review of randomized controlled trials

    PubMed Central

    Isomura, T; Suzuki, S; Origasa, H; Hosono, A; Suzuki, M; Sawada, T; Terao, S; Muto, Y; Koga, T

    2016-01-01

    There remain liver-related safety concerns, regarding potential hepatotoxicity in humans, induced by green tea intake, despite being supposedly beneficial. Although many randomized controlled trials (RCTs) of green tea extracts have been reported in the literature, the systematic reviews published to date were only based on subjective assessment of case reports. To more objectively examine the liver-related safety of green tea intake, we conducted a systematic review of published RCTs. A systematic literature search was conducted using three databases (PubMed, EMBASE and Cochrane Central Register of Controlled Trials) in December 2013 to identify RCTs of green tea extracts. Data on liver-related adverse events, including laboratory test abnormalities, were abstracted from the identified articles. Methodological quality of RCTs was assessed. After excluding duplicates, 561 titles and abstracts and 119 full-text articles were screened, and finally 34 trials were identified. Of these, liver-related adverse events were reported in four trials; these adverse events involved seven subjects (eight events) in the green tea intervention group and one subject (one event) in the control group. The summary odds ratio, estimated using a meta-analysis method for sparse event data, for intervention compared with placebo was 2.1 (95% confidence interval: 0.5–9.8). The few events reported in both groups were elevations of liver enzymes. Most were mild, and no serious liver-related adverse events were reported. Results of this review, although not conclusive, suggest that liver-related adverse events after intake of green tea extracts are expected to be rare. PMID:27188915

  8. Determination of nicotine, cotinine, and related alkaloids in human urine and saliva by automated in-tube solid-phase microextraction coupled with liquid chromatography-mass spectrometry.

    PubMed

    Kataoka, Hiroyuki; Inoue, Reiko; Yagi, Katsuharu; Saito, Keita

    2009-01-15

    A simple, rapid and sensitive method for the determination of nicotine, cotinine, nornicotine, anabasine, and anatabine in human urine and saliva was developed. These compounds were analyzed by on-line in-tube solid-phase microextraction (SPME) coupled with liquid chromatography-mass spectrometry (LC-MS). Nicotine, cotinine and related alkaloids were separated within 7 min by high performance liquid chromatography (HPLC) using a Synergi 4u POLAR-RP 80A column and 5 mM ammonium formate/methanol (55/45, v/v) as a mobile phase at a flow-rate of 0.8 mL/min. Electrospray ionization conditions in the positive ion mode were optimized for MS detection of these compounds. The optimum in-tube SPME conditions were 25 draw/eject cycles with a sample size of 40 microL using a CP-Pora PLOT amine capillary column as the extraction device. The extracted compounds could be desorbed easily from the capillary by passage of the mobile phase, and no carryover was observed. Using the in-tube SPME LC-MS method, the calibration curves were linear in the concentration range of 0.5-20 ng/mL of nicotine, cotinine and related compounds in urine and saliva, and the detection limits (S/N=3) were 15-40 pg/mL. The method described here showed 20-46-fold higher sensitivity than the direct injection method (5 microL injection). The within-run and between-day precision (relative standard deviations) were below 4.7% and 11.3% (n=5), respectively. This method was applied successfully to analysis of urine and saliva samples without interference peaks. The recoveries of nicotine, cotinine and related compounds spiked into urine and saliva samples were above 83%, and the relative standard deviations were below 7.1%. This method was used to analyze urinary and salivary levels of these compounds in nicotine intake and smoking.

  9. Celery Seed and Related Extracts with Antiarthritic, Antiulcer, and Antimicrobial Activities.

    PubMed

    Powanda, Michael C; Whitehouse, Michael W; Rainsford, K D

    2015-01-01

    Celery preparations have been used extensively for several millennia as natural therapies for acute and chronic painful or inflammatory conditions. This chapter reviews some of the biological and chemical properties of various celery preparations that have been used as natural remedies. Many of these have varying activities and product qualities. A fully standardized celery preparation has been prepared known as an alcoholic extract of the seeds of a plant source derived from northern India. This is termed, Celery Seed Extract (CSE) and has been found to be at least as effective as aspirin, ibuprofen, and naproxen in suppressing arthritis in a model of polyarthritis. CSE can also reduce existing inflammation in rats. CSE has also been shown to provide analgesia in two model systems. CSE, in addition to acting as an analgesic and inflammatory agent, has been shown to protect against and/or reduce gastric irritation caused by NSAIDs, as well as act synergistically with them to reduce inflammation. The CSE was fractionated by organic solvent extractions, then subjected to column chromatography followed by HPLC and was characterized by mass spectrometry. This yielded a purified component that had specific inhibitory effects on Helicobacter pylori but was not active against Campylobacter jejuni or Escherichia coli. Additionally, toxicology studies did not reveal any clear signs of toxicity at doses relevant to human use. Also, unlike many dietary supplements, the available data suggest that CSE does not significantly affect the p450 enzyme systems and thus is less likely to alter the metabolism of drugs the individual may be taking. CSE may be a prototype of a natural product that can be used therapeutically to treat arthritis and other inflammatory diseases.

  10. Urinary excretion of lipoxin A(4) and related compounds: development of new extraction techniques for lipoxins.

    PubMed

    Romano, Mario; Luciotti, Graziella; Gangemi, Sebastiano; Marinucci, Francesca; Prontera, Cesaria; D'Urbano, Etrusca; Davì, Giovanni

    2002-09-01

    LX are tetraene-containing eicosanoids generated by lipoxygenase (LO) transformation of arachidonic acid (Serhan and Romano, 1995). LX possess potent anti-inflammatory activity in vivo, and temporal biosynthesis of LX, concurrent with spontaneous resolution, has been observed during exudate formation (Levy et al, 2001). Limited results are currently available on the involvement of LX in clinical settings. Recently, a rabbit anti-LXA(4) antiserum has been raised to produce an enzyme-linked immunosorbent assay (ELISA) kit for LXA(4) (Levy et al, 1993). Although specific and accurate with isolated cells, this kit has not been tested with complex biological matrix such as urine. Initial attempts to determine urinary excretion of LXA(4) using the LXA(4) ELISA kit were unsuccessful because of high unspecific absorbance readings. In this report, we show that the LXA(4) extraction procedure indicated in the ELISA kit is inadequate for urinary measurements of immunoreactive (i)LXA(4). We present the development of a new extraction technique, more selective for LX, that abolishes background contamination and minimizes the unspecific readings. Using this method, we show for the first time that urine from healthy subjects contain (i)LXA(4) material and identify a urinary tetraene with the physical properties of a LXA(4) metabolite. Although reliable methods have been previously established to quantitate LXA(4) from whole blood (Brezinski et al, 1992), the present extraction technique, which optimizes for LXA(4) recovery from human urine, represents a substantial achievement for LX investigation and may open a new avenue of clinical studies on LXA(4).

  11. Relative contributions of hypoxia and natural gas extraction to atmospheric methane emissions from Lake Erie

    NASA Astrophysics Data System (ADS)

    Disbennett, D. A.; Townsend-Small, A.; Bourbonniere, R.; Mackay, R.

    2013-12-01

    Reduced oxygen availability in lakes due to summer stratification can create conditions suitable for methanogenic activity, which ultimately contributes to atmospheric methane emissions. Lake Erie has persistent low oxygen conditions in bottom waters during summer, which contributes to methane production through anaerobic organic matter respiration. Lake Erie also has substantial subsurface natural gas deposits that are currently being extracted in Canadian waters. We hypothesized that the lake would be a source of methane to the atmosphere in late summer, prior to fall turnover, and that natural gas wells and pipelines would contribute to additional methane emissions from resource extraction areas in Canadian waters. Initial sampling was conducted at a total of 20 sites in central and western Lake Erie during early September 2012. Sites were selected to collect samples from a wide range of environmental conditions in order to better establish the baseline flux from these areas. We selected an array of sites in the offshore environment, sites from a very shallow bay and sites within the Canadian gas fields. Air samples were gathered using floating flux chambers tethered to the research vessel. Dissolved gas water samples were collected using a Van Dorn bottle. We found a consistent positive flux of methane throughout the lake during late summer, with flux rates adjacent to natural gas pipelines up to an order of magnitude greater than elsewhere. Stable isotope analysis yielded results that were not entirely expected. The δ13C of surface samples from areas of fossil fuel extraction and suspected biogenic sources were very similar, likely due to oxidation of methane in the water column. Additional sampling occurred during 2012 and 2013 concentrating on bottom waters and surface fluxes which should allow us to further constrain sources of CH4 from Lake Erie. This project is an effort to constrain the global warming potential of hypoxia in the Great Lakes, and

  12. Extraction efficiency of hydrophilic and lipophilic antioxidants from lyophilized foods using pressurized liquid extraction and manual extraction.

    PubMed

    Watanabe, Jun; Oki, Tomoyuki; Takebayashi, Jun; Takano-Ishikawa, Yuko

    2014-09-01

    The efficient extraction of antioxidants from food samples is necessary in order to accurately measure their antioxidant capacities. α-Tocopherol and gallic acid were spiked into samples of 5 lyophilized and pulverized vegetables and fruits (onion, cabbage, Satsuma mandarin orange, pumpkin, and spinach). The lipophilic and hydrophilic antioxidants in the samples were sequentially extracted with a mixed solvent of n-hexane and dichloromethane, and then with acetic acid-acidified aqueous methanol. Duplicate samples were extracted: one set was extracted using an automated pressurized liquid extraction apparatus, and the other set was extracted manually. Spiked α-tocopherol and gallic acid were recovered almost quantitatively in the extracted lipophilic and hydrophilic fractions, respectively, especially when pressurized liquid extraction was used. The expected increase in lipophilic oxygen radical absorbance capacity (L-ORAC) due to spiking with α-tocopherol, and the expected increase in 2,2-diphenyl-1-picrylhydrazyl radical scavenging activities and total polyphenol content due to spiking with gallic acid, were all recovered in high yield. Relatively low recoveries, as reflected in the hydrophilic ORAC (H-ORAC) value, were obtained following spiking with gallic acid, suggesting an interaction between gallic acid and endogenous antioxidants. The H-ORAC values of gallic acid-spiked samples were almost the same as those of postadded (spiked) samples. These results clearly indicate that lipophilic and hydrophilic antioxidants are effectively extracted from lyophilized food, especially when pressurized liquid extraction is used.

  13. Sieve-based coreference resolution enhances semi-supervised learning model for chemical-induced disease relation extraction

    PubMed Central

    Le, Hoang-Quynh; Tran, Mai-Vu; Dang, Thanh Hai; Ha, Quang-Thuy; Collier, Nigel

    2016-01-01

    The BioCreative V chemical-disease relation (CDR) track was proposed to accelerate the progress of text mining in facilitating integrative understanding of chemicals, diseases and their relations. In this article, we describe an extension of our system (namely UET-CAM) that participated in the BioCreative V CDR. The original UET-CAM system’s performance was ranked fourth among 18 participating systems by the BioCreative CDR track committee. In the Disease Named Entity Recognition and Normalization (DNER) phase, our system employed joint inference (decoding) with a perceptron-based named entity recognizer (NER) and a back-off model with Semantic Supervised Indexing and Skip-gram for named entity normalization. In the chemical-induced disease (CID) relation extraction phase, we proposed a pipeline that includes a coreference resolution module and a Support Vector Machine relation extraction model. The former module utilized a multi-pass sieve to extend entity recall. In this article, the UET-CAM system was improved by adding a ‘silver’ CID corpus to train the prediction model. This silver standard corpus of more than 50 thousand sentences was automatically built based on the Comparative Toxicogenomics Database (CTD) database. We evaluated our method on the CDR test set. Results showed that our system could reach the state of the art performance with F1 of 82.44 for the DNER task and 58.90 for the CID task. Analysis demonstrated substantial benefits of both the multi-pass sieve coreference resolution method (F1 + 4.13%) and the silver CID corpus (F1 +7.3%). Database URL: SilverCID–The silver-standard corpus for CID relation extraction is freely online available at: https://zenodo.org/record/34530 (doi:10.5281/zenodo.34530). PMID:27630201

  14. Sieve-based coreference resolution enhances semi-supervised learning model for chemical-induced disease relation extraction.

    PubMed

    Le, Hoang-Quynh; Tran, Mai-Vu; Dang, Thanh Hai; Ha, Quang-Thuy; Collier, Nigel

    2016-07-01

    The BioCreative V chemical-disease relation (CDR) track was proposed to accelerate the progress of text mining in facilitating integrative understanding of chemicals, diseases and their relations. In this article, we describe an extension of our system (namely UET-CAM) that participated in the BioCreative V CDR. The original UET-CAM system's performance was ranked fourth among 18 participating systems by the BioCreative CDR track committee. In the Disease Named Entity Recognition and Normalization (DNER) phase, our system employed joint inference (decoding) with a perceptron-based named entity recognizer (NER) and a back-off model with Semantic Supervised Indexing and Skip-gram for named entity normalization. In the chemical-induced disease (CID) relation extraction phase, we proposed a pipeline that includes a coreference resolution module and a Support Vector Machine relation extraction model. The former module utilized a multi-pass sieve to extend entity recall. In this article, the UET-CAM system was improved by adding a 'silver' CID corpus to train the prediction model. This silver standard corpus of more than 50 thousand sentences was automatically built based on the Comparative Toxicogenomics Database (CTD) database. We evaluated our method on the CDR test set. Results showed that our system could reach the state of the art performance with F1 of 82.44 for the DNER task and 58.90 for the CID task. Analysis demonstrated substantial benefits of both the multi-pass sieve coreference resolution method (F1 + 4.13%) and the silver CID corpus (F1 +7.3%).Database URL: SilverCID-The silver-standard corpus for CID relation extraction is freely online available at: https://zenodo.org/record/34530 (doi:10.5281/zenodo.34530).

  15. Knowledge, attitudes, and performance of dental students in relation to sterilization/disinfection methods of extracted human teeth

    PubMed Central

    Hashemipour, Maryam Alsadat; Mozafarinia, Romina; Mirzadeh, Azin; Aramon, Moien; Nassab, Sayed Amir Hossein Gandjalikhan

    2013-01-01

    Background: Dental students use extracted human teeth to learn practical and technical skills before they enter the clinical environment. In the present research, knowledge, performance, and attitudes toward sterilization/disinfection methods of extracted human teeth were evaluated in a selected group of Iranian dental students. Materials and Methods: In this descriptive cross-sectional study the subjects consisted of fourth-, fifth- and sixth-year dental students. Data were collected by questionnaires and analyzed by Fisher's exact test and Chi-squared test using SPSS 11.5. Results: In this study, 100 dental students participated. The average knowledge score was 15.9 ± 4.8. Based on the opinion of 81 students sodium hypochlorite was selected as suitable material for sterilization and 78 students believed that oven sterilization is a good way for the purpose. The average performance score was 4.1 ± 0.8, with 3.9 ± 1.7 and 4.3 ± 1.1 for males and females, respectively, with no significant differences between the two sexes. The maximum and minimum attitude scores were 60 and 25, with an average score of 53.1 ± 5.2. Conclusion: The results of this study indicated that knowledge, performance and attitude of dental students in relation to sterilization/disinfection methods of extracted human teeth were good. However, weaknesses were observed in relation to teaching and materials suitable for sterilization. PMID:24130583

  16. Multivariate calibration for the determination of total azadirachtin-related limonoids and simple terpenoids in neem extracts using vanillin assay.

    PubMed

    Dai, J; Yaylayan, V A; Raghavan, G S; Parè, J R; Liu, Z

    2001-03-01

    Two-component and multivariate calibration techniques were developed for the simultaneous quantification of total azadirachtin-related limonoids (AZRL) and simple terpenoids (ST) in neem extracts using vanillin assay. A mathematical modeling method was also developed to aid in the analysis of the spectra and to simplify the calculations. The mathematical models were used in a two-component calibration (using azadirachtin and limonene as standards) for samples containing mainly limonoids and terpenoids (such as neem seed kernel extracts). However, for the extracts from other parts of neem, such as neem leaf, a multivariate calibration was necessary to eliminate the possible interference from phenolics and other components in order to obtain the accurate content of AZRL and ST. It was demonstrated that the accuracy of the vanillin assay in predicting the content of azadirachtin in a model mixture containing limonene (25% w/w) can be improved from 50% overestimation to 95% accuracy using the two-component calibration, while predicting the content of limonene with 98% accuracy. Both calibration techniques were applied to estimate the content of AZRL and ST in different parts of the neem plant. The results of this study indicated that the relative content of limonoids was much higher than that of the terpenoids in all parts of the neem plant studied.

  17. Personnel Administration in an Automated Environment.

    ERIC Educational Resources Information Center

    Leinbach, Philip E.; And Others

    1990-01-01

    Fourteen articles address issues related to library personnel administration in an automated environment, such as education for automation, salaries, impact of technology, expert systems, core competencies, administrative issues, technology services, job satisfaction, and performance appraisal. A selected annotated bibliography is included. (MES)

  18. In vitro antibacterial, antifungal and antioxidant activities of Eucalyptus spp. leaf extracts related to phenolic composition.

    PubMed

    Elansary, Hosam O; Salem, Mohamed Z M; Ashmawy, Nader A; Yessoufou, Kowiyou; El-Settawy, Ahmed A A

    2017-03-16

    The crude methanolic extracts from leaves of Eucalyptus camaldulensis L., E. camaldulensis var obtusa and E. gomphocephala grown in Egypt were investigated to explore their chemical composition as well as their antibacterial, antifungal and antioxidant activities. Major phenolics found were ellagic acid, quercetin 3-O-rhamnoside, quercetin 3-O-b-D-glucuronide, caffeic acid and chlorogenic acid. The antioxidant activities were examined by the 2,2'-diphenypicrylhydrazyl (DPPH) and β-Carotene-linoleic acid assays. E. camaldulensis extracts showed the highest phenolic content, antioxidant and antimicrobial activities compared to other cultivars. MIC values reported for antibacterial activity of E. camaldulensis ranged from 0.08 μg/mL (Bacillus cereus) to 0.22 μg/mL (Staphylococcus aureus), while MBC values ranged from 0.16 μg/mL (Dickeya solani and B. cereus) to 0.40 μg/mL (S. aureus). The inhibitory activities against growth of bacteria and fungi used is an indication that E. camaldulensis a might be useful resource for the development and formulation of antibacterial and antifungal drugs.

  19. SU-D-BRD-03: Improving Plan Quality with Automation of Treatment Plan Checks

    SciTech Connect

    Covington, E; Younge, K; Chen, X; Lee, C; Matuszak, M; Kessler, M; Acosta, E; Orow, A; Filpansick, S; Moran, J; Keranen, W

    2015-06-15

    Purpose: To evaluate the effectiveness of an automated plan check tool to improve first-time plan quality as well as standardize and document performance of physics plan checks. Methods: The Plan Checker Tool (PCT) uses the Eclipse Scripting API to check and compare data from the treatment planning system (TPS) and treatment management system (TMS). PCT was created to improve first-time plan quality, reduce patient delays, increase efficiency of our electronic workflow, and to standardize and partially automate plan checks in the TPS. A framework was developed which can be configured with different reference values and types of checks. One example is the prescribed dose check where PCT flags the user when the planned dose and the prescribed dose disagree. PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user. A PDF report is created and automatically uploaded into the TMS. Prior to and during PCT development, errors caught during plan checks and also patient delays were tracked in order to prioritize which checks should be automated. The most common and significant errors were determined. Results: Nineteen of 33 checklist items were automated with data extracted with the PCT. These include checks for prescription, reference point and machine scheduling errors which are three of the top six causes of patient delays related to physics and dosimetry. Since the clinical roll-out, no delays have been due to errors that are automatically flagged by the PCT. Development continues to automate the remaining checks. Conclusion: With PCT, 57% of the physics plan checklist has been partially or fully automated. Treatment delays have declined since release of the PCT for clinical use. By tracking delays and errors, we have been able to measure the effectiveness of automating checks and are using this information to prioritize future development. This project was supported in part by P01CA059827.

  20. Antioxidant Activity and Thermal Stability of Oleuropein and Related Phenolic Compounds of Olive Leaf Extract after Separation and Concentration by Salting-Out-Assisted Cloud Point Extraction.

    PubMed

    Stamatopoulos, Konstantinos; Katsoyannos, Evangelos; Chatzilazarou, Arhontoula

    2014-04-08

    A fast, clean, energy-saving, non-toxic method for the stabilization of the antioxidant activity and the improvement of the thermal stability of oleuropein and related phenolic compounds separated from olive leaf extract via salting-out-assisted cloud point extraction (CPE) was developed using Tween 80. The process was based on the decrease of the solubility of polyphenols and the lowering of the cloud point temperature of Tween 80 due to the presence of elevated amounts of sulfates (salting-out) and the separation from the bulk solution with centrifugation. The optimum conditions were chosen based on polyphenols recovery (%), phase volume ratio (Vs/Vw) and concentration factor (Fc). The maximum recovery of polyphenols was in total 95.9%; Vs/Vw was 0.075 and Fc was 15 at the following conditions: pH 2.6, ambient temperature (25 °C), 4% Tween 80 (w/v), 35% Na₂SO₄ (w/v) and a settling time of 5 min. The total recovery of oleuropein, hydroxytyrosol, luteolin-7-O-glucoside, verbascoside and apigenin-7-O-glucoside, at optimum conditions, was 99.8%, 93.0%, 87.6%, 99.3% and 100.0%, respectively. Polyphenolic compounds entrapped in the surfactant-rich phase (Vs) showed higher thermal stability (activation energy (Ea) 23.8 kJ/mol) compared to non-entrapped ones (Ea 76.5 kJ/mol). The antioxidant activity of separated polyphenols remained unaffected as determined by the 1,1-diphenyl-2-picrylhydrazyl method.

  1. PLAN2L: a web tool for integrated text mining and literature-based bioentity relation extraction.

    PubMed

    Krallinger, Martin; Rodriguez-Penagos, Carlos; Tendulkar, Ashish; Valencia, Alfonso

    2009-07-01

    There is an increasing interest in using literature mining techniques to complement information extracted from annotation databases or generated by bioinformatics applications. Here we present PLAN2L, a web-based online search system that integrates text mining and information extraction techniques to access systematically information useful for analyzing genetic, cellular and molecular aspects of the plant model organism Arabidopsis thaliana. Our system facilitates a more efficient retrieval of information relevant to heterogeneous biological topics, from implications in biological relationships at the level of protein interactions and gene regulation, to sub-cellular locations of gene products and associations to cellular and developmental processes, i.e. cell cycle, flowering, root, leaf and seed development. Beyond single entities, also predefined pairs of entities can be provided as queries for which literature-derived relations together with textual evidences are returned. PLAN2L does not require registration and is freely accessible at http://zope.bioinfo.cnio.es/plan2l.

  2. [Extraction of single-trial event-related potentials by means of ARX modeling and independent component analysis].

    PubMed

    Wang, Rongchang; Du, Sidan

    2006-12-01

    The present paper focused on the extraction of event-related potentials on a single sweep under extremely low S/N ratio. Two methods that can efficiently remove spontaneous EEG, ocular artifacts and power line interference were presented based on ARX modeling and independent component analysis (ICA). The former method applied ARX model to the measured compound signal that extensively contained the three kinds of ordinary noises mentioned above, and used ARX algorithm for parametric identification. The latter decomposed the signal by means of independent component analysis. Besides, some of ICA's important decomposing characters and its intrinsic causality were pointed out definitely. According to the practical situation, some modification on FastICA algorithm was also given, so as to implement auto-adaptive mapping of decomposed results to ERP component. Through simulation, both the two ways are proved to be highly capable of signal extraction and S/N ratio improving.

  3. On-line solid-phase extraction coupled with high-performance liquid chromatography and tandem mass spectrometry (SPE-HPLC-MS-MS) for quantification of bromazepam in human plasma: an automated method for bioequivalence studies.

    PubMed

    Gonçalves, José Carlos Saraiva; Monteiro, Tânia Maria; Neves, Claúdia Silvana de Miranda; Gram, Karla Regina da Silva; Volpato, Nádia Maria; Silva, Vivian A; Caminha, Ricardo; Gonçalves, Maria do Rocio Bencke; Santos, Fábio Monteiro Dos; Silveira, Gabriel Estolano da; Noël, François

    2005-10-01

    A validated method for on-line solid-phase extraction coupled with high-performance liquid chromatography tandem mass spectrometry (SPE-HPLC-MS-MS) is described for the quantification of bromazepam in human plasma. The method involves a dilution of 300 muL of plasma with 100 muL of carbamazepine (2.5 ng/mL), used as internal standard, vortex-mixing, centrifugation, and injection of 100 muL of the supernate. The analytes were ionized using positive electrospray mass spectrometry then detected by multiple reaction monitoring (MRM). The m/z transitions 316-->182 (bromazepam) and 237-->194 (carbamazepine) were used for quantification. The calibration curve was linear from 1 ng/mL (limit of quantification) to 200 ng/mL. The retention times of bromazepam and carbamazepine were 2.6 and 3.2 minutes, respectively. The intraday and interday precisions were 3.43%-15.45% and 5.2%-17%, respectively. The intraday and interday accuracy was 94.00%-103.94%. This new automated method has been successfully applied in a bioequivalence study of 2 tablet formulations of 6 mg bromazepam: Lexotan(R) from Produtos Roche Químicos e Farmacêuticos SA, Rio de Janeiro, Brazil (reference) and test formulation from Laboratórios Biosintética Ltda, São Paulo, Brazil. Because the 90% CI of geometric mean ratios between reference and test were completely included in the 80%-125% interval, the 2 formulations were considered bioequivalent. The comparison of different experimental conditions for establishing a dissolution profile in vitro along with our bioavailability data further allowed us to propose rationally based experimental conditions for a dissolution test of bromazepam tablets, actually lacking a pharmacopeial monograph.

  4. Screening of over 100 drugs in horse urine using automated on-line solid-phase extraction coupled to liquid chromatography-high resolution mass spectrometry for doping control.

    PubMed

    Kwok, W H; Choi, Timmy L S; Tsoi, Yeuki Y K; Leung, Gary N W; Wan, Terence S M

    2017-02-14

    A fast method for the direct analysis of enzyme-hydrolysed horse urine using an automated on-line solid-phase extraction (SPE) coupled to a liquid-chromatography/high resolution mass spectrometer was developed. Over 100 drugs of diverse drug classes could be simultaneously detected in horse urine at sub to low parts per billion levels. Urine sample was first hydrolysed by β-glucuronidase to release conjugated drugs, followed by centrifugal filtration. The filtrate (1mL) was directly injected into an on-line SPE system consisting of a pre-column filter and a SPE cartridge column for the separation of analytes from matrix components. Through valves-switching, the interfering matrix components were flushed to waste, and the analytes were eluted to a C18 analytical column for refocusing and chromatographic separation. Detections were achieved by full-scan HRMS in alternating positive and negative electrospray ionisation modes within a turn-around time of 16min, inclusive of on-line sample clean-up and post-run mobile phase equilibration. No significant matrix interference was observed at the expected retention times of the targeted masses. Over 90% of the drugs studied gave estimated limits of detection (LoDs) at or below 5ng/mL, with some LoDs reaching down to 0.05ng/mL. Data-dependent acquisition (DDA) was included to provide additional product-ion scan data to substantiate the presence of detected analytes. The resulting product-ion spectra can be searched against an in-house MS/MS library for identity verification. The applicability of the method has been demonstrated by the detection of drugs in doping control samples.

  5. Comparative Analysis of Circulating Endothelial Progenitor Cells in Age-Related Macular Degeneration Patients Using Automated Rare Cell Analysis (ARCA) and Fluorescence Activated Cell Sorting (FACS)

    PubMed Central

    Say, Emil Anthony T.; Melamud, Alex; Esserman, Denise Ann; Povsic, Thomas J.; Chavala, Sai H.

    2013-01-01

    Background Patients with age-related macular degeneration (ARMD) begin with non-neovascular (NNV) phenotypes usually associated with good vision. Approximately 20% of NNV-ARMD patients will convert to vision debilitating neovascular (NV) ARMD, but precise timing of this event is unknown. Developing a clinical test predicting impending conversion to NV-ARMD is necessary to prevent vision loss. Endothelial progenitor cells (EPCs), defined as CD34+VEGR2+ using traditional fluorescence activated cell sorting (FACS), are rare cell populations known to be elevated in patients with NV-ARMD compared to NNV-ARMD. FACS has high inter-observer variability and subjectivity when measuring rare cell populations precluding development into a diagnostic test. We hypothesized that automated rare cell analysis (ARCA), a validated and FDA-approved technology for reproducible rare cell identification, can enumerate EPCs in ARMD patients more reliably. This pilot study serves as the first step in developing methods for reproducibly predicting ARMD phenotype conversion. Methods We obtained peripheral venous blood samples in 23 subjects with NNV-ARMD or treatment naïve NV-ARMD. Strict criteria were used to exclude subjects with known angiogenic diseases to minimize confounding results. Blood samples were analyzed in masked fashion in two separate laboratories. EPCs were independently enumerated using ARCA and FACS within 24 hours of blood sample collection, and p<0.2 was considered indicative of a trend for this proof of concept study, while statistical significance was established at 0.05. Results We measured levels of CD34+VEGFR2+ EPCs suggestive of a trend with higher values in patients with NV compared to NNV-ARMD (p = 0.17) using ARCA. Interestingly, CD34+VEGR2+ EPC analysis using FACS did not produce similar results (p = 0.94). Conclusions CD34+VEGR2+ may have predictive value for EPC enumeration in future ARCA studies. EPC measurements in a small sample size were

  6. Automation U.S.A.: Overcoming Barriers to Automation.

    ERIC Educational Resources Information Center

    Brody, Herb

    1985-01-01

    Although labor unions and inadequate technology play minor roles, the principal barrier to factory automation is "fear of change." Related problems include long-term benefits, nontechnical executives, and uncertainty of factory cost accounting. Industry support for university programs is helping to educate engineers to design, implement, and…

  7. Automatic Extraction and Post-coordination of Spatial Relations in Consumer Language.

    PubMed

    Roberts, Kirk; Rodriguez, Laritza; Shooshan, Sonya E; Demner-Fushman, Dina

    2015-01-01

    To incorporate ontological concepts in natural language processing (NLP) it is often necessary to combine simple concepts into complex concepts (post-coordination). This is especially true in consumer language, where a more limited vocabulary forces consumers to utilize highly productive language that is almost impossible to pre-coordinate in an ontology. Our work focuses on recognizing an important case for post-coordination in natural language: spatial relations between disorders and anatomical structures. Consumers typically utilize such spatial relations when describing symptoms. We describe an annotated corpus of 2,000 sentences with 1,300 spatial relations, and a second corpus of 500 of these relations manually normalized to UMLS concepts. We use machine learning techniques to recognize these relations, obtaining good performance. Further, we experiment with methods to normalize the relations to an existing ontology. This two-step process is analogous to the combination of concept recognition and normalization, and achieves comparable results.

  8. Automatic Extraction and Post-coordination of Spatial Relations in Consumer Language

    PubMed Central

    Roberts, Kirk; Rodriguez, Laritza; Shooshan, Sonya E.; Demner-Fushman, Dina

    2015-01-01

    To incorporate ontological concepts in natural language processing (NLP) it is often necessary to combine simple concepts into complex concepts (post-coordination). This is especially true in consumer language, where a more limited vocabulary forces consumers to utilize highly productive language that is almost impossible to pre-coordinate in an ontology. Our work focuses on recognizing an important case for post-coordination in natural language: spatial relations between disorders and anatomical structures. Consumers typically utilize such spatial relations when describing symptoms. We describe an annotated corpus of 2,000 sentences with 1,300 spatial relations, and a second corpus of 500 of these relations manually normalized to UMLS concepts. We use machine learning techniques to recognize these relations, obtaining good performance. Further, we experiment with methods to normalize the relations to an existing ontology. This two-step process is analogous to the combination of concept recognition and normalization, and achieves comparable results. PMID:26958247

  9. Is cell death induced by nematocysts extract of medusa Pelagia noctiluca related to oxidative stress?

    PubMed

    Ayed, Yosra; Chayma, Bouaziz; Hayla, Abassi; Abid, Salwa; Bacha, Hassen

    2013-09-01

    Pelagia noctiluca, a jellyfish widely distributed in the Mediterranean waters, especially in coastal areas of Tunisia, has garnered attention because of its stinging capacity and the resulting public health hazard. Crude extracts of P. noctiluca nematocysts have been tested for their cytotoxicity on Vero cells. Our results clearly showed that nematocysts induced cell mortality in a dose- and time-dependent manner. A cytoprotective effect against cell mortality was obtained when Vero cells were treated with Vitamin E. This process was further confirmed by the generation of reactive oxygen species (ROS) and the induction of Hsp 70 and 27 protein expressions. Thus, our findings suggested that oxidative stress is involved in the toxicity of pelagia nematocysts and may therefore constitute the major mechanism of this medusa nematocysts toxicity.

  10. Correlation of mutagenic assessment of Houston air particulate extracts in relation to lung cancer mortality rates

    SciTech Connect

    Walker, R.D.; Connor, T.H.; MacDonald, E.J.; Trieff, N.M.; Legator, M.S.; MacKenzie, K.W. Jr.; Dobbins, J.G.

    1982-08-01

    Air particulate extracts from a series of solvents were tested in the Ames mutagen detection system and were found to be mutagenic in varying degrees as a function of the particulate collection site in Houston, Texas. The mutagenicity level at seven sites was compared with age-adjusted mortality rates in the same areas. Significant correlation was found with the lung cancer mortality rates but not with mortality rates for other causes. These findings support the hypothesis of a contribution of urban air particulate to the lung cancer rates. Furthermore, these findings suggest that an index of the mutagenicity of air particulate is a more powerful measure of the human health hazard of air pollution than the traditional indices of particulate concentration.

  11. Autonomy and Automation

    NASA Technical Reports Server (NTRS)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  12. Automated Global Feature Analyzer (AGFA) for the Intelligent and Autonomous Robotic Exploration of the Solar System

    NASA Astrophysics Data System (ADS)

    Fink, W.; Datta, A.; Dohm, J. M.; Tarbell, M. A.; Jobling, F. M.; Furfaro, R.; Kargel, J. S.; Schulze-Makuch, D.; Lunine, J. I.; Baker, V. R.

    2008-03-01

    AGFA performs automated target identification and characterization through segmentation, providing for feature extraction, feature classification, target prioritization, and unbiased anomaly detection within mapped planetary operational areas.

  13. Automated labeling in document images

    NASA Astrophysics Data System (ADS)

    Kim, Jongwoo; Le, Daniel X.; Thoma, George R.

    2000-12-01

    The National Library of Medicine (NLM) is developing an automated system to produce bibliographic records for its MEDLINER database. This system, named Medical Article Record System (MARS), employs document image analysis and understanding techniques and optical character recognition (OCR). This paper describes a key module in MARS called the Automated Labeling (AL) module, which labels all zones of interest (title, author, affiliation, and abstract) automatically. The AL algorithm is based on 120 rules that are derived from an analysis of journal page layouts and features extracted from OCR output. Experiments carried out on more than 11,000 articles in over 1,000 biomedical journals show the accuracy of this rule-based algorithm to exceed 96%.

  14. Sensors and Automated Analyzers for Radionuclides

    SciTech Connect

    Grate, Jay W.; Egorov, Oleg B.

    2003-03-27

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less.

  15. Image segmentation for automated dental identification

    NASA Astrophysics Data System (ADS)

    Haj Said, Eyad; Nassar, Diaa Eldin M.; Ammar, Hany H.

    2006-02-01

    Dental features are one of few biometric identifiers that qualify for postmortem identification; therefore, creation of an Automated Dental Identification System (ADIS) with goals and objectives similar to the Automated Fingerprint Identification System (AFIS) has received increased attention. As a part of ADIS, teeth segmentation from dental radiographs films is an essential step in the identification process. In this paper, we introduce a fully automated approach for teeth segmentation with goal to extract at least one tooth from the dental radiograph film. We evaluate our approach based on theoretical and empirical basis, and we compare its performance with the performance of other approaches introduced in the literature. The results show that our approach exhibits the lowest failure rate and the highest optimality among all full automated approaches introduced in the literature.

  16. Summary of astronaut inputs concerning automation

    NASA Technical Reports Server (NTRS)

    Weeks, David J.

    1990-01-01

    An assessment of the potential for increased productivity on Space Station Freedom through advanced automation and robotics was recently completed. Sponsored by the Office of Space Station, the study involved reviews of on-orbit operations experience documentation, interviews with 23 current and former astronauts/payload specialists as well as other NASA and contractor personnel, and a survey of 32 astronauts and payload specialists. Assessed areas of related on-orbit experience included Skylab, space shuttle, Spacelab, and the Soviet space program, as well as the U.S. nuclear submarine program and Antarctic research stations analogs. The survey questionnaire asked the respondents to rate the desirability of advanced automation, EVA robotics, and IVA robotics. They were also asked to rate safety impacts of automated fault diagnosis, isolation, and recovery (FDIR); automated exception reporting and alarm filtering; and an EVA retriever. The respondents were also asked to evaluate 26 specific applications of advanced automation and robotics related to perceived impact on productivity.

  17. Temporal Feature Extraction from DCE-MRI to Identify Poorly Perfused Subvolumes of Tumors Related to Outcomes of Radiation Therapy in Head and Neck Cancer

    PubMed Central

    You, Daekeun; Aryal, Madhava; Samuels, Stuart E.; Eisbruch, Avraham; Cao, Yue

    2017-01-01

    This study aimed to develop an automated model to extract temporal features from DCE-MRI in head-and-neck (HN) cancers to localize significant tumor subvolumes having low blood volume (LBV) for predicting local and regional failure after chemoradiation therapy. Temporal features were extracted from time-intensity curves to build classification model for differentiating voxels with LBV from those with high BV. Support vector machine (SVM) classification was trained on the extracted features for voxel classification. Subvolumes with LBV were then assembled from the classified voxels with LBV. The model was trained and validated on independent datasets created from 456 873 DCE curves. The resultant subvolumes were compared to ones derived by a 2-step method via pharmacokinetic modeling of blood volume, and evaluated for classification accuracy and volumetric similarity by DSC. The proposed model achieved an average voxel-level classification accuracy and DSC of 82% and 0.72, respectively. Also, the model showed tolerance on different acquisition parameters of DCE-MRI. The model could be directly used for outcome prediction and therapy assessment in radiation therapy of HN cancers, or even supporting boost target definition in adaptive clinical trials with further validation. The model is fully automatable, extendable, and scalable to extract temporal features of DCE-MRI in other tumors. PMID:28111634

  18. Workflow automation architecture standard

    SciTech Connect

    Moshofsky, R.P.; Rohen, W.T.

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  19. An extract from Taxodium distichum targets hemagglutinin- and neuraminidase-related activities of influenza virus in vitro

    PubMed Central

    Hsieh, Chung-Fan; Chen, Yu-Li; Lin, Chwan-Fwu; Ho, Jin-Yuan; Huang, Chun-Hsun; Chiu, Cheng-Hsun; Hsieh, Pei-Wen; Horng, Jim-Tong

    2016-01-01

    Influenza virus remains an emerging virus and causes pandemics with high levels of fatality. After screening different plant extracts with potential anti-influenza activity, a water extract of Taxodium distichum stems (TDSWex) showed excellent activity against influenza viruses. The EC50 of TDSWex was 0.051 ± 0.024 mg/mL against influenza virus A/WSN/33. TDSWex had excellent antiviral efficacy against various strains of human influenza A and B viruses, particularly oseltamivir-resistant clinical isolates and a swine-origin influenza strain. We observed that the synthesis of viral RNA and protein were inhibited in the presence of TDSWex. The results of the time-of-addition assay suggested that TDSWex inhibited viral entry and budding. In the hemagglutination inhibition assay, TDSWex inhibited the hemagglutination of red blood cells, implying that the extract targeted hemagglutin-related functions such as viral entry. In the attachment and penetration assay, TDSWex showed antiviral activity with EC50s of 0.045 ± 0.026 and 0.012 ± 0.003 mg/mL, respectively. In addition, TDSWex blocked neuraminidase activity. We conclude that TDSWex has bimodal activities against both hemagglutinin and neuraminidase during viral replication. PMID:27796330

  20. Expression pattern of sonic hedgehog signaling and calcitonin gene-related peptide in the socket healing process after tooth extraction.

    PubMed

    Pang, Pai; Shimo, Tsuyoshi; Takada, Hiroyuki; Matsumoto, Kenichi; Yoshioka, Norie; Ibaragi, Soichiro; Sasaki, Akira

    2015-11-06

    Sonic Hedgehog (SHH), a neural development inducer, plays a significant role in the bone healing process. Calcitonin gene-related peptide (CGRP), a neuropeptide marker of sensory nerves, has been demonstrated to affect bone formation. The roles of SHH signaling and CGRP-positive sensory nerves in the alveolar bone formation process have been unknown. Here we examined the expression patterns of SHH signaling and CGRP in mouse socket by immunohistochemistry and immunofluorescence analysis. We found that the expression level of SHH peaked at day 3 and was then decreased at 5 days after tooth extraction. CGRP, PTCH1 and GLI2 were each expressed in a similar pattern with their highest expression levels at day 5 and day 7 after tooth extraction. CGRP and GLI2 were co-expressed in some inflammatory cells and bone forming cells. In some areas, CGRP-positive neurons expressed GLI2. In conclusion, SHH may affect alveolar bone healing by interacting with CGRP-positive sensory neurons and thus regulate the socket's healing process after tooth extraction.

  1. Automated Teaching: A Review of Theory and Research. Technical Report.

    ERIC Educational Resources Information Center

    Silverman, Robert E.

    The present state of research in the area of automated teaching and the application of automated teaching devices were reviewed in terms of issues relating to programing, to machine variables, and to studies comparing conventional instruction with automated instruction. Some basic issues were singled out for consideration. Such problems as…

  2. Workshop on Office Automation and Telecommunication: Applying the Technology.

    ERIC Educational Resources Information Center

    Mitchell, Bill

    This document contains 12 outlines that forecast the office of the future. The outlines cover the following topics: (1) office automation definition and objectives; (2) functional categories of office automation software packages for mini and mainframe computers; (3) office automation-related software for microcomputers; (4) office automation…

  3. 22 CFR 120.30 - The Automated Export System (AES).

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false The Automated Export System (AES). 120.30... DEFINITIONS § 120.30 The Automated Export System (AES). The Automated Export System (AES) is the Department of Commerce, Bureau of Census, electronic filing of export information. The AES shall serve as the...

  4. Automated Vehicle Regulation: An Energy and Emissions Perspective

    SciTech Connect

    Levine, Aaron

    2016-05-18

    This presentation provides a summary of the current automated vehicles polices in the United States and how they related to reducing greenhouse gas (GHG) emissions. The presentation then looks at future automated vehicle trends that will increase and reduce GHG emissions and what current policies utilized in other areas of law could be adapted for automated vehicle GHG emissions.

  5. Tumorigenesis of diesel exhaust, gasoline exhaust, and related emission extracts on SENCAR mouse skin

    SciTech Connect

    Nesnow, S; Triplett, L L; Slaga, T J

    1980-01-01

    The tumorigenicity of diesel exhaust particulate emissions was examined using a sensitive mouse skin tumorigenesis model (SENCAR). The tumorigenic potency of particulate emissions from diesel, gasoline, and related emission sources was compared.

  6. Surface subsidence and collapse in relation to extraction of salt and other soluble evaporites

    USGS Publications Warehouse

    Ege, John R.

    1979-01-01

    Extraction of soluble minerals, whether by natural or man-induced processes, can result in localized land-surface subsidence and more rarely sinkhole formation. One process cited by many investigators is that uncontrolled dissolving of salt or other soluble evaporites can create or enlarge underground cavities, thereby increasing the span of the unsupported roof to the strength limit of the overlying rocks. Downwarping results when spans are exceeded, or collapse of the undermined roof leads to upward sloping or chimneying of the overburden rocks. If underground space is available for rock debris to collect, the void can migrate to the surface with the end result being surface subsidence or collapse. In North America natural solution subsidence and collapse features in rocks ranging in age from Silurian to the present are found in evaporite terranes in the Great Plains from Saskatchewan in the north to Texas and New Mexico in the south, in the Great Lakes area, and in the southeastern States. Man-induced subsidence and collapse in evaporites are generally associated with conventional or solution mining, oilfield operations, and reservoir and dam construction, and can be especially hazardous in populated or built-up areas.

  7. Automation in Clinical Microbiology

    PubMed Central

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  8. Automated Characterization Of Vibrations Of A Structure

    NASA Technical Reports Server (NTRS)

    Bayard, David S.; Yam, Yeung; Mettler, Edward; Hadaegh, Fred Y.; Milman, Mark H.; Scheid, Robert E.

    1992-01-01

    Automated method of characterizing dynamical properties of large flexible structure yields estimates of modal parameters used by robust control system to stabilize structure and minimize undesired motions. Based on extraction of desired modal and control-design data from responses of structure to known vibrational excitations. Applicable to terrestrial structures where vibrations are important - aircraft, buildings, bridges, cranes, and drill strings.

  9. Assessing the state of the art in biomedical relation extraction: overview of the BioCreative V chemical-disease relation (CDR) task.

    PubMed

    Wei, Chih-Hsuan; Peng, Yifan; Leaman, Robert; Davis, Allan Peter; Mattingly, Carolyn J; Li, Jiao; Wiegers, Thomas C; Lu, Zhiyong

    2016-01-01

    Manually curating chemicals, diseases and their relationships is significantly important to biomedical research, but it is plagued by its high cost and the rapid growth of the biomedical literature. In recent years, there has been a growing interest in developing computational approaches for automatic chemical-disease relation (CDR) extraction. Despite these attempts, the lack of a comprehensive benchmarking dataset has limited the comparison of different techniques in order to assess and advance the current state-of-the-art. To this end, we organized a challenge task through BioCreative V to automatically extract CDRs from the literature. We designed two challenge tasks: disease named entity recognition (DNER) and chemical-induced disease (CID) relation extraction. To assist system development and assessment, we created a large annotated text corpus that consisted of human annotations of chemicals, diseases and their interactions from 1500 PubMed articles. 34 teams worldwide participated in the CDR task: 16 (DNER) and 18 (CID). The best systems achieved an F-score of 86.46% for the DNER task--a result that approaches the human inter-annotator agreement (0.8875)--and an F-score of 57.03% for the CID task, the highest results ever reported for such tasks. When combining team results via machine learning, the ensemble system was able to further improve over the best team results by achieving 88.89% and 62.80% in F-score for the DNER and CID task, respectively. Additionally, another novel aspect of our evaluation is to test each participating system's ability to return real-time results: the average response time for each team's DNER and CID web service systems were 5.6 and 9.3 s, respectively. Most teams used hybrid systems for their submissions based on machining learning. Given the level of participation and results, we found our task to be successful in engaging the text-mining research community, producing a large annotated corpus and improving the results of

  10. Assessing the state of the art in biomedical relation extraction: overview of the BioCreative V chemical-disease relation (CDR) task

    PubMed Central

    Wei, Chih-Hsuan; Peng, Yifan; Leaman, Robert; Davis, Allan Peter; Mattingly, Carolyn J.; Li, Jiao; Wiegers, Thomas C.; Lu, Zhiyong

    2016-01-01

    Manually curating chemicals, diseases and their relationships is significantly important to biomedical research, but it is plagued by its high cost and the rapid growth of the biomedical literature. In recent years, there has been a growing interest in developing computational approaches for automatic chemical-disease relation (CDR) extraction. Despite these attempts, the lack of a comprehensive benchmarking dataset has limited the comparison of different techniques in order to assess and advance the current state-of-the-art. To this end, we organized a challenge task through BioCreative V to automatically extract CDRs from the literature. We designed two challenge tasks: disease named entity recognition (DNER) and chemical-induced disease (CID) relation extraction. To assist system development and assessment, we created a large annotated text corpus that consisted of human annotations of chemicals, diseases and their interactions from 1500 PubMed articles. 34 teams worldwide participated in the CDR task: 16 (DNER) and 18 (CID). The best systems achieved an F-score of 86.46% for the DNER task—a result that approaches the human inter-annotator agreement (0.8875)—and an F-score of 57.03% for the CID task, the highest results ever reported for such tasks. When combining team results via machine learning, the ensemble system was able to further improve over the best team results by achieving 88.89% and 62.80% in F-score for the DNER and CID task, respectively. Additionally, another novel aspect of our evaluation is to test each participating system’s ability to return real-time results: the average response time for each team’s DNER and CID web service systems were 5.6 and 9.3 s, respectively. Most teams used hybrid systems for their submissions based on machining learning. Given the level of participation and results, we found our task to be successful in engaging the text-mining research community, producing a large annotated corpus and improving the

  11. Automated office blood pressure.

    PubMed

    Myers, Martin G; Godwin, Marshall

    2012-05-01

    Manual blood pressure (BP) is gradually disappearing from clinical practice with the mercury sphygmomanometer