Science.gov

Sample records for automated relation extraction

  1. Hybrid curation of gene–mutation relations combining automated extraction and crowdsourcing

    PubMed Central

    Burger, John D.; Doughty, Emily; Khare, Ritu; Wei, Chih-Hsuan; Mishra, Rajashree; Aberdeen, John; Tresner-Kirsch, David; Wellner, Ben; Kann, Maricel G.; Lu, Zhiyong; Hirschman, Lynette

    2014-01-01

    Background: This article describes capture of biological information using a hybrid approach that combines natural language processing to extract biological entities and crowdsourcing with annotators recruited via Amazon Mechanical Turk to judge correctness of candidate biological relations. These techniques were applied to extract gene– mutation relations from biomedical abstracts with the goal of supporting production scale capture of gene–mutation–disease findings as an open source resource for personalized medicine. Results: The hybrid system could be configured to provide good performance for gene–mutation extraction (precision ∼82%; recall ∼70% against an expert-generated gold standard) at a cost of $0.76 per abstract. This demonstrates that crowd labor platforms such as Amazon Mechanical Turk can be used to recruit quality annotators, even in an application requiring subject matter expertise; aggregated Turker judgments for gene–mutation relations exceeded 90% accuracy. Over half of the precision errors were due to mismatches against the gold standard hidden from annotator view (e.g. incorrect EntrezGene identifier or incorrect mutation position extracted), or incomplete task instructions (e.g. the need to exclude nonhuman mutations). Conclusions: The hybrid curation model provides a readily scalable cost-effective approach to curation, particularly if coupled with expert human review to filter precision errors. We plan to generalize the framework and make it available as open source software. Database URL: http://www.mitre.org/publications/technical-papers/hybrid-curation-of-gene-mutation-relations-combining-automated PMID:25246425

  2. Automated Neuroanatomical Relation Extraction: A Linguistically Motivated Approach with a PVT Connectivity Graph Case Study

    PubMed Central

    Gökdeniz, Erinç; Özgür, Arzucan; Canbeyli, Reşit

    2016-01-01

    Identifying the relations among different regions of the brain is vital for a better understanding of how the brain functions. While a large number of studies have investigated the neuroanatomical and neurochemical connections among brain structures, their specific findings are found in publications scattered over a large number of years and different types of publications. Text mining techniques have provided the means to extract specific types of information from a large number of publications with the aim of presenting a larger, if not necessarily an exhaustive picture. By using natural language processing techniques, the present paper aims to identify connectivity relations among brain regions in general and relations relevant to the paraventricular nucleus of the thalamus (PVT) in particular. We introduce a linguistically motivated approach based on patterns defined over the constituency and dependency parse trees of sentences. Besides the presence of a relation between a pair of brain regions, the proposed method also identifies the directionality of the relation, which enables the creation and analysis of a directional brain region connectivity graph. The approach is evaluated over the manually annotated data sets of the WhiteText Project. In addition, as a case study, the method is applied to extract and analyze the connectivity graph of PVT, which is an important brain region that is considered to influence many functions ranging from arousal, motivation, and drug-seeking behavior to attention. The results of the PVT connectivity graph show that PVT may be a new target of research in mood assessment. PMID:27708573

  3. Automated Extraction of Flow Features

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne (Technical Monitor); Haimes, Robert

    2005-01-01

    Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, re-circulation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; isc-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.

  4. Automated DNA extraction from pollen in honey.

    PubMed

    Guertler, Patrick; Eicheldinger, Adelina; Muschler, Paul; Goerlich, Ottmar; Busch, Ulrich

    2014-04-15

    In recent years, honey has become subject of DNA analysis due to potential risks evoked by microorganisms, allergens or genetically modified organisms. However, so far, only a few DNA extraction procedures are available, mostly time-consuming and laborious. Therefore, we developed an automated DNA extraction method from pollen in honey based on a CTAB buffer-based DNA extraction using the Maxwell 16 instrument and the Maxwell 16 FFS Nucleic Acid Extraction System, Custom-Kit. We altered several components and extraction parameters and compared the optimised method with a manual CTAB buffer-based DNA isolation method. The automated DNA extraction was faster and resulted in higher DNA yield and sufficient DNA purity. Real-time PCR results obtained after automated DNA extraction are comparable to results after manual DNA extraction. No PCR inhibition was observed. The applicability of this method was further successfully confirmed by analysis of different routine honey samples.

  5. Automated Extraction of Secondary Flow Features

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne M.; Haimes, Robert

    2005-01-01

    The use of Computational Fluid Dynamics (CFD) has become standard practice in the design and development of the major components used for air and space propulsion. To aid in the post-processing and analysis phase of CFD many researchers now use automated feature extraction utilities. These tools can be used to detect the existence of such features as shocks, vortex cores and separation and re-attachment lines. The existence of secondary flow is another feature of significant importance to CFD engineers. Although the concept of secondary flow is relatively understood there is no commonly accepted mathematical definition for secondary flow. This paper will present a definition for secondary flow and one approach for automatically detecting and visualizing secondary flow.

  6. Acceleration of Automated HI Source Extraction

    NASA Astrophysics Data System (ADS)

    Badenhorst, S. J.; Blyth, S.; Kuttel, M. M.

    2013-10-01

    We aim to enable fast automated extraction of neutral hydrogen (HI) sources from large survey data sets. This requires both handling the large files (>5 TB) to be produced by next-generation interferometers and acceleration of the source extraction algorithm. We develop an efficient multithreaded implementation of the A'Trous wavelet reconstruction algorithm, which we evaluate against the serial implementation in the DUCHAMP package. We also evaluate three memory management libraries (Mmap, Boost and Stxxl) that enable processing of data files too large to fit into main memory, to establish which provides the best performance.

  7. Automated Fluid Feature Extraction from Transient Simulations

    NASA Technical Reports Server (NTRS)

    Haimes, Robert; Lovely, David

    1999-01-01

    In the past, feature extraction and identification were interesting concepts, but not required to understand the underlying physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of much interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snap-shot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense. The following is a list of the important physical phenomena found in transient (and steady-state) fluid flow: (1) Shocks, (2) Vortex cores, (3) Regions of recirculation, (4) Boundary layers, (5) Wakes. Three papers and an initial specification for the (The Fluid eXtraction tool kit) FX Programmer's guide were included. The papers, submitted to the AIAA Computational Fluid Dynamics Conference, are entitled : (1) Using Residence Time for the Extraction of Recirculation Regions, (2) Shock Detection from Computational Fluid Dynamics results and (3) On the Velocity Gradient Tensor and Fluid Feature Extraction.

  8. Automated Fluid Feature Extraction from Transient Simulations

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    2000-01-01

    In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one 'snap-shot' of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense.

  9. Automated Fluid Feature Extraction from Transient Simulations

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    1998-01-01

    In the past, feature extraction and identification were interesting concepts, but not required to understand the underlying physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of much interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one 'snap-shot' of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense. The following is a list of the important physical phenomena found in transient (and steady-state) fluid flow: Shocks; Vortex ores; Regions of Recirculation; Boundary Layers; Wakes.

  10. Automated feature extraction for 3-dimensional point clouds

    NASA Astrophysics Data System (ADS)

    Magruder, Lori A.; Leigh, Holly W.; Soderlund, Alexander; Clymer, Bradley; Baer, Jessica; Neuenschwander, Amy L.

    2016-05-01

    Light detection and ranging (LIDAR) technology offers the capability to rapidly capture high-resolution, 3-dimensional surface data with centimeter-level accuracy for a large variety of applications. Due to the foliage-penetrating properties of LIDAR systems, these geospatial data sets can detect ground surfaces beneath trees, enabling the production of highfidelity bare earth elevation models. Precise characterization of the ground surface allows for identification of terrain and non-terrain points within the point cloud, and facilitates further discernment between natural and man-made objects based solely on structural aspects and relative neighboring parameterizations. A framework is presented here for automated extraction of natural and man-made features that does not rely on coincident ortho-imagery or point RGB attributes. The TEXAS (Terrain EXtraction And Segmentation) algorithm is used first to generate a bare earth surface from a lidar survey, which is then used to classify points as terrain or non-terrain. Further classifications are assigned at the point level by leveraging local spatial information. Similarly classed points are then clustered together into regions to identify individual features. Descriptions of the spatial attributes of each region are generated, resulting in the identification of individual tree locations, forest extents, building footprints, and 3-dimensional building shapes, among others. Results of the fully-automated feature extraction algorithm are then compared to ground truth to assess completeness and accuracy of the methodology.

  11. Automated feature extraction and classification from image sources

    USGS Publications Warehouse

    U.S. Geological Survey

    1995-01-01

    The U.S. Department of the Interior, U.S. Geological Survey (USGS), and Unisys Corporation have completed a cooperative research and development agreement (CRADA) to explore automated feature extraction and classification from image sources. The CRADA helped the USGS define the spectral and spatial resolution characteristics of airborne and satellite imaging sensors necessary to meet base cartographic and land use and land cover feature classification requirements and help develop future automated geographic and cartographic data production capabilities. The USGS is seeking a new commercial partner to continue automated feature extraction and classification research and development.

  12. Automating Relational Database Design for Microcomputer Users.

    ERIC Educational Resources Information Center

    Pu, Hao-Che

    1991-01-01

    Discusses issues involved in automating the relational database design process for microcomputer users and presents a prototype of a microcomputer-based system (RA, Relation Assistant) that is based on expert systems technology and helps avoid database maintenance problems. Relational database design is explained and the importance of easy input…

  13. Automated blood vessel extraction using local features on retinal images

    NASA Astrophysics Data System (ADS)

    Hatanaka, Yuji; Samo, Kazuki; Tajima, Mikiya; Ogohara, Kazunori; Muramatsu, Chisako; Okumura, Susumu; Fujita, Hiroshi

    2016-03-01

    An automated blood vessel extraction using high-order local autocorrelation (HLAC) on retinal images is presented. Although many blood vessel extraction methods based on contrast have been proposed, a technique based on the relation of neighbor pixels has not been published. HLAC features are shift-invariant; therefore, we applied HLAC features to retinal images. However, HLAC features are weak to turned image, thus a method was improved by the addition of HLAC features to a polar transformed image. The blood vessels were classified using an artificial neural network (ANN) with HLAC features using 105 mask patterns as input. To improve performance, the second ANN (ANN2) was constructed by using the green component of the color retinal image and the four output values of ANN, Gabor filter, double-ring filter and black-top-hat transformation. The retinal images used in this study were obtained from the "Digital Retinal Images for Vessel Extraction" (DRIVE) database. The ANN using HLAC output apparent white values in the blood vessel regions and could also extract blood vessels with low contrast. The outputs were evaluated using the area under the curve (AUC) based on receiver operating characteristics (ROC) analysis. The AUC of ANN2 was 0.960 as a result of our study. The result can be used for the quantitative analysis of the blood vessels.

  14. Automated sea floor extraction from underwater video

    NASA Astrophysics Data System (ADS)

    Kelly, Lauren; Rahmes, Mark; Stiver, James; McCluskey, Mike

    2016-05-01

    Ocean floor mapping using video is a method to simply and cost-effectively record large areas of the seafloor. Obtaining visual and elevation models has noteworthy applications in search and recovery missions. Hazards to navigation are abundant and pose a significant threat to the safety, effectiveness, and speed of naval operations and commercial vessels. This project's objective was to develop a workflow to automatically extract metadata from marine video and create image optical and elevation surface mosaics. Three developments made this possible. First, optical character recognition (OCR) by means of two-dimensional correlation, using a known character set, allowed for the capture of metadata from image files. Second, exploiting the image metadata (i.e., latitude, longitude, heading, camera angle, and depth readings) allowed for the determination of location and orientation of the image frame in mosaic. Image registration improved the accuracy of mosaicking. Finally, overlapping data allowed us to determine height information. A disparity map was created using the parallax from overlapping viewpoints of a given area and the relative height data was utilized to create a three-dimensional, textured elevation map.

  15. Toward the automation of road networks extraction processes

    NASA Astrophysics Data System (ADS)

    Leymarie, Frederic; Boichis, Nicolas; Airault, Sylvain; Jamet, Olivier

    1996-12-01

    Syseca and IGN are working on various steps in the ongoing march from digital photogrammetry to the semi-automation and ultimately the full automation of data manipulation, i.e., capture and analysis. The immediate goals are to reduce the production costs and the data availability delays. Within this context, we have tackle the distinctive problem of 'automated road network extraction.' The methodology adopted is to first study semi-automatic solutions which probably increase the global efficiency of human operators in topographic data capture; in a second step, automatic solutions are designed based upon the gained experience. We report on different (semi-)automatic solutions for the road following algorithm. One key aspect of our method is to have the stages of 'detection' and 'geometric recovery' cooperate together while remaining distinct. 'Detection' is based on a local (texture) analysis of the image, while 'geometric recovery' is concerned with the extraction of 'road objects' for both monocular and stereo information. 'Detection' is a low-level visual process, 'reasoning' directly at the level of image intensities, while the mid-level visual process, 'geometric recovery', uses contextual knowledge about roads, both generic, e.g. parallelism of borders, and specific, e.g. using previously extracted road segments and disparities. We then pursue our 'march' by reporting on steps we are exploring toward full automation. We have in particular made attempts at tackling the automation of the initialization step to start searching in a valid direction.

  16. Automated vasculature extraction from placenta images

    NASA Astrophysics Data System (ADS)

    Almoussa, Nizar; Dutra, Brittany; Lampe, Bryce; Getreuer, Pascal; Wittman, Todd; Salafia, Carolyn; Vese, Luminita

    2011-03-01

    Recent research in perinatal pathology argues that analyzing properties of the placenta may reveal important information on how certain diseases progress. One important property is the structure of the placental blood vessels, which supply a fetus with all of its oxygen and nutrition. An essential step in the analysis of the vascular network pattern is the extraction of the blood vessels, which has only been done manually through a costly and time-consuming process. There is no existing method to automatically detect placental blood vessels; in addition, the large variation in the shape, color, and texture of the placenta makes it difficult to apply standard edge-detection algorithms. We describe a method to automatically detect and extract blood vessels from a given image by using image processing techniques and neural networks. We evaluate several local features for every pixel, in addition to a novel modification to an existing road detector. Pixels belonging to blood vessel regions have recognizable responses; hence, we use an artificial neural network to identify the pattern of blood vessels. A set of images where blood vessels are manually highlighted is used to train the network. We then apply the neural network to recognize blood vessels in new images. The network is effective in capturing the most prominent vascular structures of the placenta.

  17. Automated Image Registration Using Morphological Region of Interest Feature Extraction

    NASA Technical Reports Server (NTRS)

    Plaza, Antonio; LeMoigne, Jacqueline; Netanyahu, Nathan S.

    2005-01-01

    With the recent explosion in the amount of remotely sensed imagery and the corresponding interest in temporal change detection and modeling, image registration has become increasingly important as a necessary first step in the integration of multi-temporal and multi-sensor data for applications such as the analysis of seasonal and annual global climate changes, as well as land use/cover changes. The task of image registration can be divided into two major components: (1) the extraction of control points or features from images; and (2) the search among the extracted features for the matching pairs that represent the same feature in the images to be matched. Manual control feature extraction can be subjective and extremely time consuming, and often results in few usable points. Automated feature extraction is a solution to this problem, where desired target features are invariant, and represent evenly distributed landmarks such as edges, corners and line intersections. In this paper, we develop a novel automated registration approach based on the following steps. First, a mathematical morphology (MM)-based method is used to obtain a scale-orientation morphological profile at each image pixel. Next, a spectral dissimilarity metric such as the spectral information divergence is applied for automated extraction of landmark chips, followed by an initial approximate matching. This initial condition is then refined using a hierarchical robust feature matching (RFM) procedure. Experimental results reveal that the proposed registration technique offers a robust solution in the presence of seasonal changes and other interfering factors. Keywords-Automated image registration, multi-temporal imagery, mathematical morphology, robust feature matching.

  18. Automated extraction of odontocete whistle contours.

    PubMed

    Roch, Marie A; Brandes, T Scott; Patel, Bhavesh; Barkley, Yvonne; Baumann-Pickering, Simone; Soldevilla, Melissa S

    2011-10-01

    Many odontocetes produce frequency modulated tonal calls known as whistles. The ability to automatically determine time × frequency tracks corresponding to these vocalizations has numerous applications including species description, identification, and density estimation. This work develops and compares two algorithms on a common corpus of nearly one hour of data collected in the Southern California Bight and at Palmyra Atoll. The corpus contains over 3000 whistles from bottlenose dolphins, long- and short-beaked common dolphins, spinner dolphins, and melon-headed whales that have been annotated by a human, and released to the Moby Sound archive. Both algorithms use a common signal processing front end to determine time × frequency peaks from a spectrogram. In the first method, a particle filter performs Bayesian filtering, estimating the contour from the noisy spectral peaks. The second method uses an adaptive polynomial prediction to connect peaks into a graph, merging graphs when they cross. Whistle contours are extracted from graphs using information from both sides of crossings. The particle filter was able to retrieve 71.5% (recall) of the human annotated tonals with 60.8% of the detections being valid (precision). The graph algorithm's recall rate was 80.0% with a precision of 76.9%.

  19. Automated Brain Extraction from T2-weighted Magnetic Resonance Images

    PubMed Central

    Datta, Sushmita; Narayana, Ponnada A.

    2011-01-01

    Purpose To develop and implement an automated and robust technique to extract brain from T2-weighted images. Materials and Methods Magnetic resonance imaging (MRI) was performed on 75 adult volunteers to acquire dual fast spin echo (FSE) images with fat-saturation technique on a 3T Philips scanner. Histogram-derived thresholds were derived directly from the original images followed by the application of regional labeling, regional connectivity, and mathematical morphological operations to extract brain from axial late-echo FSE (T2-weighted) images. The proposed technique was evaluated subjectively by an expert and quantitatively using Bland-Altman plot and Jaccard and Dice similarity measures. Results Excellent agreement between the extracted brain volumes with the proposed technique and manual stripping by an expert was observed based on Bland-Altman plot and also as assessed by high similarity indices (Jaccard: 0.9825± 0.0045; Dice: 0.9912 ±0.0023). Conclusion Brain extraction using proposed automated methodology is robust and the results are reproducible. PMID:21448946

  20. Automated RNA Extraction and Purification for Multiplexed Pathogen Detection

    SciTech Connect

    Bruzek, Amy K.; Bruckner-Lea, Cindy J.

    2005-01-01

    Pathogen detection has become an extremely important part of our nation?s defense in this post 9/11 world where the threat of bioterrorist attacks are a grim reality. When a biological attack takes place, response time is critical. The faster the biothreat is assessed, the faster countermeasures can be put in place to protect the health of the general public. Today some of the most widely used methods for detecting pathogens are either time consuming or not reliable [1]. Therefore, a method that can detect multiple pathogens that is inherently reliable, rapid, automated and field portable is needed. To that end, we are developing automated fluidics systems for the recovery, cleanup, and direct labeling of community RNA from suspect environmental samples. The advantage of using RNA for detection is that there are multiple copies of mRNA in a cell, whereas there are normally only one or two copies of DNA [2]. Because there are multiple copies of mRNA in a cell for highly expressed genes, no amplification of the genetic material may be necessary, and thus rapid and direct detection of only a few cells may be possible [3]. This report outlines the development of both manual and automated methods for the extraction and purification of mRNA. The methods were evaluated using cell lysates from Escherichia coli 25922 (nonpathogenic), Salmonella typhimurium (pathogenic), and Shigella spp (pathogenic). Automated RNA purification was achieved using a custom sequential injection fluidics system consisting of a syringe pump, a multi-port valve and a magnetic capture cell. mRNA was captured using silica coated superparamagnetic beads that were trapped in the tubing by a rare earth magnet. RNA was detected by gel electrophoresis and/or by hybridization of the RNA to microarrays. The versatility of the fluidics systems and the ability to automate these systems allows for quick and easy processing of samples and eliminates the need for an experienced operator.

  1. Arduino-based automation of a DNA extraction system.

    PubMed

    Kim, Kyung-Won; Lee, Mi-So; Ryu, Mun-Ho; Kim, Jong-Won

    2015-01-01

    There have been many studies to detect infectious diseases with the molecular genetic method. This study presents an automation process for a DNA extraction system based on microfluidics and magnetic bead, which is part of a portable molecular genetic test system. This DNA extraction system consists of a cartridge with chambers, syringes, four linear stepper actuators, and a rotary stepper actuator. The actuators provide a sequence of steps in the DNA extraction process, such as transporting, mixing, and washing for the gene specimen, magnetic bead, and reagent solutions. The proposed automation system consists of a PC-based host application and an Arduino-based controller. The host application compiles a G code sequence file and interfaces with the controller to execute the compiled sequence. The controller executes stepper motor axis motion, time delay, and input-output manipulation. It drives the stepper motor with an open library, which provides a smooth linear acceleration profile. The controller also provides a homing sequence to establish the motor's reference position, and hard limit checking to prevent any over-travelling. The proposed system was implemented and its functionality was investigated, especially regarding positioning accuracy and velocity profile. PMID:26409535

  2. Arduino-based automation of a DNA extraction system.

    PubMed

    Kim, Kyung-Won; Lee, Mi-So; Ryu, Mun-Ho; Kim, Jong-Won

    2015-01-01

    There have been many studies to detect infectious diseases with the molecular genetic method. This study presents an automation process for a DNA extraction system based on microfluidics and magnetic bead, which is part of a portable molecular genetic test system. This DNA extraction system consists of a cartridge with chambers, syringes, four linear stepper actuators, and a rotary stepper actuator. The actuators provide a sequence of steps in the DNA extraction process, such as transporting, mixing, and washing for the gene specimen, magnetic bead, and reagent solutions. The proposed automation system consists of a PC-based host application and an Arduino-based controller. The host application compiles a G code sequence file and interfaces with the controller to execute the compiled sequence. The controller executes stepper motor axis motion, time delay, and input-output manipulation. It drives the stepper motor with an open library, which provides a smooth linear acceleration profile. The controller also provides a homing sequence to establish the motor's reference position, and hard limit checking to prevent any over-travelling. The proposed system was implemented and its functionality was investigated, especially regarding positioning accuracy and velocity profile.

  3. Automated labeling of bibliographic data extracted from biomedical online journals

    NASA Astrophysics Data System (ADS)

    Kim, Jongwoo; Le, Daniel X.; Thoma, George R.

    2003-01-01

    A prototype system has been designed to automate the extraction of bibliographic data (e.g., article title, authors, abstract, affiliation and others) from online biomedical journals to populate the National Library of Medicine"s MEDLINE database. This paper describes a key module in this system: the labeling module that employs statistics and fuzzy rule-based algorithms to identify segmented zones in an article"s HTML pages as specific bibliographic data. Results from experiments conducted with 1,149 medical articles from forty-seven journal issues are presented.

  4. Feature extraction from Doppler ultrasound signals for automated diagnostic systems.

    PubMed

    Ubeyli, Elif Derya; Güler, Inan

    2005-11-01

    This paper presented the assessment of feature extraction methods used in automated diagnosis of arterial diseases. Since classification is more accurate when the pattern is simplified through representation by important features, feature extraction and selection play an important role in classifying systems such as neural networks. Different feature extraction methods were used to obtain feature vectors from ophthalmic and internal carotid arterial Doppler signals. In addition to this, the problem of selecting relevant features among the features available for the purpose of classification of Doppler signals was dealt with. Multilayer perceptron neural networks (MLPNNs) with different inputs (feature vectors) were used for diagnosis of ophthalmic and internal carotid arterial diseases. The assessment of feature extraction methods was performed by taking into consideration of performances of the MLPNNs. The performances of the MLPNNs were evaluated by the convergence rates (number of training epochs) and the total classification accuracies. Finally, some conclusions were drawn concerning the efficiency of discrete wavelet transform as a feature extraction method used for the diagnosis of ophthalmic and internal carotid arterial diseases. PMID:16278106

  5. An automated approach for extracting Barrier Island morphology from digital elevation models

    NASA Astrophysics Data System (ADS)

    Wernette, Phillipe; Houser, Chris; Bishop, Michael P.

    2016-06-01

    The response and recovery of a barrier island to extreme storms depends on the elevation of the dune base and crest, both of which can vary considerably alongshore and through time. Quantifying the response to and recovery from storms requires that we can first identify and differentiate the dune(s) from the beach and back-barrier, which in turn depends on accurate identification and delineation of the dune toe, crest and heel. The purpose of this paper is to introduce a multi-scale automated approach for extracting beach, dune (dune toe, dune crest and dune heel), and barrier island morphology. The automated approach introduced here extracts the shoreline and back-barrier shoreline based on elevation thresholds, and extracts the dune toe, dune crest and dune heel based on the average relative relief (RR) across multiple spatial scales of analysis. The multi-scale automated RR approach to extracting dune toe, dune crest, and dune heel based upon relative relief is more objective than traditional approaches because every pixel is analyzed across multiple computational scales and the identification of features is based on the calculated RR values. The RR approach out-performed contemporary approaches and represents a fast objective means to define important beach and dune features for predicting barrier island response to storms. The RR method also does not require that the dune toe, crest, or heel are spatially continuous, which is important because dune morphology is likely naturally variable alongshore.

  6. Automated Feature Extraction of Foredune Morphology from Terrestrial Lidar Data

    NASA Astrophysics Data System (ADS)

    Spore, N.; Brodie, K. L.; Swann, C.

    2014-12-01

    Foredune morphology is often described in storm impact prediction models using the elevation of the dune crest and dune toe and compared with maximum runup elevations to categorize the storm impact and predicted responses. However, these parameters do not account for other foredune features that may make them more or less erodible, such as alongshore variations in morphology, vegetation coverage, or compaction. The goal of this work is to identify other descriptive features that can be extracted from terrestrial lidar data that may affect the rate of dune erosion under wave attack. Daily, mobile-terrestrial lidar surveys were conducted during a 6-day nor'easter (Hs = 4 m in 6 m water depth) along 20km of coastline near Duck, North Carolina which encompassed a variety of foredune forms in close proximity to each other. This abstract will focus on the tools developed for the automated extraction of the morphological features from terrestrial lidar data, while the response of the dune will be presented by Brodie and Spore as an accompanying abstract. Raw point cloud data can be dense and is often under-utilized due to time and personnel constraints required for analysis, since many algorithms are not fully automated. In our approach, the point cloud is first projected into a local coordinate system aligned with the coastline, and then bare earth points are interpolated onto a rectilinear 0.5 m grid creating a high resolution digital elevation model. The surface is analyzed by identifying features along each cross-shore transect. Surface curvature is used to identify the position of the dune toe, and then beach and berm morphology is extracted shoreward of the dune toe, and foredune morphology is extracted landward of the dune toe. Changes in, and magnitudes of, cross-shore slope, curvature, and surface roughness are used to describe the foredune face and each cross-shore transect is then classified using its pre-storm morphology for storm-response analysis.

  7. Automation of Cn2 profile extraction from weather radar images

    NASA Astrophysics Data System (ADS)

    Burchett, Lee R.; Fiorino, Steven T.; Buchanan, Matthew

    2012-06-01

    A novel method for measuring the structure constant of the atmospheric turbulence on an arbitrary path has recently been demonstrated by the Air Force Institute of Technology (AFIT). This method provides a unique ability to remotely measure the intensity of turbulence, which is important for predicting beam spread, wander, and scintillation effects on High Energy Laser (HEL) propagation. Because this is a new technique, estimating A novel method for measuring the structure constant of the atmospheric turbulence on an arbitrary path has recently been demonstrated by the Air Force Institute of Technology (AFIT). This method provides a unique ability to remotely measure the intensity of turbulence, which is important for predicting beam spread, wander, and scintillation effects on High Energy Laser (HEL) propagation. Because this is a new technique, estimating Cn2 using radar is a complicated and time consuming process. This paper presents a new software program which is being developed to automate the calculation of Cn2 over an arbitrary path. The program takes regional National Weather Service NEXRAD radar reflectivity measurements and extracts data for the path of interest. These reflectivity measurements are then used to estimate Cn2 over the path. The program uses the Radar Software Library (RSL) produced by the Tropical Rainfall Measuring Mission (TRMM) at the NASA/Goddard Flight Center. RSL provides support for nearly all formats of weather radar data. The particular challenge to extracting data is in determining which data bins the path passes through. Due to variations in radar systems and measurement conditions, the RSL produces data grids that are not consistent in geometry or completeness. The Cn2 program adapts to the varying geometries of each radar image. Automation of the process allows for fast estimation of Cn2 and supports a goal of real-time remote turbulence measurement. Recently, this software was used to create comparison data for RF

  8. Automated extraction of knowledge for model-based diagnostics

    NASA Technical Reports Server (NTRS)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  9. Automated Dsm Extraction from Uav Images and Performance Analysis

    NASA Astrophysics Data System (ADS)

    Rhee, S.; Kim, T.

    2015-08-01

    As technology evolves, unmanned aerial vehicles (UAVs) imagery is being used from simple applications such as image acquisition to complicated applications such as 3D spatial information extraction. Spatial information is usually provided in the form of a DSM or point cloud. It is important to generate very dense tie points automatically from stereo images. In this paper, we tried to apply stereo image-based matching technique developed for satellite/aerial images to UAV images, propose processing steps for automated DSM generation and to analyse the possibility of DSM generation. For DSM generation from UAV images, firstly, exterior orientation parameters (EOPs) for each dataset were adjusted. Secondly, optimum matching pairs were determined. Thirdly, stereo image matching was performed with each pair. Developed matching algorithm is based on grey-level correlation on pixels applied along epipolar lines. Finally, the extracted match results were united with one result and the final DSM was made. Generated DSM was compared with a reference DSM from Lidar. Overall accuracy was 1.5 m in NMAD. However, several problems have to be solved in future, including obtaining precise EOPs, handling occlusion and image blurring problems. More effective interpolation technique needs to be developed in the future.

  10. Automated Extraction of Substance Use Information from Clinical Texts

    PubMed Central

    Wang, Yan; Chen, Elizabeth S.; Pakhomov, Serguei; Arsoniadis, Elliot; Carter, Elizabeth W.; Lindemann, Elizabeth; Sarkar, Indra Neil; Melton, Genevieve B.

    2015-01-01

    Within clinical discourse, social history (SH) includes important information about substance use (alcohol, drug, and nicotine use) as key risk factors for disease, disability, and mortality. In this study, we developed and evaluated a natural language processing (NLP) system for automated detection of substance use statements and extraction of substance use attributes (e.g., temporal and status) based on Stanford Typed Dependencies. The developed NLP system leveraged linguistic resources and domain knowledge from a multi-site social history study, Propbank and the MiPACQ corpus. The system attained F-scores of 89.8, 84.6 and 89.4 respectively for alcohol, drug, and nicotine use statement detection, as well as average F-scores of 82.1, 90.3, 80.8, 88.7, 96.6, and 74.5 respectively for extraction of attributes. Our results suggest that NLP systems can achieve good performance when augmented with linguistic resources and domain knowledge when applied to a wide breadth of substance use free text clinical notes. PMID:26958312

  11. Automated Tract Extraction via Atlas Based Adaptive Clustering

    PubMed Central

    Tunç, Birkan; Parker, William A.; Ingalhalikar, Madhura; Verma, Ragini

    2014-01-01

    Advancements in imaging protocols such as the high angular resolution diffusion-weighted imaging (HARDI) and in tractography techniques are expected to cause an increase in the tract-based analyses. Statistical analyses over white matter tracts can contribute greatly towards understanding structural mechanisms of the brain since tracts are representative of the connectivity pathways. The main challenge with tract-based studies is the extraction of the tracts of interest in a consistent and comparable manner over a large group of individuals without drawing the inclusion and exclusion regions of interest. In this work, we design a framework for automated extraction of white matter tracts. The framework introduces three main components, namely a connectivity based fiber representation, a fiber clustering atlas, and a clustering approach called Adaptive Clustering. The fiber representation relies on the connectivity signatures of fibers to establish an easy correspondence between different subjects. A group-wise clustering of these fibers that are represented by the connectivity signatures is then used to generate a fiber bundle atlas. Finally, Adaptive Clustering incorporates the previously generated clustering atlas as a prior, to cluster the fibers of a new subject automatically. Experiments on the HARDI scans of healthy individuals acquired repeatedly, demonstrate the applicability, the reliability and the repeatability of our approach in extracting white matter tracts. By alleviating the seed region selection or the inclusion/exclusion ROI drawing requirements that are usually handled by trained radiologists, the proposed framework expands the range of possible clinical applications and establishes the ability to perform tract-based analyses with large samples. PMID:25134977

  12. AUTOMATED SOLID PHASE EXTRACTION GC/MS FOR ANALYSIS OF SEMIVOLATILES IN WATER AND SEDIMENTS

    EPA Science Inventory

    Data is presented on the development of a new automated system combining solid phase extraction (SPE) with GC/MS spectrometry for the single-run analysis of water samples containing a broad range of organic compounds. The system uses commercially available automated in-line sampl...

  13. ACIS Extract: A Chandra/ACIS Tool for Automated Point Source Extraction and Spectral Fitting

    NASA Astrophysics Data System (ADS)

    Townsley, L.; Broos, P.; Bauer, F.; Getman, K.

    2003-03-01

    ACIS Extract (AE) is an IDL program that assists the observer in performing the many tasks involved in analyzing the spectra of large numbers of point sources observed with the ACIS instrument on Chandra. Notably, all tasks are performed in a context that may include multiple observations of the field. Features of AE and its several accessory tools include refining the accuracy of source positions, defining extraction regions based on the PSF of each source in each observation, generating single-observation and composite ARFs and RMFs, applying energy-dependent aperture corrections to the ARFs, computing light curves and K-S tests for source variability, automated broad-band photometry, automated spectral fitting and review of fitting results, and compilation of results into LaTeX tables. A variety of interactive plots are produced showing various source properties across the catalog. This poster details the capabilities of the package and shows example output. The code and a detailed users' manual are available to the community at http://www.astro.psu.edu/xray/docs/TARA/ae_users_guide.html. Support for this effort was provided by NASA contract NAS8-38252 to Gordon Garmire, the ACIS Principal Investigator.

  14. A Need For Automated Tools For Extraction And Visualisation Of The Data From Orbiter Payload

    NASA Astrophysics Data System (ADS)

    Zharkova, V. V.

    2007-01-01

    The Solar Orbiter instruments are expected to provide a large amount of full disk solar images, which processing by individual users can delay the scientific advances of the mission. Recently developed automated techniques applied for detection of sunspots, active regions, filaments in full disk solar images in Ca II K1, Ca II K3 and Ha images (Meudon Observatory), EUV images (SOHO/EIT) and white light images (SOHO/MDI) revealed a good ac- curacy with the manual synoptic maps and NOAA data and can be used for the automated image standardization and feature extraction from the Solar Orbiter images. The extracted parameters of active features can be also automatically populated into the extended relational database of the Solar Feature Catalogues (SFCs) http://solar.inf.brad.ac.uk. In addition to the original images, this will allow delivering to the users the major activity features extracted with a sufficient accuracy and daily updating the fully digitized database of solar features, which continuation and consistency is valuable for the solar activity models.

  15. Selecting a Relational Database Management System for Library Automation Systems.

    ERIC Educational Resources Information Center

    Shekhel, Alex; O'Brien, Mike

    1989-01-01

    Describes the evaluation of four relational database management systems (RDBMSs) (Informix Turbo, Oracle 6.0 TPS, Unify 2000 and Relational Technology's Ingres 5.0) to determine which is best suited for library automation. The evaluation criteria used to develop a benchmark specifically designed to test RDBMSs for libraries are discussed. (CLB)

  16. Towards automated support for extraction of reusable components

    NASA Technical Reports Server (NTRS)

    Abd-El-hafiz, S. K.; Basili, Victor R.; Caldiera, Gianluigi

    1992-01-01

    A cost effective introduction of software reuse techniques requires the reuse of existing software developed in many cases without aiming at reusability. This paper discusses the problems related to the analysis and reengineering of existing software in order to reuse it. We introduce a process model for component extraction and focus on the problem of analyzing and qualifying software components which are candidates for reuse. A prototype tool for supporting the extraction of reusable components is presented. One of the components of this tool aids in understanding programs and is based on the functional model of correctness. It can assist software engineers in the process of finding correct formal specifications for programs. A detailed description of this component and an example to demonstrate a possible operational scenario are given.

  17. Towards automated support for extraction of reusable components

    NASA Technical Reports Server (NTRS)

    Abd-El-hafiz, S. K.; Basili, V. R.; Caldier, G.

    1991-01-01

    A cost effective introduction of software reuse techniques requires the reuse of existing software developed in many cases without aiming at reusability. This paper discusses the problems related to the analysis and reengineering of existing software in order to reuse it. We introduce a process model for component extraction and focus on the problem of analyzing and qualifying software components which are candidates for reuse. A prototype tool for supporting the extraction of reusable components is presented. One of the components of this tool aids in understanding programs and is based on the functional model of correctness. It can assist software engineers in the process of finding correct formal specifications for programs. A detailed description of this component and an example to demonstrate a possible operational scenario are given.

  18. Automating Nuclear-Safety-Related SQA Procedures with Custom Applications

    SciTech Connect

    Freels, James D.

    2016-01-01

    Nuclear safety-related procedures are rigorous for good reason. Small design mistakes can quickly turn into unwanted failures. Researchers at Oak Ridge National Laboratory worked with COMSOL to define a simulation app that automates the software quality assurance (SQA) verification process and provides results in less than 24 hours.

  19. Extraction and quantification of phytoestrogens in foods using automated solid-phase extraction and LC/MS/MS.

    PubMed

    Kuhnle, Gunter G C; Dell'aquila, Caterina; Low, Yen-Ling; Kussmaul, Michaela; Bingham, Sheila A

    2007-12-01

    Phytoestrogens are a group of polyphenolic plant metabolites that can induce biological responses. Their bioactivity is based on their similarity to 17beta-estradiol and their ability to bind to the beta-estrogen receptor. Although epidemiological data are inconclusive, phytoestrogens are considered to be beneficial for a variety of conditions, for example, hormone-related cancers like breast and prostate cancer. To investigate the biological effects of these compounds and to assess the exposure of larger cohorts or the general public, reliable data on the phytoestrogen content of food is necessary. Previously, food analysis for phytoestrogens was performed using either HPLC-UV or GC/MS. Here, we describe the development of the first generic method for the analysis of phytoestrogens in food, using automated solid-phase extraction and liquid chromatography-tandem mass spectrometry. The presented method shows a good reproducibility and can be easily adapted to other phytoestrogens if required.

  20. Disposable and removable nucleic acid extraction and purification cartridges for automated flow-through systems

    SciTech Connect

    Regan, John Frederick

    2014-09-09

    Removable cartridges are used on automated flow-through systems for the purpose of extracting and purifying genetic material from complex matrices. Different types of cartridges are paired with specific automated protocols to concentrate, extract, and purifying pathogenic or human genetic material. Their flow-through nature allows large quantities sample to be processed. Matrices may be filtered using size exclusion and/or affinity filters to concentrate the pathogen of interest. Lysed material is ultimately passed through a filter to remove the insoluble material before the soluble genetic material is delivered past a silica-like membrane that binds the genetic material, where it is washed, dried, and eluted. Cartridges are inserted into the housing areas of flow-through automated instruments, which are equipped with sensors to ensure proper placement and usage of the cartridges. Properly inserted cartridges create fluid- and air-tight seals with the flow lines of an automated instrument.

  1. Automated serial extraction of DNA and RNA from biobanked tissue specimens

    PubMed Central

    2013-01-01

    Background With increasing biobanking of biological samples, methods for large scale extraction of nucleic acids are in demand. The lack of such techniques designed for extraction from tissues results in a bottleneck in downstream genetic analyses, particularly in the field of cancer research. We have developed an automated procedure for tissue homogenization and extraction of DNA and RNA into separate fractions from the same frozen tissue specimen. A purpose developed magnetic bead based technology to serially extract both DNA and RNA from tissues was automated on a Tecan Freedom Evo robotic workstation. Results 864 fresh-frozen human normal and tumor tissue samples from breast and colon were serially extracted in batches of 96 samples. Yields and quality of DNA and RNA were determined. The DNA was evaluated in several downstream analyses, and the stability of RNA was determined after 9 months of storage. The extracted DNA performed consistently well in processes including PCR-based STR analysis, HaloPlex selection and deep sequencing on an Illumina platform, and gene copy number analysis using microarrays. The RNA has performed well in RT-PCR analyses and maintains integrity upon storage. Conclusions The technology described here enables the processing of many tissue samples simultaneously with a high quality product and a time and cost reduction for the user. This reduces the sample preparation bottleneck in cancer research. The open automation format also enables integration with upstream and downstream devices for automated sample quantitation or storage. PMID:23957867

  2. Artificial intelligence issues related to automated computing operations

    NASA Technical Reports Server (NTRS)

    Hornfeck, William A.

    1989-01-01

    Large data processing installations represent target systems for effective applications of artificial intelligence (AI) constructs. The system organization of a large data processing facility at the NASA Marshall Space Flight Center is presented. The methodology and the issues which are related to AI application to automated operations within a large-scale computing facility are described. Problems to be addressed and initial goals are outlined.

  3. Automated multisyringe stir bar sorptive extraction using robust montmorillonite/epoxy-coated stir bars.

    PubMed

    Ghani, Milad; Saraji, Mohammad; Maya, Fernando; Cerdà, Víctor

    2016-05-01

    Herein we present a simple, rapid and low cost strategy for the preparation of robust stir bar coatings based on the combination of montmorillonite with epoxy resin. The composite stir bar was implemented in a novel automated multisyringe stir bar sorptive extraction system (MS-SBSE), and applied to the extraction of four chlorophenols (4-chlorophenol, 2,4-dichlorophenol, 2,4,6-trichlorophenol and pentachlorophenol) as model compounds, followed by high performance liquid chromatography-diode array detection. The different experimental parameters of the MS-SBSE, such as sample volume, selection of the desorption solvent, desorption volume, desorption time, sample solution pH, salt effect and extraction time were studied. Under the optimum conditions, the detection limits were between 0.02 and 0.34μgL(-1). Relative standard deviations (RSD) of the method for the analytes at 10μgL(-1) concentration level ranged from 3.5% to 4.1% (as intra-day RSD) and from 3.9% to 4.3% (as inter-day RSD at 50μgL(-1) concentration level). Batch-to-batch reproducibility for three different stir bars was 4.6-5.1%. The enrichment factors were between 30 and 49. In order to investigate the capability of the developed technique for real sample analysis, well water, wastewater and leachates from a solid waste treatment plant were satisfactorily analyzed.

  4. Comparison of an automated nucleic acid extraction system with the column-based procedure

    PubMed Central

    Hinz, Rebecca; Hagen, Ralf Matthias

    2015-01-01

    Here, we assessed the extraction efficiency of a deployable bench-top nucleic acid extractor EZ1 in comparison to the column-based approach with complex sample matrices. A total of 48 EDTA blood samples and 81 stool samples were extracted by EZ1 automated extraction and the column-based QIAamp DNA Mini Kit. Blood sample extractions were assessed by two real-time malaria PCRs, while stool samples were analyzed by six multiplex real-time PCR assays targeting bacterial, viral, and parasitic stool pathogens. Inhibition control PCR testing was performed as well. In total, 147 concordant and 13 discordant pathogen-specific PCR results were obtained. The latter comprised 11 positive results after column-based extraction only and two positive results after EZ1 extraction only. EZ1 extraction showed a higher frequency of inhibition. This phenomenon was, however, inconsistent for the different PCR schemes. In case of concordant PCR results, relevant differences of cycle threshold numbers for the compared extraction schemes were not observed. Switches from well-established column-based extraction to extraction with the automated EZ1 system do not lead to a relevantly reduced yield of target DNA when complex sample matrices are used. If sample inhibition is observed, column-based extraction from another sample aliquot may be considered. PMID:25883797

  5. Automated microfluidic DNA/RNA extraction with both disposable and reusable components

    NASA Astrophysics Data System (ADS)

    Kim, Jungkyu; Johnson, Michael; Hill, Parker; Sonkul, Rahul S.; Kim, Jongwon; Gale, Bruce K.

    2012-01-01

    An automated microfluidic nucleic extraction system was fabricated with a multilayer polydimethylsiloxane (PDMS) structure that consists of sample wells, microvalves, a micropump and a disposable microfluidic silica cartridge. Both the microvalves and micropump structures were fabricated in a single layer and are operated pneumatically using a 100 µm PDMS membrane. To fabricate the disposable microfluidic silica cartridge, two-cavity structures were made in a PDMS replica to fit the stacked silica membranes. A handheld controller for the microvalves and pumps was developed to enable system automation. With purified ribonucleic acid (RNA), whole blood and E. coli samples, the automated microfluidic nucleic acid extraction system was validated with a guanidine-based solid phase extraction procedure. An extraction efficiency of ~90% for deoxyribonucleic acid (DNA) and ~54% for RNA was obtained in 12 min from whole blood and E. coli samples, respectively. In addition, the same quantity and quality of extracted DNA was confirmed by polymerase chain reaction (PCR) amplification. The PCR also presented the appropriate amplification and melting profiles. Automated, programmable fluid control and physical separation of the reusable components and the disposable components significantly decrease the assay time and manufacturing cost and increase the flexibility and compatibility of the system with downstream components.

  6. Feature Extraction and Selection Strategies for Automated Target Recognition

    NASA Technical Reports Server (NTRS)

    Greene, W. Nicholas; Zhang, Yuhan; Lu, Thomas T.; Chao, Tien-Hsin

    2010-01-01

    Several feature extraction and selection methods for an existing automatic target recognition (ATR) system using JPLs Grayscale Optical Correlator (GOC) and Optimal Trade-Off Maximum Average Correlation Height (OT-MACH) filter were tested using MATLAB. The ATR system is composed of three stages: a cursory region of-interest (ROI) search using the GOC and OT-MACH filter, a feature extraction and selection stage, and a final classification stage. Feature extraction and selection concerns transforming potential target data into more useful forms as well as selecting important subsets of that data which may aide in detection and classification. The strategies tested were built around two popular extraction methods: Principal Component Analysis (PCA) and Independent Component Analysis (ICA). Performance was measured based on the classification accuracy and free-response receiver operating characteristic (FROC) output of a support vector machine(SVM) and a neural net (NN) classifier.

  7. Automated extraction of precise protein expression patterns in lymphoma by text mining abstracts of immunohistochemical studies

    PubMed Central

    Chang, Jia-Fu; Popescu, Mihail; Arthur, Gerald L.

    2013-01-01

    Background: In general, surgical pathology reviews report protein expression by tumors in a semi-quantitative manner, that is, -, -/+, +/-, +. At the same time, the experimental pathology literature provides multiple examples of precise expression levels determined by immunohistochemical (IHC) tissue examination of populations of tumors. Natural language processing (NLP) techniques enable the automated extraction of such information through text mining. We propose establishing a database linking quantitative protein expression levels with specific tumor classifications through NLP. Materials and Methods: Our method takes advantage of typical forms of representing experimental findings in terms of percentages of protein expression manifest by the tumor population under study. Characteristically, percentages are represented straightforwardly with the % symbol or as the number of positive findings of the total population. Such text is readily recognized using regular expressions and templates permitting extraction of sentences containing these forms for further analysis using grammatical structures and rule-based algorithms. Results: Our pilot study is limited to the extraction of such information related to lymphomas. We achieved a satisfactory level of retrieval as reflected in scores of 69.91% precision and 57.25% recall with an F-score of 62.95%. In addition, we demonstrate the utility of a web-based curation tool for confirming and correcting our findings. Conclusions: The experimental pathology literature represents a rich source of pathobiological information, which has been relatively underutilized. There has been a combinatorial explosion of knowledge within the pathology domain as represented by increasing numbers of immunophenotypes and disease subclassifications. NLP techniques support practical text mining techniques for extracting this knowledge and organizing it in forms appropriate for pathology decision support systems. PMID:23967385

  8. Application and flexibility of robotics in automating extraction methods for food samples.

    PubMed

    Higgs, D J; Vanderslice, J T

    1987-05-01

    Laboratory robotic technology has made it possible to automate the manually intensive operations associated with the extraction of vitamins from food. The modular approach to robotics allows the conversion from one extraction procedure to another by a simple addition or replacement of a module plus reprogramming. This is illustrated for the extraction of vitamins C and B1 from food samples. Because many of the organic micronutrients are unstable, storage and extraction conditions must be established to stabilize labile compounds if the full capabilities of robotics are to be realized.

  9. Dynamic electromembrane extraction: Automated movement of donor and acceptor phases to improve extraction efficiency.

    PubMed

    Asl, Yousef Abdossalami; Yamini, Yadollah; Seidi, Shahram; Amanzadeh, Hatam

    2015-11-01

    In the present research, dynamic electromembrane extraction (DEME) was introduced for the first time for extraction and determination of ionizable species from different biological matrices. The setup proposed for DEME provides an efficient, stable, and reproducible method to increase extraction efficiency. This setup consists of a piece of hollow fiber mounted inside a glass flow cell by means of two plastics connector tubes. In this dynamic system, an organic solvent is impregnated into the pores of hollow fiber as supported liquid membrane (SLM); an aqueous acceptor solution is repeatedly pumped into the lumen of hollow fiber by a syringe pump whereas a peristaltic pump is used to move sample solution around the mounted hollow fiber into the flow cell. Two platinum electrodes connected to a power supply are used during extractions which are located into the lumen of the hollow fiber and glass flow cell, respectively. The method was applied for extraction of amitriptyline (AMI) and nortriptyline (NOR) as model analytes from biological fluids. Effective parameters on DEME of the model analytes were investigated and optimized. Under optimized conditions, the calibration curves were linear in the range of 2.0-100μgL(-1) with coefficient of determination (r(2)) more than 0.9902 for both of the analytes. The relative standard deviations (RSD %) were less than 8.4% based on four replicate measurements. LODs less than 1.0μgL(-1) were obtained for both AMI and NOR. The preconcentration factors higher than 83-fold were obtained for the extraction of AMI and NOR in various biological samples. PMID:26455283

  10. Prescription Extraction from Clinical Notes: Towards Automating EMR Medication Reconciliation

    PubMed Central

    Wang, Yajuan; Steinhubl, Steven R.; Defilippi, Chrisopher; Ng, Kenney; Ebadollahi, Shahram; Stewart, Walter F.; Byrd, Roy J

    2015-01-01

    Medication in for ma lion is one of [he most important clinical data types in electronic medical records (EMR) This study developed an NLP application (PredMED) to extract full prescriptions and their relevant components from a large corpus of unstructured ambulatory office visit clinical notes and the corresponding structured medication reconciliation (MED REC) data in the EMR. PredMED achieved an 84.4% F-score on office visit encounter notes and 95.0% on MED„REC data, outperforming two available medication extraction systems. To assess the potential for using automatically extracted prescriptions in the medication reconciliation task, we manually analyzed discrepancies between prescriptions found in clinical encounter notes and in matching MED_REC data for sample patient encounters. PMID:26306266

  11. Discovering Indicators of Successful Collaboration Using Tense: Automated Extraction of Patterns in Discourse

    ERIC Educational Resources Information Center

    Thompson, Kate; Kennedy-Clark, Shannon; Wheeler, Penny; Kelly, Nick

    2014-01-01

    This paper describes a technique for locating indicators of success within the data collected from complex learning environments, proposing an application of e-research to access learner processes and measure and track group progress. The technique combines automated extraction of tense and modality via parts-of-speech tagging with a visualisation…

  12. Data Mining: The Art of Automated Knowledge Extraction

    NASA Astrophysics Data System (ADS)

    Karimabadi, H.; Sipes, T.

    2012-12-01

    Data mining algorithms are used routinely in a wide variety of fields and they are gaining adoption in sciences. The realities of real world data analysis are that (a) data has flaws, and (b) the models and assumptions that we bring to the data are inevitably flawed, and/or biased and misspecified in some way. Data mining can improve data analysis by detecting anomalies in the data, check for consistency of the user model assumptions, and decipher complex patterns and relationships that would not be possible otherwise. The common form of data collected from in situ spacecraft measurements is multi-variate time series which represents one of the most challenging problems in data mining. We have successfully developed algorithms to deal with such data and have extended the algorithms to handle streaming data. In this talk, we illustrate the utility of our algorithms through several examples including automated detection of reconnection exhausts in the solar wind and flux ropes in the magnetotail. We also show examples from successful applications of our technique to analysis of 3D kinetic simulations. With an eye to the future, we provide an overview of our upcoming plans that include collaborative data mining, expert outsourcing data mining, computer vision for image analysis, among others. Finally, we discuss the integration of data mining algorithms with web-based services such as VxOs and other Heliophysics data centers and the resulting capabilities that it would enable.

  13. Multispectral Image Road Extraction Based Upon Automated Map Conflation

    NASA Astrophysics Data System (ADS)

    Chen, Bin

    Road network extraction from remotely sensed imagery enables many important and diverse applications such as vehicle tracking, drone navigation, and intelligent transportation studies. There are, however, a number of challenges to road detection from an image. Road pavement material, width, direction, and topology vary across a scene. Complete or partial occlusions caused by nearby buildings, trees, and the shadows cast by them, make maintaining road connectivity difficult. The problems posed by occlusions are exacerbated with the increasing use of oblique imagery from aerial and satellite platforms. Further, common objects such as rooftops and parking lots are made of materials similar or identical to road pavements. This problem of common materials is a classic case of a single land cover material existing for different land use scenarios. This work addresses these problems in road extraction from geo-referenced imagery by leveraging the OpenStreetMap digital road map to guide image-based road extraction. The crowd-sourced cartography has the advantages of worldwide coverage that is constantly updated. The derived road vectors follow only roads and so can serve to guide image-based road extraction with minimal confusion from occlusions and changes in road material. On the other hand, the vector road map has no information on road widths and misalignments between the vector map and the geo-referenced image are small but nonsystematic. Properly correcting misalignment between two geospatial datasets, also known as map conflation, is an essential step. A generic framework requiring minimal human intervention is described for multispectral image road extraction and automatic road map conflation. The approach relies on the road feature generation of a binary mask and a corresponding curvilinear image. A method for generating the binary road mask from the image by applying a spectral measure is presented. The spectral measure, called anisotropy-tunable distance (ATD

  14. Automated Extraction Improves Multiplex Molecular Detection of Infection in Septic Patients

    PubMed Central

    Regueiro, Benito J.; Varela-Ledo, Eduardo; Martinez-Lamas, Lucia; Rodriguez-Calviño, Javier; Aguilera, Antonio; Santos, Antonio; Gomez-Tato, Antonio; Alvarez-Escudero, Julian

    2010-01-01

    Sepsis is one of the leading causes of morbidity and mortality in hospitalized patients worldwide. Molecular technologies for rapid detection of microorganisms in patients with sepsis have only recently become available. LightCycler SeptiFast test Mgrade (Roche Diagnostics GmbH) is a multiplex PCR analysis able to detect DNA of the 25 most frequent pathogens in bloodstream infections. The time and labor saved while avoiding excessive laboratory manipulation is the rationale for selecting the automated MagNA Pure compact nucleic acid isolation kit-I (Roche Applied Science, GmbH) as an alternative to conventional SeptiFast extraction. For the purposes of this study, we evaluate extraction in order to demonstrate the feasibility of automation. Finally, a prospective observational study was done using 106 clinical samples obtained from 76 patients in our ICU. Both extraction methods were used in parallel to test the samples. When molecular detection test results using both manual and automated extraction were compared with the data from blood cultures obtained at the same time, the results show that SeptiFast with the alternative MagNA Pure compact extraction not only shortens the complete workflow to 3.57 hrs., but also increases sensitivity of the molecular assay for detecting infection as defined by positive blood culture confirmation. PMID:20967222

  15. Automated DNA extraction of single dog hairs without roots for mitochondrial DNA analysis.

    PubMed

    Bekaert, Bram; Larmuseau, Maarten H D; Vanhove, Maarten P M; Opdekamp, Anouschka; Decorte, Ronny

    2012-03-01

    Dogs are intensely integrated in human social life and their shed hairs can play a major role in forensic investigations. The overall aim of this study was to validate a semi-automated extraction method for mitochondrial DNA analysis of telogenic dog hairs. Extracted DNA was amplified with a 95% success rate from 43 samples using two new experimental designs in which the mitochondrial control region was amplified as a single large (± 1260 bp) amplicon or as two individual amplicons (HV1 and HV2; ± 650 and 350 bp) with tailed-primers. The results prove that the extraction of dog hair mitochondrial DNA can easily be automated to provide sufficient DNA yield for the amplification of a forensically useful long mitochondrial DNA fragment or alternatively two short fragments with minimal loss of sequence in case of degraded samples.

  16. Highly efficient automated extraction of DNA from old and contemporary skeletal remains.

    PubMed

    Zupanič Pajnič, Irena; Debska, Magdalena; Gornjak Pogorelc, Barbara; Vodopivec Mohorčič, Katja; Balažic, Jože; Zupanc, Tomaž; Štefanič, Borut; Geršak, Ksenija

    2016-01-01

    We optimised the automated extraction of DNA from old and contemporary skeletal remains using the AutoMate Express system and the PrepFiler BTA kit. 24 Contemporary and 25 old skeletal remains from WWII were analysed. For each skeleton, extraction using only 0.05 g of powder was performed according to the manufacturer's recommendations (no demineralisation - ND method). Since only 32% of full profiles were obtained from aged and 58% from contemporary casework skeletons, the extraction protocol was modified to acquire higher quality DNA and genomic DNA was obtained after full demineralisation (FD method). The nuclear DNA of the samples was quantified using the Investigator Quantiplex kit and STR typing was performed using the NGM kit to evaluate the performance of tested extraction methods. In the aged DNA samples, 64% of full profiles were obtained using the FD method. For the contemporary skeletal remains the performance of the ND method was closer to the FD method compared to the old skeletons, giving 58% of full profiles with the ND method and 71% of full profiles using the FD method. The extraction of DNA from only 0.05 g of bone or tooth powder using the AutoMate Express has proven highly successful in the recovery of DNA from old and contemporary skeletons, especially with the modified FD method. We believe that the results obtained will contribute to the possibilities of using automated devices for extracting DNA from skeletal remains, which would shorten the procedures for obtaining high-quality DNA from skeletons in forensic laboratories.

  17. Highly efficient automated extraction of DNA from old and contemporary skeletal remains.

    PubMed

    Zupanič Pajnič, Irena; Debska, Magdalena; Gornjak Pogorelc, Barbara; Vodopivec Mohorčič, Katja; Balažic, Jože; Zupanc, Tomaž; Štefanič, Borut; Geršak, Ksenija

    2016-01-01

    We optimised the automated extraction of DNA from old and contemporary skeletal remains using the AutoMate Express system and the PrepFiler BTA kit. 24 Contemporary and 25 old skeletal remains from WWII were analysed. For each skeleton, extraction using only 0.05 g of powder was performed according to the manufacturer's recommendations (no demineralisation - ND method). Since only 32% of full profiles were obtained from aged and 58% from contemporary casework skeletons, the extraction protocol was modified to acquire higher quality DNA and genomic DNA was obtained after full demineralisation (FD method). The nuclear DNA of the samples was quantified using the Investigator Quantiplex kit and STR typing was performed using the NGM kit to evaluate the performance of tested extraction methods. In the aged DNA samples, 64% of full profiles were obtained using the FD method. For the contemporary skeletal remains the performance of the ND method was closer to the FD method compared to the old skeletons, giving 58% of full profiles with the ND method and 71% of full profiles using the FD method. The extraction of DNA from only 0.05 g of bone or tooth powder using the AutoMate Express has proven highly successful in the recovery of DNA from old and contemporary skeletons, especially with the modified FD method. We believe that the results obtained will contribute to the possibilities of using automated devices for extracting DNA from skeletal remains, which would shorten the procedures for obtaining high-quality DNA from skeletons in forensic laboratories. PMID:26615474

  18. Automated extraction and semantic analysis of mutation impacts from the biomedical literature

    PubMed Central

    2012-01-01

    Open Mutation Miner (OMM), the first comprehensive, fully open-source approach to automatically extract impacts and related relevant information from the biomedical literature. We assessed the performance of our work on manually annotated corpora and the results show the reliability of our approach. The representation of the extracted information into a structured format facilitates knowledge management and aids in database curation and correction. Furthermore, access to the analysis results is provided through multiple interfaces, including web services for automated data integration and desktop-based solutions for end user interactions. PMID:22759648

  19. Visual Routines for Extracting Magnitude Relations

    ERIC Educational Resources Information Center

    Michal, Audrey L.; Uttal, David; Shah, Priti; Franconeri, Steven L.

    2016-01-01

    Linking relations described in text with relations in visualizations is often difficult. We used eye tracking to measure the optimal way to extract such relations in graphs, college students, and young children (6- and 8-year-olds). Participants compared relational statements ("Are there more blueberries than oranges?") with simple…

  20. Fully Automated Electro Membrane Extraction Autosampler for LC-MS Systems Allowing Soft Extractions for High-Throughput Applications.

    PubMed

    Fuchs, David; Pedersen-Bjergaard, Stig; Jensen, Henrik; Rand, Kasper D; Honoré Hansen, Steen; Petersen, Nickolaj Jacob

    2016-07-01

    The current work describes the implementation of electro membrane extraction (EME) into an autosampler for high-throughput analysis of samples by EME-LC-MS. The extraction probe was built into a luer lock adapter connected to a HTC PAL autosampler syringe. As the autosampler drew sample solution, analytes were extracted into the lumen of the extraction probe and transferred to a LC-MS system for further analysis. Various parameters affecting extraction efficacy were investigated including syringe fill strokes, syringe pull up volume, pull up delay and volume in the sample vial. The system was optimized for soft extraction of analytes and high sample throughput. Further, it was demonstrated that by flushing the EME-syringe with acidic wash buffer and reverting the applied electric potential, carry-over between samples can be reduced to below 1%. Performance of the system was characterized (RSD, <10%; R(2), 0.994) and finally, the EME-autosampler was used to analyze in vitro conversion of methadone into its main metabolite by rat liver microsomes and for demonstrating the potential of known CYP3A4 inhibitors to prevent metabolism of methadone. By making use of the high extraction speed of EME, a complete analytical workflow of purification, separation, and analysis of sample could be achieved within only 5.5 min. With the developed system large sequences of samples could be analyzed in a completely automated manner. This high degree of automation makes the developed EME-autosampler a powerful tool for a wide range of applications where high-throughput extractions are required before sample analysis. PMID:27237618

  1. Automated DNA extraction platforms offer solutions to challenges of assessing microbial biofouling in oil production facilities.

    PubMed

    Oldham, Athenia L; Drilling, Heather S; Stamps, Blake W; Stevenson, Bradley S; Duncan, Kathleen E

    2012-11-20

    The analysis of microbial assemblages in industrial, marine, and medical systems can inform decisions regarding quality control or mitigation. Modern molecular approaches to detect, characterize, and quantify microorganisms provide rapid and thorough measures unbiased by the need for cultivation. The requirement of timely extraction of high quality nucleic acids for molecular analysis is faced with specific challenges when used to study the influence of microorganisms on oil production. Production facilities are often ill equipped for nucleic acid extraction techniques, making the preservation and transportation of samples off-site a priority. As a potential solution, the possibility of extracting nucleic acids on-site using automated platforms was tested. The performance of two such platforms, the Fujifilm QuickGene-Mini80™ and Promega Maxwell®16 was compared to a widely used manual extraction kit, MOBIO PowerBiofilm™ DNA Isolation Kit, in terms of ease of operation, DNA quality, and microbial community composition. Three pipeline biofilm samples were chosen for these comparisons; two contained crude oil and corrosion products and the third transported seawater. Overall, the two more automated extraction platforms produced higher DNA yields than the manual approach. DNA quality was evaluated for amplification by quantitative PCR (qPCR) and end-point PCR to generate 454 pyrosequencing libraries for 16S rRNA microbial community analysis. Microbial community structure, as assessed by DGGE analysis and pyrosequencing, was comparable among the three extraction methods. Therefore, the use of automated extraction platforms should enhance the feasibility of rapidly evaluating microbial biofouling at remote locations or those with limited resources.

  2. Automated DNA extraction platforms offer solutions to challenges of assessing microbial biofouling in oil production facilities.

    PubMed

    Oldham, Athenia L; Drilling, Heather S; Stamps, Blake W; Stevenson, Bradley S; Duncan, Kathleen E

    2012-01-01

    The analysis of microbial assemblages in industrial, marine, and medical systems can inform decisions regarding quality control or mitigation. Modern molecular approaches to detect, characterize, and quantify microorganisms provide rapid and thorough measures unbiased by the need for cultivation. The requirement of timely extraction of high quality nucleic acids for molecular analysis is faced with specific challenges when used to study the influence of microorganisms on oil production. Production facilities are often ill equipped for nucleic acid extraction techniques, making the preservation and transportation of samples off-site a priority. As a potential solution, the possibility of extracting nucleic acids on-site using automated platforms was tested. The performance of two such platforms, the Fujifilm QuickGene-Mini80™ and Promega Maxwell®16 was compared to a widely used manual extraction kit, MOBIO PowerBiofilm™ DNA Isolation Kit, in terms of ease of operation, DNA quality, and microbial community composition. Three pipeline biofilm samples were chosen for these comparisons; two contained crude oil and corrosion products and the third transported seawater. Overall, the two more automated extraction platforms produced higher DNA yields than the manual approach. DNA quality was evaluated for amplification by quantitative PCR (qPCR) and end-point PCR to generate 454 pyrosequencing libraries for 16S rRNA microbial community analysis. Microbial community structure, as assessed by DGGE analysis and pyrosequencing, was comparable among the three extraction methods. Therefore, the use of automated extraction platforms should enhance the feasibility of rapidly evaluating microbial biofouling at remote locations or those with limited resources. PMID:23168231

  3. Automated segmentation and feature extraction of product inspection items

    NASA Astrophysics Data System (ADS)

    Talukder, Ashit; Casasent, David P.

    1997-03-01

    X-ray film and linescan images of pistachio nuts on conveyor trays for product inspection are considered. The final objective is the categorization of pistachios into good, blemished and infested nuts. A crucial step before classification is the separation of touching products and the extraction of features essential for classification. This paper addresses new detection and segmentation algorithms to isolate touching or overlapping items. These algorithms employ a new filter, a new watershed algorithm, and morphological processing to produce nutmeat-only images. Tests on a large database of x-ray film and real-time x-ray linescan images of around 2900 small, medium and large nuts showed excellent segmentation results. A new technique to detect and segment dark regions in nutmeat images is also presented and tested on approximately 300 x-ray film and approximately 300 real-time linescan x-ray images with 95-97 percent detection and correct segmentation. New algorithms are described that determine nutmeat fill ratio and locate splits in nutmeat. The techniques formulated in this paper are of general use in many different product inspection and computer vision problems.

  4. Automated renal histopathology: digital extraction and quantification of renal pathology

    NASA Astrophysics Data System (ADS)

    Sarder, Pinaki; Ginley, Brandon; Tomaszewski, John E.

    2016-03-01

    The branch of pathology concerned with excess blood serum proteins being excreted in the urine pays particular attention to the glomerulus, a small intertwined bunch of capillaries located at the beginning of the nephron. Normal glomeruli allow moderate amount of blood proteins to be filtered; proteinuric glomeruli allow large amount of blood proteins to be filtered. Diagnosis of proteinuric diseases requires time intensive manual examination of the structural compartments of the glomerulus from renal biopsies. Pathological examination includes cellularity of individual compartments, Bowman's and luminal space segmentation, cellular morphology, glomerular volume, capillary morphology, and more. Long examination times may lead to increased diagnosis time and/or lead to reduced precision of the diagnostic process. Automatic quantification holds strong potential to reduce renal diagnostic time. We have developed a computational pipeline capable of automatically segmenting relevant features from renal biopsies. Our method first segments glomerular compartments from renal biopsies by isolating regions with high nuclear density. Gabor texture segmentation is used to accurately define glomerular boundaries. Bowman's and luminal spaces are segmented using morphological operators. Nuclei structures are segmented using color deconvolution, morphological processing, and bottleneck detection. Average computation time of feature extraction for a typical biopsy, comprising of ~12 glomeruli, is ˜69 s using an Intel(R) Core(TM) i7-4790 CPU, and is ~65X faster than manual processing. Using images from rat renal tissue samples, automatic glomerular structural feature estimation was reproducibly demonstrated for 15 biopsy images, which contained 148 individual glomeruli images. The proposed method holds immense potential to enhance information available while making clinical diagnoses.

  5. Automated identification of adverse events related to central venous catheters.

    PubMed

    Penz, Janet F E; Wilcox, Adam B; Hurdle, John F

    2007-04-01

    Methods for surveillance of adverse events (AEs) in clinical settings are limited by cost, technology, and appropriate data availability. In this study, two methods for semi-automated review of text records within the Veterans Administration database are utilized to identify AEs related to the placement of central venous catheters (CVCs): a Natural Language Processing program and a phrase-matching algorithm. A sample of manually reviewed records were then compared to the results of both methods to assess sensitivity and specificity. The phrase-matching algorithm was found to be a sensitive but relatively non-specific method, whereas a natural language processing system was significantly more specific but less sensitive. Positive predictive values for each method estimated the CVC-associated AE rate at this institution to be 6.4 and 6.2%, respectively. Using both methods together results in acceptable sensitivity and specificity (72.0 and 80.1%, respectively). All methods including manual chart review are limited by incomplete or inaccurate clinician documentation. A secondary finding was related to the completeness of administrative data (ICD-9 and CPT codes) used to identify intensive care unit patients in whom a CVC was placed. Administrative data identified less than 11% of patients who had a CVC placed. This suggests that other methods, including automated methods such as phrase matching, may be more sensitive than administrative data in identifying patients with devices. Considerable potential exists for the use of such methods for the identification of patients at risk, AE surveillance, and prevention of AEs through decision support technologies. PMID:16901760

  6. Automated Extraction of the Barthel Index from Clinical Texts

    PubMed Central

    Giang, Phan; Williams, Allison; Argyros, Lisa

    2013-01-01

    This paper describes a text mining program that computes the Barthel score of functional status by analyzing clinical notes stored in Electronic Health Record systems(EHR) and comparing them to textual evidence provided by clinical experts. The program demonstrates high accuracy and overall reliability based on a relatively small number of expert-abstracted charts. It offers an efficient and affordable method for estimating functional status using clinical notes. An important feature is an architecture that facilitates interaction between users and the program, allowing the program to improve its performance based on user feedback . PMID:24551352

  7. Automated road network extraction from high spatial resolution multi-spectral imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Qiaoping

    For the last three decades, the Geomatics Engineering and Computer Science communities have considered automated road network extraction from remotely-sensed imagery to be a challenging and important research topic. The main objective of this research is to investigate the theory and methodology of automated feature extraction for image-based road database creation, refinement or updating, and to develop a series of algorithms for road network extraction from high resolution multi-spectral imagery. The proposed framework for road network extraction from multi-spectral imagery begins with an image segmentation using the k-means algorithm. This step mainly concerns the exploitation of the spectral information for feature extraction. The road cluster is automatically identified using a fuzzy classifier based on a set of predefined road surface membership functions. These membership functions are established based on the general spectral signature of road pavement materials and the corresponding normalized digital numbers on each multi-spectral band. Shape descriptors of the Angular Texture Signature are defined and used to reduce the misclassifications between roads and other spectrally similar objects (e.g., crop fields, parking lots, and buildings). An iterative and localized Radon transform is developed for the extraction of road centerlines from the classified images. The purpose of the transform is to accurately and completely detect the road centerlines. It is able to find short, long, and even curvilinear lines. The input image is partitioned into a set of subset images called road component images. An iterative Radon transform is locally applied to each road component image. At each iteration, road centerline segments are detected based on an accurate estimation of the line parameters and line widths. Three localization approaches are implemented and compared using qualitative and quantitative methods. Finally, the road centerline segments are grouped into a

  8. Automated extraction of acetylgestagens from kidney fat by matrix solid phase dispersion.

    PubMed

    Rosén, J; Hellenäs, K E; Törnqvist, P; Shearan, P

    1994-12-01

    A new extraction method for the acetylgestagens medroxyprogesterone acetate (MPA), chloromadinone acetate and megestrol acetate, from kidney fat, has been developed. The method is a combination of matrix solid phase dispersion and solid phase extraction and is simpler and safer than previous methods, especially as it can be automated. The recovery was estimated as 59 +/- 5% (mean +/- standard deviation) for MPA. For screening purposes detection can be achieved using a commercially available enzyme immunoassay kit giving detection limits in the range of 1.0-2.0 ng g-1.

  9. Neuron Image Analyzer: Automated and Accurate Extraction of Neuronal Data from Low Quality Images.

    PubMed

    Kim, Kwang-Min; Son, Kilho; Palmore, G Tayhas R

    2015-01-01

    Image analysis software is an essential tool used in neuroscience and neural engineering to evaluate changes in neuronal structure following extracellular stimuli. Both manual and automated methods in current use are severely inadequate at detecting and quantifying changes in neuronal morphology when the images analyzed have a low signal-to-noise ratio (SNR). This inadequacy derives from the fact that these methods often include data from non-neuronal structures or artifacts by simply tracing pixels with high intensity. In this paper, we describe Neuron Image Analyzer (NIA), a novel algorithm that overcomes these inadequacies by employing Laplacian of Gaussian filter and graphical models (i.e., Hidden Markov Model, Fully Connected Chain Model) to specifically extract relational pixel information corresponding to neuronal structures (i.e., soma, neurite). As such, NIA that is based on vector representation is less likely to detect false signals (i.e., non-neuronal structures) or generate artifact signals (i.e., deformation of original structures) than current image analysis algorithms that are based on raster representation. We demonstrate that NIA enables precise quantification of neuronal processes (e.g., length and orientation of neurites) in low quality images with a significant increase in the accuracy of detecting neuronal changes post-stimulation. PMID:26593337

  10. Neuron Image Analyzer: Automated and Accurate Extraction of Neuronal Data from Low Quality Images

    PubMed Central

    Kim, Kwang-Min; Son, Kilho; Palmore, G. Tayhas R.

    2015-01-01

    Image analysis software is an essential tool used in neuroscience and neural engineering to evaluate changes in neuronal structure following extracellular stimuli. Both manual and automated methods in current use are severely inadequate at detecting and quantifying changes in neuronal morphology when the images analyzed have a low signal-to-noise ratio (SNR). This inadequacy derives from the fact that these methods often include data from non-neuronal structures or artifacts by simply tracing pixels with high intensity. In this paper, we describe Neuron Image Analyzer (NIA), a novel algorithm that overcomes these inadequacies by employing Laplacian of Gaussian filter and graphical models (i.e., Hidden Markov Model, Fully Connected Chain Model) to specifically extract relational pixel information corresponding to neuronal structures (i.e., soma, neurite). As such, NIA that is based on vector representation is less likely to detect false signals (i.e., non-neuronal structures) or generate artifact signals (i.e., deformation of original structures) than current image analysis algorithms that are based on raster representation. We demonstrate that NIA enables precise quantification of neuronal processes (e.g., length and orientation of neurites) in low quality images with a significant increase in the accuracy of detecting neuronal changes post-stimulation. PMID:26593337

  11. ANALYSIS OF SELECTED FACTORS RELATIVE TO AUTOMATED SCHOOL SCHEDULING PROCESSES.

    ERIC Educational Resources Information Center

    CHAFFEE, LEONARD M.; HELLER, ROBERT W.

    PROJECT PASS (PROJECT IN AUTOMATED SCHOOL SCHEDULING) WAS SPONSORED IN 1965 BY THE WESTERN NEW YORK SCHOOL STUDY COUNCIL TO PROVIDE IN-SERVICE EDUCATION FOR SCHOOL PERSONNEL CONTEMPLATING THE USE OF AUTOMATED APPROACHES TO SCHOOL SCHEDULING. TWO TECHNIQUES WERE UTILIZED--CLASS LOADING AND STUDENT SELECTION (CLASS), AND GENERAL ACADEMIC SIMULATION…

  12. Analyzing Automated Instructional Systems: Metaphors from Related Design Professions.

    ERIC Educational Resources Information Center

    Jonassen, David H.; Wilson, Brent G.

    Noting that automation has had an impact on virtually every manufacturing and information operation in the world, including instructional design (ID), this paper suggests three basic metaphors for automating instructional design activities: (1) computer-aided design and manufacturing (CAD/CAM) systems; (2) expert system advisor systems; and (3)…

  13. Automated diagnosis of Age-related Macular Degeneration using greyscale features from digital fundus images.

    PubMed

    Mookiah, Muthu Rama Krishnan; Acharya, U Rajendra; Koh, Joel E W; Chandran, Vinod; Chua, Chua Kuang; Tan, Jen Hong; Lim, Choo Min; Ng, E Y K; Noronha, Kevin; Tong, Louis; Laude, Augustinus

    2014-10-01

    Age-related Macular Degeneration (AMD) is one of the major causes of vision loss and blindness in ageing population. Currently, there is no cure for AMD, however early detection and subsequent treatment may prevent the severe vision loss or slow the progression of the disease. AMD can be classified into two types: dry and wet AMDs. The people with macular degeneration are mostly affected by dry AMD. Early symptoms of AMD are formation of drusen and yellow pigmentation. These lesions are identified by manual inspection of fundus images by the ophthalmologists. It is a time consuming, tiresome process, and hence an automated diagnosis of AMD screening tool can aid clinicians in their diagnosis significantly. This study proposes an automated dry AMD detection system using various entropies (Shannon, Kapur, Renyi and Yager), Higher Order Spectra (HOS) bispectra features, Fractional Dimension (FD), and Gabor wavelet features extracted from greyscale fundus images. The features are ranked using t-test, Kullback-Lieber Divergence (KLD), Chernoff Bound and Bhattacharyya Distance (CBBD), Receiver Operating Characteristics (ROC) curve-based and Wilcoxon ranking methods in order to select optimum features and classified into normal and AMD classes using Naive Bayes (NB), k-Nearest Neighbour (k-NN), Probabilistic Neural Network (PNN), Decision Tree (DT) and Support Vector Machine (SVM) classifiers. The performance of the proposed system is evaluated using private (Kasturba Medical Hospital, Manipal, India), Automated Retinal Image Analysis (ARIA) and STructured Analysis of the Retina (STARE) datasets. The proposed system yielded the highest average classification accuracies of 90.19%, 95.07% and 95% with 42, 54 and 38 optimal ranked features using SVM classifier for private, ARIA and STARE datasets respectively. This automated AMD detection system can be used for mass fundus image screening and aid clinicians by making better use of their expertise on selected images that

  14. Extraction, identification, and functional characterization of a bioactive substance from automated compound-handling plastic tips.

    PubMed

    Watson, John; Greenough, Emily B; Leet, John E; Ford, Michael J; Drexler, Dieter M; Belcastro, James V; Herbst, John J; Chatterjee, Moneesh; Banks, Martyn

    2009-06-01

    Disposable plastic labware is ubiquitous in contemporary pharmaceutical research laboratories. Plastic labware is routinely used for chemical compound storage and during automated liquid-handling processes that support assay development, high-throughput screening, structure-activity determinations, and liability profiling. However, there is little information available in the literature on the contaminants released from plastic labware upon DMSO exposure and their resultant effects on specific biological assays. The authors report here the extraction, by simple DMSO washing, of a biologically active substance from one particular size of disposable plastic tips used in automated compound handling. The active contaminant was identified as erucamide ((Z)-docos-13-enamide), a long-chain mono-unsaturated fatty acid amide commonly used in plastics manufacturing, by gas chromatography/mass spectroscopy analysis of the DMSO-extracted material. Tip extracts prepared in DMSO, as well as a commercially obtained sample of erucamide, were active in a functional bioassay of a known G-protein-coupled fatty acid receptor. A sample of a different disposable tip product from the same vendor did not release detectable erucamide following solvent extraction, and DMSO extracts prepared from this product were inactive in the receptor functional assay. These results demonstrate that solvent-extractable contaminants from some plastic labware used in the contemporary pharmaceutical research and development (R&D) environment can be introduced into physical and biological assays during routine compound management liquid-handling processes. These contaminants may further possess biological activity and are therefore a potential source of assay-specific confounding artifacts.

  15. Automated Kinematic Extraction of Wing and Body Motions of Free Flying Diptera

    NASA Astrophysics Data System (ADS)

    Kostreski, Nicholas I.

    In the quest to understand the forces generated by micro aerial systems powered by oscillating appendages, it is necessary to study the kinematics that generate those forces. Automated and manual tracking techniques were developed to extract the complex wing and body motions of dipteran insects, ideal micro aerial systems, in free flight. Video sequences were captured by three high speed cameras (7500 fps) oriented orthogonally around a clear flight test chamber. Synchronization and image-based triggering were made possible by an automated triggering circuit. A multi-camera calibration was implemented using image-based tracking techniques. Three-dimensional reconstructions of the insect were generated from the 2-D images by shape from silhouette (SFS) methods. An intensity based segmentation of the wings and body was performed using a mixture of Gaussians. In addition to geometric and cost based filtering, spectral clustering was also used to refine the reconstruction and Principal Component Analysis (PCA) was performed to find the body roll axis and wing-span axes. The unobservable roll state of the cylindrically shaped body was successfully estimated by combining observations of the wing kinematics with a wing symmetry assumption. Wing pitch was determined by a ray tracing technique to compute and minimize a point-to-line cost function. Linear estimation with assumed motion models was accomplished by discrete Kalman filtering the measured body states. Generative models were developed for different species of diptera for model based tracking, simulation, and extraction of inertial properties. Manual and automated tracking results were analyzed and insect flight simulation videos were developed to quantify ground truth errors for an assumed model. The results demonstrated the automated tracker to have comparable performance to a human digitizer, though manual techniques displayed superiority during aggressive maneuvers and image blur. Both techniques demonstrated

  16. Automated extraction of natural drainage density patterns for the conterminous United States through high performance computing

    USGS Publications Warehouse

    Stanislawski, Larry V.; Falgout, Jeff T.; Buttenfield, Barbara P.

    2015-01-01

    Hydrographic networks form an important data foundation for cartographic base mapping and for hydrologic analysis. Drainage density patterns for these networks can be derived to characterize local landscape, bedrock and climate conditions, and further inform hydrologic and geomorphological analysis by indicating areas where too few headwater channels have been extracted. But natural drainage density patterns are not consistently available in existing hydrographic data for the United States because compilation and capture criteria historically varied, along with climate, during the period of data collection over the various terrain types throughout the country. This paper demonstrates an automated workflow that is being tested in a high-performance computing environment by the U.S. Geological Survey (USGS) to map natural drainage density patterns at the 1:24,000-scale (24K) for the conterminous United States. Hydrographic network drainage patterns may be extracted from elevation data to guide corrections for existing hydrographic network data. The paper describes three stages in this workflow including data pre-processing, natural channel extraction, and generation of drainage density patterns from extracted channels. The workflow is concurrently implemented by executing procedures on multiple subbasin watersheds within the U.S. National Hydrography Dataset (NHD). Pre-processing defines parameters that are needed for the extraction process. Extraction proceeds in standard fashion: filling sinks, developing flow direction and weighted flow accumulation rasters. Drainage channels with assigned Strahler stream order are extracted within a subbasin and simplified. Drainage density patterns are then estimated with 100-meter resolution and subsequently smoothed with a low-pass filter. The extraction process is found to be of better quality in higher slope terrains. Concurrent processing through the high performance computing environment is shown to facilitate and refine

  17. Automated headspace solid-phase dynamic extraction for the determination of cannabinoids in hair samples.

    PubMed

    Musshoff, Frank; Lachenmeier, Dirk W; Kroener, Lars; Madea, Burkhard

    2003-04-23

    This article describes a fully automated procedure for detecting cannabinoids in human hair samples. The procedure uses alkaline hydrolysis and headspace solid-phase dynamic extraction (HS-SPDE), followed by on-coating derivatization and gas chromatography-mass spectrometry (GC-MS). SPDE is a further development of solid-phase microextraction (SPME), based on an inside needle capillary absorption trap. It uses a hollow needle with an internal coating of polydimethylsiloxane as extraction and pre-concentration medium. Ten mg of hair were washed with deionised water, petroleum ether and dichloromethane. After adding deuterated internal standards, the sample was hydrolyzed with sodium hydroxide and directly submitted to HS-SPDE. After absorption of analytes for an on-coating derivatization procedure, the SPDE-needle was directly placed into the headspace of a second vial containing N-methyl-N-trimethylsilyl-trifluoroacetamide before GC-MS analysis. The limit of detection was 0.14 ng/mg for Delta(9)-tetrahydrocannabinol, 0.09 ng/mg for cannabidiol, and 0.12ng/mg for cannabinol. Absolute recoveries were in the range of 0.6 to 8.4%. Linearity was verified over a range from 0.2 to 20 ng/mg, with coefficients of correlation between 0.998 and 0.999. Intra- and inter-day precision were determined at two different concentrations and resulted in ranges between 2.3 and 6.0% (intra-day) and 3.3 and 7.6% (inter-day). Compared with conventional methods of hair analysis, this automated HS-SPDE-GC-MS procedure is substantially faster. It is easy to perform without using solvents and with minimal sample quantities, and it yields the same sensitivity and reproducibility. Compared to SPME, we found a higher extraction rate, coupled with a faster automated operation and greater stability of the device.

  18. Munitions related feature extraction from LIDAR data.

    SciTech Connect

    Roberts, Barry L.

    2010-06-01

    The characterization of former military munitions ranges is critical in the identification of areas likely to contain residual unexploded ordnance (UXO). Although these ranges are large, often covering tens-of-thousands of acres, the actual target areas represent only a small fraction of the sites. The challenge is that many of these sites do not have records indicating locations of former target areas. The identification of target areas is critical in the characterization and remediation of these sites. The Strategic Environmental Research and Development Program (SERDP) and Environmental Security Technology Certification Program (ESTCP) of the DoD have been developing and implementing techniques for the efficient characterization of large munitions ranges. As part of this process, high-resolution LIDAR terrain data sets have been collected over several former ranges. These data sets have been shown to contain information relating to former munitions usage at these ranges, specifically terrain cratering due to high-explosives detonations. The location and relative intensity of crater features can provide information critical in reconstructing the usage history of a range, and indicate areas most likely to contain UXO. We have developed an automated procedure using an adaptation of the Circular Hough Transform for the identification of crater features in LIDAR terrain data. The Circular Hough Transform is highly adept at finding circular features (craters) in noisy terrain data sets. This technique has the ability to find features of a specific radius providing a means of filtering features based on expected scale and providing additional spatial characterization of the identified feature. This method of automated crater identification has been applied to several former munitions ranges with positive results.

  19. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples.

    PubMed

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane

    2012-09-01

    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination.

  20. Fully automated DNA extraction from blood using magnetic particles modified with a hyperbranched polyamidoamine dendrimer.

    PubMed

    Yoza, Brandon; Arakaki, Atsushi; Maruyama, Kohei; Takeyama, Haruko; Matsunaga, Tadashi

    2003-01-01

    Bacterial and artificial magnetic particles were modified using a polyamidoamine (PAMAM) dendrimer and outer shell amines determined. Bacterial magnetic particles were the most consistently modified. Transmission electron microscopic (TEM) analysis showed that the artificial magnetic particles were structurally damaged by the modification process including sonication. Furthermore, laser particle analysis of the magnetite also revealed damage. Small quantities of dendrimer-modified bacterial magnetic particles were used to extract DNA from blood. The efficiency of DNA recovery was consistently about 30 ng of DNA using 2-10 microg of dendrimer-modified bacterial magnetite. This technique was fully automated using newly developed liquid handling robots and bacterial magnetic particles.

  1. Automated solid-phase extraction of herbicides from water for gas chromatographic-mass spectrometric analysis

    USGS Publications Warehouse

    Meyer, M.T.; Mills, M.S.; Thurman, E.M.

    1993-01-01

    An automated solid-phase extraction (SPE) method was developed for the pre-concentration of chloroacetanilide and triazine herbicides, and two triazine metabolites from 100-ml water samples. Breakthrough experiments for the C18 SPE cartridge show that the two triazine metabolites are not fully retained and that increasing flow-rate decreases their retention. Standard curve r2 values of 0.998-1.000 for each compound were consistently obtained and a quantitation level of 0.05 ??g/l was achieved for each compound tested. More than 10,000 surface and ground water samples have been analyzed by this method.

  2. Automated CO2 extraction from air for clumped isotope analysis in the atmo- and biosphere

    NASA Astrophysics Data System (ADS)

    Hofmann, Magdalena; Ziegler, Martin; Pons, Thijs; Lourens, Lucas; Röckmann, Thomas

    2015-04-01

    The conventional stable isotope ratios 13C/12C and 18O/16O in atmospheric CO2 are a powerful tool for unraveling the global carbon cycle. In recent years, it has been suggested that the abundance of the very rare isotopologue 13C18O16O on m/z 47 might be a promising tracer to complement conventional stable isotope analysis of atmospheric CO2 [Affek and Eiler, 2006; Affek et al. 2007; Eiler and Schauble, 2004; Yeung et al., 2009]. Here we present an automated analytical system that is designed for clumped isotope analysis of atmo- and biospheric CO2. The carbon dioxide gas is quantitatively extracted from about 1.5L of air (ATP). The automated stainless steel extraction and purification line consists of three main components: (i) a drying unit (a magnesium perchlorate unit and a cryogenic water trap), (ii) two CO2 traps cooled with liquid nitrogen [Werner et al., 2001] and (iii) a GC column packed with Porapak Q that can be cooled with liquid nitrogen to -30°C during purification and heated up to 230°C in-between two extraction runs. After CO2 extraction and purification, the CO2 is automatically transferred to the mass spectrometer. Mass spectrometric analysis of the 13C18O16O abundance is carried out in dual inlet mode on a MAT 253 mass spectrometer. Each analysis generally consists of 80 change-over-cycles. Three additional Faraday cups were added to the mass spectrometer for simultaneous analysis of the mass-to-charge ratios 44, 45, 46, 47, 48 and 49. The reproducibility for δ13C, δ18O and Δ47 for repeated CO2 extractions from air is in the range of 0.11o (SD), 0.18o (SD) and 0.02 (SD)o respectively. This automated CO2 extraction and purification system will be used to analyse the clumped isotopic signature in atmospheric CO2 (tall tower, Cabauw, Netherlands) and to study the clumped isotopic fractionation during photosynthesis (leaf chamber experiments) and soil respiration. References Affek, H. P., Xu, X. & Eiler, J. M., Geochim. Cosmochim. Acta 71, 5033

  3. BRONCO: Biomedical entity Relation ONcology COrpus for extracting gene-variant-disease-drug relations.

    PubMed

    Lee, Kyubum; Lee, Sunwon; Park, Sungjoon; Kim, Sunkyu; Kim, Suhkyung; Choi, Kwanghun; Tan, Aik Choon; Kang, Jaewoo

    2016-01-01

    Comprehensive knowledge of genomic variants in a biological context is key for precision medicine. As next-generation sequencing technologies improve, the amount of literature containing genomic variant data, such as new functions or related phenotypes, rapidly increases. Because numerous articles are published every day, it is almost impossible to manually curate all the variant information from the literature. Many researchers focus on creating an improved automated biomedical natural language processing (BioNLP) method that extracts useful variants and their functional information from the literature. However, there is no gold-standard data set that contains texts annotated with variants and their related functions. To overcome these limitations, we introduce a Biomedical entity Relation ONcology COrpus (BRONCO) that contains more than 400 variants and their relations with genes, diseases, drugs and cell lines in the context of cancer and anti-tumor drug screening research. The variants and their relations were manually extracted from 108 full-text articles. BRONCO can be utilized to evaluate and train new methods used for extracting biomedical entity relations from full-text publications, and thus be a valuable resource to the biomedical text mining research community. Using BRONCO, we quantitatively and qualitatively evaluated the performance of three state-of-the-art BioNLP methods. We also identified their shortcomings, and suggested remedies for each method. We implemented post-processing modules for the three BioNLP methods, which improved their performance.Database URL:http://infos.korea.ac.kr/bronco. PMID:27074804

  4. BRONCO: Biomedical entity Relation ONcology COrpus for extracting gene-variant-disease-drug relations

    PubMed Central

    Lee, Kyubum; Lee, Sunwon; Park, Sungjoon; Kim, Sunkyu; Kim, Suhkyung; Choi, Kwanghun; Tan, Aik Choon; Kang, Jaewoo

    2016-01-01

    Comprehensive knowledge of genomic variants in a biological context is key for precision medicine. As next-generation sequencing technologies improve, the amount of literature containing genomic variant data, such as new functions or related phenotypes, rapidly increases. Because numerous articles are published every day, it is almost impossible to manually curate all the variant information from the literature. Many researchers focus on creating an improved automated biomedical natural language processing (BioNLP) method that extracts useful variants and their functional information from the literature. However, there is no gold-standard data set that contains texts annotated with variants and their related functions. To overcome these limitations, we introduce a Biomedical entity Relation ONcology COrpus (BRONCO) that contains more than 400 variants and their relations with genes, diseases, drugs and cell lines in the context of cancer and anti-tumor drug screening research. The variants and their relations were manually extracted from 108 full-text articles. BRONCO can be utilized to evaluate and train new methods used for extracting biomedical entity relations from full-text publications, and thus be a valuable resource to the biomedical text mining research community. Using BRONCO, we quantitatively and qualitatively evaluated the performance of three state-of-the-art BioNLP methods. We also identified their shortcomings, and suggested remedies for each method. We implemented post-processing modules for the three BioNLP methods, which improved their performance. Database URL: http://infos.korea.ac.kr/bronco PMID:27074804

  5. CHANNEL MORPHOLOGY TOOL (CMT): A GIS-BASED AUTOMATED EXTRACTION MODEL FOR CHANNEL GEOMETRY

    SciTech Connect

    JUDI, DAVID; KALYANAPU, ALFRED; MCPHERSON, TIMOTHY; BERSCHEID, ALAN

    2007-01-17

    This paper describes an automated Channel Morphology Tool (CMT) developed in ArcGIS 9.1 environment. The CMT creates cross-sections along a stream centerline and uses a digital elevation model (DEM) to create station points with elevations along each of the cross-sections. The generated cross-sections may then be exported into a hydraulic model. Along with the rapid cross-section generation the CMT also eliminates any cross-section overlaps that might occur due to the sinuosity of the channels using the Cross-section Overlap Correction Algorithm (COCoA). The CMT was tested by extracting cross-sections from a 5-m DEM for a 50-km channel length in Houston, Texas. The extracted cross-sections were compared directly with surveyed cross-sections in terms of the cross-section area. Results indicated that the CMT-generated cross-sections satisfactorily matched the surveyed data.

  6. Automated DNA extraction, quantification, dilution, and PCR preparation for genotyping by high-resolution melting.

    PubMed

    Seipp, Michael T; Herrmann, Mark; Wittwer, Carl T

    2010-12-01

    Genotyping by high-resolution amplicon melting uses only two PCR primers per locus and a generic, saturating DNA dye that detects heteroduplexes as well as homoduplexes. Heterozygous genotypes have a characteristic melting curve shape and a broader width than homozygous genotypes, which are usually differentiated by their melting temperature (T(m)). The H63D mutation, associated with hemochromatosis, is a single nucleotide polymorphism, which is impossible to genotype based on T(m), as the homozygous WT and mutant amplicons melt at the same temperature. To distinguish such homozygous variants, WT DNA can be added to controls and unknown samples to create artificial heterozygotes with all genotypes distinguished by quantitative heteroduplex analysis. By automating DNA extraction, quantification, and PCR preparation, a hands-off integrated solution for genotyping is possible. A custom Biomek® NX robot with an onboard spectrophotometer and custom programming was used to extract DNA from whole blood, dilute the DNA to appropriate concentrations, and add the sample DNA to preprepared PCR plates. Agencourt® Genfind™ v.2 chemistry was used for DNA extraction. PCR was performed on a plate thermocycler, high-resolution melting data collected on a LightScanner-96, followed by analysis and automatic genotyping using custom software. In a blinded study of 42 H63D samples, 41 of the 42 sample genotypes were concordant with dual hybridization probe genotyping. The incorrectly assigned genotype was a heterozygote that appeared to be a homozygous mutant as a result of a low sample DNA concentration. Automated DNA extraction from whole blood with quantification, dilution, and PCR preparation was demonstrated using quantitative heteroduplex analysis. Accuracy is critically dependent on DNA quantification.

  7. Development and validation of an automated unit for the extraction of radiocaesium from seawater.

    PubMed

    Bokor, Ilonka; Sdraulig, Sandra; Jenkinson, Peter; Madamperuma, Janaka; Martin, Paul

    2016-01-01

    An automated unit was developed for the in-situ extraction of radiocaesium ((137)Cs and (134)Cs) from large volumes of seawater to achieve very low detection limits. The unit was designed for monitoring of Australian ocean and coastal waters, including at ports visited by nuclear-powered warships. The unit is housed within a robust case, and is easily transported and operated. It contains four filter cartridges connected in series. The first two cartridges are used to remove any suspended material that may be present in the seawater, while the last two cartridges are coated with potassium copper hexacyanoferrate for caesium extraction. Once the extraction is completed the coated cartridges are ashed. The ash is transferred to a small petri dish for counting of (137)Cs and (134)Cs by high resolution gamma spectrometry for a minimum of 24 h. The extraction method was validated for the following criteria: selectivity, trueness, precision, linearity, limit of detection and traceability. The validation showed the unit to be fit for purpose with the method capable of achieving low detection limits required for environmental samples. The results for the environmental measurements in Australian seawater correlate well with those reported in the Worldwide Marine Radioactivity Study (WOMARS). The cost of preparation and running the system is low and waste generation is minimal. PMID:26330020

  8. Automated Detection and Extraction of Coronal Dimmings from SDO/AIA Data

    NASA Astrophysics Data System (ADS)

    Davey, Alisdair R.; Attrill, G. D. R.; Wills-Davey, M. J.

    2010-05-01

    The sheer volume of data anticipated from the Solar Dynamics Observatory/Atmospheric Imaging Assembly (SDO/AIA) highlights the necessity for the development of automatic detection methods for various types of solar activity. Initially recognised in the 1970s, it is now well established that coronal dimmings are closely associated with coronal mass ejections (CMEs), and are particularly recognised as an indicator of front-side (halo) CMEs, which can be difficult to detect in white-light coronagraph data. An automated coronal dimming region detection and extraction algorithm removes visual observer bias from determination of physical quantities such as spatial location, area and volume. This allows reproducible, quantifiable results to be mined from very large datasets. The information derived may facilitate more reliable early space weather detection, as well as offering the potential for conducting large-sample studies focused on determining the geoeffectiveness of CMEs, coupled with analysis of their associated coronal dimmings. We present examples of dimming events extracted using our algorithm from existing EUV data, demonstrating the potential for the anticipated application to SDO/AIA data. Metadata returned by our algorithm include: location, area, volume, mass and dynamics of coronal dimmings. As well as running on historic datasets, this algorithm is capable of detecting and extracting coronal dimmings in near real-time. The coronal dimming detection and extraction algorithm described in this poster is part of the SDO/Computer Vision Center effort hosted at SAO (Martens et al., 2009). We acknowledge NASA grant NNH07AB97C.

  9. An automated string-based approach to extracting and characterizing White Matter fiber-bundles.

    PubMed

    Cauteruccio, Francesco; Stamile, Claudio; Terracina, Giorgio; Ursino, Domenico; Sappey-Marinier, Dominique

    2016-10-01

    In this paper, we propose an automated approach to extracting White Matter (WM) fiber-bundles through clustering and model characterization. The key novelties of our approach are: a new string-based formalism, allowing an alternative representation of WM fibers, a new string dissimilarity metric, a WM fiber clustering technique, and a new model-based characterization algorithm. Thanks to these novelties, the complex problem of WM fiber-bundle extraction and characterization reduces to a much simpler and well-known string extraction and analysis problem. Interestingly, while several past approaches extract fiber-bundles by grouping available fibers on the basis of provided atlases (and, therefore, cannot capture possibly existing fiber-bundles nor represented in the atlases), our approach first clusters available fibers once and for all, and then tries to associate obtained clusters with models provided directly and dynamically by users. This more dynamic and interactive way of proceeding can help the detection of fiber-bundles autonomously proposed by our approach and not present in the initial models provided by experts.

  10. An automated string-based approach to extracting and characterizing White Matter fiber-bundles.

    PubMed

    Cauteruccio, Francesco; Stamile, Claudio; Terracina, Giorgio; Ursino, Domenico; Sappey-Marinier, Dominique

    2016-10-01

    In this paper, we propose an automated approach to extracting White Matter (WM) fiber-bundles through clustering and model characterization. The key novelties of our approach are: a new string-based formalism, allowing an alternative representation of WM fibers, a new string dissimilarity metric, a WM fiber clustering technique, and a new model-based characterization algorithm. Thanks to these novelties, the complex problem of WM fiber-bundle extraction and characterization reduces to a much simpler and well-known string extraction and analysis problem. Interestingly, while several past approaches extract fiber-bundles by grouping available fibers on the basis of provided atlases (and, therefore, cannot capture possibly existing fiber-bundles nor represented in the atlases), our approach first clusters available fibers once and for all, and then tries to associate obtained clusters with models provided directly and dynamically by users. This more dynamic and interactive way of proceeding can help the detection of fiber-bundles autonomously proposed by our approach and not present in the initial models provided by experts. PMID:27522235

  11. A Novel Validation Algorithm Allows for Automated Cell Tracking and the Extraction of Biologically Meaningful Parameters

    PubMed Central

    Madany Mamlouk, Amir; Schicktanz, Simone; Kruse, Charli

    2011-01-01

    Automated microscopy is currently the only method to non-invasively and label-free observe complex multi-cellular processes, such as cell migration, cell cycle, and cell differentiation. Extracting biological information from a time-series of micrographs requires each cell to be recognized and followed through sequential microscopic snapshots. Although recent attempts to automatize this process resulted in ever improving cell detection rates, manual identification of identical cells is still the most reliable technique. However, its tedious and subjective nature prevented tracking from becoming a standardized tool for the investigation of cell cultures. Here, we present a novel method to accomplish automated cell tracking with a reliability comparable to manual tracking. Previously, automated cell tracking could not rival the reliability of manual tracking because, in contrast to the human way of solving this task, none of the algorithms had an independent quality control mechanism; they missed validation. Thus, instead of trying to improve the cell detection or tracking rates, we proceeded from the idea to automatically inspect the tracking results and accept only those of high trustworthiness, while rejecting all other results. This validation algorithm works independently of the quality of cell detection and tracking through a systematic search for tracking errors. It is based only on very general assumptions about the spatiotemporal contiguity of cell paths. While traditional tracking often aims to yield genealogic information about single cells, the natural outcome of a validated cell tracking algorithm turns out to be a set of complete, but often unconnected cell paths, i.e. records of cells from mitosis to mitosis. This is a consequence of the fact that the validation algorithm takes complete paths as the unit of rejection/acceptance. The resulting set of complete paths can be used to automatically extract important biological parameters with high

  12. Automated Large Scale Parameter Extraction of Road-Side Trees Sampled by a Laser Mobile Mapping System

    NASA Astrophysics Data System (ADS)

    Lindenbergh, R. C.; Berthold, D.; Sirmacek, B.; Herrero-Huerta, M.; Wang, J.; Ebersbach, D.

    2015-08-01

    In urbanized Western Europe trees are considered an important component of the built-up environment. This also means that there is an increasing demand for tree inventories. Laser mobile mapping systems provide an efficient and accurate way to sample the 3D road surrounding including notable roadside trees. Indeed, at, say, 50 km/h such systems collect point clouds consisting of half a million points per 100m. Method exists that extract tree parameters from relatively small patches of such data, but a remaining challenge is to operationally extract roadside tree parameters at regional level. For this purpose a workflow is presented as follows: The input point clouds are consecutively downsampled, retiled, classified, segmented into individual trees and upsampled to enable automated extraction of tree location, tree height, canopy diameter and trunk diameter at breast height (DBH). The workflow is implemented to work on a laser mobile mapping data set sampling 100 km of road in Sachsen, Germany and is tested on a stretch of road of 7km long. Along this road, the method detected 315 trees that were considered well detected and 56 clusters of tree points were no individual trees could be identified. Using voxels, the data volume could be reduced by about 97 % in a default scenario. Processing the results of this scenario took ~2500 seconds, corresponding to about 10 km/h, which is getting close to but is still below the acquisition rate which is estimated at 50 km/h.

  13. Semi-automated extraction of landslides in Taiwan based on SPOT imagery and DEMs

    NASA Astrophysics Data System (ADS)

    Eisank, Clemens; Hölbling, Daniel; Friedl, Barbara; Chen, Yi-Chin; Chang, Kang-Tsung

    2014-05-01

    The vast availability and improved quality of optical satellite data and digital elevation models (DEMs), as well as the need for complete and up-to-date landslide inventories at various spatial scales have fostered the development of semi-automated landslide recognition systems. Among the tested approaches for designing such systems, object-based image analysis (OBIA) stepped out to be a highly promising methodology. OBIA offers a flexible, spatially enabled framework for effective landslide mapping. Most object-based landslide mapping systems, however, have been tailored to specific, mainly small-scale study areas or even to single landslides only. Even though reported mapping accuracies tend to be higher than for pixel-based approaches, accuracy values are still relatively low and depend on the particular study. There is still room to improve the applicability and objectivity of object-based landslide mapping systems. The presented study aims at developing a knowledge-based landslide mapping system implemented in an OBIA environment, i.e. Trimble eCognition. In comparison to previous knowledge-based approaches, the classification of segmentation-derived multi-scale image objects relies on digital landslide signatures. These signatures hold the common operational knowledge on digital landslide mapping, as reported by 25 Taiwanese landslide experts during personal semi-structured interviews. Specifically, the signatures include information on commonly used data layers, spectral and spatial features, and feature thresholds. The signatures guide the selection and implementation of mapping rules that were finally encoded in Cognition Network Language (CNL). Multi-scale image segmentation is optimized by using the improved Estimation of Scale Parameter (ESP) tool. The approach described above is developed and tested for mapping landslides in a sub-region of the Baichi catchment in Northern Taiwan based on SPOT imagery and a high-resolution DEM. An object

  14. AUTOMATION.

    ERIC Educational Resources Information Center

    Manpower Research Council, Milwaukee, WI.

    THE MANPOWER RESEARCH COUNCIL, A NONPROFIT SERVICE ORGANIZATION, HAS AS ITS OBJECTIVE THE DEVELOPMENT OF AN INTERCHANGE AMONG THE MANUFACTURING AND SERVICE INDUSTRIES OF THE UNITED STATES OF INFORMATION ON EMPLOYMENT, INDUSTRIAL RELATIONS TRENDS AND ACTIVITIES, AND MANAGEMENT PROBLEMS. A SURVEY OF 200 MEMBER CORPORATIONS, EMPLOYING A TOTAL OF…

  15. Investigation of automated feature extraction techniques for applications in cancer detection from multispectral histopathology images

    NASA Astrophysics Data System (ADS)

    Harvey, Neal R.; Levenson, Richard M.; Rimm, David L.

    2003-05-01

    Recent developments in imaging technology mean that it is now possible to obtain high-resolution histological image data at multiple wavelengths. This allows pathologists to image specimens over a full spectrum, thereby revealing (often subtle) distinctions between different types of tissue. With this type of data, the spectral content of the specimens, combined with quantitative spatial feature characterization may make it possible not only to identify the presence of an abnormality, but also to classify it accurately. However, such are the quantities and complexities of these data, that without new automated techniques to assist in the data analysis, the information contained in the data will remain inaccessible to those who need it. We investigate the application of a recently developed system for the automated analysis of multi-/hyper-spectral satellite image data to the problem of cancer detection from multispectral histopathology image data. The system provides a means for a human expert to provide training data simply by highlighting regions in an image using a computer mouse. Application of these feature extraction techniques to examples of both training and out-of-training-sample data demonstrate that these, as yet unoptimized, techniques already show promise in the discrimination between benign and malignant cells from a variety of samples.

  16. Extraction of words from the national ID cards for automated recognition

    NASA Astrophysics Data System (ADS)

    Akhter, Md. Rezwan; Bhuiyan, Md. Hasanuzzaman; Uddin, Mohammad Shorif

    2011-10-01

    The government of Bangladesh introduced national ID cards in 2008 for all peoples of age 18 years and above. This card is now a de-facto identity document and finds diverse applications in vote casting, bank account opening, telephone subscribing as well as in many real life transactions and security checking. To get real fruits of this versatile ID card, automated retrieving and recognition of an independent person from this extra large national database is an ultimate necessity. This work is the first step to fill this gap in making the recognition in automated fashion. Here we have investigated an image analysis technique to extract the words that will be used in subsequent recognition steps. At first scanned ID card image is used as an input into the computer system and then the target text region is separated from the picture region. The text region is used for separation of lines and words on the basis of the vertical and horizontal projections of image intensity, respectively. Experimentation using real national ID cards confirms the effectiveness of our technique.

  17. FBI DRUGFIRE program: the development and deployment of an automated firearms identification system to support serial, gang, and drug-related shooting investigations

    NASA Astrophysics Data System (ADS)

    Sibert, Robert W.

    1994-03-01

    The FBI DRUGFIRE Program entails the continuing phased development and deployment of a scalable automated firearms identification system. The first phase of this system, a networked, database-driven firearms evidence imaging system, has been operational for approximately one year and has demonstrated its effectiveness in facilitating the sharing and linking of firearms evidence collected in serial, gang, and drug-related shooting investigations. However, there is a pressing need for development of enhancements which will more fully automate the system so that it is capable of processing very large volumes of firearms evidence. These enhancements would provide automated image analysis and pattern matching functionalities. Existing `spin off' technologies need to be integrated into the present DRUGFIRE system to automate the 3-D mensuration, registration, feature extraction, and matching of the microtopographical surface features imprinted on the primers of fired casings during firing.

  18. Comparison of two extraction methods independently developed on two conceptually different automated supercritical fluid extraction systems for the determination of polychlorinated biphenyls in sediments.

    PubMed

    Nilsson, T; Björklund, E; Bøwadt, S

    2000-09-01

    Two extraction methods that independently have been developed on conceptually different automated supercritical fluid extraction systems, ISCO SFX 3560 (syringe pump and liquid trapping) and Hewlett-Packard 7680T SFE (reciprocating pump and solid-phase trapping), were compared for the extraction of polychlorinated biphenyls from two Swedish sediments. The results demonstrated that the high-temperature ISCO method in some cases yields a more exhaustive extraction, but also less clean extracts due to co-extraction of unwanted matrix components which are all present in the trapping solvent. The medium-temperature Hewlett-Packard method may sometimes cause problems with quantitative recoveries, but on the other hand it yields very clean extracts due to the extra selectivity resulting from collection on a solid-phase trap.

  19. Automated data extraction from in situ protein stable isotope probing studies

    SciTech Connect

    Slysz, Gordon W.; Steinke, Laurey A.; Ward, David M.; Klatt, Christian G.; Clauss, Therese RW; Purvine, Samuel O.; Payne, Samuel H.; Anderson, Gordon A.; Smith, Richard D.; Lipton, Mary S.

    2014-01-27

    Protein stable isotope probing (protein-SIP) has strong potential for revealing key metabolizing taxa in complex microbial communities. While most protein-SIP work to date has been performed under controlled laboratory conditions to allow extensive isotope labeling of the target organism, a key application will be in situ studies of microbial communities under conditions that result in small degrees of partial labeling. One hurdle restricting large scale in situ protein-SIP studies is the lack of algorithms and software for automated data processing of the massive data sets resulting from such studies. In response, we developed Stable Isotope Probing Protein Extraction Resources software (SIPPER) and applied it for large scale extraction and visualization of data from short term (3 h) protein-SIP experiments performed in situ on Yellowstone phototrophic bacterial mats. Several metrics incorporated into the software allow it to support exhaustive analysis of the complex composite isotopic envelope observed as a result of low amounts of partial label incorporation. SIPPER also enables the detection of labeled molecular species without the need for any prior identification.

  20. Rapid and automated sample preparation for nucleic acid extraction on a microfluidic CD (compact disk)

    NASA Astrophysics Data System (ADS)

    Kim, Jitae; Kido, Horacio; Zoval, Jim V.; Gagné, Dominic; Peytavi, Régis; Picard, François J.; Bastien, Martine; Boissinot, Maurice; Bergeron, Michel G.; Madou, Marc J.

    2006-01-01

    Rapid and automated preparation of PCR (polymerase chain reaction)-ready genomic DNA was demonstrated on a multiplexed CD (compact disk) platform by using hard-to-lyse bacterial spores. Cell disruption is carried out while beadcell suspensions are pushed back and forth in center-tapered lysing chambers by angular oscillation of the disk - keystone effect. During this lysis period, the cell suspensions are securely held within the lysing chambers by heatactivated wax valves. Upon application of a remote heat to the disk in motion, the wax valves release lysate solutions into centrifuge chambers where cell debris are separated by an elevated rotation of the disk. Only debris-free DNA extract is then transferred to collection chambers by capillary-assisted siphon and collected for heating that inactivates PCR inhibitors. Lysing capacity was evaluated using a real-time PCR assay to monitor the efficiency of Bacillus globigii spore lysis. PCR analysis showed that 5 minutes' CD lysis run gave spore lysis efficiency similar to that obtained with a popular commercial DNA extraction kit (i.e., IDI-lysis kit from GeneOhm Sciences Inc.) which is highly efficient for microbial cell and spore lysis. This work will contribute to the development of an integrated CD-based assay for rapid diagnosis of infectious diseases.

  1. Streamlining DNA Barcoding Protocols: Automated DNA Extraction and a New cox1 Primer in Arachnid Systematics

    PubMed Central

    Vidergar, Nina; Toplak, Nataša; Kuntner, Matjaž

    2014-01-01

    Background DNA barcoding is a popular tool in taxonomic and phylogenetic studies, but for most animal lineages protocols for obtaining the barcoding sequences—mitochondrial cytochrome C oxidase subunit I (cox1 AKA CO1)—are not standardized. Our aim was to explore an optimal strategy for arachnids, focusing on the species-richest lineage, spiders by (1) improving an automated DNA extraction protocol, (2) testing the performance of commonly used primer combinations, and (3) developing a new cox1 primer suitable for more efficient alignment and phylogenetic analyses. Methodology We used exemplars of 15 species from all major spider clades, processed a range of spider tissues of varying size and quality, optimized genomic DNA extraction using the MagMAX Express magnetic particle processor—an automated high throughput DNA extraction system—and tested cox1 amplification protocols emphasizing the standard barcoding region using ten routinely employed primer pairs. Results The best results were obtained with the commonly used Folmer primers (LCO1490/HCO2198) that capture the standard barcode region, and with the C1-J-2183/C1-N-2776 primer pair that amplifies its extension. However, C1-J-2183 is designed too close to HCO2198 for well-interpreted, continuous sequence data, and in practice the resulting sequences from the two primer pairs rarely overlap. We therefore designed a new forward primer C1-J-2123 60 base pairs upstream of the C1-J-2183 binding site. The success rate of this new primer (93%) matched that of C1-J-2183. Conclusions The use of C1-J-2123 allows full, indel-free overlap of sequences obtained with the standard Folmer primers and with C1-J-2123 primer pair. Our preliminary tests suggest that in addition to spiders, C1-J-2123 will also perform in other arachnids and several other invertebrates. We provide optimal PCR protocols for these primer sets, and recommend using them for systematic efforts beyond DNA barcoding. PMID:25415202

  2. Californian demonstration and validation of automated agricultural field extraction from multi-temporal Landsat data

    NASA Astrophysics Data System (ADS)

    Yan, L.; Roy, D. P.

    2013-12-01

    The spatial distribution of agricultural fields is a fundamental description of rural landscapes and the location and extent of fields is important to establish the area of land utilized for agricultural yield prediction, resource allocation, and for economic planning. To date, field objects have not been extracted from satellite data over large areas because of computational constraints and because consistently processed appropriate resolution data have not been available or affordable. We present a fully automated computational methodology to extract agricultural fields from 30m Web Enabled Landsat data (WELD) time series and results for approximately 250,000 square kilometers (eleven 150 x 150 km WELD tiles) encompassing all the major agricultural areas of California. The extracted fields, including rectangular, circular, and irregularly shaped fields, are evaluated by comparison with manually interpreted Landsat field objects. Validation results are presented in terms of standard confusion matrix accuracy measures and also the degree of field object over-segmentation, under-segmentation, fragmentation and shape distortion. The apparent success of the presented field extraction methodology is due to several factors. First, the use of multi-temporal Landsat data, as opposed to single Landsat acquisitions, that enables crop rotations and inter-annual variability in the state of the vegetation to be accommodated for and provides more opportunities for cloud-free, non-missing and atmospherically uncontaminated surface observations. Second, the adoption of an object based approach, namely the variational region-based geometric active contour method that enables robust segmentation with only a small number of parameters and that requires no training data collection. Third, the use of a watershed algorithm to decompose connected segments belonging to multiple fields into coherent isolated field segments and a geometry based algorithm to detect and associate parts of

  3. Automated Outreach for Cardiovascular-Related Medication Refill Reminders.

    PubMed

    Harrison, Teresa N; Green, Kelley R; Liu, In-Lu Amy; Vansomphone, Southida S; Handler, Joel; Scott, Ronald D; Cheetham, T Craig; Reynolds, Kristi

    2016-07-01

    The objective of this study was to evaluate the effectiveness of an automated telephone system reminding patients with hypertension and/or cardiovascular disease to obtain overdue medication refills. The authors compared the intervention with usual care among patients with an overdue prescription for a statin or lisinopril-hydrochlorothiazide (lisinopril-HCTZ). The primary outcome was refill rate at 2 weeks. Secondary outcomes included time to refill and change in low-density lipoprotein cholesterol and blood pressure. Significantly more patients who received a reminder call refilled their prescription compared with the usual-care group (statin cohort: 30.3% vs 24.9% [P<.0001]; lisinopril-HCTZ cohort: 30.7% vs 24.2% [P<.0001]). The median time to refill was shorter in patients receiving the reminder call (statin cohort: 29 vs 36 days [P<.0001]; lisinopril-HCTZ cohort: 24 vs 31 days [P<.0001]). There were no statistically significant differences in mean low-density lipoprotein cholesterol and blood pressure. These findings suggest the need for interventions that have a longer-term impact. PMID:26542896

  4. Support Vector Machine with Ensemble Tree Kernel for Relation Extraction.

    PubMed

    Liu, Xiaoyong; Fu, Hui; Du, Zhiguo

    2016-01-01

    Relation extraction is one of the important research topics in the field of information extraction research. To solve the problem of semantic variation in traditional semisupervised relation extraction algorithm, this paper proposes a novel semisupervised relation extraction algorithm based on ensemble learning (LXRE). The new algorithm mainly uses two kinds of support vector machine classifiers based on tree kernel for integration and integrates the strategy of constrained extension seed set. The new algorithm can weaken the inaccuracy of relation extraction, which is caused by the phenomenon of semantic variation. The numerical experimental research based on two benchmark data sets (PropBank and AIMed) shows that the LXRE algorithm proposed in the paper is superior to other two common relation extraction methods in four evaluation indexes (Precision, Recall, F-measure, and Accuracy). It indicates that the new algorithm has good relation extraction ability compared with others. PMID:27118966

  5. Support Vector Machine with Ensemble Tree Kernel for Relation Extraction

    PubMed Central

    Fu, Hui; Du, Zhiguo

    2016-01-01

    Relation extraction is one of the important research topics in the field of information extraction research. To solve the problem of semantic variation in traditional semisupervised relation extraction algorithm, this paper proposes a novel semisupervised relation extraction algorithm based on ensemble learning (LXRE). The new algorithm mainly uses two kinds of support vector machine classifiers based on tree kernel for integration and integrates the strategy of constrained extension seed set. The new algorithm can weaken the inaccuracy of relation extraction, which is caused by the phenomenon of semantic variation. The numerical experimental research based on two benchmark data sets (PropBank and AIMed) shows that the LXRE algorithm proposed in the paper is superior to other two common relation extraction methods in four evaluation indexes (Precision, Recall, F-measure, and Accuracy). It indicates that the new algorithm has good relation extraction ability compared with others. PMID:27118966

  6. Semi-automated procedures for shoreline extraction using single RADARSAT-1 SAR image

    NASA Astrophysics Data System (ADS)

    Al Fugura, A.'kif; Billa, Lawal; Pradhan, Biswajeet

    2011-12-01

    Coastline identification is important for surveying and mapping reasons. Coastline serves as the basic point of reference and is used on nautical charts for navigation purposes. Its delineation has become crucial and more important in the wake of the many recent earthquakes and tsunamis resulting in complete change and redraw of some shorelines. In a tropical country like Malaysia, presence of cloud cover hinders the application of optical remote sensing data. In this study a semi-automated technique and procedures are presented for shoreline delineation from RADARSAT-1 image. A scene of RADARSAT-1 satellite image was processed using enhanced filtering technique to identify and extract the shoreline coast of Kuala Terengganu, Malaysia. RADSARSAT image has many advantages over the optical data because of its ability to penetrate cloud cover and its night sensing capabilities. At first, speckles were removed from the image by using Lee sigma filter which was used to reduce random noise and to enhance the image and discriminate the boundary between land and water. The results showed an accurate and improved extraction and delineation of the entire coastline of Kuala Terrenganu. The study demonstrated the reliability of the image averaging filter in reducing random noise over the sea surface especially near the shoreline. It enhanced land-water boundary differentiation, enabling better delineation of the shoreline. Overall, the developed techniques showed the potential of radar imagery for accurate shoreline mapping and will be useful for monitoring shoreline changes during high and low tides as well as shoreline erosion in a tropical country like Malaysia.

  7. Automated centreline extraction of neuronal dendrite from optical microscopy image stacks

    NASA Astrophysics Data System (ADS)

    Xiao, Liang; Zhang, Fanbiao

    2010-11-01

    In this work we present a novel vision-based pipeline for automated skeleton detection and centreline extraction of neuronal dendrite from optical microscopy image stacks. The proposed pipeline is an integrated solution that merges image stacks pre-processing, the seed points detection, ridge traversal procedure, minimum spanning tree optimization and tree trimming into to a unified framework to deal with the challenge problem. In image stacks preprocessing, we first apply a curvelet transform based shrinkage and cycle spinning technique to remove the noise. This is followed by the adaptive threshold method to compute the result of neuronal object segmentation, and the 3D distance transformation is performed to get the distance map. According to the eigenvalues and eigenvectors of the Hessian matrix, the skeleton seed points are detected. Staring from the seed points, the initial centrelines are obtained using ridge traversal procedure. After that, we use minimum spanning tree to organize the geometrical structure of the skeleton points, and then we use graph trimming post-processing to compute the final centreline. Experimental results on different datasets demonstrate that our approach has high reliability, good robustness and requires less user interaction.

  8. Sensitivity testing of trypanosome detection by PCR from whole blood samples using manual and automated DNA extraction methods.

    PubMed

    Dunlop, J; Thompson, C K; Godfrey, S S; Thompson, R C A

    2014-11-01

    Automated extraction of DNA for testing of laboratory samples is an attractive alternative to labour-intensive manual methods when higher throughput is required. However, it is important to maintain the maximum detection sensitivity possible to reduce the occurrence of type II errors (false negatives; failure to detect the target when it is present), especially in the biomedical field, where PCR is used for diagnosis. We used blood infected with known concentrations of Trypanosoma copemani to test the impact of analysis techniques on trypanosome detection sensitivity by PCR. We compared combinations of a manual and an automated DNA extraction method and two different PCR primer sets to investigate the impact of each on detection levels. Both extraction techniques and specificity of primer sets had a significant impact on detection sensitivity. Samples extracted using the same DNA extraction technique performed substantially differently for each of the separate primer sets. Type I errors (false positives; detection of the target when it is not present), produced by contaminants, were avoided with both extraction methods. This study highlights the importance of testing laboratory techniques with known samples to optimise accuracy of test results.

  9. Rapid and Semi-Automated Extraction of Neuronal Cell Bodies and Nuclei from Electron Microscopy Image Stacks

    PubMed Central

    Holcomb, Paul S.; Morehead, Michael; Doretto, Gianfranco; Chen, Peter; Berg, Stuart; Plaza, Stephen; Spirou, George

    2016-01-01

    Connectomics—the study of how neurons wire together in the brain—is at the forefront of modern neuroscience research. However, many connectomics studies are limited by the time and precision needed to correctly segment large volumes of electron microscopy (EM) image data. We present here a semi-automated segmentation pipeline using freely available software that can significantly decrease segmentation time for extracting both nuclei and cell bodies from EM image volumes. PMID:27259933

  10. PKDE4J: Entity and relation extraction for public knowledge discovery.

    PubMed

    Song, Min; Kim, Won Chul; Lee, Dahee; Heo, Go Eun; Kang, Keun Young

    2015-10-01

    Due to an enormous number of scientific publications that cannot be handled manually, there is a rising interest in text-mining techniques for automated information extraction, especially in the biomedical field. Such techniques provide effective means of information search, knowledge discovery, and hypothesis generation. Most previous studies have primarily focused on the design and performance improvement of either named entity recognition or relation extraction. In this paper, we present PKDE4J, a comprehensive text-mining system that integrates dictionary-based entity extraction and rule-based relation extraction in a highly flexible and extensible framework. Starting with the Stanford CoreNLP, we developed the system to cope with multiple types of entities and relations. The system also has fairly good performance in terms of accuracy as well as the ability to configure text-processing components. We demonstrate its competitive performance by evaluating it on many corpora and found that it surpasses existing systems with average F-measures of 85% for entity extraction and 81% for relation extraction.

  11. Evaluation of Automated and Manual Commercial DNA Extraction Methods for Recovery of Brucella DNA from Suspensions and Spiked Swabs ▿

    PubMed Central

    Dauphin, Leslie A.; Hutchins, Rebecca J.; Bost, Liberty A.; Bowen, Michael D.

    2009-01-01

    This study evaluated automated and manual commercial DNA extraction methods for their ability to recover DNA from Brucella species in phosphate-buffered saline (PBS) suspension and from spiked swab specimens. Six extraction methods, representing several of the methodologies which are commercially available for DNA extraction, as well as representing various throughput capacities, were evaluated: the MagNA Pure Compact and the MagNA Pure LC instruments, the IT 1-2-3 DNA sample purification kit, the MasterPure Complete DNA and RNA purification kit, the QIAamp DNA blood mini kit, and the UltraClean microbial DNA isolation kit. These six extraction methods were performed upon three pathogenic Brucella species: B. abortus, B. melitensis, and B. suis. Viability testing of the DNA extracts indicated that all six extraction methods were efficient at inactivating virulent Brucella spp. Real-time PCR analysis using Brucella genus- and species-specific TaqMan assays revealed that use of the MasterPure kit resulted in superior levels of detection from bacterial suspensions, while the MasterPure kit and MagNA Pure Compact performed equally well for extraction of spiked swab samples. This study demonstrated that DNA extraction methodologies differ in their ability to recover Brucella DNA from PBS bacterial suspensions and from swab specimens and, thus, that the extraction method used for a given type of sample matrix can influence the sensitivity of real-time PCR assays for Brucella. PMID:19846627

  12. Automated Semantic Indices Related to Cognitive Function and Rate of Cognitive Decline

    ERIC Educational Resources Information Center

    Pakhomov, Serguei V. S.; Hemmy, Laura S.; Lim, Kelvin O.

    2012-01-01

    The objective of our study is to introduce a fully automated, computational linguistic technique to quantify semantic relations between words generated on a standard semantic verbal fluency test and to determine its cognitive and clinical correlates. Cognitive differences between patients with Alzheimer's disease and mild cognitive impairment are…

  13. Automated Agricultural Field Extraction from Multi-temporal Web Enabled Landsat Data

    NASA Astrophysics Data System (ADS)

    Yan, L.; Roy, D. P.

    2012-12-01

    Agriculture has caused significant anthropogenic surface change. In many regions agricultural field sizes may be increasing to maximize yields and reduce costs resulting in decreased landscape spatial complexity and increased homogenization of land uses with potential for significant biogeochemical and ecological effects. To date, studies of the incidence, drivers and impacts of changing field sizes have not been undertaken over large areas because of computational constraints and because consistently processed appropriate resolution data have not been available or affordable. The Landsat series of satellites provides near-global coverage, long term, and appropriate spatial resolution (30m) satellite data to document changing field sizes. The recent free availability of all the Landsat data in the U.S. Landsat archive now provides the opportunity to study field size changes in a global and consistent way. Commercial software can be used to extract fields from Landsat data but are inappropriate for large area application because they require considerable human interaction. This paper presents research to develop and validate an automated computational Geographic Object Based Image Analysis methodology to extract agricultural fields and derive field sizes from Web Enabled Landsat Data (WELD) (http://weld.cr.usgs.gov/). WELD weekly products (30m reflectance and brightness temperature) are classified into Satellite Image Automatic Mapper™ (SIAM™) spectral categories and an edge intensity map and a map of the probability of each pixel being agricultural are derived from five years of 52 weeks of WELD and corresponding SIAM™ data. These data are fused to derive candidate agriculture field segments using a variational region-based geometric active contour model. Geometry-based algorithms are used to decompose connected segments belonging to multiple fields into coherent isolated field objects with a divide and conquer strategy to detect and merge partial circle

  14. Arsenic fractionation in agricultural soil using an automated three-step sequential extraction method coupled to hydride generation-atomic fluorescence spectrometry.

    PubMed

    Rosas-Castor, J M; Portugal, L; Ferrer, L; Guzmán-Mar, J L; Hernández-Ramírez, A; Cerdà, V; Hinojosa-Reyes, L

    2015-05-18

    A fully automated modified three-step BCR flow-through sequential extraction method was developed for the fractionation of the arsenic (As) content from agricultural soil based on a multi-syringe flow injection analysis (MSFIA) system coupled to hydride generation-atomic fluorescence spectrometry (HG-AFS). Critical parameters that affect the performance of the automated system were optimized by exploiting a multivariate approach using a Doehlert design. The validation of the flow-based modified-BCR method was carried out by comparison with the conventional BCR method. Thus, the total As content was determined in the following three fractions: fraction 1 (F1), the acid-soluble or interchangeable fraction; fraction 2 (F2), the reducible fraction; and fraction 3 (F3), the oxidizable fraction. The limits of detection (LOD) were 4.0, 3.4, and 23.6 μg L(-1) for F1, F2, and F3, respectively. A wide working concentration range was obtained for the analysis of each fraction, i.e., 0.013-0.800, 0.011-0.900 and 0.079-1.400 mg L(-1) for F1, F2, and F3, respectively. The precision of the automated MSFIA-HG-AFS system, expressed as the relative standard deviation (RSD), was evaluated for a 200 μg L(-1) As standard solution, and RSD values between 5 and 8% were achieved for the three BCR fractions. The new modified three-step BCR flow-based sequential extraction method was satisfactorily applied for arsenic fractionation in real agricultural soil samples from an arsenic-contaminated mining zone to evaluate its extractability. The frequency of analysis of the proposed method was eight times higher than that of the conventional BCR method (6 vs 48 h), and the kinetics of lixiviation were established for each fraction. PMID:25910440

  15. Arsenic fractionation in agricultural soil using an automated three-step sequential extraction method coupled to hydride generation-atomic fluorescence spectrometry.

    PubMed

    Rosas-Castor, J M; Portugal, L; Ferrer, L; Guzmán-Mar, J L; Hernández-Ramírez, A; Cerdà, V; Hinojosa-Reyes, L

    2015-05-18

    A fully automated modified three-step BCR flow-through sequential extraction method was developed for the fractionation of the arsenic (As) content from agricultural soil based on a multi-syringe flow injection analysis (MSFIA) system coupled to hydride generation-atomic fluorescence spectrometry (HG-AFS). Critical parameters that affect the performance of the automated system were optimized by exploiting a multivariate approach using a Doehlert design. The validation of the flow-based modified-BCR method was carried out by comparison with the conventional BCR method. Thus, the total As content was determined in the following three fractions: fraction 1 (F1), the acid-soluble or interchangeable fraction; fraction 2 (F2), the reducible fraction; and fraction 3 (F3), the oxidizable fraction. The limits of detection (LOD) were 4.0, 3.4, and 23.6 μg L(-1) for F1, F2, and F3, respectively. A wide working concentration range was obtained for the analysis of each fraction, i.e., 0.013-0.800, 0.011-0.900 and 0.079-1.400 mg L(-1) for F1, F2, and F3, respectively. The precision of the automated MSFIA-HG-AFS system, expressed as the relative standard deviation (RSD), was evaluated for a 200 μg L(-1) As standard solution, and RSD values between 5 and 8% were achieved for the three BCR fractions. The new modified three-step BCR flow-based sequential extraction method was satisfactorily applied for arsenic fractionation in real agricultural soil samples from an arsenic-contaminated mining zone to evaluate its extractability. The frequency of analysis of the proposed method was eight times higher than that of the conventional BCR method (6 vs 48 h), and the kinetics of lixiviation were established for each fraction.

  16. Semi-automated and automated glioma grading using dynamic susceptibility-weighted contrast-enhanced perfusion MRI relative cerebral blood volume measurements

    PubMed Central

    Friedman, S N; Bambrough, P J; Kotsarini, C; Khandanpour, N; Hoggard, N

    2012-01-01

    Objective Despite the established role of MRI in the diagnosis of brain tumours, histopathological assessment remains the clinically used technique, especially for the glioma group. Relative cerebral blood volume (rCBV) is a dynamic susceptibility-weighted contrast-enhanced perfusion MRI parameter that has been shown to correlate to tumour grade, but assessment requires a specialist and is time consuming. We developed analysis software to determine glioma gradings from perfusion rCBV scans in a manner that is quick, easy and does not require a specialist operator. Methods MRI perfusion data from 47 patients with different histopathological grades of glioma were analysed with custom-designed software. Semi-automated analysis was performed with a specialist and non-specialist operator separately determining the maximum rCBV value corresponding to the tumour. Automated histogram analysis was performed by calculating the mean, standard deviation, median, mode, skewness and kurtosis of rCBV values. All values were compared with the histopathologically assessed tumour grade. Results A strong correlation between specialist and non-specialist observer measurements was found. Significantly different values were obtained between tumour grades using both semi-automated and automated techniques, consistent with previous results. The raw (unnormalised) data single-pixel maximum rCBV semi-automated analysis value had the strongest correlation with glioma grade. Standard deviation of the raw data had the strongest correlation of the automated analysis. Conclusion Semi-automated calculation of raw maximum rCBV value was the best indicator of tumour grade and does not require a specialist operator. Advances in knowledge Both semi-automated and automated MRI perfusion techniques provide viable non-invasive alternatives to biopsy for glioma tumour grading. PMID:23175486

  17. Comparative evaluation of commercially available manual and automated nucleic acid extraction methods for rotavirus RNA detection in stools.

    PubMed

    Esona, Mathew D; McDonald, Sharla; Kamili, Shifaq; Kerin, Tara; Gautam, Rashi; Bowen, Michael D

    2013-12-01

    Rotaviruses are a major cause of viral gastroenteritis in children. For accurate and sensitive detection of rotavirus RNA from stool samples by reverse transcription-polymerase chain reaction (RT-PCR), the extraction process must be robust. However, some extraction methods may not remove the strong RT-PCR inhibitors known to be present in stool samples. The objective of this study was to evaluate and compare the performance of six extraction methods used commonly for extraction of rotavirus RNA from stool, which have never been formally evaluated: the MagNA Pure Compact, KingFisher Flex and NucliSENS easyMAG instruments, the NucliSENS miniMAG semi-automated system, and two manual purification kits, the QIAamp Viral RNA kit and a modified RNaid kit. Using each method, total nucleic acid or RNA was extracted from eight rotavirus-positive stool samples with enzyme immunoassay optical density (EIA OD) values ranging from 0.176 to 3.098. Extracts prepared using the MagNA Pure Compact instrument yielded the most consistent results by qRT-PCR and conventional RT-PCR. When extracts prepared from a dilution series were extracted by the 6 methods and tested, rotavirus RNA was detected in all samples by qRT-PCR but by conventional RT-PCR testing, only the MagNA Pure Compact and KingFisher Flex extracts were positive in all cases. RT-PCR inhibitors were detected in extracts produced with the QIAamp Viral RNA Mini kit. The findings of this study should prove useful for selection of extraction methods to be incorporated into future rotavirus detection and genotyping protocols. PMID:24036075

  18. Mixed-mode isolation of triazine metabolites from soil and aquifer sediments using automated solid-phase extraction

    USGS Publications Warehouse

    Mills, M.S.; Thurman, E.M.

    1992-01-01

    Reversed-phase isolation and ion-exchange purification were combined in the automated solid-phase extraction of two polar s-triazine metabolites, 2-amino-4-chloro-6-(isopropylamino)-s-triazine (deethylatrazine) and 2-amino-4-chloro-6-(ethylamino)-s-triazine (deisopropylatrazine) from clay-loam and slit-loam soils and sandy aquifer sediments. First, methanol/ water (4/1, v/v) soil extracts were transferred to an automated workstation following evaporation of the methanol phase for the rapid reversed-phase isolation of the metabolites on an octadecylresin (C18). The retention of the triazine metabolites on C18 decreased substantially when trace methanol concentrations (1%) remained. Furthermore, the retention on C18 increased with decreasing aqueous solubility and increasing alkyl-chain length of the metabolites and parent herbicides, indicating a reversed-phase interaction. The analytes were eluted with ethyl acetate, which left much of the soil organic-matter impurities on the resin. Second, the small-volume organic eluate was purified on an anion-exchange resin (0.5 mL/min) to extract the remaining soil pigments that could foul the ion source of the GC/MS system. Recoveries of the analytes were 75%, using deuterated atrazine as a surrogate, and were comparable to recoveries by soxhlet extraction. The detection limit was 0.1 ??g/kg with a coefficient of variation of 15%. The ease and efficiency of this automated method makes it viable, practical technique for studying triazine metabolites in the environment.

  19. A fully integrated and automated microsystem for rapid pharmacogenetic typing of multiple warfarin-related single-nucleotide polymorphisms.

    PubMed

    Zhuang, Bin; Han, Junping; Xiang, Guangxin; Gan, Wupeng; Wang, Shuaiqin; Wang, Dong; Wang, Lei; Sun, Jing; Li, Cai-Xia; Liu, Peng

    2016-01-01

    A fully integrated and automated microsystem consisting of low-cost, disposable plastic chips for DNA extraction and PCR amplification combined with a reusable glass capillary array electrophoresis chip in a modular-based format was successfully developed for warfarin pharmacogenetic testing. DNA extraction was performed by adopting a filter paper-based method, followed by "in situ" PCR that was carried out directly in the same reaction chamber of the chip without elution. PCR products were then co-injected with sizing standards into separation channels for detection using a novel injection electrode. The entire process was automatically conducted on a custom-made compact control and detection instrument. The limit of detection of the microsystem for the singleplex amplification of amelogenin was determined to be 0.625 ng of standard K562 DNA and 0.3 μL of human whole blood. A two-color multiplex allele-specific PCR assay for detecting the warfarin-related single-nucleotide polymorphisms (SNPs) 6853 (-1639G>A) and 6484 (1173C>T) in the VKORC1 gene and the *3 SNP (1075A>C) in the CYP2C9 gene was developed and used for validation studies. The fully automated genetic analysis was completed in two hours with a minimum requirement of 0.5 μL of input blood. Samples from patients with different genotypes were all accurately analyzed. In addition, both dried bloodstains and oral swabs were successfully processed by the microsystem with a simple modification to the DNA extraction and amplification chip. The successful development and operation of this microsystem establish the feasibility of rapid warfarin pharmacogenetic testing in routine clinical practice. PMID:26568290

  20. On-line automated sample preparation for liquid chromatography using parallel supported liquid membrane extraction and microporous membrane liquid-liquid extraction.

    PubMed

    Sandahl, Margareta; Mathiasson, Lennart; Jönsson, Jan Ake

    2002-10-25

    An automated system was developed for analysis of non-polar and polar ionisable compounds at trace levels in natural water. Sample work-up was performed in a flow system using two parallel membrane extraction units. This system was connected on-line to a reversed-phase HPLC system for final determination. One of the membrane units was used for supported liquid membrane (SLM) extraction, which is suitable for ionisable or permanently charged compounds. The other unit was used for microporous membrane liquid-liquid extraction (MMLLE) suitable for uncharged compounds. The fungicide thiophanate methyl and its polar metabolites carbendazim and 2-aminobenzimidazole were used as model compounds. The whole system was controlled by means of four syringe pumps. While extracting one part of the sample using the SLM technique. the extract from the MMLLE extraction was analysed and vice versa. This gave a total analysis time of 63 min for each sample resulting in a sample throughput of 22 samples per 24 h.

  1. Solid phase extraction-liquid chromatography (SPE-LC) interface for automated peptide separation and identification by tandem mass spectrometry

    NASA Astrophysics Data System (ADS)

    Hørning, Ole Bjeld; Theodorsen, Søren; Vorm, Ole; Jensen, Ole Nørregaard

    2007-12-01

    Reversed-phase solid phase extraction (SPE) is a simple and widely used technique for desalting and concentration of peptide and protein samples prior to mass spectrometry analysis. Often, SPE sample preparation is done manually and the samples eluted, dried and reconstituted into 96-well titer plates for subsequent LC-MS/MS analysis. To reduce the number of sample handling stages and increase throughput, we developed a robotic system to interface off-line SPE to LC-ESI-MS/MS. Samples were manually loaded onto disposable SPE tips that subsequently were connected in-line with a capillary chromatography column. Peptides were recovered from the SPE column and separated on the RP-LC column using isocratic elution conditions and analysed by electrospray tandem mass spectrometry. Peptide mixtures eluted within approximately 5 min, with individual peptide peak resolution of ~7 s (FWHM), making the SPE-LC suited for analysis of medium complex samples (3-12 protein components). For optimum performance, the isocratic flow rate was reduced to 30 nL/min, producing nanoelectrospray like conditions which ensure high ionisation efficiency and sensitivity. Using a modified autosampler for mounting and disposing of the SPE tips, the SPE-LC-MS/MS system could analyse six samples per hour, and up to 192 SPE tips in one batch. The relatively high sample throughput, medium separation power and high sensitivity makes the automated SPE-LC-MS/MS setup attractive for proteomics experiments as demonstrated by the identification of the components of simple protein mixtures and of proteins recovered from 2DE gels.

  2. Automated identification and geometrical features extraction of individual trees from Mobile Laser Scanning data in Budapest

    NASA Astrophysics Data System (ADS)

    Koma, Zsófia; Székely, Balázs; Folly-Ritvay, Zoltán; Skobrák, Ferenc; Koenig, Kristina; Höfle, Bernhard

    2016-04-01

    Mobile Laser Scanning (MLS) is an evolving operational measurement technique for urban environment providing large amounts of high resolution information about trees, street features, pole-like objects on the street sides or near to motorways. In this study we investigate a robust segmentation method to extract the individual trees automatically in order to build an object-based tree database system. We focused on the large urban parks in Budapest (Margitsziget and Városliget; KARESZ project) which contained large diversity of different kind of tree species. The MLS data contained high density point cloud data with 1-8 cm mean absolute accuracy 80-100 meter distance from streets. The robust segmentation method contained following steps: The ground points are determined first. As a second step cylinders are fitted in vertical slice 1-1.5 meter relative height above ground, which is used to determine the potential location of each single trees trunk and cylinder-like object. Finally, residual values are calculated as deviation of each point from a vertically expanded fitted cylinder; these residual values are used to separate cylinder-like object from individual trees. After successful parameterization, the model parameters and the corresponding residual values of the fitted object are extracted and imported into the tree database. Additionally, geometric features are calculated for each segmented individual tree like crown base, crown width, crown length, diameter of trunk, volume of the individual trees. In case of incompletely scanned trees, the extraction of geometric features is based on fitted circles. The result of the study is a tree database containing detailed information about urban trees, which can be a valuable dataset for ecologist, city planners, planting and mapping purposes. Furthermore, the established database will be the initial point for classification trees into single species. MLS data used in this project had been measured in the framework of

  3. Automated solid-phase extraction coupled online with HPLC-FLD for the quantification of zearalenone in edible oil.

    PubMed

    Drzymala, Sarah S; Weiz, Stefan; Heinze, Julia; Marten, Silvia; Prinz, Carsten; Zimathies, Annett; Garbe, Leif-Alexander; Koch, Matthias

    2015-05-01

    Established maximum levels for the mycotoxin zearalenone (ZEN) in edible oil require monitoring by reliable analytical methods. Therefore, an automated SPE-HPLC online system based on dynamic covalent hydrazine chemistry has been developed. The SPE step comprises a reversible hydrazone formation by ZEN and a hydrazine moiety covalently attached to a solid phase. Seven hydrazine materials with different properties regarding the resin backbone, pore size, particle size, specific surface area, and loading have been evaluated. As a result, a hydrazine-functionalized silica gel was chosen. The final automated online method was validated and applied to the analysis of three maize germ oil samples including a provisionally certified reference material. Important performance criteria for the recovery (70-120 %) and precision (RSDr <25 %) as set by the Commission Regulation EC 401/2006 were fulfilled: The mean recovery was 78 % and RSDr did not exceed 8 %. The results of the SPE-HPLC online method were further compared to results obtained by liquid-liquid extraction with stable isotope dilution analysis LC-MS/MS and found to be in good agreement. The developed SPE-HPLC online system with fluorescence detection allows a reliable, accurate, and sensitive quantification (limit of quantification, 30 μg/kg) of ZEN in edible oils while significantly reducing the workload. To our knowledge, this is the first report on an automated SPE-HPLC method based on a covalent SPE approach.

  4. Screening for anabolic steroids in urine of forensic cases using fully automated solid phase extraction and LC-MS-MS.

    PubMed

    Andersen, David W; Linnet, Kristian

    2014-01-01

    A screening method for 18 frequently measured exogenous anabolic steroids and the testosterone/epitestosterone (T/E) ratio in forensic cases has been developed and validated. The method involves a fully automated sample preparation including enzyme treatment, addition of internal standards and solid phase extraction followed by analysis by liquid chromatography-tandem mass spectrometry (LC-MS-MS) using electrospray ionization with adduct formation for two compounds. Urine samples from 580 forensic cases were analyzed to determine the T/E ratio and occurrence of exogenous anabolic steroids. Extraction recoveries ranged from 77 to 95%, matrix effects from 48 to 78%, overall process efficiencies from 40 to 54% and the lower limit of identification ranged from 2 to 40 ng/mL. In the 580 urine samples analyzed from routine forensic cases, 17 (2.9%) were found positive for one or more anabolic steroids. Only seven different steroids including testosterone were found in the material, suggesting that only a small number of common steroids are likely to occur in a forensic context. The steroids were often in high concentrations (>100 ng/mL), and a combination of steroids and/or other drugs of abuse were seen in the majority of cases. The method presented serves as a fast and automated screening procedure, proving the suitability of LC-MS-MS for analyzing anabolic steroids.

  5. INVESTIGATION OF ARSENIC SPECIATION ON DRINKING WATER TREATMENT MEDIA UTILIZING AUTOMATED SEQUENTIAL CONTINUOUS FLOW EXTRACTION WITH IC-ICP-MS DETECTION

    EPA Science Inventory

    Three treatment media, used for the removal of arsenic from drinking water, were sequentially extracted using 10mM MgCl2 (pH 8), 10mM NaH2PO4 (pH 7) followed by 10mM (NH4)2C2O4 (pH 3). The media were extracted using an on-line automated continuous extraction system which allowed...

  6. Direct Sampling and Analysis from Solid Phase Extraction Cards using an Automated Liquid Extraction Surface Analysis Nanoelectrospray Mass Spectrometry System

    SciTech Connect

    Walworth, Matthew J; ElNaggar, Mariam S; Stankovich, Joseph J; WitkowskiII, Charles E.; Norris, Jeremy L; Van Berkel, Gary J

    2011-01-01

    Direct liquid extraction based surface sampling, a technique previously demonstrated with continuous flow and autonomous pipette liquid microjunction surface sampling probes, has recently been implemented as the Liquid Extraction Surface Analysis (LESA) mode on the commercially available Advion NanoMate chip-based infusion nanoelectrospray ionization system. In the present paper, the LESA mode was applied to the analysis of 96-well format custom solid phase extraction (SPE) cards, with each well consisting of either a 1 or 2 mm diameter monolithic hydrophobic stationary phase. These substrate wells were conditioned, loaded with either single or multi-component aqueous mixtures, and read out using the LESA mode of a TriVersa NanoMate or a Nanomate 100 coupled to an ABI/Sciex 4000QTRAPTM hybrid triple quadrupole/linear ion trap mass spectrometer and a Thermo LTQ XL linear ion trap mass spectrometer. Extraction conditions, including extraction/nanoESI solvent composition, volume, and dwell times, were optimized in the analysis of targeted compounds. Limit of detection and quantitation as well as analysis reproducibility figures of merit were measured. Calibration data was obtained for propranolol using a deuterated internal standard which demonstrated linearity and reproducibility. A 10x increase in signal and cleanup of micromolar Angiotensin II from a concentrated salt solution was demonstrated. Additionally, a multicomponent herbicide mixture at ppb concentration levels was analyzed using MS3 spectra for compound identification in the presence of isobaric interferences.

  7. Age-Related Cognitive Deficits In Rhesus Monkeys Mirror Human Deficits on an Automated Test Battery

    PubMed Central

    Nagahara, Alan H.; Bernot, Tim; Tuszynski, Mark H.

    2010-01-01

    Aged non-human primates are a valuable model for gaining insight into mechanisms underlying neural decline with aging and during the course of neurodegenerative disorders. Behavioral studies are a valuable component of aged primate models, but are difficult to perform, time consuming, and often of uncertain relevance to human cognitive measures. We now report findings from an automated cognitive test battery in aged primates using equipment that is identical, and tasks that are similar, to those employed in human aging and Alzheimer’s disease studies. Young (7.1 ± 0.8 years) and aged (23.0 ± 0.5 years) rhesus monkeys underwent testing on a modified version of the Cambridge Automated Neuropsychological Test Battery (CANTAB), examining cognitive performance on separate tasks that sample features of visuospatial learning, spatial working memory, discrimination learning, and skilled motor performance. We find selective cognitive impairments among aged subjects in visuospatial learning and spatial working memory, but not in delayed recall of previously learned discriminations. Aged monkeys also exhibit slower speed in skilled motor function. Thus, aged monkeys behaviorally characterized on a battery of automated tests reveal patterns of age-related cognitive impairment that mirror in quality and severity those of aged humans, and differ fundamentally from more severe patterns of deficits observed in Alzheimer’s Disease. PMID:18760505

  8. Automated wide-angle SAR stereo height extraction in rugged terrain using shift-scaling correlation.

    SciTech Connect

    Yocky, David Alan; Jakowatz, Charles V., Jr.

    2003-07-01

    Coherent stereo pairs from cross-track synthetic aperture radar (SAR) collects allow fully automated correlation matching using magnitude and phase data. Yet, automated feature matching (correspondence) becomes more difficult when imaging rugged terrain utilizing large stereo crossing angle geometries because high-relief features can undergo significant spatial distortions. These distortions sometimes cause traditional, shift-only correlation matching to fail. This paper presents a possible solution addressing this difficulty. Changing the complex correlation maximization search from shift-only to shift-and-scaling using the downhill simplex method results in higher correlation. This is shown on eight coherent spotlight-mode cross-track stereo pairs with stereo crossing angles averaging 93.7{sup o} collected over terrain with slopes greater than 20{sup o}. The resulting digital elevation maps (DEMs) are compared to ground truth. Using the shift-scaling correlation approach to calculate disparity, height errors decrease and the number of reliable DEM posts increase.

  9. Chemical-induced disease relation extraction with various linguistic features

    PubMed Central

    Gu, Jinghang; Qian, Longhua; Zhou, Guodong

    2016-01-01

    Understanding the relations between chemicals and diseases is crucial in various biomedical tasks such as new drug discoveries and new therapy developments. While manually mining these relations from the biomedical literature is costly and time-consuming, such a procedure is often difficult to keep up-to-date. To address these issues, the BioCreative-V community proposed a challenging task of automatic extraction of chemical-induced disease (CID) relations in order to benefit biocuration. This article describes our work on the CID relation extraction task on the BioCreative-V tasks. We built a machine learning based system that utilized simple yet effective linguistic features to extract relations with maximum entropy models. In addition to leveraging various features, the hypernym relations between entity concepts derived from the Medical Subject Headings (MeSH)-controlled vocabulary were also employed during both training and testing stages to obtain more accurate classification models and better extraction performance, respectively. We demoted relation extraction between entities in documents to relation extraction between entity mentions. In our system, pairs of chemical and disease mentions at both intra- and inter-sentence levels were first constructed as relation instances for training and testing, then two classification models at both levels were trained from the training examples and applied to the testing examples. Finally, we merged the classification results from mention level to document level to acquire final relations between chemicals and diseases. Our system achieved promising F-scores of 60.4% on the development dataset and 58.3% on the test dataset using gold-standard entity annotations, respectively. Database URL: https://github.com/JHnlp/BC5CIDTask PMID:27052618

  10. Chemical-induced disease relation extraction with various linguistic features.

    PubMed

    Gu, Jinghang; Qian, Longhua; Zhou, Guodong

    2016-01-01

    Understanding the relations between chemicals and diseases is crucial in various biomedical tasks such as new drug discoveries and new therapy developments. While manually mining these relations from the biomedical literature is costly and time-consuming, such a procedure is often difficult to keep up-to-date. To address these issues, the BioCreative-V community proposed a challenging task of automatic extraction of chemical-induced disease (CID) relations in order to benefit biocuration. This article describes our work on the CID relation extraction task on the BioCreative-V tasks. We built a machine learning based system that utilized simple yet effective linguistic features to extract relations with maximum entropy models. In addition to leveraging various features, the hypernym relations between entity concepts derived from the Medical Subject Headings (MeSH)-controlled vocabulary were also employed during both training and testing stages to obtain more accurate classification models and better extraction performance, respectively. We demoted relation extraction between entities in documents to relation extraction between entity mentions. In our system, pairs of chemical and disease mentions at both intra- and inter-sentence levels were first constructed as relation instances for training and testing, then two classification models at both levels were trained from the training examples and applied to the testing examples. Finally, we merged the classification results from mention level to document level to acquire final relations between chemicals and diseases. Our system achieved promisingF-scores of 60.4% on the development dataset and 58.3% on the test dataset using gold-standard entity annotations, respectively. Database URL:https://github.com/JHnlp/BC5CIDTask. PMID:27052618

  11. Automated on-line liquid-liquid extraction system for temporal mass spectrometric analysis of dynamic samples.

    PubMed

    Hsieh, Kai-Ta; Liu, Pei-Han; Urban, Pawel L

    2015-09-24

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid-liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053-2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h(-1)). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment.

  12. Automated on-line liquid-liquid extraction system for temporal mass spectrometric analysis of dynamic samples.

    PubMed

    Hsieh, Kai-Ta; Liu, Pei-Han; Urban, Pawel L

    2015-09-24

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid-liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053-2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h(-1)). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. PMID:26423626

  13. Toward automated parasitic extraction of silicon photonics using layout physical verifications

    NASA Astrophysics Data System (ADS)

    Ismail, Mohamed; El Shamy, Raghi S.; Madkour, Kareem; Hammouda, Sherif; Swillam, Mohamed A.

    2016-08-01

    A physical verification flow of the layout of silicon photonic circuits is suggested. Simple empirical models are developed to estimate the bend power loss and coupled power in photonic integrated circuits fabricated using SOI standard wafers. These models are utilized in physical verification flow of the circuit layout to verify reliable fabrication using any electronic design automation tool. The models are accurate compared with electromagnetic solvers. The models are closed form and circumvent the need to utilize any EM solver for the verification process. Hence, it dramatically reduces the time of the verification process.

  14. Automated Control of the Organic and Inorganic Composition of Aloe vera Extracts Using (1)H NMR Spectroscopy.

    PubMed

    Monakhova, Yulia B; Randel, Gabriele; Diehl, Bernd W K

    2016-09-01

    Recent classification of Aloe vera whole-leaf extract by the International Agency for Research and Cancer as a possible carcinogen to humans as well as the continuous adulteration of A. vera's authentic material have generated renewed interest in controlling A. vera. The existing NMR spectroscopic method for the analysis of A. vera, which is based on a routine developed at Spectral Service, was extended. Apart from aloverose, glucose, malic acid, lactic acid, citric acid, whole-leaf material (WLM), acetic acid, fumaric acid, sodium benzoate, and potassium sorbate, the quantification of Mg(2+), Ca(2+), and fructose is possible with the addition of a Cs-EDTA solution to sample. The proposed methodology was automated, which includes phasing, baseline-correction, deconvolution (based on the Lorentzian function), integration, quantification, and reporting. The NMR method was applied to 41 A. vera preparations in the form of liquid A. vera juice and solid A. vera powder. The advantages of the new NMR methodology over the previous method were discussed. Correlation between the new and standard NMR methodologies was significant for aloverose, glucose, malic acid, lactic acid, citric acid, and WLM (P < 0.0001, R(2) = 0.99). NMR was found to be suitable for the automated simultaneous quantitative determination of 13 parameters in A. vera. PMID:27413027

  15. Characterization and Application of Superlig 620 Solid Phase Extraction Resin for Automated Process Monitoring of 90Sr

    SciTech Connect

    Devol, Timothy A.; Clements, John P.; Farawila, Anne F.; O'Hara, Matthew J.; Egorov, Oleg; Grate, Jay W.

    2009-11-30

    Characterization of SuperLig® 620 solid phase extraction resin was performed in order to develop an automated on-line process monitor for 90Sr. The main focus was on strontium separation from barium, with the goal of developing an automated separation process for 90Sr in high-level wastes. High-level waste contains significant 137Cs activity, of which 137mBa is of great concern as an interference to the quantification of strontium. In addition barium, yttrium and plutonium were studied as potential interferences to strontium uptake and detection. A number of complexants were studied in a series of batch Kd experiments, as SuperLig® 620 was not previously known to elute strontium in typical mineral acids. The optimal separation was found using a 2M nitric acid load solution with a strontium elution step of ~0.49M ammonium citrate and a barium elution step of ~1.8M ammonium citrate. 90Sr quantification of Hanford high-level tank waste was performed on a sequential injection analysis microfluidics system coupled to a flow-cell detector. The results of the on-line procedure are compared to standard radiochemical techniques in this paper.

  16. Automated Control of the Organic and Inorganic Composition of Aloe vera Extracts Using (1)H NMR Spectroscopy.

    PubMed

    Monakhova, Yulia B; Randel, Gabriele; Diehl, Bernd W K

    2016-09-01

    Recent classification of Aloe vera whole-leaf extract by the International Agency for Research and Cancer as a possible carcinogen to humans as well as the continuous adulteration of A. vera's authentic material have generated renewed interest in controlling A. vera. The existing NMR spectroscopic method for the analysis of A. vera, which is based on a routine developed at Spectral Service, was extended. Apart from aloverose, glucose, malic acid, lactic acid, citric acid, whole-leaf material (WLM), acetic acid, fumaric acid, sodium benzoate, and potassium sorbate, the quantification of Mg(2+), Ca(2+), and fructose is possible with the addition of a Cs-EDTA solution to sample. The proposed methodology was automated, which includes phasing, baseline-correction, deconvolution (based on the Lorentzian function), integration, quantification, and reporting. The NMR method was applied to 41 A. vera preparations in the form of liquid A. vera juice and solid A. vera powder. The advantages of the new NMR methodology over the previous method were discussed. Correlation between the new and standard NMR methodologies was significant for aloverose, glucose, malic acid, lactic acid, citric acid, and WLM (P < 0.0001, R(2) = 0.99). NMR was found to be suitable for the automated simultaneous quantitative determination of 13 parameters in A. vera.

  17. Automated feature extraction for the classification of human in vivo 13C NMR spectra using statistical pattern recognition and wavelets.

    PubMed

    Tate, A R; Watson, D; Eglen, S; Arvanitis, T N; Thomas, E L; Bell, J D

    1996-06-01

    If magnetic resonance spectroscopy (MRS) is to become a useful tool in clinical medicine, it will be necessary to find reliable methods for analyzing and classifying MRS data. Automated methods are desirable because they can remove user bias and can deal with large amounts of data, allowing the use of all the available information. In this study, techniques for automatically extracting features for the classification of MRS in vivo data are investigated. Among the techniques used were wavelets, principal component analysis, and linear discriminant function analysis. These techniques were tested on a set of 75 in vivo 13C spectra of human adipose tissue from subjects from three different dietary groups (vegan, vegetarian, and omnivore). It was found that it was possible to assign automatically 94% of the vegans and omnivores to their correct dietary groups, without the need for explicit identification or measurement of peaks.

  18. Technical Note: Semi-automated effective width extraction from time-lapse RGB imagery of a remote, braided Greenlandic river

    NASA Astrophysics Data System (ADS)

    Gleason, C. J.; Smith, L. C.; Finnegan, D. C.; LeWinter, A. L.; Pitcher, L. H.; Chu, V. W.

    2015-06-01

    River systems in remote environments are often challenging to monitor and understand where traditional gauging apparatus are difficult to install or where safety concerns prohibit field measurements. In such cases, remote sensing, especially terrestrial time-lapse imaging platforms, offer a means to better understand these fluvial systems. One such environment is found at the proglacial Isortoq River in southwestern Greenland, a river with a constantly shifting floodplain and remote Arctic location that make gauging and in situ measurements all but impossible. In order to derive relevant hydraulic parameters for this river, two true color (RGB) cameras were installed in July 2011, and these cameras collected over 10 000 half hourly time-lapse images of the river by September of 2012. Existing approaches for extracting hydraulic parameters from RGB imagery require manual or supervised classification of images into water and non-water areas, a task that was impractical for the volume of data in this study. As such, automated image filters were developed that removed images with environmental obstacles (e.g., shadows, sun glint, snow) from the processing stream. Further image filtering was accomplished via a novel automated histogram similarity filtering process. This similarity filtering allowed successful (mean accuracy 79.6 %) supervised classification of filtered images from training data collected from just 10 % of those images. Effective width, a hydraulic parameter highly correlated with discharge in braided rivers, was extracted from these classified images, producing a hydrograph proxy for the Isortoq River between 2011 and 2012. This hydrograph proxy shows agreement with historic flooding observed in other parts of Greenland in July 2012 and offers promise that the imaging platform and processing methodology presented here will be useful for future monitoring studies of remote rivers.

  19. An automated system for retrieving herb-drug interaction related articles from MEDLINE

    PubMed Central

    Lin, Kuo; Friedman, Carol; Finkelstein, Joseph

    2016-01-01

    An automated, user-friendly and accurate system for retrieving herb-drug interaction (HDIs) related articles in MEDLINE can increase the safety of patients, as well as improve the physicians’ article retrieving ability regarding speed and experience. Previous studies show that MeSH based queries associated with negative effects of drugs can be customized, resulting in good performance in retrieving relevant information, but no study has focused on the area of herb-drug interactions (HDI). This paper adapted the characteristics of HDI related papers and created a multilayer HDI article searching system. It achieved a sensitivity of 92% at a precision of 93% in a preliminary evaluation. Instead of requiring physicians to conduct PubMed searches directly, this system applies a more user-friendly approach by employing a customized system that enhances PubMed queries, shielding users from having to write queries, dealing with PubMed, or reading many irrelevant articles. The system provides automated processes and outputs target articles based on the input. PMID:27570662

  20. An automated system for retrieving herb-drug interaction related articles from MEDLINE.

    PubMed

    Lin, Kuo; Friedman, Carol; Finkelstein, Joseph

    2016-01-01

    An automated, user-friendly and accurate system for retrieving herb-drug interaction (HDIs) related articles in MEDLINE can increase the safety of patients, as well as improve the physicians' article retrieving ability regarding speed and experience. Previous studies show that MeSH based queries associated with negative effects of drugs can be customized, resulting in good performance in retrieving relevant information, but no study has focused on the area of herb-drug interactions (HDI). This paper adapted the characteristics of HDI related papers and created a multilayer HDI article searching system. It achieved a sensitivity of 92% at a precision of 93% in a preliminary evaluation. Instead of requiring physicians to conduct PubMed searches directly, this system applies a more user-friendly approach by employing a customized system that enhances PubMed queries, shielding users from having to write queries, dealing with PubMed, or reading many irrelevant articles. The system provides automated processes and outputs target articles based on the input. PMID:27570662

  1. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    NASA Astrophysics Data System (ADS)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and

  2. [Corrected Title: Solid-Phase Extraction of Polar Compounds from Water] Automated Electrostatics Environmental Chamber

    NASA Technical Reports Server (NTRS)

    Sauer, Richard; Rutz, Jeffrey; Schultz, John

    2005-01-01

    A solid-phase extraction (SPE) process has been developed for removing alcohols, carboxylic acids, aldehydes, ketones, amines, and other polar organic compounds from water. This process can be either a subprocess of a water-reclamation process or a means of extracting organic compounds from water samples for gas-chromatographic analysis. This SPE process is an attractive alternative to an Environmental Protection Administration liquid-liquid extraction process that generates some pollution and does not work in a microgravitational environment. In this SPE process, one forces a water sample through a resin bed by use of positive pressure on the upstream side and/or suction on the downstream side, thereby causing organic compounds from the water to be adsorbed onto the resin. If gas-chromatographic analysis is to be done, the resin is dried by use of a suitable gas, then the adsorbed compounds are extracted from the resin by use of a solvent. Unlike the liquid-liquid process, the SPE process works in both microgravity and Earth gravity. In comparison with the liquid-liquid process, the SPE process is more efficient, extracts a wider range of organic compounds, generates less pollution, and costs less.

  3. Automated extraction of urban trees from mobile LiDAR point clouds

    NASA Astrophysics Data System (ADS)

    Fan, W.; Chenglu, W.; Jonathan, L.

    2016-03-01

    This paper presents an automatic algorithm to localize and extract urban trees from mobile LiDAR point clouds. First, in order to reduce the number of points to be processed, the ground points are filtered out from the raw point clouds, and the un-ground points are segmented into supervoxels. Then, a novel localization method is proposed to locate the urban trees accurately. Next, a segmentation method by localization is proposed to achieve objects. Finally, the features of objects are extracted, and the feature vectors are classified by random forests trained on manually labeled objects. The proposed method has been tested on a point cloud dataset. The results prove that our algorithm efficiently extracts the urban trees.

  4. Background Knowledge in Learning-Based Relation Extraction

    ERIC Educational Resources Information Center

    Do, Quang Xuan

    2012-01-01

    In this thesis, we study the importance of background knowledge in relation extraction systems. We not only demonstrate the benefits of leveraging background knowledge to improve the systems' performance but also propose a principled framework that allows one to effectively incorporate knowledge into statistical machine learning models for…

  5. Automation of static and dynamic non-dispersive liquid phase microextraction. Part 1: Approaches based on extractant drop-, plug-, film- and microflow-formation.

    PubMed

    Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján

    2016-02-01

    Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given.

  6. A semi-automated methodology for finding lipid-related GO terms

    PubMed Central

    Fan, Mengyuan; Low, Hong Sang; Wenk, Markus R.; Wong, Limsoon

    2014-01-01

    Motivation: Although semantic similarity in Gene Ontology (GO) and other approaches may be used to find similar GO terms, there is yet a method to systematically find a class of GO terms sharing a common property with high accuracy (e.g. involving human curation). Results: We have developed a methodology to address this issue and applied it to identify lipid-related GO terms, owing to the important and varied roles of lipids in many biological processes. Our methodology finds lipid-related GO terms in a semi-automated manner, requiring only moderate manual curation. We first obtain a list of lipid-related gold-standard GO terms by keyword search and manual curation. Then, based on the hypothesis that co-annotated GO terms share similar properties, we develop a machine learning method that expands the list of lipid-related terms from the gold standard. Those terms predicted most likely to be lipid related are examined by a human curator following specific curation rules to confirm the class labels. The structure of GO is also exploited to help reduce the curation effort. The prediction and curation cycle is repeated until no further lipid-related term is found. Our approach has covered a high proportion, if not all, of lipid-related terms with relatively high efficiency. Database URL: http://compbio.ddns.comp.nus.edu.sg/∼lipidgo PMID:25209026

  7. An automated algorithm for extracting road edges from terrestrial mobile LiDAR data

    NASA Astrophysics Data System (ADS)

    Kumar, Pankaj; McElhinney, Conor P.; Lewis, Paul; McCarthy, Timothy

    2013-11-01

    Terrestrial mobile laser scanning systems provide rapid and cost effective 3D point cloud data which can be used for extracting features such as the road edge along a route corridor. This information can assist road authorities in carrying out safety risk assessment studies along road networks. The knowledge of the road edge is also a prerequisite for the automatic estimation of most other road features. In this paper, we present an algorithm which has been developed for extracting left and right road edges from terrestrial mobile LiDAR data. The algorithm is based on a novel combination of two modified versions of the parametric active contour or snake model. The parameters involved in the algorithm are selected empirically and are fixed for all the road sections. We have developed a novel way of initialising the snake model based on the navigation information obtained from the mobile mapping vehicle. We tested our algorithm on different types of road sections representing rural, urban and national primary road sections. The successful extraction of road edges from these multiple road section environments validates our algorithm. These findings and knowledge provide valuable insights as well as a prototype road edge extraction tool-set, for both national road authorities and survey companies.

  8. Automated biphasic morphological assessment of hepatitis B-related liver fibrosis using second harmonic generation microscopy.

    PubMed

    Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting

    2015-01-01

    Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P < 0.001). A biphasic scoring method was applied, combining support vector machine and multivariate generalized linear models to assess the early and late stages of fibrosis, respectively, based on these parameters. The verification cohort was used to verify the scoring method, and the area under the receiver operating characteristic curve was >0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method. PMID:26260921

  9. Automated biphasic morphological assessment of hepatitis B-related liver fibrosis using second harmonic generation microscopy

    NASA Astrophysics Data System (ADS)

    Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting

    2015-08-01

    Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P < 0.001). A biphasic scoring method was applied, combining support vector machine and multivariate generalized linear models to assess the early and late stages of fibrosis, respectively, based on these parameters. The verification cohort was used to verify the scoring method, and the area under the receiver operating characteristic curve was >0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method.

  10. Quantitative analysis of ex vivo colorectal epithelium using an automated feature extraction algorithm for microendoscopy image data.

    PubMed

    Prieto, Sandra P; Lai, Keith K; Laryea, Jonathan A; Mizell, Jason S; Muldoon, Timothy J

    2016-04-01

    Qualitative screening for colorectal polyps via fiber bundle microendoscopy imaging has shown promising results, with studies reporting high rates of sensitivity and specificity, as well as low interobserver variability with trained clinicians. A quantitative image quality control and image feature extraction algorithm (QFEA) was designed to lessen the burden of training and provide objective data for improved clinical efficacy of this method. After a quantitative image quality control step, QFEA extracts field-of-view area, crypt area, crypt circularity, and crypt number per image. To develop and validate this QFEA, a training set of microendoscopy images was collected from freshly resected porcine colon epithelium. The algorithm was then further validated on ex vivo image data collected from eight human subjects, selected from clinically normal appearing regions distant from grossly visible tumor in surgically resected colorectal tissue. QFEA has proven flexible in application to both mosaics and individual images, and its automated crypt detection sensitivity ranges from 71 to 94% despite intensity and contrast variation within the field of view. It also demonstrates the ability to detect and quantify differences in grossly normal regions among different subjects, suggesting the potential efficacy of this approach in detecting occult regions of dysplasia. PMID:27335893

  11. Fully automated Liquid Extraction-Based Surface Sampling and Ionization Using a Chip-Based Robotic Nanoelectrospray Platform

    SciTech Connect

    Kertesz, Vilmos; Van Berkel, Gary J

    2010-01-01

    A fully automated liquid extraction-based surface sampling device utilizing an Advion NanoMate chip-based infusion nanoelectrospray ionization system is reported. Analyses were enabled for discrete spot sampling by using the Advanced User Interface of the current commercial control software. This software interface provided the parameter control necessary for the NanoMate robotic pipettor to both form and withdraw a liquid microjunction for sampling from a surface. The system was tested with three types of analytically important sample surface types, viz., spotted sample arrays on a MALDI plate, dried blood spots on paper, and whole-body thin tissue sections from drug dosed mice. The qualitative and quantitative data were consistent with previous studies employing other liquid extraction-based surface sampling techniques. The successful analyses performed here utilized the hardware and software elements already present in the NanoMate system developed to handle and analyze liquid samples. Implementation of an appropriate sample (surface) holder, a solvent reservoir, faster movement of the robotic arm, finer control over solvent flow rate when dispensing and retrieving the solution at the surface, and the ability to select any location on a surface to sample from would improve the analytical performance and utility of the platform.

  12. Sequential automated fusion/extraction chromatography methodology for the dissolution of uranium in environmental samples for mass spectrometric determination.

    PubMed

    Milliard, Alex; Durand-Jézéquel, Myriam; Larivière, Dominic

    2011-01-17

    An improved methodology has been developed, based on dissolution by automated fusion followed by extraction chromatography for the detection and quantification of uranium in environmental matrices by mass spectrometry. A rapid fusion protocol (<8 min) was investigated for the complete dissolution of various samples. It could be preceded, if required, by an effective ashing procedure using the M4 fluxer and a newly designed platinum lid. Complete dissolution of the sample was observed and measured using standard reference materials (SRMs) and experimental data show no evidence of cross-contamination of crucibles when LiBO(2)/LiBr melts were used. The use of a M4 fusion unit also improved repeatability in sample preparation over muffle furnace fusion. Instrumental issues originating from the presence of high salt concentrations in the digestate after lithium metaborate fusion was also mitigated using an extraction chromatography (EXC) protocol aimed at removing lithium and interfering matrix constituants prior to the elution of uranium. The sequential methodology, which can be performed simultaneously on three samples, requires less than 20 min per sample for fusion and separation. It was successfully coupled to inductively coupled plasma mass spectrometry (ICP-MS) achieving detection limits below 100 pg kg(-1) for 5-300 mg of sample. PMID:21167982

  13. Rhythmic brushstrokes distinguish van Gogh from his contemporaries: findings via automated brushstroke extraction.

    PubMed

    Li, Jia; Yao, Lei; Hendriks, Ella; Wang, James Z

    2012-06-01

    Art historians have long observed the highly characteristic brushstroke styles of Vincent van Gogh and have relied on discerning these styles for authenticating and dating his works. In our work, we compared van Gogh with his contemporaries by statistically analyzing a massive set of automatically extracted brushstrokes. A novel extraction method is developed by exploiting an integration of edge detection and clustering-based segmentation. Evidence substantiates that van Gogh's brushstrokes are strongly rhythmic. That is, regularly shaped brushstrokes are tightly arranged, creating a repetitive and patterned impression. We also found that the traits that distinguish van Gogh's paintings in different time periods of his development are all different from those distinguishing van Gogh from his peers. This study confirms that the combined brushwork features identified as special to van Gogh are consistently held throughout his French periods of production (1886-1890).

  14. Rhythmic brushstrokes distinguish van Gogh from his contemporaries: findings via automated brushstroke extraction.

    PubMed

    Li, Jia; Yao, Lei; Hendriks, Ella; Wang, James Z

    2012-06-01

    Art historians have long observed the highly characteristic brushstroke styles of Vincent van Gogh and have relied on discerning these styles for authenticating and dating his works. In our work, we compared van Gogh with his contemporaries by statistically analyzing a massive set of automatically extracted brushstrokes. A novel extraction method is developed by exploiting an integration of edge detection and clustering-based segmentation. Evidence substantiates that van Gogh's brushstrokes are strongly rhythmic. That is, regularly shaped brushstrokes are tightly arranged, creating a repetitive and patterned impression. We also found that the traits that distinguish van Gogh's paintings in different time periods of his development are all different from those distinguishing van Gogh from his peers. This study confirms that the combined brushwork features identified as special to van Gogh are consistently held throughout his French periods of production (1886-1890). PMID:22516651

  15. Quantification of lung tumor rotation with automated landmark extraction using orthogonal cine MRI images

    NASA Astrophysics Data System (ADS)

    Paganelli, Chiara; Lee, Danny; Greer, Peter B.; Baroni, Guido; Riboldi, Marco; Keall, Paul

    2015-09-01

    The quantification of tumor motion in sites affected by respiratory motion is of primary importance to improve treatment accuracy. To account for motion, different studies analyzed the translational component only, without focusing on the rotational component, which was quantified in a few studies on the prostate with implanted markers. The aim of our study was to propose a tool able to quantify lung tumor rotation without the use of internal markers, thus providing accurate motion detection close to critical structures such as the heart or liver. Specifically, we propose the use of an automatic feature extraction method in combination with the acquisition of fast orthogonal cine MRI images of nine lung patients. As a preliminary test, we evaluated the performance of the feature extraction method by applying it on regions of interest around (i) the diaphragm and (ii) the tumor and comparing the estimated motion with that obtained by (i) the extraction of the diaphragm profile and (ii) the segmentation of the tumor, respectively. The results confirmed the capability of the proposed method in quantifying tumor motion. Then, a point-based rigid registration was applied to the extracted tumor features between all frames to account for rotation. The median lung rotation values were  -0.6   ±   2.3° and  -1.5   ±   2.7° in the sagittal and coronal planes respectively, confirming the need to account for tumor rotation along with translation to improve radiotherapy treatment.

  16. Quantification of lung tumor rotation with automated landmark extraction using orthogonal cine MRI images.

    PubMed

    Paganelli, Chiara; Lee, Danny; Greer, Peter B; Baroni, Guido; Riboldi, Marco; Keall, Paul

    2015-09-21

    The quantification of tumor motion in sites affected by respiratory motion is of primary importance to improve treatment accuracy. To account for motion, different studies analyzed the translational component only, without focusing on the rotational component, which was quantified in a few studies on the prostate with implanted markers. The aim of our study was to propose a tool able to quantify lung tumor rotation without the use of internal markers, thus providing accurate motion detection close to critical structures such as the heart or liver. Specifically, we propose the use of an automatic feature extraction method in combination with the acquisition of fast orthogonal cine MRI images of nine lung patients. As a preliminary test, we evaluated the performance of the feature extraction method by applying it on regions of interest around (i) the diaphragm and (ii) the tumor and comparing the estimated motion with that obtained by (i) the extraction of the diaphragm profile and (ii) the segmentation of the tumor, respectively. The results confirmed the capability of the proposed method in quantifying tumor motion. Then, a point-based rigid registration was applied to the extracted tumor features between all frames to account for rotation. The median lung rotation values were  -0.6   ±   2.3° and  -1.5   ±   2.7° in the sagittal and coronal planes respectively, confirming the need to account for tumor rotation along with translation to improve radiotherapy treatment.

  17. Novel tools for extraction and validation of disease-related mutations applied to Fabry disease.

    PubMed

    Kuipers, Remko; van den Bergh, Tom; Joosten, Henk-Jan; Lekanne dit Deprez, Ronald H; Mannens, Marcel Mam; Schaap, Peter J

    2010-09-01

    Genetic disorders are often caused by nonsynonymous nucleotide changes in one or more genes associated with the disease. Specific amino acid changes, however, can lead to large variability of phenotypic expression. For many genetic disorders this results in an increasing amount of publications describing phenotype-associated mutations in disorder-related genes. Keeping up with this stream of publications is essential for molecular diagnostics and translational research purposes but often impossible due to time constraints: there are simply too many articles to read. To help solve this problem, we have created Mutator, an automated method to extract mutations from full-text articles. Extracted mutations are crossreferenced to sequence data and a scoring method is applied to distinguish false-positives. To analyze stored and new mutation data for their (potential) effect we have developed Validator, a Web-based tool specifically designed for DNA diagnostics. Fabry disease, a monogenetic gene disorder of the GLA gene, was used as a test case. A structure-based sequence alignment of the alpha-amylase superfamily was used to validate results. We have compared our data with existing Fabry mutation data sets obtained from the HGMD and Swiss-Prot databases. Compared to these data sets, Mutator extracted 30% additional mutations from the literature.

  18. Unsupervised entity and relation extraction from clinical records in Italian.

    PubMed

    Alicante, Anita; Corazza, Anna; Isgrò, Francesco; Silvestri, Stefano

    2016-05-01

    This paper proposes and discusses the use of text mining techniques for the extraction of information from clinical records written in Italian. However, as it is very difficult and expensive to obtain annotated material for languages different from English, we only consider unsupervised approaches, where no annotated training set is necessary. We therefore propose a complete system that is structured in two steps. In the first one domain entities are extracted from the clinical records by means of a metathesaurus and standard natural language processing tools. The second step attempts to discover relations between the entity pairs extracted from the whole set of clinical records. For this last step we investigate the performance of unsupervised methods such as clustering in the space of entity pairs, represented by an ad hoc feature vector. The resulting clusters are then automatically labelled by using the most significant features. The system has been tested on a fairly large data set of clinical records in Italian, investigating the variation in the performance adopting different similarity measures in the feature space. The results of our experiments show that the unsupervised approach proposed is promising and well suited for a semi-automatic labelling of the extracted relations. PMID:26851833

  19. Towards a Relation Extraction Framework for Cyber-Security Concepts

    SciTech Connect

    Jones, Corinne L; Bridges, Robert A; Huffer, Kelly M; Goodall, John R

    2015-01-01

    In order to assist security analysts in obtaining information pertaining to their network, such as novel vulnerabilities, exploits, or patches, information retrieval methods tailored to the security domain are needed. As labeled text data is scarce and expensive, we follow developments in semi-supervised NLP and implement a bootstrapping algorithm for extracting security entities and their relationships from text. The algorithm requires little input data, specifically, a few relations or patterns (heuristics for identifying relations), and incorporates an active learning component which queries the user on the most important decisions to prevent drifting the desired relations. Preliminary testing on a small corpus shows promising results, obtaining precision of .82.

  20. Evaluation of an Automated Information Extraction Tool for Imaging Data Elements to Populate a Breast Cancer Screening Registry.

    PubMed

    Lacson, Ronilda; Harris, Kimberly; Brawarsky, Phyllis; Tosteson, Tor D; Onega, Tracy; Tosteson, Anna N A; Kaye, Abby; Gonzalez, Irina; Birdwell, Robyn; Haas, Jennifer S

    2015-10-01

    Breast cancer screening is central to early breast cancer detection. Identifying and monitoring process measures for screening is a focus of the National Cancer Institute's Population-based Research Optimizing Screening through Personalized Regimens (PROSPR) initiative, which requires participating centers to report structured data across the cancer screening continuum. We evaluate the accuracy of automated information extraction of imaging findings from radiology reports, which are available as unstructured text. We present prevalence estimates of imaging findings for breast imaging received by women who obtained care in a primary care network participating in PROSPR (n = 139,953 radiology reports) and compared automatically extracted data elements to a "gold standard" based on manual review for a validation sample of 941 randomly selected radiology reports, including mammograms, digital breast tomosynthesis, ultrasound, and magnetic resonance imaging (MRI). The prevalence of imaging findings vary by data element and modality (e.g., suspicious calcification noted in 2.6% of screening mammograms, 12.1% of diagnostic mammograms, and 9.4% of tomosynthesis exams). In the validation sample, the accuracy of identifying imaging findings, including suspicious calcifications, masses, and architectural distortion (on mammogram and tomosynthesis); masses, cysts, non-mass enhancement, and enhancing foci (on MRI); and masses and cysts (on ultrasound), range from 0.8 to1.0 for recall, precision, and F-measure. Information extraction tools can be used for accurate documentation of imaging findings as structured data elements from text reports for a variety of breast imaging modalities. These data can be used to populate screening registries to help elucidate more effective breast cancer screening processes. PMID:25561069

  1. Evaluation of an Automated Information Extraction Tool for Imaging Data Elements to Populate a Breast Cancer Screening Registry.

    PubMed

    Lacson, Ronilda; Harris, Kimberly; Brawarsky, Phyllis; Tosteson, Tor D; Onega, Tracy; Tosteson, Anna N A; Kaye, Abby; Gonzalez, Irina; Birdwell, Robyn; Haas, Jennifer S

    2015-10-01

    Breast cancer screening is central to early breast cancer detection. Identifying and monitoring process measures for screening is a focus of the National Cancer Institute's Population-based Research Optimizing Screening through Personalized Regimens (PROSPR) initiative, which requires participating centers to report structured data across the cancer screening continuum. We evaluate the accuracy of automated information extraction of imaging findings from radiology reports, which are available as unstructured text. We present prevalence estimates of imaging findings for breast imaging received by women who obtained care in a primary care network participating in PROSPR (n = 139,953 radiology reports) and compared automatically extracted data elements to a "gold standard" based on manual review for a validation sample of 941 randomly selected radiology reports, including mammograms, digital breast tomosynthesis, ultrasound, and magnetic resonance imaging (MRI). The prevalence of imaging findings vary by data element and modality (e.g., suspicious calcification noted in 2.6% of screening mammograms, 12.1% of diagnostic mammograms, and 9.4% of tomosynthesis exams). In the validation sample, the accuracy of identifying imaging findings, including suspicious calcifications, masses, and architectural distortion (on mammogram and tomosynthesis); masses, cysts, non-mass enhancement, and enhancing foci (on MRI); and masses and cysts (on ultrasound), range from 0.8 to1.0 for recall, precision, and F-measure. Information extraction tools can be used for accurate documentation of imaging findings as structured data elements from text reports for a variety of breast imaging modalities. These data can be used to populate screening registries to help elucidate more effective breast cancer screening processes.

  2. An energy minimization approach to automated extraction of regular building footprints from airborne LiDAR data

    NASA Astrophysics Data System (ADS)

    He, Y.; Zhang, C.; Fraser, C. S.

    2014-08-01

    This paper presents an automated approach to the extraction of building footprints from airborne LiDAR data based on energy minimization. Automated 3D building reconstruction in complex urban scenes has been a long-standing challenge in photogrammetry and computer vision. Building footprints constitute a fundamental component of a 3D building model and they are useful for a variety of applications. Airborne LiDAR provides large-scale elevation representation of urban scene and as such is an important data source for object reconstruction in spatial information systems. However, LiDAR points on building edges often exhibit a jagged pattern, partially due to either occlusion from neighbouring objects, such as overhanging trees, or to the nature of the data itself, including unavoidable noise and irregular point distributions. The explicit 3D reconstruction may thus result in irregular or incomplete building polygons. In the presented work, a vertex-driven Douglas-Peucker method is developed to generate polygonal hypotheses from points forming initial building outlines. The energy function is adopted to examine and evaluate each hypothesis and the optimal polygon is determined through energy minimization. The energy minimization also plays a key role in bridging gaps, where the building outlines are ambiguous due to insufficient LiDAR points. In formulating the energy function, hard constraints such as parallelism and perpendicularity of building edges are imposed, and local and global adjustments are applied. The developed approach has been extensively tested and evaluated on datasets with varying point cloud density over different terrain types. Results are presented and analysed. The successful reconstruction of building footprints, of varying structural complexity, along with a quantitative assessment employing accurate reference data, demonstrate the practical potential of the proposed approach.

  3. Automated extraction of DNA from blood and PCR setup using a Tecan Freedom EVO liquid handler for forensic genetic STR typing of reference samples.

    PubMed

    Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune; Hansen, Anders J; Morling, Niels

    2011-04-01

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained the samples were washed with MilliQ water before the return of the DNA extracts. The PCR was setup in 96-well microtiter plates. The methods were validated for the kits: AmpFℓSTR Identifiler, SGM Plus and Yfiler (Applied Biosystems, Foster City, CA), GenePrint FFFL and PowerPlex Y (Promega, Madison, WI). The automated protocols allowed for extraction and addition of PCR master mix of 96 samples within 3.5h. In conclusion, we demonstrated that (1) DNA extraction with magnetic beads and (2) PCR setup for accredited, forensic genetic short tandem repeat typing can be implemented on a simple automated liquid handler leading to the reduction of manual work, and increased quality and throughput. PMID:21609694

  4. Validation of an automated solid-phase extraction method for the analysis of 23 opioids, cocaine, and metabolites in urine with ultra-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Ramírez Fernández, María del Mar; Van Durme, Filip; Wille, Sarah M R; di Fazio, Vincent; Kummer, Natalie; Samyn, Nele

    2014-06-01

    The aim of this work was to automate a sample preparation procedure extracting morphine, hydromorphone, oxymorphone, norcodeine, codeine, dihydrocodeine, oxycodone, 6-monoacetyl-morphine, hydrocodone, ethylmorphine, benzoylecgonine, cocaine, cocaethylene, tramadol, meperidine, pentazocine, fentanyl, norfentanyl, buprenorphine, norbuprenorphine, propoxyphene, methadone and 2-ethylidene-1,5-dimethyl-3,3-diphenylpyrrolidine from urine samples. Samples were extracted by solid-phase extraction (SPE) with cation exchange cartridges using a TECAN Freedom Evo 100 base robotic system, including a hydrolysis step previous extraction when required. Block modules were carefully selected in order to use the same consumable material as in manual procedures to reduce cost and/or manual sample transfers. Moreover, the present configuration included pressure monitoring pipetting increasing pipetting accuracy and detecting sampling errors. The compounds were then separated in a chromatographic run of 9 min using a BEH Phenyl analytical column on a ultra-performance liquid chromatography-tandem mass spectrometry system. Optimization of the SPE was performed with different wash conditions and elution solvents. Intra- and inter-day relative standard deviations (RSDs) were within ±15% and bias was within ±15% for most of the compounds. Recovery was >69% (RSD < 11%) and matrix effects ranged from 1 to 26% when compensated with the internal standard. The limits of quantification ranged from 3 to 25 ng/mL depending on the compound. No cross-contamination in the automated SPE system was observed. The extracted samples were stable for 72 h in the autosampler (4°C). This method was applied to authentic samples (from forensic and toxicology cases) and to proficiency testing schemes containing cocaine, heroin, buprenorphine and methadone, offering fast and reliable results. Automation resulted in improved precision and accuracy, and a minimum operator intervention, leading to safer sample

  5. Pedestrian detection in thermal images: An automated scale based region extraction with curvelet space validation

    NASA Astrophysics Data System (ADS)

    Lakshmi, A.; Faheema, A. G. J.; Deodhare, Dipti

    2016-05-01

    Pedestrian detection is a key problem in night vision processing with a dozen of applications that will positively impact the performance of autonomous systems. Despite significant progress, our study shows that performance of state-of-the-art thermal image pedestrian detectors still has much room for improvement. The purpose of this paper is to overcome the challenge faced by the thermal image pedestrian detectors, which employ intensity based Region Of Interest (ROI) extraction followed by feature based validation. The most striking disadvantage faced by the first module, ROI extraction, is the failed detection of cloth insulted parts. To overcome this setback, this paper employs an algorithm and a principle of region growing pursuit tuned to the scale of the pedestrian. The statistics subtended by the pedestrian drastically vary with the scale and deviation from normality approach facilitates scale detection. Further, the paper offers an adaptive mathematical threshold to resolve the problem of subtracting the background while extracting cloth insulated parts as well. The inherent false positives of the ROI extraction module are limited by the choice of good features in pedestrian validation step. One such feature is curvelet feature, which has found its use extensively in optical images, but has as yet no reported results in thermal images. This has been used to arrive at a pedestrian detector with a reduced false positive rate. This work is the first venture made to scrutinize the utility of curvelet for characterizing pedestrians in thermal images. Attempt has also been made to improve the speed of curvelet transform computation. The classification task is realized through the use of the well known methodology of Support Vector Machines (SVMs). The proposed method is substantiated with qualified evaluation methodologies that permits us to carry out probing and informative comparisons across state-of-the-art features, including deep learning methods, with six

  6. Automated extraction of BI-RADS final assessment categories from radiology reports with natural language processing.

    PubMed

    Sippo, Dorothy A; Warden, Graham I; Andriole, Katherine P; Lacson, Ronilda; Ikuta, Ichiro; Birdwell, Robyn L; Khorasani, Ramin

    2013-10-01

    The objective of this study is to evaluate a natural language processing (NLP) algorithm that determines American College of Radiology Breast Imaging Reporting and Data System (BI-RADS) final assessment categories from radiology reports. This HIPAA-compliant study was granted institutional review board approval with waiver of informed consent. This cross-sectional study involved 1,165 breast imaging reports in the electronic medical record (EMR) from a tertiary care academic breast imaging center from 2009. Reports included screening mammography, diagnostic mammography, breast ultrasound, combined diagnostic mammography and breast ultrasound, and breast magnetic resonance imaging studies. Over 220 reports were included from each study type. The recall (sensitivity) and precision (positive predictive value) of a NLP algorithm to collect BI-RADS final assessment categories stated in the report final text was evaluated against a manual human review standard reference. For all breast imaging reports, the NLP algorithm demonstrated a recall of 100.0 % (95 % confidence interval (CI), 99.7, 100.0 %) and a precision of 96.6 % (95 % CI, 95.4, 97.5 %) for correct identification of BI-RADS final assessment categories. The NLP algorithm demonstrated high recall and precision for extraction of BI-RADS final assessment categories from the free text of breast imaging reports. NLP may provide an accurate, scalable data extraction mechanism from reports within EMRs to create databases to track breast imaging performance measures and facilitate optimal breast cancer population management strategies. PMID:23868515

  7. Automated DICOM metadata and volumetric anatomical information extraction for radiation dosimetry

    NASA Astrophysics Data System (ADS)

    Papamichail, D.; Ploussi, A.; Kordolaimi, S.; Karavasilis, E.; Papadimitroulas, P.; Syrgiamiotis, V.; Efstathopoulos, E.

    2015-09-01

    Patient-specific dosimetry calculations based on simulation techniques have as a prerequisite the modeling of the modality system and the creation of voxelized phantoms. This procedure requires the knowledge of scanning parameters and patients’ information included in a DICOM file as well as image segmentation. However, the extraction of this information is complicated and time-consuming. The objective of this study was to develop a simple graphical user interface (GUI) to (i) automatically extract metadata from every slice image of a DICOM file in a single query and (ii) interactively specify the regions of interest (ROI) without explicit access to the radiology information system. The user-friendly application developed in Matlab environment. The user can select a series of DICOM files and manage their text and graphical data. The metadata are automatically formatted and presented to the user as a Microsoft Excel file. The volumetric maps are formed by interactively specifying the ROIs and by assigning a specific value in every ROI. The result is stored in DICOM format, for data and trend analysis. The developed GUI is easy, fast and and constitutes a very useful tool for individualized dosimetry. One of the future goals is to incorporate a remote access to a PACS server functionality.

  8. Automated boundary extraction of the spinal canal in MRI based on dynamic programming.

    PubMed

    Koh, Jaehan; Chaudhary, Vipin; Dhillon, Gurmeet

    2012-01-01

    The spinal cord is the only communication link between the brain and the body. The abnormalities in it can lead to severe pain and sometimes to paralysis. Due to the growing gap between the number of available radiologists and the number of required radiologists, the need for computer-aided diagnosis and characterization is increasing. To ease this gap, we have developed a computer-aided diagnosis and characterization framework in lumbar spine that includes the spinal cord, vertebrae, and intervertebral discs. In this paper, we propose two spinal cord boundary extraction methods that fit into our framework based on dynamic programming in lumbar spine MRI. Our method incorporates the intensity of the image and the gradient of the image into a dynamic programming scheme and works in a fully-automatic fashion. The boundaries generated by our method is compared against reference boundaries in terms of Fréchet distance which is known to be a metric for shape analysis. The experimental results from 65 clinical data show that our method finds the spinal canal boundary correctly achieving a mean Fréchet distance of 13.5 pixels. For almost all data, the extracted boundary falls within the spinal cord. So, it can be used as a landmark when marking background regions and finding regions of interest.

  9. Automated feature extraction and spatial organization of seafloor pockmarks, Belfast Bay, Maine, USA

    USGS Publications Warehouse

    Andrews, B.D.; Brothers, L.L.; Barnhardt, W.A.

    2010-01-01

    Seafloor pockmarks occur worldwide and may represent millions of m3 of continental shelf erosion, but few numerical analyses of their morphology and spatial distribution of pockmarks exist. We introduce a quantitative definition of pockmark morphology and, based on this definition, propose a three-step geomorphometric method to identify and extract pockmarks from high-resolution swath bathymetry. We apply this GIS-implemented approach to 25km2 of bathymetry collected in the Belfast Bay, Maine USA pockmark field. Our model extracted 1767 pockmarks and found a linear pockmark depth-to-diameter ratio for pockmarks field-wide. Mean pockmark depth is 7.6m and mean diameter is 84.8m. Pockmark distribution is non-random, and nearly half of the field's pockmarks occur in chains. The most prominent chains are oriented semi-normal to the steepest gradient in Holocene sediment thickness. A descriptive model yields field-wide spatial statistics indicating that pockmarks are distributed in non-random clusters. Results enable quantitative comparison of pockmarks in fields worldwide as well as similar concave features, such as impact craters, dolines, or salt pools. ?? 2010.

  10. A novel approach for automated shoreline extraction from remote sensing images using low level programming

    NASA Astrophysics Data System (ADS)

    Rigos, Anastasios; Vaiopoulos, Aristidis; Skianis, George; Tsekouras, George; Drakopoulos, Panos

    2015-04-01

    Tracking coastline changes is a crucial task in the context of coastal management and synoptic remotely sensed data has become an essential tool for this purpose. In this work, and within the framework of BeachTour project, we introduce a new method for shoreline extraction from high resolution satellite images. It was applied on two images taken by the WorldView-2 satellite (7 channels, 2m resolution) during July 2011 and August 2014. The location is the well-known tourist destination of Laganas beach spanning 5 km along the southern part of Zakynthos Island, Greece. The atmospheric correction was performed with the ENVI FLAASH procedure and the final images were validated against hyperspectral field measurements. Using three channels (CH2=blue, CH3=green and CH7=near infrared) the Modified Redness Index image was calculated according to: MRI=(CH7)2/[CH2x(CH3)3]. MRI has the property that its value keeps increasing as the water becomes shallower. This is followed by an abrupt reduction trend at the location of the wet sand up to the point where the dry shore face begins. After that it remains low-valued throughout the beach zone. Images based on this index were used for the shoreline extraction process that included the following steps: a) On the MRI based image, only an area near the shoreline was kept (this process is known as image masking). b) On the masked image the Canny edge detector operator was applied. c) Of all edges discovered on step (b) only the biggest was kept. d) If the line revealed on step (c) was unacceptable, i.e. not defining the shoreline or defining only part of it, then either more than one areas on step (c) were kept or on the MRI image the pixel values were bound in a particular interval [Blow, Bhigh] and only the ones belonging in this interval were kept. Then, steps (a)-(d) were repeated. Using this method, which is still under development, we were able to extract the shoreline position and reveal its changes during the 3-year period

  11. Automated on-line renewable solid-phase extraction-liquid chromatography exploiting multisyringe flow injection-bead injection lab-on-valve analysis.

    PubMed

    Quintana, José Benito; Miró, Manuel; Estela, José Manuel; Cerdà, Víctor

    2006-04-15

    In this paper, the third generation of flow injection analysis, also named the lab-on-valve (LOV) approach, is proposed for the first time as a front end to high-performance liquid chromatography (HPLC) for on-line solid-phase extraction (SPE) sample processing by exploiting the bead injection (BI) concept. The proposed microanalytical system based on discontinuous programmable flow features automated packing (and withdrawal after single use) of a small amount of sorbent (<5 mg) into the microconduits of the flow network and quantitative elution of sorbed species into a narrow band (150 microL of 95% MeOH). The hyphenation of multisyringe flow injection analysis (MSFIA) with BI-LOV prior to HPLC analysis is utilized for on-line postextraction treatment to ensure chemical compatibility between the eluate medium and the initial HPLC gradient conditions. This circumvents the band-broadening effect commonly observed in conventional on-line SPE-based sample processors due to the low eluting strength of the mobile phase. The potential of the novel MSFI-BI-LOV hyphenation for on-line handling of complex environmental and biological samples prior to reversed-phase chromatographic separations was assessed for the expeditious determination of five acidic pharmaceutical residues (viz., ketoprofen, naproxen, bezafibrate, diclofenac, and ibuprofen) and one metabolite (viz., salicylic acid) in surface water, urban wastewater, and urine. To this end, the copolymeric divinylbenzene-co-n-vinylpyrrolidone beads (Oasis HLB) were utilized as renewable sorptive entities in the micromachined unit. The automated analytical method features relative recovery percentages of >88%, limits of detection within the range 0.02-0.67 ng mL(-1), and coefficients of variation <11% for the column renewable mode and gives rise to a drastic reduction in operation costs ( approximately 25-fold) as compared to on-line column switching systems. PMID:16615800

  12. Automation or De-automation

    NASA Astrophysics Data System (ADS)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  13. Acquisition of Data for Plasma Simulation by Automated Extraction of Terminology from Article Abstracts

    NASA Astrophysics Data System (ADS)

    Pichl, Lukáš Suzuki, Manabu; Murata, Masaki; Sasaki, Akira; Kato, Daiji; Murakami, Izumi; Rhee, Yongjoo

    Computer simulation of burning plasmas as well as computational plasma modeling in image processing requires a number of accurate data, in addition to a relevant model framework. To this aim, it is very important to recognize, obtain and evaluate data relevant for such a simulation from the literature. This work focuses on the simultaneous search of relevant data across various online databases, extraction of cataloguing and numerical information, and automatic recognition of specific terminology in the text retrieved. The concept is illustrated on the particular terminology of Atomic and Molecular data relevant to edge plasma simulation. The IAEA search engine GENIE and the NIFS search engine Joint Search 2 are compared and discussed. Accurate modeling of the imaged object is considered to be the ultimate challenge in improving the resolution limits of plasma imaging.

  14. Robust semi-automated path extraction for visualising stenosis of the coronary arteries.

    PubMed

    Mueller, Daniel; Maeder, Anthony

    2008-09-01

    Computed tomography angiography (CTA) is useful for diagnosing and planning treatment of heart disease. However, contrast agent in surrounding structures (such as the aorta and left ventricle) makes 3D visualisation of the coronary arteries difficult. This paper presents a composite method employing segmentation and volume rendering to overcome this issue. A key contribution is a novel Fast Marching minimal path cost function for vessel centreline extraction. The resultant centreline is used to compute a measure of vessel lumen, which indicates the degree of stenosis (narrowing of a vessel). Two volume visualisation techniques are presented which utilise the segmented arteries and lumen measure. The system is evaluated and demonstrated using synthetic and clinically obtained datasets. PMID:18603408

  15. Kernel-Based Learning for Domain-Specific Relation Extraction

    NASA Astrophysics Data System (ADS)

    Basili, Roberto; Giannone, Cristina; Del Vescovo, Chiara; Moschitti, Alessandro; Naggar, Paolo

    In a specific process of business intelligence, i.e. investigation on organized crime, empirical language processing technologies can play a crucial role. The analysis of transcriptions on investigative activities, such as police interrogatories, for the recognition and storage of complex relations among people and locations is a very difficult and time consuming task, ultimately based on pools of experts. We discuss here an inductive relation extraction platform that opens the way to much cheaper and consistent workflows. The presented empirical investigation shows that accurate results, comparable to the expert teams, can be achieved, and parametrization allows to fine tune the system behavior for fitting domain-specific requirements.

  16. Dried blood spot proteomics: surface extraction of endogenous proteins coupled with automated sample preparation and mass spectrometry analysis.

    PubMed

    Martin, Nicholas J; Bunch, Josephine; Cooper, Helen J

    2013-08-01

    Dried blood spots offer many advantages as a sample format including ease and safety of transport and handling. To date, the majority of mass spectrometry analyses of dried blood spots have focused on small molecules or hemoglobin. However, dried blood spots are a potentially rich source of protein biomarkers, an area that has been overlooked. To address this issue, we have applied an untargeted bottom-up proteomics approach to the analysis of dried blood spots. We present an automated and integrated method for extraction of endogenous proteins from the surface of dried blood spots and sample preparation via trypsin digestion by use of the Advion Biosciences Triversa Nanomate robotic platform. Liquid chromatography tandem mass spectrometry of the resulting digests enabled identification of 120 proteins from a single dried blood spot. The proteins identified cross a concentration range of four orders of magnitude. The method is evaluated and the results discussed in terms of the proteins identified and their potential use as biomarkers in screening programs.

  17. Dried Blood Spot Proteomics: Surface Extraction of Endogenous Proteins Coupled with Automated Sample Preparation and Mass Spectrometry Analysis

    NASA Astrophysics Data System (ADS)

    Martin, Nicholas J.; Bunch, Josephine; Cooper, Helen J.

    2013-08-01

    Dried blood spots offer many advantages as a sample format including ease and safety of transport and handling. To date, the majority of mass spectrometry analyses of dried blood spots have focused on small molecules or hemoglobin. However, dried blood spots are a potentially rich source of protein biomarkers, an area that has been overlooked. To address this issue, we have applied an untargeted bottom-up proteomics approach to the analysis of dried blood spots. We present an automated and integrated method for extraction of endogenous proteins from the surface of dried blood spots and sample preparation via trypsin digestion by use of the Advion Biosciences Triversa Nanomate robotic platform. Liquid chromatography tandem mass spectrometry of the resulting digests enabled identification of 120 proteins from a single dried blood spot. The proteins identified cross a concentration range of four orders of magnitude. The method is evaluated and the results discussed in terms of the proteins identified and their potential use as biomarkers in screening programs.

  18. Automated and portable solid phase extraction platform for immuno-detection of 17β-estradiol in water.

    PubMed

    Heub, Sarah; Tscharner, Noe; Monnier, Véronique; Kehl, Florian; Dittrich, Petra S; Follonier, Stéphane; Barbe, Laurent

    2015-02-13

    A fully automated and portable system for solid phase extraction (SPE) has been developed for the analysis of the natural hormone 17β-estradiol (E2) in environmental water by enzyme linked immuno-sorbent assay (ELISA). The system has been validated with de-ionized and artificial sea water as model samples and allowed for pre-concentration of E2 at levels of 1, 10 and 100 ng/L with only 100 ml of sample. Recoveries ranged from 24±3% to 107±6% depending on the concentration and sample matrix. The method successfully allowed us to determine the concentration of two seawater samples. A concentration of 15.1±0.3 ng/L of E2 was measured in a sample obtained from a food production process, and 8.8±0.7 ng/L in a sample from the Adriatic Sea. The system would be suitable for continuous monitoring of water quality as it is user friendly, and as the method is reproducible and totally compatible with the analysis of water sample by simple immunoassays and other detection methods such as biosensors. PMID:25604269

  19. Automated extraction of absorption features from Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and Geophysical and Environmental Research Imaging Spectrometer (GERIS) data

    NASA Technical Reports Server (NTRS)

    Kruse, Fred A.; Calvin, Wendy M.; Seznec, Olivier

    1988-01-01

    Automated techniques were developed for the extraction and characterization of absorption features from reflectance spectra. The absorption feature extraction algorithms were successfully tested on laboratory, field, and aircraft imaging spectrometer data. A suite of laboratory spectra of the most common minerals was analyzed and absorption band characteristics tabulated. A prototype expert system was designed, implemented, and successfully tested to allow identification of minerals based on the extracted absorption band characteristics. AVIRIS spectra for a site in the northern Grapevine Mountains, Nevada, have been characterized and the minerals sericite (fine grained muscovite) and dolomite were identified. The minerals kaolinite, alunite, and buddingtonite were identified and mapped for a site at Cuprite, Nevada, using the feature extraction algorithms on the new Geophysical and Environmental Research 64 channel imaging spectrometer (GERIS) data. The feature extraction routines (written in FORTRAN and C) were interfaced to the expert system (written in PROLOG) to allow both efficient processing of numerical data and logical spectrum analysis.

  20. Automated Feature Extraction in Brain Tumor by Magnetic Resonance Imaging Using Gaussian Mixture Models

    PubMed Central

    Chaddad, Ahmad

    2015-01-01

    This paper presents a novel method for Glioblastoma (GBM) feature extraction based on Gaussian mixture model (GMM) features using MRI. We addressed the task of the new features to identify GBM using T1 and T2 weighted images (T1-WI, T2-WI) and Fluid-Attenuated Inversion Recovery (FLAIR) MR images. A pathologic area was detected using multithresholding segmentation with morphological operations of MR images. Multiclassifier techniques were considered to evaluate the performance of the feature based scheme in terms of its capability to discriminate GBM and normal tissue. GMM features demonstrated the best performance by the comparative study using principal component analysis (PCA) and wavelet based features. For the T1-WI, the accuracy performance was 97.05% (AUC = 92.73%) with 0.00% missed detection and 2.95% false alarm. In the T2-WI, the same accuracy (97.05%, AUC = 91.70%) value was achieved with 2.95% missed detection and 0.00% false alarm. In FLAIR mode the accuracy decreased to 94.11% (AUC = 95.85%) with 0.00% missed detection and 5.89% false alarm. These experimental results are promising to enhance the characteristics of heterogeneity and hence early treatment of GBM. PMID:26136774

  1. Automated Semantic Indices Related to Cognitive Function and Rate of Cognitive Decline

    PubMed Central

    Pakhomov, Serguei V.S.; Hemmy, Laura S.; Lim, Kelvin O.

    2012-01-01

    The objective of our study is to introduce a fully automated, computational linguistic technique to quantify semantic relations between words generated on a standard semantic verbal fluency test and to determine its cognitive and clinical correlates. Cognitive differences between patients with Alzheimer’s disease and mild cognitive impairment are evident in their performance on the semantic verbal fluency test. In addition to the semantic verbal fluency test score, several other performance characteristics sensitive to disease status and predictive of future cognitive decline have been defined in terms of words generated from semantically related categories (clustering) and shifting between categories (switching). However, the traditional assessment of clustering and switching has been performed manually in a qualitative fashion resulting in subjective scoring with limited reproducibility and scalability. Our approach uses word definitions and hierarchical relations between the words in WordNet®, a large electronic lexical database, to quantify the degree of semantic similarity and relatedness between words. We investigated the novel semantic fluency indices of mean cumulative similarity and relatedness between all pairs of words regardless of their order, and mean sequential similarity and relatedness between pairs of adjacent words in a sample of patients with clinically diagnosed probable (n=55) or possible (n=27) Alzheimer’s disease or mild cognitive impairment (n=31). The semantic fluency indices differed significantly between the diagnostic groups, and were strongly associated with neuropsychological tests of executive function, as well as the rate of global cognitive decline. Our results suggest that word meanings and relations between words shared across individuals and computationally modeled via WordNet and large text corpora provide the necessary context to account for the variability in language-based behavior and relate it to cognitive dysfunction

  2. Development of an automated method for Folin-Ciocalteu total phenolic assay in artichoke extracts.

    PubMed

    Yoo, Kil Sun; Lee, Eun Jin; Leskovar, Daniel; Patil, Bhimanagouda S

    2012-12-01

    We developed a system to run the Folin-Ciocalteu (F-C) total phenolic assay, in artichoke extract samples, which is fully automatic, consistent, and fast. The system uses 2 high performance liquid chromatography (HPLC) pumps, an autosampler, a column heater, a UV/Vis detector, and a data collection system. To test the system, a pump delivered 10-fold diluted F-C reagent solution at a rate of 0.7 mL/min, and 0.4 g/mL sodium carbonate at a rate of 2.1 mL/min. The autosampler injected 10 μL per 1.2 min, which was mixed with the F-C reagent and heated to 65 °C while it passed through the column heater. The heated reactant was mixed with sodium carbonate and color intensity was measured by the detector at 600 nm. The data collection system recorded the color intensity, and peak area of each sample was calculated as the concentration of the total phenolic content, expressed in μg/mL as either chlorogenic acid or gallic acid. This new method had superb repeatability (0.7% CV) and a high correlation with both the manual method (r(2) = 0.93) and the HPLC method (r(2) = 0.78). Ascorbic acid and quercetin showed variable antioxidant activity, but sugars did not. This method can be efficiently applied to research that needs to test many numbers of antioxidant capacity samples with speed and accuracy. PMID:23163965

  3. Automated oral cancer identification using histopathological images: a hybrid feature extraction paradigm.

    PubMed

    Krishnan, M Muthu Rama; Venkatraghavan, Vikram; Acharya, U Rajendra; Pal, Mousumi; Paul, Ranjan Rashmi; Min, Lim Choo; Ray, Ajoy Kumar; Chatterjee, Jyotirmoy; Chakraborty, Chandan

    2012-02-01

    Oral cancer (OC) is the sixth most common cancer in the world. In India it is the most common malignant neoplasm. Histopathological images have widely been used in the differential diagnosis of normal, oral precancerous (oral sub-mucous fibrosis (OSF)) and cancer lesions. However, this technique is limited by subjective interpretations and less accurate diagnosis. The objective of this work is to improve the classification accuracy based on textural features in the development of a computer assisted screening of OSF. The approach introduced here is to grade the histopathological tissue sections into normal, OSF without Dysplasia (OSFWD) and OSF with Dysplasia (OSFD), which would help the oral onco-pathologists to screen the subjects rapidly. The biopsy sections are stained with H&E. The optical density of the pixels in the light microscopic images is recorded and represented as matrix quantized as integers from 0 to 255 for each fundamental color (Red, Green, Blue), resulting in a M×N×3 matrix of integers. Depending on either normal or OSF condition, the image has various granular structures which are self similar patterns at different scales termed "texture". We have extracted these textural changes using Higher Order Spectra (HOS), Local Binary Pattern (LBP), and Laws Texture Energy (LTE) from the histopathological images (normal, OSFWD and OSFD). These feature vectors were fed to five different classifiers: Decision Tree (DT), Sugeno Fuzzy, Gaussian Mixture Model (GMM), K-Nearest Neighbor (K-NN), Radial Basis Probabilistic Neural Network (RBPNN) to select the best classifier. Our results show that combination of texture and HOS features coupled with Fuzzy classifier resulted in 95.7% accuracy, sensitivity and specificity of 94.5% and 98.8% respectively. Finally, we have proposed a novel integrated index called Oral Malignancy Index (OMI) using the HOS, LBP, LTE features, to diagnose benign or malignant tissues using just one number. We hope that this OMI can

  4. Automation of static and dynamic non-dispersive liquid phase microextraction. Part 1: Approaches based on extractant drop-, plug-, film- and microflow-formation.

    PubMed

    Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján

    2016-02-01

    Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given. PMID:26772123

  5. Automated extraction of pressure ridges from SAR images of sea ice - Comparison with surface truth

    NASA Technical Reports Server (NTRS)

    Vesecky, J. F.; Smith, M. P.; Samadani, R.; Daida, J. M.; Comiso, J. C.

    1991-01-01

    The authors estimate the characteristics of ridges and leads in sea ice from SAR (synthetic aperture radar) images. Such estimates are based on the hypothesis that bright filamentary features in SAR sea ice images correspond with pressure ridges. A data set collected in the Greenland Sea in 1987 allows this hypothesis to be evaluated for X-band SAR images. A preliminary analysis of data collected from SAR images and ice elevation (from a laser altimeter) is presented. It is found that SAR image brightness and ice elevation are clearly related. However, the correlation, using the data and techniques applied, is not strong.

  6. Automated solid-phase extraction and quantitative analysis of 14 phthalate metabolites in human serum using isotope dilution-high-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Silva, Manori J; Samandar, Ella; Preau, James L; Reidy, John A; Needham, Larry L; Calafat, Antonia

    2005-01-01

    Phthalates are industrial chemicals with many commercial applications. Because of their common usage, the general population is exposed to phthalates. A sensitive and selective analytical method is necessary to accurately determine the phthalate levels in serum. We improved our previously developed analytical method to measure nine phthalate metabolites in human serum by automating the solid-phase extraction (SPE) procedure and by including five additional phthalate metabolites: phthalic acid; mono-isobutyl phthalate, a metabolite of di-isobutyl phthalate; mono-(3-carboxypropyl) phthalate, a major oxidative metabolite of di-n-octyl phthalate; and mono-(2-ethyl-5-oxohexyl) phthalate and mono-(2-ethyl-5-hydroxyhexyl) phthalate, two oxidative metabolites of di-(2-ethylhexyl) phthalate. Automation of the SPE eliminated the human variation associated with the manual SPE, thus improving the reproducibility of the measurements. Additional wash steps during SPE produced cleaner extracts and resulted in higher recoveries (80-99%) than the manual SPE method. Furthermore, the automated SPE method allowed for the unattended extraction of samples, with a concomitant increase in sample throughput compared to the manual SPE method. The method is accurate, precise, and sensitive, with limits of detection in the low nanogram-per-milliliter range.

  7. Superheated liquid extraction of oleuropein and related biophenols from olive leaves.

    PubMed

    Japón-Luján, R; Luque de Castro, M D

    2006-12-15

    Oleuropein and other healthy olive biophenols (OBPs) such as verbacoside, apigenin-7-glucoside and luteolin-7-glucoside have been extracted from olive leaves by using superheated liquids and a static-dynamic approach. Multivariate methodology has been used to carry out a detailed optimisation of the extraction. Under the optimal working conditions, complete removal without degradation of the target analytes was achieved in 13 min. The extract was injected into a chromatograph-photodiode array detector assembly for individual separation-quantification. The proposed approach - which provides more concentrated extracts than previous alternatives - is very useful to study matrix-extractant analytes partition. In addition, the efficacy of superheated liquids to extract OBPs, the simplicity of the experimental setup, its easy automation and low acquisition and maintenance costs make the industrial implementation of the proposed method advisable.

  8. Superheated liquid extraction of oleuropein and related biophenols from olive leaves.

    PubMed

    Japón-Luján, R; Luque de Castro, M D

    2006-12-15

    Oleuropein and other healthy olive biophenols (OBPs) such as verbacoside, apigenin-7-glucoside and luteolin-7-glucoside have been extracted from olive leaves by using superheated liquids and a static-dynamic approach. Multivariate methodology has been used to carry out a detailed optimisation of the extraction. Under the optimal working conditions, complete removal without degradation of the target analytes was achieved in 13 min. The extract was injected into a chromatograph-photodiode array detector assembly for individual separation-quantification. The proposed approach - which provides more concentrated extracts than previous alternatives - is very useful to study matrix-extractant analytes partition. In addition, the efficacy of superheated liquids to extract OBPs, the simplicity of the experimental setup, its easy automation and low acquisition and maintenance costs make the industrial implementation of the proposed method advisable. PMID:17045596

  9. AUTOMATED ANALYSIS OF AQUEOUS SAMPLES CONTAINING PESTICIDES, ACIDIC/BASIC/NEUTRAL SEMIVOLATILES AND VOLATILE ORGANIC COMPOUNDS BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GC/MS

    EPA Science Inventory

    Data is presented on the development of a new automated system combining solid phase extraction (SPE) with GC/MS spectrometry for the single-run analysis of water samples containing a broad range of organic compounds. The system uses commercially available automated in-line 10-m...

  10. Extracted facial feature of racial closely related faces

    NASA Astrophysics Data System (ADS)

    Liewchavalit, Chalothorn; Akiba, Masakazu; Kanno, Tsuneo; Nagao, Tomoharu

    2010-02-01

    Human faces contain a lot of demographic information such as identity, gender, age, race and emotion. Human being can perceive these pieces of information and use it as an important clue in social interaction with other people. Race perception is considered the most delicacy and sensitive parts of face perception. There are many research concerning image-base race recognition, but most of them are focus on major race group such as Caucasoid, Negroid and Mongoloid. This paper focuses on how people classify race of the racial closely related group. As a sample of racial closely related group, we choose Japanese and Thai face to represents difference between Northern and Southern Mongoloid. Three psychological experiment was performed to study the strategies of face perception on race classification. As a result of psychological experiment, it can be suggested that race perception is an ability that can be learn. Eyes and eyebrows are the most attention point and eyes is a significant factor in race perception. The Principal Component Analysis (PCA) was performed to extract facial features of sample race group. Extracted race features of texture and shape were used to synthesize faces. As the result, it can be suggested that racial feature is rely on detailed texture rather than shape feature. This research is a indispensable important fundamental research on the race perception which are essential in the establishment of human-like race recognition system.

  11. Evaluation of three automated nucleic acid extraction systems for identification of respiratory viruses in clinical specimens by multiplex real-time PCR.

    PubMed

    Kim, Yoonjung; Han, Mi-Soon; Kim, Juwon; Kwon, Aerin; Lee, Kyung-A

    2014-01-01

    A total of 84 nasopharyngeal swab specimens were collected from 84 patients. Viral nucleic acid was extracted by three automated extraction systems: QIAcube (Qiagen, Germany), EZ1 Advanced XL (Qiagen), and MICROLAB Nimbus IVD (Hamilton, USA). Fourteen RNA viruses and two DNA viruses were detected using the Anyplex II RV16 Detection kit (Seegene, Republic of Korea). The EZ1 Advanced XL system demonstrated the best analytical sensitivity for all the three viral strains. The nucleic acids extracted by EZ1 Advanced XL showed higher positive rates for virus detection than the others. Meanwhile, the MICROLAB Nimbus IVD system was comprised of fully automated steps from nucleic extraction to PCR setup function that could reduce human errors. For the nucleic acids recovered from nasopharyngeal swab specimens, the QIAcube system showed the fewest false negative results and the best concordance rate, and it may be more suitable for detecting various viruses including RNA and DNA virus strains. Each system showed different sensitivity and specificity for detection of certain viral pathogens and demonstrated different characteristics such as turnaround time and sample capacity. Therefore, these factors should be considered when new nucleic acid extraction systems are introduced to the laboratory.

  12. The ValleyMorph Tool: An automated extraction tool for transverse topographic symmetry (T-) factor and valley width to valley height (Vf-) ratio

    NASA Astrophysics Data System (ADS)

    Daxberger, Heidi; Dalumpines, Ron; Scott, Darren M.; Riller, Ulrich

    2014-09-01

    In tectonically active regions on Earth, shallow-crustal deformation associated with seismic hazards may pose a threat to human life and property. The study of landform development, such as analysis of the valley width to valley height ratio (Vf-ratio) and the Transverse Topographic Symmetry Factor (T-factor), delineating drainage basin symmetry, can be used as a relative measure of tectonic activity along fault-bound mountain fronts. The fast evolution of digital elevation models (DEM) provides an ideal base for remotely-sensed tectonomorphic studies of large areas using Geographical Information Systems (GIS). However, a manual extraction of the above mentioned morphologic parameters may be tedious and very time consuming. Moreover, basic GIS software suites do not provide the necessary built-in functions. Therefore, we present a newly developed, Python based, ESRI ArcGIS compatible tool and stand-alone script, the ValleyMorph Tool. This tool facilitates an automated extraction of the Vf-ratio and the T-factor data for large regions. Using a digital elevation raster and watershed polygon files as input, the tool provides output in the form of several ArcGIS data tables and shapefiles, ideal for further data manipulation and computation. This coding enables an easy application among the ArcGIS user community and code conversion to earlier ArcGIS versions. The ValleyMorph Tool is easy to use due to a simple graphical user interface. The tool is tested for the southern Central Andes using a total of 3366 watersheds.

  13. Submicrometric Magnetic Nanoporous Carbons Derived from Metal-Organic Frameworks Enabling Automated Electromagnet-Assisted Online Solid-Phase Extraction.

    PubMed

    Frizzarin, Rejane M; Palomino Cabello, Carlos; Bauzà, Maria Del Mar; Portugal, Lindomar A; Maya, Fernando; Cerdà, Víctor; Estela, José M; Turnes Palomino, Gemma

    2016-07-19

    We present the first application of submicrometric magnetic nanoporous carbons (μMNPCs) as sorbents for automated solid-phase extraction (SPE). Small zeolitic imidazolate framework-67 crystals are obtained at room temperature and directly carbonized under an inert atmosphere to obtain submicrometric nanoporous carbons containing magnetic cobalt nanoparticles. The μMNPCs have a high contact area, high stability, and their preparation is simple and cost-effective. The prepared μMNPCs are exploited as sorbents in a microcolumn format in a sequential injection analysis (SIA) system with online spectrophotometric detection, which includes a specially designed three-dimensional (3D)-printed holder containing an automatically actuated electromagnet. The combined action of permanent magnets and an automatically actuated electromagnet enabled the movement of the solid bed of particles inside the microcolumn, preventing their aggregation, increasing the versatility of the system, and increasing the preconcentration efficiency. The method was optimized using a full factorial design and Doehlert Matrix. The developed system was applied to the determination of anionic surfactants, exploiting the retention of the ion-pairs formed with Methylene Blue on the μMNPC. Using sodium dodecyl sulfate as a model analyte, quantification was linear from 50 to 1000 μg L(-1), and the detection limit was equal to 17.5 μg L(-1), the coefficient of variation (n = 8; 100 μg L(-1)) was 2.7%, and the analysis throughput was 13 h(-1). The developed approach was applied to the determination of anionic surfactants in water samples (natural water, groundwater, and wastewater), yielding recoveries of 93% to 110% (95% confidence level). PMID:27336802

  14. Submicrometric Magnetic Nanoporous Carbons Derived from Metal-Organic Frameworks Enabling Automated Electromagnet-Assisted Online Solid-Phase Extraction.

    PubMed

    Frizzarin, Rejane M; Palomino Cabello, Carlos; Bauzà, Maria Del Mar; Portugal, Lindomar A; Maya, Fernando; Cerdà, Víctor; Estela, José M; Turnes Palomino, Gemma

    2016-07-19

    We present the first application of submicrometric magnetic nanoporous carbons (μMNPCs) as sorbents for automated solid-phase extraction (SPE). Small zeolitic imidazolate framework-67 crystals are obtained at room temperature and directly carbonized under an inert atmosphere to obtain submicrometric nanoporous carbons containing magnetic cobalt nanoparticles. The μMNPCs have a high contact area, high stability, and their preparation is simple and cost-effective. The prepared μMNPCs are exploited as sorbents in a microcolumn format in a sequential injection analysis (SIA) system with online spectrophotometric detection, which includes a specially designed three-dimensional (3D)-printed holder containing an automatically actuated electromagnet. The combined action of permanent magnets and an automatically actuated electromagnet enabled the movement of the solid bed of particles inside the microcolumn, preventing their aggregation, increasing the versatility of the system, and increasing the preconcentration efficiency. The method was optimized using a full factorial design and Doehlert Matrix. The developed system was applied to the determination of anionic surfactants, exploiting the retention of the ion-pairs formed with Methylene Blue on the μMNPC. Using sodium dodecyl sulfate as a model analyte, quantification was linear from 50 to 1000 μg L(-1), and the detection limit was equal to 17.5 μg L(-1), the coefficient of variation (n = 8; 100 μg L(-1)) was 2.7%, and the analysis throughput was 13 h(-1). The developed approach was applied to the determination of anionic surfactants in water samples (natural water, groundwater, and wastewater), yielding recoveries of 93% to 110% (95% confidence level).

  15. MG-Digger: An Automated Pipeline to Search for Giant Virus-Related Sequences in Metagenomes.

    PubMed

    Verneau, Jonathan; Levasseur, Anthony; Raoult, Didier; La Scola, Bernard; Colson, Philippe

    2016-01-01

    The number of metagenomic studies conducted each year is growing dramatically. Storage and analysis of such big data is difficult and time-consuming. Interestingly, analysis shows that environmental and human metagenomes include a significant amount of non-annotated sequences, representing a 'dark matter.' We established a bioinformatics pipeline that automatically detects metagenome reads matching query sequences from a given set and applied this tool to the detection of sequences matching large and giant DNA viral members of the proposed order Megavirales or virophages. A total of 1,045 environmental and human metagenomes (≈ 1 Terabase) were collected, processed, and stored on our bioinformatics server. In addition, nucleotide and protein sequences from 93 Megavirales representatives, including 19 giant viruses of amoeba, and 5 virophages, were collected. The pipeline was generated by scripts written in Python language and entitled MG-Digger. Metagenomes previously found to contain megavirus-like sequences were tested as controls. MG-Digger was able to annotate 100s of metagenome sequences as best matching those of giant viruses. These sequences were most often found to be similar to phycodnavirus or mimivirus sequences, but included reads related to recently available pandoraviruses, Pithovirus sibericum, and faustoviruses. Compared to other tools, MG-Digger combined stand-alone use on Linux or Windows operating systems through a user-friendly interface, implementation of ready-to-use customized metagenome databases and query sequence databases, adjustable parameters for BLAST searches, and creation of output files containing selected reads with best match identification. Compared to Metavir 2, a reference tool in viral metagenome analysis, MG-Digger detected 8% more true positive Megavirales-related reads in a control metagenome. The present work shows that massive, automated and recurrent analyses of metagenomes are effective in improving knowledge about the

  16. MG-Digger: An Automated Pipeline to Search for Giant Virus-Related Sequences in Metagenomes.

    PubMed

    Verneau, Jonathan; Levasseur, Anthony; Raoult, Didier; La Scola, Bernard; Colson, Philippe

    2016-01-01

    The number of metagenomic studies conducted each year is growing dramatically. Storage and analysis of such big data is difficult and time-consuming. Interestingly, analysis shows that environmental and human metagenomes include a significant amount of non-annotated sequences, representing a 'dark matter.' We established a bioinformatics pipeline that automatically detects metagenome reads matching query sequences from a given set and applied this tool to the detection of sequences matching large and giant DNA viral members of the proposed order Megavirales or virophages. A total of 1,045 environmental and human metagenomes (≈ 1 Terabase) were collected, processed, and stored on our bioinformatics server. In addition, nucleotide and protein sequences from 93 Megavirales representatives, including 19 giant viruses of amoeba, and 5 virophages, were collected. The pipeline was generated by scripts written in Python language and entitled MG-Digger. Metagenomes previously found to contain megavirus-like sequences were tested as controls. MG-Digger was able to annotate 100s of metagenome sequences as best matching those of giant viruses. These sequences were most often found to be similar to phycodnavirus or mimivirus sequences, but included reads related to recently available pandoraviruses, Pithovirus sibericum, and faustoviruses. Compared to other tools, MG-Digger combined stand-alone use on Linux or Windows operating systems through a user-friendly interface, implementation of ready-to-use customized metagenome databases and query sequence databases, adjustable parameters for BLAST searches, and creation of output files containing selected reads with best match identification. Compared to Metavir 2, a reference tool in viral metagenome analysis, MG-Digger detected 8% more true positive Megavirales-related reads in a control metagenome. The present work shows that massive, automated and recurrent analyses of metagenomes are effective in improving knowledge about the

  17. MG-Digger: An Automated Pipeline to Search for Giant Virus-Related Sequences in Metagenomes

    PubMed Central

    Verneau, Jonathan; Levasseur, Anthony; Raoult, Didier; La Scola, Bernard; Colson, Philippe

    2016-01-01

    The number of metagenomic studies conducted each year is growing dramatically. Storage and analysis of such big data is difficult and time-consuming. Interestingly, analysis shows that environmental and human metagenomes include a significant amount of non-annotated sequences, representing a ‘dark matter.’ We established a bioinformatics pipeline that automatically detects metagenome reads matching query sequences from a given set and applied this tool to the detection of sequences matching large and giant DNA viral members of the proposed order Megavirales or virophages. A total of 1,045 environmental and human metagenomes (≈ 1 Terabase) were collected, processed, and stored on our bioinformatics server. In addition, nucleotide and protein sequences from 93 Megavirales representatives, including 19 giant viruses of amoeba, and 5 virophages, were collected. The pipeline was generated by scripts written in Python language and entitled MG-Digger. Metagenomes previously found to contain megavirus-like sequences were tested as controls. MG-Digger was able to annotate 100s of metagenome sequences as best matching those of giant viruses. These sequences were most often found to be similar to phycodnavirus or mimivirus sequences, but included reads related to recently available pandoraviruses, Pithovirus sibericum, and faustoviruses. Compared to other tools, MG-Digger combined stand-alone use on Linux or Windows operating systems through a user-friendly interface, implementation of ready-to-use customized metagenome databases and query sequence databases, adjustable parameters for BLAST searches, and creation of output files containing selected reads with best match identification. Compared to Metavir 2, a reference tool in viral metagenome analysis, MG-Digger detected 8% more true positive Megavirales-related reads in a control metagenome. The present work shows that massive, automated and recurrent analyses of metagenomes are effective in improving knowledge about

  18. Satellite mapping and automated feature extraction: Geographic information system-based change detection of the Antarctic coast

    NASA Astrophysics Data System (ADS)

    Kim, Kee-Tae

    Declassified Intelligence Satellite Photograph (DISP) data are important resources for measuring the geometry of the coastline of Antarctica. By using the state-of-art digital imaging technology, bundle block triangulation based on tie points and control points derived from a RADARSAT-1 Synthetic Aperture Radar (SAR) image mosaic and Ohio State University (OSU) Antarctic digital elevation model (DEM), the individual DISP images were accurately assembled into a map quality mosaic of Antarctica as it appeared in 1963. The new map is one of important benchmarks for gauging the response of the Antarctic coastline to changing climate. Automated coastline extraction algorithm design is the second theme of this dissertation. At the pre-processing stage, an adaptive neighborhood filtering was used to remove the film-grain noise while preserving edge features. At the segmentation stage, an adaptive Bayesian approach to image segmentation was used to split the DISP imagery into its homogenous regions, in which the fuzzy c-means clustering (FCM) technique and Gibbs random field (GRF) model were introduced to estimate the conditional and prior probability density functions. A Gaussian mixture model was used to estimate the reliable initial values for the FCM technique. At the post-processing stage, image object formation and labeling, removal of noisy image objects, and vectorization algorithms were sequentially applied to segmented images for extracting a vector representation of coastlines. Results were presented that demonstrate the effectiveness of the algorithm in segmenting the DISP data. In the cases of cloud cover and little contrast scenes, manual editing was carried out based on intermediate image processing and visual inspection in comparison of old paper maps. Through a geographic information system (GIS), the derived DISP coastline data were integrated with earlier and later data to assess continental scale changes in the Antarctic coast. Computing the area of

  19. Effects of a psychophysiological system for adaptive automation on performance, workload, and the event-related potential P300 component

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J 3rd; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.

    2003-01-01

    The present study examined the effects of an electroencephalographic- (EEG-) based system for adaptive automation on tracking performance and workload. In addition, event-related potentials (ERPs) to a secondary task were derived to determine whether they would provide an additional degree of workload specificity. Participants were run in an adaptive automation condition, in which the system switched between manual and automatic task modes based on the value of each individual's own EEG engagement index; a yoked control condition; or another control group, in which task mode switches followed a random pattern. Adaptive automation improved performance and resulted in lower levels of workload. Further, the P300 component of the ERP paralleled the sensitivity to task demands of the performance and subjective measures across conditions. These results indicate that it is possible to improve performance with a psychophysiological adaptive automation system and that ERPs may provide an alternative means for distinguishing among levels of cognitive task demand in such systems. Actual or potential applications of this research include improved methods for assessing operator workload and performance.

  20. Determination of Low Concentrations of Acetochlor in Water by Automated Solid-Phase Extraction and Gas Chromatography with Mass-Selective Detection

    USGS Publications Warehouse

    Lindley, C.E.; Stewart, J.T.; Sandstrom, M.W.

    1996-01-01

    A sensitive and reliable gas chromatographic/mass spectrometric (GC/MS) method for determining acetochlor in environmental water samples was developed. The method involves automated extraction of the herbicide from a filtered 1 L water sample through a C18 solid-phase extraction column, elution from the column with hexane-isopropyl alcohol (3 + 1), and concentration of the extract with nitrogen gas. The herbicide is quantitated by capillary/column GC/MS with selected-ion monitoring of 3 characteristic ions. The single-operator method detection limit for reagent water samples is 0.0015 ??g/L. Mean recoveries ranged from about 92 to 115% for 3 water matrixes fortified at 0.05 and 0.5 ??g/L. Average single-operator precision, over the course of 1 week, was better than 5%.

  1. A quantitative measure for degree of automation and its relation to system performance and mental load.

    PubMed

    Wei, Z G; Macwan, A P; Wieringa, P A

    1998-06-01

    In this paper we quantitatively model degree of automation (DofA) in supervisory control as a function of the number and nature of tasks to be performed by the operator and automation. This model uses a task weighting scheme in which weighting factors are obtained from task demand load, task mental load, and task effect on system performance. The computation of DofA is demonstrated using an experimental system. Based on controlled experiments using operators, analyses of the task effect on system performance, the prediction and assessment of task demand load, and the prediction of mental load were performed. Each experiment had a different DofA. The effect of a change in DofA on system performance and mental load was investigated. It was found that system performance became less sensitive to changes in DofA at higher levels of DofA. The experimental data showed that when the operator controlled a partly automated system, perceived mental load could be predicted from the task mental load for each task component, as calculated by analyzing a situation in which all tasks were manually controlled. Actual or potential applications of this research include a methodology to balance and optimize the automation of complex industrial systems.

  2. On the Relation between Automated Essay Scoring and Modern Views of the Writing Construct

    ERIC Educational Resources Information Center

    Deane, Paul

    2013-01-01

    This paper examines the construct measured by automated essay scoring (AES) systems. AES systems measure features of the text structure, linguistic structure, and conventional print form of essays; as such, the systems primarily measure text production skills. In the current state-of-the-art, AES provide little direct evidence about such matters…

  3. Extraction of text-related features for condensing image documents

    NASA Astrophysics Data System (ADS)

    Bloomberg, Dan S.; Chen, Francine R.

    1996-03-01

    A system has been built that selects excerpts from a scanned document for presentation as a summary, without using character recognition. The method relies on the idea that the most significant sentences in a document contain words that are both specific to the document and have a relatively high frequency of occurrence within it. Accordingly, and entirely within the image domain, each page image is deskewed and the text regions of are found and extracted as a set of textblocks. Blocks with font size near the median for the document are selected and then placed in reading order. The textlines and words are segmented, and the words are placed into equivalence classes of similar shape. The sentences are identified by finding baselines for each line of text and analyzing the size and location of the connected components relative to the baseline. Scores can then be given to each word, depending on its shape and frequency of occurrence, and to each sentence, depending on the scores for the words in the sentence. Other salient features, such as textblocks that have a large font or are likely to contain an abstract, can also be used to select image parts that are likely to be thematically relevant. The method has been applied to a variety of documents, including articles scanned from magazines and technical journals.

  4. Toward automated classification of consumers' cancer-related questions with a new taxonomy of expected answer types.

    PubMed

    McRoy, Susan; Jones, Sean; Kurmally, Adam

    2016-09-01

    This article examines methods for automated question classification applied to cancer-related questions that people have asked on the web. This work is part of a broader effort to provide automated question answering for health education. We created a new corpus of consumer-health questions related to cancer and a new taxonomy for those questions. We then compared the effectiveness of different statistical methods for developing classifiers, including weighted classification and resampling. Basic methods for building classifiers were limited by the high variability in the natural distribution of questions and typical refinement approaches of feature selection and merging categories achieved only small improvements to classifier accuracy. Best performance was achieved using weighted classification and resampling methods, the latter yielding an accuracy of F1 = 0.963. Thus, it would appear that statistical classifiers can be trained on natural data, but only if natural distributions of classes are smoothed. Such classifiers would be useful for automated question answering, for enriching web-based content, or assisting clinical professionals to answer questions.

  5. Time-resolved Characterization of Particle Associated Polycyclic Aromatic Hydrocarbons using a newly-developed Sequential Spot Sampler with Automated Extraction and Analysis

    PubMed Central

    Lewis, Gregory S.; Spielman, Steven R.; Hering, Susanne V.

    2014-01-01

    A versatile and compact sampling system, the Sequential Spot Sampler (S3) has been developed for pre-concentrated, time-resolved, dry collection of fine and ultrafine particles. Using a temperature-moderated laminar flow water condensation method, ambient particles as small as 6 nm are deposited within a dry, 1-mm diameter spot. Sequential samples are collected on a multiwell plate. Chemical analyses are laboratory-based, but automated. The sample preparation, extraction and chemical analysis steps are all handled through a commercially-available, needle-based autosampler coupled to a liquid chromatography system. This automation is enabled by the small deposition area of the collection. The entire sample is extracted into 50–100μl volume of solvent, providing quantifiable samples with small collected air volumes. A pair of S3 units was deployed in Stockton (CA) from November 2011 to February 2012. PM2.5 samples were collected every 12 hrs, and analyzed for polycyclic aromatic hydrocarbons (PAHs). In parallel, conventional filter samples were collected for 48 hrs and used to assess the new system’s performance. An automated sample preparation and extraction was developed for samples collected using the S3. Collocated data from the two sequential spot samplers were highly correlated for all measured compounds, with a regression slope of 1.1 and r2=0.9 for all measured concentrations. S3/filter ratios for the mean concentration of each individual PAH vary between 0.82 and 1.33, with the larger variability observed for the semivolatile components. Ratio for total PAH concentrations was 1.08. Total PAH concentrations showed similar temporal trend as ambient PM2.5 concentrations. Source apportionment analysis estimated a significant contribution of biomass burning to ambient PAH concentrations during winter. PMID:25574151

  6. Time-resolved characterization of particle associated polycyclic aromatic hydrocarbons using a newly-developed sequential spot sampler with automated extraction and analysis

    NASA Astrophysics Data System (ADS)

    Eiguren-Fernandez, Arantzazu; Lewis, Gregory S.; Spielman, Steven R.; Hering, Susanne V.

    2014-10-01

    A versatile and compact sampling system, the Sequential Spot Sampler (S3) has been developed for pre-concentrated, time-resolved, dry collection of fine and ultrafine particles. Using a temperature-moderated laminar flow water condensation method, ambient particles as small as 6 nm are deposited within a dry, 1-mm diameter spot. Sequential samples are collected on a multiwell plate. Chemical analyses are laboratory-based, but automated. The sample preparation, extraction and chemical analysis steps are all handled through a commercially-available, needle-based autosampler coupled to a liquid chromatography system. This automation is enabled by the small deposition area of the collection. The entire sample is extracted into 50-100 μL volume of solvent, providing quantifiable samples with small collected air volumes. A pair of S3 units was deployed in Stockton (CA) from November 2011 to February 2012. PM2.5 samples were collected every 12 h, and analyzed for polycyclic aromatic hydrocarbons (PAHs). In parallel, conventional filter samples were collected for 48 h and used to assess the new system's performance. An automated sample preparation and extraction was developed for samples collected using the S3. Collocated data from the two sequential spot samplers were highly correlated for all measured compounds, with a regression slope of 1.1 and r2 = 0.9 for all measured concentrations. S3/filter ratios for the mean concentration of each individual PAH vary between 0.82 and 1.33, with the larger variability observed for the semivolatile components. Ratio for total PAH concentrations was 1.08. Total PAH concentrations showed similar temporal trend as ambient PM2.5 concentrations. Source apportionment analysis estimated a significant contribution of biomass burning to ambient PAH concentrations during winter.

  7. Method for extracting copper, silver and related metals

    DOEpatents

    Moyer, Bruce A.; McDowell, W. J.

    1990-01-01

    A process for selectively extracting precious metals such as silver and gold concurrent with copper extraction from aqueous solutions containing the same. The process utilizes tetrathiamacrocycles and high molecular weight organic acids that exhibit a synergistic relationship when complexing with certain metal ions thereby removing them from ore leach solutions.

  8. Method for extracting copper, silver and related metals

    DOEpatents

    Moyer, B.A.; McDowell, W.J.

    1987-10-23

    A process for selectively extracting precious metals such as silver and gold concurrent with copper extraction from aqueous solutions containing the same. The process utilizes tetrathiamacrocycles and high molecular weight organic acids that exhibit a synergistic relationship when complexing with certain metal ions thereby removing them from ore leach solutions.

  9. A filter paper-based microdevice for low-cost, rapid, and automated DNA extraction and amplification from diverse sample types.

    PubMed

    Gan, Wupeng; Zhuang, Bin; Zhang, Pengfei; Han, Junping; Li, Cai-Xia; Liu, Peng

    2014-10-01

    A plastic microfluidic device that integrates a filter disc as a DNA capture phase was successfully developed for low-cost, rapid and automated DNA extraction and PCR amplification from various raw samples. The microdevice was constructed by sandwiching a piece of Fusion 5 filter, as well as a PDMS (polydimethylsiloxane) membrane, between two PMMA (poly(methyl methacrylate)) layers. An automated DNA extraction from 1 μL of human whole blood can be finished on the chip in 7 minutes by sequentially aspirating NaOH, HCl, and water through the filter. The filter disc containing extracted DNA was then taken out directly for PCR. On-chip DNA purification from 0.25-1 μL of human whole blood yielded 8.1-21.8 ng of DNA, higher than those obtained using QIAamp® DNA Micro kits. To realize DNA extraction from raw samples, an additional sample loading chamber containing a filter net with an 80 μm mesh size was designed in front of the extraction chamber to accommodate sample materials. Real-world samples, including whole blood, dried blood stains on Whatman® 903 paper, dried blood stains on FTA™ cards, buccal swabs, saliva, and cigarette butts, can all be processed in the system in 8 minutes. In addition, multiplex amplification of 15 STR (short tandem repeat) loci and Sanger-based DNA sequencing of the 520 bp GJB2 gene were accomplished from the filters that contained extracted DNA from blood. To further prove the feasibility of integrating this extraction method with downstream analyses, "in situ" PCR amplifications were successfully performed in the DNA extraction chamber following DNA purification from blood and blood stains without DNA elution. Using a modified protocol to bond the PDMS and PMMA, our plastic PDMS devices withstood the PCR process without any leakage. This study represents a significant step towards the practical application of on-chip DNA extraction methods, as well as the development of fully integrated genetic analytical systems.

  10. Automated position control of a surface array relative to a liquid microjunction surface sampler

    DOEpatents

    Van Berkel, Gary J.; Kertesz, Vilmos; Ford, Michael James

    2007-11-13

    A system and method utilizes an image analysis approach for controlling the probe-to-surface distance of a liquid junction-based surface sampling system for use with mass spectrometric detection. Such an approach enables a hands-free formation of the liquid microjunction used to sample solution composition from the surface and for re-optimization, as necessary, of the microjunction thickness during a surface scan to achieve a fully automated surface sampling system.

  11. Automated mini-column solid-phase extraction cleanup for high-throughput analysis of chemical contaminants in foods by low-pressure gas chromatography – tandem mass spectrometry

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography – tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. ...

  12. Extraction of a group-pair relation: problem-solving relation from web-board documents.

    PubMed

    Pechsiri, Chaveevan; Piriyakul, Rapepun

    2016-01-01

    This paper aims to extract a group-pair relation as a Problem-Solving relation, for example a DiseaseSymptom-Treatment relation and a CarProblem-Repair relation, between two event-explanation groups, a problem-concept group as a symptom/CarProblem-concept group and a solving-concept group as a treatment-concept/repair concept group from hospital-web-board and car-repair-guru-web-board documents. The Problem-Solving relation (particularly Symptom-Treatment relation) including the graphical representation benefits non-professional persons by supporting knowledge of primarily solving problems. The research contains three problems: how to identify an EDU (an Elementary Discourse Unit, which is a simple sentence) with the event concept of either a problem or a solution; how to determine a problem-concept EDU boundary and a solving-concept EDU boundary as two event-explanation groups, and how to determine the Problem-Solving relation between these two event-explanation groups. Therefore, we apply word co-occurrence to identify a problem-concept EDU and a solving-concept EDU, and machine-learning techniques to solve a problem-concept EDU boundary and a solving-concept EDU boundary. We propose using k-mean and Naïve Bayes to determine the Problem-Solving relation between the two event-explanation groups involved with clustering features. In contrast to previous works, the proposed approach enables group-pair relation extraction with high accuracy.

  13. Extraction of a group-pair relation: problem-solving relation from web-board documents.

    PubMed

    Pechsiri, Chaveevan; Piriyakul, Rapepun

    2016-01-01

    This paper aims to extract a group-pair relation as a Problem-Solving relation, for example a DiseaseSymptom-Treatment relation and a CarProblem-Repair relation, between two event-explanation groups, a problem-concept group as a symptom/CarProblem-concept group and a solving-concept group as a treatment-concept/repair concept group from hospital-web-board and car-repair-guru-web-board documents. The Problem-Solving relation (particularly Symptom-Treatment relation) including the graphical representation benefits non-professional persons by supporting knowledge of primarily solving problems. The research contains three problems: how to identify an EDU (an Elementary Discourse Unit, which is a simple sentence) with the event concept of either a problem or a solution; how to determine a problem-concept EDU boundary and a solving-concept EDU boundary as two event-explanation groups, and how to determine the Problem-Solving relation between these two event-explanation groups. Therefore, we apply word co-occurrence to identify a problem-concept EDU and a solving-concept EDU, and machine-learning techniques to solve a problem-concept EDU boundary and a solving-concept EDU boundary. We propose using k-mean and Naïve Bayes to determine the Problem-Solving relation between the two event-explanation groups involved with clustering features. In contrast to previous works, the proposed approach enables group-pair relation extraction with high accuracy. PMID:27540498

  14. Evaluation of automated stir bar sorptive extraction-thermal desorption-gas chromatography electron capture negative ion mass spectrometry for the analysis of PBDEs and PBBs in sheep and human serum.

    PubMed

    Loconto, Paul R

    2009-09-01

    Stir-bar sorptive extraction and automated thermal desorption/cryotrapping interfaced to capillary gas chromatography and electron capture negative ion mass spectrometry is shown to effectively isolate and recover polybrominated diphenyl ethers and polybrominated biphenyls from sheep and human serum. This paper describes the development of the method and demonstrates the feasibility of using Twister with spiked serum. Conditions for conducting stir-bar sorptive extraction and for automated thermal desorption that led to acceptable analyte recoveries were optimized. The approach to sample preparation introduced here significantly reduces tedious labor and solvent consumption associated with conventional liquid-liquid extraction. PMID:19772742

  15. Automated scheme for measuring polyp volume in CT colonography using Hessian matrix-based shape extraction and 3D volume growing

    NASA Astrophysics Data System (ADS)

    Suzuki, Kenji; Epstein, Mark L.; Xu, Jianwu; Obara, Piotr; Rockey, Don C.; Dachman, Abraham H.

    2010-03-01

    Current measurement of the single longest dimension of a polyp is subjective and has variations among radiologists. Our purpose was to develop an automated measurement of polyp volume in CT colonography (CTC). We developed a computerized segmentation scheme for measuring polyp volume in CTC, which consisted of extraction of a highly polyp-like seed region based on the Hessian matrix, segmentation of polyps by use of a 3D volume-growing technique, and sub-voxel refinement to reduce a bias of segmentation. Our database consisted of 30 polyp views (15 polyps) in CTC scans from 13 patients. To obtain "gold standard," a radiologist outlined polyps in each slice and calculated volumes by summation of areas. The measurement study was repeated three times at least one week apart for minimizing a memory effect bias. We used the mean volume of the three studies as "gold standard." Our measurement scheme yielded a mean polyp volume of 0.38 cc (range: 0.15-1.24 cc), whereas a mean "gold standard" manual volume was 0.40 cc (range: 0.15-1.08 cc). The mean absolute difference between automated and manual volumes was 0.11 cc with standard deviation of 0.14 cc. The two volumetrics reached excellent agreement (intra-class correlation coefficient was 0.80) with no statistically significant difference (p(F<=f) = 0.42). Thus, our automated scheme efficiently provides accurate polyp volumes for radiologists.

  16. Automated Retinal Image Analysis for Evaluation of Focal Hyperpigmentary Changes in Intermediate Age-Related Macular Degeneration

    PubMed Central

    Schmitz-Valckenberg, Steffen; Göbel, Arno P.; Saur, Stefan C.; Steinberg, Julia S.; Thiele, Sarah; Wojek, Christian; Russmann, Christoph; Holz, Frank G.; for the MODIAMD-Study Group

    2016-01-01

    Purpose To develop and evaluate a software tool for automated detection of focal hyperpigmentary changes (FHC) in eyes with intermediate age-related macular degeneration (AMD). Methods Color fundus (CFP) and autofluorescence (AF) photographs of 33 eyes with FHC of 28 AMD patients (mean age 71 years) from the prospective longitudinal natural history MODIAMD-study were included. Fully automated to semiautomated registration of baseline to corresponding follow-up images was evaluated. Following the manual circumscription of individual FHC (four different readings by two readers), a machine-learning algorithm was evaluated for automatic FHC detection. Results The overall pixel distance error for the semiautomated (CFP follow-up to CFP baseline: median 5.7; CFP to AF images from the same visit: median 6.5) was larger as compared for the automated image registration (4.5 and 5.7; P < 0.001 and P < 0.001). The total number of manually circumscribed objects and the corresponding total size varied between 637 to 1163 and 520,848 pixels to 924,860 pixels, respectively. Performance of the learning algorithms showed a sensitivity of 96% at a specificity level of 98% using information from both CFP and AF images and defining small areas of FHC (“speckle appearance”) as “neutral.” Conclusions FHC as a high-risk feature for progression of AMD to late stages can be automatically assessed at different time points with similar sensitivity and specificity as compared to manual outlining. Upon further development of the research prototype, this approach may be useful both in natural history and interventional large-scale studies for a more refined classification and risk assessment of eyes with intermediate AMD. Translational Relevance Automated FHC detection opens the door for a more refined and detailed classification and risk assessment of eyes with intermediate AMD in both natural history and future interventional studies. PMID:26966639

  17. Automated age-related macular degeneration classification in OCT using unsupervised feature learning

    NASA Astrophysics Data System (ADS)

    Venhuizen, Freerk G.; van Ginneken, Bram; Bloemen, Bart; van Grinsven, Mark J. J. P.; Philipsen, Rick; Hoyng, Carel; Theelen, Thomas; Sánchez, Clara I.

    2015-03-01

    Age-related Macular Degeneration (AMD) is a common eye disorder with high prevalence in elderly people. The disease mainly affects the central part of the retina, and could ultimately lead to permanent vision loss. Optical Coherence Tomography (OCT) is becoming the standard imaging modality in diagnosis of AMD and the assessment of its progression. However, the evaluation of the obtained volumetric scan is time consuming, expensive and the signs of early AMD are easy to miss. In this paper we propose a classification method to automatically distinguish AMD patients from healthy subjects with high accuracy. The method is based on an unsupervised feature learning approach, and processes the complete image without the need for an accurate pre-segmentation of the retina. The method can be divided in two steps: an unsupervised clustering stage that extracts a set of small descriptive image patches from the training data, and a supervised training stage that uses these patches to create a patch occurrence histogram for every image on which a random forest classifier is trained. Experiments using 384 volume scans show that the proposed method is capable of identifying AMD patients with high accuracy, obtaining an area under the Receiver Operating Curve of 0:984. Our method allows for a quick and reliable assessment of the presence of AMD pathology in OCT volume scans without the need for accurate layer segmentation algorithms.

  18. Automated procedure for determination of ammonia in concrete with headspace single-drop micro-extraction by stepwise injection spectrophotometric analysis.

    PubMed

    Timofeeva, Irina; Khubaibullin, Ilnur; Kamencev, Mihail; Moskvin, Aleksey; Bulatov, Andrey

    2015-02-01

    A novel automatic stepwise injection headspace single-drop micro-extraction system is proposed as a versatile approach for automated determination of volatile compounds. The system application is demonstrated for ammonia determination in concrete samples. An ammonia gas was produced from ammonium ions and extracted on-line into 5 μL 0.1M H3PO4 to eliminate the interference effect of concrete species on the ammonia stepwise injection spectrophotometric determination. The linear range was 0.1-1 mg kg(-1) with LOD 30 µg kg(-1). The sample throughput was 4 h(-1). This system has been successfully applied for the determination of ammonia in concretes.

  19. Three Experiments Examining the Use of Electroencephalogram,Event-Related Potentials, and Heart-Rate Variability for Real-Time Human-Centered Adaptive Automation Design

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Parasuraman, Raja; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.

    2003-01-01

    Adaptive automation represents an advanced form of human-centered automation design. The approach to automation provides for real-time and model-based assessments of human-automation interaction, determines whether the human has entered into a hazardous state of awareness and then modulates the task environment to keep the operator in-the-loop , while maintaining an optimal state of task engagement and mental alertness. Because adaptive automation has not matured, numerous challenges remain, including what the criteria are, for determining when adaptive aiding and adaptive function allocation should take place. Human factors experts in the area have suggested a number of measures including the use of psychophysiology. This NASA Technical Paper reports on three experiments that examined the psychophysiological measures of event-related potentials, electroencephalogram, and heart-rate variability for real-time adaptive automation. The results of the experiments confirm the efficacy of these measures for use in both a developmental and operational role for adaptive automation design. The implications of these results and future directions for psychophysiology and human-centered automation design are discussed.

  20. Quantitative high-throughput analysis of 16 (fluoro)quinolones in honey using automated extraction by turbulent flow chromatography coupled to liquid chromatography-tandem mass spectrometry.

    PubMed

    Mottier, Pascal; Hammel, Yves-Alexis; Gremaud, Eric; Guy, Philippe A

    2008-01-01

    A method making use of turbulent flow chromatography automated online extraction with tandem mass spectrometry (MS/MS) was developed for the analysis of 4 quinolones and 12 fluoroquinolones in honey. The manual sample preparation was limited to a simple dilution of the honey test portion in water followed by a filtration. The extract was online purified on a large particle size extraction column where the sample matrix was washed away while the analytes were retained. Subsequently, the analytes were eluted from the extraction column onto an analytical column by means of an organic solvent prior to chromatographic separation and MS detection. Validation was performed at three fortification levels (i.e., 5, 20, and 50 microg/kg) in three different honeys (acacia, multiflower, and forest) using the single-point calibration procedure by means of either a 10 or 25 microg/kg calibrant. Good recovery (85-127%, median 101%) as well as within-day (2-18%, median 6%) and between-day (2-42%, median 9%) precision values was obtained whatever the level of fortification and the analyte surveyed. Due to the complexity of the honey matrix and the large variation of the MS/MS transition reaction signals, which were honey-dependent, the limit of quantification for all compounds was arbitrarily set at the lowest fortification level considered during the validation, e.g., 5 microg/kg. This method has been successfully applied in a minisurvey of 34 honeys, showing ciprofloxacin and norfloxacin as the main (fluoro)quinolone antibiotics administered to treat bacterial diseases of bees. Turbulent flow chromatography coupled to LC-MS/MS showed a strong potential as an alternative method compared to those making use of offline sample preparation, in terms of both increasing the analysis throughput and obtaining higher reproducibility linked to automation to ensure the absence of contaminants in honey samples.

  1. Automated Extraction of Gravity Wave Signatures from the Super Dual Auroral Radar Network (SuperDARN) Database Using Spatio-Temporal Process Discovery Algorithms

    NASA Astrophysics Data System (ADS)

    Baker, J. B.; Ramakrishnan, N.; Ruohoniemi, J. M.; Hossain, M.; Ribeiro, A.

    2011-12-01

    A major challenge in space physics research is the automated extraction of recurrent features from multi-dimensional datasets which tend to be irregularly gridded in both space and time. In many cases, the complexity of the datasets impedes their use by scientists who are often times most interested in extracting a simple time-series of higher level data product that can be easily compared with other measurements. As such, the collective archive of space physics measurements is vastly under-utilized at the present time. Application of cutting-edge computer-aided data mining and knowledge discovery techniques has the potential to improve this situation by making space physics datasets much more accessible to the scientific user community and accelerating the rate of research and collaboration. As a first step in this direction, we are applying the principles of feature extraction, sub-clustering and motif mining to the analysis of HF backscatter measurements from the Super Dual Auroral Radar Network (SuperDARN). The SuperDARN database is an ideal test-bed for development of space physics data mining algorithms because: (1) there is a richness of geophysical phenomena manifested in the data; (2) the data is multi-dimensional and exhibits a high degree of spatiotemporal sparseness; and (3) some of the radars have been operating continuously with infrequent outages for more than 25 years. In this presentation we discuss results obtained from the application of new data mining algorithms designed specifically to automate the extraction of gravity wave signatures from the SuperDARN database. In particular, we examine the occurrence statistics of gravity waves as a function of latitude, local time, and geomagnetic conditions.

  2. A fully automated method for simultaneous determination of aflatoxins and ochratoxin A in dried fruits by pressurized liquid extraction and online solid-phase extraction cleanup coupled to ultra-high-pressure liquid chromatography-tandem mass spectrometry.

    PubMed

    Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Russo, Mariateresa; Valdés, Alberto; Ibáñez, Clara; Rastrelli, Luca

    2015-04-01

    According to current demands and future perspectives in food safety, this study reports a fast and fully automated analytical method for the simultaneous analysis of the mycotoxins with high toxicity and wide spread, aflatoxins (AFs) and ochratoxin A (OTA) in dried fruits, a high-risk foodstuff. The method is based on pressurized liquid extraction (PLE), with aqueous methanol (30%) at 110 °C, of the slurried dried fruit and online solid-phase extraction (online SPE) cleanup of the PLE extracts with a C18 cartridge. The purified sample was directly analysed by ultra-high-pressure liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) for sensitive and selective determination of AFs and OTA. The proposed analytical procedure was validated for different dried fruits (vine fruit, fig and apricot), providing method detection and quantification limits much lower than the AFs and OTA maximum levels imposed by EU regulation in dried fruit for direct human consumption. Also, recoveries (83-103%) and repeatability (RSD < 8, n = 3) meet the performance criteria required by EU regulation for the determination of the levels of mycotoxins in foodstuffs. The main advantage of the proposed method is full automation of the whole analytical procedure that reduces the time and cost of the analysis, sample manipulation and solvent consumption, enabling high-throughput analysis and highly accurate and precise results.

  3. A fully automated method for simultaneous determination of aflatoxins and ochratoxin A in dried fruits by pressurized liquid extraction and online solid-phase extraction cleanup coupled to ultra-high-pressure liquid chromatography-tandem mass spectrometry.

    PubMed

    Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Russo, Mariateresa; Valdés, Alberto; Ibáñez, Clara; Rastrelli, Luca

    2015-04-01

    According to current demands and future perspectives in food safety, this study reports a fast and fully automated analytical method for the simultaneous analysis of the mycotoxins with high toxicity and wide spread, aflatoxins (AFs) and ochratoxin A (OTA) in dried fruits, a high-risk foodstuff. The method is based on pressurized liquid extraction (PLE), with aqueous methanol (30%) at 110 °C, of the slurried dried fruit and online solid-phase extraction (online SPE) cleanup of the PLE extracts with a C18 cartridge. The purified sample was directly analysed by ultra-high-pressure liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) for sensitive and selective determination of AFs and OTA. The proposed analytical procedure was validated for different dried fruits (vine fruit, fig and apricot), providing method detection and quantification limits much lower than the AFs and OTA maximum levels imposed by EU regulation in dried fruit for direct human consumption. Also, recoveries (83-103%) and repeatability (RSD < 8, n = 3) meet the performance criteria required by EU regulation for the determination of the levels of mycotoxins in foodstuffs. The main advantage of the proposed method is full automation of the whole analytical procedure that reduces the time and cost of the analysis, sample manipulation and solvent consumption, enabling high-throughput analysis and highly accurate and precise results. PMID:25694147

  4. High-throughput method of dioxin analysis in aqueous samples using consecutive solid phase extraction steps with the new C18 Ultraflow™ pressurized liquid extraction and automated clean-up.

    PubMed

    Youn, Yeu-Young; Park, Deok Hie; Lee, Yeon Hwa; Lim, Young Hee; Cho, Hye Sung

    2015-01-01

    A high-throughput analytical method has been developed for the determination of seventeen 2,3,7,8-substituted congeners of polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) in aqueous samples. A recently introduced octadecyl (C18) disk for semi-automated solid-phase extraction of PCDD/Fs in water samples with a high level of particulate material has been tested for the analysis of dioxins. A new type of C18 disk specially designed for the analysis of hexane extractable material (HEM), but never previously reported for use in PCDD/Fs analysis. This kind of disk allows a higher filtration flow, and therefore the time of analysis is reduced. The solid-phase extraction technique is used to change samples from liquid to solid, and therefore pressurized liquid extraction (PLE) can be used in the pre-treatment. In order to achieve efficient purification, extracts from the PLE are purified using an automated Power-prep system with disposable silica, alumina, and carbon columns. Quantitative analyses of PCDD/Fs were performed by GC-HRMS using multi-ion detection (MID) mode. The method was successfully applied to the analysis of water samples from the wastewater treatment system of a vinyl chloride monomer plant. The entire procedure is in agreement with EPA1613 recommendations regarding the blank control, MDLs (method detection limits), accuracy, and precision. The high-throughput method not only meets the requirements of international standards, but also shortens the required analysis time from 2 weeks to 3d.

  5. A Comprehensive Automated 3D Approach for Building Extraction, Reconstruction, and Regularization from Airborne Laser Scanning Point Clouds

    PubMed Central

    Dorninger, Peter; Pfeifer, Norbert

    2008-01-01

    Three dimensional city models are necessary for supporting numerous management applications. For the determination of city models for visualization purposes, several standardized workflows do exist. They are either based on photogrammetry or on LiDAR or on a combination of both data acquisition techniques. However, the automated determination of reliable and highly accurate city models is still a challenging task, requiring a workflow comprising several processing steps. The most relevant are building detection, building outline generation, building modeling, and finally, building quality analysis. Commercial software tools for building modeling require, generally, a high degree of human interaction and most automated approaches described in literature stress the steps of such a workflow individually. In this article, we propose a comprehensive approach for automated determination of 3D city models from airborne acquired point cloud data. It is based on the assumption that individual buildings can be modeled properly by a composition of a set of planar faces. Hence, it is based on a reliable 3D segmentation algorithm, detecting planar faces in a point cloud. This segmentation is of crucial importance for the outline detection and for the modeling approach. We describe the theoretical background, the segmentation algorithm, the outline detection, and the modeling approach, and we present and discuss several actual projects.

  6. Demonstration and validation of automated agricultural field extraction from multi-temporal Landsat data for the majority of United States harvested cropland

    NASA Astrophysics Data System (ADS)

    Yan, L.; Roy, D. P.

    2014-12-01

    The spatial distribution of agricultural fields is a fundamental description of rural landscapes and the location and extent of fields is important to establish the area of land utilized for agricultural yield prediction, resource allocation, and for economic planning, and may be indicative of the degree of agricultural capital investment, mechanization, and labor intensity. To date, field objects have not been extracted from satellite data over large areas because of computational constraints, the complexity of the extraction task, and because consistently processed appropriate resolution data have not been available or affordable. A recently published automated methodology to extract agricultural crop fields from weekly 30 m Web Enabled Landsat data (WELD) time series was refined and applied to 14 states that cover 70% of harvested U.S. cropland (USDA 2012 Census). The methodology was applied to 2010 combined weekly Landsat 5 and 7 WELD data. The field extraction and quantitative validation results are presented for the following 14 states: Iowa, North Dakota, Illinois, Kansas, Minnesota, Nebraska, Texas, South Dakota, Missouri, Indiana, Ohio, Wisconsin, Oklahoma and Michigan (sorted by area of harvested cropland). These states include the top 11 U.S states by harvested cropland area. Implications and recommendations for systematic application to global coverage Landsat data are discussed.

  7. Quantitative radiology: automated measurement of polyp volume in computed tomography colonography using Hessian matrix-based shape extraction and volume growing

    PubMed Central

    Epstein, Mark L.; Obara, Piotr R.; Chen, Yisong; Liu, Junchi; Zarshenas, Amin; Makkinejad, Nazanin; Dachman, Abraham H.

    2015-01-01

    Background Current measurement of the single longest dimension of a polyp is subjective and has variations among radiologists. Our purpose was to develop a computerized measurement of polyp volume in computed tomography colonography (CTC). Methods We developed a 3D automated scheme for measuring polyp volume at CTC. Our scheme consisted of segmentation of colon wall to confine polyp segmentation to the colon wall, extraction of a highly polyp-like seed region based on the Hessian matrix, a 3D volume growing technique under the minimum surface expansion criterion for segmentation of polyps, and sub-voxel refinement and surface smoothing for obtaining a smooth polyp surface. Our database consisted of 30 polyp views (15 polyps) in CTC scans from 13 patients. Each patient was scanned in the supine and prone positions. Polyp sizes measured in optical colonoscopy (OC) ranged from 6-18 mm with a mean of 10 mm. A radiologist outlined polyps in each slice and calculated volumes by summation of volumes in each slice. The measurement study was repeated 3 times at least 1 week apart for minimizing a memory effect bias. We used the mean volume of the three studies as “gold standard”. Results Our measurement scheme yielded a mean polyp volume of 0.38 cc (range, 0.15-1.24 cc), whereas a mean “gold standard” manual volume was 0.40 cc (range, 0.15-1.08 cc). The “gold-standard” manual and computer volumetric reached excellent agreement (intra-class correlation coefficient =0.80), with no statistically significant difference [P (F≤f) =0.42]. Conclusions We developed an automated scheme for measuring polyp volume at CTC based on Hessian matrix-based shape extraction and volume growing. Polyp volumes obtained by our automated scheme agreed excellently with “gold standard” manual volumes. Our fully automated scheme can efficiently provide accurate polyp volumes for radiologists; thus, it would help radiologists improve the accuracy and efficiency of polyp volume

  8. Automated headspace solid-phase microextraction and in-matrix derivatization for the determination of amphetamine-related drugs in human urine by gas chromatography-mass spectrometry.

    PubMed

    Namera, Akira; Yashiki, Mikio; Kojima, Tohru; Ueki, Makoto

    2002-01-01

    An automated extraction and determination method for the gas chromatography (GC)-mass spectrometry (MS) analysis of amphetamine-related drugs in human urine is developed using headspace solid-phase microextraction (SPME) and in-matrix derivatization. A urine sample (0.5 mL, potassium carbonate (5 M, 1.0 mL), sodium chloride (0.5 g), and ethylchloroformate (20 microL) are put in a sample vial. Amphetamine-related drugs are converted to ethylformate derivatives (carbamates) in the vial because amphetamine-related drugs in urine are quickly reacted with ethylchloroformate. An SPME fiber is then exposed at 80 degrees C for 15 min in the headspace of the vial. The extracted derivatives to the fiber are desorbed by exposing the fiber in the injection port of a GC-MS. The calibration curves show linearity in the range of 1.0 to 1000 ng/mL for methamphetamine, fenfluramine, and methylenedioxymethamphetamine; 2.0 to 1000 ng/mL for amphetamine and phentermine; 5.0 to 1000 ng/mL for methylenedioxyamphetamine; 10 to 1000 ng/mL for phenethylamine; and 50 to 1000 ng/mL for 4-bromo-2,5-dimethoxyphenethylamine in urine. No interferences are found, and the time for analysis is 30 min for one sample. Furthermore, this proposed method is applied to some clinical and medico-legal cases by taking methamphetamine. Methamphetamine and its metabolite amphetamine are detected in the urine samples collected from the patients involved in the clinical cases. Methamphetamine, amphetamine, and phenethylamine are detected in the urine sample collected from the victim of a medico-legal case.

  9. Automation and robotics and related technology issues for Space Station customer servicing

    NASA Technical Reports Server (NTRS)

    Cline, Helmut P.

    1987-01-01

    Several flight servicing support elements are discussed within the context of the Space Station. Particular attention is given to the servicing facility, the mobile servicing center, and the flight telerobotic servicer (FTS). The role that automation and robotics can play in the design and operation of each of these elements is discussed. It is noted that the FTS, which is currently being developed by NASA, will evolve to increasing levels of autonomy to allow for the virtual elimination of routine EVA. Some of the features of the FTS will probably be: dual manipulator arms having reach and dexterity roughly equivalent to that of an EVA-suited astronaut, force reflection capability allowing efficient teleoperation, and capability of operating from a variety of support systems.

  10. Differential genetic regulation of motor activity and anxiety-related behaviors in mice using an automated home cage task.

    PubMed

    Kas, Martien J H; de Mooij-van Malsen, Annetrude J G; Olivier, Berend; Spruijt, Berry M; van Ree, Jan M

    2008-08-01

    Traditional behavioral tests, such as the open field test, measure an animal's responsiveness to a novel environment. However, it is generally difficult to assess whether the behavioral response obtained from these tests relates to the expression level of motor activity and/or to avoidance of anxiogenic areas. Here, an automated home cage environment for mice was designed to obtain independent measures of motor activity levels and of sheltered feeding preference during three consecutive days. Chronic treatment with the anxiolytic drug chlordiazepoxide (5 and 10 mg/kg/day) in C57BL/6J mice reduced sheltered feeding preference without altering motor activity levels. Furthermore, two distinct chromosome substitution strains, derived from C57BL/6J (host strain) and A/J (donor strain) inbred strains, expressed either increased sheltering preference in females (chromosome 15) or reduced motor activity levels in females and males (chromosome 1) when compared to C57BL/6J. Longitudinal behavioral monitoring revealed that these phenotypic differences maintained after adaptation to the home cage. Thus, by using new automated behavioral phenotyping approaches, behavior can be dissociated into distinct behavioral domains (e.g., anxiety-related and motor activity domains) with different underlying genetic origin and pharmacological responsiveness.

  11. Shoe-String Automation

    SciTech Connect

    Duncan, M.L.

    2001-07-30

    Faced with a downsizing organization, serious budget reductions and retirement of key metrology personnel, maintaining capabilities to provide necessary services to our customers was becoming increasingly difficult. It appeared that the only solution was to automate some of our more personnel-intensive processes; however, it was crucial that the most personnel-intensive candidate process be automated, at the lowest price possible and with the lowest risk of failure. This discussion relates factors in the selection of the Standard Leak Calibration System for automation, the methods of automation used to provide the lowest-cost solution and the benefits realized as a result of the automation.

  12. An Automated Approach to Agricultural Tile Drain Detection and Extraction Utilizing High Resolution Aerial Imagery and Object-Based Image Analysis

    NASA Astrophysics Data System (ADS)

    Johansen, Richard A.

    Subsurface drainage from agricultural fields in the Maumee River watershed is suspected to adversely impact the water quality and contribute to the formation of harmful algal blooms (HABs) in Lake Erie. In early August of 2014, a HAB developed in the western Lake Erie Basin that resulted in over 400,000 people being unable to drink their tap water due to the presence of a toxin from the bloom. HAB development in Lake Erie is aided by excess nutrients from agricultural fields, which are transported through subsurface tile and enter the watershed. Compounding the issue within the Maumee watershed, the trend within the watershed has been to increase the installation of tile drains in both total extent and density. Due to the immense area of drained fields, there is a need to establish an accurate and effective technique to monitor subsurface farmland tile installations and their associated impacts. This thesis aimed at developing an automated method in order to identify subsurface tile locations from high resolution aerial imagery by applying an object-based image analysis (OBIA) approach utilizing eCognition. This process was accomplished through a set of algorithms and image filters, which segment and classify image objects by their spectral and geometric characteristics. The algorithms utilized were based on the relative location of image objects and pixels, in order to maximize the robustness and transferability of the final rule-set. These algorithms were coupled with convolution and histogram image filters to generate results for a 10km2 study area located within Clay Township in Ottawa County, Ohio. The eCognition results were compared to previously collected tile locations from an associated project that applied heads-up digitizing of aerial photography to map field tile. The heads-up digitized locations were used as a baseline for the accuracy assessment. The accuracy assessment generated a range of agreement values from 67.20% - 71.20%, and an average

  13. Fully automated online solid phase extraction coupled directly to liquid chromatography-tandem mass spectrometry. Quantification of sulfonamide antibiotics, neutral and acidic pesticides at low concentrations in surface waters.

    PubMed

    Stoob, Krispin; Singer, Heinz P; Goetz, Christian W; Ruff, Matthias; Mueller, Stephan R

    2005-12-01

    A fully automated online solid phase extraction-liquid chromatography-tandem mass spectrometry (SPE-LC-MS/MS) instrumental setup has been developed for the quantification of sulfonamide antibiotics and pesticides in natural water. The direct coupling of an online solid phase extraction cartridge (Oasis HLB) to LC-MS/MS was accomplished using column switching techniques. High sensitivity in the low ng/L range was achieved by large volume injections of 18 mL with a combination of a tri-directional auto-sampler and a dispenser system. This setup allowed high sample throughput with a minimum of investment costs. Special emphasis was placed on low cross contamination. The chosen approach is suitable for research as well as for monitoring applications. The flexible instrumental setup was successfully optimised for different important groups of bioactive chemicals resulting in three trace analytical methods for quantification of (i) sulfonamide antibiotics and their acetyl metabolites; (ii) neutral pesticides (triazines, phenylureas, amides, chloracetanilides) and (iii) acidic pesticides (phenoxyacetic acids and triketones). Absolute extraction recoveries from 85 to 112% were obtained for the different analytes. More than 500 samples could be analyzed with one extraction cartridge. The inter-day precision of the method was excellent indicated by relative standard deviations between 1 and 6%. High accuracy was achieved by the developed methods resulting in maximum deviation relative to the spiked amount of 8-15% for the different analytes. Detection limits for various environmental samples were between 0.5 and 5 ng/L. Matrix induced ion suppression was in general smaller than 25%. The performance of the online methods was demonstrated with measurements of concentration dynamics of sulfonamide antibiotics and pesticides concentrations in a little creek during rain fall events.

  14. A Framework for the Relative and Absolute Performance Evaluation of Automated Spectroscopy Systems

    SciTech Connect

    Portnoy, David; Heimberg, Peter; Heimberg, Jennifer; Feuerbach, Robert; McQuarrie, Allan; Noonan, William; Mattson, John

    2009-12-02

    The development of high-speed, high-performance gamma-ray spectroscopy algorithms is critical to the success of many automated threat detection systems. In response to this need a proliferation of such algorithms has taken place. With this proliferation comes the necessary and non-trivial task of validation. There is (and always will be) insufficient experimental data to determine performance of spectroscopy algorithms over the relevant factor space at any reasonable precision. In the case of gamma-ray spectroscopy, there are hundreds of radioisotopes of interest, which may come in arbitrary admixtures, there are many materials of unknown quantity, which may be found in the intervening space between the source and the detection system, and there are also irregular variations in the detector systems themselves. All of these factors and more should be explored to determine algorithm/system performance. This paper describes a statistical framework for the performance estimation and comparison of gamma-ray spectroscopy algorithms. The framework relies heavily on data of increasing levels of artificiality to sufficiently cover the factor space. At each level rigorous statistical methods are employed to validate performance estimates.

  15. Automated isotope dilution liquid chromatography-tandem mass spectrometry with on-line dilution and solid phase extraction for the measurement of cortisol in human serum sample.

    PubMed

    Kawaguchi, Migaku; Eyama, Sakae; Takatsu, Akiko

    2014-08-01

    A candidate reference measurement procedure involving automated isotope dilution coupled with liquid chromatography-tandem mass spectrometry (ID-LC-MS/MS) with on-line dilution and solid phase extraction (SPE) has been developed and critically evaluated. We constructed the LC-MS/MS with on-line dilution and SPE system. An isotopically labelled internal standard, cortisol-d4, was added to serum sample. After equilibration, the methanol was added to the sample, and deproteination was performed. Then, the sample was applied to the LC-MS/MS system. The limit of detection (LOD) and limit of quantification (LOQ) were 0.2 and 1ngg(-1), respectively. Excellent precision was obtained with within-day variation (RSD) of 1.9% for ID-LC-MS/MS analysis (n=6). This method, which demonstrates simple, easy, good accuracy, high precision, and is free from interferences from structural analogues, qualifies as a reference measurement procedure.

  16. Development of an Automated Column Solid-Phase Extraction Cleanup of QuEChERS Extracts, Using a Zirconia-Based Sorbent, for Pesticide Residue Analyses by LC-MS/MS.

    PubMed

    Morris, Bruce D; Schriner, Richard B

    2015-06-01

    A new, automated, high-throughput, mini-column solid-phase extraction (c-SPE) cleanup method for QuEChERS extracts was developed, using a robotic X-Y-Z instrument autosampler, for analysis of pesticide residues in fruits and vegetables by LC-MS/MS. Removal of avocado matrix and recoveries of 263 pesticides and metabolites were studied, using various stationary phase mixtures, including zirconia-based sorbents, and elution with acetonitrile. These experiments allowed selection of a sorbent mixture consisting of zirconia, C18, and carbon-coated silica, that effectively retained avocado matrix but also retained 53 pesticides with <70% recoveries. Addition of MeOH to the elution solvent improved pesticide recoveries from zirconia, as did citrate ions in CEN QuEChERS extracts. Finally, formate buffer in acetonitrile/MeOH (1:1) was required to give >70% recoveries of all 263 pesticides. Analysis of avocado extracts by LC-Q-Orbitrap-MS showed that the method developed was removing >90% of di- and triacylglycerols. The method was validated for 269 pesticides (including homologues and metabolites) in avocado and citrus. Spike recoveries were within 70-120% and 20% RSD for 243 of these analytes in avocado and 254 in citrus, when calibrated against solvent-only standards, indicating effective matrix removal and minimal electrospray ionization suppression.

  17. Development of an Automated Column Solid-Phase Extraction Cleanup of QuEChERS Extracts, Using a Zirconia-Based Sorbent, for Pesticide Residue Analyses by LC-MS/MS.

    PubMed

    Morris, Bruce D; Schriner, Richard B

    2015-06-01

    A new, automated, high-throughput, mini-column solid-phase extraction (c-SPE) cleanup method for QuEChERS extracts was developed, using a robotic X-Y-Z instrument autosampler, for analysis of pesticide residues in fruits and vegetables by LC-MS/MS. Removal of avocado matrix and recoveries of 263 pesticides and metabolites were studied, using various stationary phase mixtures, including zirconia-based sorbents, and elution with acetonitrile. These experiments allowed selection of a sorbent mixture consisting of zirconia, C18, and carbon-coated silica, that effectively retained avocado matrix but also retained 53 pesticides with <70% recoveries. Addition of MeOH to the elution solvent improved pesticide recoveries from zirconia, as did citrate ions in CEN QuEChERS extracts. Finally, formate buffer in acetonitrile/MeOH (1:1) was required to give >70% recoveries of all 263 pesticides. Analysis of avocado extracts by LC-Q-Orbitrap-MS showed that the method developed was removing >90% of di- and triacylglycerols. The method was validated for 269 pesticides (including homologues and metabolites) in avocado and citrus. Spike recoveries were within 70-120% and 20% RSD for 243 of these analytes in avocado and 254 in citrus, when calibrated against solvent-only standards, indicating effective matrix removal and minimal electrospray ionization suppression. PMID:25702899

  18. Sieve-based relation extraction of gene regulatory networks from biological literature

    PubMed Central

    2015-01-01

    Background Relation extraction is an essential procedure in literature mining. It focuses on extracting semantic relations between parts of text, called mentions. Biomedical literature includes an enormous amount of textual descriptions of biological entities, their interactions and results of related experiments. To extract them in an explicit, computer readable format, these relations were at first extracted manually from databases. Manual curation was later replaced with automatic or semi-automatic tools with natural language processing capabilities. The current challenge is the development of information extraction procedures that can directly infer more complex relational structures, such as gene regulatory networks. Results We develop a computational approach for extraction of gene regulatory networks from textual data. Our method is designed as a sieve-based system and uses linear-chain conditional random fields and rules for relation extraction. With this method we successfully extracted the sporulation gene regulation network in the bacterium Bacillus subtilis for the information extraction challenge at the BioNLP 2013 conference. To enable extraction of distant relations using first-order models, we transform the data into skip-mention sequences. We infer multiple models, each of which is able to extract different relationship types. Following the shared task, we conducted additional analysis using different system settings that resulted in reducing the reconstruction error of bacterial sporulation network from 0.73 to 0.68, measured as the slot error rate between the predicted and the reference network. We observe that all relation extraction sieves contribute to the predictive performance of the proposed approach. Also, features constructed by considering mention words and their prefixes and suffixes are the most important features for higher accuracy of extraction. Analysis of distances between different mention types in the text shows that our choice

  19. Fully automated analysis of four tobacco-specific N-nitrosamines in mainstream cigarette smoke using two-dimensional online solid phase extraction combined with liquid chromatography-tandem mass spectrometry.

    PubMed

    Zhang, Jie; Bai, Ruoshi; Yi, Xiaoli; Yang, Zhendong; Liu, Xingyu; Zhou, Jun; Liang, Wei

    2016-01-01

    A fully automated method for the detection of four tobacco-specific nitrosamines (TSNAs) in mainstream cigarette smoke (MSS) has been developed. The new developed method is based on two-dimensional online solid-phase extraction-liquid chromatography-tandem mass spectrometry (SPE/LC-MS/MS). The two dimensional SPE was performed in the method utilizing two cartridges with different extraction mechanisms to cleanup disturbances of different polarity to minimize sample matrix effects on each analyte. Chromatographic separation was achieved using a UPLC C18 reversed phase analytical column. Under the optimum online SPE/LC-MS/MS conditions, N'-nitrosonornicotine (NNN), N'-nitrosoanatabine (NAT), N'-nitrosoanabasine (NAB), and 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanone (NNK) were baseline separated with good peak shapes. This method appears to be the most sensitive method yet reported for determination of TSNAs in mainstream cigarette smoke. The limits of quantification for NNN, NNK, NAT and NAB reached the levels of 6.0, 1.0, 3.0 and 0.6 pg/cig, respectively, which were well below the lowest levels of TSNAs in MSS of current commercial cigarettes. The accuracy of the measurement of four TSNAs was from 92.8 to 107.3%. The relative standard deviations of intra-and inter-day analysis were less than 5.4% and 7.5%, respectively. The main advantages of the method developed are fairly high sensitivity, selectivity and accuracy of results, minimum sample pre-treatment, full automation, and high throughput. As a part of the validation procedure, the developed method was applied to evaluate TSNAs yields for 27 top-selling commercial cigarettes in China.

  20. Comparison of Boiling and Robotics Automation Method in DNA Extraction for Metagenomic Sequencing of Human Oral Microbes.

    PubMed

    Yamagishi, Junya; Sato, Yukuto; Shinozaki, Natsuko; Ye, Bin; Tsuboi, Akito; Nagasaki, Masao; Yamashita, Riu

    2016-01-01

    The rapid improvement of next-generation sequencing performance now enables us to analyze huge sample sets with more than ten thousand specimens. However, DNA extraction can still be a limiting step in such metagenomic approaches. In this study, we analyzed human oral microbes to compare the performance of three DNA extraction methods: PowerSoil (a method widely used in this field), QIAsymphony (a robotics method), and a simple boiling method. Dental plaque was initially collected from three volunteers in the pilot study and then expanded to 12 volunteers in the follow-up study. Bacterial flora was estimated by sequencing the V4 region of 16S rRNA following species-level profiling. Our results indicate that the efficiency of PowerSoil and QIAsymphony was comparable to the boiling method. Therefore, the boiling method may be a promising alternative because of its simplicity, cost effectiveness, and short handling time. Moreover, this method was reliable for estimating bacterial species and could be used in the future to examine the correlation between oral flora and health status. Despite this, differences in the efficiency of DNA extraction for various bacterial species were observed among the three methods. Based on these findings, there is no "gold standard" for DNA extraction. In future, we suggest that the DNA extraction method should be selected on a case-by-case basis considering the aims and specimens of the study. PMID:27104353

  1. Comparison of Boiling and Robotics Automation Method in DNA Extraction for Metagenomic Sequencing of Human Oral Microbes

    PubMed Central

    Shinozaki, Natsuko; Ye, Bin; Tsuboi, Akito; Nagasaki, Masao; Yamashita, Riu

    2016-01-01

    The rapid improvement of next-generation sequencing performance now enables us to analyze huge sample sets with more than ten thousand specimens. However, DNA extraction can still be a limiting step in such metagenomic approaches. In this study, we analyzed human oral microbes to compare the performance of three DNA extraction methods: PowerSoil (a method widely used in this field), QIAsymphony (a robotics method), and a simple boiling method. Dental plaque was initially collected from three volunteers in the pilot study and then expanded to 12 volunteers in the follow-up study. Bacterial flora was estimated by sequencing the V4 region of 16S rRNA following species-level profiling. Our results indicate that the efficiency of PowerSoil and QIAsymphony was comparable to the boiling method. Therefore, the boiling method may be a promising alternative because of its simplicity, cost effectiveness, and short handling time. Moreover, this method was reliable for estimating bacterial species and could be used in the future to examine the correlation between oral flora and health status. Despite this, differences in the efficiency of DNA extraction for various bacterial species were observed among the three methods. Based on these findings, there is no “gold standard” for DNA extraction. In future, we suggest that the DNA extraction method should be selected on a case-by-case basis considering the aims and specimens of the study. PMID:27104353

  2. Automated Identification of the Heart Wall Throughout the Entire Cardiac Cycle Using Optimal Cardiac Phase for Extracted Features

    NASA Astrophysics Data System (ADS)

    Takahashi, Hiroki; Hasegawa, Hideyuki; Kanai, Hiroshi

    2011-07-01

    In most methods for evaluation of cardiac function based on echocardiography, the heart wall is currently identified manually by an operator. However, this task is very time-consuming and suffers from inter- and intraobserver variability. The present paper proposes a method that uses multiple features of ultrasonic echo signals for automated identification of the heart wall region throughout an entire cardiac cycle. In addition, the optimal cardiac phase to select a frame of interest, i.e., the frame for the initiation of tracking, was determined. The heart wall region at the frame of interest in this cardiac phase was identified by the expectation-maximization (EM) algorithm, and heart wall regions in the following frames were identified by tracking each point classified in the initial frame as the heart wall region using the phased tracking method. The results for two subjects indicate the feasibility of the proposed method in the longitudinal axis view of the heart.

  3. Recommendations relative to the scientific missions of a Mars Automated Roving Vehicle (MARV)

    NASA Technical Reports Server (NTRS)

    Spencer, R. L. (Editor)

    1973-01-01

    Scientific objectives of the MARV mission are outlined and specific science systems requirements and experimental payloads defined. All aspects of the Martian surface relative to biotic and geologic elements and those relating to geophysical and geochemical properties are explored.

  4. A METHOD FOR AUTOMATED ANALYSIS OF 10 ML WATER SAMPLES CONTAINING ACIDIC, BASIC, AND NEUTRAL SEMIVOLATILE COMPOUNDS LISTED IN USEPA METHOD 8270 BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GAS CHROMATOGRAPHY/MASS SPECTROMETRY

    EPA Science Inventory

    Data is presented showing the progress made towards the development of a new automated system combining solid phase extraction (SPE) with gas chromatography/mass spectrometry for the single run analysis of water samples containing a broad range of acid, base and neutral compounds...

  5. Automated gaseous criteria pollutant audits

    SciTech Connect

    Watson, J.P.

    1998-12-31

    The Quality Assurance Section (QAS) of the California Air Resources Board (CARB) began performing automated gaseous audits of its ambient air monitoring sites in July 1996. The concept of automated audits evolved from the constant streamlining of the through-the-probe audit process. Continual audit van development and the desire to utilize advanced technology to save time and improve the accuracy of the overall audit process also contributed to the concept. The automated audit process is a computer program which controls an audit van`s ambient gas calibration system, isolated relay and analog to digital cards, and a monitoring station`s data logging system. The program instructs the audit van`s gas calibration system to deliver specified audit concentrations to a monitoring station`s instruments through their collection probe inlet. The monitoring station`s responses to the audit concentrations are obtained by the program polling the station`s datalogger through its RS-232 port. The program calculates relevant audit statistics and stores all data collected during an audit in a relational database. Planning for the development of an automated gaseous audit system began in earnest in 1993, when the CARB purchased computerized ambient air calibration systems which could be remotely controlled by computer through their serial ports. After receiving all the required components of the automated audit system, they were individually tested to confirm their correct operation. Subsequently, a prototype program was developed to perform through-the-probe automated ozone audits. Numerous simulated ozone audits documented the program`s ability to control audit equipment and extract data from a monitoring station`s data logging system. The program was later modified to incorporate the capability to perform audits for carbon monoxide, total hydrocarbons, methane, nitrogen dioxide, sulfur dioxide, and hydrogen sulfide.

  6. A Relation Extraction Framework for Biomedical Text Using Hybrid Feature Set

    PubMed Central

    Muzaffar, Abdul Wahab; Azam, Farooque; Qamar, Usman

    2015-01-01

    The information extraction from unstructured text segments is a complex task. Although manual information extraction often produces the best results, it is harder to manage biomedical data extraction manually because of the exponential increase in data size. Thus, there is a need for automatic tools and techniques for information extraction in biomedical text mining. Relation extraction is a significant area under biomedical information extraction that has gained much importance in the last two decades. A lot of work has been done on biomedical relation extraction focusing on rule-based and machine learning techniques. In the last decade, the focus has changed to hybrid approaches showing better results. This research presents a hybrid feature set for classification of relations between biomedical entities. The main contribution of this research is done in the semantic feature set where verb phrases are ranked using Unified Medical Language System (UMLS) and a ranking algorithm. Support Vector Machine and Naïve Bayes, the two effective machine learning techniques, are used to classify these relations. Our approach has been validated on the standard biomedical text corpus obtained from MEDLINE 2001. Conclusively, it can be articulated that our framework outperforms all state-of-the-art approaches used for relation extraction on the same corpus. PMID:26347797

  7. A Relation Extraction Framework for Biomedical Text Using Hybrid Feature Set.

    PubMed

    Muzaffar, Abdul Wahab; Azam, Farooque; Qamar, Usman

    2015-01-01

    The information extraction from unstructured text segments is a complex task. Although manual information extraction often produces the best results, it is harder to manage biomedical data extraction manually because of the exponential increase in data size. Thus, there is a need for automatic tools and techniques for information extraction in biomedical text mining. Relation extraction is a significant area under biomedical information extraction that has gained much importance in the last two decades. A lot of work has been done on biomedical relation extraction focusing on rule-based and machine learning techniques. In the last decade, the focus has changed to hybrid approaches showing better results. This research presents a hybrid feature set for classification of relations between biomedical entities. The main contribution of this research is done in the semantic feature set where verb phrases are ranked using Unified Medical Language System (UMLS) and a ranking algorithm. Support Vector Machine and Naïve Bayes, the two effective machine learning techniques, are used to classify these relations. Our approach has been validated on the standard biomedical text corpus obtained from MEDLINE 2001. Conclusively, it can be articulated that our framework outperforms all state-of-the-art approaches used for relation extraction on the same corpus.

  8. Automated Identification of Closed Mesoscale Cellular Convection and Impact of Resolution on Related Mesoscale Dynamics

    NASA Astrophysics Data System (ADS)

    Martini, M.; Gustafson, W. I.; Yang, Q.; Xiao, H.

    2013-12-01

    Organized mesoscale cellular convection (MCC) is a common feature of marine stratocumulus that forms in response to a balance between mesoscale dynamics and smaller scale processes such as cloud radiative cooling and microphysics. Cloud resolving models begin to resolve some, but not all, of these processes with less of the mesoscale dynamics resolved as one progresses from <1 km to 10 km grid spacing. While limited domain cloud resolving models can use high resolution to simulate MCC, global cloud resolving models must resort to using grid spacings closer to 5 to 10 km. This effectively truncates the scales through which the dynamics can act and impacts the MCC characteristics, potentially altering the climate impact of these clouds in climate models. To understand the impact of this truncation, we use the Weather Research and Forecasting model with chemistry (WRF-Chem) and fully coupled cloud-aerosol interactions to simulate marine low clouds during the VOCALS-REx campaign over the Southeast Pacific. A suite of experiments with 1-, 3- and 9-km grid spacing indicates resolution dependent behavior. The simulations with finer grid spacing have lower liquid water paths and cloud fractions, while cloud tops are higher. When compared to observed liquid water paths from GOES and MODIS, the 3-km simulation has better agreement over the coastal regions while the 9-km simulation better agrees over remote regions. The observed diurnal cycle is reasonably well simulated. To isolate organized MCC characteristics we developed a new automated method, which uses a variation of the watershed segmentation technique that combines the detection of cloud boundaries with a test for coincident vertical velocity characteristics. This has the advantage of ensuring that the detected cloud fields are dynamically consistent for closed MCC and helps minimize false detections from secondary circulations. We demonstrate that the 3-km simulation is able to reproduce the scaling between

  9. Discovery of Predicate-Oriented Relations among Named Entities Extracted from Thai Texts

    NASA Astrophysics Data System (ADS)

    Tongtep, Nattapong; Theeramunkong, Thanaruk

    Extracting named entities (NEs) and their relations is more difficult in Thai than in other languages due to several Thai specific characteristics, including no explicit boundaries for words, phrases and sentences; few case markers and modifier clues; high ambiguity in compound words and serial verbs; and flexible word orders. Unlike most previous works which focused on NE relations of specific actions, such as work_for, live_in, located_in, and kill, this paper proposes more general types of NE relations, called predicate-oriented relation (PoR), where an extracted action part (verb) is used as a core component to associate related named entities extracted from Thai Texts. Lacking a practical parser for the Thai language, we present three types of surface features, i.e. punctuation marks (such as token spaces), entity types and the number of entities and then apply five alternative commonly used learning schemes to investigate their performance on predicate-oriented relation extraction. The experimental results show that our approach achieves the F-measure of 97.76%, 99.19%, 95.00% and 93.50% on four different types of predicate-oriented relation (action-location, location-action, action-person and person-action) in crime-related news documents using a data set of 1,736 entity pairs. The effects of NE extraction techniques, feature sets and class unbalance on the performance of relation extraction are explored.

  10. Automated solid-phase extraction and liquid chromatography-electrospray ionization-mass spectrometry for the determination of flunitrazepam and its metabolites in human urine and plasma samples.

    PubMed

    Jourdil, N; Bessard, J; Vincent, F; Eysseric, H; Bessard, G

    2003-05-25

    A sensitive and specific method using reversed-phase liquid chromatography coupled with electrospray ionization-mass spectrometry (LC-ESI-MS) has been developed for the quantitative determination of flunitrazepam (F) and its metabolites 7-aminoflunitrazepam (7-AF), N-desmethylflunitrazepam (N-DMF) and 3-hydroxyflunitrazepam (3-OHF) in biological fluids. After the addition of deuterium labelled standards of F,7-AF and N-DMF, the drugs were isolated from urine or plasma by automated solid-phase extraction, then chromatographed in an isocratic elution mode with a salt-free eluent. The quantification was performed using selected ion monitoring of protonated molecular ions (M+H(+)). Experiments were carried out to improve the extraction recovery (81-100%) and the sensitivity (limit of detection 0.025 ng/ml for F and 7-AF, 0.040 ng/ml for N-DMF and 0.200 ng/ml for 3-OHF). The method was applied to the determination of F and metabolites in drug addicts including withdrawal urine samples and in one date-rape plasma and urine sample. PMID:12705961

  11. Fully automated determination of 74 pharmaceuticals in environmental and waste waters by online solid phase extraction-liquid chromatography-electrospray-tandem mass spectrometry.

    PubMed

    López-Serna, Rebeca; Pérez, Sandra; Ginebreda, Antoni; Petrović, Mira; Barceló, Damià

    2010-12-15

    The present work describes the development of a fully automated method, based on on-line solid-phase extraction (SPE)-liquid chromatography-electrospray-tandem mass spectrometry (LC-MS-MS), for the determination of 74 pharmaceuticals in environmental waters (superficial water and groundwater) as well as sewage waters. On-line SPE is performed by passing 2.5 mL of the water sample through a HySphere Resin GP cartridge. For unequivocal identification and confirmation two selected reaction monitoring (SRM) transitions are monitored per compound, thus four identification points are achieved. Quantification is performed by the internal standard approach, indispensable to correct the losses during the solid phase extraction, as well as the matrix effects. The main advantages of the method developed are high sensitivity (limits of detection in the low ng L(-1) range), selectivity due the use of tandem mass spectrometry and reliability due the use of 51 surrogates and minimum sample manipulation. As a part of the validation procedure, the method developed has been applied to the analysis of various environmental and sewage samples from a Spanish river and a sewage treatment plant.

  12. Comparison of Two Commercial Automated Nucleic Acid Extraction and Integrated Quantitation Real-Time PCR Platforms for the Detection of Cytomegalovirus in Plasma

    PubMed Central

    Tsai, Huey-Pin; Tsai, You-Yuan; Lin, I-Ting; Kuo, Pin-Hwa; Chen, Tsai-Yun; Chang, Kung-Chao; Wang, Jen-Ren

    2016-01-01

    Quantitation of cytomegalovirus (CMV) viral load in the transplant patients has become a standard practice for monitoring the response to antiviral therapy. The cut-off values of CMV viral load assays for preemptive therapy are different due to the various assay designs employed. To establish a sensitive and reliable diagnostic assay for preemptive therapy of CMV infection, two commercial automated platforms including m2000sp extraction system integrated the Abbott RealTime (m2000rt) and the Roche COBAS AmpliPrep for extraction integrated COBAS Taqman (CAP/CTM) were evaluated using WHO international CMV standards and 110 plasma specimens from transplant patients. The performance characteristics, correlation, and workflow of the two platforms were investigated. The Abbott RealTime assay correlated well with the Roche CAP/CTM assay (R2 = 0.9379, P<0.01). The Abbott RealTime assay exhibited higher sensitivity for the detection of CMV viral load, and viral load values measured with Abbott RealTime assay were on average 0.76 log10 IU/mL higher than those measured with the Roche CAP/CTM assay (P<0.0001). Workflow analysis on a small batch size at one time, using the Roche CAP/CTM platform had a shorter hands-on time than the Abbott RealTime platform. In conclusion, these two assays can provide reliable data for different purpose in a clinical virology laboratory setting. PMID:27494707

  13. An automated flow injection system for metal determination by flame atomic absorption spectrometry involving on-line fabric disk sorptive extraction technique.

    PubMed

    Anthemidis, A; Kazantzi, V; Samanidou, V; Kabir, A; Furton, K G

    2016-08-15

    A novel flow injection-fabric disk sorptive extraction (FI-FDSE) system was developed for automated determination of trace metals. The platform was based on a minicolumn packed with sol-gel coated fabric media in the form of disks, incorporated into an on-line solid-phase extraction system, coupled with flame atomic absorption spectrometry (FAAS). This configuration provides minor backpressure, resulting in high loading flow rates and shorter analytical cycles. The potentials of this technique were demonstrated for trace lead and cadmium determination in environmental water samples. The applicability of different sol-gel coated FPSE media was investigated. The on-line formed complex of metal with ammonium pyrrolidine dithiocarbamate (APDC) was retained onto the fabric surface and methyl isobutyl ketone (MIBK) was used to elute the analytes prior to atomization. For 90s preconcentration time, enrichment factors of 140 and 38 and detection limits (3σ) of 1.8 and 0.4μgL(-1) were achieved for lead and cadmium determination, respectively, with a sampling frequency of 30h(-1). The accuracy of the proposed method was estimated by analyzing standard reference materials and spiked water samples. PMID:27260436

  14. Regenerable immuno-biochip for screening ochratoxin A in green coffee extract using an automated microarray chip reader with chemiluminescence detection.

    PubMed

    Sauceda-Friebe, Jimena C; Karsunke, Xaver Y Z; Vazac, Susanna; Biselli, Scarlett; Niessner, Reinhard; Knopp, Dietmar

    2011-03-18

    Ochratoxin A (OTA) can contaminate foodstuffs in the ppb to ppm range and once formed, it is difficult to remove. Because of its toxicity and potential risks to human health, the need exists for rapid, efficient detection methods that comply with legal maximum residual limits. In this work we have synthesized an OTA conjugate functionalized with a water-soluble peptide for covalent immobilization on a glass biochip by means of contact spotting. The chip was used for OTA determination with an indirect competitive immunoassay format with flow-through reagent addition and chemiluminescence detection, carried out with the stand-alone automated Munich Chip Reader 3 (MCR 3) platform. A buffer model and real green coffee extracts were used for this purpose. At the present, covalent conjugate immobilization allowed for at least 20 assay-regeneration cycles of the biochip surface. The total analysis time for a single sample, including measurement and surface regeneration, was 12 min and the LOQ of OTA in green coffee extract was 0.3 μg L(-1) which corresponds to 7 μg kg(-1).

  15. Automated in-syringe single-drop head-space micro-extraction applied to the determination of ethanol in wine samples.

    PubMed

    Srámková, Ivana; Horstkotte, Burkhard; Solich, Petr; Sklenářová, Hana

    2014-05-30

    A novel approach of head-space single-drop micro-extraction applied to the determination of ethanol in wine is presented. For the first time, the syringe of an automated syringe pump was used as an extraction chamber of adaptable size for a volatile analyte. This approach enabled to apply negative pressure during the enrichment step, which favored the evaporation of the analyte. Placing a slowly spinning magnetic stirring bar inside the syringe, effective syringe cleaning as well as mixing of the sample with buffer solution to suppress the interference of acetic acid was achieved. Ethanol determination was based on the reduction of a single drop of 3mmol L(-1) potassium dichromate dissolved in 8mol L(-1) sulfuric acid. The drop was positioned in the syringe inlet in the head-space above the sample with posterior spectrophotometric quantification. The entire procedure was carried out automatically using a simple sequential injection analyzer system. One analysis required less than 5min including the washing step. A limit of detection of 0.025% (v/v) of ethanol and an average repeatability of less than 5.0% RSD were achieved. The consumption of dichromate reagent, buffer, and sample per analysis were only 20μL, 200μL, and 1mL, respectively. The results of real samples analysis did not differ significantly from those obtained with the references gas chromatography method.

  16. Maneuver Automation Software

    NASA Technical Reports Server (NTRS)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; Illsley, Jeannette

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  17. High quality DNA obtained with an automated DNA extraction method with 70+ year old formalin-fixed celloidin-embedded (FFCE) blocks from the indiana medical history museum.

    PubMed

    Niland, Erin E; McGuire, Audrey; Cox, Mary H; Sandusky, George E

    2012-01-01

    DNA and RNA have been used as markers of tissue quality and integrity throughout the last few decades. In this research study, genomic quality DNA of kidney, liver, heart, lung, spleen, and brain were analyzed in tissues from post-mortem patients and surgical cancer cases spanning the past century. DNA extraction was performed on over 180 samples from: 70+ year old formalin-fixed celloidin-embedded (FFCE) tissues, formalin-fixed paraffin-embedded (FFPE) tissue samples from surgical cases and post-mortem cases from the 1970's, 1980's, 1990's, and 2000's, tissues fixed in 10% neutral buffered formalin/stored in 70% ethanol from the 1990's, 70+ year old tissues fixed in unbuffered formalin of various concentrations, and fresh tissue as a control. To extract DNA from FFCE samples and ethanol-soaked samples, a modified standard operating procedure was used in which all tissues were homogenized, digested with a proteinase K solution for a long period of time (24-48 hours), and DNA was extracted using the Autogen Flexstar automated extraction machine. To extract DNA from FFPE, all tissues were soaked in xylene to remove the paraffin from the tissue prior to digestion, and FFPE tissues were not homogenized. The results were as follows: celloidin-embedded and paraffin-embedded tissues yielded the highest DNA concentration and greatest DNA quality, while the formalin in various concentrations, and long term formalin/ethanol-stored tissue yielded both the lowest DNA concentration and quality of the tissues tested. The average DNA yield for the various fixatives was: 367.77 μg/ mL FFCE, 590.7 μg/mL FFPE, 53.74 μg/mL formalin-fixed/70% ethanol-stored and 33.2 μg/mL unbuffered formalin tissues. The average OD readings for FFCE, FFPE, formalin-fixed/70% ethanol-stored tissues, and tissues fixed in unbuffered formalin were 1.86, 1.87, 1.43, and 1.48 respectively. The results show that usable DNA can be extracted from tissue fixed in formalin and embedded in celloidin or

  18. Rapid analysis of three β-agonist residues in food of animal origin by automated on-line solid-phase extraction coupled to liquid chromatography and tandem mass spectrometry.

    PubMed

    Mi, Jiebo; Li, Shujing; Xu, Hong; Liang, Wei; Sun, Tao

    2014-09-01

    An automated online solid-phase extraction with liquid chromatography and tandem mass spectrometry method was developed and validated for the detection of clenbuterol, salbutamol, and ractopamine in food of animal origin. The samples from the food matrix were pretreated with an online solid-phase extraction cartridge by Oasis MCX for <5 min after acid hydrolysis for 30 min. The peak focusing mode was used to elute the target compounds directly onto a C18 column. Chromatographic separation was achieved under gradient conditions using a mobile phase composed of acetonitrile/0.1% formic acid in aqueous solution. Each analyte was detected in two multiple reaction monitoring transitions via an electrospray ionization source in a positive mode. The relative standard deviations ranged from 2.6 to 10.5%, and recovery was between 76.7 and 107.2% at all quality control levels. The limits of quantification of three β-agonists were in the range of 0.024-0.29 μg/kg in pork, sausage, and milk powder, respectively. This newly developed method offers high sensitivity and minimum sample pretreatment for the high-throughput analysis of β-agonist residues.

  19. Automated extraction of lysergic acid diethylamide (LSD) and N-demethyl-LSD from blood, serum, plasma, and urine samples using the Zymark RapidTrace with LC/MS/MS confirmation.

    PubMed

    de Kanel, J; Vickery, W E; Waldner, B; Monahan, R M; Diamond, F X

    1998-05-01

    A forensic procedure for the quantitative confirmation of lysergic acid diethylamide (LSD) and the qualitative confirmation of its metabolite, N-demethyl-LSD, in blood, serum, plasma, and urine samples is presented. The Zymark RapidTrace was used to perform fully automated solid-phase extractions of all specimen types. After extract evaporation, confirmations were performed using liquid chromatography (LC) followed by positive electrospray ionization (ESI+) mass spectrometry/mass spectrometry (MS/MS) without derivatization. Quantitation of LSD was accomplished using LSD-d3 as an internal standard. The limit of quantitation (LOQ) for LSD was 0.05 ng/mL. The limit of detection (LOD) for both LSD and N-demethyl-LSD was 0.025 ng/mL. The recovery of LSD was greater than 95% at levels of 0.1 ng/mL and 2.0 ng/mL. For LSD at 1.0 ng/mL, the within-run and between-run (different day) relative standard deviation (RSD) was 2.2% and 4.4%, respectively.

  20. Automated and sensitive determination of four anabolic androgenic steroids in urine by online turbulent flow solid-phase extraction coupled with liquid chromatography-tandem mass spectrometry: a novel approach for clinical monitoring and doping control.

    PubMed

    Guo, Feng; Shao, Jing; Liu, Qian; Shi, Jian-Bo; Jiang, Gui-Bin

    2014-07-01

    A novel method for automated and sensitive analysis of testosterone, androstenedione, methyltestosterone and methenolone in urine samples by online turbulent flow solid-phase extraction coupled with high performance liquid chromatography-tandem mass spectrometry was developed. The optimization and validation of the method were discussed in detail. The Turboflow C18-P SPE column showed the best extraction efficiency for all the analytes. Nanogram per liter (ng/L) level of AAS could be determined directly and the limits of quantification (LOQs) were 0.01 ng/mL, which were much lower than normally concerned concentrations for these typical anabolic androgenic steroids (AAS) (0.1 ng/mL). The linearity range was from the LOQ to 100 ng/mL for each compound, with the coefficients of determination (r(2)) ranging from 0.9990 to 0.9999. The intraday and interday relative standard deviations (RSDs) ranged from 1.1% to 14.5% (n=5). The proposed method was successfully applied to the analysis of urine samples collected from 24 male athletes and 15 patients of prostate cancer. The proposed method provides an alternative practical way to rapidly determine AAS in urine samples, especially for clinical monitoring and doping control.

  1. Automated extraction and assessment of functional features of areal measured microstructures using a segmentation-based evaluation method

    NASA Astrophysics Data System (ADS)

    Hartmann, Wito; Loderer, Andreas

    2014-10-01

    In addition to currently available surface parameters, according to ISO 4287:2010 and ISO 25178-2:2012—which are defined particularly for stochastic surfaces—a universal evaluation procedure is provided for geometrical, well-defined, microstructured surfaces. Since several million of features (like diameters, depths, etc) are present on microstructured surfaces, segmentation techniques are used for the automation of the feature-based dimensional evaluation. By applying an additional extended 3D evaluation after the segmentation and classification procedure, the accuracy of the evaluation is improved compared to the direct evaluation of segments, and additional functional parameters can be derived. Advantages of the extended segmentation-based evaluation method include not only the ability to evaluate the manufacturing process statistically (e.g. by capability indices, according to ISO 21747:2007 and ISO 3534-2:2013) and to derive statistical reliable values for the correction of microstructuring processes but also the direct re-use of the evaluated parameter (including its statistical distribution) in simulations for the calculation of probabilities with respect to the functionality of the microstructured surface. The practical suitability of this method is demonstrated using examples of microstructures for the improvement of sliding and ink transfers for printing machines.

  2. CD-REST: a system for extracting chemical-induced disease relation in literature.

    PubMed

    Xu, Jun; Wu, Yonghui; Zhang, Yaoyun; Wang, Jingqi; Lee, Hee-Jin; Xu, Hua

    2016-01-01

    Mining chemical-induced disease relations embedded in the vast biomedical literature could facilitate a wide range of computational biomedical applications, such as pharmacovigilance. The BioCreative V organized a Chemical Disease Relation (CDR) Track regarding chemical-induced disease relation extraction from biomedical literature in 2015. We participated in all subtasks of this challenge. In this article, we present our participation system Chemical Disease Relation Extraction SysTem (CD-REST), an end-to-end system for extracting chemical-induced disease relations in biomedical literature. CD-REST consists of two main components: (1) a chemical and disease named entity recognition and normalization module, which employs the Conditional Random Fields algorithm for entity recognition and a Vector Space Model-based approach for normalization; and (2) a relation extraction module that classifies both sentence-level and document-level candidate drug-disease pairs by support vector machines. Our system achieved the best performance on the chemical-induced disease relation extraction subtask in the BioCreative V CDR Track, demonstrating the effectiveness of our proposed machine learning-based approaches for automatic extraction of chemical-induced disease relations in biomedical literature. The CD-REST system provides web services using HTTP POST request. The web services can be accessed fromhttp://clinicalnlptool.com/cdr The online CD-REST demonstration system is available athttp://clinicalnlptool.com/cdr/cdr.html. Database URL:http://clinicalnlptool.com/cdr;http://clinicalnlptool.com/cdr/cdr.html.

  3. Semi-automated extraction and delineation of 3D roads of street scene from mobile laser scanning point clouds

    NASA Astrophysics Data System (ADS)

    Yang, Bisheng; Fang, Lina; Li, Jonathan

    2013-05-01

    Accurate 3D road information is important for applications such as road maintenance and virtual 3D modeling. Mobile laser scanning (MLS) is an efficient technique for capturing dense point clouds that can be used to construct detailed road models for large areas. This paper presents a method for extracting and delineating roads from large-scale MLS point clouds. The proposed method partitions MLS point clouds into a set of consecutive "scanning lines", which each consists of a road cross section. A moving window operator is used to filter out non-ground points line by line, and curb points are detected based on curb patterns. The detected curb points are tracked and refined so that they are both globally consistent and locally similar. To evaluate the validity of the proposed method, experiments were conducted using two types of street-scene point clouds captured by Optech's Lynx Mobile Mapper System. The completeness, correctness, and quality of the extracted roads are over 94.42%, 91.13%, and 91.3%, respectively, which proves the proposed method is a promising solution for extracting 3D roads from MLS point clouds.

  4. Extraction of Children's Friendship Relation from Activity Level

    NASA Astrophysics Data System (ADS)

    Kono, Aki; Shintani, Kimio; Katsuki, Takuya; Kihara, Shin'ya; Ueda, Mari; Kaneda, Shigeo; Haga, Hirohide

    Children learn to fit into society through living in a group, and it's greatly influenced by their friend relations. Although preschool teachers need to observe them to assist in the growth of children's social progress and support the development each child's personality, only experienced teachers can watch over children while providing high-quality guidance. To resolve the problem, this paper proposes a mathematical and objective method that assists teachers with observation. It uses numerical data of activity level recorded by pedometers, and we make tree diagram called dendrogram based on hierarchical clustering with recorded activity level. Also, we calculate children's ``breadth'' and ``depth'' of friend relations by using more than one dendrogram. When we record children's activity level in a certain kindergarten for two months and evaluated the proposed method, the results usually coincide with remarks of teachers about the children.

  5. Human Factors In Aircraft Automation

    NASA Technical Reports Server (NTRS)

    Billings, Charles

    1995-01-01

    Report presents survey of state of art in human factors in automation of aircraft operation. Presents examination of aircraft automation and effects on flight crews in relation to human error and aircraft accidents.

  6. An unsupervised text mining method for relation extraction from biomedical literature.

    PubMed

    Quan, Changqin; Wang, Meng; Ren, Fuji

    2014-01-01

    The wealth of interaction information provided in biomedical articles motivated the implementation of text mining approaches to automatically extract biomedical relations. This paper presents an unsupervised method based on pattern clustering and sentence parsing to deal with biomedical relation extraction. Pattern clustering algorithm is based on Polynomial Kernel method, which identifies interaction words from unlabeled data; these interaction words are then used in relation extraction between entity pairs. Dependency parsing and phrase structure parsing are combined for relation extraction. Based on the semi-supervised KNN algorithm, we extend the proposed unsupervised approach to a semi-supervised approach by combining pattern clustering, dependency parsing and phrase structure parsing rules. We evaluated the approaches on two different tasks: (1) Protein-protein interactions extraction, and (2) Gene-suicide association extraction. The evaluation of task (1) on the benchmark dataset (AImed corpus) showed that our proposed unsupervised approach outperformed three supervised methods. The three supervised methods are rule based, SVM based, and Kernel based separately. The proposed semi-supervised approach is superior to the existing semi-supervised methods. The evaluation on gene-suicide association extraction on a smaller dataset from Genetic Association Database and a larger dataset from publicly available PubMed showed that the proposed unsupervised and semi-supervised methods achieved much higher F-scores than co-occurrence based method.

  7. Revealing Dimensions of Thinking in Open-Ended Self-Descriptions: An Automated Meaning Extraction Method for Natural Language.

    PubMed

    2008-02-01

    A new method for extracting common themes from written text is introduced and applied to 1,165 open-ended self-descriptive narratives. Drawing on a lexical approach to personality, the most commonly-used adjectives within narratives written by college students were identified using computerized text analytic tools. A factor analysis on the use of these adjectives in the self-descriptions produced a 7-factor solution consisting of psychologically meaningful dimensions. Some dimensions were unipolar (e.g., Negativity factor, wherein most loaded items were negatively valenced adjectives); others were dimensional in that semantically opposite words clustered together (e.g., Sociability factor, wherein terms such as shy, outgoing, reserved, and loud all loaded in the same direction). The factors exhibited modest reliability across different types of writ writing samples and were correlated with self-reports and behaviors consistent with the dimensions. Similar analyses with additional content words (adjectives, adverbs, nouns, and verbs) yielded additional psychological dimensions associated with physical appearance, school, relationships, etc. in which people contextualize their self-concepts. The results suggest that the meaning extraction method is a promising strategy that determines the dimensions along which people think about themselves.

  8. Revealing Dimensions of Thinking in Open-Ended Self-Descriptions: An Automated Meaning Extraction Method for Natural Language

    PubMed Central

    2008-01-01

    A new method for extracting common themes from written text is introduced and applied to 1,165 open-ended self-descriptive narratives. Drawing on a lexical approach to personality, the most commonly-used adjectives within narratives written by college students were identified using computerized text analytic tools. A factor analysis on the use of these adjectives in the self-descriptions produced a 7-factor solution consisting of psychologically meaningful dimensions. Some dimensions were unipolar (e.g., Negativity factor, wherein most loaded items were negatively valenced adjectives); others were dimensional in that semantically opposite words clustered together (e.g., Sociability factor, wherein terms such as shy, outgoing, reserved, and loud all loaded in the same direction). The factors exhibited modest reliability across different types of writ writing samples and were correlated with self-reports and behaviors consistent with the dimensions. Similar analyses with additional content words (adjectives, adverbs, nouns, and verbs) yielded additional psychological dimensions associated with physical appearance, school, relationships, etc. in which people contextualize their self-concepts. The results suggest that the meaning extraction method is a promising strategy that determines the dimensions along which people think about themselves. PMID:18802499

  9. Is the use o f Gunnera perpensa extracts in endometritis related to antibacterial activity?

    PubMed

    McGaw, L J; Gehring, R; Katsoulis, L; Eloff, J N

    2005-06-01

    Rhizome extracts of Gunnera perpensa are used in traditional remedies in South Africa to treat endometritis both in humans and animals. An investigation was undertaken to determine whether this plant possesses antibacterial activity, which may explain its efficacy. Gunnera perpensa rhizome extracts were prepared serially with solvents of increasing polarity and tested for antibacterial activity. Test bacteria included the Gram-positive Enterococcus faecalis and Staphylococcus aureus and the Gram-negative Escherichia coli and Pseudomonas aeruginosa. A moderate to weak level of antibacterial activity in most of the extracts resulted, with the best minimal inhibitory concentration (MIC) value of 2.61 mg ml(-1) shown by the acetone extract against S. aureus. The extracts were also submitted to the brine shrimp assay to detect possible toxic or pharmacological effects. All the extracts were lethal to the brine shrimp larvae at a concentration of 5 mg ml(-1). The acetone extract was extremely toxic at 1 mg ml(-1), with some toxicity evident at 0.1 mg ml(-1). The remainder of the extracts generally displayed little activity at concentrations lower than 5 mg ml(-1). In summary, the results indicate that although the extracts demonstrated a level of pharmacological activity, the relatively weak antibacterial activity is unlikely to justify the use of G. perpensa rhizomes in the traditional treatment of endometritis. Rather, the slightly antibacterial nature of the rhizomes may contribute to an additive effect, along with their known uterotonic activity, to the overall efficacy of the preparation. PMID:16137130

  10. Hematocrit-Independent Quantitation of Stimulants in Dried Blood Spots: Pipet versus Microfluidic-Based Volumetric Sampling Coupled with Automated Flow-Through Desorption and Online Solid Phase Extraction-LC-MS/MS Bioanalysis.

    PubMed

    Verplaetse, Ruth; Henion, Jack

    2016-07-01

    A workflow overcoming microsample collection issues and hematocrit (HCT)-related bias would facilitate more widespread use of dried blood spots (DBS). This report describes comparative results between the use of a pipet and a microfluidic-based sampling device for the creation of volumetric DBS. Both approaches were successfully coupled to HCT-independent, fully automated sample preparation and online liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) analysis allowing detection of five stimulants in finger prick blood. Reproducible, selective, accurate, and precise responses meeting generally accepted regulated bioanalysis guidelines were observed over the range of 5-1000 ng/mL whole blood. The applied heated flow-through solvent desorption of the entire spot and online solid phase extraction (SPE) procedure were unaffected by the blood's HCT value within the tested range of 28.0-61.5% HCT. Enhanced stability for mephedrone on DBS compared to liquid whole blood was observed. Finger prick blood samples were collected using both volumetric sampling approaches over a time course of 25 h after intake of a single oral dose of phentermine. A pharmacokinetic curve for the incurred phentermine was successfully produced using the described validated method. These results suggest that either volumetric sample collection method may be amenable to field-use followed by fully automated, HCT-independent DBS-SPE-LC-MS/MS bioanalysis for the quantitation of these representative controlled substances. Analytical data from DBS prepared with a pipet and microfluidic-based sampling devices were comparable, but the latter is easier to operate, making this approach more suitable for sample collection by unskilled persons. PMID:27270226

  11. Extraction and colorimetric determination of azadirachtin-related limonoids in neem seed kernel.

    PubMed

    Dai, J; Yaylayan, V A; Raghavan, G S; Parè, J R

    1999-09-01

    A colorimetric method was developed for the determination of total azadirachtin-related limonoids (AZRL) in neem seed kernel extracts. The method employed acidified vanillin solution in methanol for the colorization of the standard azadirachtin or neem seed kernel extracts in dichloromethane. Through the investigation of various factors influencing the sensitivity of detection, such as the concentration of vanillin, acid, and the time required for the formation of color, optimum conditions were selected to perform the assay. Under the optimum conditions, a good linearity was found between the absorbance at 577 nm and the concentration of standard azadirachtin solution in the range of 0.01-0.10 mg/mL. In addition, different extraction procedures were evaluated using the vanillin assay. The HPLC analysis of the extracts indicated that if the extractions were performed in methanol followed by partitioning in dichloromethane, approximately 50% of the value determined by the vanillin assay represents azadirachtin content. PMID:10552715

  12. Analysis of trace contamination of phthalate esters in ultrapure water using a modified solid-phase extraction procedure and automated thermal desorption-gas chromatography/mass spectrometry.

    PubMed

    Liu, Hsu-Chuan; Den, Walter; Chan, Shu-Fei; Kin, Kuan Tzu

    2008-04-25

    The present study was aimed to develop a procedure modified from the conventional solid-phase extraction (SPE) method for the analysis of trace concentration of phthalate esters in industrial ultrapure water (UPW). The proposed procedure allows UPW sample to be drawn through a sampling tube containing hydrophobic sorbent (Tenax TA) to concentrate the aqueous phthalate esters. The solid trap was then demoisturized by two-stage gas drying before subjecting to thermal desorption and analysis by gas chromatography-mass spectrometry. This process removes the solvent extraction procedure necessary for the conventional SPE method, and permits automation of the analytical procedure for high-volume analyses. Several important parameters, including desorption temperature and duration, packing quantity and demoisturizing procedure, were optimized in this study based on the analytical sensitivity for a standard mixture containing five different phthalate esters. The method detection limits for the five phthalate esters were between 36 ng l(-1) and 95 ng l(-1) and recovery rates between 15% and 101%. Dioctyl phthalate (DOP) was not recovered adequately because the compound was both poorly adsorbed and desorbed on and off Tenax TA sorbents. Furthermore, analyses of material leaching from poly(vinyl chloride) (PVC) tubes as well as the actual water samples showed that di-n-butyl phthalate (DBP) and di(2-ethylhexyl) phthalate (DEHP) were the common contaminants detected from PVC contaminated UPW and the actual UPW, as well as in tap water. The reduction of DEHP in the production processes of actual UPW was clearly observed, however a DEHP concentration of 0.20 microg l(-1) at the point of use was still being quantified, suggesting that the contamination of phthalate esters could present a barrier to the future cleanliness requirement of UPW. The work demonstrated that the proposed modified SPE procedure provided an effective method for rapid analysis and contamination

  13. Automated measurement of parameters related to the deformities of lower limbs based on x-rays images.

    PubMed

    Wojciechowski, Wadim; Molka, Adrian; Tabor, Zbisław

    2016-03-01

    Measurement of the deformation of the lower limbs in the current standard full-limb X-rays images presents significant challenges to radiologists and orthopedists. The precision of these measurements is deteriorated because of inexact positioning of the leg during image acquisition, problems with selecting reliable anatomical landmarks in projective X-ray images, and inevitable errors of manual measurements. The influence of the random errors resulting from the last two factors on the precision of the measurement can be reduced if an automated measurement method is used instead of a manual one. In the paper a framework for an automated measurement of various metric and angular quantities used in the description of the lower extremity deformation in full-limb frontal X-ray images is described. The results of automated measurements are compared with manual measurements. These results demonstrate that an automated method can be a valuable alternative to the manual measurements.

  14. Integrated DNA and RNA extraction and purification on an automated microfluidic cassette from bacterial and viral pathogens causing community-acquired lower respiratory tract infections.

    PubMed

    Van Heirstraeten, Liesbet; Spang, Peter; Schwind, Carmen; Drese, Klaus S; Ritzi-Lehnert, Marion; Nieto, Benjamin; Camps, Marta; Landgraf, Bryan; Guasch, Francesc; Corbera, Antoni Homs; Samitier, Josep; Goossens, Herman; Malhotra-Kumar, Surbhi; Roeser, Tina

    2014-05-01

    In this paper, we describe the development of an automated sample preparation procedure for etiological agents of community-acquired lower respiratory tract infections (CA-LRTI). The consecutive assay steps, including sample re-suspension, pre-treatment, lysis, nucleic acid purification, and concentration, were integrated into a microfluidic lab-on-a-chip (LOC) cassette that is operated hands-free by a demonstrator setup, providing fluidic and valve actuation. The performance of the assay was evaluated on viral and Gram-positive and Gram-negative bacterial broth cultures previously sampled using a nasopharyngeal swab. Sample preparation on the microfluidic cassette resulted in higher or similar concentrations of pure bacterial DNA or viral RNA compared to manual benchtop experiments. The miniaturization and integration of the complete sample preparation procedure, to extract purified nucleic acids from real samples of CA-LRTI pathogens to, and above, lab quality and efficiency, represent important steps towards its application in a point-of-care test (POCT) for rapid diagnosis of CA-LRTI. PMID:24615272

  15. Automation of reverse engineering process in aircraft modeling and related optimization problems

    NASA Technical Reports Server (NTRS)

    Li, W.; Swetits, J.

    1994-01-01

    During the year of 1994, the engineering problems in aircraft modeling were studied. The initial concern was to obtain a surface model with desirable geometric characteristics. Much of the effort during the first half of the year was to find an efficient way of solving a computationally difficult optimization model. Since the smoothing technique in the proposal 'Surface Modeling and Optimization Studies of Aerodynamic Configurations' requires solutions of a sequence of large-scale quadratic programming problems, it is important to design algorithms that can solve each quadratic program in a few interactions. This research led to three papers by Dr. W. Li, which were submitted to SIAM Journal on Optimization and Mathematical Programming. Two of these papers have been accepted for publication. Even though significant progress has been made during this phase of research and computation times was reduced from 30 min. to 2 min. for a sample problem, it was not good enough for on-line processing of digitized data points. After discussion with Dr. Robert E. Smith Jr., it was decided not to enforce shape constraints in order in order to simplify the model. As a consequence, P. Dierckx's nonparametric spline fitting approach was adopted, where one has only one control parameter for the fitting process - the error tolerance. At the same time the surface modeling software developed by Imageware was tested. Research indicated a substantially improved fitting of digitalized data points can be achieved if a proper parameterization of the spline surface is chosen. A winning strategy is to incorporate Dierckx's surface fitting with a natural parameterization for aircraft parts. The report consists of 4 chapters. Chapter 1 provides an overview of reverse engineering related to aircraft modeling and some preliminary findings of the effort in the second half of the year. Chapters 2-4 are the research results by Dr. W. Li on penalty functions and conjugate gradient methods for

  16. Semi-automated relative quantification of cell culture contamination with mycoplasma by Photoshop-based image analysis on immunofluorescence preparations.

    PubMed

    Kumar, Ashok; Yerneni, Lakshmana K

    2009-01-01

    Mycoplasma contamination in cell culture is a serious setback for the cell-culturist. The experiments undertaken using contaminated cell cultures are known to yield unreliable or false results due to various morphological, biochemical and genetic effects. Earlier surveys revealed incidences of mycoplasma contamination in cell cultures to range from 15 to 80%. Out of a vast array of methods for detecting mycoplasma in cell culture, the cytological methods directly demonstrate the contaminating organism present in association with the cultured cells. In this investigation, we report the adoption of a cytological immunofluorescence assay (IFA), in an attempt to obtain a semi-automated relative quantification of contamination by employing the user-friendly Photoshop-based image analysis. The study performed on 77 cell cultures randomly collected from various laboratories revealed mycoplasma contamination in 18 cell cultures simultaneously by IFA and Hoechst DNA fluorochrome staining methods. It was observed that the Photoshop-based image analysis on IFA stained slides was very valuable as a sensitive tool in providing quantitative assessment on the extent of contamination both per se and in comparison to cellularity of cell cultures. The technique could be useful in estimating the efficacy of anti-mycoplasma agents during decontaminating measures.

  17. Automated Learning of Subcellular Variation among Punctate Protein Patterns and a Generative Model of Their Relation to Microtubules

    PubMed Central

    Johnson, Gregory R.; Li, Jieyue; Shariff, Aabid; Rohde, Gustavo K.; Murphy, Robert F.

    2015-01-01

    Characterizing the spatial distribution of proteins directly from microscopy images is a difficult problem with numerous applications in cell biology (e.g. identifying motor-related proteins) and clinical research (e.g. identification of cancer biomarkers). Here we describe the design of a system that provides automated analysis of punctate protein patterns in microscope images, including quantification of their relationships to microtubules. We constructed the system using confocal immunofluorescence microscopy images from the Human Protein Atlas project for 11 punctate proteins in three cultured cell lines. These proteins have previously been characterized as being primarily located in punctate structures, but their images had all been annotated by visual examination as being simply “vesicular”. We were able to show that these patterns could be distinguished from each other with high accuracy, and we were able to assign to one of these subclasses hundreds of proteins whose subcellular localization had not previously been well defined. In addition to providing these novel annotations, we built a generative approach to modeling of punctate distributions that captures the essential characteristics of the distinct patterns. Such models are expected to be valuable for representing and summarizing each pattern and for constructing systems biology simulations of cell behaviors. PMID:26624011

  18. RADARS, a bioinformatics solution that automates proteome mass spectral analysis, optimises protein identification, and archives data in a relational database.

    PubMed

    Field, Helen I; Fenyö, David; Beavis, Ronald C

    2002-01-01

    RADARS, a rapid, automated, data archiving and retrieval software system for high-throughput proteomic mass spectral data processing and storage, is described. The majority of mass spectrometer data files are compatible with RADARS, for consistent processing. The system automatically takes unprocessed data files, identifies proteins via in silico database searching, then stores the processed data and search results in a relational database suitable for customized reporting. The system is robust, used in 24/7 operation, accessible to multiple users of an intranet through a web browser, may be monitored by Virtual Private Network, and is secure. RADARS is scalable for use on one or many computers, and is suited to multiple processor systems. It can incorporate any local database in FASTA format, and can search protein and DNA databases online. A key feature is a suite of visualisation tools (many available gratis), allowing facile manipulation of spectra, by hand annotation, reanalysis, and access to all procedures. We also described the use of Sonar MS/MS, a novel, rapid search engine requiring 40 MB RAM per process for searches against a genomic or EST database translated in all six reading frames. RADARS reduces the cost of analysis by its efficient algorithms: Sonar MS/MS can identifiy proteins without accurate knowledge of the parent ion mass and without protein tags. Statistical scoring methods provide close-to-expert accuracy and brings robust data analysis to the non-expert user.

  19. Automated extraction of typing information for bacterial pathogens from whole genome sequence data: Neisseria meningitidis as an exemplar.

    PubMed

    Jolley, K A; Maiden, M C

    2013-01-01

    Whole genome sequence (WGS) data are increasingly used to characterise bacterial pathogens. These data provide detailed information on the genotypes and likely phenotypes of aetiological agents, enabling the relationships of samples from potential disease outbreaks to be established precisely. However, the generation of increasing quantities of sequence data does not, in itself, resolve the problems that many microbiological typing methods have addressed over the last 100 years or so; indeed, providing large volumes of unstructured data can confuse rather than resolve these issues. Here we review the nascent field of storage of WGS data for clinical application and show how curated sequence-based typing schemes on websites have generated an infrastructure that can exploit WGS for bacterial typing efficiently. We review the tools that have been implemented within the PubMLST website to extract clinically useful, strain-characterisation information that can be provided to physicians and public health professionals in a timely, concise and understandable way. These data can be used to inform medical decisions such as how to treat a patient, whether to instigate public health action, and what action might be appropriate. The information is compatible both with previous sequence-based typing data and also with data obtained in the absence of WGS, providing a flexible infrastructure for WGS-based clinical microbiology. PMID:23369391

  20. A new adaptive algorithm for automated feature extraction in exponentially damped signals for health monitoring of smart structures

    NASA Astrophysics Data System (ADS)

    Qarib, Hossein; Adeli, Hojjat

    2015-12-01

    In this paper authors introduce a new adaptive signal processing technique for feature extraction and parameter estimation in noisy exponentially damped signals. The iterative 3-stage method is based on the adroit integration of the strengths of parametric and nonparametric methods such as multiple signal categorization, matrix pencil, and empirical mode decomposition algorithms. The first stage is a new adaptive filtration or noise removal scheme. The second stage is a hybrid parametric-nonparametric signal parameter estimation technique based on an output-only system identification technique. The third stage is optimization of estimated parameters using a combination of the primal-dual path-following interior point algorithm and genetic algorithm. The methodology is evaluated using a synthetic signal and a signal obtained experimentally from transverse vibrations of a steel cantilever beam. The method is successful in estimating the frequencies accurately. Further, it estimates the damping exponents. The proposed adaptive filtration method does not include any frequency domain manipulation. Consequently, the time domain signal is not affected as a result of frequency domain and inverse transformations.

  1. Extraction conditions of white rose petals for the inhibition of enzymes related to skin aging.

    PubMed

    Choi, Ehn-Kyoung; Guo, Haiyu; Choi, Jae-Kwon; Jang, Su-Kil; Shin, Kyungha; Cha, Ye-Seul; Choi, Youngjin; Seo, Da-Woom; Lee, Yoon-Bok; Joo, Seong-So; Kim, Yun-Bae

    2015-09-01

    In order to assess inhibitory potentials of white rose petal extracts (WRPE) on the activities of enzymes related to dermal aging according to the extraction conditions, three extraction methods were adopted. WRPE was prepared by extracting dried white rose (Rosa hybrida) petals with 50% ethanol (WRPE-EtOH), Pectinex® SMASH XXL enzyme (WRPE-enzyme) or high temperature-high pressure (WRPE-HTHP). In the inhibition of matrix metalloproteinase-1, although the enzyme activity was fully inhibited by all 3 extracts at 100 µg/mL in 60 min, partial inhibition (50-70%) was achieved only by WRPE-EtOH and WRPE-enzyme at 50 µg/mL. High concentrations (≥250 µg/mL) of all 3 extracts markedly inhibited the elastase activity. However, at low concentrations (15.6-125 µg/mL), only WRPE-EtOH inhibited the enzyme activity. Notably, WRPE-EtOH was superior to WRPE-enzyme and WRPE-HTHP in the inhibition of tyrosinase. WRPE-EtOH significantly inhibited the enzyme activity from 31.2 µM, reaching 80% inhibition at 125 µM. In addition to its strong antioxidative activity, the ethanol extract of white rose petals was confirmed to be effective in inhibiting skin aging-related enzymes. Therefore, it is suggested that WRPE-EtOH could be a good candidate for the improvement of skin aging such as wrinkle formation and pigmentation.

  2. Progress Toward Automated Cost Estimation

    NASA Technical Reports Server (NTRS)

    Brown, Joseph A.

    1992-01-01

    Report discusses efforts to develop standard system of automated cost estimation (ACE) and computer-aided design (CAD). Advantage of system is time saved and accuracy enhanced by automating extraction of quantities from design drawings, consultation of price lists, and application of cost and markup formulas.

  3. Systematic review automation technologies.

    PubMed

    Tsafnat, Guy; Glasziou, Paul; Choong, Miew Keen; Dunn, Adam; Galgani, Filippo; Coiera, Enrico

    2014-07-09

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects.We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time.

  4. Systematic review automation technologies

    PubMed Central

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  5. Development and validation of an automated liquid-liquid extraction GC/MS method for the determination of THC, 11-OH-THC, and free THC-carboxylic acid (THC-COOH) from blood serum.

    PubMed

    Purschke, Kirsten; Heinl, Sonja; Lerch, Oliver; Erdmann, Freidoon; Veit, Florian

    2016-06-01

    The analysis of Δ(9)-tetrahydrocannabinol (THC) and its metabolites 11-hydroxy-Δ(9)-tetrahydrocannabinol (11-OH-THC), and 11-nor-9-carboxy-Δ(9)-tetrahydrocannabinol (THC-COOH) from blood serum is a routine task in forensic toxicology laboratories. For examination of consumption habits, the concentration of the phase I metabolite THC-COOH is used. Recommendations for interpretation of analysis values in medical-psychological assessments (regranting of driver's licenses, Germany) include threshold values for the free, unconjugated THC-COOH. Using a fully automated two-step liquid-liquid extraction, THC, 11-OH-THC, and free, unconjugated THC-COOH were extracted from blood serum, silylated with N-methyl-N-(trimethylsilyl) trifluoroacetamide (MSTFA), and analyzed by GC/MS. The automation was carried out by an x-y-z sample robot equipped with modules for shaking, centrifugation, and solvent evaporation. This method was based on a previously developed manual sample preparation method. Validation guidelines of the Society of Toxicological and Forensic Chemistry (GTFCh) were fulfilled for both methods, at which the focus of this article is the automated one. Limits of detection and quantification for THC were 0.3 and 0.6 μg/L, for 11-OH-THC were 0.1 and 0.8 μg/L, and for THC-COOH were 0.3 and 1.1 μg/L, when extracting only 0.5 mL of blood serum. Therefore, the required limit of quantification for THC of 1 μg/L in driving under the influence of cannabis cases in Germany (and other countries) can be reached and the method can be employed in that context. Real and external control samples were analyzed, and a round robin test was passed successfully. To date, the method is employed in the Institute of Legal Medicine in Giessen, Germany, in daily routine. Automation helps in avoiding errors during sample preparation and reduces the workload of the laboratory personnel. Due to its flexibility, the analysis system can be employed for other liquid-liquid extractions as

  6. Development and validation of an automated liquid-liquid extraction GC/MS method for the determination of THC, 11-OH-THC, and free THC-carboxylic acid (THC-COOH) from blood serum.

    PubMed

    Purschke, Kirsten; Heinl, Sonja; Lerch, Oliver; Erdmann, Freidoon; Veit, Florian

    2016-06-01

    The analysis of Δ(9)-tetrahydrocannabinol (THC) and its metabolites 11-hydroxy-Δ(9)-tetrahydrocannabinol (11-OH-THC), and 11-nor-9-carboxy-Δ(9)-tetrahydrocannabinol (THC-COOH) from blood serum is a routine task in forensic toxicology laboratories. For examination of consumption habits, the concentration of the phase I metabolite THC-COOH is used. Recommendations for interpretation of analysis values in medical-psychological assessments (regranting of driver's licenses, Germany) include threshold values for the free, unconjugated THC-COOH. Using a fully automated two-step liquid-liquid extraction, THC, 11-OH-THC, and free, unconjugated THC-COOH were extracted from blood serum, silylated with N-methyl-N-(trimethylsilyl) trifluoroacetamide (MSTFA), and analyzed by GC/MS. The automation was carried out by an x-y-z sample robot equipped with modules for shaking, centrifugation, and solvent evaporation. This method was based on a previously developed manual sample preparation method. Validation guidelines of the Society of Toxicological and Forensic Chemistry (GTFCh) were fulfilled for both methods, at which the focus of this article is the automated one. Limits of detection and quantification for THC were 0.3 and 0.6 μg/L, for 11-OH-THC were 0.1 and 0.8 μg/L, and for THC-COOH were 0.3 and 1.1 μg/L, when extracting only 0.5 mL of blood serum. Therefore, the required limit of quantification for THC of 1 μg/L in driving under the influence of cannabis cases in Germany (and other countries) can be reached and the method can be employed in that context. Real and external control samples were analyzed, and a round robin test was passed successfully. To date, the method is employed in the Institute of Legal Medicine in Giessen, Germany, in daily routine. Automation helps in avoiding errors during sample preparation and reduces the workload of the laboratory personnel. Due to its flexibility, the analysis system can be employed for other liquid-liquid extractions as

  7. CD-REST: a system for extracting chemical-induced disease relation in literature

    PubMed Central

    Xu, Jun; Wu, Yonghui; Zhang, Yaoyun; Wang, Jingqi; Lee, Hee-Jin; Xu, Hua

    2016-01-01

    Mining chemical-induced disease relations embedded in the vast biomedical literature could facilitate a wide range of computational biomedical applications, such as pharmacovigilance. The BioCreative V organized a Chemical Disease Relation (CDR) Track regarding chemical-induced disease relation extraction from biomedical literature in 2015. We participated in all subtasks of this challenge. In this article, we present our participation system Chemical Disease Relation Extraction SysTem (CD-REST), an end-to-end system for extracting chemical-induced disease relations in biomedical literature. CD-REST consists of two main components: (1) a chemical and disease named entity recognition and normalization module, which employs the Conditional Random Fields algorithm for entity recognition and a Vector Space Model-based approach for normalization; and (2) a relation extraction module that classifies both sentence-level and document-level candidate drug–disease pairs by support vector machines. Our system achieved the best performance on the chemical-induced disease relation extraction subtask in the BioCreative V CDR Track, demonstrating the effectiveness of our proposed machine learning-based approaches for automatic extraction of chemical-induced disease relations in biomedical literature. The CD-REST system provides web services using HTTP POST request. The web services can be accessed from http://clinicalnlptool.com/cdr. The online CD-REST demonstration system is available at http://clinicalnlptool.com/cdr/cdr.html. Database URL: http://clinicalnlptool.com/cdr; http://clinicalnlptool.com/cdr/cdr.html PMID:27016700

  8. Cockpit automation

    NASA Technical Reports Server (NTRS)

    Wiener, Earl L.

    1988-01-01

    The aims and methods of aircraft cockpit automation are reviewed from a human-factors perspective. Consideration is given to the mixed pilot reception of increased automation, government concern with the safety and reliability of highly automated aircraft, the formal definition of automation, and the ground-proximity warning system and accidents involving controlled flight into terrain. The factors motivating automation include technology availability; safety; economy, reliability, and maintenance; workload reduction and two-pilot certification; more accurate maneuvering and navigation; display flexibility; economy of cockpit space; and military requirements.

  9. Determination of propoxur in environmental samples by automated solid-phase extraction followed by flow-injection analysis with tris(2,2'-bipyridyl)ruthenium(II) chemiluminescence detection.

    PubMed

    Pérez-Ruiz, Tomás; Martínez-Lozano, Carmen; García, María Dolores

    2007-02-19

    A sensitive method for the analysis of propoxur in environmental samples has been developed. It involves an automated solid-phase extraction (SPE) procedure using a Gilson Aspec XLi and flow-injection analysis (FI) with chemiluminescence (CL) detection. The FI-CL system relies on the photolysis of propoxur by irradiation using a low-pressure mercury lamp (main spectral line 254 nm). The resultant methylamine is subsequently detected by CL using tris(2,2'-bipyridyl)ruthenium(III), which is on-line generated by photo-oxidation of the ruthenium(II) complex in the presence of peroxydisulfate. The linear concentration range of application was 0.05-5 microg mL(-1) of propoxur, with a detection limit of 5 ng mL(-1). The repeatability was 0.82% expressed as relative standard deviation (n=10) and the reproducibility, studied on 5 consecutive days, was 2.1%. The sample throughput was 160 injection per hour. Propoxur residues below ng mL(-1) levels could be determined in environmental water samples when an SPE preconcentration device was coupled on-line with the FI system. This SPE-FI-CL arrangement provides a detection limit as low as 5 ng L(-1) using only 500 mL of sample. In the analysis of fruits and vegetables, the detection limit was about 10 microg kg(-1).

  10. Exploiting syntactic and semantics information for chemical–disease relation extraction

    PubMed Central

    Zhou, Huiwei; Deng, Huijie; Chen, Long; Yang, Yunlong; Jia, Chen; Huang, Degen

    2016-01-01

    Identifying chemical–disease relations (CDR) from biomedical literature could improve chemical safety and toxicity studies. This article proposes a novel syntactic and semantic information exploitation method for CDR extraction. The proposed method consists of a feature-based model, a tree kernel-based model and a neural network model. The feature-based model exploits lexical features, the tree kernel-based model captures syntactic structure features, and the neural network model generates semantic representations. The motivation of our method is to fully utilize the nice properties of the three models to explore diverse information for CDR extraction. Experiments on the BioCreative V CDR dataset show that the three models are all effective for CDR extraction, and their combination could further improve extraction performance. Database URL: http://www.biocreative.org/resources/corpora/biocreative-v-cdr-corpus/. PMID:27081156

  11. Automatic extraction of semantic relations between medical entities: a rule based approach

    PubMed Central

    2011-01-01

    Background Information extraction is a complex task which is necessary to develop high-precision information retrieval tools. In this paper, we present the platform MeTAE (Medical Texts Annotation and Exploration). MeTAE allows (i) to extract and annotate medical entities and relationships from medical texts and (ii) to explore semantically the produced RDF annotations. Results Our annotation approach relies on linguistic patterns and domain knowledge and consists in two steps: (i) recognition of medical entities and (ii) identification of the correct semantic relation between each pair of entities. The first step is achieved by an enhanced use of MetaMap which improves the precision obtained by MetaMap by 19.59% in our evaluation. The second step relies on linguistic patterns which are built semi-automatically from a corpus selected according to semantic criteria. We evaluate our system’s ability to identify medical entities of 16 types. We also evaluate the extraction of treatment relations between a treatment (e.g. medication) and a problem (e.g. disease): we obtain 75.72% precision and 60.46% recall. Conclusions According to our experiments, using an external sentence segmenter and noun phrase chunker may improve the precision of MetaMap-based medical entity recognition. Our pattern-based relation extraction method obtains good precision and recall w.r.t related works. A more precise comparison with related approaches remains difficult however given the differences in corpora and in the exact nature of the extracted relations. The selection of MEDLINE articles through queries related to known drug-disease pairs enabled us to obtain a more focused corpus of relevant examples of treatment relations than a more general MEDLINE query. PMID:22166723

  12. Study on electrical current variations in electromembrane extraction process: Relation between extraction recovery and magnitude of electrical current.

    PubMed

    Rahmani, Turaj; Rahimi, Atyeh; Nojavan, Saeed

    2016-01-15

    This contribution presents an experimental approach to improve analytical performance of electromembrane extraction (EME) procedure, which is based on the scrutiny of current pattern under different extraction conditions such as using different organic solvents as supported liquid membrane, electrical potentials, pH values of donor and acceptor phases, variable extraction times, temperatures, stirring rates, different hollow fiber lengths and the addition of salts or organic solvents to the sample matrix. In this study, four basic drugs with different polarities were extracted under different conditions with the corresponding electrical current patterns compared against extraction recoveries. The extraction process was demonstrated in terms of EME-HPLC analyses of selected basic drugs. Comparing the obtained extraction recoveries with the electrical current patterns, most cases exhibited minimum recovery and repeatability at the highest investigated magnitude of electrical current. . It was further found that identical current patterns are associated with repeated extraction efficiencies. In other words, the pattern should be repeated for a successful extraction. The results showed completely different electrical currents under different extraction conditions, so that all variable parameters have contributions into the electrical current pattern. Finally, the current patterns of extractions from wastewater, plasma and urine samples were demonstrated. The results indicated an increase in the electrical current when extracting from complex matrices; this was seen to decrease the extraction efficiency.

  13. A simple, rapid, sensitive method detecting homoserine lactone (HSL)-related compounds in microbial extracts.

    PubMed

    Singh, Maya Prakash; Greenstein, Michael

    2006-04-01

    A simple, rapid, sensitive microtiter plate method detecting N-acyl homoserine lactone (HSL)-related compounds was established using an Agrobacterium tumefaciens strain harboring a traG::lacZ/traR reporter gene responsive to HSLs. This strain did not produce its own HSL, but the traG::lacZ reporter gene was induced only when its transcription activator TraR detected a cognate exogenous HSL. Therefore, the assay was expected to be highly specific for HSL-related compounds. Induction of the reporter gene, leading to production of beta-galactosidase enzyme, was measured by using two different beta-galactosidase substrates, X-gal and Galacton-Star, for colorimetric and chemiluminometric detection, respectively. The screen was validated in both the 96-well and 384-well plate formats, and extracts derived from 696 different microbial isolates, mostly unidentified actinomycetes isolated from diverse locations, were tested. Crude extracts of 81 (11.64%) cultures tested positive for HSL-related compounds, and an additional 34 (4.8%) crude extracts showed a moderate to weak signal for HSLs. Data from the fractionated samples, however, suggested a much higher prevalence of HSL signals in these extracts. Of 144 crude extracts fractionated into 10 individual samples at a 10x concentration, 72 (50%) cultures tested positive for HSLs. Six cultures were active only in the crude extract, 18 were active both in crude and one or more of their fractions, and an additional 48 were active in just one or more of their fractions. This finding may be the first to suggest such a high prevalence of HSL-signals found in nature, and a large number of actinomycetes in our collection appeared to produce HSL-related compounds.

  14. Relation of retinal blood flow and retinal oxygen extraction during stimulation with diffuse luminance flicker

    PubMed Central

    Palkovits, Stefan; Lasta, Michael; Told, Reinhard; Schmidl, Doreen; Werkmeister, René; Cherecheanu, Alina Popa; Garhöfer, Gerhard; Schmetterer, Leopold

    2015-01-01

    Cerebral and retinal blood flow are dependent on local neuronal activity. Several studies quantified the increase in cerebral blood flow and oxygen consumption during activity. In the present study we investigated the relation between changes in retinal blood flow and oxygen extraction during stimulation with diffuse luminance flicker and the influence of breathing gas mixtures with different fractions of O2 (FiO2; 100% 15% and 12%). Twenty-four healthy subjects were included. Retinal blood flow was studied by combining measurement of vessel diameters using the Dynamic Vessel Analyser with measurements of blood velocity using laser Doppler velocimetry. Oxygen saturation was measured using spectroscopic reflectometry and oxygen extraction was calculated. Flicker stimulation increased retinal blood flow (57.7 ± 17.8%) and oxygen extraction (34.6 ± 24.1%; p < 0.001 each). During 100% oxygen breathing the response of retinal blood flow and oxygen extraction was increased (p < 0.01 each). By contrast, breathing gas mixtures with 12% and 15% FiO2 did not alter flicker–induced retinal haemodynamic changes. The present study indicates that at a comparable increase in blood flow the increase in oxygen extraction in the retina is larger than in the brain. During systemic hyperoxia the blood flow and oxygen extraction responses to neural stimulation are augmented. The underlying mechanism is unknown. PMID:26672758

  15. Influence of DNA extraction methods on relative telomere length measurements and its impact on epidemiological studies

    PubMed Central

    Raschenberger, Julia; Lamina, Claudia; Haun, Margot; Kollerits, Barbara; Coassin, Stefan; Boes, Eva; Kedenko, Ludmilla; Köttgen, Anna; Kronenberg, Florian

    2016-01-01

    Measurement of telomere length is widely used in epidemiologic studies. Insufficient standardization of the measurements processes has, however, complicated the comparison of results between studies. We aimed to investigate whether DNA extraction methods have an influence on measured values of relative telomere length (RTL) and whether this has consequences for epidemiological studies. We performed four experiments with RTL measurement in quadruplicate by qPCR using DNA extracted with different methods: 1) a standardized validation experiment including three extraction methods (magnetic-particle-method EZ1, salting-out-method INV, phenol-chloroform-isoamyl-alcohol PCI) each in the same 20 samples demonstrated pronounced differences in RTL with lowest values with EZ1 followed by INV and PCI-isolated DNA; 2) a comparison of 307 samples from an epidemiological study showing EZ1-measurements 40% lower than INV-measurements; 3) a matching-approach of two similar non-diseased control groups including 143 pairs of subjects revealed significantly shorter RTL in EZ1 than INV-extracted DNA (0.844 ± 0.157 vs. 1.357 ± 0.242); 4) an association analysis of RTL with prevalent cardiovascular disease detected a stronger association with INV than with EZ1-extracted DNA. In summary, DNA extraction methods have a pronounced influence on the measured RTL-values. This might result in spurious or lost associations in epidemiological studies under certain circumstances. PMID:27138987

  16. Relation of retinal blood flow and retinal oxygen extraction during stimulation with diffuse luminance flicker.

    PubMed

    Palkovits, Stefan; Lasta, Michael; Told, Reinhard; Schmidl, Doreen; Werkmeister, René; Cherecheanu, Alina Popa; Garhöfer, Gerhard; Schmetterer, Leopold

    2015-01-01

    Cerebral and retinal blood flow are dependent on local neuronal activity. Several studies quantified the increase in cerebral blood flow and oxygen consumption during activity. In the present study we investigated the relation between changes in retinal blood flow and oxygen extraction during stimulation with diffuse luminance flicker and the influence of breathing gas mixtures with different fractions of O2 (FiO2; 100% 15% and 12%). Twenty-four healthy subjects were included. Retinal blood flow was studied by combining measurement of vessel diameters using the Dynamic Vessel Analyser with measurements of blood velocity using laser Doppler velocimetry. Oxygen saturation was measured using spectroscopic reflectometry and oxygen extraction was calculated. Flicker stimulation increased retinal blood flow (57.7 ± 17.8%) and oxygen extraction (34.6 ± 24.1%; p < 0.001 each). During 100% oxygen breathing the response of retinal blood flow and oxygen extraction was increased (p < 0.01 each). By contrast, breathing gas mixtures with 12% and 15% FiO2 did not alter flicker-induced retinal haemodynamic changes. The present study indicates that at a comparable increase in blood flow the increase in oxygen extraction in the retina is larger than in the brain. During systemic hyperoxia the blood flow and oxygen extraction responses to neural stimulation are augmented. The underlying mechanism is unknown. PMID:26672758

  17. Influence of DNA extraction methods on relative telomere length measurements and its impact on epidemiological studies.

    PubMed

    Raschenberger, Julia; Lamina, Claudia; Haun, Margot; Kollerits, Barbara; Coassin, Stefan; Boes, Eva; Kedenko, Ludmilla; Köttgen, Anna; Kronenberg, Florian

    2016-01-01

    Measurement of telomere length is widely used in epidemiologic studies. Insufficient standardization of the measurements processes has, however, complicated the comparison of results between studies. We aimed to investigate whether DNA extraction methods have an influence on measured values of relative telomere length (RTL) and whether this has consequences for epidemiological studies. We performed four experiments with RTL measurement in quadruplicate by qPCR using DNA extracted with different methods: 1) a standardized validation experiment including three extraction methods (magnetic-particle-method EZ1, salting-out-method INV, phenol-chloroform-isoamyl-alcohol PCI) each in the same 20 samples demonstrated pronounced differences in RTL with lowest values with EZ1 followed by INV and PCI-isolated DNA; 2) a comparison of 307 samples from an epidemiological study showing EZ1-measurements 40% lower than INV-measurements; 3) a matching-approach of two similar non-diseased control groups including 143 pairs of subjects revealed significantly shorter RTL in EZ1 than INV-extracted DNA (0.844 ± 0.157 vs. 1.357 ± 0.242); 4) an association analysis of RTL with prevalent cardiovascular disease detected a stronger association with INV than with EZ1-extracted DNA. In summary, DNA extraction methods have a pronounced influence on the measured RTL-values. This might result in spurious or lost associations in epidemiological studies under certain circumstances. PMID:27138987

  18. The Application of Thermal Plasma to Extraction Metallurgy and Related Fields

    NASA Technical Reports Server (NTRS)

    Akashi, K.

    1980-01-01

    Various applications of thermal plasma to extraction metallurgy and related fields are surveyed, chiefly on the basis of documents published during the past two or three years. Applications to melting and smelting, to thermal decomposition, to reduction, to manufacturing of inorganic compounds, and to other fields are considered.

  19. A method for automatically extracting infectious disease-related primers and probes from the literature

    PubMed Central

    2010-01-01

    Background Primer and probe sequences are the main components of nucleic acid-based detection systems. Biologists use primers and probes for different tasks, some related to the diagnosis and prescription of infectious diseases. The biological literature is the main information source for empirically validated primer and probe sequences. Therefore, it is becoming increasingly important for researchers to navigate this important information. In this paper, we present a four-phase method for extracting and annotating primer/probe sequences from the literature. These phases are: (1) convert each document into a tree of paper sections, (2) detect the candidate sequences using a set of finite state machine-based recognizers, (3) refine problem sequences using a rule-based expert system, and (4) annotate the extracted sequences with their related organism/gene information. Results We tested our approach using a test set composed of 297 manuscripts. The extracted sequences and their organism/gene annotations were manually evaluated by a panel of molecular biologists. The results of the evaluation show that our approach is suitable for automatically extracting DNA sequences, achieving precision/recall rates of 97.98% and 95.77%, respectively. In addition, 76.66% of the detected sequences were correctly annotated with their organism name. The system also provided correct gene-related information for 46.18% of the sequences assigned a correct organism name. Conclusions We believe that the proposed method can facilitate routine tasks for biomedical researchers using molecular methods to diagnose and prescribe different infectious diseases. In addition, the proposed method can be expanded to detect and extract other biological sequences from the literature. The extracted information can also be used to readily update available primer/probe databases or to create new databases from scratch. PMID:20682041

  20. Vision related daily life problems in patients waiting for a cataract extraction.

    PubMed Central

    Lundström, M; Fregell, G; Sjöblom, A

    1994-01-01

    Problems in daily life activities caused by bad vision were studied in 150 patients with cataract before and 6 months after a cataract extraction. A relation was found between binocular visual acuity before surgery and the number of problems experienced (p < 0.001). After cataract extraction a reduction in problems was closely associated with an increase in visual acuity (p < 0.001) and also, in the patients' opinion, a better life situation (p < 0.001). Six questions to be answered when considering surgery are given. PMID:7918286

  1. Determination of Other Related Carotenoids Substances in Astaxanthin Crystals Extracted from Adonis amurensis.

    PubMed

    Zhang, Li-hua; Peng, Yong-jian; Xu, Xin-de; Wang, Sheng-nan; Yu, Lei-ming; Hong, Yi-min; Ma, Jin-ping

    2015-01-01

    Astaxanthin is a kind of important carotenoids with powerful antioxidation capacity and other health functions. Extracting from Adonis amurensis is a promising way to obtain natural astaxanthin. However, how to ensure the high purity and to investigate related substances in astaxanthin crystals are necessary issues. In this study, to identify possible impurities, astaxanthin crystal was first extracted from Adonis amurensis, then purified by saponification and separation. The concentration of total carotenoids in purified astaxanthin crystals was as high as 97% by weight when analyzed by UV-visible absorption spectra. After identified with TLC, HPLC and MS, besides free astaxanthin as main ingredient in the crystals, there existed four other unknown related substances, which were further investigated by HPLC/ESI/MS with the positive ion mode combining with other auxiliary reference data obtained in stress tests, at last it was confirmed that four related carotenoids substances were three structural isomers of semi-astacene and adonirubin.

  2. PPInterFinder--a mining tool for extracting causal relations on human proteins from literature.

    PubMed

    Raja, Kalpana; Subramani, Suresh; Natarajan, Jeyakumar

    2013-01-01

    One of the most common and challenging problem in biomedical text mining is to mine protein-protein interactions (PPIs) from MEDLINE abstracts and full-text research articles because PPIs play a major role in understanding the various biological processes and the impact of proteins in diseases. We implemented, PPInterFinder--a web-based text mining tool to extract human PPIs from biomedical literature. PPInterFinder uses relation keyword co-occurrences with protein names to extract information on PPIs from MEDLINE abstracts and consists of three phases. First, it identifies the relation keyword using a parser with Tregex and a relation keyword dictionary. Next, it automatically identifies the candidate PPI pairs with a set of rules related to PPI recognition. Finally, it extracts the relations by matching the sentence with a set of 11 specific patterns based on the syntactic nature of PPI pair. We find that PPInterFinder is capable of predicting PPIs with the accuracy of 66.05% on AIMED corpus and outperforms most of the existing systems. DATABASE URL: http://www.biomining-bu.in/ppinterfinder/ PMID:23325628

  3. PPInterFinder--a mining tool for extracting causal relations on human proteins from literature.

    PubMed

    Raja, Kalpana; Subramani, Suresh; Natarajan, Jeyakumar

    2013-01-01

    One of the most common and challenging problem in biomedical text mining is to mine protein-protein interactions (PPIs) from MEDLINE abstracts and full-text research articles because PPIs play a major role in understanding the various biological processes and the impact of proteins in diseases. We implemented, PPInterFinder--a web-based text mining tool to extract human PPIs from biomedical literature. PPInterFinder uses relation keyword co-occurrences with protein names to extract information on PPIs from MEDLINE abstracts and consists of three phases. First, it identifies the relation keyword using a parser with Tregex and a relation keyword dictionary. Next, it automatically identifies the candidate PPI pairs with a set of rules related to PPI recognition. Finally, it extracts the relations by matching the sentence with a set of 11 specific patterns based on the syntactic nature of PPI pair. We find that PPInterFinder is capable of predicting PPIs with the accuracy of 66.05% on AIMED corpus and outperforms most of the existing systems. DATABASE URL: http://www.biomining-bu.in/ppinterfinder/

  4. Extraction procedures for oilseeds and related high fat-low moisture products.

    PubMed

    Sawyer, L D

    1982-09-01

    A combined sample preparation/extraction procedure is presented for pesticide residue analysis of oilseeds and related high fat-low moisture products. The procedure utilizes high-speed milling to prepare the sample and high-speed homogenization in the extraction step to achieve what is apparently quantitative isolation of both incurred residues and natural oils. A separate, simple, oil determination step allows findings to be reported on either the fat or whole product basis. Petroleum ether, ethyl ether-petroleum ether (1 + 1), and ethanol are used serially as the extractants. Usual fatty food cleanup procedures and multiresidue gas chromatographic detection techniques are utilized. The procedure presented in this paper is a refinement of earlier work which used a homogenizer both to grind and to extract samples of unground seeds and which demonstrated essentially complete extraction of endrin residues in soybeans and DDT residues in mustard seed. Identical samples analyzed by the currently recommended shakeout procedure, 29.012, gave recoveries of approximately 50% of the total residues. The procedure presented in this paper was satisfactorily tested on 13 different oilseed types and one sample of soda crackers. Oil content for these samples ranged from 5 to 69%. PMID:6890060

  5. Antimutagenicity of Methanolic Extracts from Anemopsis californica in Relation to Their Antioxidant Activity

    PubMed Central

    Del-Toro-Sánchez, Carmen Lizette; Bautista-Bautista, Nereyda; Blasco-Cabal, José Luis; Gonzalez-Ávila, Marisela; Gutiérrez-Lomelí, Melesio; Arriaga-Alba, Myriam

    2014-01-01

    Anemopsis californica has been used empirically to treat infectious diseases. However, there are no antimutagenic evaluation reports on this plant. The present study evaluated the antioxidant activity in relation to the mutagenic and antimutagenic activity properties of leaf (LME) and stem (SME) methanolic extracts of A. californica collected in the central Mexican state of Querétaro. Antioxidant properties and total phenols of extracts were evaluated using DPPH (1,1-diphenyl-2-picrylhydrazyl) and Folin-Ciocalteu methods, respectively. Mutagenicity was evaluated using the Ames test employing Salmonella enterica serovar Typhimurium strains (TA98, TA100, and TA102), with and without an aroclor 1254 (S9 mixture). Antimutagenesis was performed against mutations induced on the Ames test with MNNG, 2AA, or 4NQO. SME presented the highest antioxidant capacity and total phenolic content. None of the extracts exhibited mutagenicity in the Ames test. The extracts produced a significant reduction in 2AA-induced mutations in S. typhimurium TA98. In both extracts, mutagenesis induced by 4NQO or methyl-N′-nitro-N-nitrosoguanidine (MNNG) was reduced only if the exposure of strains was <10 μg/Petri dish. A. californca antioxidant properties and its capacity to reduce point mutations render it suitable to enhance medical cancer treatments. The significant effect against antimutagenic 2AA suggests that their consumption would provide protection against carcinogenic polycyclic aromatic compounds. PMID:25152760

  6. Automated Lattice Perturbation Theory

    SciTech Connect

    Monahan, Christopher

    2014-11-01

    I review recent developments in automated lattice perturbation theory. Starting with an overview of lattice perturbation theory, I focus on the three automation packages currently "on the market": HiPPy/HPsrc, Pastor and PhySyCAl. I highlight some recent applications of these methods, particularly in B physics. In the final section I briefly discuss the related, but distinct, approach of numerical stochastic perturbation theory.

  7. Extracting the frequencies of the pinna spectral notches in measured head related impulse responses

    NASA Astrophysics Data System (ADS)

    Raykar, Vikas C.; Duraiswami, Ramani; Yegnanarayana, B.

    2005-07-01

    The head related impulse response (HRIR) characterizes the auditory cues created by scattering of sound off a person's anatomy. The experimentally measured HRIR depends on several factors such as reflections from body parts (torso, shoulder, and knees), head diffraction, and reflection/diffraction effects due to the pinna. Structural models (Algazi et al., 2002; Brown and Duda, 1998) seek to establish direct relationships between the features in the HRIR and the anatomy. While there is evidence that particular features in the HRIR can be explained by anthropometry, the creation of such models from experimental data is hampered by the fact that the extraction of the features in the HRIR is not automatic. One of the prominent features observed in the HRIR, and one that has been shown to be important for elevation perception, are the deep spectral notches attributed to the pinna. In this paper we propose a method to robustly extract the frequencies of the pinna spectral notches from the measured HRIR, distinguishing them from other confounding features. The method also extracts the resonances described by Shaw (1997). The techniques are applied to the publicly available CIPIC HRIR database (Algazi et al., 2001c). The extracted notch frequencies are related to the physical dimensions and shape of the pinna.

  8. Orbital transfer vehicle launch operations study: Automated technology knowledge base, volume 4

    NASA Technical Reports Server (NTRS)

    1986-01-01

    A simplified retrieval strategy for compiling automation-related bibliographies from NASA/RECON is presented. Two subsets of NASA Thesaurus subject terms were extracted: a primary list, which is used to obtain an initial set of citations; and a secondary list, which is used to limit or further specify a large initial set of citations. These subject term lists are presented in Appendix A as the Automated Technology Knowledge Base (ATKB) Thesaurus.

  9. Process automation

    SciTech Connect

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs.

  10. A crowdsourcing workflow for extracting chemical-induced disease relations from free text

    PubMed Central

    Li, Tong Shu; Bravo, Àlex; Furlong, Laura I.; Good, Benjamin M.; Su, Andrew I.

    2016-01-01

    Relations between chemicals and diseases are one of the most queried biomedical interactions. Although expert manual curation is the standard method for extracting these relations from the literature, it is expensive and impractical to apply to large numbers of documents, and therefore alternative methods are required. We describe here a crowdsourcing workflow for extracting chemical-induced disease relations from free text as part of the BioCreative V Chemical Disease Relation challenge. Five non-expert workers on the CrowdFlower platform were shown each potential chemical-induced disease relation highlighted in the original source text and asked to make binary judgments about whether the text supported the relation. Worker responses were aggregated through voting, and relations receiving four or more votes were predicted as true. On the official evaluation dataset of 500 PubMed abstracts, the crowd attained a 0.505 F-score (0.475 precision, 0.540 recall), with a maximum theoretical recall of 0.751 due to errors with named entity recognition. The total crowdsourcing cost was $1290.67 ($2.58 per abstract) and took a total of 7 h. A qualitative error analysis revealed that 46.66% of sampled errors were due to task limitations and gold standard errors, indicating that performance can still be improved. All code and results are publicly available at https://github.com/SuLab/crowd_cid_relex Database URL: https://github.com/SuLab/crowd_cid_relex PMID:27087308

  11. A crowdsourcing workflow for extracting chemical-induced disease relations from free text.

    PubMed

    Li, Tong Shu; Bravo, Àlex; Furlong, Laura I; Good, Benjamin M; Su, Andrew I

    2016-01-01

    Relations between chemicals and diseases are one of the most queried biomedical interactions. Although expert manual curation is the standard method for extracting these relations from the literature, it is expensive and impractical to apply to large numbers of documents, and therefore alternative methods are required. We describe here a crowdsourcing workflow for extracting chemical-induced disease relations from free text as part of the BioCreative V Chemical Disease Relation challenge. Five non-expert workers on the CrowdFlower platform were shown each potential chemical-induced disease relation highlighted in the original source text and asked to make binary judgments about whether the text supported the relation. Worker responses were aggregated through voting, and relations receiving four or more votes were predicted as true. On the official evaluation dataset of 500 PubMed abstracts, the crowd attained a 0.505F-score (0.475 precision, 0.540 recall), with a maximum theoretical recall of 0.751 due to errors with named entity recognition. The total crowdsourcing cost was $1290.67 ($2.58 per abstract) and took a total of 7 h. A qualitative error analysis revealed that 46.66% of sampled errors were due to task limitations and gold standard errors, indicating that performance can still be improved. All code and results are publicly available athttps://github.com/SuLab/crowd_cid_relexDatabase URL:https://github.com/SuLab/crowd_cid_relex. PMID:27087308

  12. A crowdsourcing workflow for extracting chemical-induced disease relations from free text.

    PubMed

    Li, Tong Shu; Bravo, Àlex; Furlong, Laura I; Good, Benjamin M; Su, Andrew I

    2016-01-01

    Relations between chemicals and diseases are one of the most queried biomedical interactions. Although expert manual curation is the standard method for extracting these relations from the literature, it is expensive and impractical to apply to large numbers of documents, and therefore alternative methods are required. We describe here a crowdsourcing workflow for extracting chemical-induced disease relations from free text as part of the BioCreative V Chemical Disease Relation challenge. Five non-expert workers on the CrowdFlower platform were shown each potential chemical-induced disease relation highlighted in the original source text and asked to make binary judgments about whether the text supported the relation. Worker responses were aggregated through voting, and relations receiving four or more votes were predicted as true. On the official evaluation dataset of 500 PubMed abstracts, the crowd attained a 0.505F-score (0.475 precision, 0.540 recall), with a maximum theoretical recall of 0.751 due to errors with named entity recognition. The total crowdsourcing cost was $1290.67 ($2.58 per abstract) and took a total of 7 h. A qualitative error analysis revealed that 46.66% of sampled errors were due to task limitations and gold standard errors, indicating that performance can still be improved. All code and results are publicly available athttps://github.com/SuLab/crowd_cid_relexDatabase URL:https://github.com/SuLab/crowd_cid_relex.

  13. Complete automation of solid-phase extraction with subsequent liquid chromatography-tandem mass spectrometry for the quantification of benzoylecgonine, m-hydroxybenzoylecgonine, p-hydroxybenzoylecgonine, and norbenzoylecgonine in urine--application to a high-throughput urine analysis laboratory.

    PubMed

    Robandt, Paul P; Reda, Louis J; Klette, Kevin L

    2008-10-01

    A fully automated system utilizing a liquid handler and an online solid-phase extraction (SPE) device coupled with liquid chromatography-tandem mass spectrometry (LC-MS-MS) was designed to process, detect, and quantify benzoylecgonine (BZE), meta-hydroxybenzoylecgonine (m-OH BZE), para-hydroxybenzoylecgonine (p-OH BZE), and norbenzoylecgonine (nor-BZE) metabolites in human urine. The method was linear for BZE, m-OH BZE, and p-OH BZE from 1.2 to 10,000 ng/mL with limits of detection (LOD) and quantification (LOQ) of 1.2 ng/mL. Nor-BZE was linear from 5 to 10,000 ng/mL with an LOD and LOQ of 1.2 and 5 ng/mL, respectively. The intrarun precision measured as the coefficient of variation of 10 replicates of a 100 ng/mL control was less than 2.6%, and the interrun precision for 5 replicates of the same control across 8 batches was less than 4.8% for all analytes. No assay interference was noted from controls containing cocaine, cocaethylene, and ecgonine methyl ester. Excellent data concordance (R2 > 0.994) was found for direct comparison of the automated SPE-LC-MS-MS procedure and an existing gas chromatography-MS procedure using 94 human urine samples previously determined to be positive for BZE. The automated specimen handling and SPE procedure, when compared to the traditional extraction schema, eliminates the human factors of specimen handling, processing, extraction, and derivatization, thereby reducing labor costs and rework resulting from batch handling issues, and may reduce the number of fume hoods required in the laboratory. PMID:19007506

  14. RP-HPTLC densitometric determination and validation of vanillin and related phenolic compounds in accelerated solvent extract of Vanilla planifolia*.

    PubMed

    Sharma, Upendra Kumar; Sharma, Nandini; Gupta, Ajai Prakash; Kumar, Vinod; Sinha, Arun Kumar

    2007-12-01

    A simple, fast and sensitive RP-HPTLC method is developed for simultaneous quantitative determination of vanillin and related phenolic compounds in ethanolic extracts of Vanilla planifolia pods. In addition to this, the applicability of accelerated solvent extraction (ASE) as an alternative to microwave-assisted extraction (MAE), ultrasound-assisted extraction (UAE) and Soxhlet extraction was also explored for the rapid extraction of phenolic compounds in vanilla pods. Good separation was achieved on aluminium plates precoated with silica gel RP-18 F(254S) in the mobile phase of methanol/water/isopropanol/acetic acid (30:65:2:3, by volume). The method showed good linearity, high precision and good recovery of compounds of interest. ASE showed good extraction efficiency in less time as compared to other techniques for all the phenolic compounds. The present method would be useful for analytical research and for routine analysis of vanilla extracts for their quality control.

  15. Achyrocline satureioides (Lam.) D.C. Hydroalcoholic Extract Inhibits Neutrophil Functions Related to Innate Host Defense

    PubMed Central

    Barioni, Eric Diego; Machado, Isabel Daufenback; Rodrigues, Stephen Fernandes de Paula; Ferraz-de-Paula, Viviane; Wagner, Theodoro Marcel; Cogliati, Bruno; Corrêa dos Santos, Matheus; Machado, Marina da Silva; de Andrade, Sérgio Faloni; Niero, Rivaldo; Farsky, Sandra Helena Poliselli

    2013-01-01

    Achyrocline satureioides (Lam.) D.C. is a herb native to South America, and its inflorescences are popularly employed to treat inflammatory diseases. Here, the effects of the in vivo actions of the hydroalcoholic extract obtained from inflorescences of A. satureioides on neutrophil trafficking into inflamed tissue were investigated. Male Wistar rats were orally treated with A. satureioides extract, and inflammation was induced one hour later by lipopolysaccharide injection into the subcutaneous tissue. The number of leukocytes and the amount of chemotactic mediators were quantified in the inflammatory exudate, and adhesion molecule and toll-like receptor 4 (TLR-4) expressions and phorbol-myristate-acetate- (PMA-) stimulated oxidative burst were quantified in circulating neutrophils. Leukocyte-endothelial interactions were quantified in the mesentery tissue. Enzymes and tissue morphology of the liver and kidney were evaluated. Treatment with A. satureioides extract reduced neutrophil influx and secretion of leukotriene B4 and CINC-1 in the exudates, the number of rolling and adhered leukocytes in the mesentery postcapillary venules, neutrophil L-selectin, β2-integrin and TLR-4 expression, and oxidative burst, but did not cause an alteration in the morphology and activities of liver and kidney. Together, the data show that A. satureioides extract inhibits neutrophil functions related to the innate response and does not cause systemic toxicity. PMID:23476704

  16. Achyrocline satureioides (Lam.) D.C. Hydroalcoholic Extract Inhibits Neutrophil Functions Related to Innate Host Defense.

    PubMed

    Barioni, Eric Diego; Santin, José Roberto; Machado, Isabel Daufenback; Rodrigues, Stephen Fernandes de Paula; Ferraz-de-Paula, Viviane; Wagner, Theodoro Marcel; Cogliati, Bruno; Corrêa Dos Santos, Matheus; Machado, Marina da Silva; de Andrade, Sérgio Faloni; Niero, Rivaldo; Farsky, Sandra Helena Poliselli

    2013-01-01

    Achyrocline satureioides (Lam.) D.C. is a herb native to South America, and its inflorescences are popularly employed to treat inflammatory diseases. Here, the effects of the in vivo actions of the hydroalcoholic extract obtained from inflorescences of A. satureioides on neutrophil trafficking into inflamed tissue were investigated. Male Wistar rats were orally treated with A. satureioides extract, and inflammation was induced one hour later by lipopolysaccharide injection into the subcutaneous tissue. The number of leukocytes and the amount of chemotactic mediators were quantified in the inflammatory exudate, and adhesion molecule and toll-like receptor 4 (TLR-4) expressions and phorbol-myristate-acetate- (PMA-) stimulated oxidative burst were quantified in circulating neutrophils. Leukocyte-endothelial interactions were quantified in the mesentery tissue. Enzymes and tissue morphology of the liver and kidney were evaluated. Treatment with A. satureioides extract reduced neutrophil influx and secretion of leukotriene B4 and CINC-1 in the exudates, the number of rolling and adhered leukocytes in the mesentery postcapillary venules, neutrophil L-selectin, β 2-integrin and TLR-4 expression, and oxidative burst, but did not cause an alteration in the morphology and activities of liver and kidney. Together, the data show that A. satureioides extract inhibits neutrophil functions related to the innate response and does not cause systemic toxicity.

  17. Automated Cooperative Trajectories

    NASA Technical Reports Server (NTRS)

    Hanson, Curt; Pahle, Joseph; Brown, Nelson

    2015-01-01

    This presentation is an overview of the Automated Cooperative Trajectories project. An introduction to the phenomena of wake vortices is given, along with a summary of past research into the possibility of extracting energy from the wake by flying close parallel trajectories. Challenges and barriers to adoption of civilian automatic wake surfing technology are identified. A hardware-in-the-loop simulation is described that will support future research. Finally, a roadmap for future research and technology transition is proposed.

  18. 77 FR 123 - Final Reissuance of General NPDES Permits (GP) for Facilities Related to Oil and Gas Extraction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-03

    ... AGENCY Final Reissuance of General NPDES Permits (GP) for Facilities Related to Oil and Gas Extraction... permit. SUMMARY: A GP regulating the activities of facilities related to oil and gas extraction on the... reissue the GP expanding the coverage area to the TransAlaska Pipeline Corridor along with other...

  19. Direct DNA isolation from solid biological sources without pretreatments with proteinase-K and/or homogenization through automated DNA extraction.

    PubMed

    Ki, Jang-Seu; Chang, Ki Byum; Roh, Hee June; Lee, Bong Youb; Yoon, Joon Yong; Jang, Gi Young

    2007-03-01

    Genomic DNA from solid biomaterials was directly isolated with an automated DNA extractor, which was based on magnetic bead technology with a bore-mediated grinding (BMG) system. The movement of the bore broke down the solid biomaterials, mixed crude lysates thoroughly with reagents to isolate the DNA, and carried the beads to the next step. The BMG system was suitable for the mechanical homogenization of the solid biomaterials and valid as an automated system for purifying the DNA from the solid biomaterials without the need for pretreatment or disruption procedures prior to the application of the solid biomaterials.

  20. Multiresidue determination of ultratrace levels of fluoroquinolone antimicrobials in drinking and aquaculture water samples by automated online molecularly imprinted solid phase extraction and liquid chromatography.

    PubMed

    Rodríguez, Erika; Navarro-Villoslada, Fernando; Benito-Peña, Elena; Marazuela, María Dolores; Moreno-Bondi, María Cruz

    2011-03-15

    The present work describes the development of a sensitive and highly selective innovative method for the simultaneous detection of six fluoroquinolone (FQ) antimicrobials (enrofloxacin, ciprofloxacin, norfloxacin, levofloxacin, danofloxacin, and sarafloxacin) in water samples. This detection is based on online solid phase extraction, coupled to liquid chromatography (LC), using for the first time tailor-made molecularly imprinted microspherical polymer particles prepared via precipitation polymerization. Various parameters affecting the extraction efficiency of the polymer have been optimized to reduce nonspecific interactions and to achieve selective uptake of the antibiotics from real samples. The method shows good recoveries ranging between 62% and 102% (V = 25 mL) for the different FQs tested and excellent interday and intraday precision with relative standard deviation (RSD) values between 2-5% and 2-6%, respectively. The detection limits were between 1-11 ng L(-1) (drinking water) and 1-12 ng L(-1) (fish farm water) when 25 mL samples were processed. The polymer showed selectivity for FQs containing a piperazine moiety whereas no retention was found for other antibiotics or nonrelated compounds. The method has been applied to the analysis of trace amounts of the FQs tested in drinking and fish farm water samples with excellent recoveries (>91%) and good precision (RSDs <5%).

  1. Microfabricated Devices for Sample Extraction, Concentrations, and Related Sample Processing Technologies

    SciTech Connect

    Chen, Gang; Lin, Yuehe

    2006-12-01

    This is an invited book chapter. As with other analytical techniques, sample pretreatments, sample extraction, sample introduction, and related techniques are of extreme importance for micro-electro-mechanical systems (MEMS). Bio-MEMS devices and systems start with a sampling step. The biological sample then usually undergoes some kinds of sample preparation steps before the actual analysis. These steps may involve extracting the target sample from its matrix, removing interferences from the sample, derivatizing the sample to detectable species, or performing a sample preconcentration step. The integration of the components for sample pretreatment into microfluidic devices represents one of the remaining the bottle-neck towards achieving true miniaturized total analysis systems (?TAS). This chapter provides a thorough state-of-art of the developments in this field to date.

  2. Automated imagery orthorectification pilot

    NASA Astrophysics Data System (ADS)

    Slonecker, E. Terrence; Johnson, Brad; McMahon, Joe

    2009-10-01

    Automated orthorectification of raw image products is now possible based on the comprehensive metadata collected by Global Positioning Systems and Inertial Measurement Unit technology aboard aircraft and satellite digital imaging systems, and based on emerging pattern-matching and automated image-to-image and control point selection capabilities in many advanced image processing systems. Automated orthorectification of standard aerial photography is also possible if a camera calibration report and sufficient metadata is available. Orthorectification of historical imagery, for which only limited metadata was available, was also attempted and found to require some user input, creating a semi-automated process that still has significant potential to reduce processing time and expense for the conversion of archival historical imagery into geospatially enabled, digital formats, facilitating preservation and utilization of a vast archive of historical imagery. Over 90 percent of the frames of historical aerial photos used in this experiment were successfully orthorectified to the accuracy of the USGS 100K base map series utilized for the geospatial reference of the archive. The accuracy standard for the 100K series maps is approximately 167 feet (51 meters). The main problems associated with orthorectification failure were cloud cover, shadow and historical landscape change which confused automated image-to-image matching processes. Further research is recommended to optimize automated orthorectification methods and enable broad operational use, especially as related to historical imagery archives.

  3. Using SemRep to Label Semantic Relations Extracted from Clinical Text

    PubMed Central

    Liu, Ying; Bill, Robert; Fiszman, Marcelo; Rindflesch, Thomas; Pedersen, Ted; Melton, Genevieve B.; Pakhomov, Serguei V.

    2012-01-01

    In this paper we examined the relationship between semantic relatedness among medical concepts found in clinical reports and biomedical literature. Our objective is to determine whether relations between medical concepts identified from Medline abstracts may be used to inform us as to the nature of the association between medical concepts that appear to be closely related based on their distribution in clinical reports. We used a corpus of 800k inpatient clinical notes as a source of data for determining the strength of association between medical concepts and SemRep database as a source of labeled relations extracted from Medline abstracts. The same pair of medical concepts may be found with more than one predicate type in the SemRep database but often with different frequencies. Our analysis shows that predicate type frequency information obtained from the SemRep database appears to be helpful for labeling semantic relations obtained with measures of semantic relatedness and similarity. PMID:23304331

  4. Method Specific Calibration Corrects for DNA Extraction Method Effects on Relative Telomere Length Measurements by Quantitative PCR

    PubMed Central

    Holland, Rebecca; Underwood, Sarah; Fairlie, Jennifer; Psifidi, Androniki; Ilska, Joanna J.; Bagnall, Ainsley; Whitelaw, Bruce; Coffey, Mike; Banos, Georgios; Nussey, Daniel H.

    2016-01-01

    Telomere length (TL) is increasingly being used as a biomarker in epidemiological, biomedical and ecological studies. A wide range of DNA extraction techniques have been used in telomere experiments and recent quantitative PCR (qPCR) based studies suggest that the choice of DNA extraction method may influence average relative TL (RTL) measurements. Such extraction method effects may limit the use of historically collected DNA samples extracted with different methods. However, if extraction method effects are systematic an extraction method specific (MS) calibrator might be able to correct for them, because systematic effects would influence the calibrator sample in the same way as all other samples. In the present study we tested whether leukocyte RTL in blood samples from Holstein Friesian cattle and Soay sheep measured by qPCR was influenced by DNA extraction method and whether MS calibration could account for any observed differences. We compared two silica membrane-based DNA extraction kits and a salting out method. All extraction methods were optimized to yield enough high quality DNA for TL measurement. In both species we found that silica membrane-based DNA extraction methods produced shorter RTL measurements than the non-membrane-based method when calibrated against an identical calibrator. However, these differences were not statistically detectable when a MS calibrator was used to calculate RTL. This approach produced RTL measurements that were highly correlated across extraction methods (r > 0.76) and had coefficients of variation lower than 10% across plates of identical samples extracted by different methods. Our results are consistent with previous findings that popular membrane-based DNA extraction methods may lead to shorter RTL measurements than non-membrane-based methods. However, we also demonstrate that these differences can be accounted for by using an extraction method-specific calibrator, offering researchers a simple means of accounting for

  5. BSQA: integrated text mining using entity relation semantics extracted from biological literature of insects.

    PubMed

    He, Xin; Li, Yanen; Khetani, Radhika; Sanders, Barry; Lu, Yue; Ling, Xu; Zhai, Chengxiang; Schatz, Bruce

    2010-07-01

    Text mining is one promising way of extracting information automatically from the vast biological literature. To maximize its potential, the knowledge encoded in the text should be translated to some semantic representation such as entities and relations, which could be analyzed by machines. But large-scale practical systems for this purpose are rare. We present BeeSpace question/answering (BSQA) system that performs integrated text mining for insect biology, covering diverse aspects from molecular interactions of genes to insect behavior. BSQA recognizes a number of entities and relations in Medline documents about the model insect, Drosophila melanogaster. For any text query, BSQA exploits entity annotation of retrieved documents to identify important concepts in different categories. By utilizing the extracted relations, BSQA is also able to answer many biologically motivated questions, from simple ones such as, which anatomical part is a gene expressed in, to more complex ones involving multiple types of relations. BSQA is freely available at http://www.beespace.uiuc.edu/QuestionAnswer.

  6. Semi-automated fault system extraction and displacement analysis of an excavated oyster reef using high-resolution laser scanned data

    NASA Astrophysics Data System (ADS)

    Molnár, Gábor; Székely, Balázs; Harzhauser, Mathias; Djuricic, Ana; Mandic, Oleg; Dorninger, Peter; Nothegger, Clemens; Exner, Ulrike; Pfeifer, Norbert

    2015-04-01

    In this contribution we present a semi-automated method for reconstructing the brittle deformation field of an excavated Miocene oyster reef, in Stetten, Korneuburg Basin, Lower Austria. Oyster shells up to 80 cm in size were scattered in a shallow estuarine bay forming a continuous and almost isochronous layer as a consequence of a catastrophic event in the Miocene. This shell bed was preserved by burial of several hundred meters of sandy to silty sediments. Later the layers were tilted westward, uplifted and erosion almost exhumed them. An excavation revealed a 27 by 17 meters area of the oyster covered layer. During the tectonic processes the sediment volume suffered brittle deformation. Faults mostly with some centimeter normal component and NW-SE striking affected the oyster covered volume, dissecting many shells and the surrounding matrix as well. Faults and displacements due to them can be traced along the site typically at several meters long, and as fossil oysters are broken and parts are displaced due to the faulting, along some faults it is possible to follow these displacements in 3D. In order to quantify these varying displacements and to map the undulating fault traces high-resolution scanning of the excavated and cleaned surface of the oyster bed has been carried out using a terrestrial laser scanner. The resulting point clouds have been co-georeferenced at mm accuracy and a 1mm resolution 3D point cloud of the surface has been created. As the faults are well-represented in the point cloud, this enables us to measure the dislocations of the dissected shell parts along the fault lines. We used a semi-automatic method to quantify these dislocations. First we manually digitized the fault lines in 2D as an initial model. In the next step we estimated the vertical (i.e. perpendicular to the layer) component of the dislocation along these fault lines comparing the elevations on two sides of the faults with moving averaging windows. To estimate the strike

  7. Drought Resilience of Water Supplies for Shale Gas Extraction and Related Power Generation in Texas

    NASA Astrophysics Data System (ADS)

    Reedy, R. C.; Scanlon, B. R.; Nicot, J. P.; Uhlman, K.

    2014-12-01

    There is considerable concern about water availability to support energy production in Texas, particularly considering that many of the shale plays are in semiarid areas of Texas and the state experienced the most extreme drought on record in 2011. The Eagle Ford shale play provides an excellent case study. Hydraulic fracturing water use for shale gas extraction in the play totaled ~ 12 billion gallons (bgal) in 2012, representing ~7 - 10% of total water use in the 16 county play area. The dominant source of water is groundwater which is not highly vulnerable to drought from a recharge perspective because water is primarily stored in the confined portion of aquifers that were recharged thousands of years ago. Water supply drought vulnerability results primarily from increased water use for irrigation. Irrigation water use in the Eagle Ford play was 30 billion gallons higher in the 2011 drought year relative to 2010. Recent trends toward increased use of brackish groundwater for shale gas extraction in the Eagle Ford also reduce pressure on fresh water resources. Evaluating the impacts of natural gas development on water resources should consider the use of natural gas in power generation, which now represents 50% of power generation in Texas. Water consumed in extracting the natural gas required for power generation is equivalent to ~7% of the water consumed in cooling these power plants in the state. However, natural gas production from shale plays can be overall beneficial in terms of water resources in the state because natural gas combined cycle power generation decreases water consumption by ~60% relative to traditional coal, nuclear, and natural gas plants that use steam turbine generation. This reduced water consumption enhances drought resilience of power generation in the state. In addition, natural gas combined cycle plants provide peaking capacity that complements increasing renewable wind generation which has no cooling water requirement. However, water

  8. Use of relational database management system by clinicians to create automated MICU progress note from existent data sources.

    PubMed

    Delaney, D P; Zibrak, J D; Samore, M; Peterson, M

    1997-01-01

    We designed and built an application called MD Assist that compiles data from several hospital databases to create reports used for daily house officer rounding in the medical intensive care unit (MICU). After rounding, the report becomes the objective portion of the daily "SOAP" MICU progress note. All data used in the automated note was available in digital format residing in an institution wide Sybase data repository which had been built to fulfill data needs of the parent enterprise. From initial design of target output through actual creation and implementation in the MICU, MD Assist was created by physicians with only consultative help from information systems (IS). This project demonstrated a method for rapidly developing time saving, clinically useful applications using a comprehensive clinical data repository.

  9. Automation in organizations: Eternal conflict

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  10. Ginseng berry extract supplementation improves age-related decline of insulin signaling in mice.

    PubMed

    Seo, Eunhui; Kim, Sunmi; Lee, Sang Jun; Oh, Byung-Chul; Jun, Hee-Sook

    2015-04-01

    The aim of this study was to evaluate the effects of ginseng berry extract on insulin sensitivity and associated molecular mechanisms in aged mice. C57BL/6 mice (15 months old) were maintained on a regular diet (CON) or a regular diet supplemented with 0.05% ginseng berry extract (GBD) for 24 or 32 weeks. GBD-fed mice showed significantly lower serum insulin levels (p = 0.016) and insulin resistance scores (HOMA-IR) (p = 0.012), suggesting that GBD improved insulin sensitivity. Pancreatic islet hypertrophy was also ameliorated in GBD-fed mice (p = 0.007). Protein levels of tyrosine phosphorylated insulin receptor substrate (IRS)-1 (p = 0.047), and protein kinase B (AKT) (p = 0.037), were up-regulated in the muscle of insulin-injected GBD-fed mice compared with CON-fed mice. The expressions of forkhead box protein O1 (FOXO1) (p = 0.036) and peroxisome proliferator-activated receptor gamma (PPARγ) (p = 0.032), which are known as aging- and insulin resistance-related genes, were also increased in the muscle of GBD-fed mice. We conclude that ginseng berry extract consumption might increase activation of IRS-1 and AKT, contributing to the improvement of insulin sensitivity in aged mice. PMID:25912041

  11. Ginseng Berry Extract Supplementation Improves Age-Related Decline of Insulin Signaling in Mice

    PubMed Central

    Seo, Eunhui; Kim, Sunmi; Lee, Sang Jun; Oh, Byung-Chul; Jun, Hee-Sook

    2015-01-01

    The aim of this study was to evaluate the effects of ginseng berry extract on insulin sensitivity and associated molecular mechanisms in aged mice. C57BL/6 mice (15 months old) were maintained on a regular diet (CON) or a regular diet supplemented with 0.05% ginseng berry extract (GBD) for 24 or 32 weeks. GBD-fed mice showed significantly lower serum insulin levels (p = 0.016) and insulin resistance scores (HOMA-IR) (p = 0.012), suggesting that GBD improved insulin sensitivity. Pancreatic islet hypertrophy was also ameliorated in GBD-fed mice (p = 0.007). Protein levels of tyrosine phosphorylated insulin receptor substrate (IRS)-1 (p = 0.047), and protein kinase B (AKT) (p = 0.037), were up-regulated in the muscle of insulin-injected GBD-fed mice compared with CON-fed mice. The expressions of forkhead box protein O1 (FOXO1) (p = 0.036) and peroxisome proliferator-activated receptor gamma (PPARγ) (p = 0.032), which are known as aging- and insulin resistance-related genes, were also increased in the muscle of GBD-fed mice. We conclude that ginseng berry extract consumption might increase activation of IRS-1 and AKT, contributing to the improvement of insulin sensitivity in aged mice. PMID:25912041

  12. Extraction of Keywords Related with Stock Price Change from Bloggers' Hot Topics

    NASA Astrophysics Data System (ADS)

    Hara, Shinji; Nadamoto, Hironori; Horiuchi, Tadashi

    This paper presents an extraction method of keywords related with actual changes of stock price from bloggers' hot topics. We realized the computer program to collect bloggers' hot topics about stocks and actual stock prices. Then, the important keywords, which have correlation with changes of the stock prices, are selected based on stochastic complexity. We classify information of stock price changes by using text classification method such as Naive Bayes method and decision tree learning. We confirm the effectiveness of our method through the classification experiment.

  13. On the effects of a plant extract of Orthosiphon stamineus on sebum-related skin imperfections.

    PubMed

    Vogelgesang, B; Abdul-Malak, N; Reymermier, C; Altobelli, C; Saget, J

    2011-02-01

    Overproduction of sebum is very common and results in an undesirable oily, shiny complexion with enlarged pores. Sebum secretion is basically under the control of 5-α reductase, and more particularly under that of type 1 isozyme. But it is also highly sensitive to environmental factors such as temperature, humidity and food. Moreover, in Asia, the edicts of a flawless facial skin turn oily skin into a major concern for Asian women. We identified Orthosiphon stamineus leaf extract as an interesting ingredient for reducing the oily appearance of skin thanks to its ability to reduce 5-α reductase type 1 expression in normal human epidermal keratinocytes in vitro. This was confirmed ex vivo, where Orthosiphon stamineus leaf extract was shown to reduce 5-α reductase activity as well as the production of squalene, one of the main components of sebum that was used as a tracer of sebum. To evaluate the efficacy of Orthosiphon stamineus leaf extract at reducing sebum-related skin imperfections in vivo, we performed two different clinical studies, one in France on a panel of Caucasian volunteers and the other one in Thailand on a panel of Asian volunteers. Using instrumental techniques as well as clinical evaluation and self-evaluation, we could highlight that an O/W cosmetic formula containing 2% of Orthosiphon stamineus leaf extract could visibly reduce the oily appearance of skin as well as the size of pores, thus leading to a significant improvement of complexion evenness and radiance. Overall, the results obtained were better than those observed with the same formula containing 1% of zinc gluconate, an ingredient frequently used in oily skin care products. PMID:20807263

  14. On the Agreement between Manual and Automated Methods for Single-Trial Detection and Estimation of Features from Event-Related Potentials.

    PubMed

    Biurrun Manresa, José A; Arguissain, Federico G; Medina Redondo, David E; Mørch, Carsten D; Andersen, Ole K

    2015-01-01

    The agreement between humans and algorithms on whether an event-related potential (ERP) is present or not and the level of variation in the estimated values of its relevant features are largely unknown. Thus, the aim of this study was to determine the categorical and quantitative agreement between manual and automated methods for single-trial detection and estimation of ERP features. To this end, ERPs were elicited in sixteen healthy volunteers using electrical stimulation at graded intensities below and above the nociceptive withdrawal reflex threshold. Presence/absence of an ERP peak (categorical outcome) and its amplitude and latency (quantitative outcome) in each single-trial were evaluated independently by two human observers and two automated algorithms taken from existing literature. Categorical agreement was assessed using percentage positive and negative agreement and Cohen's κ, whereas quantitative agreement was evaluated using Bland-Altman analysis and the coefficient of variation. Typical values for the categorical agreement between manual and automated methods were derived, as well as reference values for the average and maximum differences that can be expected if one method is used instead of the others. Results showed that the human observers presented the highest categorical and quantitative agreement, and there were significantly large differences between detection and estimation of quantitative features among methods. In conclusion, substantial care should be taken in the selection of the detection/estimation approach, since factors like stimulation intensity and expected number of trials with/without response can play a significant role in the outcome of a study.

  15. On the Agreement between Manual and Automated Methods for Single-Trial Detection and Estimation of Features from Event-Related Potentials

    PubMed Central

    Biurrun Manresa, José A.; Arguissain, Federico G.; Medina Redondo, David E.; Mørch, Carsten D.; Andersen, Ole K.

    2015-01-01

    The agreement between humans and algorithms on whether an event-related potential (ERP) is present or not and the level of variation in the estimated values of its relevant features are largely unknown. Thus, the aim of this study was to determine the categorical and quantitative agreement between manual and automated methods for single-trial detection and estimation of ERP features. To this end, ERPs were elicited in sixteen healthy volunteers using electrical stimulation at graded intensities below and above the nociceptive withdrawal reflex threshold. Presence/absence of an ERP peak (categorical outcome) and its amplitude and latency (quantitative outcome) in each single-trial were evaluated independently by two human observers and two automated algorithms taken from existing literature. Categorical agreement was assessed using percentage positive and negative agreement and Cohen’s κ, whereas quantitative agreement was evaluated using Bland-Altman analysis and the coefficient of variation. Typical values for the categorical agreement between manual and automated methods were derived, as well as reference values for the average and maximum differences that can be expected if one method is used instead of the others. Results showed that the human observers presented the highest categorical and quantitative agreement, and there were significantly large differences between detection and estimation of quantitative features among methods. In conclusion, substantial care should be taken in the selection of the detection/estimation approach, since factors like stimulation intensity and expected number of trials with/without response can play a significant role in the outcome of a study. PMID:26258532

  16. Habitat automation

    NASA Technical Reports Server (NTRS)

    Swab, Rodney E.

    1992-01-01

    A habitat, on either the surface of the Moon or Mars, will be designed and built with the proven technologies of that day. These technologies will be mature and readily available to the habitat designer. We believe an acceleration of the normal pace of automation would allow a habitat to be safer and more easily maintained than would be the case otherwise. This document examines the operation of a habitat and describes elements of that operation which may benefit from an increased use of automation. Research topics within the automation realm are then defined and discussed with respect to the role they can have in the design of the habitat. Problems associated with the integration of advanced technologies into real-world projects at NASA are also addressed.

  17. Blastocyst microinjection automation.

    PubMed

    Mattos, Leonardo S; Grant, Edward; Thresher, Randy; Kluckman, Kimberly

    2009-09-01

    Blastocyst microinjections are routinely involved in the process of creating genetically modified mice for biomedical research, but their efficiency is highly dependent on the skills of the operators. As a consequence, much time and resources are required for training microinjection personnel. This situation has been aggravated by the rapid growth of genetic research, which has increased the demand for mutant animals. Therefore, increased productivity and efficiency in this area are highly desired. Here, we pursue these goals through the automation of a previously developed teleoperated blastocyst microinjection system. This included the design of a new system setup to facilitate automation, the definition of rules for automatic microinjections, the implementation of video processing algorithms to extract feedback information from microscope images, and the creation of control algorithms for process automation. Experimentation conducted with this new system and operator assistance during the cells delivery phase demonstrated a 75% microinjection success rate. In addition, implantation of the successfully injected blastocysts resulted in a 53% birth rate and a 20% yield of chimeras. These results proved that the developed system was capable of automatic blastocyst penetration and retraction, demonstrating the success of major steps toward full process automation.

  18. PPI-IRO: a two-stage method for protein-protein interaction extraction based on interaction relation ontology.

    PubMed

    Li, Chuan-Xi; Chen, Peng; Wang, Ru-Jing; Wang, Xiu-Jie; Su, Ya-Ru; Li, Jinyan

    2014-01-01

    Mining Protein-Protein Interactions (PPIs) from the fast-growing biomedical literature resources has been proven as an effective approach for the identification of biological regulatory networks. This paper presents a novel method based on the idea of Interaction Relation Ontology (IRO), which specifies and organises words of various proteins interaction relationships. Our method is a two-stage PPI extraction method. At first, IRO is applied in a binary classifier to determine whether sentences contain a relation or not. Then, IRO is taken to guide PPI extraction by building sentence dependency parse tree. Comprehensive and quantitative evaluations and detailed analyses are used to demonstrate the significant performance of IRO on relation sentences classification and PPI extraction. Our PPI extraction method yielded a recall of around 80% and 90% and an F1 of around 54% and 66% on corpora of AIMed and BioInfer, respectively, which are superior to most existing extraction methods. PMID:25757257

  19. PPI-IRO: a two-stage method for protein-protein interaction extraction based on interaction relation ontology.

    PubMed

    Li, Chuan-Xi; Chen, Peng; Wang, Ru-Jing; Wang, Xiu-Jie; Su, Ya-Ru; Li, Jinyan

    2014-01-01

    Mining Protein-Protein Interactions (PPIs) from the fast-growing biomedical literature resources has been proven as an effective approach for the identification of biological regulatory networks. This paper presents a novel method based on the idea of Interaction Relation Ontology (IRO), which specifies and organises words of various proteins interaction relationships. Our method is a two-stage PPI extraction method. At first, IRO is applied in a binary classifier to determine whether sentences contain a relation or not. Then, IRO is taken to guide PPI extraction by building sentence dependency parse tree. Comprehensive and quantitative evaluations and detailed analyses are used to demonstrate the significant performance of IRO on relation sentences classification and PPI extraction. Our PPI extraction method yielded a recall of around 80% and 90% and an F1 of around 54% and 66% on corpora of AIMed and BioInfer, respectively, which are superior to most existing extraction methods.

  20. Quantification of five compounds with heterogeneous physicochemical properties (morphine, 6-monoacetylmorphine, cyamemazine, meprobamate and caffeine) in 11 fluids and tissues, using automated solid-phase extraction and gas chromatography-tandem mass spectrometry.

    PubMed

    Bévalot, Fabien; Bottinelli, Charline; Cartiser, Nathalie; Fanton, Laurent; Guitton, Jérôme

    2014-06-01

    An automated solid-phase extraction (SPE) protocol followed by gas chromatography coupled with tandem mass spectrometry was developed for quantification of caffeine, cyamemazine, meprobamate, morphine and 6-monoacetylmorphine (6-MAM) in 11 biological matrices [blood, urine, bile, vitreous humor, liver, kidney, lung and skeletal muscle, brain, adipose tissue and bone marrow (BM)]. The assay was validated for linearity, within- and between-day precision and accuracy, limits of quantification, selectivity, extraction recovery (ER), sample dilution and autosampler stability on BM. For the other matrices, partial validation was performed (limits of quantification, linearity, within-day precision, accuracy, selectivity and ER). The lower limits of quantification were 12.5 ng/mL(ng/g) for 6-MAM, morphine and cyamemazine, 100 ng/mL(ng/g) for meprobamate and 50 ng/mL(ng/g) for caffeine. Analysis of real-case samples demonstrated the performance of the assay in forensic toxicology to investigate challenging cases in which, for example, blood is not available or in which analysis in alternative matrices could be relevant. The SPE protocol was also assessed as an extraction procedure that could target other relevant analytes of interest. The extraction procedure was applied to 12 molecules of forensic interest with various physicochemical properties (alimemazine, alprazolam, amitriptyline, citalopram, cocaine, diazepam, levomepromazine, nordazepam, tramadol, venlafaxine, pentobarbital and phenobarbital). All drugs were able to be detected at therapeutic concentrations in blood and in the alternate matrices. PMID:24790060

  1. Validation of high-throughput measurement system with microwave-assisted extraction, fully automated sample preparation device, and gas chromatography-electron capture detector for determination of polychlorinated biphenyls in whale blubber.

    PubMed

    Fujita, Hiroyuki; Honda, Katsuhisa; Hamada, Noriaki; Yasunaga, Genta; Fujise, Yoshihiro

    2009-02-01

    Validation of a high-throughput measurement system with microwave-assisted extraction (MAE), fully automated sample preparation device (SPD), and gas chromatography-electron capture detector (GC-ECD) for the determination of polychlorinated biphenyls (PCBs) in minke whale blubber was performed. PCB congeners accounting for > 95% of the total PCBs burden in blubber were efficiently extracted with a small volume (20 mL) of n-hexane using MAE due to simultaneous saponification and extraction. Further, the crude extract obtained by MAE was rapidly purified and automatically substituted to a small volume (1 mL) of toluene using SPD without using concentrators. Furthermore, the concentration of PCBs in the purified and concentrated solution was accurately determined by GC-ECD. Moreover, the result of accuracy test using a certified material (SRM 1588b; Cod liver oil) showed good agreement with the NIST certified concentration values. In addition, the method quantification limit of total-PCB in whale blubbers was 41 ng g(-1). This new measurement system for PCBs takes only four hours. Consequently, it indicated this method is the most suitable for the monitoring and screening of PCBs in the conservation of the marine ecosystem and safe distribution of foods.

  2. A knowledge-poor approach to chemical-disease relation extraction

    PubMed Central

    Alam, Firoj; Corazza, Anna; Lavelli, Alberto; Zanoli, Roberto

    2016-01-01

    The article describes a knowledge-poor approach to the task of extracting Chemical-Disease Relations from PubMed abstracts. A first version of the approach was applied during the participation in the BioCreative V track 3, both in Disease Named Entity Recognition and Normalization (DNER) and in Chemical-induced diseases (CID) relation extraction. For both tasks, we have adopted a general-purpose approach based on machine learning techniques integrated with a limited number of domain-specific knowledge resources and using freely available tools for preprocessing data. Crucially, the system only uses the data sets provided by the organizers. The aim is to design an easily portable approach with a limited need of domain-specific knowledge resources. In the participation in the BioCreative V task, we ranked 5 out of 16 in DNER, and 7 out of 18 in CID. In this article, we present our follow-up study in particular on CID by performing further experiments, extending our approach and improving the performance. PMID:27189609

  3. Introduction of a modular automated voltage-clamp platform and its correlation with manual human Ether-à-go-go related gene voltage-clamp data.

    PubMed

    Scheel, Olaf; Himmel, Herbert; Rascher-Eggstein, Gesa; Knott, Thomas

    2011-12-01

    In investigating ion channel pharmacology, the manual patch clamp is still considered the gold standard for data quality, notwithstanding the major drawbacks of low throughput and the need for skilled operators. The automated patch clamp platform CytoPatch™ Instrument overcomes these restrictions. Its modular fully automated design makes it possible to obtain scalable throughput without the need for well-trained operators. Its chip design and perfusion system reproduces the manual patch technique, thus ensuring optimal data quality. Further, the use of stably transfected frozen cells, usable immediately after thawing, eliminates the cell quality impairment and low success rates associated with a running cell culture and renders screening costs accurately calculable. To demonstrate the applicability of this platform, 18 blinded compounds were assessed for their impact on the cardiac human Ether-à-go-go related gene K(+) channel. The IC(50) values obtained by the CytoPatch Instrument using the frozen human embryonic kidney 293 cells showed a high correlation (R(2)=0.928) with those obtained from manual patch clamp recordings with human embryonic kidney 293 cells from a running cell culture. Moreover, this correlation extended to sticky compounds such as terfenadine or astemizole. In conclusion, the CytoPatch Instrument operated with frozen cells ready to use directly after thawing provides the same high data quality known from the manual voltage clamp and has the added benefit of enhanced throughput for use in ion channel screening and safety assessment. PMID:21675869

  4. Characterization of cysteine related variants in an IgG2 antibody by LC-MS with an automated data analysis approach.

    PubMed

    Zhang, Yuling; Bailey, Robert; Nightlinger, Nancy; Gillespie, Alison; Balland, Alain; Rogers, Richard

    2015-08-01

    In this communication, a high-throughput method for automated data analysis of cysteine-related product quality attributes (PQAs) in IgG2 antibodies is reported. This method leverages recent advances in the relative quantification of PQAs to facilitate the characterization of disulfide variants and free sulfhydryls (SHs) in IgG2 antibodies. The method uses samples labeled with a mass tag (N-ethyl maleimide [NEM]) followed by enzymatic digestion under non-reducing conditions to maintain the cysteine connectivity. The digested IgG2 samples are separated and detected by mass spectrometry (MS) and the resulting peptide map is analyzed in an automated fashion using Pinpoint software (Thermo Scientific). Previous knowledge of IgG2 disulfide structures can be fed into the Pinpoint software to create workbooks for various disulfide linkages and hinge disulfide variants. In addition, the NEM mass tag can be added to the workbooks for targeted analysis of labeled cysteine-containing peptides. The established Pinpoint workbooks are a high-throughput approach to quantify relative abundances of unpaired cysteines and disulfide linkages, including complicated hinge disulfide variants. This approach is especially efficient for comparing large sets of similar samples such as those created in comparability and stability studies or chromatographic fractions. Here, the high throughput method is applied to quantify the relative abundance of hinge disulfide variants and unpaired cysteines in the IgG2 fractions from non-reduced reversed-phase high-performance liquid chromatography (nrRP-HPLC). The LC-MS data analyzed by the Pinpoint workbook suggests that the nrRP-HPLC separated peaks contain hinge disulfide isoforms and free cysteine pairs for each major disulfide isoform structure.

  5. Automated dispersive liquid-liquid microextraction-gas chromatography-mass spectrometry.

    PubMed

    Guo, Liang; Lee, Hian Kee

    2014-04-15

    An innovative automated procedure, low-density solvent based/solvent demulsification dispersive liquid-liquid microextraction (automated DLLME) coupled to gas chromatography-mass spectrometry (GC/MS) analysis, has been developed. The most significant innovation of the method is the automation. The entire procedure, including the extraction of the model analytes (phthalate esters) by DLLME from the aqueous sample solution, breaking up of the emulsion after extraction, collection of the extract, and analysis of the extract by GC/MS, was completely automated. The applications of low-density solvent as extraction solvent and the solvent demulsification technique to break up the emulsion simplified the procedure and facilitated its automation. Orthogonal array design (OAD) as an efficient optimization strategy was employed to optimize the extraction parameters, with all the experiments conducted auotmatically. An OA16 (4(1) × 2(12)) matrix was initially employed for the identification of optimized extraction parameters (type and volume of extraction solvent, type and volume of dispersive solvent and demulsification solvent, demulsification time, and injection speed). Then, on the basis of the results, more levels (values) of five extraction parameters were investigated by an OA16 (4(5)) matrix and quantitatively assessed by the analysis of variance (ANOVA). Enrichment factors of between 178- and 272-fold were obtained for the phthalate esters. The linearities were in the range of 0.1 and 50 μg/L and 0.2 and 50 μg/L, depending on the analytes. Good limits of detection (in the range of 0.01 to 0.02 μg/L) and satisfactory repeatability (relative standard deviations of below 5.9%) were obtained. The proposed method demonstrates for the first time integrated sample preparation by DLLME and analysis by GC/MS that can be operated automatically across multiple experiments.

  6. Automated dispenser

    SciTech Connect

    Hollen, R.M.; Stalnaker, N.D.

    1989-04-06

    An automated dispenser having a conventional pipette attached to an actuating cylinder through a flexible cable for delivering precise quantities of a liquid through commands from remotely located computer software. The travel of the flexible cable is controlled by adjustable stops and a locking shaft. The pipette can be positioned manually or by the hands of a robot. 1 fig.

  7. Automating Finance

    ERIC Educational Resources Information Center

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  8. Automated protein NMR resonance assignments.

    PubMed

    Wan, Xiang; Xu, Dong; Slupsky, Carolyn M; Lin, Guohui

    2003-01-01

    NMR resonance peak assignment is one of the key steps in solving an NMR protein structure. The assignment process links resonance peaks to individual residues of the target protein sequence, providing the prerequisite for establishing intra- and inter-residue spatial relationships between atoms. The assignment process is tedious and time-consuming, which could take many weeks. Though there exist a number of computer programs to assist the assignment process, many NMR labs are still doing the assignments manually to ensure quality. This paper presents (1) a new scoring system for mapping spin systems to residues, (2) an automated adjacency information extraction procedure from NMR spectra, and (3) a very fast assignment algorithm based on our previous proposed greedy filtering method and a maximum matching algorithm to automate the assignment process. The computational tests on 70 instances of (pseudo) experimental NMR data of 14 proteins demonstrate that the new score scheme has much better discerning power with the aid of adjacency information between spin systems simulated across various NMR spectra. Typically, with automated extraction of adjacency information, our method achieves nearly complete assignments for most of the proteins. The experiment shows very promising perspective that the fast automated assignment algorithm together with the new score scheme and automated adjacency extraction may be ready for practical use. PMID:16452794

  9. Evaluation of the chemical compatibility of plastic contact materials and pharmaceutical products; safety considerations related to extractables and leachables.

    PubMed

    Jenke, Dennis

    2007-10-01

    A review is provided on the general topic of the compatibility of plastic materials with pharmaceutical products, with specific emphasis on the safety aspects associated with extractables and leachables related to such plastic materials. PMID:17701994

  10. Separating arterial and venous-related components of photoplethysmographic signals for accurate extraction of oxygen saturation and respiratory rate.

    PubMed

    Yousefi, Rasoul; Nourani, Mehrdad

    2015-05-01

    We propose an algorithm for separating arterial and venous-related signals using second-order statistics of red and infrared signals in a blind source separation technique. The separated arterial signal is used to compute accurate arterial oxygen saturation. We have also introduced an algorithm for extracting the respiratory pattern from the extracted venous-related signal. In addition to real-time monitoring, respiratory rate is also extracted. Our experimental results from multiple subjects show that the proposed separation technique is extremely useful for extracting accurate arterial oxygen saturation and respiratory rate. Specifically, the breathing rate is extracted with average root mean square deviation of 1.89 and average mean difference of -0.69. PMID:25055387

  11. Flight-deck automation: Promises and problems

    NASA Technical Reports Server (NTRS)

    Wiener, E. L.; Curry, R. E.

    1980-01-01

    The state of the art in human factors in flight-deck automation is presented. A number of critical problem areas are identified and broad design guidelines are offered. Automation-related aircraft accidents and incidents are discussed as examples of human factors problems in automated flight.

  12. Automating the analytical laboratory via the Chemical Analysis Automation paradigm

    SciTech Connect

    Hollen, R.; Rzeszutko, C.

    1997-10-01

    To address the need for standardization within the analytical chemistry laboratories of the nation, the Chemical Analysis Automation (CAA) program within the US Department of Energy, Office of Science and Technology`s Robotic Technology Development Program is developing laboratory sample analysis systems that will automate the environmental chemical laboratories. The current laboratory automation paradigm consists of islands-of-automation that do not integrate into a system architecture. Thus, today the chemist must perform most aspects of environmental analysis manually using instrumentation that generally cannot communicate with other devices in the laboratory. CAA is working towards a standardized and modular approach to laboratory automation based upon the Standard Analysis Method (SAM) architecture. Each SAM system automates a complete chemical method. The building block of a SAM is known as the Standard Laboratory Module (SLM). The SLM, either hardware or software, automates a subprotocol of an analysis method and can operate as a standalone or as a unit within a SAM. The CAA concept allows the chemist to easily assemble an automated analysis system, from sample extraction through data interpretation, using standardized SLMs without the worry of hardware or software incompatibility or the necessity of generating complicated control programs. A Task Sequence Controller (TSC) software program schedules and monitors the individual tasks to be performed by each SLM configured within a SAM. The chemist interfaces with the operation of the TSC through the Human Computer Interface (HCI), a logical, icon-driven graphical user interface. The CAA paradigm has successfully been applied in automating EPA SW-846 Methods 3541/3620/8081 for the analysis of PCBs in a soil matrix utilizing commercially available equipment in tandem with SLMs constructed by CAA.

  13. Extract from Eugenia punicifolia is an antioxidant and inhibits enzymes related to metabolic syndrome.

    PubMed

    Lopes Galeno, Denise Morais; Carvalho, Rosany Piccolotto; Boleti, Ana Paula de Araújo; Lima, Arleilson Sousa; Oliveira de Almeida, Patricia Danielle; Pacheco, Carolina Carvalho; Pereira de Souza, Tatiane; Lima, Emerson Silva

    2014-01-01

    The present study aimed to investigate in vitro biological activities of extract of Eugenia punicifolia leaves (EEP), emphasizing the inhibitory activity of enzymes related to metabolic syndrome and its antioxidant effects. The antioxidant activity was analyzed by free radicals scavengers in vitro assays: DPPH·, ABTS(·+), O2(·−), and NO· and a cell-based assay. EEP were tested in inhibitory colorimetric assays using α-amylase, α-glucosidase, xanthine oxidase, and pancreatic lipase enzymes. The EEP exhibited activity in ABTS(·+), DPPH·, and O2(·−) scavenger (IC50 = 10.5 ± 1.2, 28.84 ± 0.54, and 38.12 ± 2.6 μg/mL), respectively. EEP did not show cytotoxic effects, and it showed antioxidant activity in cells in a concentration-dependent manner. EEP exhibited inhibition of α-amylase, α-glucosidase, and xanthine oxidase activities in vitro assays (IC50 = 122.8 ± 6.3; 2.9 ± 0.1; 23.5 ± 2.6), respectively; however, EEP did not inhibit the lipase activity. The findings supported that extract of E. punicifolia leaves is a natural antioxidant and inhibitor of enzymes, such as α-amylase, α-glucosidase, and xanthine oxidase, which can result in a reduction in the carbohydrate absorption rate and decrease of risks factors of cardiovascular disease, thereby providing a novel dietary opportunity for the prevention of metabolic syndrome. PMID:24078187

  14. Extracting a kinetic relation from the dynamics of a bistable chain

    NASA Astrophysics Data System (ADS)

    Zhao, Qingze; Purohit, Prashant K.

    2014-06-01

    We integrate Newton's second law for a chain of masses and bistable springs with a spinodal region with the goal of extracting a kinetic relation for propagating phase boundaries. Our numerical experiments correspond to the impact on a bar made of phase changing material. By reading off the spring extensions ahead and behind the phase boundaries in our numerical experiments, we compute a driving force and plot it as a function of the phase boundary velocity to get a kinetic relation. We then show that this kinetic relation results in solutions to Riemann problems in continuum bars that agree with the corresponding numerical experiments on the discrete mass-spring chain. We also integrate Langevin's equations of motion for the same chain of masses and springs to account for the presence of a heat bath at a fixed temperature. We find that the xt-plane looks similar to the purely mechanical numerical experiments at low temperatures but at high temperatures there is an increased incidence of random nucleation events. Using results from both impact and Riemann problems, we show that the kinetic relation is a function of the bath temperature.

  15. Pathogenesis-related protein expression in the apoplast of wheat leaves protected against leaf rust following application of plant extracts.

    PubMed

    Naz, Rabia; Bano, Asghari; Wilson, Neil L; Guest, David; Roberts, Thomas H

    2014-09-01

    Leaf rust (Puccinia triticina) is a major disease of wheat. We tested aqueous leaf extracts of Jacaranda mimosifolia (Bignoniaceae), Thevetia peruviana (Apocynaceae), and Calotropis procera (Apocynaceae) for their ability to protect wheat from leaf rust. Extracts from all three species inhibited P. triticina urediniospore germination in vitro. Plants sprayed with extracts before inoculation developed significantly lower levels of disease incidence (number of plants infected) than unsprayed, inoculated controls. Sprays combining 0.6% leaf extracts and 2 mM salicylic acid with the fungicide Amistar Xtra at 0.05% (azoxystrobin at 10 μg/liter + cyproconazole at 4 μg/liter) reduced disease incidence significantly more effectively than sprays of fungicide at 0.1% alone. Extracts of J. mimosifolia were most active, either alone (1.2%) or in lower doses (0.6%) in combination with 0.05% Amistar Xtra. Leaf extracts combined with fungicide strongly stimulated defense-related gene expression and the subsequent accumulation of pathogenesis-related (PR) proteins in the apoplast of inoculated wheat leaves. The level of protection afforded was significantly correlated with the ability of extracts to increase PR protein expression. We conclude that pretreatment of wheat leaves with spray formulations containing previously untested plant leaf extracts enhances protection against leaf rust provided by fungicide sprays, offering an alternative disease management strategy.

  16. Procyanidins extracted from the lotus seedpod ameliorate age-related antioxidant deficit in aged rats.

    PubMed

    Xu, Jiqu; Rong, Shuang; Xie, Bijun; Sun, Zhida; Zhang, Li; Wu, Hailei; Yao, Ping; Hao, Liping; Liu, Liegang

    2010-03-01

    The alleviative effect of procyanidins extracted from the lotus seedpod (LSPC) on oxidative stress in various tissues was evaluated by determining the activities of the antioxidant enzymes and the content of reduced glutathione (GSH) in heart, liver, lung, kidney, skeletal muscle, and serum in aged rats. Aging led to antioxidant deficit in various tissues in this study, which is confirmed by remarkable increased lipid peroxidation, whereas the change patterns of superoxide dismutase (SOD), catalase (CAT), glutathione peroxidase (GPx), and GSH were diverse in various tissues of aged rats. LSPC treatment (50 and 100 mg/kg body weight) modified the activity of SOD, CAT, and GPx as well as GSH content alteration in these tissues, which reversed the age-related antioxidant deficit in aged rats. However, the regulatory patterns on the activities of these enzymes and GSH content by LSPC treatment were different according to the tissues in aged rats.

  17. Determination of musk fragrances in sewage sludge by pressurized liquid extraction coupled to automated ionic liquid-based headspace single-drop microextraction followed by GC-MS/MS.

    PubMed

    Vallecillos, Laura; Borrull, Francesc; Pocurull, Eva

    2012-10-01

    A method for the quantitative determination of ten musk fragrances extensively used in personal care products from sewage sludge was developed by using a pressurized liquid extraction (PLE) followed by an automated ionic liquid-based headspace single-drop microextraction and gas chromatography-tandem mass spectrometry. The influence of main factors on the efficiency of PLE was studied. For all musks, the highest recovery values were achieved using 1 g of pretreated sewage sludge, H(2) O/methanol (1:1) as an extraction solvent, a temperature of 80°C, a pressure of 1500 psi, an extraction time of 5 min, 2 cycles, a 100% flush volume, a purge time of 120 s, and 1 g Florisil as in-cell clean-up extraction sorbent. The use and optimization of an in-cell clean-up sorbent was necessary to remove fatty interferents of the PLE extract that make the subsequent ionic liquid-based headspace single-drop microextraction difficult. Validation parameters, namely LODs and LOQs, ranged from 0.5-1.5 to 2.5-5 ng/g, respectively. Good levels of intra- and interday repeatabilities were obtained analyzing sewage sludge samples spiked at 10 ng/g (n = 3, RSDs < 10%). The method applicability was tested with sewage sludge from different wastewater treatment plants. The analysis revealed the presence of all the polycyclic musks studied at concentrations higher than the LOQs, ranging from 6 to 530 ng/g. However, the nitro musk concentrations were below the LOQs or, in the case of musk xylene, was not detected.

  18. Locoregional Control of Non-Small Cell Lung Cancer in Relation to Automated Early Assessment of Tumor Regression on Cone Beam Computed Tomography

    SciTech Connect

    Brink, Carsten; Bernchou, Uffe; Bertelsen, Anders; Hansen, Olfred; Schytte, Tine; Bentzen, Soren M.

    2014-07-15

    Purpose: Large interindividual variations in volume regression of non-small cell lung cancer (NSCLC) are observable on standard cone beam computed tomography (CBCT) during fractionated radiation therapy. Here, a method for automated assessment of tumor volume regression is presented and its potential use in response adapted personalized radiation therapy is evaluated empirically. Methods and Materials: Automated deformable registration with calculation of the Jacobian determinant was applied to serial CBCT scans in a series of 99 patients with NSCLC. Tumor volume at the end of treatment was estimated on the basis of the first one third and two thirds of the scans. The concordance between estimated and actual relative volume at the end of radiation therapy was quantified by Pearson's correlation coefficient. On the basis of the estimated relative volume, the patients were stratified into 2 groups having volume regressions below or above the population median value. Kaplan-Meier plots of locoregional disease-free rate and overall survival in the 2 groups were used to evaluate the predictive value of tumor regression during treatment. Cox proportional hazards model was used to adjust for other clinical characteristics. Results: Automatic measurement of the tumor regression from standard CBCT images was feasible. Pearson's correlation coefficient between manual and automatic measurement was 0.86 in a sample of 9 patients. Most patients experienced tumor volume regression, and this could be quantified early into the treatment course. Interestingly, patients with pronounced volume regression had worse locoregional tumor control and overall survival. This was significant on patient with non-adenocarcinoma histology. Conclusions: Evaluation of routinely acquired CBCT images during radiation therapy provides biological information on the specific tumor. This could potentially form the basis for personalized response adaptive therapy.

  19. Automated headspace-solid-phase micro extraction-retention time locked-isotope dilution gas chromatography-mass spectrometry for the analysis of organotin compounds in water and sediment samples.

    PubMed

    Devosa, Christophe; Vliegen, Maarten; Willaert, Bart; David, Frank; Moens, Luc; Sandra, Pat

    2005-06-24

    An automated method for the simultaneous determination of six important organotin compounds namely monobutyltin (MBT), dibutyltin (DBT), tributyltin (TBT), monophenyltin (MPhT), diphenyltin (DPhT) and triphenyltin (TPhT) in water and sediment samples is described. The method is based on derivatization with sodium tetraethylborate followed by automated headspace-solid-phase micro extraction (SPME) combined with GC-MS under retention time locked (RTL) conditions. Home-synthesized deuterated organotin analogues were used as internal standards. Two high abundant fragment ions corresponding to the main tin isotopes Sn118 and Sn120 were chosen; one for quantification and one as qualifier ion. The method was validated and excellent figures of merit were obtained. Limits of quantification (LOQs) are from 1.3 to 15 ng l(-1) (ppt) for water samples and from 1.0 to 6.3 microg kg(-1) (ppb) for sediment samples. Accuracy for sediment samples was tested on spiked real-life sediment samples and on a reference PACS-2 marine harbor sediment. The developed method was used in a case-study at the harbor of Antwerp where sediment samples in different areas were taken and subsequently screened for TBT contamination. Concentrations ranged from 15 microg kg(-1) in the port of Antwerp up to 43 mg kg(-1) near a ship repair unit. PMID:16038329

  20. Automated SPME-GC-MS monitoring of headspace metabolomic responses of E. coli to biologically active components extracted by the coating.

    PubMed

    Hossain, S M Zakir; Bojko, Barbara; Pawliszyn, Janusz

    2013-05-01

    Monitoring extracellular metabolites of bacteria is very useful for not only metabolomics research but also for assessment of the effects of various chemicals, including antimicrobial agents and drugs. Herein, we describe the automated headspace solid-phase microextraction (HS-SPME) method coupled with gas chromatography-mass spectrometry (GC-MS) for the qualitative as well as semi-quantitative determination of metabolic responses of Escherichia coli to an antimicrobial agent, cinnamaldehyde. The minimum inhibitory concentration of cinnamaldehyde was calculated to be 2 g L(-1). We found that cinnamaldehyde was an important factor influencing the metabolic profile and growth process. A higher number of metabolites were observed during the mid-logarithmic growth phase. The metabolite variations (types and concentrations) induced by cinnamaldehyde were dependent on both cell density and the dose of cinnamaldehyde. Simultaneously, 25 different metabolites were separated and detected (e.g., indole, alkane, alcohol, organic acids, esters, etc.) in headspace of complex biological samples due to intermittent addition of high dose of cinnamaldehyde. The study was done using an automated system, thereby minimizing manual workup and indicating the potential of the method for high-throughput analysis. These findings enhanced the understanding of the metabolic responses of E. coli to cinnamaldehyde shock effect and demonstrated the effectiveness of the SPME-GC-MS based metabolomics approach to study such a complex biological system. PMID:23601279

  1. Pressure-driven mesofluidic platform integrating automated on-chip renewable micro-solid-phase extraction for ultrasensitive determination of waterborne inorganic mercury.

    PubMed

    Portugal, Lindomar A; Laglera, Luis M; Anthemidis, Aristidis N; Ferreira, Sérgio L C; Miró, Manuel

    2013-06-15

    A dedicated pressure-driven mesofluidic platform incorporating on-chip sample clean-up and analyte preconcentration is herein reported for expedient determination of trace level concentrations of waterborne inorganic mercury. Capitalizing upon the Lab-on-a-Valve (LOV) concept, the mesofluidic device integrates on-chip micro-solid phase extraction (μSPE) in automatic disposable mode followed by chemical vapor generation and gas-liquid separation prior to in-line atomic fluorescence spectrometric detection. In contrast to prevailing chelating sorbents for Hg(II), bare poly(divinylbenzene-N-vinylpyrrolidone) copolymer sorptive beads were resorted to efficient uptake of Hg(II) in hydrochloric acid milieu (pH=2.3) without the need for metal derivatization nor pH adjustment of prior acidified water samples for preservation to near-neutral conditions. Experimental variables influencing the sorptive uptake and retrieval of target species and the evolvement of elemental mercury within the miniaturized integrated reaction chamber/gas-liquid separator were investigated in detail. Using merely <10 mg of sorbent, the limits of detection and quantification at the 3s(blank) and 10s(blank) levels, respectively, for a sample volume of 3 mL were 12 and 42 ng L(-1) Hg(II) with a dynamic range extending up to 5.0 μg L(-1). The proposed mesofluidic platform copes with the requirements of regulatory bodies (US-EPA, WHO, EU-Commission) for drinking water quality and surface waters that endorse maximum allowed concentrations of mercury spanning from 0.07 to 6.0 μg L(-1). Demonstrated with the analysis of aqueous samples of varying matrix complexity, the LOV approach afforded reliable results with relative recoveries of 86-107% and intermediate precision down to 9% in the renewable μSPE format.

  2. Extract the Relational Information of Static Features and Motion Features for Human Activities Recognition in Videos

    PubMed Central

    2016-01-01

    Both static features and motion features have shown promising performance in human activities recognition task. However, the information included in these features is insufficient for complex human activities. In this paper, we propose extracting relational information of static features and motion features for human activities recognition. The videos are represented by a classical Bag-of-Word (BoW) model which is useful in many works. To get a compact and discriminative codebook with small dimension, we employ the divisive algorithm based on KL-divergence to reconstruct the codebook. After that, to further capture strong relational information, we construct a bipartite graph to model the relationship between words of different feature set. Then we use a k-way partition to create a new codebook in which similar words are getting together. With this new codebook, videos can be represented by a new BoW vector with strong relational information. Moreover, we propose a method to compute new clusters from the divisive algorithm's projective function. We test our work on the several datasets and obtain very promising results. PMID:27656199

  3. Extract the Relational Information of Static Features and Motion Features for Human Activities Recognition in Videos

    PubMed Central

    2016-01-01

    Both static features and motion features have shown promising performance in human activities recognition task. However, the information included in these features is insufficient for complex human activities. In this paper, we propose extracting relational information of static features and motion features for human activities recognition. The videos are represented by a classical Bag-of-Word (BoW) model which is useful in many works. To get a compact and discriminative codebook with small dimension, we employ the divisive algorithm based on KL-divergence to reconstruct the codebook. After that, to further capture strong relational information, we construct a bipartite graph to model the relationship between words of different feature set. Then we use a k-way partition to create a new codebook in which similar words are getting together. With this new codebook, videos can be represented by a new BoW vector with strong relational information. Moreover, we propose a method to compute new clusters from the divisive algorithm's projective function. We test our work on the several datasets and obtain very promising results.

  4. Comparative analyses of universal extraction buffers for assay of stress related biochemical and physiological parameters.

    PubMed

    Han, Chunyu; Chan, Zhulong; Yang, Fan

    2015-01-01

    Comparative efficiency of three extraction solutions, including the universal sodium phosphate buffer (USPB), the Tris-HCl buffer (UTHB), and the specific buffers, were compared for assays of soluble protein, free proline, superoxide radical (O2∙-), hydrogen peroxide (H2O2), and the antioxidant enzymes such as superoxide dismutase (SOD), catalase (CAT), guaiacol peroxidase (POD), ascorbate peroxidase (APX), glutathione peroxidase (GPX), and glutathione reductase (GR) in Populus deltoide. Significant differences for protein extraction were detected via sodium dodecyl sulfate polyacrylamide gel electrophoresis (SDS-PAGE) and two-dimensional electrophoresis (2-DE). Between the two universal extraction buffers, the USPB showed higher efficiency for extraction of soluble protein, CAT, GR, O2∙-, GPX, SOD, and free proline, while the UTHB had higher efficiency for extraction of APX, POD, and H2O2. When compared with the specific buffers, the USPB showed higher extraction efficiency for measurement of soluble protein, CAT, GR, and O2∙-, parallel extraction efficiency for GPX, SOD, free proline, and H2O2, and lower extraction efficiency for APX and POD, whereas the UTHB had higher extraction efficiency for measurement of POD and H2O2. Further comparisons proved that 100 mM USPB buffer showed the highest extraction efficiencies. These results indicated that USPB would be suitable and efficient for extraction of soluble protein, CAT, GR, GPX, SOD, H2O2, O2∙-, and free proline.

  5. How to extract clinically useful information from large amount of dialysis related stored data.

    PubMed

    Vito, Domenico; Casagrande, Giustina; Bianchi, Camilla; Costantino, Maria L

    2015-01-01

    The basic storage infrastructure used to gather data from the technological evolution also in the healthcare field was leading to the storing into public or private repository of even higher quantities of data related to patients and their pathological evolution. Big data techniques are spreading also in medical research. By these techniques is possible extract information from complex heterogeneous sources, realizing longitudinal studies focused to correlate the patient status with biometric parameters. In our work we develop a common data infrastructure involving 4 clinical dialysis centers between Lombardy and Switzerland. The common platform has been build to store large amount of clinical data related to 716 dialysis session of 70 patient. The platform is made up by a combination of a MySQL(®) database (Dialysis Database) and a MATLAB-based mining library (Dialysis MATlib). A statistical analysis of these data has been performed on the data gathered. These analyses led to the development of two clinical indexes, representing an example of transformation of big data into clinical information. PMID:26737858

  6. Towards a wave-extraction method for numerical relativity. II. The quasi-Kinnersley frame

    SciTech Connect

    Nerozzi, Andrea; Beetle, Christopher; Bruni, Marco; Burko, Lior M.; Pollney, Denis

    2005-07-15

    The Newman-Penrose formalism may be used in numerical relativity to extract coordinate-invariant information about gravitational radiation emitted in strong-field dynamical scenarios. The main challenge in doing so is to identify a null tetrad appropriately adapted to the simulated geometry such that Newman-Penrose quantities computed relative to it have an invariant physical meaning. In black hole perturbation theory, the Teukolsky formalism uses such adapted tetrads, those which differ only perturbatively from the background Kinnersley tetrad. At late times, numerical simulations of astrophysical processes producing isolated black holes ought to admit descriptions in the Teukolsky formalism. However, adapted tetrads in this context must be identified using only the numerically computed metric, since no background Kerr geometry is known a priori. To do this, this paper introduces the notion of a quasi-Kinnersley frame. This frame, when space-time is perturbatively close to Kerr, approximates the background Kinnersley frame. However, it remains calculable much more generally, in space-times nonperturbatively different from Kerr. We give an explicit solution for the tetrad transformation which is required in order to find this frame in a general space-time.

  7. RELATIVE POTENCY OF FUNGAL EXTRACTS IN INDUCING ALLERGIC ASTHMA-LIKE RESPONSES IN BALB/C MICE

    EPA Science Inventory

    Indoor mold has been associated with the development of allergic asthma. However, relative potency of molds in the induction of allergic asthma is not clear. In this study, we tested the relative potency of fungal extracts (Metarizium anisophilae [MACA], Stachybotrys ...

  8. Automated lithocell

    NASA Astrophysics Data System (ADS)

    Englisch, Andreas; Deuter, Armin

    1990-06-01

    Integration and automation have gained more and more ground in modern IC-manufacturing. It is difficult to make a direct calculation of the profit these investments yield. On the other hand, the demands to man, machine and technology have increased enormously of late; it is not difficult to see that only by means of integration and automation can these demands be coped with. Here are some salient points: U the complexity and costs incurred by the equipment and processes have got significantly higher . owing to the reduction of all dimensions, the tolerances within which the various process steps have to be carried out have got smaller and smaller and the adherence to these tolerances more and more difficult U the cycle time has become more and more important both for the development and control of new processes and, to a great extent, for a rapid and reliable supply to the customer. In order that the products be competitive under these conditions, all sort of costs have to be reduced and the yield has to be maximized. Therefore, the computer-aided control of the equipment and the process combined with an automatic data collection and a real-time SPC (statistical process control) has become absolutely necessary for successful IC-manufacturing. Human errors must be eliminated from the execution of the various process steps by automation. The work time set free in this way makes it possible for the human creativity to be employed on a larger scale in stabilizing the processes. Besides, a computer-aided equipment control can ensure the optimal utilization of the equipment round the clock.

  9. Extraction of solubles from plant biomass for use as microbial growth stimulant and methods related thereto

    DOEpatents

    Lau, Ming Woei

    2015-12-08

    A method for producing a microbial growth stimulant (MGS) from a plant biomass is described. In one embodiment, an ammonium hydroxide solution is used to extract a solution of proteins and ammonia from the biomass. Some of the proteins and ammonia are separated from the extracted solution to provide the MGS solution. The removed ammonia can be recycled and the proteins are useful as animal feeds. In one embodiment, the method comprises extracting solubles from pretreated lignocellulosic biomass with a cellulase enzyme-producing growth medium (such T. reesei) in the presence of water and an aqueous extract.

  10. Ginseng Purified Dry Extract, BST204, Improved Cancer Chemotherapy-Related Fatigue and Toxicity in Mice

    PubMed Central

    Park, Hyun-Jung; Shim, Hyun Soo; Kim, Jeom Yong; Kim, Joo Young; Park, Sun Kyu

    2015-01-01

    Cancer related fatigue (CRF) is one of the most common side effects of cancer and its treatments. A large proportion of cancer patients experience cancer-related physical and central fatigue so new strategies are needed for treatment and improved survival of these patients. BST204 was prepared by incubating crude ginseng extract with ginsenoside-β-glucosidase. The purpose of the present study was to examine the effects of BST204, mixture of ginsenosides on 5-fluorouracil (5-FU)-induced CRF, the glycogen synthesis, and biochemical parameters in mice. The mice were randomly divided into the following groups: the naïve normal (normal), the HT-29 cell inoculated (xenograft), xenograft and 5-FU treated (control), xenograft + 5-FU + BST204-treated (100 and 200 mg/kg) (BST204), and xenograft + 5-FU + modafinil (13 mg/kg) treated group (modafinil). Running wheel activity and forced swimming test were used for evaluation of CRF. Muscle glycogen, serum inflammatory cytokines, aspartic aminotransferase (AST), alanine aminotransferase (ALT), creatinine (CRE), white blood cell (WBC), neutrophil (NEUT), red blood cell (RBC), and hemoglobin (HGB) were measured. Treatment with BST204 significantly increased the running wheel activity and forced swimming time compared to the control group. Consistent with the behavioral data, BST204 markedly increased muscle glycogen activity and concentrations of WBC, NEUT, RBC, and HGB. Also, tumor necrosis factor-α (TNF-α) and interleukin-6 (IL-6), AST, ALT, and CRE levels in the serum were significantly reduced in the BST204-treated group compared to the control group. This result suggests that BST204 may improve chemotherapy-related fatigue and adverse toxic side effects. PMID:25945105

  11. BioCreative V CDR task corpus: a resource for chemical disease relation extraction

    PubMed Central

    Li, Jiao; Sun, Yueping; Johnson, Robin J.; Sciaky, Daniela; Wei, Chih-Hsuan; Leaman, Robert; Davis, Allan Peter; Mattingly, Carolyn J.; Wiegers, Thomas C.; Lu, Zhiyong

    2016-01-01

    Community-run, formal evaluations and manually annotated text corpora are critically important for advancing biomedical text-mining research. Recently in BioCreative V, a new challenge was organized for the tasks of disease named entity recognition (DNER) and chemical-induced disease (CID) relation extraction. Given the nature of both tasks, a test collection is required to contain both disease/chemical annotations and relation annotations in the same set of articles. Despite previous efforts in biomedical corpus construction, none was found to be sufficient for the task. Thus, we developed our own corpus called BC5CDR during the challenge by inviting a team of Medical Subject Headings (MeSH) indexers for disease/chemical entity annotation and Comparative Toxicogenomics Database (CTD) curators for CID relation annotation. To ensure high annotation quality and productivity, detailed annotation guidelines and automatic annotation tools were provided. The resulting BC5CDR corpus consists of 1500 PubMed articles with 4409 annotated chemicals, 5818 diseases and 3116 chemical-disease interactions. Each entity annotation includes both the mention text spans and normalized concept identifiers, using MeSH as the controlled vocabulary. To ensure accuracy, the entities were first captured independently by two annotators followed by a consensus annotation: The average inter-annotator agreement (IAA) scores were 87.49% and 96.05% for the disease and chemicals, respectively, in the test set according to the Jaccard similarity coefficient. Our corpus was successfully used for the BioCreative V challenge tasks and should serve as a valuable resource for the text-mining research community. Database URL: http://www.biocreative.org/tasks/biocreative-v/track-3-cdr/ PMID:27161011

  12. Automation in haemostasis.

    PubMed

    Huber, A R; Méndez, A; Brunner-Agten, S

    2013-01-01

    Automatia, an ancient Greece goddess of luck who makes things happen by themselves and on her own will without human engagement, is present in our daily life in the medical laboratory. Automation has been introduced and perfected by clinical chemistry and since then expanded into other fields such as haematology, immunology, molecular biology and also coagulation testing. The initial small and relatively simple standalone instruments have been replaced by more complex systems that allow for multitasking. Integration of automated coagulation testing into total laboratory automation has become possible in the most recent years. Automation has many strengths and opportunities if weaknesses and threats are respected. On the positive side, standardization, reduction of errors, reduction of cost and increase of throughput are clearly beneficial. Dependence on manufacturers, high initiation cost and somewhat expensive maintenance are less favourable factors. The modern lab and especially the todays lab technicians and academic personnel in the laboratory do not add value for the doctor and his patients by spending lots of time behind the machines. In the future the lab needs to contribute at the bedside suggesting laboratory testing and providing support and interpretation of the obtained results. The human factor will continue to play an important role in testing in haemostasis yet under different circumstances.

  13. A novel dual-valve sequential injection manifold (DV-SIA) for automated liquid-liquid extraction. Application for the determination of picric acid.

    PubMed

    Skrlíková, Jana; Andruch, Vasil; Sklenárová, Hana; Chocholous, Petr; Solich, Petr; Balogh, Ioseph S

    2010-05-01

    A novel dual-valve sequential injection system (DV-SIA) for online liquid-liquid extraction which resolves the main problems of LLE utilization in SIA has been designed. The main idea behind this new design was to construct an SIA system by connecting two independent units, one for aqueous-organic mixture flow and the second specifically for organic phase flow. As a result, the DV-SIA manifold consists of an Extraction unit and a Detection unit. Processing a mixture of aqueous-organic phase in the Extraction unit and a separated organic phase in the Detection unit solves the problems associated with the change of phases having different affinities to the walls of the Teflon tubing used in the SI-system. The developed manifold is a simple, user-friendly and universal system built entirely from commercially available components. The system can be used for a variety of samples and organic solvents and is simple enough to be easily handled by operators less familiar with flow systems. The efficiency of the DV-SIA system is demonstrated by the extraction of picric acid in the form of an ion associate with 2-[2-(4-methoxy-phenylamino)-vinyl]-1,3,3-trimethyl-3H-indolium reagent, with subsequent spectrophotometric detection. The suggested DV-SIA concept can be expected to stimulate new experiments in analytical laboratories and can be applied to the elaboration of procedures for the determination of other compounds extractable by organic solvents. It could thus form a basis for the design of simple, single-purpose commercial instruments used in LLE procedures.

  14. Semi-automated building extraction from airborne laser scanning data. (Polish Title: Półautomatyczne modelowanie brył budynków na podstawie danych z lotniczego skaningu laserowego)

    NASA Astrophysics Data System (ADS)

    Marjasiewicz, M.; Malej, T.

    2014-12-01

    The main idea of this project is to introduce a conception of semi - automated method for building model extraction from Airborne Laser Scanning data. The presented method is based on the RANSAC algorithm, which provides automatic collection planes for roofs model creation. In the case of Airborne Laser Scanning, the algorithm can process point clouds influenced with noise and erroneous measurement (gross errors). The RANSAC algorithm is based on the iterative processing of a set of points in order to estimate the geometric model. Research of u sing algorithm for ALS data was performed in available Cloud Compare and SketchUP software. An important aspect in this research was algorithm parameters selection, which was made on the basis of characteristics of point cloud and scanned objects. Analysis showed that the accuracy of plane extraction with RANSAC algorithm does not exceed 20 centimeters for point clouds of density 4 pts . /m 2 . RANSAC can be successfully used in buildings modelling based on ALS data. Roofs created by the presented method could be used in visualizations on a much better level than Level of Detail 2 by CityGML standard. If model is textured it can represent LoD3 standard.

  15. Automated procedure to determine the thermodynamic stability of a material and the range of chemical potentials necessary for its formation relative to competing phases and compounds

    NASA Astrophysics Data System (ADS)

    Buckeridge, J.; Scanlon, D. O.; Walsh, A.; Catlow, C. R. A.

    2014-01-01

    We present a simple and fast algorithm to test the thermodynamic stability and determine the necessary chemical environment for the production of a multiternary material, relative to competing phases and compounds formed from the constituent elements. If the material is found to be stable, the region of stability, in terms of the constituent elemental chemical potentials, is determined from the intersection points of hypersurfaces in an (n-1)-dimensional chemical potential space, where n is the number of atomic species in the material. The input required is the free energy of formation of the material itself, and that of all competing phases. Output consists of the result of the test of stability, the intersection points in the chemical potential space and the competing phase to which they relate, and, for two- and three-dimensional spaces, a file which may be used for visualization of the stability region. We specify the use of the program by applying it both to a ternary system and to a quaternary system. The algorithm automates essential analysis of the thermodynamic stability of a material. This analysis consists of a process which is lengthy for ternary materials, and becomes much more complicated when studying materials of four or more constituent elements, which have become of increased interest in recent years for technological applications such as energy harvesting and optoelectronics. The algorithm will therefore be of great benefit to the theoretical and computational study of such materials.

  16. Optimization of DNA extraction and PCR protocols for phylogenetic analysis in Schinopsis spp. and related Anacardiaceae.

    PubMed

    Mogni, Virginia Y; Kahan, Mariano A; de Queiroz, Luciano Paganucci; Vesprini, José L; Ortiz, Juan Pablo A; Prado, Darién E

    2016-01-01

    The Anacardiaceae is an important and worldwide distributed family of ecological and socio-economic relevance. Notwithstanding that, molecular studies in this family are scarce and problematic because of the particularly high concentration of secondary metabolites-i.e. tannins and oleoresins-that are present in almost all tissues of the many members of the group, which complicate the purification and amplification of the DNA. The objective of this work was to improve an available DNA isolation method for Schinopsis spp. and other related Anacardiaceae, as well as the PCR protocols for DNA amplification of the chloroplast trnL-F, rps16 and ndhF and nuclear ITS-ETS fragments. The modifications proposed allowed the extraction of 70-120 µg of non-degraded genomic DNA per gram of dry tissue that resulted useful for PCR amplification. PCR reactions produced the expected fragments that could be directly sequenced. Sequence analyses of amplicons showed similarity with the corresponding Schinopsis accessions available at GenBank. The methodology presented here can be routinely applied for molecular studies of the group aimed to clarify not only aspects on the molecular biology but also the taxonomy and phylogeny of this fascinating group of vascular plants. PMID:27217992

  17. Optimization of DNA extraction and PCR protocols for phylogenetic analysis in Schinopsis spp. and related Anacardiaceae.

    PubMed

    Mogni, Virginia Y; Kahan, Mariano A; de Queiroz, Luciano Paganucci; Vesprini, José L; Ortiz, Juan Pablo A; Prado, Darién E

    2016-01-01

    The Anacardiaceae is an important and worldwide distributed family of ecological and socio-economic relevance. Notwithstanding that, molecular studies in this family are scarce and problematic because of the particularly high concentration of secondary metabolites-i.e. tannins and oleoresins-that are present in almost all tissues of the many members of the group, which complicate the purification and amplification of the DNA. The objective of this work was to improve an available DNA isolation method for Schinopsis spp. and other related Anacardiaceae, as well as the PCR protocols for DNA amplification of the chloroplast trnL-F, rps16 and ndhF and nuclear ITS-ETS fragments. The modifications proposed allowed the extraction of 70-120 µg of non-degraded genomic DNA per gram of dry tissue that resulted useful for PCR amplification. PCR reactions produced the expected fragments that could be directly sequenced. Sequence analyses of amplicons showed similarity with the corresponding Schinopsis accessions available at GenBank. The methodology presented here can be routinely applied for molecular studies of the group aimed to clarify not only aspects on the molecular biology but also the taxonomy and phylogeny of this fascinating group of vascular plants.

  18. Multiresidue trace analysis of pharmaceuticals, their human metabolites and transformation products by fully automated on-line solid-phase extraction-liquid chromatography-tandem mass spectrometry.

    PubMed

    García-Galán, María Jesús; Petrovic, Mira; Rodríguez-Mozaz, Sara; Barceló, Damià

    2016-09-01

    A novel, fully automated analytical methodology based on dual column liquid chromatography coupled to tandem mass spectrometry (LC-LC-MS(2)) has been developed and validated for the analysis of 12 pharmaceuticals and 20 metabolites and transformation products in different types of water (influent and effluent wastewaters and surface water). Two LC columns were used - one for pre-concentration of the sample and the second for separation and analysis - so that water samples were injected directly in the chromatographic system. Besides the many advantages of the methodology, such as minimization of the sample volume required and its manipulation, both compounds ionized in positive and negative mode could be analyzed simultaneously without compromising the sensitivity. A comparative study of different mobile phases, gradients and LC pre-concentration columns was carried out to obtain the best analytical performance. Limits of detection (MLODs) achieved were in the low ngL(-1) range for all the compounds. The method was successfully applied to study the presence of the target analytes in different wastewater and surface water samples collected near the city of Girona (Catalonia, Spain). Data on the environmental presence and fate of pharmaceutical metabolites and TPs is still scarce, highlighting the relevance of the developed methodology. PMID:27343613

  19. Follicular unit extraction hair transplant automation: options in overcoming challenges of the latest technology in hair restoration with the goal of avoiding the line scar.

    PubMed

    Rashid, Rashid M; Morgan Bicknell, Lindsay T

    2012-09-01

    Follicular unit extraction (FUE) provides many advantages over the strip surgical method of harvesting hair grafts for hair restoration. However, FUE also has its shortcomings because it is a more time intensive approach that results in increased costs and is technically a more challenging technique of hair transplantation. In this manuscript, we seek to share approaches used at our center to help minimize and/or improve on some of the challenges of FUE. PMID:23031379

  20. AutoLink: Automated sequential resonance assignment of biopolymers from NMR data by relative-hypothesis-prioritization-based simulated logic

    NASA Astrophysics Data System (ADS)

    Masse, James E.; Keller, Rochus

    2005-05-01

    We have developed a new computer algorithm for determining the backbone resonance assignments for biopolymers. The approach we have taken, relative hypothesis prioritization, is implemented as a Lua program interfaced to the recently developed computer-aided resonance assignment (CARA) program. Our program can work with virtually any spectrum type, and is especially good with NOESY data. The results of the program are displayed in an easy-to-read, color-coded, graphic representation, allowing users to assess the quality of the results in minutes. Here we report the application of the program to two RNA recognition motifs of Apobec-1 Complementation Factor. The assignment of these domains demonstrates AutoLink's ability to deliver accurate resonance assignments from very minimal data and with minimal user intervention.

  1. Dynamic ultrasound-assisted extraction of oleuropein and related biophenols from olive leaves.

    PubMed

    Japón-Luján, R; Luque-Rodríguez, J M; Luque de Castro, M D

    2006-03-01

    A continuous approach for the ultrasound-assisted extraction of olive biophenols (OBPs) from olive leaves is proposed. Multivariate methodology was used to carry out a detailed optimisation of extraction. Under the optimal working conditions, complete extraction of the target analytes (namely, oleuropein, verbacoside, apigenin-7-glucoside and luteolin-7-glucoside with LODs 11.04, 2.68, 1.49 and 3.91 mg/kg, respectively) was achieved in 25 min. The extract was injected into a chromatograph-photodiode array detector assembly (HPLC-DAD) for individual separation-quantification. No clean-up or preconcentration steps were required. Gas chromatography-mass spectrometry (without derivatization of the analytes) was used to identify OBPs at concentrations below the LODs obtained by HPLC-DAD. The efficacy of ethanol-water mixtures to extract OBPs from olive leaves has been demonstrated and compared with that of a conventional method which requires 24h for complete extraction; so these mixtures can substitute toxic extractants used to date. PMID:16442552

  2. A semi-automated system for quantifying the oxidative potential of ambient particles in aqueous extracts using the dithiothreitol (DTT) assay: results from the Southeastern Center for Air Pollution and Epidemiology (SCAPE)

    NASA Astrophysics Data System (ADS)

    Fang, T.; Verma, V.; Guo, H.; King, L. E.; Edgerton, E. S.; Weber, R. J.

    2014-07-01

    A variety of methods are used to measure the capability of particulate matter (PM) to catalytically generate reactive oxygen species (ROS) in vivo, also defined as the aerosol oxidative potential. A widely used measure of aerosol oxidative potential is the dithiothreitol (DTT) assay, which monitors the depletion of DTT (a surrogate for cellular antioxidants) as catalyzed by the redox-active species in PM. However, a major constraint in the routine use of the DTT assay for integrating it with the large-scale health studies is its labor-intensive and time-consuming protocol. To specifically address this concern, we have developed a semi-automated system for quantifying the oxidative potential of aerosol liquid extracts using the DTT assay. The system, capable of unattended analysis at one sample per hour, has a high analytical precision (Coefficient of Variation of 12% for standards, 4% for ambient samples), and reasonably low limit of detection (0.31 nmol min-1). Comparison of the automated approach with the manual method conducted on ambient samples yielded good agreement (slope = 1.08 ± 0.12, r2 = 0.92, N = 9). The system was utilized for the Southeastern Center for Air Pollution and Epidemiology (SCAPE) to generate an extensive data set on DTT activity of ambient particles collected from contrasting environments (urban, road-side, and rural) in the southeastern US. We find that water-soluble PM2.5 DTT activity on a per air volume basis was spatially uniform and often well correlated with PM2.5 mass (r = 0.49 to 0.88), suggesting regional sources contributing to the PM oxidative potential in southeast US. However, the greater heterogeneity in the intrinsic DTT activity (per PM mass basis) across seasons indicates variability in the DTT activity associated with aerosols from sources that vary with season. Although developed for the DTT assay, the instrument can also be used to determine oxidative potential with other acellular assays.

  3. Fully automated determination of selective retinoic acid receptor ligands in mouse plasma and tissue by reversed-phase liquid chromatography coupled on-line with solid-phase extraction.

    PubMed

    Arafa, H M; Hamada, F M; Elmazar, M M; Nau, H

    1996-04-01

    A fully automated reversed-phase HPLC method was developed for the quantitative assay of three retinoids (Am-580, CD-2019 and CD-437) which selectively activate the retinoic acid receptors RAR alpha, RAR beta and RAR gamma, respectively. Mouse plasma, embryo and maternal tissues were prepared for injection by on-line solid-phase extraction (SPE) and valve-switching techniques. Following automatic injection, the sample was loaded on preconditioned disposable cartridges, cleaned-up and then transferred onto the analytical column to be eluted in the backflush mode, separated by gradient elution and detected by UV, while a new cartridge was concomitantly conditioned. The overall recovery was quantitative allowing for external standardization. The calibration curves were linear in all biological samples tested so far, with a correlation coefficient (r) >0.99. The intra-day precision was < or = 7.8% (n = 5-6) and the inter-day variability was < or = 9.4% (n = 3). The lower limit of detection was 2.5 ng/ml or ng/g for CD-2019 and CD-437, and 5 ng/ml for Am-580 with a S/N ratio of 5 using a sample weight of 25 microliters or mg. The method is now in routine use in our laboratory for the assessment of the pharmacokinetic profiles of these retinoids. The small sample size required, the simple sample preparation and the rapid analysis with high degree of automation make this method convenient for microanalysis of biological samples both in animal and human studies.

  4. Automated measurement of centering errors and relative surface distances for the optimized assembly of micro-optics

    NASA Astrophysics Data System (ADS)

    Langehanenberg, Patrik; Dumitrescu, Eugen; Heinisch, Josef; Krey, Stefan; Ruprecht, Aiko K.

    2011-03-01

    For any kind of optical compound systems the precise geometric alignment of every single element according to the optical design is essential to obtain the desired imaging properties. In this contribution we present a measurement system for the determination of the complete set of geometric alignment parameters in assembled systems. The deviation of each center or curvature with respect to a reference axis is measured with an autocollimator system. These data are further processed in order to provide the shift and tilt of an individual lens or group of lenses with respect to a defined reference axis. Previously it was shown that such an instrument can measure the centering errors of up to 40 surfaces within a system under test with accuracies in the range of an arc second. In addition, the relative distances of the optical surfaces (center thicknesses of lens elements, air gaps in between) are optically determined in the same measurement system by means of low coherent interferometry. Subsequently, the acquired results can be applied for the compensation of the detected geometric alignment errors before the assembly is finally bonded (e.g., glued). The presented applications mainly include measurements of miniaturized lens systems like mobile phone optics. However, any type of objective lens from endoscope imaging systems up to very complex objective lenses used in microlithography can be analyzed with the presented measurement system.

  5. Rapid and automated analysis of aflatoxin M1 in milk and dairy products by online solid phase extraction coupled to ultra-high-pressure-liquid-chromatography tandem mass spectrometry.

    PubMed

    Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Pagano, Imma; Russo, Mariateresa; Rastrelli, Luca

    2016-01-01

    This study reports a fast and automated analytical procedure for the analysis of aflatoxin M1 (AFM1) in milk and dairy products. The method is based on the simultaneous protein precipitation and AFM1 extraction, by salt-induced liquid-liquid extraction (SI-LLE), followed by an online solid-phase extraction (online SPE) coupled to ultra-high-pressure-liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) analysis to the automatic pre-concentration, clean up and sensitive and selective determination of AFM1. The main parameters affecting the extraction efficiency and accuracy of the analytical method were studied in detail. In the optimal conditions, acetonitrile and NaCl were used as extraction/denaturant solvent and salting-out agent in SI-LLE, respectively. After centrifugation, the organic phase (acetonitrile) was diluted with water (1:9 v/v) and purified (1mL) by online C18 cartridge coupled with an UHPLC column. Finally, selected reaction monitoring (SRM) acquisition mode was applied to the detection of AFM1. Validation studies were carried out on different dairy products (whole and skimmed cow milk, yogurt, goat milk, and powder infant formula), providing method quantification limits about 25 times lower than AFM1 maximum levels permitted by EU regulation 1881/2006 in milk and dairy products for direct human consumption. Recoveries (86-102%) and repeatability (RSD<3, n=6) meet the performance criteria required by EU regulation N. 401/2006 for the determination of the levels of mycotoxins in foodstuffs. Moreover, no matrix effects were observed in the different milk and dairy products studied. The proposed method improves the performance of AFM1 analysis in milk samples as AFM1 determination is performed with a degree of accuracy higher than the conventional methods. Other advantages are the reduction of sample preparation procedure, time and cost of the analysis, enabling high sample throughput that meet the current concerns of food safety and the public

  6. Rapid and automated analysis of aflatoxin M1 in milk and dairy products by online solid phase extraction coupled to ultra-high-pressure-liquid-chromatography tandem mass spectrometry.

    PubMed

    Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Pagano, Imma; Russo, Mariateresa; Rastrelli, Luca

    2016-01-01

    This study reports a fast and automated analytical procedure for the analysis of aflatoxin M1 (AFM1) in milk and dairy products. The method is based on the simultaneous protein precipitation and AFM1 extraction, by salt-induced liquid-liquid extraction (SI-LLE), followed by an online solid-phase extraction (online SPE) coupled to ultra-high-pressure-liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) analysis to the automatic pre-concentration, clean up and sensitive and selective determination of AFM1. The main parameters affecting the extraction efficiency and accuracy of the analytical method were studied in detail. In the optimal conditions, acetonitrile and NaCl were used as extraction/denaturant solvent and salting-out agent in SI-LLE, respectively. After centrifugation, the organic phase (acetonitrile) was diluted with water (1:9 v/v) and purified (1mL) by online C18 cartridge coupled with an UHPLC column. Finally, selected reaction monitoring (SRM) acquisition mode was applied to the detection of AFM1. Validation studies were carried out on different dairy products (whole and skimmed cow milk, yogurt, goat milk, and powder infant formula), providing method quantification limits about 25 times lower than AFM1 maximum levels permitted by EU regulation 1881/2006 in milk and dairy products for direct human consumption. Recoveries (86-102%) and repeatability (RSD<3, n=6) meet the performance criteria required by EU regulation N. 401/2006 for the determination of the levels of mycotoxins in foodstuffs. Moreover, no matrix effects were observed in the different milk and dairy products studied. The proposed method improves the performance of AFM1 analysis in milk samples as AFM1 determination is performed with a degree of accuracy higher than the conventional methods. Other advantages are the reduction of sample preparation procedure, time and cost of the analysis, enabling high sample throughput that meet the current concerns of food safety and the public

  7. Automated statistical experimental design approach for rapid separation of coenzyme Q10 and identification of its biotechnological process related impurities using UHPLC and UHPLC-APCI-MS.

    PubMed

    Talluri, Murali V N Kumar; Kalariya, Pradipbhai D; Dharavath, Shireesha; Shaikh, Naeem; Garg, Prabha; Ramisetti, Nageswara Rao; Ragampeta, Srinivas

    2016-09-01

    A novel ultra high performance liquid chromatography method development strategy was ameliorated by applying quality by design approach. The developed systematic approach was divided into five steps (i) Analytical Target Profile, (ii) Critical Quality Attributes, (iii) Risk Assessments of Critical parameters using design of experiments (screening and optimization phases), (iv) Generation of design space, and (v) Process Capability Analysis (Cp) for robustness study using Monte Carlo simulation. The complete quality-by-design-based method development was made automated and expedited by employing sub-2 μm particles column with an ultra high performance liquid chromatography system. Successful chromatographic separation of the Coenzyme Q10 from its biotechnological process related impurities was achieved on a Waters Acquity phenyl hexyl (100 mm × 2.1 mm, 1.7 μm) column with gradient elution of 10 mM ammonium acetate buffer (pH 4.0) and a mixture of acetonitrile/2-propanol (1:1) as the mobile phase. Through this study, fast and organized method development workflow was developed and robustness of the method was also demonstrated. The method was validated for specificity, linearity, accuracy, precision, and robustness in compliance to the International Conference on Harmonization, Q2 (R1) guidelines. The impurities were identified by atmospheric pressure chemical ionization-mass spectrometry technique. Further, the in silico toxicity of impurities was analyzed using TOPKAT and DEREK software. PMID:27488256

  8. Automated statistical experimental design approach for rapid separation of coenzyme Q10 and identification of its biotechnological process related impurities using UHPLC and UHPLC-APCI-MS.

    PubMed

    Talluri, Murali V N Kumar; Kalariya, Pradipbhai D; Dharavath, Shireesha; Shaikh, Naeem; Garg, Prabha; Ramisetti, Nageswara Rao; Ragampeta, Srinivas

    2016-09-01

    A novel ultra high performance liquid chromatography method development strategy was ameliorated by applying quality by design approach. The developed systematic approach was divided into five steps (i) Analytical Target Profile, (ii) Critical Quality Attributes, (iii) Risk Assessments of Critical parameters using design of experiments (screening and optimization phases), (iv) Generation of design space, and (v) Process Capability Analysis (Cp) for robustness study using Monte Carlo simulation. The complete quality-by-design-based method development was made automated and expedited by employing sub-2 μm particles column with an ultra high performance liquid chromatography system. Successful chromatographic separation of the Coenzyme Q10 from its biotechnological process related impurities was achieved on a Waters Acquity phenyl hexyl (100 mm × 2.1 mm, 1.7 μm) column with gradient elution of 10 mM ammonium acetate buffer (pH 4.0) and a mixture of acetonitrile/2-propanol (1:1) as the mobile phase. Through this study, fast and organized method development workflow was developed and robustness of the method was also demonstrated. The method was validated for specificity, linearity, accuracy, precision, and robustness in compliance to the International Conference on Harmonization, Q2 (R1) guidelines. The impurities were identified by atmospheric pressure chemical ionization-mass spectrometry technique. Further, the in silico toxicity of impurities was analyzed using TOPKAT and DEREK software.

  9. Physiological Changes in Rhizobia after Growth in Peat Extract May Be Related to Improved Desiccation Tolerance

    PubMed Central

    Wilkes, Meredith A.; Deaker, Rosalind

    2013-01-01

    Improved survival of peat-cultured rhizobia compared to survival of liquid-cultured cells has been attributed to cellular adaptations during solid-state fermentation in moist peat. We have observed improved desiccation tolerance of Rhizobium leguminosarum bv. trifolii TA1 and Bradyrhizobium japonicum CB1809 after aerobic growth in water extracts of peat. Survival of TA1 grown in crude peat extract was 18-fold greater than that of cells grown in a defined liquid medium but was diminished when cells were grown in different-sized colloidal fractions of peat extract. Survival of CB1809 was generally better when grown in crude peat extract than in the control but was not statistically significant (P > 0.05) and was strongly dependent on peat extract concentration. Accumulation of intracellular trehalose by both TA1 and CB1809 was higher after growth in peat extract than in the defined medium control. Cells grown in water extracts of peat exhibit morphological changes similar to those observed after growth in moist peat. Electron microscopy revealed thickened plasma membranes, with an electron-dense material occupying the periplasmic space in both TA1 and CB1809. Growth in peat extract also resulted in changes to polypeptide expression in both strains, and peptide analysis by liquid chromatography-mass spectrometry indicated increased expression of stress response proteins. Our results suggest that increased capacity for desiccation tolerance in rhizobia is multifactorial, involving the accumulation of trehalose together with increased expression of proteins involved in protection of the cell envelope, repair of DNA damage, oxidative stress responses, and maintenance of stability and integrity of proteins. PMID:23603686

  10. Physiological changes in rhizobia after growth in peat extract may be related to improved desiccation tolerance.

    PubMed

    Casteriano, Andrea; Wilkes, Meredith A; Deaker, Rosalind

    2013-07-01

    Improved survival of peat-cultured rhizobia compared to survival of liquid-cultured cells has been attributed to cellular adaptations during solid-state fermentation in moist peat. We have observed improved desiccation tolerance of Rhizobium leguminosarum bv. trifolii TA1 and Bradyrhizobium japonicum CB1809 after aerobic growth in water extracts of peat. Survival of TA1 grown in crude peat extract was 18-fold greater than that of cells grown in a defined liquid medium but was diminished when cells were grown in different-sized colloidal fractions of peat extract. Survival of CB1809 was generally better when grown in crude peat extract than in the control but was not statistically significant (P > 0.05) and was strongly dependent on peat extract concentration. Accumulation of intracellular trehalose by both TA1 and CB1809 was higher after growth in peat extract than in the defined medium control. Cells grown in water extracts of peat exhibit morphological changes similar to those observed after growth in moist peat. Electron microscopy revealed thickened plasma membranes, with an electron-dense material occupying the periplasmic space in both TA1 and CB1809. Growth in peat extract also resulted in changes to polypeptide expression in both strains, and peptide analysis by liquid chromatography-mass spectrometry indicated increased expression of stress response proteins. Our results suggest that increased capacity for desiccation tolerance in rhizobia is multifactorial, involving the accumulation of trehalose together with increased expression of proteins involved in protection of the cell envelope, repair of DNA damage, oxidative stress responses, and maintenance of stability and integrity of proteins.

  11. Feature extraction of event-related potentials using wavelets: an application to human performance monitoring

    NASA Technical Reports Server (NTRS)

    Trejo, L. J.; Shensa, M. J.

    1999-01-01

    This report describes the development and evaluation of mathematical models for predicting human performance from discrete wavelet transforms (DWT) of event-related potentials (ERP) elicited by task-relevant stimuli. The DWT was compared to principal components analysis (PCA) for representation of ERPs in linear regression and neural network models developed to predict a composite measure of human signal detection performance. Linear regression models based on coefficients of the decimated DWT predicted signal detection performance with half as many free parameters as comparable models based on PCA scores. In addition, the DWT-based models were more resistant to model degradation due to over-fitting than PCA-based models. Feed-forward neural networks were trained using the backpropagation algorithm to predict signal detection performance based on raw ERPs, PCA scores, or high-power coefficients of the DWT. Neural networks based on high-power DWT coefficients trained with fewer iterations, generalized to new data better, and were more resistant to overfitting than networks based on raw ERPs. Networks based on PCA scores did not generalize to new data as well as either the DWT network or the raw ERP network. The results show that wavelet expansions represent the ERP efficiently and extract behaviorally important features for use in linear regression or neural network models of human performance. The efficiency of the DWT is discussed in terms of its decorrelation and energy compaction properties. In addition, the DWT models provided evidence that a pattern of low-frequency activity (1 to 3.5 Hz) occurring at specific times and scalp locations is a reliable correlate of human signal detection performance. Copyright 1999 Academic Press.

  12. Feature Extraction of Event-Related Potentials Using Wavelets: An Application to Human Performance Monitoring

    NASA Technical Reports Server (NTRS)

    Trejo, Leonard J.; Shensa, Mark J.; Remington, Roger W. (Technical Monitor)

    1998-01-01

    This report describes the development and evaluation of mathematical models for predicting human performance from discrete wavelet transforms (DWT) of event-related potentials (ERP) elicited by task-relevant stimuli. The DWT was compared to principal components analysis (PCA) for representation of ERPs in linear regression and neural network models developed to predict a composite measure of human signal detection performance. Linear regression models based on coefficients of the decimated DWT predicted signal detection performance with half as many f ree parameters as comparable models based on PCA scores. In addition, the DWT-based models were more resistant to model degradation due to over-fitting than PCA-based models. Feed-forward neural networks were trained using the backpropagation,-, algorithm to predict signal detection performance based on raw ERPs, PCA scores, or high-power coefficients of the DWT. Neural networks based on high-power DWT coefficients trained with fewer iterations, generalized to new data better, and were more resistant to overfitting than networks based on raw ERPs. Networks based on PCA scores did not generalize to new data as well as either the DWT network or the raw ERP network. The results show that wavelet expansions represent the ERP efficiently and extract behaviorally important features for use in linear regression or neural network models of human performance. The efficiency of the DWT is discussed in terms of its decorrelation and energy compaction properties. In addition, the DWT models provided evidence that a pattern of low-frequency activity (1 to 3.5 Hz) occurring at specific times and scalp locations is a reliable correlate of human signal detection performance.

  13. Anticonvulsant and related neuropharmacological effects of the whole plant extract of Synedrella nodiflora (L.) Gaertn (Asteraceae)

    PubMed Central

    Amoateng, Patrick; Woode, Eric; Kombian, Samuel B.

    2012-01-01

    Purpose: The plant Synedrella nodiflora (L) Gaertn is traditionally used by some Ghanaian communities to treat epilepsy. To determine if this use has merit, we studied the anticonvulsant and other neuropharmacological effects of a hydro-ethanolic extract of the whole plant using murine models. Materials and Methods: The anticonvulsant effect of the extract (10–1000 mg/kg) was tested on the pentylenetetrazole-, picrotoxin-, and pilocarpine-induced seizure models and PTZ-kindling in mice/rats. The effect of the extract was also tested on motor coordination using the rota-rod. Results: The results obtained revealed that the extract possesses anticonvulsant effects in all the experimental models of seizures tested as it significantly reduced the latencies to myoclonic jerks and seizures as well as seizure duration and the percentage severity. The extract was also found to cause motor incoordination at the higher dose of 1000 mg/kg. Conclusions: In summary, the hydro-ethanolic extract of the whole plant of S. nodiflora possesses anticonvulsant effects, possibly through an interaction with GABAergic transmission and antioxidant mechanisms and muscle relaxant effects. These findings thus provide scientific evidence in support of the traditional use of the plant in the management of epilepsy. PMID:22557925

  14. [Relations between extraction of wisdom teeth and temporomandibular disorders: a case/control study].

    PubMed

    Duval, Florian; Leroux, Agathe; Bertaud, Valérie; Meary, Fleur; Le Padellec, Clément; Refuveille, Laura; Lemaire, Arnaud; Sorel, Olivier; Chauvel-Lebret, Dominique

    2015-09-01

    The aim of this study was to assess the impact of extraction of third molars on the occurrence of temporo-mandibular disorders (TMD). A review of the literature and a case-control study have been conducted. The case-control study compares the frequency of extraction of third molars between the sample with TMD (case) and the sample without TMD (control). The proportion of patients who had undergone extractions of wisdom teeth was higher in the case group than in the control group. The difference was statistically significant when patients had undergone extraction of all four wisdom teeth or when the extraction of four wisdom teeth underwent in one sitting or under general anesthesia. The study of patients in case sample shows that all signs of TMD were more common in patients who had undergone extractions in several sessions and under local anesthesia. The temporomandibular joint sounds are significantly more frequent with local anesthesia. In the case group, 85 to 92% of patients have parafunctions and 5 to 11% have malocclusion. This demonstrates the multifactorial etiology of temporomandibular disorders.

  15. Determination of perfluorochemicals in biological, environmental and food samples by an automated on-line solid phase extraction ultra high performance liquid chromatography tandem mass spectrometry method.

    PubMed

    Gosetti, Fabio; Chiuminatto, Ugo; Zampieri, Davide; Mazzucco, Eleonora; Robotti, Elisa; Calabrese, Giorgio; Gennaro, Maria Carla; Marengo, Emilio

    2010-12-10

    A rapid on-line solid phase extraction ultra high performance liquid chromatography tandem mass spectrometry method was developed for the identification and quantitation of nine perfluorinated compounds in matrices of environmental, biological and food interest. Pre-treatment, solid phase extraction, chromatographic and mass detection conditions were optimised, in order to apply the whole methodology to the analysis of different matrices. Particular attention was devoted to the evaluation of matrix effect and the correlated phenomena of ion enhancement or suppression in mass spectrometry detection. LOD and LOQ range from 3 to 15ngL(-1) and from 10 to 50ngL(-1), respectively. Method detection limits (MDLs) were also calculated for each kind of matrix. The recovery, evaluated for each analyte, does not depend on analyte concentration in the explored concentration range: average R¯% values are always greater than 82.9%. In the whole, the results obtained for samples of river waters, blood serum, blood plasma, and fish confirm the ubiquitous presence of perfluorinated compounds, as recently denounced by many sources.

  16. Automated theorem proving.

    PubMed

    Plaisted, David A

    2014-03-01

    Automated theorem proving is the use of computers to prove or disprove mathematical or logical statements. Such statements can express properties of hardware or software systems, or facts about the world that are relevant for applications such as natural language processing and planning. A brief introduction to propositional and first-order logic is given, along with some of the main methods of automated theorem proving in these logics. These methods of theorem proving include resolution, Davis and Putnam-style approaches, and others. Methods for handling the equality axioms are also presented. Methods of theorem proving in propositional logic are presented first, and then methods for first-order logic. WIREs Cogn Sci 2014, 5:115-128. doi: 10.1002/wcs.1269 CONFLICT OF INTEREST: The authors has declared no conflicts of interest for this article. For further resources related to this article, please visit the WIREs website. PMID:26304304

  17. The automation of science.

    PubMed

    King, Ross D; Rowland, Jem; Oliver, Stephen G; Young, Michael; Aubrey, Wayne; Byrne, Emma; Liakata, Maria; Markham, Magdalena; Pir, Pinar; Soldatova, Larisa N; Sparkes, Andrew; Whelan, Kenneth E; Clare, Amanda

    2009-04-01

    The basis of science is the hypothetico-deductive method and the recording of experiments in sufficient detail to enable reproducibility. We report the development of Robot Scientist "Adam," which advances the automation of both. Adam has autonomously generated functional genomics hypotheses about the yeast Saccharomyces cerevisiae and experimentally tested these hypotheses by using laboratory automation. We have confirmed Adam's conclusions through manual experiments. To describe Adam's research, we have developed an ontology and logical language. The resulting formalization involves over 10,000 different research units in a nested treelike structure, 10 levels deep, that relates the 6.6 million biomass measurements to their logical description. This formalization describes how a machine contributed to scientific knowledge. PMID:19342587

  18. Automated Desalting Apparatus

    NASA Technical Reports Server (NTRS)

    Spencer, Maegan K.; Liu, De-Ling; Kanik, Isik; Beegle, Luther

    2010-01-01

    Because salt and metals can mask the signature of a variety of organic molecules (like amino acids) in any given sample, an automated system to purify complex field samples has been created for the analytical techniques of electrospray ionization/ mass spectroscopy (ESI/MS), capillary electrophoresis (CE), and biological assays where unique identification requires at least some processing of complex samples. This development allows for automated sample preparation in the laboratory and analysis of complex samples in the field with multiple types of analytical instruments. Rather than using tedious, exacting protocols for desalting samples by hand, this innovation, called the Automated Sample Processing System (ASPS), takes analytes that have been extracted through high-temperature solvent extraction and introduces them into the desalting column. After 20 minutes, the eluent is produced. This clear liquid can then be directly analyzed by the techniques listed above. The current apparatus including the computer and power supplies is sturdy, has an approximate mass of 10 kg, and a volume of about 20 20 20 cm, and is undergoing further miniaturization. This system currently targets amino acids. For these molecules, a slurry of 1 g cation exchange resin in deionized water is packed into a column of the apparatus. Initial generation of the resin is done by flowing sequentially 2.3 bed volumes of 2N NaOH and 2N HCl (1 mL each) to rinse the resin, followed by .5 mL of deionized water. This makes the pH of the resin near neutral, and eliminates cross sample contamination. Afterward, 2.3 mL of extracted sample is then loaded into the column onto the top of the resin bed. Because the column is packed tightly, the sample can be applied without disturbing the resin bed. This is a vital step needed to ensure that the analytes adhere to the resin. After the sample is drained, oxalic acid (1 mL, pH 1.6-1.8, adjusted with NH4OH) is pumped into the column. Oxalic acid works as a

  19. Detection of Staphylococcus aureus enterotoxin production genes from patient samples using an automated extraction platform and multiplex real-time PCR.

    PubMed

    Chiefari, Amy K; Perry, Michael J; Kelly-Cirino, Cassandra; Egan, Christina T

    2015-12-01

    To minimize specimen volume, handling and testing time, we have developed two TaqMan(®) multiplex real-time PCR (rtPCR) assays to detect staphylococcal enterotoxins A-E and Toxic Shock Syndrome Toxin production genes directly from clinical patient stool specimens utilizing a novel lysis extraction process in parallel with the Roche MagNA Pure Compact. These assays are specific, sensitive and reliable for the detection of the staphylococcal enterotoxin encoding genes and the tst1 gene from known toxin producing strains of Staphylococcus aureus. Specificity was determined by testing a total of 47 microorganism strains, including 8 previously characterized staphylococcal enterotoxin producing strains against each rtPCR target. Sensitivity for these assays range from 1 to 25 cfu per rtPCR reaction for cultured isolates and 8-20 cfu per rtPCR for the clinical stool matrix.

  20. Antioxidant properties of water extracts from Cassia tora L. in relation to the degree of roasting.

    PubMed

    Yen, G C; Chuang, D Y

    2000-07-01

    The antioxidant properties of water extracts from Cassia tora L. (WECT) prepared under different degrees of roasting were investigated. The water extracts of unroasted C. tora L. (WEUCT) showed 94% inhibition of peroxidation of linoleic acid at a dose of 0.2 mg/mL, which was higher than that of alpha-tocopherol (82%). Water extracts prepared from C. tora L. roasted at 175 degrees C for 5 min and at 200 degrees C for 5 min exhibited 83% and 82%, respectively, inhibition of linoleic acid peroxidation. This result indicated that the antioxidant activities of WECT decreased with longer roasting time or higher roasting temperature. The IC(50) of WEUCT in liposome oxidation induced by the Fenton reaction was 0.41 mg/mL, which was higher than that of alpha-tocopherol (IC(50) = 0.55 mg/mL). WEUCT also exhibited good antioxidant activity in enzymatic and nonenzymatic microsome oxidative systems. The water extracts of roasted C. tora L. increased in the degree of browning and produced chemiluminescence when compared with the unroasted sample. However, the total polyphenolic compounds of WECT decreased after the roasting process finished. In conclusion, the decrease in the antioxidant activity of water extracts from roasted C. tora L. might have been due to the degradation of Maillard reaction products and the decrease of polyphenolic compounds.

  1. Automated Agitation-Assisted Demulsification Dispersive Liquid-Liquid Microextraction.

    PubMed

    Guo, Liang; Chia, Shao Hua; Lee, Hian Kee

    2016-03-01

    Dispersive liquid-liquid microextraction (DLLME) is an extremely fast and efficient sample preparation procedure. For its capability and applicability to be fully exploited, full automation of its operations seamlessly integrated with analysis is necessary. In this work, for the first time, fully automated agitation-assisted demulsification (AAD)-DLLME integrated with gas chromatography/mass spectrometry was developed for the convenient and efficient determination of polycyclic aromatic hydrocarbons in environmental water samples. The use of a commercially available multipurpose autosampler equipped with two microsyringes of different capacities allowed elimination or significant reduction of manpower, labor, and time with the large-volume microsyringe used for liquid transfers and the small-volume microsyringe for extract collection and injection for analysis. Apart from enhancing accessibility of DLLME, the procedure was characterized by the application of agitation after extraction to break up the emulsion (that otherwise would need centrifugation or a demulsification solvent), further improving overall operational efficiency and flexibility. Additionally, the application of low-density solvent as extractant facilitated the easy collection of extract as the upper layer over water. Some parameters affecting the automated AAD-DDLME procedure were investigated. Under the optimized conditions, the procedure provided good linearity (ranging from a minimum of 0.1-0.5 μg/L to a maximum of 50 μg/L), low limits of detection (0.010-0.058 μg/L), and good repeatability of the extractions (relative standard deviations, below 5.3%, n = 6). The proposed method was applied to analyze PAHs in real river water samples. PMID:26818217

  2. A Shortest Dependency Path Based Convolutional Neural Network for Protein-Protein Relation Extraction

    PubMed Central

    Quan, Chanqin

    2016-01-01

    The state-of-the-art methods for protein-protein interaction (PPI) extraction are primarily based on kernel methods, and their performances strongly depend on the handcraft features. In this paper, we tackle PPI extraction by using convolutional neural networks (CNN) and propose a shortest dependency path based CNN (sdpCNN) model. The proposed method (1) only takes the sdp and word embedding as input and (2) could avoid bias from feature selection by using CNN. We performed experiments on standard Aimed and BioInfer datasets, and the experimental results demonstrated that our approach outperformed state-of-the-art kernel based methods. In particular, by tracking the sdpCNN model, we find that sdpCNN could extract key features automatically and it is verified that pretrained word embedding is crucial in PPI task. PMID:27493967

  3. A Shortest Dependency Path Based Convolutional Neural Network for Protein-Protein Relation Extraction.

    PubMed

    Hua, Lei; Quan, Chanqin

    2016-01-01

    The state-of-the-art methods for protein-protein interaction (PPI) extraction are primarily based on kernel methods, and their performances strongly depend on the handcraft features. In this paper, we tackle PPI extraction by using convolutional neural networks (CNN) and propose a shortest dependency path based CNN (sdpCNN) model. The proposed method (1) only takes the sdp and word embedding as input and (2) could avoid bias from feature selection by using CNN. We performed experiments on standard Aimed and BioInfer datasets, and the experimental results demonstrated that our approach outperformed state-of-the-art kernel based methods. In particular, by tracking the sdpCNN model, we find that sdpCNN could extract key features automatically and it is verified that pretrained word embedding is crucial in PPI task. PMID:27493967

  4. A Shortest Dependency Path Based Convolutional Neural Network for Protein-Protein Relation Extraction.

    PubMed

    Hua, Lei; Quan, Chanqin

    2016-01-01

    The state-of-the-art methods for protein-protein interaction (PPI) extraction are primarily based on kernel methods, and their performances strongly depend on the handcraft features. In this paper, we tackle PPI extraction by using convolutional neural networks (CNN) and propose a shortest dependency path based CNN (sdpCNN) model. The proposed method (1) only takes the sdp and word embedding as input and (2) could avoid bias from feature selection by using CNN. We performed experiments on standard Aimed and BioInfer datasets, and the experimental results demonstrated that our approach outperformed state-of-the-art kernel based methods. In particular, by tracking the sdpCNN model, we find that sdpCNN could extract key features automatically and it is verified that pretrained word embedding is crucial in PPI task.

  5. Algorithms Could Automate Cancer Diagnosis

    NASA Technical Reports Server (NTRS)

    Baky, A. A.; Winkler, D. G.

    1982-01-01

    Five new algorithms are a complete statistical procedure for quantifying cell abnormalities from digitized images. Procedure could be basis for automated detection and diagnosis of cancer. Objective of procedure is to assign each cell an atypia status index (ASI), which quantifies level of abnormality. It is possible that ASI values will be accurate and economical enough to allow diagnoses to be made quickly and accurately by computer processing of laboratory specimens extracted from patients.

  6. Free aluminium extraction from various reference materials and acid soils with relation to plant availability.

    PubMed

    Matús, Peter; Kubová, Jana; Bujdos, Marek; Medved', Ján

    2006-12-15

    The single extractions with 15 extractants (agents) (H(2)O, KCl, NH(4)Cl, NH(4)F, CaCl(2), BaCl(2), CuCl(2), LaCl(3), Na(2)S(2)O(4), (NH(4))(2)C(2)O(4), Na(4)P(2)O(7), NTA, EDTA, DTPA, HCl), the optimised BCR (Community Bureau of Reference) three-step sequential extraction procedure (SEP) and the solid phase extraction (SPE) by the chelating ion-exchanger Iontosorb Salicyl (cellulose resin containing covalently bound salicylic acid functional groups) were used for the partitioning of Al in very acid soil samples taken from an area influenced by acid mine solutions. The precision, accuracy and repeatibility for all steps of the optimised BCR SEP were checked on the various reference materials (CRM 483 sewage sludge amended soil, CRM BCR 701 freshwater sediment, SRM 2710 and SRM 2711 Montana soils). Also the new indicative values of the optimised BCR SEP fractional Al concentrations were obtained for these reference materials. The aluminium amounts obtained by the used extraction procedures were valuated and discussed from the aspect of the Al concentration in the plants (grass) growing on the same studied soils. The aluminium toxicity indexes (ATI) calculated for the studied soils, the BaCl(2) and acetic acid soil extracts and the grass stems and roots were used for the assessment of the Al toxicity to the plants. The ATI value was defined as the ratio of the nutrient cations (Ca, Mg, K, Na) concentration sum to the Al concentration. The flame atomic absorption spectrometry (LOQ=0.2mgl(-1)) and the inductively coupled plasma optical emission spectrometry (LOQ=0.03mgl(-1)) were used for the aluminium quantification. PMID:18970873

  7. Accumulation and depuration of trinitrotoluene and related extractable and nonextractable (bound) residues in marine fish and mussels.

    PubMed

    Lotufo, Guilherme R; Belden, Jason B; Fisher, Jonathon C; Chen, Shou-Feng; Mowery, Richard A; Chambliss, C Kevin; Rosen, Gunther

    2016-03-01

    To determine if trinitrotoluene (TNT) forms nonextractable residues in mussels (Mytilus galloprovincialis) and fish (Cyprinodon variegatus) and to measure the relative degree of accumulation as compared to extractable TNT and its major metabolites, organisms were exposed to water fortified with (14)C-TNT. After 24 h, nonextractable residues made up 75% (mussel) and 83% (fish) while TNT accounted for 2% of total radioactivity. Depuration half-lives for extractable TNT, aminodinitrotoluenes (ADNTs) and diaminonitrotoluenes (DANTs) were fast initially (<0.5 h), but slower for nonextractable residues. Nonextractable residues from organisms were identified as ADNTs and DANTs using 0.1 M HCL for solubilization followed by liquid chromatography-tandem mass spectrometry. Recovered metabolites only accounted for a small fraction of the bound residue quantified using a radiotracer likely because of low extraction or hydrolysis efficiency or alternative pathways of incorporation of radiolabel into tissue. PMID:26708767

  8. Accumulation and depuration of trinitrotoluene and related extractable and nonextractable (bound) residues in marine fish and mussels.

    PubMed

    Lotufo, Guilherme R; Belden, Jason B; Fisher, Jonathon C; Chen, Shou-Feng; Mowery, Richard A; Chambliss, C Kevin; Rosen, Gunther

    2016-03-01

    To determine if trinitrotoluene (TNT) forms nonextractable residues in mussels (Mytilus galloprovincialis) and fish (Cyprinodon variegatus) and to measure the relative degree of accumulation as compared to extractable TNT and its major metabolites, organisms were exposed to water fortified with (14)C-TNT. After 24 h, nonextractable residues made up 75% (mussel) and 83% (fish) while TNT accounted for 2% of total radioactivity. Depuration half-lives for extractable TNT, aminodinitrotoluenes (ADNTs) and diaminonitrotoluenes (DANTs) were fast initially (<0.5 h), but slower for nonextractable residues. Nonextractable residues from organisms were identified as ADNTs and DANTs using 0.1 M HCL for solubilization followed by liquid chromatography-tandem mass spectrometry. Recovered metabolites only accounted for a small fraction of the bound residue quantified using a radiotracer likely because of low extraction or hydrolysis efficiency or alternative pathways of incorporation of radiolabel into tissue.

  9. Ginseng extracts restore high-glucose induced vascular dysfunctions by altering triglyceride metabolism and downregulation of atherosclerosis-related genes.

    PubMed

    Chan, Gabriel Hoi-Huen; Law, Betty Yuen-Kwan; Chu, John Man-Tak; Yue, Kevin Kin-Man; Jiang, Zhi-Hong; Lau, Chi-Wai; Huang, Yu; Chan, Shun-Wan; Ying-Kit Yue, Patrick; Wong, Ricky Ngok-Shun

    2013-01-01

    The king of herbs, Panax ginseng, has been used widely as a therapeutic agent vis-à-vis its active pharmacological and physiological effects. Based on Chinese pharmacopeia Ben Cao Gang Mu and various pieces of literature, Panax ginseng was believed to exert active vascular protective effects through its antiobesity and anti-inflammation properties. We investigated the vascular protective effects of ginseng by administrating ginseng extracts to rats after the induction of diabetes. We found that Panax ginseng can restore diabetes-induced impaired vasorelaxation and can reduce serum triglyceride but not cholesterol level in the diabetic rats. The ginseng extracts also suppressed the expression of atherosclerosis-related genes and altered the expression of lipid-related genes. The results provide evidence that Panax ginseng improves vascular dysfunction induced by diabetes and the protective effects may possibly be due to the downregulation of atherosclerosis-related genes and altered lipid metabolism, which help to restore normal endothelium functions.

  10. High-resolution twin-ion metabolite extraction (HiTIME) mass spectrometry: nontargeted detection of unknown drug metabolites by isotope labeling, liquid chromatography mass spectrometry, and automated high-performance computing.

    PubMed

    Leeming, Michael G; Isaac, Andrew P; Pope, Bernard J; Cranswick, Noel; Wright, Christine E; Ziogas, James; O'Hair, Richard A J; Donald, William A

    2015-04-21

    The metabolic fate of a compound can often determine the success of a new drug lead. Thus, significant effort is directed toward identifying the metabolites formed from a given molecule. Here, an automated and nontargeted procedure is introduced for detecting drug metabolites without authentic metabolite standards via the use of stable isotope labeling, liquid chromatography mass spectrometry (LC/MS), and high-performance computing. LC/MS of blood plasma extracts from rats that were administered a 1:1 mixture of acetaminophen (APAP) and (13)C6-APAP resulted in mass spectra that contained "twin" ions for drug metabolites that were not detected in control spectra (i.e., no APAP administered). Because of the development of a program (high-resolution twin-ion metabolite extraction; HiTIME) that can identify twin-ions in high-resolution mass spectra without centroiding (i.e., reduction of mass spectral peaks to single data points), 9 doublets corresponding to APAP metabolites were identified. This is nearly twice that obtained by use of existing programs that make use of centroiding to reduce computational cost under these conditions with a quadrupole time-of-flight mass spectrometer. By a manual search for all reported APAP metabolite ions, no additional twin-ion signals were assigned. These data indicate that all the major metabolites of APAP and multiple low-abundance metabolites (e.g., acetaminophen hydroxy- and methoxysulfate) that are rarely reported were detected. This methodology can be used to detect drug metabolites without prior knowledge of their identity. HiTIME is freely available from https://github.com/bjpop/HiTIME .

  11. Inhibitive Effects of Mulberry Leaf-Related Extracts on Cell Adhesion and Inflammatory Response in Human Aortic Endothelial Cells

    PubMed Central

    Chao, P.-Y.; Lin, K.-H.; Chiu, C.-C.; Yang, Y.-Y.; Huang, M.-Y.; Yang, C.-M.

    2013-01-01

    Effects of mulberry leaf-related extracts (MLREs) on hydrogen peroxide-induced DNA damage in human lymphocytes and on inflammatory signaling pathways in human aortic endothelial cells (HAECs) were studied. The tested MLREs were rich in flavonols, especially bombyx faces tea (BT) in quercetin and kaempferol. Polyphenols, flavonoids, and anthocyanidin also abounded in BT. The best trolox equivalent antioxidant capacity (TEAC) was generated from the acidic methanolic extracts of BT. Acidic methanolic and water extracts of mulberry leaf tea (MT), mulberry leaf (M), and BT significantly inhibited DNA oxidative damage to lymphocytes based on the comet assay as compared to the H2O2-treated group. TNF-α-induced monocyte-endothelial cell adhesion was significantly suppressed by MLREs. Additionally, nuclear factor kappa B (NF-κB) expression was significantly reduced by BT and MT. Significant reductions were also observed in both NF-κB and activator protein (AP)-1 DNA binding by MLREs. Significant increases in peroxisome proliferator-activated receptor (PPAR) α and γ DNA binding by MLREs were also detected in M and MT extracts, but no evidence for PPAR α DNA binding in 50 μg/mL MT extract was found. Apparently, MLREs can provide distinct cytoprotective mechanisms that may contribute to its putative beneficial effects on suppressing endothelial responses to cytokines during inflammation. PMID:24371453

  12. A retention time locked gas chromatography-mass spectrometry method based on stir-bar sorptive extraction and thermal desorption for automated determination of synthetic musk fragrances in natural and wastewaters.

    PubMed

    Arbulu, Maria; Sampedro, M Carmen; Unceta, Nora; Gómez-Caballero, Alberto; Goicolea, M Aránzazu; Barrio, Ramón J

    2011-05-20

    A stir-bar sorptive extraction (SBSE) method followed by automated thermal desorption (ATD) coupled to gas chromatography-mass spectrometry was optimized for determining trace levels of 18 synthetic fragrances (musks). Using the method developed a retention time locked library is created and converted to a screening database. This homebuilt database can be combined with deconvolution software for the identification of musks. A factorial design was provide to evaluate the main parameters and interactions between the factors affecting the process of SBSE. Operating with de MS-detector in the full-scan mode, high sensitivity with detection limits in the low ng L(-1) range, and good linearity and repeatability were achieved for all musks. The applicability of the method developed was tested in natural waters (surface and groundwater) and wastewater of a plant treatment (WWPT). The results obtained confirmed the usefulness of the proposed method for the determination and unequivocal identification of musks. This approach enables the developed method to be used for routine screening of environmental samples and posterior rapid quantitation of the positive samples.

  13. Automated Defect Classification (ADC)

    1998-01-01

    The ADC Software System is designed to provide semiconductor defect feature analysis and defect classification capabilities. Defect classification is an important software method used by semiconductor wafer manufacturers to automate the analysis of defect data collected by a wide range of microscopy techniques in semiconductor wafer manufacturing today. These microscopies (e.g., optical bright and dark field, scanning electron microscopy, atomic force microscopy, etc.) generate images of anomalies that are induced or otherwise appear on wafermore » surfaces as a result of errant manufacturing processes or simple atmospheric contamination (e.g., airborne particles). This software provides methods for analyzing these images, extracting statistical features from the anomalous regions, and applying supervised classifiers to label the anomalies into user-defined categories.« less

  14. Automating Frame Analysis

    SciTech Connect

    Sanfilippo, Antonio P.; Franklin, Lyndsey; Tratz, Stephen C.; Danielson, Gary R.; Mileson, Nicholas D.; Riensche, Roderick M.; McGrath, Liam

    2008-04-01

    Frame Analysis has come to play an increasingly stronger role in the study of social movements in Sociology and Political Science. While significant steps have been made in providing a theory of frames and framing, a systematic characterization of the frame concept is still largely lacking and there are no rec-ognized criteria and methods that can be used to identify and marshal frame evi-dence reliably and in a time and cost effective manner. Consequently, current Frame Analysis work is still too reliant on manual annotation and subjective inter-pretation. The goal of this paper is to present an approach to the representation, acquisition and analysis of frame evidence which leverages Content Analysis, In-formation Extraction and Semantic Search methods to provide a systematic treat-ment of a Frame Analysis and automate frame annotation.

  15. Automated Defect Classification (ADC)

    SciTech Connect

    1998-01-01

    The ADC Software System is designed to provide semiconductor defect feature analysis and defect classification capabilities. Defect classification is an important software method used by semiconductor wafer manufacturers to automate the analysis of defect data collected by a wide range of microscopy techniques in semiconductor wafer manufacturing today. These microscopies (e.g., optical bright and dark field, scanning electron microscopy, atomic force microscopy, etc.) generate images of anomalies that are induced or otherwise appear on wafer surfaces as a result of errant manufacturing processes or simple atmospheric contamination (e.g., airborne particles). This software provides methods for analyzing these images, extracting statistical features from the anomalous regions, and applying supervised classifiers to label the anomalies into user-defined categories.

  16. Prevention of medication-related osteonecrosis of the jaws secondary to tooth extractions. A systematic review

    PubMed Central

    Limeres, Jacobo

    2016-01-01

    Background A study was made to identify the most effective protocol for reducing the risk of osteonecrosis of the jaws (ONJ) following tooth extraction in patients subjected to treatment with antiresorptive or antiangiogenic drugs. Material and Methods A MEDLINE and SCOPUS search (January 2003 - March 2015) was made with the purpose of conducting a systematic literature review based on the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines. All articles contributing information on tooth extractions in patients treated with oral or intravenous antiresorptive or antiangiogenic drugs were included. Results Only 13 of the 380 selected articles were finally included in the review: 11 and 5 of them offered data on patients treated with intravenous and oral bisphosphonates, respectively. No randomized controlled trials were found – all publications corresponding to case series or cohort studies. The prevalence of ONJ in the patients treated with intravenous and oral bisphosphonates was 6,9% (range 0-34.7%) and 0.47% (range 0-2.5%), respectively. The main preventive measures comprised local and systemic infection control. Conclusions No conclusive scientific evidence is available to date on the efficacy of ONJ prevention protocols in patients treated with antiresorptive or antiangiogenic drugs subjected to tooth extraction. Key words:Bisphosphonates, angiogenesis inhibitors, antiresorptive drugs, extraction, osteonecrosis. PMID:26827065

  17. The relative allergenicity of Stachybotrys chartarum compared to house dust mite extracts in a mouse model

    EPA Science Inventory

    A report by the Institute of Medicine suggested that more research is needed to better understand mold effects on allergic disease, particularly asthma development. The authors compared the ability of the fungus Stachybotrys chartarum (SCE) and house dust mite (HDM) extracts to i...

  18. Cinnamon polyphenol extract regulates tristetraprolin and related gene expression in mouse adipocytes

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Cinnamon (Cinnamomum verum) has been widely used in spices, flavoring agents, and preservatives. Cinnamon polyphenol extract (CPE) may be important in the alleviation of chronic diseases, but the molecular evidence is not substantial. Tristetraprolin (TTP) family proteins have anti-inflammatory ef...

  19. Bioactive compounds extracted from Indian wild legume seeds: antioxidant and type II diabetes-related enzyme inhibition properties.

    PubMed

    Gautam, Basanta; Vadivel, Vellingiri; Stuetz, Wolfgang; Biesalski, Hans K

    2012-03-01

    Seven different wild legume seeds (Acacia leucophloea, Bauhinia variegata, Canavalia gladiata, Entada scandens, Mucuna pruriens, Sesbania bispinosa and Tamarindus indica) from various parts of India were analyzed for total free phenolics, l-Dopa (l-3,4 dihydroxyphenylalanine), phytic acid and their antioxidant capacity (ferric-reducing antioxidant power [FRAP] and 2,2-diphenyl-1-picrylhydrazyl [DPPH] assay) and type II diabetes-related enzyme inhibition activitiy (α-amylase). S. bispinosa had the highest content in both total free phenolics and l-Dopa, and relatively low phytic acid when compared with other seeds. Phytic acid content, being highest in E. scandens, M. pruriens and T. indica, was highly predictive for FRAP (r = 0.47, p < 0.05) and DPPH (r = 0.66, p < 0.001) assays. The phenolic extract from T. indica and l-Dopa extract from E. scandens showed significantly higher FRAP values among others. All seed extracts demonstrated a remarkable reducing power (7-145 mM FeSO4 per mg extract), DPPH radical scavenging activity (16-95%) and α-amylase enzyme inhibition activity (28-40%).

  20. Bioactive compounds extracted from Indian wild legume seeds: antioxidant and type II diabetes-related enzyme inhibition properties.

    PubMed

    Gautam, Basanta; Vadivel, Vellingiri; Stuetz, Wolfgang; Biesalski, Hans K

    2012-03-01

    Seven different wild legume seeds (Acacia leucophloea, Bauhinia variegata, Canavalia gladiata, Entada scandens, Mucuna pruriens, Sesbania bispinosa and Tamarindus indica) from various parts of India were analyzed for total free phenolics, l-Dopa (l-3,4 dihydroxyphenylalanine), phytic acid and their antioxidant capacity (ferric-reducing antioxidant power [FRAP] and 2,2-diphenyl-1-picrylhydrazyl [DPPH] assay) and type II diabetes-related enzyme inhibition activitiy (α-amylase). S. bispinosa had the highest content in both total free phenolics and l-Dopa, and relatively low phytic acid when compared with other seeds. Phytic acid content, being highest in E. scandens, M. pruriens and T. indica, was highly predictive for FRAP (r = 0.47, p < 0.05) and DPPH (r = 0.66, p < 0.001) assays. The phenolic extract from T. indica and l-Dopa extract from E. scandens showed significantly higher FRAP values among others. All seed extracts demonstrated a remarkable reducing power (7-145 mM FeSO4 per mg extract), DPPH radical scavenging activity (16-95%) and α-amylase enzyme inhibition activity (28-40%). PMID:21970446

  1. Safety in the Automated Office.

    ERIC Educational Resources Information Center

    Graves, Pat R.; Greathouse, Lillian R.

    1990-01-01

    Office automation has introduced new hazards to the workplace: electrical hazards related to computer wiring, musculoskeletal problems resulting from use of computer terminals and design of work stations, and environmental concerns related to ventilation, noise levels, and office machine chemicals. (SK)

  2. Relation between various soil phosphorus extraction methods and sorption parameters in calcareous soils with different texture.

    PubMed

    Jalali, Mohsen; Jalali, Mahdi

    2016-10-01

    The aim of this study was to investigate the influence of soil texture on phosphorus (P) extractability and sorption from a wide range of calcareous soils across Hamedan, western Iran. Fifty seven soil samples were selected and partitioned into five types on the basis of soil texture (clay, sandy, sandy clay loam, sandy loam and mixed loam) and the P extracted with calcium chloride (PCaCl2), citrate (Pcitrate), HCl (PHCl), Olsen (POls), and Mehlich-3 (PM3) solutions. On the average, the P extracted was in the order PHCl>PM3>Pcitrate>POls>PCaCl2. The P extracted by Pcitrate, PHCl, POls, and PM3 methods were significantly higher in sandy, sandy clay loam and sandy loam textures than clay and mixed loam textures, while soil phosphorus buffer capacity (PBC) was significantly higher in clay and mixed loam soil textures. The correlation analysis revealed a significant positive relationship between silt content Freundlich sorption coefficient (KF), maximum P sorption (Qmax), linear distribution coefficient (Kd), and PBC. All extractions were highly correlated with each other and among soil components with silt content. The principal component analysis (PCA) performed on data identified five principal components describing 74.5% of total variation. The results point to soil texture as an important factor and that silt was the crucial soil property associated with P sorption and its extractability in these calcareous soils. DPSM3-2 (PM3PM3+Qmax×100) and DPScitrate (PcitratePcitrate+Qmax×100) proved to be good indicators of soil's potential P release in these calcareous soils. Among the DPS, 21% of soils reported DPSM3-2, values higher than the environmental threshold, indicating build-up of P and P release. Most of the studied sandy clay loam soils had exceeded the environmentally unacceptable P concentration. Various management practices should be taken into account to reduce P losses from these soils. Further inorganic and organic P fertilizer inputs should be reduced

  3. Relation between various soil phosphorus extraction methods and sorption parameters in calcareous soils with different texture.

    PubMed

    Jalali, Mohsen; Jalali, Mahdi

    2016-10-01

    The aim of this study was to investigate the influence of soil texture on phosphorus (P) extractability and sorption from a wide range of calcareous soils across Hamedan, western Iran. Fifty seven soil samples were selected and partitioned into five types on the basis of soil texture (clay, sandy, sandy clay loam, sandy loam and mixed loam) and the P extracted with calcium chloride (PCaCl2), citrate (Pcitrate), HCl (PHCl), Olsen (POls), and Mehlich-3 (PM3) solutions. On the average, the P extracted was in the order PHCl>PM3>Pcitrate>POls>PCaCl2. The P extracted by Pcitrate, PHCl, POls, and PM3 methods were significantly higher in sandy, sandy clay loam and sandy loam textures than clay and mixed loam textures, while soil phosphorus buffer capacity (PBC) was significantly higher in clay and mixed loam soil textures. The correlation analysis revealed a significant positive relationship between silt content Freundlich sorption coefficient (KF), maximum P sorption (Qmax), linear distribution coefficient (Kd), and PBC. All extractions were highly correlated with each other and among soil components with silt content. The principal component analysis (PCA) performed on data identified five principal components describing 74.5% of total variation. The results point to soil texture as an important factor and that silt was the crucial soil property associated with P sorption and its extractability in these calcareous soils. DPSM3-2 (PM3PM3+Qmax×100) and DPScitrate (PcitratePcitrate+Qmax×100) proved to be good indicators of soil's potential P release in these calcareous soils. Among the DPS, 21% of soils reported DPSM3-2, values higher than the environmental threshold, indicating build-up of P and P release. Most of the studied sandy clay loam soils had exceeded the environmentally unacceptable P concentration. Various management practices should be taken into account to reduce P losses from these soils. Further inorganic and organic P fertilizer inputs should be reduced

  4. Automated data entry system: performance issues

    NASA Astrophysics Data System (ADS)

    Thoma, George R.; Ford, Glenn

    2001-12-01

    This paper discusses the performance of a system for extracting bibliographic fields from scanned pages in biomedical journals to populate MEDLINE, the flagship database of the national Library of Medicine (NLM), and heavily used worldwide. This system consists of automated processes to extract the article title, author names, affiliations and abstract, and manual workstations for the entry of other required fields such as pagination, grant support information, databank accession numbers and others needed for a completed bibliographic record in MEDLINE. Labor and time data are given for (1) a wholly manual keyboarding process to create the records, (2) an OCR-based system that requires all fields except the abstract to be manually input, and (3) a more automated system that relies on document image analysis and understanding techniques for the extraction of several fields. It is shown that this last, most automated, approach requires less than 25% of the labor effort in the first, manual, process.

  5. Preventive effect of Vaccinium uliginosum L. extract and its fractions on age-related macular degeneration and its action mechanisms.

    PubMed

    Yoon, Sun-Myung; Lee, Bom-Lee; Guo, Yuan-Ri; Choung, Se-Young

    2016-01-01

    Age-related macular degeneration (AMD) is the leading cause of vision loss and blindness among the elderly. Although the pathogenesis of this disease remains still obscure, several researchers have report that death of retinal pigmented epithelium (RPE) caused by excessive accumulation of A2E is crucial determinants of AMD. In this study, the preventive effect of Vaccinium uliginosum L. (V.U) extract and its fractions on AMD was investigated in blue light-irradiated human RPE cell (ARPE-19 cells). Blue light-induced RPE cell death was significantly inhibited by the treatment of V.U extract or its fraction. To identify the mechanism, FAB-MS analysis revealed that V.U inhibits the photooxidation of N-retinyl-N-retinylidene ethanolamine (A2E) induced by blue light in cell free system. Moreover, monitoring by quantitative HPLC also revealed that V.U extract and its fractions reduced intracellular accumulation of A2E, suggesting that V.U extract and its fractions inhibit not only blue light-induced photooxidation, but also intracellular accumulation of A2E, resulting in RPE cell survival after blue light exposure. A2E-laden cell exposed to blue light induced apoptosis by increasing the cleaved form of caspase-3, Bax/Bcl-2. Additionally, V.U inhibited by the treatment of V.U extract or quercetin-3-O-arabinofuranoside. These results suggest that V.U extract and its fractions have preventive effect on blue light-induced damage in RPE cells and AMD.

  6. Fully automated determination of pesticides in wine.

    PubMed

    Kaufmann, A

    1997-01-01

    A fully automated solid-phase extraction gas chromatographic/mass spectrometric (SPE/GC/MS) method was developed for determination of pesticides in wine. All steps from aspiration of infiltrated wine to printout of the integrated chromatogram were performed without human interaction. A dedicated robot performed addition of internal standard, application of wine onto the SPE cartridge, elution of analytes, drying and concentrating of eluate, and passing of concentrate to the GC sampler. All steps were performed in standard liquid chromatography/GC vials, using a minimum of organic solvent. The method permits determination of 21 different pesticides. Individual detection limits were 0.005-0.01 mg/L. The regression coefficients relating to linearity were > 0.99; only 4,4-dichloro-benzphenone and dicofol showed lower coefficients. The recoveries for 17 pesticides ranged from 80 to 115%.

  7. Automated recognition of urinary microscopic solid particles.

    PubMed

    Almadhoun, Mohamed D; El-Halees, Alaa

    2014-03-01

    Urine analysis reveals the presence of many problems and diseases in the human body. Manual microscopic urine analysis is time-consuming, subjective to human observation and causes mistakes. Computer aided automatic microscopic analysis can help to overcome these problems. This paper introduces a comprehensive approach for automating procedures for detecting and recognition of microscopic urine particles. Samples of red blood cells (RBC), white blood cells (WBC), calcium oxalate, triple phosphate and other undefined images were used in experiments. Image processing functions and segmentation were applied, shape and textural features were extracted and five classifiers were tested to get the best results. Repeated experiments were done for adjusting factors to produce the best evaluation results. A good performance was achieved compared with many related works. PMID:24392883

  8. GDRMS: a system for automatic extraction of the disease-centre relation

    NASA Astrophysics Data System (ADS)

    Yang, Ronggen; Zhang, Yue; Gong, Lejun

    2011-12-01

    With the rapidly increasing of biomedical literature, the deluge of new articles is leading to information overload. Extracting the available knowledge from the huge amount of biomedical literature has become a major challenge. GDRMS is developed as a tool that extracts the relationship between disease and gene, gene and gene from biomedical literatures using text mining technology. It is a ruled-based system which also provides disease-centre network visualization, constructs the disease-gene database, and represents a gene engine for understanding the function of the gene. The main focus of GDRMS is to provide a valuable opportunity to explore the relationship between disease and gene for the research community about etiology of disease.

  9. GDRMS: a system for automatic extraction of the disease-centre relation

    NASA Astrophysics Data System (ADS)

    Yang, Ronggen; Zhang, Yue; Gong, Lejun

    2012-01-01

    With the rapidly increasing of biomedical literature, the deluge of new articles is leading to information overload. Extracting the available knowledge from the huge amount of biomedical literature has become a major challenge. GDRMS is developed as a tool that extracts the relationship between disease and gene, gene and gene from biomedical literatures using text mining technology. It is a ruled-based system which also provides disease-centre network visualization, constructs the disease-gene database, and represents a gene engine for understanding the function of the gene. The main focus of GDRMS is to provide a valuable opportunity to explore the relationship between disease and gene for the research community about etiology of disease.

  10. Broccoli sprout extract induces detoxification-related gene expression and attenuates acute liver injury

    PubMed Central

    Yoshida, Kazutaka; Ushida, Yusuke; Ishijima, Tomoko; Suganuma, Hiroyuki; Inakuma, Takahiro; Yajima, Nobuhiro; Abe, Keiko; Nakai, Yuji

    2015-01-01

    AIM: To investigate the effects of broccoli sprout extract (BSEx) on liver gene expression and acute liver injury in the rat. METHODS: First, the effects of BSEx on liver gene expression were examined. Male rats were divided into two groups. The Control group was fed the AIN-76 diet, and the BSEx group was fed the AIN-76 diet containing BSEx. After a 10-d feeding period, rats were sacrificed and their livers were used for DNA microarray and real-time reverse transcription-polymerase chain reaction (RT-PCR) analyses. Next, the effects of BSEx on acute liver injury were examined. In experiments using acute liver injury models, 1000 mg/kg acetaminophen (APAP) or 350 mg/kg D-galactosamine (D-GalN) was used to induce injury. These male rats were divided into four groups: Control, BSEx, Inducer (APAP or D-GalN), and Inducer+BSEx. The feeding regimens were identical for the two analyses. Twenty-four hours following APAP administration via p.o. or D-GalN administration via i.p., rats were sacrificed to determine serum aspartate transaminase (AST) and alanine transaminase (ALT) levels, hepatic glutathione (GSH) and thiobarbituric acid-reactive substances accumulation and glutathione-S-transferase (GST) activity. RESULTS: Microarray and real-time RT-PCR analyses revealed that BSEx upregulated the expression of genes related to detoxification and glutathione synthesis in normal rat liver. The levels of AST (70.91 ± 15.74 IU/mL vs 5614.41 ± 1997.83 IU/mL, P < 0.05) and ALT (11.78 ± 2.08 IU/mL vs 1297.71 ± 447.33 IU/mL, P < 0.05) were significantly suppressed in the APAP + BSEx group compared with the APAP group. The level of GSH (2.61 ± 0.75 nmol/g tissue vs 1.66 ± 0.59 nmol/g tissue, P < 0.05) and liver GST activity (93.19 ± 16.55 U/g tissue vs 51.90 ± 16.85 U/g tissue, P < 0.05) were significantly increased in the APAP + BSEx group compared with the APAP group. AST (4820.05 ± 3094.93 IU/mL vs 12465.63 ± 3223.97 IU/mL, P < 0.05) and ALT (1808.95 ± 1014.04 IU/mL vs

  11. HITSZ_CDR: an end-to-end chemical and disease relation extraction system for BioCreative V

    PubMed Central

    Li, Haodi; Tang, Buzhou; Chen, Qingcai; Chen, Kai; Wang, Xiaolong; Wang, Baohua; Wang, Zhe

    2016-01-01

    In this article, an end-to-end system was proposed for the challenge task of disease named entity recognition (DNER) and chemical-induced disease (CID) relation extraction in BioCreative V, where DNER includes disease mention recognition (DMR) and normalization (DN). Evaluation on the challenge corpus showed that our system achieved the highest F1-scores 86.93% on DMR, 84.11% on DN, 43.04% on CID relation extraction, respectively. The F1-score on DMR is higher than our previous one reported by the challenge organizers (86.76%), the highest F1-score of the challenge. Database URL: http://database.oxfordjournals.org/content/2016/baw077 PMID:27270713

  12. A semi-automated system for quantifying the oxidative potential of ambient particles in aqueous extracts using the dithiothreitol (DTT) assay: results from the Southeastern Center for Air Pollution and Epidemiology (SCAPE)

    NASA Astrophysics Data System (ADS)

    Fang, T.; Verma, V.; Guo, H.; King, L. E.; Edgerton, E. S.; Weber, R. J.

    2015-01-01

    A variety of methods are used to measure the capability of particulate matter (PM) to catalytically generate reactive oxygen species (ROS) in vivo, also defined as the aerosol oxidative potential. A widely used measure of aerosol oxidative potential is the dithiothreitol (DTT) assay, which monitors the depletion of DTT (a surrogate for cellular antioxidants) as catalyzed by the redox-active species in PM. However, a major constraint in the routine use of the DTT assay for integrating it with large-scale health studies is its labor-intensive and time-consuming protocol. To specifically address this concern, we have developed a semi-automated system for quantifying the oxidative potential of aerosol liquid extracts using the DTT assay. The system, capable of unattended analysis at one sample per hour, has a high analytical precision (coefficient of variation of 15% for positive control, 4% for ambient samples) and reasonably low limit of detection (0.31 nmol min-1). Comparison of the automated approach with the manual method conducted on ambient samples yielded good agreement (slope = 1.08 ± 0.12, r2 = 0.92, N = 9). The system was utilized for the Southeastern Center for Air Pollution & Epidemiology (SCAPE) to generate an extensive data set on DTT activity of ambient particles collected from contrasting environments (urban, roadside, and rural) in the southeastern US. We find that water-soluble PM2.5 DTT activity on a per-air-volume basis was spatially uniform and often well correlated with PM2.5 mass (r = 0.49 to 0.88), suggesting regional sources contributing to the PM oxidative potential in the southeastern US. The correlation may also suggest a mechanistic explanation (oxidative stress) for observed PM2.5 mass-health associations. The heterogeneity in the intrinsic DTT activity (per-PM-mass basis) across seasons indicates variability in the DTT activity associated with aerosols from sources that vary with season. Although developed for the DTT assay, the

  13. A Method of Extracting Sentences Related to Protein Interaction from Literature using a Structure Database

    NASA Astrophysics Data System (ADS)

    Kaneta, Yoshikazu; Munna, Md. Ahaduzzaman; Ohkawa, Takenao

    Because a protein expresses its function through interaction with other substrates, it is vital to create a database of protein interaction. Since the total volume of information on protein interaction is described in terms of thousands of literatures, it is nearly impossible to extract all this information manually. Although extraction systems for interaction information based on the template matching method have already been developed, it is not possible to match all the sentences with interaction information due to the extent of sentence complexity. We propose a method of extracting sentences with interaction information independent of sentence structure. In a protein-compound complex structure, the interacting residue is near to its partner. The distance between them can be calculated by using the structure data in the PDB database, with a short distance indicating that the sentences associated with them might describe the interaction information. In a free-protein structure, the distance cannot be calculated because the coordinates of the protein's partner are not registered in the structure data. Hence, we use the homology protein structure data, which is complexed with the protein's parter. The proposed method was applied to seven literatures written about protein-compound complexes and four literatures written about free proteins, obtaining F-measures of 71% and 72%, respectively.

  14. Celery Seed and Related Extracts with Antiarthritic, Antiulcer, and Antimicrobial Activities.

    PubMed

    Powanda, Michael C; Whitehouse, Michael W; Rainsford, K D

    2015-01-01

    Celery preparations have been used extensively for several millennia as natural therapies for acute and chronic painful or inflammatory conditions. This chapter reviews some of the biological and chemical properties of various celery preparations that have been used as natural remedies. Many of these have varying activities and product qualities. A fully standardized celery preparation has been prepared known as an alcoholic extract of the seeds of a plant source derived from northern India. This is termed, Celery Seed Extract (CSE) and has been found to be at least as effective as aspirin, ibuprofen, and naproxen in suppressing arthritis in a model of polyarthritis. CSE can also reduce existing inflammation in rats. CSE has also been shown to provide analgesia in two model systems. CSE, in addition to acting as an analgesic and inflammatory agent, has been shown to protect against and/or reduce gastric irritation caused by NSAIDs, as well as act synergistically with them to reduce inflammation. The CSE was fractionated by organic solvent extractions, then subjected to column chromatography followed by HPLC and was characterized by mass spectrometry. This yielded a purified component that had specific inhibitory effects on Helicobacter pylori but was not active against Campylobacter jejuni or Escherichia coli. Additionally, toxicology studies did not reveal any clear signs of toxicity at doses relevant to human use. Also, unlike many dietary supplements, the available data suggest that CSE does not significantly affect the p450 enzyme systems and thus is less likely to alter the metabolism of drugs the individual may be taking. CSE may be a prototype of a natural product that can be used therapeutically to treat arthritis and other inflammatory diseases. PMID:26462366

  15. Antiviral activity of Plantago major extracts and related compounds in vitro.

    PubMed

    Chiang, L C; Chiang, W; Chang, M Y; Ng, L T; Lin, C C

    2002-07-01

    Plantago major L., a popular traditional Chinese medicine, has long been used for treating various diseases varying from cold to viral hepatitis. The aim of present study was to examine the antiviral activity of aqueous extract and pure compounds of P. major. Studies were conducted on a series of viruses, namely herpesviruses (HSV-1, HSV-2) and adenoviruses (ADV-3, ADV-8, ADV-11). The antiviral activity of EC50 was defined as the concentration achieved 50% cyto-protection against virus infection and the selectivity index (SI) was determined by the ratio of CC50 (concentration of 50% cellular cytotoxicity) to EC50. Results showed that aqueous extract of P. major possessed only a slight anti-herpes virus activity. In contrast, certain pure compounds belonging to the five different classes of chemicals found in extracts of this plant exhibited potent antiviral activity. Among them, caffeic acid exhibited the strongest activity against HSV-1 (EC50=15.3 microg/ml, SI=671), HSV-2 (EC50=87.3 microg/ml, SI=118) and ADV-3 (EC50=14.2 microg/ml, SI=727), whereas chlorogenic acid possessed the strongest anti-ADV-11 (EC50=13.3 microg/ml, SI=301) activity. The present study concludes that pure compounds of P. major, which possess antiviral activities are mainly derived from the phenolic compounds, especially caffeic acid. Its mode of action against HSV-2 and ADV-3 was found to be at multiplication stages (postinfection of HSV-1: 0-12 h; ADV-3: 0-2 h), and with SI values greater than 400, suggesting the potential use of this compound for treatment of the infection by these two viruses. PMID:12076751

  16. Celery Seed and Related Extracts with Antiarthritic, Antiulcer, and Antimicrobial Activities.

    PubMed

    Powanda, Michael C; Whitehouse, Michael W; Rainsford, K D

    2015-01-01

    Celery preparations have been used extensively for several millennia as natural therapies for acute and chronic painful or inflammatory conditions. This chapter reviews some of the biological and chemical properties of various celery preparations that have been used as natural remedies. Many of these have varying activities and product qualities. A fully standardized celery preparation has been prepared known as an alcoholic extract of the seeds of a plant source derived from northern India. This is termed, Celery Seed Extract (CSE) and has been found to be at least as effective as aspirin, ibuprofen, and naproxen in suppressing arthritis in a model of polyarthritis. CSE can also reduce existing inflammation in rats. CSE has also been shown to provide analgesia in two model systems. CSE, in addition to acting as an analgesic and inflammatory agent, has been shown to protect against and/or reduce gastric irritation caused by NSAIDs, as well as act synergistically with them to reduce inflammation. The CSE was fractionated by organic solvent extractions, then subjected to column chromatography followed by HPLC and was characterized by mass spectrometry. This yielded a purified component that had specific inhibitory effects on Helicobacter pylori but was not active against Campylobacter jejuni or Escherichia coli. Additionally, toxicology studies did not reveal any clear signs of toxicity at doses relevant to human use. Also, unlike many dietary supplements, the available data suggest that CSE does not significantly affect the p450 enzyme systems and thus is less likely to alter the metabolism of drugs the individual may be taking. CSE may be a prototype of a natural product that can be used therapeutically to treat arthritis and other inflammatory diseases.

  17. Antiviral activity of Plantago major extracts and related compounds in vitro.

    PubMed

    Chiang, L C; Chiang, W; Chang, M Y; Ng, L T; Lin, C C

    2002-07-01

    Plantago major L., a popular traditional Chinese medicine, has long been used for treating various diseases varying from cold to viral hepatitis. The aim of present study was to examine the antiviral activity of aqueous extract and pure compounds of P. major. Studies were conducted on a series of viruses, namely herpesviruses (HSV-1, HSV-2) and adenoviruses (ADV-3, ADV-8, ADV-11). The antiviral activity of EC50 was defined as the concentration achieved 50% cyto-protection against virus infection and the selectivity index (SI) was determined by the ratio of CC50 (concentration of 50% cellular cytotoxicity) to EC50. Results showed that aqueous extract of P. major possessed only a slight anti-herpes virus activity. In contrast, certain pure compounds belonging to the five different classes of chemicals found in extracts of this plant exhibited potent antiviral activity. Among them, caffeic acid exhibited the strongest activity against HSV-1 (EC50=15.3 microg/ml, SI=671), HSV-2 (EC50=87.3 microg/ml, SI=118) and ADV-3 (EC50=14.2 microg/ml, SI=727), whereas chlorogenic acid possessed the strongest anti-ADV-11 (EC50=13.3 microg/ml, SI=301) activity. The present study concludes that pure compounds of P. major, which possess antiviral activities are mainly derived from the phenolic compounds, especially caffeic acid. Its mode of action against HSV-2 and ADV-3 was found to be at multiplication stages (postinfection of HSV-1: 0-12 h; ADV-3: 0-2 h), and with SI values greater than 400, suggesting the potential use of this compound for treatment of the infection by these two viruses.

  18. Relative contributions of hypoxia and natural gas extraction to atmospheric methane emissions from Lake Erie

    NASA Astrophysics Data System (ADS)

    Disbennett, D. A.; Townsend-Small, A.; Bourbonniere, R.; Mackay, R.

    2013-12-01

    Reduced oxygen availability in lakes due to summer stratification can create conditions suitable for methanogenic activity, which ultimately contributes to atmospheric methane emissions. Lake Erie has persistent low oxygen conditions in bottom waters during summer, which contributes to methane production through anaerobic organic matter respiration. Lake Erie also has substantial subsurface natural gas deposits that are currently being extracted in Canadian waters. We hypothesized that the lake would be a source of methane to the atmosphere in late summer, prior to fall turnover, and that natural gas wells and pipelines would contribute to additional methane emissions from resource extraction areas in Canadian waters. Initial sampling was conducted at a total of 20 sites in central and western Lake Erie during early September 2012. Sites were selected to collect samples from a wide range of environmental conditions in order to better establish the baseline flux from these areas. We selected an array of sites in the offshore environment, sites from a very shallow bay and sites within the Canadian gas fields. Air samples were gathered using floating flux chambers tethered to the research vessel. Dissolved gas water samples were collected using a Van Dorn bottle. We found a consistent positive flux of methane throughout the lake during late summer, with flux rates adjacent to natural gas pipelines up to an order of magnitude greater than elsewhere. Stable isotope analysis yielded results that were not entirely expected. The δ13C of surface samples from areas of fossil fuel extraction and suspected biogenic sources were very similar, likely due to oxidation of methane in the water column. Additional sampling occurred during 2012 and 2013 concentrating on bottom waters and surface fluxes which should allow us to further constrain sources of CH4 from Lake Erie. This project is an effort to constrain the global warming potential of hypoxia in the Great Lakes, and

  19. 49 CFR 238.445 - Automated monitoring.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false Automated monitoring. 238.445 Section 238.445 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Equipment § 238.445 Automated monitoring. (a) Each passenger train shall be equipped to monitor...

  20. 49 CFR 238.445 - Automated monitoring.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Automated monitoring. 238.445 Section 238.445 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Equipment § 238.445 Automated monitoring. (a) Each passenger train shall be equipped to monitor...

  1. 49 CFR 238.445 - Automated monitoring.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Automated monitoring. 238.445 Section 238.445 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Equipment § 238.445 Automated monitoring. (a) Each passenger train shall be equipped to monitor...

  2. 49 CFR 238.445 - Automated monitoring.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Automated monitoring. 238.445 Section 238.445 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Equipment § 238.445 Automated monitoring. (a) Each passenger train shall be equipped to monitor...

  3. Sieve-based coreference resolution enhances semi-supervised learning model for chemical-induced disease relation extraction

    PubMed Central

    Le, Hoang-Quynh; Tran, Mai-Vu; Dang, Thanh Hai; Ha, Quang-Thuy; Collier, Nigel

    2016-01-01

    The BioCreative V chemical-disease relation (CDR) track was proposed to accelerate the progress of text mining in facilitating integrative understanding of chemicals, diseases and their relations. In this article, we describe an extension of our system (namely UET-CAM) that participated in the BioCreative V CDR. The original UET-CAM system’s performance was ranked fourth among 18 participating systems by the BioCreative CDR track committee. In the Disease Named Entity Recognition and Normalization (DNER) phase, our system employed joint inference (decoding) with a perceptron-based named entity recognizer (NER) and a back-off model with Semantic Supervised Indexing and Skip-gram for named entity normalization. In the chemical-induced disease (CID) relation extraction phase, we proposed a pipeline that includes a coreference resolution module and a Support Vector Machine relation extraction model. The former module utilized a multi-pass sieve to extend entity recall. In this article, the UET-CAM system was improved by adding a ‘silver’ CID corpus to train the prediction model. This silver standard corpus of more than 50 thousand sentences was automatically built based on the Comparative Toxicogenomics Database (CTD) database. We evaluated our method on the CDR test set. Results showed that our system could reach the state of the art performance with F1 of 82.44 for the DNER task and 58.90 for the CID task. Analysis demonstrated substantial benefits of both the multi-pass sieve coreference resolution method (F1 + 4.13%) and the silver CID corpus (F1 +7.3%). Database URL: SilverCID–The silver-standard corpus for CID relation extraction is freely online available at: https://zenodo.org/record/34530 (doi:10.5281/zenodo.34530).

  4. Sieve-based coreference resolution enhances semi-supervised learning model for chemical-induced disease relation extraction.

    PubMed

    Le, Hoang-Quynh; Tran, Mai-Vu; Dang, Thanh Hai; Ha, Quang-Thuy; Collier, Nigel

    2016-07-01

    The BioCreative V chemical-disease relation (CDR) track was proposed to accelerate the progress of text mining in facilitating integrative understanding of chemicals, diseases and their relations. In this article, we describe an extension of our system (namely UET-CAM) that participated in the BioCreative V CDR. The original UET-CAM system's performance was ranked fourth among 18 participating systems by the BioCreative CDR track committee. In the Disease Named Entity Recognition and Normalization (DNER) phase, our system employed joint inference (decoding) with a perceptron-based named entity recognizer (NER) and a back-off model with Semantic Supervised Indexing and Skip-gram for named entity normalization. In the chemical-induced disease (CID) relation extraction phase, we proposed a pipeline that includes a coreference resolution module and a Support Vector Machine relation extraction model. The former module utilized a multi-pass sieve to extend entity recall. In this article, the UET-CAM system was improved by adding a 'silver' CID corpus to train the prediction model. This silver standard corpus of more than 50 thousand sentences was automatically built based on the Comparative Toxicogenomics Database (CTD) database. We evaluated our method on the CDR test set. Results showed that our system could reach the state of the art performance with F1 of 82.44 for the DNER task and 58.90 for the CID task. Analysis demonstrated substantial benefits of both the multi-pass sieve coreference resolution method (F1 + 4.13%) and the silver CID corpus (F1 +7.3%).Database URL: SilverCID-The silver-standard corpus for CID relation extraction is freely online available at: https://zenodo.org/record/34530 (doi:10.5281/zenodo.34530). PMID:27630201

  5. Sieve-based coreference resolution enhances semi-supervised learning model for chemical-induced disease relation extraction

    PubMed Central

    Le, Hoang-Quynh; Tran, Mai-Vu; Dang, Thanh Hai; Ha, Quang-Thuy; Collier, Nigel

    2016-01-01

    The BioCreative V chemical-disease relation (CDR) track was proposed to accelerate the progress of text mining in facilitating integrative understanding of chemicals, diseases and their relations. In this article, we describe an extension of our system (namely UET-CAM) that participated in the BioCreative V CDR. The original UET-CAM system’s performance was ranked fourth among 18 participating systems by the BioCreative CDR track committee. In the Disease Named Entity Recognition and Normalization (DNER) phase, our system employed joint inference (decoding) with a perceptron-based named entity recognizer (NER) and a back-off model with Semantic Supervised Indexing and Skip-gram for named entity normalization. In the chemical-induced disease (CID) relation extraction phase, we proposed a pipeline that includes a coreference resolution module and a Support Vector Machine relation extraction model. The former module utilized a multi-pass sieve to extend entity recall. In this article, the UET-CAM system was improved by adding a ‘silver’ CID corpus to train the prediction model. This silver standard corpus of more than 50 thousand sentences was automatically built based on the Comparative Toxicogenomics Database (CTD) database. We evaluated our method on the CDR test set. Results showed that our system could reach the state of the art performance with F1 of 82.44 for the DNER task and 58.90 for the CID task. Analysis demonstrated substantial benefits of both the multi-pass sieve coreference resolution method (F1 + 4.13%) and the silver CID corpus (F1 +7.3%). Database URL: SilverCID–The silver-standard corpus for CID relation extraction is freely online available at: https://zenodo.org/record/34530 (doi:10.5281/zenodo.34530). PMID:27630201

  6. Knowledge, attitudes, and performance of dental students in relation to sterilization/disinfection methods of extracted human teeth

    PubMed Central

    Hashemipour, Maryam Alsadat; Mozafarinia, Romina; Mirzadeh, Azin; Aramon, Moien; Nassab, Sayed Amir Hossein Gandjalikhan

    2013-01-01

    Background: Dental students use extracted human teeth to learn practical and technical skills before they enter the clinical environment. In the present research, knowledge, performance, and attitudes toward sterilization/disinfection methods of extracted human teeth were evaluated in a selected group of Iranian dental students. Materials and Methods: In this descriptive cross-sectional study the subjects consisted of fourth-, fifth- and sixth-year dental students. Data were collected by questionnaires and analyzed by Fisher's exact test and Chi-squared test using SPSS 11.5. Results: In this study, 100 dental students participated. The average knowledge score was 15.9 ± 4.8. Based on the opinion of 81 students sodium hypochlorite was selected as suitable material for sterilization and 78 students believed that oven sterilization is a good way for the purpose. The average performance score was 4.1 ± 0.8, with 3.9 ± 1.7 and 4.3 ± 1.1 for males and females, respectively, with no significant differences between the two sexes. The maximum and minimum attitude scores were 60 and 25, with an average score of 53.1 ± 5.2. Conclusion: The results of this study indicated that knowledge, performance and attitude of dental students in relation to sterilization/disinfection methods of extracted human teeth were good. However, weaknesses were observed in relation to teaching and materials suitable for sterilization. PMID:24130583

  7. Multivariate calibration for the determination of total azadirachtin-related limonoids and simple terpenoids in neem extracts using vanillin assay.

    PubMed

    Dai, J; Yaylayan, V A; Raghavan, G S; Parè, J R; Liu, Z

    2001-03-01

    Two-component and multivariate calibration techniques were developed for the simultaneous quantification of total azadirachtin-related limonoids (AZRL) and simple terpenoids (ST) in neem extracts using vanillin assay. A mathematical modeling method was also developed to aid in the analysis of the spectra and to simplify the calculations. The mathematical models were used in a two-component calibration (using azadirachtin and limonene as standards) for samples containing mainly limonoids and terpenoids (such as neem seed kernel extracts). However, for the extracts from other parts of neem, such as neem leaf, a multivariate calibration was necessary to eliminate the possible interference from phenolics and other components in order to obtain the accurate content of AZRL and ST. It was demonstrated that the accuracy of the vanillin assay in predicting the content of azadirachtin in a model mixture containing limonene (25% w/w) can be improved from 50% overestimation to 95% accuracy using the two-component calibration, while predicting the content of limonene with 98% accuracy. Both calibration techniques were applied to estimate the content of AZRL and ST in different parts of the neem plant. The results of this study indicated that the relative content of limonoids was much higher than that of the terpenoids in all parts of the neem plant studied. PMID:11312830

  8. SU-D-BRD-03: Improving Plan Quality with Automation of Treatment Plan Checks

    SciTech Connect

    Covington, E; Younge, K; Chen, X; Lee, C; Matuszak, M; Kessler, M; Acosta, E; Orow, A; Filpansick, S; Moran, J; Keranen, W

    2015-06-15

    Purpose: To evaluate the effectiveness of an automated plan check tool to improve first-time plan quality as well as standardize and document performance of physics plan checks. Methods: The Plan Checker Tool (PCT) uses the Eclipse Scripting API to check and compare data from the treatment planning system (TPS) and treatment management system (TMS). PCT was created to improve first-time plan quality, reduce patient delays, increase efficiency of our electronic workflow, and to standardize and partially automate plan checks in the TPS. A framework was developed which can be configured with different reference values and types of checks. One example is the prescribed dose check where PCT flags the user when the planned dose and the prescribed dose disagree. PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user. A PDF report is created and automatically uploaded into the TMS. Prior to and during PCT development, errors caught during plan checks and also patient delays were tracked in order to prioritize which checks should be automated. The most common and significant errors were determined. Results: Nineteen of 33 checklist items were automated with data extracted with the PCT. These include checks for prescription, reference point and machine scheduling errors which are three of the top six causes of patient delays related to physics and dosimetry. Since the clinical roll-out, no delays have been due to errors that are automatically flagged by the PCT. Development continues to automate the remaining checks. Conclusion: With PCT, 57% of the physics plan checklist has been partially or fully automated. Treatment delays have declined since release of the PCT for clinical use. By tracking delays and errors, we have been able to measure the effectiveness of automating checks and are using this information to prioritize future development. This project was supported in part by P01CA059827.

  9. Automation U.S.A.: Overcoming Barriers to Automation.

    ERIC Educational Resources Information Center

    Brody, Herb

    1985-01-01

    Although labor unions and inadequate technology play minor roles, the principal barrier to factory automation is "fear of change." Related problems include long-term benefits, nontechnical executives, and uncertainty of factory cost accounting. Industry support for university programs is helping to educate engineers to design, implement, and…

  10. Automated solid-phase extraction and high-performance liquid chromatographic determination of nitrosamines using post-column photolysis and tris(2,2'-bipyridyl) ruthenium(III) chemiluminescence.

    PubMed

    Pérez-Ruiz, Tomás; Martínez-Lozano, Carmen; Tomás, Virginia; Martín, Jesús

    2005-06-01

    A sensitive and selective post-column detection system for nitrosamines is described. The principle upon which the detector works is that UV irradiation of aqueous solutions of nitrosamines leads to cleavage of the N-NO bond. The amine generated is subsequent detected by chemiluminescence using tris(2,2'-bipyridyl) ruthenium(III), which is on-line generated by photo-oxidation of the ruthenium(II) complex in the presence of peroxydisulfate. Factors affecting the photochemical and chemiluminescent reactions were optimized to minimise their contribution to the total band-broadening. This detection system was tested for N-nitrosodimethylamine, N-nitroso-diethylamine, N-nitrosomorpholine, N-nitrosopiperidine and N-nitrosopyrrolidine, which were separated on an ODS column by isocratic reversed-phase chromatography with acetonitrile-water containing 5 mM acetate buffer at pH 4.0. A linear relationship between analyte concentration and peak area was obtained within the range 0.13-500 microg l(-1) with correlation coefficients greater than 0.9995 and detection limits of between 0.03 and 0.76 microg l(-1). Intra- and inter-day precision values of about 1.2% RSD (n = 11) and 2.5% RSD (n = 10), respectively, were obtained. The sensitivity may increase from 9 to 280 times with respect to UV detection, depending on the nitrosamine in question. An automated solid-phase extraction (SPE) system was used in conjunction with HPLC to determine nitrosamine residues in waters. Detection limits within the range 0.10-3.0 ng l(-1) were achieved for only 250 ml of sample.

  11. Development of an automated on-line solid-phase extraction-high-performance liquid chromatographic method for the analysis of aniline, phenol, caffeine and various selected substituted aniline and phenol compounds in aqueous matrices.

    PubMed

    Patsias, J; Papadopoulou-Mourkidou, E

    2000-12-29

    A fully automated solid-phase extraction (SPE)-high-performance liquid chromatographic method has been developed for the simultaneous analysis of substituted anilines and phenols in aqueous matrices at the low- to sub-microg/l level. Diode array and electrochemical detection operated in tandem mode were used for analyte detection. Two new polymeric sorbent materials (Hysphere-GP and Hysphere-SH) were evaluated for the on-line SPE of substituted anilines and phenols from aqueous matrices and their performance was compared with the PRP-1 and PLRP-S sorbents. Hysphere-GP sorbent packed in 10 x 2 mm cartridges was found to give better results in terms of sensitivity and selectivity of the overall analytical method. The proposed analytical method was validated for the analysis of these compounds in Axios river water that receives industrial, communal and agricultural wastes. The detection limits for all the compounds range between 0.05 and 0.2 microg/l, except for aniline and phenol which have detection limits of 0.5 and 1 microg/l, respectively (aniline detected by electrochemical detection). The recoveries for all the compounds are higher than 75% except for aniline (6%), phenol (50%) and 3-chlorophenol (67%). Finally, in order to evaluate the efficiency of the Hysphere-GP (10 x 2 mm) cartridges for sample stabilization and storage, the stability of the compounds of interest at the sorbed state onto these cartridges has been evaluated under three different temperature regimes (deep freeze, refrigeration, 20 degrees C).

  12. Automated External Defibrillator

    MedlinePlus

    ... from the NHLBI on Twitter. What Is an Automated External Defibrillator? An automated external defibrillator (AED) is a portable device that ... Institutes of Health Department of Health and Human Services USA.gov

  13. Antioxidant Activity and Thermal Stability of Oleuropein and Related Phenolic Compounds of Olive Leaf Extract after Separation and Concentration by Salting-Out-Assisted Cloud Point Extraction.

    PubMed

    Stamatopoulos, Konstantinos; Katsoyannos, Evangelos; Chatzilazarou, Arhontoula

    2014-04-08

    A fast, clean, energy-saving, non-toxic method for the stabilization of the antioxidant activity and the improvement of the thermal stability of oleuropein and related phenolic compounds separated from olive leaf extract via salting-out-assisted cloud point extraction (CPE) was developed using Tween 80. The process was based on the decrease of the solubility of polyphenols and the lowering of the cloud point temperature of Tween 80 due to the presence of elevated amounts of sulfates (salting-out) and the separation from the bulk solution with centrifugation. The optimum conditions were chosen based on polyphenols recovery (%), phase volume ratio (Vs/Vw) and concentration factor (Fc). The maximum recovery of polyphenols was in total 95.9%; Vs/Vw was 0.075 and Fc was 15 at the following conditions: pH 2.6, ambient temperature (25 °C), 4% Tween 80 (w/v), 35% Na₂SO₄ (w/v) and a settling time of 5 min. The total recovery of oleuropein, hydroxytyrosol, luteolin-7-O-glucoside, verbascoside and apigenin-7-O-glucoside, at optimum conditions, was 99.8%, 93.0%, 87.6%, 99.3% and 100.0%, respectively. Polyphenolic compounds entrapped in the surfactant-rich phase (Vs) showed higher thermal stability (activation energy (Ea) 23.8 kJ/mol) compared to non-entrapped ones (Ea 76.5 kJ/mol). The antioxidant activity of separated polyphenols remained unaffected as determined by the 1,1-diphenyl-2-picrylhydrazyl method.

  14. Antioxidant Activity and Thermal Stability of Oleuropein and Related Phenolic Compounds of Olive Leaf Extract after Separation and Concentration by Salting-Out-Assisted Cloud Point Extraction

    PubMed Central

    Stamatopoulos, Konstantinos; Katsoyannos, Evangelos; Chatzilazarou, Arhontoula

    2014-01-01

    A fast, clean, energy-saving, non-toxic method for the stabilization of the antioxidant activity and the improvement of the thermal stability of oleuropein and related phenolic compounds separated from olive leaf extract via salting-out-assisted cloud point extraction (CPE) was developed using Tween 80. The process was based on the decrease of the solubility of polyphenols and the lowering of the cloud point temperature of Tween 80 due to the presence of elevated amounts of sulfates (salting-out) and the separation from the bulk solution with centrifugation. The optimum conditions were chosen based on polyphenols recovery (%), phase volume ratio (Vs/Vw) and concentration factor (Fc). The maximum recovery of polyphenols was in total 95.9%; Vs/Vw was 0.075 and Fc was 15 at the following conditions: pH 2.6, ambient temperature (25 °C), 4% Tween 80 (w/v), 35% Na2SO4 (w/v) and a settling time of 5 min. The total recovery of oleuropein, hydroxytyrosol, luteolin-7-O-glucoside, verbascoside and apigenin-7-O-glucoside, at optimum conditions, was 99.8%, 93.0%, 87.6%, 99.3% and 100.0%, respectively. Polyphenolic compounds entrapped in the surfactant-rich phase (Vs) showed higher thermal stability (activation energy (Ea) 23.8 kJ/mol) compared to non-entrapped ones (Ea 76.5 kJ/mol). The antioxidant activity of separated polyphenols remained unaffected as determined by the 1,1-diphenyl-2-picrylhydrazyl method. PMID:26784869

  15. Automation: triumph or trap?

    PubMed

    Smythe, M H

    1997-01-01

    Automation, a hot topic in the laboratory world today, can be a very expensive option. Those who are considering implementing automation can save time and money by examining the issues from the standpoint of an industrial/manufacturing engineer. The engineer not only asks what problems will be solved by automation, but what problems will be created. This article discusses questions that must be asked and answered to ensure that automation efforts will yield real and substantial payoffs.

  16. Automated labeling in document images

    NASA Astrophysics Data System (ADS)

    Kim, Jongwoo; Le, Daniel X.; Thoma, George R.

    2000-12-01

    The National Library of Medicine (NLM) is developing an automated system to produce bibliographic records for its MEDLINER database. This system, named Medical Article Record System (MARS), employs document image analysis and understanding techniques and optical character recognition (OCR). This paper describes a key module in MARS called the Automated Labeling (AL) module, which labels all zones of interest (title, author, affiliation, and abstract) automatically. The AL algorithm is based on 120 rules that are derived from an analysis of journal page layouts and features extracted from OCR output. Experiments carried out on more than 11,000 articles in over 1,000 biomedical journals show the accuracy of this rule-based algorithm to exceed 96%.

  17. Workflow automation architecture standard

    SciTech Connect

    Moshofsky, R.P.; Rohen, W.T.

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  18. Image segmentation for automated dental identification

    NASA Astrophysics Data System (ADS)

    Haj Said, Eyad; Nassar, Diaa Eldin M.; Ammar, Hany H.

    2006-02-01

    Dental features are one of few biometric identifiers that qualify for postmortem identification; therefore, creation of an Automated Dental Identification System (ADIS) with goals and objectives similar to the Automated Fingerprint Identification System (AFIS) has received increased attention. As a part of ADIS, teeth segmentation from dental radiographs films is an essential step in the identification process. In this paper, we introduce a fully automated approach for teeth segmentation with goal to extract at least one tooth from the dental radiograph film. We evaluate our approach based on theoretical and empirical basis, and we compare its performance with the performance of other approaches introduced in the literature. The results show that our approach exhibits the lowest failure rate and the highest optimality among all full automated approaches introduced in the literature.

  19. Automated microbiological assay of thiamin in serum and red cells.

    PubMed Central

    Icke, G; Nicol, D

    1994-01-01

    AIMS--To develop a sensitive, direct, automated method for the measurement of serum and red cell thiamin. METHODS--A microbiological assay using a chloramphenicol resistant strain of Lactobacillus fermenti as the test organism was developed. Addition of chloramphenicol and cycloheximide to the assay medium suppressed bacterial and yeast contamination and enabled tests to be automated without recourse to aseptic procedures. Evaluation of the assay included precision analysis and estimation of thiamin recovery. Results obtained on red cell extracts were compared with an established colorimetric (thiochrome) method. RESULTS--Acceptable intrabatch and interbatch precision was obtained and good recovery of thiamin added to serum was obtained. Non-parametric reference ranges based on the results from 505 healthy people were: serum thiamin 11.3-35.0 nmol/l and red cell thiamin 190-400 nmol/l. Results were not age or gender related. The method gave results for red cell thiamin which were significantly higher than those obtained with an established thiochrome method. CONCLUSIONS--This automated microbiological assay is sensitive to 2.0 nmol/l of thiamin and allows tests to be set up at the rate of 100 per hour and after 20-22 hours allows incubation results to be read at 60 per hour. The method has proved reliable, suitable for the assay of large numbers of samples, and relatively inexpensive to perform. PMID:8089221

  20. 22 CFR 120.30 - The Automated Export System (AES).

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 22 Foreign Relations 1 2013-04-01 2013-04-01 false The Automated Export System (AES). 120.30... DEFINITIONS § 120.30 The Automated Export System (AES). The Automated Export System (AES) is the Department of... data and defense services shall be reported directly to the Directorate of Defense Trade Controls...

  1. 22 CFR 120.30 - The Automated Export System (AES).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 22 Foreign Relations 1 2014-04-01 2014-04-01 false The Automated Export System (AES). 120.30... DEFINITIONS § 120.30 The Automated Export System (AES). The Automated Export System (AES) is the Department of... data and defense services shall be reported directly to the Directorate of Defense Trade Controls...

  2. 22 CFR 120.30 - The Automated Export System (AES).

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 22 Foreign Relations 1 2012-04-01 2012-04-01 false The Automated Export System (AES). 120.30... DEFINITIONS § 120.30 The Automated Export System (AES). The Automated Export System (AES) is the Department of... data and defense services shall be reported directly to the Directorate of Defense Trade Controls...

  3. Expression pattern of sonic hedgehog signaling and calcitonin gene-related peptide in the socket healing process after tooth extraction.

    PubMed

    Pang, Pai; Shimo, Tsuyoshi; Takada, Hiroyuki; Matsumoto, Kenichi; Yoshioka, Norie; Ibaragi, Soichiro; Sasaki, Akira

    2015-11-01

    Sonic Hedgehog (SHH), a neural development inducer, plays a significant role in the bone healing process. Calcitonin gene-related peptide (CGRP), a neuropeptide marker of sensory nerves, has been demonstrated to affect bone formation. The roles of SHH signaling and CGRP-positive sensory nerves in the alveolar bone formation process have been unknown. Here we examined the expression patterns of SHH signaling and CGRP in mouse socket by immunohistochemistry and immunofluorescence analysis. We found that the expression level of SHH peaked at day 3 and was then decreased at 5 days after tooth extraction. CGRP, PTCH1 and GLI2 were each expressed in a similar pattern with their highest expression levels at day 5 and day 7 after tooth extraction. CGRP and GLI2 were co-expressed in some inflammatory cells and bone forming cells. In some areas, CGRP-positive neurons expressed GLI2. In conclusion, SHH may affect alveolar bone healing by interacting with CGRP-positive sensory neurons and thus regulate the socket's healing process after tooth extraction. PMID:26427874

  4. An extract from Taxodium distichum targets hemagglutinin- and neuraminidase-related activities of influenza virus in vitro

    PubMed Central

    Hsieh, Chung-Fan; Chen, Yu-Li; Lin, Chwan-Fwu; Ho, Jin-Yuan; Huang, Chun-Hsun; Chiu, Cheng-Hsun; Hsieh, Pei-Wen; Horng, Jim-Tong

    2016-01-01

    Influenza virus remains an emerging virus and causes pandemics with high levels of fatality. After screening different plant extracts with potential anti-influenza activity, a water extract of Taxodium distichum stems (TDSWex) showed excellent activity against influenza viruses. The EC50 of TDSWex was 0.051 ± 0.024 mg/mL against influenza virus A/WSN/33. TDSWex had excellent antiviral efficacy against various strains of human influenza A and B viruses, particularly oseltamivir-resistant clinical isolates and a swine-origin influenza strain. We observed that the synthesis of viral RNA and protein were inhibited in the presence of TDSWex. The results of the time-of-addition assay suggested that TDSWex inhibited viral entry and budding. In the hemagglutination inhibition assay, TDSWex inhibited the hemagglutination of red blood cells, implying that the extract targeted hemagglutin-related functions such as viral entry. In the attachment and penetration assay, TDSWex showed antiviral activity with EC50s of 0.045 ± 0.026 and 0.012 ± 0.003 mg/mL, respectively. In addition, TDSWex blocked neuraminidase activity. We conclude that TDSWex has bimodal activities against both hemagglutinin and neuraminidase during viral replication. PMID:27796330

  5. Salvia officinalis L.: composition and antioxidant-related activities of a crude extract and selected sub-fractions.

    PubMed

    Koşar, Müberra; Dorman, H J Damien; Başer, K Hüsnü Can; Hiltunen, Raimo

    2010-09-01

    The composition and antioxidant properties of a methanol: acetic acid (99:1, v/v) soluble crude extract isolated from S. officinalis L. leaves through maceration and selected fractions isolated thereof are presented in this study. The total phenol content was estimated as gallic acid equivalents, whilst qualitative-quantitative phenolic content was determined using high performance liquid chromatography with photodiode array detection. Antioxidant evaluation consisted of ferric reductive capacity and 1,1-diphenyl-2-picrylhydrazyl and hydroxyl free radical scavenging determinations. The crude extract contained hydroxybenzoic acids, hydroxycinnamic acids, flavonoids and diterpenoids, whilst caffeic acid, carnosic acid, luteolin, luteolin-7-O-glucoside and rosmarinic acid were identified from their chromatographic and spectral characteristics and quantified from their respective calibration curves. The crude extract and sub-fractions demonstrated varying degrees of efficacy in the antioxidant-related assays used, except the n-hexane fraction, which was unable to reduce iron(III) at reasonable concentrations. Although the positive controls, ascorbic acid, BHA and BHT, were more potent than the S. officinalis samples, two fractions were significantly (p < 0.05) more potent iron(III) reducing agents than pycnogenol, a proanthocyanidin-rich commercial preparation.

  6. Automation of industrial bioprocesses.

    PubMed

    Beyeler, W; DaPra, E; Schneider, K

    2000-01-01

    The dramatic development of new electronic devices within the last 25 years has had a substantial influence on the control and automation of industrial bioprocesses. Within this short period of time the method of controlling industrial bioprocesses has changed completely. In this paper, the authors will use a practical approach focusing on the industrial applications of automation systems. From the early attempts to use computers for the automation of biotechnological processes up to the modern process automation systems some milestones are highlighted. Special attention is given to the influence of Standards and Guidelines on the development of automation systems.

  7. Automation in Clinical Microbiology

    PubMed Central

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  8. Tumorigenesis of diesel exhaust, gasoline exhaust, and related emission extracts on SENCAR mouse skin

    SciTech Connect

    Nesnow, S; Triplett, L L; Slaga, T J

    1980-01-01

    The tumorigenicity of diesel exhaust particulate emissions was examined using a sensitive mouse skin tumorigenesis model (SENCAR). The tumorigenic potency of particulate emissions from diesel, gasoline, and related emission sources was compared.

  9. In Vitro Antimicrobial Activity of Extracts from Plants Used Traditionally in South Africa to Treat Tuberculosis and Related Symptoms

    PubMed Central

    Madikizela, Balungile; Ndhlala, Ashwell Rungano; Finnie, Jeffrey Franklin; Staden, Johannes Van

    2013-01-01

    Respiratory ailments are major human killers, especially in developing countries. Tuberculosis (TB) is an infectious disease causing a threat to human healthcare. Many South African plants are used in the traditional treatment of TB and related symptoms, but there has not been a sufficient focus on evaluating their antimicrobial properties. The aim of this study was to evaluate the antimicrobial properties of plants used traditionally to treat TB and related symptoms against microorganisms (Klebsiella pneumoniae, Staphylococcus aureus, and Mycobacterium aurum A+) associated with respiratory infections using the microdilution assay. Ten plants were selected based on a survey of available literature of medicinal plants used in South Africa for the treatment of TB and related symptoms. The petroleum ether, dichloromethane, 80% ethanol, and water extracts of the selected plants were evaluated for antibacterial activity. Out of 68 extracts tested from different parts of the 10 plant species, 17 showed good antimicrobial activities against at least one or more of the microbial strains tested, with minimum inhibitory concentration ranging from 0.195 to 12.5 mg/mL. The good antimicrobial properties of Abrus precatorius, Terminalia phanerophlebia, Indigofera arrecta, and Pentanisia prunelloides authenticate their traditional use in the treatment of respiratory diseases. Thus, further pharmacological and phytochemical analysis is required. PMID:23533527

  10. Automated office blood pressure.

    PubMed

    Myers, Martin G; Godwin, Marshall

    2012-05-01

    Manual blood pressure (BP) is gradually disappearing from clinical practice with the mercury sphygmomanometer now considered to be an environmental hazard. Manual BP is also subject to measurement error on the part of the physician/nurse and patient-related anxiety which can result in poor quality BP measurements and office-induced (white coat) hypertension. Automated office (AO) BP with devices such as the BpTRU (BpTRU Medical Devices, Coquitlam, BC) has already replaced conventional manual BP in many primary care practices in Canada and has also attracted interest in other countries where research studies using AOBP have been undertaken. The basic principles of AOBP include multiple readings taken with a fully automated recorder with the patient resting alone in a quiet room. When these principles are followed, office-induced hypertension is eliminated and AOBP exhibits a much stronger correlation with the awake ambulatory BP as compared with routine manual BP measurements. Unlike routine manual BP, AOBP correlates as well with left ventricular mass as does the awake ambulatory BP. AOBP also simplifies the definition of hypertension in that the cut point for a normal AOBP (< 135/85 mm Hg) is the same as for the awake ambulatory BP and home BP. This article summarizes the currently available evidence supporting the use of AOBP in routine clinical practice and proposes an algorithm in which AOBP replaces manual BP for the diagnosis and management of hypertension. PMID:22265230

  11. Automated Characterization Of Vibrations Of A Structure

    NASA Technical Reports Server (NTRS)

    Bayard, David S.; Yam, Yeung; Mettler, Edward; Hadaegh, Fred Y.; Milman, Mark H.; Scheid, Robert E.

    1992-01-01

    Automated method of characterizing dynamical properties of large flexible structure yields estimates of modal parameters used by robust control system to stabilize structure and minimize undesired motions. Based on extraction of desired modal and control-design data from responses of structure to known vibrational excitations. Applicable to terrestrial structures where vibrations are important - aircraft, buildings, bridges, cranes, and drill strings.

  12. Assessing the state of the art in biomedical relation extraction: overview of the BioCreative V chemical-disease relation (CDR) task.

    PubMed

    Wei, Chih-Hsuan; Peng, Yifan; Leaman, Robert; Davis, Allan Peter; Mattingly, Carolyn J; Li, Jiao; Wiegers, Thomas C; Lu, Zhiyong

    2016-01-01

    Manually curating chemicals, diseases and their relationships is significantly important to biomedical research, but it is plagued by its high cost and the rapid growth of the biomedical literature. In recent years, there has been a growing interest in developing computational approaches for automatic chemical-disease relation (CDR) extraction. Despite these attempts, the lack of a comprehensive benchmarking dataset has limited the comparison of different techniques in order to assess and advance the current state-of-the-art. To this end, we organized a challenge task through BioCreative V to automatically extract CDRs from the literature. We designed two challenge tasks: disease named entity recognition (DNER) and chemical-induced disease (CID) relation extraction. To assist system development and assessment, we created a large annotated text corpus that consisted of human annotations of chemicals, diseases and their interactions from 1500 PubMed articles. 34 teams worldwide participated in the CDR task: 16 (DNER) and 18 (CID). The best systems achieved an F-score of 86.46% for the DNER task--a result that approaches the human inter-annotator agreement (0.8875)--and an F-score of 57.03% for the CID task, the highest results ever reported for such tasks. When combining team results via machine learning, the ensemble system was able to further improve over the best team results by achieving 88.89% and 62.80% in F-score for the DNER and CID task, respectively. Additionally, another novel aspect of our evaluation is to test each participating system's ability to return real-time results: the average response time for each team's DNER and CID web service systems were 5.6 and 9.3 s, respectively. Most teams used hybrid systems for their submissions based on machining learning. Given the level of participation and results, we found our task to be successful in engaging the text-mining research community, producing a large annotated corpus and improving the results of

  13. Assessing the state of the art in biomedical relation extraction: overview of the BioCreative V chemical-disease relation (CDR) task.

    PubMed

    Wei, Chih-Hsuan; Peng, Yifan; Leaman, Robert; Davis, Allan Peter; Mattingly, Carolyn J; Li, Jiao; Wiegers, Thomas C; Lu, Zhiyong

    2016-01-01

    Manually curating chemicals, diseases and their relationships is significantly important to biomedical research, but it is plagued by its high cost and the rapid growth of the biomedical literature. In recent years, there has been a growing interest in developing computational approaches for automatic chemical-disease relation (CDR) extraction. Despite these attempts, the lack of a comprehensive benchmarking dataset has limited the comparison of different techniques in order to assess and advance the current state-of-the-art. To this end, we organized a challenge task through BioCreative V to automatically extract CDRs from the literature. We designed two challenge tasks: disease named entity recognition (DNER) and chemical-induced disease (CID) relation extraction. To assist system development and assessment, we created a large annotated text corpus that consisted of human annotations of chemicals, diseases and their interactions from 1500 PubMed articles. 34 teams worldwide participated in the CDR task: 16 (DNER) and 18 (CID). The best systems achieved an F-score of 86.46% for the DNER task--a result that approaches the human inter-annotator agreement (0.8875)--and an F-score of 57.03% for the CID task, the highest results ever reported for such tasks. When combining team results via machine learning, the ensemble system was able to further improve over the best team results by achieving 88.89% and 62.80% in F-score for the DNER and CID task, respectively. Additionally, another novel aspect of our evaluation is to test each participating system's ability to return real-time results: the average response time for each team's DNER and CID web service systems were 5.6 and 9.3 s, respectively. Most teams used hybrid systems for their submissions based on machining learning. Given the level of participation and results, we found our task to be successful in engaging the text-mining research community, producing a large annotated corpus and improving the results of

  14. Assessing the state of the art in biomedical relation extraction: overview of the BioCreative V chemical-disease relation (CDR) task

    PubMed Central

    Wei, Chih-Hsuan; Peng, Yifan; Leaman, Robert; Davis, Allan Peter; Mattingly, Carolyn J.; Li, Jiao; Wiegers, Thomas C.; Lu, Zhiyong

    2016-01-01

    Manually curating chemicals, diseases and their relationships is significantly important to biomedical research, but it is plagued by its high cost and the rapid growth of the biomedical literature. In recent years, there has been a growing interest in developing computational approaches for automatic chemical-disease relation (CDR) extraction. Despite these attempts, the lack of a comprehensive benchmarking dataset has limited the comparison of different techniques in order to assess and advance the current state-of-the-art. To this end, we organized a challenge task through BioCreative V to automatically extract CDRs from the literature. We designed two challenge tasks: disease named entity recognition (DNER) and chemical-induced disease (CID) relation extraction. To assist system development and assessment, we created a large annotated text corpus that consisted of human annotations of chemicals, diseases and their interactions from 1500 PubMed articles. 34 teams worldwide participated in the CDR task: 16 (DNER) and 18 (CID). The best systems achieved an F-score of 86.46% for the DNER task—a result that approaches the human inter-annotator agreement (0.8875)—and an F-score of 57.03% for the CID task, the highest results ever reported for such tasks. When combining team results via machine learning, the ensemble system was able to further improve over the best team results by achieving 88.89% and 62.80% in F-score for the DNER and CID task, respectively. Additionally, another novel aspect of our evaluation is to test each participating system’s ability to return real-time results: the average response time for each team’s DNER and CID web service systems were 5.6 and 9.3 s, respectively. Most teams used hybrid systems for their submissions based on machining learning. Given the level of participation and results, we found our task to be successful in engaging the text-mining research community, producing a large annotated corpus and improving the

  15. Relative brain signature: a population-based feature extraction procedure to identify functional biomarkers in the brain of alcoholics

    PubMed Central

    Karamzadeh, Nader; Ardeshirpour, Yasaman; Kellman, Matthew; Chowdhry, Fatima; Anderson, Afrouz; Chorlian, David; Wegman, Edward; Gandjbakhche, Amir

    2015-01-01

    Background A novel feature extraction technique, Relative-Brain-Signature (RBS), which characterizes subjects' relationship to populations with distinctive neuronal activity, is presented. The proposed method transforms a set of Electroencephalography's (EEG) time series in high dimensional space to a space of fewer dimensions by projecting time series onto orthogonal subspaces. Methods We apply our technique to an EEG data set of 77 abstinent alcoholics and 43 control subjects. To characterize subjects' relationship to the alcoholic and control populations, one RBS vector with respect to the alcoholic and one with respect to the control population is constructed. We used the extracted RBS vectors to identify functional biomarkers over the brain of alcoholics. To achieve this goal, the classification algorithm was used to categorize subjects into alcoholics and controls, which resulted in 78% accuracy. Results and Conclusions Using the results of the classification, regions with distinctive functionality in alcoholic subjects are detected. These affected regions, with respect to their spatial extent, are frontal, anterior frontal, centro-parietal, parieto-occiptal, and occipital lobes. The distribution of these regions over the scalp indicates that the impact of the alcohol in the cerebral cortex of the alcoholics is spatially diffuse. Our finding suggests that these regions engage more of the right hemisphere relative to the left hemisphere of the alcoholics' brain. PMID:26221569

  16. Performance-friendly rule extraction in large water data-sets with AOC posets and relational concept analysis

    NASA Astrophysics Data System (ADS)

    Dolques, Xavier; Le Ber, Florence; Huchard, Marianne; Grac, Corinne

    2016-02-01

    In this paper, we consider data analysis methods for knowledge extraction from large water data-sets. More specifically, we try to connect physico-chemical parameters and the characteristics of taxons living in sample sites. Among these data analysis methods, we consider formal concept analysis (FCA), which is a recognized tool for classification and rule discovery on object-attribute data. Relational concept analysis (RCA) relies on FCA and deals with sets of object-attribute data provided with relations. RCA produces more informative results but at the expense of an increase in complexity. Besides, in numerous applications of FCA, the partially ordered set of concepts introducing attributes or objects (AOC poset, for Attribute-Object-Concept poset) is used rather than the concept lattice in order to reduce combinatorial problems. AOC posets are much smaller and easier to compute than concept lattices and still contain the information needed to rebuild the initial data. This paper introduces a variant of the RCA process based on AOC posets rather than concept lattices. This approach is compared with RCA based on iceberg lattices. Experiments are performed with various scaling operators, and a specific operator is introduced to deal with noisy data. We show that using AOC poset on water data-sets provides a reasonable concept number and allows us to extract meaningful implication rules (association rules whose confidence is 1), whose semantics depends on the chosen scaling operator.

  17. Automated DNA Sequencing System

    SciTech Connect

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  18. Improvements in automated analysis of catecholamine and related metabolites in biological samples by column-switching high-performance liquid chromatography.

    PubMed

    Grossi, G; Bargossi, A M; Lucarelli, C; Paradisi, R; Sprovieri, C; Sprovieri, G

    1991-03-22

    Previously two fully automated methods based on column switching and high-performance liquid chromatography have been described, one for plasma and urinary catecholamines and the other for catecholamine urinary metabolites. Improvements in these methods, after 3 years of routine application, are now reported. The sample processing scheme was changed in order to eliminate memory effects and, in the procedure for plasma catecholamines, a pre-analytical deproteinization step was added which enhances the analytical column lifetime. The applied voltages for the electrochemical detector have been optimized, resulting in an automated method, suitable for the simultaneous determination of vanillylmandelic acid, 3,4-dihydroxyphenylacetic acid, homovanillic acid and 5-hydroxyindoleacetic acid. The sensitivity of the methods allows the detection of 2-3 ng/l of plasma catecholamines and 0.01-0.06 mg/l of urinary metabolites. Also, it is possible to switch from one method to the other in only 30 min. The normal values obtained from 200 healthy people are reported, together with a list of 57 potential interfering substances tested.

  19. Cockpit Adaptive Automation and Pilot Performance

    NASA Technical Reports Server (NTRS)

    Parasuraman, Raja

    2001-01-01

    The introduction of high-level automated systems in the aircraft cockpit has provided several benefits, e.g., new capabilities, enhanced operational efficiency, and reduced crew workload. At the same time, conventional 'static' automation has sometimes degraded human operator monitoring performance, increased workload, and reduced situation awareness. Adaptive automation represents an alternative to static automation. In this approach, task allocation between human operators and computer systems is flexible and context-dependent rather than static. Adaptive automation, or adaptive task allocation, is thought to provide for regulation of operator workload and performance, while preserving the benefits of static automation. In previous research we have reported beneficial effects of adaptive automation on the performance of both pilots and non-pilots of flight-related tasks. For adaptive systems to be viable, however, such benefits need to be examined jointly in the context of a single set of tasks. The studies carried out under this project evaluated a systematic method for combining different forms of adaptive automation. A model for effective combination of different forms of adaptive automation, based on matching adaptation to operator workload was proposed and tested. The model was evaluated in studies using IFR-rated pilots flying a general-aviation simulator. Performance, subjective, and physiological (heart rate variability, eye scan-paths) measures of workload were recorded. The studies compared workload-based adaptation to to non-adaptive control conditions and found evidence for systematic benefits of adaptive automation. The research provides an empirical basis for evaluating the effectiveness of adaptive automation in the cockpit. The results contribute to the development of design principles and guidelines for the implementation of adaptive automation in the cockpit, particularly in general aviation, and in other human-machine systems. Project goals

  20. Periodontal Disease, Dental Implants, Extractions and Medications Related to Osteonecrosis of the Jaws.

    PubMed

    Shah, Neha P; Katsarelis, Helen; Pazianas, Michael; Dhariwal, Daljit K

    2015-11-01

    Patients taking bisphosphonates and other anti-resorptive drugs are likely to attend general dental practice. The term 'bisphosphonate'is often immediately associated with osteonecrosis of the jaws (ONJ). Risk assessment and subsequent management of these patients should be carried out taking into account all the risk factors associated with ONJ. The introduction of newer drugs, also shown to be associated with ONJ, demands increased awareness of general dental practitioners about these medications. CPD/CLINICAL RELEVANCE: This paper provides an update on medication-related ONJ and considers the effects of anti-resorptive drugs on the management of patients needing exodontia, treatment for periodontal disease and dental implant placement.

  1. Effects of the polysaccharides extracted from Ganoderma lucidum on chemotherapy-related fatigue in mice.

    PubMed

    Ouyang, Ming-Zi; Lin, Li-Zhu; Lv, Wen-Jiao; Zuo, Qian; Lv, Zhuo; Guan, Jie-Shan; Wang, Shu-Tang; Sun, Ling-Ling; Chen, Han-Rui; Xiao, Zhi-Wei

    2016-10-01

    The weight-loaded swimming capability, tumor growth, survival time and biochemical markers of Ganoderma lucidum polysaccharides (GLPs) in a chemotherapy-related fatigue mouse model were tested in the present study. The results showed that the middle-dose GLPs (GLP-M) and the high-dose GLPs (GLP-H) could increase the exhausting swimming time, which was observed to decrease in the cisplatin control group(PCG) and the tumor control group (TCG).The GLP-M and the GLP-H had reduced serum levels of tumor necrosis factor-αand interleukin-6, which were up-regulated by cisplatin. Cisplatin and the presence of tumor significantly enhanced the malondialdehyde (MDA) content and inhibited the activity of superoxide dismutase (SOD) in the muscle. Administration of GLPs at a high dose decreased the levels of MDA and up-regulated the SOD activity. The high-dose GLPs+cisplatin group presented a decreased tendency of tumor volume and a lower tumor weight compared with PCG. Moreover, the mice in the GLP-M and GLP-H groups had longer survival times compared with the mice in the TCG and PCG.The levels of creatinine and serum blood urea nitrogen, which are up-regulated by cisplatin, were significantly reduced by GLP-M and GLP-H. Therefore, these results suggest that GLPs might improve chemotherapy-related fatigue via regulation of inflammatory responses, oxidative stress and reduction of nephrotoxicity. PMID:27208798

  2. Effects of the polysaccharides extracted from Ganoderma lucidum on chemotherapy-related fatigue in mice.

    PubMed

    Ouyang, Ming-Zi; Lin, Li-Zhu; Lv, Wen-Jiao; Zuo, Qian; Lv, Zhuo; Guan, Jie-Shan; Wang, Shu-Tang; Sun, Ling-Ling; Chen, Han-Rui; Xiao, Zhi-Wei

    2016-10-01

    The weight-loaded swimming capability, tumor growth, survival time and biochemical markers of Ganoderma lucidum polysaccharides (GLPs) in a chemotherapy-related fatigue mouse model were tested in the present study. The results showed that the middle-dose GLPs (GLP-M) and the high-dose GLPs (GLP-H) could increase the exhausting swimming time, which was observed to decrease in the cisplatin control group(PCG) and the tumor control group (TCG).The GLP-M and the GLP-H had reduced serum levels of tumor necrosis factor-αand interleukin-6, which were up-regulated by cisplatin. Cisplatin and the presence of tumor significantly enhanced the malondialdehyde (MDA) content and inhibited the activity of superoxide dismutase (SOD) in the muscle. Administration of GLPs at a high dose decreased the levels of MDA and up-regulated the SOD activity. The high-dose GLPs+cisplatin group presented a decreased tendency of tumor volume and a lower tumor weight compared with PCG. Moreover, the mice in the GLP-M and GLP-H groups had longer survival times compared with the mice in the TCG and PCG.The levels of creatinine and serum blood urea nitrogen, which are up-regulated by cisplatin, were significantly reduced by GLP-M and GLP-H. Therefore, these results suggest that GLPs might improve chemotherapy-related fatigue via regulation of inflammatory responses, oxidative stress and reduction of nephrotoxicity.

  3. Follicular Unit Extraction Hair Transplant

    PubMed Central

    Dua, Aman; Dua, Kapil

    2010-01-01

    Hair transplantation has come a long way from the days of Punch Hair Transplant by Dr. Orentreich in 1950s to Follicular Unit Hair Transplant (FUT) of 1990s and the very recent Follicular Unit Extraction (FUE) technique. With the advent of FUE, the dream of ‘no visible scarring’ in the donor area is now looking like a possibility. In FUE, the grafts are extracted as individual follicular units in a two-step or three-step technique whereas the method of implantation remains the same as in the traditional FUT. The addition of latest automated FUE technique seeks to overcome some of the limitations in this relatively new technique and it is now possible to achieve more than a thousand grafts in one day in trained hands. This article reviews the methodology, limitations and advantages of FUE hair transplant. PMID:21031064

  4. Enhancing Seismic Calibration Research Through Software Automation

    SciTech Connect

    Ruppert, S; Dodge, D; Elliott, A; Ganzberger, M; Hauk, T; Matzel, E; Ryall, F

    2004-07-09

    observations. Even partial automation of this second tier, through development of prototype tools to extract observations and make many thousands of scientific measurements, has significantly increased the efficiency of the scientists who construct and validate integrated calibration surfaces. This achieved gain in efficiency and quality control is likely to continue and even accelerate through continued application of information science and scientific automation. Data volume and calibration research requirements have increased by several orders of magnitude over the past decade. Whereas it was possible for individual researchers to download individual waveforms and make time-consuming measurements event by event in the past, with the Terabytes of data available today, a software automation framework must exist to efficiently populate and deliver quality data to the researcher. This framework must also simultaneously provide the researcher with robust measurement and analysis tools that can handle and extract groups of events effectively and isolate the researcher from the now onerous task of database management and metadata collection necessary for validation and error analysis. We have succeeded in automating many of the collection, parsing, reconciliation and extraction tasks, individually. Several software automation prototypes have been produced and have resulted in demonstrated gains in efficiency of producing scientific data products. Future software automation tasks will continue to leverage database and information management technologies in addressing additional scientific calibration research tasks.

  5. Systematically Extracting Metal- and Solvent-Related Occupational Information from Free-Text Responses to Lifetime Occupational History Questionnaires

    PubMed Central

    Friesen, Melissa C.; Locke, Sarah J.; Tornow, Carina; Chen, Yu-Cheng; Koh, Dong-Hee; Stewart, Patricia A.; Purdue, Mark; Colt, Joanne S.

    2014-01-01