Science.gov

Sample records for fully automated microarray

  1. Crossword: A Fully Automated Algorithm for the Segmentation and Quality Control of Protein Microarray Images

    PubMed Central

    2015-01-01

    Biological assays formatted as microarrays have become a critical tool for the generation of the comprehensive data sets required for systems-level understanding of biological processes. Manual annotation of data extracted from images of microarrays, however, remains a significant bottleneck, particularly for protein microarrays due to the sensitivity of this technology to weak artifact signal. In order to automate the extraction and curation of data from protein microarrays, we describe an algorithm called Crossword that logically combines information from multiple approaches to fully automate microarray segmentation. Automated artifact removal is also accomplished by segregating structured pixels from the background noise using iterative clustering and pixel connectivity. Correlation of the location of structured pixels across image channels is used to identify and remove artifact pixels from the image prior to data extraction. This component improves the accuracy of data sets while reducing the requirement for time-consuming visual inspection of the data. Crossword enables a fully automated protocol that is robust to significant spatial and intensity aberrations. Overall, the average amount of user intervention is reduced by an order of magnitude and the data quality is increased through artifact removal and reduced user variability. The increase in throughput should aid the further implementation of microarray technologies in clinical studies. PMID:24417579

  2. Fully automated analysis of multi-resolution four-channel micro-array genotyping data

    NASA Astrophysics Data System (ADS)

    Abbaspour, Mohsen; Abugharbieh, Rafeef; Podder, Mohua; Tebbutt, Scott J.

    2006-03-01

    We present a fully-automated and robust microarray image analysis system for handling multi-resolution images (down to 3-micron with sizes up to 80 MBs per channel). The system is developed to provide rapid and accurate data extraction for our recently developed microarray analysis and quality control tool (SNP Chart). Currently available commercial microarray image analysis applications are inefficient, due to the considerable user interaction typically required. Four-channel DNA microarray technology is a robust and accurate tool for determining genotypes of multiple genetic markers in individuals. It plays an important role in the state of the art trend where traditional medical treatments are to be replaced by personalized genetic medicine, i.e. individualized therapy based on the patient's genetic heritage. However, fast, robust, and precise image processing tools are required for the prospective practical use of microarray-based genetic testing for predicting disease susceptibilities and drug effects in clinical practice, which require a turn-around timeline compatible with clinical decision-making. In this paper we have developed a fully-automated image analysis platform for the rapid investigation of hundreds of genetic variations across multiple genes. Validation tests indicate very high accuracy levels for genotyping results. Our method achieves a significant reduction in analysis time, from several hours to just a few minutes, and is completely automated requiring no manual interaction or guidance.

  3. Fully Automated Complementary DNA Microarray Segmentation using a Novel Fuzzy-based Algorithm

    PubMed Central

    Saberkari, Hamidreza; Bahrami, Sheyda; Shamsi, Mousa; Amoshahy, Mohammad Javad; Ghavifekr, Habib Badri; Sedaaghi, Mohammad Hossein

    2015-01-01

    DNA microarray is a powerful approach to study simultaneously, the expression of 1000 of genes in a single experiment. The average value of the fluorescent intensity could be calculated in a microarray experiment. The calculated intensity values are very close in amount to the levels of expression of a particular gene. However, determining the appropriate position of every spot in microarray images is a main challenge, which leads to the accurate classification of normal and abnormal (cancer) cells. In this paper, first a preprocessing approach is performed to eliminate the noise and artifacts available in microarray cells using the nonlinear anisotropic diffusion filtering method. Then, the coordinate center of each spot is positioned utilizing the mathematical morphology operations. Finally, the position of each spot is exactly determined through applying a novel hybrid model based on the principle component analysis and the spatial fuzzy c-means clustering (SFCM) algorithm. Using a Gaussian kernel in SFCM algorithm will lead to improving the quality in complementary DNA microarray segmentation. The performance of the proposed algorithm has been evaluated on the real microarray images, which is available in Stanford Microarray Databases. Results illustrate that the accuracy of microarray cells segmentation in the proposed algorithm reaches to 100% and 98% for noiseless/noisy cells, respectively. PMID:26284175

  4. Fully automated urban traffic system

    NASA Technical Reports Server (NTRS)

    Dobrotin, B. M.; Hansen, G. R.; Peng, T. K. C.; Rennels, D. A.

    1977-01-01

    The replacement of the driver with an automatic system which could perform the functions of guiding and routing a vehicle with a human's capability of responding to changing traffic demands was discussed. The problem was divided into four technological areas; guidance, routing, computing, and communications. It was determined that the latter three areas being developed independent of any need for fully automated urban traffic. A guidance system that would meet system requirements was not being developed but was technically feasible.

  5. Progress in Fully Automated Abdominal CT Interpretation

    PubMed Central

    Summers, Ronald M.

    2016-01-01

    OBJECTIVE Automated analysis of abdominal CT has advanced markedly over just the last few years. Fully automated assessment of organs, lymph nodes, adipose tissue, muscle, bowel, spine, and tumors are some examples where tremendous progress has been made. Computer-aided detection of lesions has also improved dramatically. CONCLUSION This article reviews the progress and provides insights into what is in store in the near future for automated analysis for abdominal CT, ultimately leading to fully automated interpretation. PMID:27101207

  6. An automated method for gridding and clustering-based segmentation of cDNA microarray images.

    PubMed

    Giannakeas, Nikolaos; Fotiadis, Dimitrios I

    2009-01-01

    Microarrays are widely used to quantify gene expression levels. Microarray image analysis is one of the tools, which are necessary when dealing with vast amounts of biological data. In this work we propose a new method for the automated analysis of microarray images. The proposed method consists of two stages: gridding and segmentation. Initially, the microarray images are preprocessed using template matching, and block and spot finding takes place. Then, the non-expressed spots are detected and a grid is fit on the image using a Voronoi diagram. In the segmentation stage, K-means and Fuzzy C means (FCM) clustering are employed. The proposed method was evaluated using images from the Stanford Microarray Database (SMD). The results that are presented in the segmentation stage show the efficiency of our Fuzzy C means-based work compared to the two already developed K-means-based methods. The proposed method can handle images with artefacts and it is fully automated. PMID:19046850

  7. Fully integrated, fully automated generation of short tandem repeat profiles

    PubMed Central

    2013-01-01

    Background The generation of short tandem repeat profiles, also referred to as ‘DNA typing,’ is not currently performed outside the laboratory because the process requires highly skilled technical operators and a controlled laboratory environment and infrastructure with several specialized instruments. The goal of this work was to develop a fully integrated system for the automated generation of short tandem repeat profiles from buccal swab samples, to improve forensic laboratory process flow as well as to enable short tandem repeat profile generation to be performed in police stations and in field-forward military, intelligence, and homeland security settings. Results An integrated system was developed consisting of an injection-molded microfluidic BioChipSet cassette, a ruggedized instrument, and expert system software. For each of five buccal swabs, the system purifies DNA using guanidinium-based lysis and silica binding, amplifies 15 short tandem repeat loci and the amelogenin locus, electrophoretically separates the resulting amplicons, and generates a profile. No operator processing of the samples is required, and the time from swab insertion to profile generation is 84 minutes. All required reagents are contained within the BioChipSet cassette; these consist of a lyophilized polymerase chain reaction mix and liquids for purification and electrophoretic separation. Profiles obtained from fully automated runs demonstrate that the integrated system generates concordant short tandem repeat profiles. The system exhibits single-base resolution from 100 to greater than 500 bases, with inter-run precision with a standard deviation of ±0.05 - 0.10 bases for most alleles. The reagents are stable for at least 6 months at 22°C, and the instrument has been designed and tested to Military Standard 810F for shock and vibration ruggedization. A nontechnical user can operate the system within or outside the laboratory. Conclusions The integrated system represents the

  8. A microfluidic device for the automated electrical readout of low-density glass-slide microarrays.

    PubMed

    Díaz-González, María; Salvador, J Pablo; Bonilla, Diana; Marco, M Pilar; Fernández-Sánchez, César; Baldi, Antoni

    2015-12-15

    Microarrays are a powerful platform for rapid and multiplexed analysis in a wide range of research fields. Electrical readout systems have emerged as an alternative to conventional optical methods for microarray analysis thanks to its potential advantages like low-cost, low-power and easy miniaturization of the required instrumentation. In this work an automated electrical readout system for low-cost glass-slide microarrays is described. The system enables the simultaneous conductimetric detection of up to 36 biorecognition events by incorporating an array of interdigitated electrode transducers. A polydimethylsiloxane microfluidic structure has been designed that creates microwells over the transducers and incorporates the microfluidic channels required for filling and draining them with readout and cleaning solutions, thus making the readout process fully automated. Since the capture biomolecules are not immobilized on the transducer surface this readout system is reusable, in contrast to previously reported electrochemical microarrays. A low-density microarray based on a competitive enzymatic immunoassay for atrazine detection was used to test the performance of the readout system. The electrical assay shows a detection limit of 0.22±0.03 μg L(-1) similar to that obtained with fluorescent detection and allows the direct determination of the pesticide in polluted water samples. These results proved that an electrical readout system such as the one presented in this work is a reliable and cost-effective alternative to fluorescence scanners for the analysis of low-density microarrays. PMID:26210466

  9. Fully Mechanically Controlled Automated Electron Microscopic Tomography

    NASA Astrophysics Data System (ADS)

    Liu, Jinxin; Li, Hongchang; Zhang, Lei; Rames, Matthew; Zhang, Meng; Yu, Yadong; Peng, Bo; Celis, César Díaz; Xu, April; Zou, Qin; Yang, Xu; Chen, Xuefeng; Ren, Gang

    2016-07-01

    Knowledge of three-dimensional (3D) structures of each individual particles of asymmetric and flexible proteins is essential in understanding those proteins’ functions; but their structures are difficult to determine. Electron tomography (ET) provides a tool for imaging a single and unique biological object from a series of tilted angles, but it is challenging to image a single protein for three-dimensional (3D) reconstruction due to the imperfect mechanical control capability of the specimen goniometer under both a medium to high magnification (approximately 50,000–160,000×) and an optimized beam coherence condition. Here, we report a fully mechanical control method for automating ET data acquisition without using beam tilt/shift processes. This method could reduce the accumulation of beam tilt/shift that used to compensate the error from the mechanical control, but downgraded the beam coherence. Our method was developed by minimizing the error of the target object center during the tilting process through a closed-loop proportional-integral (PI) control algorithm. The validations by both negative staining (NS) and cryo-electron microscopy (cryo-EM) suggest that this method has a comparable capability to other ET methods in tracking target proteins while maintaining optimized beam coherence conditions for imaging.

  10. Fully Mechanically Controlled Automated Electron Microscopic Tomography

    PubMed Central

    Liu, Jinxin; Li, Hongchang; Zhang, Lei; Rames, Matthew; Zhang, Meng; Yu, Yadong; Peng, Bo; Celis, César Díaz; Xu, April; Zou, Qin; Yang, Xu; Chen, Xuefeng; Ren, Gang

    2016-01-01

    Knowledge of three-dimensional (3D) structures of each individual particles of asymmetric and flexible proteins is essential in understanding those proteins’ functions; but their structures are difficult to determine. Electron tomography (ET) provides a tool for imaging a single and unique biological object from a series of tilted angles, but it is challenging to image a single protein for three-dimensional (3D) reconstruction due to the imperfect mechanical control capability of the specimen goniometer under both a medium to high magnification (approximately 50,000–160,000×) and an optimized beam coherence condition. Here, we report a fully mechanical control method for automating ET data acquisition without using beam tilt/shift processes. This method could reduce the accumulation of beam tilt/shift that used to compensate the error from the mechanical control, but downgraded the beam coherence. Our method was developed by minimizing the error of the target object center during the tilting process through a closed-loop proportional-integral (PI) control algorithm. The validations by both negative staining (NS) and cryo-electron microscopy (cryo-EM) suggest that this method has a comparable capability to other ET methods in tracking target proteins while maintaining optimized beam coherence conditions for imaging. PMID:27403922

  11. Fully Mechanically Controlled Automated Electron Microscopic Tomography.

    PubMed

    Liu, Jinxin; Li, Hongchang; Zhang, Lei; Rames, Matthew; Zhang, Meng; Yu, Yadong; Peng, Bo; Celis, César Díaz; Xu, April; Zou, Qin; Yang, Xu; Chen, Xuefeng; Ren, Gang

    2016-01-01

    Knowledge of three-dimensional (3D) structures of each individual particles of asymmetric and flexible proteins is essential in understanding those proteins' functions; but their structures are difficult to determine. Electron tomography (ET) provides a tool for imaging a single and unique biological object from a series of tilted angles, but it is challenging to image a single protein for three-dimensional (3D) reconstruction due to the imperfect mechanical control capability of the specimen goniometer under both a medium to high magnification (approximately 50,000-160,000×) and an optimized beam coherence condition. Here, we report a fully mechanical control method for automating ET data acquisition without using beam tilt/shift processes. This method could reduce the accumulation of beam tilt/shift that used to compensate the error from the mechanical control, but downgraded the beam coherence. Our method was developed by minimizing the error of the target object center during the tilting process through a closed-loop proportional-integral (PI) control algorithm. The validations by both negative staining (NS) and cryo-electron microscopy (cryo-EM) suggest that this method has a comparable capability to other ET methods in tracking target proteins while maintaining optimized beam coherence conditions for imaging. PMID:27403922

  12. Evaluation of a novel automated allergy microarray platform compared with three other allergy test methods.

    PubMed

    Williams, P; Önell, A; Baldracchini, F; Hui, V; Jolles, S; El-Shanawany, T

    2016-04-01

    Microarray platforms, enabling simultaneous measurement of many allergens with a small serum sample, are potentially powerful tools in allergy diagnostics. We report here the first study comparing a fully automated microarray system, the Microtest allergy system, with a manual microarray platform, Immuno-Solid phase Allergen Chip (ISAC), and two well-established singleplex allergy tests, skin prick test (SPT) and ImmunoCAP, all tested on the same patients. One hundred and three adult allergic patients attending the allergy clinic were included into the study. All patients were tested with four allergy test methods (SPT, ImmunoCAP, Microtest and ISAC 112) and a total of 3485 pairwise test results were analysed and compared. The four methods showed comparable results with a positive/negative agreement of 81-88% for any pair of test methods compared, which is in line with data in the literature. The most prevalent allergens (cat, dog, mite, timothy, birch and peanut) and their individual allergen components revealed an agreement between methods with correlation coefficients between 0·73 and 0·95. All four methods revealed deviating individual patient results for a minority of patients. These results indicate that microarray platforms are efficient and useful tools to characterize the specific immunoglobulin (Ig)E profile of allergic patients using a small volume of serum sample. The results produced by the Microtest system were in agreement with diagnostic tests in current use. Further data collection and evaluation are needed for other populations, geographical regions and allergens. PMID:26437695

  13. Fully automated three-dimensional microscopy system

    NASA Astrophysics Data System (ADS)

    Kerschmann, Russell L.

    2000-04-01

    Tissue-scale structures such as vessel networks are imaged at micron resolution with the Virtual Tissue System (VT System). VT System imaging of cubic millimeters of tissue and other material extends the capabilities of conventional volumetric techniques such as confocal microscopy, and allows for the first time the integrated 2D and 3D analysis of important tissue structural relationships. The VT System eliminates the need for glass slide-mounted tissue sections and instead captures images directly from the surface of a block containing a sample. Tissues are en bloc stained with fluorochrome compounds, embedded in an optically conditioned polymer that suppresses image signals form dep within the block , and serially sectioned for imaging. Thousands of fully registered 2D images are automatically captured digitally to completely convert tissue samples into blocks of high-resolution information. The resulting multi gigabyte data sets constitute the raw material for precision visualization and analysis. Cellular function may be seen in a larger anatomical context. VT System technology makes tissue metrics, accurate cell enumeration and cell cycle analyses possible while preserving full histologic setting.

  14. Towards A Fully Automated High-Throughput Phototransfection System

    PubMed Central

    Cappelleri, David J.; Halasz, Adam; Sul, Jai-Yoon; Kim, Tae Kyung; Eberwine, James; Kumar, Vijay

    2010-01-01

    We have designed and implemented a framework for creating a fully automated high-throughput phototransfection system. Integrated image processing, laser target position calculation, and stage movements show a throughput increase of > 23X over the current manual phototransfection method while the potential for even greater throughput improvements (> 110X) is described. A software tool for automated off-line single cell morphological measurements, as well as real-time image segmentation analysis, has also been constructed and shown to be able quantify changes in the cell before and after the process, successfully characterizing them, using metrics such as cell perimeter, area, major and minor axis length, and eccentricity values. PMID:20706617

  15. A fully automated digitally controlled 30-inch telescope

    NASA Technical Reports Server (NTRS)

    Colgate, S. A.; Moore, E. P.; Carlson, R.

    1975-01-01

    A fully automated 30-inch (75-cm) telescope has been successfully designed and constructed from a military surplus Nike-Ajax radar mount. Novel features include: closed-loop operation between mountain telescope and campus computer 30 km apart via microwave link, a TV-type sensor which is photon shot-noise limited, a special lightweight primary mirror, and a stepping motor drive capable of slewing and settling one degree in one second or a radian in fifteen seconds.

  16. Semi-automated and fully automated mammographic density measurement and breast cancer risk prediction.

    PubMed

    Llobet, Rafael; Pollán, Marina; Antón, Joaquín; Miranda-García, Josefa; Casals, María; Martínez, Inmaculada; Ruiz-Perales, Francisco; Pérez-Gómez, Beatriz; Salas-Trejo, Dolores; Pérez-Cortés, Juan-Carlos

    2014-09-01

    The task of breast density quantification is becoming increasingly relevant due to its association with breast cancer risk. In this work, a semi-automated and a fully automated tools to assess breast density from full-field digitized mammograms are presented. The first tool is based on a supervised interactive thresholding procedure for segmenting dense from fatty tissue and is used with a twofold goal: for assessing mammographic density (MD) in a more objective and accurate way than via visual-based methods and for labeling the mammograms that are later employed to train the fully automated tool. Although most automated methods rely on supervised approaches based on a global labeling of the mammogram, the proposed method relies on pixel-level labeling, allowing better tissue classification and density measurement on a continuous scale. The fully automated method presented combines a classification scheme based on local features and thresholding operations that improve the performance of the classifier. A dataset of 655 mammograms was used to test the concordance of both approaches in measuring MD. Three expert radiologists measured MD in each of the mammograms using the semi-automated tool (DM-Scan). It was then measured by the fully automated system and the correlation between both methods was computed. The relation between MD and breast cancer was then analyzed using a case-control dataset consisting of 230 mammograms. The Intraclass Correlation Coefficient (ICC) was used to compute reliability among raters and between techniques. The results obtained showed an average ICC=0.922 among raters when using the semi-automated tool, whilst the average correlation between the semi-automated and automated measures was ICC=0.838. In the case-control study, the results obtained showed Odds Ratios (OR) of 1.38 and 1.50 per 10% increase in MD when using the semi-automated and fully automated approaches respectively. It can therefore be concluded that the automated and semi-automated

  17. A Fully Automatic Method for Gridding Bright Field Images of Bead-Based Microarrays.

    PubMed

    Datta, Abhik; Wai-Kin Kong, Adams; Yow, Kin-Choong

    2016-07-01

    In this paper, a fully automatic method for gridding bright field images of bead-based microarrays is proposed. There have been numerous techniques developed for gridding fluorescence images of traditional spotted microarrays but to our best knowledge, no algorithm has yet been developed for gridding bright field images of bead-based microarrays. The proposed gridding method is designed for automatic quality control during fabrication and assembly of bead-based microarrays. The method begins by estimating the grid parameters using an evolutionary algorithm. This is followed by a grid-fitting step that rigidly aligns an ideal grid with the image. Finally, a grid refinement step deforms the ideal grid to better fit the image. The grid fitting and refinement are performed locally and the final grid is a nonlinear (piecewise affine) grid. To deal with extreme corruptions in the image, the initial grid parameter estimation and grid-fitting steps employ robust search techniques. The proposed method does not have any free parameters that need tuning. The method is capable of identifying the grid structure even in the presence of extreme amounts of artifacts and distortions. Evaluation results on a variety of images are presented. PMID:26011899

  18. Fully Automated Image Orientation in the Absence of Targets

    NASA Astrophysics Data System (ADS)

    Stamatopoulos, C.; Chuang, T. Y.; Fraser, C. S.; Lu, Y. Y.

    2012-07-01

    Automated close-range photogrammetric network orientation has traditionally been associated with the use of coded targets in the object space to allow for an initial relative orientation (RO) and subsequent spatial resection of the images. Over the past decade, automated orientation via feature-based matching (FBM) techniques has attracted renewed research attention in both the photogrammetry and computer vision (CV) communities. This is largely due to advances made towards the goal of automated relative orientation of multi-image networks covering untargetted (markerless) objects. There are now a number of CV-based algorithms, with accompanying open-source software, that can achieve multi-image orientation within narrow-baseline networks. From a photogrammetric standpoint, the results are typically disappointing as the metric integrity of the resulting models is generally poor, or even unknown, while the number of outliers within the image matching and triangulation is large, and generally too large to allow relative orientation (RO) via the commonly used coplanarity equations. On the other hand, there are few examples within the photogrammetric research field of automated markerless camera calibration to metric tolerances, and these too are restricted to narrow-baseline, low-convergence imaging geometry. The objective addressed in this paper is markerless automatic multi-image orientation, maintaining metric integrity, within networks that incorporate wide-baseline imagery. By wide-baseline we imply convergent multi-image configurations with convergence angles of up to around 90°. An associated aim is provision of a fast, fully automated process, which can be performed without user intervention. For this purpose, various algorithms require optimisation to allow parallel processing utilising multiple PC cores and graphics processing units (GPUs).

  19. Design and development of a microarray processing station (MPS) for automated miniaturized immunoassays.

    PubMed

    Pla-Roca, Mateu; Altay, Gizem; Giralt, Xavier; Casals, Alícia; Samitier, Josep

    2016-08-01

    Here we describe the design and evaluation of a fluidic device for the automatic processing of microarrays, called microarray processing station or MPS. The microarray processing station once installed on a commercial microarrayer allows automating the washing, and drying steps, which are often performed manually. The substrate where the assay occurs remains on place during the microarray printing, incubation and processing steps, therefore the addressing of nL volumes of the distinct immunoassay reagents such as capture and detection antibodies and samples can be performed on the same coordinate of the substrate with a perfect alignment without requiring any additional mechanical or optical re-alignment methods. This allows the performance of independent immunoassays in a single microarray spot. PMID:27405464

  20. Towards a fully automated eclipsing binary solver for Gaia

    NASA Astrophysics Data System (ADS)

    Tingley, Brandon; Sadowski, Gilles; Siopis, Christos

    2009-02-01

    Gaia, an ESA cornerstone mission, will obtain of the order of 100 high-precision photometric observations and lower precision radial velocity measurements over five years for around a billion stars several hundred thousand of which will be eclipsing binaries. In order to extract the characteristics of these systems, a fully automated code must be available. During the process of this development, two tools that may be of use to the transit community have emerged: a very fast, simple, detached eclipsing binary simulator/solver based on a new approach and an interacting eclipsing binary simulator with most of the features of the Wilson-Devinney and Nightfall codes, but fully documented and written in easy-to-follow and highly portable Java. Currently undergoing development and testing, this code includes an intuitive graphical interface and an optimizer for the estimation of the physical parameters of the system.

  1. A fully automated high-throughput training system for rodents.

    PubMed

    Poddar, Rajesh; Kawai, Risa; Ölveczky, Bence P

    2013-01-01

    Addressing the neural mechanisms underlying complex learned behaviors requires training animals in well-controlled tasks, an often time-consuming and labor-intensive process that can severely limit the feasibility of such studies. To overcome this constraint, we developed a fully computer-controlled general purpose system for high-throughput training of rodents. By standardizing and automating the implementation of predefined training protocols within the animal's home-cage our system dramatically reduces the efforts involved in animal training while also removing human errors and biases from the process. We deployed this system to train rats in a variety of sensorimotor tasks, achieving learning rates comparable to existing, but more laborious, methods. By incrementally and systematically increasing the difficulty of the task over weeks of training, rats were able to master motor tasks that, in complexity and structure, resemble ones used in primate studies of motor sequence learning. By enabling fully automated training of rodents in a home-cage setting this low-cost and modular system increases the utility of rodents for studying the neural underpinnings of a variety of complex behaviors. PMID:24349451

  2. FASTER: an unsupervised fully automated sleep staging method for mice

    PubMed Central

    Sunagawa, Genshiro A; Séi, Hiroyoshi; Shimba, Shigeki; Urade, Yoshihiro; Ueda, Hiroki R

    2013-01-01

    Identifying the stages of sleep, or sleep staging, is an unavoidable step in sleep research and typically requires visual inspection of electroencephalography (EEG) and electromyography (EMG) data. Currently, scoring is slow, biased and prone to error by humans and thus is the most important bottleneck for large-scale sleep research in animals. We have developed an unsupervised, fully automated sleep staging method for mice that allows less subjective and high-throughput evaluation of sleep. Fully Automated Sleep sTaging method via EEG/EMG Recordings (FASTER) is based on nonparametric density estimation clustering of comprehensive EEG/EMG power spectra. FASTER can accurately identify sleep patterns in mice that have been perturbed by drugs or by genetic modification of a clock gene. The overall accuracy is over 90% in every group. 24-h data are staged by a laptop computer in 10 min, which is faster than an experienced human rater. Dramatically improving the sleep staging process in both quality and throughput FASTER will open the door to quantitative and comprehensive animal sleep research. PMID:23621645

  3. A Fully Automated High-Throughput Training System for Rodents

    PubMed Central

    Poddar, Rajesh; Kawai, Risa; Ölveczky, Bence P.

    2013-01-01

    Addressing the neural mechanisms underlying complex learned behaviors requires training animals in well-controlled tasks, an often time-consuming and labor-intensive process that can severely limit the feasibility of such studies. To overcome this constraint, we developed a fully computer-controlled general purpose system for high-throughput training of rodents. By standardizing and automating the implementation of predefined training protocols within the animal’s home-cage our system dramatically reduces the efforts involved in animal training while also removing human errors and biases from the process. We deployed this system to train rats in a variety of sensorimotor tasks, achieving learning rates comparable to existing, but more laborious, methods. By incrementally and systematically increasing the difficulty of the task over weeks of training, rats were able to master motor tasks that, in complexity and structure, resemble ones used in primate studies of motor sequence learning. By enabling fully automated training of rodents in a home-cage setting this low-cost and modular system increases the utility of rodents for studying the neural underpinnings of a variety of complex behaviors. PMID:24349451

  4. Fully Automated Lipid Pool Detection Using Near Infrared Spectroscopy

    PubMed Central

    Wojakowski, Wojciech

    2016-01-01

    Background. Detecting and identifying vulnerable plaque, which is prone to rupture, is still a challenge for cardiologist. Such lipid core-containing plaque is still not identifiable by everyday angiography, thus triggering the need to develop a new tool where NIRS-IVUS can visualize plaque characterization in terms of its chemical and morphologic characteristic. The new tool can lead to the development of new methods of interpreting the newly obtained data. In this study, the algorithm to fully automated lipid pool detection on NIRS images is proposed. Method. Designed algorithm is divided into four stages: preprocessing (image enhancement), segmentation of artifacts, detection of lipid areas, and calculation of Lipid Core Burden Index. Results. A total of 31 NIRS chemograms were analyzed by two methods. The metrics, total LCBI, maximal LCBI in 4 mm blocks, and maximal LCBI in 2 mm blocks, were calculated to compare presented algorithm with commercial available system. Both intraclass correlation (ICC) and Bland-Altman plots showed good agreement and correlation between used methods. Conclusions. Proposed algorithm is fully automated lipid pool detection on near infrared spectroscopy images. It is a tool developed for offline data analysis, which could be easily augmented for newer functions and projects. PMID:27610191

  5. Fully Automated Lipid Pool Detection Using Near Infrared Spectroscopy.

    PubMed

    Pociask, Elżbieta; Jaworek-Korjakowska, Joanna; Malinowski, Krzysztof Piotr; Roleder, Tomasz; Wojakowski, Wojciech

    2016-01-01

    Background. Detecting and identifying vulnerable plaque, which is prone to rupture, is still a challenge for cardiologist. Such lipid core-containing plaque is still not identifiable by everyday angiography, thus triggering the need to develop a new tool where NIRS-IVUS can visualize plaque characterization in terms of its chemical and morphologic characteristic. The new tool can lead to the development of new methods of interpreting the newly obtained data. In this study, the algorithm to fully automated lipid pool detection on NIRS images is proposed. Method. Designed algorithm is divided into four stages: preprocessing (image enhancement), segmentation of artifacts, detection of lipid areas, and calculation of Lipid Core Burden Index. Results. A total of 31 NIRS chemograms were analyzed by two methods. The metrics, total LCBI, maximal LCBI in 4 mm blocks, and maximal LCBI in 2 mm blocks, were calculated to compare presented algorithm with commercial available system. Both intraclass correlation (ICC) and Bland-Altman plots showed good agreement and correlation between used methods. Conclusions. Proposed algorithm is fully automated lipid pool detection on near infrared spectroscopy images. It is a tool developed for offline data analysis, which could be easily augmented for newer functions and projects. PMID:27610191

  6. Fully automated 2D-3D registration and verification.

    PubMed

    Varnavas, Andreas; Carrell, Tom; Penney, Graeme

    2015-12-01

    Clinical application of 2D-3D registration technology often requires a significant amount of human interaction during initialisation and result verification. This is one of the main barriers to more widespread clinical use of this technology. We propose novel techniques for automated initial pose estimation of the 3D data and verification of the registration result, and show how these techniques can be combined to enable fully automated 2D-3D registration, particularly in the case of a vertebra based system. The initialisation method is based on preoperative computation of 2D templates over a wide range of 3D poses. These templates are used to apply the Generalised Hough Transform to the intraoperative 2D image and the sought 3D pose is selected with the combined use of the generated accumulator arrays and a Gradient Difference Similarity Measure. On the verification side, two algorithms are proposed: one using normalised features based on the similarity value and the other based on the pose agreement between multiple vertebra based registrations. The proposed methods are employed here for CT to fluoroscopy registration and are trained and tested with data from 31 clinical procedures with 417 low dose, i.e. low quality, high noise interventional fluoroscopy images. When similarity value based verification is used, the fully automated system achieves a 95.73% correct registration rate, whereas a no registration result is produced for the remaining 4.27% of cases (i.e. incorrect registration rate is 0%). The system also automatically detects input images outside its operating range. PMID:26387052

  7. An automated method for gridding in microarray images.

    PubMed

    Giannakeas, Nikolaos; Fotiadis, Dimitrios I; Politou, Anastasia S

    2006-01-01

    Microarray technology is a powerful tool for analyzing the expression of a large number of genes in parallel. A typical microarray image consists of a few thousands of spots which determine the level of gene expression in the sample. In this paper we propose a method which automatically addresses each spot area in the image. Initially, a preliminary segmentation of the image is produced using a template matching algorithm. Next, grid and spot finding are realized. The position of non-expressed spots is located and finally a Voronoi diagram is employed to fit the grid on the image. Our method has been evaluated in a set of five images consisting of 45960 spots, from the Stanford microarray database and the reported accuracy for spot detection was 93% PMID:17946343

  8. Fully automated low-cost setup for fringe projection profilometry.

    PubMed

    Rivera-Ortega, Uriel; Dirckx, Joris; Meneses-Fabian, Cruz

    2015-02-20

    In this paper an alternative low-cost, easy-to-use, and fully automated profilometry setup is proposed. The setup is based on a phase-shifting fringe projection technique with four projected fringe parameters. It uses the well-known triangulation arrangement and low-cost electronic and image acquisition components such as a data acquisition board, a motor controller board, a printer rail, a CMOS webcam, and an LCD projector. The position of the camera, the generation of the fringe pattern, the acquisition of the images, and the calculation of the wrapped and unwrapped phase are all performed in LabVIEW. The setup is portable and can be perfectly adapted to be used in other profilometry techniques such as electronic speckle pattern interferometry and laser scanning profilometry. PMID:25968198

  9. A fully automated precise electrical resistance measurement system

    SciTech Connect

    Marhas, M.K.; Balakrishnan, K.; Ganesan, V.; Srinivasan, R.

    1996-08-01

    A fully automated precise electrical resistance measurement system for more than one sample has been constructed. Conventional four-probe measurements with van der Pauw and Montgomery configurations are possible with this system. Resistance measurements in the range of a few {mu}{Omega} to a few G{Omega} are possible for six samples at a time from room temperature down to liquid-helium or liquid-nitrogen temperatures with a temperature control accuracy of better than 10 mK. The design features of the system with special reference to the low-noise switching methods of currents and voltages are described in detail. Precision of the results thus obtained using this system are highlighted for a few superconducting and semiconducting samples. {copyright} {ital 1996 American Institute of Physics.}

  10. A fully automated robotic system for microinjection of zebrafish embryos.

    PubMed

    Wang, Wenhui; Liu, Xinyu; Gelinas, Danielle; Ciruna, Brian; Sun, Yu

    2007-01-01

    As an important embodiment of biomanipulation, injection of foreign materials (e.g., DNA, RNAi, sperm, protein, and drug compounds) into individual cells has significant implications in genetics, transgenics, assisted reproduction, and drug discovery. This paper presents a microrobotic system for fully automated zebrafish embryo injection, which overcomes the problems inherent in manual operation, such as human fatigue and large variations in success rates due to poor reproducibility. Based on computer vision and motion control, the microrobotic system performs injection at a speed of 15 zebrafish embryos (chorion unremoved) per minute, with a survival rate of 98% (n = 350 embryos), a success rate of 99% (n = 350 embryos), and a phenotypic rate of 98.5% (n = 210 embryos). The sample immobilization technique and microrobotic control method are applicable to other biological injection applications such as the injection of mouse oocytes/embryos and Drosophila embryos to enable high-throughput biological and pharmaceutical research. PMID:17848993

  11. A Fully Automated Robotic System for Microinjection of Zebrafish Embryos

    PubMed Central

    Gelinas, Danielle; Ciruna, Brian; Sun, Yu

    2007-01-01

    As an important embodiment of biomanipulation, injection of foreign materials (e.g., DNA, RNAi, sperm, protein, and drug compounds) into individual cells has significant implications in genetics, transgenics, assisted reproduction, and drug discovery. This paper presents a microrobotic system for fully automated zebrafish embryo injection, which overcomes the problems inherent in manual operation, such as human fatigue and large variations in success rates due to poor reproducibility. Based on computer vision and motion control, the microrobotic system performs injection at a speed of 15 zebrafish embryos (chorion unremoved) per minute, with a survival rate of 98% (n = 350 embryos), a success rate of 99% (n = 350 embryos), and a phenotypic rate of 98.5% (n = 210 embryos). The sample immobilization technique and microrobotic control method are applicable to other biological injection applications such as the injection of mouse oocytes/embryos and Drosophila embryos to enable high-throughput biological and pharmaceutical research. PMID:17848993

  12. Fully automated adipose tissue measurement on abdominal CT

    NASA Astrophysics Data System (ADS)

    Yao, Jianhua; Sussman, Daniel L.; Summers, Ronald M.

    2011-03-01

    Obesity has become widespread in America and has been associated as a risk factor for many illnesses. Adipose tissue (AT) content, especially visceral AT (VAT), is an important indicator for risks of many disorders, including heart disease and diabetes. Measuring adipose tissue (AT) with traditional means is often unreliable and inaccurate. CT provides a means to measure AT accurately and consistently. We present a fully automated method to segment and measure abdominal AT in CT. Our method integrates image preprocessing which attempts to correct for image artifacts and inhomogeneities. We use fuzzy cmeans to cluster AT regions and active contour models to separate subcutaneous and visceral AT. We tested our method on 50 abdominal CT scans and evaluated the correlations between several measurements.

  13. Fully automated diabetic retinopathy screening using morphological component analysis.

    PubMed

    Imani, Elaheh; Pourreza, Hamid-Reza; Banaee, Touka

    2015-07-01

    Diabetic retinopathy is the major cause of blindness in the world. It has been shown that early diagnosis can play a major role in prevention of visual loss and blindness. This diagnosis can be made through regular screening and timely treatment. Besides, automation of this process can significantly reduce the work of ophthalmologists and alleviate inter and intra observer variability. This paper provides a fully automated diabetic retinopathy screening system with the ability of retinal image quality assessment. The novelty of the proposed method lies in the use of Morphological Component Analysis (MCA) algorithm to discriminate between normal and pathological retinal structures. To this end, first a pre-screening algorithm is used to assess the quality of retinal images. If the quality of the image is not satisfactory, it is examined by an ophthalmologist and must be recaptured if necessary. Otherwise, the image is processed for diabetic retinopathy detection. In this stage, normal and pathological structures of the retinal image are separated by MCA algorithm. Finally, the normal and abnormal retinal images are distinguished by statistical features of the retinal lesions. Our proposed system achieved 92.01% sensitivity and 95.45% specificity on the Messidor dataset which is a remarkable result in comparison with previous work. PMID:25863517

  14. Fully Automated Cloud-Drift Winds in NESDIS Operations.

    NASA Astrophysics Data System (ADS)

    Nieman, Steven J.; Menzel, W. Paul; Hayden, Christopher M.; Gray, Donald; Wanzong, Steven T.; Velden, Christopher S.; Daniels, Jaime

    1997-06-01

    Cloud-drift winds have been produced from geostationary satellite data in the Western Hemisphere since the early 1970s. During the early years, winds were used as an aid for the short-term forecaster in an era when numerical forecasts were often of questionable quality, especially over oceanic regions. Increased computing resources over the last two decades have led to significant advances in the performance of numerical forecast models. As a result, continental forecasts now stand to gain little from the inspection or assimilation of cloud-drift wind fields. However, the oceanic data void remains, and although numerical forecasts in such areas have improved, they still suffer from a lack of in situ observations. During the same two decades, the quality of geostationary satellite data has improved considerably, and the cloud-drift wind production process has also benefited from increased computing power. As a result, fully automated wind production is now possible, yielding cloud-drift winds whose quality and quantity is sufficient to add useful information to numerical model forecasts in oceanic and coastal regions. This article will detail the automated cloud-drift wind production process, as operated by the National Environmental Satellite Data and Information Service within the National Oceanic and Atmospheric Administration.

  15. Experience with the JMS fully automated dialysis machine.

    PubMed

    Tsuchiya, Shinichiro; Moriishi, Misaki; Takahashi, Naoko; Watanabe, Hiroshi; Kawanishi, Hideki; Kim, Sung-Teh; Masaoka, Katsunori

    2003-01-01

    A fully automated dialysis machine has been developed and evaluated clinically. It uses highly pure dialysate (produced by a new dialysate cleaning system) instead of the conventional physiologic saline for the processes of priming, guiding blood to the dialysis machine, replenishing fluid, and returning the blood to the body. The piping for the dialysate is in the shape of a loop, and the dialyzer coupler has no mechanical parts that might become contaminated. As a result of these and certain other improvements in machine design, it is now possible to obtain reasonably clean dialysate. For the priming process, the machine uses a volume of up to 4 L of the dialysate after reverse filtration from the dialyzer. Most foreign matter or eluates can be removed from the dialyzer and the blood channels. Before blood is guided out of the body into the dialysis system, the needles inserted in the artery and vein are simultaneously connected to the blood channel, and the dialysate remaining in the channel is removed from the dialyzer. If the patient's blood pressure falls during dialysis, the dialysate can be replenished at any desired flow rate for reverse filtration. Blood return can be started automatically when the planned dialysis time has elapsed and the target water volume has been removed. The cleaned dialysate is infused from the dialyzer into the blood channel by reverse filtration to allow the blood to be returned to the body via both the artery and the vein at the same time. A total of 216 units of this fully automated dialysis machine have been placed in service at two of our facilities. During the 6 month period beginning in July 2001, they were used for 40,000 hemodialysis sessions in 516 patients. During the dialysate preparation process, the endotoxin levels in the reverse osmosis (RO) water, prefilter dialysate, and reverse filtered dialysate were all less than 1 EU/L. The time required to guide blood into the dialyzer (n = 39) decreased from the 4.6 +/- 1

  16. Fully automated liver segmentation from SPIR image series.

    PubMed

    Göçeri, Evgin; Gürcan, Metin N; Dicle, Oğuz

    2014-10-01

    Accurate liver segmentation is an important component of surgery planning for liver transplantation, which enables patients with liver disease a chance to survive. Spectral pre-saturation inversion recovery (SPIR) image sequences are useful for liver vessel segmentation because vascular structures in the liver are clearly visible in these sequences. Although level-set based segmentation techniques are frequently used in liver segmentation due to their flexibility to adapt to different problems by incorporating prior knowledge, the need to initialize the contours on each slice is a common drawback of such techniques. In this paper, we present a fully automated variational level set approach for liver segmentation from SPIR image sequences. Our approach is designed to be efficient while achieving high accuracy. The efficiency is achieved by (1) automatically defining an initial contour for each slice, and (2) automatically computing weight values of each term in the applied energy functional at each iteration during evolution. Automated detection and exclusion of spurious structures (e.g. cysts and other bright white regions on the skin) in the pre-processing stage increases the accuracy and robustness. We also present a novel approach to reduce computational cost by employing binary regularization of level set function. A signed pressure force function controls the evolution of the active contour. The method was applied to ten data sets. In each image, the performance of the algorithm was measured using the receiver operating characteristics method in terms of accuracy, sensitivity and specificity. The accuracy of the proposed method was 96%. Quantitative analyses of results indicate that the proposed method can accurately, efficiently and consistently segment liver images. PMID:25192606

  17. PLIP: fully automated protein-ligand interaction profiler.

    PubMed

    Salentin, Sebastian; Schreiber, Sven; Haupt, V Joachim; Adasme, Melissa F; Schroeder, Michael

    2015-07-01

    The characterization of interactions in protein-ligand complexes is essential for research in structural bioinformatics, drug discovery and biology. However, comprehensive tools are not freely available to the research community. Here, we present the protein-ligand interaction profiler (PLIP), a novel web service for fully automated detection and visualization of relevant non-covalent protein-ligand contacts in 3D structures, freely available at projects.biotec.tu-dresden.de/plip-web. The input is either a Protein Data Bank structure, a protein or ligand name, or a custom protein-ligand complex (e.g. from docking). In contrast to other tools, the rule-based PLIP algorithm does not require any structure preparation. It returns a list of detected interactions on single atom level, covering seven interaction types (hydrogen bonds, hydrophobic contacts, pi-stacking, pi-cation interactions, salt bridges, water bridges and halogen bonds). PLIP stands out by offering publication-ready images, PyMOL session files to generate custom images and parsable result files to facilitate successive data processing. The full python source code is available for download on the website. PLIP's command-line mode allows for high-throughput interaction profiling. PMID:25873628

  18. Fully automated scoring of chest radiographs in cystic fibrosis.

    PubMed

    Lee, Min-Zhao; Cai, Weidong; Song, Yang; Selvadurai, Hiran; Feng, David Dagan

    2013-01-01

    We present a prototype of a fully automated scoring system for chest radiographs (CXRs) in cystic fibrosis. The system was used to analyze real, clinical CXR data, to estimate the Shwachman-Kulczycki score for the image. Images were resampled and normalized to a standard size and intensity level, then segmented with a patch-based nearest-neighbor mapping algorithm. Texture features were calculated regionally and globally, using Tamura features, local binary patterns (LBP), gray-level co-occurrence matrix and Gabor filtering. Feature selection was guided by current understanding of the disease process, in particular the reorganization and thickening of airways. Combinations of these features were used as inputs for support vector machine (SVM) learning to classify each CXR, and evaluated using two-fold cross-validation for agreement with clinician scoring. The final computed score for each image was compared with the score assigned by a physician. Using this prototype system, we analyzed 139 CXRs from an Australian pediatric cystic fibrosis registry, for which texture directionality showed greatest discriminating power. Computed scores agreed with clinician scores in 75% of cases, and up to 90% of cases in discriminating severe disease from mild disease, similar to the level of human interobserver agreement for this dataset. PMID:24110600

  19. A fully automated immunoassay from whole blood on a disc.

    PubMed

    Lee, Beom Seok; Lee, Jung-Nam; Park, Jong-Myeon; Lee, Jeong-Gun; Kim, Suhyeon; Cho, Yoon-Kyoung; Ko, Christopher

    2009-06-01

    A portable, disc-based, and fully automated enzyme-linked immuno-sorbent assay (ELISA) system is developed to test infectious diseases from whole blood. The innovative laser irradiated ferrowax microvalves and centrifugal microfluidics were utilized for the full integration of microbead-based suspension ELISA assays on a disc starting from whole blood. The concentrations of the antigen and the antibody of Hepatitis B virus (HBV), HBsAg and Anti-HBs respectively, were measured using the lab-on-a-disc (LOD). All the necessary reagents are preloaded on the disc and the total process of the plasma separation, incubation with target specific antigen or antibody coated microbeads, multiple steps of washing, enzyme reaction with substrates, and the absorbance detection could be finished within 30 minutes. Compared to the conventional ELISA, the operation time was dramatically reduced from over 2 hours to less than 30 minutes while the limit of detection was kept similar; e.g. the limit of detection of Anti-HBs tests were 8.6 mIU mL(-1) and 10 mIU mL(-1) for the disc-based and the conventional ELISA respectively. PMID:19458861

  20. A fully automated TerraSAR-X based flood service

    NASA Astrophysics Data System (ADS)

    Martinis, Sandro; Kersten, Jens; Twele, André

    2015-06-01

    In this paper, a fully automated processing chain for near real-time flood detection using high resolution TerraSAR-X Synthetic Aperture Radar (SAR) data is presented. The processing chain including SAR data pre-processing, computation and adaption of global auxiliary data, unsupervised initialization of the classification as well as post-classification refinement by using a fuzzy logic-based approach is automatically triggered after satellite data delivery. The dissemination of flood maps resulting from this service is performed through an online service which can be activated on-demand for emergency response purposes (i.e., when a flood situation evolves). The classification methodology is based on previous work of the authors but was substantially refined and extended for robustness and transferability to guarantee high classification accuracy under different environmental conditions and sensor configurations. With respect to accuracy and computational effort, experiments performed on a data set of 175 different TerraSAR-X scenes acquired during flooding all over the world with different sensor configurations confirm the robustness and effectiveness of the proposed flood mapping service. These promising results have been further confirmed by means of an in-depth validation performed for three study sites in Germany, Thailand, and Albania/Montenegro.

  1. PLIP: fully automated protein–ligand interaction profiler

    PubMed Central

    Salentin, Sebastian; Schreiber, Sven; Haupt, V. Joachim; Adasme, Melissa F.; Schroeder, Michael

    2015-01-01

    The characterization of interactions in protein–ligand complexes is essential for research in structural bioinformatics, drug discovery and biology. However, comprehensive tools are not freely available to the research community. Here, we present the protein–ligand interaction profiler (PLIP), a novel web service for fully automated detection and visualization of relevant non-covalent protein–ligand contacts in 3D structures, freely available at projects.biotec.tu-dresden.de/plip-web. The input is either a Protein Data Bank structure, a protein or ligand name, or a custom protein–ligand complex (e.g. from docking). In contrast to other tools, the rule-based PLIP algorithm does not require any structure preparation. It returns a list of detected interactions on single atom level, covering seven interaction types (hydrogen bonds, hydrophobic contacts, pi-stacking, pi-cation interactions, salt bridges, water bridges and halogen bonds). PLIP stands out by offering publication-ready images, PyMOL session files to generate custom images and parsable result files to facilitate successive data processing. The full python source code is available for download on the website. PLIP's command-line mode allows for high-throughput interaction profiling. PMID:25873628

  2. Microarrays

    ERIC Educational Resources Information Center

    Plomin, Robert; Schalkwyk, Leonard C.

    2007-01-01

    Microarrays are revolutionizing genetics by making it possible to genotype hundreds of thousands of DNA markers and to assess the expression (RNA transcripts) of all of the genes in the genome. Microarrays are slides the size of a postage stamp that contain millions of DNA sequences to which single-stranded DNA or RNA can hybridize. This…

  3. Participation through Automation: Fully Automated Critical PeakPricing in Commercial Buildings

    SciTech Connect

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Kiliccote,Sila; Linkugel, Eric

    2006-06-20

    California electric utilities have been exploring the use of dynamic critical peak prices (CPP) and other demand response programs to help reduce peaks in customer electric loads. CPP is a tariff design to promote demand response. Levels of automation in DR can be defined as follows: Manual Demand Response involves a potentially labor-intensive approach such as manually turning off or changing comfort set points at each equipment switch or controller. Semi-Automated Demand Response involves a pre-programmed demand response strategy initiated by a person via centralized control system. Fully Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. The receipt of the external signal initiates pre-programmed demand response strategies. They refer to this as Auto-DR. This paper describes the development, testing, and results from automated CPP (Auto-CPP) as part of a utility project in California. The paper presents the project description and test methodology. This is followed by a discussion of Auto-DR strategies used in the field test buildings. They present a sample Auto-CPP load shape case study, and a selection of the Auto-CPP response data from September 29, 2005. If all twelve sites reached their maximum saving simultaneously, a total of approximately 2 MW of DR is available from these twelve sites that represent about two million ft{sup 2}. The average DR was about half that value, at about 1 MW. These savings translate to about 0.5 to 1.0 W/ft{sup 2} of demand reduction. They are continuing field demonstrations and economic evaluations to pursue increasing penetrations of automated DR that has demonstrated ability to provide a valuable DR resource for California.

  4. Toward Fully Automated Multicriterial Plan Generation: A Prospective Clinical Study

    SciTech Connect

    Voet, Peter W.J.; Dirkx, Maarten L.P.; Breedveld, Sebastiaan; Fransen, Dennie; Levendag, Peter C.; Heijmen, Ben J.M.

    2013-03-01

    Purpose: To prospectively compare plans generated with iCycle, an in-house-developed algorithm for fully automated multicriterial intensity modulated radiation therapy (IMRT) beam profile and beam orientation optimization, with plans manually generated by dosimetrists using the clinical treatment planning system. Methods and Materials: For 20 randomly selected head-and-neck cancer patients with various tumor locations (of whom 13 received sequential boost treatments), we offered the treating physician the choice between an automatically generated iCycle plan and a manually optimized plan using standard clinical procedures. Although iCycle used a fixed “wish list” with hard constraints and prioritized objectives, the dosimetrists manually selected the beam configuration and fine tuned the constraints and objectives for each IMRT plan. Dosimetrists were not informed in advance whether a competing iCycle plan was made. The 2 plans were simultaneously presented to the physician, who then selected the plan to be used for treatment. For the patient group, differences in planning target volume coverage and sparing of critical tissues were quantified. Results: In 32 of 33 plan comparisons, the physician selected the iCycle plan for treatment. This highly consistent preference for the automatically generated plans was mainly caused by the improved sparing for the large majority of critical structures. With iCycle, the normal tissue complication probabilities for the parotid and submandibular glands were reduced by 2.4% ± 4.9% (maximum, 18.5%, P=.001) and 6.5% ± 8.3% (maximum, 27%, P=.005), respectively. The reduction in the mean oral cavity dose was 2.8 ± 2.8 Gy (maximum, 8.1 Gy, P=.005). For the swallowing muscles, the esophagus and larynx, the mean dose reduction was 3.3 ± 1.1 Gy (maximum, 9.2 Gy, P<.001). For 15 of the 20 patients, target coverage was also improved. Conclusions: In 97% of cases, automatically generated plans were selected for treatment because of

  5. ATLAS from Data Research Associates: A Fully Integrated Automation System.

    ERIC Educational Resources Information Center

    Mellinger, Michael J.

    1987-01-01

    This detailed description of a fully integrated, turnkey library system includes a complete profile of the system (functions, operational characteristics, hardware, operating system, minimum memory and pricing); history of the technologies involved; and descriptions of customer services and availability. (CLB)

  6. Datamining approach for automation of diagnosis of breast cancer in immunohistochemically stained tissue microarray images.

    PubMed

    Prasad, Keerthana; Zimmermann, Bernhard; Prabhu, Gopalakrishna; Pai, Muktha

    2010-01-01

    Cancer of the breast is the second most common human neoplasm, accounting for approximately one quarter of all cancers in females after cervical carcinoma. Estrogen receptor (ER), Progesteron receptor and human epidermal growth factor receptor (HER-2/neu) expressions play an important role in diagnosis and prognosis of breast carcinoma. Tissue microarray (TMA) technique is a high throughput technique which provides a standardized set of images which are uniformly stained, facilitating effective automation of the evaluation of the specimen images. TMA technique is widely used to evaluate hormone expression for diagnosis of breast cancer. If one considers the time taken for each of the steps in the tissue microarray process workflow, it can be observed that the maximum amount of time is taken by the analysis step. Hence, automated analysis will significantly reduce the overall time required to complete the study. Many tools are available for automated digital acquisition of images of the spots from the microarray slide. Each of these images needs to be evaluated by a pathologist to assign a score based on the staining intensity to represent the hormone expression, to classify them into negative or positive cases. Our work aims to develop a system for automated evaluation of sets of images generated through tissue microarray technique, representing the ER expression images and HER-2/neu expression images. Our study is based on the Tissue Microarray Database portal of Stanford university at http://tma.stanford.edu/cgi-bin/cx?n=her1, which has made huge number of images available to researchers. We used 171 images corresponding to ER expression and 214 images corresponding to HER-2/neu expression of breast carcinoma. Out of the 171 images corresponding to ER expression, 104 were negative and 67 were representing positive cases. Out of the 214 images corresponding to HER-2/neu expression, 112 were negative and 102 were representing positive cases. Our method has 92

  7. A Program Certification Assistant Based on Fully Automated Theorem Provers

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2005-01-01

    We describe a certification assistant to support formal safety proofs for programs. It is based on a graphical user interface that hides the low-level details of first-order automated theorem provers while supporting limited interactivity: it allows users to customize and control the proof process on a high level, manages the auxiliary artifacts produced during this process, and provides traceability between the proof obligations and the relevant parts of the program. The certification assistant is part of a larger program synthesis system and is intended to support the deployment of automatically generated code in safety-critical applications.

  8. Gene Expression Measurement Module (GEMM) - a fully automated, miniaturized instrument for measuring gene expression in space

    NASA Astrophysics Data System (ADS)

    Karouia, Fathi; Ricco, Antonio; Pohorille, Andrew; Peyvan, Kianoosh

    2012-07-01

    The capability to measure gene expression on board spacecrafts opens the doors to a large number of experiments on the influence of space environment on biological systems that will profoundly impact our ability to conduct safe and effective space travel, and might also shed light on terrestrial physiology or biological function and human disease and aging processes. Measurements of gene expression will help us to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment on a wide range of organisms from microbes to humans, develop effective countermeasures against these effects, determine metabolic basis of microbial pathogenicity and drug resistance, test our ability to sustain and grow in space organisms that can be used for life support and in situ resource utilization during long-duration space exploration, and monitor both the spacecraft environment and crew health. These and other applications hold significant potential for discoveries in space biology, biotechnology and medicine. Accordingly, supported by funding from the NASA Astrobiology Science and Technology Instrument Development Program, we are developing a fully automated, miniaturized, integrated fluidic system for small spacecraft capable of in-situ measuring microbial expression of thousands of genes from multiple samples. The instrument will be capable of (1) lysing bacterial cell walls, (2) extracting and purifying RNA released from cells, (3) hybridizing it on a microarray and (4) providing electrochemical readout, all in a microfluidics cartridge. The prototype under development is suitable for deployment on nanosatellite platforms developed by the NASA Small Spacecraft Office. The first target application is to cultivate and measure gene expression of the photosynthetic bacterium Synechococcus elongatus, i.e. a cyanobacterium known to exhibit remarkable metabolic diversity and resilience to adverse conditions

  9. Gene Expression Measurement Module (GEMM) - A Fully Automated, Miniaturized Instrument for Measuring Gene Expression in Space

    NASA Technical Reports Server (NTRS)

    Pohorille, Andrew; Peyvan, Kia; Karouia, Fathi; Ricco, Antonio

    2012-01-01

    The capability to measure gene expression on board spacecraft opens the door to a large number of high-value experiments on the influence of the space environment on biological systems. For example, measurements of gene expression will help us to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment on a wide range of organisms from microbes to humans, develop effective countermeasures against these effects, and determine the metabolic bases of microbial pathogenicity and drug resistance. These and other applications hold significant potential for discoveries in space biology, biotechnology, and medicine. Supported by funding from the NASA Astrobiology Science and Technology Instrument Development Program, we are developing a fully automated, miniaturized, integrated fluidic system for small spacecraft capable of in-situ measurement of expression of several hundreds of microbial genes from multiple samples. The instrument will be capable of (1) lysing cell walls of bacteria sampled from cultures grown in space, (2) extracting and purifying RNA released from cells, (3) hybridizing the RNA on a microarray and (4) providing readout of the microarray signal, all in a single microfluidics cartridge. The device is suitable for deployment on nanosatellite platforms developed by NASA Ames' Small Spacecraft Division. To meet space and other technical constraints imposed by these platforms, a number of technical innovations are being implemented. The integration and end-to-end technological and biological validation of the instrument are carried out using as a model the photosynthetic bacterium Synechococcus elongatus, known for its remarkable metabolic diversity and resilience to adverse conditions. Each step in the measurement process-lysis, nucleic acid extraction, purification, and hybridization to an array-is assessed through comparison of the results obtained using the instrument with

  10. An automated microfluidic system for single-stranded DNA preparation and magnetic bead-based microarray analysis

    PubMed Central

    Wang, Shuaiqin; Sun, Yujia; Liu, Yan; Xiang, Guangxin; Wang, Lei; Cheng, Jing; Liu, Peng

    2015-01-01

    We present an integrated microfluidic device capable of performing single-stranded DNA (ssDNA) preparation and magnetic bead-based microarray analysis with a white-light detection for detecting mutations that account for hereditary hearing loss. The entire operation process, which includes loading of streptavidin-coated magnetic beads (MBs) and biotin-labeled polymerase chain reaction products, active dispersion of the MBs with DNA for binding, alkaline denaturation of DNA, dynamic hybridization of the bead-labeled ssDNA to a tag array, and white-light detection, can all be automatically accomplished in a single chamber of the microchip, which was operated on a self-contained instrument with all the necessary components for thermal control, fluidic control, and detection. Two novel mixing valves with embedded polydimethylsiloxane membranes, which can alternately generate a 3-μl pulse flow at a peak rate of around 160 mm/s, were integrated into the chip for thoroughly dispersing magnetic beads in 2 min. The binding efficiency of biotinylated oligonucleotides to beads was measured to be 80.6% of that obtained in a tube with the conventional method. To critically test the performance of this automated microsystem, we employed a commercial microarray-based detection kit for detecting nine mutation loci that account for hereditary hearing loss. The limit of detection of the microsystem was determined as 2.5 ng of input K562 standard genomic DNA using this kit. In addition, four blood samples obtained from persons with mutations were all correctly typed by our system in less than 45 min per run. The fully automated, “amplicon-in-answer-out” operation, together with the white-light detection, makes our system an excellent platform for low-cost, rapid genotyping in clinical diagnosis. PMID:25825617

  11. An automated microfluidic system for single-stranded DNA preparation and magnetic bead-based microarray analysis.

    PubMed

    Wang, Shuaiqin; Sun, Yujia; Gan, Wupeng; Liu, Yan; Xiang, Guangxin; Wang, Dong; Wang, Lei; Cheng, Jing; Liu, Peng

    2015-03-01

    We present an integrated microfluidic device capable of performing single-stranded DNA (ssDNA) preparation and magnetic bead-based microarray analysis with a white-light detection for detecting mutations that account for hereditary hearing loss. The entire operation process, which includes loading of streptavidin-coated magnetic beads (MBs) and biotin-labeled polymerase chain reaction products, active dispersion of the MBs with DNA for binding, alkaline denaturation of DNA, dynamic hybridization of the bead-labeled ssDNA to a tag array, and white-light detection, can all be automatically accomplished in a single chamber of the microchip, which was operated on a self-contained instrument with all the necessary components for thermal control, fluidic control, and detection. Two novel mixing valves with embedded polydimethylsiloxane membranes, which can alternately generate a 3-μl pulse flow at a peak rate of around 160 mm/s, were integrated into the chip for thoroughly dispersing magnetic beads in 2 min. The binding efficiency of biotinylated oligonucleotides to beads was measured to be 80.6% of that obtained in a tube with the conventional method. To critically test the performance of this automated microsystem, we employed a commercial microarray-based detection kit for detecting nine mutation loci that account for hereditary hearing loss. The limit of detection of the microsystem was determined as 2.5 ng of input K562 standard genomic DNA using this kit. In addition, four blood samples obtained from persons with mutations were all correctly typed by our system in less than 45 min per run. The fully automated, "amplicon-in-answer-out" operation, together with the white-light detection, makes our system an excellent platform for low-cost, rapid genotyping in clinical diagnosis. PMID:25825617

  12. Current advances and strategies towards fully automated sample preparation for regulated LC-MS/MS bioanalysis.

    PubMed

    Zheng, Naiyu; Jiang, Hao; Zeng, Jianing

    2014-09-01

    Robotic liquid handlers (RLHs) have been widely used in automated sample preparation for liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. Automated sample preparation for regulated bioanalysis offers significantly higher assay efficiency, better data quality and potential bioanalytical cost-savings. For RLHs that are used for regulated bioanalysis, there are additional requirements, including 21 CFR Part 11 compliance, software validation, system qualification, calibration verification and proper maintenance. This article reviews recent advances in automated sample preparation for regulated bioanalysis in the last 5 years. Specifically, it covers the following aspects: regulated bioanalysis requirements, recent advances in automation hardware and software development, sample extraction workflow simplification, strategies towards fully automated sample extraction, and best practices in automated sample preparation for regulated bioanalysis. PMID:25384595

  13. A fully automated system for adherent cells microinjection.

    PubMed

    Becattini, Gabriele; Mattos, Leonardo S; Caldwell, Darwin G

    2014-01-01

    This paper proposes an automated robotic system to perform cell microinjections to relieve human operators from this highly difficult and tedious manual procedure. The system, which uses commercial equipment currently found on most biomanipulation laboratories, consists of a multitask software framework combining computer vision and robotic control elements. The vision part features an injection pipette tracker and an automatic cell targeting system that is responsible for defining injection points within the contours of adherent cells in culture. The main challenge is the use of bright-field microscopy only, without the need for chemical markers normally employed to highlight the cells. Here, cells are identified and segmented using a threshold-based image processing technique working on defocused images. Fast and precise microinjection pipette positioning over the automatically defined targets is performed by a two-stage robotic system which achieves an average injection rate of 7.6 cells/min with a pipette positioning precision of 0.23 μm. The consistency of these microinjections and the performance of the visual targeting framework were experimentally evaluated using two cell lines (CHO-K1 and HEK) and over 500 cells. In these trials, the cells were automatically targeted and injected with a fluorescent marker, resulting in a correct cell detection rate of 87% and a successful marker delivery rate of 67.5%. These results demonstrate that the new system is capable of better performances than expert operators, highlighting its benefits and potential for large-scale application. PMID:24403406

  14. A fully automated, quantitative test of upper limb function.

    PubMed

    Prochazka, Arthur; Kowalczewski, Jan

    2015-01-01

    The Rehabilitation Joystick for Computerized Exercise (ReJoyce, Rehabtronics Inc., Edmonton, Alberta, Canada), is a workstation on which participants exercise dexterous movement tasks in the guise of computer games. The system incorporates the ReJoyce Arm and Hand Function Test (RAHFT). Here the authors evaluate the RAHFT against the Action Research Arm Test (ARAT) and the Fugl-Meyer Assessment (FMA). All 3 tests were performed in 36 separate sessions in 13 tetraplegic individuals. Concurrent and criterion validities of the RAHFT were supported by a high level of correlation with the ARAT (r2 = .88). Regarding responsiveness, the effect size of the RAHFT at week 6 of 1 hr/day exercise training was 1.8. Regarding reliability, the mean test-retest difference in RAHFT baseline scores was 0.67% ± 3.6%, which was not statistically significant. The RAHFT showed less ceiling effect than either ARAT or FMA. These data help validate the RAHFT as a quantitative, automated alternative to the ARAT and FMA. The RAHFT is the first comprehensive test of arm and dexterous hand function that does not depend on human judgment. It offers a standardized, quantitative outcome evaluation, which can be performed not only in the clinic, but also in the participant's home, administered by a remote therapist over the Internet. PMID:25575220

  15. Improving reticle defect disposition via fully automated lithography simulation

    NASA Astrophysics Data System (ADS)

    Mann, Raunak; Goodman, Eliot; Lao, Keith; Ha, Steven; Vacca, Anthony; Fiekowsky, Peter; Fiekowsky, Dan

    2016-03-01

    Most advanced wafer fabs have embraced complex pattern decoration, which creates numerous challenges during in-fab reticle qualification. These optical proximity correction (OPC) techniques create assist features that tend to be very close in size and shape to the main patterns as seen in Figure 1. A small defect on an assist feature will most likely have little or no impact on the fidelity of the wafer image, whereas the same defect on a main feature could significantly decrease device functionality. In order to properly disposition these defects, reticle inspection technicians need an efficient method that automatically separates main from assist features and predicts the resulting defect impact on the wafer image. Analysis System (ADAS) defect simulation system[1]. Up until now, using ADAS simulation was limited to engineers due to the complexity of the settings that need to be manually entered in order to create an accurate result. A single error in entering one of these values can cause erroneous results, therefore full automation is necessary. In this study, we propose a new method where all needed simulation parameters are automatically loaded into ADAS. This is accomplished in two parts. First we have created a scanner parameter database that is automatically identified from mask product and level names. Second, we automatically determine the appropriate simulation printability threshold by using a new reference image (provided by the inspection tool) that contains a known measured value of the reticle critical dimension (CD). This new method automatically loads the correct scanner conditions, sets the appropriate simulation threshold, and automatically measures the percentage of CD change caused by the defect. This streamlines qualification and reduces the number of reticles being put on hold, waiting for engineer review. We also present data showing the consistency and reliability of the new method, along with the impact on the efficiency of in

  16. A Fully Automated Classification for Mapping the Annual Cropland Extent

    NASA Astrophysics Data System (ADS)

    Waldner, F.; Defourny, P.

    2015-12-01

    Mapping the global cropland extent is of paramount importance for food security. Indeed, accurate and reliable information on cropland and the location of major crop types is required to make future policy, investment, and logistical decisions, as well as production monitoring. Timely cropland information directly feed early warning systems such as GIEWS and, FEWS NET. In Africa, and particularly in the arid and semi-arid region, food security is center of debate (at least 10% of the population remains undernourished) and accurate cropland estimation is a challenge. Space borne Earth Observation provides opportunities for global cropland monitoring in a spatially explicit, economic, efficient, and objective fashion. In the both agriculture monitoring and climate modelling, cropland maps serve as mask to isolate agricultural land for (i) time-series analysis for crop condition monitoring and (ii) to investigate how the cropland is respond to climatic evolution. A large diversity of mapping strategies ranging from the local to the global scale and associated with various degrees of accuracy can be found in the literature. At the global scale, despite efforts, cropland is generally one of classes with the poorest accuracy which make difficult the use for agricultural. This research aims at improving the cropland delineation from the local scale to the regional and global scales as well as allowing near real time updates. To that aim, five temporal features were designed to target the key- characteristics of crop spectral-temporal behavior. To ensure a high degree of automation, training data is extracted from available baseline land cover maps. The method delivers cropland maps with a high accuracy over contrasted agro-systems in Ukraine, Argentina, China and Belgium. The accuracy reached are comparable to those obtained with classifiers trained with in-situ data. Besides, it was found that the cropland class is associated with a low uncertainty. The temporal features

  17. Accurate, fully-automated NMR spectral profiling for metabolomics.

    PubMed

    Ravanbakhsh, Siamak; Liu, Philip; Bjorndahl, Trent C; Bjordahl, Trent C; Mandal, Rupasri; Grant, Jason R; Wilson, Michael; Eisner, Roman; Sinelnikov, Igor; Hu, Xiaoyu; Luchinat, Claudio; Greiner, Russell; Wishart, David S

    2015-01-01

    Many diseases cause significant changes to the concentrations of small molecules (a.k.a. metabolites) that appear in a person's biofluids, which means such diseases can often be readily detected from a person's "metabolic profile"-i.e., the list of concentrations of those metabolites. This information can be extracted from a biofluids Nuclear Magnetic Resonance (NMR) spectrum. However, due to its complexity, NMR spectral profiling has remained manual, resulting in slow, expensive and error-prone procedures that have hindered clinical and industrial adoption of metabolomics via NMR. This paper presents a system, BAYESIL, which can quickly, accurately, and autonomously produce a person's metabolic profile. Given a 1D 1H NMR spectrum of a complex biofluid (specifically serum or cerebrospinal fluid), BAYESIL can automatically determine the metabolic profile. This requires first performing several spectral processing steps, then matching the resulting spectrum against a reference compound library, which contains the "signatures" of each relevant metabolite. BAYESIL views spectral matching as an inference problem within a probabilistic graphical model that rapidly approximates the most probable metabolic profile. Our extensive studies on a diverse set of complex mixtures including real biological samples (serum and CSF), defined mixtures and realistic computer generated spectra; involving > 50 compounds, show that BAYESIL can autonomously find the concentration of NMR-detectable metabolites accurately (~ 90% correct identification and ~ 10% quantification error), in less than 5 minutes on a single CPU. These results demonstrate that BAYESIL is the first fully-automatic publicly-accessible system that provides quantitative NMR spectral profiling effectively-with an accuracy on these biofluids that meets or exceeds the performance of trained experts. We anticipate this tool will usher in high-throughput metabolomics and enable a wealth of new applications of NMR in

  18. A Fully Automated Microfluidic Femtosecond Laser Axotomy Platform for Nerve Regeneration Studies in C. elegans

    PubMed Central

    Gokce, Sertan Kutal; Guo, Samuel X.; Ghorashian, Navid; Everett, W. Neil; Jarrell, Travis; Kottek, Aubri; Bovik, Alan C.; Ben-Yakar, Adela

    2014-01-01

    Femtosecond laser nanosurgery has been widely accepted as an axonal injury model, enabling nerve regeneration studies in the small model organism, Caenorhabditis elegans. To overcome the time limitations of manual worm handling techniques, automation and new immobilization technologies must be adopted to improve throughput in these studies. While new microfluidic immobilization techniques have been developed that promise to reduce the time required for axotomies, there is a need for automated procedures to minimize the required amount of human intervention and accelerate the axotomy processes crucial for high-throughput. Here, we report a fully automated microfluidic platform for performing laser axotomies of fluorescently tagged neurons in living Caenorhabditis elegans. The presented automation process reduces the time required to perform axotomies within individual worms to ∼17 s/worm, at least one order of magnitude faster than manual approaches. The full automation is achieved with a unique chip design and an operation sequence that is fully computer controlled and synchronized with efficient and accurate image processing algorithms. The microfluidic device includes a T-shaped architecture and three-dimensional microfluidic interconnects to serially transport, position, and immobilize worms. The image processing algorithms can identify and precisely position axons targeted for ablation. There were no statistically significant differences observed in reconnection probabilities between axotomies carried out with the automated system and those performed manually with anesthetics. The overall success rate of automated axotomies was 67.4±3.2% of the cases (236/350) at an average processing rate of 17.0±2.4 s. This fully automated platform establishes a promising methodology for prospective genome-wide screening of nerve regeneration in C. elegans in a truly high-throughput manner. PMID:25470130

  19. A fully automated center pivot using crop canopy temperature: Preliminary results

    Technology Transfer Automated Retrieval System (TEKTRAN)

    It has been shown that the temperature-time threshold (TTT) method of automatic irrigation scheduling is a viable alternative to traditional soil water based irrigation scheduling in the Southern High Plains. This method was used to fully automate a center pivot in the panhandle of Texas. An array o...

  20. 21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY...

  1. 21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY...

  2. 21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY...

  3. 21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY...

  4. 21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY...

  5. ProDeGe: A Computational Protocol for fully Automated Decontamination of Genomic Data

    Energy Science and Technology Software Center (ESTSC)

    2015-12-01

    The Single Cell Data Decontamination Pipeline is a fully-automated software tool which classifies unscreened contigs from single cell datasets through a combination of homology and feature-based methodologies using the organism's nucleotide sequences and known NCBI taxonomony. The software is freely available to download and install, and can be run on any system.

  6. Considerations for Using Phased Array Ultrasonics in a Fully Automated Inspection System

    NASA Astrophysics Data System (ADS)

    Kramb, V. A.; Olding, R. B.; Sebastian, J. R.; Hoppe, W. C.; Petricola, D. L.; Hoeffel, J. D.; Gasper, D. A.; Stubbs, D. A.

    2004-02-01

    The University of Dayton Research Institute (UDRI) under contract by the US Air Force has designed and constructed a fully automated ultrasonic inspection system for the detection of embedded defects in rotating gas turbine engine components. The system performs automated inspections using the "scan plan" concept developed for the Air Force sponsored "Retirement For Cause" (RFC) automated eddy current system. Execution of the scan plan results in a fully automated inspection process producing engine component accept/reject decisions based on probability of detection (POD) information. Use of the phased-array ultrasonic instrument and probes allows for optimization of both the sensitivity and resolution for each inspection through electronic beamforming, scanning, and focusing processes. However, issues such as alignment of the array probe, calibration of individual elements and overall beam response prior to the inspection have not been addressed for an automated system. This paper will discuss current progress in the development of an automated alignment and calibration procedure for various phased array apertures and specimen geometries.

  7. FORCe: Fully Online and Automated Artifact Removal for Brain-Computer Interfacing.

    PubMed

    Daly, Ian; Scherer, Reinhold; Billinger, Martin; Müller-Putz, Gernot

    2015-09-01

    A fully automated and online artifact removal method for the electroencephalogram (EEG) is developed for use in brain-computer interfacing (BCI). The method (FORCe) is based upon a novel combination of wavelet decomposition, independent component analysis, and thresholding. FORCe is able to operate on a small channel set during online EEG acquisition and does not require additional signals (e.g., electrooculogram signals). Evaluation of FORCe is performed offline on EEG recorded from 13 BCI particpants with cerebral palsy (CP) and online with three healthy participants. The method outperforms the state-of the-art automated artifact removal methods Lagged Auto-Mutual Information Clustering (LAMIC) and Fully Automated Statistical Thresholding for EEG artifact Rejection (FASTER), and is able to remove a wide range of artifact types including blink, electromyogram (EMG), and electrooculogram (EOG) artifacts. PMID:25134085

  8. A Comparison between Manual and Automated Evaluations of Tissue Microarray Patterns of Protein Expression

    PubMed Central

    Alvarenga, Arthur W.; Coutinho-Camillo, Claudia M.; Rodrigues, Bruna R.; Rocha, Rafael M.; Torres, Luiz Fernando B.; Martins, Vilma R.; da Cunha, Isabela W.

    2013-01-01

    Tissue microarray technology enables us to evaluate the pattern of protein expression in large numbers of samples. However, manual data acquisition and analysis still represent a challenge because they are subjective and time-consuming. Automated analysis may thus increase the speed and reproducibility of evaluation. However, the reliability of automated analysis systems should be independently evaluated. Herein, the expression of phosphorylated AKT and mTOR was determined by ScanScope XT (Aperio; Vista, CA) and ACIS III (Dako; Glostrup, Denmark) and compared with the manual analysis by two observers. The percentage of labeled pixels or nuclei analysis had a good correlation between human observers and automated systems (κ = 0.855 and 0.879 for ScanScope vs. observers and κ = 0.765 and 0.793 for ACIS III vs. observers). The intensity of labeling determined by ScanScope was also correlated with that found by the human observers (correlation index of 0.946 and 0.851 for pAKT and 0.851 and 0.875 for pmTOR). However, the correlation between ACIS III and human observation varied for labeling intensity and was considered poor in some cases (correlation index of 0.718 and 0.680 for pAKT and 0.223 and 0.225 for pmTOR). Thus, the percentage of positive pixels or nuclei determination was satisfactorily performed by both systems; however, labeling intensity was better identified by ScanScope XT. PMID:23340270

  9. Fully automated digital holographic processing for monitoring the dynamics of a vesicle suspension under shear flow

    PubMed Central

    Minetti, Christophe; Podgorski, Thomas; Coupier, Gwennou; Dubois, Frank

    2014-01-01

    We investigate the dynamics of a vesicle suspension under shear flow between plates using DHM with a spatially reduced coherent source. Holograms are grabbed at a frequency of 24 frames/sec. The distribution of the vesicle suspension is obtained after numerical processing of the digital holograms sequence resulting in a 4D distribution. Obtaining this distribution is not straightforward and requires special processing to automate the analysis. We present an original method that fully automates the analysis and provides distributions that are further analyzed to extract physical properties of the fluid. Details of the numerical implementation, as well as sample experimental results are presented. PMID:24877015

  10. Automated regenerable microarray-based immunoassay for rapid parallel quantification of mycotoxins in cereals.

    PubMed

    Oswald, S; Karsunke, X Y Z; Dietrich, R; Märtlbauer, E; Niessner, R; Knopp, D

    2013-08-01

    An automated flow-through multi-mycotoxin immunoassay using the stand-alone Munich Chip Reader 3 platform and reusable biochips was developed and evaluated. This technology combines a unique microarray, prepared by covalent immobilization of target analytes or derivatives on diamino-poly(ethylene glycol) functionalized glass slides, with a dedicated chemiluminescence readout by a CCD camera. In a first stage, we aimed for the parallel detection of aflatoxins, ochratoxin A, deoxynivalenol, and fumonisins in cereal samples in a competitive indirect immunoassay format. The method combines sample extraction with methanol/water (80:20, v/v), extract filtration and dilution, and immunodetection using horseradish peroxidase-labeled anti-mouse IgG antibodies. The total analysis time, including extraction, extract dilution, measurement, and surface regeneration, was 19 min. The prepared microarray chip was reusable for at least 50 times. Oat extract revealed itself as a representative sample matrix for preparation of mycotoxin standards and determination of different types of cereals such as oat, wheat, rye, and maize polenta at relevant concentrations according to the European Commission regulation. The recovery rates of fortified samples in different matrices, with 55-80 and 58-79%, were lower for the better water-soluble fumonisin B1 and deoxynivalenol and with 127-132 and 82-120% higher for the more unpolar aflatoxins and ochratoxin A, respectively. Finally, the results of wheat samples which were naturally contaminated with deoxynivalenol were critically compared in an interlaboratory comparison with data obtained from microtiter plate ELISA, aokinmycontrol® method, and liquid chromatography-mass spectrometry and found to be in good agreement. PMID:23620369

  11. Automated Immunomagnetic Separation and Microarray Detection of E. coli O157:H7 from Poultry Carcass Rinse

    SciTech Connect

    Chandler, Darrell P. ); Brown, Jeremy D.; Call, Douglas R. ); Wunschel, Sharon C. ); Grate, Jay W. ); Holman, David A.; Olson, Lydia G.; Stottlemyer, Mark S.; Bruckner-Lea, Cindy J. )

    2001-09-01

    We describe the development and application of a novel electromagnetic flow cell and fluidics system for automated immunomagnetic separation of E. coli directly from unprocessed poultry carcass rinse, and the biochemical coupling of automated sample preparation with nucleic acid microarrays without cell growth. Highly porous nickel foam was used as a magnetic flux conductor. Up to 32% recovery efficiency of 'total' E. coli was achieved within the automated system with 6 sec contact times and 15 minute protocol (from sample injection through elution), statistically similar to cell recovery efficiencies in > 1 hour 'batch' captures. The electromagnet flow cell allowed complete recovery of 2.8 mm particles directly from unprocessed poultry carcass rinse whereas the batch system did not. O157:H7 cells were reproducibly isolated directly from unprocessed poultry rinse with 39% recovery efficiency at 103 cells ml-1 inoculum. Direct plating of washed beads showed positive recovery of O 157:H7 directly from carcass rinse at an inoculum of 10 cells ml-1. Recovered beads were used for direct PCR amplification and microarray detection, with a process-level detection limit (automated cell concentration through microarray detection) of < 103 cells ml-1 carcass rinse. The fluidic system and analytical approach described here are generally applicable to most microbial detection problems and applications.

  12. Neurodegenerative changes in Alzheimer's disease: a comparative study of manual, semi-automated, and fully automated assessment using MRI

    NASA Astrophysics Data System (ADS)

    Fritzsche, Klaus H.; Giesel, Frederik L.; Heimann, Tobias; Thomann, Philipp A.; Hahn, Horst K.; Pantel, Johannes; Schröder, Johannes; Essig, Marco; Meinzer, Hans-Peter

    2008-03-01

    Objective quantification of disease specific neurodegenerative changes can facilitate diagnosis and therapeutic monitoring in several neuropsychiatric disorders. Reproducibility and easy-to-perform assessment are essential to ensure applicability in clinical environments. Aim of this comparative study is the evaluation of a fully automated approach that assesses atrophic changes in Alzheimer's disease (AD) and Mild Cognitive Impairment (MCI). 21 healthy volunteers (mean age 66.2), 21 patients with MCI (66.6), and 10 patients with AD (65.1) were enrolled. Subjects underwent extensive neuropsychological testing and MRI was conducted on a 1.5 Tesla clinical scanner. Atrophic changes were measured automatically by a series of image processing steps including state of the art brain mapping techniques. Results were compared with two reference approaches: a manual segmentation of the hippocampal formation and a semi-automated estimation of temporal horn volume, which is based upon interactive selection of two to six landmarks in the ventricular system. All approaches separated controls and AD patients significantly (10 -5 < p < 10 -4) and showed a slight but not significant increase of neurodegeneration for subjects with MCI compared to volunteers. The automated approach correlated significantly with the manual (r = -0.65, p < 10 -6) and semi automated (r = -0.83, p < 10 -13) measurements. It proved high accuracy and at the same time maximized observer independency, time reduction and thus usefulness for clinical routine.

  13. Fully Automated Quantification of Insulin Concentration Using a Microfluidic-Based Chemiluminescence Immunoassay.

    PubMed

    Yao, Ping; Liu, Zhu; Tung, Steve; Dong, Zaili; Liu, Lianqing

    2016-06-01

    A fully automated microfluidic-based detection system for the rapid determination of insulin concentration through a chemiluminescence immunoassay has been developed. The microfluidic chip used in the system is a double-layered polydimethylsiloxane device embedded with interconnecting micropumps, microvalves, and a micromixer. At a high injection rate of the developing solution, the chemiluminescence signal can be excited and measured within a short period of time. The integral value of the chemiluminescence light signal is used to determine the insulin concentration of the samples, and the results indicate that the measurement is accurate in the range from 1.5 pM to 391 pM. The entire chemiluminescence assay can be completed in less than 10 min. The fully automated microfluidic-based insulin detection system provides a useful platform for rapid determination of insulin in clinical diagnostics for diabetes, which is expected to become increasingly important for future point-of-care applications. PMID:25824205

  14. Fully automated corneal endothelial morphometry of images captured by clinical specular microscopy

    NASA Astrophysics Data System (ADS)

    Bucht, Curry; Söderberg, Per; Manneberg, Göran

    2010-02-01

    The corneal endothelium serves as the posterior barrier of the cornea. Factors such as clarity and refractive properties of the cornea are in direct relationship to the quality of the endothelium. The endothelial cell density is considered the most important morphological factor of the corneal endothelium. Pathological conditions and physical trauma may threaten the endothelial cell density to such an extent that the optical property of the cornea and thus clear eyesight is threatened. Diagnosis of the corneal endothelium through morphometry is an important part of several clinical applications. Morphometry of the corneal endothelium is presently carried out by semi automated analysis of pictures captured by a Clinical Specular Microscope (CSM). Because of the occasional need of operator involvement, this process can be tedious, having a negative impact on sampling size. This study was dedicated to the development and use of fully automated analysis of a very large range of images of the corneal endothelium, captured by CSM, using Fourier analysis. Software was developed in the mathematical programming language Matlab. Pictures of the corneal endothelium, captured by CSM, were read into the analysis software. The software automatically performed digital enhancement of the images, normalizing lights and contrasts. The digitally enhanced images of the corneal endothelium were Fourier transformed, using the fast Fourier transform (FFT) and stored as new images. Tools were developed and applied for identification and analysis of relevant characteristics of the Fourier transformed images. The data obtained from each Fourier transformed image was used to calculate the mean cell density of its corresponding corneal endothelium. The calculation was based on well known diffraction theory. Results in form of estimated cell density of the corneal endothelium were obtained, using fully automated analysis software on 292 images captured by CSM. The cell density obtained by the

  15. Vervet MRI Atlas and Label Map for Fully Automated Morphometric Analyses

    PubMed Central

    Maldjian, Joseph A.; Daunais, James B.; Friedman, David P.

    2016-01-01

    Currently available non-human primate templates typically require input of a skull-stripped brain for structural processing. This can be a manually intensive procedure, and considerably limits their utility. The purpose of this study was to create a vervet MRI population template, associated tissue probability maps (TPM), and a label atlas to facilitate true fully automated Magnetic Resonance Imaging (MRI) structural analyses for morphometric analyses. Structural MRI scans of ten vervet monkeys (Chlorocebus aethiops) scanned at three time points were used in this study. An unbiased population average template was created using a symmetric diffeomorphic registration (SyN) procedure. Skull stripping, segmentation, and label map generation were performed using the publically available rhesus INIA19 MRI template and NeuroMap label atlas. A six-class TPM and a six-layer two-class normalization template was created from the vervet segmentation for use within the Statistical Parametric Mapping (SPM) framework. Fully automated morphologic processing of all of the vervet MRI scans was then performed using the vervet TPM and vervet normalization template including skull-stripping, segmentation and normalization. The vervet template creation procedure resulted in excellent skull stripping, segmentation, and NeuroMap atlas labeling with 720 structures successfully registered. Fully automated processing was accomplished for all vervet scans, demonstrating excellent skull-stripping, segmentation, and normalization performance. We describe creation of an unbiased vervet structural MRI population template and atlas. The template includes an associated six-class TPM and DARTEL six-layer two-class normalization template for true fully automated skull-stripping, segmentation, and normalization of vervet structural T1-weighted MRI scans. We provide the most detailed vervet label atlas currently available based on the NeuroMaps atlas with 720 labels successfully registered. We

  16. How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study

    PubMed Central

    Johansen, Ayna; Brendryen, Håvar

    2016-01-01

    Background eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. Objective We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist’s support of a working alliance, internalization of motivation, and managing lapses. Methods We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several “counseling sessions” about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. Results The program supports the user’s working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. Conclusions A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective. PMID:27354373

  17. Fully Automated Volumetric Modulated Arc Therapy Plan Generation for Prostate Cancer Patients

    SciTech Connect

    Voet, Peter W.J. Dirkx, Maarten L.P.; Breedveld, Sebastiaan; Al-Mamgani, Abrahim; Incrocci, Luca; Heijmen, Ben J.M.

    2014-04-01

    Purpose: To develop and evaluate fully automated volumetric modulated arc therapy (VMAT) treatment planning for prostate cancer patients, avoiding manual trial-and-error tweaking of plan parameters by dosimetrists. Methods and Materials: A system was developed for fully automated generation of VMAT plans with our commercial clinical treatment planning system (TPS), linked to the in-house developed Erasmus-iCycle multicriterial optimizer for preoptimization. For 30 randomly selected patients, automatically generated VMAT plans (VMAT{sub auto}) were compared with VMAT plans generated manually by 1 expert dosimetrist in the absence of time pressure (VMAT{sub man}). For all treatment plans, planning target volume (PTV) coverage and sparing of organs-at-risk were quantified. Results: All generated plans were clinically acceptable and had similar PTV coverage (V{sub 95%} > 99%). For VMAT{sub auto} and VMAT{sub man} plans, the organ-at-risk sparing was similar as well, although only the former plans were generated without any planning workload. Conclusions: Fully automated generation of high-quality VMAT plans for prostate cancer patients is feasible and has recently been implemented in our clinic.

  18. TreeRipper web application: towards a fully automated optical tree recognition software

    PubMed Central

    2011-01-01

    Background Relationships between species, genes and genomes have been printed as trees for over a century. Whilst this may have been the best format for exchanging and sharing phylogenetic hypotheses during the 20th century, the worldwide web now provides faster and automated ways of transferring and sharing phylogenetic knowledge. However, novel software is needed to defrost these published phylogenies for the 21st century. Results TreeRipper is a simple website for the fully-automated recognition of multifurcating phylogenetic trees (http://linnaeus.zoology.gla.ac.uk/~jhughes/treeripper/). The program accepts a range of input image formats (PNG, JPG/JPEG or GIF). The underlying command line c++ program follows a number of cleaning steps to detect lines, remove node labels, patch-up broken lines and corners and detect line edges. The edge contour is then determined to detect the branch length, tip label positions and the topology of the tree. Optical Character Recognition (OCR) is used to convert the tip labels into text with the freely available tesseract-ocr software. 32% of images meeting the prerequisites for TreeRipper were successfully recognised, the largest tree had 115 leaves. Conclusions Despite the diversity of ways phylogenies have been illustrated making the design of a fully automated tree recognition software difficult, TreeRipper is a step towards automating the digitization of past phylogenies. We also provide a dataset of 100 tree images and associated tree files for training and/or benchmarking future software. TreeRipper is an open source project licensed under the GNU General Public Licence v3. PMID:21599881

  19. Fully-automated quality assurance in multi-center studies using MRI phantom measurements.

    PubMed

    Davids, Mathias; Zöllner, Frank G; Ruttorf, Michaela; Nees, Frauke; Flor, Herta; Schumann, Gunter; Schad, Lothar R

    2014-07-01

    Phantom measurements allow for investigating the overall quality characteristics of an MRI scanner. Especially within multicenter studies, these characteristics ensure the comparability of the results across different sites, in addition to the performance stability of a single scanner over time. This comparability requires consistent phantoms, sequence protocols, and quality assurance criteria. Within the scope of this work, a software library was implemented for fully-automated determination of important quality characteristics (comprising signal-to-noise ratio, image uniformity, ghosting artifacts, chemical shift and spatial resolution and linearity) including methods for data preparation, automated pre- and postprocessing as well as visualization and interpretation. All methods were evaluated using both synthetic images with predefined distortions and a set of 44 real phantom measurements involving eight sites and three manufacturers. Using the synthetic phantom images, predefined levels of distortion that were incorporated artificially were correctly detected by the automated routines with no more than 2.6% of relative error. In addition, the methods were applied to real phantom measurements - all data sets could be evaluated automatically considering all quality parameters as long as the acquisition protocols are followed. Shortcomings of the processability only occurred in the ghosting artifacts (39/44 evaluable) and the spatial linearity (43/44 evaluable) analysis due to gross misalignments of the phantom during image acquisition. Based on evaluation results, the accuracy of the evaluation appears to be robust to misalignments, artifacts, and distortions affecting the images, allowing for objective fully-automated evaluation and interpretation of large data set numbers. PMID:24602825

  20. Fully automated quantitative analysis of breast cancer risk in DCE-MR images

    NASA Astrophysics Data System (ADS)

    Jiang, Luan; Hu, Xiaoxin; Gu, Yajia; Li, Qiang

    2015-03-01

    Amount of fibroglandular tissue (FGT) and background parenchymal enhancement (BPE) in dynamic contrast enhanced magnetic resonance (DCE-MR) images are two important indices for breast cancer risk assessment in the clinical practice. The purpose of this study is to develop and evaluate a fully automated scheme for quantitative analysis of FGT and BPE in DCE-MR images. Our fully automated method consists of three steps, i.e., segmentation of whole breast, fibroglandular tissues, and enhanced fibroglandular tissues. Based on the volume of interest extracted automatically, dynamic programming method was applied in each 2-D slice of a 3-D MR scan to delineate the chest wall and breast skin line for segmenting the whole breast. This step took advantages of the continuity of chest wall and breast skin line across adjacent slices. We then further used fuzzy c-means clustering method with automatic selection of cluster number for segmenting the fibroglandular tissues within the segmented whole breast area. Finally, a statistical method was used to set a threshold based on the estimated noise level for segmenting the enhanced fibroglandular tissues in the subtraction images of pre- and post-contrast MR scans. Based on the segmented whole breast, fibroglandular tissues, and enhanced fibroglandular tissues, FGT and BPE were automatically computed. Preliminary results of technical evaluation and clinical validation showed that our fully automated scheme could obtain good segmentation of the whole breast, fibroglandular tissues, and enhanced fibroglandular tissues to achieve accurate assessment of FGT and BPE for quantitative analysis of breast cancer risk.

  1. Fully automated segmentation and characterization of the dendritic trees of retinal horizontal neurons

    SciTech Connect

    Kerekes, Ryan A; Gleason, Shaun Scott; Martins, Rodrigo; Dyer, Michael

    2010-01-01

    We introduce a new fully automated method for segmenting and characterizing the dendritic tree of neurons in confocal image stacks. Our method is aimed at wide-field-of-view, low-resolution imagery of retinal neurons in which dendrites can be intertwined and difficult to follow. The approach is based on 3-D skeletonization and includes a method for automatically determining an appropriate global threshold as well as a soma detection algorithm. We provide the details of the algorithm and a qualitative performance comparison against a commercially available neurite tracing software package, showing that a segmentation produced by our method more closely matches the ground-truth segmentation.

  2. Regeneration of Recombinant Antigen Microarrays for the Automated Monitoring of Antibodies against Zoonotic Pathogens in Swine Sera

    PubMed Central

    Meyer, Verena K.; Kober, Catharina; Niessner, Reinhard; Seidel, Michael

    2015-01-01

    The ability to regenerate immobilized proteins like recombinant antigens (rAgs) on surfaces is an unsolved problem for flow-based immunoassays on microarray analysis systems. The regeneration on microarray chip surfaces is achieved by changing the protein structures and desorption of antibodies. Afterwards, reactivation of immobilized protein antigens is necessary for reconstitution processes. Any backfolding should be managed in a way that antibodies are able to detect the protein antigens in the next measurement cycle. The regeneration of rAg microarrays was examined for the first time on the MCR3 flow-based chemiluminescence (CL) microarray analysis platform. The aim was to reuse rAg microarray chips in order to reduce the screening effort and costs. An antibody capturing format was used to detect antibodies against zoonotic pathogens in sera of slaughtered pigs. Different denaturation and reactivation buffers were tested. Acidic glycine-SDS buffer (pH 2.5) and 8 M guanidinium hydrochloride showed the best results in respect of denaturation efficiencies. The highest CL signals after regeneration were achieved with a carbonate buffer containing 10 mM DTT and 0.1% BSA for reactivation. Antibodies against Yersinia spp. and hepatitis E virus (HEV) were detected in swine sera on one immunochip over 4 days and 25 measurement cycles. Each cycle took 10 min for detection and regeneration. By using the rAg microarray chip, a fast and automated screening of antibodies against pathogens in sera of slaughtered pigs would be possible for zoonosis monitoring. PMID:25625908

  3. DIVAS: fully automated simulation based mask defect dispositioning and defect management system

    NASA Astrophysics Data System (ADS)

    Munir, Saghir; Bald, Daniel J.; Tolani, Vikram; Ghadiali, Firoz; Lieberman, Barry

    2004-12-01

    This article presents the evolution of the first fully automated simulation based mask defect dispositioning and defect management system used since late 2002 in a production environment at Intel Mask Operation (IMO). Given that inspection tools flag defects which may or may not have any lithographic significance, it makes sense to repair only those defects that resolve on the wafer. The system described here is a fully automated defect dispositioning system, where the lithographic impact of a defect is determined through computer simulation of the mask level image. From the simulated aerial images, combined with image processing techniques, the system can automatically determine the actual critical dimension (CD) impact (in nanometers). Then, using the product specification as a criteria, can pass or fail the defect. Furthermore, this system allows engineers and technicians in the factory to track defects as they are repaired, compare defects at various inspection steps and annotate repair history. Trends such as yield and defect commonality can also be determined. The article concludes with performance results, indicating the speed and accuracy of the system, as well as the savings in the number of defects needing repair.

  4. Designs and concept reliance of a fully automated high-content screening platform.

    PubMed

    Radu, Constantin; Adrar, Hosna Sana; Alamir, Ab; Hatherley, Ian; Trinh, Trung; Djaballah, Hakim

    2012-10-01

    High-content screening (HCS) is becoming an accepted platform in academic and industry screening labs and does require slightly different logistics for execution. To automate our stand-alone HCS microscopes, namely, an alpha IN Cell Analyzer 3000 (INCA3000), originally a Praelux unit hooked to a Hudson Plate Crane with a maximum capacity of 50 plates per run, and the IN Cell Analyzer 2000 (INCA2000), in which up to 320 plates could be fed per run using the Thermo Fisher Scientific Orbitor, we opted for a 4 m linear track system harboring both microscopes, plate washer, bulk dispensers, and a high-capacity incubator allowing us to perform both live and fixed cell-based assays while accessing both microscopes on deck. Considerations in design were given to the integration of the alpha INCA3000, a new gripper concept to access the onboard nest, and peripheral locations on deck to ensure a self-reliant system capable of achieving higher throughput. The resulting system, referred to as Hestia, has been fully operational since the new year, has an onboard capacity of 504 plates, and harbors the only fully automated alpha INCA3000 unit in the world. PMID:22797489

  5. MAGNETIC RESONANCE IMAGING COMPATIBLE ROBOTIC SYSTEM FOR FULLY AUTOMATED BRACHYTHERAPY SEED PLACEMENT

    PubMed Central

    Muntener, Michael; Patriciu, Alexandru; Petrisor, Doru; Mazilu, Dumitru; Bagga, Herman; Kavoussi, Louis; Cleary, Kevin; Stoianovici, Dan

    2011-01-01

    Objectives To introduce the development of the first magnetic resonance imaging (MRI)-compatible robotic system capable of automated brachytherapy seed placement. Methods An MRI-compatible robotic system was conceptualized and manufactured. The entire robot was built of nonmagnetic and dielectric materials. The key technology of the system is a unique pneumatic motor that was specifically developed for this application. Various preclinical experiments were performed to test the robot for precision and imager compatibility. Results The robot was fully operational within all closed-bore MRI scanners. Compatibility tests in scanners of up to 7 Tesla field intensity showed no interference of the robot with the imager. Precision tests in tissue mockups yielded a mean seed placement error of 0.72 ± 0.36 mm. Conclusions The robotic system is fully MRI compatible. The new technology allows for automated and highly accurate operation within MRI scanners and does not deteriorate the MRI quality. We believe that this robot may become a useful instrument for image-guided prostate interventions. PMID:17169653

  6. Designs and Concept-Reliance of a Fully Automated High Content Screening Platform

    PubMed Central

    Radu, Constantin; Adrar, Hosna Sana; Alamir, Ab; Hatherley, Ian; Trinh, Trung; Djaballah, Hakim

    2013-01-01

    High content screening (HCS) is becoming an accepted platform in academic and industry screening labs and does require slightly different logistics for execution. To automate our stand alone HCS microscopes, namely an alpha IN Cell Analyzer 3000 (INCA3000) originally a Praelux unit hooked to a Hudson Plate Crane with a maximum capacity of 50 plates per run; and the IN Cell Analyzer 2000 (INCA2000) where up to 320 plates could be fed per run using the Thermo Fisher Scientific Orbitor, we opted for a 4 meter linear track system harboring both microscopes, plate washer, bulk dispensers, and a high capacity incubator allowing us to perform both live and fixed cell based assays while accessing both microscopes on deck. Considerations in design were given to the integration of the alpha INCA3000, a new gripper concept to access the onboard nest, and peripheral locations on deck to ensure a self reliant system capable of achieving higher throughput. The resulting system, referred to as Hestia, has been fully operational since the New Year, has an onboard capacity of 504 plates, and harbors the only fully automated alpha INCA3000 unit in the World. PMID:22797489

  7. Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes

    NASA Astrophysics Data System (ADS)

    Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao

    2010-06-01

    To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.

  8. Artificial Golgi apparatus: globular protein-like dendrimer facilitates fully automated enzymatic glycan synthesis.

    PubMed

    Matsushita, Takahiko; Nagashima, Izuru; Fumoto, Masataka; Ohta, Takashi; Yamada, Kuriko; Shimizu, Hiroki; Hinou, Hiroshi; Naruchi, Kentaro; Ito, Takaomi; Kondo, Hirosato; Nishimura, Shin-Ichiro

    2010-11-24

    Despite the growing importance of synthetic glycans as tools for biological studies and drug discovery, a lack of common methods for the routine synthesis remains a major obstacle. We have developed a new method for automated glycan synthesis that employs the enzymatic approach and a dendrimer as an ideal support within the chemical process. Recovery tests using a hollow fiber ultrafiltration module have revealed that monodisperse G6 (MW = 58 kDa) and G7 (MW = 116 kDa) poly(amidoamine) dendrimers exhibit a similar profile to BSA (MW = 66 kDa). Characteristics of the globular protein-like G7 dendrimer with high solubility and low viscosity in water greatly enhanced throughput and efficiency in automated synthesis while random polyacrylamide-based supports entail significant loss during the repetitive reaction/separation step. The present protocol allowed for the fully automated enzymatic synthesis of sialyl Lewis X tetrasaccharide derivatives over a period of 4 days in 16% overall yield from a simple N-acetyl-d-glucosamine linked to an aminooxy-functionalized G7 dendrimer. PMID:21033706

  9. Fully automated high-quality NMR structure determination of small 2H-enriched proteins

    PubMed Central

    Tang, Yuefeng; Schneider, William M.; Shen, Yang; Raman, Srivatsan; Inouye, Masayori; Baker, David; Roth, Monica J.

    2010-01-01

    Determination of high-quality small protein structures by nuclear magnetic resonance (NMR) methods generally requires acquisition and analysis of an extensive set of structural constraints. The process generally demands extensive backbone and sidechain resonance assignments, and weeks or even months of data collection and interpretation. Here we demonstrate rapid and high-quality protein NMR structure generation using CS-Rosetta with a perdeuterated protein sample made at a significantly reduced cost using new bacterial culture condensation methods. Our strategy provides the basis for a high-throughput approach for routine, rapid, high-quality structure determination of small proteins. As an example, we demonstrate the determination of a high-quality 3D structure of a small 8 kDa protein, E. coli cold shock protein A (CspA), using <4 days of data collection and fully automated data analysis methods together with CS-Rosetta. The resulting CspA structure is highly converged and in excellent agreement with the published crystal structure, with a backbone RMSD value of 0.5 Å, an all atom RMSD value of 1.2 Å to the crystal structure for well-defined regions, and RMSD value of 1.1 Å to crystal structure for core, non-solvent exposed sidechain atoms. Cross validation of the structure with 15N- and 13C-edited NOESY data obtained with a perdeuterated 15N, 13C-enriched 13CH3 methyl protonated CspA sample confirms that essentially all of these independently-interpreted NOE-based constraints are already satisfied in each of the 10 CS-Rosetta structures. By these criteria, the CS-Rosetta structure generated by fully automated analysis of data for a perdeuterated sample provides an accurate structure of CspA. This represents a general approach for rapid, automated structure determination of small proteins by NMR. PMID:20734145

  10. Performance of automated scoring of ER, PR, HER2, CK5/6 and EGFR in breast cancer tissue microarrays in the Breast Cancer Association Consortium

    PubMed Central

    Howat, William J; Blows, Fiona M; Provenzano, Elena; Brook, Mark N; Morris, Lorna; Gazinska, Patrycja; Johnson, Nicola; McDuffus, Leigh‐Anne; Miller, Jodi; Sawyer, Elinor J; Pinder, Sarah; van Deurzen, Carolien H M; Jones, Louise; Sironen, Reijo; Visscher, Daniel; Caldas, Carlos; Daley, Frances; Coulson, Penny; Broeks, Annegien; Sanders, Joyce; Wesseling, Jelle; Nevanlinna, Heli; Fagerholm, Rainer; Blomqvist, Carl; Heikkilä, Päivi; Ali, H Raza; Dawson, Sarah‐Jane; Figueroa, Jonine; Lissowska, Jolanta; Brinton, Louise; Mannermaa, Arto; Kataja, Vesa; Kosma, Veli‐Matti; Cox, Angela; Brock, Ian W; Cross, Simon S; Reed, Malcolm W; Couch, Fergus J; Olson, Janet E; Devillee, Peter; Mesker, Wilma E; Seyaneve, Caroline M; Hollestelle, Antoinette; Benitez, Javier; Perez, Jose Ignacio Arias; Menéndez, Primitiva; Bolla, Manjeet K; Easton, Douglas F; Schmidt, Marjanka K; Pharoah, Paul D; Sherman, Mark E

    2014-01-01

    Abstract Breast cancer risk factors and clinical outcomes vary by tumour marker expression. However, individual studies often lack the power required to assess these relationships, and large‐scale analyses are limited by the need for high throughput, standardized scoring methods. To address these limitations, we assessed whether automated image analysis of immunohistochemically stained tissue microarrays can permit rapid, standardized scoring of tumour markers from multiple studies. Tissue microarray sections prepared in nine studies containing 20 263 cores from 8267 breast cancers stained for two nuclear (oestrogen receptor, progesterone receptor), two membranous (human epidermal growth factor receptor 2 and epidermal growth factor receptor) and one cytoplasmic (cytokeratin 5/6) marker were scanned as digital images. Automated algorithms were used to score markers in tumour cells using the Ariol system. We compared automated scores against visual reads, and their associations with breast cancer survival. Approximately 65–70% of tissue microarray cores were satisfactory for scoring. Among satisfactory cores, agreement between dichotomous automated and visual scores was highest for oestrogen receptor (Kappa = 0.76), followed by human epidermal growth factor receptor 2 (Kappa = 0.69) and progesterone receptor (Kappa = 0.67). Automated quantitative scores for these markers were associated with hazard ratios for breast cancer mortality in a dose‐response manner. Considering visual scores of epidermal growth factor receptor or cytokeratin 5/6 as the reference, automated scoring achieved excellent negative predictive value (96–98%), but yielded many false positives (positive predictive value = 30–32%). For all markers, we observed substantial heterogeneity in automated scoring performance across tissue microarrays. Automated analysis is a potentially useful tool for large‐scale, quantitative scoring of immunohistochemically stained tissue

  11. Development and evaluation of fully automated demand response in large facilities

    SciTech Connect

    Piette, Mary Ann; Sezgen, Osman; Watson, David S.; Motegi, Naoya; Shockman, Christine; ten Hope, Laurie

    2004-03-30

    This report describes the results of a research project to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve electric grid reliability, manage electricity costs, and ensure that customers receive signals that encourage load reduction during times when the electric grid is near its capacity. The two main drivers for widespread demand responsiveness are the prevention of future electricity crises and the reduction of electricity prices. Additional goals for price responsiveness include equity through cost of service pricing, and customer control of electricity usage and bills. The technology developed and evaluated in this report could be used to support numerous forms of DR programs and tariffs. For the purpose of this report, we have defined three levels of Demand Response automation. Manual Demand Response involves manually turning off lights or equipment; this can be a labor-intensive approach. Semi-Automated Response involves the use of building energy management control systems for load shedding, where a preprogrammed load shedding strategy is initiated by facilities staff. Fully-Automated Demand Response is initiated at a building or facility through receipt of an external communications signal--facility staff set up a pre-programmed load shedding strategy which is automatically initiated by the system without the need for human intervention. We have defined this approach to be Auto-DR. An important concept in Auto-DR is that a facility manager is able to ''opt out'' or ''override'' an individual DR event if it occurs at a time when the reduction in end-use services is not desirable. This project sought to improve the feasibility and nature of Auto-DR strategies in large facilities. The research focused on technology development, testing, characterization, and evaluation relating to Auto

  12. Broadband fully automated digitally assisted coaxial bridge for high accuracy impedance ratio measurements

    NASA Astrophysics Data System (ADS)

    Overney, Frédéric; Lüönd, Felix; Jeanneret, Blaise

    2016-06-01

    This paper describes the principle of a new fully automated digitally assisted coaxial bridge having a large bandwidth ranging from 60 Hz to 50 kHz. The performance of the bridge is evaluated making 1:1 comparisons between calculable ac resistors. The agreement between the calculated and the measured frequency dependence of the resistors is better than 5\\cdot {{10}-8} at frequencies up to 5 kHz, better than 1\\cdot {{10}-7} up to 20 kHz and better than 0.8\\cdot {{10}-6} up to 50 kHz. This bridge is particularly well suited to investigate the ac transport properties of graphene in the quantum Hall regime.

  13. Development of fully automated quantitative capillary electrophoresis with high accuracy and repeatability.

    PubMed

    Xu, Yuan; Ling, Bang-Zan; Zhu, Wen-Jun; Yao, Dong; Zhang, Lin; Wang, Yan; Yan, Chao

    2016-03-01

    A quantitative capillary electrophoresis (qCE) was developed by utilizing a rotary type of nano-volume injector, an autosampler, and a thermostat with cooling capacity. The accuracy and precision were greatly improved compared with conventional capillary electrophoresis. The 10 nL volume accuracy was guaranteed by the carefully designed nano-injector with an accurate internal loop. The system repeatability (precision) in terms of RSD <0.5% for migration time and 1% for peak area were achieved by using DMSO as a test sample. We believe that this fully automated qCE system has the potential to be employed broadly in quality control and quality assurance in the pharmaceutical industry. PMID:26174138

  14. A fully automated in vitro diagnostic system based on magnetic tunnel junction arrays and superparamagnetic particles

    NASA Astrophysics Data System (ADS)

    Lian, Jie; Chen, Si; Qiu, Yuqin; Zhang, Suohui; Shi, Stone; Gao, Yunhua

    2012-04-01

    A fully automated in vitro diagnostic (IVD) system for diagnosing acute myocardial infarction was developed using high sensitivity MTJ array as sensors and nano-magnetic particles as tags. On the chip is an array of 12 × 106 MTJ devices integrated onto a 3 metal layer CMOS circuit. The array is divided into 48 detection areas, therefore 48 different types of bio targets can be analyzed simultaneously if needed. The chip is assembled with a micro-fluidic cartridge which contains all the reagents necessary for completing the assaying process. Integrated with electrical, mechanical and micro-fluidic pumping devices and with the reaction protocol programed in a microprocessor, the system only requires a simple one-step analyte application procedure to operate and yields results of the three major AMI bio-markers (cTnI, MYO, CK-MB) in 15 mins.

  15. Evaluation and comparison of two fully automated radioassay systems with distinctly different modes of analysis

    SciTech Connect

    Chen, I.W.; Maxon, H.R.; Heminger, L.A.; Ellis, K.S.; Volle, C.P.

    1980-12-01

    Two fully automated radioimmunoassay systems with batch and sequential modes of analysis were used to assay serum thyroxine, triiodothyronine, and digoxin. The results obtained were compared with those obtained by manual methods. The batch system uses antibody coated tubes while the sequential system uses immobilized antibody chambers for the separation of bound from free ligands. In accuracy, both systems compared favorably with the established manual methods, but the sequential system showed better precision than the batch system. There was a statistically significant carryover of thyroxine in the sequential system when there were at least six-fold differences in the concentrations of thyroxine in adjacent samples, but the carryover was not significant in the batch system. Compared with the batch system, the sequential system has a shorter throughtime for individual samples (time from aspiration of the sample to the printout of results) but a longer interval for final overall printout of assay results (lower throughput).

  16. Fully automated multifunctional ultrahigh pressure liquid chromatography system for advanced proteome analyses

    SciTech Connect

    Lee, Jung Hwa; Hyung, Seok-Won; Mun, Dong-Gi; Jung, Hee-Jung; Kim, Hokeun; Lee, Hangyeore; Kim, Su-Jin; Park, Kyong Soo; Moore, Ronald J.; Smith, Richard D.; Lee, Sang-Won

    2012-08-03

    A multi-functional liquid chromatography system that performs 1-dimensional, 2-dimensional (strong cation exchange/reverse phase liquid chromatography, or SCX/RPLC) separations, and online phosphopeptides enrichment using a single binary nano-flow pump has been developed. With a simple operation of a function selection valve, which is equipped with a SCX column and a TiO2 (titanium dioxide) column, a fully automated selection of three different experiment modes was achieved. Because the current system uses essentially the same solvent flow paths, the same trap column, and the same separation column for reverse-phase separation of 1D, 2D, and online phosphopeptides enrichment experiments, the elution time information obtained from these experiments is in excellent agreement, which facilitates correlating peptide information from different experiments.

  17. Fully automated detection of the counting area in blood smears for computer aided hematology.

    PubMed

    Rupp, Stephan; Schlarb, Timo; Hasslmeyer, Erik; Zerfass, Thorsten

    2011-01-01

    For medical diagnosis, blood is an indispensable indicator for a wide variety of diseases, i.e. hemic, parasitic and sexually transmitted diseases. A robust detection and exact segmentation of white blood cells (leukocytes) in stained blood smears of the peripheral blood provides the base for a fully automated, image based preparation of the so called differential blood cell count in the context of medical laboratory diagnostics. Especially for the localization of the blood cells and in particular for the segmentation of the cells it is necessary to detect the working area of the blood smear. In this contribution we present an approach for locating the so called counting area on stained blood smears that is the region where cells are predominantly separated and do not interfere with each other. For this multiple images of a blood smear are taken and analyzed in order to select the image corresponding to this area. The analysis involves the computation of an unimodal function from image content that serves as indicator for the corresponding image. This requires a prior segmentation of the cells that is carried out by a binarization in the HSV color space. Finally, the indicator function is derived from the number of cells and the cells' surface area. Its unimodality guarantees to find a maximum value that corresponds to the counting areas image index. By this, a fast lookup of the counting area is performed enabling a fully automated analysis of blood smears for medical diagnosis. For an evaluation the algorithm's performance on a number of blood smears was compared with the ground truth information that has been defined by an adept hematologist. PMID:22256137

  18. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    PubMed Central

    Tsai, Yingssu; McPhillips, Scott E.; González, Ana; McPhillips, Timothy M.; Zinn, Daniel; Cohen, Aina E.; Feese, Michael D.; Bushnell, David; Tiefenbrunn, Theresa; Stout, C. David; Ludaescher, Bertram; Hedman, Britt; Hodgson, Keith O.; Soltis, S. Michael

    2013-01-01

    AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallo­graphy steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully demonstrated. This workflow was run once on the same 96 samples that the group had examined manually and the workflow cycled successfully through all of the samples, collected data from the same samples that were selected manually and located the same peaks of unmodeled density in the resulting difference

  19. Fully automated and colorimetric foodborne pathogen detection on an integrated centrifugal microfluidic device.

    PubMed

    Oh, Seung Jun; Park, Byung Hyun; Choi, Goro; Seo, Ji Hyun; Jung, Jae Hwan; Choi, Jong Seob; Kim, Do Hyun; Seo, Tae Seok

    2016-05-21

    This work describes fully automated and colorimetric foodborne pathogen detection on an integrated centrifugal microfluidic device, which is called a lab-on-a-disc. All the processes for molecular diagnostics including DNA extraction and purification, DNA amplification and amplicon detection were integrated on a single disc. Silica microbeads incorporated in the disc enabled extraction and purification of bacterial genomic DNA from bacteria-contaminated milk samples. We targeted four kinds of foodborne pathogens (Escherichia coli O157:H7, Salmonella typhimurium, Vibrio parahaemolyticus and Listeria monocytogenes) and performed loop-mediated isothermal amplification (LAMP) to amplify the specific genes of the targets. Colorimetric detection mediated by a metal indicator confirmed the results of the LAMP reactions with the colour change of the LAMP mixtures from purple to sky blue. The whole process was conducted in an automated manner using the lab-on-a-disc and a miniaturized rotary instrument equipped with three heating blocks. We demonstrated that a milk sample contaminated with foodborne pathogens can be automatically analysed on the centrifugal disc even at the 10 bacterial cell level in 65 min. The simplicity and portability of the proposed microdevice would provide an advanced platform for point-of-care diagnostics of foodborne pathogens, where prompt confirmation of food quality is needed. PMID:27112702

  20. DeepPicker: A deep learning approach for fully automated particle picking in cryo-EM.

    PubMed

    Wang, Feng; Gong, Huichao; Liu, Gaochao; Li, Meijing; Yan, Chuangye; Xia, Tian; Li, Xueming; Zeng, Jianyang

    2016-09-01

    Particle picking is a time-consuming step in single-particle analysis and often requires significant interventions from users, which has become a bottleneck for future automated electron cryo-microscopy (cryo-EM). Here we report a deep learning framework, called DeepPicker, to address this problem and fill the current gaps toward a fully automated cryo-EM pipeline. DeepPicker employs a novel cross-molecule training strategy to capture common features of particles from previously-analyzed micrographs, and thus does not require any human intervention during particle picking. Tests on the recently-published cryo-EM data of three complexes have demonstrated that our deep learning based scheme can successfully accomplish the human-level particle picking process and identify a sufficient number of particles that are comparable to those picked manually by human experts. These results indicate that DeepPicker can provide a practically useful tool to significantly reduce the time and manual effort spent in single-particle analysis and thus greatly facilitate high-resolution cryo-EM structure determination. DeepPicker is released as an open-source program, which can be downloaded from https://github.com/nejyeah/DeepPicker-python. PMID:27424268

  1. Fully automated muscle quality assessment by Gabor filtering of second harmonic generation images

    NASA Astrophysics Data System (ADS)

    Paesen, Rik; Smolders, Sophie; Vega, José Manolo de Hoyos; Eijnde, Bert O.; Hansen, Dominique; Ameloot, Marcel

    2016-02-01

    Although structural changes on the sarcomere level of skeletal muscle are known to occur due to various pathologies, rigorous studies of the reduced sarcomere quality remain scarce. This can possibly be explained by the lack of an objective tool for analyzing and comparing sarcomere images across biological conditions. Recent developments in second harmonic generation (SHG) microscopy and increasing insight into the interpretation of sarcomere SHG intensity profiles have made SHG microscopy a valuable tool to study microstructural properties of sarcomeres. Typically, sarcomere integrity is analyzed by fitting a set of manually selected, one-dimensional SHG intensity profiles with a supramolecular SHG model. To circumvent this tedious manual selection step, we developed a fully automated image analysis procedure to map the sarcomere disorder for the entire image at once. The algorithm relies on a single-frequency wavelet-based Gabor approach and includes a newly developed normalization procedure allowing for unambiguous data interpretation. The method was validated by showing the correlation between the sarcomere disorder, quantified by the M-band size obtained from manually selected profiles, and the normalized Gabor value ranging from 0 to 1 for decreasing disorder. Finally, to elucidate the applicability of our newly developed protocol, Gabor analysis was used to study the effect of experimental autoimmune encephalomyelitis on the sarcomere regularity. We believe that the technique developed in this work holds great promise for high-throughput, unbiased, and automated image analysis to study sarcomere integrity by SHG microscopy.

  2. Fully Automated Generation of Accurate Digital Surface Models with Sub-Meter Resolution from Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Wohlfeil, J.; Hirschmüller, H.; Piltz, B.; Börner, A.; Suppa, M.

    2012-07-01

    Modern pixel-wise image matching algorithms like Semi-Global Matching (SGM) are able to compute high resolution digital surface models from airborne and spaceborne stereo imagery. Although image matching itself can be performed automatically, there are prerequisites, like high geometric accuracy, which are essential for ensuring the high quality of resulting surface models. Especially for line cameras, these prerequisites currently require laborious manual interaction using standard tools, which is a growing problem due to continually increasing demand for such surface models. The tedious work includes partly or fully manual selection of tie- and/or ground control points for ensuring the required accuracy of the relative orientation of images for stereo matching. It also includes masking of large water areas that seriously reduce the quality of the results. Furthermore, a good estimate of the depth range is required, since accurate estimates can seriously reduce the processing time for stereo matching. In this paper an approach is presented that allows performing all these steps fully automated. It includes very robust and precise tie point selection, enabling the accurate calculation of the images' relative orientation via bundle adjustment. It is also shown how water masking and elevation range estimation can be performed automatically on the base of freely available SRTM data. Extensive tests with a large number of different satellite images from QuickBird and WorldView are presented as proof of the robustness and reliability of the proposed method.

  3. High‐throughput automated scoring of Ki67 in breast cancer tissue microarrays from the Breast Cancer Association Consortium

    PubMed Central

    Howat, William J; Daley, Frances; Zabaglo, Lila; McDuffus, Leigh‐Anne; Blows, Fiona; Coulson, Penny; Raza Ali, H; Benitez, Javier; Milne, Roger; Brenner, Herman; Stegmaier, Christa; Mannermaa, Arto; Chang‐Claude, Jenny; Rudolph, Anja; Sinn, Peter; Couch, Fergus J; Tollenaar, Rob A.E.M.; Devilee, Peter; Figueroa, Jonine; Sherman, Mark E; Lissowska, Jolanta; Hewitt, Stephen; Eccles, Diana; Hooning, Maartje J; Hollestelle, Antoinette; WM Martens, John; HM van Deurzen, Carolien; Investigators, kConFab; Bolla, Manjeet K; Wang, Qin; Jones, Michael; Schoemaker, Minouk; Broeks, Annegien; van Leeuwen, Flora E; Van't Veer, Laura; Swerdlow, Anthony J; Orr, Nick; Dowsett, Mitch; Easton, Douglas; Schmidt, Marjanka K; Pharoah, Paul D; Garcia‐Closas, Montserrat

    2016-01-01

    Abstract Automated methods are needed to facilitate high‐throughput and reproducible scoring of Ki67 and other markers in breast cancer tissue microarrays (TMAs) in large‐scale studies. To address this need, we developed an automated protocol for Ki67 scoring and evaluated its performance in studies from the Breast Cancer Association Consortium. We utilized 166 TMAs containing 16,953 tumour cores representing 9,059 breast cancer cases, from 13 studies, with information on other clinical and pathological characteristics. TMAs were stained for Ki67 using standard immunohistochemical procedures, and scanned and digitized using the Ariol system. An automated algorithm was developed for the scoring of Ki67, and scores were compared to computer assisted visual (CAV) scores in a subset of 15 TMAs in a training set. We also assessed the correlation between automated Ki67 scores and other clinical and pathological characteristics. Overall, we observed good discriminatory accuracy (AUC = 85%) and good agreement (kappa = 0.64) between the automated and CAV scoring methods in the training set. The performance of the automated method varied by TMA (kappa range= 0.37–0.87) and study (kappa range = 0.39–0.69). The automated method performed better in satisfactory cores (kappa = 0.68) than suboptimal (kappa = 0.51) cores (p‐value for comparison = 0.005); and among cores with higher total nuclei counted by the machine (4,000–4,500 cells: kappa = 0.78) than those with lower counts (50–500 cells: kappa = 0.41; p‐value = 0.010). Among the 9,059 cases in this study, the correlations between automated Ki67 and clinical and pathological characteristics were found to be in the expected directions. Our findings indicate that automated scoring of Ki67 can be an efficient method to obtain good quality data across large numbers of TMAs from multicentre studies. However, robust algorithm development and rigorous pre‐ and post

  4. High-throughput automated scoring of Ki67 in breast cancer tissue microarrays from the Breast Cancer Association Consortium.

    PubMed

    Abubakar, Mustapha; Howat, William J; Daley, Frances; Zabaglo, Lila; McDuffus, Leigh-Anne; Blows, Fiona; Coulson, Penny; Raza Ali, H; Benitez, Javier; Milne, Roger; Brenner, Herman; Stegmaier, Christa; Mannermaa, Arto; Chang-Claude, Jenny; Rudolph, Anja; Sinn, Peter; Couch, Fergus J; Tollenaar, Rob A E M; Devilee, Peter; Figueroa, Jonine; Sherman, Mark E; Lissowska, Jolanta; Hewitt, Stephen; Eccles, Diana; Hooning, Maartje J; Hollestelle, Antoinette; Wm Martens, John; Hm van Deurzen, Carolien; Investigators, kConFab; Bolla, Manjeet K; Wang, Qin; Jones, Michael; Schoemaker, Minouk; Broeks, Annegien; van Leeuwen, Flora E; Van't Veer, Laura; Swerdlow, Anthony J; Orr, Nick; Dowsett, Mitch; Easton, Douglas; Schmidt, Marjanka K; Pharoah, Paul D; Garcia-Closas, Montserrat

    2016-07-01

    Automated methods are needed to facilitate high-throughput and reproducible scoring of Ki67 and other markers in breast cancer tissue microarrays (TMAs) in large-scale studies. To address this need, we developed an automated protocol for Ki67 scoring and evaluated its performance in studies from the Breast Cancer Association Consortium. We utilized 166 TMAs containing 16,953 tumour cores representing 9,059 breast cancer cases, from 13 studies, with information on other clinical and pathological characteristics. TMAs were stained for Ki67 using standard immunohistochemical procedures, and scanned and digitized using the Ariol system. An automated algorithm was developed for the scoring of Ki67, and scores were compared to computer assisted visual (CAV) scores in a subset of 15 TMAs in a training set. We also assessed the correlation between automated Ki67 scores and other clinical and pathological characteristics. Overall, we observed good discriminatory accuracy (AUC = 85%) and good agreement (kappa = 0.64) between the automated and CAV scoring methods in the training set. The performance of the automated method varied by TMA (kappa range= 0.37-0.87) and study (kappa range = 0.39-0.69). The automated method performed better in satisfactory cores (kappa = 0.68) than suboptimal (kappa = 0.51) cores (p-value for comparison = 0.005); and among cores with higher total nuclei counted by the machine (4,000-4,500 cells: kappa = 0.78) than those with lower counts (50-500 cells: kappa = 0.41; p-value = 0.010). Among the 9,059 cases in this study, the correlations between automated Ki67 and clinical and pathological characteristics were found to be in the expected directions. Our findings indicate that automated scoring of Ki67 can be an efficient method to obtain good quality data across large numbers of TMAs from multicentre studies. However, robust algorithm development and rigorous pre- and post-analytical quality control procedures are

  5. Fully Automated Segmentation of the Pons and Midbrain Using Human T1 MR Brain Images

    PubMed Central

    Nigro, Salvatore; Cerasa, Antonio; Zito, Giancarlo; Perrotta, Paolo; Chiaravalloti, Francesco; Donzuso, Giulia; Fera, Franceso; Bilotta, Eleonora; Pantano, Pietro; Quattrone, Aldo

    2014-01-01

    Purpose This paper describes a novel method to automatically segment the human brainstem into midbrain and pons, called LABS: Landmark-based Automated Brainstem Segmentation. LABS processes high-resolution structural magnetic resonance images (MRIs) according to a revised landmark-based approach integrated with a thresholding method, without manual interaction. Methods This method was first tested on morphological T1-weighted MRIs of 30 healthy subjects. Its reliability was further confirmed by including neurological patients (with Alzheimer's Disease) from the ADNI repository, in whom the presence of volumetric loss within the brainstem had been previously described. Segmentation accuracies were evaluated against expert-drawn manual delineation. To evaluate the quality of LABS segmentation we used volumetric, spatial overlap and distance-based metrics. Results The comparison between the quantitative measurements provided by LABS against manual segmentations revealed excellent results in healthy controls when considering either the midbrain (DICE measures higher that 0.9; Volume ratio around 1 and Hausdorff distance around 3) or the pons (DICE measures around 0.93; Volume ratio ranging 1.024–1.05 and Hausdorff distance around 2). Similar performances were detected for AD patients considering segmentation of the pons (DICE measures higher that 0.93; Volume ratio ranging from 0.97–0.98 and Hausdorff distance ranging 1.07–1.33), while LABS performed lower for the midbrain (DICE measures ranging 0.86–0.88; Volume ratio around 0.95 and Hausdorff distance ranging 1.71–2.15). Conclusions Our study represents the first attempt to validate a new fully automated method for in vivo segmentation of two anatomically complex brainstem subregions. We retain that our method might represent a useful tool for future applications in clinical practice. PMID:24489664

  6. Multi-Q: a fully automated tool for multiplexed protein quantitation.

    PubMed

    Lin, Wen-Ting; Hung, Wei-Neng; Yian, Yi-Hwa; Wu, Kun-Pin; Han, Chia-Li; Chen, Yet-Ran; Chen, Yu-Ju; Sung, Ting-Yi; Hsu, Wen-Lian

    2006-09-01

    The iTRAQ labeling method combined with shotgun proteomic techniques represents a new dimension in multiplexed quantitation for relative protein expression measurement in different cell states. To expedite the analysis of vast amounts of spectral data, we present a fully automated software package, called Multi-Q, for multiplexed iTRAQ-based quantitation in protein profiling. Multi-Q is designed as a generic platform that can accommodate various input data formats from search engines and mass spectrometer manufacturers. To calculate peptide ratios, the software automatically processes iTRAQ's signature peaks, including peak detection, background subtraction, isotope correction, and normalization to remove systematic errors. Furthermore, Multi-Q allows users to define their own data-filtering thresholds based on semiempirical values or statistical models so that the computed results of fold changes in peptide ratios are statistically significant. This feature facilitates the use of Multi-Q with various instrument types with different dynamic ranges, which is an important aspect of iTRAQ analysis. The performance of Multi-Q is evaluated with a mixture of 10 standard proteins and human Jurkat T cells. The results are consistent with expected protein ratios and thus demonstrate the high accuracy, full automation, and high-throughput capability of Multi-Q as a large-scale quantitation proteomics tool. These features allow rapid interpretation of output from large proteomic datasets without the need for manual validation. Executable Multi-Q files are available on Windows platform at http://ms.iis.sinica.edu.tw/Multi-Q/. PMID:16944945

  7. Towards fully automated processing of VLBI sessions - results from ultra-rapid UT1 experiments

    NASA Astrophysics Data System (ADS)

    Hobiger, T.; Sekido, M.; Koyama, Y.; Kondo, T.; Takiguchi, H.; Kurihara, S.; Kokado, K.; Nozawa, K.; Haas, R.; Otsubo, T.; Gotoh, T.; Kubo-Oka, T.

    2010-12-01

    Like other space geodetic techniques, data processing and analysis of VLBI data requires some human interactions before the target parameters are available to the scientific community. If the processing chain can be completely automated, results would be available independent from time-zones, holidays or illness of the analyst. In VLBI, a lot of effort is put into near real-time monitoring of Earth orientation parameters, especially UT1. Since VLBI is the only space-geodetic technique which gives direct access to the Earth's phase of rotation, i.e. universal time UT1, a low latency product is desirable. Beside multi-baseline sessions, regular single-baseline VLBI experiments are scheduled in order to provide estimates of UT1 for the international science community. Although the turn-around time of such sessions is usually much shorter and results are available within one day after the data were recorded, lower latency of UT1 results is still requested. Based on the experience gained over the last three years, an automated processing chain was established. In July 2010, we started to provide automatically processed results to IERS rapid service, and thus fully unattended operation and robust estimation of UT1 has become routine operation. A new analysis software ensures that all post-processing stages run smoothly and a variety of scripts guarantee that the data-flow to and through the correlator takes full advantage of the available resources. The concept of ultra-rapid VLBI sessions can be extended further to include additional, geometrical well distributed stations, in order to derive also polar motion components with the same latency as UT1 and to provide an up-to-date complete set of Earth orientation parameters for navigation of space and satellite missions. Moreover, our work demonstrates how future VLBI networks can be processed automatically in order to provide near real-time information about the instantaneous Earth orientation in the framework of GGOS.

  8. A Fully Automated Diabetes Prevention Program, Alive-PD: Program Design and Randomized Controlled Trial Protocol

    PubMed Central

    Azar, Kristen MJ; Block, Torin J; Romanelli, Robert J; Carpenter, Heather; Hopkins, Donald; Palaniappan, Latha; Block, Clifford H

    2015-01-01

    Background In the United States, 86 million adults have pre-diabetes. Evidence-based interventions that are both cost effective and widely scalable are needed to prevent diabetes. Objective Our goal was to develop a fully automated diabetes prevention program and determine its effectiveness in a randomized controlled trial. Methods Subjects with verified pre-diabetes were recruited to participate in a trial of the effectiveness of Alive-PD, a newly developed, 1-year, fully automated behavior change program delivered by email and Web. The program involves weekly tailored goal-setting, team-based and individual challenges, gamification, and other opportunities for interaction. An accompanying mobile phone app supports goal-setting and activity planning. For the trial, participants were randomized by computer algorithm to start the program immediately or after a 6-month delay. The primary outcome measures are change in HbA1c and fasting glucose from baseline to 6 months. The secondary outcome measures are change in HbA1c, glucose, lipids, body mass index (BMI), weight, waist circumference, and blood pressure at 3, 6, 9, and 12 months. Randomization and delivery of the intervention are independent of clinic staff, who are blinded to treatment assignment. Outcomes will be evaluated for the intention-to-treat and per-protocol populations. Results A total of 340 subjects with pre-diabetes were randomized to the intervention (n=164) or delayed-entry control group (n=176). Baseline characteristics were as follows: mean age 55 (SD 8.9); mean BMI 31.1 (SD 4.3); male 68.5%; mean fasting glucose 109.9 (SD 8.4) mg/dL; and mean HbA1c 5.6 (SD 0.3)%. Data collection and analysis are in progress. We hypothesize that participants in the intervention group will achieve statistically significant reductions in fasting glucose and HbA1c as compared to the control group at 6 months post baseline. Conclusions The randomized trial will provide rigorous evidence regarding the efficacy of

  9. New Fully Automated Method for Segmentation of Breast Lesions on Ultrasound Based on Texture Analysis.

    PubMed

    Gómez-Flores, Wilfrido; Ruiz-Ortega, Bedert Abel

    2016-07-01

    The study described here explored a fully automatic segmentation approach based on texture analysis for breast lesions on ultrasound images. The proposed method involves two main stages: (i) In lesion region detection, the original gray-scale image is transformed into a texture domain based on log-Gabor filters. Local texture patterns are then extracted from overlapping lattices that are further classified by a linear discriminant analysis classifier to distinguish between the "normal tissue" and "breast lesion" classes. Next, an incremental method based on the average radial derivative function reveals the region with the highest probability of being a lesion. (ii) In lesion delineation, using the detected region and the pre-processed ultrasound image, an iterative thresholding procedure based on the average radial derivative function is performed to determine the final lesion contour. The experiments are carried out on a data set of 544 breast ultrasound images (including cysts, benign solid masses and malignant lesions) acquired with three distinct ultrasound machines. In terms of the area under the receiver operating characteristic curve, the one-way analysis of variance test (α=0.05) indicates that the proposed approach significantly outperforms two published fully automatic methods (p<0.001), for which the areas under the curve are 0.91, 0.82 and 0.63, respectively. Hence, these results suggest that the log-Gabor domain improves the discrimination power of texture features to accurately segment breast lesions. In addition, the proposed approach can potentially be used for automated computer diagnosis purposes to assist physicians in detection and classification of breast masses. PMID:27095150

  10. A fully automated multi-functional ultrahigh pressure liquid chromatography system for advanced proteome analyses

    PubMed Central

    Lee, Jung Hwa; Hyung, Seok-Won; Mun, Dong-Gi; Jung, Hee-Jung; Kim, Hokeun; Lee, Hangyeore; Kim, Su-Jin; Park, Kyong Soo; Moore, Ronald J.; Smith, Richard D.; Lee, Sang-Won

    2012-01-01

    A multi-functional liquid chromatography system that performs 1-dimensional, 2-dimensional (strong cation exchange/reverse phase liquid chromatography, or SCX/RPLC) separations and online phosphopeptide enrichment using a single binary nano-flow pump has been developed. With a simple operation of a function selection valve equipped with a SCX column and a TiO2 (titanium dioxide) column, a fully automated selection of three different experiment modes was achieved. Because the current system uses essentially the same solvent flow paths, the same trap column, and the same separation column for reverse-phase separation of 1D, 2D, and online phosphopeptides enrichment experiments, the elution time information obtained from these experiments is in excellent agreement, which facilitates correlating peptide information from different experiments. The final reverse-phase separation of the three experiments is completely decoupled from all of function selection processes; thereby salts or acids from SCX or TiO2 column do not affect the efficiency of the reverse-phase separation. PMID:22709424

  11. Fully automated objective-based method for master recession curve separation.

    PubMed

    Posavec, Kristijan; Parlov, Jelena; Nakić, Zoran

    2010-01-01

    The fully automated objective-based method for master recession curve (MRC) separation was developed by using Microsoft Excel spreadsheet and Visual Basic for Applications (VBA) code. The core of the program code is used to construct an MRC by using the adapted matching strip method (Posavec et al. 2006). Criteria for separating the MRC into two or three segments are determined from the flow-duration curve and are represented as the probable range of percent of flow rate duration. Successive separations are performed automatically on two and three MRCs using sets of percent of flow rate duration from selected ranges and an optimal separation model scenario, having the highest average coefficient of determination R(2), is selected as the most appropriate one. The resulting separated master recession curves are presented graphically, whereas the statistics are presented numerically, all in separate sheets. Examples of field data obtained from two springs in Istria, Croatia, are used to illustrate its application. The freely available Excel spreadsheet and VBA program ensures the ease of use and applicability for larger data sets. PMID:20100291

  12. A fully automated tortuosity quantification system with application to corneal nerve fibres in confocal microscopy images.

    PubMed

    Annunziata, Roberto; Kheirkhah, Ahmad; Aggarwal, Shruti; Hamrah, Pedram; Trucco, Emanuele

    2016-08-01

    Recent clinical research has highlighted important links between a number of diseases and the tortuosity of curvilinear anatomical structures like corneal nerve fibres, suggesting that tortuosity changes might detect early stages of specific conditions. Currently, clinical studies are mainly based on subjective, visual assessment, with limited repeatability and inter-observer agreement. To address these problems, we propose a fully automated framework for image-level tortuosity estimation, consisting of a hybrid segmentation method and a highly adaptable, definition-free tortuosity estimation algorithm. The former combines an appearance model, based on a Scale and Curvature-Invariant Ridge Detector (SCIRD), with a context model, including multi-range learned context filters. The latter is based on a novel tortuosity estimation paradigm in which discriminative, multi-scale features can be automatically learned for specific anatomical objects and diseases. Experimental results on 140 in vivo confocal microscopy images of corneal nerve fibres from healthy and unhealthy subjects demonstrate the excellent performance of our method compared to state-of-the-art approaches and ground truth annotations from 3 expert observers. PMID:27136674

  13. DynaMet: a fully automated pipeline for dynamic LC-MS data.

    PubMed

    Kiefer, Patrick; Schmitt, Uwe; Müller, Jonas E N; Hartl, Johannes; Meyer, Fabian; Ryffel, Florian; Vorholt, Julia A

    2015-10-01

    Dynamic isotope labeling data provides crucial information about the operation of metabolic pathways and are commonly generated via liquid chromatography-mass spectrometry (LC-MS). Metabolome-wide analysis is challenging as it requires grouping of metabolite features over different samples. We developed DynaMet for fully automated investigations of isotope labeling experiments from LC-high-resolution MS raw data. DynaMet enables untargeted extraction of metabolite labeling profiles and provides integrated tools for expressive data visualization. To validate DynaMet we first used time course labeling data of the model strain Bacillus methanolicus from (13)C methanol resulting in complex spectra in multicarbon compounds. Analysis of two biological replicates revealed high robustness and reproducibility of the pipeline. In total, DynaMet extracted 386 features showing dynamic labeling within 10 min. Of these features, 357 could be fitted by implemented kinetic models. Feature identification against KEGG database resulted in 215 matches covering multiple pathways of core metabolism and major biosynthetic routes. Moreover, we performed time course labeling experiment with Escherichia coli on uniformly labeled (13)C glucose resulting in a comparable number of detected features with labeling profiles of high quality. The distinct labeling patterns of common central metabolites generated from both model bacteria can readily be explained by one versus multicarbon compound metabolism. DynaMet is freely available as an extension package for Python based eMZed2, an open source framework built for rapid development of LC-MS data analysis workflows. PMID:26366644

  14. Fully Automated and Robust Tracking of Transient Waves in Structured Anatomies Using Dynamic Programming.

    PubMed

    Akkus, Zeynettin; Bayat, Mahdi; Cheong, Mathew; Viksit, Kumar; Erickson, Bradley J; Alizad, Azra; Fatemi, Mostafa

    2016-10-01

    Tissue stiffness is often linked to underlying pathology and can be quantified by measuring the mechanical transient transverse wave speed (TWS) within the medium. Time-of-flight methods based on correlation of the transient signals or tracking of peaks have been used to quantify the TWS from displacement maps obtained with ultrasound pulse-echo techniques. However, it is challenging to apply these methods to in vivo data because of tissue inhomogeneity, noise and artifacts that produce outliers. In this study, we introduce a robust and fully automated method based on dynamic programming to estimate TWS in tissues with known geometries. The method is validated using ultrasound bladder vibrometry data from an in vivo study. We compared the results of our method with those of time-of-flight techniques. Our method performs better than time-of-flight techniques. In conclusion, we present a robust and accurate TWS detection method that overcomes the difficulties of time-of-flight methods. PMID:27425150

  15. Fully automated VLBI analysis with c5++ for ultra-rapid determination of UT1

    NASA Astrophysics Data System (ADS)

    Hobiger, Thomas; Otsubo, Toshimichi; Sekido, Mamoru; Gotoh, Tadahiro; Kubooka, Toshihiro; Takiguchi, Hiroshi

    2010-12-01

    VLBI is the only space-geodetic technique which gives direct access to the Earth's phase of rotation, i.e. universal time UT1. Beside multi-baseline sessions, regular single baseline VLBI experiments are scheduled in order to provide estimates of UT1 for the international space community. Although the turn-around time of such sessions is usually much shorter and results are available within one day after the data were recorded, lower latency of UT1 results is still requested. Based on the experience gained over the last two years, an automated analysis procedure was established. The main goal was to realize fully unattended operation and robust estimation of UT1. Our new analysis software, named c5++, is capable of interfacing directly with the correlator output, carries out all processing stages without human interaction and provides the results for the scientific community or dedicated space applications. Moreover, the concept of ultra-rapid VLBI sessions can be extended to include further well-distributed stations, in order to obtain the polar motion parameters with the same latency and provide an up-to-date complete set of Earth orientation parameters for navigation of space and satellite missions.

  16. Fully automated glottis segmentation in endoscopic videos using local color and shape features of glottal regions.

    PubMed

    Gloger, Oliver; Lehnert, Bernhard; Schrade, Andreas; Völzke, Henry

    2015-03-01

    Exact analysis of glottal vibration patterns is indispensable for the assessment of laryngeal pathologies. Increasing demand of voice related examination and large amount of data provided by high-speed laryngoscopy and stroboscopy call for automatic assistance in research and patient care. Automatic glottis segmentation is necessary to assist glottal vibration pattern analysis, but unfortunately proves to be very challenging. Previous glottis segmentation approaches hardly consider characteristic glottis features as well as inhomogeneity of glottal regions and show serious drawbacks in their application for diagnostic purposes. We developed a fully automated glottis segmentation framework that extracts a set of glottal regions in endoscopic videos by using a flexible thresholding technique combined with a refining level set method that incorporates prior glottis shape knowledge. A novel descriptor for glottal regions is presented to remove potential nonglottal fake regions that show glottis-like shape properties. Knowledge of local color distributions is incorporated into Bayesian probability image generation. Glottal regions are then tracked frame-by-frame in probability images with a region-based level set segmentation strategy. Principal component analysis of pixel coordinates is applied to determine glottal orientation in each frame and to remove nonglottal regions if erroneous regions are included. The framework shows very promising results concerning segmentation accuracy and processing times and is applicable for both stroboscopic and high-speed videos. PMID:25350912

  17. Improving GPR Surveys Productivity by Array Technology and Fully Automated Processing

    NASA Astrophysics Data System (ADS)

    Morello, Marco; Ercoli, Emanuele; Mazzucchelli, Paolo; Cottino, Edoardo

    2016-04-01

    The realization of network infrastructures with lower environmental impact and the tendency to use digging technologies less invasive in terms of time and space of road occupation and restoration play a key-role in the development of communication networks. However, pre-existing buried utilities must be detected and located in the subsurface, to exploit the high productivity of modern digging apparatus. According to SUE quality level B+ both position and depth of subsurface utilities must be accurately estimated, demanding for 3D GPR surveys. In fact, the advantages of 3D GPR acquisitions (obtained either by multiple 2D recordings or by an antenna array) versus 2D acquisitions are well-known. Nonetheless, the amount of acquired data for such 3D acquisitions does not usually allow to complete processing and interpretation directly in field and in real-time, thus limiting the overall efficiency of the GPR acquisition. As an example, the "low impact mini-trench "technique (addressed in ITU - International Telecommunication Union - L.83 recommendation) requires that non-destructive mapping of buried services enhances its productivity to match the improvements of new digging equipment. Nowadays multi-antenna and multi-pass GPR acquisitions demand for new processing techniques that can obtain high quality subsurface images, taking full advantage of 3D data: the development of a fully automated and real-time 3D GPR processing system plays a key-role in overall optical network deployment profitability. Furthermore, currently available computing power suggests the feasibility of processing schemes that incorporate better focusing algorithms. A novel processing scheme, whose goal is the automated processing and detection of buried targets that can be applied in real-time to 3D GPR array systems, has been developed and fruitfully tested with two different GPR arrays (16 antennas, 900 MHz central frequency, and 34 antennas, 600 MHz central frequency). The proposed processing

  18. Assessment of Automated Image Analysis of Breast Cancer Tissue Microarrays for Epidemiologic Studies

    PubMed Central

    Bolton, Kelly L.; Garcia-Closas, Montserrat; Pfeiffer, Ruth M.; Duggan, Máire A.; Howat, William J.; Hewitt, Stephen M.; Yang, Xiaohong R.; Cornelison, Robert; Anzick, Sarah L.; Meltzer, Paul; Davis, Sean; Lenz, Petra; Figueroa, Jonine D.; Pharoah, Paul D.P.; Sherman, Mark E.

    2010-01-01

    A major challenge in studies of etiologic heterogeneity in breast cancer has been the limited throughput, accuracy and reproducibility of measuring tissue markers. Computerized image analysis systems may help address these concerns but published reports of their use are limited. We assessed agreement between automated and pathologist scores of a diverse set of immunohistochemical (IHC) assays performed on breast cancer TMAs. TMAs of 440 breast cancers previously stained for ER-α, PR, HER-2, ER-β and aromatase were independently scored by two pathologists and three automated systems (TMALabII, TMAx, Ariol). Agreement between automated and pathologist scores of negative/positive was measured using the area under the receiver operator characteristics curve (AUC) and weighted kappa statistics (κ) for categorical scores. We also investigated the correlation between IHC scores and mRNA expression levels. Agreement between pathologist and automated negative/positive and categorical scores was excellent for ER-α and PR (AUC range =0.98-0.99; κ range =0.86-0.91). Lower levels of agreement were seen for ER-β categorical scores (AUC=0.99-1.0; κ=0.80-0.86) and both negative/positive and categorical scores for aromatase (AUC=0.85-0.96; κ=0.41-0.67) and HER2 (AUC=0.94-0.97; κ=0.53-0.72). For ER-α and PR, there was strong correlation between mRNA levels and automated (ρ=0.67-0.74) and pathologist IHC scores (ρ=0.67-0.77). HER2 mRNA levels were more strongly correlated with pathologist (ρ=0.63) than automated IHC scores (ρ=0.41-0.49). Automated analysis of IHC markers is a promising approach for scoring large numbers of breast cancer tissues in epidemiologic investigations. This would facilitate studies of etiologic heterogeneity which ultimately may allow improved risk prediction and better prevention approaches. PMID:20332278

  19. Fully automated prostate magnetic resonance imaging and transrectal ultrasound fusion via a probabilistic registration metric

    NASA Astrophysics Data System (ADS)

    Sparks, Rachel; Bloch, B. Nicholas; Feleppa, Ernest; Barratt, Dean; Madabhushi, Anant

    2013-03-01

    In this work, we present a novel, automated, registration method to fuse magnetic resonance imaging (MRI) and transrectal ultrasound (TRUS) images of the prostate. Our methodology consists of: (1) delineating the prostate on MRI, (2) building a probabilistic model of prostate location on TRUS, and (3) aligning the MRI prostate segmentation to the TRUS probabilistic model. TRUS-guided needle biopsy is the current gold standard for prostate cancer (CaP) diagnosis. Up to 40% of CaP lesions appear isoechoic on TRUS, hence TRUS-guided biopsy cannot reliably target CaP lesions and is associated with a high false negative rate. MRI is better able to distinguish CaP from benign prostatic tissue, but requires special equipment and training. MRI-TRUS fusion, whereby MRI is acquired pre-operatively and aligned to TRUS during the biopsy procedure, allows for information from both modalities to be used to help guide the biopsy. The use of MRI and TRUS in combination to guide biopsy at least doubles the yield of positive biopsies. Previous work on MRI-TRUS fusion has involved aligning manually determined fiducials or prostate surfaces to achieve image registration. The accuracy of these methods is dependent on the reader's ability to determine fiducials or prostate surfaces with minimal error, which is a difficult and time-consuming task. Our novel, fully automated MRI-TRUS fusion method represents a significant advance over the current state-of-the-art because it does not require manual intervention after TRUS acquisition. All necessary preprocessing steps (i.e. delineation of the prostate on MRI) can be performed offline prior to the biopsy procedure. We evaluated our method on seven patient studies, with B-mode TRUS and a 1.5 T surface coil MRI. Our method has a root mean square error (RMSE) for expertly selected fiducials (consisting of the urethra, calcifications, and the centroids of CaP nodules) of 3.39 +/- 0.85 mm.

  20. High-throughput, fully-automated volumetry for prediction of MMSE and CDR decline in mild cognitive impairment

    PubMed Central

    Kovacevic, Sanja; Rafii, Michael S.; Brewer, James B.

    2008-01-01

    Medial temporal lobe (MTL) atrophy is associated with increased risk for conversion to Alzheimer's disease (AD), but manual tracing techniques and even semi-automated techniques for volumetric assessment are not practical in the clinical setting. In addition, most studies that examined MTL atrophy in AD have focused only on the hippocampus. It is unknown the extent to which volumes of amygdala and temporal horn of the lateral ventricle predict subsequent clinical decline. This study examined whether measures of hippocampus, amygdala, and temporal horn volume predict clinical decline over the following 6-month period in patients with mild cognitive impairment (MCI). Fully-automated volume measurements were performed in 269 MCI patients. Baseline volumes of the hippocampus, amygdala, and temporal horn were evaluated as predictors of change in Mini-mental State Exam (MMSE) and Clinical Dementia Rating Sum of Boxes (CDR SB) over a 6-month interval. Fully-automated measurements of baseline hippocampus and amygdala volumes correlated with baseline delayed recall scores. Patients with smaller baseline volumes of the hippocampus and amygdala or larger baseline volumes of the temporal horn had more rapid subsequent clinical decline on MMSE and CDR SB. Fully-automated and rapid measurement of segmental MTL volumes may help clinicians predict clinical decline in MCI patients. PMID:19474571

  1. Assessment of fully-automated atlas-based segmentation of novel oral mucosal surface organ-at-risk

    PubMed Central

    Dean, Jamie A; Welsh, Liam C; McQuaid, Dualta; Wong, Kee H; Aleksic, Aleksandar; Dunne, Emma; Islam, Mohammad R; Patel, Anushka; Patel, Priyanka; Petkar, Imran; Phillips, Iain; Sham, Jackie; Newbold, Kate L; Bhide, Shreerang A; Harrington, Kevin J; Gulliford, Sarah L; Nutting, Christopher M

    2016-01-01

    Background and Purpose Current oral mucositis normal tissue complication probability models, based on the dose distribution to the oral cavity volume, have suboptimal predictive power. Improving the delineation of the oral mucosa is likely to improve these models, but is resource intensive. We developed and evaluated fully-automated atlas-based segmentation (ABS) of a novel delineation technique for the oral mucosal surfaces. Material and Methods An atlas of mucosal surface contours (MSC) consisting of 46 patients was developed. It was applied to an independent test cohort of 10 patients for whom manual segmentation of MSC structures, by three different clinicians, and conventional outlining of oral cavity contours (OCC), by an additional clinician, were also performed. Geometric comparisons were made using the dice similarity coefficient (DSC), validation index (VI) and Hausdorff distance (HD). Dosimetric comparisons were carried out using dose-volume histograms. Results The median difference, in the DSC and HD, between automated-manual comparisons and manual-manual comparisons were small and non-significant (-0.024; p = 0.33 and -0.5; p = 0.88, respectively). The median VI was 0.086. The maximum normalised volume difference between automated and manual MSC structures across all of the dose levels, averaged over the test cohort, was 8%. This difference reached approximately 28% when comparing automated MSC and OCC structures. Conclusions Fully-automated ABS of MSC is suitable for use in radiotherapy dose-response modelling. PMID:26970676

  2. Fully Automated Whole-Head Segmentation with Improved Smoothness and Continuity, with Theory Reviewed

    PubMed Central

    Huang, Yu; Parra, Lucas C.

    2015-01-01

    Individualized current-flow models are needed for precise targeting of brain structures using transcranial electrical or magnetic stimulation (TES/TMS). The same is true for current-source reconstruction in electroencephalography and magnetoencephalography (EEG/MEG). The first step in generating such models is to obtain an accurate segmentation of individual head anatomy, including not only brain but also cerebrospinal fluid (CSF), skull and soft tissues, with a field of view (FOV) that covers the whole head. Currently available automated segmentation tools only provide results for brain tissues, have a limited FOV, and do not guarantee continuity and smoothness of tissues, which is crucially important for accurate current-flow estimates. Here we present a tool that addresses these needs. It is based on a rigorous Bayesian inference framework that combines image intensity model, anatomical prior (atlas) and morphological constraints using Markov random fields (MRF). The method is evaluated on 20 simulated and 8 real head volumes acquired with magnetic resonance imaging (MRI) at 1 mm3 resolution. We find improved surface smoothness and continuity as compared to the segmentation algorithms currently implemented in Statistical Parametric Mapping (SPM). With this tool, accurate and morphologically correct modeling of the whole-head anatomy for individual subjects may now be feasible on a routine basis. Code and data are fully integrated into SPM software tool and are made publicly available. In addition, a review on the MRI segmentation using atlas and the MRF over the last 20 years is also provided, with the general mathematical framework clearly derived. PMID:25992793

  3. Fully Automated Assessment of the Severity of Parkinson’s Disease from Speech

    PubMed Central

    Bayestehtashk, Alireza; Asgari, Meysam; Shafran, Izhak; McNames, James

    2014-01-01

    For several decades now, there has been sporadic interest in automatically characterizing the speech impairment due to Parkinson’s disease (PD). Most early studies were confined to quantifying a few speech features that were easy to compute. More recent studies have adopted a machine learning approach where a large number of potential features are extracted and the models are learned automatically from the data. In the same vein, here we characterize the disease using a relatively large cohort of 168 subjects, collected from multiple (three) clinics. We elicited speech using three tasks – the sustained phonation task, the diadochokinetic task and a reading task, all within a time budget of 4 minutes, prompted by a portable device. From these recordings, we extracted 1582 features for each subject using openSMILE, a standard feature extraction tool. We compared the effectiveness of three strategies for learning a regularized regression and find that ridge regression performs better than lasso and support vector regression for our task. We refine the feature extraction to capture pitch-related cues, including jitter and shimmer, more accurately using a time-varying harmonic model of speech. Our results show that the severity of the disease can be inferred from speech with a mean absolute error of about 5.5, explaining 61% of the variance and consistently well-above chance across all clinics. Of the three speech elicitation tasks, we find that the reading task is significantly better at capturing cues than diadochokinetic or sustained phonation task. In all, we have demonstrated that the data collection and inference can be fully automated, and the results show that speech-based assessment has promising practical application in PD. The techniques reported here are more widely applicable to other paralinguistic tasks in clinical domain. PMID:25382935

  4. Mammographic Breast Density Evaluation in Korean Women Using Fully Automated Volumetric Assessment

    PubMed Central

    2016-01-01

    The purpose was to present mean breast density of Korean women according to age using fully automated volumetric assessment. This study included 5,967 screening normal or benign mammograms (mean age, 46.2 ± 9.7; range, 30–89 years), from cancer-screening program. We evaluated mean fibroglandular tissue volume, breast tissue volume, volumetric breast density (VBD), and the results were 53.7 ± 30.8 cm3, 383.8 ± 205.2 cm3, and 15.8% ± 7.3%. The frequency of dense breasts and mean VBD by age group were 94.3% and 19.1% ± 6.7% for the 30s (n = 1,484), 91.4% and 17.2% ± 6.8% for the 40s (n = 2,706), 72.2% and 12.4% ± 6.2% for the 50s (n = 1,138), 44.0% and 8.6% ± 4.3% for the 60s (n = 89), 39.1% and 8.0% ± 3.8% for the 70s (n = 138), and 39.1% and 8.0% ± 3.5% for the 80s (n = 12). The frequency of dense breasts was higher in younger women (n = 4,313, 92.3%) than older women (n = 1,654, 59.8%). Mean VBD decreased with aging or menopause, and was about 16% for 46-year-old-Korean women, much higher than in other countries. The proportion of dense breasts sharply decreases in Korean women between 40 and 69 years of age. PMID:26955249

  5. Fully Automated Assessment of the Severity of Parkinson's Disease from Speech.

    PubMed

    Bayestehtashk, Alireza; Asgari, Meysam; Shafran, Izhak; McNames, James

    2015-01-01

    For several decades now, there has been sporadic interest in automatically characterizing the speech impairment due to Parkinson's disease (PD). Most early studies were confined to quantifying a few speech features that were easy to compute. More recent studies have adopted a machine learning approach where a large number of potential features are extracted and the models are learned automatically from the data. In the same vein, here we characterize the disease using a relatively large cohort of 168 subjects, collected from multiple (three) clinics. We elicited speech using three tasks - the sustained phonation task, the diadochokinetic task and a reading task, all within a time budget of 4 minutes, prompted by a portable device. From these recordings, we extracted 1582 features for each subject using openSMILE, a standard feature extraction tool. We compared the effectiveness of three strategies for learning a regularized regression and find that ridge regression performs better than lasso and support vector regression for our task. We refine the feature extraction to capture pitch-related cues, including jitter and shimmer, more accurately using a time-varying harmonic model of speech. Our results show that the severity of the disease can be inferred from speech with a mean absolute error of about 5.5, explaining 61% of the variance and consistently well-above chance across all clinics. Of the three speech elicitation tasks, we find that the reading task is significantly better at capturing cues than diadochokinetic or sustained phonation task. In all, we have demonstrated that the data collection and inference can be fully automated, and the results show that speech-based assessment has promising practical application in PD. The techniques reported here are more widely applicable to other paralinguistic tasks in clinical domain. PMID:25382935

  6. Fully Automated Detection of Cloud and Aerosol Layers in the CALIPSO Lidar Measurements

    NASA Technical Reports Server (NTRS)

    Vaughan, Mark A.; Powell, Kathleen A.; Kuehn, Ralph E.; Young, Stuart A.; Winker, David M.; Hostetler, Chris A.; Hunt, William H.; Liu, Zhaoyan; McGill, Matthew J.; Getzewich, Brian J.

    2009-01-01

    Accurate knowledge of the vertical and horizontal extent of clouds and aerosols in the earth s atmosphere is critical in assessing the planet s radiation budget and for advancing human understanding of climate change issues. To retrieve this fundamental information from the elastic backscatter lidar data acquired during the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) mission, a selective, iterated boundary location (SIBYL) algorithm has been developed and deployed. SIBYL accomplishes its goals by integrating an adaptive context-sensitive profile scanner into an iterated multiresolution spatial averaging scheme. This paper provides an in-depth overview of the architecture and performance of the SIBYL algorithm. It begins with a brief review of the theory of target detection in noise-contaminated signals, and an enumeration of the practical constraints levied on the retrieval scheme by the design of the lidar hardware, the geometry of a space-based remote sensing platform, and the spatial variability of the measurement targets. Detailed descriptions are then provided for both the adaptive threshold algorithm used to detect features of interest within individual lidar profiles and the fully automated multiresolution averaging engine within which this profile scanner functions. The resulting fusion of profile scanner and averaging engine is specifically designed to optimize the trade-offs between the widely varying signal-to-noise ratio of the measurements and the disparate spatial resolutions of the detection targets. Throughout the paper, specific algorithm performance details are illustrated using examples drawn from the existing CALIPSO dataset. Overall performance is established by comparisons to existing layer height distributions obtained by other airborne and space-based lidars.

  7. Accurate, fully-automated registration of coronary arteries for volumetric CT digital subtraction angiography

    NASA Astrophysics Data System (ADS)

    Razeto, Marco; Mohr, Brian; Arakita, Kazumasa; Schuijf, Joanne D.; Fuchs, Andreas; Kühl, J. Tobias; Chen, Marcus Y.; Kofoed, Klaus F.

    2014-03-01

    Diagnosis of coronary artery disease with Coronary Computed Tomography Angiography (CCTA) is complicated by the presence of signi cant calci cation or stents. Volumetric CT Digital Subtraction Angiography (CTDSA) has recently been shown to be e ective at overcoming these limitations. Precise registration of structures is essential as any misalignment can produce artifacts potentially inhibiting clinical interpretation of the data. The fully-automated registration method described in this paper addresses the problem by combining a dense deformation eld with rigid-body transformations where calci cations/stents are present. The method contains non-rigid and rigid components. Non-rigid registration recovers the majority of motion artifacts and produces a dense deformation eld valid over the entire scan domain. Discrete domains are identi ed in which rigid registrations very accurately align each calci cation/stent. These rigid-body transformations are combined within the immediate area of the deformation eld using a distance transform to minimize distortion of the surrounding tissue. A recent interim analysis of a clinical feasibility study evaluated reader con dence and diagnostic accuracy in conventional CCTA and CTDSA registered using this method. Conventional invasive coronary angiography was used as the reference. The study included 27 patients scanned with a second-generation 320-row CT detector in which 41 lesions were identi ed. Compared to conventional CCTA, CTDSA improved reader con dence in 13/36 (36%) of segments with severe calci cation and 3/5 (60%) of segments with coronary stents. Also, the false positive rate of CTDSA was reduced compared to conventional CCTA from 18% (24/130) to 14% (19/130).

  8. Fully automated intrinsic respiratory and cardiac gating for small animal CT

    NASA Astrophysics Data System (ADS)

    Kuntz, J.; Dinkel, J.; Zwick, S.; Bäuerle, T.; Grasruck, M.; Kiessling, F.; Gupta, R.; Semmler, W.; Bartling, S. H.

    2010-04-01

    A fully automated, intrinsic gating algorithm for small animal cone-beam CT is described and evaluated. A parameter representing the organ motion, derived from the raw projection images, is used for both cardiac and respiratory gating. The proposed algorithm makes it possible to reconstruct motion-corrected still images as well as to generate four-dimensional (4D) datasets representing the cardiac and pulmonary anatomy of free-breathing animals without the use of electrocardiogram (ECG) or respiratory sensors. Variation analysis of projections from several rotations is used to place a region of interest (ROI) on the diaphragm. The ROI is cranially extended to include the heart. The centre of mass (COM) variation within this ROI, the filtered frequency response and the local maxima are used to derive a binary motion-gating parameter for phase-sensitive gated reconstruction. This algorithm was implemented on a flat-panel-based cone-beam CT scanner and evaluated using a moving phantom and animal scans (seven rats and eight mice). Volumes were determined using a semiautomatic segmentation. In all cases robust gating signals could be obtained. The maximum volume error in phantom studies was less than 6%. By utilizing extrinsic gating via externally placed cardiac and respiratory sensors, the functional parameters (e.g. cardiac ejection fraction) and image quality were equivalent to this current gold standard. This algorithm obviates the necessity of both gating hardware and user interaction. The simplicity of the proposed algorithm enables adoption in a wide range of small animal cone-beam CT scanners.

  9. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    SciTech Connect

    Lorenz, Matthias; Ovchinnikova, Olga S; Van Berkel, Gary J

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  10. Fully automated, quantitative, noninvasive assessment of collagen fiber content and organization in thick collagen gels

    NASA Astrophysics Data System (ADS)

    Bayan, Christopher; Levitt, Jonathan M.; Miller, Eric; Kaplan, David; Georgakoudi, Irene

    2009-05-01

    Collagen is the most prominent protein of human tissues. Its content and organization define to a large extent the mechanical properties of tissue as well as its function. Methods that have been used traditionally to visualize and analyze collagen are invasive, provide only qualitative or indirect information, and have limited use in studies that aim to understand the dynamic nature of collagen remodeling and its interactions with the surrounding cells and other matrix components. Second harmonic generation (SHG) imaging emerged as a promising noninvasive modality for providing high-resolution images of collagen fibers within thick specimens, such as tissues. In this article, we present a fully automated procedure to acquire quantitative information on the content, orientation, and organization of collagen fibers. We use this procedure to monitor the dynamic remodeling of collagen gels in the absence or presence of fibroblasts over periods of 12 or 14 days. We find that an adaptive thresholding and stretching approach provides great insight to the content of collagen fibers within SHG images without the need for user input. An additional feature-erosion and feature-dilation step is useful for preserving structure and noise removal in images with low signal. To quantitatively assess the orientation of collagen fibers, we extract the orientation index (OI), a parameter based on the power distribution of the spatial-frequency-averaged, two-dimensional Fourier transform of the SHG images. To measure the local organization of the collagen fibers, we access the Hough transform of small tiles of the image and compute the entropy distribution, which represents the probability of finding the direction of fibers along a dominant direction. Using these methods we observed that the presence and number of fibroblasts within the collagen gel significantly affects the remodeling of the collagen matrix. In the absence of fibroblasts, gels contract, especially during the first few

  11. Numerical and experimental analysis of a ducted propeller designed by a fully automated optimization process under open water condition

    NASA Astrophysics Data System (ADS)

    Yu, Long; Druckenbrod, Markus; Greve, Martin; Wang, Ke-qi; Abdel-Maksoud, Moustafa

    2015-10-01

    A fully automated optimization process is provided for the design of ducted propellers under open water conditions, including 3D geometry modeling, meshing, optimization algorithm and CFD analysis techniques. The developed process allows the direct integration of a RANSE solver in the design stage. A practical ducted propeller design case study is carried out for validation. Numerical simulations and open water tests are fulfilled and proved that the optimum ducted propeller improves hydrodynamic performance as predicted.

  12. Two Fully Automated Web-Based Interventions for Risky Alcohol Use: Randomized Controlled Trial

    PubMed Central

    Strüber, Evelin

    2013-01-01

    Background Excessive alcohol use is a widespread problem in many countries, especially among young people. To reach more people engaging in high-risk drinking behaviors, a number of online programs have been developed in recent years. Change Your Drinking is a German, diary-based, fully automated alcohol intervention. In 2010, a revised version of the program was developed. It is more strongly oriented to concepts of relapse prevention than the previous version, includes more feedback, and offers more possibilities to interact with the program. Moreover, the program duration was extended from 10 to 14 days. Objective This paper examines whether the revised version of Change Your Drinking is more effective in reducing alcohol consumption than the original version. Methods The effectiveness of both program versions was compared in a Web-based, open, randomized controlled trial with follow-up surveys 6 weeks and 3 months after registration. Participants were recruited online and were randomly assigned to either the original or the revised version of Change Your Drinking. The following self-assessed outcomes were used: alcohol use days, alcohol intake in grams, the occurrence of binge drinking and risky drinking (all referring to the past 7 days prior to each survey), and the number of alcohol-related problems. Results A total of 595 participants were included in the trial. Follow-up rates were 58.0% after 6 weeks and 49.6% after 3 months. No significant group differences were found in any of the outcomes. However, the revised version was used by more participants (80.7%) than the original version (55.7%). A significant time effect was detected in all outcomes (alcohol use days: P=.002; alcohol intake in grams: P<.001; binge drinking: P<.001; alcohol-related problems: P=.004; risky drinking: P<.001). Conclusions The duration and complexity of the program played a minor role in reducing alcohol consumption. However, differences in program usage between the versions

  13. Results from the first fully automated PBS-mask process and pelliclization

    NASA Astrophysics Data System (ADS)

    Oelmann, Andreas B.; Unger, Gerd M.

    1994-02-01

    Automation is widely discussed in IC- and mask-manufacturing and partially realized everywhere. The idea for the automation goes back to 1978, when it turned out that the operators for the then newly installed PBS-process-line (the first in Europe) should be trained to behave like robots for particle reduction gaining lower defect densities on the masks. More than this goal has been achieved. It turned out recently, that the automation with its dedicated work routes and detailed documentation of every lot (individual mask or reticle) made it easy to obtain the CEEC certificate which includes ISO 9001.

  14. A Rapid, Fully Automated, Molecular-Based Assay Accurately Analyzes Sentinel Lymph Nodes for the Presence of Metastatic Breast Cancer

    PubMed Central

    Hughes, Steven J.; Xi, Liqiang; Raja, Siva; Gooding, William; Cole, David J.; Gillanders, William E.; Mikhitarian, Keidi; McCarty, Kenneth; Silver, Susan; Ching, Jesus; McMillan, William; Luketich, James D.; Godfrey, Tony E.

    2006-01-01

    Objective: To develop a fully automated, rapid, molecular-based assay that accurately and objectively evaluates sentinel lymph nodes (SLN) from breast cancer patients. Summary Background Data: Intraoperative analysis for the presence of metastatic cancer in SLNs from breast cancer patients lacks sensitivity. Even with immunohistochemical staining (IHC) and time-consuming review, alarming discordance in the interpretation of SLN has been observed. Methods: A total of 43 potential markers were evaluated for the ability to accurately characterize lymph node specimens from breast cancer patients as compared with complete histologic analysis including IHC. Selected markers then underwent external validation on 90 independent SLN specimens using rapid, multiplex quantitative reverse transcription-polymerase chain reaction (QRT-PCR) assays. Finally, 18 SLNs were analyzed using a completely automated RNA isolation, reverse transcription, and quantitative PCR instrument (GeneXpert). Results: Following analysis of potential markers, promising markers were evaluated to establish relative level of expression cutoff values that maximized classification accuracy. A validation set of 90 SLNs from breast cancer patients was prospectively characterized using 4 markers individually or in combinations, and the results compared with histologic analysis. A 2-marker assay was found to be 97.8% accurate (94% sensitive, 100% specific) compared with histologic analysis. The fully automated GeneXpert instrument produced comparable and reproducible results in less than 35 minutes. Conclusions: A rapid, fully automated QRT-PCR assay definitively characterizes breast cancer SLN with accuracy equal to conventional pathology. This approach is superior to intraoperative SLN analysis and can provide standardized, objective results to assist in pathologic diagnosis. PMID:16495705

  15. A Fully Automated Method for CT-on-Rails-Guided Online Adaptive Planning for Prostate Cancer Intensity Modulated Radiation Therapy

    SciTech Connect

    Li, Xiaoqiang; Quan, Enzhuo M.; Li, Yupeng; Pan, Xiaoning; Zhou, Yin; Wang, Xiaochun; Du, Weiliang; Kudchadker, Rajat J.; Johnson, Jennifer L.; Kuban, Deborah A.; Lee, Andrew K.; Zhang, Xiaodong

    2013-08-01

    Purpose: This study was designed to validate a fully automated adaptive planning (AAP) method which integrates automated recontouring and automated replanning to account for interfractional anatomical changes in prostate cancer patients receiving adaptive intensity modulated radiation therapy (IMRT) based on daily repeated computed tomography (CT)-on-rails images. Methods and Materials: Nine prostate cancer patients treated at our institution were randomly selected. For the AAP method, contours on each repeat CT image were automatically generated by mapping the contours from the simulation CT image using deformable image registration. An in-house automated planning tool incorporated into the Pinnacle treatment planning system was used to generate the original and the adapted IMRT plans. The cumulative dose–volume histograms (DVHs) of the target and critical structures were calculated based on the manual contours for all plans and compared with those of plans generated by the conventional method, that is, shifting the isocenters by aligning the images based on the center of the volume (COV) of prostate (prostate COV-aligned). Results: The target coverage from our AAP method for every patient was acceptable, while 1 of the 9 patients showed target underdosing from prostate COV-aligned plans. The normalized volume receiving at least 70 Gy (V{sub 70}), and the mean dose of the rectum and bladder were reduced by 8.9%, 6.4 Gy and 4.3%, 5.3 Gy, respectively, for the AAP method compared with the values obtained from prostate COV-aligned plans. Conclusions: The AAP method, which is fully automated, is effective for online replanning to compensate for target dose deficits and critical organ overdosing caused by interfractional anatomical changes in prostate cancer.

  16. Development and application of fully-automated EBIC techniques for solar-cell measurements

    SciTech Connect

    Russell, P.E.; Herrington, C.R.

    1982-04-01

    The electron beam induced current, or EBIC, technique is a powerful tool for the investigation of semiconductor materials and device properties. The technique utilizes a focused electron beam as a source of electron-hole pair generation in a well controlled, localized volume. If the sample contains a collecting electrical junction, the short circuit current response to the electron beam can be measured and/or used as intensity modulation for a short circuit current, or EBIC, map. In this work significant improvements to the state-of-the-art in EBIC measurement systems is described. The system developed is totally computer automated and includes such features as beam current regulation and reproducibility, beam blanking, and automated digital data recording, displaying and manipulation. The manual system on which the automation is based will be described first, followed by the objectives of automating the system. Then a complete description of the automated system including hardware, software and several examples of the use of the system will be detailed. The most commonly desired form of EBIC data is either EBIC or log EBIC versus beam position.

  17. Parenchymal texture analysis in digital mammography: A fully automated pipeline for breast cancer risk assessment

    PubMed Central

    Zheng, Yuanjie; Keller, Brad M.; Ray, Shonket; Wang, Yan; Conant, Emily F.; Gee, James C.; Kontos, Despina

    2015-01-01

    Purpose: Mammographic percent density (PD%) is known to be a strong risk factor for breast cancer. Recent studies also suggest that parenchymal texture features, which are more granular descriptors of the parenchymal pattern, can provide additional information about breast cancer risk. To date, most studies have measured mammographic texture within selected regions of interest (ROIs) in the breast, which cannot adequately capture the complexity of the parenchymal pattern throughout the whole breast. To better characterize patterns of the parenchymal tissue, the authors have developed a fully automated software pipeline based on a novel lattice-based strategy to extract a range of parenchymal texture features from the entire breast region. Methods: Digital mammograms from 106 cases with 318 age-matched controls were retrospectively analyzed. The lattice-based approach is based on a regular grid virtually overlaid on each mammographic image. Texture features are computed from the intersection (i.e., lattice) points of the grid lines within the breast, using a local window centered at each lattice point. Using this strategy, a range of statistical (gray-level histogram, co-occurrence, and run-length) and structural (edge-enhancing, local binary pattern, and fractal dimension) features are extracted. To cover the entire breast, the size of the local window for feature extraction is set equal to the lattice grid spacing and optimized experimentally by evaluating different windows sizes. The association between their lattice-based texture features and breast cancer was evaluated using logistic regression with leave-one-out cross validation and further compared to that of breast PD% and commonly used single-ROI texture features extracted from the retroareolar or the central breast region. Classification performance was evaluated using the area under the curve (AUC) of the receiver operating characteristic (ROC). DeLong’s test was used to compare the different ROCs in

  18. Fully automated 3D prostate central gland segmentation in MR images: a LOGISMOS based approach

    NASA Astrophysics Data System (ADS)

    Yin, Yin; Fotin, Sergei V.; Periaswamy, Senthil; Kunz, Justin; Haldankar, Hrishikesh; Muradyan, Naira; Turkbey, Baris; Choyke, Peter

    2012-02-01

    One widely accepted classification of a prostate is by a central gland (CG) and a peripheral zone (PZ). In some clinical applications, separating CG and PZ from the whole prostate is useful. For instance, in prostate cancer detection, radiologist wants to know in which zone the cancer occurs. Another application is for multiparametric MR tissue characterization. In prostate T2 MR images, due to the high intensity variation between CG and PZ, automated differentiation of CG and PZ is difficult. Previously, we developed an automated prostate boundary segmentation system, which tested on large datasets and showed good performance. Using the results of the pre-segmented prostate boundary, in this paper, we proposed an automated CG segmentation algorithm based on Layered Optimal Graph Image Segmentation of Multiple Objects and Surfaces (LOGISMOS). The designed LOGISMOS model contained both shape and topology information during deformation. We generated graph cost by training classifiers and used coarse-to-fine search. The LOGISMOS framework guarantees optimal solution regarding to cost and shape constraint. A five-fold cross-validation approach was applied to training dataset containing 261 images to optimize the system performance and compare with a voxel classification based reference approach. After the best parameter settings were found, the system was tested on a dataset containing another 261 images. The mean DSC of 0.81 for the test set indicates that our approach is promising for automated CG segmentation. Running time for the system is about 15 seconds.

  19. Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.

    PubMed

    Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras

    2016-04-01

    There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a <4 h magnetic bead-based process with excellent yield and good repeatability. This article demonstrates the next level of this work by automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (<3 min) separation to accommodate the high-throughput processing of the automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences. PMID:26429557

  20. Fully-automated synthesis of 16β-18F-fluoro-5α-dihydrotestosterone (FDHT) on the ELIXYS radiosynthesizer

    PubMed Central

    Lazari, Mark; Lyashchenko, Serge K.; Burnazi, Eva M.; Lewis, Jason S.; van Dam, R. Michael; Murphy, Jennifer M.

    2015-01-01

    Noninvasive in vivo imaging of androgen receptor (AR) levels with positron emission tomography (PET) is becoming the primary tool in prostate cancer detection and staging. Of the potential 18F-labeled PET tracers, 18F-FDHT has clinically shown to be of highest diagnostic value. We demonstrate the first automated synthesis of 18F-FDHT by adapting the conventional manual synthesis onto the fully-automated ELIXYS radiosynthesizer. Clinically-relevant amounts of 18F-FDHT were synthesized on ELIXYS in 90 min with decay-corrected radiochemical yield of 29 ± 5% (n = 7). The specific activity was 4.6 Ci/µmol (170 GBq/µmol) at end of formulation with a starting activity of 1.0 Ci (37 GBq). The formulated 18F-FDHT yielded sufficient activity for multiple patient doses and passed all quality control tests required for routine clinical use. PMID:26046518

  1. A fully automated and highly versatile system for testing multi-cognitive functions and recording neuronal activities in rodents.

    PubMed

    Zheng, Weimin; Ycu, Edgar A

    2012-01-01

    We have developed a fully automated system for operant behavior testing and neuronal activity recording by which multiple cognitive brain functions can be investigated in a single task sequence. The unique feature of this system is a custom-made, acoustically transparent chamber that eliminates many of the issues associated with auditory cue control in most commercially available chambers. The ease with which operant devices can be added or replaced makes this system quite versatile, allowing for the implementation of a variety of auditory, visual, and olfactory behavioral tasks. Automation of the system allows fine temporal (10 ms) control and precise time-stamping of each event in a predesigned behavioral sequence. When combined with a multi-channel electrophysiology recording system, multiple cognitive brain functions, such as motivation, attention, decision-making, patience, and rewards, can be examined sequentially or independently. PMID:22588124

  2. Feasibility of fully automated detection of fiducial markers implanted into the prostate using electronic portal imaging: A comparison of methods

    SciTech Connect

    Harris, Emma J. . E-mail: eharris@icr.ac.uk; McNair, Helen A.; Evans, Phillip M.

    2006-11-15

    Purpose: To investigate the feasibility of fully automated detection of fiducial markers implanted into the prostate using portal images acquired with an electronic portal imaging device. Methods and Materials: We have made a direct comparison of 4 different methods (2 template matching-based methods, a method incorporating attenuation and constellation analyses and a cross correlation method) that have been published in the literature for the automatic detection of fiducial markers. The cross-correlation technique requires a-priory information from the portal images, therefore the technique is not fully automated for the first treatment fraction. Images of 7 patients implanted with gold fiducial markers (8 mm in length and 1 mm in diameter) were acquired before treatment (set-up images) and during treatment (movie images) using 1MU and 15MU per image respectively. Images included: 75 anterior (AP) and 69 lateral (LAT) set-up images and 51 AP and 83 LAT movie images. Using the different methods described in the literature, marker positions were automatically identified. Results: The method based upon cross correlation techniques gave the highest percentage detection success rate of 99% (AP) and 83% (LAT) set-up (1MU) images. The methods gave detection success rates of less than 91% (AP) and 42% (LAT) set-up images. The amount of a-priory information used and how it affects the way the techniques are implemented, is discussed. Conclusions: Fully automated marker detection in set-up images for the first treatment fraction is unachievable using these methods and that using cross-correlation is the best technique for automatic detection on subsequent radiotherapy treatment fractions.

  3. LDRD final report: Automated planning and programming of assembly of fully 3D mechanisms

    SciTech Connect

    Kaufman, S.G.; Wilson, R.H.; Jones, R.E.; Calton, T.L.; Ames, A.L.

    1996-11-01

    This report describes the results of assembly planning research under the LDRD. The assembly planning problem is that of finding a sequence of assembly operations, starting from individual parts, that will result in complete assembly of a device specified as a CAD model. The automated assembly programming problem is that of automatically producing a robot program that will carry out a given assembly sequence. Given solutions to both of these problems, it is possible to automatically program a robot to assemble a mechanical device given as a CAD data file. This report describes the current state of our solutions to both of these problems, and a software system called Archimedes 2 we have constructed to automate these solutions. Because Archimedes 2 can input CAD data in several standard formats, we have been able to test it on a number of industrial assembly models more complex than any before attempted by automated assembly planning systems, some having over 100 parts. A complete path from a CAD model to an automatically generated robot program for assembling the device represented by the CAD model has also been demonstrated.

  4. A Robust and Fully-Automated Chromatographic Method for the Quantitative Purification of Ca and Sr for Isotopic Analysis

    NASA Astrophysics Data System (ADS)

    Smith, H. B.; Kim, H.; Romaniello, S. J.; Field, P.; Anbar, A. D.

    2014-12-01

    High throughput methods for sample purification are required to effectively exploit new opportunities in the study of non-traditional stable isotopes. Many geochemical isotopic studies would benefit from larger data sets, but these are often impractical with manual drip chromatography techniques, which can be time-consuming and demand the attention of skilled laboratory staff. Here we present a new, fully-automated single-column method suitable for the purification of both Ca and Sr for stable and radiogenic isotopic analysis. The method can accommodate a wide variety of sample types, including carbonates, bones, and teeth; silicate rocks and sediments; fresh and marine waters; and biological samples such as blood and urine. Protocols for these isotopic analyses are being developed for use on the new prepFAST-MCTM system from Elemental Scientific (ESI). The system is highly adaptable and processes up to 24-60 samples per day by reusing a single chromatographic column. Efficient column cleaning between samples and an all Teflon flow path ensures that sample carryover is maintained at the level of background laboratory blanks typical for manual drip chromatography. This method is part of a family of new fully-automated chromatographic methods being developed to address many different isotopic systems including B, Ca, Fe, Cu, Zn, Sr, Cd, Pb, and U. These methods are designed to be rugged and transferrable, and to allow the preparation of large, diverse sample sets via a highly repeatable process with minimal effort.

  5. Computer-assisted automatic synthesis II. Development of a fully automated apparatus for preparing substituted N–(carboxyalkyl)amino acids

    PubMed Central

    Hayashi, Nobuyoshi; Sugawara, Tohru; Shintani, Motoaki; Kato, Shinji

    1989-01-01

    A versatile automated apparatus, equipped with an artificial intelligence has been developed which may be used to prepare and isolate a wide variety of compounds. The prediction of the optimum reaction conditions and the reaction control in real time, are accomplished using novel kinetic equations and substituent effects in an artificial intelligence software which has already reported [1]. This paper deals with the design and construction of the fully automated system, and its application to the synthesis of a substituted N-(carboxyalkyl)amino acid. The apparatus is composed of units for perfoming various tasks, e.g. reagent supply, reaction, purification and separation, each linked to a control system. All synthetic processes including washing and drying of the apparatus after each synthetic run were automatically performed from the mixing of the reactants to the isolation of the products as powders with purities of greater than 98%. The automated apparatus has been able to run for 24 hours per day, and the average rate of synthesis of substituted N-(carboxyalkyl)amino acids has been three compounds daily. The apparatus is extremely valuable for synthesizing many derivatives of one particular compound structure. Even if the chemical yields are low under the optimum conditions, it is still possible to obtain a sufficient amount of the desired product by repetition of the reaction. Moreover it was possible to greatly reduce the manual involvement of the many syntheses which are a necessary part of pharmaceutical research. PMID:18924679

  6. Computer-assisted automatic synthesis II. Development of a fully automated apparatus for preparing substituted N-(carboxyalkyl)amino acids.

    PubMed

    Hayashi, N; Sugawara, T; Shintani, M; Kato, S

    1989-01-01

    A versatile automated apparatus, equipped with an artificial intelligence has been developed which may be used to prepare and isolate a wide variety of compounds. The prediction of the optimum reaction conditions and the reaction control in real time, are accomplished using novel kinetic equations and substituent effects in an artificial intelligence software which has already reported [1]. This paper deals with the design and construction of the fully automated system, and its application to the synthesis of a substituted N-(carboxyalkyl)amino acid. The apparatus is composed of units for perfoming various tasks, e.g. reagent supply, reaction, purification and separation, each linked to a control system. All synthetic processes including washing and drying of the apparatus after each synthetic run were automatically performed from the mixing of the reactants to the isolation of the products as powders with purities of greater than 98%. The automated apparatus has been able to run for 24 hours per day, and the average rate of synthesis of substituted N-(carboxyalkyl)amino acids has been three compounds daily. The apparatus is extremely valuable for synthesizing many derivatives of one particular compound structure. Even if the chemical yields are low under the optimum conditions, it is still possible to obtain a sufficient amount of the desired product by repetition of the reaction. Moreover it was possible to greatly reduce the manual involvement of the many syntheses which are a necessary part of pharmaceutical research. PMID:18924679

  7. A Neurocomputational Method for Fully Automated 3D Dendritic Spine Detection and Segmentation of Medium-sized Spiny Neurons

    PubMed Central

    Zhang, Yong; Chen, Kun; Baron, Matthew; Teylan, Merilee A.; Kim, Yong; Song, Zhihuan; Greengard, Paul

    2010-01-01

    Acquisition and quantitative analysis of high resolution images of dendritic spines are challenging tasks but are necessary for the study of animal models of neurological and psychiatric diseases. Currently available methods for automated dendritic spine detection are for the most part customized for 2D image slices, not volumetric 3D images. In this work, a fully automated method is proposed to detect and segment dendritic spines from 3D confocal microscopy images of medium-sized spiny neurons (MSNs). MSNs constitute a major neuronal population in striatum, and abnormalities in their function are associated with several neurological and psychiatric diseases. Such automated detection is critical for the development of new 3D neuronal assays which can be used for the screening of drugs and the studies of their therapeutic effects. The proposed method utilizes a generalized gradient vector flow (GGVF) with a new smoothing constraint and then detects feature points near the central regions of dendrites and spines. Then, the central regions are refined and separated based on eigen-analysis and multiple shape measurements. Finally, the spines are segmented in 3D space using the fast marching algorithm, taking the detected central regions of spines as initial points. The proposed method is compared with three popular existing methods for centerline extraction and also with manual results for dendritic spine detection in 3D space. The experimental results and comparisons show that the proposed method is able to automatically and accurately detect, segment, and quantitate dendritic spines in 3D images of MSNs. PMID:20100579

  8. A fully automated trabecular bone structural analysis tool based on T2* -weighted magnetic resonance imaging.

    PubMed

    Kraiger, Markus; Martirosian, Petros; Opriessnig, Peter; Eibofner, Frank; Rempp, Hansjoerg; Hofer, Michael; Schick, Fritz; Stollberger, Rudolf

    2012-03-01

    One major source affecting the precision of bone structure analysis in quantitative magnetic resonance imaging (qMRI) is inter- and intraoperator variability, inherent in delineating and tracing regions of interest along longitudinal studies. In this paper an automated analysis tool, featuring bone marrow segmentation, region of interest generation, and characterization of cancellous bone of articular joints is presented. In evaluation studies conducted at the knee joint the novel analysis tool significantly decreased the standard error of measurement and improved the sensitivity in detecting minor structural changes. It further eliminated the need of time-consuming user interaction, and thereby increasing reproducibility. PMID:21862288

  9. Early detection of glaucoma using fully automated disparity analysis of the optic nerve head (ONH) from stereo fundus images

    NASA Astrophysics Data System (ADS)

    Sharma, Archie; Corona, Enrique; Mitra, Sunanda; Nutter, Brian S.

    2006-03-01

    Early detection of structural damage to the optic nerve head (ONH) is critical in diagnosis of glaucoma, because such glaucomatous damage precedes clinically identifiable visual loss. Early detection of glaucoma can prevent progression of the disease and consequent loss of vision. Traditional early detection techniques involve observing changes in the ONH through an ophthalmoscope. Stereo fundus photography is also routinely used to detect subtle changes in the ONH. However, clinical evaluation of stereo fundus photographs suffers from inter- and intra-subject variability. Even the Heidelberg Retina Tomograph (HRT) has not been found to be sufficiently sensitive for early detection. A semi-automated algorithm for quantitative representation of the optic disc and cup contours by computing accumulated disparities in the disc and cup regions from stereo fundus image pairs has already been developed using advanced digital image analysis methodologies. A 3-D visualization of the disc and cup is achieved assuming camera geometry. High correlation among computer-generated and manually segmented cup to disc ratios in a longitudinal study involving 159 stereo fundus image pairs has already been demonstrated. However, clinical usefulness of the proposed technique can only be tested by a fully automated algorithm. In this paper, we present a fully automated algorithm for segmentation of optic cup and disc contours from corresponding stereo disparity information. Because this technique does not involve human intervention, it eliminates subjective variability encountered in currently used clinical methods and provides ophthalmologists with a cost-effective and quantitative method for detection of ONH structural damage for early detection of glaucoma.

  10. SARA South Observatory: A Fully Automated Boller & Chivens 0.6-m Telescope at C.T.I.O.

    NASA Astrophysics Data System (ADS)

    Mack, Peter; KanniahPadmanaban, S. Y.; Kaitchuck, R.; Borstad, A.; Luzier, N.

    2010-05-01

    The SARA South Observatory is the re-birth of the Lowell 24-inch telescope located on the south-east ridge of Cerro Tololo, Chile. Installed in 1968 this Boller & Chivens telescope fell into disuse for almost 20 years. The telescope and observatory have undergone a major restoration. A new dome with a wide slit has been fully automated with an ACE SmartDome controller featuring autonomous closure. The telescope was completely gutted, repainted, and virtually every electronic component and wire replaced. Modern infrastructure, such as USB, Ethernet and video ports have been incorporated into the telescope tube saddle boxes. Absolute encoders have been placed on the Hour Angle and declination axes with a resolution of less than 0.7 arc seconds. The secondary mirror is also equipped with an absolute encoder and temperature sensor to allow for fully automated focus. New mirror coatings, automated mirror covers, a new 150mm refractor, and new instrumentation have been deployed. An integrated X-stage guider and dual filter wheel containing 18 filters is used for direct imaging. The guider camera can be easily removed and a standard 2-inch eyepiece used for occasional viewing by VIP's at C.T.I.O. A 12 megapixel all-sky camera produces color images every 30 seconds showing details in the Milky Way and Magellanic Clouds. Two low light level cameras are deployed; one on the finder and one at the top of the telescope showing a 30° field. Other auxiliary equipment, including daytime color video cameras, weather station and remotely controllable power outlets permit complete control and servicing of the system. The SARA Consortium (www.saraobservatory.org), a collection of ten eastern universities, also operates a 0.9-m telescope at the Kitt Peak National Observatory using an almost identical set of instruments with the same ACE control system. This project was funded by the SARA Consortium.

  11. A fully automated primary screening system for the discovery of therapeutic antibodies directly from B cells.

    PubMed

    Tickle, Simon; Howells, Louise; O'Dowd, Victoria; Starkie, Dale; Whale, Kevin; Saunders, Mark; Lee, David; Lightwood, Daniel

    2015-04-01

    For a therapeutic antibody to succeed, it must meet a range of potency, stability, and specificity criteria. Many of these characteristics are conferred by the amino acid sequence of the heavy and light chain variable regions and, for this reason, can be screened for during antibody selection. However, it is important to consider that antibodies satisfying all these criteria may be of low frequency in an immunized animal; for this reason, it is essential to have a mechanism that allows for efficient sampling of the immune repertoire. UCB's core antibody discovery platform combines high-throughput B cell culture screening and the identification and isolation of single, antigen-specific IgG-secreting B cells through a proprietary technique called the "fluorescent foci" method. Using state-of-the-art automation to facilitate primary screening, extremely efficient interrogation of the natural antibody repertoire is made possible; more than 1 billion immune B cells can now be screened to provide a useful starting point from which to identify the rare therapeutic antibody. This article will describe the design, construction, and commissioning of a bespoke automated screening platform and two examples of how it was used to screen for antibodies against two targets. PMID:25548140

  12. Fully Automated Centrifugal Microfluidic Device for Ultrasensitive Protein Detection from Whole Blood.

    PubMed

    Park, Yang-Seok; Sunkara, Vijaya; Kim, Yubin; Lee, Won Seok; Han, Ja-Ryoung; Cho, Yoon-Kyoung

    2016-01-01

    Enzyme-linked immunosorbent assay (ELISA) is a promising method to detect small amount of proteins in biological samples. The devices providing a platform for reduced sample volume and assay time as well as full automation are required for potential use in point-of-care-diagnostics. Recently, we have demonstrated ultrasensitive detection of serum proteins, C-reactive protein (CRP) and cardiac troponin I (cTnI), utilizing a lab-on-a-disc composed of TiO2 nanofibrous (NF) mats. It showed a large dynamic range with femto molar (fM) detection sensitivity, from a small volume of whole blood in 30 min. The device consists of several components for blood separation, metering, mixing, and washing that are automated for improved sensitivity from low sample volumes. Here, in the video demonstration, we show the experimental protocols and know-how for the fabrication of NFs as well as the disc, their integration and the operation in the following order: processes for preparing TiO2 NF mat; transfer-printing of TiO2 NF mat onto the disc; surface modification for immune-reactions, disc assembly and operation; on-disc detection and representative results for immunoassay. Use of this device enables multiplexed analysis with minimal consumption of samples and reagents. Given the advantages, the device should find use in a wide variety of applications, and prove beneficial in facilitating the analysis of low abundant proteins. PMID:27167836

  13. Revisiting the Fully Automated Double-ring Infiltrometer using Open-source Electronics

    NASA Astrophysics Data System (ADS)

    Ong, J.; Werkema, D., Jr.; Lane, J. W.

    2012-12-01

    The double-ring infiltrometer (DRI) is commonly used for measuring soil hydraulic conductivity. However, constant-head DRI tests typically involve the use of Mariotte tubes, which can be problematic to set-up, and time-consuming to maintain and monitor during infiltration tests. Maheshwari (1996, Australian Journal of Soil Research, v. 34, p. 709-714) developed a method for eliminating Mariotte tubes for constant-head tests using a computer-controlled combination of water-level indicators and solenoids to maintain a near-constant head in the DRI. A pressure transducer mounted on a depth-to-volume calibrated tank measures the water delivery rates during the test and data are saved on a hard drive or floppy disk. Here we use an inexpensive combination of pressure transducers, microcontroller, and open-source electronics that eliminate the need for Mariotte tubes. The system automates DRI water delivery and data recording for both constant- and falling-head infiltration tests. The user has the option of choosing water supplied to the DRI through a pressurized water system, pump, or gravity fed. An LCD screen enables user interface and observation of data for quality analysis in the field. The digital data are stored on a micro-SD card in standard column format for future retrieval and easy importing into conventional processing and plotting software. We show the results of infiltrometer tests using the automated system and a conventional Mariotte tube system conducted over test beds of uniform soils.

  14. Development of a fully automated Flow Injection analyzer implementing bioluminescent biosensors for water toxicity assessment.

    PubMed

    Komaitis, Efstratios; Vasiliou, Efstathios; Kremmydas, Gerasimos; Georgakopoulos, Dimitrios G; Georgiou, Constantinos

    2010-01-01

    This paper describes the development of an automated Flow Injection analyzer for water toxicity assessment. The analyzer is validated by assessing the toxicity of heavy metal (Pb(2+), Hg(2+) and Cu(2+)) solutions. One hundred μL of a Vibrio fischeri suspension are injected in a carrier solution containing different heavy metal concentrations. Biosensor cells are mixed with the toxic carrier solution in the mixing coil on the way to the detector. Response registered is % inhibition of biosensor bioluminescence due to heavy metal toxicity in comparison to that resulting by injecting the Vibrio fischeri suspension in deionised water. Carrier solutions of mercury showed higher toxicity than the other heavy metals, whereas all metals show concentration related levels of toxicity. The biosensor's response to carrier solutions of different pHs was tested. Vibrio fischeri's bioluminescence is promoted in the pH 5-10 range. Experiments indicate that the whole cell biosensor, as applied in the automated fluidic system, responds to various toxic solutions. PMID:22163592

  15. Fully automated high-performance liquid chromatographic assay for the analysis of free catecholamines in urine.

    PubMed

    Said, R; Robinet, D; Barbier, C; Sartre, J; Huguet, C

    1990-08-24

    A totally automated and reliable high-performance liquid chromatographic method is described for the routine determination of free catecholamines (norepinephrine, epinephrine and dopamine) in urine. The catecholamines were isolated from urine samples using small alumina columns. A standard automated method for pH adjustment of urine before the extraction step has been developed. The extraction was performed on an ASPEC (Automatic Sample Preparation with Extraction Columns, Gilson). The eluate was collected in a separate tube and then automatically injected into the chromatographic column. The catecholamines were separated by reversed-phase ion-pair liquid chromatography and quantified by fluorescence detection. No manual intervention was required during the extraction and separation procedure. One sample may be run every 15 min, ca. 96 samples in 24 h. Analytical recoveries for all three catecholamines are 63-87%, and the detection limits are 0.01, 0.01, and 0.03 microM for norepinephrine, epinephrine and dopamine, respectively, which is highly satisfactory for urine. Day-to-day coefficients of variation were less than 10%. PMID:2277100

  16. Predicting non-small cell lung cancer prognosis by fully automated microscopic pathology image features

    PubMed Central

    Yu, Kun-Hsing; Zhang, Ce; Berry, Gerald J.; Altman, Russ B.; Ré, Christopher; Rubin, Daniel L.; Snyder, Michael

    2016-01-01

    Lung cancer is the most prevalent cancer worldwide, and histopathological assessment is indispensable for its diagnosis. However, human evaluation of pathology slides cannot accurately predict patients' prognoses. In this study, we obtain 2,186 haematoxylin and eosin stained histopathology whole-slide images of lung adenocarcinoma and squamous cell carcinoma patients from The Cancer Genome Atlas (TCGA), and 294 additional images from Stanford Tissue Microarray (TMA) Database. We extract 9,879 quantitative image features and use regularized machine-learning methods to select the top features and to distinguish shorter-term survivors from longer-term survivors with stage I adenocarcinoma (P<0.003) or squamous cell carcinoma (P=0.023) in the TCGA data set. We validate the survival prediction framework with the TMA cohort (P<0.036 for both tumour types). Our results suggest that automatically derived image features can predict the prognosis of lung cancer patients and thereby contribute to precision oncology. Our methods are extensible to histopathology images of other organs. PMID:27527408

  17. Predicting non-small cell lung cancer prognosis by fully automated microscopic pathology image features.

    PubMed

    Yu, Kun-Hsing; Zhang, Ce; Berry, Gerald J; Altman, Russ B; Ré, Christopher; Rubin, Daniel L; Snyder, Michael

    2016-01-01

    Lung cancer is the most prevalent cancer worldwide, and histopathological assessment is indispensable for its diagnosis. However, human evaluation of pathology slides cannot accurately predict patients' prognoses. In this study, we obtain 2,186 haematoxylin and eosin stained histopathology whole-slide images of lung adenocarcinoma and squamous cell carcinoma patients from The Cancer Genome Atlas (TCGA), and 294 additional images from Stanford Tissue Microarray (TMA) Database. We extract 9,879 quantitative image features and use regularized machine-learning methods to select the top features and to distinguish shorter-term survivors from longer-term survivors with stage I adenocarcinoma (P<0.003) or squamous cell carcinoma (P=0.023) in the TCGA data set. We validate the survival prediction framework with the TMA cohort (P<0.036 for both tumour types). Our results suggest that automatically derived image features can predict the prognosis of lung cancer patients and thereby contribute to precision oncology. Our methods are extensible to histopathology images of other organs. PMID:27527408

  18. Development and Validation of a Fully Automated Platform for Extended Blood Group Genotyping.

    PubMed

    Boccoz, Stephanie A; Le Goff, Gaelle C; Mandon, Celine A; Corgier, Benjamin P; Blum, Loïc J; Marquette, Christophe A

    2016-01-01

    Thirty-five blood group systems, containing >300 antigens, are listed by the International Society of Blood Transfusion. Most of these antigens result from a single nucleotide polymorphism. Blood group typing is conventionally performed by serology. However, this technique has some limitations and cannot respond to the growing demand of blood products typed for a large number of antigens. The knowledge of the molecular basis of these red blood cell systems allowed the implementation of molecular biology methods in immunohematology laboratories. Here, we describe a blood group genotyping assay based on the use of TKL immobilization support and microarray-based HIFI technology that takes approximately 4 hours and 30 minutes from whole-blood samples to results analysis. Targets amplified by multiplex PCR were hybridized on the chip, and a revelation step allowed the simultaneous identification of up to 24 blood group antigens, leading to the determination of extended genotypes. Two panels of multiplex PCR were developed: Panel 1 (KEL1/2, KEL3/4; JK1/2; FY1/2; MNS1/2, MNS3/4, FY*Fy et FY*X) and Panel 2 (YT1/2; CO1/2; DO1/2, HY+, Jo(a+); LU1/2; DI1/2). We present the results of the evaluation of our platform on a panel of 583 and 190 blood donor samples for Panel 1 and 2, respectively. Good correlations (99% to 100%) with reference were obtained. PMID:26621100

  19. Protein Microarrays

    NASA Astrophysics Data System (ADS)

    Ricard-Blum, S.

    Proteins are key actors in the life of the cell, involved in many physiological and pathological processes. Since variations in the expression of messenger RNA are not systematically correlated with variations in the protein levels, the latter better reflect the way a cell functions. Protein microarrays thus supply complementary information to DNA chips. They are used in particular to analyse protein expression profiles, to detect proteins within complex biological media, and to study protein-protein interactions, which give information about the functions of those proteins [3-9]. They have the same advantages as DNA microarrays for high-throughput analysis, miniaturisation, and the possibility of automation. Section 18.1 gives a brief overview of proteins. Following this, Sect. 18.2 describes how protein microarrays can be made on flat supports, explaining how proteins can be produced and immobilised on a solid support, and discussing the different kinds of substrate and detection method. Section 18.3 discusses the particular format of protein microarrays in suspension. The diversity of protein microarrays and their applications are then reported in Sect. 18.4, with applications to therapeutics (protein-drug interactions) and diagnostics. The prospects for future developments of protein microarrays are then outlined in the conclusion. The bibliography provides an extensive list of reviews and detailed references for those readers who wish to go further in this area. Indeed, the aim of the present chapter is not to give an exhaustive or detailed analysis of the state of the art, but rather to provide the reader with the basic elements needed to understand how proteins are designed and used.

  20. Fully automated hybrid diode laser assembly using high precision active alignment

    NASA Astrophysics Data System (ADS)

    Böttger, Gunnar; Weber, Daniel; Scholz, Friedemann; Schröder, Henning; Schneider-Ramelow, Martin; Lang, Klaus-Dieter

    2016-03-01

    Fraunhofer IZM, Technische Universität Berlin and eagleyard Photonics present various implementations of current micro-optical assemblies for high quality free space laser beam forming and efficient fiber coupling. The laser modules shown are optimized for fast and automated assembly in small form factor packages via state-of-the-art active alignment machinery, using alignment and joining processes that have been developed and established in various industrial research projects. Operational wavelengths and optical powers ranging from 600 to 1600 nm and from 1 mW to several W respectively are addressed, for application in high-resolution laser spectroscopy, telecom and optical sensors, up to the optical powers needed in industrial and medical laser treatment.

  1. Fully automated and adaptive detection of amyloid plaques in stained brain sections of Alzheimer transgenic mice.

    PubMed

    Feki, Abdelmonem; Teboul, Olivier; Dubois, Albertine; Bozon, Bruno; Faure, Alexis; Hantraye, Philippe; Dhenain, Marc; Delatour, Benoit; Delzescaux, Thierry

    2007-01-01

    Automated detection of amyloid plaques (AP) in post mortem brain sections of patients with Alzheimer disease (AD) or in mouse models of the disease is a major issue to improve quantitative, standardized and accurate assessment of neuropathological lesions as well as of their modulation by treatment. We propose a new segmentation method to automatically detect amyloid plaques in Congo Red stained sections based on adaptive thresholds and a dedicated amyloid plaque/tissue modelling. A set of histological sections focusing on anatomical structures was used to validate the method in comparison to expert segmentation. Original information concerning global amyloid load have been derived from 6 mouse brains which opens new perspectives for the extensive analysis of such a data in 3-D and the possibility to integrate in vivo-post mortem information for diagnosis purposes. PMID:18044661

  2. a Fully Automated Pipeline for Classification Tasks with AN Application to Remote Sensing

    NASA Astrophysics Data System (ADS)

    Suzuki, K.; Claesen, M.; Takeda, H.; De Moor, B.

    2016-06-01

    Nowadays deep learning has been intensively in spotlight owing to its great victories at major competitions, which undeservedly pushed `shallow' machine learning methods, relatively naive/handy algorithms commonly used by industrial engineers, to the background in spite of their facilities such as small requisite amount of time/dataset for training. We, with a practical point of view, utilized shallow learning algorithms to construct a learning pipeline such that operators can utilize machine learning without any special knowledge, expensive computation environment, and a large amount of labelled data. The proposed pipeline automates a whole classification process, namely feature-selection, weighting features and the selection of the most suitable classifier with optimized hyperparameters. The configuration facilitates particle swarm optimization, one of well-known metaheuristic algorithms for the sake of generally fast and fine optimization, which enables us not only to optimize (hyper)parameters but also to determine appropriate features/classifier to the problem, which has conventionally been a priori based on domain knowledge and remained untouched or dealt with naïve algorithms such as grid search. Through experiments with the MNIST and CIFAR-10 datasets, common datasets in computer vision field for character recognition and object recognition problems respectively, our automated learning approach provides high performance considering its simple setting (i.e. non-specialized setting depending on dataset), small amount of training data, and practical learning time. Moreover, compared to deep learning the performance stays robust without almost any modification even with a remote sensing object recognition problem, which in turn indicates that there is a high possibility that our approach contributes to general classification problems.

  3. A quality assurance framework for the fully automated and objective evaluation of image quality in cone-beam computed tomography

    SciTech Connect

    Steiding, Christian; Kolditz, Daniel; Kalender, Willi A.

    2014-03-15

    Purpose: Thousands of cone-beam computed tomography (CBCT) scanners for vascular, maxillofacial, neurological, and body imaging are in clinical use today, but there is no consensus on uniform acceptance and constancy testing for image quality (IQ) and dose yet. The authors developed a quality assurance (QA) framework for fully automated and time-efficient performance evaluation of these systems. In addition, the dependence of objective Fourier-based IQ metrics on direction and position in 3D volumes was investigated for CBCT. Methods: The authors designed a dedicated QA phantom 10 cm in length consisting of five compartments, each with a diameter of 10 cm, and an optional extension ring 16 cm in diameter. A homogeneous section of water-equivalent material allows measuring CT value accuracy, image noise and uniformity, and multidimensional global and local noise power spectra (NPS). For the quantitative determination of 3D high-contrast spatial resolution, the modulation transfer function (MTF) of centrally and peripherally positioned aluminum spheres was computed from edge profiles. Additional in-plane and axial resolution patterns were used to assess resolution qualitatively. The characterization of low-contrast detectability as well as CT value linearity and artifact behavior was tested by utilizing sections with soft-tissue-equivalent and metallic inserts. For an automated QA procedure, a phantom detection algorithm was implemented. All tests used in the dedicated QA program were initially verified in simulation studies and experimentally confirmed on a clinical dental CBCT system. Results: The automated IQ evaluation of volume data sets of the dental CBCT system was achieved with the proposed phantom requiring only one scan for the determination of all desired parameters. Typically, less than 5 min were needed for phantom set-up, scanning, and data analysis. Quantitative evaluation of system performance over time by comparison to previous examinations was also

  4. A Fully Automated and Robust Method to Incorporate Stamping Data in Crash, NVH and Durability Analysis

    NASA Astrophysics Data System (ADS)

    Palaniswamy, Hariharasudhan; Kanthadai, Narayan; Roy, Subir; Beauchesne, Erwan

    2011-08-01

    Crash, NVH (Noise, Vibration, Harshness), and durability analysis are commonly deployed in structural CAE analysis for mechanical design of components especially in the automotive industry. Components manufactured by stamping constitute a major portion of the automotive structure. In CAE analysis they are modeled at a nominal state with uniform thickness and no residual stresses and strains. However, in reality the stamped components have non-uniformly distributed thickness and residual stresses and strains resulting from stamping. It is essential to consider the stamping information in CAE analysis to accurately model the behavior of the sheet metal structures under different loading conditions. Especially with the current emphasis on weight reduction by replacing conventional steels with aluminum and advanced high strength steels it is imperative to avoid over design. Considering this growing need in industry, a highly automated and robust method has been integrated within Altair Hyperworks® to initialize sheet metal components in CAE models with stamping data. This paper demonstrates this new feature and the influence of stamping data for a full car frontal crash analysis.

  5. Evaluation of Cross-Protocol Stability of a Fully Automated Brain Multi-Atlas Parcellation Tool

    PubMed Central

    Liang, Zifei; He, Xiaohai; Ceritoglu, Can; Tang, Xiaoying; Li, Yue; Kutten, Kwame S.; Oishi, Kenichi; Miller, Michael I.; Mori, Susumu; Faria, Andreia V.

    2015-01-01

    Brain parcellation tools based on multiple-atlas algorithms have recently emerged as a promising method with which to accurately define brain structures. When dealing with data from various sources, it is crucial that these tools are robust for many different imaging protocols. In this study, we tested the robustness of a multiple-atlas, likelihood fusion algorithm using Alzheimer’s Disease Neuroimaging Initiative (ADNI) data with six different protocols, comprising three manufacturers and two magnetic field strengths. The entire brain was parceled into five different levels of granularity. In each level, which defines a set of brain structures, ranging from eight to 286 regions, we evaluated the variability of brain volumes related to the protocol, age, and diagnosis (healthy or Alzheimer’s disease). Our results indicated that, with proper pre-processing steps, the impact of different protocols is minor compared to biological effects, such as age and pathology. A precise knowledge of the sources of data variation enables sufficient statistical power and ensures the reliability of an anatomical analysis when using this automated brain parcellation tool on datasets from various imaging protocols, such as clinical databases. PMID:26208327

  6. Fully Automated Field-Deployable Bioaerosol Monitoring System Using Carbon Nanotube-Based Biosensors.

    PubMed

    Kim, Junhyup; Jin, Joon-Hyung; Kim, Hyun Soo; Song, Wonbin; Shin, Su-Kyoung; Yi, Hana; Jang, Dae-Ho; Shin, Sehyun; Lee, Byung Yang

    2016-05-17

    Much progress has been made in the field of automated monitoring systems of airborne pathogens. However, they still lack the robustness and stability necessary for field deployment. Here, we demonstrate a bioaerosol automonitoring instrument (BAMI) specifically designed for the in situ capturing and continuous monitoring of airborne fungal particles. This was possible by developing highly sensitive and selective fungi sensors based on two-channel carbon nanotube field-effect transistors (CNT-FETs), followed by integration with a bioaerosol sampler, a Peltier cooler for receptor lifetime enhancement, and a pumping assembly for fluidic control. These four main components collectively cooperated with each other to enable the real-time monitoring of fungi. The two-channel CNT-FETs can detect two different fungal species simultaneously. The Peltier cooler effectively lowers the working temperature of the sensor device, resulting in extended sensor lifetime and receptor stability. The system performance was verified in both laboratory conditions and real residential areas. The system response was in accordance with reported fungal species distribution in the environment. Our system is versatile enough that it can be easily modified for the monitoring of other airborne pathogens. We expect that our system will expedite the development of hand-held and portable systems for airborne bioaerosol monitoring. PMID:27070239

  7. Grid-Competitive Residential and Commercial Fully Automated PV Systems Technology: Final technical Report, August 2011

    SciTech Connect

    Brown, Katie E.; Cousins, Peter; Culligan, Matt; Jonathan Botkin; DeGraaff, David; Bunea, Gabriella; Rose, Douglas; Bourne, Ben; Koehler, Oliver

    2011-08-26

    Under DOE's Technology Pathway Partnership program, SunPower Corporation developed turn-key, high-efficiency residential and commercial systems that are cost effective. Key program objectives include a reduction in LCOE values to 9-12 cents/kWh and 13-18 cents/kWh respectively for the commercial and residential markets. Target LCOE values for the commercial ground, commercial roof, and residential markets are 10, 11, and 13 cents/kWh. For this effort, SunPower collaborated with a variety of suppliers and partners to complete the tasks below. Subcontractors included: Solaicx, SiGen, Ribbon Technology, Dow Corning, Xantrex, Tigo Energy, and Solar Bridge. SunPower's TPP addressed nearly the complete PV value chain: from ingot growth through system deployment. Throughout the award period of performance, SunPower has made progress toward achieving these reduced costs through the development of 20%+ efficient modules, increased cell efficiency through the understanding of loss mechanisms and improved manufacturing technologies, novel module development, automated design tools and techniques, and reduced system development and installation time. Based on an LCOE assessment using NREL's Solar Advisor Model, SunPower achieved the 2010 target range, as well as progress toward 2015 targets.

  8. Fully automatic leaf characterisation in heterogeneous environment of plant growing automation

    NASA Astrophysics Data System (ADS)

    Chareyron, Ga"l.; Da Rugna, Jérôme; Darsch, Amaury

    2010-01-01

    In the last decade, we have seen a tremendous emergence of genome sequencing analysis systems. These systems are limited by the ability to phenotype numerous plants under controlled environmental conditions. To avoid this limitation, it is desirable to use an automated system designed with plants control growth feature in mind. For each experimental sequence, many parameters are subject to variations: illuminant, plant size and color, humidity, temperature, to name a few. These parameters variations require the adjustment of classical plant detection algorithms. This paper present an innovative and automatic imaging scheme for characterising the plant's leafs growth. By considering a plant growth sequence it is possible, using the color histogram sequence, to detect day color variations and, then, to compute to set the algorithm parameters. The main difficulty is to take into account the automaton properties since the plant is not photographed exactly at the same position and angle. There is also an important evolution of the plant background, like moss, which needs to be taken into account. Ground truth experiments on several complete sequences will demonstrate the ability to identify the rosettes and to extract the plant characteristics whatever the culture conditions are.

  9. Evaluation of a Fully Automated Research Prototype for the Immediate Identification of Microorganisms from Positive Blood Cultures under Clinical Conditions

    PubMed Central

    Hyman, Jay M.; Walsh, John D.; Ronsick, Christopher; Wilson, Mark; Hazen, Kevin C.; Borzhemskaya, Larisa; Link, John; Clay, Bradford; Ullery, Michael; Sanchez-Illan, Mirta; Rothenberg, Steven; Robinson, Ron; van Belkum, Alex

    2016-01-01

    ABSTRACT A clinical laboratory evaluation of an intrinsic fluorescence spectroscopy (IFS)-based identification system paired to a BacT/Alert Virtuo microbial detection system (bioMérieux, Inc., Durham, NC) was performed to assess the potential for fully automated identification of positive blood cultures. The prototype IFS system incorporates a novel method combining a simple microbial purification procedure with rapid in situ identification via spectroscopy. Results were available within 15 min of a bottle signaling positive and required no manual intervention. Among cultures positive for organisms contained within the database and producing acceptable spectra, 75 of 88 (85.2%) and 79 of 88 (89.8%) were correctly identified to the species and genus level, respectively. These results are similar to the performance of existing rapid methods. PMID:27094332

  10. Fully automated detection of diabetic macular edema and dry age-related macular degeneration from optical coherence tomography images

    PubMed Central

    Srinivasan, Pratul P.; Kim, Leo A.; Mettu, Priyatham S.; Cousins, Scott W.; Comer, Grant M.; Izatt, Joseph A.; Farsiu, Sina

    2014-01-01

    We present a novel fully automated algorithm for the detection of retinal diseases via optical coherence tomography (OCT) imaging. Our algorithm utilizes multiscale histograms of oriented gradient descriptors as feature vectors of a support vector machine based classifier. The spectral domain OCT data sets used for cross-validation consisted of volumetric scans acquired from 45 subjects: 15 normal subjects, 15 patients with dry age-related macular degeneration (AMD), and 15 patients with diabetic macular edema (DME). Our classifier correctly identified 100% of cases with AMD, 100% cases with DME, and 86.67% cases of normal subjects. This algorithm is a potentially impactful tool for the remote diagnosis of ophthalmic diseases. PMID:25360373

  11. A fully integrated and automated microsystem for rapid pharmacogenetic typing of multiple warfarin-related single-nucleotide polymorphisms.

    PubMed

    Zhuang, Bin; Han, Junping; Xiang, Guangxin; Gan, Wupeng; Wang, Shuaiqin; Wang, Dong; Wang, Lei; Sun, Jing; Li, Cai-Xia; Liu, Peng

    2016-01-01

    A fully integrated and automated microsystem consisting of low-cost, disposable plastic chips for DNA extraction and PCR amplification combined with a reusable glass capillary array electrophoresis chip in a modular-based format was successfully developed for warfarin pharmacogenetic testing. DNA extraction was performed by adopting a filter paper-based method, followed by "in situ" PCR that was carried out directly in the same reaction chamber of the chip without elution. PCR products were then co-injected with sizing standards into separation channels for detection using a novel injection electrode. The entire process was automatically conducted on a custom-made compact control and detection instrument. The limit of detection of the microsystem for the singleplex amplification of amelogenin was determined to be 0.625 ng of standard K562 DNA and 0.3 μL of human whole blood. A two-color multiplex allele-specific PCR assay for detecting the warfarin-related single-nucleotide polymorphisms (SNPs) 6853 (-1639G>A) and 6484 (1173C>T) in the VKORC1 gene and the *3 SNP (1075A>C) in the CYP2C9 gene was developed and used for validation studies. The fully automated genetic analysis was completed in two hours with a minimum requirement of 0.5 μL of input blood. Samples from patients with different genotypes were all accurately analyzed. In addition, both dried bloodstains and oral swabs were successfully processed by the microsystem with a simple modification to the DNA extraction and amplification chip. The successful development and operation of this microsystem establish the feasibility of rapid warfarin pharmacogenetic testing in routine clinical practice. PMID:26568290

  12. “Smart” RCTs: Development of a Smartphone App for Fully Automated Nutrition-Labeling Intervention Trials

    PubMed Central

    Li, Nicole; Dunford, Elizabeth; Eyles, Helen; Crino, Michelle; Michie, Jo; Ni Mhurchu, Cliona

    2016-01-01

    Background There is substantial interest in the effects of nutrition labels on consumer food-purchasing behavior. However, conducting randomized controlled trials on the impact of nutrition labels in the real world presents a significant challenge. Objective The Food Label Trial (FLT) smartphone app was developed to enable conducting fully automated trials, delivering intervention remotely, and collecting individual-level data on food purchases for two nutrition-labeling randomized controlled trials (RCTs) in New Zealand and Australia. Methods Two versions of the smartphone app were developed: one for a 5-arm trial (Australian) and the other for a 3-arm trial (New Zealand). The RCT protocols guided requirements for app functionality, that is, obtaining informed consent, two-stage eligibility check, questionnaire administration, randomization, intervention delivery, and outcome assessment. Intervention delivery (nutrition labels) and outcome data collection (individual shopping data) used the smartphone camera technology, where a barcode scanner was used to identify a packaged food and link it with its corresponding match in a food composition database. Scanned products were either recorded in an electronic list (data collection mode) or allocated a nutrition label on screen if matched successfully with an existing product in the database (intervention delivery mode). All recorded data were transmitted to the RCT database hosted on a server. Results In total approximately 4000 users have downloaded the FLT app to date; 606 (Australia) and 1470 (New Zealand) users met the eligibility criteria and were randomized. Individual shopping data collected by participants currently comprise more than 96,000 (Australia) and 229,000 (New Zealand) packaged food and beverage products. Conclusions The FLT app is one of the first smartphone apps to enable conducting fully automated RCTs. Preliminary app usage statistics demonstrate large potential of such technology, both for

  13. A randomised control study of a fully automated internet based smoking cessation programme

    PubMed Central

    Swartz, L H G; Noell, J W; Schroeder, S W; Ary, D V

    2006-01-01

    Objective The objective of this project was to test the short term (90 days) efficacy of an automated behavioural intervention for smoking cessation, the “1‐2‐3 Smokefree” programme, delivered via an internet website. Design Randomised control trial. Subjects surveyed at baseline, immediately post‐intervention, and 90 days later. Settings The study and the intervention occurred entirely via the internet site. Subjects were recruited primarily via worksites, which referred potential subjects to the website. Subjects The 351 qualifying subjects were notified of the study via their worksite and required to have internet access. Additionally, subjects were required to be over 18 years of age, smoke cigarettes, and be interested in quitting smoking in the next 30 days. Eligible subjects were randomly assigned individually to treatment or control condition by computer algorithm. Intervention The intervention consisted of a video based internet site that presented current strategies for smoking cessation and motivational materials tailored to the user's race/ethnicity, sex, and age. Control subjects received nothing for 90 days and were then allowed access to the programme. Main outcome measures The primary outcome measure was abstinence from smoking at 90 day follow up. Results At follow up, the cessation rate at 90 days was 24.1% (n  =  21) for the treatment group and 8.2% (n  =  9) for the control group (p  =  0.002). Using an intent‐to‐treat model, 12.3% (n  =  21) of the treatment group were abstinent, compared to 5.0% (n  =  9) in the control group (p  =  0.015). Conclusions These evaluation results suggest that a smoking cessation programme, with at least short term efficacy, can be successfully delivered via the internet. PMID:16436397

  14. Analytical and Clinical Comparison of Two Fully Automated Immunoassay Systems for the Diagnosis of Celiac Disease

    PubMed Central

    Norman, Gary L.; Santora, Debby; Fasano, Alessio

    2014-01-01

    Objective. Here we compared analytical and clinical performance characteristics of two novel automated assay systems for the detection of celiac disease (CD) specific antibodies: QUANTA Flash (INOVA Diagnostics, Inc.) and EliA (Thermo Scientific). Methods. A total of 74 biopsy-proven CD patients (2 with IgA deficiency) and 138 controls were tested by both methods. Results. Sensitivities of QUANTA Flash assays ranged from 35.1% to 90.5% and specificities from 96.4% to 99.3%, while sensitivities for EliA assays ranged from 37.8% to 90.5% (equivocal considered positive) and specificities from 97.1% to 100.0%. Good qualitative agreement was found between all assays. Thirty-four (50.0%) of the 68 QUANTA Flash h-tTG IgA positive results were higher than 10 times the upper limit of normal (ULN). In contrast, only 22.8% of the EliA tTG IgA positive samples were >10x ULN. Seventy-three (98.6%) biopsy-proven CD patients were correctly identified with the QUANTA Flash h-tTG IgA+DGP IgG combination, while 64 (86.5%) and 72 (97.3%) (depending on equivocal range) were identified with the same combination of EliA assays. Conclusion. The QUANTA Flash CD assays have outstanding clinical performance. Of particular clinical significance, in light of proposals to decrease the absolute necessity of biopsy, was the demonstration that 50% of the QUANTA Flash h-tTG IgA results were >10x ULN. PMID:24741592

  15. Fully automated screening of immunocytochemically stained specimens for early cancer detection

    NASA Astrophysics Data System (ADS)

    Bell, André A.; Schneider, Timna E.; Müller-Frank, Dirk A. C.; Meyer-Ebrecht, Dietrich; Böcking, Alfred; Aach, Til

    2007-03-01

    Cytopathological cancer diagnoses can be obtained less invasive than histopathological investigations. Cells containing specimens can be obtained without pain or discomfort, bloody biopsies are avoided, and the diagnosis can, in some cases, even be made earlier. Since no tissue biopsies are necessary these methods can also be used in screening applications, e.g., for cervical cancer. Among the cytopathological methods a diagnosis based on the analysis of the amount of DNA in individual cells achieves high sensitivity and specificity. Yet this analysis is time consuming, which is prohibitive for a screening application. Hence, it will be advantageous to retain, by a preceding selection step, only a subset of suspicious specimens. This can be achieved using highly sensitive immunocytochemical markers like p16 ink4a for preselection of suspicious cells and specimens. We present a method to fully automatically acquire images at distinct positions at cytological specimens using a conventional computer controlled microscope and an autofocus algorithm. Based on the thus obtained images we automatically detect p16 ink4a-positive objects. This detection in turn is based on an analysis of the color distribution of the p16 ink4a marker in the Lab-colorspace. A Gaussian-mixture-model is used to describe this distribution and the method described in this paper so far achieves a sensitivity of up to 90%.

  16. Semantic Focusing Allows Fully Automated Single-Layer Slide Scanning of Cervical Cytology Slides

    PubMed Central

    Lahrmann, Bernd; Valous, Nektarios A.; Eisenmann, Urs; Wentzensen, Nicolas; Grabe, Niels

    2013-01-01

    Liquid-based cytology (LBC) in conjunction with Whole-Slide Imaging (WSI) enables the objective and sensitive and quantitative evaluation of biomarkers in cytology. However, the complex three-dimensional distribution of cells on LBC slides requires manual focusing, long scanning-times, and multi-layer scanning. Here, we present a solution that overcomes these limitations in two steps: first, we make sure that focus points are only set on cells. Secondly, we check the total slide focus quality. From a first analysis we detected that superficial dust can be separated from the cell layer (thin layer of cells on the glass slide) itself. Then we analyzed 2,295 individual focus points from 51 LBC slides stained for p16 and Ki67. Using the number of edges in a focus point image, specific color values and size-inclusion filters, focus points detecting cells could be distinguished from focus points on artifacts (accuracy 98.6%). Sharpness as total focus quality of a virtual LBC slide is computed from 5 sharpness features. We trained a multi-parameter SVM classifier on 1,600 images. On an independent validation set of 3,232 cell images we achieved an accuracy of 94.8% for classifying images as focused. Our results show that single-layer scanning of LBC slides is possible and how it can be achieved. We assembled focus point analysis and sharpness classification into a fully automatic, iterative workflow, free of user intervention, which performs repetitive slide scanning as necessary. On 400 LBC slides we achieved a scanning-time of 13.9±10.1 min with 29.1±15.5 focus points. In summary, the integration of semantic focus information into whole-slide imaging allows automatic high-quality imaging of LBC slides and subsequent biomarker analysis. PMID:23585899

  17. A preliminary study for fully automated quantification of psoriasis severity using image mapping

    NASA Astrophysics Data System (ADS)

    Mukai, Kazuhiro; Iyatomi, Hitoshi

    2014-03-01

    Psoriasis is a common chronic skin disease and it detracts patients' QoL seriously. Since there is no known permanent cure so far, controlling appropriate disease condition is necessary and therefore quantification of its severity is important. In clinical, psoriasis area and severity index (PASI) is commonly used for abovementioned purpose, however it is often subjective and troublesome. A fully automatic computer-assisted area and severity index (CASI) was proposed to make an objective quantification of skin disease. It investigates the size and density of erythema based on digital image analysis, however it does not consider various inadequate effects caused by different geometrical conditions under clinical follow-up (i.e. variability in direction and distance between camera and patient). In this study, we proposed an image alignment method for clinical images and investigated to quantify the severity of psoriasis under clinical follow-up combined with the idea of CASI. The proposed method finds geometrical same points in patient's body (ROI) between images with Scale Invariant Feature Transform (SIFT) and performs the Affine transform to map the pixel value to the other. In this study, clinical images from 7 patients with psoriasis lesions on their trunk under clinical follow-up were used. In each series, our image alignment algorithm align images to the geometry of their first image. Our proposed method aligned images appropriately on visual assessment and confirmed that psoriasis areas were properly extracted using the approach of CASI. Although we cannot evaluate PASI and CASI directly due to their different definition of ROI, we confirmed that there is a large correlation between those scores with our image quantification method.

  18. Comparison of fully and semi-automated area-based methods for measuring mammographic density and predicting breast cancer risk

    PubMed Central

    Sovio, U; Li, J; Aitken, Z; Humphreys, K; Czene, K; Moss, S; Hall, P; McCormack, V; dos-Santos-Silva, I

    2014-01-01

    Background: Mammographic density is a strong risk factor for breast cancer but the lack of valid fully automated methods for quantifying it has precluded its use in clinical and screening settings. We compared the performance of a recently developed automated approach, based on the public domain ImageJ programme, to the well-established semi-automated Cumulus method. Methods: We undertook a case-control study within the intervention arm of the Age Trial, in which ∼54 000 British women were offered annual mammography at ages 40–49 years. A total of 299 breast cancer cases diagnosed during follow-up and 422 matched (on screening centre, date of birth and dates of screenings) controls were included. Medio-lateral oblique (MLO) images taken closest to age 41 and at least one year before the index case's diagnosis were digitised for each participant. Cumulus readings were performed in the left MLO and ImageJ-based readings in both left and right MLOs. Conditional logistic regression was used to examine density–breast cancer associations. Results: The association between density readings taken from one single MLO and breast cancer risk was weaker for the ImageJ-based method than for Cumulus (age–body mass index-adjusted odds ratio (OR) per one s.d. increase in percent density (95% CI): 1.52 (1.24–1.86) and 1.61 (1.33–1.94), respectively). The ImageJ-based density–cancer association strengthened when the mean of left–right MLO readings was used: OR=1.61 (1.31–1.98). Conclusions: The mean of left–right MLO readings yielded by the ImageJ-based method was as strong a predictor of risk as Cumulus readings from a single MLO image. The ImageJ-based method, using the mean of two measurements, is a valid automated alternative to Cumulus for measuring density in analogue films. PMID:24556624

  19. Fully automated segmentation of the cervical cord from T1-weighted MRI using PropSeg: Application to multiple sclerosis☆

    PubMed Central

    Yiannakas, Marios C.; Mustafa, Ahmed M.; De Leener, Benjamin; Kearney, Hugh; Tur, Carmen; Altmann, Daniel R.; De Angelis, Floriana; Plantone, Domenico; Ciccarelli, Olga; Miller, David H.; Cohen-Adad, Julien; Gandini Wheeler-Kingshott, Claudia A.M.

    2015-01-01

    Spinal cord (SC) atrophy, i.e. a reduction in the SC cross-sectional area (CSA) over time, can be measured by means of image segmentation using magnetic resonance imaging (MRI). However, segmentation methods have been limited by factors relating to reproducibility or sensitivity to change. The purpose of this study was to evaluate a fully automated SC segmentation method (PropSeg), and compare this to a semi-automated active surface (AS) method, in healthy controls (HC) and people with multiple sclerosis (MS). MRI data from 120 people were retrospectively analysed; 26 HC, 21 with clinically isolated syndrome, 26 relapsing remitting MS, 26 primary and 21 secondary progressive MS. MRI data from 40 people returning after one year were also analysed. CSA measurements were obtained within the cervical SC. Reproducibility of the measurements was assessed using the intraclass correlation coefficient (ICC). A comparison between mean CSA changes obtained with the two methods over time was performed using multivariate structural equation regression models. Associations between CSA measures and clinical scores were investigated using linear regression models. Compared to the AS method, the reproducibility of CSA measurements obtained with PropSeg was high, both in patients and in HC, with ICC > 0.98 in all cases. There was no significant difference between PropSeg and AS in terms of detecting change over time. Furthermore, PropSeg provided measures that correlated with physical disability, similar to the AS method. PropSeg is a time-efficient and reliable segmentation method, which requires no manual intervention, and may facilitate large multi-centre neuroprotective trials in progressive MS. PMID:26793433

  20. A fully automated multi-modal computer aided diagnosis approach to coronary calcium scoring of MSCT images

    NASA Astrophysics Data System (ADS)

    Wu, Jing; Ferns, Gordon; Giles, John; Lewis, Emma

    2012-03-01

    Inter- and intra- observer variability is a problem often faced when an expert or observer is tasked with assessing the severity of a disease. This issue is keenly felt in coronary calcium scoring of patients suffering from atherosclerosis where in clinical practice, the observer must identify firstly the presence, followed by the location of candidate calcified plaques found within the coronary arteries that may prevent oxygenated blood flow to the heart muscle. However, it can be difficult for a human observer to differentiate calcified plaques that are located in the coronary arteries from those found in surrounding anatomy such as the mitral valve or pericardium. In addition to the benefits to scoring accuracy, the use of fast, low dose multi-slice CT imaging to perform the cardiac scan is capable of acquiring the entire heart within a single breath hold. Thus exposing the patient to lower radiation dose, which for a progressive disease such as atherosclerosis where multiple scans may be required, is beneficial to their health. Presented here is a fully automated method for calcium scoring using both the traditional Agatston method, as well as the volume scoring method. Elimination of the unwanted regions of the cardiac image slices such as lungs, ribs, and vertebrae is carried out using adaptive heart isolation. Such regions cannot contain calcified plaques but can be of a similar intensity and their removal will aid detection. Removal of both the ascending and descending aortas, as they contain clinical insignificant plaques, is necessary before the final calcium scores are calculated and examined against ground truth scores of three averaged expert observer results. The results presented here are intended to show the feasibility and requirement for an automated scoring method to reduce the subjectivity and reproducibility error inherent with manual clinical calcium scoring.

  1. Fully automated molecular biology: Plasmid-Based Functional Proteomic Workcell Evaluation and Characterization of Yeast Strains with Optimized "Trojan Horse" Amino Acid Scanning Mutational Inserts.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The optimization of various genes is important in cellulosic fuel ethanol production from S. cerevisiae to meet the rapidly expanding need for ethanol derived from hemicellulosic materials. The United States Department of Agriculture has developed a fully automated platform for molecular biology ro...

  2. Fully automated molecular biology routines on a plasmid-based functional proteomic workcell: Evaluation and Characterization of Yeast Strains Optimized for Growth on Xylose Expressing "Stealth" Insecticidal Peptides.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Optimization of genes important to production of fuel ethanol from hemicellulosic biomass for use in developing improved commercial yeast strains is necessary to meet the rapidly expanding need for ethanol. The United States Department of Agriculture has developed a fully automated platform for mol...

  3. Active shape models for a fully automated 3D segmentation of the liver--an evaluation on clinical data.

    PubMed

    Heimann, Tobias; Wolf, Ivo; Meinzer, Hans-Peter

    2006-01-01

    This paper presents an evaluation of the performance of a three-dimensional Active Shape Model (ASM) to segment the liver in 48 clinical CT scans. The employed shape model is built from 32 samples using an optimization approach based on the minimum description length (MDL). Three different gray-value appearance models (plain intensity, gradient and normalized gradient profiles) are created to guide the search. The employed segmentation techniques are ASM search with 10 and 30 modes of variation and a deformable model coupled to a shape model with 10 modes of variation. To assess the segmentation performance, the obtained results are compared to manual segmentations with four different measures (overlap, average distance, RMS distance and ratio of deviations larger 5mm). The only appearance model delivering usable results is the normalized gradient profile. The deformable model search achieves the best results, followed by the ASM search with 30 modes. Overall, statistical shape modeling delivers very promising results for a fully automated segmentation of the liver. PMID:17354754

  4. Fully automated spatially resolved reflectance spectrometer for the determination of the absorption and scattering in turbid media

    NASA Astrophysics Data System (ADS)

    Foschum, F.; Jäger, M.; Kienle, A.

    2011-10-01

    We describe a fully automated setup which is based on measurements of the spatially resolved reflectance for the determination of the reduced scattering and absorption coefficients in semi-infinite turbid media. The sample is illuminated with a xenon light source in combination with a monochromator enabling the scan of the wavelength from 450 nm to 950 nm. Reflected light from the sample is detected with a CCD camera providing a high spatial resolution. The essential steps for signal processing including, e.g., the consideration of the optical transfer function and the correct treatment of the background subtraction, are presented. The solutions of the diffusion theory and of the radiative transfer theory are investigated regarding the exact detection and illumination geometry. Systematic errors caused by using the different theories for fitting the optical parameters are characterized. The system was validated using liquid phantoms which contain Intralipid 20% and ink, and the measurement range of the system is specified. Further, we carefully characterized the optical properties of Intralipid 20% in the wavelength range between 450 nm and 950 nm.

  5. Detection of motile micro-organisms in biological samples by means of a fully automated image processing system

    NASA Astrophysics Data System (ADS)

    Alanis, Elvio; Romero, Graciela; Alvarez, Liliana; Martinez, Carlos C.; Hoyos, Daniel; Basombrio, Miguel A.

    2001-08-01

    A fully automated image processing system for detection of motile microorganism is biological samples is presented. The system is specifically calibrated for determining the concentration of Trypanosoma Cruzi parasites in blood samples of mice infected with Chagas disease. The method can be adapted for use in other biological samples. A thin layer of blood infected by T. cruzi parasites is examined in a common microscope in which the images of the vision field are taken by a CCD camera and temporarily stored in the computer memory. In a typical field, a few motile parasites are observable surrounded by blood red cells. The parasites have low contrast. Thus, they are difficult to detect visually but their great motility betrays their presence by the movement of the nearest neighbor red cells. Several consecutive images of the same field are taken, decorrelated with each other where parasites are present, and digitally processed in order to measure the number of parasites present in the field. Several fields are sequentially processed in the same fashion, displacing the sample by means of step motors driven by the computer. A direct advantage of this system is that its results are more reliable and the process is less time consuming than the current subjective evaluations made visually by technicians.

  6. Fully automated Liquid Extraction-Based Surface Sampling and Ionization Using a Chip-Based Robotic Nanoelectrospray Platform

    SciTech Connect

    Kertesz, Vilmos; Van Berkel, Gary J

    2010-01-01

    A fully automated liquid extraction-based surface sampling device utilizing an Advion NanoMate chip-based infusion nanoelectrospray ionization system is reported. Analyses were enabled for discrete spot sampling by using the Advanced User Interface of the current commercial control software. This software interface provided the parameter control necessary for the NanoMate robotic pipettor to both form and withdraw a liquid microjunction for sampling from a surface. The system was tested with three types of analytically important sample surface types, viz., spotted sample arrays on a MALDI plate, dried blood spots on paper, and whole-body thin tissue sections from drug dosed mice. The qualitative and quantitative data were consistent with previous studies employing other liquid extraction-based surface sampling techniques. The successful analyses performed here utilized the hardware and software elements already present in the NanoMate system developed to handle and analyze liquid samples. Implementation of an appropriate sample (surface) holder, a solvent reservoir, faster movement of the robotic arm, finer control over solvent flow rate when dispensing and retrieving the solution at the surface, and the ability to select any location on a surface to sample from would improve the analytical performance and utility of the platform.

  7. Development of a Fully Automated, GPS Based Monitoring System for Disaster Prevention and Emergency Preparedness: PPMS+RT

    PubMed Central

    Bond, Jason; Kim, Don; Chrzanowski, Adam; Szostak-Chrzanowski, Anna

    2007-01-01

    The increasing number of structural collapses, slope failures and other natural disasters has lead to a demand for new sensors, sensor integration techniques and data processing strategies for deformation monitoring systems. In order to meet extraordinary accuracy requirements for displacement detection in recent deformation monitoring projects, research has been devoted to integrating Global Positioning System (GPS) as a monitoring sensor. Although GPS has been used for monitoring purposes worldwide, certain environments pose challenges where conventional processing techniques cannot provide the required accuracy with sufficient update frequency. Described is the development of a fully automated, continuous, real-time monitoring system that employs GPS sensors and pseudolite technology to meet these requirements in such environments. Ethernet and/or serial port communication techniques are used to transfer data between GPS receivers at target points and a central processing computer. The data can be processed locally or remotely based upon client needs. A test was conducted that illustrated a 10 mm displacement was remotely detected at a target point using the designed system. This information could then be used to signal an alarm if conditions are deemed to be unsafe.

  8. Fully-automated in-syringe dispersive liquid-liquid microextraction for the determination of caffeine in coffee beverages.

    PubMed

    Frizzarin, Rejane M; Maya, Fernando; Estela, José M; Cerdà, Víctor

    2016-12-01

    A novel fully-automated magnetic stirring-assisted lab-in-syringe analytical procedure has been developed for the fast and efficient dispersive liquid-liquid microextraction (DLLME) of caffeine in coffee beverages. The procedure is based on the microextraction of caffeine with a minute amount of dichloromethane, isolating caffeine from the sample matrix with no further sample pretreatment. Selection of the relevant extraction parameters such as the dispersive solvent, proportion of aqueous/organic phase, pH and flow rates have been carefully evaluated. Caffeine quantification was linear from 2 to 75mgL(-1), with detection and quantification limits of 0.46mgL(-1) and 1.54mgL(-1), respectively. A coefficient of variation (n=8; 5mgL(-1)) of a 2.1% and a sampling rate of 16h(-1), were obtained. The procedure was satisfactorily applied to the determination of caffeine in brewed, instant and decaf coffee samples, being the results for the sample analysis validated using high-performance liquid chromatography. PMID:27374593

  9. Towards fully automated closed-loop Deep Brain Stimulation in Parkinson's disease patients: A LAMSTAR-based tremor predictor.

    PubMed

    Khobragade, Nivedita; Graupe, Daniel; Tuninetti, Daniela

    2015-08-01

    This paper describes the application of the LAMSTAR (LArge Memory STorage and Retrieval) neural network for prediction of onset of tremor in Parkinson's disease (PD) patients to allow for on-off adaptive control of Deep Brain Stimulation (DBS). Currently, the therapeutic treatment of PD by DBS is an open-loop system where continuous stimulation is applied to a target area in the brain. This work demonstrates a fully automated closed-loop DBS system so that stimulation can be applied on-demand only when needed to treat PD symptoms. The proposed LAMSTAR network uses spectral, entropy and recurrence rate parameters for prediction of the advent of tremor after the DBS stimulation is switched off. These parameters are extracted from non-invasively collected surface electromyography and accelerometry signals. The LAMSTAR network has useful characteristics, such as fast retrieval of patterns and ability to handle large amount of data of different types, which make it attractive for medical applications. Out of 21 trials blue from one subject, the average ratio of delay in prediction of tremor to the actual delay in observed tremor from the time stimulation was switched off achieved by the proposed LAMSTAR network is 0.77. Moreover, sensitivity of 100% and overall performance better than previously proposed Back Propagation neural networks is obtained. PMID:26736828

  10. Fully automated measurement of field-dependent AMS using MFK1-FA Kappabridge equipped with 3D rotator

    NASA Astrophysics Data System (ADS)

    Chadima, Martin; Studynka, Jan

    2013-04-01

    Low-field magnetic susceptibility of paramagnetic and diamagnetic minerals is field-independent by definition being also field-independent in pure magnetite. On the other hand, in pyrrhotite, hematite and high-Ti titanomagnetite it may be clearly field-dependent. Consequently, the field-dependent AMS enables the magnetic fabric of the latter group of minerals to be separated from the whole-rock AMS. The methods for the determination of the field-dependent AMS consist of separate measurements of each specimen in several fields within the Rayleigh Law range and subsequent processing in which the field-independent and field-dependent AMS components are calculated. The disadvantage of this technique is that each specimen must be measured several times, which is relatively laborious and time consuming. Recently, a new 3D rotator was developed for the MFK1-FA Kappabridge, which rotates the specimen simultaneously about two axes with different velocities. The measurement is fully automated in such a way that, once the specimen is inserted into the rotator, it requires no additional manipulation to measure the full AMS tensor. Consequently, the 3D rotator enables to measure the AMS tensors in the pre-set field intensities without any operator interference. Whole procedure is controlled by newly developed Safyr5 software; once the measurements are finished, the acquired data are immediately processed and can be visualized in a standard way.

  11. Evaluation of the Q analyzer, a new cap-piercing fully automated coagulometer with clotting, chromogenic, and immunoturbidometric capability.

    PubMed

    Kitchen, Steve; Woolley, Anita

    2013-01-01

    The Q analyzer is a recently launched fully automated photo-optical analyzer equipped with primary tube cap-piercing and capable of clotting, chromogenic, and immunoturbidometric tests. The purpose of the present study was to evaluate the performance characteristics of the Q analyzer with reagents from the instrument manufacturer. We assessed precision and throughput when performing coagulation screening tests, prothrombin time (PT)/international normalized ratio (INR), activated partial thromboplastin time (APTT), and fibrinogen assay by Clauss assay. We compared results with established reagent instrument combinations in widespread use. Precision of PT/INR and APTT was acceptable as indicated by total precision of around 3%. The time to first result was 3  min for an INR and 5  min for PT/APTT. The system produced 115 completed samples per hour when processing only INRs and 60 samples (120 results) per hour for PT/APTT combined. The sensitivity of the DG-APTT Synth/Q method to mild deficiency of factor VIII (FVIII), IX, and XI was excellent (as indicated by APTTs being prolonged above the upper limit of the reference range). The Q analyzer was associated with high precision, acceptable throughput, and good reliability. When used in combination with DG-PT reagent and manufacturer's instrument-specific international sensitivity index, the INRs obtained were accurate. The Q analyzer with DG-APTT Synth reagent demonstrated good sensitivity to isolated mild deficiency of FVIII, IX, and XI and had the advantage of relative insensitivity to mild FXII deficiency. Taken together, our data indicate that the Q hemostasis analyzer was suitable for routine use in combination with the reagents evaluated. PMID:23249565

  12. Multicenter Evaluation of Fully Automated BACTEC Mycobacteria Growth Indicator Tube 960 System for Susceptibility Testing of Mycobacterium tuberculosis

    PubMed Central

    Bemer, Pascale; Palicova, Frantiska; Rüsch-Gerdes, Sabine; Drugeon, Henri B.; Pfyffer, Gaby E.

    2002-01-01

    The reliability of the BACTEC Mycobacteria Growth Indicator Tube (MGIT) 960 system for testing of Mycobacterium tuberculosis susceptibility to the three front-line drugs (isoniazid [INH], rifampin [RIF], and ethambutol [EMB]) plus streptomycin (STR) was compared to that of the BACTEC 460 TB system. The proportion method was used to resolve discrepant results by an independent arbiter. One hundred and ten strains were tested with an overall agreement of 93.5%. Discrepant results were obtained for seven strains (6.4%) with INH (resistant by BACTEC MGIT 960; susceptible by BACTEC 460 TB), for one strain (0.9%) with RIF (resistant by BACTEC MGIT 960; susceptible by BACTEC 460 TB), for seven strains (6.4%) with EMB (six resistant by BACTEC MGIT 960 and susceptible by BACTEC 460 TB; one susceptible by BACTEC MGIT 960 and resistant by BACTEC 460 TB), and for 19 strains (17.3%) with STR (resistant by BACTEC MGIT 960 and susceptible by BACTEC 460 TB). After resolution of discrepant results, the sensitivity of the BACTEC MGIT 960 system was 100% for all four drugs and specificity ranged from 89.8% for STR to 100% for RIF. Turnaround times were 4.6 to 11.7 days (median, 6.5 days) for BACTEC MGIT 960 and 4.0 to 10.0 days (median, 7.0 days) for BACTEC 460 TB. These data demonstrate that the fully automated and nonradiometric BACTEC MGIT 960 system is an accurate method for rapid susceptibility testing of M. tuberculosis. PMID:11773109

  13. Real-time direct cell concentration and viability determination using a fully automated microfluidic platform for standalone process monitoring.

    PubMed

    Nunes, P S; Kjaerulff, S; Dufva, M; Mogensen, K B

    2015-06-21

    The industrial production of cells has a large unmet need for greater process monitoring, in addition to the standard temperature, pH and oxygen concentration determination. Monitoring the cell health by a vast range of fluorescence cell-based assays can greatly improve the feedback control and thereby ensure optimal cell production, by prolonging the fermentation cycle and increasing the bioreactor output. In this work, we report on the development of a fully automated microfluidic system capable of extracting samples directly from a bioreactor, diluting the sample, staining the cells, and determining the total cell and dead cells concentrations, within a time frame of 10.3 min. The platform consists of custom made stepper motor actuated peristaltic pumps and valves, fluidic interconnections, sample to waste liquid management and image cytometry-based detection. The total concentration of cells is determined by brightfield microscopy, while fluorescence detection is used to detect propidium iodide stained non-viable cells. This method can be incorporated into facilities with bioreactors to monitor the cell concentration and viability during the cultivation process. Here, we demonstrate the microfluidic system performance by monitoring in real time the cell concentration and viability of yeast extracted directly from an in-house made bioreactor. This is the first demonstration of using the Dean drag force, generated due to the implementation of a curved microchannel geometry in conjunction with high flow rates, to promote passive mixing of cell samples and thus homogenization of the diluted cell plug. The autonomous operation of the fluidics furthermore allows implementation of intelligent protocols for administering air bubbles from the bioreactor in the microfluidic system, so that these will be guided away from the imaging region, thereby significantly improving both the robustness of the system and the quality of the data. PMID:25923294

  14. Separation of field-independent and field-dependent susceptibility tensors using a sequence of fully automated AMS measurements

    NASA Astrophysics Data System (ADS)

    Studynka, J.; Chadima, M.; Hrouda, F.; Suza, P.

    2013-12-01

    Low-field magnetic susceptibility of diamagnetic and paramagnetic minerals as well as that of pure magnetite and all single-domain ferromagnetic (s.l.) minerals is field-independent. In contrast, magnetic susceptibility of multi-domain pyrrhotite, hematite and titanomagnetite may significantly depend on the field intensity. Hence, the AMS data acquired in various fields have a great potential to separate the magnetic fabric carried by the latter group of minerals from the whole-rock fabric. The determination of the field variation of AMS consist of separate measurements of each sample in several fields within the Rayleigh Law range and subsequent processing in which the field-independent and field-dependent susceptibility tensors are calculated. The disadvantage of this technique is that each sample must be measured several times in various positions, which is relatively laborious and time consuming. Recently, a new 3D rotator was developed for the MFK1 Kappabridges which rotates the sample simultaneously about two axes with different velocities. The measurement is fully automated in such a way that, once the sample is mounted into the rotator, it requires no additional positioning to measure the full AMS tensor. The important advantage of the 3D rotator is that it enables to measure AMS in a sequence of pre-set field intensities without any operator manipulation. Whole procedure is computer-controlled and, once a sequence of measurements is finished, the acquired data are immediately processed and visualized. Examples of natural rocks demonstrating various types of field dependence of AMS are given.

  15. Fully Automated Pulmonary Lobar Segmentation: Influence of Different Prototype Software Programs onto Quantitative Evaluation of Chronic Obstructive Lung Disease

    PubMed Central

    Lim, Hyun-ju; Weinheimer, Oliver; Wielpütz, Mark O.; Dinkel, Julien; Hielscher, Thomas; Gompelmann, Daniela; Kauczor, Hans-Ulrich; Heussel, Claus Peter

    2016-01-01

    Objectives Surgical or bronchoscopic lung volume reduction (BLVR) techniques can be beneficial for heterogeneous emphysema. Post-processing software tools for lobar emphysema quantification are useful for patient and target lobe selection, treatment planning and post-interventional follow-up. We aimed to evaluate the inter-software variability of emphysema quantification using fully automated lobar segmentation prototypes. Material and Methods 66 patients with moderate to severe COPD who underwent CT for planning of BLVR were included. Emphysema quantification was performed using 2 modified versions of in-house software (without and with prototype advanced lung vessel segmentation; programs 1 [YACTA v.2.3.0.2] and 2 [YACTA v.2.4.3.1]), as well as 1 commercial program 3 [Pulmo3D VA30A_HF2] and 1 pre-commercial prototype 4 [CT COPD ISP ver7.0]). The following parameters were computed for each segmented anatomical lung lobe and the whole lung: lobar volume (LV), mean lobar density (MLD), 15th percentile of lobar density (15th), emphysema volume (EV) and emphysema index (EI). Bland-Altman analysis (limits of agreement, LoA) and linear random effects models were used for comparison between the software. Results Segmentation using programs 1, 3 and 4 was unsuccessful in 1 (1%), 7 (10%) and 5 (7%) patients, respectively. Program 2 could analyze all datasets. The 53 patients with successful segmentation by all 4 programs were included for further analysis. For LV, program 1 and 4 showed the largest mean difference of 72 ml and the widest LoA of [-356, 499 ml] (p<0.05). Program 3 and 4 showed the largest mean difference of 4% and the widest LoA of [-7, 14%] for EI (p<0.001). Conclusions Only a single software program was able to successfully analyze all scheduled data-sets. Although mean bias of LV and EV were relatively low in lobar quantification, ranges of disagreement were substantial in both of them. For longitudinal emphysema monitoring, not only scanning protocol but

  16. Therapeutic Alliance With a Fully Automated Mobile Phone and Web-Based Intervention: Secondary Analysis of a Randomized Controlled Trial

    PubMed Central

    Parker, Gordon; Manicavasagar, Vijaya; Hadzi-Pavlovic, Dusan; Fogarty, Andrea

    2016-01-01

    Background Studies of Internet-delivered psychotherapies suggest that clients report development of a therapeutic alliance in the Internet environment. Because a majority of the interventions studied to date have been therapist-assisted to some degree, it remains unclear whether a therapeutic alliance can develop within the context of an Internet-delivered self-guided intervention with no therapist support, and whether this has consequences for program outcomes. Objective This study reports findings of a secondary analysis of data from 90 participants with mild-to-moderate depression, anxiety, and/or stress who used a fully automated mobile phone and Web-based cognitive behavior therapy (CBT) intervention called “myCompass” in a recent randomized controlled trial (RCT). Methods Symptoms, functioning, and positive well-being were assessed at baseline and post-intervention using the Depression, Anxiety and Stress Scale (DASS), the Work and Social Adjustment Scale (WSAS), and the Mental Health Continuum-Short Form (MHC-SF). Therapeutic alliance was measured at post-intervention using the Agnew Relationship Measure (ARM), and this was supplemented with qualitative data obtained from 16 participant interviews. Extent of participant engagement with the program was also assessed. Results Mean ratings on the ARM subscales were above the neutral midpoints, and the interviewees provided rich detail of a meaningful and collaborative therapeutic relationship with the myCompass program. Whereas scores on the ARM subscales did not correlate with treatment outcomes, participants’ ratings of the quality of their emotional connection with the program correlated significantly and positively with program logins, frequency of self-monitoring, and number of treatment modules completed (r values between .32-.38, P≤.002). The alliance (ARM) subscales measuring perceived empowerment (r=.26, P=.02) and perceived freedom to self-disclose (r=.25, P=.04) also correlated significantly

  17. Fully Automated RNAscope In Situ Hybridization Assays for Formalin-Fixed Paraffin-Embedded Cells and Tissues.

    PubMed

    Anderson, Courtney M; Zhang, Bingqing; Miller, Melanie; Butko, Emerald; Wu, Xingyong; Laver, Thomas; Kernag, Casey; Kim, Jeffrey; Luo, Yuling; Lamparski, Henry; Park, Emily; Su, Nan; Ma, Xiao-Jun

    2016-10-01

    Biomarkers such as DNA, RNA, and protein are powerful tools in clinical diagnostics and therapeutic development for many diseases. Identifying RNA expression at the single cell level within the morphological context by RNA in situ hybridization provides a great deal of information on gene expression changes over conventional techniques that analyze bulk tissue, yet widespread use of this technique in the clinical setting has been hampered by the dearth of automated RNA ISH assays. Here we present an automated version of the RNA ISH technology RNAscope that is adaptable to multiple automation platforms. The automated RNAscope assay yields a high signal-to-noise ratio with little to no background staining and results comparable to the manual assay. In addition, the automated duplex RNAscope assay was able to detect two biomarkers simultaneously. Lastly, assay consistency and reproducibility were confirmed by quantification of TATA-box binding protein (TBP) mRNA signals across multiple lots and multiple experiments. Taken together, the data presented in this study demonstrate that the automated RNAscope technology is a high performance RNA ISH assay with broad applicability in biomarker research and diagnostic assay development. J. Cell. Biochem. 117: 2201-2208, 2016. © 2016 Wiley Periodicals, Inc. PMID:27191821

  18. Fully Automated Quantitative Estimation of Volumetric Breast Density from Digital Breast Tomosynthesis Images: Preliminary Results and Comparison with Digital Mammography and MR Imaging.

    PubMed

    Pertuz, Said; McDonald, Elizabeth S; Weinstein, Susan P; Conant, Emily F; Kontos, Despina

    2016-04-01

    Purpose To assess a fully automated method for volumetric breast density (VBD) estimation in digital breast tomosynthesis (DBT) and to compare the findings with those of full-field digital mammography (FFDM) and magnetic resonance (MR) imaging. Materials and Methods Bilateral DBT images, FFDM images, and sagittal breast MR images were retrospectively collected from 68 women who underwent breast cancer screening from October 2011 to September 2012 with institutional review board-approved, HIPAA-compliant protocols. A fully automated computer algorithm was developed for quantitative estimation of VBD from DBT images. FFDM images were processed with U.S. Food and Drug Administration-cleared software, and the MR images were processed with a previously validated automated algorithm to obtain corresponding VBD estimates. Pearson correlation and analysis of variance with Tukey-Kramer post hoc correction were used to compare the multimodality VBD estimates. Results Estimates of VBD from DBT were significantly correlated with FFDM-based and MR imaging-based estimates with r = 0.83 (95% confidence interval [CI]: 0.74, 0.90) and r = 0.88 (95% CI: 0.82, 0.93), respectively (P < .001). The corresponding correlation between FFDM and MR imaging was r = 0.84 (95% CI: 0.76, 0.90). However, statistically significant differences after post hoc correction (α = 0.05) were found among VBD estimates from FFDM (mean ± standard deviation, 11.1% ± 7.0) relative to MR imaging (16.6% ± 11.2) and DBT (19.8% ± 16.2). Differences between VDB estimates from DBT and MR imaging were not significant (P = .26). Conclusion Fully automated VBD estimates from DBT, FFDM, and MR imaging are strongly correlated but show statistically significant differences. Therefore, absolute differences in VBD between FFDM, DBT, and MR imaging should be considered in breast cancer risk assessment. (©) RSNA, 2015 Online supplemental material is available for this article. PMID:26491909

  19. Usage and Effectiveness of a Fully Automated, Open-Access, Spanish Web-Based Smoking Cessation Program: Randomized Controlled Trial

    PubMed Central

    2014-01-01

    Background The Internet is an optimal setting to provide massive access to tobacco treatments. To evaluate open-access Web-based smoking cessation programs in a real-world setting, adherence and retention data should be taken into account as much as abstinence rate. Objective The objective was to analyze the usage and effectiveness of a fully automated, open-access, Web-based smoking cessation program by comparing interactive versus noninteractive versions. Methods Participants were randomly assigned either to the interactive or noninteractive version of the program, both with identical content divided into 4 interdependent modules. At baseline, we collected demographic, psychological, and smoking characteristics of the smokers self-enrolled in the Web-based program of Universidad Nacional de Educación a Distancia (National Distance Education University; UNED) in Madrid, Spain. The following questionnaires were administered: the anxiety and depression subscales from the Symptom Checklist-90-Revised, the 4-item Perceived Stress Scale, and the Heaviness of Smoking Index. At 3 months, we analyzed dropout rates, module completion, user satisfaction, follow-up response rate, and self-assessed smoking abstinence. Results A total of 23,213 smokers were registered, 50.06% (11,620/23,213) women and 49.94% (11,593/23,213) men, with a mean age of 39.5 years (SD 10.3). Of these, 46.10% (10,701/23,213) were married and 34.43% (7992/23,213) were single, 46.03% (10,686/23,213) had university education, and 78.73% (18,275/23,213) were employed. Participants smoked an average of 19.4 cigarettes per day (SD 10.3). Of the 11,861 smokers randomly assigned to the interactive version, 2720 (22.93%) completed the first module, 1052 (8.87%) the second, 624 (5.26%) the third, and 355 (2.99%) the fourth. Completion data was not available for the noninteractive version (no way to record it automatically). The 3-month follow-up questionnaire was completed by 1085 of 23,213 enrolled smokers

  20. Technical Note: A fully automated purge and trap GC-MS system for quantification of volatile organic compound (VOC) fluxes between the ocean and atmosphere

    NASA Astrophysics Data System (ADS)

    Andrews, S. J.; Hackenberg, S. C.; Carpenter, L. J.

    2015-04-01

    The oceans are a key source of a number of atmospherically important volatile gases. The accurate and robust determination of trace gases in seawater is a significant analytical challenge, requiring reproducible and ideally automated sample handling, a high efficiency of seawater-air transfer, removal of water vapour from the sample stream, and high sensitivity and selectivity of the analysis. Here we describe a system that was developed for the fully automated analysis of dissolved very short-lived halogenated species (VSLS) sampled from an under-way seawater supply. The system can also be used for semi-automated batch sampling from Niskin bottles filled during CTD (conductivity, temperature, depth) profiles. The essential components comprise a bespoke, automated purge and trap (AutoP & T) unit coupled to a commercial thermal desorption and gas chromatograph mass spectrometer (TD-GC-MS). The AutoP & T system has completed five research cruises, from the tropics to the poles, and collected over 2500 oceanic samples to date. It is able to quantify >25 species over a boiling point range of 34-180 °C with Henry's law coefficients of 0.018 and greater (CH22l, kHcc dimensionless gas/aqueous) and has been used to measure organic sulfurs, hydrocarbons, halocarbons and terpenes. In the eastern tropical Pacific, the high sensitivity and sampling frequency provided new information regarding the distribution of VSLS, including novel measurements of a photolytically driven diurnal cycle of CH22l within the surface ocean water.

  1. Parameter evaluation and fully-automated radiosynthesis of [(11)C]harmine for imaging of MAO-A for clinical trials.

    PubMed

    Philippe, C; Zeilinger, M; Mitterhauser, M; Dumanic, M; Lanzenberger, R; Hacker, M; Wadsak, W

    2015-03-01

    The aim of the present study was the evaluation and automation of the radiosynthesis of [(11)C]harmine for clinical trials. The following parameters have been investigated: amount of base, precursor concentration, solvent, reaction temperature and time. The optimum reaction conditions were determined to be 2-3mg/mL precursor activated with 1eq. 5M NaOH in DMSO, 80°C reaction temperature and 2min reaction time. Under these conditions 6.1±1GBq (51.0±11% based on [(11)C]CH3I, corrected for decay) of [(11)C]harmine (n=72) were obtained. The specific activity was 101.32±28.2GBq/µmol (at EOS). All quality control parameters were in accordance with the standards for parenteral human application. Due to its reliability and high yields, this fully-automated synthesis method can be used as routine set-up. PMID:25594603

  2. Toward fully automated genotyping: Allele assignment, pedigree construction, phase determination, and recombination detection in Duchenne muscular dystrophy

    SciTech Connect

    Perlin, M.W.; Burks, M.B.; Hoop, R.C.; Hoffman, E.P.

    1994-10-01

    Human genetic maps have made quantum leaps in the past few years, because of the characterization of >2,000 CA dinucleotide repeat loci: these PCR-based markers offer extraordinarily high PIC, and within the next year their density is expected to reach intervals of a few centimorgans per marker. These new genetic maps open new avenues for disease gene research, including large-scale genotyping for both simple and complex disease loci. However, the allele patterns of many dinucleotide repeat loci can be complex and difficult to interpret, with genotyping errors a recognized problem. Furthermore, the possibility of genotyping individuals at hundreds or thousands of polymorphic loci requires improvements in data handling and analysis. The automation of genotyping and analysis of computer-derived haplotypes would remove many of the barriers preventing optimal use of dense and informative dinucleotide genetic maps. Toward this end, we have automated the allele identification, genotyping, phase determinations, and inheritance consistency checks generated by four CA repeats within the 2.5-Mbp, 10-cM X-linked dystrophin gene, using fluorescein-labeled multiplexed PCR products analyzed on automated sequencers. The described algorithms can deconvolute and resolve closely spaced alleles, despite interfering stutter noise; set phase in females; propagate the phase through the family; and identify recombination events. We show the implementation of these algorithms for the completely automated interpretation of allele data and risk assessment for five Duchenne/Becker muscular dystrophy families. The described approach can be scaled up to perform genome-based analyses with hundreds or thousands of CA-repeat loci, using multiple fluorophors on automated sequencers. 16 refs., 5 figs., 1 tab.

  3. Fully-automated identification of fish species based on otolith contour: using short-time Fourier transform and discriminant analysis (STFT-DA)

    PubMed Central

    Salimi, Nima; Loh, Kar Hoe; Kaur Dhillon, Sarinder

    2016-01-01

    Background. Fish species may be identified based on their unique otolith shape or contour. Several pattern recognition methods have been proposed to classify fish species through morphological features of the otolith contours. However, there has been no fully-automated species identification model with the accuracy higher than 80%. The purpose of the current study is to develop a fully-automated model, based on the otolith contours, to identify the fish species with the high classification accuracy. Methods. Images of the right sagittal otoliths of 14 fish species from three families namely Sciaenidae, Ariidae, and Engraulidae were used to develop the proposed identification model. Short-time Fourier transform (STFT) was used, for the first time in the area of otolith shape analysis, to extract important features of the otolith contours. Discriminant Analysis (DA), as a classification technique, was used to train and test the model based on the extracted features. Results. Performance of the model was demonstrated using species from three families separately, as well as all species combined. Overall classification accuracy of the model was greater than 90% for all cases. In addition, effects of STFT variables on the performance of the identification model were explored in this study. Conclusions. Short-time Fourier transform could determine important features of the otolith outlines. The fully-automated model proposed in this study (STFT-DA) could predict species of an unknown specimen with acceptable identification accuracy. The model codes can be accessed at http://mybiodiversityontologies.um.edu.my/Otolith/ and https://peerj.com/preprints/1517/. The current model has flexibility to be used for more species and families in future studies. PMID:26925315

  4. Fully-automated identification of fish species based on otolith contour: using short-time Fourier transform and discriminant analysis (STFT-DA).

    PubMed

    Salimi, Nima; Loh, Kar Hoe; Kaur Dhillon, Sarinder; Chong, Ving Ching

    2016-01-01

    Background. Fish species may be identified based on their unique otolith shape or contour. Several pattern recognition methods have been proposed to classify fish species through morphological features of the otolith contours. However, there has been no fully-automated species identification model with the accuracy higher than 80%. The purpose of the current study is to develop a fully-automated model, based on the otolith contours, to identify the fish species with the high classification accuracy. Methods. Images of the right sagittal otoliths of 14 fish species from three families namely Sciaenidae, Ariidae, and Engraulidae were used to develop the proposed identification model. Short-time Fourier transform (STFT) was used, for the first time in the area of otolith shape analysis, to extract important features of the otolith contours. Discriminant Analysis (DA), as a classification technique, was used to train and test the model based on the extracted features. Results. Performance of the model was demonstrated using species from three families separately, as well as all species combined. Overall classification accuracy of the model was greater than 90% for all cases. In addition, effects of STFT variables on the performance of the identification model were explored in this study. Conclusions. Short-time Fourier transform could determine important features of the otolith outlines. The fully-automated model proposed in this study (STFT-DA) could predict species of an unknown specimen with acceptable identification accuracy. The model codes can be accessed at http://mybiodiversityontologies.um.edu.my/Otolith/ and https://peerj.com/preprints/1517/. The current model has flexibility to be used for more species and families in future studies. PMID:26925315

  5. Aptamer Microarrays

    SciTech Connect

    Angel-Syrett, Heather; Collett, Jim; Ellington, Andrew D.

    2009-01-02

    In vitro selection can yield specific, high-affinity aptamers. We and others have devised methods for the automated selection of aptamers, and have begun to use these reagents for the construction of arrays. Arrayed aptamers have proven to be almost as sensitive as their solution phase counterparts, and when ganged together can provide both specific and general diagnostic signals for proteins and other analytes. We describe here technical details regarding the production and processing of aptamer microarrays, including blocking, washing, drying, and scanning. We will also discuss the challenges involved in developing standardized and reproducible methods for binding and quantitating protein targets. While signals from fluorescent analytes or sandwiches are typically captured, it has proven possible for immobilized aptamers to be uniquely coupled to amplification methods not available to protein reagents, thus allowing for protein-binding signals to be greatly amplified. Into the future, many of the biosensor methods described in this book can potentially be adapted to array formats, thus further expanding the utility of and applications for aptamer arrays.

  6. Fully automated synthesis of O-(3-[18F]fluoropropyl)-L-tyrosine by direct nucleophilic exchange on a quaternary 4-aminopyridinium resin.

    PubMed

    Tang, Ganghua; Tang, Xiaolan; Wang, Mingfang; Luo, Lei; Gan, Manquan

    2003-06-01

    A fully automated synthesis of O-(3-[18F]fluoropropyl)-L-tyrosine (FPT), an amino acid tracer for tumor imaging with positron emission tomography, is described. FPT was prepared by a two-step reaction sequence. Direct nucleophilic fluorination substitution of [18F]fluoride with 1,3-di(4-methylphenylsulfonyloxy)propane on a quaternary 4-(4-methylpiperidinyl)pyridinium functionalized polystyrene anion exchange resin, followed by [18F]fluoro-1-(4-methylphenylsulfonyloxy)propane yielded FPT. The overall radiochemical yield with no decay correction was about 12%; the whole synthesis time was about 52 min, and the radiochemical purity was above 95%. PMID:12798378

  7. Towards fully automated genotyping: use of an X linked recessive spastic paraplegia family to test alternative analysis methods.

    PubMed

    Kobayashi, H; Matise, T C; Perlin, M W; Marks, H G; Hoffman, E P

    1995-05-01

    Advances in dinucleotide-based genetic maps open possibilities for large scale genotyping at high resolution. The current rate-limiting steps in use of these dense maps is data interpretation (allele definition), data entry, and statistical calculations. We have recently reported automated allele identification methods. Here we show that a 10-cM framework map of the human X chromosome can be analyzed on two lanes of an automated sequencer per individual (10-12 loci per lane). We use this map and analysis strategy to generate allele data for an X-linked recessive spastic paraplegia family with a known PLP mutation. We analyzed 198 genotypes in a single gel and used the data to test three methods of data analysis: manual meiotic breakpoint mapping, automated concordance analysis, and whole chromosome multipoint linkage analysis. All methods pinpointed the correct location of the gene. We propose that multipoint exclusion mapping may permit valid inflation of LOD scores using the equation max LOD-(next best LOD). PMID:7759066

  8. Fully Automated Atlas-Based Hippocampus Volumetry for Clinical Routine: Validation in Subjects with Mild Cognitive Impairment from the ADNI Cohort.

    PubMed

    Suppa, Per; Hampel, Harald; Spies, Lothar; Fiebach, Jochen B; Dubois, Bruno; Buchert, Ralph

    2015-01-01

    Hippocampus volumetry based on magnetic resonance imaging (MRI) has not yet been translated into everyday clinical diagnostic patient care, at least in part due to limited availability of appropriate software tools. In the present study, we evaluate a fully-automated and computationally efficient processing pipeline for atlas based hippocampal volumetry using freely available Statistical Parametric Mapping (SPM) software in 198 amnestic mild cognitive impairment (MCI) subjects from the Alzheimer's Disease Neuroimaging Initiative (ADNI1). Subjects were grouped into MCI stable and MCI to probable Alzheimer's disease (AD) converters according to follow-up diagnoses at 12, 24, and 36 months. Hippocampal grey matter volume (HGMV) was obtained from baseline T1-weighted MRI and then corrected for total intracranial volume and age. Average processing time per subject was less than 4 minutes on a standard PC. The area under the receiver operator characteristic curve of the corrected HGMV for identification of MCI to probable AD converters within 12, 24, and 36 months was 0.78, 0.72, and 0.71, respectively. Thus, hippocampal volume computed with the fully-automated processing pipeline provides similar power for prediction of MCI to probable AD conversion as computationally more expensive methods. The whole processing pipeline has been made freely available as an SPM8 toolbox. It is easily set up and integrated into everyday clinical patient care. PMID:25720402

  9. A fully-automated image processing technique to improve measurement of suspended particles and flocs by removing out-of-focus objects

    NASA Astrophysics Data System (ADS)

    Keyvani, Ali; Strom, Kyle

    2013-03-01

    A fully-automated image processing script was developed to analyze large datasets of imaged flocs in dilute turbulent suspensions of mud. In the procedure, out-of-focus flocs are automatically removed from the dataset to attain a more precise floc size distribution. This automated technique was tested against visual inspection of images to ensure that the procedure was only selecting in-focus flocs for inclusion in the size measurements, and the resulting measured sizes were compared to floc measured through manual image processing of the same data. The results show that the automated method is able to accurately measure the floc size distribution by correctly sizing in-focus flocs and removing out-of-focus flocs. The processing procedures were developed with sizing of suspended mud flocs in mind, but the process is general and can be applied for other applications. We show the ability of the method to handle large numbers of images (over 15,000 at a time) by tracking the change in floc size population with time at 1-min intervals over the course of a 160 min floc growth experiment.

  10. Fully automated high-throughput chromatin immunoprecipitation for ChIP-seq: Identifying ChIP-quality p300 monoclonal antibodies

    PubMed Central

    Gasper, William C.; Marinov, Georgi K.; Pauli-Behn, Florencia; Scott, Max T.; Newberry, Kimberly; DeSalvo, Gilberto; Ou, Susan; Myers, Richard M.; Vielmetter, Jost; Wold, Barbara J.

    2014-01-01

    Chromatin immunoprecipitation coupled with DNA sequencing (ChIP-seq) is the major contemporary method for mapping in vivo protein-DNA interactions in the genome. It identifies sites of transcription factor, cofactor and RNA polymerase occupancy, as well as the distribution of histone marks. Consortia such as the ENCyclopedia Of DNA Elements (ENCODE) have produced large datasets using manual protocols. However, future measurements of hundreds of additional factors in many cell types and physiological states call for higher throughput and consistency afforded by automation. Such automation advances, when provided by multiuser facilities, could also improve the quality and efficiency of individual small-scale projects. The immunoprecipitation process has become rate-limiting, and is a source of substantial variability when performed manually. Here we report a fully automated robotic ChIP (R-ChIP) pipeline that allows up to 96 reactions. A second bottleneck is the dearth of renewable ChIP-validated immune reagents, which do not yet exist for most mammalian transcription factors. We used R-ChIP to screen new mouse monoclonal antibodies raised against p300, a histone acetylase, well-known as a marker of active enhancers, for which ChIP-competent monoclonal reagents have been lacking. We identified, validated for ChIP-seq, and made publicly available a monoclonal reagent called ENCITp300-1. PMID:24919486

  11. Centrifugal LabTube platform for fully automated DNA purification and LAMP amplification based on an integrated, low-cost heating system.

    PubMed

    Hoehl, Melanie M; Weißert, Michael; Dannenberg, Arne; Nesch, Thomas; Paust, Nils; von Stetten, Felix; Zengerle, Roland; Slocum, Alexander H; Steigert, Juergen

    2014-06-01

    This paper introduces a disposable battery-driven heating system for loop-mediated isothermal DNA amplification (LAMP) inside a centrifugally-driven DNA purification platform (LabTube). We demonstrate LabTube-based fully automated DNA purification of as low as 100 cell-equivalents of verotoxin-producing Escherichia coli (VTEC) in water, milk and apple juice in a laboratory centrifuge, followed by integrated and automated LAMP amplification with a reduction of hands-on time from 45 to 1 min. The heating system consists of two parallel SMD thick film resistors and a NTC as heating and temperature sensing elements. They are driven by a 3 V battery and controlled by a microcontroller. The LAMP reagents are stored in the elution chamber and the amplification starts immediately after the eluate is purged into the chamber. The LabTube, including a microcontroller-based heating system, demonstrates contamination-free and automated sample-to-answer nucleic acid testing within a laboratory centrifuge. The heating system can be easily parallelized within one LabTube and it is deployable for a variety of heating and electrical applications. PMID:24562605

  12. ELIXYS - a fully automated, three-reactor high-pressure radiosynthesizer for development and routine production of diverse PET tracers

    PubMed Central

    2013-01-01

    Background Automated radiosynthesizers are vital for routine production of positron-emission tomography tracers to minimize radiation exposure to operators and to ensure reproducible synthesis yields. The recent trend in the synthesizer industry towards the use of disposable kits aims to simplify setup and operation for the user, but often introduces several limitations related to temperature and chemical compatibility, thus requiring reoptimization of protocols developed on non-cassette-based systems. Radiochemists would benefit from a single hybrid system that provides tremendous flexibility for development and optimization of reaction conditions while also providing a pathway to simple, cassette-based production of diverse tracers. Methods We have designed, built, and tested an automated three-reactor radiosynthesizer (ELIXYS) to provide a flexible radiosynthesis platform suitable for both tracer development and routine production. The synthesizer is capable of performing high-pressure and high-temperature reactions by eliminating permanent tubing and valve connections to the reaction vessel. Each of the three movable reactors can seal against different locations on disposable cassettes to carry out different functions such as sealed reactions, evaporations, and reagent addition. A reagent and gas handling robot moves sealed reagent vials from storage locations in the cassette to addition positions and also dynamically provides vacuum and inert gas to ports on the cassette. The software integrates these automated features into chemistry unit operations (e.g., React, Evaporate, Add) to intuitively create synthesis protocols. 2-Deoxy-2-[18F]fluoro-5-methyl-β-l-arabinofuranosyluracil (l-[18F]FMAU) and 2-deoxy-2-[18F]fluoro-β-d-arabinofuranosylcytosine (d-[18F]FAC) were synthesized to validate the system. Results l-[18F]FMAU and d-[18F]FAC were successfully synthesized in 165 and 170 min, respectively, with decay-corrected radiochemical yields of 46% ± 1% (n = 6

  13. Anxiety Online—A Virtual Clinic: Preliminary Outcomes Following Completion of Five Fully Automated Treatment Programs for Anxiety Disorders and Symptoms

    PubMed Central

    Meyer, Denny; Austin, David William; Kyrios, Michael

    2011-01-01

    Background The development of e-mental health interventions to treat or prevent mental illness and to enhance wellbeing has risen rapidly over the past decade. This development assists the public in sidestepping some of the obstacles that are often encountered when trying to access traditional face-to-face mental health care services. Objective The objective of our study was to investigate the posttreatment effectiveness of five fully automated self-help cognitive behavior e-therapy programs for generalized anxiety disorder (GAD), panic disorder with or without agoraphobia (PD/A), obsessive–compulsive disorder (OCD), posttraumatic stress disorder (PTSD), and social anxiety disorder (SAD) offered to the international public via Anxiety Online, an open-access full-service virtual psychology clinic for anxiety disorders. Methods We used a naturalistic participant choice, quasi-experimental design to evaluate each of the five Anxiety Online fully automated self-help e-therapy programs. Participants were required to have at least subclinical levels of one of the anxiety disorders to be offered the associated disorder-specific fully automated self-help e-therapy program. These programs are offered free of charge via Anxiety Online. Results A total of 225 people self-selected one of the five e-therapy programs (GAD, n = 88; SAD, n = 50; PD/A, n = 40; PTSD, n = 30; OCD, n = 17) and completed their 12-week posttreatment assessment. Significant improvements were found on 21/25 measures across the five fully automated self-help programs. At postassessment we observed significant reductions on all five anxiety disorder clinical disorder severity ratings (Cohen d range 0.72–1.22), increased confidence in managing one’s own mental health care (Cohen d range 0.70–1.17), and decreases in the total number of clinical diagnoses (except for the PD/A program, where a positive trend was found) (Cohen d range 0.45–1.08). In addition, we found significant improvements in

  14. WebMark--A Fully Automated Method of Submission, Assessment, Grading, and Commentary for Laboratory Practical Scripts

    NASA Astrophysics Data System (ADS)

    Olivier, George W. J.; Herson, Katie; Sosabowski, Michael H.

    2001-12-01

    The traditional (manual) method of checking and grading student laboratory practical scripts is time consuming and therefore can cause long script turnaround times; it is labor intensive, especially for nonuniform quantitative data; there is potential for inconsistency, and, for large student groups, a great deal of tedium for the checker. Automation of checking such scripts has the potential to alleviate these disadvantages. This paper describes a strategy adopted by the School of Pharmacy and Biomolecular Sciences, University of Brighton, UK, to automate the submission, assessment, grading, and commentary of laboratory practical scripts. Student evaluation and feedback is also reported. Students enter their results into a Web-based form via the school intranet. Their results are linked to a Filemaker Pro database, which calculates the "right" answers on the basis of the primary data used, compares them with the students' answers, and grades the scripts. The database detects where students have made errors in calculations and calculates the grade, which it sends to students with qualitative feedback. Students receive their grade and feedback by email immediately upon submission of their results, which gives them the opportunity to reflect upon and discuss their results with the instructor while the exercise is still fresh in their mind.

  15. Fully automated prostate segmentation in 3D MR based on normalized gradient fields cross-correlation initialization and LOGISMOS refinement

    NASA Astrophysics Data System (ADS)

    Yin, Yin; Fotin, Sergei V.; Periaswamy, Senthil; Kunz, Justin; Haldankar, Hrishikesh; Muradyan, Naira; Cornud, François; Turkbey, Baris; Choyke, Peter

    2012-02-01

    Manual delineation of the prostate is a challenging task for a clinician due to its complex and irregular shape. Furthermore, the need for precisely targeting the prostate boundary continues to grow. Planning for radiation therapy, MR-ultrasound fusion for image-guided biopsy, multi-parametric MRI tissue characterization, and context-based organ retrieval are examples where accurate prostate delineation can play a critical role in a successful patient outcome. Therefore, a robust automated full prostate segmentation system is desired. In this paper, we present an automated prostate segmentation system for 3D MR images. In this system, the prostate is segmented in two steps: the prostate displacement and size are first detected, and then the boundary is refined by a shape model. The detection approach is based on normalized gradient fields cross-correlation. This approach is fast, robust to intensity variation and provides good accuracy to initialize a prostate mean shape model. The refinement model is based on a graph-search based framework, which contains both shape and topology information during deformation. We generated the graph cost using trained classifiers and used coarse-to-fine search and region-specific classifier training. The proposed algorithm was developed using 261 training images and tested on another 290 cases. The segmentation performance using mean DSC ranging from 0.89 to 0.91 depending on the evaluation subset demonstrates state of the art performance. Running time for the system is about 20 to 40 seconds depending on image size and resolution.

  16. Technical Note: A fully automated purge and trap-GC-MS system for quantification of volatile organic compound (VOC) fluxes between the ocean and atmosphere

    NASA Astrophysics Data System (ADS)

    Andrews, S. J.; Hackenberg, S. C.; Carpenter, L. J.

    2014-12-01

    The oceans are a key source of a number of atmospherically important volatile gases. The accurate and robust determination of trace gases in seawater is a significant analytical challenge, requiring reproducible and ideally automated sample handling, a high efficiency of seawater-air transfer, removal of water vapour from the sample stream, and high sensitivity and selectivity of the analysis. Here we describe a system that was developed for the fully automated analysis of dissolved very short-lived halogenated species (VSLS) sampled from an under-way seawater supply. The system can also be used for semi-automated batch sampling from Niskin bottles filled during CTD (Conductivity, Temperature, Depth) profiles. The essential components comprise of a bespoke, automated purge and trap (AutoP & T) unit coupled to a commercial thermal desorption and gas chromatograph-mass spectrometer (TD-GC-MS). The AutoP & T system has completed five research cruises, from the tropics to the poles, and collected over 2500 oceanic samples to date. It is able to quantify >25 species over a boiling point range of 34-180 °C with Henry's Law coefficients of 0.018 and greater (CH2I2, kHcc dimensionless gas/aqueous) and has been used to measure organic sulfurs, hydrocarbons, halocarbons and terpenes. In the east tropical Pacific, the high sensitivity and sampling frequency provided new information regarding the distribution of VSLS, including novel measurements of a photolytically driven diurnal cycle of CH2I2 within the surface ocean water.

  17. Fully automated, high speed, tomographic phase object reconstruction using the transport of intensity equation in transmission and reflection configurations.

    PubMed

    Nguyen, Thanh; Nehmetallah, George; Tran, Dat; Darudi, Ahmad; Soltani, Peyman

    2015-12-10

    While traditional transport of intensity equation (TIE) based phase retrieval of a phase object is performed through axial translation of the CCD, in this work a tunable lens TIE is employed in both transmission and reflection configurations. These configurations are extended to a 360° tomographic 3D reconstruction through multiple illuminations from different angles by a custom fabricated rotating assembly of the phase object. Synchronization circuitry is developed to control the CCD camera and the Arduino board, which in its turn controls the tunable lens and the stepper motor to automate the tomographic reconstruction process. Finally, a MATLAB based user friendly graphical user interface is developed to control the whole system and perform tomographic reconstruction using both multiplicative and inverse radon based techniques. PMID:26836869

  18. Fully Automated Electro Membrane Extraction Autosampler for LC-MS Systems Allowing Soft Extractions for High-Throughput Applications.

    PubMed

    Fuchs, David; Pedersen-Bjergaard, Stig; Jensen, Henrik; Rand, Kasper D; Honoré Hansen, Steen; Petersen, Nickolaj Jacob

    2016-07-01

    The current work describes the implementation of electro membrane extraction (EME) into an autosampler for high-throughput analysis of samples by EME-LC-MS. The extraction probe was built into a luer lock adapter connected to a HTC PAL autosampler syringe. As the autosampler drew sample solution, analytes were extracted into the lumen of the extraction probe and transferred to a LC-MS system for further analysis. Various parameters affecting extraction efficacy were investigated including syringe fill strokes, syringe pull up volume, pull up delay and volume in the sample vial. The system was optimized for soft extraction of analytes and high sample throughput. Further, it was demonstrated that by flushing the EME-syringe with acidic wash buffer and reverting the applied electric potential, carry-over between samples can be reduced to below 1%. Performance of the system was characterized (RSD, <10%; R(2), 0.994) and finally, the EME-autosampler was used to analyze in vitro conversion of methadone into its main metabolite by rat liver microsomes and for demonstrating the potential of known CYP3A4 inhibitors to prevent metabolism of methadone. By making use of the high extraction speed of EME, a complete analytical workflow of purification, separation, and analysis of sample could be achieved within only 5.5 min. With the developed system large sequences of samples could be analyzed in a completely automated manner. This high degree of automation makes the developed EME-autosampler a powerful tool for a wide range of applications where high-throughput extractions are required before sample analysis. PMID:27237618

  19. Fully Automated One-Step Production of Functional 3D Tumor Spheroids for High-Content Screening.

    PubMed

    Monjaret, François; Fernandes, Mathieu; Duchemin-Pelletier, Eve; Argento, Amelie; Degot, Sébastien; Young, Joanne

    2016-04-01

    Adoption of spheroids within high-content screening (HCS) has lagged behind high-throughput screening (HTS) due to issues with running complex assays on large three-dimensional (3D) structures.To enable multiplexed imaging and analysis of spheroids, different cancer cell lines were grown in 3D on micropatterned 96-well plates with automated production of nine uniform spheroids per well. Spheroids achieve diameters of up to 600 µm, and reproducibility was experimentally validated (interwell and interplate CV(diameter) <5%). Biphoton imaging confirmed that micropatterned spheroids exhibit characteristic cell heterogeneity with distinct microregions. Furthermore, central necrosis appears at a consistent spheroid size, suggesting standardized growth.Using three reference compounds (fluorouracil, irinotecan, and staurosporine), we validated HT-29 micropatterned spheroids on an HCS platform, benchmarking against hanging-drop spheroids. Spheroid formation and imaging in a single plate accelerate assay workflow, and fixed positioning prevents structures from overlapping or sticking to the well wall, augmenting image processing reliability. Furthermore, multiple spheroids per well increase the statistical confidence sufficiently to discriminate compound mechanisms of action and generate EC50 values for endpoints of cell death, architectural change, and size within a single-pass read. Higher quality data and a more efficient HCS work chain should encourage integration of micropatterned spheroid models within fundamental research and drug discovery applications. PMID:26385905

  20. Capillary-based fully integrated and automated system for nanoliter polymerase chain reaction analysis directly from cheek cells.

    PubMed

    He, Y; Zhang, Y H; Yeung, E S

    2001-07-27

    A miniaturized, integrated and automated system based on capillary fluidics has been developed for nanoliter DNA analysis directly from cheek cells. All steps for DNA analysis, including injecting aqueous reagents and DNA samples, mixing the solutions together, thermal cell lysis, polymerase chain reaction (PCR), transfer and injection of PCR product, separation, sizing and detection of those products are performed in a capillary-based integrated system. A small amount of cheek cells collected by a plastic toothpick is directly dissolved in the PCR cocktail in a plastic vial or mixed on-line with a small volume of PCR cocktail (125 nl) in the capillary. After thermal cell lysis and PCR in a microthermal cycler, the DNA fragments are mixed with DNA size standards and transferred to a micro-cross for injection and separation by capillary gel electrophoresis. Programmable syringe pumps, switching valves, multiposition and freeze-thaw valves are used for microfluidic control in the entire system. This work establishes the feasibility of performing all the steps of DNA analysis from real samples in a capillary-based nanoliter integrated system. PMID:11521874

  1. A fully automated non-external marker 4D-CT sorting algorithm using a serial cine scanning protocol

    NASA Astrophysics Data System (ADS)

    Carnes, Greg; Gaede, Stewart; Yu, Edward; Van Dyk, Jake; Battista, Jerry; Lee, Ting-Yim

    2009-04-01

    Current 4D-CT methods require external marker data to retrospectively sort image data and generate CT volumes. In this work we develop an automated 4D-CT sorting algorithm that performs without the aid of data collected from an external respiratory surrogate. The sorting algorithm requires an overlapping cine scan protocol. The overlapping protocol provides a spatial link between couch positions. Beginning with a starting scan position, images from the adjacent scan position (which spatial match the starting scan position) are selected by maximizing the normalized cross correlation (NCC) of the images at the overlapping slice position. The process was continued by 'daisy chaining' all couch positions using the selected images until an entire 3D volume was produced. The algorithm produced 16 phase volumes to complete a 4D-CT dataset. Additional 4D-CT datasets were also produced using external marker amplitude and phase angle sorting methods. The image quality of the volumes produced by the different methods was quantified by calculating the mean difference of the sorted overlapping slices from adjacent couch positions. The NCC sorted images showed a significant decrease in the mean difference (p < 0.01) for the five patients.

  2. The Simoa HD-1 Analyzer: A Novel Fully Automated Digital Immunoassay Analyzer with Single-Molecule Sensitivity and Multiplexing.

    PubMed

    Wilson, David H; Rissin, David M; Kan, Cheuk W; Fournier, David R; Piech, Tomasz; Campbell, Todd G; Meyer, Raymond E; Fishburn, Matthew W; Cabrera, Carlos; Patel, Purvish P; Frew, Erica; Chen, Yao; Chang, Lei; Ferrell, Evan P; von Einem, Volker; McGuigan, William; Reinhardt, Marcus; Sayer, Heiko; Vielsack, Claus; Duffy, David C

    2016-08-01

    Disease detection at the molecular level is driving the emerging revolution of early diagnosis and treatment. A challenge facing the field is that protein biomarkers for early diagnosis can be present in very low abundance. The lower limit of detection with conventional immunoassay technology is the upper femtomolar range (10(-13) M). Digital immunoassay technology has improved detection sensitivity three logs, to the attomolar range (10(-16) M). This capability has the potential to open new advances in diagnostics and therapeutics, but such technologies have been relegated to manual procedures that are not well suited for efficient routine use. We describe a new laboratory instrument that provides full automation of single-molecule array (Simoa) technology for digital immunoassays. The instrument is capable of single-molecule sensitivity and multiplexing with short turnaround times and a throughput of 66 samples/h. Singleplex and multiplexed digital immunoassays were developed for 16 proteins of interest in cardiovascular, cancer, infectious disease, neurology, and inflammation research. The average sensitivity improvement of the Simoa immunoassays versus conventional ELISA was >1200-fold, with coefficients of variation of <10%. The potential of digital immunoassays to advance human diagnostics was illustrated in two clinical areas: traumatic brain injury and early detection of infectious disease. PMID:26077162

  3. Fully automated 1.5 MHz FDML laser with more than 100mW output power at 1310 nm

    NASA Astrophysics Data System (ADS)

    Wieser, Wolfgang; Klein, Thomas; Draxinger, Wolfgang; Huber, Robert

    2015-07-01

    While FDML lasers with MHz sweep speeds have been presented five years ago, these devices have required manual control for startup and operation. Here, we present a fully self-starting and continuously regulated FDML laser with a sweep rate of 1.5 MHz. The laser operates over a sweep range of 115 nm centered at 1315 nm, and provides very high average output power of more than 100 mW. We characterize the laser performance, roll-off, coherence length and investigate the wavelength and phase stability of the laser output under changing environmental conditions. The high output power allows optical coherence tomography (OCT) imaging with an OCT sensitivity of 108 dB at 1.5 MHz.

  4. Fully automated E-field measurement setup using pigtailed electro-optic sensors for accurate, vectorial, and reliable remote measurement of high-power microwave signals

    NASA Astrophysics Data System (ADS)

    Bernier, M.; Warzecha, A.; Duvillaret, L.; Lasserre, J.-L.; Paupert, A.

    2008-10-01

    The EO probe developed, offers an accurate evaluation of only one component of either continuous or single shot electric signal as long as the electric field to be measured is strong enough. Since those probes are also non intrusive, very small (tens of microns width) and have a flat response over a very large bandwidth (more than seven decades), they are competitive candidates for accurate vectorial measurement of either radiated or guided high power microwave electric field in the far- and near-field region. Unfortunately what makes them so versatile is also their Achilles' heel: the strong temporal instability of their response. Therefore, we present, in this paper, a fully-automated electro-optic probe developed to stabilise the transducer.

  5. Fully automated synthesis module for preparation of S-(2-[(18)F]fluoroethyl)-L-methionine by direct nucleophilic exchange on a quaternary 4-aminopyridinium resin.

    PubMed

    Tang, Ganghua; Wang, Mingfang; Tang, Xiaolan; Luo, Lei; Gan, Manquan

    2003-07-01

    A fully automated preparation of S-(2-[(18)F]fluoroethyl)-L-methionine (FEMET), an amino acid tracer for tumor imaging with positron emission tomography, is described. [(18)F]F(-) was produced via nuclear reaction (18)O(p,n) [(18)F] at PETtrace Cyclotron. Direction nucleophilic fluorination reaction of [(18)F]fluoride with 1,2-di(4-methylphenylsulfonyloxy)ethane on a quaternary 4-(4-methylpiperidinyl)pyridinium functionalized polystyrene anion exchange resin gave 2-[(18)F]-1-(4-methylphenyl-sulfonyloxy)ethane, and then [(18)F]fluoroalkylation of L-homocysteine thiolactone with 2-[(18)F]-1-(4-methylphenylsulfonyloxy)ethane yielded FEMET. The overall radiochemical yield with no decay correction was about 10%, the whole synthesis time was about 52 min, and the radiochemical purity was above 95%. PMID:12831988

  6. Fully automated classification of bone marrow infiltration in low-dose CT of patients with multiple myeloma based on probabilistic density model and supervised learning.

    PubMed

    Martínez-Martínez, Francisco; Kybic, Jan; Lambert, Lukáš; Mecková, Zuzana

    2016-04-01

    This paper presents a fully automated method for the identification of bone marrow infiltration in femurs in low-dose CT of patients with multiple myeloma. We automatically find the femurs and the bone marrow within them. In the next step, we create a probabilistic, spatially dependent density model of normal tissue. At test time, we detect unexpectedly high density voxels which may be related to bone marrow infiltration, as outliers to this model. Based on a set of global, aggregated features representing all detections from one femur, we classify the subjects as being either healthy or not. This method was validated on a dataset of 127 subjects with ground truth created from a consensus of two expert radiologists, obtaining an AUC of 0.996 for the task of distinguishing healthy controls and patients with bone marrow infiltration. To the best of our knowledge, no other automatic image-based method for this task has been published before. PMID:26894595

  7. Evaluation of the Fully Automated BD MAX Cdiff and Xpert C. difficile Assays for Direct Detection of Clostridium difficile in Stool Specimens

    PubMed Central

    Hofko, Marjeta; Zorn, Markus; Zimmermann, Stefan

    2013-01-01

    We evaluated the fully automated molecular BD MAX Cdiff assay (BD Diagnostics) and the Xpert C. difficile test (Cepheid) for rapid detection of Clostridium difficile infection. Culture was done on chromogenic agar followed by matrix-assisted laser desorption ionization–time of flight (MALDI-TOF) mass spectrometry identification and toxin detection. Repeat testing was required for 1.8% and 6.0% of the BD MAX and Xpert tests, respectively. Sensitivities, specificities, positive predictive values (PPV), and negative predictive values (NPV) were 90.5%, 97.9%, 89.3%, and 98.1%, respectively, for BD MAX and 97.3%, 97.9%, 90.0%, and 99.5%, respectively, for Xpert. PMID:23515539

  8. Instrumentation of LOTIS: Livermore Optical Transient Imaging System; a fully automated wide field of view telescope system searching for simultaneous optical counterparts of gamma ray bursts

    SciTech Connect

    Park, H.S.; Ables, E.; Barthelmy, S.D.; Bionta, R.M.; Ott, L.L.; Parker, E.L.; Williams, G.G.

    1998-03-06

    LOTIS is a rapidly slewing wide-field-of-view telescope which was designed and constructed to search for simultaneous gamma-ray burst (GRB) optical counterparts. This experiment requires a rapidly slewing ({lt} 10 sec), wide-field-of-view ({gt} 15{degrees}), automatic and dedicated telescope. LOTIS utilizes commercial tele-photo lenses and custom 2048 x 2048 CCD cameras to view a 17.6 x 17.6{degrees} field of view. It can point to any part of the sky within 5 sec and is fully automated. It is connected via Internet socket to the GRB coordinate distribution network which analyzes telemetry from the satellite and delivers GRB coordinate information in real-time. LOTIS started routine operation in Oct. 1996. In the idle time between GRB triggers, LOTIS systematically surveys the entire available sky every night for new optical transients. This paper will describe the system design and performance.

  9. Fully Automated Gis-Based Individual Tree Crown Delineation Based on Curvature Values from a LIDAR Derived Canopy Height Model in a Coniferous Plantation

    NASA Astrophysics Data System (ADS)

    Argamosa, R. J. L.; Paringit, E. C.; Quinton, K. R.; Tandoc, F. A. M.; Faelga, R. A. G.; Ibañez, C. A. G.; Posilero, M. A. V.; Zaragosa, G. P.

    2016-06-01

    The generation of high resolution canopy height model (CHM) from LiDAR makes it possible to delineate individual tree crown by means of a fully-automated method using the CHM's curvature through its slope. The local maxima are obtained by taking the maximum raster value in a 3 m x 3 m cell. These values are assumed as tree tops and therefore considered as individual trees. Based on the assumptions, thiessen polygons were generated to serve as buffers for the canopy extent. The negative profile curvature is then measured from the slope of the CHM. The results show that the aggregated points from a negative profile curvature raster provide the most realistic crown shape. The absence of field data regarding tree crown dimensions require accurate visual assessment after the appended delineated tree crown polygon was superimposed to the hill shaded CHM.

  10. Multi-center evaluation of the novel fully-automated PCR-based Idylla™ BRAF Mutation Test on formalin-fixed paraffin-embedded tissue of malignant melanoma.

    PubMed

    Melchior, Linea; Grauslund, Morten; Bellosillo, Beatriz; Montagut, Clara; Torres, Erica; Moragón, Ester; Micalessi, Isabel; Frans, Johan; Noten, Veerle; Bourgain, Claire; Vriesema, Renske; van der Geize, Robert; Cokelaere, Kristof; Vercooren, Nancy; Crul, Katrien; Rüdiger, Thomas; Buchmüller, Diana; Reijans, Martin; Jans, Caroline

    2015-12-01

    The advent of BRAF-targeted therapies led to increased survival in patients with metastatic melanomas harboring a BRAF V600 mutation (implicated in 46-48% of malignant melanomas). The Idylla(™) System (Idylla(™)), i.e., the real-time-PCR-based Idylla(™) BRAF Mutation Test performed on the fully-automated Idylla(™) platform, enables detection of the most frequent BRAF V600 mutations (V600E/E2/D, V600K/R/M) in tumor material within approximately 90 min and with 1% detection limit. Idylla(™) performance was determined in a multi-center study by analyzing BRAF mutational status of 148 archival formalin-fixed paraffin-embedded (FFPE) tumor samples from malignant melanoma patients, and comparing Idylla(™) results with assessments made by commercial or in-house routine diagnostic methods. Of the 148 samples analyzed, Idylla(™) initially recorded 7 insufficient DNA input calls and 15 results discordant with routine method results. Further analysis learned that the quality of 8 samples was insufficient for Idylla(™) testing, 1 sample had an invalid routine test result, and Idylla(™) results were confirmed in 10 samples. Hence, Idylla(™) identified all mutations present, including 7 not identified by routine methods. Idylla(™) enables fully automated BRAF V600 testing directly on FFPE tumor tissue with increased sensitivity, ease-of-use, and much shorter turnaround time compared to existing diagnostic tests, making it a tool for rapid, simple and highly reliable analysis of therapeutically relevant BRAF mutations, in particular for diagnostic units without molecular expertise and infrastructure. PMID:26407762

  11. Implementation and Evaluation of a Fully Automated Multiplex Real-Time PCR Assay on the BD Max Platform to Detect and Differentiate Herpesviridae from Cerebrospinal Fluids

    PubMed Central

    Köller, Thomas; Kurze, Daniel; Lange, Mirjam; Scherdin, Martin; Podbielski, Andreas; Warnke, Philipp

    2016-01-01

    A fully automated multiplex real-time PCR assay—including a sample process control and a plasmid based positive control—for the detection and differentiation of herpes simplex virus 1 (HSV1), herpes simplex virus 2 (HSV2) and varicella-zoster virus (VZV) from cerebrospinal fluids (CSF) was developed on the BD Max platform. Performance was compared to an established accredited multiplex real time PCR protocol utilizing the easyMAG and the LightCycler 480/II, both very common devices in viral molecular diagnostics. For clinical validation, 123 CSF specimens and 40 reference samples from national interlaboratory comparisons were examined with both methods, resulting in 97.6% and 100% concordance for CSF and reference samples, respectively. Utilizing the BD Max platform revealed sensitivities of 173 (CI 95%, 88–258) copies/ml for HSV1, 171 (CI 95%, 148–194) copies/ml for HSV2 and 84 (CI 95%, 5–163) copies/ml for VZV. Cross reactivity could be excluded by checking 25 common viral, bacterial and fungal human pathogens. Workflow analyses displayed shorter test duration as well as remarkable fewer and easier preparation steps with the potential to reduce error rates occurring when manually assessing patient samples. This protocol allows for a fully automated PCR assay on the BD Max platform for the simultaneously detection of herpesviridae from CSF specimens. Singular or multiple infections due to HSV1, HSV2 and VZV can reliably be differentiated with good sensitivities. Control parameters are included within the assay, thereby rendering its suitability for current quality management requirements. PMID:27092772

  12. Use of short roll C-arm computed tomography and fully automated 3D analysis tools to guide transcatheter aortic valve replacement.

    PubMed

    Kim, Michael S; Bracken, John; Eshuis, Peter; Chen, S Y James; Fullerton, David; Cleveland, Joseph; Messenger, John C; Carroll, John D

    2016-07-01

    Determination of the coplanar view is a critical component of transcatheter aortic valve replacement (TAVR). The safety and accuracy of a novel reduced angular range C-arm computed tomography (CACT) approach coupled with a fully automated 3D analysis tool package to predict the coplanar view in TAVR was evaluated. Fifty-seven patients with severe symptomatic aortic stenosis deemed prohibitive-risk for surgery and who underwent TAVR were enrolled. Patients were randomized 2:1 to CACT vs. angiography (control) in estimating the coplanar view. These approaches to determine the coplanar view were compared quantitatively. Radiation doses needed to determine the coplanar view were recorded for both the CACT and control patients. Use of CACT offered good agreement with the actual angiographic view utilized during TAVR in 34 out of 41 cases in which a CACT scan was performed (83 %). For these 34 cases, the mean angular magnitude difference, taking into account both oblique and cranial/caudal angulation, was 1.3° ± 0.4°, while the maximum difference was 7.3°. There were no significant differences in the mean total radiation dose delivered to patients between the CACT and control groups as measured by either dose area product (207.8 ± 15.2 Gy cm(2) vs. 186.1 ± 25.3 Gy cm(2), P = 0.47) or air kerma (1287.6 ± 117.7 mGy vs. 1098.9 ± 143.8 mGy, P = 0.32). Use of reduced-angular range CACT coupled with fully automated 3D analysis tools is a safe, practical, and feasible method by which to determine the optimal angiographic deployment view for guiding TAVR procedures. PMID:27091735

  13. Fully Automated Simultaneous Integrated Boosted-Intensity Modulated Radiation Therapy Treatment Planning Is Feasible for Head-and-Neck Cancer: A Prospective Clinical Study

    SciTech Connect

    Wu Binbin; McNutt, Todd; Zahurak, Marianna; Simari, Patricio; Pang, Dalong; Taylor, Russell; Sanguineti, Giuseppe

    2012-12-01

    Purpose: To prospectively determine whether overlap volume histogram (OVH)-driven, automated simultaneous integrated boosted (SIB)-intensity-modulated radiation therapy (IMRT) treatment planning for head-and-neck cancer can be implemented in clinics. Methods and Materials: A prospective study was designed to compare fully automated plans (APs) created by an OVH-driven, automated planning application with clinical plans (CPs) created by dosimetrists in a 3-dose-level (70 Gy, 63 Gy, and 58.1 Gy), head-and-neck SIB-IMRT planning. Because primary organ sparing (cord, brain, brainstem, mandible, and optic nerve/chiasm) always received the highest priority in clinical planning, the study aimed to show the noninferiority of APs with respect to PTV coverage and secondary organ sparing (parotid, brachial plexus, esophagus, larynx, inner ear, and oral mucosa). The sample size was determined a priori by a superiority hypothesis test that had 85% power to detect a 4% dose decrease in secondary organ sparing with a 2-sided alpha level of 0.05. A generalized estimating equation (GEE) regression model was used for statistical comparison. Results: Forty consecutive patients were accrued from July to December 2010. GEE analysis indicated that in APs, overall average dose to the secondary organs was reduced by 1.16 (95% CI = 0.09-2.33) with P=.04, overall average PTV coverage was increased by 0.26% (95% CI = 0.06-0.47) with P=.02 and overall average dose to the primary organs was reduced by 1.14 Gy (95% CI = 0.45-1.8) with P=.004. A physician determined that all APs could be delivered to patients, and APs were clinically superior in 27 of 40 cases. Conclusions: The application can be implemented in clinics as a fast, reliable, and consistent way of generating plans that need only minor adjustments to meet specific clinical needs.

  14. Quantitative determination of opioids in whole blood using fully automated dried blood spot desorption coupled to on-line SPE-LC-MS/MS.

    PubMed

    Verplaetse, Ruth; Henion, Jack

    2016-01-01

    Opioids are well known, widely used painkillers. Increased stability of opioids in the dried blood spot (DBS) matrix compared to blood/plasma has been described. Other benefits provided by DBS techniques include point-of-care collection, less invasive micro sampling, more economical shipment, and convenient storage. Current methodology for analysis of micro whole blood samples for opioids is limited to the classical DBS workflow, including tedious manual punching of the DBS cards followed by extraction and liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. The goal of this study was to develop and validate a fully automated on-line sample preparation procedure for the analysis of DBS micro samples relevant to the detection of opioids in finger prick blood. To this end, automated flow-through elution of DBS cards was followed by on-line solid-phase extraction (SPE) and analysis by LC-MS/MS. Selective, sensitive, accurate, and reproducible quantitation of five representative opioids in human blood at sub-therapeutic, therapeutic, and toxic levels was achieved. The range of reliable response (R(2)  ≥0.997) was 1 to 500 ng/mL whole blood for morphine, codeine, oxycodone, hydrocodone; and 0.1 to 50 ng/mL for fentanyl. Inter-day, intra-day, and matrix inter-lot accuracy and precision was less than 15% (even at lower limits of quantitation (LLOQ) level). The method was successfully used to measure hydrocodone and its major metabolite norhydrocodone in incurred human samples. Our data support the enormous potential of DBS sampling and automated analysis for monitoring opioids as well as other pharmaceuticals in both anti-doping and pain management regimens. PMID:26607771

  15. SU-D-BRD-06: Creating a Safety Net for a Fully Automated, Script Driven Electronic Medical Record

    SciTech Connect

    Sheu, R; Ghafar, R; Powers, A; Green, S; Lo, Y

    2015-06-15

    Purpose: Demonstrate the effectiveness of in-house software in ensuring EMR workflow efficiency and safety. Methods: A web-based dashboard system (WBDS) was developed to monitor clinical workflow in real time using web technology (WAMP) through ODBC (Open Database Connectivity). Within Mosaiq (Elekta Inc), operational workflow is driven and indicated by Quality Check Lists (QCLs), which is triggered by automation software IQ Scripts (Elekta Inc); QCLs rely on user completion to propagate. The WBDS retrieves data directly from the Mosaig SQL database and tracks clinical events in real time. For example, the necessity of a physics initial chart check can be determined by screening all patients on treatment who have received their first fraction and who have not yet had their first chart check. Monitoring similar “real” events with our in-house software creates a safety net as its propagation does not rely on individual users input. Results: The WBDS monitors the following: patient care workflow (initial consult to end of treatment), daily treatment consistency (scheduling, technique, charges), physics chart checks (initial, EOT, weekly), new starts, missing treatments (>3 warning/>5 fractions, action required), and machine overrides. The WBDS can be launched from any web browser which allows the end user complete transparency and timely information. Since the creation of the dashboards, workflow interruptions due to accidental deletion or completion of QCLs were eliminated. Additionally, all physics chart checks were completed timely. Prompt notifications of treatment record inconsistency and machine overrides have decreased the amount of time between occurrence and execution of corrective action. Conclusion: Our clinical workflow relies primarily on QCLs and IQ Scripts; however, this functionality is not the panacea of safety and efficiency. The WBDS creates a more thorough system of checks to provide a safer and near error-less working environment.

  16. Fast and Efficient Fragment-Based Lead Generation by Fully Automated Processing and Analysis of Ligand-Observed NMR Binding Data.

    PubMed

    Peng, Chen; Frommlet, Alexandra; Perez, Manuel; Cobas, Carlos; Blechschmidt, Anke; Dominguez, Santiago; Lingel, Andreas

    2016-04-14

    NMR binding assays are routinely applied in hit finding and validation during early stages of drug discovery, particularly for fragment-based lead generation. To this end, compound libraries are screened by ligand-observed NMR experiments such as STD, T1ρ, and CPMG to identify molecules interacting with a target. The analysis of a high number of complex spectra is performed largely manually and therefore represents a limiting step in hit generation campaigns. Here we report a novel integrated computational procedure that processes and analyzes ligand-observed proton and fluorine NMR binding data in a fully automated fashion. A performance evaluation comparing automated and manual analysis results on (19)F- and (1)H-detected data sets shows that the program delivers robust, high-confidence hit lists in a fraction of the time needed for manual analysis and greatly facilitates visual inspection of the associated NMR spectra. These features enable considerably higher throughput, the assessment of larger libraries, and shorter turn-around times. PMID:26964888

  17. Comparative evaluation of two fully-automated real-time PCR methods for MRSA admission screening in a tertiary-care hospital.

    PubMed

    Hos, N J; Wiegel, P; Fischer, J; Plum, G

    2016-09-01

    We evaluated two fully-automated real-time PCR systems, the novel QIAGEN artus MRSA/SA QS-RGQ and the widely used BD MAX MRSA assay, for their diagnostic performance in MRSA admission screening in a tertiary-care university hospital. Two hundred sixteen clinical swabs were analyzed for MRSA DNA using the BD MAX MRSA assay. In parallel, the same specimens were tested with the QIAGEN artus MRSA/SA QS-RGQ. Automated steps included lysis of bacteria, DNA extraction, real-time PCR and interpretation of results. MRSA culture was additionally performed as a reference method for MRSA detection. Sensitivity values were similar for both assays (80 %), while the QIAGEN artus MRSA/SA QS-RGQ reached a slightly higher specificity (95.8 % versus 90.0 %). Positive (PPVs) and negative predictive values (NPVs) were 17.4 % and 99.4 % for the BD MAX MRSA assay and 33.3 % and 99.5 % for the QIAGEN artus MRSA/SA QS-RGQ, respectively. Total turn-around time (TAT) for 24 samples was 3.5 hours for both assays. In conclusion, both assays represent reliable diagnostic tools due to their high negative predictive values, especially for the rapid identification of MRSA negative patients in a low prevalence MRSA area. PMID:27259711

  18. A fully automated meltwater monitoring and collection system for spatially distributed isotope analysis in snowmelt-dominated catchments

    NASA Astrophysics Data System (ADS)

    Rücker, Andrea; Boss, Stefan; Von Freyberg, Jana; Zappa, Massimiliano; Kirchner, James

    2016-04-01

    In many mountainous catchments the seasonal snowpack stores a significant volume of water, which is released as streamflow during the melting period. The predicted change in future climate will bring new challenges in water resource management in snow-dominated headwater catchments and their receiving lowlands. To improve predictions of hydrologic extreme events, particularly summer droughts, it is important characterize the relationship between winter snowpack and summer (low) flows in such areas (e.g., Godsey et al., 2014). In this context, stable water isotopes (18O, 2H) are a powerful tool for fingerprinting the sources of streamflow and tracing water flow pathways. For this reason, we have established an isotope sampling network in the Alptal catchment (46.4 km2) in Central-Switzerland as part of the SREP-Drought project (Snow Resources and the Early Prediction of hydrological DROUGHT in mountainous streams). Samples of precipitation (daily), snow cores (weekly) and runoff (daily) are analyzed for their isotopic signature in a regular cycle. Precipitation is also sampled along a horizontal transect at the valley bottom, and along an elevational transect. Additionally, the analysis of snow meltwater is of importance. As the sample collection of snow meltwater in mountainous terrain is often impractical, we have developed a fully automatic snow lysimeter system, which measures meltwater volume and collects samples for isotope analysis at daily intervals. The system consists of three lysimeters built from Decagon-ECRN-100 High Resolution Rain Gauges as standard component that allows monitoring of meltwater flow. Each lysimeter leads the meltwater into a 10-liter container that is automatically sampled and then emptied daily. These water samples are replaced regularly and analyzed afterwards on their isotopic composition in the lab. Snow melt events as well as system status can be monitored in real time. In our presentation we describe the automatic snow lysimeter

  19. Assessing cereal grain quality with a fully automated instrument using artificial neural network processing of digitized color video images

    NASA Astrophysics Data System (ADS)

    Egelberg, Peter J.; Mansson, Olle; Peterson, Carsten

    1995-01-01

    A fully integrated instrument for cereal grain quality assessment is presented. Color video images of grains fed onto a belt are digitized. These images are then segmented into kernel entities, which are subject to the analysis. The number of degrees of freedom for each such object is decreased to a suitable level for Artificial Neural Network (ANN) processing. Feed- forward ANN's with one hidden layer are trained with respect to desired features such as purity and flour yield. The resulting performance is compatible with that of manual human ocular inspection and alternative measuring methods. A statistical analysis of training and test set population densities is used to estimate the prediction reliabilities and to set appropriate alarm levels. The instrument containing feeder belts, balance and CCD video camera is physically separated from the 90 MHz Pentium PC computer which is used to perform the segmentation, ANN analysis and for controlling the instrument under the Unix operating system. A user-friendly graphical user interface is used to operate the instrument. The processing time for a 50 g grain sample is approximately 2 - 3 minutes.

  20. Fully automated determination of the sterol composition and total content in edible oils and fats by online liquid chromatography-gas chromatography-flame ionization detection.

    PubMed

    Nestola, Marco; Schmidt, Torsten C

    2016-09-01

    Sterol analysis of edible oils and fats is important in authenticity control. The gas chromatographic determination of the sterol distribution and total content is described by ISO norm 12228. Extraction, purification, and detection of the sterols are time-consuming and error-prone. Collaborative trials prove this regularly. Purification by thin-layer chromatography (TLC) and robust GC determination of all mentioned sterols is not straightforward. Therefore, a fully automated LC-GC-FID method was developed to facilitate the determination of sterols. The only manual step left was to weigh the sample into an autosampler vial. Saponification and extraction were performed by an autosampler while purification, separation, and detection were accomplished by online coupled normal-phase LC-GC-FID. Interlacing of sample preparation and analysis allowed an average sample throughput of one sample per hour. The obtained quantitative results were fully comparable with the ISO method with one apparent exception. In the case of sunflower oils, an additional unknown sterol was detected generally missed by ISO 12228. The reason was found in the omission of sterol silylation before subjection to GC-FID. The derivatization reaction changed the retention time and hid this compound behind a major sterol. The compound could be identified as 14-methyl fecosterol. Its structure was elucidated by GC-MS and ensured by HPLC and GC retention times. Finally, validation of the designed method confirmed its suitability for routine environments. PMID:27522150

  1. Cell proliferation measurement in cecum and colon of rats using scanned images and fully automated image analysis: validation of method.

    PubMed

    Persohn, E; Seewald, W; Bauer, J; Schreiber, J

    2007-08-01

    The purpose of this study was to establish and validate fully automatic measurement of cell proliferation on scanned images of rat cecum and colon. Tissue slides were taken from a 4-week mechanistic study and processed for BrdU immunohistochemistry. Four sections of the cecum and colon per slide were scanned with the Zeiss MIRAX SCAN and transferred to the Definiens eCognition Analyst LS5.0 system for evaluation. Two rule sets for automatic counting of BrdU-positive and negative nuclei from mucosal cells on the image tiles were created by Definiens, one for cecum, one for colon. For validation, manual counting of 16 randomly selected tiles from five different slides of colon and cecum was performed. Negative and positive cell nuclei were counted in each image tile by four different people. Comparison of results from manual counting with the automatic counting showed that the sum as well as single tile data and labeling index (LI) from automatic counting were within the range of manual counting results +/-10%. Automatic counting included only cell nuclei within the mucosa whereas muscularis and lymphoid tissue as well as wrinkles from tissue preparation were excluded. In addition, two data sets from automatic counting of the same image tile were compared: (1) data where image tiles with incorrect detection of mucosa were excluded from further calculation of LI and area, and (2) data where no visual check was performed and all measurements were included. Results were very similar for both data sets. The necessity of the manual correction may therefore be doubted. PMID:17467963

  2. Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB

    PubMed Central

    Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N

    2009-01-01

    Background The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. Results We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime

  3. Diabetes Prevention and Weight Loss with a Fully Automated Behavioral Intervention by Email, Web, and Mobile Phone: A Randomized Controlled Trial Among Persons with Prediabetes

    PubMed Central

    Romanelli, Robert J; Block, Torin J; Hopkins, Donald; Carpenter, Heather A; Dolginsky, Marina S; Hudes, Mark L; Palaniappan, Latha P; Block, Clifford H

    2015-01-01

    Background One-third of US adults, 86 million people, have prediabetes. Two-thirds of adults are overweight or obese and at risk for diabetes. Effective and affordable interventions are needed that can reach these 86 million, and others at high risk, to reduce their progression to diagnosed diabetes. Objective The aim was to evaluate the effectiveness of a fully automated algorithm-driven behavioral intervention for diabetes prevention, Alive-PD, delivered via the Web, Internet, mobile phone, and automated phone calls. Methods Alive-PD provided tailored behavioral support for improvements in physical activity, eating habits, and factors such as weight loss, stress, and sleep. Weekly emails suggested small-step goals and linked to an individual Web page with tools for tracking, coaching, social support through virtual teams, competition, and health information. A mobile phone app and automated phone calls provided further support. The trial randomly assigned 339 persons to the Alive-PD intervention (n=163) or a 6-month wait-list usual-care control group (n=176). Participants were eligible if either fasting glucose or glycated hemoglobin A1c (HbA1c) was in the prediabetic range. Primary outcome measures were changes in fasting glucose and HbA1c at 6 months. Secondary outcome measures included clinic-measured changes in body weight, body mass index (BMI), waist circumference, triglyceride/high-density lipoprotein cholesterol (TG/HDL) ratio, and Framingham diabetes risk score. Analysis was by intention-to-treat. Results Participants’ mean age was 55 (SD 8.9) years, mean BMI was 31.2 (SD 4.4) kg/m2, and 68.7% (233/339) were male. Mean fasting glucose was in the prediabetic range (mean 109.9, SD 8.4 mg/dL), whereas the mean HbA1c was 5.6% (SD 0.3), in the normal range. In intention-to-treat analyses, Alive-PD participants achieved significantly greater reductions than controls in fasting glucose (mean –7.36 mg/dL, 95% CI –7.85 to –6.87 vs mean –2.19, 95% CI

  4. Comparison of Two Theory-Based, Fully Automated Telephone Interventions Designed to Maintain Dietary Change in Healthy Adults: Study Protocol of a Three-Arm Randomized Controlled Trial

    PubMed Central

    Quintiliani, Lisa M; Turner-McGrievy, Gabrielle M; Migneault, Jeffrey P; Heeren, Timothy; Friedman, Robert H

    2014-01-01

    Background Health behavior change interventions have focused on obtaining short-term intervention effects; few studies have evaluated mid-term and long-term outcomes, and even fewer have evaluated interventions that are designed to maintain and enhance initial intervention effects. Moreover, behavior theory has not been developed for maintenance or applied to maintenance intervention design to the degree that it has for behavior change initiation. Objective The objective of this paper is to describe a study that compared two theory-based interventions (social cognitive theory [SCT] vs goal systems theory [GST]) designed to maintain previously achieved improvements in fruit and vegetable (F&V) consumption. Methods The interventions used tailored, interactive conversations delivered by a fully automated telephony system (Telephone-Linked Care [TLC]) over a 6-month period. TLC maintenance intervention based on SCT used a skills-based approach to build self-efficacy. It assessed confidence in and barriers to eating F&V, provided feedback on how to overcome barriers, plan ahead, and set goals. The TLC maintenance intervention based on GST used a cognitive-based approach. Conversations trained participants in goal management to help them integrate their newly acquired dietary behavior into their hierarchical system of goals. Content included goal facilitation, conflict, shielding, and redundancy, and reflection on personal goals and priorities. To evaluate and compare the two approaches, a sample of adults whose F&V consumption was below public health goal levels were recruited from a large urban area to participate in a fully automated telephony intervention (TLC-EAT) for 3-6 months. Participants who increase their daily intake of F&V by ≥1 serving/day will be eligible for the three-arm randomized controlled trial. A sample of 405 participants will be randomized to one of three arms: (1) an assessment-only control, (2) TLC-SCT, and (3) TLC-GST. The maintenance

  5. A fully automated system for analysis of pesticides in water: on-line extraction followed by liquid chromatography-tandem photodiode array/postcolumn derivatization/fluorescence detection.

    PubMed

    Patsias, J; Papadopoulou-Mourkidou, E

    1999-01-01

    A fully automated system for on-line solid phase extraction (SPE) followed by high-performance liquid chromatography (HPLC) with tandem detection with a photodiode array detector and a fluorescence detector (after postcolumn derivatization) was developed for analysis of many chemical classes of pesticides and their major conversion products in aquatic systems. An automated on-line-SPE system (Prospekt) operated with reversed-phase cartridges (PRP-1) extracts analytes from 100 mL acidified (pH = 3) filtered water sample. On-line HPLC analysis is performed with a 15 cm C18 analytical column eluted with a mobile phase of phosphate (pH = 3)-acetonitrile in 25 min linear gradient mode. Solutes are detected by tandem diode array/derivatization/fluorescence detection. The system is controlled and monitored by a single computer operated with Millenium software. Recoveries of most analytes in samples fortified at 1 microgram/L are > 90%, with relative standard deviation values of < 5%. For a few very polar analytes, mostly N-methylcarbamoyloximes (i.e., aldicarb sulfone, methomyl, and oxamyl), recoveries are < 20%. However, for these compounds, as well as for the rest of the N-methylcarbamates except for aldicarb sulfoxide and butoxycarboxim, the limits of detection (LODs) are 0.005-0.05 microgram/L. LODs for aldicarb sulfoxide and butoxycarboxim are 0.2 and 0.1 microgram, respectively. LODs for the rest of the analytes except 4-nitrophenol, bentazone, captan, decamethrin, and MCPA are 0.05-0.1 microgram/L. LODs for the latter compounds are 0.2-1.0 microgram/L. The system can be operated unattended. PMID:10444834

  6. Fully automated analysis of beta-lactams in bovine milk by online solid phase extraction-liquid chromatography-electrospray-tandem mass spectrometry.

    PubMed

    Kantiani, Lina; Farré, Marinella; Sibum, Martin; Postigo, Cristina; López de Alda, Miren; Barceló, Damiá

    2009-06-01

    A fully automated method for the detection of beta-lactam antibiotics, including six penicillins (amoxicillin, ampicillin, cloxacillin, dicloxacillin, oxacillin, and penicillin G) and four cephalosporins (cefazolin, ceftiofur, cefoperazone, and cefalexin) in bovine milk samples has been developed. The outlined method is based on online solid-phase extraction-liquid chromatography/electrospray-tandem mass spectrometry (SPE-LC/ESI-MS-MS). Target compounds were concentrated from 500 microL of centrifuged milk samples using an online SPE procedure with C18 HD cartridges. Target analytes were eluted with a gradient mobile phase (water + 0.1% formic acid/methanol + 0.1% formic acid) at a flow rate of 0.7 mL/min. Chromatographic separation was achieved within 10 min using a C-12 reversed phase analytical column. For unequivocal identification and confirmation, two multiple reaction monitoring (MRM) transitions were acquired for each analyte in the positive electrospray ionization mode (ESI(+)). Method limits of detection (LODs) in milk were well below the maximum residue limits (MRLs) set by the European Union for all compounds. Limits of quantification in milk were between 0.09 ng/mL and 1.44 ng/mL. The developed method was validated according to EU's requirements, and accuracy results ranged from 80 to 116%. Finally, the method was applied to the analysis of twenty real samples previously screened by the inhibition of microbial growth test Eclipse 100. This new developed method offers high sensitivity and accuracy of results, minimum sample pre-treatment, and uses for the first time an automated online SPE offering a high throughput analysis. Because of all these characteristics, the proposed method is applicable and could be deemed necessary within the field of food control and safety. PMID:19402673

  7. Development of a Real-Time PCR Protocol Requiring Minimal Handling for Detection of Vancomycin-Resistant Enterococci with the Fully Automated BD Max System.

    PubMed

    Dalpke, Alexander H; Hofko, Marjeta; Zimmermann, Stefan

    2016-09-01

    Vancomycin-resistant enterococci (VRE) are an important cause of health care-associated infections, resulting in significant mortality and a significant economic burden in hospitals. Active surveillance for at-risk populations contributes to the prevention of infections with VRE. The availability of a combination of automation and molecular detection procedures for rapid screening would be beneficial. Here, we report on the development of a laboratory-developed PCR for detection of VRE which runs on the fully automated Becton Dickinson (BD) Max platform, which combines DNA extraction, PCR setup, and real-time PCR amplification. We evaluated two protocols: one using a liquid master mix and the other employing commercially ordered dry-down reagents. The BD Max VRE PCR was evaluated in two rounds with 86 and 61 rectal elution swab (eSwab) samples, and the results were compared to the culture results. The sensitivities of the different PCR formats were 84 to 100% for vanA and 83.7 to 100% for vanB; specificities were 96.8 to 100% for vanA and 81.8 to 97% for vanB The use of dry-down reagents and the ExK DNA-2 kit for extraction showed that the samples were less inhibited (3.3%) than they were by the use of the liquid master mix (14.8%). Adoption of a cutoff threshold cycle of 35 for discrimination of vanB-positive samples allowed an increase of specificity to 87.9%. The performance of the BD Max VRE assay equaled that of the BD GeneOhm VanR assay, which was run in parallel. The use of dry-down reagents simplifies the assay and omits any need to handle liquid PCR reagents. PMID:27358466

  8. A fully-automated approach to land cover mapping with airborne LiDAR and high resolution multispectral imagery in a forested suburban landscape

    NASA Astrophysics Data System (ADS)

    Parent, Jason R.; Volin, John C.; Civco, Daniel L.

    2015-06-01

    Information on land cover is essential for guiding land management decisions and supporting landscape-level ecological research. In recent years, airborne light detection and ranging (LiDAR) and high resolution aerial imagery have become more readily available in many areas. These data have great potential to enable the generation of land cover at a fine scale and across large areas by leveraging 3-dimensional structure and multispectral information. LiDAR and other high resolution datasets must be processed in relatively small subsets due to their large volumes; however, conventional classification techniques cannot be fully automated and thus are unlikely to be feasible options when processing large high-resolution datasets. In this paper, we propose a fully automated rule-based algorithm to develop a 1 m resolution land cover classification from LiDAR data and multispectral imagery. The algorithm we propose uses a series of pixel- and object-based rules to identify eight vegetated and non-vegetated land cover features (deciduous and coniferous tall vegetation, medium vegetation, low vegetation, water, riparian wetlands, buildings, low impervious cover). The rules leverage both structural and spectral properties including height, LiDAR return characteristics, brightness in visible and near-infrared wavelengths, and normalized difference vegetation index (NDVI). Pixel-based properties were used initially to classify each land cover class while minimizing omission error; a series of object-based tests were then used to remove errors of commission. These tests used conservative thresholds, based on diverse test areas, to help avoid over-fitting the algorithm to the test areas. The accuracy assessment of the classification results included a stratified random sample of 3198 validation points distributed across 30 1 × 1 km tiles in eastern Connecticut, USA. The sample tiles were selected in a stratified random manner from locations representing the full range of

  9. Fully-Automated High-Throughput NMR System for Screening of Haploid Kernels of Maize (Corn) by Measurement of Oil Content

    PubMed Central

    Xu, Xiaoping; Huang, Qingming; Chen, Shanshan; Yang, Peiqiang; Chen, Shaojiang; Song, Yiqiao

    2016-01-01

    One of the modern crop breeding techniques uses doubled haploid plants that contain an identical pair of chromosomes in order to accelerate the breeding process. Rapid haploid identification method is critical for large-scale selections of double haploids. The conventional methods based on the color of the endosperm and embryo seeds are slow, manual and prone to error. On the other hand, there exists a significant difference between diploid and haploid seeds generated by high oil inducer, which makes it possible to use oil content to identify the haploid. This paper describes a fully-automated high-throughput NMR screening system for maize haploid kernel identification. The system is comprised of a sampler unit to select a single kernel to feed for measurement of NMR and weight, and a kernel sorter to distribute the kernel according to the measurement result. Tests of the system show a consistent accuracy of 94% with an average screening time of 4 seconds per kernel. Field test result is described and the directions for future improvement are discussed. PMID:27454427

  10. Left Ventricle: Fully Automated Segmentation Based on Spatiotemporal Continuity and Myocardium Information in Cine Cardiac Magnetic Resonance Imaging (LV-FAST)

    PubMed Central

    Wang, Lijia; Pei, Mengchao; Codella, Noel C. F.; Kochar, Minisha; Weinsaft, Jonathan W.; Li, Jianqi; Prince, Martin R.

    2015-01-01

    CMR quantification of LV chamber volumes typically and manually defines the basal-most LV, which adds processing time and user-dependence. This study developed an LV segmentation method that is fully automated based on the spatiotemporal continuity of the LV (LV-FAST). An iteratively decreasing threshold region growing approach was used first from the midventricle to the apex, until the LV area and shape discontinued, and then from midventricle to the base, until less than 50% of the myocardium circumference was observable. Region growth was constrained by LV spatiotemporal continuity to improve robustness of apical and basal segmentations. The LV-FAST method was compared with manual tracing on cardiac cine MRI data of 45 consecutive patients. Of the 45 patients, LV-FAST and manual selection identified the same apical slices at both ED and ES and the same basal slices at both ED and ES in 38, 38, 38, and 41 cases, respectively, and their measurements agreed within −1.6 ± 8.7 mL, −1.4 ± 7.8 mL, and 1.0 ± 5.8% for EDV, ESV, and EF, respectively. LV-FAST allowed LV volume-time course quantitatively measured within 3 seconds on a standard desktop computer, which is fast and accurate for processing the cine volumetric cardiac MRI data, and enables LV filling course quantification over the cardiac cycle. PMID:25738153

  11. Fully-Automated High-Throughput NMR System for Screening of Haploid Kernels of Maize (Corn) by Measurement of Oil Content.

    PubMed

    Wang, Hongzhi; Liu, Jin; Xu, Xiaoping; Huang, Qingming; Chen, Shanshan; Yang, Peiqiang; Chen, Shaojiang; Song, Yiqiao

    2016-01-01

    One of the modern crop breeding techniques uses doubled haploid plants that contain an identical pair of chromosomes in order to accelerate the breeding process. Rapid haploid identification method is critical for large-scale selections of double haploids. The conventional methods based on the color of the endosperm and embryo seeds are slow, manual and prone to error. On the other hand, there exists a significant difference between diploid and haploid seeds generated by high oil inducer, which makes it possible to use oil content to identify the haploid. This paper describes a fully-automated high-throughput NMR screening system for maize haploid kernel identification. The system is comprised of a sampler unit to select a single kernel to feed for measurement of NMR and weight, and a kernel sorter to distribute the kernel according to the measurement result. Tests of the system show a consistent accuracy of 94% with an average screening time of 4 seconds per kernel. Field test result is described and the directions for future improvement are discussed. PMID:27454427

  12. Fully-automated estimation of actual to potential evapotranspiration in the Everglades using Landsat and air temperature data as inputs to the Vegetation Index-Temperature Trapezoid method

    NASA Astrophysics Data System (ADS)

    Yagci, A. L.; Jones, J. W.

    2014-12-01

    While the greater Everglades contains a vast wetland, evapotranspiration (ET) is a major source of water "loss" from the system. Like other ecosystems, the Everglades is vulnerable to drought. Everglades restoration science and resource management requires information on the spatial and temporal distribution of ET. We developed a fully-automated ET model using the Vegetation Index-Temperature Trapezoid concept. The model was tested and evaluated against in-situ ET observations collected at the Shark River Slough Mangrove Forest eddy-covariance tower in Everglades National Park (Sitename / FLUXNET ID: Florida Everglades Shark River Slough Mangrove Forest / US-Skr). It uses Landsat Surface Reflectance Climate Data from Landsat 5, and Landsat 5 thermal and air temperature data from the Daily Gridded Surface Dataset to output the ratio of actual evapotranspiration (AET) and potential evapotranspiration (PET). When multiplied with a PET estimate, this output can be used to estimate ET at high spatial resolution. Furthermore, it can be used to downscale coarse resolution ET and PET products. Two example outputs covering the agricultural lands north of the major Everglades wetlands extracted from two different dates are shown below along with a National Land Cover Database image from 2011. The irrigated and non-irrigated farms are easily distinguishable from the background (i.e., natural land covers). Open water retained the highest AET/PET ratio. Wetlands had a higher AET/PET ratio than farmlands. The main challenge in this study area is prolonged cloudiness during the growing season.

  13. Significantly improved precision of cell migration analysis in time-lapse video microscopy through use of a fully automated tracking system

    PubMed Central

    2010-01-01

    Background Cell motility is a critical parameter in many physiological as well as pathophysiological processes. In time-lapse video microscopy, manual cell tracking remains the most common method of analyzing migratory behavior of cell populations. In addition to being labor-intensive, this method is susceptible to user-dependent errors regarding the selection of "representative" subsets of cells and manual determination of precise cell positions. Results We have quantitatively analyzed these error sources, demonstrating that manual cell tracking of pancreatic cancer cells lead to mis-calculation of migration rates of up to 410%. In order to provide for objective measurements of cell migration rates, we have employed multi-target tracking technologies commonly used in radar applications to develop fully automated cell identification and tracking system suitable for high throughput screening of video sequences of unstained living cells. Conclusion We demonstrate that our automatic multi target tracking system identifies cell objects, follows individual cells and computes migration rates with high precision, clearly outperforming manual procedures. PMID:20377897

  14. A fully automated effervescence-assisted switchable solvent-based liquid phase microextraction procedure: Liquid chromatographic determination of ofloxacin in human urine samples.

    PubMed

    Vakh, Christina; Pochivalov, Aleksei; Andruch, Vasil; Moskvin, Leonid; Bulatov, Andrey

    2016-02-11

    A novel fully automated effervescence-assisted switchable solvent-based liquid phase microextraction procedure has been suggested. In this extraction method, medium-chain saturated fatty acids were investigated as switchable hydrophilicity solvents. The conversion of fatty acid into hydrophilic form was carried out in the presence of sodium carbonate. The injection of sulfuric acid into the solution decreased the pH value of the solution, thus, microdroplets of the fatty acid were generated. Carbon dioxide bubbles were generated in-situ, and promoted the extraction process and final phase separation. The performance of the suggested approach was demonstrated by the determination of ofloxacin in human urine samples using high-performance liquid chromatography with fluorescence detection. This analytical task was used as a proof-of-concept example. Under the optimal conditions, the detector response of ofloxacin was linear in the concentration ranges of 3·10(-8)-3·10(-6) mol L(-1). The limit of detection, calculated from a blank test based on 3σ, was 1·10(-8) mol L(-1). The results demonstrated that the presented approach is highly cost-effective, simple, rapid and environmentally friendly. PMID:26803002

  15. The microfluidic bioagent autonomous networked detector (M-BAND): an update. Fully integrated, automated, and networked field identification of airborne pathogens

    NASA Astrophysics Data System (ADS)

    Sanchez, M.; Probst, L.; Blazevic, E.; Nakao, B.; Northrup, M. A.

    2011-11-01

    We describe a fully automated and autonomous air-borne biothreat detection system for biosurveillance applications. The system, including the nucleic-acid-based detection assay, was designed, built and shipped by Microfluidic Systems Inc (MFSI), a new subsidiary of PositiveID Corporation (PSID). Our findings demonstrate that the system and assay unequivocally identify pathogenic strains of Bacillus anthracis, Yersinia pestis, Francisella tularensis, Burkholderia mallei, and Burkholderia pseudomallei. In order to assess the assay's ability to detect unknown samples, our team also challenged it against a series of blind samples provided by the Department of Homeland Security (DHS). These samples included natural occurring isolated strains, near-neighbor isolates, and environmental samples. Our results indicate that the multiplex assay was specific and produced no false positives when challenged with in house gDNA collections and DHS provided panels. Here we present another analytical tool for the rapid identification of nine Centers for Disease Control and Prevention category A and B biothreat organisms.

  16. Multiresidue trace analysis of pharmaceuticals, their human metabolites and transformation products by fully automated on-line solid-phase extraction-liquid chromatography-tandem mass spectrometry.

    PubMed

    García-Galán, María Jesús; Petrovic, Mira; Rodríguez-Mozaz, Sara; Barceló, Damià

    2016-09-01

    A novel, fully automated analytical methodology based on dual column liquid chromatography coupled to tandem mass spectrometry (LC-LC-MS(2)) has been developed and validated for the analysis of 12 pharmaceuticals and 20 metabolites and transformation products in different types of water (influent and effluent wastewaters and surface water). Two LC columns were used - one for pre-concentration of the sample and the second for separation and analysis - so that water samples were injected directly in the chromatographic system. Besides the many advantages of the methodology, such as minimization of the sample volume required and its manipulation, both compounds ionized in positive and negative mode could be analyzed simultaneously without compromising the sensitivity. A comparative study of different mobile phases, gradients and LC pre-concentration columns was carried out to obtain the best analytical performance. Limits of detection (MLODs) achieved were in the low ngL(-1) range for all the compounds. The method was successfully applied to study the presence of the target analytes in different wastewater and surface water samples collected near the city of Girona (Catalonia, Spain). Data on the environmental presence and fate of pharmaceutical metabolites and TPs is still scarce, highlighting the relevance of the developed methodology. PMID:27343613

  17. Web-Based Fully Automated Self-Help With Different Levels of Therapist Support for Individuals With Eating Disorder Symptoms: A Randomized Controlled Trial

    PubMed Central

    Dingemans, Alexandra E; Spinhoven, Philip; van Ginkel, Joost R; de Rooij, Mark; van Furth, Eric F

    2016-01-01

    Background Despite the disabling nature of eating disorders (EDs), many individuals with ED symptoms do not receive appropriate mental health care. Internet-based interventions have potential to reduce the unmet needs by providing easily accessible health care services. Objective This study aimed to investigate the effectiveness of an Internet-based intervention for individuals with ED symptoms, called “Featback.” In addition, the added value of different intensities of therapist support was investigated. Methods Participants (N=354) were aged 16 years or older with self-reported ED symptoms, including symptoms of anorexia nervosa, bulimia nervosa, and binge eating disorder. Participants were recruited via the website of Featback and the website of a Dutch pro-recovery–focused e-community for young women with ED problems. Participants were randomized to: (1) Featback, consisting of psychoeducation and a fully automated self-monitoring and feedback system, (2) Featback supplemented with low-intensity (weekly) digital therapist support, (3) Featback supplemented with high-intensity (3 times a week) digital therapist support, and (4) a waiting list control condition. Internet-administered self-report questionnaires were completed at baseline, post-intervention (ie, 8 weeks after baseline), and at 3- and 6-month follow-up. The primary outcome measure was ED psychopathology. Secondary outcome measures were symptoms of depression and anxiety, perseverative thinking, and ED-related quality of life. Statistical analyses were conducted according to an intent-to-treat approach using linear mixed models. Results The 3 Featback conditions were superior to a waiting list in reducing bulimic psychopathology (d=−0.16, 95% confidence interval (CI)=−0.31 to −0.01), symptoms of depression and anxiety (d=−0.28, 95% CI=−0.45 to −0.11), and perseverative thinking (d=−0.28, 95% CI=−0.45 to −0.11). No added value of therapist support was found in terms of symptom

  18. Effectiveness of a Web-Based Screening and Fully Automated Brief Motivational Intervention for Adolescent Substance Use: A Randomized Controlled Trial

    PubMed Central

    Elgán, Tobias H; De Paepe, Nina; Tønnesen, Hanne; Csémy, Ladislav; Thomasius, Rainer

    2016-01-01

    Background Mid-to-late adolescence is a critical period for initiation of alcohol and drug problems, which can be reduced by targeted brief motivational interventions. Web-based brief interventions have advantages in terms of acceptability and accessibility and have shown significant reductions of substance use among college students. However, the evidence is sparse among adolescents with at-risk use of alcohol and other drugs. Objective This study evaluated the effectiveness of a targeted and fully automated Web-based brief motivational intervention with no face-to-face components on substance use among adolescents screened for at-risk substance use in four European countries. Methods In an open-access, purely Web-based randomized controlled trial, a convenience sample of adolescents aged 16-18 years from Sweden, Germany, Belgium, and the Czech Republic was recruited using online and offline methods and screened online for at-risk substance use using the CRAFFT (Car, Relax, Alone, Forget, Friends, Trouble) screening instrument. Participants were randomized to a single session brief motivational intervention group or an assessment-only control group but not blinded. Primary outcome was differences in past month drinking measured by a self-reported AUDIT-C-based index score for drinking frequency, quantity, and frequency of binge drinking with measures collected online at baseline and after 3 months. Secondary outcomes were the AUDIT-C-based separate drinking indicators, illegal drug use, and polydrug use. All outcome analyses were conducted with and without Expectation Maximization (EM) imputation of missing follow-up data. Results In total, 2673 adolescents were screened and 1449 (54.2%) participants were randomized to the intervention or control group. After 3 months, 211 adolescents (14.5%) provided follow-up data. Compared to the control group, results from linear mixed models revealed significant reductions in self-reported past-month drinking in favor of the

  19. Development and Evaluation of a Real-Time PCR Assay for Detection of Pneumocystis jirovecii on the Fully Automated BD MAX Platform

    PubMed Central

    Hofko, Marjeta; Zimmermann, Stefan

    2013-01-01

    Pneumocystis jirovecii is an opportunistic pathogen in immunocompromised and AIDS patients. Detection by quantitative PCR is faster and more sensitive than microscopic diagnosis yet requires specific infrastructure. We adapted a real-time PCR amplifying the major surface glycoprotein (MSG) target from Pneumocystis jirovecii for use on the new BD MAX platform. The assay allowed fully automated DNA extraction and multiplex real-time PCR. The BD MAX assay was evaluated against manual DNA extraction and conventional real-time PCR. The BD MAX was used in the research mode running a multiplex PCR (MSG, internal control, and sample process control). The assay had a detection limit of 10 copies of an MSG-encoding plasmid per PCR that equated to 500 copies/ml in respiratory specimens. We observed accurate quantification of MSG targets over a 7- to 8-log range. Prealiquoting and sealing of the complete PCR reagents in conical tubes allowed easy and convenient handling of the BD MAX PCR. In a retrospective analysis of 54 positive samples, the BD MAX assay showed good quantitative correlation with the reference PCR method (R2 = 0.82). Cross-contamination was not observed. Prospectively, 278 respiratory samples were analyzed by both molecular assays. The positivity rate overall was 18.3%. The BD MAX assay identified 46 positive samples, compared to 40 by the reference PCR. The BD MAX assay required liquefaction of highly viscous samples with dithiothreitol as the only manual step, thus offering advantages for timely availability of molecular-based detection assays. PMID:23678059

  20. Predicting survival in heart failure case and control subjects by use of fully automated methods for deriving nonlinear and conventional indices of heart rate dynamics

    NASA Technical Reports Server (NTRS)

    Ho, K. K.; Moody, G. B.; Peng, C. K.; Mietus, J. E.; Larson, M. G.; Levy, D.; Goldberger, A. L.

    1997-01-01

    BACKGROUND: Despite much recent interest in quantification of heart rate variability (HRV), the prognostic value of conventional measures of HRV and of newer indices based on nonlinear dynamics is not universally accepted. METHODS AND RESULTS: We have designed algorithms for analyzing ambulatory ECG recordings and measuring HRV without human intervention, using robust methods for obtaining time-domain measures (mean and SD of heart rate), frequency-domain measures (power in the bands of 0.001 to 0.01 Hz [VLF], 0.01 to 0.15 Hz [LF], and 0.15 to 0.5 Hz [HF] and total spectral power [TP] over all three of these bands), and measures based on nonlinear dynamics (approximate entropy [ApEn], a measure of complexity, and detrended fluctuation analysis [DFA], a measure of long-term correlations). The study population consisted of chronic congestive heart failure (CHF) case patients and sex- and age-matched control subjects in the Framingham Heart Study. After exclusion of technically inadequate studies and those with atrial fibrillation, we used these algorithms to study HRV in 2-hour ambulatory ECG recordings of 69 participants (mean age, 71.7+/-8.1 years). By use of separate Cox proportional-hazards models, the conventional measures SD (P<.01), LF (P<.01), VLF (P<.05), and TP (P<.01) and the nonlinear measure DFA (P<.05) were predictors of survival over a mean follow-up period of 1.9 years; other measures, including ApEn (P>.3), were not. In multivariable models, DFA was of borderline predictive significance (P=.06) after adjustment for the diagnosis of CHF and SD. CONCLUSIONS: These results demonstrate that HRV analysis of ambulatory ECG recordings based on fully automated methods can have prognostic value in a population-based study and that nonlinear HRV indices may contribute prognostic value to complement traditional HRV measures.

  1. Fully automated chip-based negative mode nanoelectrospray mass spectrometry of fructooligosaccharides produced by heterologously expressed levansucrase from Pseudomonas syringae pv. tomato DC3000.

    PubMed

    Visnapuu, Triinu; Zamfir, Alina D; Mosoarca, Cristina; Stanescu, Michaela D; Alamäe, Tiina

    2009-05-01

    Pseudomonas syringae pathovars possess multiple levansucrases with still unclear specific roles for bacteria. We have cloned and expressed three levansucrase genes, lsc1, lsc2 and lsc3, from P. syringae DC3000 in Escherichia coli. Levansucrases synthesize a high molecular weight fructan polymer, levan, from sucrose and in the case of some levansucrases, fructooligosaccharides (FOS) with potential prebiotic effects are also produced. The ability of purified Lsc3 protein of DC3000 to synthesize FOS was tested using prolonged incubation time and varied concentrations of sugar substrates. Thin-layer chromatography (TLC) analysis of reaction products disclosed formation of FOS from both sucrose and raffinose, revealing a new catalytic property for P. syringae levansucrases. In order to analyze Lsc3-produced FOS in underivatized form, we optimized a novel method recently introduced in carbohydrate research, based on fully automated chip-based nanoelectrospray ionization (nanoESI) high-capacity ion trap mass spectrometry (HCT-MS). Uding chip-based nanoESI MS in negative ion mode, FOS, with degrees of polymerization up to five, were detected in reaction mixtures of Lsc3 with sucrose and raffinose. For confirmation, further structural analysis by tandem mass spectrometry (MS/MS) employing collision-induced dissociation at low energies was performed. To validate the method, commercial inulin-derived FOS preparations Orafti P95 and Orafti Synergy1, which are currently used as prebiotics, were used as controls. By chip-based nanoESI HCT-MS, similar FOS distribution was observed in these reference mixtures. Thereby, the obtained data allowed us to postulate that FOS produced by the Lsc3 protein of P. syringae DC3000 may be prebiotic as well. PMID:19337979

  2. Chromosome Microarray.

    PubMed

    Anderson, Sharon

    2016-01-01

    Over the last half century, knowledge about genetics, genetic testing, and its complexity has flourished. Completion of the Human Genome Project provided a foundation upon which the accuracy of genetics, genomics, and integration of bioinformatics knowledge and testing has grown exponentially. What is lagging, however, are efforts to reach and engage nurses about this rapidly changing field. The purpose of this article is to familiarize nurses with several frequently ordered genetic tests including chromosomes and fluorescence in situ hybridization followed by a comprehensive review of chromosome microarray. It shares the complexity of microarray including how testing is performed and results analyzed. A case report demonstrates how this technology is applied in clinical practice and reveals benefits and limitations of this scientific and bioinformatics genetic technology. Clinical implications for maternal-child nurses across practice levels are discussed. PMID:27276104

  3. Autonomous system for Web-based microarray image analysis.

    PubMed

    Bozinov, Daniel

    2003-12-01

    Software-based feature extraction from DNA microarray images still requires human intervention on various levels. Manual adjustment of grid and metagrid parameters, precise alignment of superimposed grid templates and gene spots, or simply identification of large-scale artifacts have to be performed beforehand to reliably analyze DNA signals and correctly quantify their expression values. Ideally, a Web-based system with input solely confined to a single microarray image and a data table as output containing measurements for all gene spots would directly transform raw image data into abstracted gene expression tables. Sophisticated algorithms with advanced procedures for iterative correction function can overcome imminent challenges in image processing. Herein is introduced an integrated software system with a Java-based interface on the client side that allows for decentralized access and furthermore enables the scientist to instantly employ the most updated software version at any given time. This software tool is extended from PixClust as used in Extractiff incorporated with Java Web Start deployment technology. Ultimately, this setup is destined for high-throughput pipelines in genome-wide medical diagnostics labs or microarray core facilities aimed at providing fully automated service to its users. PMID:15376911

  4. A comparison of fully automated methods of data analysis and computer assisted heuristic methods in an electrode kinetic study of the pathologically variable [Fe(CN)6](3-/4-) process by AC voltammetry.

    PubMed

    Morris, Graham P; Simonov, Alexandr N; Mashkina, Elena A; Bordas, Rafel; Gillow, Kathryn; Baker, Ruth E; Gavaghan, David J; Bond, Alan M

    2013-12-17

    Fully automated and computer assisted heuristic data analysis approaches have been applied to a series of AC voltammetric experiments undertaken on the [Fe(CN)6](3-/4-) process at a glassy carbon electrode in 3 M KCl aqueous electrolyte. The recovered parameters in all forms of data analysis encompass E(0) (reversible potential), k(0) (heterogeneous charge transfer rate constant at E(0)), α (charge transfer coefficient), Ru (uncompensated resistance), and Cdl (double layer capacitance). The automated method of analysis employed time domain optimization and Bayesian statistics. This and all other methods assumed the Butler-Volmer model applies for electron transfer kinetics, planar diffusion for mass transport, Ohm's Law for Ru, and a potential-independent Cdl model. Heuristic approaches utilize combinations of Fourier Transform filtering, sensitivity analysis, and simplex-based forms of optimization applied to resolved AC harmonics and rely on experimenter experience to assist in experiment-theory comparisons. Remarkable consistency of parameter evaluation was achieved, although the fully automated time domain method provided consistently higher α values than those based on frequency domain data analysis. The origin of this difference is that the implemented fully automated method requires a perfect model for the double layer capacitance. In contrast, the importance of imperfections in the double layer model is minimized when analysis is performed in the frequency domain. Substantial variation in k(0) values was found by analysis of the 10 data sets for this highly surface-sensitive pathologically variable [Fe(CN)6](3-/4-) process, but remarkably, all fit the quasi-reversible model satisfactorily. PMID:24160752

  5. DNA Microarrays

    NASA Astrophysics Data System (ADS)

    Nguyen, C.; Gidrol, X.

    Genomics has revolutionised biological and biomedical research. This revolution was predictable on the basis of its two driving forces: the ever increasing availability of genome sequences and the development of new technology able to exploit them. Up until now, technical limitations meant that molecular biology could only analyse one or two parameters per experiment, providing relatively little information compared with the great complexity of the systems under investigation. This gene by gene approach is inadequate to understand biological systems containing several thousand genes. It is essential to have an overall view of the DNA, RNA, and relevant proteins. A simple inventory of the genome is not sufficient to understand the functions of the genes, or indeed the way that cells and organisms work. For this purpose, functional studies based on whole genomes are needed. Among these new large-scale methods of molecular analysis, DNA microarrays provide a way of studying the genome and the transcriptome. The idea of integrating a large amount of data derived from a support with very small area has led biologists to call these chips, borrowing the term from the microelectronics industry. At the beginning of the 1990s, the development of DNA chips on nylon membranes [1, 2], then on glass [3] and silicon [4] supports, made it possible for the first time to carry out simultaneous measurements of the equilibrium concentration of all the messenger RNA (mRNA) or transcribed RNA in a cell. These microarrays offer a wide range of applications, in both fundamental and clinical research, providing a method for genome-wide characterisation of changes occurring within a cell or tissue, as for example in polymorphism studies, detection of mutations, and quantitative assays of gene copies. With regard to the transcriptome, it provides a way of characterising differentially expressed genes, profiling given biological states, and identifying regulatory channels.

  6. Feasibility and User Perception of a Fully Automated Push-Based Multiple-Session Alcohol Intervention for University Students: Randomized Controlled Trial

    PubMed Central

    2014-01-01

    Background In recent years, many electronic health behavior interventions have been developed in order to reach individuals with unhealthy behaviors, such as risky drinking. This is especially relevant for university students, many of whom are risky drinkers. Objective This study explored the acceptability and feasibility in a nontreatment-seeking group of university students (including both risk and nonrisk drinkers), of a fully automated, push-based, multiple-session, alcohol intervention, comparing two modes of delivery by randomizing participants to receive the intervention either by SMS text messaging (short message service, SMS) or by email. Methods A total of 5499 students at Luleå University in northern Sweden were invited to participate in a single-session alcohol assessment and feedback intervention; 28.04% (1542/5499) students completed this part of the study. In total, 29.44% (454/1542) of those participating in the single-session intervention accepted to participate further in the extended multiple-session intervention lasting for 4 weeks. The students were randomized to receive the intervention messages via SMS or email. A follow-up questionnaire was sent immediately after the intervention and 52.9% (240/454) responded. Results No difference was seen regarding satisfaction with the length and frequency of the intervention, regardless of the mode of delivery. Approximately 15% in both the SMS (19/136) and email groups (15/104) would have preferred the other mode of delivery. On the other hand, more students in the SMS group (46/229, 20.1%) stopped participating in the intervention during the 4-week period compared with the email group (10/193, 5.2%). Most students in both groups expressed satisfaction with the content of the messages and would recommend the intervention to a fellow student in need of reducing drinking. A striking difference was seen regarding when a message was read; 88.2% (120/136) of the SMS group read the messages within 1 hour in

  7. Nucleosome positioning from tiling microarray data

    PubMed Central

    Yassour, Moran; Kaplan, Tommy; Jaimovich, Ariel; Friedman, Nir

    2008-01-01

    Motivation: The packaging of DNA around nucleosomes in eukaryotic cells plays a crucial role in regulation of gene expression, and other DNA-related processes. To better understand the regulatory role of nucleosomes, it is important to pinpoint their position in a high (5–10 bp) resolution. Toward this end, several recent works used dense tiling arrays to map nucleosomes in a high-throughput manner. These data were then parsed and hand-curated, and the positions of nucleosomes were assessed. Results: In this manuscript, we present a fully automated algorithm to analyze such data and predict the exact location of nucleosomes. We introduce a method, based on a probabilistic graphical model, to increase the resolution of our predictions even beyond that of the microarray used. We show how to build such a model and how to compile it into a simple Hidden Markov Model, allowing for a fast and accurate inference of nucleosome positions. We applied our model to nucleosomal data from mid-log yeast cells reported by Yuan et al. and compared our predictions to those of the original paper; to a more recent method that uses five times denser tiling arrays as explained by Lee et al.; and to a curated set of literature-based nucleosome positions. Our results suggest that by applying our algorithm to the same data used by Yuan et al. our fully automated model traced 13% more nucleosomes, and increased the overall accuracy by about 20%. We believe that such an improvement opens the way for a better understanding of the regulatory mechanisms controlling gene expression, and how they are encoded in the DNA. Contact: nir@cs.huji.ac.il PMID:18586706

  8. A robotics platform for automated batch fabrication of high density, microfluidics-based DNA microarrays, with applications to single cell, multiplex assays of secreted proteins

    PubMed Central

    Ahmad, Habib; Sutherland, Alex; Shin, Young Shik; Hwang, Kiwook; Qin, Lidong; Krom, Russell-John; Heath, James R.

    2011-01-01

    Microfluidics flow-patterning has been utilized for the construction of chip-scale miniaturized DNA and protein barcode arrays. Such arrays have been used for specific clinical and fundamental investigations in which many proteins are assayed from single cells or other small sample sizes. However, flow-patterned arrays are hand-prepared, and so are impractical for broad applications. We describe an integrated robotics/microfluidics platform for the automated preparation of such arrays, and we apply it to the batch fabrication of up to eighteen chips of flow-patterned DNA barcodes. The resulting substrates are comparable in quality with hand-made arrays and exhibit excellent substrate-to-substrate consistency. We demonstrate the utility and reproducibility of robotics-patterned barcodes by utilizing two flow-patterned chips for highly parallel assays of a panel of secreted proteins from single macrophage cells. PMID:21974603

  9. Regenerable immuno-biochip for screening ochratoxin A in green coffee extract using an automated microarray chip reader with chemiluminescence detection.

    PubMed

    Sauceda-Friebe, Jimena C; Karsunke, Xaver Y Z; Vazac, Susanna; Biselli, Scarlett; Niessner, Reinhard; Knopp, Dietmar

    2011-03-18

    Ochratoxin A (OTA) can contaminate foodstuffs in the ppb to ppm range and once formed, it is difficult to remove. Because of its toxicity and potential risks to human health, the need exists for rapid, efficient detection methods that comply with legal maximum residual limits. In this work we have synthesized an OTA conjugate functionalized with a water-soluble peptide for covalent immobilization on a glass biochip by means of contact spotting. The chip was used for OTA determination with an indirect competitive immunoassay format with flow-through reagent addition and chemiluminescence detection, carried out with the stand-alone automated Munich Chip Reader 3 (MCR 3) platform. A buffer model and real green coffee extracts were used for this purpose. At the present, covalent conjugate immobilization allowed for at least 20 assay-regeneration cycles of the biochip surface. The total analysis time for a single sample, including measurement and surface regeneration, was 12 min and the LOQ of OTA in green coffee extract was 0.3 μg L(-1) which corresponds to 7 μg kg(-1). PMID:21397079

  10. PMD: A Resource for Archiving and Analyzing Protein Microarray data

    PubMed Central

    Xu, Zhaowei; Huang, Likun; Zhang, Hainan; Li, Yang; Guo, Shujuan; Wang, Nan; Wang, Shi-hua; Chen, Ziqing; Wang, Jingfang; Tao, Sheng-ce

    2016-01-01

    Protein microarray is a powerful technology for both basic research and clinical study. However, because there is no database specifically tailored for protein microarray, the majority of the valuable original protein microarray data is still not publically accessible. To address this issue, we constructed Protein Microarray Database (PMD), which is specifically designed for archiving and analyzing protein microarray data. In PMD, users can easily browse and search the entire database by experimental name, protein microarray type, and sample information. Additionally, PMD integrates several data analysis tools and provides an automated data analysis pipeline for users. With just one click, users can obtain a comprehensive analysis report for their protein microarray data. The report includes preliminary data analysis, such as data normalization, candidate identification, and an in-depth bioinformatics analysis of the candidates, which include functional annotation, pathway analysis, and protein-protein interaction network analysis. PMD is now freely available at www.proteinmicroarray.cn. PMID:26813635

  11. SaDA: From Sampling to Data Analysis—An Extensible Open Source Infrastructure for Rapid, Robust and Automated Management and Analysis of Modern Ecological High-Throughput Microarray Data

    PubMed Central

    Singh, Kumar Saurabh; Thual, Dominique; Spurio, Roberto; Cannata, Nicola

    2015-01-01

    One of the most crucial characteristics of day-to-day laboratory information management is the collection, storage and retrieval of information about research subjects and environmental or biomedical samples. An efficient link between sample data and experimental results is absolutely important for the successful outcome of a collaborative project. Currently available software solutions are largely limited to large scale, expensive commercial Laboratory Information Management Systems (LIMS). Acquiring such LIMS indeed can bring laboratory information management to a higher level, but most of the times this requires a sufficient investment of money, time and technical efforts. There is a clear need for a light weighted open source system which can easily be managed on local servers and handled by individual researchers. Here we present a software named SaDA for storing, retrieving and analyzing data originated from microorganism monitoring experiments. SaDA is fully integrated in the management of environmental samples, oligonucleotide sequences, microarray data and the subsequent downstream analysis procedures. It is simple and generic software, and can be extended and customized for various environmental and biomedical studies. PMID:26047146

  12. SaDA: From Sampling to Data Analysis-An Extensible Open Source Infrastructure for Rapid, Robust and Automated Management and Analysis of Modern Ecological High-Throughput Microarray Data.

    PubMed

    Singh, Kumar Saurabh; Thual, Dominique; Spurio, Roberto; Cannata, Nicola

    2015-06-01

    One of the most crucial characteristics of day-to-day laboratory information management is the collection, storage and retrieval of information about research subjects and environmental or biomedical samples. An efficient link between sample data and experimental results is absolutely important for the successful outcome of a collaborative project. Currently available software solutions are largely limited to large scale, expensive commercial Laboratory Information Management Systems (LIMS). Acquiring such LIMS indeed can bring laboratory information management to a higher level, but most of the times this requires a sufficient investment of money, time and technical efforts. There is a clear need for a light weighted open source system which can easily be managed on local servers and handled by individual researchers. Here we present a software named SaDA for storing, retrieving and analyzing data originated from microorganism monitoring experiments. SaDA is fully integrated in the management of environmental samples, oligonucleotide sequences, microarray data and the subsequent downstream analysis procedures. It is simple and generic software, and can be extended and customized for various environmental and biomedical studies. PMID:26047146

  13. Fully Automated Renal Tissue Volumetry in MR Volume Data Using Prior-Shape-Based Segmentation in Subject-Specific Probability Maps.

    PubMed

    Gloger, Oliver; Tönnies, Klaus; Laqua, Rene; Völzke, Henry

    2015-10-01

    Organ segmentation in magnetic resonance (MR) volume data is of increasing interest in epidemiological studies and clinical practice. Especially in large-scale population-based studies, organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time consuming and prone to reader variability, large-scale studies need automatic methods to perform organ segmentation. In this paper, we present an automated framework for renal tissue segmentation that computes renal parenchyma, cortex, and medulla volumetry in native MR volume data without any user interaction. We introduce a novel strategy of subject-specific probability map computation for renal tissue types, which takes inter- and intra-MR-intensity variability into account. Several kinds of tissue-related 2-D and 3-D prior-shape knowledge are incorporated in modularized framework parts to segment renal parenchyma in a final level set segmentation strategy. Subject-specific probabilities for medulla and cortex tissue are applied in a fuzzy clustering technique to delineate cortex and medulla tissue inside segmented parenchyma regions. The novel subject-specific computation approach provides clearly improved tissue probability map quality than existing methods. Comparing to existing methods, the framework provides improved results for parenchyma segmentation. Furthermore, cortex and medulla segmentation qualities are very promising but cannot be compared to existing methods since state-of-the art methods for automated cortex and medulla segmentation in native MR volume data are still missing. PMID:25915954

  14. Development of a high-throughput screening for nerve agent detoxifying materials using a fully-automated robot-assisted biological assay.

    PubMed

    Wille, T; Thiermann, H; Worek, F

    2010-04-01

    Developing improved medical countermeasures against chemical warfare agents (nerve agents) is urgently needed but time-consuming and costly. Here we introduce a robot-assisted liquid handling system with warming, cooling and incubating facilities to screen the detoxifying properties of biological and chemical materials against nerve agents. Two biological tests were established and plasma from various species, DFPase and three cyclodextrins were used as test materials. In test 1, plasma was mixed with sarin or VX and the inhibitory potency of the incubate was determined with human acetylcholinesterase (AChE) at 0, 30 and 60 min. In test 2, test materials and nerve agents were mixed and incubated. Between 0 and 40 min samples were taken and incubated for 3 min with AChE and the residual AChE inhibition was determined to enable the semi-quantitative evaluation of the detoxification kinetics. The automated assays proved to be highly reproducible. It was possible to pre-select detoxifying reagents with test 1 and to determine more detailed detoxifying kinetics with test 2. In conclusion, the automated assay may be considered as a versatile tool for the high-throughput screening of potential detoxifying materials against different nerve agents. With this two-step assay it is possible to screen effectively for detoxifying materials in a high-throughput system. PMID:19961920

  15. The Genopolis Microarray Database

    PubMed Central

    Splendiani, Andrea; Brandizi, Marco; Even, Gael; Beretta, Ottavio; Pavelka, Norman; Pelizzola, Mattia; Mayhaus, Manuel; Foti, Maria; Mauri, Giancarlo; Ricciardi-Castagnoli, Paola

    2007-01-01

    Background Gene expression databases are key resources for microarray data management and analysis and the importance of a proper annotation of their content is well understood. Public repositories as well as microarray database systems that can be implemented by single laboratories exist. However, there is not yet a tool that can easily support a collaborative environment where different users with different rights of access to data can interact to define a common highly coherent content. The scope of the Genopolis database is to provide a resource that allows different groups performing microarray experiments related to a common subject to create a common coherent knowledge base and to analyse it. The Genopolis database has been implemented as a dedicated system for the scientific community studying dendritic and macrophage cells functions and host-parasite interactions. Results The Genopolis Database system allows the community to build an object based MIAME compliant annotation of their experiments and to store images, raw and processed data from the Affymetrix GeneChip® platform. It supports dynamical definition of controlled vocabularies and provides automated and supervised steps to control the coherence of data and annotations. It allows a precise control of the visibility of the database content to different sub groups in the community and facilitates exports of its content to public repositories. It provides an interactive users interface for data analysis: this allows users to visualize data matrices based on functional lists and sample characterization, and to navigate to other data matrices defined by similarity of expression values as well as functional characterizations of genes involved. A collaborative environment is also provided for the definition and sharing of functional annotation by users. Conclusion The Genopolis Database supports a community in building a common coherent knowledge base and analyse it. This fills a gap between a local

  16. Novel Microarrays for Simultaneous Serodiagnosis of Multiple Antiviral Antibodies

    PubMed Central

    Sivakumar, Ponnurengam Malliappan; Moritsugu, Nozomi; Obuse, Sei; Isoshima, Takashi; Tashiro, Hideo; Ito, Yoshihiro

    2013-01-01

    We developed an automated diagnostic system for the detection of virus-specific immunoglobulin Gs (IgGs) that was based on a microarray platform. We compared efficacies of our automated system with conventional enzyme immunoassays (EIAs). Viruses were immobilized to microarrays using a radical cross-linking reaction that was induced by photo-irradiation. A new photoreactive polymer containing perfluorophenyl azide (PFPA) and poly(ethylene glycol) methacrylate was prepared and coated on plates. Inactivated measles, rubella, mumps, Varicella-Zoster and recombinant Epstein-Barr viruse antigen were added to coated plates, and irradiated with ultraviolet light to facilitate immobilization. Virus-specific IgGs in healthy human sera were assayed using these prepared microarrays and the results obtained compared with those from conventional EIAs. We observed high correlation (0.79–0.96) in the results between the automated microarray technique and EIAs. The microarray-based assay was more rapid, involved less reagents and sample, and was easier to conduct compared with conventional EIA techniques. The automated microarray system was further improved by introducing reagent storage reservoirs inside the chamber, thereby conserving the use of expensive reagents and antibodies. We considered the microarray format to be suitable for rapid and multiple serological diagnoses of viral diseases that could be developed further for clinical applications. PMID:24367491

  17. Novel microarrays for simultaneous serodiagnosis of multiple antiviral antibodies.

    PubMed

    Sivakumar, Ponnurengam Malliappan; Moritsugu, Nozomi; Obuse, Sei; Isoshima, Takashi; Tashiro, Hideo; Ito, Yoshihiro

    2013-01-01

    We developed an automated diagnostic system for the detection of virus-specific immunoglobulin Gs (IgGs) that was based on a microarray platform. We compared efficacies of our automated system with conventional enzyme immunoassays (EIAs). Viruses were immobilized to microarrays using a radical cross-linking reaction that was induced by photo-irradiation. A new photoreactive polymer containing perfluorophenyl azide (PFPA) and poly(ethylene glycol) methacrylate was prepared and coated on plates. Inactivated measles, rubella, mumps, Varicella-Zoster and recombinant Epstein-Barr viruse antigen were added to coated plates, and irradiated with ultraviolet light to facilitate immobilization. Virus-specific IgGs in healthy human sera were assayed using these prepared microarrays and the results obtained compared with those from conventional EIAs. We observed high correlation (0.79-0.96) in the results between the automated microarray technique and EIAs. The microarray-based assay was more rapid, involved less reagents and sample, and was easier to conduct compared with conventional EIA techniques. The automated microarray system was further improved by introducing reagent storage reservoirs inside the chamber, thereby conserving the use of expensive reagents and antibodies. We considered the microarray format to be suitable for rapid and multiple serological diagnoses of viral diseases that could be developed further for clinical applications. PMID:24367491

  18. Microarrays, Integrated Analytical Systems

    NASA Astrophysics Data System (ADS)

    Combinatorial chemistry is used to find materials that form sensor microarrays. This book discusses the fundamentals, and then proceeds to the many applications of microarrays, from measuring gene expression (DNA microarrays) to protein-protein interactions, peptide chemistry, carbodhydrate chemistry, electrochemical detection, and microfluidics.

  19. A fully automated effervescence assisted dispersive liquid-liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples.

    PubMed

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina; Andruch, Vasil; Moskvin, Leonid; Bulatov, Andrey

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid-liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L(-1) Na2CO3) and the proton donor solution (1 mol L(-1) CH3COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min(-1) during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV-Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5-100 µmol L(-1) of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L(-1). PMID:26703262

  20. Automation or De-automation

    NASA Astrophysics Data System (ADS)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  1. MARS: Microarray analysis, retrieval, and storage system

    PubMed Central

    Maurer, Michael; Molidor, Robert; Sturn, Alexander; Hartler, Juergen; Hackl, Hubert; Stocker, Gernot; Prokesch, Andreas; Scheideler, Marcel; Trajanoski, Zlatko

    2005-01-01

    Background Microarray analysis has become a widely used technique for the study of gene-expression patterns on a genomic scale. As more and more laboratories are adopting microarray technology, there is a need for powerful and easy to use microarray databases facilitating array fabrication, labeling, hybridization, and data analysis. The wealth of data generated by this high throughput approach renders adequate database and analysis tools crucial for the pursuit of insights into the transcriptomic behavior of cells. Results MARS (Microarray Analysis and Retrieval System) provides a comprehensive MIAME supportive suite for storing, retrieving, and analyzing multi color microarray data. The system comprises a laboratory information management system (LIMS), a quality control management, as well as a sophisticated user management system. MARS is fully integrated into an analytical pipeline of microarray image analysis, normalization, gene expression clustering, and mapping of gene expression data onto biological pathways. The incorporation of ontologies and the use of MAGE-ML enables an export of studies stored in MARS to public repositories and other databases accepting these documents. Conclusion We have developed an integrated system tailored to serve the specific needs of microarray based research projects using a unique fusion of Web based and standalone applications connected to the latest J2EE application server technology. The presented system is freely available for academic and non-profit institutions. More information can be found at . PMID:15836795

  2. Detection of BRAF Mutations Using a Fully Automated Platform and Comparison with High Resolution Melting, Real-Time Allele Specific Amplification, Immunohistochemistry and Next Generation Sequencing Assays, for Patients with Metastatic Melanoma

    PubMed Central

    Harlé, Alexandre; Salleron, Julia; Franczak, Claire; Dubois, Cindy; Filhine-Tressarieu, Pierre; Leroux, Agnès; Merlin, Jean-Louis

    2016-01-01

    Background Metastatic melanoma is a severe disease with one of the highest mortality rate in skin diseases. Overall survival has significantly improved with immunotherapy and targeted therapies. Kinase inhibitors targeting BRAF V600 showed promising results. BRAF genotyping is mandatory for the prescription of anti-BRAF therapies. Methods Fifty-nine formalin-fixed paraffin-embedded melanoma samples were assessed using High-Resolution-Melting (HRM) PCR, Real-time allele-specific amplification (RT-ASA) PCR, Next generation sequencing (NGS), immunohistochemistry (IHC) and the fully-automated molecular diagnostics platform IdyllaTM. Sensitivity, specificity, positive predictive value and negative predictive value were calculated using NGS as the reference standard to compare the different assays. Results BRAF mutations were found in 28(47.5%), 29(49.2%), 31(52.5%), 29(49.2%) and 27(45.8%) samples with HRM, RT-ASA, NGS, IdyllaTM and IHC respectively. Twenty-six (81.2%) samples were found bearing a c.1799T>A (p.Val600Glu) mutation, three (9.4%) with a c.1798_1799delinsAA (p.Val600Lys) mutation and one with c.1789_1790delinsTC (p.Leu597Ser) mutation. Two samples were found bearing complex mutations. Conclusions HRM appears the less sensitive assay for the detection of BRAF V600 mutations. The RT-ASA, IdyllaTM and IHC assays are suitable for routine molecular diagnostics aiming at the prescription of anti-BRAF therapies. IdyllaTM assay is fully-automated and requires less than 2 minutes for samples preparation and is the fastest of the tested assays. PMID:27111917

  3. Quantification of 31 illicit and medicinal drugs and metabolites in whole blood by fully automated solid-phase extraction and ultra-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Bjørk, Marie Kjærgaard; Simonsen, Kirsten Wiese; Andersen, David Wederkinck; Dalsgaard, Petur Weihe; Sigurðardóttir, Stella Rögn; Linnet, Kristian; Rasmussen, Brian Schou

    2013-03-01

    An efficient method for analyzing illegal and medicinal drugs in whole blood using fully automated sample preparation and short ultra-high-performance liquid chromatography-tandem mass spectrometry (MS/MS) run time is presented. A selection of 31 drugs, including amphetamines, cocaine, opioids, and benzodiazepines, was used. In order to increase the efficiency of routine analysis, a robotic system based on automated liquid handling and capable of handling all unit operation for sample preparation was built on a Freedom Evo 200 platform with several add-ons from Tecan and third-party vendors. Solid-phase extraction was performed using Strata X-C plates. Extraction time for 96 samples was less than 3 h. Chromatography was performed using an ACQUITY UPLC system (Waters Corporation, Milford, USA). Analytes were separated on a 100 mm × 2.1 mm, 1.7 μm Acquity UPLC CSH C(18) column using a 6.5 min 0.1 % ammonia (25 %) in water/0.1 % ammonia (25 %) in methanol gradient and quantified by MS/MS (Waters Quattro Premier XE) in multiple-reaction monitoring mode. Full validation, including linearity, precision and trueness, matrix effect, ion suppression/enhancement of co-eluting analytes, recovery, and specificity, was performed. The method was employed successfully in the laboratory and used for routine analysis of forensic material. In combination with tetrahydrocannabinol analysis, the method covered 96 % of cases involving driving under the influence of drugs. The manual labor involved in preparing blood samples, solvents, etc., was reduced to a half an hour per batch. The automated sample preparation setup also minimized human exposure to hazardous materials, provided highly improved ergonomics, and eliminated manual pipetting. PMID:23292043

  4. Fully automated determination of nicotine and its major metabolites in whole blood by means of a DBS online-SPE LC-HR-MS/MS approach for sports drug testing.

    PubMed

    Tretzel, Laura; Thomas, Andreas; Piper, Thomas; Hedeland, Mikael; Geyer, Hans; Schänzer, Wilhelm; Thevis, Mario

    2016-05-10

    Dried blood spots (DBS) represent a sample matrix collected under minimal-invasive, straightforward and robust conditions. DBS specimens have been shown to provide appropriate test material for different analytical disciplines, e.g., preclinical drug development, therapeutic drug monitoring, forensic toxicology and diagnostic analysis of metabolic disorders in newborns. However, the sample preparation has occasionally been reported as laborious and time consuming. In order to minimize the manual workload and to substantiate the suitability of DBS for high sample-throughput, the automation of sample preparation processes is of paramount interest. In the current study, the development and validation of a fully automated DBS extraction method coupled to online solid-phase extraction using the example of nicotine, its major metabolites nornicotine, cotinine and trans-3'-hydroxycotinine and the tobacco alkaloids anabasine and anatabine is presented, based on the rationale that the use of nicotine-containing products for performance-enhancing purposes has been monitored by the World Anti-Doping Agency (WADA) for several years. Automation-derived DBS sample extracts were directed online to liquid chromatography high resolution/high mass accuracy tandem mass spectrometry, and target analytes were determined with support of four deuterated internal standards. Validation of the method yielded precise (CV <7.5% for intraday and <12.3% for interday measurements) and linear (r(2)>0.998) results. The limit of detection was established at 5 ng mL(-1) for all studied compounds, the extraction recovery ranged from 25 to 44%, and no matrix effects were observed. To exemplify the applicability of the DBS online-SPE LC-MS/MS approach for sports drug testing purposes, the method was applied to authentic DBS samples obtained from smokers, snus users, and e-cigarette users. Statistical evaluation of the obtained results indicated differences in metabolic behavior depending on the route

  5. A fully automated method for simultaneous determination of aflatoxins and ochratoxin A in dried fruits by pressurized liquid extraction and online solid-phase extraction cleanup coupled to ultra-high-pressure liquid chromatography-tandem mass spectrometry.

    PubMed

    Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Russo, Mariateresa; Valdés, Alberto; Ibáñez, Clara; Rastrelli, Luca

    2015-04-01

    According to current demands and future perspectives in food safety, this study reports a fast and fully automated analytical method for the simultaneous analysis of the mycotoxins with high toxicity and wide spread, aflatoxins (AFs) and ochratoxin A (OTA) in dried fruits, a high-risk foodstuff. The method is based on pressurized liquid extraction (PLE), with aqueous methanol (30%) at 110 °C, of the slurried dried fruit and online solid-phase extraction (online SPE) cleanup of the PLE extracts with a C18 cartridge. The purified sample was directly analysed by ultra-high-pressure liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) for sensitive and selective determination of AFs and OTA. The proposed analytical procedure was validated for different dried fruits (vine fruit, fig and apricot), providing method detection and quantification limits much lower than the AFs and OTA maximum levels imposed by EU regulation in dried fruit for direct human consumption. Also, recoveries (83-103%) and repeatability (RSD < 8, n = 3) meet the performance criteria required by EU regulation for the determination of the levels of mycotoxins in foodstuffs. The main advantage of the proposed method is full automation of the whole analytical procedure that reduces the time and cost of the analysis, sample manipulation and solvent consumption, enabling high-throughput analysis and highly accurate and precise results. PMID:25694147

  6. A Texture Based Pattern Recognition Approach to Distinguish Melanoma from Non-Melanoma Cells in Histopathological Tissue Microarray Sections

    PubMed Central

    Rexhepaj, Elton; Agnarsdóttir, Margrét; Bergman, Julia; Edqvist, Per-Henrik; Bergqvist, Michael; Uhlén, Mathias; Gallagher, William M.; Lundberg, Emma; Ponten, Fredrik

    2013-01-01

    Aims Immunohistochemistry is a routine practice in clinical cancer diagnostics and also an established technology for tissue-based research regarding biomarker discovery efforts. Tedious manual assessment of immunohistochemically stained tissue needs to be fully automated to take full advantage of the potential for high throughput analyses enabled by tissue microarrays and digital pathology. Such automated tools also need to be reproducible for different experimental conditions and biomarker targets. In this study we present a novel supervised melanoma specific pattern recognition approach that is fully automated and quantitative. Methods and Results Melanoma samples were immunostained for the melanocyte specific target, Melan-A. Images representing immunostained melanoma tissue were then digitally processed to segment regions of interest, highlighting Melan-A positive and negative areas. Color deconvolution was applied to each region of interest to separate the channel containing the immunohistochemistry signal from the hematoxylin counterstaining channel. A support vector machine melanoma classification model was learned from a discovery melanoma patient cohort (n = 264) and subsequently validated on an independent cohort of melanoma patient tissue sample images (n = 157). Conclusion Here we propose a novel method that takes advantage of utilizing an immuhistochemical marker highlighting melanocytes to fully automate the learning of a general melanoma cell classification model. The presented method can be applied on any protein of interest and thus provides a tool for quantification of immunohistochemistry-based protein expression in melanoma. PMID:23690928

  7. Fully automated trace level determination of parent and alkylated PAHs in environmental waters by online SPE-LC-APPI-MS/MS.

    PubMed

    Ramirez, Cesar E; Wang, Chengtao; Gardinali, Piero R

    2014-01-01

    Polycyclic aromatic hydrocarbons (PAHs) are ubiquitous compounds that enter the environment from natural and anthropogenic sources, often used as markers to determine the extent, fate, and potential effects on natural resources after a crude oil accidental release. Gas chromatography-mass spectrometry (GC-MS) after liquid-liquid extraction (LLE+GC-MS) has been extensively used to isolate and quantify both parent and alkylated PAHs. However, it requires labor-intensive extraction and cleanup steps and generates large amounts of toxic solvent waste. Therefore, there is a clear need for greener, faster techniques with enough reproducibility and sensitivity to quantify many PAHs in large numbers of water samples in a short period of time. This study combines online solid-phase extraction followed by liquid chromatography (LC) separation with dopant-assisted atmospheric pressure photoionization (APPI) and tandem MS detection, to provide a one-step protocol that detects PAHs at low nanograms per liter with almost no sample preparation and with a significantly lower consumption of toxic halogenated solvents. Water samples were amended with methanol, fortified with isotopically labeled PAHs, and loaded onto an online SPE column, using a large-volume sample loop with an auxiliary LC pump for sample preconcentration and salt removal. The loaded SPE column was connected to an UPLC pump and analytes were backflushed to a Thermo Hypersil Green PAH analytical column where a 20-min gradient separation was performed at a variable flow rate. Detection was performed by a triple-quadrupole MS equipped with a gas-phase dopant delivery system, using 1.50 mL of chlorobenzene dopant per run. In contrast, LLE+GC-MS typically use 150 mL of organic solvents per sample, and methylene chloride is preferred because of its low boiling point. However, this solvent has a higher environmental persistence than chlorobenzene and is considered a carcinogen. The automated system is capable of

  8. Microarrays in hematology.

    PubMed

    Walker, Josef; Flower, Darren; Rigley, Kevin

    2002-01-01

    Microarrays are fast becoming routine tools for the high-throughput analysis of gene expression in a wide range of biologic systems, including hematology. Although a number of approaches can be taken when implementing microarray-based studies, all are capable of providing important insights into biologic function. Although some technical issues have not been resolved, microarrays will continue to make a significant impact on hematologically important research. PMID:11753074

  9. Evaluation of Surface Chemistries for Antibody Microarrays

    SciTech Connect

    Seurynck-Servoss, Shannon L.; White, Amanda M.; Baird, Cheryl L.; Rodland, Karin D.; Zangar, Richard C.

    2007-12-01

    Antibody microarrays are an emerging technology that promises to be a powerful tool for the detection of disease biomarkers. The current technology for protein microarrays has been primarily derived from DNA microarrays and is not fully characterized for use with proteins. For example, there are a myriad of surface chemistries that are commercially available for antibody microarrays, but no rigorous studies that compare these different surfaces. Therefore, we have used an enzyme-linked immunosorbent assay (ELISA) microarray platform to analyze 16 different commercially available slide types. Full standard curves were generated for 24 different assays. We found that this approach provides a rigorous and quantitative system for comparing the different slide types based on spot size and morphology, slide noise, spot background, lower limit of detection, and reproducibility. These studies demonstrate that the properties of the slide surface affect the activity of immobilized antibodies and the quality of data produced. Although many slide types can produce useful data, glass slides coated with poly-L-lysine or aminosilane, with or without activation with a crosslinker, consistently produce superior results in the ELISA microarray analyses we performed.

  10. Microarrays: an overview.

    PubMed

    Lee, Norman H; Saeed, Alexander I

    2007-01-01

    Gene expression microarrays are being used widely to address a myriad of complex biological questions. To gather meaningful expression data, it is crucial to have a firm understanding of the steps involved in the application of microarrays. The available microarray platforms are discussed along with their advantages and disadvantages. Additional considerations include study design, quality control and systematic assessment of microarray performance, RNA-labeling strategies, sample allocation, signal amplification schemes, defining the number of appropriate biological replicates, data normalization, statistical approaches to identify differentially regulated genes, and clustering algorithms for data visualization. In this chapter, the underlying principles regarding microarrays are reviewed, to serve as a guide when navigating through this powerful technology. PMID:17332646

  11. An Overview of DNA Microarray Grid Alignment and Foreground Separation Approaches

    NASA Astrophysics Data System (ADS)

    Bajcsy, Peter

    2006-12-01

    This paper overviews DNA microarray grid alignment and foreground separation approaches. Microarray grid alignment and foreground separation are the basic processing steps of DNA microarray images that affect the quality of gene expression information, and hence impact our confidence in any data-derived biological conclusions. Thus, understanding microarray data processing steps becomes critical for performing optimal microarray data analysis. In the past, the grid alignment and foreground separation steps have not been covered extensively in the survey literature. We present several classifications of existing algorithms, and describe the fundamental principles of these algorithms. Challenges related to automation and reliability of processed image data are outlined at the end of this overview paper.

  12. Microarrays--status and prospects.

    PubMed

    Venkatasubbarao, Srivatsa

    2004-12-01

    Microarrays have become an extremely important research tool for life science researchers and are also beginning to be used in diagnostic, treatment and monitoring applications. This article provides a detailed description of microarrays prepared by in situ synthesis, deposition using microspotting methods, nonplanar bead arrays, flow-through microarrays, optical fiber bundle arrays and nanobarcodes. The problems and challenges in the development of microarrays, development of standards and diagnostic microarrays are described. Tables summarizing the vendor list of various derivatized microarray surfaces, commercially sold premade microarrays, bead arrays and unique microarray products in development are also included. PMID:15542153

  13. Fully automated multidimensional reversed-phase liquid chromatography with tandem anion/cation exchange columns for simultaneous global endogenous tyrosine nitration detection, integral membrane protein characterization, and quantitative proteomics mapping in cerebral infarcts.

    PubMed

    Quan, Quan; Szeto, Samuel S W; Law, Henry C H; Zhang, Zaijun; Wang, Yuqiang; Chu, Ivan K

    2015-10-01

    Protein tyrosine nitration (PTN) is a signature hallmark of radical-induced nitrative stress in a wide range of pathophysiological conditions, with naturally occurring abundances at substoichiometric levels. In this present study, a fully automated four-dimensional platform, consisting of high-/low-pH reversed-phase dimensions with two additional complementary, strong anion (SAX) and cation exchange (SCX), chromatographic separation stages inserted in tandem, was implemented for the simultaneous mapping of endogenous nitrated tyrosine-containing peptides within the global proteomic context of a Macaca fascicularis cerebral ischemic stroke model. This integrated RP-SA(C)X-RP platform was initially benchmarked through proteomic analyses of Saccharomyces cerevisiae, revealing extended proteome and protein coverage. A total of 27 144 unique peptides from 3684 nonredundant proteins [1% global false discovery rate (FDR)] were identified from M. fascicularis cerebral cortex tissue. The inclusion of the S(A/C)X columns contributed to the increased detection of acidic, hydrophilic, and hydrophobic peptide populations; these separation features enabled the concomitant identification of 127 endogenous nitrated peptides and 137 transmembrane domain-containing peptides corresponding to integral membrane proteins, without the need for specific targeted enrichment strategies. The enhanced diversity of the peptide inventory obtained from the RP-SA(C)X-RP platform also improved analytical confidence in isobaric tags for relative and absolute quantitation (iTRAQ)-based proteomic analyses. PMID:26335518

  14. Near point-of-care administration by the attending physician of the rapid influenza antigen detection immunochromatography test and the fully automated respiratory virus nucleic acid test: contribution to patient management.

    PubMed

    Boku, Soushin; Naito, Toshio; Murai, Kenji; Tanei, Mika; Inui, Akihiro; Nisimura, Hidekazu; Isonuma, Hiroshi; Takahashi, Hiroshi; Kikuchi, Ken

    2013-08-01

    Rapid influenza antigen detection tests (RIADTs) using immunochromatography are the most readily available tools for the diagnosis and management of influenza. This study was designed to assess whether near point-of-care administration by primary care physicians of the RIADT and a fully automated respiratory virus nucleic acid test (Verigene Respiratory Virus Plus®; RV+) would contribute to improved patient management. When viral culture and RT-PCR/bi-directional sequencing were used as the gold standard, sensitivities and specificities for RIADT and RV+ were 58.3% and 90.9%, and 97.2% and 100%, respectively. Within 12 hours from onset of fever, sensitivities were 44.4% and 94.4%, respectively, for RIADT and RV+. In clinical situations where a higher-sensitivity test is needed, such as during pre-admission evaluations, for testing of hospital employees during the prodromal phase of infection, during the therapeutic decision-making process, and during outbreaks, we suggest that patients testing negative by the RIADT can be reassessed with the RV+ test to achieve maximal diagnostic accuracy. PMID:23743175

  15. Fully Automated Anesthesia, Analgesia and Fluid Management

    ClinicalTrials.gov

    2016-09-05

    General Anesthetic Drug Overdose; Adverse Effect of Intravenous Anesthetics, Sequela; Complication of Anesthesia; Drug Delivery System Malfunction; Hemodynamic Instability; Underdosing of Other General Anesthetics

  16. Microarray Analysis in Glioblastomas.

    PubMed

    Bhawe, Kaumudi M; Aghi, Manish K

    2016-01-01

    Microarray analysis in glioblastomas is done using either cell lines or patient samples as starting material. A survey of the current literature points to transcript-based microarrays and immunohistochemistry (IHC)-based tissue microarrays as being the preferred methods of choice in cancers of neurological origin. Microarray analysis may be carried out for various purposes including the following: i. To correlate gene expression signatures of glioblastoma cell lines or tumors with response to chemotherapy (DeLay et al., Clin Cancer Res 18(10):2930-2942, 2012). ii. To correlate gene expression patterns with biological features like proliferation or invasiveness of the glioblastoma cells (Jiang et al., PLoS One 8(6):e66008, 2013). iii. To discover new tumor classificatory systems based on gene expression signature, and to correlate therapeutic response and prognosis with these signatures (Huse et al., Annu Rev Med 64(1):59-70, 2013; Verhaak et al., Cancer Cell 17(1):98-110, 2010). While investigators can sometimes use archived tumor gene expression data available from repositories such as the NCBI Gene Expression Omnibus to answer their questions, new arrays must often be run to adequately answer specific questions. Here, we provide a detailed description of microarray methodologies, how to select the appropriate methodology for a given question, and analytical strategies that can be used. Experimental methodology for protein microarrays is outside the scope of this chapter, but basic sample preparation techniques for transcript-based microarrays are included here. PMID:26113463

  17. Automation in Clinical Microbiology

    PubMed Central

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  18. Hybridization and Selective Release of DNA Microarrays

    SciTech Connect

    Beer, N R; Baker, B; Piggott, T; Maberry, S; Hara, C M; DeOtte, J; Benett, W; Mukerjee, E; Dzenitis, J; Wheeler, E K

    2011-11-29

    DNA microarrays contain sequence specific probes arrayed in distinct spots numbering from 10,000 to over 1,000,000, depending on the platform. This tremendous degree of multiplexing gives microarrays great potential for environmental background sampling, broad-spectrum clinical monitoring, and continuous biological threat detection. In practice, their use in these applications is not common due to limited information content, long processing times, and high cost. The work focused on characterizing the phenomena of microarray hybridization and selective release that will allow these limitations to be addressed. This will revolutionize the ways that microarrays can be used for LLNL's Global Security missions. The goals of this project were two-fold: automated faster hybridizations and selective release of hybridized features. The first study area involves hybridization kinetics and mass-transfer effects. the standard hybridization protocol uses an overnight incubation to achieve the best possible signal for any sample type, as well as for convenience in manual processing. There is potential to significantly shorten this time based on better understanding and control of the rate-limiting processes and knowledge of the progress of the hybridization. In the hybridization work, a custom microarray flow cell was used to manipulate the chemical and thermal environment of the array and autonomously image the changes over time during hybridization. The second study area is selective release. Microarrays easily generate hybridization patterns and signatures, but there is still an unmet need for methodologies enabling rapid and selective analysis of these patterns and signatures. Detailed analysis of individual spots by subsequent sequencing could potentially yield significant information for rapidly mutating and emerging (or deliberately engineered) pathogens. In the selective release work, optical energy deposition with coherent light quickly provides the thermal energy to

  19. Nanotechnologies in protein microarrays.

    PubMed

    Krizkova, Sona; Heger, Zbynek; Zalewska, Marta; Moulick, Amitava; Adam, Vojtech; Kizek, Rene

    2015-01-01

    Protein microarray technology became an important research tool for study and detection of proteins, protein-protein interactions and a number of other applications. The utilization of nanoparticle-based materials and nanotechnology-based techniques for immobilization allows us not only to extend the surface for biomolecule immobilization resulting in enhanced substrate binding properties, decreased background signals and enhanced reporter systems for more sensitive assays. Generally in contemporarily developed microarray systems, multiple nanotechnology-based techniques are combined. In this review, applications of nanoparticles and nanotechnologies in creating protein microarrays, proteins immobilization and detection are summarized. We anticipate that advanced nanotechnologies can be exploited to expand promising fields of proteins identification, monitoring of protein-protein or drug-protein interactions, or proteins structures. PMID:26039143

  20. Identifying Fishes through DNA Barcodes and Microarrays

    PubMed Central

    Kochzius, Marc; Seidel, Christian; Antoniou, Aglaia; Botla, Sandeep Kumar; Campo, Daniel; Cariani, Alessia; Vazquez, Eva Garcia; Hauschild, Janet; Hervet, Caroline; Hjörleifsdottir, Sigridur; Hreggvidsson, Gudmundur; Kappel, Kristina; Landi, Monica; Magoulas, Antonios; Marteinsson, Viggo; Nölte, Manfred; Planes, Serge; Tinti, Fausto; Turan, Cemal; Venugopal, Moleyur N.; Weber, Hannes; Blohm, Dietmar

    2010-01-01

    Background International fish trade reached an import value of 62.8 billion Euro in 2006, of which 44.6% are covered by the European Union. Species identification is a key problem throughout the life cycle of fishes: from eggs and larvae to adults in fisheries research and control, as well as processed fish products in consumer protection. Methodology/Principal Findings This study aims to evaluate the applicability of the three mitochondrial genes 16S rRNA (16S), cytochrome b (cyt b), and cytochrome oxidase subunit I (COI) for the identification of 50 European marine fish species by combining techniques of “DNA barcoding” and microarrays. In a DNA barcoding approach, neighbour Joining (NJ) phylogenetic trees of 369 16S, 212 cyt b, and 447 COI sequences indicated that cyt b and COI are suitable for unambiguous identification, whereas 16S failed to discriminate closely related flatfish and gurnard species. In course of probe design for DNA microarray development, each of the markers yielded a high number of potentially species-specific probes in silico, although many of them were rejected based on microarray hybridisation experiments. None of the markers provided probes to discriminate the sibling flatfish and gurnard species. However, since 16S-probes were less negatively influenced by the “position of label” effect and showed the lowest rejection rate and the highest mean signal intensity, 16S is more suitable for DNA microarray probe design than cty b and COI. The large portion of rejected COI-probes after hybridisation experiments (>90%) renders the DNA barcoding marker as rather unsuitable for this high-throughput technology. Conclusions/Significance Based on these data, a DNA microarray containing 64 functional oligonucleotide probes for the identification of 30 out of the 50 fish species investigated was developed. It represents the next step towards an automated and easy-to-handle method to identify fish, ichthyoplankton, and fish products. PMID

  1. Evaluation of the Fully Automated BACTEC MGIT 960 System for Testing Susceptibility of Mycobacterium tuberculosis to Pyrazinamide, Streptomycin, Isoniazid, Rifampin, and Ethambutol and Comparison with the Radiometric BACTEC 460TB Method

    PubMed Central

    Scarparo, Claudio; Ricordi, Paolo; Ruggiero, Giuliana; Piccoli, Paola

    2004-01-01

    The performance of the fully automated BACTEC MGIT 960 (M960) system for the testing of Mycobacterium tuberculosis susceptibility to streptomycin (SM), isoniazid (INH), rifampin (RMP), ethambutol (EMB), and pyrazinamide (PZA) was evaluated with 100 clinical isolates and compared to that of the radiometric BACTEC 460TB (B460) system. The agar proportion method and the B460 system were used as reference methods to resolve the discordant results for SM, INH, RMP, and EMB (a combination known as SIRE) and PZA, respectively. The overall agreements were 96.3% for SIRE and 92% for PZA. For SIRE, a total of 26 discrepancies were found and were resolved in favor of the M960 system in 8 cases and in favor of the B460 system in 18 cases. The M960 system produced 8 very major errors (VME) and 10 major errors (ME), while the B460 system showed 4 VME and 4 ME. No statistically significant differences were found. Both systems exhibited excellent performance, but a higher number of VME was observed with the M960 system at the critical concentrations of EMB and SM. For PZA, a total of eight discrepancies were observed and were resolved in favor of the M960 system in one case and in favor of the B460 system in seven cases; no statistically significant differences were found. The M960 system showed four VME and three ME. The mean times to report overall PZA results and resistant results were 8.2 and 9.8 days, respectively, for the M960 system and 7.4 and 8.1 days, respectively, for the B460 system. Statistically significant differences were found. The mean times to report SIRE results were 8.3 days for the M960 system and 8.2 days for the B460 system. No statistically significant differences were found. Twelve strains tested for SIRE susceptibility and seven strains tested for PZA susceptibility had been reprocessed because of contamination. In conclusion, the M960 system can represent a valid alternative to the B460 for M. tuberculosis susceptibility testing; however, the frequent

  2. Microarrays for Undergraduate Classes

    ERIC Educational Resources Information Center

    Hancock, Dale; Nguyen, Lisa L.; Denyer, Gareth S.; Johnston, Jill M.

    2006-01-01

    A microarray experiment is presented that, in six laboratory sessions, takes undergraduate students from the tissue sample right through to data analysis. The model chosen, the murine erythroleukemia cell line, can be easily cultured in sufficient quantities for class use. Large changes in gene expression can be induced in these cells by…

  3. Microarray analysis at single molecule resolution

    PubMed Central

    Mureşan, Leila; Jacak, Jarosław; Klement, Erich Peter; Hesse, Jan; Schütz, Gerhard J.

    2010-01-01

    Bioanalytical chip-based assays have been enormously improved in sensitivity in the recent years; detection of trace amounts of substances down to the level of individual fluorescent molecules has become state of the art technology. The impact of such detection methods, however, has yet not fully been exploited, mainly due to a lack in appropriate mathematical tools for robust data analysis. One particular example relates to the analysis of microarray data. While classical microarray analysis works at resolutions of two to 20 micrometers and quantifies the abundance of target molecules by determining average pixel intensities, a novel high resolution approach [1] directly visualizes individual bound molecules as diffraction limited peaks. The now possible quantification via counting is less susceptible to labeling artifacts and background noise. We have developed an approach for the analysis of high-resolution microarray images. It consists first of a single molecule detection step, based on undecimated wavelet transforms, and second, of a spot identification step via spatial statistics approach (corresponding to the segmentation step in the classical microarray analysis). The detection method was tested on simulated images with a concentration range of 0.001 to 0.5 molecules per square micron and signal-to-noise ratio (SNR) between 0.9 and 31.6. For SNR above 15 the false negatives relative error was below 15%. Separation of foreground/background proved reliable, in case foreground density exceeds background by a factor of 2. The method has also been applied to real data from high-resolution microarray measurements. PMID:20123580

  4. Microarrays under the microscope.

    PubMed

    Wildsmith, S E; Elcock, F J

    2001-02-01

    Microarray technology is a rapidly advancing area, which is gaining popularity in many biological disciplines from drug target identification to predictive toxicology. Over the past few years, there has been a dramatic increase in the number of methods and techniques available for carrying out this form of gene expression analysis. The techniques and associated peripherals, such as slide types, deposition methods, robotics, and scanning equipment, are undergoing constant improvement, helping to drive the technology forward in terms of robustness and ease of use. These rapid developments, combined with the number of options available and the associated hyperbole, can prove daunting for the new user. This review aims to guide the researcher through the various steps of conducting microarray experiments, from initial strategy to analysing the data, with critical examination of the benefits and disadvantages along the way. PMID:11212888

  5. Navigating Public Microarray Databases

    PubMed Central

    Bähler, Jürg

    2004-01-01

    With the ever-escalating amount of data being produced by genome-wide microarray studies, it is of increasing importance that these data are captured in public databases so that researchers can use this information to complement and enhance their own studies. Many groups have set up databases of expression data, ranging from large repositories, which are designed to comprehensively capture all published data, through to more specialized databases. The public repositories, such as ArrayExpress at the European Bioinformatics Institute contain complete datasets in raw format in addition to processed data, whilst the specialist databases tend to provide downstream analysis of normalized data from more focused studies and data sources. Here we provide a guide to the use of these public microarray resources. PMID:18629145

  6. Navigating public microarray databases.

    PubMed

    Penkett, Christopher J; Bähler, Jürg

    2004-01-01

    With the ever-escalating amount of data being produced by genome-wide microarray studies, it is of increasing importance that these data are captured in public databases so that researchers can use this information to complement and enhance their own studies. Many groups have set up databases of expression data, ranging from large repositories, which are designed to comprehensively capture all published data, through to more specialized databases. The public repositories, such as ArrayExpress at the European Bioinformatics Institute contain complete datasets in raw format in addition to processed data, whilst the specialist databases tend to provide downstream analysis of normalized data from more focused studies and data sources. Here we provide a guide to the use of these public microarray resources. PMID:18629145

  7. Evaluating concentration estimation errors in ELISA microarray experiments

    SciTech Connect

    Daly, Don S.; White, Amanda M.; Varnum, Susan M.; Anderson, Kevin K.; Zangar, Richard C.

    2005-01-26

    Enzyme-linked immunosorbent assay (ELISA) is a standard immunoassay to predict a protein concentration in a sample. Deploying ELISA in a microarray format permits simultaneous prediction of the concentrations of numerous proteins in a small sample. These predictions, however, are uncertain due to processing error and biological variability. Evaluating prediction error is critical to interpreting biological significance and improving the ELISA microarray process. Evaluating prediction error must be automated to realize a reliable high-throughput ELISA microarray system. Methods: In this paper, we present a statistical method based on propagation of error to evaluate prediction errors in the ELISA microarray process. Although propagation of error is central to this method, it is effective only when comparable data are available. Therefore, we briefly discuss the roles of experimental design, data screening, normalization and statistical diagnostics when evaluating ELISA microarray prediction errors. We use an ELISA microarray investigation of breast cancer biomarkers to illustrate the evaluation of prediction errors. The illustration begins with a description of the design and resulting data, followed by a brief discussion of data screening and normalization. In our illustration, we fit a standard curve to the screened and normalized data, review the modeling diagnostics, and apply propagation of error.

  8. Automated, Miniaturized Instrument for Measuring Gene Expression in Space

    NASA Technical Reports Server (NTRS)

    Pohorille, A.; Peyvan, K.; Danley, D.; Ricco, A. J.

    2010-01-01

    To facilitate astrobiological studies on the survival and adaptation of microorganisms and mixed microbial cultures to space environment, we have been developing a fully automated, miniaturized system for measuring their gene expression on small spacecraft. This low-cost, multi-purpose instrument represents a major scientific and technological advancement in our ability to study the impact of the space environment on biological systems by providing data on cellular metabolism and regulation orders of magnitude richer than what is currently available. The system supports growth of the organism, lyse it to release the expressed RNA, label the RNA, read the expression levels of a large number of genes by microarray analysis of labeled RNA and transmit the measurements to Earth. To measure gene expression we use microarray technology developed by CombiMatrix, which is based on electrochemical reactions on arrays of electrodes on a semiconductor substrate. Since the electrical integrity of the microarray remains intact after probe synthesis, the circuitry can be employed to sense nucleic acid binding at each electrode. CombiMatrix arrays can be sectored to allow multiple samples per chip. In addition, a single array can be used for several assays. The array has been integrated into an automated microfluidic cartridge that uses flexible reagent blisters and pinch pumping to move liquid reagents between chambers. The proposed instrument will help to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment, develop effective countermeasures against these effects, and test our ability to sustain and grow in space organisms that can be used for life support and in situ resource utilization during long-duration space exploration. The instrument is suitable for small satellite platforms, which provide frequent, low cost access to space. It can be also used on any other platform in space

  9. High-Throughput Fully Automated Construction of a Multiplex Library of Mutagenized Open Reading Frames for an Insecticidal Peptide Using a Plasmid-Based Functional Proteomic Robotic Workcell with Improved Vacuum System

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Robotic platforms are essential for the production and screening of large numbers of expression-ready plasmid sets used to develop optimized clones and improved microbial strains. Here we demonstrate a plasmid-based integrated workcell that was used to automate the molecular biology protocols inclu...

  10. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  11. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  12. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  13. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  14. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  15. Automation in Immunohematology

    PubMed Central

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-01-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process. PMID:22988378

  16. [Research progress of probe design software of oligonucleotide microarrays].

    PubMed

    Chen, Xi; Wu, Zaoquan; Liu, Zhengchun

    2014-02-01

    DNA microarray has become an essential medical genetic diagnostic tool for its high-throughput, miniaturization and automation. The design and selection of oligonucleotide probes are critical for preparing gene chips with high quality. Several sets of probe design software have been developed and are available to perform this work now. Every set of the software aims to different target sequences and shows different advantages and limitations. In this article, the research and development of these sets of software are reviewed in line with three main criteria, including specificity, sensitivity and melting temperature (Tm). In addition, based on the experimental results from literatures, these sets of software are classified according to their applications. This review will be helpful for users to choose an appropriate probe-design software. It will also reduce the costs of microarrays, improve the application efficiency of microarrays, and promote both the research and development (R&D) and commercialization of high-performance probe design software. PMID:24804514

  17. Tiling Microarray Analysis Tools

    SciTech Connect

    Nix, Davis Austin

    2005-05-04

    TiMAT is a package of 23 command line Java applications for use in the analysis of Affymetrix tiled genomic microarray data. TiMAT enables: 1) Rebuilding the genome annotation for entire tiled arrays (repeat filtering, chromosomal coordinate assignment). 2) Post processing of oligo intensity values (quantile normalization, median scaling, PMMM transformation), 3) Significance testing (Wilcoxon rank sum and signed rank tests, intensity difference and ratio tests) and Interval refinement (filtering based on multiple statistics, overlap comparisons), 4) Data visualization (detailed thumbnail/zoomed view with Interval Plots and data export to Affymetrix's Integrated Genome Browser) and Data reports (spreadsheet summaries and detailed profiles)

  18. Microwave-Assisted Sample Treatment in a Fully Automated Flow-Based Instrument: Oxidation of Reduced Technetium Species in the Analysis of Total Technetium-99 in Caustic Aged Nuclear Waste Samples

    SciTech Connect

    Egorov, Oleg B.; O'Hara, Matthew J.; Grate, Jay W.

    2004-07-15

    An automated flow-based instrument for microwave-assisted treatment of liquid samples has been developed and characterized. The instrument utilizes a flow-through reaction vessel design that facilitates the addition of multiple reagents during sample treatment, removal of the gaseous reaction products, and enables quantitative removal of liquids from the reaction vessel for carryover-free operations. Matrix modification and speciation control chemistries that are required for the radiochemical determination of total 99Tc in caustic aged nuclear waste samples have been investigated. A rapid and quantitative oxidation procedure using peroxydisulfate in acidic solution was developed to convert reduced technetium species to pertechnetate in samples with high content of reducing organics. The effectiveness of the automated sample treatment procedures has been validated in the radiochemical analysis of total 99Tc in caustic aged nuclear waste matrixes from the Hanford site.

  19. Compressive Sensing DNA Microarrays

    PubMed Central

    2009-01-01

    Compressive sensing microarrays (CSMs) are DNA-based sensors that operate using group testing and compressive sensing (CS) principles. In contrast to conventional DNA microarrays, in which each genetic sensor is designed to respond to a single target, in a CSM, each sensor responds to a set of targets. We study the problem of designing CSMs that simultaneously account for both the constraints from CS theory and the biochemistry of probe-target DNA hybridization. An appropriate cross-hybridization model is proposed for CSMs, and several methods are developed for probe design and CS signal recovery based on the new model. Lab experiments suggest that in order to achieve accurate hybridization profiling, consensus probe sequences are required to have sequence homology of at least 80% with all targets to be detected. Furthermore, out-of-equilibrium datasets are usually as accurate as those obtained from equilibrium conditions. Consequently, one can use CSMs in applications in which only short hybridization times are allowed. PMID:19158952

  20. Single nucleotide polymorphism genotyping using BeadChip microarrays.

    PubMed

    Lambert, Gilliam; Tsinajinnie, Darwin; Duggan, David

    2013-07-01

    The genotyping of single nucleotide polymorphisms (SNPs) has successfully contributed to the study of complex diseases more than any other technology to date. Genome-wide association studies (GWAS) using 10,000s to >1,000,000 SNPs have identified 1000s of statistically significant SNPs pertaining to 17 different human disease and trait categories. Post-GWAS fine-mapping studies using 10,000s to 100,000s SNPs on a microarray have narrowed the region of interest for many of these GWAS findings; in addition, independent signals within the original GWAS region have been identified. Focused content, SNP-based microarrays such as the human exome, for example, have too been used successfully to identify novel disease associations. Success has come to studies where 100s to 10,000s (mostly) to >100,000 samples were genotyped. For the time being, SNP-based microarrays remain cost-effective especially when studying large numbers of samples compared to other "genotyping" technologies including next generation sequencing. In this unit, protocols for manual (LIMS-free), semi-manual, and automated processing of BeadChip microarrays are presented. Lower throughput studies will find value in the manual and semi-manual protocols, while all types of studies--low-, medium-, and high-throughput--will find value in the semi-manual and automated protocols. PMID:23853082

  1. Electronic microarray assays for avian influenza and Newcastle disease virus.

    PubMed

    Lung, Oliver; Beeston, Anne; Ohene-Adjei, Samuel; Pasick, John; Hodko, Dalibor; Hughes, Kimberley Burton; Furukawa-Stoffer, Tara; Fisher, Mathew; Deregt, Dirk

    2012-11-01

    Microarrays are suitable for multiplexed detection and typing of pathogens. Avian influenza virus (AIV) is currently classified into 16 H (hemagglutinin) and 9 N (neuraminidase) subtypes, whereas Newcastle disease virus (NDV) strains differ in virulence and are broadly classified into high and low pathogenicity types. In this study, three assays for detection and typing of poultry viruses were developed on an automated microarray platform: a multiplex assay for simultaneous detection of AIV and detection and pathotyping of NDV, and two separate assays for differentiating all AIV H and N subtypes. The AIV-NDV multiplex assay detected all strains in a 63 virus panel, and accurately typed all high pathogenicity NDV strains tested. A limit of detection of 10(1)-10(3) TCID(50)/mL and 200-400 EID(50)/mL was obtained for NDV and AIV, respectively. The AIV typing assays accurately typed all 41 AIV strains and a limit of detection of 4-200 EID(50)/mL was obtained. Assay validation showed that the microarray assays were generally comparable to real-time RT-PCR. However, the AIV typing microarray assays detected more positive clinical samples than the AIV matrix real-time RT-PCR, and also provided information regarding the subtype. The AIV-NDV multiplex and AIV H typing microarray assays detected mixed infections and could be useful for detection and typing of AIV and NDV. PMID:22796283

  2. Inkjet printing for the production of protein microarrays.

    PubMed

    McWilliam, Iain; Chong Kwan, Marisa; Hall, Duncan

    2011-01-01

    A significant proportion of protein microarray researchers would like the arrays they develop to become widely used research, screening, validation or diagnostic devices. For this to be achievable the arrays must be compatible with high-throughput techniques that allow manufacturing scale production. In order to simplify the transition from laboratory bench to market, Arrayjet have developed a range of inkjet microarray printers, which, at one end of the scale, are suitable for R&D and, at the other end, are capable of true high-throughput array output. To maintain scalability, all Arrayjet microarray printers utilise identical core technology comprising a JetSpyder™ liquid handling adaptor, which enables automated loading of an industry standard inkjet printhead compatible with non-contact on-the-fly printing. This chapter contains a detailed explanation of Arrayjet technology followed by a historical look at the development of inkjet technologies for protein microarray production. The method described subsequently is a simple example of an antibody array printed onto nitrocellulose-coated slides with specific detection with fluorescently labelled IgG. The method is linked to a notes section with advice on best practice and sources of useful information for protein microarray production using inkjet technology. PMID:21901611

  3. DNA Microarray-Based Diagnostics.

    PubMed

    Marzancola, Mahsa Gharibi; Sedighi, Abootaleb; Li, Paul C H

    2016-01-01

    The DNA microarray technology is currently a useful biomedical tool which has been developed for a variety of diagnostic applications. However, the development pathway has not been smooth and the technology has faced some challenges. The reliability of the microarray data and also the clinical utility of the results in the early days were criticized. These criticisms added to the severe competition from other techniques, such as next-generation sequencing (NGS), impacting the growth of microarray-based tests in the molecular diagnostic market.Thanks to the advances in the underlying technologies as well as the tremendous effort offered by the research community and commercial vendors, these challenges have mostly been addressed. Nowadays, the microarray platform has achieved sufficient standardization and method validation as well as efficient probe printing, liquid handling and signal visualization. Integration of various steps of the microarray assay into a harmonized and miniaturized handheld lab-on-a-chip (LOC) device has been a goal for the microarray community. In this respect, notable progress has been achieved in coupling the DNA microarray with the liquid manipulation microsystem as well as the supporting subsystem that will generate the stand-alone LOC device.In this chapter, we discuss the major challenges that microarray technology has faced in its almost two decades of development and also describe the solutions to overcome the challenges. In addition, we review the advancements of the technology, especially the progress toward developing the LOC devices for DNA diagnostic applications. PMID:26614075

  4. Living-Cell Microarrays

    PubMed Central

    Yarmush, Martin L.; King, Kevin R.

    2011-01-01

    Living cells are remarkably complex. To unravel this complexity, living-cell assays have been developed that allow delivery of experimental stimuli and measurement of the resulting cellular responses. High-throughput adaptations of these assays, known as living-cell microarrays, which are based on microtiter plates, high-density spotting, microfabrication, and microfluidics technologies, are being developed for two general applications: (a) to screen large-scale chemical and genomic libraries and (b) to systematically investigate the local cellular microenvironment. These emerging experimental platforms offer exciting opportunities to rapidly identify genetic determinants of disease, to discover modulators of cellular function, and to probe the complex and dynamic relationships between cells and their local environment. PMID:19413510

  5. Tiling Microarray Analysis Tools

    Energy Science and Technology Software Center (ESTSC)

    2005-05-04

    TiMAT is a package of 23 command line Java applications for use in the analysis of Affymetrix tiled genomic microarray data. TiMAT enables: 1) Rebuilding the genome annotation for entire tiled arrays (repeat filtering, chromosomal coordinate assignment). 2) Post processing of oligo intensity values (quantile normalization, median scaling, PMMM transformation), 3) Significance testing (Wilcoxon rank sum and signed rank tests, intensity difference and ratio tests) and Interval refinement (filtering based on multiple statistics, overlap comparisons),more » 4) Data visualization (detailed thumbnail/zoomed view with Interval Plots and data export to Affymetrix's Integrated Genome Browser) and Data reports (spreadsheet summaries and detailed profiles)« less

  6. Validating Automated Speaking Tests

    ERIC Educational Resources Information Center

    Bernstein, Jared; Van Moere, Alistair; Cheng, Jian

    2010-01-01

    This paper presents evidence that supports the valid use of scores from fully automatic tests of spoken language ability to indicate a person's effectiveness in spoken communication. The paper reviews the constructs, scoring, and the concurrent validity evidence of "facility-in-L2" tests, a family of automated spoken language tests in Spanish,…

  7. Segmentation of prostate cancer tissue microarray images

    NASA Astrophysics Data System (ADS)

    Cline, Harvey E.; Can, Ali; Padfield, Dirk

    2006-02-01

    Prostate cancer is diagnosed by histopathology interpretation of hematoxylin and eosin (H and E)-stained tissue sections. Gland and nuclei distributions vary with the disease grade. The morphological features vary with the advance of cancer where the epithelial regions grow into the stroma. An efficient pathology slide image analysis method involved using a tissue microarray with known disease stages. Digital 24-bit RGB images were acquired for each tissue element on the slide with both 10X and 40X objectives. Initial segmentation at low magnification was accomplished using prior spectral characteristics from a training tissue set composed of four tissue clusters; namely, glands, epithelia, stroma and nuclei. The segmentation method was automated by using the training RGB values as an initial guess and iterating the averaging process 10 times to find the four cluster centers. Labels were assigned to the nearest cluster center in red-blue spectral feature space. An automatic threshold algorithm separated the glands from the tissue. A visual pseudo color representation of 60 segmented tissue microarray image was generated where white, pink, red, blue colors represent glands, epithelia, stroma and nuclei, respectively. The higher magnification images provided refined nuclei morphology. The nuclei were detected with a RGB color space principle component analysis that resulted in a grey scale image. The shape metrics such as compactness, elongation, minimum and maximum diameters were calculated based on the eigenvalues of the best-fitting ellipses to the nuclei.

  8. Microarray platform for omics analysis

    NASA Astrophysics Data System (ADS)

    Mecklenburg, Michael; Xie, Bin

    2001-09-01

    Microarray technology has revolutionized genetic analysis. However, limitations in genome analysis has lead to renewed interest in establishing 'omic' strategies. As we enter the post-genomic era, new microarray technologies are needed to address these new classes of 'omic' targets, such as proteins, as well as lipids and carbohydrates. We have developed a microarray platform that combines self- assembling monolayers with the biotin-streptavidin system to provide a robust, versatile immobilization scheme. A hydrophobic film is patterned on the surface creating an array of tension wells that eliminates evaporation effects thereby reducing the shear stress to which biomolecules are exposed to during immobilization. The streptavidin linker layer makes it possible to adapt and/or develop microarray based assays using virtually any class of biomolecules including: carbohydrates, peptides, antibodies, receptors, as well as them ore traditional DNA based arrays. Our microarray technology is designed to furnish seamless compatibility across the various 'omic' platforms by providing a common blueprint for fabricating and analyzing arrays. The prototype microarray uses a microscope slide footprint patterned with 2 by 96 flat wells. Data on the microarray platform will be presented.

  9. Systematic review automation technologies

    PubMed Central

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  10. Evaluating concentration estimation errors in ELISA microarray experiments

    PubMed Central

    Daly, Don Simone; White, Amanda M; Varnum, Susan M; Anderson, Kevin K; Zangar, Richard C

    2005-01-01

    Background Enzyme-linked immunosorbent assay (ELISA) is a standard immunoassay to estimate a protein's concentration in a sample. Deploying ELISA in a microarray format permits simultaneous estimation of the concentrations of numerous proteins in a small sample. These estimates, however, are uncertain due to processing error and biological variability. Evaluating estimation error is critical to interpreting biological significance and improving the ELISA microarray process. Estimation error evaluation must be automated to realize a reliable high-throughput ELISA microarray system. In this paper, we present a statistical method based on propagation of error to evaluate concentration estimation errors in the ELISA microarray process. Although propagation of error is central to this method and the focus of this paper, it is most effective only when comparable data are available. Therefore, we briefly discuss the roles of experimental design, data screening, normalization, and statistical diagnostics when evaluating ELISA microarray concentration estimation errors. Results We use an ELISA microarray investigation of breast cancer biomarkers to illustrate the evaluation of concentration estimation errors. The illustration begins with a description of the design and resulting data, followed by a brief discussion of data screening and normalization. In our illustration, we fit a standard curve to the screened and normalized data, review the modeling diagnostics, and apply propagation of error. We summarize the results with a simple, three-panel diagnostic visualization featuring a scatterplot of the standard data with logistic standard curve and 95% confidence intervals, an annotated histogram of sample measurements, and a plot of the 95% concentration coefficient of variation, or relative error, as a function of concentration. Conclusions This statistical method should be of value in the rapid evaluation and quality control of high-throughput ELISA microarray analyses

  11. Chemistry of Natural Glycan Microarray

    PubMed Central

    Song, Xuezheng; Heimburg-Molinaro, Jamie; Cummings, Richard D.; Smith, David F.

    2014-01-01

    Glycan microarrays have become indispensable tools for studying protein-glycan interactions. Along with chemo-enzymatic synthesis, glycans isolated from natural sources have played important roles in array development and will continue to be a major source of glycans. N- and O-glycans from glycoproteins, and glycans from glycosphingolipids can be released from corresponding glycoconjugates with relatively mature methods, although isolation of large numbers and quantities of glycans are still very challenging. Glycosylphosphatidylinositol (GPI)-anchors and glycosaminoglycans (GAGs) are less represented on current glycan microarrays. Glycan microarray development has been greatly facilitated by bifunctional fluorescent linkers, which can be applied in a “Shotgun Glycomics” approach to incorporate isolated natural glycans. Glycan presentation on microarrays may affect glycan binding by GBPs, often through multivalent recognition by the GBP. PMID:24487062

  12. Microarray Analysis of Microbial Weathering

    NASA Astrophysics Data System (ADS)

    Olsson-Francis, K.; van Houdt, R.; Leys, N.; Mergeay, M.; Cockell, C. S.

    2010-04-01

    Microarray analysis of the heavy metal resistant bacterium, Cupriavidus metallidurans CH34 was used to investigate the genes involved in the weathering. The results demonstrated that large porin and membrane transporter genes were unregulated.

  13. Standard operation protocol for analysis of lipid hydroperoxides in human serum using a fully automated method based on solid-phase extraction and liquid chromatography-mass spectrometry in selected reaction monitoring.

    PubMed

    Ferreiro-Vera, C; Ribeiro, Joana P N; Mata-Granados, J M; Priego-Capote, F; Luque de Castro, M D

    2011-09-23

    Standard operating procedures (SOPs) are of paramount importance in the analytical field to ensure the reproducibility of the results obtained among laboratories. SOPs gain special interest when the aim is the analysis of potentially unstable compounds. An SOP for analysis of lipid hydroperoxides (HpETEs) is here reported after optimization of the critical steps to be considered in their analysis in human serum from sampling to final analysis. The method is based on automated hyphenation between solid-phase extraction (SPE) and liquid chromatography-mass spectrometry (LC-MS). The developed research involves: (i) optimization of the SPE and LC-MS steps with a proper synchronization; (ii) validation of the method-viz. accuracy study (estimated as 86.4% as minimum value), evaluation of sensitivity and precision, which ranged from 2.5 to 7.0 ng/mL (0.25-0.70 ng on column) as quantification limit and precision below 13.2%), and robustness study (reusability of the cartridge for 5 times without affecting the accuracy and precision of the method); (iii) stability study, involving freeze-thaw stability, short-term and long-term stability and stock solution stability tests. The results thus obtained allow minimizing both random and systematic variation of the metabolic profiles of the target compounds by correct application of the established protocol. PMID:21851945

  14. Giant Magnetoresistive Sensors for DNA Microarray

    PubMed Central

    Xu, Liang; Yu, Heng; Han, Shu-Jen; Osterfeld, Sebastian; White, Robert L.; Pourmand, Nader; Wang, Shan X.

    2009-01-01

    Giant magnetoresistive (GMR) sensors are developed for a DNA microarray. Compared with the conventional fluorescent sensors, GMR sensors are cheaper, more sensitive, can generate fully electronic signals, and can be easily integrated with electronics and microfluidics. The GMR sensor used in this work has a bottom spin valve structure with an MR ratio of 12%. The single-strand target DNA detected has a length of 20 bases. Assays with DNA concentrations down to 10 pM were performed, with a dynamic range of 3 logs. A double modulation technique was used in signal detection to reduce the 1/f noise in the sensor while circumventing electromagnetic interference. The logarithmic relationship between the magnetic signal and the target DNA concentration can be described by the Temkin isotherm. Furthermore, GMR sensors integrated with microfluidics has great potential of improving the sensitivity to 1 pM or below, and the total assay time can be reduced to less than 1 hour. PMID:20824116

  15. The Stanford Tissue Microarray Database.

    PubMed

    Marinelli, Robert J; Montgomery, Kelli; Liu, Chih Long; Shah, Nigam H; Prapong, Wijan; Nitzberg, Michael; Zachariah, Zachariah K; Sherlock, Gavin J; Natkunam, Yasodha; West, Robert B; van de Rijn, Matt; Brown, Patrick O; Ball, Catherine A

    2008-01-01

    The Stanford Tissue Microarray Database (TMAD; http://tma.stanford.edu) is a public resource for disseminating annotated tissue images and associated expression data. Stanford University pathologists, researchers and their collaborators worldwide use TMAD for designing, viewing, scoring and analyzing their tissue microarrays. The use of tissue microarrays allows hundreds of human tissue cores to be simultaneously probed by antibodies to detect protein abundance (Immunohistochemistry; IHC), or by labeled nucleic acids (in situ hybridization; ISH) to detect transcript abundance. TMAD archives multi-wavelength fluorescence and bright-field images of tissue microarrays for scoring and analysis. As of July 2007, TMAD contained 205 161 images archiving 349 distinct probes on 1488 tissue microarray slides. Of these, 31 306 images for 68 probes on 125 slides have been released to the public. To date, 12 publications have been based on these raw public data. TMAD incorporates the NCI Thesaurus ontology for searching tissues in the cancer domain. Image processing researchers can extract images and scores for training and testing classification algorithms. The production server uses the Apache HTTP Server, Oracle Database and Perl application code. Source code is available to interested researchers under a no-cost license. PMID:17989087

  16. Comparing Bacterial DNA Microarray Fingerprints

    SciTech Connect

    Willse, Alan R.; Chandler, Darrell P.; White, Amanda M.; Protic, Miroslava; Daly, Don S.; Wunschel, Sharon C.

    2005-08-15

    Detecting subtle genetic differences between microorganisms is an important problem in molecular epidemiology and microbial forensics. In a typical investigation, gel electrophoresis is used to compare randomly amplified DNA fragments between microbial strains, where the patterns of DNA fragment sizes are proxies for a microbe's genotype. The limited genomic sample captured on a gel is often insufficient to discriminate nearly identical strains. This paper examines the application of microarray technology to DNA fingerprinting as a high-resolution alternative to gel-based methods. The so-called universal microarray, which uses short oligonucleotide probes that do not target specific genes or species, is intended to be applicable to all microorganisms because it does not require prior knowledge of genomic sequence. In principle, closely related strains can be distinguished if the number of probes on the microarray is sufficiently large, i.e., if the genome is sufficiently sampled. In practice, we confront noisy data, imperfectly matched hybridizations, and a high-dimensional inference problem. We describe the statistical problems of microarray fingerprinting, outline similarities with and differences from more conventional microarray applications, and illustrate the statistical fingerprinting problem for 10 closely related strains from three Bacillus species, and 3 strains from non-Bacillus species.

  17. Dual modality intravascular optical coherence tomography (OCT) and near-infrared fluorescence (NIRF) imaging: a fully automated algorithm for the distance-calibration of NIRF signal intensity for quantitative molecular imaging.

    PubMed

    Ughi, Giovanni J; Verjans, Johan; Fard, Ali M; Wang, Hao; Osborn, Eric; Hara, Tetsuya; Mauskapf, Adam; Jaffer, Farouc A; Tearney, Guillermo J

    2015-02-01

    Intravascular optical coherence tomography (IVOCT) is a well-established method for the high-resolution investigation of atherosclerosis in vivo. Intravascular near-infrared fluorescence (NIRF) imaging is a novel technique for the assessment of molecular processes associated with coronary artery disease. Integration of NIRF and IVOCT technology in a single catheter provides the capability to simultaneously obtain co-localized anatomical and molecular information from the artery wall. Since NIRF signal intensity attenuates as a function of imaging catheter distance to the vessel wall, the generation of quantitative NIRF data requires an accurate measurement of the vessel wall in IVOCT images. Given that dual modality, intravascular OCT-NIRF systems acquire data at a very high frame-rate (>100 frames/s), a high number of images per pullback need to be analyzed, making manual processing of OCT-NIRF data extremely time consuming. To overcome this limitation, we developed an algorithm for the automatic distance-correction of dual-modality OCT-NIRF images. We validated this method by comparing automatic to manual segmentation results in 180 in vivo images from six New Zealand White rabbit atherosclerotic after indocyanine-green injection. A high Dice similarity coefficient was found (0.97 ± 0.03) together with an average individual A-line error of 22 µm (i.e., approximately twice the axial resolution of IVOCT) and a processing time of 44 ms per image. In a similar manner, the algorithm was validated using 120 IVOCT clinical images from eight different in vivo pullbacks in human coronary arteries. The results suggest that the proposed algorithm enables fully automatic visualization of dual modality OCT-NIRF pullbacks, and provides an accurate and efficient calibration of NIRF data for quantification of the molecular agent in the atherosclerotic vessel wall. PMID:25341407

  18. Dual modality intravascular optical coherence tomography (OCT) and near-infrared fluorescence (NIRF) imaging: a fully automated algorithm for the distance-calibration of NIRF signal intensity for quantitative molecular imaging

    PubMed Central

    Ughi, Giovanni J.; Verjans, Johan; Fard, Ali M.; Wang, Hao; Osborn, Eric; Hara, Tetsuya; Mauskapf, Adam; Jaffer, Farouc A.; Tearney, Guillermo J.

    2015-01-01

    Intravascular optical coherence tomography (IVOCT) is a well-established method for the high-resolution investigation of atherosclerosis in vivo. Intravascular near-infrared fluorescence (NIRF) imaging is a novel technique for the assessment of molecular processes associated with coronary artery disease. Integration of NIRF and IVOCT technology in a single catheter provides the capability to simultaneously obtain co-localized anatomical and molecular information from the artery wall. Since NIRF signal intensity attenuates as a function of imaging catheter distance to the vessel wall, the generation of quantitative NIRF data requires an accurate measurement of the vessel wall in IVOCT images. Given that dual modality, intravascular OCT-NIRF systems acquire data at a very high frame-rate (>100 frames/second), a high number of images per pullback need to be analyzed, making manual processing of OCT-NIRF data extremely time consuming. To overcome this limitation, we developed an algorithm for the automatic distance-correction of dual-modality OCT-NIRF images. We validated this method by comparing automatic to manual segmentation results in 180 in vivo images from 6 New Zealand White rabbit atherosclerotic after indocyanine-green (ICG) injection. A high Dice similarity coefficient was found (0.97 ± 0.03) together with an average individual A-line error of 22 μm (i.e., approximately twice the axial resolution of IVOCT) and a processing time of 44 ms per image. In a similar manner, the algorithm was validated using 120 IVOCT clinical images from 8 different in vivo pullbacks in human coronary arteries. The results suggest that the proposed algorithm enables fully automatic visualization of dual modality OCT-NIRF pullbacks, and provides an accurate and efficient calibration of NIRF data for quantification of the molecular agent in the atherosclerotic vessel wall. PMID:25341407

  19. Nanodroplet chemical microarrays and label-free assays.

    PubMed

    Gosalia, Dhaval; Diamond, Scott L

    2010-01-01

    The microarraying of chemicals or biomolecules on a glass surface allows for dense storage and miniaturized screening experiments and can be deployed in chemical-biology research or drug discovery. Microarraying allows the production of scores of replicate slides. Small molecule libraries are typically stored as 10 mM DMSO stock solutions, whereas libraries of biomolecules are typically stored in high percentages of glycerol. Thus, a method is required to print such libraries on microarrays, and then assay them against biological targets. By printing either small molecule libraries or biomolecule libraries in an aqueous solvent containing glycerol, each adherent nanodroplet remains fixed at a position on the microarray by surface tension without the use of wells, without evaporating, and without the need for chemically linking the compound to the surface. Importantly, glycerol is a high boiling point solvent that is fully miscible with DMSO and water and has the additional property of stabilizing various enzymes. The nanoliter volume of the droplet forms the reaction compartment once additional reagents are metered onto the microarray, either by aerosol spray deposition or by addressable acoustic dispensing. Incubation of the nanodroplet microarray in a high humidity environment controls the final water content of the reaction. This platform has been validated for fluorescent HTS assays of protease and kinases as well as for fluorogenic substrate profiling of proteases. Label-free HTS is also possible by running nanoliter HTS reactions on a MALDI target for mass spectrometry (MS) analysis without the need for desalting of the samples. A method is described for running nanoliter-scale multicomponent homogeneous reactions followed by label-free MALDI MS spectrometry analysis of the reactions. PMID:20857358

  20. Characteristic attributes in cancer microarrays.

    PubMed

    Sarkar, I N; Planet, P J; Bael, T E; Stanley, S E; Siddall, M; DeSalle, R; Figurski, D H

    2002-04-01

    Rapid advances in genome sequencing and gene expression microarray technologies are providing unprecedented opportunities to identify specific genes involved in complex biological processes, such as development, signal transduction, and disease. The vast amount of data generated by these technologies has presented new challenges in bioinformatics. To help organize and interpret microarray data, new and efficient computational methods are needed to: (1) distinguish accurately between different biological or clinical categories (e.g., malignant vs. benign), and (2) identify specific genes that play a role in determining those categories. Here we present a novel and simple method that exhaustively scans microarray data for unambiguous gene expression patterns. Such patterns of data can be used as the basis for classification into biological or clinical categories. The method, termed the Characteristic Attribute Organization System (CAOS), is derived from fundamental precepts in systematic biology. In CAOS we define two types of characteristic attributes ('pure' and 'private') that may exist in gene expression microarray data. We also consider additional attributes ('compound') that are composed of expression states of more than one gene that are not characteristic on their own. CAOS was tested on three well-known cancer DNA microarray data sets for its ability to classify new microarray samples. We found CAOS to be a highly accurate and robust class prediction technique. In addition, CAOS identified specific genes, not emphasized in other analyses, that may be crucial to the biology of certain types of cancer. The success of CAOS in this study has significant implications for basic research and the future development of reliable methods for clinical diagnostic tools. PMID:12474425

  1. Microarrayed Materials for Stem Cells

    PubMed Central

    Mei, Ying

    2013-01-01

    Stem cells hold remarkable promise for applications in disease modeling, cancer therapy and regenerative medicine. Despite the significant progress made during the last decade, designing materials to control stem cell fate remains challenging. As an alternative, materials microarray technology has received great attention because it allows for high throughput materials synthesis and screening at a reasonable cost. Here, we discuss recent developments in materials microarray technology and their applications in stem cell engineering. Future opportunities in the field will also be reviewed. PMID:24311967

  2. Immunoprofiling Using NAPPA Protein Microarrays

    PubMed Central

    Sibani, Sahar; LaBaer, Joshua

    2012-01-01

    Protein microarrays provide an efficient method to immunoprofile patients in an effort to rapidly identify disease immunosignatures. The validity of using autoantibodies in diagnosis has been demonstrated in type 1 diabetes, rheumatoid arthritis, and systemic lupus, and is now being strongly considered in cancer. Several types of protein microarrays exist including antibody and antigen arrays. In this chapter, we describe the immunoprofiling application for one type of antigen array called NAPPA (nucleic acids programmable protein array). We provide a guideline for setting up the screening study and designing protein arrays to maximize the likelihood of obtaining quality data. PMID:21370064

  3. Microfluidic microarray systems and methods thereof

    DOEpatents

    West, Jay A. A.; Hukari, Kyle W.; Hux, Gary A.

    2009-04-28

    Disclosed are systems that include a manifold in fluid communication with a microfluidic chip having a microarray, an illuminator, and a detector in optical communication with the microarray. Methods for using these systems for biological detection are also disclosed.

  4. Microarray analysis: Uses and Limitations

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The use of microarray technology has exploded in resent years. All areas of biological research have found application for this powerful platform. From human disease studies to microbial detection systems, a plethora of uses for this technology are currently in place with new uses being developed ...

  5. Microarray Developed on Plastic Substrates.

    PubMed

    Bañuls, María-José; Morais, Sergi B; Tortajada-Genaro, Luis A; Maquieira, Ángel

    2016-01-01

    There is a huge potential interest to use synthetic polymers as versatile solid supports for analytical microarraying. Chemical modification of polycarbonate (PC) for covalent immobilization of probes, micro-printing of protein or nucleic acid probes, development of indirect immunoassay, and development of hybridization protocols are described and discussed. PMID:26614067

  6. Automated, Miniaturized Instrument for Measuring Gene Expression in Space

    NASA Astrophysics Data System (ADS)

    Pohorille, Andrew; Danley, David; Payvan, Kia; Ricco, Antonio

    To facilitate astrobiological studies on the survival and adaptation of microorganisms and mixed microbial cultures to space environment, we have been developing a fully automated, minia-turized system for measuring their gene expression on small spacecraft. This low-cost, multi-purpose instrument represents a major scientific and technological advancement in our ability to study the impact of the space environment on biological systems by providing data on cel-lular metabolism and regulation orders of magnitude richer than what is currently available. The system supports growth of the organism, lyse it to release the expressed RNA, label the RNA, read the expression levels of a large number of genes by microarray analysis of labeled RNA and transmit the measurements to Earth. To measure gene expression we use microarray technology developed by CombiMatrix, which is based on electrochemical reactions on arrays of electrodes on a semiconductor substrate. Since the electrical integrity of the microarray re-mains intact after probe synthesis, the circuitry can be employed to sense nucleic acid binding at each electrode. CombiMatrix arrays can be sectored to allow multiple samples per chip. In addition, a single array can be used for several assays. The array has been integrated into an automated microfluidic cartridge that uses flexible reagent blisters and pinch pumping to move liquid reagents between chambers. The proposed instrument will help to understand adaptation of terrestrial life to conditions be-yond the planet of origin, identify deleterious effects of the space environment, develop effective countermeasures against these effects, and test our ability to sustain and grow in space organ-isms that can be used for life support and in situ resource utilization during long-duration space exploration. The instrument is suitable for small satellite platforms, which provide frequent, low cost access to space. It can be also used on any other platform in space

  7. Using Kepler for Tool Integration in Microarray Analysis Workflows

    PubMed Central

    Gan, Zhuohui; Stowe, Jennifer C.; Altintas, Ilkay; McCulloch, Andrew D.; Zambon, Alexander C.

    2015-01-01

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines. PMID:26605000

  8. The Microarray Revolution: Perspectives from Educators

    ERIC Educational Resources Information Center

    Brewster, Jay L.; Beason, K. Beth; Eckdahl, Todd T.; Evans, Irene M.

    2004-01-01

    In recent years, microarray analysis has become a key experimental tool, enabling the analysis of genome-wide patterns of gene expression. This review approaches the microarray revolution with a focus upon four topics: 1) the early development of this technology and its application to cancer diagnostics; 2) a primer of microarray research,…

  9. Multipartite Fully Entangled Fraction

    NASA Astrophysics Data System (ADS)

    Xu, Jianwei

    2016-06-01

    Fully entangled fraction is a definition for bipartite states, which is tightly related to bipartite maximally entangled states, and has clear experimental and theoretical significance. In this work, we generalize it to multipartite case, we call the generalized version multipartite fully entangled fraction (MFEF). MFEF measures the closeness of a state to GHZ states. The analytical expressions of MFEF are very difficult to obtain except for very special states, however, we show that, the MFEF of any state is determined by a system of finite-order polynomial equations. Therefore, the MFEF can be efficiently numerically computed.

  10. Wire-Guide Manipulator For Automated Welding

    NASA Technical Reports Server (NTRS)

    Morris, Tim; White, Kevin; Gordon, Steve; Emerich, Dave; Richardson, Dave; Faulkner, Mike; Stafford, Dave; Mccutcheon, Kim; Neal, Ken; Milly, Pete

    1994-01-01

    Compact motor drive positions guide for welding filler wire. Drive part of automated wire feeder in partly or fully automated welding system. Drive unit contains three parallel subunits. Rotations of lead screws in three subunits coordinated to obtain desired motions in three degrees of freedom. Suitable for both variable-polarity plasma arc welding and gas/tungsten arc welding.

  11. ArrayPipe: a flexible processing pipeline for microarray data.

    PubMed

    Hokamp, Karsten; Roche, Fiona M; Acab, Michael; Rousseau, Marc-Etienne; Kuo, Byron; Goode, David; Aeschliman, Dana; Bryan, Jenny; Babiuk, Lorne A; Hancock, Robert E W; Brinkman, Fiona S L

    2004-07-01

    A number of microarray analysis software packages exist already; however, none combines the user-friendly features of a web-based interface with potential ability to analyse multiple arrays at once using flexible analysis steps. The ArrayPipe web server (freely available at www.pathogenomics.ca/arraypipe) allows the automated application of complex analyses to microarray data which can range from single slides to large data sets including replicates and dye-swaps. It handles output from most commonly used quantification software packages for dual-labelled arrays. Application features range from quality assessment of slides through various data visualizations to multi-step analyses including normalization, detection of differentially expressed genes, andcomparison and highlighting of gene lists. A highly customizable action set-up facilitates unrestricted arrangement of functions, which can be stored as action profiles. A unique combination of web-based and command-line functionality enables comfortable configuration of processes that can be repeatedly applied to large data sets in high throughput. The output consists of reports formatted as standard web pages and tab-delimited lists of calculated values that can be inserted into other analysis programs. Additional features, such as web-based spreadsheet functionality, auto-parallelization and password protection make this a powerful tool in microarray research for individuals and large groups alike. PMID:15215429

  12. Biclustering of time series microarray data.

    PubMed

    Meng, Jia; Huang, Yufei

    2012-01-01

    Clustering is a popular data exploration technique widely used in microarray data analysis. In this chapter, we review ideas and algorithms of bicluster and its applications in time series microarray analysis. We introduce first the concept and importance of biclustering and its different variations. We then focus our discussion on the popular iterative signature algorithm (ISA) for searching biclusters in microarray dataset. Next, we discuss in detail the enrichment constraint time-dependent ISA (ECTDISA) for identifying biologically meaningful temporal transcription modules from time series microarray dataset. In the end, we provide an example of ECTDISA application to time series microarray data of Kaposi's Sarcoma-associated Herpesvirus (KSHV) infection. PMID:22130875

  13. The Current Status of DNA Microarrays

    NASA Astrophysics Data System (ADS)

    Shi, Leming; Perkins, Roger G.; Tong, Weida

    DNA microarray technology that allows simultaneous assay of thousands of genes in a single experiment has steadily advanced to become a mainstream method used in research, and has reached a stage that envisions its use in medical applications and personalized medicine. Many different strategies have been developed for manufacturing DNA microarrays. In this chapter, we discuss the manufacturing characteristics of seven microarray platforms that were used in a recently completed large study by the MicroArray Quality Control (MAQC) consortium, which evaluated the concordance of results across these platforms. The platforms can be grouped into three categories: (1) in situ synthesis of oligonucleotide probes on microarrays (Affymetrix GeneChip® arrays based on photolithography synthesis and Agilent's arrays based on inkjet synthesis); (2) spotting of presynthesized oligonucleotide probes on microarrays (GE Healthcare's CodeLink system, Applied Biosystems' Genome Survey Microarrays, and the custom microarrays printed with Operon's oligonucleotide set); and (3) deposition of presynthesized oligonucleotide probes on bead-based microarrays (Illumina's BeadChip microarrays). We conclude this chapter with our views on the challenges and opportunities toward acceptance of DNA microarray data in clinical and regulatory settings.

  14. The Current Status of DNA Microarrays

    NASA Astrophysics Data System (ADS)

    Shi, Leming; Perkins, Roger G.; Tong, Weida

    DNA microarray technology that allows simultaneous assay of thousands of genes in a single experiment has steadily advanced to become a mainstream method used in research, and has reached a stage that envisions its use in medical applications and personalized medicine. Many different strategies have been developed for manufacturing DNA microarrays. In this chapter, we discuss the manu facturing characteristics of seven microarray platforms that were used in a recently completed large study by the MicroArray Quality Control (MAQC) consortium, which evaluated the concordance of results across these platforms. The platforms can be grouped into three categories: (1) in situ synthesis of oligonucleotide probes on microarrays (Affymetrix GeneChip® arrays based on photolithography synthesis and Agilent's arrays based on inkjet synthesis); (2) spotting of presynthe-sized oligonucleotide probes on microarrays (GE Healthcare's CodeLink system, Applied Biosystems' Genome Survey Microarrays, and the custom microarrays printed with Operon's oligonucleotide set); and (3) deposition of presynthesized oligonucleotide probes on bead-based microarrays (Illumina's BeadChip microar-rays). We conclude this chapter with our views on the challenges and opportunities toward acceptance of DNA microarray data in clinical and regulatory settings.

  15. Tissue microarrays: applications in genomic research.

    PubMed

    Watanabe, Aprill; Cornelison, Robert; Hostetter, Galen

    2005-03-01

    The widespread application of tissue microarrays in cancer research and the clinical pathology laboratory demonstrates a versatile and portable technology. The rapid integration of tissue microarrays into biomarker discovery and validation processes reflects the forward thinking of researchers who have pioneered the high-density tissue microarray. The precise arrangement of hundreds of archival clinical tissue samples into a composite tissue microarray block is now a proven method for the efficient and standardized analysis of molecular markers. With applications in cancer research, tissue microarrays are a valuable tool in validating candidate markers discovered in highly sensitive genome-wide microarray experiments. With applications in clinical pathology, tissue microarrays are used widely in immunohistochemistry quality control and quality assurance. The timeline of a biomarker implicated in prostate neoplasia, which was identified by complementary DNA expression profiling, validated by tissue microarrays and is now used as a prognostic immunohistochemistry marker, is reviewed. The tissue microarray format provides opportunities for digital imaging acquisition, image processing and database integration. Advances in digital imaging help to alleviate previous bottlenecks in the research pipeline, permit computer image scoring and convey telepathology opportunities for remote image analysis. The tissue microarray industry now includes public and private sectors with varying degrees of research utility and offers a range of potential tissue microarray applications in basic research, prognostic oncology and drug discovery. PMID:15833047

  16. Microarray analysis in pulmonary hypertension

    PubMed Central

    Hoffmann, Julia; Wilhelm, Jochen; Olschewski, Andrea

    2016-01-01

    Microarrays are a powerful and effective tool that allows the detection of genome-wide gene expression differences between controls and disease conditions. They have been broadly applied to investigate the pathobiology of diverse forms of pulmonary hypertension, namely group 1, including patients with idiopathic pulmonary arterial hypertension, and group 3, including pulmonary hypertension associated with chronic lung diseases such as chronic obstructive pulmonary disease and idiopathic pulmonary fibrosis. To date, numerous human microarray studies have been conducted to analyse global (lung homogenate samples), compartment-specific (laser capture microdissection), cell type-specific (isolated primary cells) and circulating cell (peripheral blood) expression profiles. Combined, they provide important information on development, progression and the end-stage disease. In the future, system biology approaches, expression of noncoding RNAs that regulate coding RNAs, and direct comparison between animal models and human disease might be of importance. PMID:27076594

  17. Phenotypic MicroRNA Microarrays

    PubMed Central

    Kwon, Yong-Jun; Heo, Jin Yeong; Kim, Hi Chul; Kim, Jin Yeop; Liuzzi, Michel; Soloveva, Veronica

    2013-01-01

    Microarray technology has become a very popular approach in cases where multiple experiments need to be conducted repeatedly or done with a variety of samples. In our lab, we are applying our high density spots microarray approach to microscopy visualization of the effects of transiently introduced siRNA or cDNA on cellular morphology or phenotype. In this publication, we are discussing the possibility of using this micro-scale high throughput process to study the role of microRNAs in the biology of selected cellular models. After reverse-transfection of microRNAs and siRNA, the cellular phenotype generated by microRNAs regulated NF-κB expression comparably to the siRNA. The ability to print microRNA molecules for reverse transfection into cells is opening up the wide horizon for the phenotypic high content screening of microRNA libraries using cellular disease models.

  18. Microarray analysis in pulmonary hypertension.

    PubMed

    Hoffmann, Julia; Wilhelm, Jochen; Olschewski, Andrea; Kwapiszewska, Grazyna

    2016-07-01

    Microarrays are a powerful and effective tool that allows the detection of genome-wide gene expression differences between controls and disease conditions. They have been broadly applied to investigate the pathobiology of diverse forms of pulmonary hypertension, namely group 1, including patients with idiopathic pulmonary arterial hypertension, and group 3, including pulmonary hypertension associated with chronic lung diseases such as chronic obstructive pulmonary disease and idiopathic pulmonary fibrosis. To date, numerous human microarray studies have been conducted to analyse global (lung homogenate samples), compartment-specific (laser capture microdissection), cell type-specific (isolated primary cells) and circulating cell (peripheral blood) expression profiles. Combined, they provide important information on development, progression and the end-stage disease. In the future, system biology approaches, expression of noncoding RNAs that regulate coding RNAs, and direct comparison between animal models and human disease might be of importance. PMID:27076594

  19. Self-Assembling Protein Microarrays

    NASA Astrophysics Data System (ADS)

    Ramachandran, Niroshan; Hainsworth, Eugenie; Bhullar, Bhupinder; Eisenstein, Samuel; Rosen, Benjamin; Lau, Albert Y.; C. Walter, Johannes; LaBaer, Joshua

    2004-07-01

    Protein microarrays provide a powerful tool for the study of protein function. However, they are not widely used, in part because of the challenges in producing proteins to spot on the arrays. We generated protein microarrays by printing complementary DNAs onto glass slides and then translating target proteins with mammalian reticulocyte lysate. Epitope tags fused to the proteins allowed them to be immobilized in situ. This obviated the need to purify proteins, avoided protein stability problems during storage, and captured sufficient protein for functional studies. We used the technology to map pairwise interactions among 29 human DNA replication initiation proteins, recapitulate the regulation of Cdt1 binding to select replication proteins, and map its geminin-binding domain.

  20. Optimisation algorithms for microarray biclustering.

    PubMed

    Perrin, Dimitri; Duhamel, Christophe

    2013-01-01

    In providing simultaneous information on expression profiles for thousands of genes, microarray technologies have, in recent years, been largely used to investigate mechanisms of gene expression. Clustering and classification of such data can, indeed, highlight patterns and provide insight on biological processes. A common approach is to consider genes and samples of microarray datasets as nodes in a bipartite graphs, where edges are weighted e.g. based on the expression levels. In this paper, using a previously-evaluated weighting scheme, we focus on search algorithms and evaluate, in the context of biclustering, several variations of Genetic Algorithms. We also introduce a new heuristic "Propagate", which consists in recursively evaluating neighbour solutions with one more or one less active conditions. The results obtained on three well-known datasets show that, for a given weighting scheme, optimal or near-optimal solutions can be identified. PMID:24109756

  1. Process automation

    SciTech Connect

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs.

  2. Simple Fully Automated Group Classification on Brain fMRI

    SciTech Connect

    Honorio, J.; Goldstein, R.; Honorio, J.; Samaras, D.; Tomasi, D.; Goldstein, R.Z.

    2010-04-14

    We propose a simple, well grounded classification technique which is suited for group classification on brain fMRI data sets that have high dimensionality, small number of subjects, high noise level, high subject variability, imperfect registration and capture subtle cognitive effects. We propose threshold-split region as a new feature selection method and majority voteas the classification technique. Our method does not require a predefined set of regions of interest. We use average acros ssessions, only one feature perexperimental condition, feature independence assumption, and simple classifiers. The seeming counter-intuitive approach of using a simple design is supported by signal processing and statistical theory. Experimental results in two block design data sets that capture brain function under distinct monetary rewards for cocaine addicted and control subjects, show that our method exhibits increased generalization accuracy compared to commonly used feature selection and classification techniques.

  3. Investigating Factors Affecting the Uptake of Automated Assessment Technology

    ERIC Educational Resources Information Center

    Dreher, Carl; Reiners, Torsten; Dreher, Heinz

    2011-01-01

    Automated assessment is an emerging innovation in educational praxis, however its pedagogical potential is not fully utilised in Australia, particularly regarding automated essay grading. The rationale for this research is that the usage of automated assessment currently lags behind the capacity that the technology provides, thus restricting the…

  4. Automation pilot

    NASA Technical Reports Server (NTRS)

    1983-01-01

    An important concept of the Action Information Management System (AIMS) approach is to evaluate office automation technology in the context of hands on use by technical program managers in the conduct of human acceptance difficulties which may accompany the transition to a significantly changing work environment. The improved productivity and communications which result from application of office automation technology are already well established for general office environments, but benefits unique to NASA are anticipated and these will be explored in detail.

  5. Automated Urinalysis

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Information from NASA Tech Briefs assisted DiaSys Corporation in the development of the R/S 2000 which automates urinalysis, eliminating most manual procedures. An automatic aspirator is inserted into a standard specimen tube, the "Sample" button is pressed, and within three seconds a consistent amount of urine sediment is transferred to a microscope. The instrument speeds up, standardizes, automates and makes urine analysis safer. Additional products based on the same technology are anticipated.

  6. Automating the Purple Crow Lidar

    NASA Astrophysics Data System (ADS)

    Hicks, Shannon; Sica, R. J.; Argall, P. S.

    2016-06-01

    The Purple Crow LiDAR (PCL) was built to measure short and long term coupling between the lower, middle, and upper atmosphere. The initial component of my MSc. project is to automate two key elements of the PCL: the rotating liquid mercury mirror and the Zaber alignment mirror. In addition to the automation of the Zaber alignment mirror, it is also necessary to describe the mirror's movement and positioning errors. Its properties will then be added into the alignment software. Once the alignment software has been completed, we will compare the new alignment method with the previous manual procedure. This is the first among several projects that will culminate in a fully-automated lidar. Eventually, we will be able to work remotely, thereby increasing the amount of data we collect. This paper will describe the motivation for automation, the methods we propose, preliminary results for the Zaber alignment error analysis, and future work.

  7. Automated High Throughput Drug Target Crystallography

    SciTech Connect

    Rupp, B

    2005-02-18

    The molecular structures of drug target proteins and receptors form the basis for 'rational' or structure guided drug design. The majority of target structures are experimentally determined by protein X-ray crystallography, which as evolved into a highly automated, high throughput drug discovery and screening tool. Process automation has accelerated tasks from parallel protein expression, fully automated crystallization, and rapid data collection to highly efficient structure determination methods. A thoroughly designed automation technology platform supported by a powerful informatics infrastructure forms the basis for optimal workflow implementation and the data mining and analysis tools to generate new leads from experimental protein drug target structures.

  8. Integrated Amplification Microarrays for Infectious Disease Diagnostics

    PubMed Central

    Chandler, Darrell P.; Bryant, Lexi; Griesemer, Sara B.; Gu, Rui; Knickerbocker, Christopher; Kukhtin, Alexander; Parker, Jennifer; Zimmerman, Cynthia; George, Kirsten St.; Cooney, Christopher G.

    2012-01-01

    This overview describes microarray-based tests that combine solution-phase amplification chemistry and microarray hybridization within a single microfluidic chamber. The integrated biochemical approach improves microarray workflow for diagnostic applications by reducing the number of steps and minimizing the potential for sample or amplicon cross-contamination. Examples described herein illustrate a basic, integrated approach for DNA and RNA genomes, and a simple consumable architecture for incorporating wash steps while retaining an entirely closed system. It is anticipated that integrated microarray biochemistry will provide an opportunity to significantly reduce the complexity and cost of microarray consumables, equipment, and workflow, which in turn will enable a broader spectrum of users to exploit the intrinsic multiplexing power of microarrays for infectious disease diagnostics.

  9. THE ABRF MARG MICROARRAY SURVEY 2005: TAKING THE PULSE ON THE MICROARRAY FIELD

    EPA Science Inventory

    Over the past several years microarray technology has evolved into a critical component of any discovery based program. Since 1999, the Association of Biomolecular Resource Facilities (ABRF) Microarray Research Group (MARG) has conducted biennial surveys designed to generate a pr...

  10. Living Cell Microarrays: An Overview of Concepts.

    PubMed

    Jonczyk, Rebecca; Kurth, Tracy; Lavrentieva, Antonina; Walter, Johanna-Gabriela; Scheper, Thomas; Stahl, Frank

    2016-01-01

    Living cell microarrays are a highly efficient cellular screening system. Due to the low number of cells required per spot, cell microarrays enable the use of primary and stem cells and provide resolution close to the single-cell level. Apart from a variety of conventional static designs, microfluidic microarray systems have also been established. An alternative format is a microarray consisting of three-dimensional cell constructs ranging from cell spheroids to cells encapsulated in hydrogel. These systems provide an in vivo-like microenvironment and are preferably used for the investigation of cellular physiology, cytotoxicity, and drug screening. Thus, many different high-tech microarray platforms are currently available. Disadvantages of many systems include their high cost, the requirement of specialized equipment for their manufacture, and the poor comparability of results between different platforms. In this article, we provide an overview of static, microfluidic, and 3D cell microarrays. In addition, we describe a simple method for the printing of living cell microarrays on modified microscope glass slides using standard DNA microarray equipment available in most laboratories. Applications in research and diagnostics are discussed, e.g., the selective and sensitive detection of biomarkers. Finally, we highlight current limitations and the future prospects of living cell microarrays. PMID:27600077

  11. Living Cell Microarrays: An Overview of Concepts

    PubMed Central

    Jonczyk, Rebecca; Kurth, Tracy; Lavrentieva, Antonina; Walter, Johanna-Gabriela; Scheper, Thomas; Stahl, Frank

    2016-01-01

    Living cell microarrays are a highly efficient cellular screening system. Due to the low number of cells required per spot, cell microarrays enable the use of primary and stem cells and provide resolution close to the single-cell level. Apart from a variety of conventional static designs, microfluidic microarray systems have also been established. An alternative format is a microarray consisting of three-dimensional cell constructs ranging from cell spheroids to cells encapsulated in hydrogel. These systems provide an in vivo-like microenvironment and are preferably used for the investigation of cellular physiology, cytotoxicity, and drug screening. Thus, many different high-tech microarray platforms are currently available. Disadvantages of many systems include their high cost, the requirement of specialized equipment for their manufacture, and the poor comparability of results between different platforms. In this article, we provide an overview of static, microfluidic, and 3D cell microarrays. In addition, we describe a simple method for the printing of living cell microarrays on modified microscope glass slides using standard DNA microarray equipment available in most laboratories. Applications in research and diagnostics are discussed, e.g., the selective and sensitive detection of biomarkers. Finally, we highlight current limitations and the future prospects of living cell microarrays. PMID:27600077

  12. Clustering Short Time-Series Microarray

    NASA Astrophysics Data System (ADS)

    Ping, Loh Wei; Hasan, Yahya Abu

    2008-01-01

    Most microarray analyses are carried out on static gene expressions. However, the dynamical study of microarrays has lately gained more attention. Most researches on time-series microarray emphasize on the bioscience and medical aspects but few from the numerical aspect. This study attempts to analyze short time-series microarray mathematically using STEM clustering tool which formally preprocess data followed by clustering. We next introduce the Circular Mould Distance (CMD) algorithm with combinations of both preprocessing and clustering analysis. Both methods are subsequently compared in terms of efficiencies.

  13. Protein microarrays as tools for functional proteomics.

    PubMed

    LaBaer, Joshua; Ramachandran, Niroshan

    2005-02-01

    Protein microarrays present an innovative and versatile approach to study protein abundance and function at an unprecedented scale. Given the chemical and structural complexity of the proteome, the development of protein microarrays has been challenging. Despite these challenges there has been a marked increase in the use of protein microarrays to map interactions of proteins with various other molecules, and to identify potential disease biomarkers, especially in the area of cancer biology. In this review, we discuss some of the promising advances made in the development and use of protein microarrays. PMID:15701447

  14. Array2BIO: A Comprehensive Suite of Utilities for the Analysis of Microarray Data

    SciTech Connect

    Loots, G G; Chain, P G; Mabery, S; Rasley, A; Garcia, E; Ovcharenko, I

    2006-02-13

    We have developed an integrative and automated toolkit for the analysis of Affymetrix microarray data, named Array2BIO. It identifies groups of coexpressed genes using two complementary approaches--comparative analysis of signal versus control microarrays and clustering analysis of gene expression across different conditions. The identified genes are assigned to functional categories based on the Gene Ontology classification, and a detection of corresponding KEGG protein interaction pathways. Array2BIO reliably handles low-expressor genes and provides a set of statistical methods to quantify the odds of observations, including the Benjamini-Hochberg and Bonferroni multiple testing corrections. Automated interface with the ECR Browser provides evolutionary conservation analysis of identified gene loci while the interconnection with Creme allows high-throughput analysis of human promoter regions and prediction of gene regulatory elements that underlie the observed expression patterns. Array2BIO is publicly available at http://array2bio.dcode.org.

  15. Photoelectrochemical synthesis of DNA microarrays

    PubMed Central

    Chow, Brian Y.; Emig, Christopher J.; Jacobson, Joseph M.

    2009-01-01

    Optical addressing of semiconductor electrodes represents a powerful technology that enables the independent and parallel control of a very large number of electrical phenomena at the solid-electrolyte interface. To date, it has been used in a wide range of applications including electrophoretic manipulation, biomolecule sensing, and stimulating networks of neurons. Here, we have adapted this approach for the parallel addressing of redox reactions, and report the construction of a DNA microarray synthesis platform based on semiconductor photoelectrochemistry (PEC). An amorphous silicon photoconductor is activated by an optical projection system to create virtual electrodes capable of electrochemically generating protons; these PEC-generated protons then cleave the acid-labile dimethoxytrityl protecting groups of DNA phosphoramidite synthesis reagents with the requisite spatial selectivity to generate DNA microarrays. Furthermore, a thin-film porous glass dramatically increases the amount of DNA synthesized per chip by over an order of magnitude versus uncoated glass. This platform demonstrates that PEC can be used toward combinatorial bio-polymer and small molecule synthesis. PMID:19706433

  16. THE ABRF-MARG MICROARRAY SURVEY 2004: TAKING THE PULSE OF THE MICROARRAY FIELD

    EPA Science Inventory

    Over the past several years, the field of microarrays has grown and evolved drastically. In its continued efforts to track this evolution, the ABRF-MARG has once again conducted a survey of international microarray facilities and individual microarray users. The goal of the surve...

  17. 2008 Microarray Research Group (MARG Survey): Sensing the State of Microarray Technology

    EPA Science Inventory

    Over the past several years, the field of microarrays has grown and evolved drastically. In its continued efforts to track this evolution and transformation, the ABRF-MARG has once again conducted a survey of international microarray facilities and individual microarray users. Th...

  18. Habitat automation

    NASA Technical Reports Server (NTRS)

    Swab, Rodney E.

    1992-01-01

    A habitat, on either the surface of the Moon or Mars, will be designed and built with the proven technologies of that day. These technologies will be mature and readily available to the habitat designer. We believe an acceleration of the normal pace of automation would allow a habitat to be safer and more easily maintained than would be the case otherwise. This document examines the operation of a habitat and describes elements of that operation which may benefit from an increased use of automation. Research topics within the automation realm are then defined and discussed with respect to the role they can have in the design of the habitat. Problems associated with the integration of advanced technologies into real-world projects at NASA are also addressed.

  19. Innovative instrumentation for microarray scanning and analysis: application for characterization of oligonucleotide duplexes behavior.

    PubMed

    Khomyakova, E B; Dreval, E V; Tran-Dang, M; Potier, M C; Soussaline, F P

    2004-05-01

    Accuracy in microarray technology requires new approaches to microarray reader development. A microarray reader system (optical scanning array or OSA reader) based on automated microscopy with large field of view, high speed 3 axis scanning at multiple narrow-band spectra of excitation light has been developed. It allows fast capture of high-resolution, multi-fluorescence images and is characterized by a linear dynamic range and sensitivity comparable to commonly used photo-multiplier tube (PMT)-based laser scanner. Controlled by high performance software, the instrument can be used for scanning and quantitative analysis of any type of dry microarray. Studies implying temperature-controlled hybridization chamber containing a microarray can also be performed. This enables the registration of kinetics and melting curves. This feature is required in a wide range of on-chip chemical and enzymatic reactions including on-chip PCR amplification. We used the OSA reader for the characterization of hybridization and melting behaviour of oligonucleotide:oligonucleotide duplexes on three-dimensional Code Link slides. PMID:15209342

  20. Microarrays Made Simple: "DNA Chips" Paper Activity

    ERIC Educational Resources Information Center

    Barnard, Betsy

    2006-01-01

    DNA microarray technology is revolutionizing biological science. DNA microarrays (also called DNA chips) allow simultaneous screening of many genes for changes in expression between different cells. Now researchers can obtain information about genes in days or weeks that used to take months or years. The paper activity described in this article…

  1. Automating Finance

    ERIC Educational Resources Information Center

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  2. Automated dispenser

    SciTech Connect

    Hollen, R.M.; Stalnaker, N.D.

    1989-04-06

    An automated dispenser having a conventional pipette attached to an actuating cylinder through a flexible cable for delivering precise quantities of a liquid through commands from remotely located computer software. The travel of the flexible cable is controlled by adjustable stops and a locking shaft. The pipette can be positioned manually or by the hands of a robot. 1 fig.

  3. Protein-Based Microarray for the Detection of Pathogenic Bacteria

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Microarrays have been used for gene expression and protein interaction studies, but recently, multianalyte diagnostic assays have employed the microarray platform. We developed a microarray immunoassay for bacteria, with biotinylated capture antibodies on streptavidin slides. To complete the fluor...

  4. Tissue Microarrays in Clinical Oncology

    PubMed Central

    Voduc, David; Kenney, Challayne; Nielsen, Torsten O.

    2008-01-01

    The tissue microarray is a recently-implemented, high-throughput technology for the analysis of molecular markers in oncology. This research tool permits the rapid assessment of a biomarker in thousands of tumor samples, using commonly available laboratory assays such as immunohistochemistry and in-situ hybridization. Although introduced less than a decade ago, the TMA has proven to be invaluable in the study of tumor biology, the development of diagnostic tests, and the investigation of oncological biomarkers. This review describes the impact of TMA-based research in clinical oncology and its potential future applications. Technical aspects of TMA construction, and the advantages and disadvantages inherent to this technology are also discussed. PMID:18314063

  5. DNA Microarrays for Identifying Fishes

    PubMed Central

    Nölte, M.; Weber, H.; Silkenbeumer, N.; Hjörleifsdottir, S.; Hreggvidsson, G. O.; Marteinsson, V.; Kappel, K.; Planes, S.; Tinti, F.; Magoulas, A.; Garcia Vazquez, E.; Turan, C.; Hervet, C.; Campo Falgueras, D.; Antoniou, A.; Landi, M.; Blohm, D.

    2008-01-01

    In many cases marine organisms and especially their diverse developmental stages are difficult to identify by morphological characters. DNA-based identification methods offer an analytically powerful addition or even an alternative. In this study, a DNA microarray has been developed to be able to investigate its potential as a tool for the identification of fish species from European seas based on mitochondrial 16S rDNA sequences. Eleven commercially important fish species were selected for a first prototype. Oligonucleotide probes were designed based on the 16S rDNA sequences obtained from 230 individuals of 27 fish species. In addition, more than 1200 sequences of 380 species served as sequence background against which the specificity of the probes was tested in silico. Single target hybridisations with Cy5-labelled, PCR-amplified 16S rDNA fragments from each of the 11 species on microarrays containing the complete set of probes confirmed their suitability. True-positive, fluorescence signals obtained were at least one order of magnitude stronger than false-positive cross-hybridisations. Single nontarget hybridisations resulted in cross-hybridisation signals at approximately 27% of the cases tested, but all of them were at least one order of magnitude lower than true-positive signals. This study demonstrates that the 16S rDNA gene is suitable for designing oligonucleotide probes, which can be used to differentiate 11 fish species. These data are a solid basis for the second step to create a “Fish Chip” for approximately 50 fish species relevant in marine environmental and fisheries research, as well as control of fisheries products. PMID:18270778

  6. Automated Methods for Multiplexed Pathogen Detection

    SciTech Connect

    Straub, Tim M.; Dockendorff, Brian P.; Quinonez-Diaz, Maria D.; Valdez, Catherine O.; Shutthanandan, Janani I.; Tarasevich, Barbara J.; Grate, Jay W.; Bruckner-Lea, Cindy J.

    2005-09-01

    Detection of pathogenic microorganisms in environmental samples is a difficult process. Concentration of the organisms of interest also co-concentrates inhibitors of many end-point detection methods, notably, nucleic acid methods. In addition, sensitive, highly multiplexed pathogen detection continues to be problematic. The primary function of the BEADS (Biodetection Enabling Analyte Delivery System) platform is the automated concentration and purification of target analytes from interfering substances, often present in these samples, via a renewable surface column. In one version of BEADS, automated immunomagnetic separation (IMS) is used to separate cells from their samples. Captured cells are transferred to a flow-through thermal cycler where PCR, using labeled primers, is performed. PCR products are then detected by hybridization to a DNA suspension array. In another version of BEADS, cell lysis is performed, and community RNA is purified and directly labeled. Multiplexed detection is accomplished by direct hybridization of the RNA to a planar microarray. The integrated IMS/PCR version of BEADS can successfully purify and amplify 10 E. coli O157:H7 cells from river water samples. Multiplexed PCR assays for the simultaneous detection of E. coli O157:H7, Salmonella, and Shigella on bead suspension arrays was demonstrated for the detection of as few as 100 cells for each organism. Results for the RNA version of BEADS are also showing promising results. Automation yields highly purified RNA, suitable for multiplexed detection on microarrays, with microarray detection specificity equivalent to PCR. Both versions of the BEADS platform show great promise for automated pathogen detection from environmental samples. Highly multiplexed pathogen detection using PCR continues to be problematic, but may be required for trace detection in large volume samples. The RNA approach solves the issues of highly multiplexed PCR and provides ''live vs. dead'' capabilities. However

  7. Microarray-integrated optoelectrofluidic immunoassay system.

    PubMed

    Han, Dongsik; Park, Je-Kyun

    2016-05-01

    A microarray-based analytical platform has been utilized as a powerful tool in biological assay fields. However, an analyte depletion problem due to the slow mass transport based on molecular diffusion causes low reaction efficiency, resulting in a limitation for practical applications. This paper presents a novel method to improve the efficiency of microarray-based immunoassay via an optically induced electrokinetic phenomenon by integrating an optoelectrofluidic device with a conventional glass slide-based microarray format. A sample droplet was loaded between the microarray slide and the optoelectrofluidic device on which a photoconductive layer was deposited. Under the application of an AC voltage, optically induced AC electroosmotic flows caused by a microarray-patterned light actively enhanced the mass transport of target molecules at the multiple assay spots of the microarray simultaneously, which reduced tedious reaction time from more than 30 min to 10 min. Based on this enhancing effect, a heterogeneous immunoassay with a tiny volume of sample (5 μl) was successfully performed in the microarray-integrated optoelectrofluidic system using immunoglobulin G (IgG) and anti-IgG, resulting in improved efficiency compared to the static environment. Furthermore, the application of multiplex assays was also demonstrated by multiple protein detection. PMID:27190571

  8. DNA Microarrays in Herbal Drug Research

    PubMed Central

    Chavan, Preeti; Joshi, Kalpana; Patwardhan, Bhushan

    2006-01-01

    Natural products are gaining increased applications in drug discovery and development. Being chemically diverse they are able to modulate several targets simultaneously in a complex system. Analysis of gene expression becomes necessary for better understanding of molecular mechanisms. Conventional strategies for expression profiling are optimized for single gene analysis. DNA microarrays serve as suitable high throughput tool for simultaneous analysis of multiple genes. Major practical applicability of DNA microarrays remains in DNA mutation and polymorphism analysis. This review highlights applications of DNA microarrays in pharmacodynamics, pharmacogenomics, toxicogenomics and quality control of herbal drugs and extracts. PMID:17173108

  9. Progress in the application of DNA microarrays.

    PubMed Central

    Lobenhofer, E K; Bushel, P R; Afshari, C A; Hamadeh, H K

    2001-01-01

    Microarray technology has been applied to a variety of different fields to address fundamental research questions. The use of microarrays, or DNA chips, to study the gene expression profiles of biologic samples began in 1995. Since that time, the fundamental concepts behind the chip, the technology required for making and using these chips, and the multitude of statistical tools for analyzing the data have been extensively reviewed. For this reason, the focus of this review will be not on the technology itself but on the application of microarrays as a research tool and the future challenges of the field. PMID:11673116

  10. Space science experimentation automation and support

    NASA Technical Reports Server (NTRS)

    Frainier, Richard J.; Groleau, Nicolas; Shapiro, Jeff C.

    1994-01-01

    This paper outlines recent work done at the NASA Ames Artificial Intelligence Research Laboratory on automation and support of science experiments on the US Space Shuttle in low earth orbit. Three approaches to increasing the science return of these experiments using emerging automation technologies are described: remote control (telescience), science advisors for astronaut operators, and fully autonomous experiments. The capabilities and limitations of these approaches are reviewed.

  11. AMIC@: All MIcroarray Clusterings @ once

    PubMed Central

    Geraci, Filippo; Pellegrini, Marco; Renda, M. Elena

    2008-01-01

    The AMIC@ Web Server offers a light-weight multi-method clustering engine for microarray gene-expression data. AMIC@ is a highly interactive tool that stresses user-friendliness and robustness by adopting AJAX technology, thus allowing an effective interleaved execution of different clustering algorithms and inspection of results. Among the salient features AMIC@ offers, there are: (i) automatic file format detection, (ii) suggestions on the number of clusters using a variant of the stability-based method of Tibshirani et al. (iii) intuitive visual inspection of the data via heatmaps and (iv) measurements of the clustering quality using cluster homogeneity. Large data sets can be processed efficiently by selecting algorithms (such as FPF-SB and k-Boost), specifically designed for this purpose. In case of very large data sets, the user can opt for a batch-mode use of the system by means of the Clustering wizard that runs all algorithms at once and delivers the results via email. AMIC@ is freely available and open to all users with no login requirement at the following URL http://bioalgo.iit.cnr.it/amica. PMID:18477631

  12. AMIC@: All MIcroarray Clusterings @ once.

    PubMed

    Geraci, Filippo; Pellegrini, Marco; Renda, M Elena

    2008-07-01

    The AMIC@ Web Server offers a light-weight multi-method clustering engine for microarray gene-expression data. AMIC@ is a highly interactive tool that stresses user-friendliness and robustness by adopting AJAX technology, thus allowing an effective interleaved execution of different clustering algorithms and inspection of results. Among the salient features AMIC@ offers, there are: (i) automatic file format detection, (ii) suggestions on the number of clusters using a variant of the stability-based method of Tibshirani et al. (iii) intuitive visual inspection of the data via heatmaps and (iv) measurements of the clustering quality using cluster homogeneity. Large data sets can be processed efficiently by selecting algorithms (such as FPF-SB and k-Boost), specifically designed for this purpose. In case of very large data sets, the user can opt for a batch-mode use of the system by means of the Clustering wizard that runs all algorithms at once and delivers the results via email. AMIC@ is freely available and open to all users with no login requirement at the following URL http://bioalgo.iit.cnr.it/amica. PMID:18477631

  13. Integrating Microarray Data and GRNs.

    PubMed

    Koumakis, L; Potamias, G; Tsiknakis, M; Zervakis, M; Moustakis, V

    2016-01-01

    With the completion of the Human Genome Project and the emergence of high-throughput technologies, a vast amount of molecular and biological data are being produced. Two of the most important and significant data sources come from microarray gene-expression experiments and respective databanks (e,g., Gene Expression Omnibus-GEO (http://www.ncbi.nlm.nih.gov/geo)), and from molecular pathways and Gene Regulatory Networks (GRNs) stored and curated in public (e.g., Kyoto Encyclopedia of Genes and Genomes-KEGG (http://www.genome.jp/kegg/pathway.html), Reactome (http://www.reactome.org/ReactomeGWT/entrypoint.html)) as well as in commercial repositories (e.g., Ingenuity IPA (http://www.ingenuity.com/products/ipa)). The association of these two sources aims to give new insight in disease understanding and reveal new molecular targets in the treatment of specific phenotypes.Three major research lines and respective efforts that try to utilize and combine data from both of these sources could be identified, namely: (1) de novo reconstruction of GRNs, (2) identification of Gene-signatures, and (3) identification of differentially expressed GRN functional paths (i.e., sub-GRN paths that distinguish between different phenotypes). In this chapter, we give an overview of the existing methods that support the different types of gene-expression and GRN integration with a focus on methodologies that aim to identify phenotype-discriminant GRNs or subnetworks, and we also present our methodology. PMID:26134183

  14. DNA microarrays in prostate cancer.

    PubMed

    Ho, Shuk-Mei; Lau, Kin-Mang

    2002-02-01

    DNA microarray technology provides a means to examine large numbers of molecular changes related to a biological process in a high throughput manner. This review discusses plausible utilities of this technology in prostate cancer research, including definition of prostate cancer predisposition, global profiling of gene expression patterns associated with cancer initiation and progression, identification of new diagnostic and prognostic markers, and discovery of novel patient classification schemes. The technology, at present, has only been explored in a limited fashion in prostate cancer research. Some hurdles to be overcome are the high cost of the technology, insufficient sample size and repeated experiments, and the inadequate use of bioinformatics. With the completion of the Human Genome Project and the advance of several highly complementary technologies, such as laser capture microdissection, unbiased RNA amplification, customized functional arrays (eg, single-nucleotide polymorphism chips), and amenable bioinformatics software, this technology will become widely used by investigators in the field. The large amount of novel, unbiased hypotheses and insights generated by this technology is expected to have a significant impact on the diagnosis, treatment, and prevention of prostate cancer. Finally, this review emphasizes existing, but currently underutilized, data-mining tools, such as multivariate statistical analyses, neural networking, and machine learning techniques, to stimulate wider usage. PMID:12084220

  15. Genome-scale cluster analysis of replicated microarrays using shrinkage correlation coefficient

    PubMed Central

    Yao, Jianchao; Chang, Chunqi; Salmi, Mari L; Hung, Yeung Sam; Loraine, Ann; Roux, Stanley J

    2008-01-01

    Background Currently, clustering with some form of correlation coefficient as the gene similarity metric has become a popular method for profiling genomic data. The Pearson correlation coefficient and the standard deviation (SD)-weighted correlation coefficient are the two most widely-used correlations as the similarity metrics in clustering microarray data. However, these two correlations are not optimal for analyzing replicated microarray data generated by most laboratories. An effective correlation coefficient is needed to provide statistically sufficient analysis of replicated microarray data. Results In this study, we describe a novel correlation coefficient, shrinkage correlation coefficient (SCC), that fully exploits the similarity between the replicated microarray experimental samples. The methodology considers both the number of replicates and the variance within each experimental group in clustering expression data, and provides a robust statistical estimation of the error of replicated microarray data. The value of SCC is revealed by its comparison with two other correlation coefficients that are currently the most widely-used (Pearson correlation coefficient and SD-weighted correlation coefficient) using statistical measures on both synthetic expression data as well as real gene expression data from Saccharomyces cerevisiae. Two leading clustering methods, hierarchical and k-means clustering were applied for the comparison. The comparison indicated that using SCC achieves better clustering performance. Applying SCC-based hierarchical clustering to the replicated microarray data obtained from germinating spores of the fern Ceratopteris richardii, we discovered two clusters of genes with shared expression patterns during spore germination. Functional analysis suggested that some of the genetic mechanisms that control germination in such diverse plant lineages as mosses and angiosperms are also conserved among ferns. Conclusion This study shows that SCC is

  16. Enhancing Interdisciplinary Mathematics and Biology Education: A Microarray Data Analysis Course Bridging These Disciplines

    PubMed Central

    Evans, Irene M.

    2010-01-01

    BIO2010 put forth the goal of improving the mathematical educational background of biology students. The analysis and interpretation of microarray high-dimensional data can be very challenging and is best done by a statistician and a biologist working and teaching in a collaborative manner. We set up such a collaboration and designed a course on microarray data analysis. We started using Genome Consortium for Active Teaching (GCAT) materials and Microarray Genome and Clustering Tool software and added R statistical software along with Bioconductor packages. In response to student feedback, one microarray data set was fully analyzed in class, starting from preprocessing to gene discovery to pathway analysis using the latter software. A class project was to conduct a similar analysis where students analyzed their own data or data from a published journal paper. This exercise showed the impact that filtering, preprocessing, and different normalization methods had on gene inclusion in the final data set. We conclude that this course achieved its goals to equip students with skills to analyze data from a microarray experiment. We offer our insight about collaborative teaching as well as how other faculty might design and implement a similar interdisciplinary course. PMID:20810954

  17. Quality Visualization of Microarray Datasets Using Circos

    PubMed Central

    Koch, Martin; Wiese, Michael

    2012-01-01

    Quality control and normalization is considered the most important step in the analysis of microarray data. At present there are various methods available for quality assessments of microarray datasets. However there seems to be no standard visualization routine, which also depicts individual microarray quality. Here we present a convenient method for visualizing the results of standard quality control tests using Circos plots. In these plots various quality measurements are drawn in a circular fashion, thus allowing for visualization of the quality and all outliers of each distinct array within a microarray dataset. The proposed method is intended for use with the Affymetrix Human Genome platform (i.e., GPL 96, GPL570 and GPL571). Circos quality measurement plots are a convenient way for the initial quality estimate of Affymetrix datasets that are stored in publicly available databases.

  18. Microarray: an approach for current drug targets.

    PubMed

    Gomase, Virendra S; Tagore, Somnath; Kale, Karbhari V

    2008-03-01

    Microarrays are a powerful tool has multiple applications both in clinical and cellular and molecular biology arenas. Early assessment of the probable biological importance of drug targets, pharmacogenomics, toxicogenomics and single nucleotide polymorphisms (SNPs). A list of new drug candidates along with proposed targets for intervention is described. Recent advances in the knowledge of microarrays analysis of organisms and the availability of the genomics sequences provide a wide range of novel targets for drug design. This review gives different process of microarray technologies; methods for comparative gene expression study, applications of microarrays in medicine and pharmacogenomics and current drug targets in research, which are relevant to common diseases as they relate to clinical and future perspectives. PMID:18336225

  19. Contributions to Statistical Problems Related to Microarray Data

    ERIC Educational Resources Information Center

    Hong, Feng

    2009-01-01

    Microarray is a high throughput technology to measure the gene expression. Analysis of microarray data brings many interesting and challenging problems. This thesis consists three studies related to microarray data. First, we propose a Bayesian model for microarray data and use Bayes Factors to identify differentially expressed genes. Second, we…

  20. Automated lithocell

    NASA Astrophysics Data System (ADS)

    Englisch, Andreas; Deuter, Armin

    1990-06-01

    Integration and automation have gained more and more ground in modern IC-manufacturing. It is difficult to make a direct calculation of the profit these investments yield. On the other hand, the demands to man, machine and technology have increased enormously of late; it is not difficult to see that only by means of integration and automation can these demands be coped with. Here are some salient points: U the complexity and costs incurred by the equipment and processes have got significantly higher . owing to the reduction of all dimensions, the tolerances within which the various process steps have to be carried out have got smaller and smaller and the adherence to these tolerances more and more difficult U the cycle time has become more and more important both for the development and control of new processes and, to a great extent, for a rapid and reliable supply to the customer. In order that the products be competitive under these conditions, all sort of costs have to be reduced and the yield has to be maximized. Therefore, the computer-aided control of the equipment and the process combined with an automatic data collection and a real-time SPC (statistical process control) has become absolutely necessary for successful IC-manufacturing. Human errors must be eliminated from the execution of the various process steps by automation. The work time set free in this way makes it possible for the human creativity to be employed on a larger scale in stabilizing the processes. Besides, a computer-aided equipment control can ensure the optimal utilization of the equipment round the clock.

  1. The Impact of Photobleaching on Microarray Analysis

    PubMed Central

    von der Haar, Marcel; Preuß, John-Alexander; von der Haar, Kathrin; Lindner, Patrick; Scheper, Thomas; Stahl, Frank

    2015-01-01

    DNA-Microarrays have become a potent technology for high-throughput analysis of genetic regulation. However, the wide dynamic range of signal intensities of fluorophore-based microarrays exceeds the dynamic range of a single array scan by far, thus limiting the key benefit of microarray technology: parallelization. The implementation of multi-scan techniques represents a promising approach to overcome these limitations. These techniques are, in turn, limited by the fluorophores’ susceptibility to photobleaching when exposed to the scanner’s laser light. In this paper the photobleaching characteristics of cyanine-3 and cyanine-5 as part of solid state DNA microarrays are studied. The effects of initial fluorophore intensity as well as laser scanner dependent variables such as the photomultiplier tube’s voltage on bleaching and imaging are investigated. The resulting data is used to develop a model capable of simulating the expected degree of signal intensity reduction caused by photobleaching for each fluorophore individually, allowing for the removal of photobleaching-induced, systematic bias in multi-scan procedures. Single-scan applications also benefit as they rely on pre-scans to determine the optimal scanner settings. These findings constitute a step towards standardization of microarray experiments and analysis and may help to increase the lab-to-lab comparability of microarray experiment results. PMID:26378589

  2. Unsupervised assessment of microarray data quality using a Gaussian mixture model

    PubMed Central

    Howard, Brian E; Sick, Beate; Heber, Steffen

    2009-01-01

    Background Quality assessment of microarray data is an important and often challenging aspect of gene expression analysis. This task frequently involves the examination of a variety of summary statistics and diagnostic plots. The interpretation of these diagnostics is often subjective, and generally requires careful expert scrutiny. Results We show how an unsupervised classification technique based on the Expectation-Maximization (EM) algorithm and the naïve Bayes model can be used to automate microarray quality assessment. The method is flexible and can be easily adapted to accommodate alternate quality statistics and platforms. We evaluate our approach using Affymetrix 3' gene expression and exon arrays and compare the performance of this method to a similar supervised approach. Conclusion This research illustrates the efficacy of an unsupervised classification approach for the purpose of automated microarray data quality assessment. Since our approach requires only unannotated training data, it is easy to customize and to keep up-to-date as technology evolves. In contrast to other "black box" classification systems, this method also allows for intuitive explanations. PMID:19545436

  3. Maximizing Your Investment in Building Automation System Technology.

    ERIC Educational Resources Information Center

    Darnell, Charles

    2001-01-01

    Discusses how organizational issues and system standardization can be important factors that determine an institution's ability to fully exploit contemporary building automation systems (BAS). Further presented is management strategy for maximizing BAS investments. (GR)

  4. Automated office blood pressure.

    PubMed

    Myers, Martin G; Godwin, Marshall

    2012-05-01

    Manual blood pressure (BP) is gradually disappearing from clinical practice with the mercury sphygmomanometer now considered to be an environmental hazard. Manual BP is also subject to measurement error on the part of the physician/nurse and patient-related anxiety which can result in poor quality BP measurements and office-induced (white coat) hypertension. Automated office (AO) BP with devices such as the BpTRU (BpTRU Medical Devices, Coquitlam, BC) has already replaced conventional manual BP in many primary care practices in Canada and has also attracted interest in other countries where research studies using AOBP have been undertaken. The basic principles of AOBP include multiple readings taken with a fully automated recorder with the patient resting alone in a quiet room. When these principles are followed, office-induced hypertension is eliminated and AOBP exhibits a much stronger correlation with the awake ambulatory BP as compared with routine manual BP measurements. Unlike routine manual BP, AOBP correlates as well with left ventricular mass as does the awake ambulatory BP. AOBP also simplifies the definition of hypertension in that the cut point for a normal AOBP (< 135/85 mm Hg) is the same as for the awake ambulatory BP and home BP. This article summarizes the currently available evidence supporting the use of AOBP in routine clinical practice and proposes an algorithm in which AOBP replaces manual BP for the diagnosis and management of hypertension. PMID:22265230

  5. Agile automated vision

    NASA Astrophysics Data System (ADS)

    Fandrich, Juergen; Schmitt, Lorenz A.

    1994-11-01

    The microelectronic industry is a protagonist in driving automated vision to new paradigms. Today semiconductor manufacturers use vision systems quite frequently in their fabs in the front-end process. In fact, the process depends on reliable image processing systems. In the back-end process, where ICs are assembled and packaged, today vision systems are only partly used. But in the next years automated vision will become compulsory for the back-end process as well. Vision will be fully integrated into every IC package production machine to increase yields and reduce costs. Modem high-speed material processing requires dedicated and efficient concepts in image processing. But the integration of various equipment in a production plant leads to unifying handling of data flow and interfaces. Only agile vision systems can act with these contradictions: fast, reliable, adaptable, scalable and comprehensive. A powerful hardware platform is a unneglectable requirement for the use of advanced and reliable, but unfortunately computing intensive image processing algorithms. The massively parallel SIMD hardware product LANTERN/VME supplies a powerful platform for existing and new functionality. LANTERN/VME is used with a new optical sensor for IC package lead inspection. This is done in 3D, including horizontal and coplanarity inspection. The appropriate software is designed for lead inspection, alignment and control tasks in IC package production and handling equipment, like Trim&Form, Tape&Reel and Pick&Place machines.

  6. Evaluation of Solid Supports for Slide- and Well-Based Recombinant Antibody Microarrays

    PubMed Central

    Gerdtsson, Anna S.; Dexlin-Mellby, Linda; Delfani, Payam; Berglund, Erica; Borrebaeck, Carl A. K.; Wingren, Christer

    2016-01-01

    Antibody microarrays have emerged as an important tool within proteomics, enabling multiplexed protein expression profiling in both health and disease. The design and performance of antibody microarrays and how they are processed are dependent on several factors, of which the interplay between the antibodies and the solid surfaces plays a central role. In this study, we have taken on the first comprehensive view and evaluated the overall impact of solid surfaces on the recombinant antibody microarray design. The results clearly demonstrated the importance of the surface-antibody interaction and showed the effect of the solid supports on the printing process, the array format of planar arrays (slide- and well-based), the assay performance (spot features, reproducibility, specificity and sensitivity) and assay processing (degree of automation). In the end, two high-end recombinant antibody microarray technology platforms were designed, based on slide-based (black polymer) and well-based (clear polymer) arrays, paving the way for future large-scale protein expression profiling efforts. PMID:27600082

  7. Evaluation of Solid Supports for Slide- and Well-Based Recombinant Antibody Microarrays.

    PubMed

    Gerdtsson, Anna S; Dexlin-Mellby, Linda; Delfani, Payam; Berglund, Erica; Borrebaeck, Carl A K; Wingren, Christer

    2016-01-01

    Antibody microarrays have emerged as an important tool within proteomics, enabling multiplexed protein expression profiling in both health and disease. The design and performance of antibody microarrays and how they are processed are dependent on several factors, of which the interplay between the antibodies and the solid surfaces plays a central role. In this study, we have taken on the first comprehensive view and evaluated the overall impact of solid surfaces on the recombinant antibody microarray design. The results clearly demonstrated the importance of the surface-antibody interaction and showed the effect of the solid supports on the printing process, the array format of planar arrays (slide- and well-based), the assay performance (spot features, reproducibility, specificity and sensitivity) and assay processing (degree of automation). In the end, two high-end recombinant antibody microarray technology platforms were designed, based on slide-based (black polymer) and well-based (clear polymer) arrays, paving the way for future large-scale protein expression profiling efforts. PMID:27600082

  8. Chromosomal Microarray versus Karyotyping for Prenatal Diagnosis

    PubMed Central

    Wapner, Ronald J.; Martin, Christa Lese; Levy, Brynn; Ballif, Blake C.; Eng, Christine M.; Zachary, Julia M.; Savage, Melissa; Platt, Lawrence D.; Saltzman, Daniel; Grobman, William A.; Klugman, Susan; Scholl, Thomas; Simpson, Joe Leigh; McCall, Kimberly; Aggarwal, Vimla S.; Bunke, Brian; Nahum, Odelia; Patel, Ankita; Lamb, Allen N.; Thom, Elizabeth A.; Beaudet, Arthur L.; Ledbetter, David H.; Shaffer, Lisa G.; Jackson, Laird

    2013-01-01

    Background Chromosomal microarray analysis has emerged as a primary diagnostic tool for the evaluation of developmental delay and structural malformations in children. We aimed to evaluate the accuracy, efficacy, and incremental yield of chromosomal microarray analysis as compared with karyotyping for routine prenatal diagnosis. Methods Samples from women undergoing prenatal diagnosis at 29 centers were sent to a central karyotyping laboratory. Each sample was split in two; standard karyotyping was performed on one portion and the other was sent to one of four laboratories for chromosomal microarray. Results We enrolled a total of 4406 women. Indications for prenatal diagnosis were advanced maternal age (46.6%), abnormal result on Down’s syndrome screening (18.8%), structural anomalies on ultrasonography (25.2%), and other indications (9.4%). In 4340 (98.8%) of the fetal samples, microarray analysis was successful; 87.9% of samples could be used without tissue culture. Microarray analysis of the 4282 nonmosaic samples identified all the aneuploidies and unbalanced rearrangements identified on karyotyping but did not identify balanced translocations and fetal triploidy. In samples with a normal karyotype, microarray analysis revealed clinically relevant deletions or duplications in 6.0% with a structural anomaly and in 1.7% of those whose indications were advanced maternal age or positive screening results. Conclusions In the context of prenatal diagnostic testing, chromosomal microarray analysis identified additional, clinically significant cytogenetic information as compared with karyotyping and was equally efficacious in identifying aneuploidies and unbalanced rearrangements but did not identify balanced translocations and triploidies. (Funded by the Eunice Kennedy Shriver National Institute of Child Health and Human Development and others; ClinicalTrials.gov number, NCT01279733.) PMID:23215555

  9. Integrating Test-Form Formatting into Automated Test Assembly

    ERIC Educational Resources Information Center

    Diao, Qi; van der Linden, Wim J.

    2013-01-01

    Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…

  10. Quantifying the Antibody Binding on Protein Microarrays using Microarray Nonlinear Calibration

    PubMed Central

    Yu, Xiaobo; Wallstrom, Garrick; Magee, Dewey Mitchell; Qiu, Ji; Mendoza, D. Eliseo A.; Wang, Jie; Bian, Xiaofang; Graves, Morgan; LaBaer, Joshua

    2015-01-01

    To address the issue of quantification for antibody assays with protein microarrays, we firstly developed a Microarray Nonlinear Calibration (MiNC) method that applies in the quantification of antibody binding to the surface of microarray spots. We found that MiNC significantly increased the linear dynamic range and reduced assay variations. A serological analysis of guinea pig Mycobacterium tuberculosis models showed that a larger number of putative antigen targets were identified with MiNC, which is consistent with the improved assay performance of protein microarrays. We expect that our cumulative results will provide scientists with a new appreciation of antibody assays with protein microarrays. Our MiNC method has the potential to be employed in biomedical research with multiplex antibody assays which need quantitation, including the discovery of antibody biomarkers, clinical diagnostics with multi-antibody signatures and construction of immune mathematical models. PMID:23662896

  11. Microbial Diagnostic Microarrays for the Detection and Typing of Food- and Water-Borne (Bacterial) Pathogens

    PubMed Central

    Kostić, Tanja; Sessitsch, Angela

    2011-01-01

    Reliable and sensitive pathogen detection in clinical and environmental (including food and water) samples is of greatest importance for public health. Standard microbiological methods have several limitations and improved alternatives are needed. Most important requirements for reliable analysis include: (i) specificity; (ii) sensitivity; (iii) multiplexing potential; (iv) robustness; (v) speed; (vi) automation potential; and (vii) low cost. Microarray technology can, through its very nature, fulfill many of these requirements directly and the remaining challenges have been tackled. In this review, we attempt to compare performance characteristics of the microbial diagnostic microarrays developed for the detection and typing of food and water pathogens, and discuss limitations, points still to be addressed and issues specific for the analysis of food, water and environmental samples.

  12. ArrayWiki: an enabling technology for sharing public microarray data repositories and meta-analyses

    PubMed Central

    Stokes, Todd H; Torrance, JT; Li, Henry; Wang, May D

    2008-01-01

    Background A survey of microarray databases reveals that most of the repository contents and data models are heterogeneous (i.e., data obtained from different chip manufacturers), and that the repositories provide only basic biological keywords linking to PubMed. As a result, it is difficult to find datasets using research context or analysis parameters information beyond a few keywords. For example, to reduce the "curse-of-dimension" problem in microarray analysis, the number of samples is often increased by merging array data from different datasets. Knowing chip data parameters such as pre-processing steps (e.g., normalization, artefact removal, etc), and knowing any previous biological validation of the dataset is essential due to the heterogeneity of the data. However, most of the microarray repositories do not have meta-data information in the first place, and do not have a a mechanism to add or insert this information. Thus, there is a critical need to create "intelligent" microarray repositories that (1) enable update of meta-data with the raw array data, and (2) provide standardized archiving protocols to minimize bias from the raw data sources. Results To address the problems discussed, we have developed a community maintained system called ArrayWiki that unites disparate meta-data of microarray meta-experiments from multiple primary sources with four key features. First, ArrayWiki provides a user-friendly knowledge management interface in addition to a programmable interface using standards developed by Wikipedia. Second, ArrayWiki includes automated quality control processes (caCORRECT) and novel visualization methods (BioPNG, Gel Plots), which provide extra information about data quality unavailable in other microarray repositories. Third, it provides a user-curation capability through the familiar Wiki interface. Fourth, ArrayWiki provides users with simple text-based searches across all experiment meta-data, and exposes data to search engine crawlers

  13. PERFORMANCE CHARACTERISTICS OF 65-MER OLIGONUCLEOTIDE MICROARRAYS

    PubMed Central

    Lee, Myoyong; Xiang, Charlie C.; Trent, Jeffrey M.; Bittner, Michael L.

    2009-01-01

    Microarray fabrication using pre-synthesized long oligonucleotide is becoming increasingly important, but a study of large-scale array productions is not published yet. We addressed the issue of fabricating oligonucleotide microarrays by spotting commercial, pre-synthesized 65-mers with 5′ amines representing 7500 murine genes. Amine-modified oligonucleotides were immobilized on glass slides having aldehyde groups via transient Schiff base formation followed by reduction to produce a covalent conjugate. When RNA derived from the same source was used for Cy3 and Cy5 labeling and hybridized to the same array, signal intensities spanning three orders of magnitude were observed, and the coefficient of variation between the two channels for all spots was 8–10%. To ascertain the reproducibility of ratio determination of these arrays, two triplicate hybridizations (with fluorochrome reversal) comparing RNAs from a fibroblast (NIH3T3) and a breast cancer (JC) cell line were carried out. The 95% confidence interval for all spots in the six hybridizations was 0.60 – 1.66. This level of reproducibility allows use of the full range of pattern finding and discriminant analysis typically applied to cDNA microarrays. Further comparative testing was carried out with oligonucleotide microarrays, cDNA microarrays and RT-PCR assays to examine the comparability of results across these different methodologies. PMID:17617369

  14. Advancing microarray assembly with acoustic dispensing technology.

    PubMed

    Wong, E Y; Diamond, S L

    2009-01-01

    In the assembly of microarrays and microarray-based chemical assays and enzymatic bioassays, most approaches use pins for contact spotting. Acoustic dispensing is a technology capable of nanoliter transfers by using acoustic energy to eject liquid sample from an open source well. Although typically used for well plate transfers, when applied to microarraying, it avoids the drawbacks of undesired physical contact with the sample; difficulty in assembling multicomponent reactions on a chip by readdressing, a rigid mode of printing that lacks patterning capabilities; and time-consuming wash steps. We demonstrated the utility of acoustic dispensing by delivering human cathepsin L in a drop-on-drop fashion into individual 50-nanoliter, prespotted reaction volumes to activate enzyme reactions at targeted positions on a microarray. We generated variable-sized spots ranging from 200 to 750 microm (and higher) and handled the transfer of fluorescent bead suspensions with increasing source well concentrations of 0.1 to 10 x 10(8) beads/mL in a linear fashion. There are no tips that can clog, and liquid dispensing CVs are generally below 5%. This platform expands the toolbox for generating analytical arrays and meets needs associated with spatially addressed assembly of multicomponent microarrays on the nanoliter scale. PMID:19035650

  15. A Synthetic Kinome Microarray Data Generator

    PubMed Central

    Maleki, Farhad; Kusalik, Anthony

    2015-01-01

    Cellular pathways involve the phosphorylation and dephosphorylation of proteins. Peptide microarrays called kinome arrays facilitate the measurement of the phosphorylation activity of hundreds of proteins in a single experiment. Analyzing the data from kinome microarrays is a multi-step process. Typically, various techniques are possible for a particular step, and it is necessary to compare and evaluate them. Such evaluations require data for which correct analysis results are known. Unfortunately, such kinome data is not readily available in the community. Further, there are no established techniques for creating artificial kinome datasets with known results and with the same characteristics as real kinome datasets. In this paper, a methodology for generating synthetic kinome array data is proposed. The methodology relies on actual intensity measurements from kinome microarray experiments and preserves their subtle characteristics. The utility of the methodology is demonstrated by evaluating methods for eliminating heterogeneous variance in kinome microarray data. Phosphorylation intensities from kinome microarrays often exhibit such heterogeneous variance and its presence can negatively impact downstream statistical techniques that rely on homogeneity of variance. It is shown that using the output from the proposed synthetic data generator, it is possible to critically compare two variance stabilization methods.

  16. Segment and fit thresholding: a new method for image analysis applied to microarray and immunofluorescence data.

    PubMed

    Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E; Allen, Peter J; Sempere, Lorenzo F; Haab, Brian B

    2015-10-01

    Experiments involving the high-throughput quantification of image data require algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multicolor, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu's method for selected images. SFT promises to advance the goal of full automation in image analysis. PMID:26339978

  17. Segment and Fit Thresholding: A New Method for Image Analysis Applied to Microarray and Immunofluorescence Data

    PubMed Central

    Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M.; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E.; Allen, Peter J.; Sempere, Lorenzo F.; Haab, Brian B.

    2016-01-01

    Certain experiments involve the high-throughput quantification of image data, thus requiring algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multi-color, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu’s method for selected images. SFT promises to advance the goal of full automation in image analysis. PMID:26339978

  18. Fully automatic telemetry data processor

    NASA Technical Reports Server (NTRS)

    Cox, F. B.; Keipert, F. A.; Lee, R. C.

    1968-01-01

    Satellite Telemetry Automatic Reduction System /STARS 2/, a fully automatic computer-controlled telemetry data processor, maximizes data recovery, reduces turnaround time, increases flexibility, and improves operational efficiency. The system incorporates a CDC 3200 computer as its central element.

  19. Design automation for integrated optics

    NASA Astrophysics Data System (ADS)

    Condrat, Christopher

    Recent breakthroughs in silicon photonics technology are enabling the integration of optical devices into silicon-based semiconductor processes. Photonics technology enables high-speed, high-bandwidth, and high-fidelity communications on the chip-scale---an important development in an increasingly communications-oriented semiconductor world. Significant developments in silicon photonic manufacturing and integration are also enabling investigations into applications beyond that of traditional telecom: sensing, filtering, signal processing, quantum technology---and even optical computing. In effect, we are now seeing a convergence of communications and computation, where the traditional roles of optics and microelectronics are becoming blurred. As the applications for opto-electronic integrated circuits (OEICs) are developed, and manufacturing capabilities expand, design support is necessary to fully exploit the potential of this optics technology. Such design support for moving beyond custom-design to automated synthesis and optimization is not well developed. Scalability requires abstractions, which in turn enables and requires the use of optimization algorithms and design methodology flows. Design automation represents an opportunity to take OEIC design to a larger scale, facilitating design-space exploration, and laying the foundation for current and future optical applications---thus fully realizing the potential of this technology. This dissertation proposes design automation for integrated optic system design. Using a building-block model for optical devices, we provide an EDA-inspired design flow and methodologies for optical design automation. Underlying these flows and methodologies are new supporting techniques in behavioral and physical synthesis, as well as device-resynthesis techniques for thermal-aware system integration. We also provide modeling for optical devices and determine optimization and constraint parameters that guide the automation

  20. Introduction to the statistical analysis of two-color microarray data.

    PubMed

    Bremer, Martina; Himelblau, Edward; Madlung, Andreas

    2010-01-01

    Microarray experiments have become routine in the past few years in many fields of biology. Analysis of array hybridizations is often performed with the help of commercial software programs, which produce gene lists, graphs, and sometimes provide values for the statistical significance of the results. Exactly what is computed by many of the available programs is often not easy to reconstruct or may even be impossible to know for the end user. It is therefore not surprising that many biology students and some researchers using microarray data do not fully understand the nature of the underlying statistics used to arrive at the results.We have developed a module that we have used successfully in undergraduate biology and statistics education that allows students to get a better understanding of both the basic biological and statistical theory needed to comprehend primary microarray data. The module is intended for the undergraduate level but may be useful to anyone who is new to the field of microarray biology. Additional course material that was developed for classroom use can be found at http://www.polyploidy.org/ .In our undergraduate classrooms we encourage students to manipulate microarray data using Microsoft Excel to reinforce some of the concepts they learn. We have included instructions for some of these manipulations throughout this chapter (see the "Do this..." boxes). However, it should be noted that while Excel can effectively analyze our small sample data set, more specialized software would typically be used to analyze full microarray data sets. Nevertheless, we believe that manipulating a small data set with Excel can provide insights into the workings of more advanced analysis software. PMID:20652509

  1. Long synthetic oligonucleotides for microarray expression measurement

    NASA Astrophysics Data System (ADS)

    Li, Jiong; Wang, Hong; Liu, Heping; Zhang, M.; Zhang, Chunxiu; Lu, Zu-Hong; Gao, Xiang; Kong, Dong

    2001-09-01

    There are generally two kinds of DNA microarray used for genomic-scale gene expression profiling of mRNA: cDNA and DNA chip, but both of them suffer from some drawbacks. To meet more requirements, another oligonucleotide microarray with long was produced. This type of microarray had the advantages of low cost, minimal Cross-hybridization, flexible and easy to make, which is most fit for small laboratories with special purposes. In this paper, we devised different probes with different probe lengths, GC contents and gene positions to optimization the probe design. Experiments showed 70 mer probes are suitable for both sufficient sensitivity and reasonable costs. Higher G-C content produces stronger signal intensity thus better sensitivity and probes designed at 3 untranslated region of gene within the range of 300 pb should be best for both sensitivity and specificity.

  2. Protein microarrays for parasite antigen discovery.

    PubMed

    Driguez, Patrick; Doolan, Denise L; Molina, Douglas M; Loukas, Alex; Trieu, Angela; Felgner, Phil L; McManus, Donald P

    2015-01-01

    The host serological profile to a parasitic infection, such as schistosomiasis, can be used to define potential vaccine and diagnostic targets. Determining the host antibody response using traditional approaches is hindered by the large number of putative antigens in any parasite proteome. Parasite protein microarrays offer the potential for a high-throughput host antibody screen to simplify this task. In order to construct the array, parasite proteins are selected from available genomic sequence and protein databases using bioinformatic tools. Selected open reading frames are PCR amplified, incorporated into a vector for cell-free protein expression, and printed robotically onto glass slides. The protein microarrays can be probed with antisera from infected/immune animals or humans and the antibody reactivity measured with fluorophore labeled antibodies on a confocal laser microarray scanner to identify potential targets for diagnosis or therapeutic or prophylactic intervention. PMID:25388117

  3. Applications of protein microarrays for biomarker discovery

    PubMed Central

    Ramachandran, Niroshan; Srivastava, Sanjeeva; LaBaer, Joshua

    2011-01-01

    The search for new biomarkers for diagnosis, prognosis and therapeutic monitoring of diseases continues in earnest despite dwindling success at finding novel reliable markers. Some of the current markers in clinical use do not provide optimal sensitivity and specificity, with the prostate cancer antigen (PSA) being one of many such examples. The emergence of proteomic techniques and systems approaches to study disease pathophysiology has rekindled the quest for new biomarkers. In particular the use of protein microarrays has surged as a powerful tool for large scale testing of biological samples. Approximately half the reports on protein microarrays have been published in the last two years especially in the area of biomarker discovery. In this review, we will discuss the application of protein microarray technologies that offer unique opportunities to find novel biomarkers. PMID:21136793

  4. Library Automation: A Survey of Leading Academic and Public Libraries in the United States.

    ERIC Educational Resources Information Center

    Mann, Thomas W., Jr.; And Others

    Results of this survey of 26 public and academic libraries of national stature show that the country's major libraries are fully committed to automating their library operations. Major findings of the survey show that: (1) all libraries surveyed are involved in automation; (2) all libraries surveyed have automated their catalogs and bibliographic…

  5. Analysis of High-Throughput ELISA Microarray Data

    SciTech Connect

    White, Amanda M.; Daly, Don S.; Zangar, Richard C.

    2011-02-23

    Our research group develops analytical methods and software for the high-throughput analysis of quantitative enzyme-linked immunosorbent assay (ELISA) microarrays. ELISA microarrays differ from DNA microarrays in several fundamental aspects and most algorithms for analysis of DNA microarray data are not applicable to ELISA microarrays. In this review, we provide an overview of the steps involved in ELISA microarray data analysis and how the statistically sound algorithms we have developed provide an integrated software suite to address the needs of each data-processing step. The algorithms discussed are available in a set of open-source software tools (http://www.pnl.gov/statistics/ProMAT).

  6. Overview of DNA microarrays: types, applications, and their future.

    PubMed

    Bumgarner, Roger

    2013-01-01

    This unit provides an overview of DNA microarrays. Microarrays are a technology in which thousands of nucleic acids are bound to a surface and are used to measure the relative concentration of nucleic acid sequences in a mixture via hybridization and subsequent detection of the hybridization events. This overview first discusses the history of microarrays and the antecedent technologies that led to their development. This is followed by discussion of the methods of manufacture of microarrays and the most common biological applications. The unit ends with a brief description of the limitations of microarrays and discusses how microarrays are being rapidly replaced by DNA sequencing technologies. PMID:23288464

  7. Does bacteriology laboratory automation reduce time to results and increase quality management?

    PubMed

    Dauwalder, O; Landrieve, L; Laurent, F; de Montclos, M; Vandenesch, F; Lina, G

    2016-03-01

    Due to reductions in financial and human resources, many microbiological laboratories have merged to build very large clinical microbiology laboratories, which allow the use of fully automated laboratory instruments. For clinical chemistry and haematology, automation has reduced the time to results and improved the management of laboratory quality. The aim of this review was to examine whether fully automated laboratory instruments for microbiology can reduce time to results and impact quality management. This study focused on solutions that are currently available, including the BD Kiestra™ Work Cell Automation and Total Lab Automation and the Copan WASPLab(®). PMID:26577142

  8. The use of microarrays in microbial ecology

    SciTech Connect

    Andersen, G.L.; He, Z.; DeSantis, T.Z.; Brodie, E.L.; Zhou, J.

    2009-09-15

    Microarrays have proven to be a useful and high-throughput method to provide targeted DNA sequence information for up to many thousands of specific genetic regions in a single test. A microarray consists of multiple DNA oligonucleotide probes that, under high stringency conditions, hybridize only to specific complementary nucleic acid sequences (targets). A fluorescent signal indicates the presence and, in many cases, the abundance of genetic regions of interest. In this chapter we will look at how microarrays are used in microbial ecology, especially with the recent increase in microbial community DNA sequence data. Of particular interest to microbial ecologists, phylogenetic microarrays are used for the analysis of phylotypes in a community and functional gene arrays are used for the analysis of functional genes, and, by inference, phylotypes in environmental samples. A phylogenetic microarray that has been developed by the Andersen laboratory, the PhyloChip, will be discussed as an example of a microarray that targets the known diversity within the 16S rRNA gene to determine microbial community composition. Using multiple, confirmatory probes to increase the confidence of detection and a mismatch probe for every perfect match probe to minimize the effect of cross-hybridization by non-target regions, the PhyloChip is able to simultaneously identify any of thousands of taxa present in an environmental sample. The PhyloChip is shown to reveal greater diversity within a community than rRNA gene sequencing due to the placement of the entire gene product on the microarray compared with the analysis of up to thousands of individual molecules by traditional sequencing methods. A functional gene array that has been developed by the Zhou laboratory, the GeoChip, will be discussed as an example of a microarray that dynamically identifies functional activities of multiple members within a community. The recent version of GeoChip contains more than 24,000 50mer

  9. Pineal function: impact of microarray analysis.

    PubMed

    Klein, David C; Bailey, Michael J; Carter, David A; Kim, Jong-so; Shi, Qiong; Ho, Anthony K; Chik, Constance L; Gaildrat, Pascaline; Morin, Fabrice; Ganguly, Surajit; Rath, Martin F; Møller, Morten; Sugden, David; Rangel, Zoila G; Munson, Peter J; Weller, Joan L; Coon, Steven L

    2010-01-27

    Microarray analysis has provided a new understanding of pineal function by identifying genes that are highly expressed in this tissue relative to other tissues and also by identifying over 600 genes that are expressed on a 24-h schedule. This effort has highlighted surprising similarity to the retina and has provided reason to explore new avenues of study including intracellular signaling, signal transduction, transcriptional cascades, thyroid/retinoic acid hormone signaling, metal biology, RNA splicing, and the role the pineal gland plays in the immune/inflammation response. The new foundation that microarray analysis has provided will broadly support future research on pineal function. PMID:19622385

  10. MicroRNA expression profiling using microarrays.

    PubMed

    Love, Cassandra; Dave, Sandeep

    2013-01-01

    MicroRNAs are small noncoding RNAs which are able to regulate gene expression at both the transcriptional and translational levels. There is a growing recognition of the role of microRNAs in nearly every tissue type and cellular process. Thus there is an increasing need for accurate quantitation of microRNA expression in a variety of tissues. Microarrays provide a robust method for the examination of microRNA expression. In this chapter, we describe detailed methods for the use of microarrays to measure microRNA expression and discuss methods for the analysis of microRNA expression data. PMID:23666707

  11. Protein Microarrays for the Detection of Biothreats

    NASA Astrophysics Data System (ADS)

    Herr, Amy E.

    Although protein microarrays have proven to be an important tool in proteomics research, the technology is emerging as useful for public health and defense applications. Recent progress in the measurement and characterization of biothreat agents is reviewed in this chapter. Details concerning validation of various protein microarray formats, from contact-printed sandwich assays to supported lipid bilayers, are presented. The reviewed technologies have important implications for in vitro characterization of toxin-ligand interactions, serotyping of bacteria, screening of potential biothreat inhibitors, and as core components of biosensors, among others, research and engineering applications.

  12. Both Automation and Paper.

    ERIC Educational Resources Information Center

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  13. Automated Demand Response and Commissioning

    SciTech Connect

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Bourassa, Norman

    2005-04-01

    This paper describes the results from the second season of research to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve the electric grid reliability and manage electricity costs. Fully-Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. We refer to this as Auto-DR. The evaluation of the control and communications must be properly configured and pass through a set of test stages: Readiness, Approval, Price Client/Price Server Communication, Internet Gateway/Internet Relay Communication, Control of Equipment, and DR Shed Effectiveness. New commissioning tests are needed for such systems to improve connecting demand responsive building systems to the electric grid demand response systems.

  14. Sensors and Automated Analyzers for Radionuclides

    SciTech Connect

    Grate, Jay W.; Egorov, Oleg B.

    2003-03-27

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less.

  15. Image segmentation for automated dental identification

    NASA Astrophysics Data System (ADS)

    Haj Said, Eyad; Nassar, Diaa Eldin M.; Ammar, Hany H.

    2006-02-01

    Dental features are one of few biometric identifiers that qualify for postmortem identification; therefore, creation of an Automated Dental Identification System (ADIS) with goals and objectives similar to the Automated Fingerprint Identification System (AFIS) has received increased attention. As a part of ADIS, teeth segmentation from dental radiographs films is an essential step in the identification process. In this paper, we introduce a fully automated approach for teeth segmentation with goal to extract at least one tooth from the dental radiograph film. We evaluate our approach based on theoretical and empirical basis, and we compare its performance with the performance of other approaches introduced in the literature. The results show that our approach exhibits the lowest failure rate and the highest optimality among all full automated approaches introduced in the literature.

  16. Synthesis and characterization of nitric oxide-releasing sol-gel microarrays.

    PubMed

    Robbins, Mary E; Hopper, Erin D; Schoenfisch, Mark H

    2004-11-01

    Diazeniumdiolate-modified sol-gel microarrays capable of releasing low levels of nitric oxide are reported as a viable means for improving the blood compatibility of a surface without fully modifying the underlying substrate. Several parameters are characterized including: (1) NO surface flux as a function of sol-gel composition and microarray geometry; (2) microstructure dimensions and spacing for optimal blood compatibility; and (3) the effect of sol-gel surface modification on analyte accessibility to platinum electrodes. The sol-gel microarrays release biologically relevant levels of NO under physiological conditions for >24 h. In vitro platelet adhesion assays indicate that a NO surface flux of 2.2 pmol cm(-2) s(-1) effectively reduces platelet adhesion to glass substrates modified with sol-gel microstructures separated by 50 microm. The blood compatibility observed for these micropatterned surfaces is comparable to NO-releasing sol-gel films. When the separation between NO-releasing microstructures is reduced to 10 microm, the NO surface flux required to reduce platelet adhesion is lowered to 0.4 pmol cm(-2) s(-1). Finally, the oxygen response of platinum electrodes modified with NO-releasing sol-gel microarrays indicates that selective modification via micropatterning enhances analyte accessibility to the sensor surface. PMID:15518528

  17. High-Throughput Nano-Biofilm Microarray for Antifungal Drug Discovery

    PubMed Central

    Srinivasan, Anand; Leung, Kai P.; Lopez-Ribot, Jose L.; Ramasubramanian, Anand K.

    2013-01-01

    ABSTRACT Micro- and nanoscale technologies have radically transformed biological research from genomics to tissue engineering, with the relative exception of microbial cell culture, which is still largely performed in microtiter plates and petri dishes. Here, we present nanoscale culture of the opportunistic fungal pathogen Candida albicans on a microarray platform. The microarray consists of 1,200 individual cultures of 30 nl of C. albicans biofilms (“nano-biofilms”) encapsulated in an inert alginate matrix. We demonstrate that these nano-biofilms are similar to conventional macroscopic biofilms in their morphological, architectural, growth, and phenotypic characteristics. We also demonstrate that the nano-biofilm microarray is a robust and efficient tool for accelerating the drug discovery process: (i) combinatorial screening against a collection of 28 antifungal compounds in the presence of immunosuppressant FK506 (tacrolimus) identified six drugs that showed synergistic antifungal activity, and (ii) screening against the NCI challenge set small-molecule library identified three heretofore-unknown hits. This cell-based microarray platform allows for miniaturization of microbial cell culture and is fully compatible with other high-throughput screening technologies. PMID:23800397

  18. Technical Advances of the Recombinant Antibody Microarray Technology Platform for Clinical Immunoproteomics

    PubMed Central

    Delfani, Payam; Dexlin Mellby, Linda; Nordström, Malin; Holmér, Andreas; Ohlsson, Mattias; Borrebaeck, Carl A. K.; Wingren, Christer

    2016-01-01

    In the quest for deciphering disease-associated biomarkers, high-performing tools for multiplexed protein expression profiling of crude clinical samples will be crucial. Affinity proteomics, mainly represented by antibody-based microarrays, have during recent years been established as a proteomic tool providing unique opportunities for parallelized protein expression profiling. But despite the progress, several main technical features and assay procedures remains to be (fully) resolved. Among these issues, the handling of protein microarray data, i.e. the biostatistics parts, is one of the key features to solve. In this study, we have therefore further optimized, validated, and standardized our in-house designed recombinant antibody microarray technology platform. To this end, we addressed the main remaining technical issues (e.g. antibody quality, array production, sample labelling, and selected assay conditions) and most importantly key biostatistics subjects (e.g. array data pre-processing and biomarker panel condensation). This represents one of the first antibody array studies in which these key biostatistics subjects have been studied in detail. Here, we thus present the next generation of the recombinant antibody microarray technology platform designed for clinical immunoproteomics. PMID:27414037

  19. PRACTICAL STRATEGIES FOR PROCESSING AND ANALYZING SPOTTED OLIGONUCLEOTIDE MICROARRAY DATA

    EPA Science Inventory

    Thoughtful data analysis is as important as experimental design, biological sample quality, and appropriate experimental procedures for making microarrays a useful supplement to traditional toxicology. In the present study, spotted oligonucleotide microarrays were used to profile...

  20. MICROARRAY DATA ANALYSIS USING MULTIPLE STATISTICAL MODELS

    EPA Science Inventory

    Microarray Data Analysis Using Multiple Statistical Models

    Wenjun Bao1, Judith E. Schmid1, Amber K. Goetz1, Ming Ouyang2, William J. Welsh2,Andrew I. Brooks3,4, ChiYi Chu3,Mitsunori Ogihara3,4, Yinhe Cheng5, David J. Dix1. 1National Health and Environmental Effects Researc...

  1. Microarrays (DNA Chips) for the Classroom Laboratory

    ERIC Educational Resources Information Center

    Barnard, Betsy; Sussman, Michael; BonDurant, Sandra Splinter; Nienhuis, James; Krysan, Patrick

    2006-01-01

    We have developed and optimized the necessary laboratory materials to make DNA microarray technology accessible to all high school students at a fraction of both cost and data size. The primary component is a DNA chip/array that students "print" by hand and then analyze using research tools that have been adapted for classroom use. The primary…

  2. DISC-BASED IMMUNOASSAY MICROARRAYS. (R825433)

    EPA Science Inventory

    Microarray technology as applied to areas that include genomics, diagnostics, environmental, and drug discovery, is an interesting research topic for which different chip-based devices have been developed. As an alternative, we have explored the principle of compact disc-based...

  3. Microarray data classified by artificial neural networks.

    PubMed

    Linder, Roland; Richards, Tereza; Wagner, Mathias

    2007-01-01

    Systems biology has enjoyed explosive growth in both the number of people participating in this area of research and the number of publications on the topic. The field of systems biology encompasses the in silico analysis of high-throughput data as provided by DNA or protein microarrays. Along with the increasing availability of microarray data, attention is focused on methods of analyzing the expression rates. One important type of analysis is the classification task, for example, distinguishing different types of cell functions or tumors. Recently, interest has been awakened toward artificial neural networks (ANN), which have many appealing characteristics such as an exceptional degree of accuracy. Nonlinear relationships or independence from certain assumptions regarding the data distribution are also considered. The current work reviews advantages as well as disadvantages of neural networks in the context of microarray analysis. Comparisons are drawn to alternative methods. Selected solutions are discussed, and finally algorithms for the effective combination of multiple ANNs are presented. The development of approaches to use ANN-processed microarray data applicable to run cell and tissue simulations may be slated for future investigation. PMID:18220242

  4. Data Analysis Strategies for Protein Microarrays

    PubMed Central

    Díez, Paula; Dasilva, Noelia; González-González, María; Matarraz, Sergio; Casado-Vela, Juan; Orfao, Alberto; Fuentes, Manuel

    2012-01-01

    Microarrays constitute a new platform which allows the discovery and characterization of proteins. According to different features, such as content, surface or detection system, there are many types of protein microarrays which can be applied for the identification of disease biomarkers and the characterization of protein expression patterns. However, the analysis and interpretation of the amount of information generated by microarrays remain a challenge. Further data analysis strategies are essential to obtain representative and reproducible results. Therefore, the experimental design is key, since the number of samples and dyes, among others aspects, would define the appropriate analysis method to be used. In this sense, several algorithms have been proposed so far to overcome analytical difficulties derived from fluorescence overlapping and/or background noise. Each kind of microarray is developed to fulfill a specific purpose. Therefore, the selection of appropriate analytical and data analysis strategies is crucial to achieve successful biological conclusions. In the present review, we focus on current algorithms and main strategies for data interpretation.

  5. Diagnostic Oligonucleotide Microarray Fingerprinting of Bacillus Isolates

    SciTech Connect

    Chandler, Darrell P.; Alferov, Oleg; Chernov, Boris; Daly, Don S.; Golova, Julia; Perov, Alexander N.; Protic, Miroslava; Robison, Richard; Shipma, Matthew; White, Amanda M.; Willse, Alan R.

    2006-01-01

    A diagnostic, genome-independent microbial fingerprinting method using DNA oligonucleotide microarrays was used for high-resolution differentiation between closely related Bacillus strains, including two strains of Bacillus anthracis that are monomorphic (indistinguishable) via amplified fragment length polymorphism fingerprinting techniques. Replicated hybridizations on 391-probe nonamer arrays were used to construct a prototype fingerprint library for quantitative comparisons. Descriptive analysis of the fingerprints, including phylogenetic reconstruction, is consistent with previous taxonomic organization of the genus. Newly developed statistical analysis methods were used to quantitatively compare and objectively confirm apparent differences in microarray fingerprints with the statistical rigor required for microbial forensics and clinical diagnostics. These data suggest that a relatively simple fingerprinting microarray and statistical analysis method can differentiate between species in the Bacillus cereus complex, and between strains of B. anthracis. A synthetic DNA standard was used to understand underlying microarray and process-level variability, leading to specific recommendations for the development of a standard operating procedure and/or continued technology enhancements for microbial forensics and diagnostics.

  6. Shrinkage covariance matrix approach for microarray data

    NASA Astrophysics Data System (ADS)

    Karjanto, Suryaefiza; Aripin, Rasimah

    2013-04-01

    Microarray technology was developed for the purpose of monitoring the expression levels of thousands of genes. A microarray data set typically consists of tens of thousands of genes (variables) from just dozens of samples due to various constraints including the high cost of producing microarray chips. As a result, the widely used standard covariance estimator is not appropriate for this purpose. One such technique is the Hotelling's T2 statistic which is a multivariate test statistic for comparing means between two groups. It requires that the number of observations (n) exceeds the number of genes (p) in the set but in microarray studies it is common that n < p. This leads to a biased estimate of the covariance matrix. In this study, the Hotelling's T2 statistic with the shrinkage approach is proposed to estimate the covariance matrix for testing differential gene expression. The performance of this approach is then compared with other commonly used multivariate tests using a widely analysed diabetes data set as illustrations. The results across the methods are consistent, implying that this approach provides an alternative to existing techniques.

  7. Raman-based microarray readout: a review.

    PubMed

    Haisch, Christoph

    2016-07-01

    For a quarter of a century, microarrays have been part of the routine analytical toolbox. Label-based fluorescence detection is still the commonest optical readout strategy. Since the 1990s, a continuously increasing number of label-based as well as label-free experiments on Raman-based microarray readout concepts have been reported. This review summarizes the possible concepts and methods and their advantages and challenges. A common label-based strategy is based on the binding of selective receptors as well as Raman reporter molecules to plasmonic nanoparticles in a sandwich immunoassay, which results in surface-enhanced Raman scattering signals of the reporter molecule. Alternatively, capture of the analytes can be performed by receptors on a microarray surface. Addition of plasmonic nanoparticles again leads to a surface-enhanced Raman scattering signal, not of a label but directly of the analyte. This approach is mostly proposed for bacteria and cell detection. However, although many promising readout strategies have been discussed in numerous publications, rarely have any of them made the step from proof of concept to a practical application, let alone routine use. Graphical Abstract Possible realization of a SERS (Surface-Enhanced Raman Scattering) system for microarray readout. PMID:26973235

  8. Automation of Meudon Synoptic Maps

    NASA Astrophysics Data System (ADS)

    Aboudarham, J.; Scholl, I.; Fuller, N.; Fouesneau, M.; Galametz, M.; Gonon, F.; Maire, A.; Leroy, Y.

    2007-05-01

    Thanks to the automatic solar features detection developed in the frame of the European EGSO (European Grid of Solar Observations) project, an important part of the automation of Meudon Synoptic Maps is achieved. Nevertheless, the tracking of these solar structures over time has still to be done to synthesize their evolution during a Carrington rotation. A new approach to track filaments, based on image segmentation and intersection of regions of interest, gives successful results, This is a major step to move towards a fully automatic building of Meudon Synoptic Maps of Solar Activity.

  9. Examining microarray slide quality for the EPA using SNL's hyperspectral microarray scanner.

    SciTech Connect

    Rohde, Rachel M.; Timlin, Jerilyn Ann

    2005-11-01

    This report summarizes research performed at Sandia National Laboratories (SNL) in collaboration with the Environmental Protection Agency (EPA) to assess microarray quality on arrays from two platforms of interest to the EPA. Custom microarrays from two novel, commercially produced array platforms were imaged with SNL's unique hyperspectral imaging technology and multivariate data analysis was performed to investigate sources of emission on the arrays. No extraneous sources of emission were evident in any of the array areas scanned. This led to the conclusions that either of these array platforms could produce high quality, reliable microarray data for the EPA toxicology programs. Hyperspectral imaging results are presented and recommendations for microarray analyses using these platforms are detailed within the report.

  10. Facilitating functional annotation of chicken microarray data

    PubMed Central

    2009-01-01

    Background Modeling results from chicken microarray studies is challenging for researchers due to little functional annotation associated with these arrays. The Affymetrix GenChip chicken genome array, one of the biggest arrays that serve as a key research tool for the study of chicken functional genomics, is among the few arrays that link gene products to Gene Ontology (GO). However the GO annotation data presented by Affymetrix is incomplete, for example, they do not show references linked to manually annotated functions. In addition, there is no tool that facilitates microarray researchers to directly retrieve functional annotations for their datasets from the annotated arrays. This costs researchers amount of time in searching multiple GO databases for functional information. Results We have improved the breadth of functional annotations of the gene products associated with probesets on the Affymetrix chicken genome array by 45% and the quality of annotation by 14%. We have also identified the most significant diseases and disorders, different types of genes, and known drug targets represented on Affymetrix chicken genome array. To facilitate functional annotation of other arrays and microarray experimental datasets we developed an Array GO Mapper (AGOM) tool to help researchers to quickly retrieve corresponding functional information for their dataset. Conclusion Results from this study will directly facilitate annotation of other chicken arrays and microarray experimental datasets. Researchers will be able to quickly model their microarray dataset into more reliable biological functional information by using AGOM tool. The disease, disorders, gene types and drug targets revealed in the study will allow researchers to learn more about how genes function in complex biological systems and may lead to new drug discovery and development of therapies. The GO annotation data generated will be available for public use via AgBase website and will be updated on regular

  11. Fully Massive Six Dimensional Box

    NASA Astrophysics Data System (ADS)

    Glosser, Chris; Ward, B. F. L.; Yost, Scott

    2004-05-01

    In this work, we present a fully analytic calculation of the six dimensional scalar four-point function, which is necessary for calculations using the amplitude decomposition of Bern, Dixon, and Kosower. The calculation proceeds along the lines of the calculation of the 3-point function by Vermaseren and Oldenburg.

  12. Automated Miniaturized Instrument for Space Biology Applications and the Monitoring of the Astronauts Health Onboard the ISS

    NASA Technical Reports Server (NTRS)

    Karouia, Fathi; Peyvan, Kia; Danley, David; Ricco, Antonio J.; Santos, Orlando; Pohorille, Andrew

    2011-01-01

    Human space travelers experience a unique environment that affects homeostasis and physiologic adaptation. The spacecraft environment subjects the traveler to noise, chemical and microbiological contaminants, increased radiation, and variable gravity forces. As humans prepare for long-duration missions to the International Space Station (ISS) and beyond, effective measures must be developed, verified and implemented to ensure mission success. Limited biomedical quantitative capabilities are currently available onboard the ISS. Therefore, the development of versatile instruments to perform space biological analysis and to monitor astronauts' health is needed. We are developing a fully automated, miniaturized system for measuring gene expression on small spacecraft in order to better understand the influence of the space environment on biological systems. This low-cost, low-power, multi-purpose instrument represents a major scientific and technological advancement by providing data on cellular metabolism and regulation. The current system will support growth of microorganisms, extract and purify the RNA, hybridize it to the array, read the expression levels of a large number of genes by microarray analysis, and transmit the measurements to Earth. The system will help discover how bacteria develop resistance to antibiotics and how pathogenic bacteria sometimes increase their virulence in space, facilitating the development of adequate countermeasures to decrease risks associated with human spaceflight. The current stand-alone technology could be used as an integrated platform onboard the ISS to perform similar genetic analyses on any biological systems from the tree of life. Additionally, with some modification the system could be implemented to perform real-time in-situ microbial monitoring of the ISS environment (air, surface and water samples) and the astronaut's microbiome using 16SrRNA microarray technology. Furthermore, the current system can be enhanced

  13. Structured oligonucleotides for target indexing to allow single-vessel PCR amplification and solid support microarray hybridization

    PubMed Central

    Girard, Laurie D.; Boissinot, Karel; Peytavi, Régis; Boissinot, Maurice; Bergeron, Michel G.

    2014-01-01

    The combination of molecular diagnostic technologies is increasingly used to overcome limitations on sensitivity, specificity or multiplexing capabilities, and provide efficient lab-on-chip devices. Two such techniques, PCR amplification and microarray hybridization are used serially to take advantage of the high sensitivity and specificity of the former combined with high multiplexing capacities of the latter. These methods are usually performed in different buffers and reaction chambers. However, these elaborate methods have a high complexity cost related to reagent requirements, liquid storage and the number of reaction chambers to integrate into automated devices. Furthermore, microarray hybridizations have a sequence dependent efficiency not always predictable. In this work, we have developed the concept of a structured oligonucleotide probe which is activated by cleavage from polymerase exonuclease activity. This technology is called SCISSOHR for Structured Cleavage Induced Single-Stranded Oligonucleotide Hybridization Reaction. The SCISSOHR probes enable indexing the target sequence to a tag sequence. The SCISSOHR technology also allows the combination of nucleic acid amplification and microarray hybridization in a single vessel in presence of the PCR buffer only. The SCISSOHR technology uses an amplification probe that is irreversibly modified in presence of the target, releasing a single-stranded DNA tag for microarray hybridization. Each tag is composed of a 3-nucleotidesequence-dependent segment and a unique “target sequence-independent” 14-nucleotide segment allowing for optimal hybridization with minimal cross-hybridization. We evaluated the performance of five (5) PCR buffers to support microarray hybridization, compared to a conventional hybridization buffer. Finally, as a proof of concept, we developed a multiplexed assay for the amplification, detection, and identification of three (3) DNA targets. This new technology will facilitate the design

  14. A software framework for microarray and gene expression object model (MAGE-OM) array design annotation

    PubMed Central

    Qureshi, Matloob; Ivens, Alasdair

    2008-01-01

    Background The MIAME and MAGE-OM standards defined by the MGED society provide a specification and implementation of a software infrastructure to facilitate the submission and sharing of data from microarray studies via public repositories. However, although the MAGE object model is flexible enough to support different annotation strategies, the annotation of array descriptions can be complex. Results We have developed a graphical Java-based application (Adamant) to assist with submission of Microarray designs to public repositories. Output of the application is fully compliant with the standards prescribed by the various public data repositories. Conclusion Adamant will allow researchers to annotate and submit their own array designs to public repositories without requiring programming expertise, knowledge of the MAGE-OM or XML. The application has been used to submit a number of ArrayDesigns to the Array Express database. PMID:18366695

  15. Automation of Capacity Bidding with an Aggregator Using Open Automated Demand Response

    SciTech Connect

    Kiliccote, Sila; Piette, Mary Ann

    2008-10-01

    This report summarizes San Diego Gas& Electric Company?s collaboration with the Demand Response Research Center to develop and test automation capability for the Capacity Bidding Program in 2007. The report describes the Open Automated Demand Response architecture, summarizes the history of technology development and pilot studies. It also outlines the Capacity Bidding Program and technology being used by an aggregator that participated in this demand response program. Due to delays, the program was not fully operational for summer 2007. However, a test event on October 3, 2007, showed that the project successfully achieved the objective to develop and demonstrate how an open, Web?based interoperable automated notification system for capacity bidding can be used by aggregators for demand response. The system was effective in initiating a fully automated demand response shed at the aggregated sites. This project also demonstrated how aggregators can integrate their demand response automation systems with San Diego Gas& Electric Company?s Demand Response Automation Server and capacity bidding program.

  16. Detecting and Genotyping Escherichia coli O157:H7 using multiplexed PCR and nucleic acid microarrays

    SciTech Connect

    Call, Douglas R.; Brockman, Fred J. ); Chandler, Darrell P.

    2000-12-01

    Rapid detection and characterization of food borne pathogens such as Escherichia coli O157:H7 is crucial for epidemiological investigations and food safety surveillance. As an alternative to conventional technologies, we examined the sensitivity and specificity of nucleic acid microarrays for detecting and genotyping E. coli O157:H7. The array was composed of oligonucleotide probes (25-30 mer) complementary to four virulence loci (intimin, Shiga-like toxins I and II, and hemolysin A). Target DNA was amplified from whole cells or from purified DNA via single or multiplexed polymerase chain reaction (PCR), and PCR products were hybridized to the array without further modification or purification. The array was 32-fold more sensitive than gel electrophoresis and capable of detecting amplification products from < 1 cell equivalent of genomic DNA (1 fg). Immunomagnetic capture, PCR and a microarray were subsequently used to detect 55 CFU ml-1 (E. coli O157:H7) from chicken rinsate without the aid of pre-enrichment. Four isolates of E. coli O157:H7 and one isolate of O91:H2, for which genotypic data were available, were unambiguously genotyped with this array. Glass based microarrays are relatively simple to construct and provide a rapid and sensitive means to detect multiplexed PCR products and the system is amenable to automation.

  17. GEPAS, a web-based tool for microarray data analysis and interpretation

    PubMed Central

    Tárraga, Joaquín; Medina, Ignacio; Carbonell, José; Huerta-Cepas, Jaime; Minguez, Pablo; Alloza, Eva; Al-Shahrour, Fátima; Vegas-Azcárate, Susana; Goetz, Stefan; Escobar, Pablo; Garcia-Garcia, Francisco; Conesa, Ana; Montaner, David; Dopazo, Joaquín

    2008-01-01

    Gene Expression Profile Analysis Suite (GEPAS) is one of the most complete and extensively used web-based packages for microarray data analysis. During its more than 5 years of activity it has continuously been updated to keep pace with the state-of-the-art in the changing microarray data analysis arena. GEPAS offers diverse analysis options that include well established as well as novel algorithms for normalization, gene selection, class prediction, clustering and functional profiling of the experiment. New options for time-course (or dose-response) experiments, microarray-based class prediction, new clustering methods and new tests for differential expression have been included. The new pipeliner module allows automating the execution of sequential analysis steps by means of a simple but powerful graphic interface. An extensive re-engineering of GEPAS has been carried out which includes the use of web services and Web 2.0 technology features, a new user interface with persistent sessions and a new extended database of gene identifiers. GEPAS is nowadays the most quoted web tool in its field and it is extensively used by researchers of many countries and its records indicate an average usage rate of 500 experiments per day. GEPAS, is available at http://www.gepas.org. PMID:18508806

  18. Sequencing by Cyclic Ligation and Cleavage (CycLiC) directly on a microarray captured template

    PubMed Central

    Mir, Kalim U.; Qi, Hong; Salata, Oleg; Scozzafava, Giuseppe

    2009-01-01

    Next generation sequencing methods that can be applied to both the resequencing of whole genomes and to the selective resequencing of specific parts of genomes are needed. We describe (i) a massively scalable biochemistry, Cyclical Ligation and Cleavage (CycLiC) for contiguous base sequencing and (ii) apply it directly to a template captured on a microarray. CycLiC uses four color-coded DNA/RNA chimeric oligonucleotide libraries (OL) to extend a primer, a base at a time, along a template. The cycles comprise the steps: (i) ligation of OLs, (ii) identification of extended base by label detection, and (iii) cleavage to remove label/terminator and undetermined bases. For proof-of-principle, we show that the method conforms to design and that we can read contiguous bases of sequence correctly from a template captured by hybridization from solution to a microarray probe. The method is amenable to massive scale-up, miniaturization and automation. Implementation on a microarray format offers the potential for both selection and sequencing of a large number of genomic regions on a single platform. Because the method uses commonly available reagents it can be developed further by a community of users. PMID:19015154

  19. Detecting and genotyping Escherichia coli O157:H7 using multiplexed PCR and nucleic acid microarrays

    SciTech Connect

    Call, Douglas R.; Brockman, Fred J.; Chandler, Darrell P.

    2001-07-05

    Rapid detection and characterization of food borne pathogens such as Escherichia coli O157:H7 is crucial for epidemiological investigations and food safety surveillance. As an alternative to conventional technologies, we examined the sensitivity and specificity of nucleic acid microarrays for detecting and genotyping E. coli O157:H7. The array was composed of oligonucleotide probes (25-30 mer) complementary to four virulence loci (intimin, Shiga-like toxins I and II, and hemolysin A). Target DNA was amplified from whole cells or from purified DNA via single or multiplexed polymerase chain reaction (PCR), and PCR products were hybridized to the array without further modification or purification. The array was 32-fold more sensitive than gel electrophoresis and capable of detecting amplification products from < 1 cell equivalent of genomic DNA (1 fg). Immunomagnetic capture, PCR and a microarray were subsequently used to detect 55 CFUs ml-1 (E. coli O157:H7) from chicken rinsate without the aid of pre-enrichment. Four isolates of E. coli O157:H7 and one isolate of O91:H2, for which genotypic data were available, were unambiguously genotyped with this array. Glass based microarrays are relatively simple to construct and provide a rapid and sensitive means to detect multiplexed PCR products and the system is amenable to automation.

  20. Viral diagnosis in Indian livestock using customized microarray chips

    PubMed Central

    Yadav, Brijesh S; Pokhriyal, Mayank; Ratta, Barkha; Kumar, Ajay; Saxena, Meeta; Sharma, Bhaskar

    2015-01-01

    Viral diagnosis in Indian livestock using customized microarray chips is gaining momentum in recent years. Hence, it is possible to design customized microarray chip for viruses infecting livestock in India. Customized microarray chips identified Bovine herpes virus-1 (BHV-1), Canine Adeno Virus-1 (CAV-1), and Canine Parvo Virus-2 (CPV-2) in clinical samples. Microarray identified specific probes were further confirmed using RT-PCR in all clinical and known samples. Therefore, the application of microarray chips during viral disease outbreaks in Indian livestock is possible where conventional methods are unsuitable. It should be noted that customized application requires a detailed cost efficiency calculation. PMID:26912948

  1. Advancing translational research with next-generation protein microarrays.

    PubMed

    Yu, Xiaobo; Petritis, Brianne; LaBaer, Joshua

    2016-04-01

    Protein microarrays are a high-throughput technology used increasingly in translational research, seeking to apply basic science findings to enhance human health. In addition to assessing protein levels, posttranslational modifications, and signaling pathways in patient samples, protein microarrays have aided in the identification of potential protein biomarkers of disease and infection. In this perspective, the different types of full-length protein microarrays that are used in translational research are reviewed. Specific studies employing these microarrays are presented to highlight their potential in finding solutions to real clinical problems. Finally, the criteria that should be considered when developing next-generation protein microarrays are provided. PMID:26749402

  2. An automated microfluidic platform for C. elegans embryo arraying, phenotyping, and long-term live imaging

    PubMed Central

    Cornaglia, Matteo; Mouchiroud, Laurent; Marette, Alexis; Narasimhan, Shreya; Lehnert, Thomas; Jovaisaite, Virginija; Auwerx, Johan; Gijs, Martin A. M.

    2015-01-01

    Studies of the real-time dynamics of embryonic development require a gentle embryo handling method, the possibility of long-term live imaging during the complete embryogenesis, as well as of parallelization providing a population’s statistics, while keeping single embryo resolution. We describe an automated approach that fully accomplishes these requirements for embryos of Caenorhabditis elegans, one of the most employed model organisms in biomedical research. We developed a microfluidic platform which makes use of pure passive hydrodynamics to run on-chip worm cultures, from which we obtain synchronized embryo populations, and to immobilize these embryos in incubator microarrays for long-term high-resolution optical imaging. We successfully employ our platform to investigate morphogenesis and mitochondrial biogenesis during the full embryonic development and elucidate the role of the mitochondrial unfolded protein response (UPRmt) within C. elegans embryogenesis. Our method can be generally used for protein expression and developmental studies at the embryonic level, but can also provide clues to understand the aging process and age-related diseases in particular. PMID:25950235

  3. An automated microfluidic platform for C. elegans embryo arraying, phenotyping, and long-term live imaging.

    PubMed

    Cornaglia, Matteo; Mouchiroud, Laurent; Marette, Alexis; Narasimhan, Shreya; Lehnert, Thomas; Jovaisaite, Virginija; Auwerx, Johan; Gijs, Martin A M

    2015-01-01

    Studies of the real-time dynamics of embryonic development require a gentle embryo handling method, the possibility of long-term live imaging during the complete embryogenesis, as well as of parallelization providing a population's statistics, while keeping single embryo resolution. We describe an automated approach that fully accomplishes these requirements for embryos of Caenorhabditis elegans, one of the most employed model organisms in biomedical research. We developed a microfluidic platform which makes use of pure passive hydrodynamics to run on-chip worm cultures, from which we obtain synchronized embryo populations, and to immobilize these embryos in incubator microarrays for long-term high-resolution optical imaging. We successfully employ our platform to investigate morphogenesis and mitochondrial biogenesis during the full embryonic development and elucidate the role of the mitochondrial unfolded protein response (UPR(mt)) within C. elegans embryogenesis. Our method can be generally used for protein expression and developmental studies at the embryonic level, but can also provide clues to understand the aging process and age-related diseases in particular. PMID:25950235

  4. An automated microfluidic platform for C. elegans embryo arraying, phenotyping, and long-term live imaging

    NASA Astrophysics Data System (ADS)

    Cornaglia, Matteo; Mouchiroud, Laurent; Marette, Alexis; Narasimhan, Shreya; Lehnert, Thomas; Jovaisaite, Virginija; Auwerx, Johan; Gijs, Martin A. M.

    2015-05-01

    Studies of the real-time dynamics of embryonic development require a gentle embryo handling method, the possibility of long-term live imaging during the complete embryogenesis, as well as of parallelization providing a population’s statistics, while keeping single embryo resolution. We describe an automated approach that fully accomplishes these requirements for embryos of Caenorhabditis elegans, one of the most employed model organisms in biomedical research. We developed a microfluidic platform which makes use of pure passive hydrodynamics to run on-chip worm cultures, from which we obtain synchronized embryo populations, and to immobilize these embryos in incubator microarrays for long-term high-resolution optical imaging. We successfully employ our platform to investigate morphogenesis and mitochondrial biogenesis during the full embryonic development and elucidate the role of the mitochondrial unfolded protein response (UPRmt) within C. elegans embryogenesis. Our method can be generally used for protein expression and developmental studies at the embryonic level, but can also provide clues to understand the aging process and age-related diseases in particular.

  5. Fully depleted back illuminated CCD

    DOEpatents

    Holland, Stephen Edward

    2001-01-01

    A backside illuminated charge coupled device (CCD) is formed of a relatively thick high resistivity photon sensitive silicon substrate, with frontside electronic circuitry, and an optically transparent backside ohmic contact for applying a backside voltage which is at least sufficient to substantially fully deplete the substrate. A greater bias voltage which overdepletes the substrate may also be applied. One way of applying the bias voltage to the substrate is by physically connecting the voltage source to the ohmic contact. An alternate way of applying the bias voltage to the substrate is to physically connect the voltage source to the frontside of the substrate, at a point outside the depletion region. Thus both frontside and backside contacts can be used for backside biasing to fully deplete the substrate. Also, high resistivity gaps around the CCD channels and electrically floating channel stop regions can be provided in the CCD array around the CCD channels. The CCD array forms an imaging sensor useful in astronomy.

  6. Fully synthetic taped insulation cables

    DOEpatents

    Forsyth, Eric B.; Muller, Albert C.

    1984-01-01

    A high voltage oil-impregnated electrical cable with fully polymer taped insulation operable to 765 kV. Biaxially oriented, specially processed, polyethylene, polybutene or polypropylene tape with an embossed pattern is wound in multiple layers over a conductive core with a permeable screen around the insulation. Conventional oil which closely matches the dielectric constant of the tape is used, and the cable can be impregnated after field installation because of its excellent impregnation characteristics.

  7. Fully synthetic taped insulation cables

    SciTech Connect

    Forsyth, E. B.; Muller, A. C.

    1984-12-11

    A high voltage oil-impregnated electrical cable with fully polymer taped insulation operable to 765 kV. Biaxially oriented, specially processed, polyethylene, polybutene or polypropylene tape with an embossed pattern is wound in multiple layers over a conductive core with a permeable screen around the insulation. Conventional oil which closely matches the dielectric constant of the tape is used, and the cable can be impregnated after field installation because of its excellent impregnation characteristics.

  8. Automated External Defibrillator

    MedlinePlus

    ... from the NHLBI on Twitter. What Is an Automated External Defibrillator? An automated external defibrillator (AED) is a portable device that ... Institutes of Health Department of Health and Human Services USA.gov

  9. Automation of surface observations program

    NASA Technical Reports Server (NTRS)

    Short, Steve E.

    1988-01-01

    At present, surface weather observing methods are still largely manual and labor intensive. Through the nationwide implementation of Automated Surface Observing Systems (ASOS), this situation can be improved. Two ASOS capability levels are planned. The first is a basic-level system which will automatically observe the weather parameters essential for aviation operations and will operate either with or without supplemental contributions by an observer. The second is a more fully automated, stand-alone system which will observe and report the full range of weather parameters and will operate primarily in the unattended mode. Approximately 250 systems are planned by the end of the decade. When deployed, these systems will generate the standard hourly and special long-line transmitted weather observations, as well as provide continuous weather information direct to airport users. Specific ASOS configurations will vary depending upon whether the operation is unattended, minimally attended, or fully attended. The major functions of ASOS are data collection, data processing, product distribution, and system control. The program phases of development, demonstration, production system acquisition, and operational implementation are described.

  10. Plasmonically amplified fluorescence bioassay with microarray format

    NASA Astrophysics Data System (ADS)

    Gogalic, S.; Hageneder, S.; Ctortecka, C.; Bauch, M.; Khan, I.; Preininger, Claudia; Sauer, U.; Dostalek, J.

    2015-05-01

    Plasmonic amplification of fluorescence signal in bioassays with microarray detection format is reported. A crossed relief diffraction grating was designed to couple an excitation laser beam to surface plasmons at the wavelength overlapping with the absorption and emission bands of fluorophore Dy647 that was used as a label. The surface of periodically corrugated sensor chip was coated with surface plasmon-supporting gold layer and a thin SU8 polymer film carrying epoxy groups. These groups were employed for the covalent immobilization of capture antibodies at arrays of spots. The plasmonic amplification of fluorescence signal on the developed microarray chip was tested by using interleukin 8 sandwich immunoassay. The readout was performed ex situ after drying the chip by using a commercial scanner with high numerical aperture collecting lens. Obtained results reveal the enhancement of fluorescence signal by a factor of 5 when compared to a regular glass chip.

  11. Immobilization Techniques for Microarray: Challenges and Applications

    PubMed Central

    Nimse, Satish Balasaheb; Song, Keumsoo; Sonawane, Mukesh Digambar; Sayyed, Danishmalik Rafiq; Kim, Taisun

    2014-01-01

    The highly programmable positioning of molecules (biomolecules, nanoparticles, nanobeads, nanocomposites materials) on surfaces has potential applications in the fields of biosensors, biomolecular electronics, and nanodevices. However, the conventional techniques including self-assembled monolayers fail to position the molecules on the nanometer scale to produce highly organized monolayers on the surface. The present article elaborates different techniques for the immobilization of the biomolecules on the surface to produce microarrays and their diagnostic applications. The advantages and the drawbacks of various methods are compared. This article also sheds light on the applications of the different technologies for the detection and discrimination of viral/bacterial genotypes and the detection of the biomarkers. A brief survey with 115 references covering the last 10 years on the biological applications of microarrays in various fields is also provided. PMID:25429408

  12. Use of microarray technologies in toxicology research.

    PubMed

    Vrana, Kent E; Freeman, Willard M; Aschner, Michael

    2003-06-01

    Microarray technology provides a unique tool for the determination of gene expression at the level of messenger RNA (mRNA). The simultaneous measurement of the entire human genome (thousands of genes) will facilitate the uncovering of specific gene expression patterns that are associated with disease. One important application of microarray technology, within the context of neurotoxicological studies, is its use as a screening tool for the identification of molecular mechanisms of toxicity. Such approaches enable researchers to identify those genes and their products (either single or whole pathways) that are involved in conferring resistance or sensitivity to toxic substances. This review addresses: (1) the potential uses of array data; (2) the various array platforms, highlighting both their advantages and disadvantages; (3) insights into data analysis and presentation strategies; and (4) concrete examples of DNA array studies in neurotoxicological research. PMID:12782098

  13. A Flexible Microarray Data Simulation Model

    PubMed Central

    Dembélé, Doulaye

    2013-01-01

    Microarray technology allows monitoring of gene expression profiling at the genome level. This is useful in order to search for genes involved in a disease. The performances of the methods used to select interesting genes are most often judged after other analyzes (qPCR validation, search in databases...), which are also subject to error. A good evaluation of gene selection methods is possible with data whose characteristics are known, that is to say, synthetic data. We propose a model to simulate microarray data with similar characteristics to the data commonly produced by current platforms. The parameters used in this model are described to allow the user to generate data with varying characteristics. In order to show the flexibility of the proposed model, a commented example is given and illustrated. An R package is available for immediate use.

  14. Microarrays: how many do you need?

    PubMed

    Zien, Alexander; Fluck, Juliane; Zimmer, Ralf; Lengauer, Thomas

    2003-01-01

    We estimate the number of microarrays that is required in order to gain reliable results from a common type of study: the pairwise comparison of different classes of samples. We show that current knowledge allows for the construction of models that look realistic with respect to searches for individual differentially expressed genes and derive prototypical parameters from real data sets. Such models allow investigation of the dependence of the required number of samples on the relevant parameters: the biological variability of the samples within each class, the fold changes in expression that are desired to be detected, the detection sensitivity of the microarrays, and the acceptable error rates of the results. We supply experimentalists with general conclusions as well as a freely accessible Java applet at www.scai.fhg.de/special/bio/howmanyarrays/ for fine tuning simulations to their particular settings. PMID:12935350

  15. Glycan microarrays for decoding the glycome

    PubMed Central

    Rillahan, Cory D.; Paulson, James C.

    2011-01-01

    In the last decade glycan microarrays have revolutionized the analysis of the specificity of glycan binding proteins, providing information that simultaneously illuminates the biology mediated by them and decodes the information content of the glycome. Numerous methods have emerged for arraying glycans in a ‘chip’ format, and glycan libraries have been assembled that address the diversity of the human glycome. Such arrays have been successfully used for analysis of glycan binding proteins that mediate mammalian biology, host-pathogen interactions, immune recognition of glycans relevant to vaccine production and cancer antigens. This review covers the development of glycan microarrays and applications that have provided insights into the roles of mammalian and microbial glycan binding proteins. PMID:21469953

  16. Metadata Management and Semantics in Microarray Repositories

    PubMed Central

    Kocabaş, F; Can, T; Baykal, N

    2011-01-01

    The number of microarray and other high-throughput experiments on primary repositories keeps increasing as do the size and complexity of the results in response to biomedical investigations. Initiatives have been started on standardization of content, object model, exchange format and ontology. However, there are backlogs and inability to exchange data between microarray repositories, which indicate that there is a great need for a standard format and data management. We have introduced a metadata framework that includes a metadata card and semantic nets that make experimental results visible, understandable and usable. These are encoded in syntax encoding schemes and represented in RDF (Resource Description Frame-word), can be integrated with other metadata cards and semantic nets, and can be exchanged, shared and queried. We demonstrated the performance and potential benefits through a case study on a selected microarray repository. We concluded that the backlogs can be reduced and that exchange of information and asking of knowledge discovery questions can become possible with the use of this metadata framework. PMID:24052712

  17. Development and Applications of the Lectin Microarray.

    PubMed

    Hirabayashi, Jun; Kuno, Atsushi; Tateno, Hiroaki

    2015-01-01

    The lectin microarray is an emerging technology for glycomics. It has already found maximum use in diverse fields of glycobiology by providing simple procedures for differential glycan profiling in a rapid and high-throughput manner. Since its first appearance in the literature in 2005, many application methods have been developed essentially on the same platform, comprising a series of glycan-binding proteins immobilized on an appropriate substrate such as a glass slide. Because the lectin microarray strategy does not require prior liberation of glycans from the core protein in glycoprotein analysis, it should encourage researchers not familiar with glycotechnology to use glycan analysis in future work. This feasibility should provide a broader range of experimental scientists with good opportunities to investigate novel aspects of glycoscience. Applications of the technology include not only basic sciences but also the growing fields of bio-industry. This chapter describes first the essence of glycan profiling and the basic fabrication of the lectin microarray for this purpose. In the latter part the focus is on diverse applications to both structural and functional glycomics, with emphasis on the wide applicability now available with this new technology. Finally, the importance of developing advanced lectin engineering is discussed. PMID:25821171

  18. RNAi microarray analysis in cultured mammalian cells.

    PubMed

    Mousses, Spyro; Caplen, Natasha J; Cornelison, Robert; Weaver, Don; Basik, Mark; Hautaniemi, Sampsa; Elkahloun, Abdel G; Lotufo, Roberto A; Choudary, Ashish; Dougherty, Edward R; Suh, Ed; Kallioniemi, Olli

    2003-10-01

    RNA interference (RNAi) mediated by small interfering RNAs (siRNAs) is a powerful new tool for analyzing gene knockdown phenotypes in living mammalian cells. To facilitate large-scale, high-throughput functional genomics studies using RNAi, we have developed a microarray-based technology for highly parallel analysis. Specifically, siRNAs in a transfection matrix were first arrayed on glass slides, overlaid with a monolayer of adherent cells, incubated to allow reverse transfection, and assessed for the effects of gene silencing by digital image analysis at a single cell level. Validation experiments with HeLa cells stably expressing GFP showed spatially confined, sequence-specific, time- and dose-dependent inhibition of green fluorescence for those cells growing directly on microspots containing siRNA targeting the GFP sequence. Microarray-based siRNA transfections analyzed with a custom-made quantitative image analysis system produced results that were identical to those from traditional well-based transfection, quantified by flow cytometry. Finally, to integrate experimental details, image analysis, data display, and data archiving, we developed a prototype information management system for high-throughput cell-based analyses. In summary, this RNAi microarray platform, together with ongoing efforts to develop large-scale human siRNA libraries, should facilitate genomic-scale cell-based analyses of gene function. PMID:14525932

  19. Integrating data from heterogeneous DNA microarray platforms.

    PubMed

    Valente, Eduardo; Rocha, Miguel

    2015-01-01

    DNA microarrays are one of the most used technologies for gene expression measurement. However, there are several distinct microarray platforms, from different manufacturers, each with its own measurement protocol, resulting in data that can hardly be compared or directly integrated. Data integration from multiple sources aims to improve the assertiveness of statistical tests, reducing the data dimensionality problem. The integration of heterogeneous DNA microarray platforms comprehends a set of tasks that range from the re-annotation of the features used on gene expression, to data normalization and batch effect elimination. In this work, a complete methodology for gene expression data integration and application is proposed, which comprehends a transcript-based re-annotation process and several methods for batch effect attenuation. The integrated data will be used to select the best feature set and learning algorithm for a brain tumor classification case study. The integration will consider data from heterogeneous Agilent and Affymetrix platforms, collected from public gene expression databases, such as The Cancer Genome Atlas and Gene Expression Omnibus. PMID:26673932

  20. An imputation approach for oligonucleotide microarrays.

    PubMed

    Li, Ming; Wen, Yalu; Lu, Qing; Fu, Wenjiang J

    2013-01-01

    Oligonucleotide microarrays are commonly adopted for detecting and qualifying the abundance of molecules in biological samples. Analysis of microarray data starts with recording and interpreting hybridization signals from CEL images. However, many CEL images may be blemished by noises from various sources, observed as "bright spots", "dark clouds", and "shadowy circles", etc. It is crucial that these image defects are correctly identified and properly processed. Existing approaches mainly focus on detecting defect areas and removing affected intensities. In this article, we propose to use a mixed effect model for imputing the affected intensities. The proposed imputation procedure is a single-array-based approach which does not require any biological replicate or between-array normalization. We further examine its performance by using Affymetrix high-density SNP arrays. The results show that this imputation procedure significantly reduces genotyping error rates. We also discuss the necessary adjustments for its potential extension to other oligonucleotide microarrays, such as gene expression profiling. The R source code for the implementation of approach is freely available upon request. PMID:23505547

  1. [Genomic medicine. Polymorphisms and microarray applications].

    PubMed

    Spalvieri, Mónica P; Rotenberg, Rosa G

    2004-01-01

    This update shows new concepts related to the significance of DNA variations among individuals, as well as to their detection by using a new technology. The sequencing of the human genome is only the beginning of what will enable us to understand genetic diversity. The unit of DNA variability is the polymorphism of a single nucleotide (SNP). At present, studies on SNPs are restricted to basic research but the large number of papers on this subject makes feasible their entrance into clinical practice. We illustrate here the use of SNPs as molecular markers in ethnical genotyping, gene expression in some diseases and as potential targets in pharmacological response, and also introduce the technology of arrays. Microarrays experiments allow the quantification and comparison of gene expression on a large scale, at the same time, by using special chips and array designs. Conventional methods provide data from up to 20 genes, while a single microarray may provide information about thousands of them simultaneously, leading to a more rapid and accurate genotyping. Biotechnology improvements will facilitate our knowledge of each gene sequence, the frequency and exact location of SNPs and their influence on cellular behavior. Although experimental efficiency and validity of results from microarrays are still controversial, the knowledge and characterization of a patient's genetic profile will lead, undoubtedly, to advances in prevention, diagnosis, prognosis and treatment of human diseases. PMID:15637833

  2. High-Throughput Enzyme Kinetics Using Microarrays

    SciTech Connect

    Guoxin Lu; Edward S. Yeung

    2007-11-01

    We report a microanalytical method to study enzyme kinetics. The technique involves immobilizing horseradish peroxidase on a poly-L-lysine (PLL)- coated glass slide in a microarray format, followed by applying substrate solution onto the enzyme microarray. Enzyme molecules are immobilized on the PLL-coated glass slide through electrostatic interactions, and no further modification of the enzyme or glass slide is needed. In situ detection of the products generated on the enzyme spots is made possible by monitoring the light intensity of each spot using a scientific-grade charged-coupled device (CCD). Reactions of substrate solutions of various types and concentrations can be carried out sequentially on one enzyme microarray. To account for the loss of enzyme from washing in between runs, a standard substrate solution is used for calibration. Substantially reduced amounts of substrate solution are consumed for each reaction on each enzyme spot. The Michaelis constant K{sub m} obtained by using this method is comparable to the result for homogeneous solutions. Absorbance detection allows universal monitoring, and no chemical modification of the substrate is needed. High-throughput studies of native enzyme kinetics for multiple enzymes are therefore possible in a simple, rapid, and low-cost manner.

  3. Workflow automation architecture standard

    SciTech Connect

    Moshofsky, R.P.; Rohen, W.T.

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  4. DNA Microarray for Detection of Gastrointestinal Viruses

    PubMed Central

    Martínez, Miguel A.; Soto-del Río, María de los Dolores; Gutiérrez, Rosa María; Chiu, Charles Y.; Greninger, Alexander L.; Contreras, Juan Francisco; López, Susana; Arias, Carlos F.

    2014-01-01

    Gastroenteritis is a clinical illness of humans and other animals that is characterized by vomiting and diarrhea and caused by a variety of pathogens, including viruses. An increasing number of viral species have been associated with gastroenteritis or have been found in stool samples as new molecular tools have been developed. In this work, a DNA microarray capable in theory of parallel detection of more than 100 viral species was developed and tested. Initial validation was done with 10 different virus species, and an additional 5 species were validated using clinical samples. Detection limits of 1 × 103 virus particles of Human adenovirus C (HAdV), Human astrovirus (HAstV), and group A Rotavirus (RV-A) were established. Furthermore, when exogenous RNA was added, the limit for RV-A detection decreased by one log. In a small group of clinical samples from children with gastroenteritis (n = 76), the microarray detected at least one viral species in 92% of the samples. Single infection was identified in 63 samples (83%), and coinfection with more than one virus was identified in 7 samples (9%). The most abundant virus species were RV-A (58%), followed by Anellovirus (15.8%), HAstV (6.6%), HAdV (5.3%), Norwalk virus (6.6%), Human enterovirus (HEV) (9.2%), Human parechovirus (1.3%), Sapporo virus (1.3%), and Human bocavirus (1.3%). To further test the specificity and sensitivity of the microarray, the results were verified by reverse transcription-PCR (RT-PCR) detection of 5 gastrointestinal viruses. The RT-PCR assay detected a virus in 59 samples (78%). The microarray showed good performance for detection of RV-A, HAstV, and calicivirus, while the sensitivity for HAdV and HEV was low. Furthermore, some discrepancies in detection of mixed infections were observed and were addressed by reverse transcription-quantitative PCR (RT-qPCR) of the viruses involved. It was observed that differences in the amount of genetic material favored the detection of the most abundant

  5. Automation of industrial bioprocesses.

    PubMed

    Beyeler, W; DaPra, E; Schneider, K

    2000-01-01

    The dramatic development of new electronic devices within the last 25 years has had a substantial influence on the control and automation of industrial bioprocesses. Within this short period of time the method of controlling industrial bioprocesses has changed completely. In this paper, the authors will use a practical approach focusing on the industrial applications of automation systems. From the early attempts to use computers for the automation of biotechnological processes up to the modern process automation systems some milestones are highlighted. Special attention is given to the influence of Standards and Guidelines on the development of automation systems. PMID:11092132

  6. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    PubMed

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one. PMID:26374396

  7. Assessing Statistical Significance in Microarray Experiments Using the Distance Between Microarrays

    PubMed Central

    Hayden, Douglas; Lazar, Peter; Schoenfeld, David

    2009-01-01

    We propose permutation tests based on the pairwise distances between microarrays to compare location, variability, or equivalence of gene expression between two populations. For these tests the entire microarray or some pre-specified subset of genes is the unit of analysis. The pairwise distances only have to be computed once so the procedure is not computationally intensive despite the high dimensionality of the data. An R software package, permtest, implementing the method is freely available from the Comprehensive R Archive Network at http://cran.r-project.org. PMID:19529777

  8. Automated ILA design for synchronous sequential circuits

    NASA Technical Reports Server (NTRS)

    Liu, M. N.; Liu, K. Z.; Maki, G. K.; Whitaker, S. R.

    1991-01-01

    An iterative logic array (ILA) architecture for synchronous sequential circuits is presented. This technique utilizes linear algebra to produce the design equations. The ILA realization of synchronous sequential logic can be fully automated with a computer program. A programmable design procedure is proposed to fullfill the design task and layout generation. A software algorithm in the C language has been developed and tested to generate 1 micron CMOS layouts using the Hewlett-Packard FUNGEN module generator shell.

  9. Automated sirulated distillation using an articulated laboratory robot system.

    PubMed

    Berry, W F; Giarrocco, V

    1994-01-01

    An automated method, based on the Hewlett-Packard ORCA (Optimized Robot for Chemical Analysis) system, for sample preparation and analysis of petroleum samples by simulated distillation (SIMDIS) is described. Results obtained for the robotically prepared samples show excellent agreement with those obtained from the same samples prepared manually. The application, based on ASTM method D 2887, is the foundation for a more fully automated system that can perform a variety of SIMDIS samples and methods. PMID:18924992

  10. Identification of significant features in DNA microarray data

    PubMed Central

    Bair, Eric

    2013-01-01

    DNA microarrays are a relatively new technology that can simultaneously measure the expression level of thousands of genes. They have become an important tool for a wide variety of biological experiments. One of the most common goals of DNA microarray experiments is to identify genes associated with biological processes of interest. Conventional statistical tests often produce poor results when applied to microarray data owing to small sample sizes, noisy data, and correlation among the expression levels of the genes. Thus, novel statistical methods are needed to identify significant genes in DNA microarray experiments. This article discusses the challenges inherent in DNA microarray analysis and describes a series of statistical techniques that can be used to overcome these challenges. The problem of multiple hypothesis testing and its relation to microarray studies are also considered, along with several possible solutions. PMID:24244802

  11. High-throughput allogeneic antibody detection using protein microarrays.

    PubMed

    Paul, Jed; Sahaf, Bita; Perloff, Spenser; Schoenrock, Kelsi; Wu, Fang; Nakasone, Hideki; Coller, John; Miklos, David

    2016-05-01

    Enzyme-linked immunosorbent assays (ELISAs) have traditionally been used to detect alloantibodies in patient plasma samples post hematopoietic cell transplantation (HCT); however, protein microarrays have the potential to be multiplexed, more sensitive, and higher throughput than ELISAs. Here, we describe the development of a novel and sensitive microarray method for detection of allogeneic antibodies against minor histocompatibility antigens encoded on the Y chromosome, called HY antigens. Six microarray surfaces were tested for their ability to bind recombinant protein and peptide HY antigens. Significant allogeneic immune responses were determined in male patients with female donors by considering normal male donor responses as baseline. HY microarray results were also compared with our previous ELISA results. Our overall goal was to maximize antibody detection for both recombinant protein and peptide epitopes. For detection of HY antigens, the Epoxy (Schott) protein microarray surface was both most sensitive and reliable and has become the standard surface in our microarray platform. PMID:26902899

  12. Towards Automated Annotation of Benthic Survey Images: Variability of Human Experts and Operational Modes of Automation.

    PubMed

    Beijbom, Oscar; Edmunds, Peter J; Roelfsema, Chris; Smith, Jennifer; Kline, David I; Neal, Benjamin P; Dunlap, Matthew J; Moriarty, Vincent; Fan, Tung-Yung; Tan, Chih-Jui; Chan, Stephen; Treibitz, Tali; Gamst, Anthony; Mitchell, B Greg; Kriegman, David

    2015-01-01

    Global climate change and other anthropogenic stressors have heightened the need to rapidly characterize ecological changes in marine benthic communities across large scales. Digital photography enables rapid collection of survey images to meet this need, but the subsequent image annotation is typically a time consuming, manual task. We investigated the feasibility of using automated point-annotation to expedite cover estimation of the 17 dominant benthic categories from survey-images captured at four Pacific coral reefs. Inter- and intra- annotator variability among six human experts was quantified and compared to semi- and fully- automated annotation methods, which are made available at coralnet.ucsd.edu. Our results indicate high expert agreement for identification of coral genera, but lower agreement for algal functional groups, in particular between turf algae and crustose coralline algae. This indicates the need for unequivocal definitions of algal groups, careful training of multiple annotators, and enhanced imaging technology. Semi-automated annotation, where 50% of the annotation decisions were performed automatically, yielded cover estimate errors comparable to those of the human experts. Furthermore, fully-automated annotation yielded rapid, unbiased cover estimates but with increased variance. These results show that automated annotation can increase spatial coverage and decrease time and financial outlay for image-based reef surveys. PMID:26154157

  13. Towards Automated Annotation of Benthic Survey Images: Variability of Human Experts and Operational Modes of Automation

    PubMed Central

    Beijbom, Oscar; Edmunds, Peter J.; Roelfsema, Chris; Smith, Jennifer; Kline, David I.; Neal, Benjamin P.; Dunlap, Matthew J.; Moriarty, Vincent; Fan, Tung-Yung; Tan, Chih-Jui; Chan, Stephen; Treibitz, Tali; Gamst, Anthony; Mitchell, B. Greg; Kriegman, David

    2015-01-01

    Global climate change and other anthropogenic stressors have heightened the need to rapidly characterize ecological changes in marine benthic communities across large scales. Digital photography enables rapid collection of survey images to meet this need, but the subsequent image annotation is typically a time consuming, manual task. We investigated the feasibility of using automated point-annotation to expedite cover estimation of the 17 dominant benthic categories from survey-images captured at four Pacific coral reefs. Inter- and intra- annotator variability among six human experts was quantified and compared to semi- and fully- automated annotation methods, which are made available at coralnet.ucsd.edu. Our results indicate high expert agreement for identification of coral genera, but lower agreement for algal functional groups, in particular between turf algae and crustose coralline algae. This indicates the need for unequivocal definitions of algal groups, careful training of multiple annotators, and enhanced imaging technology. Semi-automated annotation, where 50% of the annotation decisions were performed automatically, yielded cover estimate errors comparable to those of the human experts. Furthermore, fully-automated annotation yielded rapid, unbiased cover estimates but with increased variance. These results show that automated annotation can increase spatial coverage and decrease time and financial outlay for image-based reef surveys. PMID:26154157

  14. Fully analogue photonic reservoir computer.

    PubMed

    Duport, François; Smerieri, Anteo; Akrout, Akram; Haelterman, Marc; Massar, Serge

    2016-01-01

    Introduced a decade ago, reservoir computing is an efficient approach for signal processing. State of the art capabilities have already been demonstrated with both computer simulations and physical implementations. If photonic reservoir computing appears to be promising a solution for ultrafast nontrivial computing, all the implementations presented up to now require digital pre or post processing, which prevents them from exploiting their full potential, in particular in terms of processing speed. We address here the possibility to get rid simultaneously of both digital pre and post processing. The standalone fully analogue reservoir computer resulting from our endeavour is compared to previous experiments and only exhibits rather limited degradation of performances. Our experiment constitutes a proof of concept for standalone physical reservoir computers. PMID:26935166

  15. Singularities in fully developed turbulence

    NASA Astrophysics Data System (ADS)

    Shivamoggi, Bhimsen K.

    2015-09-01

    Phenomenological arguments are used to explore finite-time singularity (FTS) development in different physical fully-developed turbulence (FDT) situations. Effects of spatial intermittency and fluid compressibility in three-dimensional (3D) FDT and the role of the divorticity amplification mechanism in two-dimensional (2D) FDT and quasi-geostrophic FDT and the advection-diffusion mechanism in magnetohydrodynamic turbulence are considered to provide physical insights into the FTS development in variant cascade physics situations. The quasi-geostrophic FDT results connect with the 2D FDT results in the barotropic limit while they connect with 3D FDT results in the baroclinic limit and hence apparently provide a bridge between 2D and 3D.

  16. Fully analogue photonic reservoir computer

    PubMed Central

    Duport, François; Smerieri, Anteo; Akrout, Akram; Haelterman, Marc; Massar, Serge

    2016-01-01

    Introduced a decade ago, reservoir computing is an efficient approach for signal processing. State of the art capabilities have already been demonstrated with both computer simulations and physical implementations. If photonic reservoir computing appears to be promising a solution for ultrafast nontrivial computing, all the implementations presented up to now require digital pre or post processing, which prevents them from exploiting their full potential, in particular in terms of processing speed. We address here the possibility to get rid simultaneously of both digital pre and post processing. The standalone fully analogue reservoir computer resulting from our endeavour is compared to previous experiments and only exhibits rather limited degradation of performances. Our experiment constitutes a proof of concept for standalone physical reservoir computers. PMID:26935166

  17. Automated DNA Sequencing System

    SciTech Connect

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  18. Prenatal chromosomal microarray for the Catholic physician

    PubMed Central

    Bringman, Jay J.

    2014-01-01

    Prenatal chromosomal microarray (CMA) is a test that is used to diagnose certain genetic problems in the fetus. While the test has been used in the pediatric setting for several years, it is now being introduced for use in the prenatal setting. The test offers great hope for detection of certain genetic defects in the fetus so that early intervention can be performed to improve the outcome for that individual. As with many biotechnical advances, CMA comes with certain bioethical issues that need to be addressed prior to its implementation. This paper is intended to provide guidance to all those that provide counseling regarding genetic testing options during pregnancy. PMID:24899750

  19. Protein Microarrays--Without a Trace

    SciTech Connect

    Camarero, J A

    2007-04-05

    Many experimental approaches in biology and biophysics, as well as applications in diagnosis and drug discovery, require proteins to be immobilized on solid supports. Protein microarrays, for example, provide a high-throughput format to study biomolecular interactions. The technique employed for protein immobilization is a key to the success of these applications. Recent biochemical developments are allowing, for the first time, the selective and traceless immobilization of proteins generated by cell-free systems without the need for purification and/or reconcentration prior to the immobilization step.

  20. ProMAT: protein microarray analysis tool

    SciTech Connect

    White, Amanda M.; Daly, Don S.; Varnum, Susan M.; Anderson, Kevin K.; Bollinger, Nikki; Zangar, Richard C.

    2006-04-04

    Summary: ProMAT is a software tool for statistically analyzing data from ELISA microarray experiments. The software estimates standard curves, sample protein concentrations and their uncertainties for multiple assays. ProMAT generates a set of comprehensive figures for assessing results and diagnosing process quality. The tool is available for Windows or Mac, and is distributed as open-source Java and R code. Availability: ProMAT is available at http://www.pnl.gov/statistics/ProMAT. ProMAT requires Java version 1.5.0 and R version 1.9.1 (or more recent versions) which are distributed with the tool.