Science.gov

Sample records for fully automated microarray

  1. Fully automated analysis of multi-resolution four-channel micro-array genotyping data

    NASA Astrophysics Data System (ADS)

    Abbaspour, Mohsen; Abugharbieh, Rafeef; Podder, Mohua; Tebbutt, Scott J.

    2006-03-01

    We present a fully-automated and robust microarray image analysis system for handling multi-resolution images (down to 3-micron with sizes up to 80 MBs per channel). The system is developed to provide rapid and accurate data extraction for our recently developed microarray analysis and quality control tool (SNP Chart). Currently available commercial microarray image analysis applications are inefficient, due to the considerable user interaction typically required. Four-channel DNA microarray technology is a robust and accurate tool for determining genotypes of multiple genetic markers in individuals. It plays an important role in the state of the art trend where traditional medical treatments are to be replaced by personalized genetic medicine, i.e. individualized therapy based on the patient's genetic heritage. However, fast, robust, and precise image processing tools are required for the prospective practical use of microarray-based genetic testing for predicting disease susceptibilities and drug effects in clinical practice, which require a turn-around timeline compatible with clinical decision-making. In this paper we have developed a fully-automated image analysis platform for the rapid investigation of hundreds of genetic variations across multiple genes. Validation tests indicate very high accuracy levels for genotyping results. Our method achieves a significant reduction in analysis time, from several hours to just a few minutes, and is completely automated requiring no manual interaction or guidance.

  2. Fully Automated Complementary DNA Microarray Segmentation using a Novel Fuzzy-based Algorithm.

    PubMed

    Saberkari, Hamidreza; Bahrami, Sheyda; Shamsi, Mousa; Amoshahy, Mohammad Javad; Ghavifekr, Habib Badri; Sedaaghi, Mohammad Hossein

    2015-01-01

    DNA microarray is a powerful approach to study simultaneously, the expression of 1000 of genes in a single experiment. The average value of the fluorescent intensity could be calculated in a microarray experiment. The calculated intensity values are very close in amount to the levels of expression of a particular gene. However, determining the appropriate position of every spot in microarray images is a main challenge, which leads to the accurate classification of normal and abnormal (cancer) cells. In this paper, first a preprocessing approach is performed to eliminate the noise and artifacts available in microarray cells using the nonlinear anisotropic diffusion filtering method. Then, the coordinate center of each spot is positioned utilizing the mathematical morphology operations. Finally, the position of each spot is exactly determined through applying a novel hybrid model based on the principle component analysis and the spatial fuzzy c-means clustering (SFCM) algorithm. Using a Gaussian kernel in SFCM algorithm will lead to improving the quality in complementary DNA microarray segmentation. The performance of the proposed algorithm has been evaluated on the real microarray images, which is available in Stanford Microarray Databases. Results illustrate that the accuracy of microarray cells segmentation in the proposed algorithm reaches to 100% and 98% for noiseless/noisy cells, respectively.

  3. Automated analytical microarrays: a critical review.

    PubMed

    Seidel, Michael; Niessner, Reinhard

    2008-07-01

    Microarrays provide a powerful analytical tool for the simultaneous detection of multiple analytes in a single experiment. The specific affinity reaction of nucleic acids (hybridization) and antibodies towards antigens is the most common bioanalytical method for generating multiplexed quantitative results. Nucleic acid-based analysis is restricted to the detection of cells and viruses. Antibodies are more universal biomolecular receptors that selectively bind small molecules such as pesticides, small toxins, and pharmaceuticals and to biopolymers (e.g. toxins, allergens) and complex biological structures like bacterial cells and viruses. By producing an appropriate antibody, the corresponding antigenic analyte can be detected on a multiplexed immunoanalytical microarray. Food and water analysis along with clinical diagnostics constitute potential application fields for multiplexed analysis. Diverse fluorescence, chemiluminescence, electrochemical, and label-free microarray readout systems have been developed in the last decade. Some of them are constructed as flow-through microarrays by combination with a fluidic system. Microarrays have the potential to become widely accepted as a system for analytical applications, provided that robust and validated results on fully automated platforms are successfully generated. This review gives an overview of the current research on microarrays with the focus on automated systems and quantitative multiplexed applications.

  4. Automated Microarray Image Analysis Toolbox for MATLAB

    SciTech Connect

    White, Amanda M.; Daly, Don S.; Willse, Alan R.; Protic, Miroslava; Chandler, Darrell P.

    2005-09-01

    The Automated Microarray Image Analysis (AMIA) Toolbox for MATLAB is a flexible, open-source microarray image analysis tool that allows the user to customize analysis of sets of microarray images. This tool provides several methods of identifying and quantify spot statistics, as well as extensive diagnostic statistics and images to identify poor data quality or processing. The open nature of this software allows researchers to understand the algorithms used to provide intensity estimates and to modify them easily if desired.

  5. Fully automated urban traffic system

    NASA Technical Reports Server (NTRS)

    Dobrotin, B. M.; Hansen, G. R.; Peng, T. K. C.; Rennels, D. A.

    1977-01-01

    The replacement of the driver with an automatic system which could perform the functions of guiding and routing a vehicle with a human's capability of responding to changing traffic demands was discussed. The problem was divided into four technological areas; guidance, routing, computing, and communications. It was determined that the latter three areas being developed independent of any need for fully automated urban traffic. A guidance system that would meet system requirements was not being developed but was technically feasible.

  6. Fully automated solid weighing workstation.

    PubMed

    Wong, Stephen K-F; Lu, YiFeng; Heineman, William; Palmer, Janice; Courtney, Carter

    2005-08-01

    A fully automated, solid-to-solid weighing workstation (patent pending) is described in this article. The core of this automated process is the use of an electrostatically charged pipette tip to attract solid particles on its outside surface. The particles were then dislodged into a 1.2-mL destination vial in a microbalance by spinning the pipette tip. Textures of solid that could be weighed included powder, crystalline, liquid, and semi-solid substances. The workstation can pick up submilligram quantities of sample (=0.3mg) from source vials containing as little as 1mg. The destination vials containing the samples were stored in a 96-well rack to enable subsequent automated liquid handling. Using bovine serum albumin as test solid, the coefficient of variation of the protein concentration for 48 samples is less than 6%. The workstation was used successfully to weigh out 48 different synthetic compounds. Time required for automated weighing was similar to manual weighing. The use of this workstation reduced 90% hands-on time and thus exposure to potentially toxic compounds. In addition, it minimized sample waste and reduced artifacts due to the poor solubility of compound in solvents. Moreover, it enabled compounds synthesized in milligram quantities to be weighed out and tested in biological assays.

  7. Progress in Fully Automated Abdominal CT Interpretation

    PubMed Central

    Summers, Ronald M.

    2016-01-01

    OBJECTIVE Automated analysis of abdominal CT has advanced markedly over just the last few years. Fully automated assessment of organs, lymph nodes, adipose tissue, muscle, bowel, spine, and tumors are some examples where tremendous progress has been made. Computer-aided detection of lesions has also improved dramatically. CONCLUSION This article reviews the progress and provides insights into what is in store in the near future for automated analysis for abdominal CT, ultimately leading to fully automated interpretation. PMID:27101207

  8. Fully integrated, fully automated generation of short tandem repeat profiles

    PubMed Central

    2013-01-01

    Background The generation of short tandem repeat profiles, also referred to as ‘DNA typing,’ is not currently performed outside the laboratory because the process requires highly skilled technical operators and a controlled laboratory environment and infrastructure with several specialized instruments. The goal of this work was to develop a fully integrated system for the automated generation of short tandem repeat profiles from buccal swab samples, to improve forensic laboratory process flow as well as to enable short tandem repeat profile generation to be performed in police stations and in field-forward military, intelligence, and homeland security settings. Results An integrated system was developed consisting of an injection-molded microfluidic BioChipSet cassette, a ruggedized instrument, and expert system software. For each of five buccal swabs, the system purifies DNA using guanidinium-based lysis and silica binding, amplifies 15 short tandem repeat loci and the amelogenin locus, electrophoretically separates the resulting amplicons, and generates a profile. No operator processing of the samples is required, and the time from swab insertion to profile generation is 84 minutes. All required reagents are contained within the BioChipSet cassette; these consist of a lyophilized polymerase chain reaction mix and liquids for purification and electrophoretic separation. Profiles obtained from fully automated runs demonstrate that the integrated system generates concordant short tandem repeat profiles. The system exhibits single-base resolution from 100 to greater than 500 bases, with inter-run precision with a standard deviation of ±0.05 - 0.10 bases for most alleles. The reagents are stable for at least 6 months at 22°C, and the instrument has been designed and tested to Military Standard 810F for shock and vibration ruggedization. A nontechnical user can operate the system within or outside the laboratory. Conclusions The integrated system represents the

  9. Fully automated determination of pesticides in wine.

    PubMed

    Kaufmann, A

    1997-01-01

    A fully automated solid-phase extraction gas chromatographic/mass spectrometric (SPE/GC/MS) method was developed for determination of pesticides in wine. All steps from aspiration of infiltrated wine to printout of the integrated chromatogram were performed without human interaction. A dedicated robot performed addition of internal standard, application of wine onto the SPE cartridge, elution of analytes, drying and concentrating of eluate, and passing of concentrate to the GC sampler. All steps were performed in standard liquid chromatography/GC vials, using a minimum of organic solvent. The method permits determination of 21 different pesticides. Individual detection limits were 0.005-0.01 mg/L. The regression coefficients relating to linearity were > 0.99; only 4,4-dichloro-benzphenone and dicofol showed lower coefficients. The recoveries for 17 pesticides ranged from 80 to 115%.

  10. Peptide nucleic acids (PNAs) patterning by an automated microarray synthesis system through photolithography.

    PubMed

    Wu, Yan-Qi; Yang, Fei-Peng; Wang, Hong-Yin; Liu, Jian-Xin; Liu, Zheng-Chun

    2013-03-01

    Peptide nucleic acids (PNA) microarray assembled with hundreds of unique PNA oligomers has been regarded as a new and mighty competitor of DNA chip in gene analyzing. However, PNA microarray is still a luxury art due to the difficult and laborious chemical synthesis. Herein, we have developed a fully-automated synthesizer for PNA microarray through photolithography. A preactivation mixer was designed and integrated into the synthesizer in order to get rid of the annoying manual process and increase the coupling efficiency of PNA monomers. The PNA patterning model was carried out to check the performance of the automated synthesizer, revealing that an exposure time of 3 min was sufficient for the complete removal of o-nitroveratryloxycarbonyl (NVOC) groups from the synthetic sites with the help of photosensitizer isopropylthioxanthone and the stepwise yield was measured to be about 98.0%, which is comparable with that from conventional fluorenyl-methyloxycarbonyl (FMOC) chemistry. Those results have definitely demonstrated the possibility and capability of this fully-automated synthesizer to fabricate high-quality PNA microarrays.

  11. A microfluidic device for the automated electrical readout of low-density glass-slide microarrays.

    PubMed

    Díaz-González, María; Salvador, J Pablo; Bonilla, Diana; Marco, M Pilar; Fernández-Sánchez, César; Baldi, Antoni

    2015-12-15

    Microarrays are a powerful platform for rapid and multiplexed analysis in a wide range of research fields. Electrical readout systems have emerged as an alternative to conventional optical methods for microarray analysis thanks to its potential advantages like low-cost, low-power and easy miniaturization of the required instrumentation. In this work an automated electrical readout system for low-cost glass-slide microarrays is described. The system enables the simultaneous conductimetric detection of up to 36 biorecognition events by incorporating an array of interdigitated electrode transducers. A polydimethylsiloxane microfluidic structure has been designed that creates microwells over the transducers and incorporates the microfluidic channels required for filling and draining them with readout and cleaning solutions, thus making the readout process fully automated. Since the capture biomolecules are not immobilized on the transducer surface this readout system is reusable, in contrast to previously reported electrochemical microarrays. A low-density microarray based on a competitive enzymatic immunoassay for atrazine detection was used to test the performance of the readout system. The electrical assay shows a detection limit of 0.22±0.03 μg L(-1) similar to that obtained with fluorescent detection and allows the direct determination of the pesticide in polluted water samples. These results proved that an electrical readout system such as the one presented in this work is a reliable and cost-effective alternative to fluorescence scanners for the analysis of low-density microarrays.

  12. Fully Mechanically Controlled Automated Electron Microscopic Tomography

    PubMed Central

    Liu, Jinxin; Li, Hongchang; Zhang, Lei; Rames, Matthew; Zhang, Meng; Yu, Yadong; Peng, Bo; Celis, César Díaz; Xu, April; Zou, Qin; Yang, Xu; Chen, Xuefeng; Ren, Gang

    2016-01-01

    Knowledge of three-dimensional (3D) structures of each individual particles of asymmetric and flexible proteins is essential in understanding those proteins’ functions; but their structures are difficult to determine. Electron tomography (ET) provides a tool for imaging a single and unique biological object from a series of tilted angles, but it is challenging to image a single protein for three-dimensional (3D) reconstruction due to the imperfect mechanical control capability of the specimen goniometer under both a medium to high magnification (approximately 50,000–160,000×) and an optimized beam coherence condition. Here, we report a fully mechanical control method for automating ET data acquisition without using beam tilt/shift processes. This method could reduce the accumulation of beam tilt/shift that used to compensate the error from the mechanical control, but downgraded the beam coherence. Our method was developed by minimizing the error of the target object center during the tilting process through a closed-loop proportional-integral (PI) control algorithm. The validations by both negative staining (NS) and cryo-electron microscopy (cryo-EM) suggest that this method has a comparable capability to other ET methods in tracking target proteins while maintaining optimized beam coherence conditions for imaging. PMID:27403922

  13. Fully Mechanically Controlled Automated Electron Microscopic Tomography

    NASA Astrophysics Data System (ADS)

    Liu, Jinxin; Li, Hongchang; Zhang, Lei; Rames, Matthew; Zhang, Meng; Yu, Yadong; Peng, Bo; Celis, César Díaz; Xu, April; Zou, Qin; Yang, Xu; Chen, Xuefeng; Ren, Gang

    2016-07-01

    Knowledge of three-dimensional (3D) structures of each individual particles of asymmetric and flexible proteins is essential in understanding those proteins’ functions; but their structures are difficult to determine. Electron tomography (ET) provides a tool for imaging a single and unique biological object from a series of tilted angles, but it is challenging to image a single protein for three-dimensional (3D) reconstruction due to the imperfect mechanical control capability of the specimen goniometer under both a medium to high magnification (approximately 50,000–160,000×) and an optimized beam coherence condition. Here, we report a fully mechanical control method for automating ET data acquisition without using beam tilt/shift processes. This method could reduce the accumulation of beam tilt/shift that used to compensate the error from the mechanical control, but downgraded the beam coherence. Our method was developed by minimizing the error of the target object center during the tilting process through a closed-loop proportional-integral (PI) control algorithm. The validations by both negative staining (NS) and cryo-electron microscopy (cryo-EM) suggest that this method has a comparable capability to other ET methods in tracking target proteins while maintaining optimized beam coherence conditions for imaging.

  14. Fully Mechanically Controlled Automated Electron Microscopic Tomography.

    PubMed

    Liu, Jinxin; Li, Hongchang; Zhang, Lei; Rames, Matthew; Zhang, Meng; Yu, Yadong; Peng, Bo; Celis, César Díaz; Xu, April; Zou, Qin; Yang, Xu; Chen, Xuefeng; Ren, Gang

    2016-01-01

    Knowledge of three-dimensional (3D) structures of each individual particles of asymmetric and flexible proteins is essential in understanding those proteins' functions; but their structures are difficult to determine. Electron tomography (ET) provides a tool for imaging a single and unique biological object from a series of tilted angles, but it is challenging to image a single protein for three-dimensional (3D) reconstruction due to the imperfect mechanical control capability of the specimen goniometer under both a medium to high magnification (approximately 50,000-160,000×) and an optimized beam coherence condition. Here, we report a fully mechanical control method for automating ET data acquisition without using beam tilt/shift processes. This method could reduce the accumulation of beam tilt/shift that used to compensate the error from the mechanical control, but downgraded the beam coherence. Our method was developed by minimizing the error of the target object center during the tilting process through a closed-loop proportional-integral (PI) control algorithm. The validations by both negative staining (NS) and cryo-electron microscopy (cryo-EM) suggest that this method has a comparable capability to other ET methods in tracking target proteins while maintaining optimized beam coherence conditions for imaging. PMID:27403922

  15. Semiautomated inspection versus fully automated inspection of lyophilized products.

    PubMed

    Seidenader, N W

    1994-01-01

    The development of fully automated inspection systems for parenteral products has created a situation of high expectations regarding productivity and quality improvements. However, not all products and production situations are suited for automation. A guideline for inspection and automation strategies will be discussed, structuring the field of lyophilized products according to the critical decision parameters.

  16. A home-built, fully automated observatory

    NASA Astrophysics Data System (ADS)

    Beales, M.

    2010-12-01

    This paper describes the design of an automated observatory making use of off-the-shelf components and software. I make no claims for originality in the design but it has been an interesting and rewarding exercise to get all the components to work together.

  17. Fully automated three-dimensional microscopy system

    NASA Astrophysics Data System (ADS)

    Kerschmann, Russell L.

    2000-04-01

    Tissue-scale structures such as vessel networks are imaged at micron resolution with the Virtual Tissue System (VT System). VT System imaging of cubic millimeters of tissue and other material extends the capabilities of conventional volumetric techniques such as confocal microscopy, and allows for the first time the integrated 2D and 3D analysis of important tissue structural relationships. The VT System eliminates the need for glass slide-mounted tissue sections and instead captures images directly from the surface of a block containing a sample. Tissues are en bloc stained with fluorochrome compounds, embedded in an optically conditioned polymer that suppresses image signals form dep within the block , and serially sectioned for imaging. Thousands of fully registered 2D images are automatically captured digitally to completely convert tissue samples into blocks of high-resolution information. The resulting multi gigabyte data sets constitute the raw material for precision visualization and analysis. Cellular function may be seen in a larger anatomical context. VT System technology makes tissue metrics, accurate cell enumeration and cell cycle analyses possible while preserving full histologic setting.

  18. Evaluation of a novel automated allergy microarray platform compared with three other allergy test methods.

    PubMed

    Williams, P; Önell, A; Baldracchini, F; Hui, V; Jolles, S; El-Shanawany, T

    2016-04-01

    Microarray platforms, enabling simultaneous measurement of many allergens with a small serum sample, are potentially powerful tools in allergy diagnostics. We report here the first study comparing a fully automated microarray system, the Microtest allergy system, with a manual microarray platform, Immuno-Solid phase Allergen Chip (ISAC), and two well-established singleplex allergy tests, skin prick test (SPT) and ImmunoCAP, all tested on the same patients. One hundred and three adult allergic patients attending the allergy clinic were included into the study. All patients were tested with four allergy test methods (SPT, ImmunoCAP, Microtest and ISAC 112) and a total of 3485 pairwise test results were analysed and compared. The four methods showed comparable results with a positive/negative agreement of 81-88% for any pair of test methods compared, which is in line with data in the literature. The most prevalent allergens (cat, dog, mite, timothy, birch and peanut) and their individual allergen components revealed an agreement between methods with correlation coefficients between 0·73 and 0·95. All four methods revealed deviating individual patient results for a minority of patients. These results indicate that microarray platforms are efficient and useful tools to characterize the specific immunoglobulin (Ig)E profile of allergic patients using a small volume of serum sample. The results produced by the Microtest system were in agreement with diagnostic tests in current use. Further data collection and evaluation are needed for other populations, geographical regions and allergens.

  19. Evaluation of a novel automated allergy microarray platform compared with three other allergy test methods.

    PubMed

    Williams, P; Önell, A; Baldracchini, F; Hui, V; Jolles, S; El-Shanawany, T

    2016-04-01

    Microarray platforms, enabling simultaneous measurement of many allergens with a small serum sample, are potentially powerful tools in allergy diagnostics. We report here the first study comparing a fully automated microarray system, the Microtest allergy system, with a manual microarray platform, Immuno-Solid phase Allergen Chip (ISAC), and two well-established singleplex allergy tests, skin prick test (SPT) and ImmunoCAP, all tested on the same patients. One hundred and three adult allergic patients attending the allergy clinic were included into the study. All patients were tested with four allergy test methods (SPT, ImmunoCAP, Microtest and ISAC 112) and a total of 3485 pairwise test results were analysed and compared. The four methods showed comparable results with a positive/negative agreement of 81-88% for any pair of test methods compared, which is in line with data in the literature. The most prevalent allergens (cat, dog, mite, timothy, birch and peanut) and their individual allergen components revealed an agreement between methods with correlation coefficients between 0·73 and 0·95. All four methods revealed deviating individual patient results for a minority of patients. These results indicate that microarray platforms are efficient and useful tools to characterize the specific immunoglobulin (Ig)E profile of allergic patients using a small volume of serum sample. The results produced by the Microtest system were in agreement with diagnostic tests in current use. Further data collection and evaluation are needed for other populations, geographical regions and allergens. PMID:26437695

  20. Flexible automated platform for blood group genotyping on DNA microarrays.

    PubMed

    Paris, Sandra; Rigal, Dominique; Barlet, Valérie; Verdier, Martine; Coudurier, Nicole; Bailly, Pascal; Brès, Jean-Charles

    2014-05-01

    The poor suitability of standard hemagglutination-based assay techniques for large-scale automated screening of red blood cell antigens severely limits the ability of blood banks to supply extensively phenotype-matched blood. With better understanding of the molecular basis of blood antigens, it is now possible to predict blood group phenotype by identifying single-nucleotide polymorphisms in genomic DNA. Development of DNA-typing assays for antigen screening in blood donation qualification laboratories promises to enable blood banks to provide optimally matched donations. We have designed an automated genotyping system using 96-well DNA microarrays for blood donation screening and a first panel of eight single-nucleotide polymorphisms to identify 16 alleles in four blood group systems (KEL, KIDD, DUFFY, and MNS). Our aim was to evaluate this system on 960 blood donor samples with known phenotype. Study data revealed a high concordance rate (99.92%; 95% CI, 99.77%-99.97%) between predicted and serologic phenotypes. These findings demonstrate that our assay using a simple protocol allows accurate, relatively low-cost phenotype prediction at the DNA level. This system could easily be configured with other blood group markers for identification of donors with rare blood types or blood units for IH panels or antigens from other systems. PMID:24726279

  1. Towards A Fully Automated High-Throughput Phototransfection System

    PubMed Central

    Cappelleri, David J.; Halasz, Adam; Sul, Jai-Yoon; Kim, Tae Kyung; Eberwine, James; Kumar, Vijay

    2010-01-01

    We have designed and implemented a framework for creating a fully automated high-throughput phototransfection system. Integrated image processing, laser target position calculation, and stage movements show a throughput increase of > 23X over the current manual phototransfection method while the potential for even greater throughput improvements (> 110X) is described. A software tool for automated off-line single cell morphological measurements, as well as real-time image segmentation analysis, has also been constructed and shown to be able quantify changes in the cell before and after the process, successfully characterizing them, using metrics such as cell perimeter, area, major and minor axis length, and eccentricity values. PMID:20706617

  2. A fully automated robotic system for high throughput fermentation.

    PubMed

    Zimmermann, Hartmut F; Rieth, Jochen

    2007-03-01

    High throughput robotic systems have been used since the 1990s to carry out biochemical assays in microtiter plates. However, before the application of such systems in industrial fermentation process development, some important specific demands should be taken into account. These are sufficient oxygen supply, optimal growth temperature, minimized sample evaporation, avoidance of contaminations, and simple but reliable process monitoring. A fully automated solution where all these aspects have been taken into account is presented.

  3. Automated prostate cancer diagnosis and Gleason grading of tissue microarrays

    NASA Astrophysics Data System (ADS)

    Tabesh, Ali; Kumar, Vinay P.; Pang, Ho-Yuen; Verbel, David; Kotsianti, Angeliki; Teverovskiy, Mikhail; Saidi, Olivier

    2005-04-01

    We present the results on the development of an automated system for prostate cancer diagnosis and Gleason grading. Images of representative areas of the original Hematoxylin-and-Eosin (H&E)-stained tissue retrieved from each patient, either from a tissue microarray (TMA) core or whole section, were captured and analyzed. The image sets consisted of 367 and 268 color images for the diagnosis and Gleason grading problems, respectively. In diagnosis, the goal is to classify a tissue image into tumor versus non-tumor classes. In Gleason grading, which characterizes tumor aggressiveness, the objective is to classify a tissue image as being from either a low- or high-grade tumor. Several feature sets were computed from the image. The feature sets considered were: (i) color channel histograms, (ii) fractal dimension features, (iii) fractal code features, (iv) wavelet features, and (v) color, shape and texture features computed using Aureon Biosciences' MAGIC system. The linear and quadratic Gaussian classifiers together with a greedy search feature selection algorithm were used. For cancer diagnosis, a classification accuracy of 94.5% was obtained on an independent test set. For Gleason grading, the achieved accuracy of classification into low- and high-grade classes of an independent test set was 77.6%.

  4. Fully Automated Lipid Pool Detection Using Near Infrared Spectroscopy.

    PubMed

    Pociask, Elżbieta; Jaworek-Korjakowska, Joanna; Malinowski, Krzysztof Piotr; Roleder, Tomasz; Wojakowski, Wojciech

    2016-01-01

    Background. Detecting and identifying vulnerable plaque, which is prone to rupture, is still a challenge for cardiologist. Such lipid core-containing plaque is still not identifiable by everyday angiography, thus triggering the need to develop a new tool where NIRS-IVUS can visualize plaque characterization in terms of its chemical and morphologic characteristic. The new tool can lead to the development of new methods of interpreting the newly obtained data. In this study, the algorithm to fully automated lipid pool detection on NIRS images is proposed. Method. Designed algorithm is divided into four stages: preprocessing (image enhancement), segmentation of artifacts, detection of lipid areas, and calculation of Lipid Core Burden Index. Results. A total of 31 NIRS chemograms were analyzed by two methods. The metrics, total LCBI, maximal LCBI in 4 mm blocks, and maximal LCBI in 2 mm blocks, were calculated to compare presented algorithm with commercial available system. Both intraclass correlation (ICC) and Bland-Altman plots showed good agreement and correlation between used methods. Conclusions. Proposed algorithm is fully automated lipid pool detection on near infrared spectroscopy images. It is a tool developed for offline data analysis, which could be easily augmented for newer functions and projects. PMID:27610191

  5. Fully Automated Lipid Pool Detection Using Near Infrared Spectroscopy

    PubMed Central

    Wojakowski, Wojciech

    2016-01-01

    Background. Detecting and identifying vulnerable plaque, which is prone to rupture, is still a challenge for cardiologist. Such lipid core-containing plaque is still not identifiable by everyday angiography, thus triggering the need to develop a new tool where NIRS-IVUS can visualize plaque characterization in terms of its chemical and morphologic characteristic. The new tool can lead to the development of new methods of interpreting the newly obtained data. In this study, the algorithm to fully automated lipid pool detection on NIRS images is proposed. Method. Designed algorithm is divided into four stages: preprocessing (image enhancement), segmentation of artifacts, detection of lipid areas, and calculation of Lipid Core Burden Index. Results. A total of 31 NIRS chemograms were analyzed by two methods. The metrics, total LCBI, maximal LCBI in 4 mm blocks, and maximal LCBI in 2 mm blocks, were calculated to compare presented algorithm with commercial available system. Both intraclass correlation (ICC) and Bland-Altman plots showed good agreement and correlation between used methods. Conclusions. Proposed algorithm is fully automated lipid pool detection on near infrared spectroscopy images. It is a tool developed for offline data analysis, which could be easily augmented for newer functions and projects. PMID:27610191

  6. Fully Automated Lipid Pool Detection Using Near Infrared Spectroscopy

    PubMed Central

    Wojakowski, Wojciech

    2016-01-01

    Background. Detecting and identifying vulnerable plaque, which is prone to rupture, is still a challenge for cardiologist. Such lipid core-containing plaque is still not identifiable by everyday angiography, thus triggering the need to develop a new tool where NIRS-IVUS can visualize plaque characterization in terms of its chemical and morphologic characteristic. The new tool can lead to the development of new methods of interpreting the newly obtained data. In this study, the algorithm to fully automated lipid pool detection on NIRS images is proposed. Method. Designed algorithm is divided into four stages: preprocessing (image enhancement), segmentation of artifacts, detection of lipid areas, and calculation of Lipid Core Burden Index. Results. A total of 31 NIRS chemograms were analyzed by two methods. The metrics, total LCBI, maximal LCBI in 4 mm blocks, and maximal LCBI in 2 mm blocks, were calculated to compare presented algorithm with commercial available system. Both intraclass correlation (ICC) and Bland-Altman plots showed good agreement and correlation between used methods. Conclusions. Proposed algorithm is fully automated lipid pool detection on near infrared spectroscopy images. It is a tool developed for offline data analysis, which could be easily augmented for newer functions and projects.

  7. Design and development of a microarray processing station (MPS) for automated miniaturized immunoassays.

    PubMed

    Pla-Roca, Mateu; Altay, Gizem; Giralt, Xavier; Casals, Alícia; Samitier, Josep

    2016-08-01

    Here we describe the design and evaluation of a fluidic device for the automatic processing of microarrays, called microarray processing station or MPS. The microarray processing station once installed on a commercial microarrayer allows automating the washing, and drying steps, which are often performed manually. The substrate where the assay occurs remains on place during the microarray printing, incubation and processing steps, therefore the addressing of nL volumes of the distinct immunoassay reagents such as capture and detection antibodies and samples can be performed on the same coordinate of the substrate with a perfect alignment without requiring any additional mechanical or optical re-alignment methods. This allows the performance of independent immunoassays in a single microarray spot. PMID:27405464

  8. Fully automated algorithm for wound surface area assessment.

    PubMed

    Deana, Alessandro Melo; de Jesus, Sérgio Henrique Costa; Sampaio, Brunna Pileggi Azevedo; Oliveira, Marcelo Tavares; Silva, Daniela Fátima Teixeira; França, Cristiane Miranda

    2013-01-01

    Worldwide, clinicians, dentists, nurses, researchers, and other health professionals need to monitor the wound healing progress and to quantify the rate of wound closure. The aim of this study is to demonstrate, step by step, a fully automated numerical method to estimate the size of the wound and the percentage damaged relative to the body surface area (BSA) in images, without the requirement for human intervention. We included the formula for BSA in rats in the algorithm. The methodology was validated in experimental wounds and human ulcers and was compared with the analysis of an experienced pathologist, with good agreement. Therefore, this algorithm is suitable for experimental wounds and burns and human ulcers, as they have a high contrast with adjacent normal skin.

  9. Fully automated adipose tissue measurement on abdominal CT

    NASA Astrophysics Data System (ADS)

    Yao, Jianhua; Sussman, Daniel L.; Summers, Ronald M.

    2011-03-01

    Obesity has become widespread in America and has been associated as a risk factor for many illnesses. Adipose tissue (AT) content, especially visceral AT (VAT), is an important indicator for risks of many disorders, including heart disease and diabetes. Measuring adipose tissue (AT) with traditional means is often unreliable and inaccurate. CT provides a means to measure AT accurately and consistently. We present a fully automated method to segment and measure abdominal AT in CT. Our method integrates image preprocessing which attempts to correct for image artifacts and inhomogeneities. We use fuzzy cmeans to cluster AT regions and active contour models to separate subcutaneous and visceral AT. We tested our method on 50 abdominal CT scans and evaluated the correlations between several measurements.

  10. Fully automated low-cost setup for fringe projection profilometry.

    PubMed

    Rivera-Ortega, Uriel; Dirckx, Joris; Meneses-Fabian, Cruz

    2015-02-20

    In this paper an alternative low-cost, easy-to-use, and fully automated profilometry setup is proposed. The setup is based on a phase-shifting fringe projection technique with four projected fringe parameters. It uses the well-known triangulation arrangement and low-cost electronic and image acquisition components such as a data acquisition board, a motor controller board, a printer rail, a CMOS webcam, and an LCD projector. The position of the camera, the generation of the fringe pattern, the acquisition of the images, and the calculation of the wrapped and unwrapped phase are all performed in LabVIEW. The setup is portable and can be perfectly adapted to be used in other profilometry techniques such as electronic speckle pattern interferometry and laser scanning profilometry. PMID:25968198

  11. Fully automated diabetic retinopathy screening using morphological component analysis.

    PubMed

    Imani, Elaheh; Pourreza, Hamid-Reza; Banaee, Touka

    2015-07-01

    Diabetic retinopathy is the major cause of blindness in the world. It has been shown that early diagnosis can play a major role in prevention of visual loss and blindness. This diagnosis can be made through regular screening and timely treatment. Besides, automation of this process can significantly reduce the work of ophthalmologists and alleviate inter and intra observer variability. This paper provides a fully automated diabetic retinopathy screening system with the ability of retinal image quality assessment. The novelty of the proposed method lies in the use of Morphological Component Analysis (MCA) algorithm to discriminate between normal and pathological retinal structures. To this end, first a pre-screening algorithm is used to assess the quality of retinal images. If the quality of the image is not satisfactory, it is examined by an ophthalmologist and must be recaptured if necessary. Otherwise, the image is processed for diabetic retinopathy detection. In this stage, normal and pathological structures of the retinal image are separated by MCA algorithm. Finally, the normal and abnormal retinal images are distinguished by statistical features of the retinal lesions. Our proposed system achieved 92.01% sensitivity and 95.45% specificity on the Messidor dataset which is a remarkable result in comparison with previous work.

  12. Fully Automated Cloud-Drift Winds in NESDIS Operations.

    NASA Astrophysics Data System (ADS)

    Nieman, Steven J.; Menzel, W. Paul; Hayden, Christopher M.; Gray, Donald; Wanzong, Steven T.; Velden, Christopher S.; Daniels, Jaime

    1997-06-01

    Cloud-drift winds have been produced from geostationary satellite data in the Western Hemisphere since the early 1970s. During the early years, winds were used as an aid for the short-term forecaster in an era when numerical forecasts were often of questionable quality, especially over oceanic regions. Increased computing resources over the last two decades have led to significant advances in the performance of numerical forecast models. As a result, continental forecasts now stand to gain little from the inspection or assimilation of cloud-drift wind fields. However, the oceanic data void remains, and although numerical forecasts in such areas have improved, they still suffer from a lack of in situ observations. During the same two decades, the quality of geostationary satellite data has improved considerably, and the cloud-drift wind production process has also benefited from increased computing power. As a result, fully automated wind production is now possible, yielding cloud-drift winds whose quality and quantity is sufficient to add useful information to numerical model forecasts in oceanic and coastal regions. This article will detail the automated cloud-drift wind production process, as operated by the National Environmental Satellite Data and Information Service within the National Oceanic and Atmospheric Administration.

  13. A fully scalable online pre-processing algorithm for short oligonucleotide microarray atlases

    PubMed Central

    Lahti, Leo; Torrente, Aurora; Elo, Laura L.; Brazma, Alvis; Rung, Johan

    2013-01-01

    Rapid accumulation of large and standardized microarray data collections is opening up novel opportunities for holistic characterization of genome function. The limited scalability of current preprocessing techniques has, however, formed a bottleneck for full utilization of these data resources. Although short oligonucleotide arrays constitute a major source of genome-wide profiling data, scalable probe-level techniques have been available only for few platforms based on pre-calculated probe effects from restricted reference training sets. To overcome these key limitations, we introduce a fully scalable online-learning algorithm for probe-level analysis and pre-processing of large microarray atlases involving tens of thousands of arrays. In contrast to the alternatives, our algorithm scales up linearly with respect to sample size and is applicable to all short oligonucleotide platforms. The model can use the most comprehensive data collections available to date to pinpoint individual probes affected by noise and biases, providing tools to guide array design and quality control. This is the only available algorithm that can learn probe-level parameters based on sequential hyperparameter updates at small consecutive batches of data, thus circumventing the extensive memory requirements of the standard approaches and opening up novel opportunities to take full advantage of contemporary microarray collections. PMID:23563154

  14. PLIP: fully automated protein-ligand interaction profiler.

    PubMed

    Salentin, Sebastian; Schreiber, Sven; Haupt, V Joachim; Adasme, Melissa F; Schroeder, Michael

    2015-07-01

    The characterization of interactions in protein-ligand complexes is essential for research in structural bioinformatics, drug discovery and biology. However, comprehensive tools are not freely available to the research community. Here, we present the protein-ligand interaction profiler (PLIP), a novel web service for fully automated detection and visualization of relevant non-covalent protein-ligand contacts in 3D structures, freely available at projects.biotec.tu-dresden.de/plip-web. The input is either a Protein Data Bank structure, a protein or ligand name, or a custom protein-ligand complex (e.g. from docking). In contrast to other tools, the rule-based PLIP algorithm does not require any structure preparation. It returns a list of detected interactions on single atom level, covering seven interaction types (hydrogen bonds, hydrophobic contacts, pi-stacking, pi-cation interactions, salt bridges, water bridges and halogen bonds). PLIP stands out by offering publication-ready images, PyMOL session files to generate custom images and parsable result files to facilitate successive data processing. The full python source code is available for download on the website. PLIP's command-line mode allows for high-throughput interaction profiling.

  15. PLIP: fully automated protein-ligand interaction profiler.

    PubMed

    Salentin, Sebastian; Schreiber, Sven; Haupt, V Joachim; Adasme, Melissa F; Schroeder, Michael

    2015-07-01

    The characterization of interactions in protein-ligand complexes is essential for research in structural bioinformatics, drug discovery and biology. However, comprehensive tools are not freely available to the research community. Here, we present the protein-ligand interaction profiler (PLIP), a novel web service for fully automated detection and visualization of relevant non-covalent protein-ligand contacts in 3D structures, freely available at projects.biotec.tu-dresden.de/plip-web. The input is either a Protein Data Bank structure, a protein or ligand name, or a custom protein-ligand complex (e.g. from docking). In contrast to other tools, the rule-based PLIP algorithm does not require any structure preparation. It returns a list of detected interactions on single atom level, covering seven interaction types (hydrogen bonds, hydrophobic contacts, pi-stacking, pi-cation interactions, salt bridges, water bridges and halogen bonds). PLIP stands out by offering publication-ready images, PyMOL session files to generate custom images and parsable result files to facilitate successive data processing. The full python source code is available for download on the website. PLIP's command-line mode allows for high-throughput interaction profiling. PMID:25873628

  16. Fully automated grey and white matter spinal cord segmentation

    PubMed Central

    Prados, Ferran; Cardoso, M. Jorge; Yiannakas, Marios C.; Hoy, Luke R.; Tebaldi, Elisa; Kearney, Hugh; Liechti, Martina D.; Miller, David H.; Ciccarelli, Olga; Wheeler-Kingshott, Claudia A. M. Gandini; Ourselin, Sebastien

    2016-01-01

    Axonal loss in the spinal cord is one of the main contributing factors to irreversible clinical disability in multiple sclerosis (MS). In vivo axonal loss can be assessed indirectly by estimating a reduction in the cervical cross-sectional area (CSA) of the spinal cord over time, which is indicative of spinal cord atrophy, and such a measure may be obtained by means of image segmentation using magnetic resonance imaging (MRI). In this work, we propose a new fully automated spinal cord segmentation technique that incorporates two different multi-atlas segmentation propagation and fusion techniques: The Optimized PatchMatch Label fusion (OPAL) algorithm for localising and approximately segmenting the spinal cord, and the Similarity and Truth Estimation for Propagated Segmentations (STEPS) algorithm for segmenting white and grey matter simultaneously. In a retrospective analysis of MRI data, the proposed method facilitated CSA measurements with accuracy equivalent to the inter-rater variability, with a Dice score (DSC) of 0.967 at C2/C3 level. The segmentation performance for grey matter at C2/C3 level was close to inter-rater variability, reaching an accuracy (DSC) of 0.826 for healthy subjects and 0.835 people with clinically isolated syndrome MS. PMID:27786306

  17. A fully automated TerraSAR-X based flood service

    NASA Astrophysics Data System (ADS)

    Martinis, Sandro; Kersten, Jens; Twele, André

    2015-06-01

    In this paper, a fully automated processing chain for near real-time flood detection using high resolution TerraSAR-X Synthetic Aperture Radar (SAR) data is presented. The processing chain including SAR data pre-processing, computation and adaption of global auxiliary data, unsupervised initialization of the classification as well as post-classification refinement by using a fuzzy logic-based approach is automatically triggered after satellite data delivery. The dissemination of flood maps resulting from this service is performed through an online service which can be activated on-demand for emergency response purposes (i.e., when a flood situation evolves). The classification methodology is based on previous work of the authors but was substantially refined and extended for robustness and transferability to guarantee high classification accuracy under different environmental conditions and sensor configurations. With respect to accuracy and computational effort, experiments performed on a data set of 175 different TerraSAR-X scenes acquired during flooding all over the world with different sensor configurations confirm the robustness and effectiveness of the proposed flood mapping service. These promising results have been further confirmed by means of an in-depth validation performed for three study sites in Germany, Thailand, and Albania/Montenegro.

  18. Participation through Automation: Fully Automated Critical PeakPricing in Commercial Buildings

    SciTech Connect

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Kiliccote,Sila; Linkugel, Eric

    2006-06-20

    California electric utilities have been exploring the use of dynamic critical peak prices (CPP) and other demand response programs to help reduce peaks in customer electric loads. CPP is a tariff design to promote demand response. Levels of automation in DR can be defined as follows: Manual Demand Response involves a potentially labor-intensive approach such as manually turning off or changing comfort set points at each equipment switch or controller. Semi-Automated Demand Response involves a pre-programmed demand response strategy initiated by a person via centralized control system. Fully Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. The receipt of the external signal initiates pre-programmed demand response strategies. They refer to this as Auto-DR. This paper describes the development, testing, and results from automated CPP (Auto-CPP) as part of a utility project in California. The paper presents the project description and test methodology. This is followed by a discussion of Auto-DR strategies used in the field test buildings. They present a sample Auto-CPP load shape case study, and a selection of the Auto-CPP response data from September 29, 2005. If all twelve sites reached their maximum saving simultaneously, a total of approximately 2 MW of DR is available from these twelve sites that represent about two million ft{sup 2}. The average DR was about half that value, at about 1 MW. These savings translate to about 0.5 to 1.0 W/ft{sup 2} of demand reduction. They are continuing field demonstrations and economic evaluations to pursue increasing penetrations of automated DR that has demonstrated ability to provide a valuable DR resource for California.

  19. Microarrays

    ERIC Educational Resources Information Center

    Plomin, Robert; Schalkwyk, Leonard C.

    2007-01-01

    Microarrays are revolutionizing genetics by making it possible to genotype hundreds of thousands of DNA markers and to assess the expression (RNA transcripts) of all of the genes in the genome. Microarrays are slides the size of a postage stamp that contain millions of DNA sequences to which single-stranded DNA or RNA can hybridize. This…

  20. Toward Fully Automated Multicriterial Plan Generation: A Prospective Clinical Study

    SciTech Connect

    Voet, Peter W.J.; Dirkx, Maarten L.P.; Breedveld, Sebastiaan; Fransen, Dennie; Levendag, Peter C.; Heijmen, Ben J.M.

    2013-03-01

    Purpose: To prospectively compare plans generated with iCycle, an in-house-developed algorithm for fully automated multicriterial intensity modulated radiation therapy (IMRT) beam profile and beam orientation optimization, with plans manually generated by dosimetrists using the clinical treatment planning system. Methods and Materials: For 20 randomly selected head-and-neck cancer patients with various tumor locations (of whom 13 received sequential boost treatments), we offered the treating physician the choice between an automatically generated iCycle plan and a manually optimized plan using standard clinical procedures. Although iCycle used a fixed “wish list” with hard constraints and prioritized objectives, the dosimetrists manually selected the beam configuration and fine tuned the constraints and objectives for each IMRT plan. Dosimetrists were not informed in advance whether a competing iCycle plan was made. The 2 plans were simultaneously presented to the physician, who then selected the plan to be used for treatment. For the patient group, differences in planning target volume coverage and sparing of critical tissues were quantified. Results: In 32 of 33 plan comparisons, the physician selected the iCycle plan for treatment. This highly consistent preference for the automatically generated plans was mainly caused by the improved sparing for the large majority of critical structures. With iCycle, the normal tissue complication probabilities for the parotid and submandibular glands were reduced by 2.4% ± 4.9% (maximum, 18.5%, P=.001) and 6.5% ± 8.3% (maximum, 27%, P=.005), respectively. The reduction in the mean oral cavity dose was 2.8 ± 2.8 Gy (maximum, 8.1 Gy, P=.005). For the swallowing muscles, the esophagus and larynx, the mean dose reduction was 3.3 ± 1.1 Gy (maximum, 9.2 Gy, P<.001). For 15 of the 20 patients, target coverage was also improved. Conclusions: In 97% of cases, automatically generated plans were selected for treatment because of

  1. Automated and Multiplexed Soft Lithography for the Production of Low-Density DNA Microarrays.

    PubMed

    Fredonnet, Julie; Foncy, Julie; Cau, Jean-Christophe; Séverac, Childérick; François, Jean Marie; Trévisiol, Emmanuelle

    2016-09-26

    Microarrays are established research tools for genotyping, expression profiling, or molecular diagnostics in which DNA molecules are precisely addressed to the surface of a solid support. This study assesses the fabrication of low-density oligonucleotide arrays using an automated microcontact printing device, the InnoStamp 40(®). This automate allows a multiplexed deposition of oligoprobes on a functionalized surface by the use of a MacroStamp(TM) bearing 64 individual pillars each mounted with 50 circular micropatterns (spots) of 160 µm diameter at 320 µm pitch. Reliability and reuse of the MacroStamp(TM) were shown to be fast and robust by a simple washing step in 96% ethanol. The low-density microarrays printed on either epoxysilane or dendrimer-functionalized slides (DendriSlides) showed excellent hybridization response with complementary sequences at unusual low probe and target concentrations, since the actual probe density immobilized by this technology was at least 10-fold lower than with the conventional mechanical spotting. In addition, we found a comparable hybridization response in terms of fluorescence intensity between spotted and printed oligoarrays with a 1 nM complementary target by using a 50-fold lower probe concentration to produce the oligoarrays by the microcontact printing method. Taken together, our results lend support to the potential development of this multiplexed microcontact printing technology employing soft lithography as an alternative, cost-competitive tool for fabrication of low-density DNA microarrays.

  2. Automated and Multiplexed Soft Lithography for the Production of Low-Density DNA Microarrays.

    PubMed

    Fredonnet, Julie; Foncy, Julie; Cau, Jean-Christophe; Séverac, Childérick; François, Jean Marie; Trévisiol, Emmanuelle

    2016-01-01

    Microarrays are established research tools for genotyping, expression profiling, or molecular diagnostics in which DNA molecules are precisely addressed to the surface of a solid support. This study assesses the fabrication of low-density oligonucleotide arrays using an automated microcontact printing device, the InnoStamp 40(®). This automate allows a multiplexed deposition of oligoprobes on a functionalized surface by the use of a MacroStamp(TM) bearing 64 individual pillars each mounted with 50 circular micropatterns (spots) of 160 µm diameter at 320 µm pitch. Reliability and reuse of the MacroStamp(TM) were shown to be fast and robust by a simple washing step in 96% ethanol. The low-density microarrays printed on either epoxysilane or dendrimer-functionalized slides (DendriSlides) showed excellent hybridization response with complementary sequences at unusual low probe and target concentrations, since the actual probe density immobilized by this technology was at least 10-fold lower than with the conventional mechanical spotting. In addition, we found a comparable hybridization response in terms of fluorescence intensity between spotted and printed oligoarrays with a 1 nM complementary target by using a 50-fold lower probe concentration to produce the oligoarrays by the microcontact printing method. Taken together, our results lend support to the potential development of this multiplexed microcontact printing technology employing soft lithography as an alternative, cost-competitive tool for fabrication of low-density DNA microarrays. PMID:27681742

  3. A Program Certification Assistant Based on Fully Automated Theorem Provers

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2005-01-01

    We describe a certification assistant to support formal safety proofs for programs. It is based on a graphical user interface that hides the low-level details of first-order automated theorem provers while supporting limited interactivity: it allows users to customize and control the proof process on a high level, manages the auxiliary artifacts produced during this process, and provides traceability between the proof obligations and the relevant parts of the program. The certification assistant is part of a larger program synthesis system and is intended to support the deployment of automatically generated code in safety-critical applications.

  4. Automated microfluidic assay system for autoantibodies found in autoimmune diseases using a photoimmobilized autoantigen microarray.

    PubMed

    Matsudaira, Takahiro; Tsuzuki, Saki; Wada, Akira; Suwa, Akira; Kohsaka, Hitoshi; Tomida, Maiko; Ito, Yoshihiro

    2008-01-01

    Autoimmune diseases such as rheumatoid arthritis, multiple sclerosis, and autoimmune diabetes are characterized by the production of autoantibodies that serve as useful diagnostic markers, surrogate markers, and prognostic factors. We devised an in vitro system to detect these clinically pivotal autoantibodies using a photoimmobilized autoantigen microarray. Photoimmobilization was useful for preparing the autoantigen microarray, where autoantigens are covalently immobilized on a plate, because it does not require specific functional groups of the autoantigens and any organic material can be immobilized by a radical reaction induced by photoirradiation. Here, we prepared the microarray using a very convenient method. Aqueous solutions of each autoantigen were mixed with a polymer of poly(ethylene glycol) methacrylate and a photoreactive crosslinker, and the mixtures were microspotted on a plate and dried in air. Finally, the plate was irradiated with an ultraviolet lamp to obtain immobilization. In the assay, patient serum was added to the microarray plate. Antigen-specific IgG adsorbed on the microspotted autoantigen was detected by peroxidase-conjugated anti-IgG antibody. The chemical luminescence intensities of the substrate decomposed by the peroxidase were detected with a sensitive CCD camera. All autoantigens were immobilized stably by this method and used to screen antigen-specific IgG. In addition, the plate was covered with a polydimethylsiloxane sheet containing microchannels and automated measurement was carried out. PMID:19194953

  5. Automated microfluidic assay system for autoantibodies found in autoimmune diseases using a photoimmobilized autoantigen microarray.

    PubMed

    Matsudaira, Takahiro; Tsuzuki, Saki; Wada, Akira; Suwa, Akira; Kohsaka, Hitoshi; Tomida, Maiko; Ito, Yoshihiro

    2008-01-01

    Autoimmune diseases such as rheumatoid arthritis, multiple sclerosis, and autoimmune diabetes are characterized by the production of autoantibodies that serve as useful diagnostic markers, surrogate markers, and prognostic factors. We devised an in vitro system to detect these clinically pivotal autoantibodies using a photoimmobilized autoantigen microarray. Photoimmobilization was useful for preparing the autoantigen microarray, where autoantigens are covalently immobilized on a plate, because it does not require specific functional groups of the autoantigens and any organic material can be immobilized by a radical reaction induced by photoirradiation. Here, we prepared the microarray using a very convenient method. Aqueous solutions of each autoantigen were mixed with a polymer of poly(ethylene glycol) methacrylate and a photoreactive crosslinker, and the mixtures were microspotted on a plate and dried in air. Finally, the plate was irradiated with an ultraviolet lamp to obtain immobilization. In the assay, patient serum was added to the microarray plate. Antigen-specific IgG adsorbed on the microspotted autoantigen was detected by peroxidase-conjugated anti-IgG antibody. The chemical luminescence intensities of the substrate decomposed by the peroxidase were detected with a sensitive CCD camera. All autoantigens were immobilized stably by this method and used to screen antigen-specific IgG. In addition, the plate was covered with a polydimethylsiloxane sheet containing microchannels and automated measurement was carried out.

  6. Toward fully automated genotyping: Genotyping microsatellite markers by deconvolution

    SciTech Connect

    Perlin, M.W.; Lancia, G.; See-Kiong, Ng

    1995-11-01

    Dense genetic linkage maps have been constructed for the human and mouse genomes, with average densities of 2.9 cM and 0.35 cM, respectively. These genetic maps are crucial for mapping both Mendelian and complex traits and are useful in clinical genetic diagnosis. Current maps are largely comprised of abundant, easily assayed, and highly polymorphic PCR-based microsatellite markers, primarily dinucleotide (CA){sub n} repeats. One key limitation of these length polymorphisms is the PCR stutter (or slippage) artifact that introduces additional stutter bands. With two (or more) closely spaced alleles, the stutter bands overlap, and it is difficult to accurately determine the correct alleles; this stutter phenomenon has all but precluded full automation, since a human must visually inspect the allele data. We describe here novel deconvolution methods for accurate genotyping that mathematically remove PCR stutter artifact from microsatellite markers. These methods overcome the manual interpretation bottleneck and thereby enable full automation of genetic map construction and use. New functionalities, including the pooling of DNAs and the pooling of markers, are described that may greatly reduce the associated experimentation requirements. 32 refs., 5 figs., 3 tabs.

  7. Gene Expression Measurement Module (GEMM) - a fully automated, miniaturized instrument for measuring gene expression in space

    NASA Astrophysics Data System (ADS)

    Karouia, Fathi; Ricco, Antonio; Pohorille, Andrew; Peyvan, Kianoosh

    2012-07-01

    The capability to measure gene expression on board spacecrafts opens the doors to a large number of experiments on the influence of space environment on biological systems that will profoundly impact our ability to conduct safe and effective space travel, and might also shed light on terrestrial physiology or biological function and human disease and aging processes. Measurements of gene expression will help us to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment on a wide range of organisms from microbes to humans, develop effective countermeasures against these effects, determine metabolic basis of microbial pathogenicity and drug resistance, test our ability to sustain and grow in space organisms that can be used for life support and in situ resource utilization during long-duration space exploration, and monitor both the spacecraft environment and crew health. These and other applications hold significant potential for discoveries in space biology, biotechnology and medicine. Accordingly, supported by funding from the NASA Astrobiology Science and Technology Instrument Development Program, we are developing a fully automated, miniaturized, integrated fluidic system for small spacecraft capable of in-situ measuring microbial expression of thousands of genes from multiple samples. The instrument will be capable of (1) lysing bacterial cell walls, (2) extracting and purifying RNA released from cells, (3) hybridizing it on a microarray and (4) providing electrochemical readout, all in a microfluidics cartridge. The prototype under development is suitable for deployment on nanosatellite platforms developed by the NASA Small Spacecraft Office. The first target application is to cultivate and measure gene expression of the photosynthetic bacterium Synechococcus elongatus, i.e. a cyanobacterium known to exhibit remarkable metabolic diversity and resilience to adverse conditions

  8. Gene Expression Measurement Module (GEMM) - A Fully Automated, Miniaturized Instrument for Measuring Gene Expression in Space

    NASA Technical Reports Server (NTRS)

    Pohorille, Andrew; Peyvan, Kia; Karouia, Fathi; Ricco, Antonio

    2012-01-01

    The capability to measure gene expression on board spacecraft opens the door to a large number of high-value experiments on the influence of the space environment on biological systems. For example, measurements of gene expression will help us to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment on a wide range of organisms from microbes to humans, develop effective countermeasures against these effects, and determine the metabolic bases of microbial pathogenicity and drug resistance. These and other applications hold significant potential for discoveries in space biology, biotechnology, and medicine. Supported by funding from the NASA Astrobiology Science and Technology Instrument Development Program, we are developing a fully automated, miniaturized, integrated fluidic system for small spacecraft capable of in-situ measurement of expression of several hundreds of microbial genes from multiple samples. The instrument will be capable of (1) lysing cell walls of bacteria sampled from cultures grown in space, (2) extracting and purifying RNA released from cells, (3) hybridizing the RNA on a microarray and (4) providing readout of the microarray signal, all in a single microfluidics cartridge. The device is suitable for deployment on nanosatellite platforms developed by NASA Ames' Small Spacecraft Division. To meet space and other technical constraints imposed by these platforms, a number of technical innovations are being implemented. The integration and end-to-end technological and biological validation of the instrument are carried out using as a model the photosynthetic bacterium Synechococcus elongatus, known for its remarkable metabolic diversity and resilience to adverse conditions. Each step in the measurement process-lysis, nucleic acid extraction, purification, and hybridization to an array-is assessed through comparison of the results obtained using the instrument with

  9. A fully automated system for adherent cells microinjection.

    PubMed

    Becattini, Gabriele; Mattos, Leonardo S; Caldwell, Darwin G

    2014-01-01

    This paper proposes an automated robotic system to perform cell microinjections to relieve human operators from this highly difficult and tedious manual procedure. The system, which uses commercial equipment currently found on most biomanipulation laboratories, consists of a multitask software framework combining computer vision and robotic control elements. The vision part features an injection pipette tracker and an automatic cell targeting system that is responsible for defining injection points within the contours of adherent cells in culture. The main challenge is the use of bright-field microscopy only, without the need for chemical markers normally employed to highlight the cells. Here, cells are identified and segmented using a threshold-based image processing technique working on defocused images. Fast and precise microinjection pipette positioning over the automatically defined targets is performed by a two-stage robotic system which achieves an average injection rate of 7.6 cells/min with a pipette positioning precision of 0.23 μm. The consistency of these microinjections and the performance of the visual targeting framework were experimentally evaluated using two cell lines (CHO-K1 and HEK) and over 500 cells. In these trials, the cells were automatically targeted and injected with a fluorescent marker, resulting in a correct cell detection rate of 87% and a successful marker delivery rate of 67.5%. These results demonstrate that the new system is capable of better performances than expert operators, highlighting its benefits and potential for large-scale application. PMID:24403406

  10. A fully automated system for adherent cells microinjection.

    PubMed

    Becattini, Gabriele; Mattos, Leonardo S; Caldwell, Darwin G

    2014-01-01

    This paper proposes an automated robotic system to perform cell microinjections to relieve human operators from this highly difficult and tedious manual procedure. The system, which uses commercial equipment currently found on most biomanipulation laboratories, consists of a multitask software framework combining computer vision and robotic control elements. The vision part features an injection pipette tracker and an automatic cell targeting system that is responsible for defining injection points within the contours of adherent cells in culture. The main challenge is the use of bright-field microscopy only, without the need for chemical markers normally employed to highlight the cells. Here, cells are identified and segmented using a threshold-based image processing technique working on defocused images. Fast and precise microinjection pipette positioning over the automatically defined targets is performed by a two-stage robotic system which achieves an average injection rate of 7.6 cells/min with a pipette positioning precision of 0.23 μm. The consistency of these microinjections and the performance of the visual targeting framework were experimentally evaluated using two cell lines (CHO-K1 and HEK) and over 500 cells. In these trials, the cells were automatically targeted and injected with a fluorescent marker, resulting in a correct cell detection rate of 87% and a successful marker delivery rate of 67.5%. These results demonstrate that the new system is capable of better performances than expert operators, highlighting its benefits and potential for large-scale application.

  11. Canadian macromolecular crystallography facility: a suite of fully automated beamlines.

    PubMed

    Grochulski, Pawel; Fodje, Michel; Labiuk, Shaunivan; Gorin, James; Janzen, Kathryn; Berg, Russ

    2012-06-01

    The Canadian light source is a 2.9 GeV national synchrotron radiation facility located on the University of Saskatchewan campus in Saskatoon. The small-gap in-vacuum undulator illuminated beamline, 08ID-1, together with the bending magnet beamline, 08B1-1, constitute the Canadian Macromolecular Crystallography Facility (CMCF). The CMCF provides service to more than 50 Principal Investigators in Canada and the United States. Up to 25% of the beam time is devoted to commercial users and the general user program is guaranteed up to 55% of the useful beam time through a peer-review process. CMCF staff provides "Mail-In" crystallography service to users with the highest scored proposals. Both beamlines are equipped with very robust end-stations including on-axis visualization systems, Rayonix 300 CCD series detectors and Stanford-type robotic sample auto-mounters. MxDC, an in-house developed beamline control system, is integrated with a data processing module, AutoProcess, allowing full automation of data collection and data processing with minimal human intervention. Sample management and remote monitoring of experiments is enabled through interaction with a Laboratory Information Management System developed at the facility.

  12. Improving reticle defect disposition via fully automated lithography simulation

    NASA Astrophysics Data System (ADS)

    Mann, Raunak; Goodman, Eliot; Lao, Keith; Ha, Steven; Vacca, Anthony; Fiekowsky, Peter; Fiekowsky, Dan

    2016-03-01

    Most advanced wafer fabs have embraced complex pattern decoration, which creates numerous challenges during in-fab reticle qualification. These optical proximity correction (OPC) techniques create assist features that tend to be very close in size and shape to the main patterns as seen in Figure 1. A small defect on an assist feature will most likely have little or no impact on the fidelity of the wafer image, whereas the same defect on a main feature could significantly decrease device functionality. In order to properly disposition these defects, reticle inspection technicians need an efficient method that automatically separates main from assist features and predicts the resulting defect impact on the wafer image. Analysis System (ADAS) defect simulation system[1]. Up until now, using ADAS simulation was limited to engineers due to the complexity of the settings that need to be manually entered in order to create an accurate result. A single error in entering one of these values can cause erroneous results, therefore full automation is necessary. In this study, we propose a new method where all needed simulation parameters are automatically loaded into ADAS. This is accomplished in two parts. First we have created a scanner parameter database that is automatically identified from mask product and level names. Second, we automatically determine the appropriate simulation printability threshold by using a new reference image (provided by the inspection tool) that contains a known measured value of the reticle critical dimension (CD). This new method automatically loads the correct scanner conditions, sets the appropriate simulation threshold, and automatically measures the percentage of CD change caused by the defect. This streamlines qualification and reduces the number of reticles being put on hold, waiting for engineer review. We also present data showing the consistency and reliability of the new method, along with the impact on the efficiency of in

  13. A Fully Automated Classification for Mapping the Annual Cropland Extent

    NASA Astrophysics Data System (ADS)

    Waldner, F.; Defourny, P.

    2015-12-01

    Mapping the global cropland extent is of paramount importance for food security. Indeed, accurate and reliable information on cropland and the location of major crop types is required to make future policy, investment, and logistical decisions, as well as production monitoring. Timely cropland information directly feed early warning systems such as GIEWS and, FEWS NET. In Africa, and particularly in the arid and semi-arid region, food security is center of debate (at least 10% of the population remains undernourished) and accurate cropland estimation is a challenge. Space borne Earth Observation provides opportunities for global cropland monitoring in a spatially explicit, economic, efficient, and objective fashion. In the both agriculture monitoring and climate modelling, cropland maps serve as mask to isolate agricultural land for (i) time-series analysis for crop condition monitoring and (ii) to investigate how the cropland is respond to climatic evolution. A large diversity of mapping strategies ranging from the local to the global scale and associated with various degrees of accuracy can be found in the literature. At the global scale, despite efforts, cropland is generally one of classes with the poorest accuracy which make difficult the use for agricultural. This research aims at improving the cropland delineation from the local scale to the regional and global scales as well as allowing near real time updates. To that aim, five temporal features were designed to target the key- characteristics of crop spectral-temporal behavior. To ensure a high degree of automation, training data is extracted from available baseline land cover maps. The method delivers cropland maps with a high accuracy over contrasted agro-systems in Ukraine, Argentina, China and Belgium. The accuracy reached are comparable to those obtained with classifiers trained with in-situ data. Besides, it was found that the cropland class is associated with a low uncertainty. The temporal features

  14. AMDA 2.13: A major update for automated cross-platform microarray data analysis.

    PubMed

    Kapetis, Dimos; Clarelli, Ferdinando; Vitulli, Federico; de Rosbo, Nicole Kerlero; Beretta, Ottavio; Foti, Maria; Ricciardi-Castagnoli, Paola; Zolezzi, Francesca

    2012-07-01

    Microarray platforms require analytical pipelines with modules for data pre-processing including data normalization, statistical analysis for identification of differentially expressed genes, cluster analysis, and functional annotation. We previously developed the Automated Microarray Data Analysis (AMDA, version 2.3.5) pipeline to process Affymetrix 3' IVT GeneChips. The availability of newer technologies that demand open-source tools for microarray data analysis has impelled us to develop an updated multi-platform version, AMDA 2.13. It includes additional quality control metrics, annotation-driven (annotation grade of Affymetrix NetAffx) and signal-driven (Inter-Quartile Range) gene filtering, and approaches to experimental design. To enhance understanding of biological data, differentially expressed genes have been mapped into KEGG pathways. Finally, a more stable and user-friendly interface was designed to integrate the requirements for different platforms. AMDA 2.13 allows the analysis of Affymetrix (cartridges and plates) and whole transcript probe design (Gene 1.0/1.1 ST and Exon 1.0 ST GeneChips), Illumina Bead Arrays, and one-channel Agilent 4×44 arrays. Relative to early versions, it supports various experimental designs and delivers more insightful biological understanding and up-to-date annotations.

  15. Automated identification of multiple micro-organisms from resequencing DNA microarrays.

    PubMed

    Malanoski, Anthony P; Lin, Baochuan; Wang, Zheng; Schnur, Joel M; Stenger, David A

    2006-01-01

    There is an increasing recognition that detailed nucleic acid sequence information will be useful and even required in the diagnosis, treatment and surveillance of many significant pathogens. Because generating detailed information about pathogens leads to significantly larger amounts of data, it is necessary to develop automated analysis methods to reduce analysis time and to standardize identification criteria. This is especially important for multiple pathogen assays designed to reduce assay time and costs. In this paper, we present a successful algorithm for detecting pathogens and reporting the maximum level of detail possible using multi-pathogen resequencing microarrays. The algorithm filters the sequence of base calls from the microarray and finds entries in genetic databases that most closely match. Taxonomic databases are then used to relate these entries to each other so that the microorganism can be identified. Although developed using a resequencing microarray, the approach is applicable to any assay method that produces base call sequence information. The success and continued development of this approach means that a non-expert can now perform unassisted analysis of the results obtained from partial sequence data.

  16. Accurate, Fully-Automated NMR Spectral Profiling for Metabolomics

    PubMed Central

    Ravanbakhsh, Siamak; Liu, Philip; Bjordahl, Trent C.; Mandal, Rupasri; Grant, Jason R.; Wilson, Michael; Eisner, Roman; Sinelnikov, Igor; Hu, Xiaoyu; Luchinat, Claudio; Greiner, Russell; Wishart, David S.

    2015-01-01

    Many diseases cause significant changes to the concentrations of small molecules (a.k.a. metabolites) that appear in a person’s biofluids, which means such diseases can often be readily detected from a person’s “metabolic profile"—i.e., the list of concentrations of those metabolites. This information can be extracted from a biofluids Nuclear Magnetic Resonance (NMR) spectrum. However, due to its complexity, NMR spectral profiling has remained manual, resulting in slow, expensive and error-prone procedures that have hindered clinical and industrial adoption of metabolomics via NMR. This paper presents a system, BAYESIL, which can quickly, accurately, and autonomously produce a person’s metabolic profile. Given a 1D 1H NMR spectrum of a complex biofluid (specifically serum or cerebrospinal fluid), BAYESIL can automatically determine the metabolic profile. This requires first performing several spectral processing steps, then matching the resulting spectrum against a reference compound library, which contains the “signatures” of each relevant metabolite. BAYESIL views spectral matching as an inference problem within a probabilistic graphical model that rapidly approximates the most probable metabolic profile. Our extensive studies on a diverse set of complex mixtures including real biological samples (serum and CSF), defined mixtures and realistic computer generated spectra; involving > 50 compounds, show that BAYESIL can autonomously find the concentration of NMR-detectable metabolites accurately (~ 90% correct identification and ~ 10% quantification error), in less than 5 minutes on a single CPU. These results demonstrate that BAYESIL is the first fully-automatic publicly-accessible system that provides quantitative NMR spectral profiling effectively—with an accuracy on these biofluids that meets or exceeds the performance of trained experts. We anticipate this tool will usher in high-throughput metabolomics and enable a wealth of new applications

  17. Accurate, fully-automated NMR spectral profiling for metabolomics.

    PubMed

    Ravanbakhsh, Siamak; Liu, Philip; Bjorndahl, Trent C; Bjordahl, Trent C; Mandal, Rupasri; Grant, Jason R; Wilson, Michael; Eisner, Roman; Sinelnikov, Igor; Hu, Xiaoyu; Luchinat, Claudio; Greiner, Russell; Wishart, David S

    2015-01-01

    Many diseases cause significant changes to the concentrations of small molecules (a.k.a. metabolites) that appear in a person's biofluids, which means such diseases can often be readily detected from a person's "metabolic profile"-i.e., the list of concentrations of those metabolites. This information can be extracted from a biofluids Nuclear Magnetic Resonance (NMR) spectrum. However, due to its complexity, NMR spectral profiling has remained manual, resulting in slow, expensive and error-prone procedures that have hindered clinical and industrial adoption of metabolomics via NMR. This paper presents a system, BAYESIL, which can quickly, accurately, and autonomously produce a person's metabolic profile. Given a 1D 1H NMR spectrum of a complex biofluid (specifically serum or cerebrospinal fluid), BAYESIL can automatically determine the metabolic profile. This requires first performing several spectral processing steps, then matching the resulting spectrum against a reference compound library, which contains the "signatures" of each relevant metabolite. BAYESIL views spectral matching as an inference problem within a probabilistic graphical model that rapidly approximates the most probable metabolic profile. Our extensive studies on a diverse set of complex mixtures including real biological samples (serum and CSF), defined mixtures and realistic computer generated spectra; involving > 50 compounds, show that BAYESIL can autonomously find the concentration of NMR-detectable metabolites accurately (~ 90% correct identification and ~ 10% quantification error), in less than 5 minutes on a single CPU. These results demonstrate that BAYESIL is the first fully-automatic publicly-accessible system that provides quantitative NMR spectral profiling effectively-with an accuracy on these biofluids that meets or exceeds the performance of trained experts. We anticipate this tool will usher in high-throughput metabolomics and enable a wealth of new applications of NMR in

  18. A fully automated microfluidic micellar electrokinetic chromatography analyzer for organic compound detection.

    PubMed

    Jang, Lee-Woon; Razu, Md Enayet; Jensen, Erik C; Jiao, Hong; Kim, Jungkyu

    2016-09-21

    An integrated microfluidic chemical analyzer utilizing micellar electrokinetic chromatography (MEKC) is developed using a pneumatically actuated Lifting-Gate microvalve array and a capillary zone electrophoresis (CZE) chip. Each of the necessary liquid handling processes such as metering, mixing, transferring, and washing steps are performed autonomously by the microvalve array. In addition, a method is presented for automated washing of the high resistance CZE channel for device reuse and periodic automated in situ analyses. To demonstrate the functionality of this MEKC platform, amino acids and thiols are labeled and efficiently separated via a fully automated program. Reproducibility of the automated programs for sample labeling and periodic in situ MEKC analysis was tested and found to be equivalent to conventional sample processing techniques for capillary electrophoresis analysis. This platform enables simple, portable, and automated chemical compound analysis which can be used in challenging environments. PMID:27507322

  19. A Fully Automated Microfluidic Femtosecond Laser Axotomy Platform for Nerve Regeneration Studies in C. elegans

    PubMed Central

    Gokce, Sertan Kutal; Guo, Samuel X.; Ghorashian, Navid; Everett, W. Neil; Jarrell, Travis; Kottek, Aubri; Bovik, Alan C.; Ben-Yakar, Adela

    2014-01-01

    Femtosecond laser nanosurgery has been widely accepted as an axonal injury model, enabling nerve regeneration studies in the small model organism, Caenorhabditis elegans. To overcome the time limitations of manual worm handling techniques, automation and new immobilization technologies must be adopted to improve throughput in these studies. While new microfluidic immobilization techniques have been developed that promise to reduce the time required for axotomies, there is a need for automated procedures to minimize the required amount of human intervention and accelerate the axotomy processes crucial for high-throughput. Here, we report a fully automated microfluidic platform for performing laser axotomies of fluorescently tagged neurons in living Caenorhabditis elegans. The presented automation process reduces the time required to perform axotomies within individual worms to ∼17 s/worm, at least one order of magnitude faster than manual approaches. The full automation is achieved with a unique chip design and an operation sequence that is fully computer controlled and synchronized with efficient and accurate image processing algorithms. The microfluidic device includes a T-shaped architecture and three-dimensional microfluidic interconnects to serially transport, position, and immobilize worms. The image processing algorithms can identify and precisely position axons targeted for ablation. There were no statistically significant differences observed in reconnection probabilities between axotomies carried out with the automated system and those performed manually with anesthetics. The overall success rate of automated axotomies was 67.4±3.2% of the cases (236/350) at an average processing rate of 17.0±2.4 s. This fully automated platform establishes a promising methodology for prospective genome-wide screening of nerve regeneration in C. elegans in a truly high-throughput manner. PMID:25470130

  20. Considerations for Using Phased Array Ultrasonics in a Fully Automated Inspection System

    NASA Astrophysics Data System (ADS)

    Kramb, V. A.; Olding, R. B.; Sebastian, J. R.; Hoppe, W. C.; Petricola, D. L.; Hoeffel, J. D.; Gasper, D. A.; Stubbs, D. A.

    2004-02-01

    The University of Dayton Research Institute (UDRI) under contract by the US Air Force has designed and constructed a fully automated ultrasonic inspection system for the detection of embedded defects in rotating gas turbine engine components. The system performs automated inspections using the "scan plan" concept developed for the Air Force sponsored "Retirement For Cause" (RFC) automated eddy current system. Execution of the scan plan results in a fully automated inspection process producing engine component accept/reject decisions based on probability of detection (POD) information. Use of the phased-array ultrasonic instrument and probes allows for optimization of both the sensitivity and resolution for each inspection through electronic beamforming, scanning, and focusing processes. However, issues such as alignment of the array probe, calibration of individual elements and overall beam response prior to the inspection have not been addressed for an automated system. This paper will discuss current progress in the development of an automated alignment and calibration procedure for various phased array apertures and specimen geometries.

  1. ProDeGe: A Computational Protocol for fully Automated Decontamination of Genomic Data

    SciTech Connect

    2015-12-01

    The Single Cell Data Decontamination Pipeline is a fully-automated software tool which classifies unscreened contigs from single cell datasets through a combination of homology and feature-based methodologies using the organism's nucleotide sequences and known NCBI taxonomony. The software is freely available to download and install, and can be run on any system.

  2. 21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY...

  3. 21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY...

  4. 21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY...

  5. 21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY...

  6. 21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY...

  7. Neurodegenerative changes in Alzheimer's disease: a comparative study of manual, semi-automated, and fully automated assessment using MRI

    NASA Astrophysics Data System (ADS)

    Fritzsche, Klaus H.; Giesel, Frederik L.; Heimann, Tobias; Thomann, Philipp A.; Hahn, Horst K.; Pantel, Johannes; Schröder, Johannes; Essig, Marco; Meinzer, Hans-Peter

    2008-03-01

    Objective quantification of disease specific neurodegenerative changes can facilitate diagnosis and therapeutic monitoring in several neuropsychiatric disorders. Reproducibility and easy-to-perform assessment are essential to ensure applicability in clinical environments. Aim of this comparative study is the evaluation of a fully automated approach that assesses atrophic changes in Alzheimer's disease (AD) and Mild Cognitive Impairment (MCI). 21 healthy volunteers (mean age 66.2), 21 patients with MCI (66.6), and 10 patients with AD (65.1) were enrolled. Subjects underwent extensive neuropsychological testing and MRI was conducted on a 1.5 Tesla clinical scanner. Atrophic changes were measured automatically by a series of image processing steps including state of the art brain mapping techniques. Results were compared with two reference approaches: a manual segmentation of the hippocampal formation and a semi-automated estimation of temporal horn volume, which is based upon interactive selection of two to six landmarks in the ventricular system. All approaches separated controls and AD patients significantly (10 -5 < p < 10 -4) and showed a slight but not significant increase of neurodegeneration for subjects with MCI compared to volunteers. The automated approach correlated significantly with the manual (r = -0.65, p < 10 -6) and semi automated (r = -0.83, p < 10 -13) measurements. It proved high accuracy and at the same time maximized observer independency, time reduction and thus usefulness for clinical routine.

  8. How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study

    PubMed Central

    Johansen, Ayna; Brendryen, Håvar

    2016-01-01

    Background eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. Objective We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist’s support of a working alliance, internalization of motivation, and managing lapses. Methods We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several “counseling sessions” about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. Results The program supports the user’s working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. Conclusions A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective. PMID:27354373

  9. Integration of image analysis and robotics into a fully automated colony picking and plate handling system.

    PubMed Central

    Jones, P; Watson, A; Davies, M; Stubbings, S

    1992-01-01

    We describe here the integration of image analysis and robotics to produce a fully automated colony picking/plate handling system. Biological tests were performed to verify its performance in terms of sterilisation and accuracy of picking. The machine was then used by a single operative to pick a 36,000 clone cDNA library in approximately 42 hrs over 5 days. Images PMID:1408762

  10. Fully Automated Volumetric Modulated Arc Therapy Plan Generation for Prostate Cancer Patients

    SciTech Connect

    Voet, Peter W.J. Dirkx, Maarten L.P.; Breedveld, Sebastiaan; Al-Mamgani, Abrahim; Incrocci, Luca; Heijmen, Ben J.M.

    2014-04-01

    Purpose: To develop and evaluate fully automated volumetric modulated arc therapy (VMAT) treatment planning for prostate cancer patients, avoiding manual trial-and-error tweaking of plan parameters by dosimetrists. Methods and Materials: A system was developed for fully automated generation of VMAT plans with our commercial clinical treatment planning system (TPS), linked to the in-house developed Erasmus-iCycle multicriterial optimizer for preoptimization. For 30 randomly selected patients, automatically generated VMAT plans (VMAT{sub auto}) were compared with VMAT plans generated manually by 1 expert dosimetrist in the absence of time pressure (VMAT{sub man}). For all treatment plans, planning target volume (PTV) coverage and sparing of organs-at-risk were quantified. Results: All generated plans were clinically acceptable and had similar PTV coverage (V{sub 95%} > 99%). For VMAT{sub auto} and VMAT{sub man} plans, the organ-at-risk sparing was similar as well, although only the former plans were generated without any planning workload. Conclusions: Fully automated generation of high-quality VMAT plans for prostate cancer patients is feasible and has recently been implemented in our clinic.

  11. Automated Immunomagnetic Separation and Microarray Detection of E. coli O157:H7 from Poultry Carcass Rinse

    SciTech Connect

    Chandler, Darrell P. ); Brown, Jeremy D.; Call, Douglas R. ); Wunschel, Sharon C. ); Grate, Jay W. ); Holman, David A.; Olson, Lydia G.; Stottlemyer, Mark S.; Bruckner-Lea, Cindy J. )

    2001-09-01

    We describe the development and application of a novel electromagnetic flow cell and fluidics system for automated immunomagnetic separation of E. coli directly from unprocessed poultry carcass rinse, and the biochemical coupling of automated sample preparation with nucleic acid microarrays without cell growth. Highly porous nickel foam was used as a magnetic flux conductor. Up to 32% recovery efficiency of 'total' E. coli was achieved within the automated system with 6 sec contact times and 15 minute protocol (from sample injection through elution), statistically similar to cell recovery efficiencies in > 1 hour 'batch' captures. The electromagnet flow cell allowed complete recovery of 2.8 mm particles directly from unprocessed poultry carcass rinse whereas the batch system did not. O157:H7 cells were reproducibly isolated directly from unprocessed poultry rinse with 39% recovery efficiency at 103 cells ml-1 inoculum. Direct plating of washed beads showed positive recovery of O 157:H7 directly from carcass rinse at an inoculum of 10 cells ml-1. Recovered beads were used for direct PCR amplification and microarray detection, with a process-level detection limit (automated cell concentration through microarray detection) of < 103 cells ml-1 carcass rinse. The fluidic system and analytical approach described here are generally applicable to most microbial detection problems and applications.

  12. Regeneration of recombinant antigen microarrays for the automated monitoring of antibodies against zoonotic pathogens in swine sera.

    PubMed

    Meyer, Verena K; Kober, Catharina; Niessner, Reinhard; Seidel, Michael

    2015-01-23

    The ability to regenerate immobilized proteins like recombinant antigens (rAgs) on surfaces is an unsolved problem for flow-based immunoassays on microarray analysis systems. The regeneration on microarray chip surfaces is achieved by changing the protein structures and desorption of antibodies. Afterwards, reactivation of immobilized protein antigens is necessary for reconstitution processes. Any backfolding should be managed in a way that antibodies are able to detect the protein antigens in the next measurement cycle. The regeneration of rAg microarrays was examined for the first time on the MCR3 flow-based chemiluminescence (CL) microarray analysis platform. The aim was to reuse rAg microarray chips in order to reduce the screening effort and costs. An antibody capturing format was used to detect antibodies against zoonotic pathogens in sera of slaughtered pigs. Different denaturation and reactivation buffers were tested. Acidic glycine-SDS buffer (pH 2.5) and 8 M guanidinium hydrochloride showed the best results in respect of denaturation efficiencies. The highest CL signals after regeneration were achieved with a carbonate buffer containing 10 mM DTT and 0.1% BSA for reactivation. Antibodies against Yersinia spp. and hepatitis E virus (HEV) were detected in swine sera on one immunochip over 4 days and 25 measurement cycles. Each cycle took 10 min for detection and regeneration. By using the rAg microarray chip, a fast and automated screening of antibodies against pathogens in sera of slaughtered pigs would be possible for zoonosis monitoring.

  13. Automated versus Manual Sample Inoculations in Routine Clinical Microbiology: a Performance Evaluation of the Fully Automated InoqulA Instrument

    PubMed Central

    Froment, P.; Marchandin, H.; Vande Perre, P.

    2014-01-01

    The process of plate streaking has been automated to improve the culture readings, isolation quality, and workflow of microbiology laboratories. However, instruments have not been well evaluated under routine conditions. We aimed to evaluate the performance of the fully automated InoqulA instrument (BD Kiestra B.V., The Netherlands) in the automated seeding of liquid specimens and samples collected using swabs with transport medium. We compared manual and automated methods according to the (i) within-run reproducibility using Escherichia coli-calibrated suspensions, (ii) intersample contamination using a series of alternating sterile broths and broths with >105 CFU/ml of either E. coli or Proteus mirabilis, (iii) isolation quality with standardized mixed bacterial suspensions of diverse complexity and a 4-category standardized scale (very poor, poor, fair to good, or excellent), and (iv) agreement of the results obtained from 244 clinical specimens. By involving 15 technicians in the latter part of the comparative study, we estimated the variability in the culture quality at the level of the laboratory team. The instrument produced satisfactory reproducibility with no sample cross-contamination, and it performed better than the manual method, with more colony types recovered and isolated (up to 11% and 17%, respectively). Finally, we showed that the instrument did not shorten the seeding time over short periods of work compared to that for the manual method. Altogether, the instrument improved the quality and standardization of the isolation, thereby contributing to a better overall workflow, shortened the time to results, and provided more accurate results for polymicrobial specimens. PMID:24353001

  14. Comparison of a modified thiazole orange technique with a fully automated analyser for reticulocyte counting.

    PubMed

    Bowen, D; Bentley, N; Hoy, T; Cavill, I

    1991-02-01

    Two independent methods for quantitating reticulocyte counts were compared. One used a modified thiazole orange technique and a flow cytometer (Becton Dickinson FACS); the other was a fully automated whole blood analyser (Sysmex R1000). Both methods gave comparable results with a coefficient of variation of less than 5%. Samples measured using the R1000 showed a negligible decrease in the reticulocyte count over five days at room temperature, although there was evidence of continuing intracellular maturation: with thiazole orange there was an apparent increase. A practical reference range of 20-70 x 10(9)/l was established from 89 normal subjects. The close correlation between the two independent estimates indicates the validity of the quantitation of the reticulocyte count and shows that automation allows significant changes within and below the normal range to be detected with a degree of reliability which was not previously possible.

  15. Fully automated quantitative analysis of breast cancer risk in DCE-MR images

    NASA Astrophysics Data System (ADS)

    Jiang, Luan; Hu, Xiaoxin; Gu, Yajia; Li, Qiang

    2015-03-01

    Amount of fibroglandular tissue (FGT) and background parenchymal enhancement (BPE) in dynamic contrast enhanced magnetic resonance (DCE-MR) images are two important indices for breast cancer risk assessment in the clinical practice. The purpose of this study is to develop and evaluate a fully automated scheme for quantitative analysis of FGT and BPE in DCE-MR images. Our fully automated method consists of three steps, i.e., segmentation of whole breast, fibroglandular tissues, and enhanced fibroglandular tissues. Based on the volume of interest extracted automatically, dynamic programming method was applied in each 2-D slice of a 3-D MR scan to delineate the chest wall and breast skin line for segmenting the whole breast. This step took advantages of the continuity of chest wall and breast skin line across adjacent slices. We then further used fuzzy c-means clustering method with automatic selection of cluster number for segmenting the fibroglandular tissues within the segmented whole breast area. Finally, a statistical method was used to set a threshold based on the estimated noise level for segmenting the enhanced fibroglandular tissues in the subtraction images of pre- and post-contrast MR scans. Based on the segmented whole breast, fibroglandular tissues, and enhanced fibroglandular tissues, FGT and BPE were automatically computed. Preliminary results of technical evaluation and clinical validation showed that our fully automated scheme could obtain good segmentation of the whole breast, fibroglandular tissues, and enhanced fibroglandular tissues to achieve accurate assessment of FGT and BPE for quantitative analysis of breast cancer risk.

  16. Fully automated DNA extraction from blood using magnetic particles modified with a hyperbranched polyamidoamine dendrimer.

    PubMed

    Yoza, Brandon; Arakaki, Atsushi; Maruyama, Kohei; Takeyama, Haruko; Matsunaga, Tadashi

    2003-01-01

    Bacterial and artificial magnetic particles were modified using a polyamidoamine (PAMAM) dendrimer and outer shell amines determined. Bacterial magnetic particles were the most consistently modified. Transmission electron microscopic (TEM) analysis showed that the artificial magnetic particles were structurally damaged by the modification process including sonication. Furthermore, laser particle analysis of the magnetite also revealed damage. Small quantities of dendrimer-modified bacterial magnetic particles were used to extract DNA from blood. The efficiency of DNA recovery was consistently about 30 ng of DNA using 2-10 microg of dendrimer-modified bacterial magnetite. This technique was fully automated using newly developed liquid handling robots and bacterial magnetic particles.

  17. Fully automated segmentation and characterization of the dendritic trees of retinal horizontal neurons

    SciTech Connect

    Kerekes, Ryan A; Gleason, Shaun Scott; Martins, Rodrigo; Dyer, Michael

    2010-01-01

    We introduce a new fully automated method for segmenting and characterizing the dendritic tree of neurons in confocal image stacks. Our method is aimed at wide-field-of-view, low-resolution imagery of retinal neurons in which dendrites can be intertwined and difficult to follow. The approach is based on 3-D skeletonization and includes a method for automatically determining an appropriate global threshold as well as a soma detection algorithm. We provide the details of the algorithm and a qualitative performance comparison against a commercially available neurite tracing software package, showing that a segmentation produced by our method more closely matches the ground-truth segmentation.

  18. Fully automated two-dimensional electrophoresis system for high-throughput protein analysis.

    PubMed

    Hiratsuka, Atsunori; Yokoyama, Kenji

    2009-01-01

    A fully automated two-dimensional electrophoresis (2DE) system for rapid and reproducible protein analysis is described. 2DE that is a combination of isoelectric focusing (IEF) and sodium dodecyl sulfate polyacrylamide gel electrophoresis (SDS-PAGE) is widely used for protein expression analysis. Here, all the operations are achieved in a shorter time and all the transferring procedures are performed automatically. The system completed the entire process within 1.5 h. A device configuration, operational procedure, and data analysis are described using this system.

  19. Artificial Golgi apparatus: globular protein-like dendrimer facilitates fully automated enzymatic glycan synthesis.

    PubMed

    Matsushita, Takahiko; Nagashima, Izuru; Fumoto, Masataka; Ohta, Takashi; Yamada, Kuriko; Shimizu, Hiroki; Hinou, Hiroshi; Naruchi, Kentaro; Ito, Takaomi; Kondo, Hirosato; Nishimura, Shin-Ichiro

    2010-11-24

    Despite the growing importance of synthetic glycans as tools for biological studies and drug discovery, a lack of common methods for the routine synthesis remains a major obstacle. We have developed a new method for automated glycan synthesis that employs the enzymatic approach and a dendrimer as an ideal support within the chemical process. Recovery tests using a hollow fiber ultrafiltration module have revealed that monodisperse G6 (MW = 58 kDa) and G7 (MW = 116 kDa) poly(amidoamine) dendrimers exhibit a similar profile to BSA (MW = 66 kDa). Characteristics of the globular protein-like G7 dendrimer with high solubility and low viscosity in water greatly enhanced throughput and efficiency in automated synthesis while random polyacrylamide-based supports entail significant loss during the repetitive reaction/separation step. The present protocol allowed for the fully automated enzymatic synthesis of sialyl Lewis X tetrasaccharide derivatives over a period of 4 days in 16% overall yield from a simple N-acetyl-d-glucosamine linked to an aminooxy-functionalized G7 dendrimer.

  20. [Locally Dynamically Moving Average Algorithm for the Fully Automated Baseline Correction of Raman Spectrum].

    PubMed

    Gao, Peng-fei; Yang, Rui; Ji, Jiang; Guo, Han-ming; Hu, Qi; Zhuang, Song-lin

    2015-05-01

    The baseline correction is an, extremely important spectral preprocessing step and can significantly improve the accuracy of the subsequent spectral analysis algorithm. At present most of the baseline correction algorithms are manual and semi-automated. The manual baseline correction depends on the user experience and its accuracy is greatly affected by the subjective factor. The semi-automated baseline correction needs to set different optimizing parameters for different Raman spectra, which will be inconvenient to users. In this paper, a locally.dynamically moving average algorithm (LDMA) for the fully automated baseline correction is presented and its basic ideas.and steps are demonstrated in detail. In the LDMA algorithm the modified moving averaging algorithm (MMA) is used to strip the Raman peaks. By automatically finding the baseline subintervals of the raw Raman spectrum to divide the total spectrum range into multi Raman peak subintervals, the LDMA algorithm succeed in dynamically changing the window half width of the MA algorithm and controlling the numbers of the smoothing iterations in each Raman peak subinterval. Hence, the phenomena of overcorrection and under-correction are avoided to the most degree. The LDMA algorithm has achieved great effect not only to the synthetic Raman spectra with the convex, exponential, or sigmoidal baseline but also to the real Raman spectra.

  1. Fully automated two-dimensional electrophoresis system for high-throughput protein analysis.

    PubMed

    Hiratsuka, Atsunori; Kinoshita, Hideki; Maruo, Yuji; Takahashi, Katsuyoshi; Akutsu, Satonari; Hayashida, Chie; Sakairi, Koji; Usui, Keisuke; Shiseki, Kisho; Inamochi, Hajime; Nakada, Yoshiko; Yodoya, Kouhei; Namatame, Ichiji; Unuma, Yutaka; Nakamura, Makoto; Ueyama, Kosuke; Ishii, Yoshinori; Yano, Kazuyoshi; Yokoyama, Kenji

    2007-08-01

    We developed a fully automated electrophoresis system for rapid and highly reproducible protein analysis. All the two-dimensional (2D) electrophoresis procedures including isoelectric focusing (IEF), on-part protein staining, sodium dodecyl sulfate polyacrylamide gel electrophoresis (SDS-PAGE), and in situ protein detection were automatically completed. The system comprised Peltiert devices, high-voltage generating devices, electrodes, and three disposable polymethylmethacrylate (PMMA) parts for IEF, reaction chambers, and SDS-PAGE. Because of miniaturization of the IEF part, rapid IEF was achieved in 30 min. A gel with a tapered edge gel on the SDS-PAGE part realized a connection between the parts without use of a gluing material. A biaxial conveyer was employed for the part relocation, sample introduction, and washing processes to realize a low-maintenance and cost-effective automation system. Performances of the system and a commercial minigel system were compared in terms of detected number, resolution, and reproducibility of the protein spots. The system achieved high-resolution comparable to the minigel system despite shorter focusing time and smaller part dimensions. The resulting reproducibility was better or comparable to the performance of the minigel system. Complete 2D separation was achieved within 1.5 h. The system is practical, portable, and has automation capabilities.

  2. Artificial Golgi apparatus: globular protein-like dendrimer facilitates fully automated enzymatic glycan synthesis.

    PubMed

    Matsushita, Takahiko; Nagashima, Izuru; Fumoto, Masataka; Ohta, Takashi; Yamada, Kuriko; Shimizu, Hiroki; Hinou, Hiroshi; Naruchi, Kentaro; Ito, Takaomi; Kondo, Hirosato; Nishimura, Shin-Ichiro

    2010-11-24

    Despite the growing importance of synthetic glycans as tools for biological studies and drug discovery, a lack of common methods for the routine synthesis remains a major obstacle. We have developed a new method for automated glycan synthesis that employs the enzymatic approach and a dendrimer as an ideal support within the chemical process. Recovery tests using a hollow fiber ultrafiltration module have revealed that monodisperse G6 (MW = 58 kDa) and G7 (MW = 116 kDa) poly(amidoamine) dendrimers exhibit a similar profile to BSA (MW = 66 kDa). Characteristics of the globular protein-like G7 dendrimer with high solubility and low viscosity in water greatly enhanced throughput and efficiency in automated synthesis while random polyacrylamide-based supports entail significant loss during the repetitive reaction/separation step. The present protocol allowed for the fully automated enzymatic synthesis of sialyl Lewis X tetrasaccharide derivatives over a period of 4 days in 16% overall yield from a simple N-acetyl-d-glucosamine linked to an aminooxy-functionalized G7 dendrimer. PMID:21033706

  3. Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes

    NASA Astrophysics Data System (ADS)

    Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao

    2010-06-01

    To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.

  4. Fully automated segmentation of left ventricle using dual dynamic programming in cardiac cine MR images

    NASA Astrophysics Data System (ADS)

    Jiang, Luan; Ling, Shan; Li, Qiang

    2016-03-01

    Cardiovascular diseases are becoming a leading cause of death all over the world. The cardiac function could be evaluated by global and regional parameters of left ventricle (LV) of the heart. The purpose of this study is to develop and evaluate a fully automated scheme for segmentation of LV in short axis cardiac cine MR images. Our fully automated method consists of three major steps, i.e., LV localization, LV segmentation at end-diastolic phase, and LV segmentation propagation to the other phases. First, the maximum intensity projection image along the time phases of the midventricular slice, located at the center of the image, was calculated to locate the region of interest of LV. Based on the mean intensity of the roughly segmented blood pool in the midventricular slice at each phase, end-diastolic (ED) and end-systolic (ES) phases were determined. Second, the endocardial and epicardial boundaries of LV of each slice at ED phase were synchronously delineated by use of a dual dynamic programming technique. The external costs of the endocardial and epicardial boundaries were defined with the gradient values obtained from the original and enhanced images, respectively. Finally, with the advantages of the continuity of the boundaries of LV across adjacent phases, we propagated the LV segmentation from the ED phase to the other phases by use of dual dynamic programming technique. The preliminary results on 9 clinical cardiac cine MR cases show that the proposed method can obtain accurate segmentation of LV based on subjective evaluation.

  5. Designs and concept reliance of a fully automated high-content screening platform.

    PubMed

    Radu, Constantin; Adrar, Hosna Sana; Alamir, Ab; Hatherley, Ian; Trinh, Trung; Djaballah, Hakim

    2012-10-01

    High-content screening (HCS) is becoming an accepted platform in academic and industry screening labs and does require slightly different logistics for execution. To automate our stand-alone HCS microscopes, namely, an alpha IN Cell Analyzer 3000 (INCA3000), originally a Praelux unit hooked to a Hudson Plate Crane with a maximum capacity of 50 plates per run, and the IN Cell Analyzer 2000 (INCA2000), in which up to 320 plates could be fed per run using the Thermo Fisher Scientific Orbitor, we opted for a 4 m linear track system harboring both microscopes, plate washer, bulk dispensers, and a high-capacity incubator allowing us to perform both live and fixed cell-based assays while accessing both microscopes on deck. Considerations in design were given to the integration of the alpha INCA3000, a new gripper concept to access the onboard nest, and peripheral locations on deck to ensure a self-reliant system capable of achieving higher throughput. The resulting system, referred to as Hestia, has been fully operational since the new year, has an onboard capacity of 504 plates, and harbors the only fully automated alpha INCA3000 unit in the world.

  6. MAGNETIC RESONANCE IMAGING COMPATIBLE ROBOTIC SYSTEM FOR FULLY AUTOMATED BRACHYTHERAPY SEED PLACEMENT

    PubMed Central

    Muntener, Michael; Patriciu, Alexandru; Petrisor, Doru; Mazilu, Dumitru; Bagga, Herman; Kavoussi, Louis; Cleary, Kevin; Stoianovici, Dan

    2011-01-01

    Objectives To introduce the development of the first magnetic resonance imaging (MRI)-compatible robotic system capable of automated brachytherapy seed placement. Methods An MRI-compatible robotic system was conceptualized and manufactured. The entire robot was built of nonmagnetic and dielectric materials. The key technology of the system is a unique pneumatic motor that was specifically developed for this application. Various preclinical experiments were performed to test the robot for precision and imager compatibility. Results The robot was fully operational within all closed-bore MRI scanners. Compatibility tests in scanners of up to 7 Tesla field intensity showed no interference of the robot with the imager. Precision tests in tissue mockups yielded a mean seed placement error of 0.72 ± 0.36 mm. Conclusions The robotic system is fully MRI compatible. The new technology allows for automated and highly accurate operation within MRI scanners and does not deteriorate the MRI quality. We believe that this robot may become a useful instrument for image-guided prostate interventions. PMID:17169653

  7. A fully automated IIF system for the detection of antinuclear antibodies and antineutrophil cytoplasmic antibodies.

    PubMed

    Shovman, O; Agmon-Levin, N; Gilburd, B; Martins, T; Petzold, A; Matthias, T; Shoenfeld, Y

    2015-02-01

    Indirect immunofluorescence (IIF) is the main technique for the detection of antinuclear antibodies (ANA) and antineutrophil cytoplasmic antibodies (ANCA). The fully automated IIF processor HELIOS(®) is the first IIF processor that is able to automatically prepare slides and perform automatic reading. The objective of the present study was to determine the diagnostic performance of this system for ANA and ANCA IIF interpretation, in comparison with visual IIF. ANA detection by visual IIF or HELIOS(®) was performed on 425 sera samples including: 218 consecutive samples submitted to a reference laboratory for routine ANA testing, 137 samples from healthy subjects and 70 ANA/ENA positive samples. For ANCA determination, 170 sera samples were collected: 40 samples for routine testing, 90 samples from healthy blood donors and 40 anti-PR3/anti-MPO positive subjects. Good correlation was found for the visual and automated ANA IIF approach regarding positive/negative discrimination of these samples (kappa = 0.633 for ANA positive samples and kappa = 0.657 for ANA negative samples, respectively). Positive/negative IIF ANCA discrimination by HELIOS(®) and visual IIF revealed a complete agreement of 100% in sera from healthy patients and PR3/MPO positive samples (kappa = 1.00). There was 95% agreement between the ANCA IIF performed by automated and visual IIF on the investigation of routine samples. Based on these results, HELIOS(®) demonstrated a high diagnostic performance for the automated ANA and ANCA IIF interpretation that was similar to a visual reading in all groups of samples.

  8. Development and evaluation of fully automated demand response in large facilities

    SciTech Connect

    Piette, Mary Ann; Sezgen, Osman; Watson, David S.; Motegi, Naoya; Shockman, Christine; ten Hope, Laurie

    2004-03-30

    This report describes the results of a research project to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve electric grid reliability, manage electricity costs, and ensure that customers receive signals that encourage load reduction during times when the electric grid is near its capacity. The two main drivers for widespread demand responsiveness are the prevention of future electricity crises and the reduction of electricity prices. Additional goals for price responsiveness include equity through cost of service pricing, and customer control of electricity usage and bills. The technology developed and evaluated in this report could be used to support numerous forms of DR programs and tariffs. For the purpose of this report, we have defined three levels of Demand Response automation. Manual Demand Response involves manually turning off lights or equipment; this can be a labor-intensive approach. Semi-Automated Response involves the use of building energy management control systems for load shedding, where a preprogrammed load shedding strategy is initiated by facilities staff. Fully-Automated Demand Response is initiated at a building or facility through receipt of an external communications signal--facility staff set up a pre-programmed load shedding strategy which is automatically initiated by the system without the need for human intervention. We have defined this approach to be Auto-DR. An important concept in Auto-DR is that a facility manager is able to ''opt out'' or ''override'' an individual DR event if it occurs at a time when the reduction in end-use services is not desirable. This project sought to improve the feasibility and nature of Auto-DR strategies in large facilities. The research focused on technology development, testing, characterization, and evaluation relating to Auto

  9. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    SciTech Connect

    Tsai, Yingssu; McPhillips, Scott E.; González, Ana; McPhillips, Timothy M.; Zinn, Daniel; Cohen, Aina E.; Feese, Michael D.; Bushnell, David; Tiefenbrunn, Theresa; Stout, C. David; Ludaescher, Bertram; Hedman, Britt; Hodgson, Keith O.; Soltis, S. Michael

    2013-05-01

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully

  10. Fully automated multifunctional ultrahigh pressure liquid chromatography system for advanced proteome analyses

    SciTech Connect

    Lee, Jung Hwa; Hyung, Seok-Won; Mun, Dong-Gi; Jung, Hee-Jung; Kim, Hokeun; Lee, Hangyeore; Kim, Su-Jin; Park, Kyong Soo; Moore, Ronald J.; Smith, Richard D.; Lee, Sang-Won

    2012-08-03

    A multi-functional liquid chromatography system that performs 1-dimensional, 2-dimensional (strong cation exchange/reverse phase liquid chromatography, or SCX/RPLC) separations, and online phosphopeptides enrichment using a single binary nano-flow pump has been developed. With a simple operation of a function selection valve, which is equipped with a SCX column and a TiO2 (titanium dioxide) column, a fully automated selection of three different experiment modes was achieved. Because the current system uses essentially the same solvent flow paths, the same trap column, and the same separation column for reverse-phase separation of 1D, 2D, and online phosphopeptides enrichment experiments, the elution time information obtained from these experiments is in excellent agreement, which facilitates correlating peptide information from different experiments.

  11. A fully automated in vitro diagnostic system based on magnetic tunnel junction arrays and superparamagnetic particles

    NASA Astrophysics Data System (ADS)

    Lian, Jie; Chen, Si; Qiu, Yuqin; Zhang, Suohui; Shi, Stone; Gao, Yunhua

    2012-04-01

    A fully automated in vitro diagnostic (IVD) system for diagnosing acute myocardial infarction was developed using high sensitivity MTJ array as sensors and nano-magnetic particles as tags. On the chip is an array of 12 × 106 MTJ devices integrated onto a 3 metal layer CMOS circuit. The array is divided into 48 detection areas, therefore 48 different types of bio targets can be analyzed simultaneously if needed. The chip is assembled with a micro-fluidic cartridge which contains all the reagents necessary for completing the assaying process. Integrated with electrical, mechanical and micro-fluidic pumping devices and with the reaction protocol programed in a microprocessor, the system only requires a simple one-step analyte application procedure to operate and yields results of the three major AMI bio-markers (cTnI, MYO, CK-MB) in 15 mins.

  12. Broadband fully automated digitally assisted coaxial bridge for high accuracy impedance ratio measurements

    NASA Astrophysics Data System (ADS)

    Overney, Frédéric; Lüönd, Felix; Jeanneret, Blaise

    2016-06-01

    This paper describes the principle of a new fully automated digitally assisted coaxial bridge having a large bandwidth ranging from 60 Hz to 50 kHz. The performance of the bridge is evaluated making 1:1 comparisons between calculable ac resistors. The agreement between the calculated and the measured frequency dependence of the resistors is better than 5\\cdot {{10}-8} at frequencies up to 5 kHz, better than 1\\cdot {{10}-7} up to 20 kHz and better than 0.8\\cdot {{10}-6} up to 50 kHz. This bridge is particularly well suited to investigate the ac transport properties of graphene in the quantum Hall regime.

  13. Development of fully automated quantitative capillary electrophoresis with high accuracy and repeatability.

    PubMed

    Xu, Yuan; Ling, Bang-Zan; Zhu, Wen-Jun; Yao, Dong; Zhang, Lin; Wang, Yan; Yan, Chao

    2016-03-01

    A quantitative capillary electrophoresis (qCE) was developed by utilizing a rotary type of nano-volume injector, an autosampler, and a thermostat with cooling capacity. The accuracy and precision were greatly improved compared with conventional capillary electrophoresis. The 10 nL volume accuracy was guaranteed by the carefully designed nano-injector with an accurate internal loop. The system repeatability (precision) in terms of RSD <0.5% for migration time and 1% for peak area were achieved by using DMSO as a test sample. We believe that this fully automated qCE system has the potential to be employed broadly in quality control and quality assurance in the pharmaceutical industry.

  14. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    PubMed Central

    Tsai, Yingssu; McPhillips, Scott E.; González, Ana; McPhillips, Timothy M.; Zinn, Daniel; Cohen, Aina E.; Feese, Michael D.; Bushnell, David; Tiefenbrunn, Theresa; Stout, C. David; Ludaescher, Bertram; Hedman, Britt; Hodgson, Keith O.; Soltis, S. Michael

    2013-01-01

    AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallo­graphy steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully demonstrated. This workflow was run once on the same 96 samples that the group had examined manually and the workflow cycled successfully through all of the samples, collected data from the same samples that were selected manually and located the same peaks of unmodeled density in the resulting difference

  15. A fully automated system for quantification of background parenchymal enhancement in breast DCE-MRI

    NASA Astrophysics Data System (ADS)

    Ufuk Dalmiş, Mehmet; Gubern-Mérida, Albert; Borelli, Cristina; Vreemann, Suzan; Mann, Ritse M.; Karssemeijer, Nico

    2016-03-01

    Background parenchymal enhancement (BPE) observed in breast dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) has been identified as an important biomarker associated with risk for developing breast cancer. In this study, we present a fully automated framework for quantification of BPE. We initially segmented fibroglandular tissue (FGT) of the breasts using an improved version of an existing method. Subsequently, we computed BPEabs (volume of the enhancing tissue), BPErf (BPEabs divided by FGT volume) and BPErb (BPEabs divided by breast volume), using different relative enhancement threshold values between 1% and 100%. To evaluate and compare the previous and improved FGT segmentation methods, we used 20 breast DCE-MRI scans and we computed Dice similarity coefficient (DSC) values with respect to manual segmentations. For evaluation of the BPE quantification, we used a dataset of 95 breast DCE-MRI scans. Two radiologists, in individual reading sessions, visually analyzed the dataset and categorized each breast into minimal, mild, moderate and marked BPE. To measure the correlation between automated BPE values to the radiologists' assessments, we converted these values into ordinal categories and we used Spearman's rho as a measure of correlation. According to our results, the new segmentation method obtained an average DSC of 0.81 0.09, which was significantly higher (p<0.001) compared to the previous method (0.76 0.10). The highest correlation values between automated BPE categories and radiologists' assessments were obtained with the BPErf measurement (r=0.55, r=0.49, p<0.001 for both), while the correlation between the scores given by the two radiologists was 0.82 (p<0.001). The presented framework can be used to systematically investigate the correlation between BPE and risk in large screening cohorts.

  16. Toxicity assessment of ionic liquids with Vibrio fischeri: an alternative fully automated methodology.

    PubMed

    Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S

    2015-03-01

    A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits.

  17. DeepPicker: A deep learning approach for fully automated particle picking in cryo-EM.

    PubMed

    Wang, Feng; Gong, Huichao; Liu, Gaochao; Li, Meijing; Yan, Chuangye; Xia, Tian; Li, Xueming; Zeng, Jianyang

    2016-09-01

    Particle picking is a time-consuming step in single-particle analysis and often requires significant interventions from users, which has become a bottleneck for future automated electron cryo-microscopy (cryo-EM). Here we report a deep learning framework, called DeepPicker, to address this problem and fill the current gaps toward a fully automated cryo-EM pipeline. DeepPicker employs a novel cross-molecule training strategy to capture common features of particles from previously-analyzed micrographs, and thus does not require any human intervention during particle picking. Tests on the recently-published cryo-EM data of three complexes have demonstrated that our deep learning based scheme can successfully accomplish the human-level particle picking process and identify a sufficient number of particles that are comparable to those picked manually by human experts. These results indicate that DeepPicker can provide a practically useful tool to significantly reduce the time and manual effort spent in single-particle analysis and thus greatly facilitate high-resolution cryo-EM structure determination. DeepPicker is released as an open-source program, which can be downloaded from https://github.com/nejyeah/DeepPicker-python. PMID:27424268

  18. DeepPicker: A deep learning approach for fully automated particle picking in cryo-EM.

    PubMed

    Wang, Feng; Gong, Huichao; Liu, Gaochao; Li, Meijing; Yan, Chuangye; Xia, Tian; Li, Xueming; Zeng, Jianyang

    2016-09-01

    Particle picking is a time-consuming step in single-particle analysis and often requires significant interventions from users, which has become a bottleneck for future automated electron cryo-microscopy (cryo-EM). Here we report a deep learning framework, called DeepPicker, to address this problem and fill the current gaps toward a fully automated cryo-EM pipeline. DeepPicker employs a novel cross-molecule training strategy to capture common features of particles from previously-analyzed micrographs, and thus does not require any human intervention during particle picking. Tests on the recently-published cryo-EM data of three complexes have demonstrated that our deep learning based scheme can successfully accomplish the human-level particle picking process and identify a sufficient number of particles that are comparable to those picked manually by human experts. These results indicate that DeepPicker can provide a practically useful tool to significantly reduce the time and manual effort spent in single-particle analysis and thus greatly facilitate high-resolution cryo-EM structure determination. DeepPicker is released as an open-source program, which can be downloaded from https://github.com/nejyeah/DeepPicker-python.

  19. Fully automated and colorimetric foodborne pathogen detection on an integrated centrifugal microfluidic device.

    PubMed

    Oh, Seung Jun; Park, Byung Hyun; Choi, Goro; Seo, Ji Hyun; Jung, Jae Hwan; Choi, Jong Seob; Kim, Do Hyun; Seo, Tae Seok

    2016-05-21

    This work describes fully automated and colorimetric foodborne pathogen detection on an integrated centrifugal microfluidic device, which is called a lab-on-a-disc. All the processes for molecular diagnostics including DNA extraction and purification, DNA amplification and amplicon detection were integrated on a single disc. Silica microbeads incorporated in the disc enabled extraction and purification of bacterial genomic DNA from bacteria-contaminated milk samples. We targeted four kinds of foodborne pathogens (Escherichia coli O157:H7, Salmonella typhimurium, Vibrio parahaemolyticus and Listeria monocytogenes) and performed loop-mediated isothermal amplification (LAMP) to amplify the specific genes of the targets. Colorimetric detection mediated by a metal indicator confirmed the results of the LAMP reactions with the colour change of the LAMP mixtures from purple to sky blue. The whole process was conducted in an automated manner using the lab-on-a-disc and a miniaturized rotary instrument equipped with three heating blocks. We demonstrated that a milk sample contaminated with foodborne pathogens can be automatically analysed on the centrifugal disc even at the 10 bacterial cell level in 65 min. The simplicity and portability of the proposed microdevice would provide an advanced platform for point-of-care diagnostics of foodborne pathogens, where prompt confirmation of food quality is needed. PMID:27112702

  20. High‐throughput automated scoring of Ki67 in breast cancer tissue microarrays from the Breast Cancer Association Consortium

    PubMed Central

    Howat, William J; Daley, Frances; Zabaglo, Lila; McDuffus, Leigh‐Anne; Blows, Fiona; Coulson, Penny; Raza Ali, H; Benitez, Javier; Milne, Roger; Brenner, Herman; Stegmaier, Christa; Mannermaa, Arto; Chang‐Claude, Jenny; Rudolph, Anja; Sinn, Peter; Couch, Fergus J; Tollenaar, Rob A.E.M.; Devilee, Peter; Figueroa, Jonine; Sherman, Mark E; Lissowska, Jolanta; Hewitt, Stephen; Eccles, Diana; Hooning, Maartje J; Hollestelle, Antoinette; WM Martens, John; HM van Deurzen, Carolien; Investigators, kConFab; Bolla, Manjeet K; Wang, Qin; Jones, Michael; Schoemaker, Minouk; Broeks, Annegien; van Leeuwen, Flora E; Van't Veer, Laura; Swerdlow, Anthony J; Orr, Nick; Dowsett, Mitch; Easton, Douglas; Schmidt, Marjanka K; Pharoah, Paul D; Garcia‐Closas, Montserrat

    2016-01-01

    Abstract Automated methods are needed to facilitate high‐throughput and reproducible scoring of Ki67 and other markers in breast cancer tissue microarrays (TMAs) in large‐scale studies. To address this need, we developed an automated protocol for Ki67 scoring and evaluated its performance in studies from the Breast Cancer Association Consortium. We utilized 166 TMAs containing 16,953 tumour cores representing 9,059 breast cancer cases, from 13 studies, with information on other clinical and pathological characteristics. TMAs were stained for Ki67 using standard immunohistochemical procedures, and scanned and digitized using the Ariol system. An automated algorithm was developed for the scoring of Ki67, and scores were compared to computer assisted visual (CAV) scores in a subset of 15 TMAs in a training set. We also assessed the correlation between automated Ki67 scores and other clinical and pathological characteristics. Overall, we observed good discriminatory accuracy (AUC = 85%) and good agreement (kappa = 0.64) between the automated and CAV scoring methods in the training set. The performance of the automated method varied by TMA (kappa range= 0.37–0.87) and study (kappa range = 0.39–0.69). The automated method performed better in satisfactory cores (kappa = 0.68) than suboptimal (kappa = 0.51) cores (p‐value for comparison = 0.005); and among cores with higher total nuclei counted by the machine (4,000–4,500 cells: kappa = 0.78) than those with lower counts (50–500 cells: kappa = 0.41; p‐value = 0.010). Among the 9,059 cases in this study, the correlations between automated Ki67 and clinical and pathological characteristics were found to be in the expected directions. Our findings indicate that automated scoring of Ki67 can be an efficient method to obtain good quality data across large numbers of TMAs from multicentre studies. However, robust algorithm development and rigorous pre‐ and post

  1. Fully Automated Segmentation of the Pons and Midbrain Using Human T1 MR Brain Images

    PubMed Central

    Nigro, Salvatore; Cerasa, Antonio; Zito, Giancarlo; Perrotta, Paolo; Chiaravalloti, Francesco; Donzuso, Giulia; Fera, Franceso; Bilotta, Eleonora; Pantano, Pietro; Quattrone, Aldo

    2014-01-01

    Purpose This paper describes a novel method to automatically segment the human brainstem into midbrain and pons, called LABS: Landmark-based Automated Brainstem Segmentation. LABS processes high-resolution structural magnetic resonance images (MRIs) according to a revised landmark-based approach integrated with a thresholding method, without manual interaction. Methods This method was first tested on morphological T1-weighted MRIs of 30 healthy subjects. Its reliability was further confirmed by including neurological patients (with Alzheimer's Disease) from the ADNI repository, in whom the presence of volumetric loss within the brainstem had been previously described. Segmentation accuracies were evaluated against expert-drawn manual delineation. To evaluate the quality of LABS segmentation we used volumetric, spatial overlap and distance-based metrics. Results The comparison between the quantitative measurements provided by LABS against manual segmentations revealed excellent results in healthy controls when considering either the midbrain (DICE measures higher that 0.9; Volume ratio around 1 and Hausdorff distance around 3) or the pons (DICE measures around 0.93; Volume ratio ranging 1.024–1.05 and Hausdorff distance around 2). Similar performances were detected for AD patients considering segmentation of the pons (DICE measures higher that 0.93; Volume ratio ranging from 0.97–0.98 and Hausdorff distance ranging 1.07–1.33), while LABS performed lower for the midbrain (DICE measures ranging 0.86–0.88; Volume ratio around 0.95 and Hausdorff distance ranging 1.71–2.15). Conclusions Our study represents the first attempt to validate a new fully automated method for in vivo segmentation of two anatomically complex brainstem subregions. We retain that our method might represent a useful tool for future applications in clinical practice. PMID:24489664

  2. Towards fully automated processing of VLBI sessions - results from ultra-rapid UT1 experiments

    NASA Astrophysics Data System (ADS)

    Hobiger, T.; Sekido, M.; Koyama, Y.; Kondo, T.; Takiguchi, H.; Kurihara, S.; Kokado, K.; Nozawa, K.; Haas, R.; Otsubo, T.; Gotoh, T.; Kubo-Oka, T.

    2010-12-01

    Like other space geodetic techniques, data processing and analysis of VLBI data requires some human interactions before the target parameters are available to the scientific community. If the processing chain can be completely automated, results would be available independent from time-zones, holidays or illness of the analyst. In VLBI, a lot of effort is put into near real-time monitoring of Earth orientation parameters, especially UT1. Since VLBI is the only space-geodetic technique which gives direct access to the Earth's phase of rotation, i.e. universal time UT1, a low latency product is desirable. Beside multi-baseline sessions, regular single-baseline VLBI experiments are scheduled in order to provide estimates of UT1 for the international science community. Although the turn-around time of such sessions is usually much shorter and results are available within one day after the data were recorded, lower latency of UT1 results is still requested. Based on the experience gained over the last three years, an automated processing chain was established. In July 2010, we started to provide automatically processed results to IERS rapid service, and thus fully unattended operation and robust estimation of UT1 has become routine operation. A new analysis software ensures that all post-processing stages run smoothly and a variety of scripts guarantee that the data-flow to and through the correlator takes full advantage of the available resources. The concept of ultra-rapid VLBI sessions can be extended further to include additional, geometrical well distributed stations, in order to derive also polar motion components with the same latency as UT1 and to provide an up-to-date complete set of Earth orientation parameters for navigation of space and satellite missions. Moreover, our work demonstrates how future VLBI networks can be processed automatically in order to provide near real-time information about the instantaneous Earth orientation in the framework of GGOS.

  3. Difference Tracker: ImageJ plugins for fully automated analysis of multiple axonal transport parameters.

    PubMed

    Andrews, Simon; Gilley, Jonathan; Coleman, Michael P

    2010-11-30

    Studies of axonal transport are critical, not only to understand its normal regulation, but also to determine the roles of transport impairment in disease. Exciting new resources have recently become available allowing live imaging of axonal transport in physiologically relevant settings, such as mammalian nerves. Thus the effects of disease, ageing and therapies can now be assessed directly in nervous system tissue. However, these imaging studies present new challenges. Manual or semi-automated analysis of the range of transport parameters required for a suitably complete evaluation is very time-consuming and can be subjective due to the complexity of the particle movements in axons in ex vivo explants or in vivo. We have developed Difference Tracker, a program combining two new plugins for the ImageJ image-analysis freeware, to provide fast, fully automated and objective analysis of a number of relevant measures of trafficking of fluorescently labeled particles so that axonal transport in different situations can be easily compared. We confirm that Difference Tracker can accurately track moving particles in highly simplified, artificial simulations. It can also identify and track multiple motile fluorescently labeled mitochondria simultaneously in time-lapse image stacks from live imaging of tibial nerve axons, reporting values for a number of parameters that are comparable to those obtained through manual analysis of the same axons. Difference Tracker therefore represents a useful free resource for the comparative analysis of axonal transport under different conditions, and could potentially be used and developed further in many other studies requiring quantification of particle movements.

  4. A Fully Automated Diabetes Prevention Program, Alive-PD: Program Design and Randomized Controlled Trial Protocol

    PubMed Central

    Azar, Kristen MJ; Block, Torin J; Romanelli, Robert J; Carpenter, Heather; Hopkins, Donald; Palaniappan, Latha; Block, Clifford H

    2015-01-01

    Background In the United States, 86 million adults have pre-diabetes. Evidence-based interventions that are both cost effective and widely scalable are needed to prevent diabetes. Objective Our goal was to develop a fully automated diabetes prevention program and determine its effectiveness in a randomized controlled trial. Methods Subjects with verified pre-diabetes were recruited to participate in a trial of the effectiveness of Alive-PD, a newly developed, 1-year, fully automated behavior change program delivered by email and Web. The program involves weekly tailored goal-setting, team-based and individual challenges, gamification, and other opportunities for interaction. An accompanying mobile phone app supports goal-setting and activity planning. For the trial, participants were randomized by computer algorithm to start the program immediately or after a 6-month delay. The primary outcome measures are change in HbA1c and fasting glucose from baseline to 6 months. The secondary outcome measures are change in HbA1c, glucose, lipids, body mass index (BMI), weight, waist circumference, and blood pressure at 3, 6, 9, and 12 months. Randomization and delivery of the intervention are independent of clinic staff, who are blinded to treatment assignment. Outcomes will be evaluated for the intention-to-treat and per-protocol populations. Results A total of 340 subjects with pre-diabetes were randomized to the intervention (n=164) or delayed-entry control group (n=176). Baseline characteristics were as follows: mean age 55 (SD 8.9); mean BMI 31.1 (SD 4.3); male 68.5%; mean fasting glucose 109.9 (SD 8.4) mg/dL; and mean HbA1c 5.6 (SD 0.3)%. Data collection and analysis are in progress. We hypothesize that participants in the intervention group will achieve statistically significant reductions in fasting glucose and HbA1c as compared to the control group at 6 months post baseline. Conclusions The randomized trial will provide rigorous evidence regarding the efficacy of

  5. An innovative fully automated RTM plant for class ``A`` mass transit applications

    SciTech Connect

    Vaccarella, P.W.; Barbieri, G.; Mazzola, M.; Russo, M.

    1996-11-01

    In order to guarantee high quality and reliability of molded parts, a fully automated RTM plant, equipped with a smart system, was set up. The controls and the servo-mechanisms make the plant require a very low labor, so it represents a true innovative industrial solution. The plant configuration was optimized to maximize the investment return also on low and medium production series. The plant and the smart system are well described. The smart system is formed of four main blocks: (1) A mathematical model that simulates the rheology and the kinetic of the injected resin system; (2) sensor for temperature, pressure and degree of polymerization and a data acquisition system; (3) software for comparison, in the real time, between mathematical model data and experimental ones; (4) servo-mechanisms interfacing software-plant. The smart system permits: complete filling of the mold even if the molding parameters were selected out of the admitted working values range; optimization of the molding time by means of sensors based on evaluation of the electric conductivity of the resin system, as function of degree of polymerization. A comparative cost analysis for class A bus body panels was executed with the respect to different production technologies. The comparison shows the economical advantages of the adopted choice.

  6. A fully automated method for quantitative cerebral hemodynamic analysis using DSC-MRI.

    PubMed

    Bjørnerud, Atle; Emblem, Kyrre E

    2010-05-01

    Dynamic susceptibility contrast (DSC)-based perfusion analysis from MR images has become an established method for analysis of cerebral blood volume (CBV) in glioma patients. To date, little emphasis has, however, been placed on quantitative perfusion analysis of these patients, mainly due to the associated increased technical complexity and lack of sufficient stability in a clinical setting. The aim of our study was to develop a fully automated analysis framework for quantitative DSC-based perfusion analysis. The method presented here generates quantitative hemodynamic maps without user interaction, combined with automatic segmentation of normal-appearing cerebral tissue. Validation of 101 patients with confirmed glioma after surgery gave mean values for CBF, CBV, and MTT, extracted automatically from normal-appearing whole-brain white and gray matter, in good agreement with literature values. The measured age- and gender-related variations in the same parameters were also in agreement with those in the literature. Several established analysis methods were compared and the resulting perfusion metrics depended significantly on method and parameter choice. In conclusion, we present an accurate, fast, and automatic quantitative perfusion analysis method where all analysis steps are based on raw DSC data only. PMID:20087370

  7. A fully automated scheme for mammographic segmentation and classification based on breast density and asymmetry.

    PubMed

    Tzikopoulos, Stylianos D; Mavroforakis, Michael E; Georgiou, Harris V; Dimitropoulos, Nikos; Theodoridis, Sergios

    2011-04-01

    This paper presents a fully automated segmentation and classification scheme for mammograms, based on breast density estimation and detection of asymmetry. First, image preprocessing and segmentation techniques are applied, including a breast boundary extraction algorithm and an improved version of a pectoral muscle segmentation scheme. Features for breast density categorization are extracted, including a new fractal dimension-related feature, and support vector machines (SVMs) are employed for classification, achieving accuracy of up to 85.7%. Most of these properties are used to extract a new set of statistical features for each breast; the differences among these feature values from the two images of each pair of mammograms are used to detect breast asymmetry, using an one-class SVM classifier, which resulted in a success rate of 84.47%. This composite methodology has been applied to the miniMIAS database, consisting of 322 (MLO) mammograms -including 15 asymmetric pairs of images-, obtained via a (noisy) digitization procedure. The results were evaluated by expert radiologists and are very promising, showing equal or higher success rates compared to other related works, despite the fact that some of them used only selected portions of this specific mammographic database. In contrast, our methodology is applied to the complete miniMIAS database and it exhibits the reliability that is normally required for clinical use in CAD systems. PMID:21306782

  8. DynaMet: a fully automated pipeline for dynamic LC-MS data.

    PubMed

    Kiefer, Patrick; Schmitt, Uwe; Müller, Jonas E N; Hartl, Johannes; Meyer, Fabian; Ryffel, Florian; Vorholt, Julia A

    2015-10-01

    Dynamic isotope labeling data provides crucial information about the operation of metabolic pathways and are commonly generated via liquid chromatography-mass spectrometry (LC-MS). Metabolome-wide analysis is challenging as it requires grouping of metabolite features over different samples. We developed DynaMet for fully automated investigations of isotope labeling experiments from LC-high-resolution MS raw data. DynaMet enables untargeted extraction of metabolite labeling profiles and provides integrated tools for expressive data visualization. To validate DynaMet we first used time course labeling data of the model strain Bacillus methanolicus from (13)C methanol resulting in complex spectra in multicarbon compounds. Analysis of two biological replicates revealed high robustness and reproducibility of the pipeline. In total, DynaMet extracted 386 features showing dynamic labeling within 10 min. Of these features, 357 could be fitted by implemented kinetic models. Feature identification against KEGG database resulted in 215 matches covering multiple pathways of core metabolism and major biosynthetic routes. Moreover, we performed time course labeling experiment with Escherichia coli on uniformly labeled (13)C glucose resulting in a comparable number of detected features with labeling profiles of high quality. The distinct labeling patterns of common central metabolites generated from both model bacteria can readily be explained by one versus multicarbon compound metabolism. DynaMet is freely available as an extension package for Python based eMZed2, an open source framework built for rapid development of LC-MS data analysis workflows.

  9. Sectoranalysis of left ventricular function by fully automated equilibrium radionuclide ventriculography.

    PubMed

    Standke, R; Hör, G; Klepzig, H; Maul, F D; Bussmann, W D; Kaltenbach, M

    1985-01-01

    We describe a fully automated method for quantification of left ventricular performance by equilibrium radionuclide ventriculographic studies, based on subdivision of the left ventricular region into 9 equiangular sectors. The precise identification of the left ventricular contours is achieved by the use of morphological and functional criteria in a sequential edge detection algorithm with a success rate of 96%. In addition to left ventricular global and sectorial ejection fraction the first harmonic of the corresponding Fourier spectrum is approximated to each sectorial time-activity curve and to the global one. Sectorial phase is calculated as the difference between the phase of the sectorial and global first Fourier component. Computerized comparison between the sectorial parameters at rest and during peak exercise localizes and classifies the degree of global and regional impairment in response to exercise. The processing time of 60 sec makes this method suitable for routine use. The validity of our procedure has been tested in 34 patients before and after successful transluminal coronary angioplasty. In these patients, 73% of the stenosed vessels before dilatation were localized by sectorial ejection fraction, 77% by sectorial phases, and 88% by the combination of both.

  10. Microscopic characterisation of filamentous microbes: towards fully automated morphological quantification through image analysis.

    PubMed

    Barry, D J; Williams, G A

    2011-10-01

    Mycelial morphology is a critically important process property in industrial fermentations of filamentous microorganisms, as particular phenotypes are associated with maximum productivity. The morphological form that develops in a given process results from the combination of various environmental factors, together with the genotype of the organism itself. The design of systems capable of rapidly and accurately characterising morphology within a given process represents a significant challenge to biotechnologists, as the complex phenotypes that are manifested are often not easily quantified. Over the last two decades, the proliferation of high-power personal computers and high-resolution digital cameras has enabled scientists to apply digital image analysis to this challenge. Although several fully automated systems have been designed for this purpose, manual analysis of images is still commonplace, together with qualitative, subjective descriptions of morphologies. This review describes the complex morphologies that can develop in fermentations of filamentous microbes and the application of microscopy and image analysis techniques to the quantification of such structures.

  11. Fully Automated and Robust Tracking of Transient Waves in Structured Anatomies Using Dynamic Programming.

    PubMed

    Akkus, Zeynettin; Bayat, Mahdi; Cheong, Mathew; Viksit, Kumar; Erickson, Bradley J; Alizad, Azra; Fatemi, Mostafa

    2016-10-01

    Tissue stiffness is often linked to underlying pathology and can be quantified by measuring the mechanical transient transverse wave speed (TWS) within the medium. Time-of-flight methods based on correlation of the transient signals or tracking of peaks have been used to quantify the TWS from displacement maps obtained with ultrasound pulse-echo techniques. However, it is challenging to apply these methods to in vivo data because of tissue inhomogeneity, noise and artifacts that produce outliers. In this study, we introduce a robust and fully automated method based on dynamic programming to estimate TWS in tissues with known geometries. The method is validated using ultrasound bladder vibrometry data from an in vivo study. We compared the results of our method with those of time-of-flight techniques. Our method performs better than time-of-flight techniques. In conclusion, we present a robust and accurate TWS detection method that overcomes the difficulties of time-of-flight methods.

  12. Fully automated high-resolution spectroscopy at Swiss 1.2 m La Silla telescope

    NASA Astrophysics Data System (ADS)

    Weber, Luc; Blecha, Andre; Davignon, Geert; Maire, Charles; Queloz, Didier; Russiniello, Giovanni B.; Simond, Gilles

    2000-06-01

    A Ritchey-Chretien 1.2 m telescope (EULER) and the High- Resolution echelle Spectrograph (CORALIE), a new Swiss observing facility at ESO La Silla Observatory, are operational and since Summer 1998. The Observatory operation is fully automated and supports the unattended, attended and interactive mode of operation under local or remote control. The control hardware is based on Local Control Units (LCU) built from PC/RedHat Linux computers and a Unix Computing Server. The Operational Software is built around INTER, a command language interpreter featuring communication control, data access, image processing functions and easy access to external resources. The general SW architecture is a non-hierarchical tree of pairs made of hardware- independent interpreters running on the Observing Server and hardware dependent servers running on the LCU's. The Operational Software includes the full access (Creation/Modification/Retrieval) to the input/output databases, telescope, instrument and auxiliary set-up and control files as well as a full data-reduction pipeline. We describe briefly the system architecture, summarize the performances and the experience gained over 18 months of operation and we discuss some critical issues: use of standard components, parallel operation, real-time requirements, system upgrade and maintenance.

  13. Fully Automated and Robust Tracking of Transient Waves in Structured Anatomies Using Dynamic Programming.

    PubMed

    Akkus, Zeynettin; Bayat, Mahdi; Cheong, Mathew; Viksit, Kumar; Erickson, Bradley J; Alizad, Azra; Fatemi, Mostafa

    2016-10-01

    Tissue stiffness is often linked to underlying pathology and can be quantified by measuring the mechanical transient transverse wave speed (TWS) within the medium. Time-of-flight methods based on correlation of the transient signals or tracking of peaks have been used to quantify the TWS from displacement maps obtained with ultrasound pulse-echo techniques. However, it is challenging to apply these methods to in vivo data because of tissue inhomogeneity, noise and artifacts that produce outliers. In this study, we introduce a robust and fully automated method based on dynamic programming to estimate TWS in tissues with known geometries. The method is validated using ultrasound bladder vibrometry data from an in vivo study. We compared the results of our method with those of time-of-flight techniques. Our method performs better than time-of-flight techniques. In conclusion, we present a robust and accurate TWS detection method that overcomes the difficulties of time-of-flight methods. PMID:27425150

  14. Improving GPR Surveys Productivity by Array Technology and Fully Automated Processing

    NASA Astrophysics Data System (ADS)

    Morello, Marco; Ercoli, Emanuele; Mazzucchelli, Paolo; Cottino, Edoardo

    2016-04-01

    The realization of network infrastructures with lower environmental impact and the tendency to use digging technologies less invasive in terms of time and space of road occupation and restoration play a key-role in the development of communication networks. However, pre-existing buried utilities must be detected and located in the subsurface, to exploit the high productivity of modern digging apparatus. According to SUE quality level B+ both position and depth of subsurface utilities must be accurately estimated, demanding for 3D GPR surveys. In fact, the advantages of 3D GPR acquisitions (obtained either by multiple 2D recordings or by an antenna array) versus 2D acquisitions are well-known. Nonetheless, the amount of acquired data for such 3D acquisitions does not usually allow to complete processing and interpretation directly in field and in real-time, thus limiting the overall efficiency of the GPR acquisition. As an example, the "low impact mini-trench "technique (addressed in ITU - International Telecommunication Union - L.83 recommendation) requires that non-destructive mapping of buried services enhances its productivity to match the improvements of new digging equipment. Nowadays multi-antenna and multi-pass GPR acquisitions demand for new processing techniques that can obtain high quality subsurface images, taking full advantage of 3D data: the development of a fully automated and real-time 3D GPR processing system plays a key-role in overall optical network deployment profitability. Furthermore, currently available computing power suggests the feasibility of processing schemes that incorporate better focusing algorithms. A novel processing scheme, whose goal is the automated processing and detection of buried targets that can be applied in real-time to 3D GPR array systems, has been developed and fruitfully tested with two different GPR arrays (16 antennas, 900 MHz central frequency, and 34 antennas, 600 MHz central frequency). The proposed processing

  15. Fully Automated Prostate Magnetic Resonance Imaging and Transrectal Ultrasound Fusion via a Probabilistic Registration Metric.

    PubMed

    Sparks, Rachel; Bloch, B Nicolas; Feleppa, Ernest; Barratt, Dean; Madabhushi, Anant

    2013-03-01

    In this work, we present a novel, automated, registration method to fuse magnetic resonance imaging (MRI) and transrectal ultrasound (TRUS) images of the prostate. Our methodology consists of: (1) delineating the prostate on MRI, (2) building a probabilistic model of prostate location on TRUS, and (3) aligning the MRI prostate segmentation to the TRUS probabilistic model. TRUS-guided needle biopsy is the current gold standard for prostate cancer (CaP) diagnosis. Up to 40% of CaP lesions appear isoechoic on TRUS, hence TRUS-guided biopsy cannot reliably target CaP lesions and is associated with a high false negative rate. MRI is better able to distinguish CaP from benign prostatic tissue, but requires special equipment and training. MRI-TRUS fusion, whereby MRI is acquired pre-operatively and aligned to TRUS during the biopsy procedure, allows for information from both modalities to be used to help guide the biopsy. The use of MRI and TRUS in combination to guide biopsy at least doubles the yield of positive biopsies. Previous work on MRI-TRUS fusion has involved aligning manually determined fiducials or prostate surfaces to achieve image registration. The accuracy of these methods is dependent on the reader's ability to determine fiducials or prostate surfaces with minimal error, which is a difficult and time-consuming task. Our novel, fully automated MRI-TRUS fusion method represents a significant advance over the current state-of-the-art because it does not require manual intervention after TRUS acquisition. All necessary preprocessing steps (i.e. delineation of the prostate on MRI) can be performed offline prior to the biopsy procedure. We evaluated our method on seven patient studies, with B-mode TRUS and a 1.5 T surface coil MRI. Our method has a root mean square error (RMSE) for expertly selected fiducials (consisting of the urethra, calcifications, and the centroids of CaP nodules) of 3.39 ± 0.85 mm. PMID:24353393

  16. Assessment of fully-automated atlas-based segmentation of novel oral mucosal surface organ-at-risk

    PubMed Central

    Dean, Jamie A; Welsh, Liam C; McQuaid, Dualta; Wong, Kee H; Aleksic, Aleksandar; Dunne, Emma; Islam, Mohammad R; Patel, Anushka; Patel, Priyanka; Petkar, Imran; Phillips, Iain; Sham, Jackie; Newbold, Kate L; Bhide, Shreerang A; Harrington, Kevin J; Gulliford, Sarah L; Nutting, Christopher M

    2016-01-01

    Background and Purpose Current oral mucositis normal tissue complication probability models, based on the dose distribution to the oral cavity volume, have suboptimal predictive power. Improving the delineation of the oral mucosa is likely to improve these models, but is resource intensive. We developed and evaluated fully-automated atlas-based segmentation (ABS) of a novel delineation technique for the oral mucosal surfaces. Material and Methods An atlas of mucosal surface contours (MSC) consisting of 46 patients was developed. It was applied to an independent test cohort of 10 patients for whom manual segmentation of MSC structures, by three different clinicians, and conventional outlining of oral cavity contours (OCC), by an additional clinician, were also performed. Geometric comparisons were made using the dice similarity coefficient (DSC), validation index (VI) and Hausdorff distance (HD). Dosimetric comparisons were carried out using dose-volume histograms. Results The median difference, in the DSC and HD, between automated-manual comparisons and manual-manual comparisons were small and non-significant (-0.024; p = 0.33 and -0.5; p = 0.88, respectively). The median VI was 0.086. The maximum normalised volume difference between automated and manual MSC structures across all of the dose levels, averaged over the test cohort, was 8%. This difference reached approximately 28% when comparing automated MSC and OCC structures. Conclusions Fully-automated ABS of MSC is suitable for use in radiotherapy dose-response modelling. PMID:26970676

  17. Fully Automated Assessment of the Severity of Parkinson’s Disease from Speech

    PubMed Central

    Bayestehtashk, Alireza; Asgari, Meysam; Shafran, Izhak; McNames, James

    2014-01-01

    For several decades now, there has been sporadic interest in automatically characterizing the speech impairment due to Parkinson’s disease (PD). Most early studies were confined to quantifying a few speech features that were easy to compute. More recent studies have adopted a machine learning approach where a large number of potential features are extracted and the models are learned automatically from the data. In the same vein, here we characterize the disease using a relatively large cohort of 168 subjects, collected from multiple (three) clinics. We elicited speech using three tasks – the sustained phonation task, the diadochokinetic task and a reading task, all within a time budget of 4 minutes, prompted by a portable device. From these recordings, we extracted 1582 features for each subject using openSMILE, a standard feature extraction tool. We compared the effectiveness of three strategies for learning a regularized regression and find that ridge regression performs better than lasso and support vector regression for our task. We refine the feature extraction to capture pitch-related cues, including jitter and shimmer, more accurately using a time-varying harmonic model of speech. Our results show that the severity of the disease can be inferred from speech with a mean absolute error of about 5.5, explaining 61% of the variance and consistently well-above chance across all clinics. Of the three speech elicitation tasks, we find that the reading task is significantly better at capturing cues than diadochokinetic or sustained phonation task. In all, we have demonstrated that the data collection and inference can be fully automated, and the results show that speech-based assessment has promising practical application in PD. The techniques reported here are more widely applicable to other paralinguistic tasks in clinical domain. PMID:25382935

  18. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    SciTech Connect

    Lorenz, Matthias; Ovchinnikova, Olga S; Van Berkel, Gary J

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  19. Fully Automated Detection of Cloud and Aerosol Layers in the CALIPSO Lidar Measurements

    NASA Technical Reports Server (NTRS)

    Vaughan, Mark A.; Powell, Kathleen A.; Kuehn, Ralph E.; Young, Stuart A.; Winker, David M.; Hostetler, Chris A.; Hunt, William H.; Liu, Zhaoyan; McGill, Matthew J.; Getzewich, Brian J.

    2009-01-01

    Accurate knowledge of the vertical and horizontal extent of clouds and aerosols in the earth s atmosphere is critical in assessing the planet s radiation budget and for advancing human understanding of climate change issues. To retrieve this fundamental information from the elastic backscatter lidar data acquired during the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) mission, a selective, iterated boundary location (SIBYL) algorithm has been developed and deployed. SIBYL accomplishes its goals by integrating an adaptive context-sensitive profile scanner into an iterated multiresolution spatial averaging scheme. This paper provides an in-depth overview of the architecture and performance of the SIBYL algorithm. It begins with a brief review of the theory of target detection in noise-contaminated signals, and an enumeration of the practical constraints levied on the retrieval scheme by the design of the lidar hardware, the geometry of a space-based remote sensing platform, and the spatial variability of the measurement targets. Detailed descriptions are then provided for both the adaptive threshold algorithm used to detect features of interest within individual lidar profiles and the fully automated multiresolution averaging engine within which this profile scanner functions. The resulting fusion of profile scanner and averaging engine is specifically designed to optimize the trade-offs between the widely varying signal-to-noise ratio of the measurements and the disparate spatial resolutions of the detection targets. Throughout the paper, specific algorithm performance details are illustrated using examples drawn from the existing CALIPSO dataset. Overall performance is established by comparisons to existing layer height distributions obtained by other airborne and space-based lidars.

  20. Fully Automated Whole-Head Segmentation with Improved Smoothness and Continuity, with Theory Reviewed

    PubMed Central

    Huang, Yu; Parra, Lucas C.

    2015-01-01

    Individualized current-flow models are needed for precise targeting of brain structures using transcranial electrical or magnetic stimulation (TES/TMS). The same is true for current-source reconstruction in electroencephalography and magnetoencephalography (EEG/MEG). The first step in generating such models is to obtain an accurate segmentation of individual head anatomy, including not only brain but also cerebrospinal fluid (CSF), skull and soft tissues, with a field of view (FOV) that covers the whole head. Currently available automated segmentation tools only provide results for brain tissues, have a limited FOV, and do not guarantee continuity and smoothness of tissues, which is crucially important for accurate current-flow estimates. Here we present a tool that addresses these needs. It is based on a rigorous Bayesian inference framework that combines image intensity model, anatomical prior (atlas) and morphological constraints using Markov random fields (MRF). The method is evaluated on 20 simulated and 8 real head volumes acquired with magnetic resonance imaging (MRI) at 1 mm3 resolution. We find improved surface smoothness and continuity as compared to the segmentation algorithms currently implemented in Statistical Parametric Mapping (SPM). With this tool, accurate and morphologically correct modeling of the whole-head anatomy for individual subjects may now be feasible on a routine basis. Code and data are fully integrated into SPM software tool and are made publicly available. In addition, a review on the MRI segmentation using atlas and the MRF over the last 20 years is also provided, with the general mathematical framework clearly derived. PMID:25992793

  1. Accurate, fully-automated registration of coronary arteries for volumetric CT digital subtraction angiography

    NASA Astrophysics Data System (ADS)

    Razeto, Marco; Mohr, Brian; Arakita, Kazumasa; Schuijf, Joanne D.; Fuchs, Andreas; Kühl, J. Tobias; Chen, Marcus Y.; Kofoed, Klaus F.

    2014-03-01

    Diagnosis of coronary artery disease with Coronary Computed Tomography Angiography (CCTA) is complicated by the presence of signi cant calci cation or stents. Volumetric CT Digital Subtraction Angiography (CTDSA) has recently been shown to be e ective at overcoming these limitations. Precise registration of structures is essential as any misalignment can produce artifacts potentially inhibiting clinical interpretation of the data. The fully-automated registration method described in this paper addresses the problem by combining a dense deformation eld with rigid-body transformations where calci cations/stents are present. The method contains non-rigid and rigid components. Non-rigid registration recovers the majority of motion artifacts and produces a dense deformation eld valid over the entire scan domain. Discrete domains are identi ed in which rigid registrations very accurately align each calci cation/stent. These rigid-body transformations are combined within the immediate area of the deformation eld using a distance transform to minimize distortion of the surrounding tissue. A recent interim analysis of a clinical feasibility study evaluated reader con dence and diagnostic accuracy in conventional CCTA and CTDSA registered using this method. Conventional invasive coronary angiography was used as the reference. The study included 27 patients scanned with a second-generation 320-row CT detector in which 41 lesions were identi ed. Compared to conventional CCTA, CTDSA improved reader con dence in 13/36 (36%) of segments with severe calci cation and 3/5 (60%) of segments with coronary stents. Also, the false positive rate of CTDSA was reduced compared to conventional CCTA from 18% (24/130) to 14% (19/130).

  2. Fully automated, quantitative, noninvasive assessment of collagen fiber content and organization in thick collagen gels

    NASA Astrophysics Data System (ADS)

    Bayan, Christopher; Levitt, Jonathan M.; Miller, Eric; Kaplan, David; Georgakoudi, Irene

    2009-05-01

    Collagen is the most prominent protein of human tissues. Its content and organization define to a large extent the mechanical properties of tissue as well as its function. Methods that have been used traditionally to visualize and analyze collagen are invasive, provide only qualitative or indirect information, and have limited use in studies that aim to understand the dynamic nature of collagen remodeling and its interactions with the surrounding cells and other matrix components. Second harmonic generation (SHG) imaging emerged as a promising noninvasive modality for providing high-resolution images of collagen fibers within thick specimens, such as tissues. In this article, we present a fully automated procedure to acquire quantitative information on the content, orientation, and organization of collagen fibers. We use this procedure to monitor the dynamic remodeling of collagen gels in the absence or presence of fibroblasts over periods of 12 or 14 days. We find that an adaptive thresholding and stretching approach provides great insight to the content of collagen fibers within SHG images without the need for user input. An additional feature-erosion and feature-dilation step is useful for preserving structure and noise removal in images with low signal. To quantitatively assess the orientation of collagen fibers, we extract the orientation index (OI), a parameter based on the power distribution of the spatial-frequency-averaged, two-dimensional Fourier transform of the SHG images. To measure the local organization of the collagen fibers, we access the Hough transform of small tiles of the image and compute the entropy distribution, which represents the probability of finding the direction of fibers along a dominant direction. Using these methods we observed that the presence and number of fibroblasts within the collagen gel significantly affects the remodeling of the collagen matrix. In the absence of fibroblasts, gels contract, especially during the first few

  3. Fully automated system for the quantification of human osteoarthritic knee joint effusion volume using magnetic resonance imaging

    PubMed Central

    2010-01-01

    Introduction Joint effusion is frequently associated with osteoarthritis (OA) flare-up and is an important marker of therapeutic response. This study aimed at developing and validating a fully automated system based on magnetic resonance imaging (MRI) for the quantification of joint effusion volume in knee OA patients. Methods MRI examinations consisted of two axial sequences: a T2-weighted true fast imaging with steady-state precession and a T1-weighted gradient echo. An automated joint effusion volume quantification system using MRI was developed and validated (a) with calibrated phantoms (cylinder and sphere) and effusion from knee OA patients; (b) with assessment by manual quantification; and (c) by direct aspiration. Twenty-five knee OA patients with joint effusion were included in the study. Results The automated joint effusion volume quantification was developed as a four stage sequencing process: bone segmentation, filtering of unrelated structures, segmentation of joint effusion, and subvoxel volume calculation. Validation experiments revealed excellent coefficients of variation with the calibrated cylinder (1.4%) and sphere (0.8%) phantoms. Comparison of the OA knee joint effusion volume assessed by the developed automated system and by manual quantification was also excellent (r = 0.98; P < 0.0001), as was the comparison with direct aspiration (r = 0.88; P = 0.0008). Conclusions The newly developed fully automated MRI-based system provided precise quantification of OA knee joint effusion volume with excellent correlation with data from phantoms, a manual system, and joint aspiration. Such an automated system will be instrumental in improving the reproducibility/reliability of the evaluation of this marker in clinical application. PMID:20846392

  4. Two Fully Automated Web-Based Interventions for Risky Alcohol Use: Randomized Controlled Trial

    PubMed Central

    Strüber, Evelin

    2013-01-01

    Background Excessive alcohol use is a widespread problem in many countries, especially among young people. To reach more people engaging in high-risk drinking behaviors, a number of online programs have been developed in recent years. Change Your Drinking is a German, diary-based, fully automated alcohol intervention. In 2010, a revised version of the program was developed. It is more strongly oriented to concepts of relapse prevention than the previous version, includes more feedback, and offers more possibilities to interact with the program. Moreover, the program duration was extended from 10 to 14 days. Objective This paper examines whether the revised version of Change Your Drinking is more effective in reducing alcohol consumption than the original version. Methods The effectiveness of both program versions was compared in a Web-based, open, randomized controlled trial with follow-up surveys 6 weeks and 3 months after registration. Participants were recruited online and were randomly assigned to either the original or the revised version of Change Your Drinking. The following self-assessed outcomes were used: alcohol use days, alcohol intake in grams, the occurrence of binge drinking and risky drinking (all referring to the past 7 days prior to each survey), and the number of alcohol-related problems. Results A total of 595 participants were included in the trial. Follow-up rates were 58.0% after 6 weeks and 49.6% after 3 months. No significant group differences were found in any of the outcomes. However, the revised version was used by more participants (80.7%) than the original version (55.7%). A significant time effect was detected in all outcomes (alcohol use days: P=.002; alcohol intake in grams: P<.001; binge drinking: P<.001; alcohol-related problems: P=.004; risky drinking: P<.001). Conclusions The duration and complexity of the program played a minor role in reducing alcohol consumption. However, differences in program usage between the versions

  5. Results from the first fully automated PBS-mask process and pelliclization

    NASA Astrophysics Data System (ADS)

    Oelmann, Andreas B.; Unger, Gerd M.

    1994-02-01

    Automation is widely discussed in IC- and mask-manufacturing and partially realized everywhere. The idea for the automation goes back to 1978, when it turned out that the operators for the then newly installed PBS-process-line (the first in Europe) should be trained to behave like robots for particle reduction gaining lower defect densities on the masks. More than this goal has been achieved. It turned out recently, that the automation with its dedicated work routes and detailed documentation of every lot (individual mask or reticle) made it easy to obtain the CEEC certificate which includes ISO 9001.

  6. A Fully Automated Method for CT-on-Rails-Guided Online Adaptive Planning for Prostate Cancer Intensity Modulated Radiation Therapy

    SciTech Connect

    Li, Xiaoqiang; Quan, Enzhuo M.; Li, Yupeng; Pan, Xiaoning; Zhou, Yin; Wang, Xiaochun; Du, Weiliang; Kudchadker, Rajat J.; Johnson, Jennifer L.; Kuban, Deborah A.; Lee, Andrew K.; Zhang, Xiaodong

    2013-08-01

    Purpose: This study was designed to validate a fully automated adaptive planning (AAP) method which integrates automated recontouring and automated replanning to account for interfractional anatomical changes in prostate cancer patients receiving adaptive intensity modulated radiation therapy (IMRT) based on daily repeated computed tomography (CT)-on-rails images. Methods and Materials: Nine prostate cancer patients treated at our institution were randomly selected. For the AAP method, contours on each repeat CT image were automatically generated by mapping the contours from the simulation CT image using deformable image registration. An in-house automated planning tool incorporated into the Pinnacle treatment planning system was used to generate the original and the adapted IMRT plans. The cumulative dose–volume histograms (DVHs) of the target and critical structures were calculated based on the manual contours for all plans and compared with those of plans generated by the conventional method, that is, shifting the isocenters by aligning the images based on the center of the volume (COV) of prostate (prostate COV-aligned). Results: The target coverage from our AAP method for every patient was acceptable, while 1 of the 9 patients showed target underdosing from prostate COV-aligned plans. The normalized volume receiving at least 70 Gy (V{sub 70}), and the mean dose of the rectum and bladder were reduced by 8.9%, 6.4 Gy and 4.3%, 5.3 Gy, respectively, for the AAP method compared with the values obtained from prostate COV-aligned plans. Conclusions: The AAP method, which is fully automated, is effective for online replanning to compensate for target dose deficits and critical organ overdosing caused by interfractional anatomical changes in prostate cancer.

  7. Parenchymal texture analysis in digital mammography: A fully automated pipeline for breast cancer risk assessment

    PubMed Central

    Zheng, Yuanjie; Keller, Brad M.; Ray, Shonket; Wang, Yan; Conant, Emily F.; Gee, James C.; Kontos, Despina

    2015-01-01

    Purpose: Mammographic percent density (PD%) is known to be a strong risk factor for breast cancer. Recent studies also suggest that parenchymal texture features, which are more granular descriptors of the parenchymal pattern, can provide additional information about breast cancer risk. To date, most studies have measured mammographic texture within selected regions of interest (ROIs) in the breast, which cannot adequately capture the complexity of the parenchymal pattern throughout the whole breast. To better characterize patterns of the parenchymal tissue, the authors have developed a fully automated software pipeline based on a novel lattice-based strategy to extract a range of parenchymal texture features from the entire breast region. Methods: Digital mammograms from 106 cases with 318 age-matched controls were retrospectively analyzed. The lattice-based approach is based on a regular grid virtually overlaid on each mammographic image. Texture features are computed from the intersection (i.e., lattice) points of the grid lines within the breast, using a local window centered at each lattice point. Using this strategy, a range of statistical (gray-level histogram, co-occurrence, and run-length) and structural (edge-enhancing, local binary pattern, and fractal dimension) features are extracted. To cover the entire breast, the size of the local window for feature extraction is set equal to the lattice grid spacing and optimized experimentally by evaluating different windows sizes. The association between their lattice-based texture features and breast cancer was evaluated using logistic regression with leave-one-out cross validation and further compared to that of breast PD% and commonly used single-ROI texture features extracted from the retroareolar or the central breast region. Classification performance was evaluated using the area under the curve (AUC) of the receiver operating characteristic (ROC). DeLong’s test was used to compare the different ROCs in

  8. Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.

    PubMed

    Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras

    2016-04-01

    There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a <4 h magnetic bead-based process with excellent yield and good repeatability. This article demonstrates the next level of this work by automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (<3 min) separation to accommodate the high-throughput processing of the automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences.

  9. Fully automated 3D prostate central gland segmentation in MR images: a LOGISMOS based approach

    NASA Astrophysics Data System (ADS)

    Yin, Yin; Fotin, Sergei V.; Periaswamy, Senthil; Kunz, Justin; Haldankar, Hrishikesh; Muradyan, Naira; Turkbey, Baris; Choyke, Peter

    2012-02-01

    One widely accepted classification of a prostate is by a central gland (CG) and a peripheral zone (PZ). In some clinical applications, separating CG and PZ from the whole prostate is useful. For instance, in prostate cancer detection, radiologist wants to know in which zone the cancer occurs. Another application is for multiparametric MR tissue characterization. In prostate T2 MR images, due to the high intensity variation between CG and PZ, automated differentiation of CG and PZ is difficult. Previously, we developed an automated prostate boundary segmentation system, which tested on large datasets and showed good performance. Using the results of the pre-segmented prostate boundary, in this paper, we proposed an automated CG segmentation algorithm based on Layered Optimal Graph Image Segmentation of Multiple Objects and Surfaces (LOGISMOS). The designed LOGISMOS model contained both shape and topology information during deformation. We generated graph cost by training classifiers and used coarse-to-fine search. The LOGISMOS framework guarantees optimal solution regarding to cost and shape constraint. A five-fold cross-validation approach was applied to training dataset containing 261 images to optimize the system performance and compare with a voxel classification based reference approach. After the best parameter settings were found, the system was tested on a dataset containing another 261 images. The mean DSC of 0.81 for the test set indicates that our approach is promising for automated CG segmentation. Running time for the system is about 15 seconds.

  10. Rapid access to compound libraries through flow technology: fully automated synthesis of a 3-aminoindolizine library via orthogonal diversification.

    PubMed

    Lange, Paul P; James, Keith

    2012-10-01

    A novel methodology for the synthesis of druglike heterocycle libraries has been developed through the use of flow reactor technology. The strategy employs orthogonal modification of a heterocyclic core, which is generated in situ, and was used to construct both a 25-membered library of druglike 3-aminoindolizines, and selected examples of a 100-member virtual library. This general protocol allows a broad range of acylation, alkylation and sulfonamidation reactions to be performed in conjunction with a tandem Sonogashira coupling/cycloisomerization sequence. All three synthetic steps were conducted under full automation in the flow reactor, with no handling or isolation of intermediates, to afford the desired products in good yields. This fully automated, multistep flow approach opens the way to highly efficient generation of druglike heterocyclic systems as part of a lead discovery strategy or within a lead optimization program.

  11. A Fully Automated and Highly Versatile System for Testing Multi-cognitive Functions and Recording Neuronal Activities in Rodents

    PubMed Central

    Zheng, Weimin; Ycu, Edgar A.

    2012-01-01

    We have developed a fully automated system for operant behavior testing and neuronal activity recording by which multiple cognitive brain functions can be investigated in a single task sequence. The unique feature of this system is a custom-made, acoustically transparent chamber that eliminates many of the issues associated with auditory cue control in most commercially available chambers. The ease with which operant devices can be added or replaced makes this system quite versatile, allowing for the implementation of a variety of auditory, visual, and olfactory behavioral tasks. Automation of the system allows fine temporal (10 ms) control and precise time-stamping of each event in a predesigned behavioral sequence. When combined with a multi-channel electrophysiology recording system, multiple cognitive brain functions, such as motivation, attention, decision-making, patience, and rewards, can be examined sequentially or independently. PMID:22588124

  12. Fully-automated synthesis of 16β-18F-fluoro-5α-dihydrotestosterone (FDHT) on the ELIXYS radiosynthesizer

    PubMed Central

    Lazari, Mark; Lyashchenko, Serge K.; Burnazi, Eva M.; Lewis, Jason S.; van Dam, R. Michael; Murphy, Jennifer M.

    2015-01-01

    Noninvasive in vivo imaging of androgen receptor (AR) levels with positron emission tomography (PET) is becoming the primary tool in prostate cancer detection and staging. Of the potential 18F-labeled PET tracers, 18F-FDHT has clinically shown to be of highest diagnostic value. We demonstrate the first automated synthesis of 18F-FDHT by adapting the conventional manual synthesis onto the fully-automated ELIXYS radiosynthesizer. Clinically-relevant amounts of 18F-FDHT were synthesized on ELIXYS in 90 min with decay-corrected radiochemical yield of 29 ± 5% (n = 7). The specific activity was 4.6 Ci/µmol (170 GBq/µmol) at end of formulation with a starting activity of 1.0 Ci (37 GBq). The formulated 18F-FDHT yielded sufficient activity for multiple patient doses and passed all quality control tests required for routine clinical use. PMID:26046518

  13. Feasibility of fully automated detection of fiducial markers implanted into the prostate using electronic portal imaging: A comparison of methods

    SciTech Connect

    Harris, Emma J. . E-mail: eharris@icr.ac.uk; McNair, Helen A.; Evans, Phillip M.

    2006-11-15

    Purpose: To investigate the feasibility of fully automated detection of fiducial markers implanted into the prostate using portal images acquired with an electronic portal imaging device. Methods and Materials: We have made a direct comparison of 4 different methods (2 template matching-based methods, a method incorporating attenuation and constellation analyses and a cross correlation method) that have been published in the literature for the automatic detection of fiducial markers. The cross-correlation technique requires a-priory information from the portal images, therefore the technique is not fully automated for the first treatment fraction. Images of 7 patients implanted with gold fiducial markers (8 mm in length and 1 mm in diameter) were acquired before treatment (set-up images) and during treatment (movie images) using 1MU and 15MU per image respectively. Images included: 75 anterior (AP) and 69 lateral (LAT) set-up images and 51 AP and 83 LAT movie images. Using the different methods described in the literature, marker positions were automatically identified. Results: The method based upon cross correlation techniques gave the highest percentage detection success rate of 99% (AP) and 83% (LAT) set-up (1MU) images. The methods gave detection success rates of less than 91% (AP) and 42% (LAT) set-up images. The amount of a-priory information used and how it affects the way the techniques are implemented, is discussed. Conclusions: Fully automated marker detection in set-up images for the first treatment fraction is unachievable using these methods and that using cross-correlation is the best technique for automatic detection on subsequent radiotherapy treatment fractions.

  14. Development of a fully automated network system for long-term health-care monitoring at home.

    PubMed

    Motoi, K; Kubota, S; Ikarashi, A; Nogawa, M; Tanaka, S; Nemoto, T; Yamakoshi, K

    2007-01-01

    Daily monitoring of health condition at home is very important not only as an effective scheme for early diagnosis and treatment of cardiovascular and other diseases, but also for prevention and control of such diseases. From this point of view, we have developed a prototype room for fully automated monitoring of various vital signs. From the results of preliminary experiments using this room, it was confirmed that (1) ECG and respiration during bathing, (2) excretion weight and blood pressure, and (3) respiration and cardiac beat during sleep could be monitored with reasonable accuracy by the sensor system installed in bathtub, toilet and bed, respectively.

  15. A wearable device for a fully automated in-hospital staff and patient identification.

    PubMed

    Cavalleri, M; Morstabilini, R; Reni, G

    2004-01-01

    In the health care context, devices for automated staff / patient identification provide multiple benefits, including error reduction in drug administration, an easier and faster use of the Electronic Health Record, enhanced security and control features when accessing confidential data, etc. Current identification systems (e.g. smartcards, bar codes) are not completely seamless to users and require mechanical operations that sometimes are difficult to perform for impaired subjects. Emerging wireless RFID technologies are encouraging, but cannot still be introduced in health care environments due to their electromagnetic emissions and the need for large size antenna to operate at reasonable distances. The present work describes a prototype of wearable device for automated staff and patient identification which is small in size and complies with the in-hospital electromagnetic requirements. This prototype also implements an anti-counterfeit option. Its experimental application allowed the introduction of some security functions for confidential data management.

  16. A scalable, fully automated process for construction of sequence-ready human exome targeted capture libraries

    PubMed Central

    2011-01-01

    Genome targeting methods enable cost-effective capture of specific subsets of the genome for sequencing. We present here an automated, highly scalable method for carrying out the Solution Hybrid Selection capture approach that provides a dramatic increase in scale and throughput of sequence-ready libraries produced. Significant process improvements and a series of in-process quality control checkpoints are also added. These process improvements can also be used in a manual version of the protocol. PMID:21205303

  17. LDRD final report: Automated planning and programming of assembly of fully 3D mechanisms

    SciTech Connect

    Kaufman, S.G.; Wilson, R.H.; Jones, R.E.; Calton, T.L.; Ames, A.L.

    1996-11-01

    This report describes the results of assembly planning research under the LDRD. The assembly planning problem is that of finding a sequence of assembly operations, starting from individual parts, that will result in complete assembly of a device specified as a CAD model. The automated assembly programming problem is that of automatically producing a robot program that will carry out a given assembly sequence. Given solutions to both of these problems, it is possible to automatically program a robot to assemble a mechanical device given as a CAD data file. This report describes the current state of our solutions to both of these problems, and a software system called Archimedes 2 we have constructed to automate these solutions. Because Archimedes 2 can input CAD data in several standard formats, we have been able to test it on a number of industrial assembly models more complex than any before attempted by automated assembly planning systems, some having over 100 parts. A complete path from a CAD model to an automatically generated robot program for assembling the device represented by the CAD model has also been demonstrated.

  18. Advanced manufacturing rules check (MRC) for fully automated assessment of complex reticle designs: Part II

    NASA Astrophysics Data System (ADS)

    Straub, J. A.; Aguilar, D.; Buck, P. D.; Dawkins, D.; Gladhill, R.; Nolke, S.; Riddick, J.

    2006-10-01

    Advanced electronic design automation (EDA) tools, with their simulation, modeling, design rule checking, and optical proximity correction capabilities, have facilitated the improvement of first pass wafer yields. While the data produced by these tools may have been processed for optimal wafer manufacturing, it is possible for the same data to be far from ideal for photomask manufacturing, particularly at lithography and inspection stages, resulting in production delays and increased costs. The same EDA tools used to produce the data can be used to detect potential problems for photomask manufacturing in the data. In the previous paper, it was shown how photomask MRC is used to uncover data related problems prior to automated defect inspection. It was demonstrated how jobs which are likely to have problems at inspection could be identified and separated from those which are not. The use of photomask MRC in production was shown to reduce time lost to aborted runs and troubleshooting due to data issues. In this paper, the effectiveness of this photomask MRC program in a high volume photomask factory over the course of a year as applied to more than ten thousand jobs will be shown. Statistics on the results of the MRC runs will be presented along with the associated impact to the automated defect inspection process. Common design problems will be shown as well as their impact to mask manufacturing throughput and productivity. Finally, solutions to the most common and most severe problems will be offered and discussed.

  19. Fully automated synthesis of [(18) F]fluoro-dihydrotestosterone ([(18) F]FDHT) using the FlexLab module.

    PubMed

    Ackermann, Uwe; Lewis, Jason S; Young, Kenneth; Morris, Michael J; Weickhardt, Andrew; Davis, Ian D; Scott, Andrew M

    2016-08-01

    Imaging of androgen receptor expression in prostate cancer using F-18 FDHT is becoming increasingly popular. With the radiolabelling precursor now commercially available, developing a fully automated synthesis of [(18) F] FDHT is important. We have fully automated the synthesis of F-18 FDHT using the iPhase FlexLab module using only commercially available components. Total synthesis time was 90 min, radiochemical yields were 25-33% (n = 11). Radiochemical purity of the final formulation was > 99% and specific activity was > 18.5 GBq/µmol for all batches. This method can be up-scaled as desired, thus making it possible to study multiple patients in a day. Furthermore, our procedure uses 4 mg of precursor only and is therefore cost-effective. The synthesis has now been validated at Austin Health and is currently used for [(18) F]FDHT studies in patients. We believe that this method can easily adapted by other modules to further widen the availability of [(18) F]FDHT.

  20. A Robust and Fully-Automated Chromatographic Method for the Quantitative Purification of Ca and Sr for Isotopic Analysis

    NASA Astrophysics Data System (ADS)

    Smith, H. B.; Kim, H.; Romaniello, S. J.; Field, P.; Anbar, A. D.

    2014-12-01

    High throughput methods for sample purification are required to effectively exploit new opportunities in the study of non-traditional stable isotopes. Many geochemical isotopic studies would benefit from larger data sets, but these are often impractical with manual drip chromatography techniques, which can be time-consuming and demand the attention of skilled laboratory staff. Here we present a new, fully-automated single-column method suitable for the purification of both Ca and Sr for stable and radiogenic isotopic analysis. The method can accommodate a wide variety of sample types, including carbonates, bones, and teeth; silicate rocks and sediments; fresh and marine waters; and biological samples such as blood and urine. Protocols for these isotopic analyses are being developed for use on the new prepFAST-MCTM system from Elemental Scientific (ESI). The system is highly adaptable and processes up to 24-60 samples per day by reusing a single chromatographic column. Efficient column cleaning between samples and an all Teflon flow path ensures that sample carryover is maintained at the level of background laboratory blanks typical for manual drip chromatography. This method is part of a family of new fully-automated chromatographic methods being developed to address many different isotopic systems including B, Ca, Fe, Cu, Zn, Sr, Cd, Pb, and U. These methods are designed to be rugged and transferrable, and to allow the preparation of large, diverse sample sets via a highly repeatable process with minimal effort.

  1. Development of a Fully Automated Flow Injection Analyzer Implementing Bioluminescent Biosensors for Water Toxicity Assessment

    PubMed Central

    Komaitis, Efstratios; Vasiliou, Efstathios; Kremmydas, Gerasimos; Georgakopoulos, Dimitrios G.; Georgiou, Constantinos

    2010-01-01

    This paper describes the development of an automated Flow Injection analyzer for water toxicity assessment. The analyzer is validated by assessing the toxicity of heavy metal (Pb2+, Hg2+ and Cu2+) solutions. One hundred μL of a Vibrio fischeri suspension are injected in a carrier solution containing different heavy metal concentrations. Biosensor cells are mixed with the toxic carrier solution in the mixing coil on the way to the detector. Response registered is % inhibition of biosensor bioluminescence due to heavy metal toxicity in comparison to that resulting by injecting the Vibrio fischeri suspension in deionised water. Carrier solutions of mercury showed higher toxicity than the other heavy metals, whereas all metals show concentration related levels of toxicity. The biosensor’s response to carrier solutions of different pHs was tested. Vibrio fischeri’s bioluminescence is promoted in the pH 5–10 range. Experiments indicate that the whole cell biosensor, as applied in the automated fluidic system, responds to various toxic solutions. PMID:22163592

  2. Fully automated high-performance liquid chromatographic assay for the analysis of free catecholamines in urine.

    PubMed

    Said, R; Robinet, D; Barbier, C; Sartre, J; Huguet, C

    1990-08-24

    A totally automated and reliable high-performance liquid chromatographic method is described for the routine determination of free catecholamines (norepinephrine, epinephrine and dopamine) in urine. The catecholamines were isolated from urine samples using small alumina columns. A standard automated method for pH adjustment of urine before the extraction step has been developed. The extraction was performed on an ASPEC (Automatic Sample Preparation with Extraction Columns, Gilson). The eluate was collected in a separate tube and then automatically injected into the chromatographic column. The catecholamines were separated by reversed-phase ion-pair liquid chromatography and quantified by fluorescence detection. No manual intervention was required during the extraction and separation procedure. One sample may be run every 15 min, ca. 96 samples in 24 h. Analytical recoveries for all three catecholamines are 63-87%, and the detection limits are 0.01, 0.01, and 0.03 microM for norepinephrine, epinephrine and dopamine, respectively, which is highly satisfactory for urine. Day-to-day coefficients of variation were less than 10%.

  3. A fully automated primary screening system for the discovery of therapeutic antibodies directly from B cells.

    PubMed

    Tickle, Simon; Howells, Louise; O'Dowd, Victoria; Starkie, Dale; Whale, Kevin; Saunders, Mark; Lee, David; Lightwood, Daniel

    2015-04-01

    For a therapeutic antibody to succeed, it must meet a range of potency, stability, and specificity criteria. Many of these characteristics are conferred by the amino acid sequence of the heavy and light chain variable regions and, for this reason, can be screened for during antibody selection. However, it is important to consider that antibodies satisfying all these criteria may be of low frequency in an immunized animal; for this reason, it is essential to have a mechanism that allows for efficient sampling of the immune repertoire. UCB's core antibody discovery platform combines high-throughput B cell culture screening and the identification and isolation of single, antigen-specific IgG-secreting B cells through a proprietary technique called the "fluorescent foci" method. Using state-of-the-art automation to facilitate primary screening, extremely efficient interrogation of the natural antibody repertoire is made possible; more than 1 billion immune B cells can now be screened to provide a useful starting point from which to identify the rare therapeutic antibody. This article will describe the design, construction, and commissioning of a bespoke automated screening platform and two examples of how it was used to screen for antibodies against two targets.

  4. Evaluation of a fully automated method to measure the critical removal stress of adult barnacles.

    PubMed

    Conlan, Sheelagh L; Mutton, Robert J; Aldred, Nick; Clare, Anthony S

    2008-01-01

    A computer-controlled force gauge designed to measure the adhesive strength of barnacles on test substrata is described. The instrument was evaluated with adult barnacles grown in situ on Silastic T2(R)-coated microscope slides and epoxy replicas adhered to the same substratum with synthetic adhesive. The force per unit area required to detach the barnacles (critical removal stress) using the new automated system was comparable to that obtained with ASTM D5618 (1994) (0.19 and 0.28 MPa compared with 0.18 and 0.27 MPa for two batches of barnacles). The automated method showed a faster rate of force development compared with the manual spring force gauge used for ASTM D5618 (1994). The new instrument was as accurate and precise at determining surface area as manual delineation used with ASTM D5618 (1994). The method provided significant advantages such as higher throughput speed, the ability to test smaller barnacles (which took less time to grow) and to control the force application angle and speed. The variability in measurements was lower than previously reported, suggesting an improved ability to compare the results obtained by different researchers.

  5. Fast Image Analysis for the Micronucleus Assay in a Fully Automated High-Throughput Biodosimetry System

    PubMed Central

    Lyulko, Oleksandra V.; Garty, Guy; Randers-Pehrson, Gerhard; Turner, Helen C.; Szolc, Barbara; Brenner, David J.

    2014-01-01

    The development of, and results from an image analysis system are presented for automated detection and scoring of micronuclei in human peripheral blood lymphocytes. The system is part of the Rapid Automated Biodosimetry Tool, which was developed at the Center for High-Throughput Minimally Invasive Radiation Biodosimetry for rapid radiation dose assessment of many individuals based on single fingerstick samples of blood. Blood lymphocytes were subjected to the cytokinesis-block micronucleus assay and the images of cell cytoplasm and nuclei are analyzed to estimate the frequency of micronuclei in binucleated cells. We describe an algorithm that is based on dual fluorescent labeling of lymphocytes with separate analysis of images of cytoplasm and nuclei. To evaluate the performance of the system, blood samples of seven healthy donors were irradiated in vitro with doses from 0–10 Gy and dose-response curves of micronuclei frequencies were generated. To establish the applicability of the system to the detection of high doses, the ratios of mononucleated cells to binucleated cells were determined for three of the donors. All of the dose-response curves generated automatically showed clear dose dependence and good correlation (R2 from 0.914–0.998) with the results of manual scoring. PMID:24502354

  6. Towards heterotic computing with droplets in a fully automated droplet-maker platform.

    PubMed

    Henson, Alon; Gutierrez, Juan Manuel Parrilla; Hinkley, Trevor; Tsuda, Soichiro; Cronin, Leroy

    2015-07-28

    The control and prediction of complex chemical systems is a difficult problem due to the nature of the interactions, transformations and processes occurring. From self-assembly to catalysis and self-organization, complex chemical systems are often heterogeneous mixtures that at the most extreme exhibit system-level functions, such as those that could be observed in a living cell. In this paper, we outline an approach to understand and explore complex chemical systems using an automated droplet maker to control the composition, size and position of the droplets in a predefined chemical environment. By investigating the spatio-temporal dynamics of the droplets, the aim is to understand how to control system-level emergence of complex chemical behaviour and even view the system-level behaviour as a programmable entity capable of information processing. Herein, we explore how our automated droplet-maker platform could be viewed as a prototype chemical heterotic computer with some initial data and example problems that may be viewed as potential chemically embodied computations.

  7. Fully Automated Centrifugal Microfluidic Device for Ultrasensitive Protein Detection from Whole Blood.

    PubMed

    Park, Yang-Seok; Sunkara, Vijaya; Kim, Yubin; Lee, Won Seok; Han, Ja-Ryoung; Cho, Yoon-Kyoung

    2016-01-01

    Enzyme-linked immunosorbent assay (ELISA) is a promising method to detect small amount of proteins in biological samples. The devices providing a platform for reduced sample volume and assay time as well as full automation are required for potential use in point-of-care-diagnostics. Recently, we have demonstrated ultrasensitive detection of serum proteins, C-reactive protein (CRP) and cardiac troponin I (cTnI), utilizing a lab-on-a-disc composed of TiO2 nanofibrous (NF) mats. It showed a large dynamic range with femto molar (fM) detection sensitivity, from a small volume of whole blood in 30 min. The device consists of several components for blood separation, metering, mixing, and washing that are automated for improved sensitivity from low sample volumes. Here, in the video demonstration, we show the experimental protocols and know-how for the fabrication of NFs as well as the disc, their integration and the operation in the following order: processes for preparing TiO2 NF mat; transfer-printing of TiO2 NF mat onto the disc; surface modification for immune-reactions, disc assembly and operation; on-disc detection and representative results for immunoassay. Use of this device enables multiplexed analysis with minimal consumption of samples and reagents. Given the advantages, the device should find use in a wide variety of applications, and prove beneficial in facilitating the analysis of low abundant proteins. PMID:27167836

  8. Fully automated image-guided needle insertion: application to small animal biopsies.

    PubMed

    Ayadi, A; Bour, G; Aprahamian, M; Bayle, B; Graebling, P; Gangloff, J; Soler, L; Egly, J M; Marescaux, J

    2007-01-01

    The study of biological process evolution in small animals requires time-consuming and expansive analyses of a large population of animals. Serial analyses of the same animal is potentially a great alternative. However non-invasive procedures must be set up, to retrieve valuable tissue samples from precisely defined areas in living animals. Taking advantage of the high resolution level of in vivo molecular imaging, we defined a procedure to perform image-guided needle insertion and automated biopsy using a micro CT-scan, a robot and a vision system. Workspace limitations in the scanner require the animal to be removed and laid in front of the robot. A vision system composed of a grid projector and a camera is used to register the designed animal-bed with to respect to the robot and to calibrate automatically the needle position and orientation. Automated biopsy is then synchronised with respiration and performed with a pneumatic translation device, at high velocity, to minimize organ deformation. We have experimentally tested our biopsy system with different needles.

  9. Development of a fully automated Flow Injection analyzer implementing bioluminescent biosensors for water toxicity assessment.

    PubMed

    Komaitis, Efstratios; Vasiliou, Efstathios; Kremmydas, Gerasimos; Georgakopoulos, Dimitrios G; Georgiou, Constantinos

    2010-01-01

    This paper describes the development of an automated Flow Injection analyzer for water toxicity assessment. The analyzer is validated by assessing the toxicity of heavy metal (Pb(2+), Hg(2+) and Cu(2+)) solutions. One hundred μL of a Vibrio fischeri suspension are injected in a carrier solution containing different heavy metal concentrations. Biosensor cells are mixed with the toxic carrier solution in the mixing coil on the way to the detector. Response registered is % inhibition of biosensor bioluminescence due to heavy metal toxicity in comparison to that resulting by injecting the Vibrio fischeri suspension in deionised water. Carrier solutions of mercury showed higher toxicity than the other heavy metals, whereas all metals show concentration related levels of toxicity. The biosensor's response to carrier solutions of different pHs was tested. Vibrio fischeri's bioluminescence is promoted in the pH 5-10 range. Experiments indicate that the whole cell biosensor, as applied in the automated fluidic system, responds to various toxic solutions. PMID:22163592

  10. SARA South Observatory: A Fully Automated Boller & Chivens 0.6-m Telescope at C.T.I.O.

    NASA Astrophysics Data System (ADS)

    Mack, Peter; KanniahPadmanaban, S. Y.; Kaitchuck, R.; Borstad, A.; Luzier, N.

    2010-05-01

    The SARA South Observatory is the re-birth of the Lowell 24-inch telescope located on the south-east ridge of Cerro Tololo, Chile. Installed in 1968 this Boller & Chivens telescope fell into disuse for almost 20 years. The telescope and observatory have undergone a major restoration. A new dome with a wide slit has been fully automated with an ACE SmartDome controller featuring autonomous closure. The telescope was completely gutted, repainted, and virtually every electronic component and wire replaced. Modern infrastructure, such as USB, Ethernet and video ports have been incorporated into the telescope tube saddle boxes. Absolute encoders have been placed on the Hour Angle and declination axes with a resolution of less than 0.7 arc seconds. The secondary mirror is also equipped with an absolute encoder and temperature sensor to allow for fully automated focus. New mirror coatings, automated mirror covers, a new 150mm refractor, and new instrumentation have been deployed. An integrated X-stage guider and dual filter wheel containing 18 filters is used for direct imaging. The guider camera can be easily removed and a standard 2-inch eyepiece used for occasional viewing by VIP's at C.T.I.O. A 12 megapixel all-sky camera produces color images every 30 seconds showing details in the Milky Way and Magellanic Clouds. Two low light level cameras are deployed; one on the finder and one at the top of the telescope showing a 30° field. Other auxiliary equipment, including daytime color video cameras, weather station and remotely controllable power outlets permit complete control and servicing of the system. The SARA Consortium (www.saraobservatory.org), a collection of ten eastern universities, also operates a 0.9-m telescope at the Kitt Peak National Observatory using an almost identical set of instruments with the same ACE control system. This project was funded by the SARA Consortium.

  11. Predicting non-small cell lung cancer prognosis by fully automated microscopic pathology image features

    PubMed Central

    Yu, Kun-Hsing; Zhang, Ce; Berry, Gerald J.; Altman, Russ B.; Ré, Christopher; Rubin, Daniel L.; Snyder, Michael

    2016-01-01

    Lung cancer is the most prevalent cancer worldwide, and histopathological assessment is indispensable for its diagnosis. However, human evaluation of pathology slides cannot accurately predict patients' prognoses. In this study, we obtain 2,186 haematoxylin and eosin stained histopathology whole-slide images of lung adenocarcinoma and squamous cell carcinoma patients from The Cancer Genome Atlas (TCGA), and 294 additional images from Stanford Tissue Microarray (TMA) Database. We extract 9,879 quantitative image features and use regularized machine-learning methods to select the top features and to distinguish shorter-term survivors from longer-term survivors with stage I adenocarcinoma (P<0.003) or squamous cell carcinoma (P=0.023) in the TCGA data set. We validate the survival prediction framework with the TMA cohort (P<0.036 for both tumour types). Our results suggest that automatically derived image features can predict the prognosis of lung cancer patients and thereby contribute to precision oncology. Our methods are extensible to histopathology images of other organs. PMID:27527408

  12. Fully automated hybrid diode laser assembly using high precision active alignment

    NASA Astrophysics Data System (ADS)

    Böttger, Gunnar; Weber, Daniel; Scholz, Friedemann; Schröder, Henning; Schneider-Ramelow, Martin; Lang, Klaus-Dieter

    2016-03-01

    Fraunhofer IZM, Technische Universität Berlin and eagleyard Photonics present various implementations of current micro-optical assemblies for high quality free space laser beam forming and efficient fiber coupling. The laser modules shown are optimized for fast and automated assembly in small form factor packages via state-of-the-art active alignment machinery, using alignment and joining processes that have been developed and established in various industrial research projects. Operational wavelengths and optical powers ranging from 600 to 1600 nm and from 1 mW to several W respectively are addressed, for application in high-resolution laser spectroscopy, telecom and optical sensors, up to the optical powers needed in industrial and medical laser treatment.

  13. a Fully Automated Pipeline for Classification Tasks with AN Application to Remote Sensing

    NASA Astrophysics Data System (ADS)

    Suzuki, K.; Claesen, M.; Takeda, H.; De Moor, B.

    2016-06-01

    Nowadays deep learning has been intensively in spotlight owing to its great victories at major competitions, which undeservedly pushed `shallow' machine learning methods, relatively naive/handy algorithms commonly used by industrial engineers, to the background in spite of their facilities such as small requisite amount of time/dataset for training. We, with a practical point of view, utilized shallow learning algorithms to construct a learning pipeline such that operators can utilize machine learning without any special knowledge, expensive computation environment, and a large amount of labelled data. The proposed pipeline automates a whole classification process, namely feature-selection, weighting features and the selection of the most suitable classifier with optimized hyperparameters. The configuration facilitates particle swarm optimization, one of well-known metaheuristic algorithms for the sake of generally fast and fine optimization, which enables us not only to optimize (hyper)parameters but also to determine appropriate features/classifier to the problem, which has conventionally been a priori based on domain knowledge and remained untouched or dealt with naïve algorithms such as grid search. Through experiments with the MNIST and CIFAR-10 datasets, common datasets in computer vision field for character recognition and object recognition problems respectively, our automated learning approach provides high performance considering its simple setting (i.e. non-specialized setting depending on dataset), small amount of training data, and practical learning time. Moreover, compared to deep learning the performance stays robust without almost any modification even with a remote sensing object recognition problem, which in turn indicates that there is a high possibility that our approach contributes to general classification problems.

  14. A quality assurance framework for the fully automated and objective evaluation of image quality in cone-beam computed tomography

    SciTech Connect

    Steiding, Christian; Kolditz, Daniel; Kalender, Willi A.

    2014-03-15

    Purpose: Thousands of cone-beam computed tomography (CBCT) scanners for vascular, maxillofacial, neurological, and body imaging are in clinical use today, but there is no consensus on uniform acceptance and constancy testing for image quality (IQ) and dose yet. The authors developed a quality assurance (QA) framework for fully automated and time-efficient performance evaluation of these systems. In addition, the dependence of objective Fourier-based IQ metrics on direction and position in 3D volumes was investigated for CBCT. Methods: The authors designed a dedicated QA phantom 10 cm in length consisting of five compartments, each with a diameter of 10 cm, and an optional extension ring 16 cm in diameter. A homogeneous section of water-equivalent material allows measuring CT value accuracy, image noise and uniformity, and multidimensional global and local noise power spectra (NPS). For the quantitative determination of 3D high-contrast spatial resolution, the modulation transfer function (MTF) of centrally and peripherally positioned aluminum spheres was computed from edge profiles. Additional in-plane and axial resolution patterns were used to assess resolution qualitatively. The characterization of low-contrast detectability as well as CT value linearity and artifact behavior was tested by utilizing sections with soft-tissue-equivalent and metallic inserts. For an automated QA procedure, a phantom detection algorithm was implemented. All tests used in the dedicated QA program were initially verified in simulation studies and experimentally confirmed on a clinical dental CBCT system. Results: The automated IQ evaluation of volume data sets of the dental CBCT system was achieved with the proposed phantom requiring only one scan for the determination of all desired parameters. Typically, less than 5 min were needed for phantom set-up, scanning, and data analysis. Quantitative evaluation of system performance over time by comparison to previous examinations was also

  15. Fully Automated Field-Deployable Bioaerosol Monitoring System Using Carbon Nanotube-Based Biosensors.

    PubMed

    Kim, Junhyup; Jin, Joon-Hyung; Kim, Hyun Soo; Song, Wonbin; Shin, Su-Kyoung; Yi, Hana; Jang, Dae-Ho; Shin, Sehyun; Lee, Byung Yang

    2016-05-17

    Much progress has been made in the field of automated monitoring systems of airborne pathogens. However, they still lack the robustness and stability necessary for field deployment. Here, we demonstrate a bioaerosol automonitoring instrument (BAMI) specifically designed for the in situ capturing and continuous monitoring of airborne fungal particles. This was possible by developing highly sensitive and selective fungi sensors based on two-channel carbon nanotube field-effect transistors (CNT-FETs), followed by integration with a bioaerosol sampler, a Peltier cooler for receptor lifetime enhancement, and a pumping assembly for fluidic control. These four main components collectively cooperated with each other to enable the real-time monitoring of fungi. The two-channel CNT-FETs can detect two different fungal species simultaneously. The Peltier cooler effectively lowers the working temperature of the sensor device, resulting in extended sensor lifetime and receptor stability. The system performance was verified in both laboratory conditions and real residential areas. The system response was in accordance with reported fungal species distribution in the environment. Our system is versatile enough that it can be easily modified for the monitoring of other airborne pathogens. We expect that our system will expedite the development of hand-held and portable systems for airborne bioaerosol monitoring. PMID:27070239

  16. Grid-Competitive Residential and Commercial Fully Automated PV Systems Technology: Final technical Report, August 2011

    SciTech Connect

    Brown, Katie E.; Cousins, Peter; Culligan, Matt; Jonathan Botkin; DeGraaff, David; Bunea, Gabriella; Rose, Douglas; Bourne, Ben; Koehler, Oliver

    2011-08-26

    Under DOE's Technology Pathway Partnership program, SunPower Corporation developed turn-key, high-efficiency residential and commercial systems that are cost effective. Key program objectives include a reduction in LCOE values to 9-12 cents/kWh and 13-18 cents/kWh respectively for the commercial and residential markets. Target LCOE values for the commercial ground, commercial roof, and residential markets are 10, 11, and 13 cents/kWh. For this effort, SunPower collaborated with a variety of suppliers and partners to complete the tasks below. Subcontractors included: Solaicx, SiGen, Ribbon Technology, Dow Corning, Xantrex, Tigo Energy, and Solar Bridge. SunPower's TPP addressed nearly the complete PV value chain: from ingot growth through system deployment. Throughout the award period of performance, SunPower has made progress toward achieving these reduced costs through the development of 20%+ efficient modules, increased cell efficiency through the understanding of loss mechanisms and improved manufacturing technologies, novel module development, automated design tools and techniques, and reduced system development and installation time. Based on an LCOE assessment using NREL's Solar Advisor Model, SunPower achieved the 2010 target range, as well as progress toward 2015 targets.

  17. A Fully Automated and Robust Method to Incorporate Stamping Data in Crash, NVH and Durability Analysis

    NASA Astrophysics Data System (ADS)

    Palaniswamy, Hariharasudhan; Kanthadai, Narayan; Roy, Subir; Beauchesne, Erwan

    2011-08-01

    Crash, NVH (Noise, Vibration, Harshness), and durability analysis are commonly deployed in structural CAE analysis for mechanical design of components especially in the automotive industry. Components manufactured by stamping constitute a major portion of the automotive structure. In CAE analysis they are modeled at a nominal state with uniform thickness and no residual stresses and strains. However, in reality the stamped components have non-uniformly distributed thickness and residual stresses and strains resulting from stamping. It is essential to consider the stamping information in CAE analysis to accurately model the behavior of the sheet metal structures under different loading conditions. Especially with the current emphasis on weight reduction by replacing conventional steels with aluminum and advanced high strength steels it is imperative to avoid over design. Considering this growing need in industry, a highly automated and robust method has been integrated within Altair Hyperworks® to initialize sheet metal components in CAE models with stamping data. This paper demonstrates this new feature and the influence of stamping data for a full car frontal crash analysis.

  18. Fully automated synthesis of [(18)F]T807, a PET tau tracer for Alzheimer's disease.

    PubMed

    Gao, Mingzhang; Wang, Min; Zheng, Qi-Huang

    2015-08-01

    The authentic standard T807 and its nitro-precursor T807P as well as t-Boc-protected T807P precursor for radiolabeling were synthesized from (4-bromophenyl)boronic acid, 3-bromo-4-nitropyridine and 3-bromo-6-nitropyridine with overall chemical yield 27% in three steps, 4-7% in three to five steps, and 3-8% in four to five steps, respectively. [(18)F]T807 was synthesized from T807P by the nucleophilic [(18)F]fluorination with K[(18)F]F/Kryptofix 2.2.2 in DMSO at 140 °C followed by reduction with Fe powder/HCOOH through manual synthesis with 5-10% decay corrected radiochemical yield in two steps. [(18)F]T807 was also synthesized from t-Boc-protected T807P by a concurrent [(18)F]fluorination and deprotection with K[(18)F]F/Kryptofix 2.2.2 in DMSO at 140 °C and purified by HPLC-SPE method in a home-built automated [(18)F]radiosynthesis module with 20-30% decay corrected radiochemical yield in one step. The specific activity of [(18)F]T807 at end of bombardment (EOB) was 37-370 GBq/μmol.

  19. UFCORIN: A fully automated predictor of solar flares in GOES X-ray flux

    NASA Astrophysics Data System (ADS)

    Muranushi, Takayuki; Shibayama, Takuya; Muranushi, Yuko Hada; Isobe, Hiroaki; Nemoto, Shigeru; Komazaki, Kenji; Shibata, Kazunari

    2015-11-01

    We have developed UFCORIN, a platform for studying and automating space weather prediction. Using our system we have tested 6160 different combinations of Solar Dynamic Observatory/Helioseismic and Magnetic Imager data as input data, and simulated the prediction of GOES X-ray flux for 2 years (2011-2012) with 1 h cadence. We have found that direct comparison of the true skill statistic (TSS) from small cross-validation sets is ill posed and used the standard scores (z) of the TSS to compare the performance of the various prediction strategies. The z of a strategy is a stochastic variable of the stochastically chosen cross-validation data set, and the z for the three strategies best at predicting X-, ≥M-, and ≥C-class flares are better than the average z of the 6160 strategies by 2.3σ, 2.1σ, and 3.8σ confidence levels, respectively. The best three TSS values were 0.75 ± 0.07, 0.48 ± 0.02, and 0.56 ± 0.04, respectively.

  20. Fully-automated roller bottle handling system for large scale culture of mammalian cells.

    PubMed

    Kunitake, R; Suzuki, A; Ichihashi, H; Matsuda, S; Hirai, O; Morimoto, K

    1997-01-20

    A fully automatic and continuous cell culture system based on roller bottles is described in this paper. The system includes a culture rack storage station for storing a large number of roller bottles filled with culture medium and inoculated with mammalian cells, mass-handling facility for extracting completed cultures from the roller bottles, and replacing the culture medium. The various component units of the system were controlled either by a general-purpose programmable logic controller or a dedicated controller. The system provided four subsequent operation modes: cell inoculation, medium change, harvesting, and medium change. The operator could easily select and change the appropriate mode from outside of the aseptic area. The development of the system made large-scale production of mammalian cells, and manufacturing and stabilization of high quality products such as erythropoietin possible under total aseptic control, and opened up the door for industrial production of physiologically active substances as pharmaceutical drugs by mammalian cell culture.

  1. A Fully Automated Multi-Scale Flood Monitoring System Based On MODIS And TerraSAR-X Data

    NASA Astrophysics Data System (ADS)

    Martinis, Sandro; Kersten, Jens; Twele, Andre; Eberle, Jonas; Strobl, Christian; Stein, Enrico

    2013-12-01

    In this contribution, a fully automated multi-scale flood monitoring system is presented. The monitoring component of the system systematically detects the extent of inundations on a continental scale with medium spatial resolution using daily acquired data of the Moderate Resolution Imaging Spectroradiometer (MODIS). In case a user-defined threshold for flooded areas over an area of interest has been reached, the crisis component of the system is triggered to derive flood information at higher spatial detail using a Synthetic Aperture Radar (SAR) based satellite mission (TerraSAR-X). The automatic processing chains of both components include data pre-processing, computation and adaption of global auxiliary data, thematic classification and a subsequent dissemination of flood maps using an interactive web-client. The performance of the flood monitoring system is demonstrated for a flood event in Russia in May 2013.

  2. Evaluation of a Fully Automated Research Prototype for the Immediate Identification of Microorganisms from Positive Blood Cultures under Clinical Conditions

    PubMed Central

    Hyman, Jay M.; Walsh, John D.; Ronsick, Christopher; Wilson, Mark; Hazen, Kevin C.; Borzhemskaya, Larisa; Link, John; Clay, Bradford; Ullery, Michael; Sanchez-Illan, Mirta; Rothenberg, Steven; Robinson, Ron; van Belkum, Alex

    2016-01-01

    ABSTRACT A clinical laboratory evaluation of an intrinsic fluorescence spectroscopy (IFS)-based identification system paired to a BacT/Alert Virtuo microbial detection system (bioMérieux, Inc., Durham, NC) was performed to assess the potential for fully automated identification of positive blood cultures. The prototype IFS system incorporates a novel method combining a simple microbial purification procedure with rapid in situ identification via spectroscopy. Results were available within 15 min of a bottle signaling positive and required no manual intervention. Among cultures positive for organisms contained within the database and producing acceptable spectra, 75 of 88 (85.2%) and 79 of 88 (89.8%) were correctly identified to the species and genus level, respectively. These results are similar to the performance of existing rapid methods. PMID:27094332

  3. Fully automated segmentation of oncological PET volumes using a combined multiscale and statistical model

    SciTech Connect

    Montgomery, David W. G.; Amira, Abbes; Zaidi, Habib

    2007-02-15

    The widespread application of positron emission tomography (PET) in clinical oncology has driven this imaging technology into a number of new research and clinical arenas. Increasing numbers of patient scans have led to an urgent need for efficient data handling and the development of new image analysis techniques to aid clinicians in the diagnosis of disease and planning of treatment. Automatic quantitative assessment of metabolic PET data is attractive and will certainly revolutionize the practice of functional imaging since it can lower variability across institutions and may enhance the consistency of image interpretation independent of reader experience. In this paper, a novel automated system for the segmentation of oncological PET data aiming at providing an accurate quantitative analysis tool is proposed. The initial step involves expectation maximization (EM)-based mixture modeling using a k-means clustering procedure, which varies voxel order for initialization. A multiscale Markov model is then used to refine this segmentation by modeling spatial correlations between neighboring image voxels. An experimental study using an anthropomorphic thorax phantom was conducted for quantitative evaluation of the performance of the proposed segmentation algorithm. The comparison of actual tumor volumes to the volumes calculated using different segmentation methodologies including standard k-means, spatial domain Markov Random Field Model (MRFM), and the new multiscale MRFM proposed in this paper showed that the latter dramatically reduces the relative error to less than 8% for small lesions (7 mm radii) and less than 3.5% for larger lesions (9 mm radii). The analysis of the resulting segmentations of clinical oncologic PET data seems to confirm that this methodology shows promise and can successfully segment patient lesions. For problematic images, this technique enables the identification of tumors situated very close to nearby high normal physiologic uptake. The

  4. “Smart” RCTs: Development of a Smartphone App for Fully Automated Nutrition-Labeling Intervention Trials

    PubMed Central

    Li, Nicole; Dunford, Elizabeth; Eyles, Helen; Crino, Michelle; Michie, Jo; Ni Mhurchu, Cliona

    2016-01-01

    Background There is substantial interest in the effects of nutrition labels on consumer food-purchasing behavior. However, conducting randomized controlled trials on the impact of nutrition labels in the real world presents a significant challenge. Objective The Food Label Trial (FLT) smartphone app was developed to enable conducting fully automated trials, delivering intervention remotely, and collecting individual-level data on food purchases for two nutrition-labeling randomized controlled trials (RCTs) in New Zealand and Australia. Methods Two versions of the smartphone app were developed: one for a 5-arm trial (Australian) and the other for a 3-arm trial (New Zealand). The RCT protocols guided requirements for app functionality, that is, obtaining informed consent, two-stage eligibility check, questionnaire administration, randomization, intervention delivery, and outcome assessment. Intervention delivery (nutrition labels) and outcome data collection (individual shopping data) used the smartphone camera technology, where a barcode scanner was used to identify a packaged food and link it with its corresponding match in a food composition database. Scanned products were either recorded in an electronic list (data collection mode) or allocated a nutrition label on screen if matched successfully with an existing product in the database (intervention delivery mode). All recorded data were transmitted to the RCT database hosted on a server. Results In total approximately 4000 users have downloaded the FLT app to date; 606 (Australia) and 1470 (New Zealand) users met the eligibility criteria and were randomized. Individual shopping data collected by participants currently comprise more than 96,000 (Australia) and 229,000 (New Zealand) packaged food and beverage products. Conclusions The FLT app is one of the first smartphone apps to enable conducting fully automated RCTs. Preliminary app usage statistics demonstrate large potential of such technology, both for

  5. A fully integrated and automated microsystem for rapid pharmacogenetic typing of multiple warfarin-related single-nucleotide polymorphisms.

    PubMed

    Zhuang, Bin; Han, Junping; Xiang, Guangxin; Gan, Wupeng; Wang, Shuaiqin; Wang, Dong; Wang, Lei; Sun, Jing; Li, Cai-Xia; Liu, Peng

    2016-01-01

    A fully integrated and automated microsystem consisting of low-cost, disposable plastic chips for DNA extraction and PCR amplification combined with a reusable glass capillary array electrophoresis chip in a modular-based format was successfully developed for warfarin pharmacogenetic testing. DNA extraction was performed by adopting a filter paper-based method, followed by "in situ" PCR that was carried out directly in the same reaction chamber of the chip without elution. PCR products were then co-injected with sizing standards into separation channels for detection using a novel injection electrode. The entire process was automatically conducted on a custom-made compact control and detection instrument. The limit of detection of the microsystem for the singleplex amplification of amelogenin was determined to be 0.625 ng of standard K562 DNA and 0.3 μL of human whole blood. A two-color multiplex allele-specific PCR assay for detecting the warfarin-related single-nucleotide polymorphisms (SNPs) 6853 (-1639G>A) and 6484 (1173C>T) in the VKORC1 gene and the *3 SNP (1075A>C) in the CYP2C9 gene was developed and used for validation studies. The fully automated genetic analysis was completed in two hours with a minimum requirement of 0.5 μL of input blood. Samples from patients with different genotypes were all accurately analyzed. In addition, both dried bloodstains and oral swabs were successfully processed by the microsystem with a simple modification to the DNA extraction and amplification chip. The successful development and operation of this microsystem establish the feasibility of rapid warfarin pharmacogenetic testing in routine clinical practice. PMID:26568290

  6. Protein Microarrays

    NASA Astrophysics Data System (ADS)

    Ricard-Blum, S.

    Proteins are key actors in the life of the cell, involved in many physiological and pathological processes. Since variations in the expression of messenger RNA are not systematically correlated with variations in the protein levels, the latter better reflect the way a cell functions. Protein microarrays thus supply complementary information to DNA chips. They are used in particular to analyse protein expression profiles, to detect proteins within complex biological media, and to study protein-protein interactions, which give information about the functions of those proteins [3-9]. They have the same advantages as DNA microarrays for high-throughput analysis, miniaturisation, and the possibility of automation. Section 18.1 gives a brief overview of proteins. Following this, Sect. 18.2 describes how protein microarrays can be made on flat supports, explaining how proteins can be produced and immobilised on a solid support, and discussing the different kinds of substrate and detection method. Section 18.3 discusses the particular format of protein microarrays in suspension. The diversity of protein microarrays and their applications are then reported in Sect. 18.4, with applications to therapeutics (protein-drug interactions) and diagnostics. The prospects for future developments of protein microarrays are then outlined in the conclusion. The bibliography provides an extensive list of reviews and detailed references for those readers who wish to go further in this area. Indeed, the aim of the present chapter is not to give an exhaustive or detailed analysis of the state of the art, but rather to provide the reader with the basic elements needed to understand how proteins are designed and used.

  7. Fully automated measuring equipment for aqueous boron and its application to online monitoring of industrial process effluents.

    PubMed

    Ohyama, Seiichi; Abe, Keiko; Ohsumi, Hitoshi; Kobayashi, Hirokazu; Miyazaki, Naotsugu; Miyadera, Koji; Akasaka, Kin-ichi

    2009-06-01

    Fully automated measuring equipment for aqueous boron (referred to as the online boron monitor) was developed on the basis of a rapid potentiometric determination method using a commercial BF4(-) ion-selective electrode (ISE). The equipment can measure boron compounds with concentration ranging from a few to several hundred mg/L, and the measurement is completed in less than 20 min without any pretreatment of the sample. In the monitor, a series of operations for the measurement, i.e., sampling and dispensing of the sample, addition of the chemicals, acquisition and processing of potentiometric data, rinsing of the measurement cell, and calibration of the BF4(-) ISE, is automated. To demonstrate the performance, we installed the monitor in full-scale coal-fired power plants and measured the effluent from a flue gas desulfurization unit. The boron concentration in the wastewater varied significantly depending on the type of coal and the load of power generation. An excellent correlation (R2 = 0.987) was obtained in the measurements between the online boron monitor and inductively coupled plasma atomic emission spectrometry, which proved that the developed monitor can serve as a useful tool for managing boron emission in industrial process effluent.

  8. Screening for anabolic steroids in urine of forensic cases using fully automated solid phase extraction and LC-MS-MS.

    PubMed

    Andersen, David W; Linnet, Kristian

    2014-01-01

    A screening method for 18 frequently measured exogenous anabolic steroids and the testosterone/epitestosterone (T/E) ratio in forensic cases has been developed and validated. The method involves a fully automated sample preparation including enzyme treatment, addition of internal standards and solid phase extraction followed by analysis by liquid chromatography-tandem mass spectrometry (LC-MS-MS) using electrospray ionization with adduct formation for two compounds. Urine samples from 580 forensic cases were analyzed to determine the T/E ratio and occurrence of exogenous anabolic steroids. Extraction recoveries ranged from 77 to 95%, matrix effects from 48 to 78%, overall process efficiencies from 40 to 54% and the lower limit of identification ranged from 2 to 40 ng/mL. In the 580 urine samples analyzed from routine forensic cases, 17 (2.9%) were found positive for one or more anabolic steroids. Only seven different steroids including testosterone were found in the material, suggesting that only a small number of common steroids are likely to occur in a forensic context. The steroids were often in high concentrations (>100 ng/mL), and a combination of steroids and/or other drugs of abuse were seen in the majority of cases. The method presented serves as a fast and automated screening procedure, proving the suitability of LC-MS-MS for analyzing anabolic steroids.

  9. Fully automated screening of immunocytochemically stained specimens for early cancer detection

    NASA Astrophysics Data System (ADS)

    Bell, André A.; Schneider, Timna E.; Müller-Frank, Dirk A. C.; Meyer-Ebrecht, Dietrich; Böcking, Alfred; Aach, Til

    2007-03-01

    Cytopathological cancer diagnoses can be obtained less invasive than histopathological investigations. Cells containing specimens can be obtained without pain or discomfort, bloody biopsies are avoided, and the diagnosis can, in some cases, even be made earlier. Since no tissue biopsies are necessary these methods can also be used in screening applications, e.g., for cervical cancer. Among the cytopathological methods a diagnosis based on the analysis of the amount of DNA in individual cells achieves high sensitivity and specificity. Yet this analysis is time consuming, which is prohibitive for a screening application. Hence, it will be advantageous to retain, by a preceding selection step, only a subset of suspicious specimens. This can be achieved using highly sensitive immunocytochemical markers like p16 ink4a for preselection of suspicious cells and specimens. We present a method to fully automatically acquire images at distinct positions at cytological specimens using a conventional computer controlled microscope and an autofocus algorithm. Based on the thus obtained images we automatically detect p16 ink4a-positive objects. This detection in turn is based on an analysis of the color distribution of the p16 ink4a marker in the Lab-colorspace. A Gaussian-mixture-model is used to describe this distribution and the method described in this paper so far achieves a sensitivity of up to 90%.

  10. Fully automated precision predictions for heavy neutrino production mechanisms at hadron colliders

    NASA Astrophysics Data System (ADS)

    Degrande, Céline; Mattelaer, Olivier; Ruiz, Richard; Turner, Jessica

    2016-09-01

    Motivated by TeV-scale neutrino mass models, we propose a systematic treatment of heavy neutrino (N ) production at hadron colliders. Our simple and efficient modeling of the vector boson fusion (VBF) W±γ →N ℓ± and N ℓ±+nj signal definitions resolve collinear and soft divergences that have plagued past studies, and is applicable to other color-singlet processes, e.g., associated Higgs (W±h), sparticle (ℓ˜±νℓ˜), and charged Higgs (h±±h∓) production. We present, for the first time, a comparison of all leading N production modes, including both gluon fusion (GF) g g →Z*/h*→N νℓ (-) and VBF. We obtain fully differential results up to next-to-leading order (NLO) in QCD accuracy using a Monte Carlo tool chain linking feynrules, nloct, and madgraph5_amc@nlo. Associated model files are publicly available. At the 14 TeV LHC, the leading order GF rate is small and comparable to the NLO N ℓ±+1 j rate; at a future 100 TeV Very Large Hadron Collider, GF dominates for mN=300 - 1500 GeV , beyond which VBF takes the lead.

  11. Fully automated precision predictions for heavy neutrino production mechanisms at hadron colliders

    NASA Astrophysics Data System (ADS)

    Degrande, Céline; Mattelaer, Olivier; Ruiz, Richard; Turner, Jessica

    2016-09-01

    Motivated by TeV-scale neutrino mass models, we propose a systematic treatment of heavy neutrino (N ) production at hadron colliders. Our simple and efficient modeling of the vector boson fusion (VBF) W±γ →N ℓ± and N ℓ±+nj signal definitions resolve collinear and soft divergences that have plagued past studies, and is applicable to other color-singlet processes, e.g., associated Higgs (W±h), sparticle (ℓ˜±νℓ˜),and charged Higgs (h±±h∓) production. We present, for the first time, a comparison of all leading N production modes, including both gluon fusion (GF) g g →Z*/h*→N ν ℓ (-) and VBF. We obtain fully differential results up to next-to-leading order (NLO) in QCD accuracy using a Monte Carlo tool chain linking feynrules, nloct, and madgraph5_amc@nlo. Associated model files are publicly available. At the 14 TeV LHC, the leading order GF rate is small and comparable to the NLO N ℓ±+1 j rate; at a future 100 TeV Very Large Hadron Collider, GF dominates for mN=300 - 1500 GeV , beyond which VBF takes the lead.

  12. A preliminary study for fully automated quantification of psoriasis severity using image mapping

    NASA Astrophysics Data System (ADS)

    Mukai, Kazuhiro; Iyatomi, Hitoshi

    2014-03-01

    Psoriasis is a common chronic skin disease and it detracts patients' QoL seriously. Since there is no known permanent cure so far, controlling appropriate disease condition is necessary and therefore quantification of its severity is important. In clinical, psoriasis area and severity index (PASI) is commonly used for abovementioned purpose, however it is often subjective and troublesome. A fully automatic computer-assisted area and severity index (CASI) was proposed to make an objective quantification of skin disease. It investigates the size and density of erythema based on digital image analysis, however it does not consider various inadequate effects caused by different geometrical conditions under clinical follow-up (i.e. variability in direction and distance between camera and patient). In this study, we proposed an image alignment method for clinical images and investigated to quantify the severity of psoriasis under clinical follow-up combined with the idea of CASI. The proposed method finds geometrical same points in patient's body (ROI) between images with Scale Invariant Feature Transform (SIFT) and performs the Affine transform to map the pixel value to the other. In this study, clinical images from 7 patients with psoriasis lesions on their trunk under clinical follow-up were used. In each series, our image alignment algorithm align images to the geometry of their first image. Our proposed method aligned images appropriately on visual assessment and confirmed that psoriasis areas were properly extracted using the approach of CASI. Although we cannot evaluate PASI and CASI directly due to their different definition of ROI, we confirmed that there is a large correlation between those scores with our image quantification method.

  13. Fully automated segmentation of carotid and vertebral arteries from contrast enhanced CTA

    NASA Astrophysics Data System (ADS)

    Cuisenaire, Olivier; Virmani, Sunny; Olszewski, Mark E.; Ardon, Roberto

    2008-03-01

    We propose a method for segmenting and labeling the main head and neck vessels (common, internal, external carotid, vertebral) from a contrast enhanced computed tomography angiography (CTA) volume. First, an initial centerline of each vessel is extracted. Next, the vessels are segmented using 3D active objects initialized using the first step. Finally, the true centerline is identified by smoothly deforming it away from the segmented mask edges using a spline-snake. We focus particularly on the novel initial centerline extraction technique. It uses a locally adaptive front propagation algorithm that attempts to find the optimal path connecting the ends of the vessel, typically from the lowest image of the scan to the Circle of Willis in the brain. It uses a patient adapted anatomical model of the different vessels both to initialize and constrain this fast marching, thus eliminating the need for manual selection of seed points. The method is evaluated using data from multiple regions (USA, India, China, Israel) including a variety of scanners (10, 16, 40, 64-slice; Brilliance CT, Philips Healthcare, Cleveland, OH, USA), contrast agent dose, and image resolution. It is fully successful in over 90% of patients and only misses a single vessel in most remaining cases. We also demonstrate its robustness to metal and dental artifacts and anatomical variability. Total processing time is approximately two minutes with no user interaction, which dramatically improves the workflow over existing clinical software. It also reduces patient dose exposure by obviating the need to acquire an unenhanced scan for bone suppression as this can be done by applying the segmentation masks.

  14. Fully automated circulating tumor cell isolation platform with large-volume capacity based on lab-on-a-disc.

    PubMed

    Park, Jong-Myeon; Kim, Minseok S; Moon, Hui-Sung; Yoo, Chang Eun; Park, Donghyun; Kim, Yeon Jeong; Han, Kyung-Yeon; Lee, June-Young; Oh, Jin Ho; Kim, Sun Soo; Park, Woong-Yang; Lee, Won-Yong; Huh, Nam

    2014-04-15

    Full automation with high purity for circulating tumor cell (CTC) isolation has been regarded as a key goal to make CTC analysis a "bench-to-bedside" technology. Here, we have developed a novel centrifugal microfluidic platform that can isolate the rare cells from a large volume of whole blood. To isolate CTCs from whole blood, we introduce a disc device having the biggest sample capacity as well as manipulating blood cells for the first time. The fully automated disc platform could handle 5 mL of blood by designing the blood chamber having a triangular obstacle structure (TOS) with lateral direction. To guarantee high purity that enables molecular analysis with the rare cells, CTCs were bound to the microbeads covered with anti-EpCAM to discriminate density between CTCs and blood cells and the CTCs being heavier than blood cells were only settled under a density gradient medium (DGM) layer. To understand the movement of CTCs under centrifugal force, we performed computational fluid dynamics simulation and found that their major trajectories were the boundary walls of the DGM chamber, thereby optimizing the chamber design. After whole blood was inserted into the blood chamber of the disc platform, size- and density-amplified cancer cells were isolated within 78 min, with minimal contamination as much as approximately 12 leukocytes per milliliter. As a model of molecular analysis toward personalized cancer treatment, we performed epidermal growth factor receptor (EGFR) mutation analysis with HCC827 lung cancer cells and the isolated cells were then successfully detected for the mutation by PCR clamping and direct sequencing.

  15. A fully automated multi-modal computer aided diagnosis approach to coronary calcium scoring of MSCT images

    NASA Astrophysics Data System (ADS)

    Wu, Jing; Ferns, Gordon; Giles, John; Lewis, Emma

    2012-03-01

    Inter- and intra- observer variability is a problem often faced when an expert or observer is tasked with assessing the severity of a disease. This issue is keenly felt in coronary calcium scoring of patients suffering from atherosclerosis where in clinical practice, the observer must identify firstly the presence, followed by the location of candidate calcified plaques found within the coronary arteries that may prevent oxygenated blood flow to the heart muscle. However, it can be difficult for a human observer to differentiate calcified plaques that are located in the coronary arteries from those found in surrounding anatomy such as the mitral valve or pericardium. In addition to the benefits to scoring accuracy, the use of fast, low dose multi-slice CT imaging to perform the cardiac scan is capable of acquiring the entire heart within a single breath hold. Thus exposing the patient to lower radiation dose, which for a progressive disease such as atherosclerosis where multiple scans may be required, is beneficial to their health. Presented here is a fully automated method for calcium scoring using both the traditional Agatston method, as well as the volume scoring method. Elimination of the unwanted regions of the cardiac image slices such as lungs, ribs, and vertebrae is carried out using adaptive heart isolation. Such regions cannot contain calcified plaques but can be of a similar intensity and their removal will aid detection. Removal of both the ascending and descending aortas, as they contain clinical insignificant plaques, is necessary before the final calcium scores are calculated and examined against ground truth scores of three averaged expert observer results. The results presented here are intended to show the feasibility and requirement for an automated scoring method to reduce the subjectivity and reproducibility error inherent with manual clinical calcium scoring.

  16. Fully automated molecular biology routines on a plasmid-based functional proteomic workcell: Evaluation and Characterization of Yeast Strains Optimized for Growth on Xylose Expressing "Stealth" Insecticidal Peptides.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Optimization of genes important to production of fuel ethanol from hemicellulosic biomass for use in developing improved commercial yeast strains is necessary to meet the rapidly expanding need for ethanol. The United States Department of Agriculture has developed a fully automated platform for mol...

  17. Fully-automated radiosynthesis and in vitro uptake investigation of [N-methyl-¹¹C]methylene blue.

    PubMed

    Schweiger, Lutz F; Smith, Tim A D

    2013-10-01

    Malignant melanoma is a type of skin cancer which can spread rapidly if not detected early and left untreated. Positron Emission Tomography (PET) is a powerful imaging technique for detecting cancer but with only a limited number of radiotracers available the development of novel PET probes for detection and prevention of cancer is imperative. In the present study we present the fully-automated radiosynthesis of [N-methyl-(11)C]methylene blue and an in vitro uptake study in metastasic melanoma cell lines. Using the GE TRACERlab FXc Pro module [N-methyl-(11)C]methylene blue was isolated via solid-phase extraction in an average time of 36 min after end of bombardment and formulated with a radiochemical purity greater than 95%. The in vitro uptake study of [N-methyl-(11)C]methylene blue in SK-MEL28 melanin-expressing melanoma cell line demonstrated in site-specific binding of 51% promoting it as a promising melanoma PET imaging agent.

  18. Fully-automated in-syringe dispersive liquid-liquid microextraction for the determination of caffeine in coffee beverages.

    PubMed

    Frizzarin, Rejane M; Maya, Fernando; Estela, José M; Cerdà, Víctor

    2016-12-01

    A novel fully-automated magnetic stirring-assisted lab-in-syringe analytical procedure has been developed for the fast and efficient dispersive liquid-liquid microextraction (DLLME) of caffeine in coffee beverages. The procedure is based on the microextraction of caffeine with a minute amount of dichloromethane, isolating caffeine from the sample matrix with no further sample pretreatment. Selection of the relevant extraction parameters such as the dispersive solvent, proportion of aqueous/organic phase, pH and flow rates have been carefully evaluated. Caffeine quantification was linear from 2 to 75mgL(-1), with detection and quantification limits of 0.46mgL(-1) and 1.54mgL(-1), respectively. A coefficient of variation (n=8; 5mgL(-1)) of a 2.1% and a sampling rate of 16h(-1), were obtained. The procedure was satisfactorily applied to the determination of caffeine in brewed, instant and decaf coffee samples, being the results for the sample analysis validated using high-performance liquid chromatography. PMID:27374593

  19. Development of a Fully Automated, GPS Based Monitoring System for Disaster Prevention and Emergency Preparedness: PPMS+RT

    PubMed Central

    Bond, Jason; Kim, Don; Chrzanowski, Adam; Szostak-Chrzanowski, Anna

    2007-01-01

    The increasing number of structural collapses, slope failures and other natural disasters has lead to a demand for new sensors, sensor integration techniques and data processing strategies for deformation monitoring systems. In order to meet extraordinary accuracy requirements for displacement detection in recent deformation monitoring projects, research has been devoted to integrating Global Positioning System (GPS) as a monitoring sensor. Although GPS has been used for monitoring purposes worldwide, certain environments pose challenges where conventional processing techniques cannot provide the required accuracy with sufficient update frequency. Described is the development of a fully automated, continuous, real-time monitoring system that employs GPS sensors and pseudolite technology to meet these requirements in such environments. Ethernet and/or serial port communication techniques are used to transfer data between GPS receivers at target points and a central processing computer. The data can be processed locally or remotely based upon client needs. A test was conducted that illustrated a 10 mm displacement was remotely detected at a target point using the designed system. This information could then be used to signal an alarm if conditions are deemed to be unsafe.

  20. Detection of motile micro-organisms in biological samples by means of a fully automated image processing system

    NASA Astrophysics Data System (ADS)

    Alanis, Elvio; Romero, Graciela; Alvarez, Liliana; Martinez, Carlos C.; Hoyos, Daniel; Basombrio, Miguel A.

    2001-08-01

    A fully automated image processing system for detection of motile microorganism is biological samples is presented. The system is specifically calibrated for determining the concentration of Trypanosoma Cruzi parasites in blood samples of mice infected with Chagas disease. The method can be adapted for use in other biological samples. A thin layer of blood infected by T. cruzi parasites is examined in a common microscope in which the images of the vision field are taken by a CCD camera and temporarily stored in the computer memory. In a typical field, a few motile parasites are observable surrounded by blood red cells. The parasites have low contrast. Thus, they are difficult to detect visually but their great motility betrays their presence by the movement of the nearest neighbor red cells. Several consecutive images of the same field are taken, decorrelated with each other where parasites are present, and digitally processed in order to measure the number of parasites present in the field. Several fields are sequentially processed in the same fashion, displacing the sample by means of step motors driven by the computer. A direct advantage of this system is that its results are more reliable and the process is less time consuming than the current subjective evaluations made visually by technicians.

  1. Fully automated measurement of field-dependent AMS using MFK1-FA Kappabridge equipped with 3D rotator

    NASA Astrophysics Data System (ADS)

    Chadima, Martin; Studynka, Jan

    2013-04-01

    Low-field magnetic susceptibility of paramagnetic and diamagnetic minerals is field-independent by definition being also field-independent in pure magnetite. On the other hand, in pyrrhotite, hematite and high-Ti titanomagnetite it may be clearly field-dependent. Consequently, the field-dependent AMS enables the magnetic fabric of the latter group of minerals to be separated from the whole-rock AMS. The methods for the determination of the field-dependent AMS consist of separate measurements of each specimen in several fields within the Rayleigh Law range and subsequent processing in which the field-independent and field-dependent AMS components are calculated. The disadvantage of this technique is that each specimen must be measured several times, which is relatively laborious and time consuming. Recently, a new 3D rotator was developed for the MFK1-FA Kappabridge, which rotates the specimen simultaneously about two axes with different velocities. The measurement is fully automated in such a way that, once the specimen is inserted into the rotator, it requires no additional manipulation to measure the full AMS tensor. Consequently, the 3D rotator enables to measure the AMS tensors in the pre-set field intensities without any operator interference. Whole procedure is controlled by newly developed Safyr5 software; once the measurements are finished, the acquired data are immediately processed and can be visualized in a standard way.

  2. Development and laboratory-scale testing of a fully automated online flow cytometer for drinking water analysis.

    PubMed

    Hammes, Frederik; Broger, Tobias; Weilenmann, Hans-Ulrich; Vital, Marius; Helbing, Jakob; Bosshart, Ulrich; Huber, Pascal; Odermatt, Res Peter; Sonnleitner, Bernhard

    2012-06-01

    Accurate and sensitive online detection tools would benefit both fundamental research and practical applications in aquatic microbiology. Here, we describe the development and testing of an online flow cytometer (FCM), with a specific use foreseen in the field of drinking water microbiology. The system incorporated fully automated sampling and fluorescent labeling of bacterial nucleic acids with analysis at 5-min intervals for periods in excess of 24 h. The laboratory scale testing showed sensitive detection (< 5% error) of bacteria over a broad concentration range (1 × 10(3) -1 × 10(6) cells mL(-1) ) and particularly the ability to track both gradual changes and dramatic events in water samples. The system was tested with bacterial pure cultures as well as indigenous microbial communities from natural water samples. Moreover, we demonstrated the possibility of using either a single fluorescent dye (e.g., SYBR Green I) or a combination of two dyes (SYBR Green I and Propidium Iodide), thus broadening the application possibilities of the system. The online FCM approach described herein has considerable potential for routine and continuous monitoring of drinking water, optimization of specific drinking water processes such as biofiltration or disinfection, as well as aquatic microbiology research in general.

  3. Fully automated Liquid Extraction-Based Surface Sampling and Ionization Using a Chip-Based Robotic Nanoelectrospray Platform

    SciTech Connect

    Kertesz, Vilmos; Van Berkel, Gary J

    2010-01-01

    A fully automated liquid extraction-based surface sampling device utilizing an Advion NanoMate chip-based infusion nanoelectrospray ionization system is reported. Analyses were enabled for discrete spot sampling by using the Advanced User Interface of the current commercial control software. This software interface provided the parameter control necessary for the NanoMate robotic pipettor to both form and withdraw a liquid microjunction for sampling from a surface. The system was tested with three types of analytically important sample surface types, viz., spotted sample arrays on a MALDI plate, dried blood spots on paper, and whole-body thin tissue sections from drug dosed mice. The qualitative and quantitative data were consistent with previous studies employing other liquid extraction-based surface sampling techniques. The successful analyses performed here utilized the hardware and software elements already present in the NanoMate system developed to handle and analyze liquid samples. Implementation of an appropriate sample (surface) holder, a solvent reservoir, faster movement of the robotic arm, finer control over solvent flow rate when dispensing and retrieving the solution at the surface, and the ability to select any location on a surface to sample from would improve the analytical performance and utility of the platform.

  4. Determination of 18 beta-glycyrrhetinic acid in human serum using the fully automated ALCA-system.

    PubMed

    Heilmann, P; Heide, J; Schöneshöfer, M

    1997-07-01

    We report a method for the determination of 18 beta-glycyrrhetinic acid (glycyrrhetinic acid) in human serum using the ALCA-system. The technology of the ALCA-system is based on the principles of adsorptive and desorptive processes between liquid and solid phases. The assay is run fully automated and selective. Procedural losses throughout the analysis are negligible, thereby allowing for external calibration. The calibration curve is linear up to 10 mg/l and concentrations as low as 10 micrograms/l are detectable. CV is 2.5% for within- and 7.5% for between-assay precision at a level of 50 micrograms/l and 1.2% for within- and 8.5% for between-assay precision at a level of 500 micrograms/l. Specific and expensive reagents are not necessary and time-consuming manual operations are not involved. This assay can be selected from a wide spectrum of methods at any time. Thus, the present method is well-suited for drug monitoring purposes in the routine laboratory. In a pharmacokinetic study we measured serum levels of glycyrrhetinic acid in ten healthy young volunteers after ingestion of 500 mg glycyrrhetinic acid. Maximum levels of glycyrrhetinic acid were 6.3 mg/l 2 to 4 hours after ingestion. Twenty-four (24) hours after ingestion seven probands still had glycyrrhetinic acid levels above the detection limit with a mean level of 0.33 mg/l.

  5. Towards a fully automated computation of RG functions for the three-dimensional O(N) vector model: parametrizing amplitudes

    NASA Astrophysics Data System (ADS)

    Guida, Riccardo; Ribeca, Paolo

    2006-02-01

    Within the framework of field-theoretical description of second-order phase transitions via the three-dimensional O(N) vector model, accurate predictions for critical exponents can be obtained from (resummation of) the perturbative series of renormalization-group functions, which are in turn derived—following Parisi's approach—from the expansions of appropriate field correlators evaluated at zero external momenta. Such a technique was fully exploited 30 years ago in two seminal works of Baker, Nickel, Green and Meiron, which led to the knowledge of the β-function up to the six-loop level; they succeeded in obtaining a precise numerical evaluation of all needed Feynman amplitudes in momentum space by lowering the dimensionalities of each integration with a cleverly arranged set of computational simplifications. In fact, extending this computation is not straightforward, due both to the factorial proliferation of relevant diagrams and the increasing dimensionality of their associated integrals; in any case, this task can be reasonably carried on only in the framework of an automated environment. On the road towards the creation of such an environment, we here show how a strategy closely inspired by that of Nickel and co-workers can be stated in algorithmic form, and successfully implemented on a computer. As an application, we plot the minimized distributions of residual integrations for the sets of diagrams needed to obtain RG functions to the full seven-loop level; they represent a good evaluation of the computational effort which will be required to improve the currently available estimates of critical exponents.

  6. Fully-Automated μMRI Morphometric Phenotyping of the Tc1 Mouse Model of Down Syndrome

    PubMed Central

    Modat, Marc; Cardoso, M. Jorge; Ma, Da; Holmes, Holly E.; Yu, Yichao; O’Callaghan, James; Cleary, Jon O.; Sinclair, Ben; Wiseman, Frances K.; Tybulewicz, Victor L. J.; Fisher, Elizabeth M. C.; Lythgoe, Mark F.; Ourselin, Sébastien

    2016-01-01

    We describe a fully automated pipeline for the morphometric phenotyping of mouse brains from μMRI data, and show its application to the Tc1 mouse model of Down syndrome, to identify new morphological phenotypes in the brain of this first transchromosomic animal carrying human chromosome 21. We incorporate an accessible approach for simultaneously scanning multiple ex vivo brains, requiring only a 3D-printed brain holder, and novel image processing steps for their separation and orientation. We employ clinically established multi-atlas techniques–superior to single-atlas methods–together with publicly-available atlas databases for automatic skull-stripping and tissue segmentation, providing high-quality, subject-specific tissue maps. We follow these steps with group-wise registration, structural parcellation and both Voxel- and Tensor-Based Morphometry–advantageous for their ability to highlight morphological differences without the laborious delineation of regions of interest. We show the application of freely available open-source software developed for clinical MRI analysis to mouse brain data: NiftySeg for segmentation and NiftyReg for registration, and discuss atlases and parameters suitable for the preclinical paradigm. We used this pipeline to compare 29 Tc1 brains with 26 wild-type littermate controls, imaged ex vivo at 9.4T. We show an unexpected increase in Tc1 total intracranial volume and, controlling for this, local volume and grey matter density reductions in the Tc1 brain compared to the wild-types, most prominently in the cerebellum, in agreement with human DS and previous histological findings. PMID:27658297

  7. Real-time direct cell concentration and viability determination using a fully automated microfluidic platform for standalone process monitoring.

    PubMed

    Nunes, P S; Kjaerulff, S; Dufva, M; Mogensen, K B

    2015-06-21

    The industrial production of cells has a large unmet need for greater process monitoring, in addition to the standard temperature, pH and oxygen concentration determination. Monitoring the cell health by a vast range of fluorescence cell-based assays can greatly improve the feedback control and thereby ensure optimal cell production, by prolonging the fermentation cycle and increasing the bioreactor output. In this work, we report on the development of a fully automated microfluidic system capable of extracting samples directly from a bioreactor, diluting the sample, staining the cells, and determining the total cell and dead cells concentrations, within a time frame of 10.3 min. The platform consists of custom made stepper motor actuated peristaltic pumps and valves, fluidic interconnections, sample to waste liquid management and image cytometry-based detection. The total concentration of cells is determined by brightfield microscopy, while fluorescence detection is used to detect propidium iodide stained non-viable cells. This method can be incorporated into facilities with bioreactors to monitor the cell concentration and viability during the cultivation process. Here, we demonstrate the microfluidic system performance by monitoring in real time the cell concentration and viability of yeast extracted directly from an in-house made bioreactor. This is the first demonstration of using the Dean drag force, generated due to the implementation of a curved microchannel geometry in conjunction with high flow rates, to promote passive mixing of cell samples and thus homogenization of the diluted cell plug. The autonomous operation of the fluidics furthermore allows implementation of intelligent protocols for administering air bubbles from the bioreactor in the microfluidic system, so that these will be guided away from the imaging region, thereby significantly improving both the robustness of the system and the quality of the data. PMID:25923294

  8. Multicenter evaluation of fully automated BACTEC Mycobacteria Growth Indicator Tube 960 system for susceptibility testing of Mycobacterium tuberculosis.

    PubMed

    Bemer, Pascale; Palicova, Frantiska; Rüsch-Gerdes, Sabine; Drugeon, Henri B; Pfyffer, Gaby E

    2002-01-01

    The reliability of the BACTEC Mycobacteria Growth Indicator Tube (MGIT) 960 system for testing of Mycobacterium tuberculosis susceptibility to the three front-line drugs (isoniazid [INH], rifampin [RIF], and ethambutol [EMB]) plus streptomycin (STR) was compared to that of the BACTEC 460 TB system. The proportion method was used to resolve discrepant results by an independent arbiter. One hundred and ten strains were tested with an overall agreement of 93.5%. Discrepant results were obtained for seven strains (6.4%) with INH (resistant by BACTEC MGIT 960; susceptible by BACTEC 460 TB), for one strain (0.9%) with RIF (resistant by BACTEC MGIT 960; susceptible by BACTEC 460 TB), for seven strains (6.4%) with EMB (six resistant by BACTEC MGIT 960 and susceptible by BACTEC 460 TB; one susceptible by BACTEC MGIT 960 and resistant by BACTEC 460 TB), and for 19 strains (17.3%) with STR (resistant by BACTEC MGIT 960 and susceptible by BACTEC 460 TB). After resolution of discrepant results, the sensitivity of the BACTEC MGIT 960 system was 100% for all four drugs and specificity ranged from 89.8% for STR to 100% for RIF. Turnaround times were 4.6 to 11.7 days (median, 6.5 days) for BACTEC MGIT 960 and 4.0 to 10.0 days (median, 7.0 days) for BACTEC 460 TB. These data demonstrate that the fully automated and nonradiometric BACTEC MGIT 960 system is an accurate method for rapid susceptibility testing of M. tuberculosis.

  9. Multicenter Evaluation of Fully Automated BACTEC Mycobacteria Growth Indicator Tube 960 System for Susceptibility Testing of Mycobacterium tuberculosis

    PubMed Central

    Bemer, Pascale; Palicova, Frantiska; Rüsch-Gerdes, Sabine; Drugeon, Henri B.; Pfyffer, Gaby E.

    2002-01-01

    The reliability of the BACTEC Mycobacteria Growth Indicator Tube (MGIT) 960 system for testing of Mycobacterium tuberculosis susceptibility to the three front-line drugs (isoniazid [INH], rifampin [RIF], and ethambutol [EMB]) plus streptomycin (STR) was compared to that of the BACTEC 460 TB system. The proportion method was used to resolve discrepant results by an independent arbiter. One hundred and ten strains were tested with an overall agreement of 93.5%. Discrepant results were obtained for seven strains (6.4%) with INH (resistant by BACTEC MGIT 960; susceptible by BACTEC 460 TB), for one strain (0.9%) with RIF (resistant by BACTEC MGIT 960; susceptible by BACTEC 460 TB), for seven strains (6.4%) with EMB (six resistant by BACTEC MGIT 960 and susceptible by BACTEC 460 TB; one susceptible by BACTEC MGIT 960 and resistant by BACTEC 460 TB), and for 19 strains (17.3%) with STR (resistant by BACTEC MGIT 960 and susceptible by BACTEC 460 TB). After resolution of discrepant results, the sensitivity of the BACTEC MGIT 960 system was 100% for all four drugs and specificity ranged from 89.8% for STR to 100% for RIF. Turnaround times were 4.6 to 11.7 days (median, 6.5 days) for BACTEC MGIT 960 and 4.0 to 10.0 days (median, 7.0 days) for BACTEC 460 TB. These data demonstrate that the fully automated and nonradiometric BACTEC MGIT 960 system is an accurate method for rapid susceptibility testing of M. tuberculosis. PMID:11773109

  10. Real-time direct cell concentration and viability determination using a fully automated microfluidic platform for standalone process monitoring.

    PubMed

    Nunes, P S; Kjaerulff, S; Dufva, M; Mogensen, K B

    2015-06-21

    The industrial production of cells has a large unmet need for greater process monitoring, in addition to the standard temperature, pH and oxygen concentration determination. Monitoring the cell health by a vast range of fluorescence cell-based assays can greatly improve the feedback control and thereby ensure optimal cell production, by prolonging the fermentation cycle and increasing the bioreactor output. In this work, we report on the development of a fully automated microfluidic system capable of extracting samples directly from a bioreactor, diluting the sample, staining the cells, and determining the total cell and dead cells concentrations, within a time frame of 10.3 min. The platform consists of custom made stepper motor actuated peristaltic pumps and valves, fluidic interconnections, sample to waste liquid management and image cytometry-based detection. The total concentration of cells is determined by brightfield microscopy, while fluorescence detection is used to detect propidium iodide stained non-viable cells. This method can be incorporated into facilities with bioreactors to monitor the cell concentration and viability during the cultivation process. Here, we demonstrate the microfluidic system performance by monitoring in real time the cell concentration and viability of yeast extracted directly from an in-house made bioreactor. This is the first demonstration of using the Dean drag force, generated due to the implementation of a curved microchannel geometry in conjunction with high flow rates, to promote passive mixing of cell samples and thus homogenization of the diluted cell plug. The autonomous operation of the fluidics furthermore allows implementation of intelligent protocols for administering air bubbles from the bioreactor in the microfluidic system, so that these will be guided away from the imaging region, thereby significantly improving both the robustness of the system and the quality of the data.

  11. Separation of field-independent and field-dependent susceptibility tensors using a sequence of fully automated AMS measurements

    NASA Astrophysics Data System (ADS)

    Studynka, J.; Chadima, M.; Hrouda, F.; Suza, P.

    2013-12-01

    Low-field magnetic susceptibility of diamagnetic and paramagnetic minerals as well as that of pure magnetite and all single-domain ferromagnetic (s.l.) minerals is field-independent. In contrast, magnetic susceptibility of multi-domain pyrrhotite, hematite and titanomagnetite may significantly depend on the field intensity. Hence, the AMS data acquired in various fields have a great potential to separate the magnetic fabric carried by the latter group of minerals from the whole-rock fabric. The determination of the field variation of AMS consist of separate measurements of each sample in several fields within the Rayleigh Law range and subsequent processing in which the field-independent and field-dependent susceptibility tensors are calculated. The disadvantage of this technique is that each sample must be measured several times in various positions, which is relatively laborious and time consuming. Recently, a new 3D rotator was developed for the MFK1 Kappabridges which rotates the sample simultaneously about two axes with different velocities. The measurement is fully automated in such a way that, once the sample is mounted into the rotator, it requires no additional positioning to measure the full AMS tensor. The important advantage of the 3D rotator is that it enables to measure AMS in a sequence of pre-set field intensities without any operator manipulation. Whole procedure is computer-controlled and, once a sequence of measurements is finished, the acquired data are immediately processed and visualized. Examples of natural rocks demonstrating various types of field dependence of AMS are given.

  12. Therapeutic Alliance With a Fully Automated Mobile Phone and Web-Based Intervention: Secondary Analysis of a Randomized Controlled Trial

    PubMed Central

    Parker, Gordon; Manicavasagar, Vijaya; Hadzi-Pavlovic, Dusan; Fogarty, Andrea

    2016-01-01

    Background Studies of Internet-delivered psychotherapies suggest that clients report development of a therapeutic alliance in the Internet environment. Because a majority of the interventions studied to date have been therapist-assisted to some degree, it remains unclear whether a therapeutic alliance can develop within the context of an Internet-delivered self-guided intervention with no therapist support, and whether this has consequences for program outcomes. Objective This study reports findings of a secondary analysis of data from 90 participants with mild-to-moderate depression, anxiety, and/or stress who used a fully automated mobile phone and Web-based cognitive behavior therapy (CBT) intervention called “myCompass” in a recent randomized controlled trial (RCT). Methods Symptoms, functioning, and positive well-being were assessed at baseline and post-intervention using the Depression, Anxiety and Stress Scale (DASS), the Work and Social Adjustment Scale (WSAS), and the Mental Health Continuum-Short Form (MHC-SF). Therapeutic alliance was measured at post-intervention using the Agnew Relationship Measure (ARM), and this was supplemented with qualitative data obtained from 16 participant interviews. Extent of participant engagement with the program was also assessed. Results Mean ratings on the ARM subscales were above the neutral midpoints, and the interviewees provided rich detail of a meaningful and collaborative therapeutic relationship with the myCompass program. Whereas scores on the ARM subscales did not correlate with treatment outcomes, participants’ ratings of the quality of their emotional connection with the program correlated significantly and positively with program logins, frequency of self-monitoring, and number of treatment modules completed (r values between .32-.38, P≤.002). The alliance (ARM) subscales measuring perceived empowerment (r=.26, P=.02) and perceived freedom to self-disclose (r=.25, P=.04) also correlated significantly

  13. Fully Automated Pulmonary Lobar Segmentation: Influence of Different Prototype Software Programs onto Quantitative Evaluation of Chronic Obstructive Lung Disease

    PubMed Central

    Lim, Hyun-ju; Weinheimer, Oliver; Wielpütz, Mark O.; Dinkel, Julien; Hielscher, Thomas; Gompelmann, Daniela; Kauczor, Hans-Ulrich; Heussel, Claus Peter

    2016-01-01

    Objectives Surgical or bronchoscopic lung volume reduction (BLVR) techniques can be beneficial for heterogeneous emphysema. Post-processing software tools for lobar emphysema quantification are useful for patient and target lobe selection, treatment planning and post-interventional follow-up. We aimed to evaluate the inter-software variability of emphysema quantification using fully automated lobar segmentation prototypes. Material and Methods 66 patients with moderate to severe COPD who underwent CT for planning of BLVR were included. Emphysema quantification was performed using 2 modified versions of in-house software (without and with prototype advanced lung vessel segmentation; programs 1 [YACTA v.2.3.0.2] and 2 [YACTA v.2.4.3.1]), as well as 1 commercial program 3 [Pulmo3D VA30A_HF2] and 1 pre-commercial prototype 4 [CT COPD ISP ver7.0]). The following parameters were computed for each segmented anatomical lung lobe and the whole lung: lobar volume (LV), mean lobar density (MLD), 15th percentile of lobar density (15th), emphysema volume (EV) and emphysema index (EI). Bland-Altman analysis (limits of agreement, LoA) and linear random effects models were used for comparison between the software. Results Segmentation using programs 1, 3 and 4 was unsuccessful in 1 (1%), 7 (10%) and 5 (7%) patients, respectively. Program 2 could analyze all datasets. The 53 patients with successful segmentation by all 4 programs were included for further analysis. For LV, program 1 and 4 showed the largest mean difference of 72 ml and the widest LoA of [-356, 499 ml] (p<0.05). Program 3 and 4 showed the largest mean difference of 4% and the widest LoA of [-7, 14%] for EI (p<0.001). Conclusions Only a single software program was able to successfully analyze all scheduled data-sets. Although mean bias of LV and EV were relatively low in lobar quantification, ranges of disagreement were substantial in both of them. For longitudinal emphysema monitoring, not only scanning protocol but

  14. Fully Automated RNAscope In Situ Hybridization Assays for Formalin-Fixed Paraffin-Embedded Cells and Tissues.

    PubMed

    Anderson, Courtney M; Zhang, Bingqing; Miller, Melanie; Butko, Emerald; Wu, Xingyong; Laver, Thomas; Kernag, Casey; Kim, Jeffrey; Luo, Yuling; Lamparski, Henry; Park, Emily; Su, Nan; Ma, Xiao-Jun

    2016-10-01

    Biomarkers such as DNA, RNA, and protein are powerful tools in clinical diagnostics and therapeutic development for many diseases. Identifying RNA expression at the single cell level within the morphological context by RNA in situ hybridization provides a great deal of information on gene expression changes over conventional techniques that analyze bulk tissue, yet widespread use of this technique in the clinical setting has been hampered by the dearth of automated RNA ISH assays. Here we present an automated version of the RNA ISH technology RNAscope that is adaptable to multiple automation platforms. The automated RNAscope assay yields a high signal-to-noise ratio with little to no background staining and results comparable to the manual assay. In addition, the automated duplex RNAscope assay was able to detect two biomarkers simultaneously. Lastly, assay consistency and reproducibility were confirmed by quantification of TATA-box binding protein (TBP) mRNA signals across multiple lots and multiple experiments. Taken together, the data presented in this study demonstrate that the automated RNAscope technology is a high performance RNA ISH assay with broad applicability in biomarker research and diagnostic assay development. J. Cell. Biochem. 117: 2201-2208, 2016. © 2016 Wiley Periodicals, Inc.

  15. Usage and Effectiveness of a Fully Automated, Open-Access, Spanish Web-Based Smoking Cessation Program: Randomized Controlled Trial

    PubMed Central

    2014-01-01

    Background The Internet is an optimal setting to provide massive access to tobacco treatments. To evaluate open-access Web-based smoking cessation programs in a real-world setting, adherence and retention data should be taken into account as much as abstinence rate. Objective The objective was to analyze the usage and effectiveness of a fully automated, open-access, Web-based smoking cessation program by comparing interactive versus noninteractive versions. Methods Participants were randomly assigned either to the interactive or noninteractive version of the program, both with identical content divided into 4 interdependent modules. At baseline, we collected demographic, psychological, and smoking characteristics of the smokers self-enrolled in the Web-based program of Universidad Nacional de Educación a Distancia (National Distance Education University; UNED) in Madrid, Spain. The following questionnaires were administered: the anxiety and depression subscales from the Symptom Checklist-90-Revised, the 4-item Perceived Stress Scale, and the Heaviness of Smoking Index. At 3 months, we analyzed dropout rates, module completion, user satisfaction, follow-up response rate, and self-assessed smoking abstinence. Results A total of 23,213 smokers were registered, 50.06% (11,620/23,213) women and 49.94% (11,593/23,213) men, with a mean age of 39.5 years (SD 10.3). Of these, 46.10% (10,701/23,213) were married and 34.43% (7992/23,213) were single, 46.03% (10,686/23,213) had university education, and 78.73% (18,275/23,213) were employed. Participants smoked an average of 19.4 cigarettes per day (SD 10.3). Of the 11,861 smokers randomly assigned to the interactive version, 2720 (22.93%) completed the first module, 1052 (8.87%) the second, 624 (5.26%) the third, and 355 (2.99%) the fourth. Completion data was not available for the noninteractive version (no way to record it automatically). The 3-month follow-up questionnaire was completed by 1085 of 23,213 enrolled smokers

  16. Technical Note: A fully automated purge and trap GC-MS system for quantification of volatile organic compound (VOC) fluxes between the ocean and atmosphere

    NASA Astrophysics Data System (ADS)

    Andrews, S. J.; Hackenberg, S. C.; Carpenter, L. J.

    2015-04-01

    The oceans are a key source of a number of atmospherically important volatile gases. The accurate and robust determination of trace gases in seawater is a significant analytical challenge, requiring reproducible and ideally automated sample handling, a high efficiency of seawater-air transfer, removal of water vapour from the sample stream, and high sensitivity and selectivity of the analysis. Here we describe a system that was developed for the fully automated analysis of dissolved very short-lived halogenated species (VSLS) sampled from an under-way seawater supply. The system can also be used for semi-automated batch sampling from Niskin bottles filled during CTD (conductivity, temperature, depth) profiles. The essential components comprise a bespoke, automated purge and trap (AutoP & T) unit coupled to a commercial thermal desorption and gas chromatograph mass spectrometer (TD-GC-MS). The AutoP & T system has completed five research cruises, from the tropics to the poles, and collected over 2500 oceanic samples to date. It is able to quantify >25 species over a boiling point range of 34-180 °C with Henry's law coefficients of 0.018 and greater (CH22l, kHcc dimensionless gas/aqueous) and has been used to measure organic sulfurs, hydrocarbons, halocarbons and terpenes. In the eastern tropical Pacific, the high sensitivity and sampling frequency provided new information regarding the distribution of VSLS, including novel measurements of a photolytically driven diurnal cycle of CH22l within the surface ocean water.

  17. Fully automated microvessel counting and hot spot selection by image processing of whole tumour sections in invasive breast cancer.

    PubMed Central

    Beliën, J A; Somi, S; de Jong, J S; van Diest, P J; Baak, J P

    1999-01-01

    BACKGROUND: Manual counting of microvessels is subjective and may lead to unacceptable interobserver variability, which may explain conflicting results. AIMS: To develop and test an automated method for microvessel counting and objective selection of the hot spot, based on image processing of whole sections, and to compare this with manual selection of a hot spot and counting of microvessels. METHODS: Microvessels were stained by CD31 immunohistochemistry in 10 cases of invasive breast cancer. The number of microvessels was counted manually in a subjectively selected hot spot, and also in the same complete tumour sections by interactive and automated image processing methods. An algorithm identified the hot spots from microvessel maps of the whole tumour section. RESULTS: No significant difference in manual microvessel counts was found between two observers within the same hot spot, and counts were significantly correlated. However, when the hot spot was reselected, significantly different results were found between repeated counts by the same observer. Counting all microvessels manually within the entire tumour section resulted in significantly different hot spots than manual counts in selected hot spots by the same observer. Within the entire tumour section no significant differences were found between the hot spots of the manual and automated methods using an automated microscope. The hot spot was found using an eight connective path search algorithm, was located at or near the border of the tumour, and (depending on the size of the hot spot) did not always contain the field with the largest number of microvessels. CONCLUSIONS: The automated counting of microvessels is preferable to the manual method because of the reduction in measurement time when the complete tumour is scanned, the greater accuracy and objectivity of hot spot selection, and the possibility of visual inspection and relocation of each measurement field afterwards. Images PMID:10450177

  18. Toward fully automated genotyping: Allele assignment, pedigree construction, phase determination, and recombination detection in Duchenne muscular dystrophy

    SciTech Connect

    Perlin, M.W.; Burks, M.B.; Hoop, R.C.; Hoffman, E.P.

    1994-10-01

    Human genetic maps have made quantum leaps in the past few years, because of the characterization of >2,000 CA dinucleotide repeat loci: these PCR-based markers offer extraordinarily high PIC, and within the next year their density is expected to reach intervals of a few centimorgans per marker. These new genetic maps open new avenues for disease gene research, including large-scale genotyping for both simple and complex disease loci. However, the allele patterns of many dinucleotide repeat loci can be complex and difficult to interpret, with genotyping errors a recognized problem. Furthermore, the possibility of genotyping individuals at hundreds or thousands of polymorphic loci requires improvements in data handling and analysis. The automation of genotyping and analysis of computer-derived haplotypes would remove many of the barriers preventing optimal use of dense and informative dinucleotide genetic maps. Toward this end, we have automated the allele identification, genotyping, phase determinations, and inheritance consistency checks generated by four CA repeats within the 2.5-Mbp, 10-cM X-linked dystrophin gene, using fluorescein-labeled multiplexed PCR products analyzed on automated sequencers. The described algorithms can deconvolute and resolve closely spaced alleles, despite interfering stutter noise; set phase in females; propagate the phase through the family; and identify recombination events. We show the implementation of these algorithms for the completely automated interpretation of allele data and risk assessment for five Duchenne/Becker muscular dystrophy families. The described approach can be scaled up to perform genome-based analyses with hundreds or thousands of CA-repeat loci, using multiple fluorophors on automated sequencers. 16 refs., 5 figs., 1 tab.

  19. Fully-automated identification of fish species based on otolith contour: using short-time Fourier transform and discriminant analysis (STFT-DA)

    PubMed Central

    Salimi, Nima; Loh, Kar Hoe; Kaur Dhillon, Sarinder

    2016-01-01

    Background. Fish species may be identified based on their unique otolith shape or contour. Several pattern recognition methods have been proposed to classify fish species through morphological features of the otolith contours. However, there has been no fully-automated species identification model with the accuracy higher than 80%. The purpose of the current study is to develop a fully-automated model, based on the otolith contours, to identify the fish species with the high classification accuracy. Methods. Images of the right sagittal otoliths of 14 fish species from three families namely Sciaenidae, Ariidae, and Engraulidae were used to develop the proposed identification model. Short-time Fourier transform (STFT) was used, for the first time in the area of otolith shape analysis, to extract important features of the otolith contours. Discriminant Analysis (DA), as a classification technique, was used to train and test the model based on the extracted features. Results. Performance of the model was demonstrated using species from three families separately, as well as all species combined. Overall classification accuracy of the model was greater than 90% for all cases. In addition, effects of STFT variables on the performance of the identification model were explored in this study. Conclusions. Short-time Fourier transform could determine important features of the otolith outlines. The fully-automated model proposed in this study (STFT-DA) could predict species of an unknown specimen with acceptable identification accuracy. The model codes can be accessed at http://mybiodiversityontologies.um.edu.my/Otolith/ and https://peerj.com/preprints/1517/. The current model has flexibility to be used for more species and families in future studies. PMID:26925315

  20. Towards fully automated genotyping: use of an X linked recessive spastic paraplegia family to test alternative analysis methods.

    PubMed

    Kobayashi, H; Matise, T C; Perlin, M W; Marks, H G; Hoffman, E P

    1995-05-01

    Advances in dinucleotide-based genetic maps open possibilities for large scale genotyping at high resolution. The current rate-limiting steps in use of these dense maps is data interpretation (allele definition), data entry, and statistical calculations. We have recently reported automated allele identification methods. Here we show that a 10-cM framework map of the human X chromosome can be analyzed on two lanes of an automated sequencer per individual (10-12 loci per lane). We use this map and analysis strategy to generate allele data for an X-linked recessive spastic paraplegia family with a known PLP mutation. We analyzed 198 genotypes in a single gel and used the data to test three methods of data analysis: manual meiotic breakpoint mapping, automated concordance analysis, and whole chromosome multipoint linkage analysis. All methods pinpointed the correct location of the gene. We propose that multipoint exclusion mapping may permit valid inflation of LOD scores using the equation max LOD-(next best LOD). PMID:7759066

  1. ELIXYS - a fully automated, three-reactor high-pressure radiosynthesizer for development and routine production of diverse PET tracers

    PubMed Central

    2013-01-01

    Background Automated radiosynthesizers are vital for routine production of positron-emission tomography tracers to minimize radiation exposure to operators and to ensure reproducible synthesis yields. The recent trend in the synthesizer industry towards the use of disposable kits aims to simplify setup and operation for the user, but often introduces several limitations related to temperature and chemical compatibility, thus requiring reoptimization of protocols developed on non-cassette-based systems. Radiochemists would benefit from a single hybrid system that provides tremendous flexibility for development and optimization of reaction conditions while also providing a pathway to simple, cassette-based production of diverse tracers. Methods We have designed, built, and tested an automated three-reactor radiosynthesizer (ELIXYS) to provide a flexible radiosynthesis platform suitable for both tracer development and routine production. The synthesizer is capable of performing high-pressure and high-temperature reactions by eliminating permanent tubing and valve connections to the reaction vessel. Each of the three movable reactors can seal against different locations on disposable cassettes to carry out different functions such as sealed reactions, evaporations, and reagent addition. A reagent and gas handling robot moves sealed reagent vials from storage locations in the cassette to addition positions and also dynamically provides vacuum and inert gas to ports on the cassette. The software integrates these automated features into chemistry unit operations (e.g., React, Evaporate, Add) to intuitively create synthesis protocols. 2-Deoxy-2-[18F]fluoro-5-methyl-β-l-arabinofuranosyluracil (l-[18F]FMAU) and 2-deoxy-2-[18F]fluoro-β-d-arabinofuranosylcytosine (d-[18F]FAC) were synthesized to validate the system. Results l-[18F]FMAU and d-[18F]FAC were successfully synthesized in 165 and 170 min, respectively, with decay-corrected radiochemical yields of 46% ± 1% (n = 6

  2. Fully automated high-throughput chromatin immunoprecipitation for ChIP-seq: Identifying ChIP-quality p300 monoclonal antibodies

    PubMed Central

    Gasper, William C.; Marinov, Georgi K.; Pauli-Behn, Florencia; Scott, Max T.; Newberry, Kimberly; DeSalvo, Gilberto; Ou, Susan; Myers, Richard M.; Vielmetter, Jost; Wold, Barbara J.

    2014-01-01

    Chromatin immunoprecipitation coupled with DNA sequencing (ChIP-seq) is the major contemporary method for mapping in vivo protein-DNA interactions in the genome. It identifies sites of transcription factor, cofactor and RNA polymerase occupancy, as well as the distribution of histone marks. Consortia such as the ENCyclopedia Of DNA Elements (ENCODE) have produced large datasets using manual protocols. However, future measurements of hundreds of additional factors in many cell types and physiological states call for higher throughput and consistency afforded by automation. Such automation advances, when provided by multiuser facilities, could also improve the quality and efficiency of individual small-scale projects. The immunoprecipitation process has become rate-limiting, and is a source of substantial variability when performed manually. Here we report a fully automated robotic ChIP (R-ChIP) pipeline that allows up to 96 reactions. A second bottleneck is the dearth of renewable ChIP-validated immune reagents, which do not yet exist for most mammalian transcription factors. We used R-ChIP to screen new mouse monoclonal antibodies raised against p300, a histone acetylase, well-known as a marker of active enhancers, for which ChIP-competent monoclonal reagents have been lacking. We identified, validated for ChIP-seq, and made publicly available a monoclonal reagent called ENCITp300-1. PMID:24919486

  3. Centrifugal LabTube platform for fully automated DNA purification and LAMP amplification based on an integrated, low-cost heating system.

    PubMed

    Hoehl, Melanie M; Weißert, Michael; Dannenberg, Arne; Nesch, Thomas; Paust, Nils; von Stetten, Felix; Zengerle, Roland; Slocum, Alexander H; Steigert, Juergen

    2014-06-01

    This paper introduces a disposable battery-driven heating system for loop-mediated isothermal DNA amplification (LAMP) inside a centrifugally-driven DNA purification platform (LabTube). We demonstrate LabTube-based fully automated DNA purification of as low as 100 cell-equivalents of verotoxin-producing Escherichia coli (VTEC) in water, milk and apple juice in a laboratory centrifuge, followed by integrated and automated LAMP amplification with a reduction of hands-on time from 45 to 1 min. The heating system consists of two parallel SMD thick film resistors and a NTC as heating and temperature sensing elements. They are driven by a 3 V battery and controlled by a microcontroller. The LAMP reagents are stored in the elution chamber and the amplification starts immediately after the eluate is purged into the chamber. The LabTube, including a microcontroller-based heating system, demonstrates contamination-free and automated sample-to-answer nucleic acid testing within a laboratory centrifuge. The heating system can be easily parallelized within one LabTube and it is deployable for a variety of heating and electrical applications.

  4. Fully automated quantification of regional cerebral blood flow with three-dimensional stereotaxic region of interest template: validation using magnetic resonance imaging--technical note.

    PubMed

    Takeuchi, Ryo; Yonekura, Yoshiharu; Takeda, Shigenori Katayama Naoya; Fujita, Katsuzo; Konishi, Junji

    2003-03-01

    The previously reported three-dimensional stereotaxic region of interest (ROI) template (3DSRT-t) for the analysis of anatomically standardized technetium-99m-L,L-ethyl cysteinate dimer (99mTc-ECD) single photon emission computed tomography (SPECT) images was modified for use in a fully automated regional cerebral blood flow (rCBF) quantification software, 3DSRT, incorporating an anatomical standardization engine transplanted from statistical parametric mapping 99 and ROIs for quantification based on 3DSRT-t. Three-dimensional T2-weighted magnetic resonance images of 10 patients with localized infarcted areas were compared with the ROI contour of 3DSRT, and the positions of the central sulcus in the primary sensorimotor area were also estimated. All positions of the 20 lesions were in strict accordance with the ROI delineation of 3DSRT. The central sulcus was identified on at least one side of 210 paired ROIs and in the middle of 192 (91.4%) of these 210 paired ROIs among the 273 paired ROIs of the primary sensorimotor area. The central sulcus was recognized in the middle of more than 71.4% of the ROIs in which the central sulcus was identifiable in the respective 28 slices of the primary sensorimotor area. Fully automated accurate ROI delineation on anatomically standardized images is possible with 3DSRT, which enables objective quantification of rCBF and vascular reserve in only a few minutes using 99mTc-ECD SPECT images obtained by the RVR method.

  5. Aptamer Microarrays

    SciTech Connect

    Angel-Syrett, Heather; Collett, Jim; Ellington, Andrew D.

    2009-01-02

    In vitro selection can yield specific, high-affinity aptamers. We and others have devised methods for the automated selection of aptamers, and have begun to use these reagents for the construction of arrays. Arrayed aptamers have proven to be almost as sensitive as their solution phase counterparts, and when ganged together can provide both specific and general diagnostic signals for proteins and other analytes. We describe here technical details regarding the production and processing of aptamer microarrays, including blocking, washing, drying, and scanning. We will also discuss the challenges involved in developing standardized and reproducible methods for binding and quantitating protein targets. While signals from fluorescent analytes or sandwiches are typically captured, it has proven possible for immobilized aptamers to be uniquely coupled to amplification methods not available to protein reagents, thus allowing for protein-binding signals to be greatly amplified. Into the future, many of the biosensor methods described in this book can potentially be adapted to array formats, thus further expanding the utility of and applications for aptamer arrays.

  6. Anxiety Online—A Virtual Clinic: Preliminary Outcomes Following Completion of Five Fully Automated Treatment Programs for Anxiety Disorders and Symptoms

    PubMed Central

    Meyer, Denny; Austin, David William; Kyrios, Michael

    2011-01-01

    Background The development of e-mental health interventions to treat or prevent mental illness and to enhance wellbeing has risen rapidly over the past decade. This development assists the public in sidestepping some of the obstacles that are often encountered when trying to access traditional face-to-face mental health care services. Objective The objective of our study was to investigate the posttreatment effectiveness of five fully automated self-help cognitive behavior e-therapy programs for generalized anxiety disorder (GAD), panic disorder with or without agoraphobia (PD/A), obsessive–compulsive disorder (OCD), posttraumatic stress disorder (PTSD), and social anxiety disorder (SAD) offered to the international public via Anxiety Online, an open-access full-service virtual psychology clinic for anxiety disorders. Methods We used a naturalistic participant choice, quasi-experimental design to evaluate each of the five Anxiety Online fully automated self-help e-therapy programs. Participants were required to have at least subclinical levels of one of the anxiety disorders to be offered the associated disorder-specific fully automated self-help e-therapy program. These programs are offered free of charge via Anxiety Online. Results A total of 225 people self-selected one of the five e-therapy programs (GAD, n = 88; SAD, n = 50; PD/A, n = 40; PTSD, n = 30; OCD, n = 17) and completed their 12-week posttreatment assessment. Significant improvements were found on 21/25 measures across the five fully automated self-help programs. At postassessment we observed significant reductions on all five anxiety disorder clinical disorder severity ratings (Cohen d range 0.72–1.22), increased confidence in managing one’s own mental health care (Cohen d range 0.70–1.17), and decreases in the total number of clinical diagnoses (except for the PD/A program, where a positive trend was found) (Cohen d range 0.45–1.08). In addition, we found significant improvements in

  7. Fully automated prostate segmentation in 3D MR based on normalized gradient fields cross-correlation initialization and LOGISMOS refinement

    NASA Astrophysics Data System (ADS)

    Yin, Yin; Fotin, Sergei V.; Periaswamy, Senthil; Kunz, Justin; Haldankar, Hrishikesh; Muradyan, Naira; Cornud, François; Turkbey, Baris; Choyke, Peter

    2012-02-01

    Manual delineation of the prostate is a challenging task for a clinician due to its complex and irregular shape. Furthermore, the need for precisely targeting the prostate boundary continues to grow. Planning for radiation therapy, MR-ultrasound fusion for image-guided biopsy, multi-parametric MRI tissue characterization, and context-based organ retrieval are examples where accurate prostate delineation can play a critical role in a successful patient outcome. Therefore, a robust automated full prostate segmentation system is desired. In this paper, we present an automated prostate segmentation system for 3D MR images. In this system, the prostate is segmented in two steps: the prostate displacement and size are first detected, and then the boundary is refined by a shape model. The detection approach is based on normalized gradient fields cross-correlation. This approach is fast, robust to intensity variation and provides good accuracy to initialize a prostate mean shape model. The refinement model is based on a graph-search based framework, which contains both shape and topology information during deformation. We generated the graph cost using trained classifiers and used coarse-to-fine search and region-specific classifier training. The proposed algorithm was developed using 261 training images and tested on another 290 cases. The segmentation performance using mean DSC ranging from 0.89 to 0.91 depending on the evaluation subset demonstrates state of the art performance. Running time for the system is about 20 to 40 seconds depending on image size and resolution.

  8. Eco-HAB as a fully automated and ecologically relevant assessment of social impairments in mouse models of autism

    PubMed Central

    Puścian, Alicja; Łęski, Szymon; Kasprowicz, Grzegorz; Winiarski, Maciej; Borowska, Joanna; Nikolaev, Tomasz; Boguszewski, Paweł M; Lipp, Hans-Peter; Knapska, Ewelina

    2016-01-01

    Eco-HAB is an open source, RFID-based system for automated measurement and analysis of social preference and in-cohort sociability in mice. The system closely follows murine ethology. It requires no contact between a human experimenter and tested animals, overcoming the confounding factors that lead to irreproducible assessment of murine social behavior between laboratories. In Eco-HAB, group-housed animals live in a spacious, four-compartment apparatus with shadowed areas and narrow tunnels, resembling natural burrows. Eco-HAB allows for assessment of the tendency of mice to voluntarily spend time together in ethologically relevant mouse group sizes. Custom-made software for automated tracking, data extraction, and analysis enables quick evaluation of social impairments. The developed protocols and standardized behavioral measures demonstrate high replicability. Unlike classic three-chambered sociability tests, Eco-HAB provides measurements of spontaneous, ecologically relevant social behaviors in group-housed animals. Results are obtained faster, with less manpower, and without confounding factors. DOI: http://dx.doi.org/10.7554/eLife.19532.001 PMID:27731798

  9. Fully automated segmentation of cartilage from the MR images of knee using a multi-atlas and local structural analysis method

    PubMed Central

    Lee, June-Goo; Gumus, Serter; Moon, Chan Hong; Kwoh, C. Kent; Bae, Kyongtae Ty

    2014-01-01

    Purpose: To develop a fully automated method to segment cartilage from the magnetic resonance (MR) images of knee and to evaluate the performance of the method on a public, open dataset. Methods: The segmentation scheme consisted of three procedures: multiple-atlas building, applying a locally weighted vote (LWV), and region adjustment. In the atlas building procedure, all training cases were registered to a target image by a nonrigid registration scheme and the best matched atlases selected. A LWV algorithm was applied to merge the information from these atlases and generate the initial segmentation result. Subsequently, for the region adjustment procedure, the statistical information of bone, cartilage, and surrounding regions was computed from the initial segmentation result. The statistical information directed the automated determination of the seed points inside and outside bone regions for the graph-cut based method. Finally, the region adjustment was conducted by the revision of outliers and the inclusion of abnormal bone regions. Results: A total of 150 knee MR images from a public, open dataset (available atwww.ski10.org) were used for the development and evaluation of this approach. The 150 cases were divided into the training set (100 cases) and the test set (50 cases). The cartilages were segmented successfully in all test cases in an average of 40 min computation time. The average dice similarity coefficient was 71.7% ± 8.0% for femoral and 72.4% ± 6.9% for tibial cartilage. Conclusions: The authors have developed a fully automated segmentation program for knee cartilage from MR images. The performance of the program based on 50 test cases was highly promising. PMID:25186408

  10. EST2uni: an open, parallel tool for automated EST analysis and database creation, with a data mining web interface and microarray expression data integration

    PubMed Central

    Forment, Javier; Gilabert, Francisco; Robles, Antonio; Conejero, Vicente; Nuez, Fernando; Blanca, Jose M

    2008-01-01

    Background Expressed sequence tag (EST) collections are composed of a high number of single-pass, redundant, partial sequences, which need to be processed, clustered, and annotated to remove low-quality and vector regions, eliminate redundancy and sequencing errors, and provide biologically relevant information. In order to provide a suitable way of performing the different steps in the analysis of the ESTs, flexible computation pipelines adapted to the local needs of specific EST projects have to be developed. Furthermore, EST collections must be stored in highly structured relational databases available to researchers through user-friendly interfaces which allow efficient and complex data mining, thus offering maximum capabilities for their full exploitation. Results We have created EST2uni, an integrated, highly-configurable EST analysis pipeline and data mining software package that automates the pre-processing, clustering, annotation, database creation, and data mining of EST collections. The pipeline uses standard EST analysis tools and the software has a modular design to facilitate the addition of new analytical methods and their configuration. Currently implemented analyses include functional and structural annotation, SNP and microsatellite discovery, integration of previously known genetic marker data and gene expression results, and assistance in cDNA microarray design. It can be run in parallel in a PC cluster in order to reduce the time necessary for the analysis. It also creates a web site linked to the database, showing collection statistics, with complex query capabilities and tools for data mining and retrieval. Conclusion The software package presented here provides an efficient and complete bioinformatics tool for the management of EST collections which is very easy to adapt to the local needs of different EST projects. The code is freely available under the GPL license and can be obtained at . This site also provides detailed instructions for

  11. Technical Note: A fully automated purge and trap-GC-MS system for quantification of volatile organic compound (VOC) fluxes between the ocean and atmosphere

    NASA Astrophysics Data System (ADS)

    Andrews, S. J.; Hackenberg, S. C.; Carpenter, L. J.

    2014-12-01

    The oceans are a key source of a number of atmospherically important volatile gases. The accurate and robust determination of trace gases in seawater is a significant analytical challenge, requiring reproducible and ideally automated sample handling, a high efficiency of seawater-air transfer, removal of water vapour from the sample stream, and high sensitivity and selectivity of the analysis. Here we describe a system that was developed for the fully automated analysis of dissolved very short-lived halogenated species (VSLS) sampled from an under-way seawater supply. The system can also be used for semi-automated batch sampling from Niskin bottles filled during CTD (Conductivity, Temperature, Depth) profiles. The essential components comprise of a bespoke, automated purge and trap (AutoP & T) unit coupled to a commercial thermal desorption and gas chromatograph-mass spectrometer (TD-GC-MS). The AutoP & T system has completed five research cruises, from the tropics to the poles, and collected over 2500 oceanic samples to date. It is able to quantify >25 species over a boiling point range of 34-180 °C with Henry's Law coefficients of 0.018 and greater (CH2I2, kHcc dimensionless gas/aqueous) and has been used to measure organic sulfurs, hydrocarbons, halocarbons and terpenes. In the east tropical Pacific, the high sensitivity and sampling frequency provided new information regarding the distribution of VSLS, including novel measurements of a photolytically driven diurnal cycle of CH2I2 within the surface ocean water.

  12. Development of a Fully Automated Guided Wave System for In-Process Cure Monitoring of CFRP Composite Laminates

    NASA Technical Reports Server (NTRS)

    Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.; Yaun, Fuh-Gwo

    2016-01-01

    A guided wave-based in-process cure monitoring technique for carbon fiber reinforced polymer (CFRP) composites was investigated at NASA Langley Research Center. A key cure transition point (vitrification) was identified and the degree of cure was monitored using metrics such as amplitude and time of arrival (TOA) of guided waves. Using an automated system preliminarily developed in this work, high-temperature piezoelectric transducers were utilized to interrogate a twenty-four ply unidirectional composite panel fabricated from Hexcel (Registered Trademark) IM7/8552 prepreg during cure. It was shown that the amplitude of the guided wave increased sharply around vitrification and the TOA curve possessed an inverse relationship with degree of cure. The work is a first step in demonstrating the feasibility of transitioning the technique to perform in-process cure monitoring in an autoclave, defect detection during cure, and ultimately a closed-loop process control to maximize composite part quality and consistency.

  13. Fully automated, high speed, tomographic phase object reconstruction using the transport of intensity equation in transmission and reflection configurations.

    PubMed

    Nguyen, Thanh; Nehmetallah, George; Tran, Dat; Darudi, Ahmad; Soltani, Peyman

    2015-12-10

    While traditional transport of intensity equation (TIE) based phase retrieval of a phase object is performed through axial translation of the CCD, in this work a tunable lens TIE is employed in both transmission and reflection configurations. These configurations are extended to a 360° tomographic 3D reconstruction through multiple illuminations from different angles by a custom fabricated rotating assembly of the phase object. Synchronization circuitry is developed to control the CCD camera and the Arduino board, which in its turn controls the tunable lens and the stepper motor to automate the tomographic reconstruction process. Finally, a MATLAB based user friendly graphical user interface is developed to control the whole system and perform tomographic reconstruction using both multiplicative and inverse radon based techniques. PMID:26836869

  14. Fully automated, high speed, tomographic phase object reconstruction using the transport of intensity equation in transmission and reflection configurations.

    PubMed

    Nguyen, Thanh; Nehmetallah, George; Tran, Dat; Darudi, Ahmad; Soltani, Peyman

    2015-12-10

    While traditional transport of intensity equation (TIE) based phase retrieval of a phase object is performed through axial translation of the CCD, in this work a tunable lens TIE is employed in both transmission and reflection configurations. These configurations are extended to a 360° tomographic 3D reconstruction through multiple illuminations from different angles by a custom fabricated rotating assembly of the phase object. Synchronization circuitry is developed to control the CCD camera and the Arduino board, which in its turn controls the tunable lens and the stepper motor to automate the tomographic reconstruction process. Finally, a MATLAB based user friendly graphical user interface is developed to control the whole system and perform tomographic reconstruction using both multiplicative and inverse radon based techniques.

  15. Fully Automated Electro Membrane Extraction Autosampler for LC-MS Systems Allowing Soft Extractions for High-Throughput Applications.

    PubMed

    Fuchs, David; Pedersen-Bjergaard, Stig; Jensen, Henrik; Rand, Kasper D; Honoré Hansen, Steen; Petersen, Nickolaj Jacob

    2016-07-01

    The current work describes the implementation of electro membrane extraction (EME) into an autosampler for high-throughput analysis of samples by EME-LC-MS. The extraction probe was built into a luer lock adapter connected to a HTC PAL autosampler syringe. As the autosampler drew sample solution, analytes were extracted into the lumen of the extraction probe and transferred to a LC-MS system for further analysis. Various parameters affecting extraction efficacy were investigated including syringe fill strokes, syringe pull up volume, pull up delay and volume in the sample vial. The system was optimized for soft extraction of analytes and high sample throughput. Further, it was demonstrated that by flushing the EME-syringe with acidic wash buffer and reverting the applied electric potential, carry-over between samples can be reduced to below 1%. Performance of the system was characterized (RSD, <10%; R(2), 0.994) and finally, the EME-autosampler was used to analyze in vitro conversion of methadone into its main metabolite by rat liver microsomes and for demonstrating the potential of known CYP3A4 inhibitors to prevent metabolism of methadone. By making use of the high extraction speed of EME, a complete analytical workflow of purification, separation, and analysis of sample could be achieved within only 5.5 min. With the developed system large sequences of samples could be analyzed in a completely automated manner. This high degree of automation makes the developed EME-autosampler a powerful tool for a wide range of applications where high-throughput extractions are required before sample analysis. PMID:27237618

  16. Comparison of Cobas 6500 and Iris IQ200 fully-automated urine analyzers to manual urine microscopy

    PubMed Central

    Bakan, Ebubekir; Ozturk, Nurinnisa; Baygutalp, Nurcan Kilic; Polat, Elif; Akpinar, Kadriye; Dorman, Emrullah; Polat, Harun; Bakan, Nuri

    2016-01-01

    Introduction Urine screening is achieved by either automated or manual microscopic analysis. The aim of the study was to compare Cobas 6500 and Iris IQ200 urine analyzers, and manual urine microscopic analysis. Materials and methods A total of 540 urine samples sent to the laboratory for chemical and sediment analysis were analyzed on Cobas 6500 and Iris IQ200 within 1 hour from sampling. One hundred and fifty three samples were found to have pathological sediment results and were subjected to manual microscopic analysis performed by laboratory staff blinded to the study. Spearman’s and Gamma statistics were used for correlation analyses, and the McNemar test for the comparison of the two automated analyzers. Results The comparison of Cobas u701 to the manual method yielded the following regression equations: y = - 0.12 (95% CI: - 1.09 to 0.67) + 0.78 (95% CI: 0.65 to 0.95) x for WBC and y = 0.06 (95% CI: - 0.09 to 0.25) + 0.66 (95% CI: 0.57 to 0.73) x for RBC. The comparison of IQ200 Elite to manual method the following equations: y = 0.03 (95% CI: - 1.00 to 1.00) + 0.88 (95% CI: 0.66 to 1.00) x for WBC and y = - 0.22 (95% CI: - 0.80 to 0.20) + 0.40 (95% CI: 0.32 to 0.50) x for RBC. IQ200 Elite compared to Cobas u701 yielded the following equations: y = - 0.95 (95% CI: - 2.13 to 0.11) + 1.25 (95% CI: 1.08 to 1.44) x for WBC and y = - 1.20 (95% CI: - 1.80 to -0.30) + 0. 80 (95% CI: 0.55 to 1.00) x for RBC. Conclusions The two analyzers showed similar performances and good compatibility to manual microscopy. However, they are still inadequate in the determination of WBC, RBC, and EC in highly-pathological samples. Thus, confirmation by manual microscopic analysis may be useful. PMID:27812305

  17. [Condition setting for the measurement of blood coagulation factor XIII activity using a fully automated blood coagulation analyzer, COAGTRON-350].

    PubMed

    Kanno, Nobuko; Kaneko, Makoto; Tanabe, Kumiko; Jyona, Masahiro; Yokota, Hiromitsu; Yatomi, Yutaka

    2012-12-01

    The automated laboratory analyzer COAGTRON-350 (Trinity Biotech) is used for routine and specific coagulation testing for the detection of fibrin formation utilizing either mechanical principles (ball method) or photo-optical principles, chromogenic kinetic enzyme analysis, and immune-turbidimetric detection systems in one benchtop unit. In this study, we demonstrated and established a parameter for the measurement of factor XIII (FXIII) activity using Berichrom FXIII reagent and the COAGTRON-350 analyzer. The usual protocol used for this reagent, based on the handling method, was slightly modified for this device. The analysis showed that fundamental study for the measurement of FXIII activity under our condition setting was favorable in terms of reproducibility, linearity, and correlation with another assays. Since FXIII is the key enzyme that plays important roles in hemostasis by stabilizing fibrin formation, the measurement of FXIII is essential for the diagnosis of bleeding disorders. Therefore, FXIII activity assessment as well as a routine coagulation testing can be conducted simultaneously with one instrument, which is useful in coagulopathy assessment.

  18. Fully Automated One-Step Production of Functional 3D Tumor Spheroids for High-Content Screening.

    PubMed

    Monjaret, François; Fernandes, Mathieu; Duchemin-Pelletier, Eve; Argento, Amelie; Degot, Sébastien; Young, Joanne

    2016-04-01

    Adoption of spheroids within high-content screening (HCS) has lagged behind high-throughput screening (HTS) due to issues with running complex assays on large three-dimensional (3D) structures.To enable multiplexed imaging and analysis of spheroids, different cancer cell lines were grown in 3D on micropatterned 96-well plates with automated production of nine uniform spheroids per well. Spheroids achieve diameters of up to 600 µm, and reproducibility was experimentally validated (interwell and interplate CV(diameter) <5%). Biphoton imaging confirmed that micropatterned spheroids exhibit characteristic cell heterogeneity with distinct microregions. Furthermore, central necrosis appears at a consistent spheroid size, suggesting standardized growth.Using three reference compounds (fluorouracil, irinotecan, and staurosporine), we validated HT-29 micropatterned spheroids on an HCS platform, benchmarking against hanging-drop spheroids. Spheroid formation and imaging in a single plate accelerate assay workflow, and fixed positioning prevents structures from overlapping or sticking to the well wall, augmenting image processing reliability. Furthermore, multiple spheroids per well increase the statistical confidence sufficiently to discriminate compound mechanisms of action and generate EC50 values for endpoints of cell death, architectural change, and size within a single-pass read. Higher quality data and a more efficient HCS work chain should encourage integration of micropatterned spheroid models within fundamental research and drug discovery applications.

  19. Fully Automated On-Chip Imaging Flow Cytometry System with Disposable Contamination-Free Plastic Re-Cultivation Chip

    PubMed Central

    Hayashi, Masahito; Hattori, Akihiro; Kim, Hyonchol; Terazono, Hideyuki; Kaneko, Tomoyuki; Yasuda, Kenji

    2011-01-01

    We have developed a novel imaging cytometry system using a poly(methyl methacrylate (PMMA)) based microfluidic chip. The system was contamination-free, because sample suspensions contacted only with a flammable PMMA chip and no other component of the system. The transparency and low-fluorescence of PMMA was suitable for microscopic imaging of cells flowing through microchannels on the chip. Sample particles flowing through microchannels on the chip were discriminated by an image-recognition unit with a high-speed camera in real time at the rate of 200 event/s, e.g., microparticles 2.5 μm and 3.0 μm in diameter were differentiated with an error rate of less than 2%. Desired cells were separated automatically from other cells by electrophoretic or dielectrophoretic force one by one with a separation efficiency of 90%. Cells in suspension with fluorescent dye were separated using the same kind of microfluidic chip. Sample of 5 μL with 1 × 106 particle/mL was processed within 40 min. Separated cells could be cultured on the microfluidic chip without contamination. The whole operation of sample handling was automated using 3D micropipetting system. These results showed that the novel imaging flow cytometry system is practically applicable for biological research and clinical diagnostics. PMID:21747698

  20. Improved synthesis of [(18)F]FLETT via a fully automated vacuum distillation method for [(18)F]2-fluoroethyl azide purification.

    PubMed

    Ackermann, Uwe; Plougastel, Lucie; Goh, Yit Wooi; Yeoh, Shinn Dee; Scott, Andrew M

    2014-12-01

    The synthesis of [(18)F]2-fluoroethyl azide and its subsequent click reaction with 5-ethynyl-2'-deoxyuridine (EDU) to form [(18)F]FLETT was performed using an iPhase FlexLab module. The implementation of a vacuum distillation method afforded [(18)F]2-fluoroethyl azide in 87±5.3% radiochemical yield. The use of Cu(CH3CN)4PF6 and TBTA as catalyst enabled us to fully automate the [(18)F]FLETT synthesis without the need for the operator to enter the radiation field. [(18)F]FLETT was produced in higher overall yield (41.3±6.5%) and shorter synthesis time (67min) than with our previously reported manual method (32.5±2.5% in 130min).

  1. Fully Automated Gis-Based Individual Tree Crown Delineation Based on Curvature Values from a LIDAR Derived Canopy Height Model in a Coniferous Plantation

    NASA Astrophysics Data System (ADS)

    Argamosa, R. J. L.; Paringit, E. C.; Quinton, K. R.; Tandoc, F. A. M.; Faelga, R. A. G.; Ibañez, C. A. G.; Posilero, M. A. V.; Zaragosa, G. P.

    2016-06-01

    The generation of high resolution canopy height model (CHM) from LiDAR makes it possible to delineate individual tree crown by means of a fully-automated method using the CHM's curvature through its slope. The local maxima are obtained by taking the maximum raster value in a 3 m x 3 m cell. These values are assumed as tree tops and therefore considered as individual trees. Based on the assumptions, thiessen polygons were generated to serve as buffers for the canopy extent. The negative profile curvature is then measured from the slope of the CHM. The results show that the aggregated points from a negative profile curvature raster provide the most realistic crown shape. The absence of field data regarding tree crown dimensions require accurate visual assessment after the appended delineated tree crown polygon was superimposed to the hill shaded CHM.

  2. GenomEra MRSA/SA, a fully automated homogeneous PCR assay for rapid detection of Staphylococcus aureus and the marker of methicillin resistance in various sample matrixes.

    PubMed

    Hirvonen, Jari J; Kaukoranta, Suvi-Sirkku

    2013-09-01

    The GenomEra MRSA/SA assay (Abacus Diagnostica, Turku, Finland) is the first commercial homogeneous PCR assay using thermally stable, intrinsically fluorescent time-resolved fluorometric (TRF) labels resistant to autofluorescence and other background effects. This fully automated closed tube PCR assay simultaneously detects Staphylococcus aureus specific DNA and the mecA gene within 50 min. It can be used for both screening and confirmation of methicillin-resistant and -sensitive S. aureus (MRSA and MSSA) directly in different specimen types or from preceding cultures. The assay has shown excellent performance in comparisons with other diagnostic methods in all the sample types tested. The GenomEra MRSA/SA assay provides rapid assistance for the detection of MRSA as well as invasive staphylococcal infections and helps the early targeting of antimicrobial therapy to patients with potential MRSA infection.

  3. Fully automated classification of bone marrow infiltration in low-dose CT of patients with multiple myeloma based on probabilistic density model and supervised learning.

    PubMed

    Martínez-Martínez, Francisco; Kybic, Jan; Lambert, Lukáš; Mecková, Zuzana

    2016-04-01

    This paper presents a fully automated method for the identification of bone marrow infiltration in femurs in low-dose CT of patients with multiple myeloma. We automatically find the femurs and the bone marrow within them. In the next step, we create a probabilistic, spatially dependent density model of normal tissue. At test time, we detect unexpectedly high density voxels which may be related to bone marrow infiltration, as outliers to this model. Based on a set of global, aggregated features representing all detections from one femur, we classify the subjects as being either healthy or not. This method was validated on a dataset of 127 subjects with ground truth created from a consensus of two expert radiologists, obtaining an AUC of 0.996 for the task of distinguishing healthy controls and patients with bone marrow infiltration. To the best of our knowledge, no other automatic image-based method for this task has been published before. PMID:26894595

  4. Instrumentation of LOTIS: Livermore Optical Transient Imaging System; a fully automated wide field of view telescope system searching for simultaneous optical counterparts of gamma ray bursts

    SciTech Connect

    Park, H.S.; Ables, E.; Barthelmy, S.D.; Bionta, R.M.; Ott, L.L.; Parker, E.L.; Williams, G.G.

    1998-03-06

    LOTIS is a rapidly slewing wide-field-of-view telescope which was designed and constructed to search for simultaneous gamma-ray burst (GRB) optical counterparts. This experiment requires a rapidly slewing ({lt} 10 sec), wide-field-of-view ({gt} 15{degrees}), automatic and dedicated telescope. LOTIS utilizes commercial tele-photo lenses and custom 2048 x 2048 CCD cameras to view a 17.6 x 17.6{degrees} field of view. It can point to any part of the sky within 5 sec and is fully automated. It is connected via Internet socket to the GRB coordinate distribution network which analyzes telemetry from the satellite and delivers GRB coordinate information in real-time. LOTIS started routine operation in Oct. 1996. In the idle time between GRB triggers, LOTIS systematically surveys the entire available sky every night for new optical transients. This paper will describe the system design and performance.

  5. A novel fully automated on-line coupled liquid chromatography-gas chromatography technique used for the determination of organochlorine pesticide residues in tobacco and tobacco products.

    PubMed

    Qi, Dawei; Fei, Ting; Sha, Yunfei; Wang, Leijun; Li, Gang; Wu, Da; Liu, Baizhan

    2014-12-29

    In this study, a novel fully automated on-line coupled liquid chromatography-gas chromatography (LC-GC) technique was reported and applied for the determination of organochlorine pesticide residues (OCPs) in tobacco and tobacco products. Using a switching valve to isolate the capillary pre-column and the analytical column during the solvent evaporation period, the LC solvent can be completely removed and prevented from reaching the GC column and the detector. The established method was used to determinate the OCPs in tobacco samples. By using Florisil SPE column and employing GPC technique, polarity impurities and large molecule impurities were removed. A dynamic range 1-100ng/mL was achieved with detection limits from 1.5 to 3.3μg/kg. The method exhibited good repeatability and recoveries. This technology may provide an alternative way for trace analysis of complex samples.

  6. On transcending the impasse of respiratory motion correction applications in routine clinical imaging - a consideration of a fully automated data driven motion control framework.

    PubMed

    Kesner, Adam L; Schleyer, Paul J; Büther, Florian; Walter, Martin A; Schäfers, Klaus P; Koo, Phillip J

    2014-12-01

    Positron emission tomography (PET) is increasingly used for the detection, characterization, and follow-up of tumors located in the thorax. However, patient respiratory motion presents a unique limitation that hinders the application of high-resolution PET technology for this type of imaging. Efforts to transcend this limitation have been underway for more than a decade, yet PET remains for practical considerations a modality vulnerable to motion-induced image degradation. Respiratory motion control is not employed in routine clinical operations. In this article, we take an opportunity to highlight some of the recent advancements in data-driven motion control strategies and how they may form an underpinning for what we are presenting as a fully automated data-driven motion control framework. This framework represents an alternative direction for future endeavors in motion control and can conceptually connect individual focused studies with a strategy for addressing big picture challenges and goals. PMID:26501450

  7. SU-D-BRD-06: Creating a Safety Net for a Fully Automated, Script Driven Electronic Medical Record

    SciTech Connect

    Sheu, R; Ghafar, R; Powers, A; Green, S; Lo, Y

    2015-06-15

    Purpose: Demonstrate the effectiveness of in-house software in ensuring EMR workflow efficiency and safety. Methods: A web-based dashboard system (WBDS) was developed to monitor clinical workflow in real time using web technology (WAMP) through ODBC (Open Database Connectivity). Within Mosaiq (Elekta Inc), operational workflow is driven and indicated by Quality Check Lists (QCLs), which is triggered by automation software IQ Scripts (Elekta Inc); QCLs rely on user completion to propagate. The WBDS retrieves data directly from the Mosaig SQL database and tracks clinical events in real time. For example, the necessity of a physics initial chart check can be determined by screening all patients on treatment who have received their first fraction and who have not yet had their first chart check. Monitoring similar “real” events with our in-house software creates a safety net as its propagation does not rely on individual users input. Results: The WBDS monitors the following: patient care workflow (initial consult to end of treatment), daily treatment consistency (scheduling, technique, charges), physics chart checks (initial, EOT, weekly), new starts, missing treatments (>3 warning/>5 fractions, action required), and machine overrides. The WBDS can be launched from any web browser which allows the end user complete transparency and timely information. Since the creation of the dashboards, workflow interruptions due to accidental deletion or completion of QCLs were eliminated. Additionally, all physics chart checks were completed timely. Prompt notifications of treatment record inconsistency and machine overrides have decreased the amount of time between occurrence and execution of corrective action. Conclusion: Our clinical workflow relies primarily on QCLs and IQ Scripts; however, this functionality is not the panacea of safety and efficiency. The WBDS creates a more thorough system of checks to provide a safer and near error-less working environment.

  8. Fully Automated Simultaneous Integrated Boosted-Intensity Modulated Radiation Therapy Treatment Planning Is Feasible for Head-and-Neck Cancer: A Prospective Clinical Study

    SciTech Connect

    Wu Binbin; McNutt, Todd; Zahurak, Marianna; Simari, Patricio; Pang, Dalong; Taylor, Russell; Sanguineti, Giuseppe

    2012-12-01

    Purpose: To prospectively determine whether overlap volume histogram (OVH)-driven, automated simultaneous integrated boosted (SIB)-intensity-modulated radiation therapy (IMRT) treatment planning for head-and-neck cancer can be implemented in clinics. Methods and Materials: A prospective study was designed to compare fully automated plans (APs) created by an OVH-driven, automated planning application with clinical plans (CPs) created by dosimetrists in a 3-dose-level (70 Gy, 63 Gy, and 58.1 Gy), head-and-neck SIB-IMRT planning. Because primary organ sparing (cord, brain, brainstem, mandible, and optic nerve/chiasm) always received the highest priority in clinical planning, the study aimed to show the noninferiority of APs with respect to PTV coverage and secondary organ sparing (parotid, brachial plexus, esophagus, larynx, inner ear, and oral mucosa). The sample size was determined a priori by a superiority hypothesis test that had 85% power to detect a 4% dose decrease in secondary organ sparing with a 2-sided alpha level of 0.05. A generalized estimating equation (GEE) regression model was used for statistical comparison. Results: Forty consecutive patients were accrued from July to December 2010. GEE analysis indicated that in APs, overall average dose to the secondary organs was reduced by 1.16 (95% CI = 0.09-2.33) with P=.04, overall average PTV coverage was increased by 0.26% (95% CI = 0.06-0.47) with P=.02 and overall average dose to the primary organs was reduced by 1.14 Gy (95% CI = 0.45-1.8) with P=.004. A physician determined that all APs could be delivered to patients, and APs were clinically superior in 27 of 40 cases. Conclusions: The application can be implemented in clinics as a fast, reliable, and consistent way of generating plans that need only minor adjustments to meet specific clinical needs.

  9. Quantitative determination of opioids in whole blood using fully automated dried blood spot desorption coupled to on-line SPE-LC-MS/MS.

    PubMed

    Verplaetse, Ruth; Henion, Jack

    2016-01-01

    Opioids are well known, widely used painkillers. Increased stability of opioids in the dried blood spot (DBS) matrix compared to blood/plasma has been described. Other benefits provided by DBS techniques include point-of-care collection, less invasive micro sampling, more economical shipment, and convenient storage. Current methodology for analysis of micro whole blood samples for opioids is limited to the classical DBS workflow, including tedious manual punching of the DBS cards followed by extraction and liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. The goal of this study was to develop and validate a fully automated on-line sample preparation procedure for the analysis of DBS micro samples relevant to the detection of opioids in finger prick blood. To this end, automated flow-through elution of DBS cards was followed by on-line solid-phase extraction (SPE) and analysis by LC-MS/MS. Selective, sensitive, accurate, and reproducible quantitation of five representative opioids in human blood at sub-therapeutic, therapeutic, and toxic levels was achieved. The range of reliable response (R(2)  ≥0.997) was 1 to 500 ng/mL whole blood for morphine, codeine, oxycodone, hydrocodone; and 0.1 to 50 ng/mL for fentanyl. Inter-day, intra-day, and matrix inter-lot accuracy and precision was less than 15% (even at lower limits of quantitation (LLOQ) level). The method was successfully used to measure hydrocodone and its major metabolite norhydrocodone in incurred human samples. Our data support the enormous potential of DBS sampling and automated analysis for monitoring opioids as well as other pharmaceuticals in both anti-doping and pain management regimens. PMID:26607771

  10. Implementation and Evaluation of a Fully Automated Multiplex Real-Time PCR Assay on the BD Max Platform to Detect and Differentiate Herpesviridae from Cerebrospinal Fluids

    PubMed Central

    Köller, Thomas; Kurze, Daniel; Lange, Mirjam; Scherdin, Martin; Podbielski, Andreas; Warnke, Philipp

    2016-01-01

    A fully automated multiplex real-time PCR assay—including a sample process control and a plasmid based positive control—for the detection and differentiation of herpes simplex virus 1 (HSV1), herpes simplex virus 2 (HSV2) and varicella-zoster virus (VZV) from cerebrospinal fluids (CSF) was developed on the BD Max platform. Performance was compared to an established accredited multiplex real time PCR protocol utilizing the easyMAG and the LightCycler 480/II, both very common devices in viral molecular diagnostics. For clinical validation, 123 CSF specimens and 40 reference samples from national interlaboratory comparisons were examined with both methods, resulting in 97.6% and 100% concordance for CSF and reference samples, respectively. Utilizing the BD Max platform revealed sensitivities of 173 (CI 95%, 88–258) copies/ml for HSV1, 171 (CI 95%, 148–194) copies/ml for HSV2 and 84 (CI 95%, 5–163) copies/ml for VZV. Cross reactivity could be excluded by checking 25 common viral, bacterial and fungal human pathogens. Workflow analyses displayed shorter test duration as well as remarkable fewer and easier preparation steps with the potential to reduce error rates occurring when manually assessing patient samples. This protocol allows for a fully automated PCR assay on the BD Max platform for the simultaneously detection of herpesviridae from CSF specimens. Singular or multiple infections due to HSV1, HSV2 and VZV can reliably be differentiated with good sensitivities. Control parameters are included within the assay, thereby rendering its suitability for current quality management requirements. PMID:27092772

  11. Use of short roll C-arm computed tomography and fully automated 3D analysis tools to guide transcatheter aortic valve replacement.

    PubMed

    Kim, Michael S; Bracken, John; Eshuis, Peter; Chen, S Y James; Fullerton, David; Cleveland, Joseph; Messenger, John C; Carroll, John D

    2016-07-01

    Determination of the coplanar view is a critical component of transcatheter aortic valve replacement (TAVR). The safety and accuracy of a novel reduced angular range C-arm computed tomography (CACT) approach coupled with a fully automated 3D analysis tool package to predict the coplanar view in TAVR was evaluated. Fifty-seven patients with severe symptomatic aortic stenosis deemed prohibitive-risk for surgery and who underwent TAVR were enrolled. Patients were randomized 2:1 to CACT vs. angiography (control) in estimating the coplanar view. These approaches to determine the coplanar view were compared quantitatively. Radiation doses needed to determine the coplanar view were recorded for both the CACT and control patients. Use of CACT offered good agreement with the actual angiographic view utilized during TAVR in 34 out of 41 cases in which a CACT scan was performed (83 %). For these 34 cases, the mean angular magnitude difference, taking into account both oblique and cranial/caudal angulation, was 1.3° ± 0.4°, while the maximum difference was 7.3°. There were no significant differences in the mean total radiation dose delivered to patients between the CACT and control groups as measured by either dose area product (207.8 ± 15.2 Gy cm(2) vs. 186.1 ± 25.3 Gy cm(2), P = 0.47) or air kerma (1287.6 ± 117.7 mGy vs. 1098.9 ± 143.8 mGy, P = 0.32). Use of reduced-angular range CACT coupled with fully automated 3D analysis tools is a safe, practical, and feasible method by which to determine the optimal angiographic deployment view for guiding TAVR procedures.

  12. Multi-center evaluation of the novel fully-automated PCR-based Idylla™ BRAF Mutation Test on formalin-fixed paraffin-embedded tissue of malignant melanoma.

    PubMed

    Melchior, Linea; Grauslund, Morten; Bellosillo, Beatriz; Montagut, Clara; Torres, Erica; Moragón, Ester; Micalessi, Isabel; Frans, Johan; Noten, Veerle; Bourgain, Claire; Vriesema, Renske; van der Geize, Robert; Cokelaere, Kristof; Vercooren, Nancy; Crul, Katrien; Rüdiger, Thomas; Buchmüller, Diana; Reijans, Martin; Jans, Caroline

    2015-12-01

    The advent of BRAF-targeted therapies led to increased survival in patients with metastatic melanomas harboring a BRAF V600 mutation (implicated in 46-48% of malignant melanomas). The Idylla(™) System (Idylla(™)), i.e., the real-time-PCR-based Idylla(™) BRAF Mutation Test performed on the fully-automated Idylla(™) platform, enables detection of the most frequent BRAF V600 mutations (V600E/E2/D, V600K/R/M) in tumor material within approximately 90 min and with 1% detection limit. Idylla(™) performance was determined in a multi-center study by analyzing BRAF mutational status of 148 archival formalin-fixed paraffin-embedded (FFPE) tumor samples from malignant melanoma patients, and comparing Idylla(™) results with assessments made by commercial or in-house routine diagnostic methods. Of the 148 samples analyzed, Idylla(™) initially recorded 7 insufficient DNA input calls and 15 results discordant with routine method results. Further analysis learned that the quality of 8 samples was insufficient for Idylla(™) testing, 1 sample had an invalid routine test result, and Idylla(™) results were confirmed in 10 samples. Hence, Idylla(™) identified all mutations present, including 7 not identified by routine methods. Idylla(™) enables fully automated BRAF V600 testing directly on FFPE tumor tissue with increased sensitivity, ease-of-use, and much shorter turnaround time compared to existing diagnostic tests, making it a tool for rapid, simple and highly reliable analysis of therapeutically relevant BRAF mutations, in particular for diagnostic units without molecular expertise and infrastructure.

  13. Multi-center evaluation of the novel fully-automated PCR-based Idylla™ BRAF Mutation Test on formalin-fixed paraffin-embedded tissue of malignant melanoma.

    PubMed

    Melchior, Linea; Grauslund, Morten; Bellosillo, Beatriz; Montagut, Clara; Torres, Erica; Moragón, Ester; Micalessi, Isabel; Frans, Johan; Noten, Veerle; Bourgain, Claire; Vriesema, Renske; van der Geize, Robert; Cokelaere, Kristof; Vercooren, Nancy; Crul, Katrien; Rüdiger, Thomas; Buchmüller, Diana; Reijans, Martin; Jans, Caroline

    2015-12-01

    The advent of BRAF-targeted therapies led to increased survival in patients with metastatic melanomas harboring a BRAF V600 mutation (implicated in 46-48% of malignant melanomas). The Idylla(™) System (Idylla(™)), i.e., the real-time-PCR-based Idylla(™) BRAF Mutation Test performed on the fully-automated Idylla(™) platform, enables detection of the most frequent BRAF V600 mutations (V600E/E2/D, V600K/R/M) in tumor material within approximately 90 min and with 1% detection limit. Idylla(™) performance was determined in a multi-center study by analyzing BRAF mutational status of 148 archival formalin-fixed paraffin-embedded (FFPE) tumor samples from malignant melanoma patients, and comparing Idylla(™) results with assessments made by commercial or in-house routine diagnostic methods. Of the 148 samples analyzed, Idylla(™) initially recorded 7 insufficient DNA input calls and 15 results discordant with routine method results. Further analysis learned that the quality of 8 samples was insufficient for Idylla(™) testing, 1 sample had an invalid routine test result, and Idylla(™) results were confirmed in 10 samples. Hence, Idylla(™) identified all mutations present, including 7 not identified by routine methods. Idylla(™) enables fully automated BRAF V600 testing directly on FFPE tumor tissue with increased sensitivity, ease-of-use, and much shorter turnaround time compared to existing diagnostic tests, making it a tool for rapid, simple and highly reliable analysis of therapeutically relevant BRAF mutations, in particular for diagnostic units without molecular expertise and infrastructure. PMID:26407762

  14. Parameter evaluation and fully-automated radiosynthesis of [11C]harmine for imaging of MAO-A for clinical trials

    PubMed Central

    Philippe, C.; Zeilinger, M.; Mitterhauser, M.; Dumanic, M.; Lanzenberger, R.; Hacker, M.; Wadsak, W.

    2015-01-01

    The aim of the present study was the evaluation and automation of the radiosynthesis of [11C]harmine for clinical trials. The following parameters have been investigated: amount of base, precursor concentration, solvent, reaction temperature and time. The optimum reaction conditions were determined to be 2–3 mg/mL precursor activated with 1 eq. 5 M NaOH in DMSO, 80 °C reaction temperature and 2 min reaction time. Under these conditions 6.1±1 GBq (51.0±11% based on [11C]CH3I, corrected for decay) of [11C]harmine (n=72) were obtained. The specific activity was 101.32±28.2 GBq/µmol (at EOS). All quality control parameters were in accordance with the standards for parenteral human application. Due to its reliability and high yields, this fully-automated synthesis method can be used as routine set-up. PMID:25594603

  15. Fully automated determination of N-nitrosamines in environmental waters by headspace solid-phase microextraction followed by GC-MS-MS.

    PubMed

    Llop, Anna; Borrull, Francesc; Pocurull, Eva

    2010-12-01

    A fully automated method for determining nine Environmental Protection Agency N-nitrosamines in several types of environmental waters at ng/L levels is presented. The method is based on a headspace solid-phase microextraction followed by GC-MS-MS using chemical ionization. Three different fibers (carboxen/PDMS, divinylbenzene/carboxen/PDMS, and PEG) were tested. Solid-phase microextraction conditions were best when a divinylbenzene/carboxen/PDMS fiber was exposed for 60 min in the headspace of 10 mL water samples at pH 7 containing 360 g/L of NaCl, at 45°C. All compounds were analyzed by GC-MS-MS within 18 min. The method was validated using effluent from an urban wastewater treatment plant and the LODs ranged from 1 to 5 ng/L. The method was then applied to determine the N-nitrosamines in samples of different complexities, such as tap water and several influent and effluent wastewater samples from urban and industrial wastewater treatment plants and a potable water treatment plant. Although the analysis of influent industrial wastewater revealed high concentrations of some compounds (N-nitrosomorpholine and N-nitrosodimethylamine at μg/L levels), in industrial effluents and other samples, the concentrations were substantially lower (ng/L levels). The new method is suitable for the simple and reliable determination of N-nitrosamines in highly complex water samples in a completely automated procedure.

  16. Comparative evaluation of two fully-automated real-time PCR methods for MRSA admission screening in a tertiary-care hospital.

    PubMed

    Hos, N J; Wiegel, P; Fischer, J; Plum, G

    2016-09-01

    We evaluated two fully-automated real-time PCR systems, the novel QIAGEN artus MRSA/SA QS-RGQ and the widely used BD MAX MRSA assay, for their diagnostic performance in MRSA admission screening in a tertiary-care university hospital. Two hundred sixteen clinical swabs were analyzed for MRSA DNA using the BD MAX MRSA assay. In parallel, the same specimens were tested with the QIAGEN artus MRSA/SA QS-RGQ. Automated steps included lysis of bacteria, DNA extraction, real-time PCR and interpretation of results. MRSA culture was additionally performed as a reference method for MRSA detection. Sensitivity values were similar for both assays (80 %), while the QIAGEN artus MRSA/SA QS-RGQ reached a slightly higher specificity (95.8 % versus 90.0 %). Positive (PPVs) and negative predictive values (NPVs) were 17.4 % and 99.4 % for the BD MAX MRSA assay and 33.3 % and 99.5 % for the QIAGEN artus MRSA/SA QS-RGQ, respectively. Total turn-around time (TAT) for 24 samples was 3.5 hours for both assays. In conclusion, both assays represent reliable diagnostic tools due to their high negative predictive values, especially for the rapid identification of MRSA negative patients in a low prevalence MRSA area. PMID:27259711

  17. A fully automated meltwater monitoring and collection system for spatially distributed isotope analysis in snowmelt-dominated catchments

    NASA Astrophysics Data System (ADS)

    Rücker, Andrea; Boss, Stefan; Von Freyberg, Jana; Zappa, Massimiliano; Kirchner, James

    2016-04-01

    In many mountainous catchments the seasonal snowpack stores a significant volume of water, which is released as streamflow during the melting period. The predicted change in future climate will bring new challenges in water resource management in snow-dominated headwater catchments and their receiving lowlands. To improve predictions of hydrologic extreme events, particularly summer droughts, it is important characterize the relationship between winter snowpack and summer (low) flows in such areas (e.g., Godsey et al., 2014). In this context, stable water isotopes (18O, 2H) are a powerful tool for fingerprinting the sources of streamflow and tracing water flow pathways. For this reason, we have established an isotope sampling network in the Alptal catchment (46.4 km2) in Central-Switzerland as part of the SREP-Drought project (Snow Resources and the Early Prediction of hydrological DROUGHT in mountainous streams). Samples of precipitation (daily), snow cores (weekly) and runoff (daily) are analyzed for their isotopic signature in a regular cycle. Precipitation is also sampled along a horizontal transect at the valley bottom, and along an elevational transect. Additionally, the analysis of snow meltwater is of importance. As the sample collection of snow meltwater in mountainous terrain is often impractical, we have developed a fully automatic snow lysimeter system, which measures meltwater volume and collects samples for isotope analysis at daily intervals. The system consists of three lysimeters built from Decagon-ECRN-100 High Resolution Rain Gauges as standard component that allows monitoring of meltwater flow. Each lysimeter leads the meltwater into a 10-liter container that is automatically sampled and then emptied daily. These water samples are replaced regularly and analyzed afterwards on their isotopic composition in the lab. Snow melt events as well as system status can be monitored in real time. In our presentation we describe the automatic snow lysimeter

  18. GARLig: a fully automated tool for subset selection of large fragment spaces via a self-adaptive genetic algorithm.

    PubMed

    Pfeffer, Patrick; Fober, Thomas; Hüllermeier, Eyke; Klebe, Gerhard

    2010-09-27

    In combinatorial chemistry, molecules are assembled according to combinatorial principles by linking suitable reagents or decorating a given scaffold with appropriate substituents from a large chemical space of starting materials. Often the number of possible combinations greatly exceeds the number feasible to handle by an in-depth in silico approach or even more if it should be experimentally synthesized. Therefore, powerful tools to efficiently enumerate large chemical spaces are required. They can be provided by genetic algorithms, which mimic Darwinian evolution. GARLig (genetic algorithm using reagents to compose ligands) has been developed to perform subset selection in large chemical compound spaces subject to target-specific 3D-scoring criteria. GARLig uses different scoring schemes, such as AutoDock4 Score, GOLDScore, and DrugScore(CSD), as fitness functions. Its genetic parameters have been optimized to characterize combinatorial libraries with respect to the binding to various targets of pharmaceutical interest. A large tripeptide library of 20(3) members has been used to profile amino acid frequencies in putative substrates for trypsin, thrombin, factor Xa, and plasmin. A peptidomimetic scaffold assembled from a selection of a 25(3) building block was used to test the performance of the evolutionary algorithm in suggesting potent inhibitors of the enzyme cathepsin D. In a final case study, our program was used to characterize and rank a combinatorial drug-like library comprising 33,750 potential thrombin inhibitors. These case studies demonstrate that GARLig finds experimentally confirmed potent leads by processing a significantly smaller subset of the fully enumerated combinatorial library. Furthermore, the profiles of amino acids computed by the genetic algorithm match the observed amino acid frequencies found by screening peptide libraries in substrate cleavage assays.

  19. Fully automated determination of the sterol composition and total content in edible oils and fats by online liquid chromatography-gas chromatography-flame ionization detection.

    PubMed

    Nestola, Marco; Schmidt, Torsten C

    2016-09-01

    Sterol analysis of edible oils and fats is important in authenticity control. The gas chromatographic determination of the sterol distribution and total content is described by ISO norm 12228. Extraction, purification, and detection of the sterols are time-consuming and error-prone. Collaborative trials prove this regularly. Purification by thin-layer chromatography (TLC) and robust GC determination of all mentioned sterols is not straightforward. Therefore, a fully automated LC-GC-FID method was developed to facilitate the determination of sterols. The only manual step left was to weigh the sample into an autosampler vial. Saponification and extraction were performed by an autosampler while purification, separation, and detection were accomplished by online coupled normal-phase LC-GC-FID. Interlacing of sample preparation and analysis allowed an average sample throughput of one sample per hour. The obtained quantitative results were fully comparable with the ISO method with one apparent exception. In the case of sunflower oils, an additional unknown sterol was detected generally missed by ISO 12228. The reason was found in the omission of sterol silylation before subjection to GC-FID. The derivatization reaction changed the retention time and hid this compound behind a major sterol. The compound could be identified as 14-methyl fecosterol. Its structure was elucidated by GC-MS and ensured by HPLC and GC retention times. Finally, validation of the designed method confirmed its suitability for routine environments. PMID:27522150

  20. Rapid microarray-based DNA genoserotyping of Escherichia coli.

    PubMed

    Geue, Lutz; Monecke, Stefan; Engelmann, Ines; Braun, Sascha; Slickers, Peter; Ehricht, Ralf

    2014-02-01

    In this study, an improvement in the oligonucleotide-based DNA microarray for the genoserotyping of Escherichia coli is presented. Primer and probes for additional 70 O antigen groups were developed. The microarray was transferred to a new platform, the ArrayStrip format, which allows high through-put tests in 96-well formats and fully automated microarray analysis. Thus, starting from a single colony, it is possible to determine within a few hours and a single experiment, 94 of the over 180 known O antigen groups as well as 47 of the 53 different H antigens. The microarray was initially validated with a set of defined reference strains that had previously been serotyped by conventional agglutination in various reference centers. For further validation of the microarray, 180 clinical E. coli isolates of human origin (from urine samples, blood cultures, bronchial secretions, and wound swabs) and 53 E. coli isolates from cattle, pigs, and poultry were used. A high degree of concordance between the results of classical antibody-based serotyping and DNA-based genoserotyping was demonstrated during validation of the new 70 O antigen groups as well as for the field strains of human and animal origin. Therefore, this oligonucleotide array is a diagnostic tool that is user-friendly and more efficient than classical serotyping by agglutination. Furthermore, the tests can be performed in almost every routine lab and are easily expanded and standardized.

  1. Fully automated production of diverse 18F-labeled PET tracers on the ELIXYS multi-reactor radiosynthesizer without hardware modification

    PubMed Central

    Lazari, Mark; Collins, Jeffrey; Shen, Bin; Farhoud, Mohammed; Yeh, Daniel; Maraglia, Brandon; Chin, Frederick T.; Nathanson, David A.; Moore, Melissa; van Dam, R. Michael

    2015-01-01

    Fully-automated radiosynthesizers are continuing to be developed to meet the growing need for the reliable production of positron emission tomography (PET) tracers made under current good manufacturing practice (cGMP) guidelines. There is a current trend towards supporting “kit-like” disposable cassettes that come preconfigured for particular tracers, thus eliminating the need for cleaning protocols between syntheses and enabling quick transitions to synthesizing other tracers. Though ideal for production, these systems are often limited for the development of novel tracers due to pressure, temperature, and chemical compatibility considerations. This study demonstrates the versatile use of the ELIXYS fully-automated radiosynthesizer to adapt and produce eight different 18F-labeled PET tracers of varying complexity. Methods Three reactor syntheses of D-[18F]FAC, L-[18F]FMAU, and D-[18F]FEAU along with the one reactor syntheses of D-[18F]FEAU, [18F]FDG, [18F]FLT, [18F]Fallypride, [18F]FHBG, and [18F]SFB were all produced using ELIXYS without the need for any hardware modifications or reconfiguration. Synthesis protocols were adapted, and slightly modified from literature, but not fully optimized. Furthermore, [18F]FLT, [18F]FDG, and [18F]Fallypride were produced sequentially on the same day and used for preclinical imaging of A431 tumor-bearing SCID mice and wild-type BALB/c mice, respectively. To assess future translation to the clinical setting, several batches of tracers were subjected to a full set of quality control tests. Results All tracers were produced with radiochemical yields comparable to those in literature. [18F]FLT, [18F]FDG, and [18F]Fallypride were successfully used to image the mice with results consistent with literature. All tracers subjected to clinical quality control tests passed. Conclusion The ELIXYS radiosynthesizer facilitates rapid tracer development and is capable of producing multiple 18F-labeled PET tracers suitable for clinical

  2. Fully automated segmentation and tracking of the intima media thickness in ultrasound video sequences of the common carotid artery.

    PubMed

    Ilea, Dana E; Duffy, Caoimhe; Kavanagh, Liam; Stanton, Alice; Whelan, Paul F

    2013-01-01

    The robust identification and measurement of the intima media thickness (IMT) has a high clinical relevance because it represents one of the most precise predictors used in the assessment of potential future cardiovascular events. To facilitate the analysis of arterial wall thickening in serial clinical investigations, in this paper we have developed a novel fully automatic algorithm for the segmentation, measurement, and tracking of the intima media complex (IMC) in B-mode ultrasound video sequences. The proposed algorithm entails a two-stage image analysis process that initially addresses the segmentation of the IMC in the first frame of the ultrasound video sequence using a model-based approach; in the second step, a novel customized tracking procedure is applied to robustly detect the IMC in the subsequent frames. For the video tracking procedure, we introduce a spatially coherent algorithm called adaptive normalized correlation that prevents the tracking process from converging to wrong arterial interfaces. This represents the main contribution of this paper and was developed to deal with inconsistencies in the appearance of the IMC over the cardiac cycle. The quantitative evaluation has been carried out on 40 ultrasound video sequences of the common carotid artery (CCA) by comparing the results returned by the developed algorithm with respect to ground truth data that has been manually annotated by clinical experts. The measured IMT(mean) ± standard deviation recorded by the proposed algorithm is 0.60 mm ± 0.10, with a mean coefficient of variation (CV) of 2.05%, whereas the corresponding result obtained for the manually annotated ground truth data is 0.60 mm ± 0.11 with a mean CV equal to 5.60%. The numerical results reported in this paper indicate that the proposed algorithm is able to correctly segment and track the IMC in ultrasound CCA video sequences, and we were encouraged by the stability of our technique when applied to data captured under

  3. Diabetes Prevention and Weight Loss with a Fully Automated Behavioral Intervention by Email, Web, and Mobile Phone: A Randomized Controlled Trial Among Persons with Prediabetes

    PubMed Central

    Romanelli, Robert J; Block, Torin J; Hopkins, Donald; Carpenter, Heather A; Dolginsky, Marina S; Hudes, Mark L; Palaniappan, Latha P; Block, Clifford H

    2015-01-01

    Background One-third of US adults, 86 million people, have prediabetes. Two-thirds of adults are overweight or obese and at risk for diabetes. Effective and affordable interventions are needed that can reach these 86 million, and others at high risk, to reduce their progression to diagnosed diabetes. Objective The aim was to evaluate the effectiveness of a fully automated algorithm-driven behavioral intervention for diabetes prevention, Alive-PD, delivered via the Web, Internet, mobile phone, and automated phone calls. Methods Alive-PD provided tailored behavioral support for improvements in physical activity, eating habits, and factors such as weight loss, stress, and sleep. Weekly emails suggested small-step goals and linked to an individual Web page with tools for tracking, coaching, social support through virtual teams, competition, and health information. A mobile phone app and automated phone calls provided further support. The trial randomly assigned 339 persons to the Alive-PD intervention (n=163) or a 6-month wait-list usual-care control group (n=176). Participants were eligible if either fasting glucose or glycated hemoglobin A1c (HbA1c) was in the prediabetic range. Primary outcome measures were changes in fasting glucose and HbA1c at 6 months. Secondary outcome measures included clinic-measured changes in body weight, body mass index (BMI), waist circumference, triglyceride/high-density lipoprotein cholesterol (TG/HDL) ratio, and Framingham diabetes risk score. Analysis was by intention-to-treat. Results Participants’ mean age was 55 (SD 8.9) years, mean BMI was 31.2 (SD 4.4) kg/m2, and 68.7% (233/339) were male. Mean fasting glucose was in the prediabetic range (mean 109.9, SD 8.4 mg/dL), whereas the mean HbA1c was 5.6% (SD 0.3), in the normal range. In intention-to-treat analyses, Alive-PD participants achieved significantly greater reductions than controls in fasting glucose (mean –7.36 mg/dL, 95% CI –7.85 to –6.87 vs mean –2.19, 95% CI

  4. Comparison of Two Theory-Based, Fully Automated Telephone Interventions Designed to Maintain Dietary Change in Healthy Adults: Study Protocol of a Three-Arm Randomized Controlled Trial

    PubMed Central

    Quintiliani, Lisa M; Turner-McGrievy, Gabrielle M; Migneault, Jeffrey P; Heeren, Timothy; Friedman, Robert H

    2014-01-01

    Background Health behavior change interventions have focused on obtaining short-term intervention effects; few studies have evaluated mid-term and long-term outcomes, and even fewer have evaluated interventions that are designed to maintain and enhance initial intervention effects. Moreover, behavior theory has not been developed for maintenance or applied to maintenance intervention design to the degree that it has for behavior change initiation. Objective The objective of this paper is to describe a study that compared two theory-based interventions (social cognitive theory [SCT] vs goal systems theory [GST]) designed to maintain previously achieved improvements in fruit and vegetable (F&V) consumption. Methods The interventions used tailored, interactive conversations delivered by a fully automated telephony system (Telephone-Linked Care [TLC]) over a 6-month period. TLC maintenance intervention based on SCT used a skills-based approach to build self-efficacy. It assessed confidence in and barriers to eating F&V, provided feedback on how to overcome barriers, plan ahead, and set goals. The TLC maintenance intervention based on GST used a cognitive-based approach. Conversations trained participants in goal management to help them integrate their newly acquired dietary behavior into their hierarchical system of goals. Content included goal facilitation, conflict, shielding, and redundancy, and reflection on personal goals and priorities. To evaluate and compare the two approaches, a sample of adults whose F&V consumption was below public health goal levels were recruited from a large urban area to participate in a fully automated telephony intervention (TLC-EAT) for 3-6 months. Participants who increase their daily intake of F&V by ≥1 serving/day will be eligible for the three-arm randomized controlled trial. A sample of 405 participants will be randomized to one of three arms: (1) an assessment-only control, (2) TLC-SCT, and (3) TLC-GST. The maintenance

  5. Surface free energy and microarray deposition technology.

    PubMed

    McHale, Glen

    2007-03-01

    Microarray techniques use a combinatorial approach to assess complex biochemical interactions. The fundamental goal is simultaneous, large-scale experimentation analogous to the automation achieved in the semiconductor industry. However, microarray deposition inherently involves liquids contacting solid substrates. Liquid droplet shapes are determined by surface and interfacial tension forces, and flows during drying. This article looks at how surface free energy and wetting considerations may influence the accuracy and reliability of spotted microarray experiments.

  6. Multiresidue trace analysis of pharmaceuticals, their human metabolites and transformation products by fully automated on-line solid-phase extraction-liquid chromatography-tandem mass spectrometry.

    PubMed

    García-Galán, María Jesús; Petrovic, Mira; Rodríguez-Mozaz, Sara; Barceló, Damià

    2016-09-01

    A novel, fully automated analytical methodology based on dual column liquid chromatography coupled to tandem mass spectrometry (LC-LC-MS(2)) has been developed and validated for the analysis of 12 pharmaceuticals and 20 metabolites and transformation products in different types of water (influent and effluent wastewaters and surface water). Two LC columns were used - one for pre-concentration of the sample and the second for separation and analysis - so that water samples were injected directly in the chromatographic system. Besides the many advantages of the methodology, such as minimization of the sample volume required and its manipulation, both compounds ionized in positive and negative mode could be analyzed simultaneously without compromising the sensitivity. A comparative study of different mobile phases, gradients and LC pre-concentration columns was carried out to obtain the best analytical performance. Limits of detection (MLODs) achieved were in the low ngL(-1) range for all the compounds. The method was successfully applied to study the presence of the target analytes in different wastewater and surface water samples collected near the city of Girona (Catalonia, Spain). Data on the environmental presence and fate of pharmaceutical metabolites and TPs is still scarce, highlighting the relevance of the developed methodology. PMID:27343613

  7. Fully-Automated High-Throughput NMR System for Screening of Haploid Kernels of Maize (Corn) by Measurement of Oil Content.

    PubMed

    Wang, Hongzhi; Liu, Jin; Xu, Xiaoping; Huang, Qingming; Chen, Shanshan; Yang, Peiqiang; Chen, Shaojiang; Song, Yiqiao

    2016-01-01

    One of the modern crop breeding techniques uses doubled haploid plants that contain an identical pair of chromosomes in order to accelerate the breeding process. Rapid haploid identification method is critical for large-scale selections of double haploids. The conventional methods based on the color of the endosperm and embryo seeds are slow, manual and prone to error. On the other hand, there exists a significant difference between diploid and haploid seeds generated by high oil inducer, which makes it possible to use oil content to identify the haploid. This paper describes a fully-automated high-throughput NMR screening system for maize haploid kernel identification. The system is comprised of a sampler unit to select a single kernel to feed for measurement of NMR and weight, and a kernel sorter to distribute the kernel according to the measurement result. Tests of the system show a consistent accuracy of 94% with an average screening time of 4 seconds per kernel. Field test result is described and the directions for future improvement are discussed. PMID:27454427

  8. Fully automated determination of 74 pharmaceuticals in environmental and waste waters by online solid phase extraction-liquid chromatography-electrospray-tandem mass spectrometry.

    PubMed

    López-Serna, Rebeca; Pérez, Sandra; Ginebreda, Antoni; Petrović, Mira; Barceló, Damià

    2010-12-15

    The present work describes the development of a fully automated method, based on on-line solid-phase extraction (SPE)-liquid chromatography-electrospray-tandem mass spectrometry (LC-MS-MS), for the determination of 74 pharmaceuticals in environmental waters (superficial water and groundwater) as well as sewage waters. On-line SPE is performed by passing 2.5 mL of the water sample through a HySphere Resin GP cartridge. For unequivocal identification and confirmation two selected reaction monitoring (SRM) transitions are monitored per compound, thus four identification points are achieved. Quantification is performed by the internal standard approach, indispensable to correct the losses during the solid phase extraction, as well as the matrix effects. The main advantages of the method developed are high sensitivity (limits of detection in the low ng L(-1) range), selectivity due the use of tandem mass spectrometry and reliability due the use of 51 surrogates and minimum sample manipulation. As a part of the validation procedure, the method developed has been applied to the analysis of various environmental and sewage samples from a Spanish river and a sewage treatment plant.

  9. The microfluidic bioagent autonomous networked detector (M-BAND): an update. Fully integrated, automated, and networked field identification of airborne pathogens

    NASA Astrophysics Data System (ADS)

    Sanchez, M.; Probst, L.; Blazevic, E.; Nakao, B.; Northrup, M. A.

    2011-11-01

    We describe a fully automated and autonomous air-borne biothreat detection system for biosurveillance applications. The system, including the nucleic-acid-based detection assay, was designed, built and shipped by Microfluidic Systems Inc (MFSI), a new subsidiary of PositiveID Corporation (PSID). Our findings demonstrate that the system and assay unequivocally identify pathogenic strains of Bacillus anthracis, Yersinia pestis, Francisella tularensis, Burkholderia mallei, and Burkholderia pseudomallei. In order to assess the assay's ability to detect unknown samples, our team also challenged it against a series of blind samples provided by the Department of Homeland Security (DHS). These samples included natural occurring isolated strains, near-neighbor isolates, and environmental samples. Our results indicate that the multiplex assay was specific and produced no false positives when challenged with in house gDNA collections and DHS provided panels. Here we present another analytical tool for the rapid identification of nine Centers for Disease Control and Prevention category A and B biothreat organisms.

  10. Diagnosing Aleutian mink disease infection by a new fully automated ELISA or by counter current immunoelectrophoresis: a comparison of sensitivity and specificity.

    PubMed

    Dam-Tuxen, Rebekka; Dahl, Jan; Jensen, Trine Hammer; Dam-Tuxen, Thomas; Struve, Tina; Bruun, Leif

    2014-04-01

    Aleutian disease (AD) is a severe disease characterized by hypergammaglobulinemia causing multiple symptoms such as acute renal failure, arteritis, reduced reproductive performance and pneumonia in mink. AD is caused by the parvovirus Aleutian mink disease virus (ADV) and diagnosed primarily based on ADV serology sometimes supplemented by organ PCR analysis. In Denmark, approximately 3.5-4 million serum samples are tested every year for the presence of anti ADV antibodies as part of a national eradication program. The present study compares the diagnostic performance of the two most commonly used assays for serological screening for Aleutian disease: counter current immunoelectrophoresis (CIEP) and ELISA. In total, 3810 mink were sampled in doublets and analyzed by CIEP and a newly developed fully automated ELISA. The results show that the two assays have a comparable diagnostic performance with the ELISA having a higher sensitivity but lower specificity than the CIEP assay. The ELISA has been approved by the Danish authorities for diagnosing Aleutian disease in mink.

  11. A fully automated effervescence-assisted switchable solvent-based liquid phase microextraction procedure: Liquid chromatographic determination of ofloxacin in human urine samples.

    PubMed

    Vakh, Christina; Pochivalov, Aleksei; Andruch, Vasil; Moskvin, Leonid; Bulatov, Andrey

    2016-02-11

    A novel fully automated effervescence-assisted switchable solvent-based liquid phase microextraction procedure has been suggested. In this extraction method, medium-chain saturated fatty acids were investigated as switchable hydrophilicity solvents. The conversion of fatty acid into hydrophilic form was carried out in the presence of sodium carbonate. The injection of sulfuric acid into the solution decreased the pH value of the solution, thus, microdroplets of the fatty acid were generated. Carbon dioxide bubbles were generated in-situ, and promoted the extraction process and final phase separation. The performance of the suggested approach was demonstrated by the determination of ofloxacin in human urine samples using high-performance liquid chromatography with fluorescence detection. This analytical task was used as a proof-of-concept example. Under the optimal conditions, the detector response of ofloxacin was linear in the concentration ranges of 3·10(-8)-3·10(-6) mol L(-1). The limit of detection, calculated from a blank test based on 3σ, was 1·10(-8) mol L(-1). The results demonstrated that the presented approach is highly cost-effective, simple, rapid and environmentally friendly.

  12. Fully-automated estimation of actual to potential evapotranspiration in the Everglades using Landsat and air temperature data as inputs to the Vegetation Index-Temperature Trapezoid method

    NASA Astrophysics Data System (ADS)

    Yagci, A. L.; Jones, J. W.

    2014-12-01

    While the greater Everglades contains a vast wetland, evapotranspiration (ET) is a major source of water "loss" from the system. Like other ecosystems, the Everglades is vulnerable to drought. Everglades restoration science and resource management requires information on the spatial and temporal distribution of ET. We developed a fully-automated ET model using the Vegetation Index-Temperature Trapezoid concept. The model was tested and evaluated against in-situ ET observations collected at the Shark River Slough Mangrove Forest eddy-covariance tower in Everglades National Park (Sitename / FLUXNET ID: Florida Everglades Shark River Slough Mangrove Forest / US-Skr). It uses Landsat Surface Reflectance Climate Data from Landsat 5, and Landsat 5 thermal and air temperature data from the Daily Gridded Surface Dataset to output the ratio of actual evapotranspiration (AET) and potential evapotranspiration (PET). When multiplied with a PET estimate, this output can be used to estimate ET at high spatial resolution. Furthermore, it can be used to downscale coarse resolution ET and PET products. Two example outputs covering the agricultural lands north of the major Everglades wetlands extracted from two different dates are shown below along with a National Land Cover Database image from 2011. The irrigated and non-irrigated farms are easily distinguishable from the background (i.e., natural land covers). Open water retained the highest AET/PET ratio. Wetlands had a higher AET/PET ratio than farmlands. The main challenge in this study area is prolonged cloudiness during the growing season.

  13. Fully-Automated High-Throughput NMR System for Screening of Haploid Kernels of Maize (Corn) by Measurement of Oil Content

    PubMed Central

    Xu, Xiaoping; Huang, Qingming; Chen, Shanshan; Yang, Peiqiang; Chen, Shaojiang; Song, Yiqiao

    2016-01-01

    One of the modern crop breeding techniques uses doubled haploid plants that contain an identical pair of chromosomes in order to accelerate the breeding process. Rapid haploid identification method is critical for large-scale selections of double haploids. The conventional methods based on the color of the endosperm and embryo seeds are slow, manual and prone to error. On the other hand, there exists a significant difference between diploid and haploid seeds generated by high oil inducer, which makes it possible to use oil content to identify the haploid. This paper describes a fully-automated high-throughput NMR screening system for maize haploid kernel identification. The system is comprised of a sampler unit to select a single kernel to feed for measurement of NMR and weight, and a kernel sorter to distribute the kernel according to the measurement result. Tests of the system show a consistent accuracy of 94% with an average screening time of 4 seconds per kernel. Field test result is described and the directions for future improvement are discussed. PMID:27454427

  14. Web-Based Fully Automated Self-Help With Different Levels of Therapist Support for Individuals With Eating Disorder Symptoms: A Randomized Controlled Trial

    PubMed Central

    Dingemans, Alexandra E; Spinhoven, Philip; van Ginkel, Joost R; de Rooij, Mark; van Furth, Eric F

    2016-01-01

    Background Despite the disabling nature of eating disorders (EDs), many individuals with ED symptoms do not receive appropriate mental health care. Internet-based interventions have potential to reduce the unmet needs by providing easily accessible health care services. Objective This study aimed to investigate the effectiveness of an Internet-based intervention for individuals with ED symptoms, called “Featback.” In addition, the added value of different intensities of therapist support was investigated. Methods Participants (N=354) were aged 16 years or older with self-reported ED symptoms, including symptoms of anorexia nervosa, bulimia nervosa, and binge eating disorder. Participants were recruited via the website of Featback and the website of a Dutch pro-recovery–focused e-community for young women with ED problems. Participants were randomized to: (1) Featback, consisting of psychoeducation and a fully automated self-monitoring and feedback system, (2) Featback supplemented with low-intensity (weekly) digital therapist support, (3) Featback supplemented with high-intensity (3 times a week) digital therapist support, and (4) a waiting list control condition. Internet-administered self-report questionnaires were completed at baseline, post-intervention (ie, 8 weeks after baseline), and at 3- and 6-month follow-up. The primary outcome measure was ED psychopathology. Secondary outcome measures were symptoms of depression and anxiety, perseverative thinking, and ED-related quality of life. Statistical analyses were conducted according to an intent-to-treat approach using linear mixed models. Results The 3 Featback conditions were superior to a waiting list in reducing bulimic psychopathology (d=−0.16, 95% confidence interval (CI)=−0.31 to −0.01), symptoms of depression and anxiety (d=−0.28, 95% CI=−0.45 to −0.11), and perseverative thinking (d=−0.28, 95% CI=−0.45 to −0.11). No added value of therapist support was found in terms of symptom

  15. Web-Based Fully Automated Self-Help With Different Levels of Therapist Support for Individuals With Eating Disorder Symptoms: A Randomized Controlled Trial

    PubMed Central

    Dingemans, Alexandra E; Spinhoven, Philip; van Ginkel, Joost R; de Rooij, Mark; van Furth, Eric F

    2016-01-01

    Background Despite the disabling nature of eating disorders (EDs), many individuals with ED symptoms do not receive appropriate mental health care. Internet-based interventions have potential to reduce the unmet needs by providing easily accessible health care services. Objective This study aimed to investigate the effectiveness of an Internet-based intervention for individuals with ED symptoms, called “Featback.” In addition, the added value of different intensities of therapist support was investigated. Methods Participants (N=354) were aged 16 years or older with self-reported ED symptoms, including symptoms of anorexia nervosa, bulimia nervosa, and binge eating disorder. Participants were recruited via the website of Featback and the website of a Dutch pro-recovery–focused e-community for young women with ED problems. Participants were randomized to: (1) Featback, consisting of psychoeducation and a fully automated self-monitoring and feedback system, (2) Featback supplemented with low-intensity (weekly) digital therapist support, (3) Featback supplemented with high-intensity (3 times a week) digital therapist support, and (4) a waiting list control condition. Internet-administered self-report questionnaires were completed at baseline, post-intervention (ie, 8 weeks after baseline), and at 3- and 6-month follow-up. The primary outcome measure was ED psychopathology. Secondary outcome measures were symptoms of depression and anxiety, perseverative thinking, and ED-related quality of life. Statistical analyses were conducted according to an intent-to-treat approach using linear mixed models. Results The 3 Featback conditions were superior to a waiting list in reducing bulimic psychopathology (d=−0.16, 95% confidence interval (CI)=−0.31 to −0.01), symptoms of depression and anxiety (d=−0.28, 95% CI=−0.45 to −0.11), and perseverative thinking (d=−0.28, 95% CI=−0.45 to −0.11). No added value of therapist support was found in terms of symptom

  16. Microarray technology for use in molecular epidemiology.

    PubMed

    Vernon, Suzanne D; Whistler, Toni

    2007-01-01

    Microarrays are a powerful laboratory tool for the simultaneous assessment of the activity of thousands genes. Remarkable advances in biological sample collection, preparation and automation of hybridization have enabled the application of microarray technology to large, population-based studies. Now, microarrays have the potential to serve as screening tools for the detection of altered gene expression activity that might contribute to diseases in human populations. Reproducible and reliable microarray results depend on multiple factors. In this chapter, biological sample parameters are introduced that should be considered for any microarray experiment. Then, the microarray technology that we have successfully applied to limited biological sample from all our molecular epidemiology studies is detailed. This reproducible and reliable approach for using microarrays should be applicable to any biological questions asked.

  17. Effectiveness of a Web-Based Screening and Fully Automated Brief Motivational Intervention for Adolescent Substance Use: A Randomized Controlled Trial

    PubMed Central

    Elgán, Tobias H; De Paepe, Nina; Tønnesen, Hanne; Csémy, Ladislav; Thomasius, Rainer

    2016-01-01

    Background Mid-to-late adolescence is a critical period for initiation of alcohol and drug problems, which can be reduced by targeted brief motivational interventions. Web-based brief interventions have advantages in terms of acceptability and accessibility and have shown significant reductions of substance use among college students. However, the evidence is sparse among adolescents with at-risk use of alcohol and other drugs. Objective This study evaluated the effectiveness of a targeted and fully automated Web-based brief motivational intervention with no face-to-face components on substance use among adolescents screened for at-risk substance use in four European countries. Methods In an open-access, purely Web-based randomized controlled trial, a convenience sample of adolescents aged 16-18 years from Sweden, Germany, Belgium, and the Czech Republic was recruited using online and offline methods and screened online for at-risk substance use using the CRAFFT (Car, Relax, Alone, Forget, Friends, Trouble) screening instrument. Participants were randomized to a single session brief motivational intervention group or an assessment-only control group but not blinded. Primary outcome was differences in past month drinking measured by a self-reported AUDIT-C-based index score for drinking frequency, quantity, and frequency of binge drinking with measures collected online at baseline and after 3 months. Secondary outcomes were the AUDIT-C-based separate drinking indicators, illegal drug use, and polydrug use. All outcome analyses were conducted with and without Expectation Maximization (EM) imputation of missing follow-up data. Results In total, 2673 adolescents were screened and 1449 (54.2%) participants were randomized to the intervention or control group. After 3 months, 211 adolescents (14.5%) provided follow-up data. Compared to the control group, results from linear mixed models revealed significant reductions in self-reported past-month drinking in favor of the

  18. Evaluation of fully automated assays for the detection of Rubella IgM and IgG antibodies by the Elecsys(®) immunoassay system.

    PubMed

    van Helden, Josef; Grangeot-Keros, Liliane; Vauloup-Fellous, Christelle; Vleminckx, Renaud; Masset, Frédéric; Revello, Maria-Grazia

    2014-04-01

    Screening for acute rubella infection in pregnancy is an important element of antenatal care. This study compared the sensitivity, specificity and reproducibility of two new, fully automated Elecsys(®) Rubella IgM and IgG immunoassays designed for the Elecsys 2010, Modular Analytics E170, COBAS e-411 and COBAS e-601 and e602 analytical platforms, with current assays using serum from patients with primary rubella infections, vaccinated patients, patients with potentially cross-reacting infections and on routine samples in clinical laboratories in France, Germany and Italy. Both assays showed good within-run and within-laboratory precision. A sensitivity of 79.8-96.0% was demonstrated for Elecsys IgM in primary, early acute infection, consistent with existing assays. In samples obtained from routine antenatal screening, the Elecsys Rubella IgM assay revealed high specificity (98.7-99.0%). A significantly (p<0.0001) lower reactivity was demonstrated in samples from previously infected patients where acute rubella infection was excluded, and the incidence of false positives in patients with potentially cross-reacting infections was lower with Elecsys Rubella IgM compared with other. The Elecsys Rubella IgG assay exhibited a relative sensitivity of 99.9-100.0% and specificity of 97.4-100.0% in samples from routine antenatal screening. The Elecsys Rubella IgM and IgG assays allow convenient, rapid and reliable determination of anti-rubella antibodies. Sensitivity, specificity and reproducibility were comparable with existing assay systems. Assay results were available in approximately half the time required for currently employed methods and the assays are compatible with widely used analytical platforms.

  19. Predicting survival in heart failure case and control subjects by use of fully automated methods for deriving nonlinear and conventional indices of heart rate dynamics

    NASA Technical Reports Server (NTRS)

    Ho, K. K.; Moody, G. B.; Peng, C. K.; Mietus, J. E.; Larson, M. G.; Levy, D.; Goldberger, A. L.

    1997-01-01

    BACKGROUND: Despite much recent interest in quantification of heart rate variability (HRV), the prognostic value of conventional measures of HRV and of newer indices based on nonlinear dynamics is not universally accepted. METHODS AND RESULTS: We have designed algorithms for analyzing ambulatory ECG recordings and measuring HRV without human intervention, using robust methods for obtaining time-domain measures (mean and SD of heart rate), frequency-domain measures (power in the bands of 0.001 to 0.01 Hz [VLF], 0.01 to 0.15 Hz [LF], and 0.15 to 0.5 Hz [HF] and total spectral power [TP] over all three of these bands), and measures based on nonlinear dynamics (approximate entropy [ApEn], a measure of complexity, and detrended fluctuation analysis [DFA], a measure of long-term correlations). The study population consisted of chronic congestive heart failure (CHF) case patients and sex- and age-matched control subjects in the Framingham Heart Study. After exclusion of technically inadequate studies and those with atrial fibrillation, we used these algorithms to study HRV in 2-hour ambulatory ECG recordings of 69 participants (mean age, 71.7+/-8.1 years). By use of separate Cox proportional-hazards models, the conventional measures SD (P<.01), LF (P<.01), VLF (P<.05), and TP (P<.01) and the nonlinear measure DFA (P<.05) were predictors of survival over a mean follow-up period of 1.9 years; other measures, including ApEn (P>.3), were not. In multivariable models, DFA was of borderline predictive significance (P=.06) after adjustment for the diagnosis of CHF and SD. CONCLUSIONS: These results demonstrate that HRV analysis of ambulatory ECG recordings based on fully automated methods can have prognostic value in a population-based study and that nonlinear HRV indices may contribute prognostic value to complement traditional HRV measures.

  20. Fully automated standard addition method for the quantification of 29 polar pesticide metabolites in different water bodies using LC-MS/MS.

    PubMed

    Kowal, Sebastian; Balsaa, Peter; Werres, Friedrich; Schmidt, Torsten C

    2013-07-01

    A reliable quantification by LC-ESI-MS/MS as the most suitable analytical method for polar substances in the aquatic environment is usually hampered by matrix effects from co-eluting compounds, which are unavoidably present in environmental samples. The standard addition method (SAM) is the most appropriate method to compensate matrix effects. However, when performed manually, this method is too labour- and time-intensive for routine analysis. In the present work, a fully automated SAM using a multi-purpose sample manager "Open Architecture UPLC®-MS/MS" (ultra-performance liquid chromatography tandem mass spectrometry) was developed for the sensitive and reliable determination of 29 polar pesticide metabolites in environmental samples. A four-point SAM was conducted parallel to direct-injection UPLC-ESI-MS/MS determination that was followed by a work flow to calculate the analyte concentrations including monitoring of required quality criteria. Several parameters regarding the SAM, chromatography and mass spectrometry conditions were optimised in order to obtain a fast as well as reliable analytical method. The matrix effects were examined by comparison of the SAM with an external calibration method. The accuracy of the SAM was investigated by recovery tests in samples of different catchment areas. The method detection limit was estimated to be between 1 and 10 ng/L for all metabolites by direct injection of a 10-μL sample. The relative standard deviation values were between 2 and 10% at the end of calibration range (30 ng/L). About 200 samples from different water bodies were examined with this method in the Rhine and Ruhr region of North Rhine-Westphalia (Germany). Approximately 94% of the analysed samples contained measurable amounts of metabolites. For most metabolites, low concentrations ≤0.10 μg/L were determined. Only for three metabolites were the concentrations in ground water significantly higher (up to 20 μg/L). In none of the examined drinking

  1. Fully automated TV-image analysis of the cell-cycle: comparison of the PLM method with determinations of the percentage and the DNA content of labelled cells.

    PubMed

    Wachsmuth, E D; van Golitschek, M; Macht, F; Maurer-Schultze, B

    1988-01-01

    A cell-cycle analysis based on a fully automated TV-image scanning system is proposed to replace the laborious PLM method. To compare the efficiency of the two procedures, cell-cycle parameters were assessed in Ehrlich (diploid and hyperdiploid), L-1210, and JB-1 mouse ascites tumours and in rat jejunal crypts. The percentages of labelled mitoses (PLM) were counted visually on Feulgen-stained autoradiographs obtained at various times after a single 3H-thymidine pulse. The fraction of labelled cells (P) and the DNA ratio of labelled and unlabelled cells were measured by TV-image analysis in the same slides and plotted against time. Within practical limits, TV-image analysis using the P-curve gives the same results as the PLM method. Using the P-curve has the important advantage that its first part, beginning at the time of 3H-thymidine injection and ending at the first maximum, furnishes more information about the cell cycle than the corresponding part of the PLM curve. It can be used to compute tG2M tS and the ratio of the growth faction index to the cell-cycle time (IP/tC) whereas the first part of the PLM-curve reveals only the length of the S-phase (tS). The IP/tC ratio is a readily accessible measure of growth and increases when the cells divide more frequently. Cell death rates may be neglected since the ratio is determined within less than the duration of one cell cycle. Moreover, the data from the first part of the P curve indicate whether there is a large non-growth fraction. If the non-growth fraction is small, i.e. if IP approximately 1, the P curve need only be measured until the first maximum is reached so that fewer samples and animals are required. If the non-growth fraction is large or unknown, the cell-cycle parameters are calculated by reference to the position and size not only of the first minimum and the first maximum, but also of the second minimum of the P curve.

  2. Fully automated ionic liquid-based headspace single drop microextraction coupled to GC-MS/MS to determine musk fragrances in environmental water samples.

    PubMed

    Vallecillos, Laura; Pocurull, Eva; Borrull, Francesc

    2012-09-15

    A fully automated ionic liquid-based headspace single drop microextraction (IL-HS-SDME) procedure has been developed for the first time to preconcentrate trace amounts of ten musk fragrances extensively used in personal care products (six polycyclic musks, three nitro musks and one polycyclic musk degradation product) from wastewater samples prior to analysis by gas chromatography and ion trap tandem mass spectrometry (GC-IT-MS/MS). Due to the low volatility of the ILs, a large internal diameter liner (3.4 mm i.d.) was used to improve the ILs evaporation. Furthermore, a piece of glass wool was introduced into the liner to avoid the entrance of the ILs in the GC column and a guard column was used to prevent analytical column damages. The main factors influencing the IL-HS-SDME were optimized. For all species, the highest enrichments factors were achieved using 1 μL of 1-octyl-3-methylimidazolium hexafluorophosphate ([OMIM][PF(6)]) ionic liquid exposed in the headspace of 10 mL water samples containing 300 g L(-1) of NaCl and stirred at 750 rpm and 60 °C for 45 min. All compounds were determined by direct injection GC-IT-MS/MS with a chromatographic time of 19 min. Method detection limits were found in the low ng mL(-1) range between 0.010 ng mL(-1) and 0.030 ng mL(-1) depending on the target analytes. Also, under optimized conditions, the method gave good levels of intra-day and inter-day repeatabilities in wastewater samples with relative standard deviations varying between 3% and 6% and 5% and 11%, respectively (n=3, 1 ng mL(-1)). The applicability of the method was tested with different wastewater samples from influent and effluent urban wastewater treatment plants (WWTPs) and one potable treatment plant (PTP). The analysis of influent urban wastewater revealed the presence of galaxolide and tonalide at concentrations of between 2.10 ng mL(-1) and 0.29 ng mL(-1) and 0.32 ng mL(-1) and

  3. Tissue Microarrays.

    PubMed

    Dancau, Ana-Maria; Simon, Ronald; Mirlacher, Martina; Sauter, Guido

    2016-01-01

    Modern next-generation sequencing and microarray technologies allow for the simultaneous analysis of all human genes on the DNA, RNA, miRNA, and methylation RNA level. Studies using such techniques have lead to the identification of hundreds of genes with a potential role in cancer or other diseases. The validation of all of these candidate genes requires in situ analysis of high numbers of clinical tissues samples. The tissue microarray technology greatly facilitates such analysis. In this method minute tissue samples (typically 0.6 mm in diameter) from up to 1000 different tissues can be analyzed on one microscope glass slide. All in situ methods suitable for histological studies can be applied to TMAs without major changes of protocols, including immunohistochemistry, fluorescence in situ hybridization, or RNA in situ hybridization. Because all tissues are analyzed simultaneously with the same batch of reagents, TMA studies provide an unprecedented degree of standardization, speed, and cost efficiency.

  4. Chromosome Microarray.

    PubMed

    Anderson, Sharon

    2016-01-01

    Over the last half century, knowledge about genetics, genetic testing, and its complexity has flourished. Completion of the Human Genome Project provided a foundation upon which the accuracy of genetics, genomics, and integration of bioinformatics knowledge and testing has grown exponentially. What is lagging, however, are efforts to reach and engage nurses about this rapidly changing field. The purpose of this article is to familiarize nurses with several frequently ordered genetic tests including chromosomes and fluorescence in situ hybridization followed by a comprehensive review of chromosome microarray. It shares the complexity of microarray including how testing is performed and results analyzed. A case report demonstrates how this technology is applied in clinical practice and reveals benefits and limitations of this scientific and bioinformatics genetic technology. Clinical implications for maternal-child nurses across practice levels are discussed. PMID:27276104

  5. A comparison of fully automated methods of data analysis and computer assisted heuristic methods in an electrode kinetic study of the pathologically variable [Fe(CN)6](3-/4-) process by AC voltammetry.

    PubMed

    Morris, Graham P; Simonov, Alexandr N; Mashkina, Elena A; Bordas, Rafel; Gillow, Kathryn; Baker, Ruth E; Gavaghan, David J; Bond, Alan M

    2013-12-17

    Fully automated and computer assisted heuristic data analysis approaches have been applied to a series of AC voltammetric experiments undertaken on the [Fe(CN)6](3-/4-) process at a glassy carbon electrode in 3 M KCl aqueous electrolyte. The recovered parameters in all forms of data analysis encompass E(0) (reversible potential), k(0) (heterogeneous charge transfer rate constant at E(0)), α (charge transfer coefficient), Ru (uncompensated resistance), and Cdl (double layer capacitance). The automated method of analysis employed time domain optimization and Bayesian statistics. This and all other methods assumed the Butler-Volmer model applies for electron transfer kinetics, planar diffusion for mass transport, Ohm's Law for Ru, and a potential-independent Cdl model. Heuristic approaches utilize combinations of Fourier Transform filtering, sensitivity analysis, and simplex-based forms of optimization applied to resolved AC harmonics and rely on experimenter experience to assist in experiment-theory comparisons. Remarkable consistency of parameter evaluation was achieved, although the fully automated time domain method provided consistently higher α values than those based on frequency domain data analysis. The origin of this difference is that the implemented fully automated method requires a perfect model for the double layer capacitance. In contrast, the importance of imperfections in the double layer model is minimized when analysis is performed in the frequency domain. Substantial variation in k(0) values was found by analysis of the 10 data sets for this highly surface-sensitive pathologically variable [Fe(CN)6](3-/4-) process, but remarkably, all fit the quasi-reversible model satisfactorily.

  6. Low-complexity PDE-based approach for automatic microarray image processing.

    PubMed

    Belean, Bogdan; Terebes, Romulus; Bot, Adrian

    2015-02-01

    Microarray image processing is known as a valuable tool for gene expression estimation, a crucial step in understanding biological processes within living organisms. Automation and reliability are open subjects in microarray image processing, where grid alignment and spot segmentation are essential processes that can influence the quality of gene expression information. The paper proposes a novel partial differential equation (PDE)-based approach for fully automatic grid alignment in case of microarray images. Our approach can handle image distortions and performs grid alignment using the vertical and horizontal luminance function profiles. These profiles are evolved using a hyperbolic shock filter PDE and then refined using the autocorrelation function. The results are compared with the ones delivered by state-of-the-art approaches for grid alignment in terms of accuracy and computational complexity. Using the same PDE formalism and curve fitting, automatic spot segmentation is achieved and visual results are presented. Considering microarray images with different spots layouts, reliable results in terms of accuracy and reduced computational complexity are achieved, compared with existing software platforms and state-of-the-art methods for microarray image processing.

  7. A robotics platform for automated batch fabrication of high density, microfluidics-based DNA microarrays, with applications to single cell, multiplex assays of secreted proteins

    NASA Astrophysics Data System (ADS)

    Ahmad, Habib; Sutherland, Alex; Shin, Young Shik; Hwang, Kiwook; Qin, Lidong; Krom, Russell-John; Heath, James R.

    2011-09-01

    Microfluidics flow-patterning has been utilized for the construction of chip-scale miniaturized DNA and protein barcode arrays. Such arrays have been used for specific clinical and fundamental investigations in which many proteins are assayed from single cells or other small sample sizes. However, flow-patterned arrays are hand-prepared, and so are impractical for broad applications. We describe an integrated robotics/microfluidics platform for the automated preparation of such arrays, and we apply it to the batch fabrication of up to eighteen chips of flow-patterned DNA barcodes. The resulting substrates are comparable in quality with hand-made arrays and exhibit excellent substrate-to-substrate consistency. We demonstrate the utility and reproducibility of robotics-patterned barcodes by utilizing two flow-patterned chips for highly parallel assays of a panel of secreted proteins from single macrophage cells.

  8. A robotics platform for automated batch fabrication of high density, microfluidics-based DNA microarrays, with applications to single cell, multiplex assays of secreted proteins

    PubMed Central

    Ahmad, Habib; Sutherland, Alex; Shin, Young Shik; Hwang, Kiwook; Qin, Lidong; Krom, Russell-John; Heath, James R.

    2011-01-01

    Microfluidics flow-patterning has been utilized for the construction of chip-scale miniaturized DNA and protein barcode arrays. Such arrays have been used for specific clinical and fundamental investigations in which many proteins are assayed from single cells or other small sample sizes. However, flow-patterned arrays are hand-prepared, and so are impractical for broad applications. We describe an integrated robotics/microfluidics platform for the automated preparation of such arrays, and we apply it to the batch fabrication of up to eighteen chips of flow-patterned DNA barcodes. The resulting substrates are comparable in quality with hand-made arrays and exhibit excellent substrate-to-substrate consistency. We demonstrate the utility and reproducibility of robotics-patterned barcodes by utilizing two flow-patterned chips for highly parallel assays of a panel of secreted proteins from single macrophage cells. PMID:21974603

  9. DNA Microarrays

    NASA Astrophysics Data System (ADS)

    Nguyen, C.; Gidrol, X.

    Genomics has revolutionised biological and biomedical research. This revolution was predictable on the basis of its two driving forces: the ever increasing availability of genome sequences and the development of new technology able to exploit them. Up until now, technical limitations meant that molecular biology could only analyse one or two parameters per experiment, providing relatively little information compared with the great complexity of the systems under investigation. This gene by gene approach is inadequate to understand biological systems containing several thousand genes. It is essential to have an overall view of the DNA, RNA, and relevant proteins. A simple inventory of the genome is not sufficient to understand the functions of the genes, or indeed the way that cells and organisms work. For this purpose, functional studies based on whole genomes are needed. Among these new large-scale methods of molecular analysis, DNA microarrays provide a way of studying the genome and the transcriptome. The idea of integrating a large amount of data derived from a support with very small area has led biologists to call these chips, borrowing the term from the microelectronics industry. At the beginning of the 1990s, the development of DNA chips on nylon membranes [1, 2], then on glass [3] and silicon [4] supports, made it possible for the first time to carry out simultaneous measurements of the equilibrium concentration of all the messenger RNA (mRNA) or transcribed RNA in a cell. These microarrays offer a wide range of applications, in both fundamental and clinical research, providing a method for genome-wide characterisation of changes occurring within a cell or tissue, as for example in polymorphism studies, detection of mutations, and quantitative assays of gene copies. With regard to the transcriptome, it provides a way of characterising differentially expressed genes, profiling given biological states, and identifying regulatory channels.

  10. Regenerable immuno-biochip for screening ochratoxin A in green coffee extract using an automated microarray chip reader with chemiluminescence detection.

    PubMed

    Sauceda-Friebe, Jimena C; Karsunke, Xaver Y Z; Vazac, Susanna; Biselli, Scarlett; Niessner, Reinhard; Knopp, Dietmar

    2011-03-18

    Ochratoxin A (OTA) can contaminate foodstuffs in the ppb to ppm range and once formed, it is difficult to remove. Because of its toxicity and potential risks to human health, the need exists for rapid, efficient detection methods that comply with legal maximum residual limits. In this work we have synthesized an OTA conjugate functionalized with a water-soluble peptide for covalent immobilization on a glass biochip by means of contact spotting. The chip was used for OTA determination with an indirect competitive immunoassay format with flow-through reagent addition and chemiluminescence detection, carried out with the stand-alone automated Munich Chip Reader 3 (MCR 3) platform. A buffer model and real green coffee extracts were used for this purpose. At the present, covalent conjugate immobilization allowed for at least 20 assay-regeneration cycles of the biochip surface. The total analysis time for a single sample, including measurement and surface regeneration, was 12 min and the LOQ of OTA in green coffee extract was 0.3 μg L(-1) which corresponds to 7 μg kg(-1).

  11. SaDA: From Sampling to Data Analysis-An Extensible Open Source Infrastructure for Rapid, Robust and Automated Management and Analysis of Modern Ecological High-Throughput Microarray Data.

    PubMed

    Singh, Kumar Saurabh; Thual, Dominique; Spurio, Roberto; Cannata, Nicola

    2015-06-01

    One of the most crucial characteristics of day-to-day laboratory information management is the collection, storage and retrieval of information about research subjects and environmental or biomedical samples. An efficient link between sample data and experimental results is absolutely important for the successful outcome of a collaborative project. Currently available software solutions are largely limited to large scale, expensive commercial Laboratory Information Management Systems (LIMS). Acquiring such LIMS indeed can bring laboratory information management to a higher level, but most of the times this requires a sufficient investment of money, time and technical efforts. There is a clear need for a light weighted open source system which can easily be managed on local servers and handled by individual researchers. Here we present a software named SaDA for storing, retrieving and analyzing data originated from microorganism monitoring experiments. SaDA is fully integrated in the management of environmental samples, oligonucleotide sequences, microarray data and the subsequent downstream analysis procedures. It is simple and generic software, and can be extended and customized for various environmental and biomedical studies.

  12. SaDA: From Sampling to Data Analysis—An Extensible Open Source Infrastructure for Rapid, Robust and Automated Management and Analysis of Modern Ecological High-Throughput Microarray Data

    PubMed Central

    Singh, Kumar Saurabh; Thual, Dominique; Spurio, Roberto; Cannata, Nicola

    2015-01-01

    One of the most crucial characteristics of day-to-day laboratory information management is the collection, storage and retrieval of information about research subjects and environmental or biomedical samples. An efficient link between sample data and experimental results is absolutely important for the successful outcome of a collaborative project. Currently available software solutions are largely limited to large scale, expensive commercial Laboratory Information Management Systems (LIMS). Acquiring such LIMS indeed can bring laboratory information management to a higher level, but most of the times this requires a sufficient investment of money, time and technical efforts. There is a clear need for a light weighted open source system which can easily be managed on local servers and handled by individual researchers. Here we present a software named SaDA for storing, retrieving and analyzing data originated from microorganism monitoring experiments. SaDA is fully integrated in the management of environmental samples, oligonucleotide sequences, microarray data and the subsequent downstream analysis procedures. It is simple and generic software, and can be extended and customized for various environmental and biomedical studies. PMID:26047146

  13. Validation of a Fully Automated 3D Hippocampal Segmentation Method Using Subjects with Alzheimer's Disease, Mild Cognitive Impairment, and Elderly Controls

    PubMed Central

    Morra, Jonathan H.; Tu, Zhuowen; Apostolova, Liana G.; Green, Amity E.; Avedissian, Christina; Madsen, Sarah K.; Parikshak, Neelroop; Hua, Xue; Toga, Arthur W.; Jack, Clifford R.; Weiner, Michael W.; Thompson, Paul M.

    2008-01-01

    We introduce a new method for brain MRI segmentation, called the auto context model (ACM), to segment the hippocampus automatically in 3D T1-weighted structural brain MRI scans of subjects from the Alzheimer's Disease Neuroimaging Initiative (ADNI). In a training phase, our algorithm used 21 hand-labeled segmentations to learn a classification rule for hippocampal versus non-hippocampal regions using a modified AdaBoost method, based on ∼18,000 features (image intensity, position, image curvatures, image gradients, tissue classification maps of gray/white matter and CSF, and mean, standard deviation, and Haar filters of size 1×1×1 to 7×7×7). We linearly registered all brains to a standard template to devise a basic shape prior to capture the global shape of the hippocampus, defined as the pointwise summation of all the training masks. We also included curvature, gradient, mean, standard deviation, and Haar filters of the shape prior and the tissue classified images as features. During each iteration of ACM - our extension of AdaBoost - the Bayesian posterior distribution of the labeling was fed back in as an input, along with its neighborhood features, as new features for AdaBoost to use. In validation studies, we compared our results with hand-labeled segmentations by two experts. Using a leave-one-out approach and standard overlap and distance error metrics, our automated segmentations agreed well with human raters; any differences were comparable to differences between trained human raters. Our error metrics compare favorably with those previously reported for other automated hippocampal segmentations, suggesting the utility of the approach for large-scale studies. PMID:18675918

  14. PMD: A Resource for Archiving and Analyzing Protein Microarray data.

    PubMed

    Xu, Zhaowei; Huang, Likun; Zhang, Hainan; Li, Yang; Guo, Shujuan; Wang, Nan; Wang, Shi-Hua; Chen, Ziqing; Wang, Jingfang; Tao, Sheng-Ce

    2016-01-27

    Protein microarray is a powerful technology for both basic research and clinical study. However, because there is no database specifically tailored for protein microarray, the majority of the valuable original protein microarray data is still not publically accessible. To address this issue, we constructed Protein Microarray Database (PMD), which is specifically designed for archiving and analyzing protein microarray data. In PMD, users can easily browse and search the entire database by experimental name, protein microarray type, and sample information. Additionally, PMD integrates several data analysis tools and provides an automated data analysis pipeline for users. With just one click, users can obtain a comprehensive analysis report for their protein microarray data. The report includes preliminary data analysis, such as data normalization, candidate identification, and an in-depth bioinformatics analysis of the candidates, which include functional annotation, pathway analysis, and protein-protein interaction network analysis. PMD is now freely available at www.proteinmicroarray.cn.

  15. PMD: A Resource for Archiving and Analyzing Protein Microarray data

    PubMed Central

    Xu, Zhaowei; Huang, Likun; Zhang, Hainan; Li, Yang; Guo, Shujuan; Wang, Nan; Wang, Shi-hua; Chen, Ziqing; Wang, Jingfang; Tao, Sheng-ce

    2016-01-01

    Protein microarray is a powerful technology for both basic research and clinical study. However, because there is no database specifically tailored for protein microarray, the majority of the valuable original protein microarray data is still not publically accessible. To address this issue, we constructed Protein Microarray Database (PMD), which is specifically designed for archiving and analyzing protein microarray data. In PMD, users can easily browse and search the entire database by experimental name, protein microarray type, and sample information. Additionally, PMD integrates several data analysis tools and provides an automated data analysis pipeline for users. With just one click, users can obtain a comprehensive analysis report for their protein microarray data. The report includes preliminary data analysis, such as data normalization, candidate identification, and an in-depth bioinformatics analysis of the candidates, which include functional annotation, pathway analysis, and protein-protein interaction network analysis. PMD is now freely available at www.proteinmicroarray.cn. PMID:26813635

  16. Protein Microarrays: Novel Developments and Applications

    PubMed Central

    Berrade, Luis; Garcia, Angie E.

    2011-01-01

    Protein microarray technology possesses some of the greatest potential for providing direct information on protein function and potential drug targets. For example, functional protein microarrays are ideal tools suited for the mapping of biological pathways. They can be used to study most major types of interactions and enzymatic activities that take place in biochemical pathways and have been used for the analysis of simultaneous multiple biomolecular interactions involving protein-protein, protein-lipid, protein-DNA and protein-small molecule interactions. Because of this unique ability to analyze many kinds of molecular interactions en masse, the requirement of very small sample amount and the potential to be miniaturized and automated, protein microarrays are extremely well suited for protein profiling, drug discovery, drug target identification and clinical prognosis and diagnosis. The aim of this review is to summarize the most recent developments in the production, applications and analysis of protein microarrays. PMID:21116694

  17. The Genopolis Microarray Database

    PubMed Central

    Splendiani, Andrea; Brandizi, Marco; Even, Gael; Beretta, Ottavio; Pavelka, Norman; Pelizzola, Mattia; Mayhaus, Manuel; Foti, Maria; Mauri, Giancarlo; Ricciardi-Castagnoli, Paola

    2007-01-01

    Background Gene expression databases are key resources for microarray data management and analysis and the importance of a proper annotation of their content is well understood. Public repositories as well as microarray database systems that can be implemented by single laboratories exist. However, there is not yet a tool that can easily support a collaborative environment where different users with different rights of access to data can interact to define a common highly coherent content. The scope of the Genopolis database is to provide a resource that allows different groups performing microarray experiments related to a common subject to create a common coherent knowledge base and to analyse it. The Genopolis database has been implemented as a dedicated system for the scientific community studying dendritic and macrophage cells functions and host-parasite interactions. Results The Genopolis Database system allows the community to build an object based MIAME compliant annotation of their experiments and to store images, raw and processed data from the Affymetrix GeneChip® platform. It supports dynamical definition of controlled vocabularies and provides automated and supervised steps to control the coherence of data and annotations. It allows a precise control of the visibility of the database content to different sub groups in the community and facilitates exports of its content to public repositories. It provides an interactive users interface for data analysis: this allows users to visualize data matrices based on functional lists and sample characterization, and to navigate to other data matrices defined by similarity of expression values as well as functional characterizations of genes involved. A collaborative environment is also provided for the definition and sharing of functional annotation by users. Conclusion The Genopolis Database supports a community in building a common coherent knowledge base and analyse it. This fills a gap between a local

  18. Platotex: an innovative and fully automated device for cell growth scale-up of agar-supported solid-state fermentation.

    PubMed

    Adelin, Emilie; Slimani, Noureddine; Cortial, Sylvie; Schmitz-Alfonso, Isabelle; Ouazzani, Jamal

    2011-02-01

    Among various factors that influence the production of microbial secondary metabolites (MSM), the method of cultivation is an important one that has not been thoroughly investigated. In order to increase microbial throughput and simplify the extraction and workup steps, we performed a study to compare liquid-state fermentation (LSF) with agar-supported solid-state fermentation (AgSF). We found that AgSF is not only more suitable for our applications but offers, for some microbial strains, a higher yield and broader diversity of secondary metabolites. The main limitation of AgSF is the lack of a system to allow production scale-up. In order to overcome this obstacle we developed Platotex, an original fermentation unit offering 2 m(2) of cultivation surface that combines automatic sterilization, cultivation, and drying steps. Platotex is also able to support both LSF and solid-state fermentation (SSF). Platotex conforms to international security and quality requirements and benefits from total remote automation through industrial communication and control standards.

  19. A fully automated effervescence assisted dispersive liquid-liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples.

    PubMed

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina; Andruch, Vasil; Moskvin, Leonid; Bulatov, Andrey

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid-liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L(-1) Na2CO3) and the proton donor solution (1 mol L(-1) CH3COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min(-1) during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV-Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5-100 µmol L(-1) of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L(-1).

  20. Automation or De-automation

    NASA Astrophysics Data System (ADS)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  1. Novel microarrays for simultaneous serodiagnosis of multiple antiviral antibodies.

    PubMed

    Sivakumar, Ponnurengam Malliappan; Moritsugu, Nozomi; Obuse, Sei; Isoshima, Takashi; Tashiro, Hideo; Ito, Yoshihiro

    2013-01-01

    We developed an automated diagnostic system for the detection of virus-specific immunoglobulin Gs (IgGs) that was based on a microarray platform. We compared efficacies of our automated system with conventional enzyme immunoassays (EIAs). Viruses were immobilized to microarrays using a radical cross-linking reaction that was induced by photo-irradiation. A new photoreactive polymer containing perfluorophenyl azide (PFPA) and poly(ethylene glycol) methacrylate was prepared and coated on plates. Inactivated measles, rubella, mumps, Varicella-Zoster and recombinant Epstein-Barr viruse antigen were added to coated plates, and irradiated with ultraviolet light to facilitate immobilization. Virus-specific IgGs in healthy human sera were assayed using these prepared microarrays and the results obtained compared with those from conventional EIAs. We observed high correlation (0.79-0.96) in the results between the automated microarray technique and EIAs. The microarray-based assay was more rapid, involved less reagents and sample, and was easier to conduct compared with conventional EIA techniques. The automated microarray system was further improved by introducing reagent storage reservoirs inside the chamber, thereby conserving the use of expensive reagents and antibodies. We considered the microarray format to be suitable for rapid and multiple serological diagnoses of viral diseases that could be developed further for clinical applications. PMID:24367491

  2. An Automatic and Power Spectra-based Rotate Correcting Algorithm for Microarray Image.

    PubMed

    Deng, Ning; Duan, Huilong

    2005-01-01

    Microarray image analysis, an important aspect of microarray technology, faces vast amount of data processing. At present, the speed of microarray image analysis is quite limited by excessive manual intervention. The geometric structure of microarray determines that, while being analyzed, microarray image should be collimated in the scanning vertical orientation. If rotation or tilt happens in microarray image, the analysis result may be incorrect. Although some automatic image analysis algorithms are used for microarray, still few methods are reported to calibrate the microarray image rotation problem. In this paper, an automatic rotate correcting algorithm is presented which aims at the deflective problem of microarray image. This method is based on image power spectra. Examined by hundreds of samples of clinical data, the algorithm is proved to achieve high precision. As a result, adopting this algorithm, the overall procedure automation in microarray image analysis can be realized.

  3. Toward fully automated high performance computing drug discovery: a massively parallel virtual screening pipeline for docking and molecular mechanics/generalized Born surface area rescoring to improve enrichment.

    PubMed

    Zhang, Xiaohua; Wong, Sergio E; Lightstone, Felice C

    2014-01-27

    In this work we announce and evaluate a high throughput virtual screening pipeline for in-silico screening of virtual compound databases using high performance computing (HPC). Notable features of this pipeline are an automated receptor preparation scheme with unsupervised binding site identification. The pipeline includes receptor/target preparation, ligand preparation, VinaLC docking calculation, and molecular mechanics/generalized Born surface area (MM/GBSA) rescoring using the GB model by Onufriev and co-workers [J. Chem. Theory Comput. 2007, 3, 156-169]. Furthermore, we leverage HPC resources to perform an unprecedented, comprehensive evaluation of MM/GBSA rescoring when applied to the DUD-E data set (Directory of Useful Decoys: Enhanced), in which we selected 38 protein targets and a total of ∼0.7 million actives and decoys. The computer wall time for virtual screening has been reduced drastically on HPC machines, which increases the feasibility of extremely large ligand database screening with more accurate methods. HPC resources allowed us to rescore 20 poses per compound and evaluate the optimal number of poses to rescore. We find that keeping 5-10 poses is a good compromise between accuracy and computational expense. Overall the results demonstrate that MM/GBSA rescoring has higher average receiver operating characteristic (ROC) area under curve (AUC) values and consistently better early recovery of actives than Vina docking alone. Specifically, the enrichment performance is target-dependent. MM/GBSA rescoring significantly out performs Vina docking for the folate enzymes, kinases, and several other enzymes. The more accurate energy function and solvation terms of the MM/GBSA method allow MM/GBSA to achieve better enrichment, but the rescoring is still limited by the docking method to generate the poses with the correct binding modes. PMID:24358939

  4. Toward fully automated high performance computing drug discovery: a massively parallel virtual screening pipeline for docking and molecular mechanics/generalized Born surface area rescoring to improve enrichment.

    PubMed

    Zhang, Xiaohua; Wong, Sergio E; Lightstone, Felice C

    2014-01-27

    In this work we announce and evaluate a high throughput virtual screening pipeline for in-silico screening of virtual compound databases using high performance computing (HPC). Notable features of this pipeline are an automated receptor preparation scheme with unsupervised binding site identification. The pipeline includes receptor/target preparation, ligand preparation, VinaLC docking calculation, and molecular mechanics/generalized Born surface area (MM/GBSA) rescoring using the GB model by Onufriev and co-workers [J. Chem. Theory Comput. 2007, 3, 156-169]. Furthermore, we leverage HPC resources to perform an unprecedented, comprehensive evaluation of MM/GBSA rescoring when applied to the DUD-E data set (Directory of Useful Decoys: Enhanced), in which we selected 38 protein targets and a total of ∼0.7 million actives and decoys. The computer wall time for virtual screening has been reduced drastically on HPC machines, which increases the feasibility of extremely large ligand database screening with more accurate methods. HPC resources allowed us to rescore 20 poses per compound and evaluate the optimal number of poses to rescore. We find that keeping 5-10 poses is a good compromise between accuracy and computational expense. Overall the results demonstrate that MM/GBSA rescoring has higher average receiver operating characteristic (ROC) area under curve (AUC) values and consistently better early recovery of actives than Vina docking alone. Specifically, the enrichment performance is target-dependent. MM/GBSA rescoring significantly out performs Vina docking for the folate enzymes, kinases, and several other enzymes. The more accurate energy function and solvation terms of the MM/GBSA method allow MM/GBSA to achieve better enrichment, but the rescoring is still limited by the docking method to generate the poses with the correct binding modes.

  5. Preliminary evaluation of a fully automated quantitative framework for characterizing general breast tissue histology via color histogram and color texture analysis

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Gastounioti, Aimilia; Batiste, Rebecca C.; Kontos, Despina; Feldman, Michael D.

    2016-03-01

    Visual characterization of histologic specimens is known to suffer from intra- and inter-observer variability. To help address this, we developed an automated framework for characterizing digitized histology specimens based on a novel application of color histogram and color texture analysis. We perform a preliminary evaluation of this framework using a set of 73 trichrome-stained, digitized slides of normal breast tissue which were visually assessed by an expert pathologist in terms of the percentage of collagenous stroma, stromal collagen density, duct-lobular unit density and the presence of elastosis. For each slide, our algorithm automatically segments the tissue region based on the lightness channel in CIELAB colorspace. Within each tissue region, a color histogram feature vector is extracted using a common color palette for trichrome images generated with a previously described method. Then, using a whole-slide, lattice-based methodology, color texture maps are generated using a set of color co-occurrence matrix statistics: contrast, correlation, energy and homogeneity. The extracted features sets are compared to the visually assessed tissue characteristics. Overall, the extracted texture features have high correlations to both the percentage of collagenous stroma (r=0.95, p<0.001) and duct-lobular unit density (r=0.71, p<0.001) seen in the tissue samples, and several individual features were associated with either collagen density and/or the presence of elastosis (p<=0.05). This suggests that the proposed framework has promise as a means to quantitatively extract descriptors reflecting tissue-level characteristics and thus could be useful in detecting and characterizing histological processes in digitized histology specimens.

  6. A fully automated method for simultaneous determination of aflatoxins and ochratoxin A in dried fruits by pressurized liquid extraction and online solid-phase extraction cleanup coupled to ultra-high-pressure liquid chromatography-tandem mass spectrometry.

    PubMed

    Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Russo, Mariateresa; Valdés, Alberto; Ibáñez, Clara; Rastrelli, Luca

    2015-04-01

    According to current demands and future perspectives in food safety, this study reports a fast and fully automated analytical method for the simultaneous analysis of the mycotoxins with high toxicity and wide spread, aflatoxins (AFs) and ochratoxin A (OTA) in dried fruits, a high-risk foodstuff. The method is based on pressurized liquid extraction (PLE), with aqueous methanol (30%) at 110 °C, of the slurried dried fruit and online solid-phase extraction (online SPE) cleanup of the PLE extracts with a C18 cartridge. The purified sample was directly analysed by ultra-high-pressure liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) for sensitive and selective determination of AFs and OTA. The proposed analytical procedure was validated for different dried fruits (vine fruit, fig and apricot), providing method detection and quantification limits much lower than the AFs and OTA maximum levels imposed by EU regulation in dried fruit for direct human consumption. Also, recoveries (83-103%) and repeatability (RSD < 8, n = 3) meet the performance criteria required by EU regulation for the determination of the levels of mycotoxins in foodstuffs. The main advantage of the proposed method is full automation of the whole analytical procedure that reduces the time and cost of the analysis, sample manipulation and solvent consumption, enabling high-throughput analysis and highly accurate and precise results.

  7. A fully automated method for simultaneous determination of aflatoxins and ochratoxin A in dried fruits by pressurized liquid extraction and online solid-phase extraction cleanup coupled to ultra-high-pressure liquid chromatography-tandem mass spectrometry.

    PubMed

    Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Russo, Mariateresa; Valdés, Alberto; Ibáñez, Clara; Rastrelli, Luca

    2015-04-01

    According to current demands and future perspectives in food safety, this study reports a fast and fully automated analytical method for the simultaneous analysis of the mycotoxins with high toxicity and wide spread, aflatoxins (AFs) and ochratoxin A (OTA) in dried fruits, a high-risk foodstuff. The method is based on pressurized liquid extraction (PLE), with aqueous methanol (30%) at 110 °C, of the slurried dried fruit and online solid-phase extraction (online SPE) cleanup of the PLE extracts with a C18 cartridge. The purified sample was directly analysed by ultra-high-pressure liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) for sensitive and selective determination of AFs and OTA. The proposed analytical procedure was validated for different dried fruits (vine fruit, fig and apricot), providing method detection and quantification limits much lower than the AFs and OTA maximum levels imposed by EU regulation in dried fruit for direct human consumption. Also, recoveries (83-103%) and repeatability (RSD < 8, n = 3) meet the performance criteria required by EU regulation for the determination of the levels of mycotoxins in foodstuffs. The main advantage of the proposed method is full automation of the whole analytical procedure that reduces the time and cost of the analysis, sample manipulation and solvent consumption, enabling high-throughput analysis and highly accurate and precise results. PMID:25694147

  8. Microarrays, Integrated Analytical Systems

    NASA Astrophysics Data System (ADS)

    Combinatorial chemistry is used to find materials that form sensor microarrays. This book discusses the fundamentals, and then proceeds to the many applications of microarrays, from measuring gene expression (DNA microarrays) to protein-protein interactions, peptide chemistry, carbodhydrate chemistry, electrochemical detection, and microfluidics.

  9. Manufacturing of microarrays.

    PubMed

    Petersen, David W; Kawasaki, Ernest S

    2007-01-01

    DNA microarray technology has become a powerful tool in the arsenal of the molecular biologist. Capitalizing on high precision robotics and the wealth of DNA sequences annotated from the genomes of a large number of organisms, the manufacture of microarrays is now possible for the average academic laboratory with the funds and motivation. Microarray production requires attention to both biological and physical resources, including DNA libraries, robotics, and qualified personnel. While the fabrication of microarrays is a very labor-intensive process, production of quality microarrays individually tailored on a project-by-project basis will help researchers shed light on future scientific questions.

  10. Fully automated analysis of four tobacco-specific N-nitrosamines in mainstream cigarette smoke using two-dimensional online solid phase extraction combined with liquid chromatography-tandem mass spectrometry.

    PubMed

    Zhang, Jie; Bai, Ruoshi; Yi, Xiaoli; Yang, Zhendong; Liu, Xingyu; Zhou, Jun; Liang, Wei

    2016-01-01

    A fully automated method for the detection of four tobacco-specific nitrosamines (TSNAs) in mainstream cigarette smoke (MSS) has been developed. The new developed method is based on two-dimensional online solid-phase extraction-liquid chromatography-tandem mass spectrometry (SPE/LC-MS/MS). The two dimensional SPE was performed in the method utilizing two cartridges with different extraction mechanisms to cleanup disturbances of different polarity to minimize sample matrix effects on each analyte. Chromatographic separation was achieved using a UPLC C18 reversed phase analytical column. Under the optimum online SPE/LC-MS/MS conditions, N'-nitrosonornicotine (NNN), N'-nitrosoanatabine (NAT), N'-nitrosoanabasine (NAB), and 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanone (NNK) were baseline separated with good peak shapes. This method appears to be the most sensitive method yet reported for determination of TSNAs in mainstream cigarette smoke. The limits of quantification for NNN, NNK, NAT and NAB reached the levels of 6.0, 1.0, 3.0 and 0.6 pg/cig, respectively, which were well below the lowest levels of TSNAs in MSS of current commercial cigarettes. The accuracy of the measurement of four TSNAs was from 92.8 to 107.3%. The relative standard deviations of intra-and inter-day analysis were less than 5.4% and 7.5%, respectively. The main advantages of the method developed are fairly high sensitivity, selectivity and accuracy of results, minimum sample pre-treatment, full automation, and high throughput. As a part of the validation procedure, the developed method was applied to evaluate TSNAs yields for 27 top-selling commercial cigarettes in China.

  11. Fully automated determination of selective retinoic acid receptor ligands in mouse plasma and tissue by reversed-phase liquid chromatography coupled on-line with solid-phase extraction.

    PubMed

    Arafa, H M; Hamada, F M; Elmazar, M M; Nau, H

    1996-04-01

    A fully automated reversed-phase HPLC method was developed for the quantitative assay of three retinoids (Am-580, CD-2019 and CD-437) which selectively activate the retinoic acid receptors RAR alpha, RAR beta and RAR gamma, respectively. Mouse plasma, embryo and maternal tissues were prepared for injection by on-line solid-phase extraction (SPE) and valve-switching techniques. Following automatic injection, the sample was loaded on preconditioned disposable cartridges, cleaned-up and then transferred onto the analytical column to be eluted in the backflush mode, separated by gradient elution and detected by UV, while a new cartridge was concomitantly conditioned. The overall recovery was quantitative allowing for external standardization. The calibration curves were linear in all biological samples tested so far, with a correlation coefficient (r) >0.99. The intra-day precision was < or = 7.8% (n = 5-6) and the inter-day variability was < or = 9.4% (n = 3). The lower limit of detection was 2.5 ng/ml or ng/g for CD-2019 and CD-437, and 5 ng/ml for Am-580 with a S/N ratio of 5 using a sample weight of 25 microliters or mg. The method is now in routine use in our laboratory for the assessment of the pharmacokinetic profiles of these retinoids. The small sample size required, the simple sample preparation and the rapid analysis with high degree of automation make this method convenient for microanalysis of biological samples both in animal and human studies.

  12. Microarrays in hematology.

    PubMed

    Walker, Josef; Flower, Darren; Rigley, Kevin

    2002-01-01

    Microarrays are fast becoming routine tools for the high-throughput analysis of gene expression in a wide range of biologic systems, including hematology. Although a number of approaches can be taken when implementing microarray-based studies, all are capable of providing important insights into biologic function. Although some technical issues have not been resolved, microarrays will continue to make a significant impact on hematologically important research. PMID:11753074

  13. Fully automated multidimensional reversed-phase liquid chromatography with tandem anion/cation exchange columns for simultaneous global endogenous tyrosine nitration detection, integral membrane protein characterization, and quantitative proteomics mapping in cerebral infarcts.

    PubMed

    Quan, Quan; Szeto, Samuel S W; Law, Henry C H; Zhang, Zaijun; Wang, Yuqiang; Chu, Ivan K

    2015-10-01

    Protein tyrosine nitration (PTN) is a signature hallmark of radical-induced nitrative stress in a wide range of pathophysiological conditions, with naturally occurring abundances at substoichiometric levels. In this present study, a fully automated four-dimensional platform, consisting of high-/low-pH reversed-phase dimensions with two additional complementary, strong anion (SAX) and cation exchange (SCX), chromatographic separation stages inserted in tandem, was implemented for the simultaneous mapping of endogenous nitrated tyrosine-containing peptides within the global proteomic context of a Macaca fascicularis cerebral ischemic stroke model. This integrated RP-SA(C)X-RP platform was initially benchmarked through proteomic analyses of Saccharomyces cerevisiae, revealing extended proteome and protein coverage. A total of 27 144 unique peptides from 3684 nonredundant proteins [1% global false discovery rate (FDR)] were identified from M. fascicularis cerebral cortex tissue. The inclusion of the S(A/C)X columns contributed to the increased detection of acidic, hydrophilic, and hydrophobic peptide populations; these separation features enabled the concomitant identification of 127 endogenous nitrated peptides and 137 transmembrane domain-containing peptides corresponding to integral membrane proteins, without the need for specific targeted enrichment strategies. The enhanced diversity of the peptide inventory obtained from the RP-SA(C)X-RP platform also improved analytical confidence in isobaric tags for relative and absolute quantitation (iTRAQ)-based proteomic analyses. PMID:26335518

  14. Validation of high-throughput measurement system with microwave-assisted extraction, fully automated sample preparation device, and gas chromatography-electron capture detector for determination of polychlorinated biphenyls in whale blubber.

    PubMed

    Fujita, Hiroyuki; Honda, Katsuhisa; Hamada, Noriaki; Yasunaga, Genta; Fujise, Yoshihiro

    2009-02-01

    Validation of a high-throughput measurement system with microwave-assisted extraction (MAE), fully automated sample preparation device (SPD), and gas chromatography-electron capture detector (GC-ECD) for the determination of polychlorinated biphenyls (PCBs) in minke whale blubber was performed. PCB congeners accounting for > 95% of the total PCBs burden in blubber were efficiently extracted with a small volume (20 mL) of n-hexane using MAE due to simultaneous saponification and extraction. Further, the crude extract obtained by MAE was rapidly purified and automatically substituted to a small volume (1 mL) of toluene using SPD without using concentrators. Furthermore, the concentration of PCBs in the purified and concentrated solution was accurately determined by GC-ECD. Moreover, the result of accuracy test using a certified material (SRM 1588b; Cod liver oil) showed good agreement with the NIST certified concentration values. In addition, the method quantification limit of total-PCB in whale blubbers was 41 ng g(-1). This new measurement system for PCBs takes only four hours. Consequently, it indicated this method is the most suitable for the monitoring and screening of PCBs in the conservation of the marine ecosystem and safe distribution of foods.

  15. Fully Automated Anesthesia, Analgesia and Fluid Management

    ClinicalTrials.gov

    2016-10-29

    General Anesthetic Drug Overdose; Adverse Effect of Intravenous Anesthetics, Sequela; Complication of Anesthesia; Drug Delivery System Malfunction; Hemodynamic Instability; Underdosing of Other General Anesthetics

  16. Evaluation of Surface Chemistries for Antibody Microarrays

    SciTech Connect

    Seurynck-Servoss, Shannon L.; White, Amanda M.; Baird, Cheryl L.; Rodland, Karin D.; Zangar, Richard C.

    2007-12-01

    Antibody microarrays are an emerging technology that promises to be a powerful tool for the detection of disease biomarkers. The current technology for protein microarrays has been primarily derived from DNA microarrays and is not fully characterized for use with proteins. For example, there are a myriad of surface chemistries that are commercially available for antibody microarrays, but no rigorous studies that compare these different surfaces. Therefore, we have used an enzyme-linked immunosorbent assay (ELISA) microarray platform to analyze 16 different commercially available slide types. Full standard curves were generated for 24 different assays. We found that this approach provides a rigorous and quantitative system for comparing the different slide types based on spot size and morphology, slide noise, spot background, lower limit of detection, and reproducibility. These studies demonstrate that the properties of the slide surface affect the activity of immobilized antibodies and the quality of data produced. Although many slide types can produce useful data, glass slides coated with poly-L-lysine or aminosilane, with or without activation with a crosslinker, consistently produce superior results in the ELISA microarray analyses we performed.

  17. Automation in Clinical Microbiology

    PubMed Central

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  18. Microarray Analysis in Glioblastomas

    PubMed Central

    Bhawe, Kaumudi M.; Aghi, Manish K.

    2016-01-01

    Microarray analysis in glioblastomas is done using either cell lines or patient samples as starting material. A survey of the current literature points to transcript-based microarrays and immunohistochemistry (IHC)-based tissue microarrays as being the preferred methods of choice in cancers of neurological origin. Microarray analysis may be carried out for various purposes including the following: To correlate gene expression signatures of glioblastoma cell lines or tumors with response to chemotherapy (DeLay et al., Clin Cancer Res 18(10):2930–2942, 2012)To correlate gene expression patterns with biological features like proliferation or invasiveness of the glioblastoma cells (Jiang et al., PLoS One 8(6):e66008, 2013)To discover new tumor classificatory systems based on gene expression signature, and to correlate therapeutic response and prognosis with these signatures (Huse et al., Annu Rev Med 64(1):59–70, 2013; Verhaak et al., Cancer Cell 17(1):98–110, 2010) While investigators can sometimes use archived tumor gene expression data available from repositories such as the NCBI Gene Expression Omnibus to answer their questions, new arrays must often be run to adequately answer specific questions. Here, we provide a detailed description of microarray methodologies, how to select the appropriate methodology for a given question, and analytical strategies that can be used. Experimental methodology for protein microarrays is outside the scope of this chapter, but basic sample preparation techniques for transcript-based microarrays are included here. PMID:26113463

  19. Microarray Analysis in Glioblastomas.

    PubMed

    Bhawe, Kaumudi M; Aghi, Manish K

    2016-01-01

    Microarray analysis in glioblastomas is done using either cell lines or patient samples as starting material. A survey of the current literature points to transcript-based microarrays and immunohistochemistry (IHC)-based tissue microarrays as being the preferred methods of choice in cancers of neurological origin. Microarray analysis may be carried out for various purposes including the following: i. To correlate gene expression signatures of glioblastoma cell lines or tumors with response to chemotherapy (DeLay et al., Clin Cancer Res 18(10):2930-2942, 2012). ii. To correlate gene expression patterns with biological features like proliferation or invasiveness of the glioblastoma cells (Jiang et al., PLoS One 8(6):e66008, 2013). iii. To discover new tumor classificatory systems based on gene expression signature, and to correlate therapeutic response and prognosis with these signatures (Huse et al., Annu Rev Med 64(1):59-70, 2013; Verhaak et al., Cancer Cell 17(1):98-110, 2010). While investigators can sometimes use archived tumor gene expression data available from repositories such as the NCBI Gene Expression Omnibus to answer their questions, new arrays must often be run to adequately answer specific questions. Here, we provide a detailed description of microarray methodologies, how to select the appropriate methodology for a given question, and analytical strategies that can be used. Experimental methodology for protein microarrays is outside the scope of this chapter, but basic sample preparation techniques for transcript-based microarrays are included here. PMID:26113463

  20. Fully automated online solid phase extraction coupled directly to liquid chromatography-tandem mass spectrometry. Quantification of sulfonamide antibiotics, neutral and acidic pesticides at low concentrations in surface waters.

    PubMed

    Stoob, Krispin; Singer, Heinz P; Goetz, Christian W; Ruff, Matthias; Mueller, Stephan R

    2005-12-01

    A fully automated online solid phase extraction-liquid chromatography-tandem mass spectrometry (SPE-LC-MS/MS) instrumental setup has been developed for the quantification of sulfonamide antibiotics and pesticides in natural water. The direct coupling of an online solid phase extraction cartridge (Oasis HLB) to LC-MS/MS was accomplished using column switching techniques. High sensitivity in the low ng/L range was achieved by large volume injections of 18 mL with a combination of a tri-directional auto-sampler and a dispenser system. This setup allowed high sample throughput with a minimum of investment costs. Special emphasis was placed on low cross contamination. The chosen approach is suitable for research as well as for monitoring applications. The flexible instrumental setup was successfully optimised for different important groups of bioactive chemicals resulting in three trace analytical methods for quantification of (i) sulfonamide antibiotics and their acetyl metabolites; (ii) neutral pesticides (triazines, phenylureas, amides, chloracetanilides) and (iii) acidic pesticides (phenoxyacetic acids and triketones). Absolute extraction recoveries from 85 to 112% were obtained for the different analytes. More than 500 samples could be analyzed with one extraction cartridge. The inter-day precision of the method was excellent indicated by relative standard deviations between 1 and 6%. High accuracy was achieved by the developed methods resulting in maximum deviation relative to the spiked amount of 8-15% for the different analytes. Detection limits for various environmental samples were between 0.5 and 5 ng/L. Matrix induced ion suppression was in general smaller than 25%. The performance of the online methods was demonstrated with measurements of concentration dynamics of sulfonamide antibiotics and pesticides concentrations in a little creek during rain fall events.

  1. Microarrays in Glycoproteomics Research

    PubMed Central

    Yue, Tingting; Haab, Brian B.

    2009-01-01

    Microarrays have been extremely useful for investigating binding interactions among diverse types of molecular species, with the main advantage being the ability to examine many interactions using small amount of samples and reagents. Microarrays are increasingly being used to advance research in the field of glycobiology, which is the study of the nature and function and carbohydrates in health and disease. Several types of microarrays are being used in the study of glycans and proteins in glycobiology, including glycan arrays to study the recognition of carbohydrates, lectin arrays to determine carbohydrate expression on purified proteins or on cells, and antibody arrays to examine the variation in particular glycan structures on specific proteins. This review will cover the technology and applications of these types of microarrays, as well as their use for obtaining complementary information on various aspects of glycobiology. PMID:19389548

  2. Functional Protein Microarray Technology

    PubMed Central

    Hu, Shaohui; Xie, Zhi; Qian, Jiang; Blackshaw, Seth; Zhu, Heng

    2010-01-01

    Functional protein microarrays are emerging as a promising new tool for large-scale and high-throughput studies. In this article, we will review their applications in basic proteomics research, where various types of assays have been developed to probe binding activities to other biomolecules, such as proteins, DNA, RNA, small molecules, and glycans. We will also report recent progress of using functional protein microarrays in profiling protein posttranslational modifications, including phosphorylation, ubiquitylation, acetylation, and nitrosylation. Finally, we will discuss potential of functional protein microarrays in biomarker identification and clinical diagnostics. We strongly believe that functional protein microarrays will soon become an indispensible and invaluable tool in proteomics research and systems biology. PMID:20872749

  3. Hybridization and Selective Release of DNA Microarrays

    SciTech Connect

    Beer, N R; Baker, B; Piggott, T; Maberry, S; Hara, C M; DeOtte, J; Benett, W; Mukerjee, E; Dzenitis, J; Wheeler, E K

    2011-11-29

    DNA microarrays contain sequence specific probes arrayed in distinct spots numbering from 10,000 to over 1,000,000, depending on the platform. This tremendous degree of multiplexing gives microarrays great potential for environmental background sampling, broad-spectrum clinical monitoring, and continuous biological threat detection. In practice, their use in these applications is not common due to limited information content, long processing times, and high cost. The work focused on characterizing the phenomena of microarray hybridization and selective release that will allow these limitations to be addressed. This will revolutionize the ways that microarrays can be used for LLNL's Global Security missions. The goals of this project were two-fold: automated faster hybridizations and selective release of hybridized features. The first study area involves hybridization kinetics and mass-transfer effects. the standard hybridization protocol uses an overnight incubation to achieve the best possible signal for any sample type, as well as for convenience in manual processing. There is potential to significantly shorten this time based on better understanding and control of the rate-limiting processes and knowledge of the progress of the hybridization. In the hybridization work, a custom microarray flow cell was used to manipulate the chemical and thermal environment of the array and autonomously image the changes over time during hybridization. The second study area is selective release. Microarrays easily generate hybridization patterns and signatures, but there is still an unmet need for methodologies enabling rapid and selective analysis of these patterns and signatures. Detailed analysis of individual spots by subsequent sequencing could potentially yield significant information for rapidly mutating and emerging (or deliberately engineered) pathogens. In the selective release work, optical energy deposition with coherent light quickly provides the thermal energy to

  4. DNA Microarray Technology

    SciTech Connect

    WERNER-WASHBURNE, MARGARET; DAVIDSON, GEORGE S.

    2002-01-01

    Collaboration between Sandia National Laboratories and the University of New Mexico Biology Department resulted in the capability to train students in microarray techniques and the interpretation of data from microarray experiments. These studies provide for a better understanding of the role of stationary phase and the gene regulation involved in exit from stationary phase, which may eventually have important clinical implications. Importantly, this research trained numerous students and is the basis for three new Ph.D. projects.

  5. DNA microarrays in neuropsychopharmacology.

    PubMed

    Marcotte, E R; Srivastava, L K; Quirion, R

    2001-08-01

    Recent advances in experimental genomics, coupled with the wealth of sequence information available for a variety of organisms, have the potential to transform the way pharmacological research is performed. At present, high-density DNA microarrays allow researchers to quickly and accurately quantify gene-expression changes in a massively parallel manner. Although now well established in other biomedical fields, such as cancer and genetics research, DNA microarrays have only recently begun to make significant inroads into pharmacology. To date, the major focus in this field has been on the general application of DNA microarrays to toxicology and drug discovery and design. This review summarizes the major microarray findings of relevance to neuropsychopharmacology, as a prelude to the design and analysis of future basic and clinical microarray experiments. The ability of DNA microarrays to monitor gene expression simultaneously in a large-scale format is helping to usher in a post-genomic age, where simple constructs about the role of nature versus nurture are being replaced by a functional understanding of gene expression in living organisms. PMID:11479006

  6. Evaluation of the Fully Automated BACTEC MGIT 960 System for Testing Susceptibility of Mycobacterium tuberculosis to Pyrazinamide, Streptomycin, Isoniazid, Rifampin, and Ethambutol and Comparison with the Radiometric BACTEC 460TB Method

    PubMed Central

    Scarparo, Claudio; Ricordi, Paolo; Ruggiero, Giuliana; Piccoli, Paola

    2004-01-01

    The performance of the fully automated BACTEC MGIT 960 (M960) system for the testing of Mycobacterium tuberculosis susceptibility to streptomycin (SM), isoniazid (INH), rifampin (RMP), ethambutol (EMB), and pyrazinamide (PZA) was evaluated with 100 clinical isolates and compared to that of the radiometric BACTEC 460TB (B460) system. The agar proportion method and the B460 system were used as reference methods to resolve the discordant results for SM, INH, RMP, and EMB (a combination known as SIRE) and PZA, respectively. The overall agreements were 96.3% for SIRE and 92% for PZA. For SIRE, a total of 26 discrepancies were found and were resolved in favor of the M960 system in 8 cases and in favor of the B460 system in 18 cases. The M960 system produced 8 very major errors (VME) and 10 major errors (ME), while the B460 system showed 4 VME and 4 ME. No statistically significant differences were found. Both systems exhibited excellent performance, but a higher number of VME was observed with the M960 system at the critical concentrations of EMB and SM. For PZA, a total of eight discrepancies were observed and were resolved in favor of the M960 system in one case and in favor of the B460 system in seven cases; no statistically significant differences were found. The M960 system showed four VME and three ME. The mean times to report overall PZA results and resistant results were 8.2 and 9.8 days, respectively, for the M960 system and 7.4 and 8.1 days, respectively, for the B460 system. Statistically significant differences were found. The mean times to report SIRE results were 8.3 days for the M960 system and 8.2 days for the B460 system. No statistically significant differences were found. Twelve strains tested for SIRE susceptibility and seven strains tested for PZA susceptibility had been reprocessed because of contamination. In conclusion, the M960 system can represent a valid alternative to the B460 for M. tuberculosis susceptibility testing; however, the frequent

  7. Nanotechnologies in protein microarrays.

    PubMed

    Krizkova, Sona; Heger, Zbynek; Zalewska, Marta; Moulick, Amitava; Adam, Vojtech; Kizek, Rene

    2015-01-01

    Protein microarray technology became an important research tool for study and detection of proteins, protein-protein interactions and a number of other applications. The utilization of nanoparticle-based materials and nanotechnology-based techniques for immobilization allows us not only to extend the surface for biomolecule immobilization resulting in enhanced substrate binding properties, decreased background signals and enhanced reporter systems for more sensitive assays. Generally in contemporarily developed microarray systems, multiple nanotechnology-based techniques are combined. In this review, applications of nanoparticles and nanotechnologies in creating protein microarrays, proteins immobilization and detection are summarized. We anticipate that advanced nanotechnologies can be exploited to expand promising fields of proteins identification, monitoring of protein-protein or drug-protein interactions, or proteins structures. PMID:26039143

  8. Identifying Fishes through DNA Barcodes and Microarrays

    PubMed Central

    Kochzius, Marc; Seidel, Christian; Antoniou, Aglaia; Botla, Sandeep Kumar; Campo, Daniel; Cariani, Alessia; Vazquez, Eva Garcia; Hauschild, Janet; Hervet, Caroline; Hjörleifsdottir, Sigridur; Hreggvidsson, Gudmundur; Kappel, Kristina; Landi, Monica; Magoulas, Antonios; Marteinsson, Viggo; Nölte, Manfred; Planes, Serge; Tinti, Fausto; Turan, Cemal; Venugopal, Moleyur N.; Weber, Hannes; Blohm, Dietmar

    2010-01-01

    Background International fish trade reached an import value of 62.8 billion Euro in 2006, of which 44.6% are covered by the European Union. Species identification is a key problem throughout the life cycle of fishes: from eggs and larvae to adults in fisheries research and control, as well as processed fish products in consumer protection. Methodology/Principal Findings This study aims to evaluate the applicability of the three mitochondrial genes 16S rRNA (16S), cytochrome b (cyt b), and cytochrome oxidase subunit I (COI) for the identification of 50 European marine fish species by combining techniques of “DNA barcoding” and microarrays. In a DNA barcoding approach, neighbour Joining (NJ) phylogenetic trees of 369 16S, 212 cyt b, and 447 COI sequences indicated that cyt b and COI are suitable for unambiguous identification, whereas 16S failed to discriminate closely related flatfish and gurnard species. In course of probe design for DNA microarray development, each of the markers yielded a high number of potentially species-specific probes in silico, although many of them were rejected based on microarray hybridisation experiments. None of the markers provided probes to discriminate the sibling flatfish and gurnard species. However, since 16S-probes were less negatively influenced by the “position of label” effect and showed the lowest rejection rate and the highest mean signal intensity, 16S is more suitable for DNA microarray probe design than cty b and COI. The large portion of rejected COI-probes after hybridisation experiments (>90%) renders the DNA barcoding marker as rather unsuitable for this high-throughput technology. Conclusions/Significance Based on these data, a DNA microarray containing 64 functional oligonucleotide probes for the identification of 30 out of the 50 fish species investigated was developed. It represents the next step towards an automated and easy-to-handle method to identify fish, ichthyoplankton, and fish products. PMID

  9. Microarrays for Undergraduate Classes

    ERIC Educational Resources Information Center

    Hancock, Dale; Nguyen, Lisa L.; Denyer, Gareth S.; Johnston, Jill M.

    2006-01-01

    A microarray experiment is presented that, in six laboratory sessions, takes undergraduate students from the tissue sample right through to data analysis. The model chosen, the murine erythroleukemia cell line, can be easily cultured in sufficient quantities for class use. Large changes in gene expression can be induced in these cells by…

  10. High-Throughput Fully Automated Construction of a Multiplex Library of Mutagenized Open Reading Frames for an Insecticidal Peptide Using a Plasmid-Based Functional Proteomic Robotic Workcell with Improved Vacuum System

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Robotic platforms are essential for the production and screening of large numbers of expression-ready plasmid sets used to develop optimized clones and improved microbial strains. Here we demonstrate a plasmid-based integrated workcell that was used to automate the molecular biology protocols inclu...

  11. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  12. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  13. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  14. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  15. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  16. Automation in Immunohematology

    PubMed Central

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-01-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process. PMID:22988378

  17. Automation in immunohematology.

    PubMed

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-07-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process. PMID:22988378

  18. Automation in immunohematology.

    PubMed

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-07-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  19. Automated, Miniaturized Instrument for Measuring Gene Expression in Space

    NASA Technical Reports Server (NTRS)

    Pohorille, A.; Peyvan, K.; Danley, D.; Ricco, A. J.

    2010-01-01

    To facilitate astrobiological studies on the survival and adaptation of microorganisms and mixed microbial cultures to space environment, we have been developing a fully automated, miniaturized system for measuring their gene expression on small spacecraft. This low-cost, multi-purpose instrument represents a major scientific and technological advancement in our ability to study the impact of the space environment on biological systems by providing data on cellular metabolism and regulation orders of magnitude richer than what is currently available. The system supports growth of the organism, lyse it to release the expressed RNA, label the RNA, read the expression levels of a large number of genes by microarray analysis of labeled RNA and transmit the measurements to Earth. To measure gene expression we use microarray technology developed by CombiMatrix, which is based on electrochemical reactions on arrays of electrodes on a semiconductor substrate. Since the electrical integrity of the microarray remains intact after probe synthesis, the circuitry can be employed to sense nucleic acid binding at each electrode. CombiMatrix arrays can be sectored to allow multiple samples per chip. In addition, a single array can be used for several assays. The array has been integrated into an automated microfluidic cartridge that uses flexible reagent blisters and pinch pumping to move liquid reagents between chambers. The proposed instrument will help to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment, develop effective countermeasures against these effects, and test our ability to sustain and grow in space organisms that can be used for life support and in situ resource utilization during long-duration space exploration. The instrument is suitable for small satellite platforms, which provide frequent, low cost access to space. It can be also used on any other platform in space

  20. Microarray analysis at single molecule resolution

    PubMed Central

    Mureşan, Leila; Jacak, Jarosław; Klement, Erich Peter; Hesse, Jan; Schütz, Gerhard J.

    2010-01-01

    Bioanalytical chip-based assays have been enormously improved in sensitivity in the recent years; detection of trace amounts of substances down to the level of individual fluorescent molecules has become state of the art technology. The impact of such detection methods, however, has yet not fully been exploited, mainly due to a lack in appropriate mathematical tools for robust data analysis. One particular example relates to the analysis of microarray data. While classical microarray analysis works at resolutions of two to 20 micrometers and quantifies the abundance of target molecules by determining average pixel intensities, a novel high resolution approach [1] directly visualizes individual bound molecules as diffraction limited peaks. The now possible quantification via counting is less susceptible to labeling artifacts and background noise. We have developed an approach for the analysis of high-resolution microarray images. It consists first of a single molecule detection step, based on undecimated wavelet transforms, and second, of a spot identification step via spatial statistics approach (corresponding to the segmentation step in the classical microarray analysis). The detection method was tested on simulated images with a concentration range of 0.001 to 0.5 molecules per square micron and signal-to-noise ratio (SNR) between 0.9 and 31.6. For SNR above 15 the false negatives relative error was below 15%. Separation of foreground/background proved reliable, in case foreground density exceeds background by a factor of 2. The method has also been applied to real data from high-resolution microarray measurements. PMID:20123580

  1. Microarrays under the microscope

    PubMed Central

    Wildsmith, S E; Elcock, F J

    2001-01-01

    Microarray technology is a rapidly advancing area, which is gaining popularity in many biological disciplines from drug target identification to predictive toxicology. Over the past few years, there has been a dramatic increase in the number of methods and techniques available for carrying out this form of gene expression analysis. The techniques and associated peripherals, such as slide types, deposition methods, robotics, and scanning equipment, are undergoing constant improvement, helping to drive the technology forward in terms of robustness and ease of use. These rapid developments, combined with the number of options available and the associated hyperbole, can prove daunting for the new user. This review aims to guide the researcher through the various steps of conducting microarray experiments, from initial strategy to analysing the data, with critical examination of the benefits and disadvantages along the way. PMID:11212888

  2. Navigating public microarray databases.

    PubMed

    Penkett, Christopher J; Bähler, Jürg

    2004-01-01

    With the ever-escalating amount of data being produced by genome-wide microarray studies, it is of increasing importance that these data are captured in public databases so that researchers can use this information to complement and enhance their own studies. Many groups have set up databases of expression data, ranging from large repositories, which are designed to comprehensively capture all published data, through to more specialized databases. The public repositories, such as ArrayExpress at the European Bioinformatics Institute contain complete datasets in raw format in addition to processed data, whilst the specialist databases tend to provide downstream analysis of normalized data from more focused studies and data sources. Here we provide a guide to the use of these public microarray resources.

  3. Evaluating concentration estimation errors in ELISA microarray experiments

    SciTech Connect

    Daly, Don S.; White, Amanda M.; Varnum, Susan M.; Anderson, Kevin K.; Zangar, Richard C.

    2005-01-26

    Enzyme-linked immunosorbent assay (ELISA) is a standard immunoassay to predict a protein concentration in a sample. Deploying ELISA in a microarray format permits simultaneous prediction of the concentrations of numerous proteins in a small sample. These predictions, however, are uncertain due to processing error and biological variability. Evaluating prediction error is critical to interpreting biological significance and improving the ELISA microarray process. Evaluating prediction error must be automated to realize a reliable high-throughput ELISA microarray system. Methods: In this paper, we present a statistical method based on propagation of error to evaluate prediction errors in the ELISA microarray process. Although propagation of error is central to this method, it is effective only when comparable data are available. Therefore, we briefly discuss the roles of experimental design, data screening, normalization and statistical diagnostics when evaluating ELISA microarray prediction errors. We use an ELISA microarray investigation of breast cancer biomarkers to illustrate the evaluation of prediction errors. The illustration begins with a description of the design and resulting data, followed by a brief discussion of data screening and normalization. In our illustration, we fit a standard curve to the screened and normalized data, review the modeling diagnostics, and apply propagation of error.

  4. Microarray oligonucleotide probe designer (MOPeD): A web service

    PubMed Central

    Patel, Viren C; Mondal, Kajari; Shetty, Amol Carl; Horner, Vanessa L; Bedoyan, Jirair K; Martin, Donna; Caspary, Tamara; Cutler, David J; Zwick, Michael E

    2011-01-01

    Methods of genomic selection that combine high-density oligonucleotide microarrays with next-generation DNA sequencing allow investigators to characterize genomic variation in selected portions of complex eukaryotic genomes. Yet choosing which specific oligonucleotides to be use can pose a major technical challenge. To address this issue, we have developed a software package called MOPeD (Microarray Oligonucleotide Probe Designer), which automates the process of designing genomic selection microarrays. This web-based software allows individual investigators to design custom genomic selection microarrays optimized for synthesis with Roche NimbleGen’s maskless photolithography. Design parameters include uniqueness of the probe sequences, melting temperature, hairpin formation, and the presence of single nucleotide polymorphisms. We generated probe databases for the human, mouse, and rhesus macaque genomes and conducted experimental validation of MOPeD-designed microarrays in human samples by sequencing the human X chromosome exome, where relevant sequence metrics indicated superior performance relative to a microarray designed by the Roche NimbleGen proprietary algorithm. We also performed validation in the mouse to identify known mutations contained within a 487-kb region from mouse chromosome 16, the mouse chromosome 16 exome (1.7 Mb), and the mouse chromosome 12 exome (3.3 Mb). Our results suggest that the open source MOPeD software package and website (http://moped.genetics.emory.edu/) will make a valuable resource for investigators in their sequence-based studies of complex eukaryotic genomes. PMID:21379402

  5. Microwave-assisted sample treatment in a fully automated flow-based instrument: oxidation of reduced technetium species in the analysis of total technetium-99 in caustic aged nuclear waste samples.

    PubMed

    Egorov, Oleg B; O'Hara, Matthew J; Grate, Jay W

    2004-07-15

    An automated flow-based instrument for microwave-assisted treatment of liquid samples has been developed and characterized. The instrument utilizes a flow-through reaction vessel design that facilitates the addition of multiple reagents during sample treatment and removal of the gaseous reaction products and enables quantitative removal of liquids from the reaction vessel for carryover-free operations. Matrix modification and speciation control chemistries that are required for the radiochemical determination of total (99)Tc in caustic aged nuclear waste samples have been investigated. A rapid and quantitative oxidation procedure using peroxydisulfate in acidic solution was developed to convert reduced technetium species to pertechnetate in samples with high content of reducing organics. The effectiveness of the automated sample treatment procedures has been validated in the radiochemical analysis of total (99)Tc in caustic aged nuclear waste matrixes from the Hanford site.

  6. [Research progress of probe design software of oligonucleotide microarrays].

    PubMed

    Chen, Xi; Wu, Zaoquan; Liu, Zhengchun

    2014-02-01

    DNA microarray has become an essential medical genetic diagnostic tool for its high-throughput, miniaturization and automation. The design and selection of oligonucleotide probes are critical for preparing gene chips with high quality. Several sets of probe design software have been developed and are available to perform this work now. Every set of the software aims to different target sequences and shows different advantages and limitations. In this article, the research and development of these sets of software are reviewed in line with three main criteria, including specificity, sensitivity and melting temperature (Tm). In addition, based on the experimental results from literatures, these sets of software are classified according to their applications. This review will be helpful for users to choose an appropriate probe-design software. It will also reduce the costs of microarrays, improve the application efficiency of microarrays, and promote both the research and development (R&D) and commercialization of high-performance probe design software.

  7. Identification of spots in rotated and skewed microarray images

    NASA Astrophysics Data System (ADS)

    Le Brese, Christopher; Zou, Ju Jia

    2009-12-01

    DNA microarray image processing has vast potential in the measurement of mass gene expression. A common approach to processing microarrays consists of spot identification, spot segmentation, and information extraction. We are concerned with spot identification. We aim to tackle the problem of identifying spots in rotated and skewed arrays via an automated process. The method proposed is composed of three steps, namely, array orientation calculation based on the Hough transform, affine calculation and correction, and gridding. The method is able to correctly identify spots in a microarray that has been rotated or skewed at an angle between 0 and +/-30 deg and corrupted by various types of noise such as high-intensity streaks, Gaussian noise, and salt-and-pepper noise.

  8. Surface chemistries for antibody microarrays

    SciTech Connect

    Seurynck-Servoss, Shannon L.; Baird, Cheryl L.; Rodland, Karin D.; Zangar, Richard C.

    2007-05-01

    Enzyme-linked immunosorbent assay (ELISA) microarrays promise to be a powerful tool for the detection of disease biomarkers. The original technology for printing ELISA microarray chips and capturing antibodies on slides was derived from the DNA microarray field. However, due to the need to maintain antibody structure and function when immobilized, surface chemistries used for DNA microarrays are not always appropriate for ELISA microarrays. In order to identify better surface chemistries for antibody capture, a number of commercial companies and academic research groups have developed new slide types that could improve antibody function in microarray applications. In this review we compare and contrast the commercially available slide chemistries, as well as highlight some promising recent advances in the field.

  9. Tiling Microarray Analysis Tools

    SciTech Connect

    Nix, Davis Austin

    2005-05-04

    TiMAT is a package of 23 command line Java applications for use in the analysis of Affymetrix tiled genomic microarray data. TiMAT enables: 1) Rebuilding the genome annotation for entire tiled arrays (repeat filtering, chromosomal coordinate assignment). 2) Post processing of oligo intensity values (quantile normalization, median scaling, PMMM transformation), 3) Significance testing (Wilcoxon rank sum and signed rank tests, intensity difference and ratio tests) and Interval refinement (filtering based on multiple statistics, overlap comparisons), 4) Data visualization (detailed thumbnail/zoomed view with Interval Plots and data export to Affymetrix's Integrated Genome Browser) and Data reports (spreadsheet summaries and detailed profiles)

  10. Enhanced production of L-(+)-lactic acid in chemostat by Lactobacillus casei DSM 20011 using ion-exchange resins and cross-flow filtration in a fully automated pilot plant controlled via NIR.

    PubMed

    González-Vara Y R, A; Vaccari, G; Dosi, E; Trilli, A; Rossi, M; Matteuzzi, D

    2000-01-20

    Due to the lack of suitable in-process sensors, on-line monitoring of fermentation processes is restricted almost exclusively to the measurement of physical parameters only indirectly related to key process variables, i.e., substrate, product, and biomass concentration. This obstacle can be overcome by near infrared (NIR) spectroscopy, which allows not only real-time process monitoring, but also automated process control, provided that NIR-generated information is fed to a suitable computerized bioreactor control system. Once the relevant calibrations have been obtained, substrate, biomass and product concentration can be evaluated on-line and used by the bioreactor control system to manage the fermentation. In this work, an NIR-based control system allowed the full automation of a small-scale pilot plant for lactic acid production and provided an excellent tool for process optimization. The growth-inhibiting effect of lactic acid present in the culture broth is enhanced when the growth-limiting substrate, glucose, is also present at relatively high concentrations. Both combined factors can result in a severe reduction of the performance of the lactate production process. A dedicated software enabling on-line NIR data acquisition and reduction, and automated process management through feed addition, culture removal and/or product recovery by microfiltration was developed in order to allow the implementation of continuous fermentation processes with recycling of culture medium and cell recycling. Both operation modes were tested at different dilution rates and the respective cultivation parameters observed were compared with those obtained in a conventional continuous fermentation. Steady states were obtained in both modes with high performance on lactate production. The highest lactate volumetric productivity, 138 g L(-1) h(-1), was obtained in continuous fermentation with cell recycling.

  11. Ecotoxicogenomics: Microarray interlaboratory comparability.

    PubMed

    Vidal-Dorsch, Doris E; Bay, Steven M; Moore, Shelly; Layton, Blythe; Mehinto, Alvine C; Vulpe, Chris D; Brown-Augustine, Marianna; Loguinov, Alex; Poynton, Helen; Garcia-Reyero, Natàlia; Perkins, Edward J; Escalon, Lynn; Denslow, Nancy D; Cristina, Colli-Dula R; Doan, Tri; Shukradas, Shweta; Bruno, Joy; Brown, Lorraine; Van Agglen, Graham; Jackman, Paula; Bauer, Megan

    2016-02-01

    Transcriptomic analysis can complement traditional ecotoxicology data by providing mechanistic insight, and by identifying sub-lethal organismal responses and contaminant classes underlying observed toxicity. Before transcriptomic information can be used in monitoring and risk assessment, it is necessary to determine its reproducibility and detect key steps impacting the reliable identification of differentially expressed genes. A custom 15K-probe microarray was used to conduct transcriptomics analyses across six laboratories with estuarine amphipods exposed to cyfluthrin-spiked or control sediments (10 days). Two sample types were generated, one consisted of total RNA extracts (Ex) from exposed and control samples (extracted by one laboratory) and the other consisted of exposed and control whole body amphipods (WB) from which each laboratory extracted RNA. Our findings indicate that gene expression microarray results are repeatable. Differentially expressed data had a higher degree of repeatability across all laboratories in samples with similar RNA quality (Ex) when compared to WB samples with more variable RNA quality. Despite such variability a subset of genes were consistently identified as differentially expressed across all laboratories and sample types. We found that the differences among the individual laboratory results can be attributed to several factors including RNA quality and technical expertise, but the overall results can be improved by following consistent protocols and with appropriate training.

  12. Systematic review automation technologies.

    PubMed

    Tsafnat, Guy; Glasziou, Paul; Choong, Miew Keen; Dunn, Adam; Galgani, Filippo; Coiera, Enrico

    2014-07-09

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects.We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time.

  13. Systematic review automation technologies

    PubMed Central

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  14. Electronic microarray assays for avian influenza and Newcastle disease virus.

    PubMed

    Lung, Oliver; Beeston, Anne; Ohene-Adjei, Samuel; Pasick, John; Hodko, Dalibor; Hughes, Kimberley Burton; Furukawa-Stoffer, Tara; Fisher, Mathew; Deregt, Dirk

    2012-11-01

    Microarrays are suitable for multiplexed detection and typing of pathogens. Avian influenza virus (AIV) is currently classified into 16 H (hemagglutinin) and 9 N (neuraminidase) subtypes, whereas Newcastle disease virus (NDV) strains differ in virulence and are broadly classified into high and low pathogenicity types. In this study, three assays for detection and typing of poultry viruses were developed on an automated microarray platform: a multiplex assay for simultaneous detection of AIV and detection and pathotyping of NDV, and two separate assays for differentiating all AIV H and N subtypes. The AIV-NDV multiplex assay detected all strains in a 63 virus panel, and accurately typed all high pathogenicity NDV strains tested. A limit of detection of 10(1)-10(3) TCID(50)/mL and 200-400 EID(50)/mL was obtained for NDV and AIV, respectively. The AIV typing assays accurately typed all 41 AIV strains and a limit of detection of 4-200 EID(50)/mL was obtained. Assay validation showed that the microarray assays were generally comparable to real-time RT-PCR. However, the AIV typing microarray assays detected more positive clinical samples than the AIV matrix real-time RT-PCR, and also provided information regarding the subtype. The AIV-NDV multiplex and AIV H typing microarray assays detected mixed infections and could be useful for detection and typing of AIV and NDV.

  15. DNA Microarray-Based Diagnostics.

    PubMed

    Marzancola, Mahsa Gharibi; Sedighi, Abootaleb; Li, Paul C H

    2016-01-01

    The DNA microarray technology is currently a useful biomedical tool which has been developed for a variety of diagnostic applications. However, the development pathway has not been smooth and the technology has faced some challenges. The reliability of the microarray data and also the clinical utility of the results in the early days were criticized. These criticisms added to the severe competition from other techniques, such as next-generation sequencing (NGS), impacting the growth of microarray-based tests in the molecular diagnostic market.Thanks to the advances in the underlying technologies as well as the tremendous effort offered by the research community and commercial vendors, these challenges have mostly been addressed. Nowadays, the microarray platform has achieved sufficient standardization and method validation as well as efficient probe printing, liquid handling and signal visualization. Integration of various steps of the microarray assay into a harmonized and miniaturized handheld lab-on-a-chip (LOC) device has been a goal for the microarray community. In this respect, notable progress has been achieved in coupling the DNA microarray with the liquid manipulation microsystem as well as the supporting subsystem that will generate the stand-alone LOC device.In this chapter, we discuss the major challenges that microarray technology has faced in its almost two decades of development and also describe the solutions to overcome the challenges. In addition, we review the advancements of the technology, especially the progress toward developing the LOC devices for DNA diagnostic applications.

  16. A rapid automatic processing platform for bead label-assisted microarray analysis: application for genetic hearing-loss mutation detection.

    PubMed

    Zhu, Jiang; Song, Xiumei; Xiang, Guangxin; Feng, Zhengde; Guo, Hongju; Mei, Danyang; Zhang, Guohao; Wang, Dong; Mitchelson, Keith; Xing, Wanli; Cheng, Jing

    2014-04-01

    Molecular diagnostics using microarrays are increasingly being used in clinical diagnosis because of their high throughput, sensitivity, and accuracy. However, standard microarray processing takes several hours and involves manual steps during hybridization, slide clean up, and imaging. Here we describe the development of an integrated platform that automates these individual steps as well as significantly shortens the processing time and improves reproducibility. The platform integrates such key elements as a microfluidic chip, flow control system, temperature control system, imaging system, and automated analysis of clinical results. Bead labeling of microarray signals required a simple imaging system and allowed continuous monitoring of the microarray processing. To demonstrate utility, the automated platform was used to genotype hereditary hearing-loss gene mutations. Compared with conventional microarray processing procedures, the platform increases the efficiency and reproducibility of hybridization, speeding microarray processing through to result analysis. The platform also continuously monitors the microarray signals, which can be used to facilitate optimization of microarray processing conditions. In addition, the modular design of the platform lends itself to development of simultaneous processing of multiple microfluidic chips. We believe the novel features of the platform will benefit its use in clinical settings in which fast, low-complexity molecular genetic testing is required.

  17. Living-Cell Microarrays

    PubMed Central

    Yarmush, Martin L.; King, Kevin R.

    2011-01-01

    Living cells are remarkably complex. To unravel this complexity, living-cell assays have been developed that allow delivery of experimental stimuli and measurement of the resulting cellular responses. High-throughput adaptations of these assays, known as living-cell microarrays, which are based on microtiter plates, high-density spotting, microfabrication, and microfluidics technologies, are being developed for two general applications: (a) to screen large-scale chemical and genomic libraries and (b) to systematically investigate the local cellular microenvironment. These emerging experimental platforms offer exciting opportunities to rapidly identify genetic determinants of disease, to discover modulators of cellular function, and to probe the complex and dynamic relationships between cells and their local environment. PMID:19413510

  18. Tiling Microarray Analysis Tools

    2005-05-04

    TiMAT is a package of 23 command line Java applications for use in the analysis of Affymetrix tiled genomic microarray data. TiMAT enables: 1) Rebuilding the genome annotation for entire tiled arrays (repeat filtering, chromosomal coordinate assignment). 2) Post processing of oligo intensity values (quantile normalization, median scaling, PMMM transformation), 3) Significance testing (Wilcoxon rank sum and signed rank tests, intensity difference and ratio tests) and Interval refinement (filtering based on multiple statistics, overlap comparisons),more » 4) Data visualization (detailed thumbnail/zoomed view with Interval Plots and data export to Affymetrix's Integrated Genome Browser) and Data reports (spreadsheet summaries and detailed profiles)« less

  19. Segmentation of prostate cancer tissue microarray images

    NASA Astrophysics Data System (ADS)

    Cline, Harvey E.; Can, Ali; Padfield, Dirk

    2006-02-01

    Prostate cancer is diagnosed by histopathology interpretation of hematoxylin and eosin (H and E)-stained tissue sections. Gland and nuclei distributions vary with the disease grade. The morphological features vary with the advance of cancer where the epithelial regions grow into the stroma. An efficient pathology slide image analysis method involved using a tissue microarray with known disease stages. Digital 24-bit RGB images were acquired for each tissue element on the slide with both 10X and 40X objectives. Initial segmentation at low magnification was accomplished using prior spectral characteristics from a training tissue set composed of four tissue clusters; namely, glands, epithelia, stroma and nuclei. The segmentation method was automated by using the training RGB values as an initial guess and iterating the averaging process 10 times to find the four cluster centers. Labels were assigned to the nearest cluster center in red-blue spectral feature space. An automatic threshold algorithm separated the glands from the tissue. A visual pseudo color representation of 60 segmented tissue microarray image was generated where white, pink, red, blue colors represent glands, epithelia, stroma and nuclei, respectively. The higher magnification images provided refined nuclei morphology. The nuclei were detected with a RGB color space principle component analysis that resulted in a grey scale image. The shape metrics such as compactness, elongation, minimum and maximum diameters were calculated based on the eigenvalues of the best-fitting ellipses to the nuclei.

  20. Fully Automated Quantification of the Striatal Uptake Ratio of [99mTc]-TRODAT with SPECT Imaging: Evaluation of the Diagnostic Performance in Parkinson's Disease and the Temporal Regression of Striatal Tracer Uptake

    PubMed Central

    Fang, Yu-Hua Dean; Chiu, Shao-Chieh; Lu, Chin-Song; Yen, Tzu-Chen; Weng, Yi-Hsin

    2015-01-01

    Purpose. We aimed at improving the existing methods for the fully automatic quantification of striatal uptake of [99mTc]-TRODAT with SPECT imaging. Procedures. A normal [99mTc]-TRODAT template was first formed based on 28 healthy controls. Images from PD patients (n = 365) and nPD subjects (28 healthy controls and 33 essential tremor patients) were spatially normalized to the normal template. We performed an inverse transform on the predefined striatal and reference volumes of interest (VOIs) and applied the transformed VOIs to the original image data to calculate the striatal-to-reference ratio (SRR). The diagnostic performance of the SRR was determined through receiver operating characteristic (ROC) analysis. Results. The SRR measured with our new and automatic method demonstrated excellent diagnostic performance with 92% sensitivity, 90% specificity, 92% accuracy, and an area under the curve (AUC) of 0.94. For the evaluation of the mean SRR and the clinical duration, a quadratic function fit the data with R2 = 0.84. Conclusions. We developed and validated a fully automatic method for the quantification of the SRR in a large study sample. This method has an excellent diagnostic performance and exhibits a strong correlation between the mean SRR and the clinical duration in PD patients. PMID:26366413

  1. Microarray platform for omics analysis

    NASA Astrophysics Data System (ADS)

    Mecklenburg, Michael; Xie, Bin

    2001-09-01

    Microarray technology has revolutionized genetic analysis. However, limitations in genome analysis has lead to renewed interest in establishing 'omic' strategies. As we enter the post-genomic era, new microarray technologies are needed to address these new classes of 'omic' targets, such as proteins, as well as lipids and carbohydrates. We have developed a microarray platform that combines self- assembling monolayers with the biotin-streptavidin system to provide a robust, versatile immobilization scheme. A hydrophobic film is patterned on the surface creating an array of tension wells that eliminates evaporation effects thereby reducing the shear stress to which biomolecules are exposed to during immobilization. The streptavidin linker layer makes it possible to adapt and/or develop microarray based assays using virtually any class of biomolecules including: carbohydrates, peptides, antibodies, receptors, as well as them ore traditional DNA based arrays. Our microarray technology is designed to furnish seamless compatibility across the various 'omic' platforms by providing a common blueprint for fabricating and analyzing arrays. The prototype microarray uses a microscope slide footprint patterned with 2 by 96 flat wells. Data on the microarray platform will be presented.

  2. Fully automated method for the liquid chromatographic-tandem mass spectrometric determination of cyproterone acetate in human plasma using restricted access material for on-line sample clean-up.

    PubMed

    Christiaens, B; Fillet, M; Chiap, P; Rbeida, O; Ceccato, A; Streel, B; De Graeve, J; Crommen, J; Hubert, Ph

    2004-11-12

    A new automated method for the quantitative analysis of cyproterone acetate (CPA) in human plasma has been developed using on-line solid phase extraction (SPE) prior to the LC-MS/MS determination. The method was based on the use of a pre-column packed with internal-surface reversed-phase material (LiChrospher RP-4 ADS, 25 mm x 2 mm) for sample clean-up coupled to LC separation on an octadecyl silica stationary phase by means of a column switching system. A 30 microl plasma sample volume was injected directly onto the pre-column using a mixture of water, acetonitrile and formic acid (90:10:0.1 (v/v/v)) adjusted to pH 4.0 with diluted ammonia as washing liquid. The analyte was then eluted in the back-flush mode with the LC mobile phase consisting of water, methanol and formic acid (10:90:0.1 (v/v/v)). The dispensing flow rates of the washing liquid and the LC mobile phase were 300 microl min(-1). Medroxyprogesterone acetate (MPA) was used as internal standard. The MS ionization of the analytes was achieved using electrospray (ESI) in the positive ion mode. The pseudomolecular ionic species of CPA and MPA (417.4 and 387.5) were selected to generate daughter ions at 357.4 and 327.5, respectively. Finally, the developed method was validated according to a new approach using accuracy profiles as a decision tool. Very good results with respect to accuracy, detectability, repeatability, intermediate precision and selectivity were obtained. The LOQ of cyproterone acetate was 300 pg ml(-1).

  3. Cockpit automation

    NASA Technical Reports Server (NTRS)

    Wiener, Earl L.

    1988-01-01

    The aims and methods of aircraft cockpit automation are reviewed from a human-factors perspective. Consideration is given to the mixed pilot reception of increased automation, government concern with the safety and reliability of highly automated aircraft, the formal definition of automation, and the ground-proximity warning system and accidents involving controlled flight into terrain. The factors motivating automation include technology availability; safety; economy, reliability, and maintenance; workload reduction and two-pilot certification; more accurate maneuvering and navigation; display flexibility; economy of cockpit space; and military requirements.

  4. Fully Regressive Melanoma

    PubMed Central

    Ehrsam, Eric; Kallini, Joseph R.; Lebas, Damien; Modiano, Philippe; Cotten, Hervé

    2016-01-01

    Fully regressive melanoma is a phenomenon in which the primary cutaneous melanoma becomes completely replaced by fibrotic components as a result of host immune response. Although 10 to 35 percent of cases of cutaneous melanomas may partially regress, fully regressive melanoma is very rare; only 47 cases have been reported in the literature to date. AH of the cases of fully regressive melanoma reported in the literature were diagnosed in conjunction with metastasis on a patient. The authors describe a case of fully regressive melanoma without any metastases at the time of its diagnosis. Characteristic findings on dermoscopy, as well as the absence of melanoma on final biopsy, confirmed the diagnosis.

  5. Fully Regressive Melanoma

    PubMed Central

    Ehrsam, Eric; Kallini, Joseph R.; Lebas, Damien; Modiano, Philippe; Cotten, Hervé

    2016-01-01

    Fully regressive melanoma is a phenomenon in which the primary cutaneous melanoma becomes completely replaced by fibrotic components as a result of host immune response. Although 10 to 35 percent of cases of cutaneous melanomas may partially regress, fully regressive melanoma is very rare; only 47 cases have been reported in the literature to date. AH of the cases of fully regressive melanoma reported in the literature were diagnosed in conjunction with metastasis on a patient. The authors describe a case of fully regressive melanoma without any metastases at the time of its diagnosis. Characteristic findings on dermoscopy, as well as the absence of melanoma on final biopsy, confirmed the diagnosis. PMID:27672418

  6. Microarray Analysis of Microbial Weathering

    NASA Astrophysics Data System (ADS)

    Olsson-Francis, K.; van Houdt, R.; Leys, N.; Mergeay, M.; Cockell, C. S.

    2010-04-01

    Microarray analysis of the heavy metal resistant bacterium, Cupriavidus metallidurans CH34 was used to investigate the genes involved in the weathering. The results demonstrated that large porin and membrane transporter genes were unregulated.

  7. [Protein microarrays and personalized medicine].

    PubMed

    Yu, Xiabo; Schneiderhan-Marra, Nicole; Joos, Thomas O

    2011-01-01

    Over the last 10 years, DNA microarrays have achieved a robust analytical performance, enabling their use for analyzing the whole transcriptome or for screening thousands of single-nucleotide polymorphisms in a single experiment. DNA microarrays allow scientists to correlate gene expression signatures with disease progression, to screen for disease-specific mutations, and to treat patients according to their individual genetic profiles; however, the real key is proteins and their manifold functions. It is necessary to achieve a greater understanding of not only protein function and abundance but also their role in the development of diseases. Protein concentrations have been shown to reflect the physiological and pathologic state of an organ, tissue, or cells far more directly than DNA, and proteins can be profiled effectively with protein microarrays, which require only a small amount of sample material. Protein microarrays have become wellestablished tools in basic and applied research, and the first products have already entered the in vitro diagnostics market. This review focuses on protein microarray applications for biomarker discovery and validation, disease diagnosis, and use within the area of personalized medicine. Protein microarrays have proved to be reliable research tools in screening for a multitude of parameters with only a minimal quantity of sample and have enormous potential in applications for diagnostic and personalized medicine.

  8. Dual modality intravascular optical coherence tomography (OCT) and near-infrared fluorescence (NIRF) imaging: a fully automated algorithm for the distance-calibration of NIRF signal intensity for quantitative molecular imaging

    PubMed Central

    Ughi, Giovanni J.; Verjans, Johan; Fard, Ali M.; Wang, Hao; Osborn, Eric; Hara, Tetsuya; Mauskapf, Adam; Jaffer, Farouc A.; Tearney, Guillermo J.

    2015-01-01

    Intravascular optical coherence tomography (IVOCT) is a well-established method for the high-resolution investigation of atherosclerosis in vivo. Intravascular near-infrared fluorescence (NIRF) imaging is a novel technique for the assessment of molecular processes associated with coronary artery disease. Integration of NIRF and IVOCT technology in a single catheter provides the capability to simultaneously obtain co-localized anatomical and molecular information from the artery wall. Since NIRF signal intensity attenuates as a function of imaging catheter distance to the vessel wall, the generation of quantitative NIRF data requires an accurate measurement of the vessel wall in IVOCT images. Given that dual modality, intravascular OCT-NIRF systems acquire data at a very high frame-rate (>100 frames/second), a high number of images per pullback need to be analyzed, making manual processing of OCT-NIRF data extremely time consuming. To overcome this limitation, we developed an algorithm for the automatic distance-correction of dual-modality OCT-NIRF images. We validated this method by comparing automatic to manual segmentation results in 180 in vivo images from 6 New Zealand White rabbit atherosclerotic after indocyanine-green (ICG) injection. A high Dice similarity coefficient was found (0.97 ± 0.03) together with an average individual A-line error of 22 μm (i.e., approximately twice the axial resolution of IVOCT) and a processing time of 44 ms per image. In a similar manner, the algorithm was validated using 120 IVOCT clinical images from 8 different in vivo pullbacks in human coronary arteries. The results suggest that the proposed algorithm enables fully automatic visualization of dual modality OCT-NIRF pullbacks, and provides an accurate and efficient calibration of NIRF data for quantification of the molecular agent in the atherosclerotic vessel wall. PMID:25341407

  9. A quantitative liposome microarray to systematically characterize protein-lipid interactions.

    PubMed

    Saliba, Antoine-Emmanuel; Vonkova, Ivana; Ceschia, Stefano; Findlay, Greg M; Maeda, Kenji; Tischer, Christian; Deghou, Samy; van Noort, Vera; Bork, Peer; Pawson, Tony; Ellenberg, Jan; Gavin, Anne-Claude

    2014-01-01

    Lipids have a role in virtually all biological processes, acting as structural elements, scaffolds and signaling molecules, but they are still largely under-represented in known biological networks. Here we describe a liposome microarray-based assay (LiMA), a method that measures protein recruitment to membranes in a quantitative, automated, multiplexed and high-throughput manner.

  10. DNA microarray technology in nutraceutical and food safety.

    PubMed

    Liu-Stratton, Yiwen; Roy, Sashwati; Sen, Chandan K

    2004-04-15

    The quality and quantity of diet is a key determinant of health and disease. Molecular diagnostics may play a key role in food safety related to genetically modified foods, food-borne pathogens and novel nutraceuticals. Functional outcomes in biology are determined, for the most part, by net balance between sets of genes related to the specific outcome in question. The DNA microarray technology offers a new dimension of strength in molecular diagnostics by permitting the simultaneous analysis of large sets of genes. Automation of assay and novel bioinformatics tools make DNA microarrays a robust technology for diagnostics. Since its development a few years ago, this technology has been used for the applications of toxicogenomics, pharmacogenomics, cell biology, and clinical investigations addressing the prevention and intervention of diseases. Optimization of this technology to specifically address food safety is a vast resource that remains to be mined. Efforts to develop diagnostic custom arrays and simplified bioinformatics tools for field use are warranted.

  11. Comparing Bacterial DNA Microarray Fingerprints

    SciTech Connect

    Willse, Alan R.; Chandler, Darrell P.; White, Amanda M.; Protic, Miroslava; Daly, Don S.; Wunschel, Sharon C.

    2005-08-15

    Detecting subtle genetic differences between microorganisms is an important problem in molecular epidemiology and microbial forensics. In a typical investigation, gel electrophoresis is used to compare randomly amplified DNA fragments between microbial strains, where the patterns of DNA fragment sizes are proxies for a microbe's genotype. The limited genomic sample captured on a gel is often insufficient to discriminate nearly identical strains. This paper examines the application of microarray technology to DNA fingerprinting as a high-resolution alternative to gel-based methods. The so-called universal microarray, which uses short oligonucleotide probes that do not target specific genes or species, is intended to be applicable to all microorganisms because it does not require prior knowledge of genomic sequence. In principle, closely related strains can be distinguished if the number of probes on the microarray is sufficiently large, i.e., if the genome is sufficiently sampled. In practice, we confront noisy data, imperfectly matched hybridizations, and a high-dimensional inference problem. We describe the statistical problems of microarray fingerprinting, outline similarities with and differences from more conventional microarray applications, and illustrate the statistical fingerprinting problem for 10 closely related strains from three Bacillus species, and 3 strains from non-Bacillus species.

  12. Nanodroplet chemical microarrays and label-free assays.

    PubMed

    Gosalia, Dhaval; Diamond, Scott L

    2010-01-01

    The microarraying of chemicals or biomolecules on a glass surface allows for dense storage and miniaturized screening experiments and can be deployed in chemical-biology research or drug discovery. Microarraying allows the production of scores of replicate slides. Small molecule libraries are typically stored as 10 mM DMSO stock solutions, whereas libraries of biomolecules are typically stored in high percentages of glycerol. Thus, a method is required to print such libraries on microarrays, and then assay them against biological targets. By printing either small molecule libraries or biomolecule libraries in an aqueous solvent containing glycerol, each adherent nanodroplet remains fixed at a position on the microarray by surface tension without the use of wells, without evaporating, and without the need for chemically linking the compound to the surface. Importantly, glycerol is a high boiling point solvent that is fully miscible with DMSO and water and has the additional property of stabilizing various enzymes. The nanoliter volume of the droplet forms the reaction compartment once additional reagents are metered onto the microarray, either by aerosol spray deposition or by addressable acoustic dispensing. Incubation of the nanodroplet microarray in a high humidity environment controls the final water content of the reaction. This platform has been validated for fluorescent HTS assays of protease and kinases as well as for fluorogenic substrate profiling of proteases. Label-free HTS is also possible by running nanoliter HTS reactions on a MALDI target for mass spectrometry (MS) analysis without the need for desalting of the samples. A method is described for running nanoliter-scale multicomponent homogeneous reactions followed by label-free MALDI MS spectrometry analysis of the reactions. PMID:20857358

  13. An automated photolithography facility for IC's

    NASA Technical Reports Server (NTRS)

    Kennedy, B. W.

    1980-01-01

    Report discusses subsystems that will constitute fully-automated photolithography facility for IC's. Facility being developed at Marshall Space Flight Center will produce ultrareliable IC's with minimal human intervention.

  14. Protein microarrays: prospects and problems.

    PubMed

    Kodadek, T

    2001-02-01

    Protein microarrays are potentially powerful tools in biochemistry and molecular biology. Two types of protein microarrays are defined. One, termed a protein function array, will consist of thousands of native proteins immobilized in a defined pattern. Such arrays can be utilized for massively parallel testing of protein function, hence the name. The other type is termed a protein-detecting array. This will consist of large numbers of arrayed protein-binding agents. These arrays will allow for expression profiling to be done at the protein level. In this article, some of the major technological challenges to the development of protein arrays are discussed, along with potential solutions.

  15. Automated, Miniaturized Instrument for Measuring Gene Expression in Space

    NASA Astrophysics Data System (ADS)

    Pohorille, Andrew; Danley, David; Payvan, Kia; Ricco, Antonio

    To facilitate astrobiological studies on the survival and adaptation of microorganisms and mixed microbial cultures to space environment, we have been developing a fully automated, minia-turized system for measuring their gene expression on small spacecraft. This low-cost, multi-purpose instrument represents a major scientific and technological advancement in our ability to study the impact of the space environment on biological systems by providing data on cel-lular metabolism and regulation orders of magnitude richer than what is currently available. The system supports growth of the organism, lyse it to release the expressed RNA, label the RNA, read the expression levels of a large number of genes by microarray analysis of labeled RNA and transmit the measurements to Earth. To measure gene expression we use microarray technology developed by CombiMatrix, which is based on electrochemical reactions on arrays of electrodes on a semiconductor substrate. Since the electrical integrity of the microarray re-mains intact after probe synthesis, the circuitry can be employed to sense nucleic acid binding at each electrode. CombiMatrix arrays can be sectored to allow multiple samples per chip. In addition, a single array can be used for several assays. The array has been integrated into an automated microfluidic cartridge that uses flexible reagent blisters and pinch pumping to move liquid reagents between chambers. The proposed instrument will help to understand adaptation of terrestrial life to conditions be-yond the planet of origin, identify deleterious effects of the space environment, develop effective countermeasures against these effects, and test our ability to sustain and grow in space organ-isms that can be used for life support and in situ resource utilization during long-duration space exploration. The instrument is suitable for small satellite platforms, which provide frequent, low cost access to space. It can be also used on any other platform in space

  16. Process automation

    SciTech Connect

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs.

  17. Microarray Developed on Plastic Substrates.

    PubMed

    Bañuls, María-José; Morais, Sergi B; Tortajada-Genaro, Luis A; Maquieira, Ángel

    2016-01-01

    There is a huge potential interest to use synthetic polymers as versatile solid supports for analytical microarraying. Chemical modification of polycarbonate (PC) for covalent immobilization of probes, micro-printing of protein or nucleic acid probes, development of indirect immunoassay, and development of hybridization protocols are described and discussed. PMID:26614067

  18. Microfluidic microarray systems and methods thereof

    SciTech Connect

    West, Jay A. A.; Hukari, Kyle W.; Hux, Gary A.

    2009-04-28

    Disclosed are systems that include a manifold in fluid communication with a microfluidic chip having a microarray, an illuminator, and a detector in optical communication with the microarray. Methods for using these systems for biological detection are also disclosed.

  19. Using Kepler for Tool Integration in Microarray Analysis Workflows

    PubMed Central

    Gan, Zhuohui; Stowe, Jennifer C.; Altintas, Ilkay; McCulloch, Andrew D.; Zambon, Alexander C.

    2015-01-01

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines. PMID:26605000

  20. Simple Fully Automated Group Classification on Brain fMRI

    SciTech Connect

    Honorio, J.; Goldstein, R.; Honorio, J.; Samaras, D.; Tomasi, D.; Goldstein, R.Z.

    2010-04-14

    We propose a simple, well grounded classification technique which is suited for group classification on brain fMRI data sets that have high dimensionality, small number of subjects, high noise level, high subject variability, imperfect registration and capture subtle cognitive effects. We propose threshold-split region as a new feature selection method and majority voteas the classification technique. Our method does not require a predefined set of regions of interest. We use average acros ssessions, only one feature perexperimental condition, feature independence assumption, and simple classifiers. The seeming counter-intuitive approach of using a simple design is supported by signal processing and statistical theory. Experimental results in two block design data sets that capture brain function under distinct monetary rewards for cocaine addicted and control subjects, show that our method exhibits increased generalization accuracy compared to commonly used feature selection and classification techniques.

  1. Fully automated procedure for ship detection using optical satellite imagery

    NASA Astrophysics Data System (ADS)

    Corbane, C.; Pecoul, E.; Demagistri, L.; Petit, M.

    2009-01-01

    Ship detection from remote sensing imagery is a crucial application for maritime security which includes among others traffic surveillance, protection against illegal fisheries, oil discharge control and sea pollution monitoring. In the framework of a European integrated project GMES-Security/LIMES, we developed an operational ship detection algorithm using high spatial resolution optical imagery to complement existing regulations, in particular the fishing control system. The automatic detection model is based on statistical methods, mathematical morphology and other signal processing techniques such as the wavelet analysis and Radon transform. This paper presents current progress made on the detection model and describes the prototype designed to classify small targets. The prototype was tested on panchromatic SPOT 5 imagery taking into account the environmental and fishing context in French Guiana. In terms of automatic detection of small ship targets, the proposed algorithm performs well. Its advantages are manifold: it is simple and robust, but most of all, it is efficient and fast, which is a crucial point in performance evaluation of advanced ship detection strategies.

  2. Rapid, fully automated flow injection antioxidant capacity assay.

    PubMed

    Labrinea, Eleni P; Georgiou, Constantinos A

    2005-06-01

    A flow injection method for antioxidant capacity assessment based on a low-cost laboratory-made analyzer is reported. A sample of 30 microL is injected in acetate buffer stream, pH 4.6, that converges with ABTS*(+) reagent stream. Detection is achieved by monitoring absorbance at 414 nm. The proposed method achieves a sample throughput of up to 120 samples h(-1), the detection limit being 1.3 microM trolox. Precision was better than 5% relative standard deviation (n = 4) and the linear range was 4-100 microM, expanded to 250 microM trolox utilizing concentration gradients formed along the injected sample bolus. Information on reaction kinetics is obtained through a single injection. The method was applied to pure compounds and wine and honey samples. Good correlation was found between antioxidant capacity assessed through the proposed method and phenolic content: r = 0.94 for red wines, r = 0.96 for white and rose wines, and r = 0.89 for honeys. PMID:15913292

  3. Automated Estimating System (AES)

    SciTech Connect

    Holder, D.A.

    1989-09-01

    This document describes Version 3.1 of the Automated Estimating System, a personal computer-based software package designed to aid in the creation, updating, and reporting of project cost estimates for the Estimating and Scheduling Department of the Martin Marietta Energy Systems Engineering Division. Version 3.1 of the Automated Estimating System is capable of running in a multiuser environment across a token ring network. The token ring network makes possible services and applications that will more fully integrate all aspects of information processing, provides a central area for large data bases to reside, and allows access to the data base by multiple users. Version 3.1 of the Automated Estimating System also has been enhanced to include an Assembly pricing data base that may be used to retrieve cost data into an estimate. A WBS Title File program has also been included in Version 3.1. The WBS Title File program allows for the creation of a WBS title file that has been integrated with the Automated Estimating System to provide WBS titles in update mode and in reports. This provides for consistency in WBS titles and provides the capability to display WBS titles on reports generated at a higher WBS level.

  4. The Microarray Revolution: Perspectives from Educators

    ERIC Educational Resources Information Center

    Brewster, Jay L.; Beason, K. Beth; Eckdahl, Todd T.; Evans, Irene M.

    2004-01-01

    In recent years, microarray analysis has become a key experimental tool, enabling the analysis of genome-wide patterns of gene expression. This review approaches the microarray revolution with a focus upon four topics: 1) the early development of this technology and its application to cancer diagnostics; 2) a primer of microarray research,…

  5. Investigating Factors Affecting the Uptake of Automated Assessment Technology

    ERIC Educational Resources Information Center

    Dreher, Carl; Reiners, Torsten; Dreher, Heinz

    2011-01-01

    Automated assessment is an emerging innovation in educational praxis, however its pedagogical potential is not fully utilised in Australia, particularly regarding automated essay grading. The rationale for this research is that the usage of automated assessment currently lags behind the capacity that the technology provides, thus restricting the…

  6. Automating the Purple Crow Lidar

    NASA Astrophysics Data System (ADS)

    Hicks, Shannon; Sica, R. J.; Argall, P. S.

    2016-06-01

    The Purple Crow LiDAR (PCL) was built to measure short and long term coupling between the lower, middle, and upper atmosphere. The initial component of my MSc. project is to automate two key elements of the PCL: the rotating liquid mercury mirror and the Zaber alignment mirror. In addition to the automation of the Zaber alignment mirror, it is also necessary to describe the mirror's movement and positioning errors. Its properties will then be added into the alignment software. Once the alignment software has been completed, we will compare the new alignment method with the previous manual procedure. This is the first among several projects that will culminate in a fully-automated lidar. Eventually, we will be able to work remotely, thereby increasing the amount of data we collect. This paper will describe the motivation for automation, the methods we propose, preliminary results for the Zaber alignment error analysis, and future work.

  7. High-throughput variation detection and genotyping using microarrays.

    PubMed

    Cutler, D J; Zwick, M E; Carrasquillo, M M; Yohn, C T; Tobin, K P; Kashuk, C; Mathews, D J; Shah, N A; Eichler, E E; Warrington, J A; Chakravarti, A

    2001-11-01

    The genetic dissection of complex traits may ultimately require a large number of SNPs to be genotyped in multiple individuals who exhibit phenotypic variation in a trait of interest. Microarray technology can enable rapid genotyping of variation specific to study samples. To facilitate their use, we have developed an automated statistical method (ABACUS) to analyze microarray hybridization data and applied this method to Affymetrix Variation Detection Arrays (VDAs). ABACUS provides a quality score to individual genotypes, allowing investigators to focus their attention on sites that give accurate information. We have applied ABACUS to an experiment encompassing 32 autosomal and eight X-linked genomic regions, each consisting of approximately 50 kb of unique sequence spanning a 100-kb region, in 40 humans. At sufficiently high-quality scores, we are able to read approximately 80% of all sites. To assess the accuracy of SNP detection, 108 of 108 SNPs have been experimentally confirmed; an additional 371 SNPs have been confirmed electronically. To access the accuracy of diploid genotypes at segregating autosomal sites, we confirmed 1515 of 1515 homozygous calls, and 420 of 423 (99.29%) heterozygotes. In replicate experiments, consisting of independent amplification of identical samples followed by hybridization to distinct microarrays of the same design, genotyping is highly repeatable. In an autosomal replicate experiment, 813,295 of 813,295 genotypes are called identically (including 351 heterozygotes); at an X-linked locus in males (haploid), 841,236 of 841,236 sites are called identically.

  8. Microarray analysis in pulmonary hypertension.

    PubMed

    Hoffmann, Julia; Wilhelm, Jochen; Olschewski, Andrea; Kwapiszewska, Grazyna

    2016-07-01

    Microarrays are a powerful and effective tool that allows the detection of genome-wide gene expression differences between controls and disease conditions. They have been broadly applied to investigate the pathobiology of diverse forms of pulmonary hypertension, namely group 1, including patients with idiopathic pulmonary arterial hypertension, and group 3, including pulmonary hypertension associated with chronic lung diseases such as chronic obstructive pulmonary disease and idiopathic pulmonary fibrosis. To date, numerous human microarray studies have been conducted to analyse global (lung homogenate samples), compartment-specific (laser capture microdissection), cell type-specific (isolated primary cells) and circulating cell (peripheral blood) expression profiles. Combined, they provide important information on development, progression and the end-stage disease. In the future, system biology approaches, expression of noncoding RNAs that regulate coding RNAs, and direct comparison between animal models and human disease might be of importance. PMID:27076594

  9. Phenotypic MicroRNA Microarrays

    PubMed Central

    Kwon, Yong-Jun; Heo, Jin Yeong; Kim, Hi Chul; Kim, Jin Yeop; Liuzzi, Michel; Soloveva, Veronica

    2013-01-01

    Microarray technology has become a very popular approach in cases where multiple experiments need to be conducted repeatedly or done with a variety of samples. In our lab, we are applying our high density spots microarray approach to microscopy visualization of the effects of transiently introduced siRNA or cDNA on cellular morphology or phenotype. In this publication, we are discussing the possibility of using this micro-scale high throughput process to study the role of microRNAs in the biology of selected cellular models. After reverse-transfection of microRNAs and siRNA, the cellular phenotype generated by microRNAs regulated NF-κB expression comparably to the siRNA. The ability to print microRNA molecules for reverse transfection into cells is opening up the wide horizon for the phenotypic high content screening of microRNA libraries using cellular disease models.

  10. Self-Assembling Protein Microarrays

    NASA Astrophysics Data System (ADS)

    Ramachandran, Niroshan; Hainsworth, Eugenie; Bhullar, Bhupinder; Eisenstein, Samuel; Rosen, Benjamin; Lau, Albert Y.; C. Walter, Johannes; LaBaer, Joshua

    2004-07-01

    Protein microarrays provide a powerful tool for the study of protein function. However, they are not widely used, in part because of the challenges in producing proteins to spot on the arrays. We generated protein microarrays by printing complementary DNAs onto glass slides and then translating target proteins with mammalian reticulocyte lysate. Epitope tags fused to the proteins allowed them to be immobilized in situ. This obviated the need to purify proteins, avoided protein stability problems during storage, and captured sufficient protein for functional studies. We used the technology to map pairwise interactions among 29 human DNA replication initiation proteins, recapitulate the regulation of Cdt1 binding to select replication proteins, and map its geminin-binding domain.

  11. Microarray analysis in pulmonary hypertension

    PubMed Central

    Hoffmann, Julia; Wilhelm, Jochen; Olschewski, Andrea

    2016-01-01

    Microarrays are a powerful and effective tool that allows the detection of genome-wide gene expression differences between controls and disease conditions. They have been broadly applied to investigate the pathobiology of diverse forms of pulmonary hypertension, namely group 1, including patients with idiopathic pulmonary arterial hypertension, and group 3, including pulmonary hypertension associated with chronic lung diseases such as chronic obstructive pulmonary disease and idiopathic pulmonary fibrosis. To date, numerous human microarray studies have been conducted to analyse global (lung homogenate samples), compartment-specific (laser capture microdissection), cell type-specific (isolated primary cells) and circulating cell (peripheral blood) expression profiles. Combined, they provide important information on development, progression and the end-stage disease. In the future, system biology approaches, expression of noncoding RNAs that regulate coding RNAs, and direct comparison between animal models and human disease might be of importance. PMID:27076594

  12. Automated High Throughput Drug Target Crystallography

    SciTech Connect

    Rupp, B

    2005-02-18

    The molecular structures of drug target proteins and receptors form the basis for 'rational' or structure guided drug design. The majority of target structures are experimentally determined by protein X-ray crystallography, which as evolved into a highly automated, high throughput drug discovery and screening tool. Process automation has accelerated tasks from parallel protein expression, fully automated crystallization, and rapid data collection to highly efficient structure determination methods. A thoroughly designed automation technology platform supported by a powerful informatics infrastructure forms the basis for optimal workflow implementation and the data mining and analysis tools to generate new leads from experimental protein drug target structures.

  13. Microarrays, antiobesity and the liver

    PubMed Central

    Castro-Chávez, Fernando

    2013-01-01

    In this review, the microarray technology and especially oligonucleotide arrays are exemplified with a practical example taken from the perilipin−/− mice and using the dChip software, available for non-lucrative purposes. It was found that the liver of perilipin−/− mice was healthy and normal, even under high-fat diet when compared with the results published for the scd1−/− mice, which under high-fat diets had a darker liver, suggestive of hepatic steatosis. Scd1 is required for the biosynthesis of monounsaturated fatty acids and plays a key role in the hepatic synthesis of triglycerides and of very-low-density lipoproteins. Both models of obesity resistance share many similar phenotypic antiobesity features, however, the perilipin−/− mice had a significant downregulation of stearoyl CoA desaturases scd1 and scd2 in its white adipose tissue, but a normal level of both genes inside the liver, even under high-fat diet. Here, different microarray methodologies are discussed, and also some of the most recent discoveries and perspectives regarding the use of microarrays, with an emphasis on obesity gene expression, and a personal remark on my findings of increased expression for hemoglobin transcripts and other hemo related genes (hemo-like), and for leukocyte like (leuko-like) genes inside the white adipose tissue of the perilipin−/− mice. In conclusion, microarrays have much to offer in comparative studies such as those in antiobesity, and also they are methodologies adequate for new astounding molecular discoveries [free full text of this article PMID:15657555

  14. Lectin microarrays for glycomic analysis.

    PubMed

    Gupta, Garima; Surolia, Avadhesha; Sampathkumar, Srinivasa-Gopalan

    2010-08-01

    Glycomics is the study of comprehensive structural elucidation and characterization of all glycoforms found in nature and their dynamic spatiotemporal changes that are associated with biological processes. Glycocalyx of mammalian cells actively participate in cell-cell, cell-matrix, and cell-pathogen interactions, which impact embryogenesis, growth and development, homeostasis, infection and immunity, signaling, malignancy, and metabolic disorders. Relative to genomics and proteomics, glycomics is just growing out of infancy with great potential in biomedicine for biomarker discovery, diagnosis, and treatment. However, the immense diversity and complexity of glycan structures and their multiple modes of interactions with proteins pose great challenges for development of analytical tools for delineating structure function relationships and understanding glyco-code. Several tools are being developed for glycan profiling based on chromatography, mass spectrometry, glycan microarrays, and glyco-informatics. Lectins, which have long been used in glyco-immunology, printed on a microarray provide a versatile platform for rapid high throughput analysis of glycoforms of biological samples. Herein, we summarize technological advances in lectin microarrays and critically review their impact on glycomics analysis. Challenges remain in terms of expansion to include nonplant derived lectins, standardization for routine clinical use, development of recombinant lectins, and exploration of plant kingdom for discovery of novel lectins. PMID:20726799

  15. Lectin microarrays for glycomic analysis.

    PubMed

    Gupta, Garima; Surolia, Avadhesha; Sampathkumar, Srinivasa-Gopalan

    2010-08-01

    Glycomics is the study of comprehensive structural elucidation and characterization of all glycoforms found in nature and their dynamic spatiotemporal changes that are associated with biological processes. Glycocalyx of mammalian cells actively participate in cell-cell, cell-matrix, and cell-pathogen interactions, which impact embryogenesis, growth and development, homeostasis, infection and immunity, signaling, malignancy, and metabolic disorders. Relative to genomics and proteomics, glycomics is just growing out of infancy with great potential in biomedicine for biomarker discovery, diagnosis, and treatment. However, the immense diversity and complexity of glycan structures and their multiple modes of interactions with proteins pose great challenges for development of analytical tools for delineating structure function relationships and understanding glyco-code. Several tools are being developed for glycan profiling based on chromatography, mass spectrometry, glycan microarrays, and glyco-informatics. Lectins, which have long been used in glyco-immunology, printed on a microarray provide a versatile platform for rapid high throughput analysis of glycoforms of biological samples. Herein, we summarize technological advances in lectin microarrays and critically review their impact on glycomics analysis. Challenges remain in terms of expansion to include nonplant derived lectins, standardization for routine clinical use, development of recombinant lectins, and exploration of plant kingdom for discovery of novel lectins.

  16. Workflows for microarray data processing in the Kepler environment

    PubMed Central

    2012-01-01

    Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or

  17. Habitat automation

    NASA Technical Reports Server (NTRS)

    Swab, Rodney E.

    1992-01-01

    A habitat, on either the surface of the Moon or Mars, will be designed and built with the proven technologies of that day. These technologies will be mature and readily available to the habitat designer. We believe an acceleration of the normal pace of automation would allow a habitat to be safer and more easily maintained than would be the case otherwise. This document examines the operation of a habitat and describes elements of that operation which may benefit from an increased use of automation. Research topics within the automation realm are then defined and discussed with respect to the role they can have in the design of the habitat. Problems associated with the integration of advanced technologies into real-world projects at NASA are also addressed.

  18. Gene Expression Network Reconstruction by LEP Method Using Microarray Data

    PubMed Central

    You, Na; Mou, Peng; Qiu, Ting; Kou, Qiang; Zhu, Huaijin; Chen, Yuexi; Wang, Xueqin

    2012-01-01

    Gene expression network reconstruction using microarray data is widely studied aiming to investigate the behavior of a gene cluster simultaneously. Under the Gaussian assumption, the conditional dependence between genes in the network is fully described by the partial correlation coefficient matrix. Due to the high dimensionality and sparsity, we utilize the LEP method to estimate it in this paper. Compared to the existing methods, the LEP reaches the highest PPV with the sensitivity controlled at the satisfactory level. A set of gene expression data from the HapMap project is analyzed for illustration. PMID:23365528

  19. Automated dispenser

    SciTech Connect

    Hollen, R.M.; Stalnaker, N.D.

    1989-04-06

    An automated dispenser having a conventional pipette attached to an actuating cylinder through a flexible cable for delivering precise quantities of a liquid through commands from remotely located computer software. The travel of the flexible cable is controlled by adjustable stops and a locking shaft. The pipette can be positioned manually or by the hands of a robot. 1 fig.

  20. Automating Finance

    ERIC Educational Resources Information Center

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  1. Integrated Amplification Microarrays for Infectious Disease Diagnostics

    PubMed Central

    Chandler, Darrell P.; Bryant, Lexi; Griesemer, Sara B.; Gu, Rui; Knickerbocker, Christopher; Kukhtin, Alexander; Parker, Jennifer; Zimmerman, Cynthia; George, Kirsten St.; Cooney, Christopher G.

    2012-01-01

    This overview describes microarray-based tests that combine solution-phase amplification chemistry and microarray hybridization within a single microfluidic chamber. The integrated biochemical approach improves microarray workflow for diagnostic applications by reducing the number of steps and minimizing the potential for sample or amplicon cross-contamination. Examples described herein illustrate a basic, integrated approach for DNA and RNA genomes, and a simple consumable architecture for incorporating wash steps while retaining an entirely closed system. It is anticipated that integrated microarray biochemistry will provide an opportunity to significantly reduce the complexity and cost of microarray consumables, equipment, and workflow, which in turn will enable a broader spectrum of users to exploit the intrinsic multiplexing power of microarrays for infectious disease diagnostics.

  2. Microarray data analysis and mining approaches.

    PubMed

    Cordero, Francesca; Botta, Marco; Calogero, Raffaele A

    2007-12-01

    Microarray based transcription profiling is now a consolidated methodology and has widespread use in areas such as pharmacogenomics, diagnostics and drug target identification. Large-scale microarray studies are also becoming crucial to a new way of conceiving experimental biology. A main issue in microarray transcription profiling is data analysis and mining. When microarrays became a methodology of general use, considerable effort was made to produce algorithms and methods for the identification of differentially expressed genes. More recently, the focus has switched to algorithms and database development for microarray data mining. Furthermore, the evolution of microarray technology is allowing researchers to grasp the regulative nature of transcription, integrating basic expression analysis with mRNA characteristics, i.e. exon-based arrays, and with DNA characteristics, i.e. comparative genomic hybridization, single nucleotide polymorphism, tiling and promoter structure. In this article, we will review approaches used to detect differentially expressed genes and to link differential expression to specific biological functions.

  3. Array2BIO: A Comprehensive Suite of Utilities for the Analysis of Microarray Data

    SciTech Connect

    Loots, G G; Chain, P G; Mabery, S; Rasley, A; Garcia, E; Ovcharenko, I

    2006-02-13

    We have developed an integrative and automated toolkit for the analysis of Affymetrix microarray data, named Array2BIO. It identifies groups of coexpressed genes using two complementary approaches--comparative analysis of signal versus control microarrays and clustering analysis of gene expression across different conditions. The identified genes are assigned to functional categories based on the Gene Ontology classification, and a detection of corresponding KEGG protein interaction pathways. Array2BIO reliably handles low-expressor genes and provides a set of statistical methods to quantify the odds of observations, including the Benjamini-Hochberg and Bonferroni multiple testing corrections. Automated interface with the ECR Browser provides evolutionary conservation analysis of identified gene loci while the interconnection with Creme allows high-throughput analysis of human promoter regions and prediction of gene regulatory elements that underlie the observed expression patterns. Array2BIO is publicly available at http://array2bio.dcode.org.

  4. THE ABRF MARG MICROARRAY SURVEY 2005: TAKING THE PULSE ON THE MICROARRAY FIELD

    EPA Science Inventory

    Over the past several years microarray technology has evolved into a critical component of any discovery based program. Since 1999, the Association of Biomolecular Resource Facilities (ABRF) Microarray Research Group (MARG) has conducted biennial surveys designed to generate a pr...

  5. Living Cell Microarrays: An Overview of Concepts.

    PubMed

    Jonczyk, Rebecca; Kurth, Tracy; Lavrentieva, Antonina; Walter, Johanna-Gabriela; Scheper, Thomas; Stahl, Frank

    2016-01-01

    Living cell microarrays are a highly efficient cellular screening system. Due to the low number of cells required per spot, cell microarrays enable the use of primary and stem cells and provide resolution close to the single-cell level. Apart from a variety of conventional static designs, microfluidic microarray systems have also been established. An alternative format is a microarray consisting of three-dimensional cell constructs ranging from cell spheroids to cells encapsulated in hydrogel. These systems provide an in vivo-like microenvironment and are preferably used for the investigation of cellular physiology, cytotoxicity, and drug screening. Thus, many different high-tech microarray platforms are currently available. Disadvantages of many systems include their high cost, the requirement of specialized equipment for their manufacture, and the poor comparability of results between different platforms. In this article, we provide an overview of static, microfluidic, and 3D cell microarrays. In addition, we describe a simple method for the printing of living cell microarrays on modified microscope glass slides using standard DNA microarray equipment available in most laboratories. Applications in research and diagnostics are discussed, e.g., the selective and sensitive detection of biomarkers. Finally, we highlight current limitations and the future prospects of living cell microarrays. PMID:27600077

  6. Living Cell Microarrays: An Overview of Concepts

    PubMed Central

    Jonczyk, Rebecca; Kurth, Tracy; Lavrentieva, Antonina; Walter, Johanna-Gabriela; Scheper, Thomas; Stahl, Frank

    2016-01-01

    Living cell microarrays are a highly efficient cellular screening system. Due to the low number of cells required per spot, cell microarrays enable the use of primary and stem cells and provide resolution close to the single-cell level. Apart from a variety of conventional static designs, microfluidic microarray systems have also been established. An alternative format is a microarray consisting of three-dimensional cell constructs ranging from cell spheroids to cells encapsulated in hydrogel. These systems provide an in vivo-like microenvironment and are preferably used for the investigation of cellular physiology, cytotoxicity, and drug screening. Thus, many different high-tech microarray platforms are currently available. Disadvantages of many systems include their high cost, the requirement of specialized equipment for their manufacture, and the poor comparability of results between different platforms. In this article, we provide an overview of static, microfluidic, and 3D cell microarrays. In addition, we describe a simple method for the printing of living cell microarrays on modified microscope glass slides using standard DNA microarray equipment available in most laboratories. Applications in research and diagnostics are discussed, e.g., the selective and sensitive detection of biomarkers. Finally, we highlight current limitations and the future prospects of living cell microarrays. PMID:27600077

  7. Living Cell Microarrays: An Overview of Concepts

    PubMed Central

    Jonczyk, Rebecca; Kurth, Tracy; Lavrentieva, Antonina; Walter, Johanna-Gabriela; Scheper, Thomas; Stahl, Frank

    2016-01-01

    Living cell microarrays are a highly efficient cellular screening system. Due to the low number of cells required per spot, cell microarrays enable the use of primary and stem cells and provide resolution close to the single-cell level. Apart from a variety of conventional static designs, microfluidic microarray systems have also been established. An alternative format is a microarray consisting of three-dimensional cell constructs ranging from cell spheroids to cells encapsulated in hydrogel. These systems provide an in vivo-like microenvironment and are preferably used for the investigation of cellular physiology, cytotoxicity, and drug screening. Thus, many different high-tech microarray platforms are currently available. Disadvantages of many systems include their high cost, the requirement of specialized equipment for their manufacture, and the poor comparability of results between different platforms. In this article, we provide an overview of static, microfluidic, and 3D cell microarrays. In addition, we describe a simple method for the printing of living cell microarrays on modified microscope glass slides using standard DNA microarray equipment available in most laboratories. Applications in research and diagnostics are discussed, e.g., the selective and sensitive detection of biomarkers. Finally, we highlight current limitations and the future prospects of living cell microarrays.

  8. Highly parallel microbial diagnostics using oligonucleotide microarrays.

    PubMed

    Loy, Alexander; Bodrossy, Levente

    2006-01-01

    Oligonucleotide microarrays are highly parallel hybridization platforms, allowing rapid and simultaneous identification of many different microorganisms and viruses in a single assay. In the past few years, researchers have been confronted with a dramatic increase in the number of studies reporting development and/or improvement of oligonucleotide microarrays for microbial diagnostics, but use of the technology in routine diagnostics is still constrained by a variety of factors. Careful development of microarray essentials (such as oligonucleotide probes, protocols for target preparation and hybridization, etc.) combined with extensive performance testing are thus mandatory requirements for the maturation of diagnostic microarrays from fancy technological gimmicks to robust and routinely applicable tools.

  9. 2008 Microarray Research Group (MARG Survey): Sensing the State of Microarray Technology

    EPA Science Inventory

    Over the past several years, the field of microarrays has grown and evolved drastically. In its continued efforts to track this evolution and transformation, the ABRF-MARG has once again conducted a survey of international microarray facilities and individual microarray users. Th...

  10. THE ABRF-MARG MICROARRAY SURVEY 2004: TAKING THE PULSE OF THE MICROARRAY FIELD

    EPA Science Inventory

    Over the past several years, the field of microarrays has grown and evolved drastically. In its continued efforts to track this evolution, the ABRF-MARG has once again conducted a survey of international microarray facilities and individual microarray users. The goal of the surve...

  11. Systematic review of accuracy of prenatal diagnosis for abnormal chromosome diseases by microarray technology.

    PubMed

    Xu, H B; Yang, H; Liu, G; Chen, H

    2014-10-31

    The accuracy of prenatal diagnosis for abnormal chromosome diseases by chromosome microarray technology and karyotyping were compared. A literature search was carried out in the MEDLINE database with the keywords "chromosome" and "karyotype" and "genetic testing" and "prenatal diagnosis" and "oligonucleotide array sequence". The studies obtained were filtered by using the QUADAS tool, and studies conforming to the quality standard were fully analyzed. There was one paper conforming to the QUADAS standards including 4406 gravidas with adaptability syndromes of prenatal diagnosis including elderly parturient women, abnormal structure by type-B ultrasound, and other abnormalities. Microarray technology yielded successful diagnoses in 4340 cases (98.8%), and there was no need for tissue culture in 87.9% of the samples. All aneuploids and non-parallel translocations in 4282 cases of non-chimera identified by karyotyping could be detected using microarray analysis technology, whereas parallel translocations and fetal triploids could not be detected by microarray analysis technology. In the samples with normal karyotyping results, type-B ultrasound showed that 6% of chromosomal deficiencies or chromosome duplications could be detected by microarray technology, and the same abnormal chromosomes were detected in 1.7% of elderly parturient women and samples with positive serology screening results. In the prenatal diagnosis test, compared with karyotyping, microarray technology could identify the extra cell genetic information with clinical significance, aneuploids, and non-parallel translocations; however, its disadvantage is that it could not identify parallel translocations and triploids.

  12. Automated Methods for Multiplexed Pathogen Detection

    SciTech Connect

    Straub, Tim M.; Dockendorff, Brian P.; Quinonez-Diaz, Maria D.; Valdez, Catherine O.; Shutthanandan, Janani I.; Tarasevich, Barbara J.; Grate, Jay W.; Bruckner-Lea, Cindy J.

    2005-09-01

    Detection of pathogenic microorganisms in environmental samples is a difficult process. Concentration of the organisms of interest also co-concentrates inhibitors of many end-point detection methods, notably, nucleic acid methods. In addition, sensitive, highly multiplexed pathogen detection continues to be problematic. The primary function of the BEADS (Biodetection Enabling Analyte Delivery System) platform is the automated concentration and purification of target analytes from interfering substances, often present in these samples, via a renewable surface column. In one version of BEADS, automated immunomagnetic separation (IMS) is used to separate cells from their samples. Captured cells are transferred to a flow-through thermal cycler where PCR, using labeled primers, is performed. PCR products are then detected by hybridization to a DNA suspension array. In another version of BEADS, cell lysis is performed, and community RNA is purified and directly labeled. Multiplexed detection is accomplished by direct hybridization of the RNA to a planar microarray. The integrated IMS/PCR version of BEADS can successfully purify and amplify 10 E. coli O157:H7 cells from river water samples. Multiplexed PCR assays for the simultaneous detection of E. coli O157:H7, Salmonella, and Shigella on bead suspension arrays was demonstrated for the detection of as few as 100 cells for each organism. Results for the RNA version of BEADS are also showing promising results. Automation yields highly purified RNA, suitable for multiplexed detection on microarrays, with microarray detection specificity equivalent to PCR. Both versions of the BEADS platform show great promise for automated pathogen detection from environmental samples. Highly multiplexed pathogen detection using PCR continues to be problematic, but may be required for trace detection in large volume samples. The RNA approach solves the issues of highly multiplexed PCR and provides ''live vs. dead'' capabilities. However

  13. Innovative instrumentation for microarray scanning and analysis: application for characterization of oligonucleotide duplexes behavior.

    PubMed

    Khomyakova, E B; Dreval, E V; Tran-Dang, M; Potier, M C; Soussaline, F P

    2004-05-01

    Accuracy in microarray technology requires new approaches to microarray reader development. A microarray reader system (optical scanning array or OSA reader) based on automated microscopy with large field of view, high speed 3 axis scanning at multiple narrow-band spectra of excitation light has been developed. It allows fast capture of high-resolution, multi-fluorescence images and is characterized by a linear dynamic range and sensitivity comparable to commonly used photo-multiplier tube (PMT)-based laser scanner. Controlled by high performance software, the instrument can be used for scanning and quantitative analysis of any type of dry microarray. Studies implying temperature-controlled hybridization chamber containing a microarray can also be performed. This enables the registration of kinetics and melting curves. This feature is required in a wide range of on-chip chemical and enzymatic reactions including on-chip PCR amplification. We used the OSA reader for the characterization of hybridization and melting behaviour of oligonucleotide:oligonucleotide duplexes on three-dimensional Code Link slides. PMID:15209342

  14. Development of a sensitive DNA microarray suitable for rapid detection of Campylobacter spp.

    PubMed

    Keramas, Georgios; Bang, Dang Duong; Lund, Marianne; Madsen, Mogens; Rasmussen, Svend Erik; Bunkenborg, Henrik; Telleman, Pieter; Christensen, Claus Bo Vöge

    2003-08-01

    Campylobacter is the most common cause of human acute bacterial gastroenteritis worldwide, widely distributed and isolated from human clinical samples as well as from many other different sources. To comply with the demands of consumers for food safety, there is a need for development of a rapid, sensitive and specific detection method for Campylobacter. In this study, we present the development of a novel sensitive DNA-microarray based detection method, evaluated on Campylobacter and non-Campylobacter reference strains, to detect Campylobacter directly from the faecal cloacal swabs. The DNA-microarray method consists of two steps: first, both universal bacterial sequences and specific Campylobacter sequences (size range: 149-307 bp) are amplified and fluorescently labeled using multiplex-PCR, targeting the 16S rRNA, the 16S-23S rRNA intergenic region and specific Campylobacter genes. Secondly, the Cy5 labeled PCR-amplicons are hybridised to immobilised capture probes on the microarray. The method allows detection of three to thirty genome equivalents (6-60 fg DNA) of Campylobacter within 3 h, with a hands on time of only 15 min. Using the DNA-microarrays, two closely related Campylobacter species, Campylobacter jejuni and Campylobacter coli could be detected and differentiated directly from chicken faeces. The DNA-microarray method has a high potential for automation and incorporation into a dedicated mass screening microsystem.

  15. Studying bovine early embryo transcriptome by microarray.

    PubMed

    Dufort, Isabelle; Robert, Claude; Sirard, Marc-André

    2015-01-01

    Microarrays represent a significant advantage when studying gene expression in early embryo because they allow for a speedy study of a large number of genes even if the sample of interest contains small quantities of genetic material. Here we describe the protocols developed by the EmbryoGENE Network to study the bovine transcriptome in early embryo using a microarray experimental design.

  16. Microarrays Made Simple: "DNA Chips" Paper Activity

    ERIC Educational Resources Information Center

    Barnard, Betsy

    2006-01-01

    DNA microarray technology is revolutionizing biological science. DNA microarrays (also called DNA chips) allow simultaneous screening of many genes for changes in expression between different cells. Now researchers can obtain information about genes in days or weeks that used to take months or years. The paper activity described in this article…

  17. Application of microarray technology in pulmonary diseases

    PubMed Central

    Tzouvelekis, Argyris; Patlakas, George; Bouros, Demosthenes

    2004-01-01

    Microarrays are a powerful tool that have multiple applications both in clinical and cell biology arenas of common lung diseases. To exemplify how this tool can be useful, in this review, we will provide an overview of the application of microarray technology in research relevant to common lung diseases and present some of the future perspectives. PMID:15585067

  18. Sensing immune responses with customized peptide microarrays.

    PubMed

    Schirwitz, Christopher; Loeffler, Felix F; Felgenhauer, Thomas; Stadler, Volker; Breitling, Frank; Bischoff, F Ralf

    2012-12-01

    The intent to solve biological and biomedical questions in high-throughput led to an immense interest in microarray technologies. Nowadays, DNA microarrays are routinely used to screen for oligonucleotide interactions within a large variety of potential interaction partners. To study interactions on the protein level with the same efficiency, protein and peptide microarrays offer similar advantages, but their production is more demanding. A new technology to produce peptide microarrays with a laser printer provides access to affordable and highly complex peptide microarrays. Such a peptide microarray can contain up to 775 peptide spots per cm², whereby the position of each peptide spot and, thus, the amino acid sequence of the corresponding peptide, is exactly known. Compared to other techniques, such as the SPOT synthesis, more features per cm² at lower costs can be synthesized which paves the way for laser printed peptide microarrays to take on roles as efficient and affordable biomedical sensors. Here, we describe the laser printer-based synthesis of peptide microarrays and focus on an application involving the blood sera of tetanus immunized individuals, indicating the potential of peptide arrays to sense immune responses.

  19. Testing fully depleted CCD

    NASA Astrophysics Data System (ADS)

    Casas, Ricard; Cardiel-Sas, Laia; Castander, Francisco J.; Jiménez, Jorge; de Vicente, Juan

    2014-08-01

    The focal plane of the PAU camera is composed of eighteen 2K x 4K CCDs. These devices, plus four spares, were provided by the Japanese company Hamamatsu Photonics K.K. with type no. S10892-04(X). These detectors are 200 μm thick fully depleted and back illuminated with an n-type silicon base. They have been built with a specific coating to be sensitive in the range from 300 to 1,100 nm. Their square pixel size is 15 μm. The read-out system consists of a Monsoon controller (NOAO) and the panVIEW software package. The deafualt CCD read-out speed is 133 kpixel/s. This is the value used in the calibration process. Before installing these devices in the camera focal plane, they were characterized using the facilities of the ICE (CSIC- IEEC) and IFAE in the UAB Campus in Bellaterra (Barcelona, Catalonia, Spain). The basic tests performed for all CCDs were to obtain the photon transfer curve (PTC), the charge transfer efficiency (CTE) using X-rays and the EPER method, linearity, read-out noise, dark current, persistence, cosmetics and quantum efficiency. The X-rays images were also used for the analysis of the charge diffusion for different substrate voltages (VSUB). Regarding the cosmetics, and in addition to white and dark pixels, some patterns were also found. The first one, which appears in all devices, is the presence of half circles in the external edges. The origin of this pattern can be related to the assembly process. A second one appears in the dark images, and shows bright arcs connecting corners along the vertical axis of the CCD. This feature appears in all CCDs exactly in the same position so our guess is that the pattern is due to electrical fields. Finally, and just in two devices, there is a spot with wavelength dependence whose origin could be the result of a defectous coating process.

  20. Space science experimentation automation and support

    NASA Technical Reports Server (NTRS)

    Frainier, Richard J.; Groleau, Nicolas; Shapiro, Jeff C.

    1994-01-01

    This paper outlines recent work done at the NASA Ames Artificial Intelligence Research Laboratory on automation and support of science experiments on the US Space Shuttle in low earth orbit. Three approaches to increasing the science return of these experiments using emerging automation technologies are described: remote control (telescience), science advisors for astronaut operators, and fully autonomous experiments. The capabilities and limitations of these approaches are reviewed.

  1. Microarray Applications in Microbial Ecology Research.

    SciTech Connect

    Gentry, T.; Schadt, C.; Zhou, J.

    2006-04-06

    Microarray technology has the unparalleled potential tosimultaneously determine the dynamics and/or activities of most, if notall, of the microbial populations in complex environments such as soilsand sediments. Researchers have developed several types of arrays thatcharacterize the microbial populations in these samples based on theirphylogenetic relatedness or functional genomic content. Several recentstudies have used these microarrays to investigate ecological issues;however, most have only analyzed a limited number of samples withrelatively few experiments utilizing the full high-throughput potentialof microarray analysis. This is due in part to the unique analyticalchallenges that these samples present with regard to sensitivity,specificity, quantitation, and data analysis. This review discussesspecific applications of microarrays to microbial ecology research alongwith some of the latest studies addressing the difficulties encounteredduring analysis of complex microbial communities within environmentalsamples. With continued development, microarray technology may ultimatelyachieve its potential for comprehensive, high-throughput characterizationof microbial populations in near real-time.

  2. Real-time DNA microarray analysis

    PubMed Central

    Hassibi, Arjang; Vikalo, Haris; Riechmann, José Luis; Hassibi, Babak

    2009-01-01

    We present a quantification method for affinity-based DNA microarrays which is based on the real-time measurements of hybridization kinetics. This method, i.e. real-time DNA microarrays, enhances the detection dynamic range of conventional systems by being impervious to probe saturation in the capturing spots, washing artifacts, microarray spot-to-spot variations, and other signal amplitude-affecting non-idealities. We demonstrate in both theory and practice that the time-constant of target capturing in microarrays, similar to all affinity-based biosensors, is inversely proportional to the concentration of the target analyte, which we subsequently use as the fundamental parameter to estimate the concentration of the analytes. Furthermore, to empirically validate the capabilities of this method in practical applications, we present a FRET-based assay which enables the real-time detection in gene expression DNA microarrays. PMID:19723688

  3. Real-time DNA microarray analysis.

    PubMed

    Hassibi, Arjang; Vikalo, Haris; Riechmann, José Luis; Hassibi, Babak

    2009-11-01

    We present a quantification method for affinity-based DNA microarrays which is based on the real-time measurements of hybridization kinetics. This method, i.e. real-time DNA microarrays, enhances the detection dynamic range of conventional systems by being impervious to probe saturation in the capturing spots, washing artifacts, microarray spot-to-spot variations, and other signal amplitude-affecting non-idealities. We demonstrate in both theory and practice that the time-constant of target capturing in microarrays, similar to all affinity-based biosensors, is inversely proportional to the concentration of the target analyte, which we subsequently use as the fundamental parameter to estimate the concentration of the analytes. Furthermore, to empirically validate the capabilities of this method in practical applications, we present a FRET-based assay which enables the real-time detection in gene expression DNA microarrays. PMID:19723688

  4. Tissue Microarrays in Clinical Oncology

    PubMed Central

    Voduc, David; Kenney, Challayne; Nielsen, Torsten O.

    2008-01-01

    The tissue microarray is a recently-implemented, high-throughput technology for the analysis of molecular markers in oncology. This research tool permits the rapid assessment of a biomarker in thousands of tumor samples, using commonly available laboratory assays such as immunohistochemistry and in-situ hybridization. Although introduced less than a decade ago, the TMA has proven to be invaluable in the study of tumor biology, the development of diagnostic tests, and the investigation of oncological biomarkers. This review describes the impact of TMA-based research in clinical oncology and its potential future applications. Technical aspects of TMA construction, and the advantages and disadvantages inherent to this technology are also discussed. PMID:18314063

  5. DNA Microarrays for Identifying Fishes

    PubMed Central

    Nölte, M.; Weber, H.; Silkenbeumer, N.; Hjörleifsdottir, S.; Hreggvidsson, G. O.; Marteinsson, V.; Kappel, K.; Planes, S.; Tinti, F.; Magoulas, A.; Garcia Vazquez, E.; Turan, C.; Hervet, C.; Campo Falgueras, D.; Antoniou, A.; Landi, M.; Blohm, D.

    2008-01-01

    In many cases marine organisms and especially their diverse developmental stages are difficult to identify by morphological characters. DNA-based identification methods offer an analytically powerful addition or even an alternative. In this study, a DNA microarray has been developed to be able to investigate its potential as a tool for the identification of fish species from European seas based on mitochondrial 16S rDNA sequences. Eleven commercially important fish species were selected for a first prototype. Oligonucleotide probes were designed based on the 16S rDNA sequences obtained from 230 individuals of 27 fish species. In addition, more than 1200 sequences of 380 species served as sequence background against which the specificity of the probes was tested in silico. Single target hybridisations with Cy5-labelled, PCR-amplified 16S rDNA fragments from each of the 11 species on microarrays containing the complete set of probes confirmed their suitability. True-positive, fluorescence signals obtained were at least one order of magnitude stronger than false-positive cross-hybridisations. Single nontarget hybridisations resulted in cross-hybridisation signals at approximately 27% of the cases tested, but all of them were at least one order of magnitude lower than true-positive signals. This study demonstrates that the 16S rDNA gene is suitable for designing oligonucleotide probes, which can be used to differentiate 11 fish species. These data are a solid basis for the second step to create a “Fish Chip” for approximately 50 fish species relevant in marine environmental and fisheries research, as well as control of fisheries products. PMID:18270778

  6. Automated lithocell

    NASA Astrophysics Data System (ADS)

    Englisch, Andreas; Deuter, Armin

    1990-06-01

    Integration and automation have gained more and more ground in modern IC-manufacturing. It is difficult to make a direct calculation of the profit these investments yield. On the other hand, the demands to man, machine and technology have increased enormously of late; it is not difficult to see that only by means of integration and automation can these demands be coped with. Here are some salient points: U the complexity and costs incurred by the equipment and processes have got significantly higher . owing to the reduction of all dimensions, the tolerances within which the various process steps have to be carried out have got smaller and smaller and the adherence to these tolerances more and more difficult U the cycle time has become more and more important both for the development and control of new processes and, to a great extent, for a rapid and reliable supply to the customer. In order that the products be competitive under these conditions, all sort of costs have to be reduced and the yield has to be maximized. Therefore, the computer-aided control of the equipment and the process combined with an automatic data collection and a real-time SPC (statistical process control) has become absolutely necessary for successful IC-manufacturing. Human errors must be eliminated from the execution of the various process steps by automation. The work time set free in this way makes it possible for the human creativity to be employed on a larger scale in stabilizing the processes. Besides, a computer-aided equipment control can ensure the optimal utilization of the equipment round the clock.

  7. Microarray-integrated optoelectrofluidic immunoassay system.

    PubMed

    Han, Dongsik; Park, Je-Kyun

    2016-05-01

    A microarray-based analytical platform has been utilized as a powerful tool in biological assay fields. However, an analyte depletion problem due to the slow mass transport based on molecular diffusion causes low reaction efficiency, resulting in a limitation for practical applications. This paper presents a novel method to improve the efficiency of microarray-based immunoassay via an optically induced electrokinetic phenomenon by integrating an optoelectrofluidic device with a conventional glass slide-based microarray format. A sample droplet was loaded between the microarray slide and the optoelectrofluidic device on which a photoconductive layer was deposited. Under the application of an AC voltage, optically induced AC electroosmotic flows caused by a microarray-patterned light actively enhanced the mass transport of target molecules at the multiple assay spots of the microarray simultaneously, which reduced tedious reaction time from more than 30 min to 10 min. Based on this enhancing effect, a heterogeneous immunoassay with a tiny volume of sample (5 μl) was successfully performed in the microarray-integrated optoelectrofluidic system using immunoglobulin G (IgG) and anti-IgG, resulting in improved efficiency compared to the static environment. Furthermore, the application of multiplex assays was also demonstrated by multiple protein detection.

  8. MAAMD: a workflow to standardize meta-analyses and comparison of affymetrix microarray data

    PubMed Central

    2014-01-01

    Background Mandatory deposit of raw microarray data files for public access, prior to study publication, provides significant opportunities to conduct new bioinformatics analyses within and across multiple datasets. Analysis of raw microarray data files (e.g. Affymetrix CEL files) can be time consuming, complex, and requires fundamental computational and bioinformatics skills. The development of analytical workflows to automate these tasks simplifies the processing of, improves the efficiency of, and serves to standardize multiple and sequential analyses. Once installed, workflows facilitate the tedious steps required to run rapid intra- and inter-dataset comparisons. Results We developed a workflow to facilitate and standardize Meta-Analysis of Affymetrix Microarray Data analysis (MAAMD) in Kepler. Two freely available stand-alone software tools, R and AltAnalyze were embedded in MAAMD. The inputs of MAAMD are user-editable csv files, which contain sample information and parameters describing the locations of input files and required tools. MAAMD was tested by analyzing 4 different GEO datasets from mice and drosophila. MAAMD automates data downloading, data organization, data quality control assesment, differential gene expression analysis, clustering analysis, pathway visualization, gene-set enrichment analysis, and cross-species orthologous-gene comparisons. MAAMD was utilized to identify gene orthologues responding to hypoxia or hyperoxia in both mice and drosophila. The entire set of analyses for 4 datasets (34 total microarrays) finished in ~ one hour. Conclusions MAAMD saves time, minimizes the required computer skills, and offers a standardized procedure for users to analyze microarray datasets and make new intra- and inter-dataset comparisons. PMID:24621103

  9. Improved statistical analysis of budding yeast TAG microarrays revealed by defined spike-in pools.

    PubMed

    Peyser, Brian D; Irizarry, Rafael A; Tiffany, Carol W; Chen, Ou; Yuan, Daniel S; Boeke, Jef D; Spencer, Forrest A

    2005-09-15

    Saccharomyces cerevisiae knockout collection TAG microarrays are an emergent platform for rapid, genome-wide functional characterization of yeast genes. TAG arrays report abundance of unique oligonucleotide 'TAG' sequences incorporated into each deletion mutation of the yeast knockout collection, allowing measurement of relative strain representation across experimental conditions for all knockout mutants simultaneously. One application of TAG arrays is to perform genome-wide synthetic lethality screens, known as synthetic lethality analyzed by microarray (SLAM). We designed a fully defined spike-in pool to resemble typical SLAM experiments and performed TAG microarray hybridizations. We describe a method for analyzing two-color array data to efficiently measure the differential knockout strain representation across two experimental conditions, and use the spike-in pool to show that the sensitivity and specificity of this method exceed typical current approaches.

  10. Automated office blood pressure.

    PubMed

    Myers, Martin G; Godwin, Marshall

    2012-05-01

    Manual blood pressure (BP) is gradually disappearing from clinical practice with the mercury sphygmomanometer now considered to be an environmental hazard. Manual BP is also subject to measurement error on the part of the physician/nurse and patient-related anxiety which can result in poor quality BP measurements and office-induced (white coat) hypertension. Automated office (AO) BP with devices such as the BpTRU (BpTRU Medical Devices, Coquitlam, BC) has already replaced conventional manual BP in many primary care practices in Canada and has also attracted interest in other countries where research studies using AOBP have been undertaken. The basic principles of AOBP include multiple readings taken with a fully automated recorder with the patient resting alone in a quiet room. When these principles are followed, office-induced hypertension is eliminated and AOBP exhibits a much stronger correlation with the awake ambulatory BP as compared with routine manual BP measurements. Unlike routine manual BP, AOBP correlates as well with left ventricular mass as does the awake ambulatory BP. AOBP also simplifies the definition of hypertension in that the cut point for a normal AOBP (< 135/85 mm Hg) is the same as for the awake ambulatory BP and home BP. This article summarizes the currently available evidence supporting the use of AOBP in routine clinical practice and proposes an algorithm in which AOBP replaces manual BP for the diagnosis and management of hypertension. PMID:22265230

  11. Progress in the application of DNA microarrays.

    PubMed Central

    Lobenhofer, E K; Bushel, P R; Afshari, C A; Hamadeh, H K

    2001-01-01

    Microarray technology has been applied to a variety of different fields to address fundamental research questions. The use of microarrays, or DNA chips, to study the gene expression profiles of biologic samples began in 1995. Since that time, the fundamental concepts behind the chip, the technology required for making and using these chips, and the multitude of statistical tools for analyzing the data have been extensively reviewed. For this reason, the focus of this review will be not on the technology itself but on the application of microarrays as a research tool and the future challenges of the field. PMID:11673116

  12. DNA Microarrays in Herbal Drug Research

    PubMed Central

    Chavan, Preeti; Joshi, Kalpana; Patwardhan, Bhushan

    2006-01-01

    Natural products are gaining increased applications in drug discovery and development. Being chemically diverse they are able to modulate several targets simultaneously in a complex system. Analysis of gene expression becomes necessary for better understanding of molecular mechanisms. Conventional strategies for expression profiling are optimized for single gene analysis. DNA microarrays serve as suitable high throughput tool for simultaneous analysis of multiple genes. Major practical applicability of DNA microarrays remains in DNA mutation and polymorphism analysis. This review highlights applications of DNA microarrays in pharmacodynamics, pharmacogenomics, toxicogenomics and quality control of herbal drugs and extracts. PMID:17173108

  13. Maximizing Your Investment in Building Automation System Technology.

    ERIC Educational Resources Information Center

    Darnell, Charles

    2001-01-01

    Discusses how organizational issues and system standardization can be important factors that determine an institution's ability to fully exploit contemporary building automation systems (BAS). Further presented is management strategy for maximizing BAS investments. (GR)

  14. Comparison of microarray preprocessing methods.

    PubMed

    Shakya, K; Ruskin, H J; Kerr, G; Crane, M; Becker, J

    2010-01-01

    Data preprocessing in microarray technology is a crucial initial step before data analysis is performed. Many preprocessing methods have been proposed but none has proved to be ideal to date. Frequently, datasets are limited by laboratory constraints so that the need is for guidelines on quality and robustness, to inform further experimentation while data are yet restricted. In this paper, we compared the performance of four popular methods, namely MAS5, Li & Wong pmonly (LWPM), Li & Wong subtractMM (LWMM), and Robust Multichip Average (RMA). The comparison is based on the analysis carried out on sets of laboratory-generated data from the Bioinformatics Lab, National Institute of Cellular Biotechnology (NICB), Dublin City University, Ireland. These experiments were designed to examine the effect of Bromodeoxyuridine (5-bromo-2-deoxyuridine, BrdU) treatment in deep lamellar keratoplasty (DLKP) cells. The methodology employed is to assess dispersion across the replicates and analyze the false discovery rate. From the dispersion analysis, we found that variability is reduced more effectively by LWPM and RMA methods. From the false positive analysis, and for both parametric and nonparametric approaches, LWMM is found to perform best. Based on a complementary q-value analysis, LWMM approach again is the strongest candidate. The indications are that, while LWMM is marginally less effective than LWPM and RMA in terms of variance reduction, it has considerably improved discrimination overall.

  15. AMIC@: All MIcroarray Clusterings @ once.

    PubMed

    Geraci, Filippo; Pellegrini, Marco; Renda, M Elena

    2008-07-01

    The AMIC@ Web Server offers a light-weight multi-method clustering engine for microarray gene-expression data. AMIC@ is a highly interactive tool that stresses user-friendliness and robustness by adopting AJAX technology, thus allowing an effective interleaved execution of different clustering algorithms and inspection of results. Among the salient features AMIC@ offers, there are: (i) automatic file format detection, (ii) suggestions on the number of clusters using a variant of the stability-based method of Tibshirani et al. (iii) intuitive visual inspection of the data via heatmaps and (iv) measurements of the clustering quality using cluster homogeneity. Large data sets can be processed efficiently by selecting algorithms (such as FPF-SB and k-Boost), specifically designed for this purpose. In case of very large data sets, the user can opt for a batch-mode use of the system by means of the Clustering wizard that runs all algorithms at once and delivers the results via email. AMIC@ is freely available and open to all users with no login requirement at the following URL http://bioalgo.iit.cnr.it/amica.

  16. Microarrays for Genotyping Human Group A Rotavirus by Multiplex Capture and Type-Specific Primer Extension

    PubMed Central

    Lovmar, Lovisa; Fock, Caroline; Espinoza, Felix; Bucardo, Filemon; Syvänen, Ann-Christine; Bondeson, Kåre

    2003-01-01

    Human group A rotavirus (HRV) is the major cause of severe gastroenteritis in infants worldwide. HRV shares the feature of a high degree of genetic diversity with many other RNA viruses, and therefore, genotyping of this organism is more complicated than genotyping of more stable DNA viruses. We describe a novel microarray-based method that allows high-throughput genotyping of RNA viruses with a high degree of polymorphism by multiplex capture and type-specific extension on microarrays. Denatured reverse transcription (RT)-PCR products derived from two outer capsid genes of clinical isolates of HRV were hybridized to immobilized capture oligonucleotides representing the most commonly occurring P and G genotypes on a microarray. Specific primer extension of the type-specific capture oligonucleotides was applied to incorporate the fluorescent nucleotide analogue cyanine 5-labeled dUTP as a detectable label. Laser scanning and fluorescence detection of the microarrays was followed by visual or computer-assisted interpretation of the fluorescence patterns generated on the microarrays. Initially, the method detected HRV in all 40 samples and correctly determined both the G and the P genotypes of 35 of the 40 strains analyzed. After modification by inclusion of additional capture oligonucleotides specific for the initially unassigned genotypes, all genotypes could be correctly defined. The results of genotyping with the microarray fully agreed with the results obtained by nucleotide sequence analysis and sequence-specific multiplex RT-PCR. Owing to its robustness, simplicity, and general utility, the microarray-based method may gain wide applicability for the genotyping of microorganisms, including highly variable RNA and DNA viruses. PMID:14605152

  17. Microarrays for genotyping human group a rotavirus by multiplex capture and type-specific primer extension.

    PubMed

    Lovmar, Lovisa; Fock, Caroline; Espinoza, Felix; Bucardo, Filemon; Syvänen, Ann-Christine; Bondeson, Kåre

    2003-11-01

    Human group A rotavirus (HRV) is the major cause of severe gastroenteritis in infants worldwide. HRV shares the feature of a high degree of genetic diversity with many other RNA viruses, and therefore, genotyping of this organism is more complicated than genotyping of more stable DNA viruses. We describe a novel microarray-based method that allows high-throughput genotyping of RNA viruses with a high degree of polymorphism by multiplex capture and type-specific extension on microarrays. Denatured reverse transcription (RT)-PCR products derived from two outer capsid genes of clinical isolates of HRV were hybridized to immobilized capture oligonucleotides representing the most commonly occurring P and G genotypes on a microarray. Specific primer extension of the type-specific capture oligonucleotides was applied to incorporate the fluorescent nucleotide analogue cyanine 5-labeled dUTP as a detectable label. Laser scanning and fluorescence detection of the microarrays was followed by visual or computer-assisted interpretation of the fluorescence patterns generated on the microarrays. Initially, the method detected HRV in all 40 samples and correctly determined both the G and the P genotypes of 35 of the 40 strains analyzed. After modification by inclusion of additional capture oligonucleotides specific for the initially unassigned genotypes, all genotypes could be correctly defined. The results of genotyping with the microarray fully agreed with the results obtained by nucleotide sequence analysis and sequence-specific multiplex RT-PCR. Owing to its robustness, simplicity, and general utility, the microarray-based method may gain wide applicability for the genotyping of microorganisms, including highly variable RNA and DNA viruses.

  18. Pyrrole-Imidazole Polyamides: Automated Solid-Phase Synthesis.

    PubMed

    Fang, Lijing; Pan, Zhengyin; Cullis, Paul M; Burley, Glenn A; Su, Wu

    2015-01-01

    In this unit, the fully automated solid-phase synthetic strategy of hairpin Py-Im polyamides is described using triphosgene (BTC) as a coupling agent. This automated methodology is compatible with all the typical building blocks, enabling the facile synthesis of polyamide libraries in 9% to 20% yield in 3 days. PMID:26623976

  19. Integrating Test-Form Formatting into Automated Test Assembly

    ERIC Educational Resources Information Center

    Diao, Qi; van der Linden, Wim J.

    2013-01-01

    Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…

  20. Quality Visualization of Microarray Datasets Using Circos

    PubMed Central

    Koch, Martin; Wiese, Michael

    2012-01-01

    Quality control and normalization is considered the most important step in the analysis of microarray data. At present there are various methods available for quality assessments of microarray datasets. However there seems to be no standard visualization routine, which also depicts individual microarray quality. Here we present a convenient method for visualizing the results of standard quality control tests using Circos plots. In these plots various quality measurements are drawn in a circular fashion, thus allowing for visualization of the quality and all outliers of each distinct array within a microarray dataset. The proposed method is intended for use with the Affymetrix Human Genome platform (i.e., GPL 96, GPL570 and GPL571). Circos quality measurement plots are a convenient way for the initial quality estimate of Affymetrix datasets that are stored in publicly available databases.

  1. Enhancing Interdisciplinary Mathematics and Biology Education: A Microarray Data Analysis Course Bridging These Disciplines

    PubMed Central

    Evans, Irene M.

    2010-01-01

    BIO2010 put forth the goal of improving the mathematical educational background of biology students. The analysis and interpretation of microarray high-dimensional data can be very challenging and is best done by a statistician and a biologist working and teaching in a collaborative manner. We set up such a collaboration and designed a course on microarray data analysis. We started using Genome Consortium for Active Teaching (GCAT) materials and Microarray Genome and Clustering Tool software and added R statistical software along with Bioconductor packages. In response to student feedback, one microarray data set was fully analyzed in class, starting from preprocessing to gene discovery to pathway analysis using the latter software. A class project was to conduct a similar analysis where students analyzed their own data or data from a published journal paper. This exercise showed the impact that filtering, preprocessing, and different normalization methods had on gene inclusion in the final data set. We conclude that this course achieved its goals to equip students with skills to analyze data from a microarray experiment. We offer our insight about collaborative teaching as well as how other faculty might design and implement a similar interdisciplinary course. PMID:20810954

  2. Contributions to Statistical Problems Related to Microarray Data

    ERIC Educational Resources Information Center

    Hong, Feng

    2009-01-01

    Microarray is a high throughput technology to measure the gene expression. Analysis of microarray data brings many interesting and challenging problems. This thesis consists three studies related to microarray data. First, we propose a Bayesian model for microarray data and use Bayes Factors to identify differentially expressed genes. Second, we…

  3. The Impact of Photobleaching on Microarray Analysis.

    PubMed

    von der Haar, Marcel; Preuß, John-Alexander; von der Haar, Kathrin; Lindner, Patrick; Scheper, Thomas; Stahl, Frank

    2015-01-01

    DNA-Microarrays have become a potent technology for high-throughput analysis of genetic regulation. However, the wide dynamic range of signal intensities of fluorophore-based microarrays exceeds the dynamic range of a single array scan by far, thus limiting the key benefit of microarray technology: parallelization. The implementation of multi-scan techniques represents a promising approach to overcome these limitations. These techniques are, in turn, limited by the fluorophores' susceptibility to photobleaching when exposed to the scanner's laser light. In this paper the photobleaching characteristics of cyanine-3 and cyanine-5 as part of solid state DNA microarrays are studied. The effects of initial fluorophore intensity as well as laser scanner dependent variables such as the photomultiplier tube's voltage on bleaching and imaging are investigated. The resulting data is used to develop a model capable of simulating the expected degree of signal intensity reduction caused by photobleaching for each fluorophore individually, allowing for the removal of photobleaching-induced, systematic bias in multi-scan procedures. Single-scan applications also benefit as they rely on pre-scans to determine the optimal scanner settings. These findings constitute a step towards standardization of microarray experiments and analysis and may help to increase the lab-to-lab comparability of microarray experiment results. PMID:26378589

  4. Unsupervised assessment of microarray data quality using a Gaussian mixture model

    PubMed Central

    Howard, Brian E; Sick, Beate; Heber, Steffen

    2009-01-01

    Background Quality assessment of microarray data is an important and often challenging aspect of gene expression analysis. This task frequently involves the examination of a variety of summary statistics and diagnostic plots. The interpretation of these diagnostics is often subjective, and generally requires careful expert scrutiny. Results We show how an unsupervised classification technique based on the Expectation-Maximization (EM) algorithm and the naïve Bayes model can be used to automate microarray quality assessment. The method is flexible and can be easily adapted to accommodate alternate quality statistics and platforms. We evaluate our approach using Affymetrix 3' gene expression and exon arrays and compare the performance of this method to a similar supervised approach. Conclusion This research illustrates the efficacy of an unsupervised classification approach for the purpose of automated microarray data quality assessment. Since our approach requires only unannotated training data, it is easy to customize and to keep up-to-date as technology evolves. In contrast to other "black box" classification systems, this method also allows for intuitive explanations. PMID:19545436

  5. Evaluation of Solid Supports for Slide- and Well-Based Recombinant Antibody Microarrays

    PubMed Central

    Gerdtsson, Anna S.; Dexlin-Mellby, Linda; Delfani, Payam; Berglund, Erica; Borrebaeck, Carl A. K.; Wingren, Christer

    2016-01-01

    Antibody microarrays have emerged as an important tool within proteomics, enabling multiplexed protein expression profiling in both health and disease. The design and performance of antibody microarrays and how they are processed are dependent on several factors, of which the interplay between the antibodies and the solid surfaces plays a central role. In this study, we have taken on the first comprehensive view and evaluated the overall impact of solid surfaces on the recombinant antibody microarray design. The results clearly demonstrated the importance of the surface-antibody interaction and showed the effect of the solid supports on the printing process, the array format of planar arrays (slide- and well-based), the assay performance (spot features, reproducibility, specificity and sensitivity) and assay processing (degree of automation). In the end, two high-end recombinant antibody microarray technology platforms were designed, based on slide-based (black polymer) and well-based (clear polymer) arrays, paving the way for future large-scale protein expression profiling efforts. PMID:27600082

  6. Evaluation of Solid Supports for Slide- and Well-Based Recombinant Antibody Microarrays

    PubMed Central

    Gerdtsson, Anna S.; Dexlin-Mellby, Linda; Delfani, Payam; Berglund, Erica; Borrebaeck, Carl A. K.; Wingren, Christer

    2016-01-01

    Antibody microarrays have emerged as an important tool within proteomics, enabling multiplexed protein expression profiling in both health and disease. The design and performance of antibody microarrays and how they are processed are dependent on several factors, of which the interplay between the antibodies and the solid surfaces plays a central role. In this study, we have taken on the first comprehensive view and evaluated the overall impact of solid surfaces on the recombinant antibody microarray design. The results clearly demonstrated the importance of the surface-antibody interaction and showed the effect of the solid supports on the printing process, the array format of planar arrays (slide- and well-based), the assay performance (spot features, reproducibility, specificity and sensitivity) and assay processing (degree of automation). In the end, two high-end recombinant antibody microarray technology platforms were designed, based on slide-based (black polymer) and well-based (clear polymer) arrays, paving the way for future large-scale protein expression profiling efforts.

  7. Evaluation of Solid Supports for Slide- and Well-Based Recombinant Antibody Microarrays.

    PubMed

    Gerdtsson, Anna S; Dexlin-Mellby, Linda; Delfani, Payam; Berglund, Erica; Borrebaeck, Carl A K; Wingren, Christer

    2016-01-01

    Antibody microarrays have emerged as an important tool within proteomics, enabling multiplexed protein expression profiling in both health and disease. The design and performance of antibody microarrays and how they are processed are dependent on several factors, of which the interplay between the antibodies and the solid surfaces plays a central role. In this study, we have taken on the first comprehensive view and evaluated the overall impact of solid surfaces on the recombinant antibody microarray design. The results clearly demonstrated the importance of the surface-antibody interaction and showed the effect of the solid supports on the printing process, the array format of planar arrays (slide- and well-based), the assay performance (spot features, reproducibility, specificity and sensitivity) and assay processing (degree of automation). In the end, two high-end recombinant antibody microarray technology platforms were designed, based on slide-based (black polymer) and well-based (clear polymer) arrays, paving the way for future large-scale protein expression profiling efforts. PMID:27600082

  8. Library Automation: A Survey of Leading Academic and Public Libraries in the United States.

    ERIC Educational Resources Information Center

    Mann, Thomas W., Jr.; And Others

    Results of this survey of 26 public and academic libraries of national stature show that the country's major libraries are fully committed to automating their library operations. Major findings of the survey show that: (1) all libraries surveyed are involved in automation; (2) all libraries surveyed have automated their catalogs and bibliographic…

  9. Fully automatic telemetry data processor

    NASA Technical Reports Server (NTRS)

    Cox, F. B.; Keipert, F. A.; Lee, R. C.

    1968-01-01

    Satellite Telemetry Automatic Reduction System /STARS 2/, a fully automatic computer-controlled telemetry data processor, maximizes data recovery, reduces turnaround time, increases flexibility, and improves operational efficiency. The system incorporates a CDC 3200 computer as its central element.

  10. Automated Demand Response and Commissioning

    SciTech Connect

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Bourassa, Norman

    2005-04-01

    This paper describes the results from the second season of research to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve the electric grid reliability and manage electricity costs. Fully-Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. We refer to this as Auto-DR. The evaluation of the control and communications must be properly configured and pass through a set of test stages: Readiness, Approval, Price Client/Price Server Communication, Internet Gateway/Internet Relay Communication, Control of Equipment, and DR Shed Effectiveness. New commissioning tests are needed for such systems to improve connecting demand responsive building systems to the electric grid demand response systems.

  11. Chromosomal Microarray versus Karyotyping for Prenatal Diagnosis

    PubMed Central

    Wapner, Ronald J.; Martin, Christa Lese; Levy, Brynn; Ballif, Blake C.; Eng, Christine M.; Zachary, Julia M.; Savage, Melissa; Platt, Lawrence D.; Saltzman, Daniel; Grobman, William A.; Klugman, Susan; Scholl, Thomas; Simpson, Joe Leigh; McCall, Kimberly; Aggarwal, Vimla S.; Bunke, Brian; Nahum, Odelia; Patel, Ankita; Lamb, Allen N.; Thom, Elizabeth A.; Beaudet, Arthur L.; Ledbetter, David H.; Shaffer, Lisa G.; Jackson, Laird

    2013-01-01

    Background Chromosomal microarray analysis has emerged as a primary diagnostic tool for the evaluation of developmental delay and structural malformations in children. We aimed to evaluate the accuracy, efficacy, and incremental yield of chromosomal microarray analysis as compared with karyotyping for routine prenatal diagnosis. Methods Samples from women undergoing prenatal diagnosis at 29 centers were sent to a central karyotyping laboratory. Each sample was split in two; standard karyotyping was performed on one portion and the other was sent to one of four laboratories for chromosomal microarray. Results We enrolled a total of 4406 women. Indications for prenatal diagnosis were advanced maternal age (46.6%), abnormal result on Down’s syndrome screening (18.8%), structural anomalies on ultrasonography (25.2%), and other indications (9.4%). In 4340 (98.8%) of the fetal samples, microarray analysis was successful; 87.9% of samples could be used without tissue culture. Microarray analysis of the 4282 nonmosaic samples identified all the aneuploidies and unbalanced rearrangements identified on karyotyping but did not identify balanced translocations and fetal triploidy. In samples with a normal karyotype, microarray analysis revealed clinically relevant deletions or duplications in 6.0% with a structural anomaly and in 1.7% of those whose indications were advanced maternal age or positive screening results. Conclusions In the context of prenatal diagnostic testing, chromosomal microarray analysis identified additional, clinically significant cytogenetic information as compared with karyotyping and was equally efficacious in identifying aneuploidies and unbalanced rearrangements but did not identify balanced translocations and triploidies. (Funded by the Eunice Kennedy Shriver National Institute of Child Health and Human Development and others; ClinicalTrials.gov number, NCT01279733.) PMID:23215555

  12. Image segmentation for automated dental identification

    NASA Astrophysics Data System (ADS)

    Haj Said, Eyad; Nassar, Diaa Eldin M.; Ammar, Hany H.

    2006-02-01

    Dental features are one of few biometric identifiers that qualify for postmortem identification; therefore, creation of an Automated Dental Identification System (ADIS) with goals and objectives similar to the Automated Fingerprint Identification System (AFIS) has received increased attention. As a part of ADIS, teeth segmentation from dental radiographs films is an essential step in the identification process. In this paper, we introduce a fully automated approach for teeth segmentation with goal to extract at least one tooth from the dental radiograph film. We evaluate our approach based on theoretical and empirical basis, and we compare its performance with the performance of other approaches introduced in the literature. The results show that our approach exhibits the lowest failure rate and the highest optimality among all full automated approaches introduced in the literature.

  13. Microbial Diagnostic Microarrays for the Detection and Typing of Food- and Water-Borne (Bacterial) Pathogens.

    PubMed

    Kostić, Tanja; Sessitsch, Angela

    2011-10-14

    Reliable and sensitive pathogen detection in clinical and environmental (including food and water) samples is of greatest importance for public health. Standard microbiological methods have several limitations and improved alternatives are needed. Most important requirements for reliable analysis include: (i) specificity; (ii) sensitivity; (iii) multiplexing potential; (iv) robustness; (v) speed; (vi) automation potential; and (vii) low cost. Microarray technology can, through its very nature, fulfill many of these requirements directly and the remaining challenges have been tackled. In this review, we attempt to compare performance characteristics of the microbial diagnostic microarrays developed for the detection and typing of food and water pathogens, and discuss limitations, points still to be addressed and issues specific for the analysis of food, water and environmental samples.

  14. Microbial Diagnostic Microarrays for the Detection and Typing of Food- and Water-Borne (Bacterial) Pathogens

    PubMed Central

    Kostić, Tanja; Sessitsch, Angela

    2011-01-01

    Reliable and sensitive pathogen detection in clinical and environmental (including food and water) samples is of greatest importance for public health. Standard microbiological methods have several limitations and improved alternatives are needed. Most important requirements for reliable analysis include: (i) specificity; (ii) sensitivity; (iii) multiplexing potential; (iv) robustness; (v) speed; (vi) automation potential; and (vii) low cost. Microarray technology can, through its very nature, fulfill many of these requirements directly and the remaining challenges have been tackled. In this review, we attempt to compare performance characteristics of the microbial diagnostic microarrays developed for the detection and typing of food and water pathogens, and discuss limitations, points still to be addressed and issues specific for the analysis of food, water and environmental samples.

  15. Segment and fit thresholding: a new method for image analysis applied to microarray and immunofluorescence data.

    PubMed

    Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E; Allen, Peter J; Sempere, Lorenzo F; Haab, Brian B

    2015-10-01

    Experiments involving the high-throughput quantification of image data require algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multicolor, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu's method for selected images. SFT promises to advance the goal of full automation in image analysis. PMID:26339978

  16. Segment and fit thresholding: a new method for image analysis applied to microarray and immunofluorescence data.

    PubMed

    Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E; Allen, Peter J; Sempere, Lorenzo F; Haab, Brian B

    2015-10-01

    Experiments involving the high-throughput quantification of image data require algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multicolor, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu's method for selected images. SFT promises to advance the goal of full automation in image analysis.

  17. DNA microarray analyses in higher plants.

    PubMed

    Galbraith, David W

    2006-01-01

    DNA microarrays were originally devised and described as a convenient technology for the global analysis of plant gene expression. Over the past decade, their use has expanded enormously to cover all kingdoms of living organisms. At the same time, the scope of applications of microarrays has increased beyond expression analyses, with plant genomics playing a leadership role in the on-going development of this technology. As the field has matured, the rate-limiting step has moved from that of the technical process of data generation to that of data analysis. We currently face major problems in dealing with the accumulating datasets, not simply with respect to how to archive, access, and process the huge amounts of data that have been and are being produced, but also in determining the relative quality of the different datasets. A major recognized concern is the appropriate use of statistical design in microarray experiments, without which the datasets are rendered useless. A vigorous area of current research involves the development of novel statistical tools specifically for microarray experiments. This article describes, in a necessarily selective manner, the types of platforms currently employed in microarray research and provides an overview of recent activities using these platforms in plant biology.

  18. Oligonucleotide microarrays in constitutional genetic diagnosis.

    PubMed

    Keren, Boris; Le Caignec, Cedric

    2011-06-01

    Oligonucleotide microarrays such as comparative genomic hybridization arrays and SNP microarrays enable the identification of genomic imbalances - also termed copy-number variants - with increasing resolution. This article will focus on the most significant applications of high-throughput oligonucleotide microarrays, both in genetic diagnosis and research. In genetic diagnosis, the method is becoming a standard tool for investigating patients with unexplained developmental delay/intellectual disability, autism spectrum disorders and/or with multiple congenital anomalies. Oligonucleotide microarray have also been recently applied to the detection of genomic imbalances in prenatal diagnosis either to characterize a chromosomal rearrangement that has previously been identified by standard prenatal karyotyping or to detect a cryptic genomic imbalance in a fetus with ultrasound abnormalities and a normal standard prenatal karyotype. In research, oligonucleotide microarrays have been used for a wide range of applications, such as the identification of new genes responsible for monogenic disorders and the association of a copy-number variant as a predisposing factor to a common disease. Despite its widespread use, the interpretation of results is not always straightforward. We will discuss several unexpected results and ethical issues raised by these new methods.

  19. Advancing Microarray Assembly with Acoustic Dispensing Technology

    PubMed Central

    Wong, E. Y.; Diamond, S. L.

    2011-01-01

    In the assembly of microarrays and microarray-based chemical assays and enzymatic bioassays, most approaches use pins for contact spotting. Acoustic dispensing is a technology capable of nanoliter transfers by using acoustic energy to eject liquid sample from an open source well. Although typically used for well plate transfers, when applied to microarraying it avoids drawbacks of undesired physical contact with sample, difficulty in assembling multicomponent reactions on a chip by readdressing, a rigid mode of printing that lacks patterning capabilities, and time-consuming wash steps. We demonstrated the utility of acoustic dispensing by delivering human cathepsin L in a drop-on-drop fashion into individual 50-nanoliter, pre-spotted reaction volumes to activate enzyme reactions at targeted positions on a microarray. We generated variable-sized spots ranging from 200 to 750 μm (and higher), and handled the transfer of fluorescent bead suspensions with increasing source well concentrations of 0.1 to 10 ×108 beads/mL in a linear fashion. There are no tips that can clog and liquid dispensing CVs are generally below 5%. This platform expands the toolbox for generating analytical arrays and meets needs associated with spatially-addressed assembly of multicomponent microarrays on the nanoliter scale. PMID:19035650

  20. A Synthetic Kinome Microarray Data Generator

    PubMed Central

    Maleki, Farhad; Kusalik, Anthony

    2015-01-01

    Cellular pathways involve the phosphorylation and dephosphorylation of proteins. Peptide microarrays called kinome arrays facilitate the measurement of the phosphorylation activity of hundreds of proteins in a single experiment. Analyzing the data from kinome microarrays is a multi-step process. Typically, various techniques are possible for a particular step, and it is necessary to compare and evaluate them. Such evaluations require data for which correct analysis results are known. Unfortunately, such kinome data is not readily available in the community. Further, there are no established techniques for creating artificial kinome datasets with known results and with the same characteristics as real kinome datasets. In this paper, a methodology for generating synthetic kinome array data is proposed. The methodology relies on actual intensity measurements from kinome microarray experiments and preserves their subtle characteristics. The utility of the methodology is demonstrated by evaluating methods for eliminating heterogeneous variance in kinome microarray data. Phosphorylation intensities from kinome microarrays often exhibit such heterogeneous variance and its presence can negatively impact downstream statistical techniques that rely on homogeneity of variance. It is shown that using the output from the proposed synthetic data generator, it is possible to critically compare two variance stabilization methods. PMID:27600233

  1. A Universal Oligonucleotide Microarray with a Minimal Number of Probes for the Detection and Identification of Viroids at the Genus Level

    PubMed Central

    Jiang, Dongmei; Xin, Yanyan; Ding, Fang; Deng, Ziniu; Wang, Guoping; Ma, Xianfeng; Li, Fang; Li, Guifen; Li, Mingfu; Li, Shifang; Zhu, Shuifang

    2013-01-01

    A major challenge in the agricultural industry is the development of techniques that can screen plant samples for viroid infection. Microarrays are promising in this regard, as their high throughput nature can potentially allow for the detection of a range of viroids in a single test. In this paper we present a microarray that can detect a wide spectrum of all 8 reported viroid genera including 37 known plant viroid species. The array was constructed using an automated probe design protocol which generated a minimal number of probes to detect viroids at the genus level. The designed microarray showed a high specificity and sensitivity when tested with a set of standard virus samples. Finally, the microarray was applied to screen infected field samples, with Hop stunt viroid infection identified as the major disease causing pathogen for an infected citrus sample. PMID:23734201

  2. Introduction to the statistical analysis of two-color microarray data.

    PubMed

    Bremer, Martina; Himelblau, Edward; Madlung, Andreas

    2010-01-01

    Microarray experiments have become routine in the past few years in many fields of biology. Analysis of array hybridizations is often performed with the help of commercial software programs, which produce gene lists, graphs, and sometimes provide values for the statistical significance of the results. Exactly what is computed by many of the available programs is often not easy to reconstruct or may even be impossible to know for the end user. It is therefore not surprising that many biology students and some researchers using microarray data do not fully understand the nature of the underlying statistics used to arrive at the results.We have developed a module that we have used successfully in undergraduate biology and statistics education that allows students to get a better understanding of both the basic biological and statistical theory needed to comprehend primary microarray data. The module is intended for the undergraduate level but may be useful to anyone who is new to the field of microarray biology. Additional course material that was developed for classroom use can be found at http://www.polyploidy.org/ .In our undergraduate classrooms we encourage students to manipulate microarray data using Microsoft Excel to reinforce some of the concepts they learn. We have included instructions for some of these manipulations throughout this chapter (see the "Do this..." boxes). However, it should be noted that while Excel can effectively analyze our small sample data set, more specialized software would typically be used to analyze full microarray data sets. Nevertheless, we believe that manipulating a small data set with Excel can provide insights into the workings of more advanced analysis software. PMID:20652509

  3. Protein microarrays for parasite antigen discovery.

    PubMed

    Driguez, Patrick; Doolan, Denise L; Molina, Douglas M; Loukas, Alex; Trieu, Angela; Felgner, Phil L; McManus, Donald P

    2015-01-01

    The host serological profile to a parasitic infection, such as schistosomiasis, can be used to define potential vaccine and diagnostic targets. Determining the host antibody response using traditional approaches is hindered by the large number of putative antigens in any parasite proteome. Parasite protein microarrays offer the potential for a high-throughput host antibody screen to simplify this task. In order to construct the array, parasite proteins are selected from available genomic sequence and protein databases using bioinformatic tools. Selected open reading frames are PCR amplified, incorporated into a vector for cell-free protein expression, and printed robotically onto glass slides. The protein microarrays can be probed with antisera from infected/immune animals or humans and the antibody reactivity measured with fluorophore labeled antibodies on a confocal laser microarray scanner to identify potential targets for diagnosis or therapeutic or prophylactic intervention. PMID:25388117

  4. Analysis of High-Throughput ELISA Microarray Data

    SciTech Connect

    White, Amanda M.; Daly, Don S.; Zangar, Richard C.

    2011-02-23

    Our research group develops analytical methods and software for the high-throughput analysis of quantitative enzyme-linked immunosorbent assay (ELISA) microarrays. ELISA microarrays differ from DNA microarrays in several fundamental aspects and most algorithms for analysis of DNA microarray data are not applicable to ELISA microarrays. In this review, we provide an overview of the steps involved in ELISA microarray data analysis and how the statistically sound algorithms we have developed provide an integrated software suite to address the needs of each data-processing step. The algorithms discussed are available in a set of open-source software tools (http://www.pnl.gov/statistics/ProMAT).

  5. Photo-Generation of Carbohydrate Microarrays

    NASA Astrophysics Data System (ADS)

    Carroll, Gregory T.; Wang, Denong; Turro, Nicholas J.; Koberstein, Jeffrey T.

    The unparalleled structural diversity of carbohydrates among biological molecules has been recognized for decades. Recent studies have highlighted carbohydrate signaling roles in many important biological processes, such as fertilization, embryonic development, cell differentiation and cellȁ4cell communication, blood coagulation, inflammation, chemotaxis, as well as host recognition and immune responses to microbial pathogens. In this chapter, we summarize recent progress in the establishment of carbohydrate-based microarrays and the application of these technologies in exploring the biological information content in carbohydrates. A newly established photochemical platform of carbohydrate microarrays serves as a model for a focused discussion.

  6. Protein Microarrays for the Detection of Biothreats

    NASA Astrophysics Data System (ADS)

    Herr, Amy E.

    Although protein microarrays have proven to be an important tool in proteomics research, the technology is emerging as useful for public health and defense applications. Recent progress in the measurement and characterization of biothreat agents is reviewed in this chapter. Details concerning validation of various protein microarray formats, from contact-printed sandwich assays to supported lipid bilayers, are presented. The reviewed technologies have important implications for in vitro characterization of toxin-ligand interactions, serotyping of bacteria, screening of potential biothreat inhibitors, and as core components of biosensors, among others, research and engineering applications.

  7. Pineal Function: Impact of Microarray Analysis

    PubMed Central

    Klein, David C.; Bailey, Michael J.; Carter, David A.; Kim, Jong-so; Shi, Qiong; Ho, Anthony; Chik, Constance; Gaildrat, Pascaline; Morin, Fabrice; Ganguly, Surajit; Rath, Martin F.; Møller, Morten; Sugden, David; Rangel, Zoila G.; Munson, Peter J.; Weller, Joan L.; Coon, Steven L.

    2009-01-01

    Microarray analysis has provided a new understanding of pineal function by identifying genes that are highly expressed in this tissue relative to other tissues and also by identifying over 600 genes that are expressed on a 24-hour schedule. This effort has highlighted surprising similarity to the retina and has provided reason to explore new avenues of study including intracellular signaling, signal transduction, transcriptional cascades, thyroid/retinoic acid hormone signaling, metal biology, RNA splicing, and the role the pineal gland plays in the immune/inflammation response. The new foundation that microarray analysis has provided will broadly support future research on pineal function. PMID:19622385

  8. The use of microarrays in microbial ecology

    SciTech Connect

    Andersen, G.L.; He, Z.; DeSantis, T.Z.; Brodie, E.L.; Zhou, J.

    2009-09-15

    Microarrays have proven to be a useful and high-throughput method to provide targeted DNA sequence information for up to many thousands of specific genetic regions in a single test. A microarray consists of multiple DNA oligonucleotide probes that, under high stringency conditions, hybridize only to specific complementary nucleic acid sequences (targets). A fluorescent signal indicates the presence and, in many cases, the abundance of genetic regions of interest. In this chapter we will look at how microarrays are used in microbial ecology, especially with the recent increase in microbial community DNA sequence data. Of particular interest to microbial ecologists, phylogenetic microarrays are used for the analysis of phylotypes in a community and functional gene arrays are used for the analysis of functional genes, and, by inference, phylotypes in environmental samples. A phylogenetic microarray that has been developed by the Andersen laboratory, the PhyloChip, will be discussed as an example of a microarray that targets the known diversity within the 16S rRNA gene to determine microbial community composition. Using multiple, confirmatory probes to increase the confidence of detection and a mismatch probe for every perfect match probe to minimize the effect of cross-hybridization by non-target regions, the PhyloChip is able to simultaneously identify any of thousands of taxa present in an environmental sample. The PhyloChip is shown to reveal greater diversity within a community than rRNA gene sequencing due to the placement of the entire gene product on the microarray compared with the analysis of up to thousands of individual molecules by traditional sequencing methods. A functional gene array that has been developed by the Zhou laboratory, the GeoChip, will be discussed as an example of a microarray that dynamically identifies functional activities of multiple members within a community. The recent version of GeoChip contains more than 24,000 50mer

  9. MicroRNA expression profiling using microarrays.

    PubMed

    Love, Cassandra; Dave, Sandeep

    2013-01-01

    MicroRNAs are small noncoding RNAs which are able to regulate gene expression at both the transcriptional and translational levels. There is a growing recognition of the role of microRNAs in nearly every tissue type and cellular process. Thus there is an increasing need for accurate quantitation of microRNA expression in a variety of tissues. Microarrays provide a robust method for the examination of microRNA expression. In this chapter, we describe detailed methods for the use of microarrays to measure microRNA expression and discuss methods for the analysis of microRNA expression data. PMID:23666707

  10. Fully Integrating the Design Process

    SciTech Connect

    T.A. Bjornard; R.S. Bean

    2008-03-01

    The basic approach to designing nuclear facilities in the United States does not currently reflect the routine consideration of proliferation resistance and international safeguards. The fully integrated design process is an approach for bringing consideration of international safeguards and proliferation resistance, together with state safeguards and security, fully into the design process from the very beginning, while integrating them sensibly and synergistically with the other project functions. In view of the recently established GNEP principles agreed to by the United States and at least eighteen other countries, this paper explores such an integrated approach, and its potential to help fulfill the new internationally driven design requirements with improved efficiencies and reduced costs.

  11. Microarray technology: an increasing variety of screening tools for proteomic research.

    PubMed

    Stoll, Dieter; Bachmann, Jutta; Templin, Markus F; Joos, Thomas O

    2004-12-15

    Protein microarray technology allows the simultaneous determination of a large variety of parameters from a minute amount of sample within a single experiment. Assay systems based on this technology are currently used for the identification, quantitation and functional analysis of proteins that are of interest for proteomic research in basic and applied biology and for diagnostic applications. Such novel assays are also of major interest for the pharmaceutical industry, focusing on the identification of biomarkers and the validation of potential target molecules. Sensitivity, reproducibility, robustness and automation have to be demonstrated before this technology will be suitable for high-throughput applications.

  12. Towards fully automatic object detection and segmentation

    NASA Astrophysics Data System (ADS)

    Schramm, Hauke; Ecabert, Olivier; Peters, Jochen; Philomin, Vasanth; Weese, Juergen

    2006-03-01

    An automatic procedure for detecting and segmenting anatomical objects in 3-D images is necessary for achieving a high level of automation in many medical applications. Since today's segmentation techniques typically rely on user input for initialization, they do not allow for a fully automatic workflow. In this work, the generalized Hough transform is used for detecting anatomical objects with well defined shape in 3-D medical images. This well-known technique has frequently been used for object detection in 2-D images and is known to be robust and reliable. However, its computational and memory requirements are generally huge, especially in case of considering 3-D images and various free transformation parameters. Our approach limits the complexity of the generalized Hough transform to a reasonable amount by (1) using object prior knowledge during the preprocessing in order to suppress unlikely regions in the image, (2) restricting the flexibility of the applied transformation to only scaling and translation, and (3) using a simple shape model which does not cover any inter-individual shape variability. Despite these limitations, the approach is demonstrated to allow for a coarse 3-D delineation of the femur, vertebra and heart in a number of experiments. Additionally it is shown that the quality of the object localization is in nearly all cases sufficient to initialize a successful segmentation using shape constrained deformable models.

  13. Automation of Capacity Bidding with an Aggregator Using Open Automated Demand Response

    SciTech Connect

    Kiliccote, Sila; Piette, Mary Ann

    2008-10-01

    This report summarizes San Diego Gas& Electric Company?s collaboration with the Demand Response Research Center to develop and test automation capability for the Capacity Bidding Program in 2007. The report describes the Open Automated Demand Response architecture, summarizes the history of technology development and pilot studies. It also outlines the Capacity Bidding Program and technology being used by an aggregator that participated in this demand response program. Due to delays, the program was not fully operational for summer 2007. However, a test event on October 3, 2007, showed that the project successfully achieved the objective to develop and demonstrate how an open, Web?based interoperable automated notification system for capacity bidding can be used by aggregators for demand response. The system was effective in initiating a fully automated demand response shed at the aggregated sites. This project also demonstrated how aggregators can integrate their demand response automation systems with San Diego Gas& Electric Company?s Demand Response Automation Server and capacity bidding program.

  14. Automated External Defibrillator

    MedlinePlus

    ... from the NHLBI on Twitter. What Is an Automated External Defibrillator? An automated external defibrillator (AED) is a portable device that ... Institutes of Health Department of Health and Human Services USA.gov

  15. Technical Advances of the Recombinant Antibody Microarray Technology Platform for Clinical Immunoproteomics

    PubMed Central

    Delfani, Payam; Dexlin Mellby, Linda; Nordström, Malin; Holmér, Andreas; Ohlsson, Mattias; Borrebaeck, Carl A. K.; Wingren, Christer

    2016-01-01

    In the quest for deciphering disease-associated biomarkers, high-performing tools for multiplexed protein expression profiling of crude clinical samples will be crucial. Affinity proteomics, mainly represented by antibody-based microarrays, have during recent years been established as a proteomic tool providing unique opportunities for parallelized protein expression profiling. But despite the progress, several main technical features and assay procedures remains to be (fully) resolved. Among these issues, the handling of protein microarray data, i.e. the biostatistics parts, is one of the key features to solve. In this study, we have therefore further optimized, validated, and standardized our in-house designed recombinant antibody microarray technology platform. To this end, we addressed the main remaining technical issues (e.g. antibody quality, array production, sample labelling, and selected assay conditions) and most importantly key biostatistics subjects (e.g. array data pre-processing and biomarker panel condensation). This represents one of the first antibody array studies in which these key biostatistics subjects have been studied in detail. Here, we thus present the next generation of the recombinant antibody microarray technology platform designed for clinical immunoproteomics. PMID:27414037

  16. Technical Advances of the Recombinant Antibody Microarray Technology Platform for Clinical Immunoproteomics.

    PubMed

    Delfani, Payam; Dexlin Mellby, Linda; Nordström, Malin; Holmér, Andreas; Ohlsson, Mattias; Borrebaeck, Carl A K; Wingren, Christer

    2016-01-01

    In the quest for deciphering disease-associated biomarkers, high-performing tools for multiplexed protein expression profiling of crude clinical samples will be crucial. Affinity proteomics, mainly represented by antibody-based microarrays, have during recent years been established as a proteomic tool providing unique opportunities for parallelized protein expression profiling. But despite the progress, several main technical features and assay procedures remains to be (fully) resolved. Among these issues, the handling of protein microarray data, i.e. the biostatistics parts, is one of the key features to solve. In this study, we have therefore further optimized, validated, and standardized our in-house designed recombinant antibody microarray technology platform. To this end, we addressed the main remaining technical issues (e.g. antibody quality, array production, sample labelling, and selected assay conditions) and most importantly key biostatistics subjects (e.g. array data pre-processing and biomarker panel condensation). This represents one of the first antibody array studies in which these key biostatistics subjects have been studied in detail. Here, we thus present the next generation of the recombinant antibody microarray technology platform designed for clinical immunoproteomics.

  17. MIMAS: an innovative tool for network-based high density oligonucleotide microarray data management and annotation

    PubMed Central

    Hermida, Leandro; Schaad, Olivier; Demougin, Philippe; Descombes, Patrick; Primig, Michael

    2006-01-01

    Background The high-density oligonucleotide microarray (GeneChip) is an important tool for molecular biological research aiming at large-scale detection of small nucleotide polymorphisms in DNA and genome-wide analysis of mRNA concentrations. Local array data management solutions are instrumental for efficient processing of the results and for subsequent uploading of data and annotations to a global certified data repository at the EBI (ArrayExpress) or the NCBI (GeneOmnibus). Description To facilitate and accelerate annotation of high-throughput expression profiling experiments, the Microarray Information Management and Annotation System (MIMAS) was developed. The system is fully compliant with the Minimal Information About a Microarray Experiment (MIAME) convention. MIMAS provides life scientists with a highly flexible and focused GeneChip data storage and annotation platform essential for subsequent analysis and interpretation of experimental results with clustering and mining tools. The system software can be downloaded for academic use upon request. Conclusion MIMAS implements a novel concept for nation-wide GeneChip data management whereby a network of facilities is centered on one data node directly connected to the European certified public microarray data repository located at the EBI. The solution proposed may serve as a prototype approach to array data management between research institutes organized in a consortium. PMID:16597336

  18. Automated Miniaturized Instrument for Space Biology Applications and the Monitoring of the Astronauts Health Onboard the ISS

    NASA Technical Reports Server (NTRS)

    Karouia, Fathi; Peyvan, Kia; Danley, David; Ricco, Antonio J.; Santos, Orlando; Pohorille, Andrew

    2011-01-01

    Human space travelers experience a unique environment that affects homeostasis and physiologic adaptation. The spacecraft environment subjects the traveler to noise, chemical and microbiological contaminants, increased radiation, and variable gravity forces. As humans prepare for long-duration missions to the International Space Station (ISS) and beyond, effective measures must be developed, verified and implemented to ensure mission success. Limited biomedical quantitative capabilities are currently available onboard the ISS. Therefore, the development of versatile instruments to perform space biological analysis and to monitor astronauts' health is needed. We are developing a fully automated, miniaturized system for measuring gene expression on small spacecraft in order to better understand the influence of the space environment on biological systems. This low-cost, low-power, multi-purpose instrument represents a major scientific and technological advancement by providing data on cellular metabolism and regulation. The current system will support growth of microorganisms, extract and purify the RNA, hybridize it to the array, read the expression levels of a large number of genes by microarray analysis, and transmit the measurements to Earth. The system will help discover how bacteria develop resistance to antibiotics and how pathogenic bacteria sometimes increase their virulence in space, facilitating the development of adequate countermeasures to decrease risks associated with human spaceflight. The current stand-alone technology could be used as an integrated platform onboard the ISS to perform similar genetic analyses on any biological systems from the tree of life. Additionally, with some modification the system could be implemented to perform real-time in-situ microbial monitoring of the ISS environment (air, surface and water samples) and the astronaut's microbiome using 16SrRNA microarray technology. Furthermore, the current system can be enhanced

  19. Automation: triumph or trap?

    PubMed

    Smythe, M H

    1997-01-01

    Automation, a hot topic in the laboratory world today, can be a very expensive option. Those who are considering implementing automation can save time and money by examining the issues from the standpoint of an industrial/manufacturing engineer. The engineer not only asks what problems will be solved by automation, but what problems will be created. This article discusses questions that must be asked and answered to ensure that automation efforts will yield real and substantial payoffs.

  20. A new method for gridding DNA microarrays.

    PubMed

    Charalambous, Christoforos C; Matsopoulos, George K

    2013-10-01

    In this paper, a new methodological scheme for the gridding of DNA microarrays is proposed. The scheme composes of a series of processes applied sequentially. Each DNA microarray image is pre-processed to remove any noise and the center of each spot is detected using a template matching algorithm. Then, an initial gridding is automatically placed on the DNA microarray image by 'building' rectangular pyramids around the detected spots' centers. The gridlines "move" between the pyramids, horizontally and vertically, forming this initial grid. Furthermore, a refinement process is applied composing of a five-step approach in order to correct gridding imperfections caused by its initial placement, both in non-spot cases and in more than one spot enclosure cases. The proposed gridding scheme is applied on DNA microarray images under known transformations and on real-world DNA data. Its performance is compared against the projection pursuit method, which is often used due to its speed and simplicity, as well as against a state-of-the-art method, the Optimal Multi-level Thresholding Gridding (OMTG). According to the obtained results, the proposed gridding scheme outperforms both methods, qualitatively and quantitatively.

  1. Diagnostic Oligonucleotide Microarray Fingerprinting of Bacillus Isolates

    SciTech Connect

    Chandler, Darrell P.; Alferov, Oleg; Chernov, Boris; Daly, Don S.; Golova, Julia; Perov, Alexander N.; Protic, Miroslava; Robison, Richard; Shipma, Matthew; White, Amanda M.; Willse, Alan R.

    2006-01-01

    A diagnostic, genome-independent microbial fingerprinting method using DNA oligonucleotide microarrays was used for high-resolution differentiation between closely related Bacillus strains, including two strains of Bacillus anthracis that are monomorphic (indistinguishable) via amplified fragment length polymorphism fingerprinting techniques. Replicated hybridizations on 391-probe nonamer arrays were used to construct a prototype fingerprint library for quantitative comparisons. Descriptive analysis of the fingerprints, including phylogenetic reconstruction, is consistent with previous taxonomic organization of the genus. Newly developed statistical analysis methods were used to quantitatively compare and objectively confirm apparent differences in microarray fingerprints with the statistical rigor required for microbial forensics and clinical diagnostics. These data suggest that a relatively simple fingerprinting microarray and statistical analysis method can differentiate between species in the Bacillus cereus complex, and between strains of B. anthracis. A synthetic DNA standard was used to understand underlying microarray and process-level variability, leading to specific recommendations for the development of a standard operating procedure and/or continued technology enhancements for microbial forensics and diagnostics.

  2. Data Analysis Strategies for Protein Microarrays

    PubMed Central

    Díez, Paula; Dasilva, Noelia; González-González, María; Matarraz, Sergio; Casado-Vela, Juan; Orfao, Alberto; Fuentes, Manuel

    2012-01-01

    Microarrays constitute a new platform which allows the discovery and characterization of proteins. According to different features, such as content, surface or detection system, there are many types of protein microarrays which can be applied for the identification of disease biomarkers and the characterization of protein expression patterns. However, the analysis and interpretation of the amount of information generated by microarrays remain a challenge. Further data analysis strategies are essential to obtain representative and reproducible results. Therefore, the experimental design is key, since the number of samples and dyes, among others aspects, would define the appropriate analysis method to be used. In this sense, several algorithms have been proposed so far to overcome analytical difficulties derived from fluorescence overlapping and/or background noise. Each kind of microarray is developed to fulfill a specific purpose. Therefore, the selection of appropriate analytical and data analysis strategies is crucial to achieve successful biological conclusions. In the present review, we focus on current algorithms and main strategies for data interpretation.

  3. Shrinkage covariance matrix approach for microarray data

    NASA Astrophysics Data System (ADS)

    Karjanto, Suryaefiza; Aripin, Rasimah

    2013-04-01

    Microarray technology was developed for the purpose of monitoring the expression levels of thousands of genes. A microarray data set typically consists of tens of thousands of genes (variables) from just dozens of samples due to various constraints including the high cost of producing microarray chips. As a result, the widely used standard covariance estimator is not appropriate for this purpose. One such technique is the Hotelling's T2 statistic which is a multivariate test statistic for comparing means between two groups. It requires that the number of observations (n) exceeds the number of genes (p) in the set but in microarray studies it is common that n < p. This leads to a biased estimate of the covariance matrix. In this study, the Hotelling's T2 statistic with the shrinkage approach is proposed to estimate the covariance matrix for testing differential gene expression. The performance of this approach is then compared with other commonly used multivariate tests using a widely analysed diabetes data set as illustrations. The results across the methods are consistent, implying that this approach provides an alternative to existing techniques.

  4. Microarrays (DNA Chips) for the Classroom Laboratory

    ERIC Educational Resources Information Center

    Barnard, Betsy; Sussman, Michael; BonDurant, Sandra Splinter; Nienhuis, James; Krysan, Patrick

    2006-01-01

    We have developed and optimized the necessary laboratory materials to make DNA microarray technology accessible to all high school students at a fraction of both cost and data size. The primary component is a DNA chip/array that students "print" by hand and then analyze using research tools that have been adapted for classroom use. The primary…

  5. Data Analysis Strategies for Protein Microarrays

    PubMed Central

    Díez, Paula; Dasilva, Noelia; González-González, María; Matarraz, Sergio; Casado-Vela, Juan; Orfao, Alberto; Fuentes, Manuel

    2012-01-01

    Microarrays constitute a new platform which allows the discovery and characterization of proteins. According to different features, such as content, surface or detection system, there are many types of protein microarrays which can be applied for the identification of disease biomarkers and the characterization of protein expression patterns. However, the analysis and interpretation of the amount of information generated by microarrays remain a challenge. Further data analysis strategies are essential to obtain representative and reproducible results. Therefore, the experimental design is key, since the number of samples and dyes, among others aspects, would define the appropriate analysis method to be used. In this sense, several algorithms have been proposed so far to overcome analytical difficulties derived from fluorescence overlapping and/or background noise. Each kind of microarray is developed to fulfill a specific purpose. Therefore, the selection of appropriate analytical and data analysis strategies is crucial to achieve successful biological conclusions. In the present review, we focus on current algorithms and main strategies for data interpretation. PMID:27605336

  6. DISC-BASED IMMUNOASSAY MICROARRAYS. (R825433)

    EPA Science Inventory

    Microarray technology as applied to areas that include genomics, diagnostics, environmental, and drug discovery, is an interesting research topic for which different chip-based devices have been developed. As an alternative, we have explored the principle of compact disc-based...

  7. MICROARRAY DATA ANALYSIS USING MULTIPLE STATISTICAL MODELS

    EPA Science Inventory

    Microarray Data Analysis Using Multiple Statistical Models

    Wenjun Bao1, Judith E. Schmid1, Amber K. Goetz1, Ming Ouyang2, William J. Welsh2,Andrew I. Brooks3,4, ChiYi Chu3,Mitsunori Ogihara3,4, Yinhe Cheng5, David J. Dix1. 1National Health and Environmental Effects Researc...

  8. Raman-based microarray readout: a review.

    PubMed

    Haisch, Christoph

    2016-07-01

    For a quarter of a century, microarrays have been part of the routine analytical toolbox. Label-based fluorescence detection is still the commonest optical readout strategy. Since the 1990s, a continuously increasing number of label-based as well as label-free experiments on Raman-based microarray readout concepts have been reported. This review summarizes the possible concepts and methods and their advantages and challenges. A common label-based strategy is based on the binding of selective receptors as well as Raman reporter molecules to plasmonic nanoparticles in a sandwich immunoassay, which results in surface-enhanced Raman scattering signals of the reporter molecule. Alternatively, capture of the analytes can be performed by receptors on a microarray surface. Addition of plasmonic nanoparticles again leads to a surface-enhanced Raman scattering signal, not of a label but directly of the analyte. This approach is mostly proposed for bacteria and cell detection. However, although many promising readout strategies have been discussed in numerous publications, rarely have any of them made the step from proof of concept to a practical application, let alone routine use. Graphical Abstract Possible realization of a SERS (Surface-Enhanced Raman Scattering) system for microarray readout. PMID:26973235

  9. PRACTICAL STRATEGIES FOR PROCESSING AND ANALYZING SPOTTED OLIGONUCLEOTIDE MICROARRAY DATA

    EPA Science Inventory

    Thoughtful data analysis is as important as experimental design, biological sample quality, and appropriate experimental procedures for making microarrays a useful supplement to traditional toxicology. In the present study, spotted oligonucleotide microarrays were used to profile...

  10. Automation of surface observations program

    NASA Technical Reports Server (NTRS)

    Short, Steve E.

    1988-01-01

    At present, surface weather observing methods are still largely manual and labor intensive. Through the nationwide implementation of Automated Surface Observing Systems (ASOS), this situation can be improved. Two ASOS capability levels are planned. The first is a basic-level system which will automatically observe the weather parameters essential for aviation operations and will operate either with or without supplemental contributions by an observer. The second is a more fully automated, stand-alone system which will observe and report the full range of weather parameters and will operate primarily in the unattended mode. Approximately 250 systems are planned by the end of the decade. When deployed, these systems will generate the standard hourly and special long-line transmitted weather observations, as well as provide continuous weather information direct to airport users. Specific ASOS configurations will vary depending upon whether the operation is unattended, minimally attended, or fully attended. The major functions of ASOS are data collection, data processing, product distribution, and system control. The program phases of development, demonstration, production system acquisition, and operational implementation are described.

  11. Workflow automation architecture standard

    SciTech Connect

    Moshofsky, R.P.; Rohen, W.T.

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  12. Examining microarray slide quality for the EPA using SNL's hyperspectral microarray scanner.

    SciTech Connect

    Rohde, Rachel M.; Timlin, Jerilyn Ann

    2005-11-01

    This report summarizes research performed at Sandia National Laboratories (SNL) in collaboration with the Environmental Protection Agency (EPA) to assess microarray quality on arrays from two platforms of interest to the EPA. Custom microarrays from two novel, commercially produced array platforms were imaged with SNL's unique hyperspectral imaging technology and multivariate data analysis was performed to investigate sources of emission on the arrays. No extraneous sources of emission were evident in any of the array areas scanned. This led to the conclusions that either of these array platforms could produce high quality, reliable microarray data for the EPA toxicology programs. Hyperspectral imaging results are presented and recommendations for microarray analyses using these platforms are detailed within the report.

  13. Facilitating functional annotation of chicken microarray data

    PubMed Central

    2009-01-01

    Background Modeling results from chicken microarray studies is challenging for researchers due to little functional annotation associated with these arrays. The Affymetrix GenChip chicken genome array, one of the biggest arrays that serve as a key research tool for the study of chicken functional genomics, is among the few arrays that link gene products to Gene Ontology (GO). However the GO annotation data presented by Affymetrix is incomplete, for example, they do not show references linked to manually annotated functions. In addition, there is no tool that facilitates microarray researchers to directly retrieve functional annotations for their datasets from the annotated arrays. This costs researchers amount of time in searching multiple GO databases for functional information. Results We have improved the breadth of functional annotations of the gene products associated with probesets on the Affymetrix chicken genome array by 45% and the quality of annotation by 14%. We have also identified the most significant diseases and disorders, different types of genes, and known drug targets represented on Affymetrix chicken genome array. To facilitate functional annotation of other arrays and microarray experimental datasets we developed an Array GO Mapper (AGOM) tool to help researchers to quickly retrieve corresponding functional information for their dataset. Conclusion Results from this study will directly facilitate annotation of other chicken arrays and microarray experimental datasets. Researchers will be able to quickly model their microarray dataset into more reliable biological functional information by using AGOM tool. The disease, disorders, gene types and drug targets revealed in the study will allow researchers to learn more about how genes function in complex biological systems and may lead to new drug discovery and development of therapies. The GO annotation data generated will be available for public use via AgBase website and will be updated on regular

  14. An automated microfluidic platform for C. elegans embryo arraying, phenotyping, and long-term live imaging

    NASA Astrophysics Data System (ADS)

    Cornaglia, Matteo; Mouchiroud, Laurent; Marette, Alexis; Narasimhan, Shreya; Lehnert, Thomas; Jovaisaite, Virginija; Auwerx, Johan; Gijs, Martin A. M.

    2015-05-01

    Studies of the real-time dynamics of embryonic development require a gentle embryo handling method, the possibility of long-term live imaging during the complete embryogenesis, as well as of parallelization providing a population’s statistics, while keeping single embryo resolution. We describe an automated approach that fully accomplishes these requirements for embryos of Caenorhabditis elegans, one of the most employed model organisms in biomedical research. We developed a microfluidic platform which makes use of pure passive hydrodynamics to run on-chip worm cultures, from which we obtain synchronized embryo populations, and to immobilize these embryos in incubator microarrays for long-term high-resolution optical imaging. We successfully employ our platform to investigate morphogenesis and mitochondrial biogenesis during the full embryonic development and elucidate the role of the mitochondrial unfolded protein response (UPRmt) within C. elegans embryogenesis. Our method can be generally used for protein expression and developmental studies at the embryonic level, but can also provide clues to understand the aging process and age-related diseases in particular.

  15. An automated microfluidic platform for C. elegans embryo arraying, phenotyping, and long-term live imaging

    PubMed Central

    Cornaglia, Matteo; Mouchiroud, Laurent; Marette, Alexis; Narasimhan, Shreya; Lehnert, Thomas; Jovaisaite, Virginija; Auwerx, Johan; Gijs, Martin A. M.

    2015-01-01

    Studies of the real-time dynamics of embryonic development require a gentle embryo handling method, the possibility of long-term live imaging during the complete embryogenesis, as well as of parallelization providing a population’s statistics, while keeping single embryo resolution. We describe an automated approach that fully accomplishes these requirements for embryos of Caenorhabditis elegans, one of the most employed model organisms in biomedical research. We developed a microfluidic platform which makes use of pure passive hydrodynamics to run on-chip worm cultures, from which we obtain synchronized embryo populations, and to immobilize these embryos in incubator microarrays for long-term high-resolution optical imaging. We successfully employ our platform to investigate morphogenesis and mitochondrial biogenesis during the full embryonic development and elucidate the role of the mitochondrial unfolded protein response (UPRmt) within C. elegans embryogenesis. Our method can be generally used for protein expression and developmental studies at the embryonic level, but can also provide clues to understand the aging process and age-related diseases in particular. PMID:25950235

  16. Fully depleted back illuminated CCD

    DOEpatents

    Holland, Stephen Edward

    2001-01-01

    A backside illuminated charge coupled device (CCD) is formed of a relatively thick high resistivity photon sensitive silicon substrate, with frontside electronic circuitry, and an optically transparent backside ohmic contact for applying a backside voltage which is at least sufficient to substantially fully deplete the substrate. A greater bias voltage which overdepletes the substrate may also be applied. One way of applying the bias voltage to the substrate is by physically connecting the voltage source to the ohmic contact. An alternate way of applying the bias voltage to the substrate is to physically connect the voltage source to the frontside of the substrate, at a point outside the depletion region. Thus both frontside and backside contacts can be used for backside biasing to fully deplete the substrate. Also, high resistivity gaps around the CCD channels and electrically floating channel stop regions can be provided in the CCD array around the CCD channels. The CCD array forms an imaging sensor useful in astronomy.

  17. Fully synthetic taped insulation cables

    DOEpatents

    Forsyth, Eric B.; Muller, Albert C.

    1984-01-01

    A high voltage oil-impregnated electrical cable with fully polymer taped insulation operable to 765 kV. Biaxially oriented, specially processed, polyethylene, polybutene or polypropylene tape with an embossed pattern is wound in multiple layers over a conductive core with a permeable screen around the insulation. Conventional oil which closely matches the dielectric constant of the tape is used, and the cable can be impregnated after field installation because of its excellent impregnation characteristics.

  18. Shoe-String Automation

    SciTech Connect

    Duncan, M.L.

    2001-07-30

    Faced with a downsizing organization, serious budget reductions and retirement of key metrology personnel, maintaining capabilities to provide necessary services to our customers was becoming increasingly difficult. It appeared that the only solution was to automate some of our more personnel-intensive processes; however, it was crucial that the most personnel-intensive candidate process be automated, at the lowest price possible and with the lowest risk of failure. This discussion relates factors in the selection of the Standard Leak Calibration System for automation, the methods of automation used to provide the lowest-cost solution and the benefits realized as a result of the automation.

  19. Automation of industrial bioprocesses.

    PubMed

    Beyeler, W; DaPra, E; Schneider, K

    2000-01-01

    The dramatic development of new electronic devices within the last 25 years has had a substantial influence on the control and automation of industrial bioprocesses. Within this short period of time the method of controlling industrial bioprocesses has changed completely. In this paper, the authors will use a practical approach focusing on the industrial applications of automation systems. From the early attempts to use computers for the automation of biotechnological processes up to the modern process automation systems some milestones are highlighted. Special attention is given to the influence of Standards and Guidelines on the development of automation systems.

  20. Implementation of and experiences with new automation.

    PubMed

    Mahmud, I; Kim, D

    2000-01-01

    In an environment where cost, timeliness, and quality drives the business, it is essential to look for answers in technology where these challenges can be met. In the Novartis Pharmaceutical Quality Assurance Department, automation and robotics have become just the tools to meet these challenges. Although automation is a relatively new concept in our department, we have fully embraced it within just a few years. As our company went through a merger, there was a significant reduction in the workforce within the Quality Assurance Department through voluntary and involuntary separations. However the workload remained constant or in some cases actually increased. So even with reduction in laboratory personnel, we were challenged internally and from the headquarters in Basle to improve productivity while maintaining integrity in quality testing. Benchmark studies indicated the Suffern site to be the choice manufacturing site above other facilities. This is attributed to the Suffern facility employees' commitment to reduce cycle time, improve efficiency, and maintain high level of regulatory compliance. One of the stronger contributing factors was automation technology in the laboratoriess, and this technology will continue to help the site's status in the future. The Automation Group was originally formed about 2 years ago to meet the demands of high quality assurance testing throughput needs and to bring our testing group up to standard with the industry. Automation began with only two people in the group and now we have three people who are the next generation automation scientists. Even with such a small staff,we have made great strides in laboratory automation as we have worked extensively with each piece of equipment brought in. The implementation process of each project was often difficult because the second generation automation group came from the laboratory and without much automation experience. However, with the involvement from the users at 'get-go', we were

  1. Structured oligonucleotides for target indexing to allow single-vessel PCR amplification and solid support microarray hybridization.

    PubMed

    Girard, Laurie D; Boissinot, Karel; Peytavi, Régis; Boissinot, Maurice; Bergeron, Michel G

    2015-02-01

    The combination of molecular diagnostic technologies is increasingly used to overcome limitations on sensitivity, specificity or multiplexing capabilities, and provide efficient lab-on-chip devices. Two such techniques, PCR amplification and microarray hybridization are used serially to take advantage of the high sensitivity and specificity of the former combined with high multiplexing capacities of the latter. These methods are usually performed in different buffers and reaction chambers. However, these elaborate methods have high complexity and cost related to reagent requirements, liquid storage and the number of reaction chambers to integrate into automated devices. Furthermore, microarray hybridizations have a sequence dependent efficiency not always predictable. In this work, we have developed the concept of a structured oligonucleotide probe which is activated by cleavage from polymerase exonuclease activity. This technology is called SCISSOHR for Structured Cleavage Induced Single-Stranded Oligonucleotide Hybridization Reaction. The SCISSOHR probes enable indexing the target sequence to a tag sequence. The SCISSOHR technology also allows the combination of nucleic acid amplification and microarray hybridization in a single vessel in presence of the PCR buffer only. The SCISSOHR technology uses an amplification probe that is irreversibly modified in presence of the target, releasing a single-stranded DNA tag for microarray hybridization. Each tag is composed of a 3-nucleotide sequence-dependent segment and a unique "target sequence-independent" 14-nucleotide segment allowing for optimal hybridization with minimal cross-hybridization. We evaluated the performance of five (5) PCR buffers to support microarray hybridization, compared to a conventional hybridization buffer. Finally, as a proof of concept, we developed a multiplexed assay for the amplification, detection, and identification of three (3) DNA targets. This new technology will facilitate the design

  2. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    PubMed

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.

  3. GEPAS, a web-based tool for microarray data analysis and interpretation

    PubMed Central

    Tárraga, Joaquín; Medina, Ignacio; Carbonell, José; Huerta-Cepas, Jaime; Minguez, Pablo; Alloza, Eva; Al-Shahrour, Fátima; Vegas-Azcárate, Susana; Goetz, Stefan; Escobar, Pablo; Garcia-Garcia, Francisco; Conesa, Ana; Montaner, David; Dopazo, Joaquín

    2008-01-01

    Gene Expression Profile Analysis Suite (GEPAS) is one of the most complete and extensively used web-based packages for microarray data analysis. During its more than 5 years of activity it has continuously been updated to keep pace with the state-of-the-art in the changing microarray data analysis arena. GEPAS offers diverse analysis options that include well established as well as novel algorithms for normalization, gene selection, class prediction, clustering and functional profiling of the experiment. New options for time-course (or dose-response) experiments, microarray-based class prediction, new clustering methods and new tests for differential expression have been included. The new pipeliner module allows automating the execution of sequential analysis steps by means of a simple but powerful graphic interface. An extensive re-engineering of GEPAS has been carried out which includes the use of web services and Web 2.0 technology features, a new user interface with persistent sessions and a new extended database of gene identifiers. GEPAS is nowadays the most quoted web tool in its field and it is extensively used by researchers of many countries and its records indicate an average usage rate of 500 experiments per day. GEPAS, is available at http://www.gepas.org. PMID:18508806

  4. Detecting and genotyping Escherichia coli O157:H7 using multiplexed PCR and nucleic acid microarrays.

    PubMed

    Call, D R; Brockman, F J; Chandler, D P

    2001-07-20

    Rapid detection and characterization of food borne pathogens such as Escherichia coli O157:H7 is crucial for epidemiological investigations and food safety surveillance. As an alternative to conventional technologies, we examined the sensitivity and specificity of nucleic acid microarrays for detecting and genotyping E. coli O157:H7. The array was composed of oligonucleotide probes (25-30 mer) complementary to four virulence loci (intimin, Shiga-like toxins I and II, and hemolysin A). Target DNA was amplified from whole cells or from purified DNA via single or multiplexed polymerase chain reaction (PCR), and PCR products were hybridized to the array without further modification or purification. The array was 32-fold more sensitive than gel electrophoresis and capable of detecting amplification products from < 1 cell equivalent of genomic DNA (1 fg). Immunomagnetic capture, PCR and a microarray were subsequently used to detect 55 CFU ml(-1) (E. coli O157:H7) from chicken rinsate without the aid of pre-enrichment. Four isolates of E. coli O157:H7 and one isolate of O91:H2, for which genotypic data were available, were unambiguously genotyped with this array. Glass-based microarrays are relatively simple to construct and provide a rapid and sensitive means to detect multiplexed PCR products; the system is amenable to automation.

  5. Detecting and Genotyping Escherichia coli O157:H7 using multiplexed PCR and nucleic acid microarrays

    SciTech Connect

    Call, Douglas R.; Brockman, Fred J. ); Chandler, Darrell P.

    2000-12-01

    Rapid detection and characterization of food borne pathogens such as Escherichia coli O157:H7 is crucial for epidemiological investigations and food safety surveillance. As an alternative to conventional technologies, we examined the sensitivity and specificity of nucleic acid microarrays for detecting and genotyping E. coli O157:H7. The array was composed of oligonucleotide probes (25-30 mer) complementary to four virulence loci (intimin, Shiga-like toxins I and II, and hemolysin A). Target DNA was amplified from whole cells or from purified DNA via single or multiplexed polymerase chain reaction (PCR), and PCR products were hybridized to the array without further modification or purification. The array was 32-fold more sensitive than gel electrophoresis and capable of detecting amplification products from < 1 cell equivalent of genomic DNA (1 fg). Immunomagnetic capture, PCR and a microarray were subsequently used to detect 55 CFU ml-1 (E. coli O157:H7) from chicken rinsate without the aid of pre-enrichment. Four isolates of E. coli O157:H7 and one isolate of O91:H2, for which genotypic data were available, were unambiguously genotyped with this array. Glass based microarrays are relatively simple to construct and provide a rapid and sensitive means to detect multiplexed PCR products and the system is amenable to automation.

  6. Detecting and genotyping Escherichia coli O157:H7 using multiplexed PCR and nucleic acid microarrays

    SciTech Connect

    Call, Douglas R.; Brockman, Fred J.; Chandler, Darrell P.

    2001-07-05

    Rapid detection and characterization of food borne pathogens such as Escherichia coli O157:H7 is crucial for epidemiological investigations and food safety surveillance. As an alternative to conventional technologies, we examined the sensitivity and specificity of nucleic acid microarrays for detecting and genotyping E. coli O157:H7. The array was composed of oligonucleotide probes (25-30 mer) complementary to four virulence loci (intimin, Shiga-like toxins I and II, and hemolysin A). Target DNA was amplified from whole cells or from purified DNA via single or multiplexed polymerase chain reaction (PCR), and PCR products were hybridized to the array without further modification or purification. The array was 32-fold more sensitive than gel electrophoresis and capable of detecting amplification products from < 1 cell equivalent of genomic DNA (1 fg). Immunomagnetic capture, PCR and a microarray were subsequently used to detect 55 CFUs ml-1 (E. coli O157:H7) from chicken rinsate without the aid of pre-enrichment. Four isolates of E. coli O157:H7 and one isolate of O91:H2, for which genotypic data were available, were unambiguously genotyped with this array. Glass based microarrays are relatively simple to construct and provide a rapid and sensitive means to detect multiplexed PCR products and the system is amenable to automation.

  7. Viral diagnosis in Indian livestock using customized microarray chips.

    PubMed

    Yadav, Brijesh S; Pokhriyal, Mayank; Ratta, Barkha; Kumar, Ajay; Saxena, Meeta; Sharma, Bhaskar

    2015-01-01

    Viral diagnosis in Indian livestock using customized microarray chips is gaining momentum in recent years. Hence, it is possible to design customized microarray chip for viruses infecting livestock in India. Customized microarray chips identified Bovine herpes virus-1 (BHV-1), Canine Adeno Virus-1 (CAV-1), and Canine Parvo Virus-2 (CPV-2) in clinical samples. Microarray identified specific probes were further confirmed using RT-PCR in all clinical and known samples. Therefore, the application of microarray chips during viral disease outbreaks in Indian livestock is possible where conventional methods are unsuitable. It should be noted that customized application requires a detailed cost efficiency calculation.

  8. Advancing translational research with next-generation protein microarrays.

    PubMed

    Yu, Xiaobo; Petritis, Brianne; LaBaer, Joshua

    2016-04-01

    Protein microarrays are a high-throughput technology used increasingly in translational research, seeking to apply basic science findings to enhance human health. In addition to assessing protein levels, posttranslational modifications, and signaling pathways in patient samples, protein microarrays have aided in the identification of potential protein biomarkers of disease and infection. In this perspective, the different types of full-length protein microarrays that are used in translational research are reviewed. Specific studies employing these microarrays are presented to highlight their potential in finding solutions to real clinical problems. Finally, the criteria that should be considered when developing next-generation protein microarrays are provided. PMID:26749402

  9. Automated optimization techniques for aircraft synthesis

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1976-01-01

    Application of numerical optimization techniques to automated conceptual aircraft design is examined. These methods are shown to be a general and efficient way to obtain quantitative information for evaluating alternative new vehicle projects. Fully automated design is compared with traditional point design methods and time and resource requirements for automated design are given. The NASA Ames Research Center aircraft synthesis program (ACSYNT) is described with special attention to calculation of the weight of a vehicle to fly a specified mission. The ACSYNT procedures for automatically obtaining sensitivity of the design (aircraft weight, performance and cost) to various vehicle, mission, and material technology parameters are presented. Examples are used to demonstrate the efficient application of these techniques.

  10. Automated MAD and MIR structure solution.

    PubMed

    Terwilliger, T C; Berendzen, J

    1999-04-01

    Obtaining an electron-density map from X-ray diffraction data can be difficult and time-consuming even after the data have been collected, largely because MIR and MAD structure determinations currently require many subjective evaluations of the qualities of trial heavy-atom partial structures before a correct heavy-atom solution is obtained. A set of criteria for evaluating the quality of heavy-atom partial solutions in macromolecular crystallography have been developed. These have allowed the conversion of the crystal structure-solution process into an optimization problem and have allowed its automation. The SOLVE software has been used to solve MAD data sets with as many as 52 selenium sites in the asymmetric unit. The automated structure-solution process developed is a major step towards the fully automated structure-determination, model-building and refinement procedure which is needed for genomic scale structure determinations. PMID:10089316

  11. Statistical Considerations for Analysis of Microarray Experiments

    PubMed Central

    Owzar, Kouros; Barry, William T.; Jung, Sin-Ho

    2014-01-01

    Microarray technologies enable the simultaneous interrogation of expressions from thousands of genes from a biospecimen sample taken from a patient. This large set of expressions generate a genetic profile of the patient that may be used to identify potential prognostic or predictive genes or genetic models for clinical outcomes. The aim of this article is to provide a broad overview of some of the major statistical considerations for the design and analysis of microarrays experiments conducted as correlative science studies to clinical trials. An emphasis will be placed on how the lack of understanding and improper use of statistical concepts and methods will lead to noise discovery and misinterpretation of experimental results. PMID:22212230

  12. Plasmonically amplified fluorescence bioassay with microarray format

    NASA Astrophysics Data System (ADS)

    Gogalic, S.; Hageneder, S.; Ctortecka, C.; Bauch, M.; Khan, I.; Preininger, Claudia; Sauer, U.; Dostalek, J.

    2015-05-01

    Plasmonic amplification of fluorescence signal in bioassays with microarray detection format is reported. A crossed relief diffraction grating was designed to couple an excitation laser beam to surface plasmons at the wavelength overlapping with the absorption and emission bands of fluorophore Dy647 that was used as a label. The surface of periodically corrugated sensor chip was coated with surface plasmon-supporting gold layer and a thin SU8 polymer film carrying epoxy groups. These groups were employed for the covalent immobilization of capture antibodies at arrays of spots. The plasmonic amplification of fluorescence signal on the developed microarray chip was tested by using interleukin 8 sandwich immunoassay. The readout was performed ex situ after drying the chip by using a commercial scanner with high numerical aperture collecting lens. Obtained results reveal the enhancement of fluorescence signal by a factor of 5 when compared to a regular glass chip.

  13. Microarrays: how many do you need?

    PubMed

    Zien, Alexander; Fluck, Juliane; Zimmer, Ralf; Lengauer, Thomas

    2003-01-01

    We estimate the number of microarrays that is required in order to gain reliable results from a common type of study: the pairwise comparison of different classes of samples. We show that current knowledge allows for the construction of models that look realistic with respect to searches for individual differentially expressed genes and derive prototypical parameters from real data sets. Such models allow investigation of the dependence of the required number of samples on the relevant parameters: the biological variability of the samples within each class, the fold changes in expression that are desired to be detected, the detection sensitivity of the microarrays, and the acceptable error rates of the results. We supply experimentalists with general conclusions as well as a freely accessible Java applet at www.scai.fhg.de/special/bio/howmanyarrays/ for fine tuning simulations to their particular settings. PMID:12935350

  14. A Flexible Microarray Data Simulation Model

    PubMed Central

    Dembélé, Doulaye

    2013-01-01

    Microarray technology allows monitoring of gene expression profiling at the genome level. This is useful in order to search for genes involved in a disease. The performances of the methods used to select interesting genes are most often judged after other analyzes (qPCR validation, search in databases...), which are also subject to error. A good evaluation of gene selection methods is possible with data whose characteristics are known, that is to say, synthetic data. We propose a model to simulate microarray data with similar characteristics to the data commonly produced by current platforms. The parameters used in this model are described to allow the user to generate data with varying characteristics. In order to show the flexibility of the proposed model, a commented example is given and illustrated. An R package is available for immediate use.

  15. Profiling protein function with small molecule microarrays

    PubMed Central

    Winssinger, Nicolas; Ficarro, Scott; Schultz, Peter G.; Harris, Jennifer L.

    2002-01-01

    The regulation of protein function through posttranslational modification, local environment, and protein–protein interaction is critical to cellular function. The ability to analyze on a genome-wide scale protein functional activity rather than changes in protein abundance or structure would provide important new insights into complex biological processes. Herein, we report the application of a spatially addressable small molecule microarray to an activity-based profile of proteases in crude cell lysates. The potential of this small molecule-based profiling technology is demonstrated by the detection of caspase activation upon induction of apoptosis, characterization of the activated caspase, and inhibition of the caspase-executed apoptotic phenotype using the small molecule inhibitor identified in the microarray-based profile. PMID:12167675

  16. Application of DNA Microarray to Clinical Diagnostics.

    PubMed

    Patel, Ankita; Cheung, Sau W

    2016-01-01

    Microarray-based technology to conduct array comparative genomic hybridization (aCGH) has made a significant impact on the diagnosis of human genetic diseases. Such diagnoses, previously undetectable by traditional G-banding chromosome analysis, are now achieved by identifying genomic copy number variants (CNVs) using the microarray. Not only can hundreds of well-characterized genetic syndromes be detected in a single assay, but new genomic disorders and disease-causing genes can also be discovered through the utilization of aCGH technology. Although other platforms such as single nucleotide polymorphism (SNP) arrays can be used for detecting CNVs, in this chapter we focus on describing the methods for performing aCGH using Agilent oligonucleotide arrays for both prenatal (e.g., amniotic fluid and chorionic villus sample) and postnatal samples (e.g., blood).

  17. Automated ILA design for synchronous sequential circuits

    NASA Technical Reports Server (NTRS)

    Liu, M. N.; Liu, K. Z.; Maki, G. K.; Whitaker, S. R.

    1991-01-01

    An iterative logic array (ILA) architecture for synchronous sequential circuits is presented. This technique utilizes linear algebra to produce the design equations. The ILA realization of synchronous sequential logic can be fully automated with a computer program. A programmable design procedure is proposed to fullfill the design task and layout generation. A software algorithm in the C language has been developed and tested to generate 1 micron CMOS layouts using the Hewlett-Packard FUNGEN module generator shell.

  18. Automated DNA Sequencing System

    SciTech Connect

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  19. Automated MAD and MIR structure solution

    SciTech Connect

    Terwilliger, Thomas C.; Berendzen, Joel

    1999-04-01

    A fully automated procedure for solving MIR and MAD structures has been developed using a scoring scheme to convert the structure-solution process into an optimization problem. Obtaining an electron-density map from X-ray diffraction data can be difficult and time-consuming even after the data have been collected, largely because MIR and MAD structure determinations currently require many subjective evaluations of the qualities of trial heavy-atom partial structures before a correct heavy-atom solution is obtained. A set of criteria for evaluating the quality of heavy-atom partial solutions in macromolecular crystallography have been developed. These have allowed the conversion of the crystal structure-solution process into an optimization problem and have allowed its automation. The SOLVE software has been used to solve MAD data sets with as many as 52 selenium sites in the asymmetric unit. The automated structure-solution process developed is a major step towards the fully automated structure-determination, model-building and refinement procedure which is needed for genomic scale structure determinations.

  20. Weighted analysis of general microarray experiments

    PubMed Central

    Sjögren, Anders; Kristiansson, Erik; Rudemo, Mats; Nerman, Olle

    2007-01-01

    Background In DNA microarray experiments, measurements from different biological samples are often assumed to be independent and to have identical variance. For many datasets these assumptions have been shown to be invalid and typically lead to too optimistic p-values. A method called WAME has been proposed where a variance is estimated for each sample and a covariance is estimated for each pair of samples. The current version of WAME is, however, limited to experiments with paired design, e.g. two-channel microarrays. Results The WAME procedure is extended to general microarray experiments, making it capable of handling both one- and two-channel datasets. Two public one-channel datasets are analysed and WAME detects both unequal variances and correlations. WAME is compared to other common methods: fold-change ranking, ordinary linear model with t-tests, LIMMA and weighted LIMMA. The p-value distributions are shown to differ greatly between the examined methods. In a resampling-based simulation study, the p-values generated by WAME are found to be substantially more correct than the alternatives when a relatively small proportion of the genes is regulated. WAME is also shown to have higher power than the other methods. WAME is available as an R-package. Conclusion The WAME procedure is generalized and the limitation to paired-design microarray datasets is removed. The examined other methods produce invalid p-values in many cases, while WAME is shown to produce essentially valid p-values when a relatively small proportion of genes is regulated. WAME is also shown to have higher power than the examined alternative methods. PMID:17937807