Science.gov

Sample records for fully automated microarray

  1. Fully automated analysis of multi-resolution four-channel micro-array genotyping data

    NASA Astrophysics Data System (ADS)

    Abbaspour, Mohsen; Abugharbieh, Rafeef; Podder, Mohua; Tebbutt, Scott J.

    2006-03-01

    We present a fully-automated and robust microarray image analysis system for handling multi-resolution images (down to 3-micron with sizes up to 80 MBs per channel). The system is developed to provide rapid and accurate data extraction for our recently developed microarray analysis and quality control tool (SNP Chart). Currently available commercial microarray image analysis applications are inefficient, due to the considerable user interaction typically required. Four-channel DNA microarray technology is a robust and accurate tool for determining genotypes of multiple genetic markers in individuals. It plays an important role in the state of the art trend where traditional medical treatments are to be replaced by personalized genetic medicine, i.e. individualized therapy based on the patient's genetic heritage. However, fast, robust, and precise image processing tools are required for the prospective practical use of microarray-based genetic testing for predicting disease susceptibilities and drug effects in clinical practice, which require a turn-around timeline compatible with clinical decision-making. In this paper we have developed a fully-automated image analysis platform for the rapid investigation of hundreds of genetic variations across multiple genes. Validation tests indicate very high accuracy levels for genotyping results. Our method achieves a significant reduction in analysis time, from several hours to just a few minutes, and is completely automated requiring no manual interaction or guidance.

  2. Reagent preparation and storage for amplification of microarray hybridization targets with a fully automated system.

    PubMed

    Zhou, Mingjie; Marlowe, Jon; Graves, Jaime; Dahl, Jason; Riley, Zackery; Tian, Lena; Duenwald, Sven; Tokiwa, George; Fare, Thomas L

    2007-08-01

    The advent of automated systems for gene expression profiling has accentuated the need for the development of convenient and cost-effective methods for reagent preparation. We have developed a method for the preparation and storage of pre-aliquoted cocktail plates that contain all reagents required for amplification of nucleic acid by reverse transcription and in vitro transcription reactions. Plates can be stored at -80 degrees C for at least 1 month and kept in a hotel at 4 degrees C for at least 24 h prior to use. Microarray data quality generated from these pre-aliquoted reagent plates is not statistically different between cRNA amplified with stored cocktails and cRNA amplified with freshly prepared cocktails. Deployment of pre-aliquoted, stored cocktail plates in a fully automated system not only increases the throughput of amplifying cRNA targets from thousands of RNA samples, but could also considerably reduce reagent costs and potentially improve process robustness.

  3. Fully Automated Complementary DNA Microarray Segmentation using a Novel Fuzzy-based Algorithm.

    PubMed

    Saberkari, Hamidreza; Bahrami, Sheyda; Shamsi, Mousa; Amoshahy, Mohammad Javad; Ghavifekr, Habib Badri; Sedaaghi, Mohammad Hossein

    2015-01-01

    DNA microarray is a powerful approach to study simultaneously, the expression of 1000 of genes in a single experiment. The average value of the fluorescent intensity could be calculated in a microarray experiment. The calculated intensity values are very close in amount to the levels of expression of a particular gene. However, determining the appropriate position of every spot in microarray images is a main challenge, which leads to the accurate classification of normal and abnormal (cancer) cells. In this paper, first a preprocessing approach is performed to eliminate the noise and artifacts available in microarray cells using the nonlinear anisotropic diffusion filtering method. Then, the coordinate center of each spot is positioned utilizing the mathematical morphology operations. Finally, the position of each spot is exactly determined through applying a novel hybrid model based on the principle component analysis and the spatial fuzzy c-means clustering (SFCM) algorithm. Using a Gaussian kernel in SFCM algorithm will lead to improving the quality in complementary DNA microarray segmentation. The performance of the proposed algorithm has been evaluated on the real microarray images, which is available in Stanford Microarray Databases. Results illustrate that the accuracy of microarray cells segmentation in the proposed algorithm reaches to 100% and 98% for noiseless/noisy cells, respectively.

  4. Fully Automated Complementary DNA Microarray Segmentation using a Novel Fuzzy-based Algorithm

    PubMed Central

    Saberkari, Hamidreza; Bahrami, Sheyda; Shamsi, Mousa; Amoshahy, Mohammad Javad; Ghavifekr, Habib Badri; Sedaaghi, Mohammad Hossein

    2015-01-01

    DNA microarray is a powerful approach to study simultaneously, the expression of 1000 of genes in a single experiment. The average value of the fluorescent intensity could be calculated in a microarray experiment. The calculated intensity values are very close in amount to the levels of expression of a particular gene. However, determining the appropriate position of every spot in microarray images is a main challenge, which leads to the accurate classification of normal and abnormal (cancer) cells. In this paper, first a preprocessing approach is performed to eliminate the noise and artifacts available in microarray cells using the nonlinear anisotropic diffusion filtering method. Then, the coordinate center of each spot is positioned utilizing the mathematical morphology operations. Finally, the position of each spot is exactly determined through applying a novel hybrid model based on the principle component analysis and the spatial fuzzy c-means clustering (SFCM) algorithm. Using a Gaussian kernel in SFCM algorithm will lead to improving the quality in complementary DNA microarray segmentation. The performance of the proposed algorithm has been evaluated on the real microarray images, which is available in Stanford Microarray Databases. Results illustrate that the accuracy of microarray cells segmentation in the proposed algorithm reaches to 100% and 98% for noiseless/noisy cells, respectively. PMID:26284175

  5. WANTED: Fully Automated Indexing.

    ERIC Educational Resources Information Center

    Purcell, Royal

    1991-01-01

    Discussion of indexing focuses on the possibilities of fully automated indexing. Topics discussed include controlled indexing languages such as subject heading lists and thesauri, free indexing languages, natural indexing languages, computer-aided indexing, expert systems, and the need for greater creativity to further advance automated indexing.…

  6. Fully automated protein purification

    PubMed Central

    Camper, DeMarco V.; Viola, Ronald E.

    2009-01-01

    Obtaining highly purified proteins is essential to begin investigating their functional and structural properties. The steps that are typically involved in purifying proteins can include an initial capture, intermediate purification, and a final polishing step. Completing these steps can take several days and require frequent attention to ensure success. Our goal was to design automated protocols that will allow the purification of proteins with minimal operator intervention. Separate methods have been produced and tested that automate the sample loading, column washing, sample elution and peak collection steps for ion-exchange, metal affinity, hydrophobic interaction and gel filtration chromatography. These individual methods are designed to be coupled and run sequentially in any order to achieve a flexible and fully automated protein purification protocol. PMID:19595984

  7. Fully automated urban traffic system

    NASA Technical Reports Server (NTRS)

    Dobrotin, B. M.; Hansen, G. R.; Peng, T. K. C.; Rennels, D. A.

    1977-01-01

    The replacement of the driver with an automatic system which could perform the functions of guiding and routing a vehicle with a human's capability of responding to changing traffic demands was discussed. The problem was divided into four technological areas; guidance, routing, computing, and communications. It was determined that the latter three areas being developed independent of any need for fully automated urban traffic. A guidance system that would meet system requirements was not being developed but was technically feasible.

  8. Automated Microarray Image Analysis Toolbox for MATLAB

    SciTech Connect

    White, Amanda M.; Daly, Don S.; Willse, Alan R.; Protic, Miroslava; Chandler, Darrell P.

    2005-09-01

    The Automated Microarray Image Analysis (AMIA) Toolbox for MATLAB is a flexible, open-source microarray image analysis tool that allows the user to customize analysis of sets of microarray images. This tool provides several methods of identifying and quantify spot statistics, as well as extensive diagnostic statistics and images to identify poor data quality or processing. The open nature of this software allows researchers to understand the algorithms used to provide intensity estimates and to modify them easily if desired.

  9. Progress in Fully Automated Abdominal CT Interpretation

    PubMed Central

    Summers, Ronald M.

    2016-01-01

    OBJECTIVE Automated analysis of abdominal CT has advanced markedly over just the last few years. Fully automated assessment of organs, lymph nodes, adipose tissue, muscle, bowel, spine, and tumors are some examples where tremendous progress has been made. Computer-aided detection of lesions has also improved dramatically. CONCLUSION This article reviews the progress and provides insights into what is in store in the near future for automated analysis for abdominal CT, ultimately leading to fully automated interpretation. PMID:27101207

  10. Automated fully-stressed design with NASTRAN

    NASA Technical Reports Server (NTRS)

    Wallerstein, D. V.; Haggenmacher, G. W.

    1976-01-01

    An automated strength sizing capability is described. The technique determines the distribution of material among the elements of a structural model. The sizing is based on either a fully stressed design or a scaled feasible fully stressed design. Results obtained from the application of the strength sizing to the structural sizing of a composite material wing box using material strength allowables are presented. These results demonstrate the rapid convergence of the structural sizes to a usable design.

  11. Fully integrated, fully automated generation of short tandem repeat profiles

    PubMed Central

    2013-01-01

    Background The generation of short tandem repeat profiles, also referred to as ‘DNA typing,’ is not currently performed outside the laboratory because the process requires highly skilled technical operators and a controlled laboratory environment and infrastructure with several specialized instruments. The goal of this work was to develop a fully integrated system for the automated generation of short tandem repeat profiles from buccal swab samples, to improve forensic laboratory process flow as well as to enable short tandem repeat profile generation to be performed in police stations and in field-forward military, intelligence, and homeland security settings. Results An integrated system was developed consisting of an injection-molded microfluidic BioChipSet cassette, a ruggedized instrument, and expert system software. For each of five buccal swabs, the system purifies DNA using guanidinium-based lysis and silica binding, amplifies 15 short tandem repeat loci and the amelogenin locus, electrophoretically separates the resulting amplicons, and generates a profile. No operator processing of the samples is required, and the time from swab insertion to profile generation is 84 minutes. All required reagents are contained within the BioChipSet cassette; these consist of a lyophilized polymerase chain reaction mix and liquids for purification and electrophoretic separation. Profiles obtained from fully automated runs demonstrate that the integrated system generates concordant short tandem repeat profiles. The system exhibits single-base resolution from 100 to greater than 500 bases, with inter-run precision with a standard deviation of ±0.05 - 0.10 bases for most alleles. The reagents are stable for at least 6 months at 22°C, and the instrument has been designed and tested to Military Standard 810F for shock and vibration ruggedization. A nontechnical user can operate the system within or outside the laboratory. Conclusions The integrated system represents the

  12. A fully automated remote refraction system.

    PubMed

    Dyer, A M; Kirk, A H

    2000-01-01

    Traditional methods of performing refractions depend on a trained refractionist being present with the subject and conducting an interactive form of subjective testing. A fully automated refraction system was installed in 13 optical dispensaries and after 15 months the patient and statistical information was gathered. The data from all operators were consistent and suggested a lack of operator effect on the refraction results. The mean of the SD of subjective sphere measurements was 0.2, or slightly less than a quarter dioptre, which would be an acceptable level of accuracy for ordering corrective lenses. The present study suggests an absence of operator influence on the results of the refractions and a degree of consistency and accuracy compatible with the prescription of lenses.

  13. Fully Mechanically Controlled Automated Electron Microscopic Tomography

    PubMed Central

    Liu, Jinxin; Li, Hongchang; Zhang, Lei; Rames, Matthew; Zhang, Meng; Yu, Yadong; Peng, Bo; Celis, César Díaz; Xu, April; Zou, Qin; Yang, Xu; Chen, Xuefeng; Ren, Gang

    2016-01-01

    Knowledge of three-dimensional (3D) structures of each individual particles of asymmetric and flexible proteins is essential in understanding those proteins’ functions; but their structures are difficult to determine. Electron tomography (ET) provides a tool for imaging a single and unique biological object from a series of tilted angles, but it is challenging to image a single protein for three-dimensional (3D) reconstruction due to the imperfect mechanical control capability of the specimen goniometer under both a medium to high magnification (approximately 50,000–160,000×) and an optimized beam coherence condition. Here, we report a fully mechanical control method for automating ET data acquisition without using beam tilt/shift processes. This method could reduce the accumulation of beam tilt/shift that used to compensate the error from the mechanical control, but downgraded the beam coherence. Our method was developed by minimizing the error of the target object center during the tilting process through a closed-loop proportional-integral (PI) control algorithm. The validations by both negative staining (NS) and cryo-electron microscopy (cryo-EM) suggest that this method has a comparable capability to other ET methods in tracking target proteins while maintaining optimized beam coherence conditions for imaging. PMID:27403922

  14. Fully Mechanically Controlled Automated Electron Microscopic Tomography.

    PubMed

    Liu, Jinxin; Li, Hongchang; Zhang, Lei; Rames, Matthew; Zhang, Meng; Yu, Yadong; Peng, Bo; Celis, César Díaz; Xu, April; Zou, Qin; Yang, Xu; Chen, Xuefeng; Ren, Gang

    2016-07-11

    Knowledge of three-dimensional (3D) structures of each individual particles of asymmetric and flexible proteins is essential in understanding those proteins' functions; but their structures are difficult to determine. Electron tomography (ET) provides a tool for imaging a single and unique biological object from a series of tilted angles, but it is challenging to image a single protein for three-dimensional (3D) reconstruction due to the imperfect mechanical control capability of the specimen goniometer under both a medium to high magnification (approximately 50,000-160,000×) and an optimized beam coherence condition. Here, we report a fully mechanical control method for automating ET data acquisition without using beam tilt/shift processes. This method could reduce the accumulation of beam tilt/shift that used to compensate the error from the mechanical control, but downgraded the beam coherence. Our method was developed by minimizing the error of the target object center during the tilting process through a closed-loop proportional-integral (PI) control algorithm. The validations by both negative staining (NS) and cryo-electron microscopy (cryo-EM) suggest that this method has a comparable capability to other ET methods in tracking target proteins while maintaining optimized beam coherence conditions for imaging.

  15. Fully Mechanically Controlled Automated Electron Microscopic Tomography

    NASA Astrophysics Data System (ADS)

    Liu, Jinxin; Li, Hongchang; Zhang, Lei; Rames, Matthew; Zhang, Meng; Yu, Yadong; Peng, Bo; Celis, César Díaz; Xu, April; Zou, Qin; Yang, Xu; Chen, Xuefeng; Ren, Gang

    2016-07-01

    Knowledge of three-dimensional (3D) structures of each individual particles of asymmetric and flexible proteins is essential in understanding those proteins’ functions; but their structures are difficult to determine. Electron tomography (ET) provides a tool for imaging a single and unique biological object from a series of tilted angles, but it is challenging to image a single protein for three-dimensional (3D) reconstruction due to the imperfect mechanical control capability of the specimen goniometer under both a medium to high magnification (approximately 50,000–160,000×) and an optimized beam coherence condition. Here, we report a fully mechanical control method for automating ET data acquisition without using beam tilt/shift processes. This method could reduce the accumulation of beam tilt/shift that used to compensate the error from the mechanical control, but downgraded the beam coherence. Our method was developed by minimizing the error of the target object center during the tilting process through a closed-loop proportional-integral (PI) control algorithm. The validations by both negative staining (NS) and cryo-electron microscopy (cryo-EM) suggest that this method has a comparable capability to other ET methods in tracking target proteins while maintaining optimized beam coherence conditions for imaging.

  16. A microfluidic device for the automated electrical readout of low-density glass-slide microarrays.

    PubMed

    Díaz-González, María; Salvador, J Pablo; Bonilla, Diana; Marco, M Pilar; Fernández-Sánchez, César; Baldi, Antoni

    2015-12-15

    Microarrays are a powerful platform for rapid and multiplexed analysis in a wide range of research fields. Electrical readout systems have emerged as an alternative to conventional optical methods for microarray analysis thanks to its potential advantages like low-cost, low-power and easy miniaturization of the required instrumentation. In this work an automated electrical readout system for low-cost glass-slide microarrays is described. The system enables the simultaneous conductimetric detection of up to 36 biorecognition events by incorporating an array of interdigitated electrode transducers. A polydimethylsiloxane microfluidic structure has been designed that creates microwells over the transducers and incorporates the microfluidic channels required for filling and draining them with readout and cleaning solutions, thus making the readout process fully automated. Since the capture biomolecules are not immobilized on the transducer surface this readout system is reusable, in contrast to previously reported electrochemical microarrays. A low-density microarray based on a competitive enzymatic immunoassay for atrazine detection was used to test the performance of the readout system. The electrical assay shows a detection limit of 0.22±0.03 μg L(-1) similar to that obtained with fluorescent detection and allows the direct determination of the pesticide in polluted water samples. These results proved that an electrical readout system such as the one presented in this work is a reliable and cost-effective alternative to fluorescence scanners for the analysis of low-density microarrays.

  17. Automated target preparation for microarray-based gene expression analysis.

    PubMed

    Raymond, Frédéric; Metairon, Sylviane; Borner, Roland; Hofmann, Markus; Kussmann, Martin

    2006-09-15

    DNA microarrays have rapidly evolved toward a platform for massively paralleled gene expression analysis. Despite its widespread use, the technology has been criticized to be vulnerable to technical variability. Addressing this issue, recent comparative, interplatform, and interlaboratory studies have revealed that, given defined procedures for "wet lab" experiments and data processing, a satisfactory reproducibility and little experimental variability can be achieved. In view of these advances in standardization, the requirement for uniform sample preparation becomes evident, especially if a microarray platform is used as a facility, i.e., by different users working in the laboratory. While one option to reduce technical variability is to dedicate one laboratory technician to all microarray studies, we have decided to automate the entire RNA sample preparation implementing a liquid handling system coupled to a thermocycler and a microtiter plate reader. Indeed, automated RNA sample preparation prior to chip analysis enables (1) the reduction of experimentally caused result variability, (2) the separation of (important) biological variability from (undesired) experimental variation, and (3) interstudy comparison of gene expression results. Our robotic platform can process up to 24 samples in parallel, using an automated sample preparation method that produces high-quality biotin-labeled cRNA ready to be hybridized on Affymetrix GeneChips. The results show that the technical interexperiment variation is less pronounced than with manually prepared samples. Moreover, experiments using the same starting material showed that the automated process yields a good reproducibility between samples.

  18. A home-built, fully automated observatory

    NASA Astrophysics Data System (ADS)

    Beales, M.

    2010-12-01

    This paper describes the design of an automated observatory making use of off-the-shelf components and software. I make no claims for originality in the design but it has been an interesting and rewarding exercise to get all the components to work together.

  19. Integrated Microfluidic Devices for Automated Microarray-Based Gene Expression and Genotyping Analysis

    NASA Astrophysics Data System (ADS)

    Liu, Robin H.; Lodes, Mike; Fuji, H. Sho; Danley, David; McShea, Andrew

    Microarray assays typically involve multistage sample processing and fluidic handling, which are generally labor-intensive and time-consuming. Automation of these processes would improve robustness, reduce run-to-run and operator-to-operator variation, and reduce costs. In this chapter, a fully integrated and self-contained microfluidic biochip device that has been developed to automate the fluidic handling steps for microarray-based gene expression or genotyping analysis is presented. The device consists of a semiconductor-based CustomArray® chip with 12,000 features and a microfluidic cartridge. The CustomArray was manufactured using a semiconductor-based in situ synthesis technology. The micro-fluidic cartridge consists of microfluidic pumps, mixers, valves, fluid channels, and reagent storage chambers. Microarray hybridization and subsequent fluidic handling and reactions (including a number of washing and labeling steps) were performed in this fully automated and miniature device before fluorescent image scanning of the microarray chip. Electrochemical micropumps were integrated in the cartridge to provide pumping of liquid solutions. A micromixing technique based on gas bubbling generated by electrochemical micropumps was developed. Low-cost check valves were implemented in the cartridge to prevent cross-talk of the stored reagents. Gene expression study of the human leukemia cell line (K562) and genotyping detection and sequencing of influenza A subtypes have been demonstrated using this integrated biochip platform. For gene expression assays, the microfluidic CustomArray device detected sample RNAs with a concentration as low as 0.375 pM. Detection was quantitative over more than three orders of magnitude. Experiment also showed that chip-to-chip variability was low indicating that the integrated microfluidic devices eliminate manual fluidic handling steps that can be a significant source of variability in genomic analysis. The genotyping results showed

  20. Evaluation of a novel automated allergy microarray platform compared with three other allergy test methods.

    PubMed

    Williams, P; Önell, A; Baldracchini, F; Hui, V; Jolles, S; El-Shanawany, T

    2016-04-01

    Microarray platforms, enabling simultaneous measurement of many allergens with a small serum sample, are potentially powerful tools in allergy diagnostics. We report here the first study comparing a fully automated microarray system, the Microtest allergy system, with a manual microarray platform, Immuno-Solid phase Allergen Chip (ISAC), and two well-established singleplex allergy tests, skin prick test (SPT) and ImmunoCAP, all tested on the same patients. One hundred and three adult allergic patients attending the allergy clinic were included into the study. All patients were tested with four allergy test methods (SPT, ImmunoCAP, Microtest and ISAC 112) and a total of 3485 pairwise test results were analysed and compared. The four methods showed comparable results with a positive/negative agreement of 81-88% for any pair of test methods compared, which is in line with data in the literature. The most prevalent allergens (cat, dog, mite, timothy, birch and peanut) and their individual allergen components revealed an agreement between methods with correlation coefficients between 0·73 and 0·95. All four methods revealed deviating individual patient results for a minority of patients. These results indicate that microarray platforms are efficient and useful tools to characterize the specific immunoglobulin (Ig)E profile of allergic patients using a small volume of serum sample. The results produced by the Microtest system were in agreement with diagnostic tests in current use. Further data collection and evaluation are needed for other populations, geographical regions and allergens.

  1. A fully automated robotic system for high throughput fermentation.

    PubMed

    Zimmermann, Hartmut F; Rieth, Jochen

    2007-03-01

    High throughput robotic systems have been used since the 1990s to carry out biochemical assays in microtiter plates. However, before the application of such systems in industrial fermentation process development, some important specific demands should be taken into account. These are sufficient oxygen supply, optimal growth temperature, minimized sample evaporation, avoidance of contaminations, and simple but reliable process monitoring. A fully automated solution where all these aspects have been taken into account is presented.

  2. A fully automated digitally controlled 30-inch telescope

    NASA Technical Reports Server (NTRS)

    Colgate, S. A.; Moore, E. P.; Carlson, R.

    1975-01-01

    A fully automated 30-inch (75-cm) telescope has been successfully designed and constructed from a military surplus Nike-Ajax radar mount. Novel features include: closed-loop operation between mountain telescope and campus computer 30 km apart via microwave link, a TV-type sensor which is photon shot-noise limited, a special lightweight primary mirror, and a stepping motor drive capable of slewing and settling one degree in one second or a radian in fifteen seconds.

  3. Programmable and automated bead-based microfluidics for versatile DNA microarrays under isothermal conditions.

    PubMed

    Penchovsky, Robert

    2013-06-21

    Advances in modern genomic research depend heavily on applications of various devices for automated high- or ultra-throughput arrays. Micro- and nanofluidics offer possibilities for miniaturization and integration of many different arrays onto a single device. Therefore, such devices are becoming a platform of choice for developing analytical instruments for modern biotechnology. This paper presents an implementation of a bead-based microfluidic platform for fully automated and programmable DNA microarrays. The devices are designed to work under isothermal conditions as DNA immobilization and hybridization transfer are performed under steady temperature using reversible pH alterations of reaction solutions. This offers the possibility for integration of more selection modules onto a single chip compared to maintaining a temperature gradient. This novel technology allows integration of many modules on a single reusable chip reducing the application cost. The method takes advantage of demonstrated high-speed DNA hybridization kinetics and denaturation on beads under flow conditions, high-fidelity of DNA hybridization, and small sample volumes are needed. The microfluidic devices are applied for a single nucleotide polymorphism analysis and DNA sequencing by synthesis without the need for fluorescent removal step. Apart from that, the microfluidic platform presented is applicable to many areas of modern biotechnology, including biosensor devices, DNA hybridization microarrays, molecular computation, on-chip nucleic acid selection, high-throughput screening of chemical libraries for drug discovery.

  4. Automation of cDNA microarray hybridization and washing yields improved data quality.

    PubMed

    Yauk, Carole; Berndt, Lynn; Williams, Andrew; Douglas, George R

    2005-07-29

    Microarray technology allows the analysis of whole-genome transcription within a single hybridization, and has become a standard research tool. It is extremely important to minimize variation in order to obtain high quality microarray data that can be compared among experiments and laboratories. The majority of facilities implement manual hybridization approaches for microarray studies. We developed an automated method for cDNA microarray hybridization that uses equivalent pre-hybridization, hybridization and washing conditions to the suggested manual protocol. The automated method significantly decreased variability across microarray slides compared to manual hybridization. Although normalized signal intensities for buffer-only spots across the chips were identical, significantly reduced variation and inter-quartile ranges were obtained using the automated workstation. This decreased variation led to improved correlation among technical replicates across slides in both the Cy3 and Cy5 channels.

  5. A fully automated Chimera methodology for multiple moving body problems

    NASA Astrophysics Data System (ADS)

    Wang, Z. J.; Parthasarathy, V.

    2000-08-01

    A fully automated Chimera methodology has been developed in this study to provide geometric or stencil information required to facilitate inter-grid data communications. Chimera holes are cut automatically in each grid of an overset grid system based on whether the grid overlaps with non-penetrable surfaces (NPS) and/or blocked regions. The efficiency of the hole-cutting algorithm is boosted with search algorithms based on the state-of-the-art alternating digital tree (ADT) data structures. The automated nature of the hole-cutting algorithm is ideally suited for handling multiple moving body problems. Several cases, both steady and unsteady, are used to demonstrate the effectiveness of the methodology. Copyright

  6. FAT: Fully Automated TiRiFiC

    NASA Astrophysics Data System (ADS)

    Kamphuis, P.; Józsa, G. I. G.; Oh, S.-. H.; Spekkens, K.; Urbancic, N.; Serra, P.; Koribalski, B. S.; Dettmar, R.-J.

    2015-07-01

    FAT (Fully Automated TiRiFiC) is an automated procedure that fits tilted-ring models to Hi data cubes of individual, well-resolved galaxies. The method builds on the 3D Tilted Ring Fitting Code (TiRiFiC, ascl:1208.008). FAT accurately models the kinematics and the morphologies of galaxies with an extent of eight beams across the major axis in the inclination range 20°-90° without the need for priors such as disc inclination. FAT's performance allows us to model the gas kinematics of many thousands of well-resolved galaxies, which is essential for future HI surveys, with the Square Kilometre Array and its pathfinders.

  7. A revolutionary graphitisation system: Fully automated, compact and simple

    NASA Astrophysics Data System (ADS)

    Wacker, L.; Němec, M.; Bourquin, J.

    2010-04-01

    A new graphitisation system, directly coupled to an elemental analyser, has been developed for convenient, fast and efficient sample preparations for radiocarbon measurement by means of accelerator mass spectrometry. We demonstrate an alternative to the cryogenic transport of CO 2 into the graphitisation reactors with liquid nitrogen, which is used by others. Instead, the CO 2 coming from an EA is absorbed on a single column filled with zeolite. The CO 2 can then be easily released by heating the zeolite trap and transferred to the reactor by gas expansion. The system is simple and fully automated for sample combustion and graphitisation.

  8. Fully automated production of iodine-124 using a vertical beam.

    PubMed

    Nagatsu, Kotaro; Fukada, Masami; Minegishi, Katsuyuki; Suzuki, Hisashi; Fukumura, Toshimitsu; Yamazaki, Hiromichi; Suzuki, Kazutoshi

    2011-01-01

    A fully automated system for the production of iodine-124, based on techniques of vertical-beam irradiation and dry distillation, was developed. The system, coupled with a capsulated target, was able to irradiate the (124)TeO(2) target up to 29 μA for 1-4h, which yielded iodine-124 with an almost constant yield of 6.9 MBq/μAh at the end of bombardment. All procedures were performed automatically and repeatedly. The newly developed system would be suitable for routine, large-scale productions of iodine-124.

  9. Description and calibration of a fully automated infrared scatterometer

    NASA Astrophysics Data System (ADS)

    Mainguy, Stephane; Olivier, Michel; Josse, Michel A.; Guidon, Michel

    1991-12-01

    A fully automated scatterometer, designed for BRDF measurements in the IR at about 10 micrometers , is described. Basically, it works around a reflecting parabola (464 mm diameter, F/0.25) and permits measurements in and out of the plane of incidence. Optical properties of the parabolic mirror are emphasized by a ray-tracing technique which permits determination of the correct illumination on the sample and detection conditions of scattered light. Advantages and drawbacks of such an instrument are discussed, as well as calibration procedures. As a conclusion, we present experimental results to illustrate the instrument capabilities.

  10. Fully Automated Lipid Pool Detection Using Near Infrared Spectroscopy

    PubMed Central

    Wojakowski, Wojciech

    2016-01-01

    Background. Detecting and identifying vulnerable plaque, which is prone to rupture, is still a challenge for cardiologist. Such lipid core-containing plaque is still not identifiable by everyday angiography, thus triggering the need to develop a new tool where NIRS-IVUS can visualize plaque characterization in terms of its chemical and morphologic characteristic. The new tool can lead to the development of new methods of interpreting the newly obtained data. In this study, the algorithm to fully automated lipid pool detection on NIRS images is proposed. Method. Designed algorithm is divided into four stages: preprocessing (image enhancement), segmentation of artifacts, detection of lipid areas, and calculation of Lipid Core Burden Index. Results. A total of 31 NIRS chemograms were analyzed by two methods. The metrics, total LCBI, maximal LCBI in 4 mm blocks, and maximal LCBI in 2 mm blocks, were calculated to compare presented algorithm with commercial available system. Both intraclass correlation (ICC) and Bland-Altman plots showed good agreement and correlation between used methods. Conclusions. Proposed algorithm is fully automated lipid pool detection on near infrared spectroscopy images. It is a tool developed for offline data analysis, which could be easily augmented for newer functions and projects. PMID:27610191

  11. FASTER: an unsupervised fully automated sleep staging method for mice

    PubMed Central

    Sunagawa, Genshiro A; Séi, Hiroyoshi; Shimba, Shigeki; Urade, Yoshihiro; Ueda, Hiroki R

    2013-01-01

    Identifying the stages of sleep, or sleep staging, is an unavoidable step in sleep research and typically requires visual inspection of electroencephalography (EEG) and electromyography (EMG) data. Currently, scoring is slow, biased and prone to error by humans and thus is the most important bottleneck for large-scale sleep research in animals. We have developed an unsupervised, fully automated sleep staging method for mice that allows less subjective and high-throughput evaluation of sleep. Fully Automated Sleep sTaging method via EEG/EMG Recordings (FASTER) is based on nonparametric density estimation clustering of comprehensive EEG/EMG power spectra. FASTER can accurately identify sleep patterns in mice that have been perturbed by drugs or by genetic modification of a clock gene. The overall accuracy is over 90% in every group. 24-h data are staged by a laptop computer in 10 min, which is faster than an experienced human rater. Dramatically improving the sleep staging process in both quality and throughput FASTER will open the door to quantitative and comprehensive animal sleep research. PMID:23621645

  12. Fully automated quantitative cephalometry using convolutional neural networks.

    PubMed

    Arık, Sercan Ö; Ibragimov, Bulat; Xing, Lei

    2017-01-01

    Quantitative cephalometry plays an essential role in clinical diagnosis, treatment, and surgery. Development of fully automated techniques for these procedures is important to enable consistently accurate computerized analyses. We study the application of deep convolutional neural networks (CNNs) for fully automated quantitative cephalometry for the first time. The proposed framework utilizes CNNs for detection of landmarks that describe the anatomy of the depicted patient and yield quantitative estimation of pathologies in the jaws and skull base regions. We use a publicly available cephalometric x-ray image dataset to train CNNs for recognition of landmark appearance patterns. CNNs are trained to output probabilistic estimations of different landmark locations, which are combined using a shape-based model. We evaluate the overall framework on the test set and compare with other proposed techniques. We use the estimated landmark locations to assess anatomically relevant measurements and classify them into different anatomical types. Overall, our results demonstrate high anatomical landmark detection accuracy ([Formula: see text] to 2% higher success detection rate for a 2-mm range compared with the top benchmarks in the literature) and high anatomical type classification accuracy ([Formula: see text] average classification accuracy for test set). We demonstrate that CNNs, which merely input raw image patches, are promising for accurate quantitative cephalometry.

  13. Fully Automated Deep Learning System for Bone Age Assessment.

    PubMed

    Lee, Hyunkwang; Tajmir, Shahein; Lee, Jenny; Zissen, Maurice; Yeshiwas, Bethel Ayele; Alkasab, Tarik K; Choy, Garry; Do, Synho

    2017-03-08

    Skeletal maturity progresses through discrete phases, a fact that is used routinely in pediatrics where bone age assessments (BAAs) are compared to chronological age in the evaluation of endocrine and metabolic disorders. While central to many disease evaluations, little has changed to improve the tedious process since its introduction in 1950. In this study, we propose a fully automated deep learning pipeline to segment a region of interest, standardize and preprocess input radiographs, and perform BAA. Our models use an ImageNet pretrained, fine-tuned convolutional neural network (CNN) to achieve 57.32 and 61.40% accuracies for the female and male cohorts on our held-out test images. Female test radiographs were assigned a BAA within 1 year 90.39% and within 2 years 98.11% of the time. Male test radiographs were assigned 94.18% within 1 year and 99.00% within 2 years. Using the input occlusion method, attention maps were created which reveal what features the trained model uses to perform BAA. These correspond to what human experts look at when manually performing BAA. Finally, the fully automated BAA system was deployed in the clinical environment as a decision supporting system for more accurate and efficient BAAs at much faster interpretation time (<2 s) than the conventional method.

  14. A Fully Automated High-Throughput Training System for Rodents

    PubMed Central

    Poddar, Rajesh; Kawai, Risa; Ölveczky, Bence P.

    2013-01-01

    Addressing the neural mechanisms underlying complex learned behaviors requires training animals in well-controlled tasks, an often time-consuming and labor-intensive process that can severely limit the feasibility of such studies. To overcome this constraint, we developed a fully computer-controlled general purpose system for high-throughput training of rodents. By standardizing and automating the implementation of predefined training protocols within the animal’s home-cage our system dramatically reduces the efforts involved in animal training while also removing human errors and biases from the process. We deployed this system to train rats in a variety of sensorimotor tasks, achieving learning rates comparable to existing, but more laborious, methods. By incrementally and systematically increasing the difficulty of the task over weeks of training, rats were able to master motor tasks that, in complexity and structure, resemble ones used in primate studies of motor sequence learning. By enabling fully automated training of rodents in a home-cage setting this low-cost and modular system increases the utility of rodents for studying the neural underpinnings of a variety of complex behaviors. PMID:24349451

  15. Fully automated localization of multiple pelvic bone structures on MRI.

    PubMed

    Onal, Sinan; Lai-Yuen, Susana; Bao, Paul; Weitzenfeld, Alfredo; Hart, Stuart

    2014-01-01

    In this paper, we present a fully automated localization method for multiple pelvic bone structures on magnetic resonance images (MRI). Pelvic bone structures are currently identified manually on MRI to identify reference points for measurement and evaluation of pelvic organ prolapse (POP). Given that this is a time-consuming and subjective procedure, there is a need to localize pelvic bone structures without any user interaction. However, bone structures are not easily differentiable from soft tissue on MRI as their pixel intensities tend to be very similar. In this research, we present a model that automatically identifies the bounding boxes of the bone structures on MRI using support vector machines (SVM) based classification and non-linear regression model that captures global and local information. Based on the relative locations of pelvic bones and organs, and local information such as texture features, the model identifies the location of the pelvic bone structures by establishing the association between their locations. Results show that the proposed method is able to locate the bone structures of interest accurately. The pubic bone, sacral promontory, and coccyx were correctly detected (DSI > 0.75) in 92%, 90%, and 88% of the testing images. This research aims to enable accurate, consistent and fully automated identification of pelvic bone structures on MRI to facilitate and improve the diagnosis of female pelvic organ prolapse.

  16. Fully automated 2D-3D registration and verification.

    PubMed

    Varnavas, Andreas; Carrell, Tom; Penney, Graeme

    2015-12-01

    Clinical application of 2D-3D registration technology often requires a significant amount of human interaction during initialisation and result verification. This is one of the main barriers to more widespread clinical use of this technology. We propose novel techniques for automated initial pose estimation of the 3D data and verification of the registration result, and show how these techniques can be combined to enable fully automated 2D-3D registration, particularly in the case of a vertebra based system. The initialisation method is based on preoperative computation of 2D templates over a wide range of 3D poses. These templates are used to apply the Generalised Hough Transform to the intraoperative 2D image and the sought 3D pose is selected with the combined use of the generated accumulator arrays and a Gradient Difference Similarity Measure. On the verification side, two algorithms are proposed: one using normalised features based on the similarity value and the other based on the pose agreement between multiple vertebra based registrations. The proposed methods are employed here for CT to fluoroscopy registration and are trained and tested with data from 31 clinical procedures with 417 low dose, i.e. low quality, high noise interventional fluoroscopy images. When similarity value based verification is used, the fully automated system achieves a 95.73% correct registration rate, whereas a no registration result is produced for the remaining 4.27% of cases (i.e. incorrect registration rate is 0%). The system also automatically detects input images outside its operating range.

  17. A fully automated robotic system for microinjection of zebrafish embryos.

    PubMed

    Wang, Wenhui; Liu, Xinyu; Gelinas, Danielle; Ciruna, Brian; Sun, Yu

    2007-09-12

    As an important embodiment of biomanipulation, injection of foreign materials (e.g., DNA, RNAi, sperm, protein, and drug compounds) into individual cells has significant implications in genetics, transgenics, assisted reproduction, and drug discovery. This paper presents a microrobotic system for fully automated zebrafish embryo injection, which overcomes the problems inherent in manual operation, such as human fatigue and large variations in success rates due to poor reproducibility. Based on computer vision and motion control, the microrobotic system performs injection at a speed of 15 zebrafish embryos (chorion unremoved) per minute, with a survival rate of 98% (n = 350 embryos), a success rate of 99% (n = 350 embryos), and a phenotypic rate of 98.5% (n = 210 embryos). The sample immobilization technique and microrobotic control method are applicable to other biological injection applications such as the injection of mouse oocytes/embryos and Drosophila embryos to enable high-throughput biological and pharmaceutical research.

  18. Fully automated adipose tissue measurement on abdominal CT

    NASA Astrophysics Data System (ADS)

    Yao, Jianhua; Sussman, Daniel L.; Summers, Ronald M.

    2011-03-01

    Obesity has become widespread in America and has been associated as a risk factor for many illnesses. Adipose tissue (AT) content, especially visceral AT (VAT), is an important indicator for risks of many disorders, including heart disease and diabetes. Measuring adipose tissue (AT) with traditional means is often unreliable and inaccurate. CT provides a means to measure AT accurately and consistently. We present a fully automated method to segment and measure abdominal AT in CT. Our method integrates image preprocessing which attempts to correct for image artifacts and inhomogeneities. We use fuzzy cmeans to cluster AT regions and active contour models to separate subcutaneous and visceral AT. We tested our method on 50 abdominal CT scans and evaluated the correlations between several measurements.

  19. A fully automated TerraSAR-X based flood service

    NASA Astrophysics Data System (ADS)

    Martinis, Sandro; Kersten, Jens; Twele, André

    2015-06-01

    In this paper, a fully automated processing chain for near real-time flood detection using high resolution TerraSAR-X Synthetic Aperture Radar (SAR) data is presented. The processing chain including SAR data pre-processing, computation and adaption of global auxiliary data, unsupervised initialization of the classification as well as post-classification refinement by using a fuzzy logic-based approach is automatically triggered after satellite data delivery. The dissemination of flood maps resulting from this service is performed through an online service which can be activated on-demand for emergency response purposes (i.e., when a flood situation evolves). The classification methodology is based on previous work of the authors but was substantially refined and extended for robustness and transferability to guarantee high classification accuracy under different environmental conditions and sensor configurations. With respect to accuracy and computational effort, experiments performed on a data set of 175 different TerraSAR-X scenes acquired during flooding all over the world with different sensor configurations confirm the robustness and effectiveness of the proposed flood mapping service. These promising results have been further confirmed by means of an in-depth validation performed for three study sites in Germany, Thailand, and Albania/Montenegro.

  20. Fully Automated Portable Comprehensive 2-Dimensional Gas Chromatography Device.

    PubMed

    Lee, Jiwon; Zhou, Menglian; Zhu, Hongbo; Nidetz, Robert; Kurabayashi, Katsuo; Fan, Xudong

    2016-10-06

    We developed a fully automated portable 2-dimensional (2-D) gas chromatography (GC x GC) device, which had a dimension of 60 cm × 50 cm × 10 cm and weight less than 5 kg. The device incorporated a micropreconcentrator/injector, commercial columns, micro-Deans switches, microthermal injectors, microphotoionization detectors, data acquisition cards, and power supplies, as well as computer control and user interface. It employed multiple channels (4 channels) in the second dimension ((2)D) to increase the (2)D separation time (up to 32 s) and hence (2)D peak capacity. In addition, a nondestructive flow-through vapor detector was installed at the end of the (1)D column to monitor the eluent from (1)D and assist in reconstructing (1)D elution peaks. With the information obtained jointly from the (1)D and (2)D detectors, (1)D elution peaks could be reconstructed with significantly improved (1)D resolution. In this Article, we first discuss the details of the system operating principle and the algorithm to reconstruct (1)D elution peaks, followed by the description and characterization of each component. Finally, 2-D separation of 50 analytes, including alkane (C6-C12), alkene, alcohol, aldehyde, ketone, cycloalkane, and aromatic hydrocarbon, in 14 min is demonstrated, showing the peak capacity of 430-530 and the peak capacity production of 40-80/min.

  1. Fully Automated Enhanced Tumor Compartmentalization: Man vs. Machine Reloaded

    PubMed Central

    Meier, Raphael; Verma, Rajeev; Jilch, Astrid; Fichtner, Jens; Knecht, Urspeter; Radina, Christian; Schucht, Philippe; Beck, Jürgen; Raabe, Andreas; Slotboom, Johannes; Reyes, Mauricio; Wiest, Roland

    2016-01-01

    Objective Comparison of a fully-automated segmentation method that uses compartmental volume information to a semi-automatic user-guided and FDA-approved segmentation technique. Methods Nineteen patients with a recently diagnosed and histologically confirmed glioblastoma (GBM) were included and MR images were acquired with a 1.5 T MR scanner. Manual segmentation for volumetric analyses was performed using the open source software 3D Slicer version 4.2.2.3 (www.slicer.org). Semi-automatic segmentation was done by four independent neurosurgeons and neuroradiologists using the computer-assisted segmentation tool SmartBrush® (referred to as SB), a semi-automatic user-guided and FDA-approved tumor-outlining program that uses contour expansion. Fully automatic segmentations were performed with the Brain Tumor Image Analysis (BraTumIA, referred to as BT) software. We compared manual (ground truth, referred to as GT), computer-assisted (SB) and fully-automated (BT) segmentations with regard to: (1) products of two maximum diameters for 2D measurements, (2) the Dice coefficient, (3) the positive predictive value, (4) the sensitivity and (5) the volume error. Results Segmentations by the four expert raters resulted in a mean Dice coefficient between 0.72 and 0.77 using SB. BT achieved a mean Dice coefficient of 0.68. Significant differences were found for intermodal (BT vs. SB) and for intramodal (four SB expert raters) performances. The BT and SB segmentations of the contrast-enhancing volumes achieved a high correlation with the GT. Pearson correlation was 0.8 for BT; however, there were a few discrepancies between raters (BT and SB 1 only). Additional non-enhancing tumor tissue extending the SB volumes was found with BT in 16/19 cases. The clinically motivated sum of products of diameters measure (SPD) revealed neither significant intermodal nor intramodal variations. The analysis time for the four expert raters was faster (1 minute and 47 seconds to 3 minutes and 39

  2. Participation through Automation: Fully Automated Critical PeakPricing in Commercial Buildings

    SciTech Connect

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Kiliccote,Sila; Linkugel, Eric

    2006-06-20

    California electric utilities have been exploring the use of dynamic critical peak prices (CPP) and other demand response programs to help reduce peaks in customer electric loads. CPP is a tariff design to promote demand response. Levels of automation in DR can be defined as follows: Manual Demand Response involves a potentially labor-intensive approach such as manually turning off or changing comfort set points at each equipment switch or controller. Semi-Automated Demand Response involves a pre-programmed demand response strategy initiated by a person via centralized control system. Fully Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. The receipt of the external signal initiates pre-programmed demand response strategies. They refer to this as Auto-DR. This paper describes the development, testing, and results from automated CPP (Auto-CPP) as part of a utility project in California. The paper presents the project description and test methodology. This is followed by a discussion of Auto-DR strategies used in the field test buildings. They present a sample Auto-CPP load shape case study, and a selection of the Auto-CPP response data from September 29, 2005. If all twelve sites reached their maximum saving simultaneously, a total of approximately 2 MW of DR is available from these twelve sites that represent about two million ft{sup 2}. The average DR was about half that value, at about 1 MW. These savings translate to about 0.5 to 1.0 W/ft{sup 2} of demand reduction. They are continuing field demonstrations and economic evaluations to pursue increasing penetrations of automated DR that has demonstrated ability to provide a valuable DR resource for California.

  3. Fully integrated microfluidic platform enabling automated phosphoprofiling of macrophage response.

    PubMed

    Srivastava, Nimisha; Brennan, James S; Renzi, Ronald F; Wu, Meiye; Branda, Steven S; Singh, Anup K; Herr, Amy E

    2009-05-01

    The ability to monitor cell signaling events is crucial to the understanding of immune defense against invading pathogens. Conventional analytical techniques such as flow cytometry, microscopy, and Western blot are powerful tools for signaling studies. Nevertheless, each approach is currently stand-alone and limited by multiple time-consuming and labor-intensive steps. In addition, these techniques do not provide correlated signaling information on total intracellular protein abundance and subcellular protein localization. We report on a novel phosphoFlow Chip (pFC) that relies on monolithic microfluidic technology to rapidly conduct signaling studies. The pFC platform integrates cell stimulation and preparation, microscopy, and subsequent flow cytometry. pFC allows host-pathogen phosphoprofiling in 30 min with an order of magnitude reduction in the consumption of reagents. For pFC validation, we monitor the mitogen-activated protein kinases ERK1/2 and p38 in response to Escherichia coli lipopolysaccharide (LPS) stimulation of murine macrophage cells (RAW 264.7). pFC permits ERK1/2 phosphorylation monitoring starting at 5 s after LPS stimulation, with phosphorylation observed at 5 min. In addition, ERK1/2 phosphorylation is correlated with subsequent recruitment into the nucleus, as observed from fluorescence microscopy performed on cells upstream of flow cytometric analysis. The fully integrated cell handling has the added advantage of reduced cell aggregation and cell loss, with no detectable cell activation. The pFC approach is a step toward unified, automated infrastructure for high-throughput systems biology.

  4. Toward Fully Automated Multicriterial Plan Generation: A Prospective Clinical Study

    SciTech Connect

    Voet, Peter W.J.; Dirkx, Maarten L.P.; Breedveld, Sebastiaan; Fransen, Dennie; Levendag, Peter C.; Heijmen, Ben J.M.

    2013-03-01

    Purpose: To prospectively compare plans generated with iCycle, an in-house-developed algorithm for fully automated multicriterial intensity modulated radiation therapy (IMRT) beam profile and beam orientation optimization, with plans manually generated by dosimetrists using the clinical treatment planning system. Methods and Materials: For 20 randomly selected head-and-neck cancer patients with various tumor locations (of whom 13 received sequential boost treatments), we offered the treating physician the choice between an automatically generated iCycle plan and a manually optimized plan using standard clinical procedures. Although iCycle used a fixed “wish list” with hard constraints and prioritized objectives, the dosimetrists manually selected the beam configuration and fine tuned the constraints and objectives for each IMRT plan. Dosimetrists were not informed in advance whether a competing iCycle plan was made. The 2 plans were simultaneously presented to the physician, who then selected the plan to be used for treatment. For the patient group, differences in planning target volume coverage and sparing of critical tissues were quantified. Results: In 32 of 33 plan comparisons, the physician selected the iCycle plan for treatment. This highly consistent preference for the automatically generated plans was mainly caused by the improved sparing for the large majority of critical structures. With iCycle, the normal tissue complication probabilities for the parotid and submandibular glands were reduced by 2.4% ± 4.9% (maximum, 18.5%, P=.001) and 6.5% ± 8.3% (maximum, 27%, P=.005), respectively. The reduction in the mean oral cavity dose was 2.8 ± 2.8 Gy (maximum, 8.1 Gy, P=.005). For the swallowing muscles, the esophagus and larynx, the mean dose reduction was 3.3 ± 1.1 Gy (maximum, 9.2 Gy, P<.001). For 15 of the 20 patients, target coverage was also improved. Conclusions: In 97% of cases, automatically generated plans were selected for treatment because of

  5. ArrayPitope: Automated Analysis of Amino Acid Substitutions for Peptide Microarray-Based Antibody Epitope Mapping

    PubMed Central

    Hansen, Christian Skjødt; Østerbye, Thomas; Marcatili, Paolo; Lund, Ole; Buus, Søren

    2017-01-01

    Identification of epitopes targeted by antibodies (B cell epitopes) is of critical importance for the development of many diagnostic and therapeutic tools. For clinical usage, such epitopes must be extensively characterized in order to validate specificity and to document potential cross-reactivity. B cell epitopes are typically classified as either linear epitopes, i.e. short consecutive segments from the protein sequence or conformational epitopes adapted through native protein folding. Recent advances in high-density peptide microarrays enable high-throughput, high-resolution identification and characterization of linear B cell epitopes. Using exhaustive amino acid substitution analysis of peptides originating from target antigens, these microarrays can be used to address the specificity of polyclonal antibodies raised against such antigens containing hundreds of epitopes. However, the interpretation of the data provided in such large-scale screenings is far from trivial and in most cases it requires advanced computational and statistical skills. Here, we present an online application for automated identification of linear B cell epitopes, allowing the non-expert user to analyse peptide microarray data. The application takes as input quantitative peptide data of fully or partially substituted overlapping peptides from a given antigen sequence and identifies epitope residues (residues that are significantly affected by substitutions) and visualize the selectivity towards each residue by sequence logo plots. Demonstrating utility, the application was used to identify and address the antibody specificity of 18 linear epitope regions in Human Serum Albumin (HSA), using peptide microarray data consisting of fully substituted peptides spanning the entire sequence of HSA and incubated with polyclonal rabbit anti-HSA (and mouse anti-rabbit-Cy3). The application is made available at: www.cbs.dtu.dk/services/ArrayPitope. PMID:28095436

  6. Microarrays

    ERIC Educational Resources Information Center

    Plomin, Robert; Schalkwyk, Leonard C.

    2007-01-01

    Microarrays are revolutionizing genetics by making it possible to genotype hundreds of thousands of DNA markers and to assess the expression (RNA transcripts) of all of the genes in the genome. Microarrays are slides the size of a postage stamp that contain millions of DNA sequences to which single-stranded DNA or RNA can hybridize. This…

  7. Multiplex RT-PCR and Automated Microarray for Detection of Eight Bovine Viruses.

    PubMed

    Lung, O; Furukawa-Stoffer, T; Burton Hughes, K; Pasick, J; King, D P; Hodko, D

    2016-11-23

    Microarrays can be a useful tool for pathogen detection as it allow for simultaneous interrogation of the presence of a large number of genetic sequences in a sample. However, conventional microarrays require extensive manual handling and multiple pieces of equipment for printing probes, hybridization, washing and signal detection. In this study, a reverse transcription (RT)-PCR with an accompanying novel automated microarray for simultaneous detection of eight viruses that affect cattle [vesicular stomatitis virus (VSV), bovine viral diarrhoea virus type 1 and type 2, bovine herpesvirus 1, bluetongue virus, malignant catarrhal fever virus, rinderpest virus (RPV) and parapox viruses] is described. The assay accurately identified a panel of 37 strains of the target viruses and identified a mixed infection. No non-specific reactions were observed with a panel of 23 non-target viruses associated with livestock. Vesicular stomatitis virus was detected as early as 2 days post-inoculation in oral swabs from experimentally infected animals. The limit of detection of the microarray assay was as low as 1 TCID50 /ml for RPV. The novel microarray platform automates the entire post-PCR steps of the assay and integrates electrophoretic-driven capture probe printing in a single user-friendly instrument that allows array layout and assay configuration to be user-customized on-site.

  8. Fully Automated Supply Chain Management at MCRD-PI

    DTIC Science & Technology

    2004-04-15

    SCAN FORMS AND SMART CARD READER INTERFACE ................................................................................................. 18 4.1...interfaces with smart card and bar-code technologies; validation of the size selection accuracy of the 3D Whole Body Scanner; automation of the receiving...appropriately reported. Implement the recruit Smart Card interface in order to quickly capture data for the ARN Control Panel. Populate the

  9. ATLAS from Data Research Associates: A Fully Integrated Automation System.

    ERIC Educational Resources Information Center

    Mellinger, Michael J.

    1987-01-01

    This detailed description of a fully integrated, turnkey library system includes a complete profile of the system (functions, operational characteristics, hardware, operating system, minimum memory and pricing); history of the technologies involved; and descriptions of customer services and availability. (CLB)

  10. A Program Certification Assistant Based on Fully Automated Theorem Provers

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2005-01-01

    We describe a certification assistant to support formal safety proofs for programs. It is based on a graphical user interface that hides the low-level details of first-order automated theorem provers while supporting limited interactivity: it allows users to customize and control the proof process on a high level, manages the auxiliary artifacts produced during this process, and provides traceability between the proof obligations and the relevant parts of the program. The certification assistant is part of a larger program synthesis system and is intended to support the deployment of automatically generated code in safety-critical applications.

  11. Toward fully automated genotyping: genotyping microsatellite markers by deconvolution.

    PubMed Central

    Perlin, M W; Lancia, G; Ng, S K

    1995-01-01

    Dense genetic linkage maps have been constructed for the human and mouse genomes, with average densities of 2.9 cM and 0.35 cM, respectively. These genetic maps are crucial for mapping both Mendelian and complex traits and are useful in clinical genetic diagnosis. Current maps are largely comprised of abundant, easily assayed, and highly polymorphic PCR-based microsatellite markers, primarily dinucleotide (CA)n repeats. One key limitation of these length polymorphisms is the PCR stutter (or slippage) artifact that introduces additional stutter bands. With two (or more) closely spaced alleles, the stutter bands overlap, and it is difficult to accurately determine the correct alleles; this stutter phenomenon has all but precluded full automation, since a human must visually inspect the allele data. We describe here novel deconvolution methods for accurate genotyping that mathematically remove PCR stutter artifact from microsatellite markers. These methods overcome the manual interpretation bottleneck and thereby enable full automation of genetic map construction and use. New functionalities, including the pooling of DNAs and the pooling of markers, are described that may greatly reduce the associated experimentation requirements. PMID:7485172

  12. Gene Expression Measurement Module (GEMM) - A Fully Automated, Miniaturized Instrument for Measuring Gene Expression in Space

    NASA Technical Reports Server (NTRS)

    Pohorille, Andrew; Peyvan, Kia; Karouia, Fathi; Ricco, Antonio

    2012-01-01

    The capability to measure gene expression on board spacecraft opens the door to a large number of high-value experiments on the influence of the space environment on biological systems. For example, measurements of gene expression will help us to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment on a wide range of organisms from microbes to humans, develop effective countermeasures against these effects, and determine the metabolic bases of microbial pathogenicity and drug resistance. These and other applications hold significant potential for discoveries in space biology, biotechnology, and medicine. Supported by funding from the NASA Astrobiology Science and Technology Instrument Development Program, we are developing a fully automated, miniaturized, integrated fluidic system for small spacecraft capable of in-situ measurement of expression of several hundreds of microbial genes from multiple samples. The instrument will be capable of (1) lysing cell walls of bacteria sampled from cultures grown in space, (2) extracting and purifying RNA released from cells, (3) hybridizing the RNA on a microarray and (4) providing readout of the microarray signal, all in a single microfluidics cartridge. The device is suitable for deployment on nanosatellite platforms developed by NASA Ames' Small Spacecraft Division. To meet space and other technical constraints imposed by these platforms, a number of technical innovations are being implemented. The integration and end-to-end technological and biological validation of the instrument are carried out using as a model the photosynthetic bacterium Synechococcus elongatus, known for its remarkable metabolic diversity and resilience to adverse conditions. Each step in the measurement process-lysis, nucleic acid extraction, purification, and hybridization to an array-is assessed through comparison of the results obtained using the instrument with

  13. Gene Expression Measurement Module (GEMM) - a fully automated, miniaturized instrument for measuring gene expression in space

    NASA Astrophysics Data System (ADS)

    Karouia, Fathi; Ricco, Antonio; Pohorille, Andrew; Peyvan, Kianoosh

    2012-07-01

    The capability to measure gene expression on board spacecrafts opens the doors to a large number of experiments on the influence of space environment on biological systems that will profoundly impact our ability to conduct safe and effective space travel, and might also shed light on terrestrial physiology or biological function and human disease and aging processes. Measurements of gene expression will help us to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment on a wide range of organisms from microbes to humans, develop effective countermeasures against these effects, determine metabolic basis of microbial pathogenicity and drug resistance, test our ability to sustain and grow in space organisms that can be used for life support and in situ resource utilization during long-duration space exploration, and monitor both the spacecraft environment and crew health. These and other applications hold significant potential for discoveries in space biology, biotechnology and medicine. Accordingly, supported by funding from the NASA Astrobiology Science and Technology Instrument Development Program, we are developing a fully automated, miniaturized, integrated fluidic system for small spacecraft capable of in-situ measuring microbial expression of thousands of genes from multiple samples. The instrument will be capable of (1) lysing bacterial cell walls, (2) extracting and purifying RNA released from cells, (3) hybridizing it on a microarray and (4) providing electrochemical readout, all in a microfluidics cartridge. The prototype under development is suitable for deployment on nanosatellite platforms developed by the NASA Small Spacecraft Office. The first target application is to cultivate and measure gene expression of the photosynthetic bacterium Synechococcus elongatus, i.e. a cyanobacterium known to exhibit remarkable metabolic diversity and resilience to adverse conditions

  14. Datamining Approach for Automation of Diagnosis of Breast Cancer in Immunohistochemically Stained Tissue Microarray Images

    PubMed Central

    Prasad, Keerthana; Zimmermann, Bernhard; Prabhu, Gopalakrishna; Pai, Muktha

    2010-01-01

    Cancer of the breast is the second most common human neoplasm, accounting for approximately one quarter of all cancers in females after cervical carcinoma. Estrogen receptor (ER), Progesteron receptor and human epidermal growth factor receptor (HER-2/neu) expressions play an important role in diagnosis and prognosis of breast carcinoma. Tissue microarray (TMA) technique is a high throughput technique which provides a standardized set of images which are uniformly stained, facilitating effective automation of the evaluation of the specimen images. TMA technique is widely used to evaluate hormone expression for diagnosis of breast cancer. If one considers the time taken for each of the steps in the tissue microarray process workflow, it can be observed that the maximum amount of time is taken by the analysis step. Hence, automated analysis will significantly reduce the overall time required to complete the study. Many tools are available for automated digital acquisition of images of the spots from the microarray slide. Each of these images needs to be evaluated by a pathologist to assign a score based on the staining intensity to represent the hormone expression, to classify them into negative or positive cases. Our work aims to develop a system for automated evaluation of sets of images generated through tissue microarray technique, representing the ER expression images and HER-2/neu expression images. Our study is based on the Tissue Microarray Database portal of Stanford university at http://tma.stanford.edu/cgi-bin/cx?n=her1, which has made huge number of images available to researchers. We used 171 images corresponding to ER expression and 214 images corresponding to HER-2/neu expression of breast carcinoma. Out of the 171 images corresponding to ER expression, 104 were negative and 67 were representing positive cases. Out of the 214 images corresponding to HER-2/neu expression, 112 were negative and 102 were representing positive cases. Our method has 92

  15. Datamining approach for automation of diagnosis of breast cancer in immunohistochemically stained tissue microarray images.

    PubMed

    Prasad, Keerthana; Zimmermann, Bernhard; Prabhu, Gopalakrishna; Pai, Muktha

    2010-05-28

    Cancer of the breast is the second most common human neoplasm, accounting for approximately one quarter of all cancers in females after cervical carcinoma. Estrogen receptor (ER), Progesteron receptor and human epidermal growth factor receptor (HER-2/neu) expressions play an important role in diagnosis and prognosis of breast carcinoma. Tissue microarray (TMA) technique is a high throughput technique which provides a standardized set of images which are uniformly stained, facilitating effective automation of the evaluation of the specimen images. TMA technique is widely used to evaluate hormone expression for diagnosis of breast cancer. If one considers the time taken for each of the steps in the tissue microarray process workflow, it can be observed that the maximum amount of time is taken by the analysis step. Hence, automated analysis will significantly reduce the overall time required to complete the study. Many tools are available for automated digital acquisition of images of the spots from the microarray slide. Each of these images needs to be evaluated by a pathologist to assign a score based on the staining intensity to represent the hormone expression, to classify them into negative or positive cases. Our work aims to develop a system for automated evaluation of sets of images generated through tissue microarray technique, representing the ER expression images and HER-2/neu expression images. Our study is based on the Tissue Microarray Database portal of Stanford university at http://tma.stanford.edu/cgi-bin/cx?n=her1, which has made huge number of images available to researchers. We used 171 images corresponding to ER expression and 214 images corresponding to HER-2/neu expression of breast carcinoma. Out of the 171 images corresponding to ER expression, 104 were negative and 67 were representing positive cases. Out of the 214 images corresponding to HER-2/neu expression, 112 were negative and 102 were representing positive cases. Our method has 92

  16. Automated and Multiplexed Soft Lithography for the Production of Low-Density DNA Microarrays

    PubMed Central

    Fredonnet, Julie; Foncy, Julie; Cau, Jean-Christophe; Séverac, Childérick; François, Jean Marie; Trévisiol, Emmanuelle

    2016-01-01

    Microarrays are established research tools for genotyping, expression profiling, or molecular diagnostics in which DNA molecules are precisely addressed to the surface of a solid support. This study assesses the fabrication of low-density oligonucleotide arrays using an automated microcontact printing device, the InnoStamp 40®. This automate allows a multiplexed deposition of oligoprobes on a functionalized surface by the use of a MacroStampTM bearing 64 individual pillars each mounted with 50 circular micropatterns (spots) of 160 µm diameter at 320 µm pitch. Reliability and reuse of the MacroStampTM were shown to be fast and robust by a simple washing step in 96% ethanol. The low-density microarrays printed on either epoxysilane or dendrimer-functionalized slides (DendriSlides) showed excellent hybridization response with complementary sequences at unusual low probe and target concentrations, since the actual probe density immobilized by this technology was at least 10-fold lower than with the conventional mechanical spotting. In addition, we found a comparable hybridization response in terms of fluorescence intensity between spotted and printed oligoarrays with a 1 nM complementary target by using a 50-fold lower probe concentration to produce the oligoarrays by the microcontact printing method. Taken together, our results lend support to the potential development of this multiplexed microcontact printing technology employing soft lithography as an alternative, cost-competitive tool for fabrication of low-density DNA microarrays. PMID:27681742

  17. Fully Automated Operational Modal Analysis using multi-stage clustering

    NASA Astrophysics Data System (ADS)

    Neu, Eugen; Janser, Frank; Khatibi, Akbar A.; Orifici, Adrian C.

    2017-02-01

    The interest for robust automatic modal parameter extraction techniques has increased significantly over the last years, together with the rising demand for continuous health monitoring of critical infrastructure like bridges, buildings and wind turbine blades. In this study a novel, multi-stage clustering approach for Automated Operational Modal Analysis (AOMA) is introduced. In contrast to existing approaches, the procedure works without any user-provided thresholds, is applicable within large system order ranges, can be used with very small sensor numbers and does not place any limitations on the damping ratio or the complexity of the system under investigation. The approach works with any parametric system identification algorithm that uses the system order n as sole parameter. Here a data-driven Stochastic Subspace Identification (SSI) method is used. Measurements from a wind tunnel investigation with a composite cantilever equipped with Fiber Bragg Grating Sensors (FBGSs) and piezoelectric sensors are used to assess the performance of the algorithm with a highly damped structure and low signal to noise ratio conditions. The proposed method was able to identify all physical system modes in the investigated frequency range from over 1000 individual datasets using FBGSs under challenging signal to noise ratio conditions and under better signal conditions but from only two sensors.

  18. Fully automated software for quantitative measurements of mitochondrial morphology.

    PubMed

    McClatchey, P Mason; Keller, Amy C; Bouchard, Ron; Knaub, Leslie A; Reusch, Jane E B

    2016-01-01

    Mitochondria undergo dynamic changes in morphology in order to adapt to changes in nutrient and oxygen availability, communicate with the nucleus, and modulate intracellular calcium dynamics. Many recent papers have been published assessing mitochondrial morphology endpoints. Although these studies have yielded valuable insights, contemporary assessment of mitochondrial morphology is typically subjective and qualitative, precluding direct comparison of outcomes between different studies and likely missing many subtle effects. In this paper, we describe a novel software technique for measuring the average length, average width, spatial density, and intracellular localization of mitochondria from a fluorescent microscope image. This method was applied to distinguish baseline characteristics of Human Umbilical Vein Endothelial Cells (HUVECs), primary Goto-Kakizaki rat aortic smooth muscle cells (GK SMCs), primary Wistar rat aortic smooth muscle cells (Wistar SMCs), and SH-SY5Ys (human neuroblastoma cell line). Consistent with direct observation, our algorithms found SH-SY5Ys to have the greatest mitochondrial density, while HUVECs were found to have the longest mitochondria. Mitochondrial morphology responses to temperature, nutrient, and oxidative stressors were characterized to test algorithm performance. Large morphology changes recorded by the software agreed with direct observation, and subtle but consistent morphology changes were found that would not otherwise have been detected. Endpoints were consistent between experimental repetitions (R=0.93 for length, R=0.93 for width, R=0.89 for spatial density, and R=0.74 for localization), and maintained reasonable agreement even when compared to images taken with compromised microscope resolution or in an alternate imaging plane. These results indicate that the automated software described herein allows quantitative and objective characterization of mitochondrial morphology from fluorescent microscope images.

  19. Improving reticle defect disposition via fully automated lithography simulation

    NASA Astrophysics Data System (ADS)

    Mann, Raunak; Goodman, Eliot; Lao, Keith; Ha, Steven; Vacca, Anthony; Fiekowsky, Peter; Fiekowsky, Dan

    2016-03-01

    Most advanced wafer fabs have embraced complex pattern decoration, which creates numerous challenges during in-fab reticle qualification. These optical proximity correction (OPC) techniques create assist features that tend to be very close in size and shape to the main patterns as seen in Figure 1. A small defect on an assist feature will most likely have little or no impact on the fidelity of the wafer image, whereas the same defect on a main feature could significantly decrease device functionality. In order to properly disposition these defects, reticle inspection technicians need an efficient method that automatically separates main from assist features and predicts the resulting defect impact on the wafer image. Analysis System (ADAS) defect simulation system[1]. Up until now, using ADAS simulation was limited to engineers due to the complexity of the settings that need to be manually entered in order to create an accurate result. A single error in entering one of these values can cause erroneous results, therefore full automation is necessary. In this study, we propose a new method where all needed simulation parameters are automatically loaded into ADAS. This is accomplished in two parts. First we have created a scanner parameter database that is automatically identified from mask product and level names. Second, we automatically determine the appropriate simulation printability threshold by using a new reference image (provided by the inspection tool) that contains a known measured value of the reticle critical dimension (CD). This new method automatically loads the correct scanner conditions, sets the appropriate simulation threshold, and automatically measures the percentage of CD change caused by the defect. This streamlines qualification and reduces the number of reticles being put on hold, waiting for engineer review. We also present data showing the consistency and reliability of the new method, along with the impact on the efficiency of in

  20. A Fully Automated Classification for Mapping the Annual Cropland Extent

    NASA Astrophysics Data System (ADS)

    Waldner, F.; Defourny, P.

    2015-12-01

    Mapping the global cropland extent is of paramount importance for food security. Indeed, accurate and reliable information on cropland and the location of major crop types is required to make future policy, investment, and logistical decisions, as well as production monitoring. Timely cropland information directly feed early warning systems such as GIEWS and, FEWS NET. In Africa, and particularly in the arid and semi-arid region, food security is center of debate (at least 10% of the population remains undernourished) and accurate cropland estimation is a challenge. Space borne Earth Observation provides opportunities for global cropland monitoring in a spatially explicit, economic, efficient, and objective fashion. In the both agriculture monitoring and climate modelling, cropland maps serve as mask to isolate agricultural land for (i) time-series analysis for crop condition monitoring and (ii) to investigate how the cropland is respond to climatic evolution. A large diversity of mapping strategies ranging from the local to the global scale and associated with various degrees of accuracy can be found in the literature. At the global scale, despite efforts, cropland is generally one of classes with the poorest accuracy which make difficult the use for agricultural. This research aims at improving the cropland delineation from the local scale to the regional and global scales as well as allowing near real time updates. To that aim, five temporal features were designed to target the key- characteristics of crop spectral-temporal behavior. To ensure a high degree of automation, training data is extracted from available baseline land cover maps. The method delivers cropland maps with a high accuracy over contrasted agro-systems in Ukraine, Argentina, China and Belgium. The accuracy reached are comparable to those obtained with classifiers trained with in-situ data. Besides, it was found that the cropland class is associated with a low uncertainty. The temporal features

  1. An automated microfluidic system for single-stranded DNA preparation and magnetic bead-based microarray analysis

    PubMed Central

    Wang, Shuaiqin; Sun, Yujia; Liu, Yan; Xiang, Guangxin; Wang, Lei; Cheng, Jing; Liu, Peng

    2015-01-01

    We present an integrated microfluidic device capable of performing single-stranded DNA (ssDNA) preparation and magnetic bead-based microarray analysis with a white-light detection for detecting mutations that account for hereditary hearing loss. The entire operation process, which includes loading of streptavidin-coated magnetic beads (MBs) and biotin-labeled polymerase chain reaction products, active dispersion of the MBs with DNA for binding, alkaline denaturation of DNA, dynamic hybridization of the bead-labeled ssDNA to a tag array, and white-light detection, can all be automatically accomplished in a single chamber of the microchip, which was operated on a self-contained instrument with all the necessary components for thermal control, fluidic control, and detection. Two novel mixing valves with embedded polydimethylsiloxane membranes, which can alternately generate a 3-μl pulse flow at a peak rate of around 160 mm/s, were integrated into the chip for thoroughly dispersing magnetic beads in 2 min. The binding efficiency of biotinylated oligonucleotides to beads was measured to be 80.6% of that obtained in a tube with the conventional method. To critically test the performance of this automated microsystem, we employed a commercial microarray-based detection kit for detecting nine mutation loci that account for hereditary hearing loss. The limit of detection of the microsystem was determined as 2.5 ng of input K562 standard genomic DNA using this kit. In addition, four blood samples obtained from persons with mutations were all correctly typed by our system in less than 45 min per run. The fully automated, “amplicon-in-answer-out” operation, together with the white-light detection, makes our system an excellent platform for low-cost, rapid genotyping in clinical diagnosis. PMID:25825617

  2. Fully automated fluorescent in situ hybridization (FISH) staining and digital analysis of HER2 in breast cancer: a validation study.

    PubMed

    van der Logt, Elise M J; Kuperus, Deborah A J; van Setten, Jan W; van den Heuvel, Marius C; Boers, James E; Schuuring, Ed; Kibbelaar, Robby E

    2015-01-01

    HER2 assessment is routinely used to select patients with invasive breast cancer that might benefit from HER2-targeted therapy. The aim of this study was to validate a fully automated in situ hybridization (ISH) procedure that combines the automated Leica HER2 fluorescent ISH system for Bond with supervised automated analysis with the Visia imaging D-Sight digital imaging platform. HER2 assessment was performed on 328 formalin-fixed/paraffin-embedded invasive breast cancer tumors on tissue microarrays (TMA) and 100 (50 selected IHC 2+ and 50 random IHC scores) full-sized slides of resections/biopsies obtained for diagnostic purposes previously. For digital analysis slides were pre-screened at 20x and 100x magnification for all fluorescent signals and supervised-automated scoring was performed on at least two pictures (in total at least 20 nuclei were counted) with the D-Sight HER2 FISH analysis module by two observers independently. Results were compared to data obtained previously with the manual Abbott FISH test. The overall agreement with Abbott FISH data among TMA samples and 50 selected IHC 2+ cases was 98.8% (κ = 0.94) and 93.8% (κ = 0.88), respectively. The results of 50 additionally tested unselected IHC cases were concordant with previously obtained IHC and/or FISH data. The combination of the Leica FISH system with the D-Sight digital imaging platform is a feasible method for HER2 assessment in routine clinical practice for patients with invasive breast cancer.

  3. Considerations for Using Phased Array Ultrasonics in a Fully Automated Inspection System

    NASA Astrophysics Data System (ADS)

    Kramb, V. A.; Olding, R. B.; Sebastian, J. R.; Hoppe, W. C.; Petricola, D. L.; Hoeffel, J. D.; Gasper, D. A.; Stubbs, D. A.

    2004-02-01

    The University of Dayton Research Institute (UDRI) under contract by the US Air Force has designed and constructed a fully automated ultrasonic inspection system for the detection of embedded defects in rotating gas turbine engine components. The system performs automated inspections using the "scan plan" concept developed for the Air Force sponsored "Retirement For Cause" (RFC) automated eddy current system. Execution of the scan plan results in a fully automated inspection process producing engine component accept/reject decisions based on probability of detection (POD) information. Use of the phased-array ultrasonic instrument and probes allows for optimization of both the sensitivity and resolution for each inspection through electronic beamforming, scanning, and focusing processes. However, issues such as alignment of the array probe, calibration of individual elements and overall beam response prior to the inspection have not been addressed for an automated system. This paper will discuss current progress in the development of an automated alignment and calibration procedure for various phased array apertures and specimen geometries.

  4. 21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY...

  5. 21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY...

  6. 21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY...

  7. 21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY...

  8. 21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION... choice to treat bacterial diseases. (b) Classification. Class II (special controls). The special...

  9. ProDeGe: A Computational Protocol for fully Automated Decontamination of Genomic Data

    SciTech Connect

    2015-12-01

    The Single Cell Data Decontamination Pipeline is a fully-automated software tool which classifies unscreened contigs from single cell datasets through a combination of homology and feature-based methodologies using the organism's nucleotide sequences and known NCBI taxonomony. The software is freely available to download and install, and can be run on any system.

  10. Fully automated digital holographic processing for monitoring the dynamics of a vesicle suspension under shear flow

    PubMed Central

    Minetti, Christophe; Podgorski, Thomas; Coupier, Gwennou; Dubois, Frank

    2014-01-01

    We investigate the dynamics of a vesicle suspension under shear flow between plates using DHM with a spatially reduced coherent source. Holograms are grabbed at a frequency of 24 frames/sec. The distribution of the vesicle suspension is obtained after numerical processing of the digital holograms sequence resulting in a 4D distribution. Obtaining this distribution is not straightforward and requires special processing to automate the analysis. We present an original method that fully automates the analysis and provides distributions that are further analyzed to extract physical properties of the fluid. Details of the numerical implementation, as well as sample experimental results are presented. PMID:24877015

  11. Neurodegenerative changes in Alzheimer's disease: a comparative study of manual, semi-automated, and fully automated assessment using MRI

    NASA Astrophysics Data System (ADS)

    Fritzsche, Klaus H.; Giesel, Frederik L.; Heimann, Tobias; Thomann, Philipp A.; Hahn, Horst K.; Pantel, Johannes; Schröder, Johannes; Essig, Marco; Meinzer, Hans-Peter

    2008-03-01

    Objective quantification of disease specific neurodegenerative changes can facilitate diagnosis and therapeutic monitoring in several neuropsychiatric disorders. Reproducibility and easy-to-perform assessment are essential to ensure applicability in clinical environments. Aim of this comparative study is the evaluation of a fully automated approach that assesses atrophic changes in Alzheimer's disease (AD) and Mild Cognitive Impairment (MCI). 21 healthy volunteers (mean age 66.2), 21 patients with MCI (66.6), and 10 patients with AD (65.1) were enrolled. Subjects underwent extensive neuropsychological testing and MRI was conducted on a 1.5 Tesla clinical scanner. Atrophic changes were measured automatically by a series of image processing steps including state of the art brain mapping techniques. Results were compared with two reference approaches: a manual segmentation of the hippocampal formation and a semi-automated estimation of temporal horn volume, which is based upon interactive selection of two to six landmarks in the ventricular system. All approaches separated controls and AD patients significantly (10 -5 < p < 10 -4) and showed a slight but not significant increase of neurodegeneration for subjects with MCI compared to volunteers. The automated approach correlated significantly with the manual (r = -0.65, p < 10 -6) and semi automated (r = -0.83, p < 10 -13) measurements. It proved high accuracy and at the same time maximized observer independency, time reduction and thus usefulness for clinical routine.

  12. Fully automated corneal endothelial morphometry of images captured by clinical specular microscopy

    NASA Astrophysics Data System (ADS)

    Bucht, Curry; Söderberg, Per; Manneberg, Göran

    2010-02-01

    The corneal endothelium serves as the posterior barrier of the cornea. Factors such as clarity and refractive properties of the cornea are in direct relationship to the quality of the endothelium. The endothelial cell density is considered the most important morphological factor of the corneal endothelium. Pathological conditions and physical trauma may threaten the endothelial cell density to such an extent that the optical property of the cornea and thus clear eyesight is threatened. Diagnosis of the corneal endothelium through morphometry is an important part of several clinical applications. Morphometry of the corneal endothelium is presently carried out by semi automated analysis of pictures captured by a Clinical Specular Microscope (CSM). Because of the occasional need of operator involvement, this process can be tedious, having a negative impact on sampling size. This study was dedicated to the development and use of fully automated analysis of a very large range of images of the corneal endothelium, captured by CSM, using Fourier analysis. Software was developed in the mathematical programming language Matlab. Pictures of the corneal endothelium, captured by CSM, were read into the analysis software. The software automatically performed digital enhancement of the images, normalizing lights and contrasts. The digitally enhanced images of the corneal endothelium were Fourier transformed, using the fast Fourier transform (FFT) and stored as new images. Tools were developed and applied for identification and analysis of relevant characteristics of the Fourier transformed images. The data obtained from each Fourier transformed image was used to calculate the mean cell density of its corresponding corneal endothelium. The calculation was based on well known diffraction theory. Results in form of estimated cell density of the corneal endothelium were obtained, using fully automated analysis software on 292 images captured by CSM. The cell density obtained by the

  13. Integration of image analysis and robotics into a fully automated colony picking and plate handling system.

    PubMed Central

    Jones, P; Watson, A; Davies, M; Stubbings, S

    1992-01-01

    We describe here the integration of image analysis and robotics to produce a fully automated colony picking/plate handling system. Biological tests were performed to verify its performance in terms of sterilisation and accuracy of picking. The machine was then used by a single operative to pick a 36,000 clone cDNA library in approximately 42 hrs over 5 days. Images PMID:1408762

  14. Vervet MRI atlas and label map for fully automated morphometric analyses.

    PubMed

    Maldjian, Joseph A; Daunais, James B; Friedman, David P; Whitlow, Christopher T

    2014-10-01

    Currently available non-human primate templates typically require input of a skull-stripped brain for structural processing. This can be a manually intensive procedure, and considerably limits their utility. The purpose of this study was to create a vervet MRI population template, associated tissue probability maps (TPM), and a label atlas to facilitate true fully automated Magnetic Resonance Imaging (MRI) structural analyses for morphometric analyses. Structural MRI scans of ten vervet monkeys (Chlorocebus aethiops) scanned at three time points were used in this study. An unbiased population average template was created using a symmetric diffeomorphic registration (SyN) procedure. Skull stripping, segmentation, and label map generation were performed using the publically available rhesus INIA19 MRI template and NeuroMap label atlas. A six-class TPM and a six-layer two-class normalization template was created from the vervet segmentation for use within the Statistical Parametric Mapping (SPM) framework. Fully automated morphologic processing of all of the vervet MRI scans was then performed using the vervet TPM and vervet normalization template including skull-stripping, segmentation and normalization. The vervet template creation procedure resulted in excellent skull stripping, segmentation, and NeuroMap atlas labeling with 720 structures successfully registered. Fully automated processing was accomplished for all vervet scans, demonstrating excellent skull-stripping, segmentation, and normalization performance. We describe creation of an unbiased vervet structural MRI population template and atlas. The template includes an associated six-class TPM and DARTEL six-layer two-class normalization template for true fully automated skull-stripping, segmentation, and normalization of vervet structural T1-weighted MRI scans. We provide the most detailed vervet label atlas currently available based on the NeuroMaps atlas with 720 labels successfully registered. We

  15. How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study

    PubMed Central

    Johansen, Ayna; Brendryen, Håvar

    2016-01-01

    Background eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. Objective We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist’s support of a working alliance, internalization of motivation, and managing lapses. Methods We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several “counseling sessions” about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. Results The program supports the user’s working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. Conclusions A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective. PMID:27354373

  16. Fully Automated Volumetric Modulated Arc Therapy Plan Generation for Prostate Cancer Patients

    SciTech Connect

    Voet, Peter W.J. Dirkx, Maarten L.P.; Breedveld, Sebastiaan; Al-Mamgani, Abrahim; Incrocci, Luca; Heijmen, Ben J.M.

    2014-04-01

    Purpose: To develop and evaluate fully automated volumetric modulated arc therapy (VMAT) treatment planning for prostate cancer patients, avoiding manual trial-and-error tweaking of plan parameters by dosimetrists. Methods and Materials: A system was developed for fully automated generation of VMAT plans with our commercial clinical treatment planning system (TPS), linked to the in-house developed Erasmus-iCycle multicriterial optimizer for preoptimization. For 30 randomly selected patients, automatically generated VMAT plans (VMAT{sub auto}) were compared with VMAT plans generated manually by 1 expert dosimetrist in the absence of time pressure (VMAT{sub man}). For all treatment plans, planning target volume (PTV) coverage and sparing of organs-at-risk were quantified. Results: All generated plans were clinically acceptable and had similar PTV coverage (V{sub 95%} > 99%). For VMAT{sub auto} and VMAT{sub man} plans, the organ-at-risk sparing was similar as well, although only the former plans were generated without any planning workload. Conclusions: Fully automated generation of high-quality VMAT plans for prostate cancer patients is feasible and has recently been implemented in our clinic.

  17. A Fully Automated Drosophila Olfactory Classical Conditioning and Testing System for Behavioral Learning and Memory Assessment

    PubMed Central

    Jiang, Hui; Hanna, Eriny; Gatto, Cheryl L.; Page, Terry L.; Bhuva, Bharat; Broadie, Kendal

    2016-01-01

    Background Aversive olfactory classical conditioning has been the standard method to assess Drosophila learning and memory behavior for decades, yet training and testing are conducted manually under exceedingly labor-intensive conditions. To overcome this severe limitation, a fully automated, inexpensive system has been developed, which allows accurate and efficient Pavlovian associative learning/memory analyses for high-throughput pharmacological and genetic studies. New Method The automated system employs a linear actuator coupled to an odorant T-maze with airflow-mediated transfer of animals between training and testing stages. Odorant, airflow and electrical shock delivery are automatically administered and monitored during training trials. Control software allows operator-input variables to define parameters of Drosophila learning, short-term memory and long-term memory assays. Results The approach allows accurate learning/memory determinations with operational fail-safes. Automated learning indices (immediately post-training) and memory indices (after 24 hours) are comparable to traditional manual experiments, while minimizing experimenter involvement. Comparison with Existing Methods The automated system provides vast improvements over labor-intensive manual approaches with no experimenter involvement required during either training or testing phases. It provides quality control tracking of airflow rates, odorant delivery and electrical shock treatments, and an expanded platform for high-throughput studies of combinational drug tests and genetic screens. The design uses inexpensive hardware and software for a total cost of ~$500US, making it affordable to a wide range of investigators. Conclusions This study demonstrates the design, construction and testing of a fully automated Drosophila olfactory classical association apparatus to provide low-labor, high-fidelity, quality-monitored, high-throughput and inexpensive learning and memory behavioral assays

  18. Automated Immunomagnetic Separation and Microarray Detection of E. coli O157:H7 from Poultry Carcass Rinse

    SciTech Connect

    Chandler, Darrell P. ); Brown, Jeremy D.; Call, Douglas R. ); Wunschel, Sharon C. ); Grate, Jay W. ); Holman, David A.; Olson, Lydia G.; Stottlemyer, Mark S.; Bruckner-Lea, Cindy J. )

    2001-09-01

    We describe the development and application of a novel electromagnetic flow cell and fluidics system for automated immunomagnetic separation of E. coli directly from unprocessed poultry carcass rinse, and the biochemical coupling of automated sample preparation with nucleic acid microarrays without cell growth. Highly porous nickel foam was used as a magnetic flux conductor. Up to 32% recovery efficiency of 'total' E. coli was achieved within the automated system with 6 sec contact times and 15 minute protocol (from sample injection through elution), statistically similar to cell recovery efficiencies in > 1 hour 'batch' captures. The electromagnet flow cell allowed complete recovery of 2.8 mm particles directly from unprocessed poultry carcass rinse whereas the batch system did not. O157:H7 cells were reproducibly isolated directly from unprocessed poultry rinse with 39% recovery efficiency at 103 cells ml-1 inoculum. Direct plating of washed beads showed positive recovery of O 157:H7 directly from carcass rinse at an inoculum of 10 cells ml-1. Recovered beads were used for direct PCR amplification and microarray detection, with a process-level detection limit (automated cell concentration through microarray detection) of < 103 cells ml-1 carcass rinse. The fluidic system and analytical approach described here are generally applicable to most microbial detection problems and applications.

  19. Regeneration of recombinant antigen microarrays for the automated monitoring of antibodies against zoonotic pathogens in swine sera.

    PubMed

    Meyer, Verena K; Kober, Catharina; Niessner, Reinhard; Seidel, Michael

    2015-01-23

    The ability to regenerate immobilized proteins like recombinant antigens (rAgs) on surfaces is an unsolved problem for flow-based immunoassays on microarray analysis systems. The regeneration on microarray chip surfaces is achieved by changing the protein structures and desorption of antibodies. Afterwards, reactivation of immobilized protein antigens is necessary for reconstitution processes. Any backfolding should be managed in a way that antibodies are able to detect the protein antigens in the next measurement cycle. The regeneration of rAg microarrays was examined for the first time on the MCR3 flow-based chemiluminescence (CL) microarray analysis platform. The aim was to reuse rAg microarray chips in order to reduce the screening effort and costs. An antibody capturing format was used to detect antibodies against zoonotic pathogens in sera of slaughtered pigs. Different denaturation and reactivation buffers were tested. Acidic glycine-SDS buffer (pH 2.5) and 8 M guanidinium hydrochloride showed the best results in respect of denaturation efficiencies. The highest CL signals after regeneration were achieved with a carbonate buffer containing 10 mM DTT and 0.1% BSA for reactivation. Antibodies against Yersinia spp. and hepatitis E virus (HEV) were detected in swine sera on one immunochip over 4 days and 25 measurement cycles. Each cycle took 10 min for detection and regeneration. By using the rAg microarray chip, a fast and automated screening of antibodies against pathogens in sera of slaughtered pigs would be possible for zoonosis monitoring.

  20. Fully Automated Trimethylsilyl (TMS) Derivatisation Protocol for Metabolite Profiling by GC-MS.

    PubMed

    Zarate, Erica; Boyle, Veronica; Rupprecht, Udo; Green, Saras; Villas-Boas, Silas G; Baker, Philip; Pinu, Farhana R

    2016-12-29

    Gas Chromatography-Mass Spectrometry (GC-MS) has long been used for metabolite profiling of a wide range of biological samples. Many derivatisation protocols are already available and among these, trimethylsilyl (TMS) derivatisation is one of the most widely used in metabolomics. However, most TMS methods rely on off-line derivatisation prior to GC-MS analysis. In the case of manual off-line TMS derivatisation, the derivative created is unstable, so reduction in recoveries occurs over time. Thus, derivatisation is carried out in small batches. Here, we present a fully automated TMS derivatisation protocol using robotic autosamplers and we also evaluate a commercial software, Maestro available from Gerstel GmbH. Because of automation, there was no waiting time of derivatised samples on the autosamplers, thus reducing degradation of unstable metabolites. Moreover, this method allowed us to overlap samples and improved throughputs. We compared data obtained from both manual and automated TMS methods performed on three different matrices, including standard mix, wine, and plasma samples. The automated TMS method showed better reproducibility and higher peak intensity for most of the identified metabolites than the manual derivatisation method. We also validated the automated method using 114 quality control plasma samples. Additionally, we showed that this online method was highly reproducible for most of the metabolites detected and identified (RSD < 20) and specifically achieved excellent results for sugars, sugar alcohols, and some organic acids. To the very best of our knowledge, this is the first time that the automated TMS method has been applied to analyse a large number of complex plasma samples. Furthermore, we found that this method was highly applicable for routine metabolite profiling (both targeted and untargeted) in any metabolomics laboratory.

  1. Fully Automated Trimethylsilyl (TMS) Derivatisation Protocol for Metabolite Profiling by GC-MS

    PubMed Central

    Zarate, Erica; Boyle, Veronica; Rupprecht, Udo; Green, Saras; Villas-Boas, Silas G.; Baker, Philip; Pinu, Farhana R.

    2016-01-01

    Gas Chromatography-Mass Spectrometry (GC-MS) has long been used for metabolite profiling of a wide range of biological samples. Many derivatisation protocols are already available and among these, trimethylsilyl (TMS) derivatisation is one of the most widely used in metabolomics. However, most TMS methods rely on off-line derivatisation prior to GC-MS analysis. In the case of manual off-line TMS derivatisation, the derivative created is unstable, so reduction in recoveries occurs over time. Thus, derivatisation is carried out in small batches. Here, we present a fully automated TMS derivatisation protocol using robotic autosamplers and we also evaluate a commercial software, Maestro available from Gerstel GmbH. Because of automation, there was no waiting time of derivatised samples on the autosamplers, thus reducing degradation of unstable metabolites. Moreover, this method allowed us to overlap samples and improved throughputs. We compared data obtained from both manual and automated TMS methods performed on three different matrices, including standard mix, wine, and plasma samples. The automated TMS method showed better reproducibility and higher peak intensity for most of the identified metabolites than the manual derivatisation method. We also validated the automated method using 114 quality control plasma samples. Additionally, we showed that this online method was highly reproducible for most of the metabolites detected and identified (RSD < 20) and specifically achieved excellent results for sugars, sugar alcohols, and some organic acids. To the very best of our knowledge, this is the first time that the automated TMS method has been applied to analyse a large number of complex plasma samples. Furthermore, we found that this method was highly applicable for routine metabolite profiling (both targeted and untargeted) in any metabolomics laboratory. PMID:28036063

  2. Automated versus manual sample inoculations in routine clinical microbiology: a performance evaluation of the fully automated InoqulA instrument.

    PubMed

    Froment, P; Marchandin, H; Vande Perre, P; Lamy, B

    2014-03-01

    The process of plate streaking has been automated to improve the culture readings, isolation quality, and workflow of microbiology laboratories. However, instruments have not been well evaluated under routine conditions. We aimed to evaluate the performance of the fully automated InoqulA instrument (BD Kiestra B.V., The Netherlands) in the automated seeding of liquid specimens and samples collected using swabs with transport medium. We compared manual and automated methods according to the (i) within-run reproducibility using Escherichia coli-calibrated suspensions, (ii) intersample contamination using a series of alternating sterile broths and broths with >10(5) CFU/ml of either E. coli or Proteus mirabilis, (iii) isolation quality with standardized mixed bacterial suspensions of diverse complexity and a 4-category standardized scale (very poor, poor, fair to good, or excellent), and (iv) agreement of the results obtained from 244 clinical specimens. By involving 15 technicians in the latter part of the comparative study, we estimated the variability in the culture quality at the level of the laboratory team. The instrument produced satisfactory reproducibility with no sample cross-contamination, and it performed better than the manual method, with more colony types recovered and isolated (up to 11% and 17%, respectively). Finally, we showed that the instrument did not shorten the seeding time over short periods of work compared to that for the manual method. Altogether, the instrument improved the quality and standardization of the isolation, thereby contributing to a better overall workflow, shortened the time to results, and provided more accurate results for polymicrobial specimens.

  3. A fully automated system for ultrasonic power measurement and simulation accordingly to IEC 61161:2006

    NASA Astrophysics Data System (ADS)

    Costa-Felix, Rodrigo P. B.; Alvarenga, André V.; Hekkenberg, Rob

    2011-02-01

    The ultrasonic power measurement, worldwide accepted, standard is the IEC 61161, presently in its 2nd edition (2006), but under review. To fulfil its requirements, considering that a radiation force balance is to be used as ultrasonic power detector, a large amount of raw data (mass measurement) shall be collected as function of time to perform all necessary calculations and corrections. Uncertainty determination demands calculation effort of raw and processed data. Although it is possible to be undertaken in an old-fashion way, using spread sheets and manual data collection, automation software are often used in metrology to provide a virtually error free environment concerning data acquisition and repetitive calculations and corrections. Considering that, a fully automate ultrasonic power measurement system was developed and comprehensively tested. A 0,1 mg of precision balance model CP224S (Sartorius, Germany) was used as measuring device and a calibrated continuous wave ultrasound check source (Precision Acoustics, UK) was the device under test. A 150 ml container filled with degassed water and containing an absorbing target at the bottom was placed on the balance pan. Besides the feature of automation software, a routine of power measurement simulation was implemented. It was idealized as a teaching tool of how ultrasonic power emission behaviour is with a radiation force balance equipped with an absorbing target. Automation software was considered as an effective tool for speeding up ultrasonic power measurement, while allowing accurate calculation and attractive graphical partial and final results.

  4. Phase shifting and phase retrieval with a fully automated laser diode system.

    PubMed

    Rivera-Ortega, Uriel; Dirckx, Joris; Meneses-Fabian, Cruz

    2015-11-20

    A low-cost and fully automated process for phase-shifting interferometry (PSI) by continuously changing the input voltage of a laser diode (LD) under the scheme of an unbalanced Twyman-Green interferometer (TGI) setup is presented. The input signal of a LD is controlled by a data acquisition (NI-DAQ) device that allows it to change its wavelength according to its tunability features. The automation and data analysis will be done using LabVIEW in combination with MATLAB. The phase map is obtained using the Carré algorithm. Measurements of visibility and phase shift to verify the PSI requirements are shown. It is demonstrated with experimental results and statistical analysis that the phase retrieval can be successfully achieved without calibration and using minimal optical devices.

  5. Fully automated cellular-resolution vertebrate screening platform with parallel animal processing

    PubMed Central

    Chang, Tsung-Yao; Pardo-Martin, Carlos; Allalou, Amin; Wählby, Carolina; Yanik, Mehmet Fatih

    2012-01-01

    The zebrafish larva is an optically-transparent vertebrate model with complex organs that is widely used to study genetics, developmental biology, and to model various human diseases. In this article, we present a set of novel technologies that significantly increase the throughput and capabilities of previously described vertebrate automated screening technology (VAST). We developed a robust multi-thread system that can simultaneously process multiple animals. System throughput is limited only by the image acquisition speed rather than by the fluidic or mechanical processes. We developed image recognition algorithms that fully automate manipulation of animals, including orienting and positioning regions of interest within the microscope’s field of view. We also identified the optimal capillary materials for high-resolution, distortion-free, low-background imaging of zebrafish larvae. PMID:22159032

  6. A fully automated linear polyacrylamide coating and regeneration method for capillary electrophoresis of proteins.

    PubMed

    Bodnar, Judit; Hajba, Laszlo; Guttman, Andras

    2016-12-01

    Surface modification of the inner capillary wall in CE of proteins is frequently required to alter EOF and to prevent protein adsorption. Manual protocols for such coating techniques are cumbersome. In this paper, an automated covalent linear polyacrylamide coating and regeneration process is described to support long-term stability of fused-silica capillaries for protein analysis. The stability of the resulting capillary coatings was evaluated by a large number of separations using a three-protein test mixture in pH 6 and 3 buffer systems. The results were compared to that obtained with the use of bare fused-silica capillaries. If necessary, the fully automated capillary coating process was easily applied to regenerate the capillary to extend its useful life-time.

  7. Study of Automated Embryo Manipulation Using Dynamic Microarray:Trapping, Culture and Collection

    NASA Astrophysics Data System (ADS)

    Kimura, Hiroshi; Nakamura, Hiroko; Iwai, Kosuke; Yamamoto, Takatoki; Takeuchi, Shoji; Fujii, Teruo; Sakai, Yasuyuki

    Embryo handling is an extremely important fundamental technique in reproductive technology and other related life science discipline. The handling usually requires an artisanal operation that uses a glass-made capillary tube to suck in / out the embryo by applying external pressure with mouth or pipetting, to move it one to another environment and to redeliver into the womb. Because of the delicate operations, it is difficult to obtain quantitative result through the experiments. It is therefore an automatic embryo handling system has been highly desired to obtain stable quantitative results, and to reduce the stress for the operators. In this paper, we proposed and developed an automated embryo culture device, which can make an array of the embryos, culture them to be the blastocyst stage, and collect the blastocyst using the dynamic microarray format that we had studied previously. We preliminary examined the three functions of trapping, culture, and release using a mouse embryo as a sample. As a result, the mouse embryos are successfully trapped and released, whereas the efficiency of the in-device embryo culture was less comparable than the conventional dish culture. The culture stage still needs optimization for embryos, however the concept of embryo manipulation was proofed successfully.

  8. Fully AutomatedSentinel-2 Data Registration to Various Multisensor Earth Observation Imaging Datasets

    NASA Astrophysics Data System (ADS)

    Platias, Christos; Vakalopoulou, Maria; Karantzalos, Konstantinos

    2016-08-01

    In this paper, we propose a deformable registration framework for the fully automated, accurate registration of Sentinel-2 datasets. The proposed approach is based on a robust non-rigid registration framework under a Markov Random Fields formulation. Efficient linear programming has been employed for reaching the lowest potential of the cost function. The framework has been validated for the registration of Sentinel-2 and various multisensor optical and radar data like Sentinel-1, RapidEye, Landsat, Proba, Modis, etc. The performed quantitative and qualitative evaluation demonstrated the high potentials of the developed approach both for optical and radar data.

  9. Development of fully automated and integrated (''Instamatic'') welding systems for marine applications

    SciTech Connect

    Masubuchi, K.; Gustin, H.L.; Schloerb, D.W.

    1983-05-01

    A two-year research program was conducted at M.I.T. to develop fully automated and integrated welding systems. These systems package many actions involved in welding so that certain prescribed welding jobs can be performed by a person with no welding skill. They have been nicknamed ''instamatic'' welding systems, since they are similar to the easy-to-operate cameras. Following a general discussion on the development of the concept of the ''instamatic'' welding system, discussions are given on two types of systems which have been built and tested: underwater stud welding systems, and those using arc welding processes.

  10. Fully automated segmentation and characterization of the dendritic trees of retinal horizontal neurons

    SciTech Connect

    Kerekes, Ryan A; Gleason, Shaun Scott; Martins, Rodrigo; Dyer, Michael

    2010-01-01

    We introduce a new fully automated method for segmenting and characterizing the dendritic tree of neurons in confocal image stacks. Our method is aimed at wide-field-of-view, low-resolution imagery of retinal neurons in which dendrites can be intertwined and difficult to follow. The approach is based on 3-D skeletonization and includes a method for automatically determining an appropriate global threshold as well as a soma detection algorithm. We provide the details of the algorithm and a qualitative performance comparison against a commercially available neurite tracing software package, showing that a segmentation produced by our method more closely matches the ground-truth segmentation.

  11. Collecting and analyzing microstructures in three dimensions: A fully automated approach

    NASA Astrophysics Data System (ADS)

    Spowart, Jonathan E.; Mullens, Herbert E.; Puchala, Brian T.

    2003-10-01

    Robo-Met.3D is a fully automated robotic serial sectioning device that was custom-built for three-dimensional (3-D) characterization of advanced microstructures at the Air Force Research Laboratory’s Materials and Manufacturing Directorate. The machine is capable of automatically performing metallographic serial sectioning at unprecedented rates and at slice thicknesses between 0.1 µm and 10 µm. Imaging is also fully automatic, using either bright-field or polarized light microscopy, and the high-resolution digital images are combined using custom software to produce accurate 3-D datasets of the material microstructure in near-realtime. Robo-Met.3D is U.S. patent pending.

  12. Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes

    NASA Astrophysics Data System (ADS)

    Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao

    2010-06-01

    To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.

  13. A new fully automated FTIR system for total column measurements of greenhouse gases

    NASA Astrophysics Data System (ADS)

    Geibel, M. C.; Gerbig, C.; Feist, D. G.

    2010-10-01

    This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network (TCCON). It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics. Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. The automation software employs a new approach relying on multiple processes, database logging and web-based remote control. First results of total column measurements at Jena, Germany show that the instrument works well and can provide parts of the diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months. After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  14. Designs and Concept-Reliance of a Fully Automated High Content Screening Platform

    PubMed Central

    Radu, Constantin; Adrar, Hosna Sana; Alamir, Ab; Hatherley, Ian; Trinh, Trung; Djaballah, Hakim

    2013-01-01

    High content screening (HCS) is becoming an accepted platform in academic and industry screening labs and does require slightly different logistics for execution. To automate our stand alone HCS microscopes, namely an alpha IN Cell Analyzer 3000 (INCA3000) originally a Praelux unit hooked to a Hudson Plate Crane with a maximum capacity of 50 plates per run; and the IN Cell Analyzer 2000 (INCA2000) where up to 320 plates could be fed per run using the Thermo Fisher Scientific Orbitor, we opted for a 4 meter linear track system harboring both microscopes, plate washer, bulk dispensers, and a high capacity incubator allowing us to perform both live and fixed cell based assays while accessing both microscopes on deck. Considerations in design were given to the integration of the alpha INCA3000, a new gripper concept to access the onboard nest, and peripheral locations on deck to ensure a self reliant system capable of achieving higher throughput. The resulting system, referred to as Hestia, has been fully operational since the New Year, has an onboard capacity of 504 plates, and harbors the only fully automated alpha INCA3000 unit in the World. PMID:22797489

  15. ProDeGe: A computational protocol for fully automated decontamination of genomes

    DOE PAGES

    Tennessen, Kristin; Andersen, Evan; Clingenpeel, Scott; ...

    2015-06-09

    Single amplified genomes and genomes assembled from metagenomes have enabled the exploration of uncultured microorganisms at an unprecedented scale. However, both these types of products are plagued by contamination. Since these genomes are now being generated in a high-throughput manner and sequences from them are propagating into public databases to drive novel scientific discoveries, rigorous quality controls and decontamination protocols are urgently needed. Here, we present ProDeGe (Protocol for fully automated Decontamination of Genomes), the first computational protocol for fully automated decontamination of draft genomes. ProDeGe classifies sequences into two classes—clean and contaminant—using a combination of homology and feature-based methodologies.more » On average, 84% of sequence from the non-target organism is removed from the data set (specificity) and 84% of the sequence from the target organism is retained (sensitivity). Lastly, the procedure operates successfully at a rate of ~0.30 CPU core hours per megabase of sequence and can be applied to any type of genome sequence.« less

  16. ProDeGe: A computational protocol for fully automated decontamination of genomes

    SciTech Connect

    Tennessen, Kristin; Andersen, Evan; Clingenpeel, Scott; Rinke, Christian; Lundberg, Derek S.; Han, James; Dangl, Jeff L.; Ivanova, Natalia; Woyke, Tanja; Kyrpides, Nikos; Pati, Amrita

    2015-06-09

    Single amplified genomes and genomes assembled from metagenomes have enabled the exploration of uncultured microorganisms at an unprecedented scale. However, both these types of products are plagued by contamination. Since these genomes are now being generated in a high-throughput manner and sequences from them are propagating into public databases to drive novel scientific discoveries, rigorous quality controls and decontamination protocols are urgently needed. Here, we present ProDeGe (Protocol for fully automated Decontamination of Genomes), the first computational protocol for fully automated decontamination of draft genomes. ProDeGe classifies sequences into two classes—clean and contaminant—using a combination of homology and feature-based methodologies. On average, 84% of sequence from the non-target organism is removed from the data set (specificity) and 84% of the sequence from the target organism is retained (sensitivity). Lastly, the procedure operates successfully at a rate of ~0.30 CPU core hours per megabase of sequence and can be applied to any type of genome sequence.

  17. Fully automated segmentation of left ventricle using dual dynamic programming in cardiac cine MR images

    NASA Astrophysics Data System (ADS)

    Jiang, Luan; Ling, Shan; Li, Qiang

    2016-03-01

    Cardiovascular diseases are becoming a leading cause of death all over the world. The cardiac function could be evaluated by global and regional parameters of left ventricle (LV) of the heart. The purpose of this study is to develop and evaluate a fully automated scheme for segmentation of LV in short axis cardiac cine MR images. Our fully automated method consists of three major steps, i.e., LV localization, LV segmentation at end-diastolic phase, and LV segmentation propagation to the other phases. First, the maximum intensity projection image along the time phases of the midventricular slice, located at the center of the image, was calculated to locate the region of interest of LV. Based on the mean intensity of the roughly segmented blood pool in the midventricular slice at each phase, end-diastolic (ED) and end-systolic (ES) phases were determined. Second, the endocardial and epicardial boundaries of LV of each slice at ED phase were synchronously delineated by use of a dual dynamic programming technique. The external costs of the endocardial and epicardial boundaries were defined with the gradient values obtained from the original and enhanced images, respectively. Finally, with the advantages of the continuity of the boundaries of LV across adjacent phases, we propagated the LV segmentation from the ED phase to the other phases by use of dual dynamic programming technique. The preliminary results on 9 clinical cardiac cine MR cases show that the proposed method can obtain accurate segmentation of LV based on subjective evaluation.

  18. MAGNETIC RESONANCE IMAGING COMPATIBLE ROBOTIC SYSTEM FOR FULLY AUTOMATED BRACHYTHERAPY SEED PLACEMENT

    PubMed Central

    Muntener, Michael; Patriciu, Alexandru; Petrisor, Doru; Mazilu, Dumitru; Bagga, Herman; Kavoussi, Louis; Cleary, Kevin; Stoianovici, Dan

    2011-01-01

    Objectives To introduce the development of the first magnetic resonance imaging (MRI)-compatible robotic system capable of automated brachytherapy seed placement. Methods An MRI-compatible robotic system was conceptualized and manufactured. The entire robot was built of nonmagnetic and dielectric materials. The key technology of the system is a unique pneumatic motor that was specifically developed for this application. Various preclinical experiments were performed to test the robot for precision and imager compatibility. Results The robot was fully operational within all closed-bore MRI scanners. Compatibility tests in scanners of up to 7 Tesla field intensity showed no interference of the robot with the imager. Precision tests in tissue mockups yielded a mean seed placement error of 0.72 ± 0.36 mm. Conclusions The robotic system is fully MRI compatible. The new technology allows for automated and highly accurate operation within MRI scanners and does not deteriorate the MRI quality. We believe that this robot may become a useful instrument for image-guided prostate interventions. PMID:17169653

  19. A modifiable microarray-based universal sensor: providing sample-to-results automation.

    PubMed

    Yasmin, Rubina; Zhu, Hui; Chen, Zongyuan; Montagna, Richard A

    2016-10-01

    A microfluidic system consisting of generic single use cartridges which interface with a workstation allows the automatic performance of all necessary sample preparation, PCR analysis and interpretation of multiplex PCR assays. The cartridges contain a DNA array with 20 different 16mer DNA "universal" probes immobilized at defined locations. PCR amplicons can be detected via hybridization of user-defined "reporter" probes that are complementary at their 3' termini to one or more of the universal probes and complementary to the target amplicons at their 5' termini. The system was able to detect single-plex and multiplex PCR amplicons from various infectious agents as well as wild type and mutant alleles of single nucleotide polymorphisms. The system's ease of use was further demonstrated by converting a published PCR assay for the detection of Mycobacterium genitalium in a fully automated manner. Excellent correlation between traditional manual methods and the automated analysis performed by the workstation suggests that the system can provide a means to easily design and implement a variety of customized PCR-based assays. The system will be useful to researchers or clinical investigators seeking to develop their own user defined assays. As the U.S. FDA continues to pursue regulatory oversight of LDTs, the system would also allow labs to continue to develop compliant assays.

  20. A fully automated plasma protein precipitation sample preparation method for LC-MS/MS bioanalysis.

    PubMed

    Ma, Ji; Shi, Jianxia; Le, Hoa; Cho, Robert; Huang, Judy Chi-jou; Miao, Shichang; Wong, Bradley K

    2008-02-01

    This report describes the development and validation of a robust robotic system that fully integrates all peripheral devices needed for the automated preparation of plasma samples by protein precipitation. The liquid handling system consisted of a Tecan Freedom EVO 200 liquid handling platform equipped with an 8-channel liquid handling arm, two robotic plate-handling arms, and two plate shakers. Important additional components integrated into the platform were a robotic temperature-controlled centrifuge, a plate sealer, and a plate seal piercing station. These enabled unattended operation starting from a stock solution of the test compound, a set of test plasma samples and associated reagents. The stock solution of the test compound was used to prepare plasma calibration and quality control samples. Once calibration and quality control samples were prepared, precipitation of plasma proteins was achieved by addition of three volumes of acetonitrile. Integration of the peripheral devices allowed automated sequential completion of the centrifugation, plate sealing, piercing and supernatant transferral steps. The method produced a sealed, injection-ready 96-well plate of plasma extracts. Accuracy and precision of the automated system were satisfactory for the intended use: intra-day and the inter-day precision were excellent (C.V.<5%), while the intra-day and inter-day accuracies were acceptable (relative error<8%). The flexibility of the platform was sufficient to accommodate pharmacokinetic studies of different numbers of animals and time points. To the best of our knowledge, this represents the first complete automation of the protein precipitation method for plasma sample analysis.

  1. A fully automated IIF system for the detection of antinuclear antibodies and antineutrophil cytoplasmic antibodies.

    PubMed

    Shovman, O; Agmon-Levin, N; Gilburd, B; Martins, T; Petzold, A; Matthias, T; Shoenfeld, Y

    2015-02-01

    Indirect immunofluorescence (IIF) is the main technique for the detection of antinuclear antibodies (ANA) and antineutrophil cytoplasmic antibodies (ANCA). The fully automated IIF processor HELIOS(®) is the first IIF processor that is able to automatically prepare slides and perform automatic reading. The objective of the present study was to determine the diagnostic performance of this system for ANA and ANCA IIF interpretation, in comparison with visual IIF. ANA detection by visual IIF or HELIOS(®) was performed on 425 sera samples including: 218 consecutive samples submitted to a reference laboratory for routine ANA testing, 137 samples from healthy subjects and 70 ANA/ENA positive samples. For ANCA determination, 170 sera samples were collected: 40 samples for routine testing, 90 samples from healthy blood donors and 40 anti-PR3/anti-MPO positive subjects. Good correlation was found for the visual and automated ANA IIF approach regarding positive/negative discrimination of these samples (kappa = 0.633 for ANA positive samples and kappa = 0.657 for ANA negative samples, respectively). Positive/negative IIF ANCA discrimination by HELIOS(®) and visual IIF revealed a complete agreement of 100% in sera from healthy patients and PR3/MPO positive samples (kappa = 1.00). There was 95% agreement between the ANCA IIF performed by automated and visual IIF on the investigation of routine samples. Based on these results, HELIOS(®) demonstrated a high diagnostic performance for the automated ANA and ANCA IIF interpretation that was similar to a visual reading in all groups of samples.

  2. Development and evaluation of fully automated demand response in large facilities

    SciTech Connect

    Piette, Mary Ann; Sezgen, Osman; Watson, David S.; Motegi, Naoya; Shockman, Christine; ten Hope, Laurie

    2004-03-30

    This report describes the results of a research project to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve electric grid reliability, manage electricity costs, and ensure that customers receive signals that encourage load reduction during times when the electric grid is near its capacity. The two main drivers for widespread demand responsiveness are the prevention of future electricity crises and the reduction of electricity prices. Additional goals for price responsiveness include equity through cost of service pricing, and customer control of electricity usage and bills. The technology developed and evaluated in this report could be used to support numerous forms of DR programs and tariffs. For the purpose of this report, we have defined three levels of Demand Response automation. Manual Demand Response involves manually turning off lights or equipment; this can be a labor-intensive approach. Semi-Automated Response involves the use of building energy management control systems for load shedding, where a preprogrammed load shedding strategy is initiated by facilities staff. Fully-Automated Demand Response is initiated at a building or facility through receipt of an external communications signal--facility staff set up a pre-programmed load shedding strategy which is automatically initiated by the system without the need for human intervention. We have defined this approach to be Auto-DR. An important concept in Auto-DR is that a facility manager is able to ''opt out'' or ''override'' an individual DR event if it occurs at a time when the reduction in end-use services is not desirable. This project sought to improve the feasibility and nature of Auto-DR strategies in large facilities. The research focused on technology development, testing, characterization, and evaluation relating to Auto

  3. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    SciTech Connect

    Tsai, Yingssu; McPhillips, Scott E.; González, Ana; McPhillips, Timothy M.; Zinn, Daniel; Cohen, Aina E.; Feese, Michael D.; Bushnell, David; Tiefenbrunn, Theresa; Stout, C. David; Ludaescher, Bertram; Hedman, Britt; Hodgson, Keith O.; Soltis, S. Michael

    2013-05-01

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully

  4. SURVIS: a fully-automated aerial baiting system for the distribution of vaccine baits for wildlife.

    PubMed

    Müller, Thomas; Freuling, Conrad M; Gschwendner, Peter; Holzhofer, Ernst; Mürke, Heinz; Rüdiger, Heiko; Schuster, Peter; Klöss, Detlef; Staubach, Christoph; Teske, Kathrin; Vos, Adriaan

    2012-01-01

    Large-scale oral vaccination of wildlife against rabies using aerial bait distribution has been successfully used to control terrestrial wildlife rabies in Europe and North America. A technical milestone to large-scale oral rabies vaccination campaigns in Europe was the development of fully-automated, computer-supported and cost-efficient technology for aerial distribution of baits like the SURVIS -system. Each bait released is recorded by the control unit through a sensor, with the exact location, time and date of release and subsequently the collected data can be evaluated, e.g. in GIS programmes. Thus, bait delivery systems like SURVIS are an important management tool for flight services and the responsible authorities for the optimization and evaluation of oral vaccination campaigns of wildlife against rabies or the control of other relevant wildlife diseases targeted by oral baits.

  5. Evaluation and comparison of two fully automated radioassay systems with distinctly different modes of analysis

    SciTech Connect

    Chen, I.W.; Maxon, H.R.; Heminger, L.A.; Ellis, K.S.; Volle, C.P.

    1980-12-01

    Two fully automated radioimmunoassay systems with batch and sequential modes of analysis were used to assay serum thyroxine, triiodothyronine, and digoxin. The results obtained were compared with those obtained by manual methods. The batch system uses antibody coated tubes while the sequential system uses immobilized antibody chambers for the separation of bound from free ligands. In accuracy, both systems compared favorably with the established manual methods, but the sequential system showed better precision than the batch system. There was a statistically significant carryover of thyroxine in the sequential system when there were at least six-fold differences in the concentrations of thyroxine in adjacent samples, but the carryover was not significant in the batch system. Compared with the batch system, the sequential system has a shorter throughtime for individual samples (time from aspiration of the sample to the printout of results) but a longer interval for final overall printout of assay results (lower throughput).

  6. A fully automated in vitro diagnostic system based on magnetic tunnel junction arrays and superparamagnetic particles

    NASA Astrophysics Data System (ADS)

    Lian, Jie; Chen, Si; Qiu, Yuqin; Zhang, Suohui; Shi, Stone; Gao, Yunhua

    2012-04-01

    A fully automated in vitro diagnostic (IVD) system for diagnosing acute myocardial infarction was developed using high sensitivity MTJ array as sensors and nano-magnetic particles as tags. On the chip is an array of 12 × 106 MTJ devices integrated onto a 3 metal layer CMOS circuit. The array is divided into 48 detection areas, therefore 48 different types of bio targets can be analyzed simultaneously if needed. The chip is assembled with a micro-fluidic cartridge which contains all the reagents necessary for completing the assaying process. Integrated with electrical, mechanical and micro-fluidic pumping devices and with the reaction protocol programed in a microprocessor, the system only requires a simple one-step analyte application procedure to operate and yields results of the three major AMI bio-markers (cTnI, MYO, CK-MB) in 15 mins.

  7. Production implementation of fully automated, closed loop cure control for advanced composite structures

    NASA Astrophysics Data System (ADS)

    Johnson, Sean A.; Roberts, Nancy K.

    Economic of advanced composite part production requires development and use of the most aggressive cure cycles possible without sacrificing quality. As cure cycles are shortened and heating rates increase, tolerance windows for process parameters become increasingly narrow. These factors are intensified by condensation curing systems which generate large amounts of volatiles. Management of the situation requires fully automated, closed loop process control and a fundamental understanding of the material system used for the application. No turnkey system for this application is currently available. General Dynamics Pomona Division (GD/PD) has developed an integrated closed loop control system which is now being proofed in production. Realization of this system will enable cure time reductions of nearly 50 percent, while increasing yield and maintaining quality.

  8. A fully automated system for quantification of background parenchymal enhancement in breast DCE-MRI

    NASA Astrophysics Data System (ADS)

    Ufuk Dalmiş, Mehmet; Gubern-Mérida, Albert; Borelli, Cristina; Vreemann, Suzan; Mann, Ritse M.; Karssemeijer, Nico

    2016-03-01

    Background parenchymal enhancement (BPE) observed in breast dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) has been identified as an important biomarker associated with risk for developing breast cancer. In this study, we present a fully automated framework for quantification of BPE. We initially segmented fibroglandular tissue (FGT) of the breasts using an improved version of an existing method. Subsequently, we computed BPEabs (volume of the enhancing tissue), BPErf (BPEabs divided by FGT volume) and BPErb (BPEabs divided by breast volume), using different relative enhancement threshold values between 1% and 100%. To evaluate and compare the previous and improved FGT segmentation methods, we used 20 breast DCE-MRI scans and we computed Dice similarity coefficient (DSC) values with respect to manual segmentations. For evaluation of the BPE quantification, we used a dataset of 95 breast DCE-MRI scans. Two radiologists, in individual reading sessions, visually analyzed the dataset and categorized each breast into minimal, mild, moderate and marked BPE. To measure the correlation between automated BPE values to the radiologists' assessments, we converted these values into ordinal categories and we used Spearman's rho as a measure of correlation. According to our results, the new segmentation method obtained an average DSC of 0.81 0.09, which was significantly higher (p<0.001) compared to the previous method (0.76 0.10). The highest correlation values between automated BPE categories and radiologists' assessments were obtained with the BPErf measurement (r=0.55, r=0.49, p<0.001 for both), while the correlation between the scores given by the two radiologists was 0.82 (p<0.001). The presented framework can be used to systematically investigate the correlation between BPE and risk in large screening cohorts.

  9. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    PubMed Central

    Tsai, Yingssu; McPhillips, Scott E.; González, Ana; McPhillips, Timothy M.; Zinn, Daniel; Cohen, Aina E.; Feese, Michael D.; Bushnell, David; Tiefenbrunn, Theresa; Stout, C. David; Ludaescher, Bertram; Hedman, Britt; Hodgson, Keith O.; Soltis, S. Michael

    2013-01-01

    AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallo­graphy steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully demonstrated. This workflow was run once on the same 96 samples that the group had examined manually and the workflow cycled successfully through all of the samples, collected data from the same samples that were selected manually and located the same peaks of unmodeled density in the resulting difference

  10. Significance of fully automated tests for the diagnosis of antiphospholipid syndrome.

    PubMed

    Oku, Kenji; Amengual, Olga; Kato, Masaru; Bohgaki, Toshiyuki; Horita, Tetsuya; Yasuda, Shinsuke; Sakamoto, Naoya; Ieko, Masahiro; Norman, Gary L; Atsumi, Tatsuya

    2016-10-01

    Antiphospholipid antibodies (aPLs) can vary both immunologically and functionally, thus it is important to effectively and correctly identify their presence when diagnosing antiphospholipid syndrome. Furthermore, since many immunological/functional tests are necessary to measure aPLs, complete examinations are often not performed in many cases due to significant burden on the testing departments. To address this issue, we measured aPLs defined according to the classification criteria (anticardiolipin antibody: aCL) IgG/IgM and anti-β2 glycoprotein I antibody (aβ2GPI) (IgG/IgM) as well as non-criteria antibodies (aCL IgA, aβ2GPI IgA and aβ2GPI domain I), in a cohort of 211 patients (61 APS, 140 disease controls and 10 healthy individuals). APLs were measured using a fully automated chemiluminescent immunoassay instrument (BIO-FLASH®/ACL AcuStar®) and with conventional ELISA tests. We demonstrated that both sensitivity and accuracy of diagnosis of aCL IgG and aβ2GPI IgG were high, in agreement with the past reports. When multiple aPLs were examined, the accuracy of diagnosis increased. The proportion of APS patients that were positive for 2 or more types of aPLs (47/61, 77%) was higher than that of patients with systemic lupus erythematosus (SLE)(3/37, 9%), those with non-SLE connective tissues diseases (1/53,2%), those with other diseases or healthy volunteers. Based on these findings, it was concluded that the fully automated chemiluminescent immunoassay instrument, which allows the simultaneous evaluation of many types of aPLs, offers clear advantages for a more complete, more rapid and less labor-intensive alternative to running multiple ELISA and could help in better diagnosis for suspected APS patients.

  11. Fully automated detection of the counting area in blood smears for computer aided hematology.

    PubMed

    Rupp, Stephan; Schlarb, Timo; Hasslmeyer, Erik; Zerfass, Thorsten

    2011-01-01

    For medical diagnosis, blood is an indispensable indicator for a wide variety of diseases, i.e. hemic, parasitic and sexually transmitted diseases. A robust detection and exact segmentation of white blood cells (leukocytes) in stained blood smears of the peripheral blood provides the base for a fully automated, image based preparation of the so called differential blood cell count in the context of medical laboratory diagnostics. Especially for the localization of the blood cells and in particular for the segmentation of the cells it is necessary to detect the working area of the blood smear. In this contribution we present an approach for locating the so called counting area on stained blood smears that is the region where cells are predominantly separated and do not interfere with each other. For this multiple images of a blood smear are taken and analyzed in order to select the image corresponding to this area. The analysis involves the computation of an unimodal function from image content that serves as indicator for the corresponding image. This requires a prior segmentation of the cells that is carried out by a binarization in the HSV color space. Finally, the indicator function is derived from the number of cells and the cells' surface area. Its unimodality guarantees to find a maximum value that corresponds to the counting areas image index. By this, a fast lookup of the counting area is performed enabling a fully automated analysis of blood smears for medical diagnosis. For an evaluation the algorithm's performance on a number of blood smears was compared with the ground truth information that has been defined by an adept hematologist.

  12. Toxicity assessment of ionic liquids with Vibrio fischeri: an alternative fully automated methodology.

    PubMed

    Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S

    2015-03-02

    A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits.

  13. Fully automated muscle quality assessment by Gabor filtering of second harmonic generation images.

    PubMed

    Paesen, Rik; Smolders, Sophie; Vega, José Manolo de Hoyos; Eijnde, Bert O; Hansen, Dominique; Ameloot, Marcel

    2016-02-01

    Although structural changes on the sarcomere level of skeletal muscle are known to occur due to various pathologies, rigorous studies of the reduced sarcomere quality remain scarce. This can possibly be explained by the lack of an objective tool for analyzing and comparing sarcomere images across biological conditions. Recent developments in second harmonic generation (SHG) microscopy and increasing insight into the interpretation of sarcomere SHG intensity profiles have made SHG microscopy a valuable tool to study microstructural properties of sarcomeres. Typically, sarcomere integrity is analyzed by fitting a set of manually selected, one-dimensional SHG intensity profiles with a supramolecular SHG model. To circumvent this tedious manual selection step, we developed a fully automated image analysis procedure to map the sarcomere disorder for the entire image at once. The algorithm relies on a single-frequency wavelet-based Gabor approach and includes a newly developed normalization procedure allowing for unambiguous data interpretation. The method was validated by showing the correlation between the sarcomere disorder, quantified by the M-band size obtained from manually selected profiles, and the normalized Gabor value ranging from 0 to 1 for decreasing disorder. Finally, to elucidate the applicability of our newly developed protocol, Gabor analysis was used to study the effect of experimental autoimmune encephalomyelitis on the sarcomere regularity. We believe that the technique developed in this work holds great promise for high-throughput, unbiased, and automated image analysis to study sarcomere integrity by SHG microscopy.

  14. DeepPicker: A deep learning approach for fully automated particle picking in cryo-EM.

    PubMed

    Wang, Feng; Gong, Huichao; Liu, Gaochao; Li, Meijing; Yan, Chuangye; Xia, Tian; Li, Xueming; Zeng, Jianyang

    2016-09-01

    Particle picking is a time-consuming step in single-particle analysis and often requires significant interventions from users, which has become a bottleneck for future automated electron cryo-microscopy (cryo-EM). Here we report a deep learning framework, called DeepPicker, to address this problem and fill the current gaps toward a fully automated cryo-EM pipeline. DeepPicker employs a novel cross-molecule training strategy to capture common features of particles from previously-analyzed micrographs, and thus does not require any human intervention during particle picking. Tests on the recently-published cryo-EM data of three complexes have demonstrated that our deep learning based scheme can successfully accomplish the human-level particle picking process and identify a sufficient number of particles that are comparable to those picked manually by human experts. These results indicate that DeepPicker can provide a practically useful tool to significantly reduce the time and manual effort spent in single-particle analysis and thus greatly facilitate high-resolution cryo-EM structure determination. DeepPicker is released as an open-source program, which can be downloaded from https://github.com/nejyeah/DeepPicker-python.

  15. Fully automated muscle quality assessment by Gabor filtering of second harmonic generation images

    NASA Astrophysics Data System (ADS)

    Paesen, Rik; Smolders, Sophie; Vega, José Manolo de Hoyos; Eijnde, Bert O.; Hansen, Dominique; Ameloot, Marcel

    2016-02-01

    Although structural changes on the sarcomere level of skeletal muscle are known to occur due to various pathologies, rigorous studies of the reduced sarcomere quality remain scarce. This can possibly be explained by the lack of an objective tool for analyzing and comparing sarcomere images across biological conditions. Recent developments in second harmonic generation (SHG) microscopy and increasing insight into the interpretation of sarcomere SHG intensity profiles have made SHG microscopy a valuable tool to study microstructural properties of sarcomeres. Typically, sarcomere integrity is analyzed by fitting a set of manually selected, one-dimensional SHG intensity profiles with a supramolecular SHG model. To circumvent this tedious manual selection step, we developed a fully automated image analysis procedure to map the sarcomere disorder for the entire image at once. The algorithm relies on a single-frequency wavelet-based Gabor approach and includes a newly developed normalization procedure allowing for unambiguous data interpretation. The method was validated by showing the correlation between the sarcomere disorder, quantified by the M-band size obtained from manually selected profiles, and the normalized Gabor value ranging from 0 to 1 for decreasing disorder. Finally, to elucidate the applicability of our newly developed protocol, Gabor analysis was used to study the effect of experimental autoimmune encephalomyelitis on the sarcomere regularity. We believe that the technique developed in this work holds great promise for high-throughput, unbiased, and automated image analysis to study sarcomere integrity by SHG microscopy.

  16. A fully automated FTIR system for remote sensing of greenhouse gases in the tropics

    NASA Astrophysics Data System (ADS)

    Geibel, M. C.; Gerbig, C.; Feist, D. G.

    2010-07-01

    This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network. It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics. Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. First results of total column measurements at Jena, Germany show that the instrument works well and can provide diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months. After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  17. Performance of automated scoring of ER, PR, HER2, CK5/6 and EGFR in breast cancer tissue microarrays in the Breast Cancer Association Consortium

    PubMed Central

    Howat, William J; Blows, Fiona M; Provenzano, Elena; Brook, Mark N; Morris, Lorna; Gazinska, Patrycja; Johnson, Nicola; McDuffus, Leigh‐Anne; Miller, Jodi; Sawyer, Elinor J; Pinder, Sarah; van Deurzen, Carolien H M; Jones, Louise; Sironen, Reijo; Visscher, Daniel; Caldas, Carlos; Daley, Frances; Coulson, Penny; Broeks, Annegien; Sanders, Joyce; Wesseling, Jelle; Nevanlinna, Heli; Fagerholm, Rainer; Blomqvist, Carl; Heikkilä, Päivi; Ali, H Raza; Dawson, Sarah‐Jane; Figueroa, Jonine; Lissowska, Jolanta; Brinton, Louise; Mannermaa, Arto; Kataja, Vesa; Kosma, Veli‐Matti; Cox, Angela; Brock, Ian W; Cross, Simon S; Reed, Malcolm W; Couch, Fergus J; Olson, Janet E; Devillee, Peter; Mesker, Wilma E; Seyaneve, Caroline M; Hollestelle, Antoinette; Benitez, Javier; Perez, Jose Ignacio Arias; Menéndez, Primitiva; Bolla, Manjeet K; Easton, Douglas F; Schmidt, Marjanka K; Pharoah, Paul D; Sherman, Mark E

    2014-01-01

    Abstract Breast cancer risk factors and clinical outcomes vary by tumour marker expression. However, individual studies often lack the power required to assess these relationships, and large‐scale analyses are limited by the need for high throughput, standardized scoring methods. To address these limitations, we assessed whether automated image analysis of immunohistochemically stained tissue microarrays can permit rapid, standardized scoring of tumour markers from multiple studies. Tissue microarray sections prepared in nine studies containing 20 263 cores from 8267 breast cancers stained for two nuclear (oestrogen receptor, progesterone receptor), two membranous (human epidermal growth factor receptor 2 and epidermal growth factor receptor) and one cytoplasmic (cytokeratin 5/6) marker were scanned as digital images. Automated algorithms were used to score markers in tumour cells using the Ariol system. We compared automated scores against visual reads, and their associations with breast cancer survival. Approximately 65–70% of tissue microarray cores were satisfactory for scoring. Among satisfactory cores, agreement between dichotomous automated and visual scores was highest for oestrogen receptor (Kappa = 0.76), followed by human epidermal growth factor receptor 2 (Kappa = 0.69) and progesterone receptor (Kappa = 0.67). Automated quantitative scores for these markers were associated with hazard ratios for breast cancer mortality in a dose‐response manner. Considering visual scores of epidermal growth factor receptor or cytokeratin 5/6 as the reference, automated scoring achieved excellent negative predictive value (96–98%), but yielded many false positives (positive predictive value = 30–32%). For all markers, we observed substantial heterogeneity in automated scoring performance across tissue microarrays. Automated analysis is a potentially useful tool for large‐scale, quantitative scoring of immunohistochemically stained tissue

  18. Towards fully automated structure-based function prediction in structural genomics: a case study.

    PubMed

    Watson, James D; Sanderson, Steve; Ezersky, Alexandra; Savchenko, Alexei; Edwards, Aled; Orengo, Christine; Joachimiak, Andrzej; Laskowski, Roman A; Thornton, Janet M

    2007-04-13

    As the global Structural Genomics projects have picked up pace, the number of structures annotated in the Protein Data Bank as hypothetical protein or unknown function has grown significantly. A major challenge now involves the development of computational methods to assign functions to these proteins accurately and automatically. As part of the Midwest Center for Structural Genomics (MCSG) we have developed a fully automated functional analysis server, ProFunc, which performs a battery of analyses on a submitted structure. The analyses combine a number of sequence-based and structure-based methods to identify functional clues. After the first stage of the Protein Structure Initiative (PSI), we review the success of the pipeline and the importance of structure-based function prediction. As a dataset, we have chosen all structures solved by the MCSG during the 5 years of the first PSI. Our analysis suggests that two of the structure-based methods are particularly successful and provide examples of local similarity that is difficult to identify using current sequence-based methods. No one method is successful in all cases, so, through the use of a number of complementary sequence and structural approaches, the ProFunc server increases the chances that at least one method will find a significant hit that can help elucidate function. Manual assessment of the results is a time-consuming process and subject to individual interpretation and human error. We present a method based on the Gene Ontology (GO) schema using GO-slims that can allow the automated assessment of hits with a success rate approaching that of expert manual assessment.

  19. A rapid method for manual or automated purification of fluorescently labeled nucleic acids for sequencing, genotyping, and microarrays.

    PubMed

    Springer, Amy L; Booth, Lisa R; Braid, Michael D; Houde, Christiane M; Hughes, Karin A; Kaiser, Robert J; Pedrak, Casandra; Spicer, Douglas A; Stolyar, Sergey

    2003-03-01

    Fluorescent dyes provide specific, sensitive, and multiplexed detection of nucleic acids. To maximize sensitivity, fluorescently labeled reaction products (e.g., cycle sequencing or primer extension products) must be purified away from residual dye-labeled precursors. Successful high-throughput analyses require that this purification be reliable, rapid, and amenable to automation. Common methods for purifying reaction products involve several steps and require processes that are not easily automated. Prolinx, Inc. has devel oped RapXtract superparamagnetic separation technology affording rapid and easy-to-perform methods that yield high-quality product and are easily automated. The technology uses superparamagnetic particles that specifically remove unincorporated dye-labeled precursors. These particles are efficiently pelleted in the presence of a magnetic field, making them ideal for purification because of the rapid separations that they allow. RapXtract-purified sequencing reactions yield data with good signal and high Phred quality scores, and they work with various sequencing dye chemistries, including BigDye and near-infrared fluorescence IRDyes. RapXtract technology can also be used to purify dye primer sequencing reactions, primer extension reactions for genotyping analysis, and nucleic acid labeling reactions for microarray hybridization. The ease of use and versatility of RapXtract technology makes it a good choice for manual or automated purification of fluorescently labeled nucleic acids.

  20. High‐throughput automated scoring of Ki67 in breast cancer tissue microarrays from the Breast Cancer Association Consortium

    PubMed Central

    Howat, William J; Daley, Frances; Zabaglo, Lila; McDuffus, Leigh‐Anne; Blows, Fiona; Coulson, Penny; Raza Ali, H; Benitez, Javier; Milne, Roger; Brenner, Herman; Stegmaier, Christa; Mannermaa, Arto; Chang‐Claude, Jenny; Rudolph, Anja; Sinn, Peter; Couch, Fergus J; Tollenaar, Rob A.E.M.; Devilee, Peter; Figueroa, Jonine; Sherman, Mark E; Lissowska, Jolanta; Hewitt, Stephen; Eccles, Diana; Hooning, Maartje J; Hollestelle, Antoinette; WM Martens, John; HM van Deurzen, Carolien; Investigators, kConFab; Bolla, Manjeet K; Wang, Qin; Jones, Michael; Schoemaker, Minouk; Broeks, Annegien; van Leeuwen, Flora E; Van't Veer, Laura; Swerdlow, Anthony J; Orr, Nick; Dowsett, Mitch; Easton, Douglas; Schmidt, Marjanka K; Pharoah, Paul D; Garcia‐Closas, Montserrat

    2016-01-01

    Abstract Automated methods are needed to facilitate high‐throughput and reproducible scoring of Ki67 and other markers in breast cancer tissue microarrays (TMAs) in large‐scale studies. To address this need, we developed an automated protocol for Ki67 scoring and evaluated its performance in studies from the Breast Cancer Association Consortium. We utilized 166 TMAs containing 16,953 tumour cores representing 9,059 breast cancer cases, from 13 studies, with information on other clinical and pathological characteristics. TMAs were stained for Ki67 using standard immunohistochemical procedures, and scanned and digitized using the Ariol system. An automated algorithm was developed for the scoring of Ki67, and scores were compared to computer assisted visual (CAV) scores in a subset of 15 TMAs in a training set. We also assessed the correlation between automated Ki67 scores and other clinical and pathological characteristics. Overall, we observed good discriminatory accuracy (AUC = 85%) and good agreement (kappa = 0.64) between the automated and CAV scoring methods in the training set. The performance of the automated method varied by TMA (kappa range= 0.37–0.87) and study (kappa range = 0.39–0.69). The automated method performed better in satisfactory cores (kappa = 0.68) than suboptimal (kappa = 0.51) cores (p‐value for comparison = 0.005); and among cores with higher total nuclei counted by the machine (4,000–4,500 cells: kappa = 0.78) than those with lower counts (50–500 cells: kappa = 0.41; p‐value = 0.010). Among the 9,059 cases in this study, the correlations between automated Ki67 and clinical and pathological characteristics were found to be in the expected directions. Our findings indicate that automated scoring of Ki67 can be an efficient method to obtain good quality data across large numbers of TMAs from multicentre studies. However, robust algorithm development and rigorous pre‐ and post

  1. High-throughput automated scoring of Ki67 in breast cancer tissue microarrays from the Breast Cancer Association Consortium.

    PubMed

    Abubakar, Mustapha; Howat, William J; Daley, Frances; Zabaglo, Lila; McDuffus, Leigh-Anne; Blows, Fiona; Coulson, Penny; Raza Ali, H; Benitez, Javier; Milne, Roger; Brenner, Herman; Stegmaier, Christa; Mannermaa, Arto; Chang-Claude, Jenny; Rudolph, Anja; Sinn, Peter; Couch, Fergus J; Tollenaar, Rob A E M; Devilee, Peter; Figueroa, Jonine; Sherman, Mark E; Lissowska, Jolanta; Hewitt, Stephen; Eccles, Diana; Hooning, Maartje J; Hollestelle, Antoinette; Wm Martens, John; Hm van Deurzen, Carolien; Investigators, kConFab; Bolla, Manjeet K; Wang, Qin; Jones, Michael; Schoemaker, Minouk; Broeks, Annegien; van Leeuwen, Flora E; Van't Veer, Laura; Swerdlow, Anthony J; Orr, Nick; Dowsett, Mitch; Easton, Douglas; Schmidt, Marjanka K; Pharoah, Paul D; Garcia-Closas, Montserrat

    2016-07-01

    Automated methods are needed to facilitate high-throughput and reproducible scoring of Ki67 and other markers in breast cancer tissue microarrays (TMAs) in large-scale studies. To address this need, we developed an automated protocol for Ki67 scoring and evaluated its performance in studies from the Breast Cancer Association Consortium. We utilized 166 TMAs containing 16,953 tumour cores representing 9,059 breast cancer cases, from 13 studies, with information on other clinical and pathological characteristics. TMAs were stained for Ki67 using standard immunohistochemical procedures, and scanned and digitized using the Ariol system. An automated algorithm was developed for the scoring of Ki67, and scores were compared to computer assisted visual (CAV) scores in a subset of 15 TMAs in a training set. We also assessed the correlation between automated Ki67 scores and other clinical and pathological characteristics. Overall, we observed good discriminatory accuracy (AUC = 85%) and good agreement (kappa = 0.64) between the automated and CAV scoring methods in the training set. The performance of the automated method varied by TMA (kappa range= 0.37-0.87) and study (kappa range = 0.39-0.69). The automated method performed better in satisfactory cores (kappa = 0.68) than suboptimal (kappa = 0.51) cores (p-value for comparison = 0.005); and among cores with higher total nuclei counted by the machine (4,000-4,500 cells: kappa = 0.78) than those with lower counts (50-500 cells: kappa = 0.41; p-value = 0.010). Among the 9,059 cases in this study, the correlations between automated Ki67 and clinical and pathological characteristics were found to be in the expected directions. Our findings indicate that automated scoring of Ki67 can be an efficient method to obtain good quality data across large numbers of TMAs from multicentre studies. However, robust algorithm development and rigorous pre- and post-analytical quality control procedures are

  2. A fully-automated analyzer for determining haloacetic acid concentrations in drinking water.

    PubMed

    Henson, Christina M; Emmert, Gary L; Simone, Paul S

    2014-12-01

    A fully-automated, on-line, real-time analyzer has been developed for preconcentration and analysis of haloacetic acids (HAAs). Preconcentration of HAAs is achieved by sample acidification and solid phase extraction onto a hydrophobic polymeric resin using sequential injection analysis (SIA). The HAAs preconcentrate is then analyzed using post-column reaction-ion chromatography (PCR-IC), which is selective for HAAs. Systematic optimization of SIA preconcentration parameters are described followed by detailed method detection limit (MDL), accuracy, precision, and linearity studies. MDL values for the individual HAA9 species range from 0.4 to 0.9 μg L(-1). Side-by-side comparison studies of HAAs analysis in 14 real-world drinking water samples from Alabama, Arkansas, Kentucky, Minnesota, Missouri, Mississippi, New York, Pennsylvania and Tennessee are presented that compare the optimized SIA-PCR-IC to USEPA Method 552.3. Trace levels of HAAs detected in select samples are reported, and the bias values calculated between the two methods are typically less than 5 μg L(-1) for eight of the nine individual HAAs.

  3. A fully automated tortuosity quantification system with application to corneal nerve fibres in confocal microscopy images.

    PubMed

    Annunziata, Roberto; Kheirkhah, Ahmad; Aggarwal, Shruti; Hamrah, Pedram; Trucco, Emanuele

    2016-08-01

    Recent clinical research has highlighted important links between a number of diseases and the tortuosity of curvilinear anatomical structures like corneal nerve fibres, suggesting that tortuosity changes might detect early stages of specific conditions. Currently, clinical studies are mainly based on subjective, visual assessment, with limited repeatability and inter-observer agreement. To address these problems, we propose a fully automated framework for image-level tortuosity estimation, consisting of a hybrid segmentation method and a highly adaptable, definition-free tortuosity estimation algorithm. The former combines an appearance model, based on a Scale and Curvature-Invariant Ridge Detector (SCIRD), with a context model, including multi-range learned context filters. The latter is based on a novel tortuosity estimation paradigm in which discriminative, multi-scale features can be automatically learned for specific anatomical objects and diseases. Experimental results on 140 in vivo confocal microscopy images of corneal nerve fibres from healthy and unhealthy subjects demonstrate the excellent performance of our method compared to state-of-the-art approaches and ground truth annotations from 3 expert observers.

  4. Fully automated deformable registration of breast DCE-MRI and PET/CT

    NASA Astrophysics Data System (ADS)

    Dmitriev, I. D.; Loo, C. E.; Vogel, W. V.; Pengel, K. E.; Gilhuijs, K. G. A.

    2013-02-01

    Accurate characterization of breast tumors is important for the appropriate selection of therapy and monitoring of the response. For this purpose breast imaging and tissue biopsy are important aspects. In this study, a fully automated method for deformable registration of DCE-MRI and PET/CT of the breast is presented. The registration is performed using the CT component of the PET/CT and the pre-contrast T1-weighted non-fat suppressed MRI. Comparable patient setup protocols were used during the MRI and PET examinations in order to avoid having to make assumptions of biomedical properties of the breast during and after the application of chemotherapy. The registration uses a multi-resolution approach to speed up the process and to minimize the probability of converging to local minima. The validation was performed on 140 breasts (70 patients). From a total number of registration cases, 94.2% of the breasts were aligned within 4.0 mm accuracy (1 PET voxel). Fused information may be beneficial to obtain representative biopsy samples, which in turn will benefit the treatment of the patient.

  5. Methodology for fully automated segmentation and plaque characterization in intracoronary optical coherence tomography images.

    PubMed

    Athanasiou, Lambros S; Bourantas, Christos V; Rigas, George; Sakellarios, Antonis I; Exarchos, Themis P; Siogkas, Panagiotis K; Ricciardi, Andrea; Naka, Katerina K; Papafaklis, Michail I; Michalis, Lampros K; Prati, Francesco; Fotiadis, Dimitrios I

    2014-02-01

    Optical coherence tomography (OCT) is a light-based intracoronary imaging modality that provides high-resolution cross-sectional images of the luminal and plaque morphology. Currently, the segmentation of OCT images and identification of the composition of plaque are mainly performed manually by expert observers. However, this process is laborious and time consuming and its accuracy relies on the expertise of the observer. To address these limitations, we present a methodology that is able to process the OCT data in a fully automated fashion. The proposed methodology is able to detect the lumen borders in the OCT frames, identify the plaque region, and detect four tissue types: calcium (CA), lipid tissue (LT), fibrous tissue (FT), and mixed tissue (MT). The efficiency of the developed methodology was evaluated using annotations from 27 OCT pullbacks acquired from 22 patients. High Pearson's correlation coefficients were obtained between the output of the developed methodology and the manual annotations (from 0.96 to 0.99), while no significant bias with good limits of agreement was shown in the Bland-Altman analysis. The overlapping areas ratio between experts' annotations and methodology in detecting CA, LT, FT, and MT was 0.81, 0.71, 0.87, and 0.81, respectively.

  6. Fully automated objective-based method for master recession curve separation.

    PubMed

    Posavec, Kristijan; Parlov, Jelena; Nakić, Zoran

    2010-01-01

    The fully automated objective-based method for master recession curve (MRC) separation was developed by using Microsoft Excel spreadsheet and Visual Basic for Applications (VBA) code. The core of the program code is used to construct an MRC by using the adapted matching strip method (Posavec et al. 2006). Criteria for separating the MRC into two or three segments are determined from the flow-duration curve and are represented as the probable range of percent of flow rate duration. Successive separations are performed automatically on two and three MRCs using sets of percent of flow rate duration from selected ranges and an optimal separation model scenario, having the highest average coefficient of determination R(2), is selected as the most appropriate one. The resulting separated master recession curves are presented graphically, whereas the statistics are presented numerically, all in separate sheets. Examples of field data obtained from two springs in Istria, Croatia, are used to illustrate its application. The freely available Excel spreadsheet and VBA program ensures the ease of use and applicability for larger data sets.

  7. Dynamics of G-band bright points derived using two fully automated algorithms

    NASA Astrophysics Data System (ADS)

    Bodnárová, M.; Utz, D.; Rybák, J.; Hanslmeier, A.

    Small-scale magnetic field concentrations (˜ 1 kG) in the solar photosphere can be identified in the G-band of the solar spectrum as bright points. Study of the G-band bright points (GBPs) dynamics can help us in solving several questions related also to the coronal heating problem. Here a set of 142 G-band speckled images obtained using the Dutch Open Telescope (DOT) on October 19, 2005 are used to compare identification of the GBPs by two different fully automated identification algorithms: an algorithm developed by Utz et al. (2009a, 2009b) and an algorithm developed according to papers of Berger et al. (1995, 1998). Temporal and spatial tracking of the GBPs identified by both algorithms was performed resulting in distributions of lifetimes, sizes and velocities of the GBPs. The obtained results show that both algorithms give very similar values in the case of lifetime and velocity estimation of the GBPs, but they differ significantly in case of estimation of the GBPs sizes. This difference is caused by the fact that we have applied no additional exclusive criteria on the GBPs identified by the algorithm based on the work of Berger et al. (1995, 1998). Therefore we conclude that in a future study of the GBPs dynamics we will prefer to use the Utz's algorithm to perform identification and tracking of the GBPs in G-band images.

  8. Development and validation of a fully automated system for detection and diagnosis of mammographic lesions.

    PubMed

    Casti, Paola; Mencattini, Arianna; Salmeri, Marcello; Ancona, Antonietta; Mangieri, Fabio; Rangayyan, Rangaraj M

    2014-01-01

    We present a comprehensive and fully automated system for computer-aided detection and diagnosis of masses in mammograms. Novel methods for detection include: selection of suspicious focal areas based on analysis of the gradient vector field, rejection of oriented components of breast tissue using multidirectional Gabor filtering, and use of differential features for rejection of false positives (FPs) via clustering of the surrounding fibroglandular tissue. The diagnosis step is based on extraction of contour-independent features for characterization of lesions as benign or malignant from automatically detected circular and annular regions. A new unified 3D free-response receiver operating characteristic framework is introduced for global analysis of two binary categorization problems in cascade. In total, 3,080 suspicious focal areas were extracted from a set of 156 full-field digital mammograms, including 26 malignant tumors, 120 benign lesions, and 18 normal mammograms. The proposed system detected and diagnosed malignant tumors with a sensitivity of 0.96, 0.92, and 0.88 at, respectively, 1.83, 0.46, and 0.45 FPs/image, with two stages of stepwise logistic regression for selection of features, a cascade of Fisher linear discriminant analysis and an artificial neural network with radial basis functions, and leave-one-patient-out cross-validation.

  9. SNMSP II: A system to fully automate special nuclear materials accountability reporting for electric utilities

    SciTech Connect

    Pareto, V.; Venegas, R.

    1987-07-01

    The USNRC requires each licensee who is authorized to possess Special Nuclear Materials (SNM) to prepare and submit reports concerning SNM received, produced, possessed, transferred, consumed, disposed of, or lost. These SNM accountability reports, which need to be submitted twice a year, contain detailed information on the origin, quantity, and type of SNM for several locations. The amount of detail required makes these reports very time consuming and error prone when prepared manually. Yankee Atomic is developing an IBM PC-based computer code that fully automates the process of generating SNM accountability reports. The program, called SNMSP II, prints a number of summaries including facsimiles of the NRC/DOE-741, 742, 742C, and RW-859 reports in a format that can be submitted directly to the NRC/DOE. SNMSP II is menu-driven and is especially designed for people with little or no computer training. Input can be either from a mainframe-based corporate data base or manually through user-friendly screens. In addition, extensive quality assurance features are available to ensure the security and accuracy of the data. This paper discusses the major features of the code and describes its implementation at Yankee.

  10. Ex vivo encapsulation of dexamethasone sodium phosphate into human autologous erythrocytes using fully automated biomedical equipment.

    PubMed

    Mambrini, Giovanni; Mandolini, Marco; Rossi, Luigia; Pierigè, Francesca; Capogrossi, Giovanni; Salvati, Patricia; Serafini, Sonja; Benatti, Luca; Magnani, Mauro

    2017-01-30

    Erythrocyte-based drug delivery systems are emerging as potential new solutions for the release of drugs into the bloodstream. The aim of the present work was to assess the performance of a fully automated process (EDS) for the ex-vivo encapsulation of the pro-drug dexamethasone sodium phosphate (DSP) into autologous erythrocytes in compliance with regulatory requirements. The loading method was based on reversible hypotonic hemolysis, which allows the opening of transient pores in the cell membrane to be crossed by DSP. The efficiency of encapsulation and the biochemical and physiological characteristics of the processed erythrocytes were investigated in blood samples from 34 healthy donors. It was found that the processed erythrocytes maintained their fundamental properties and the encapsulation process was reproducible. The EDS under study showed greater loading efficiency and reduced variability compared to previous EDS versions. Notably, these results were confirmed using blood samples from Ataxia Telangiectasia (AT) patients, 9.33±1.40 and 19.41±2.10mg of DSP (mean±SD, n=134) by using 62.5 and 125mg DSP loading quantities, respectively. These results support the use of the new EDS version 3.2.0 to investigate the effect of erythrocyte-delivered dexamethasone in regulatory trials in patients with AT.

  11. Fully automated dialysis system based on the central dialysis fluid delivery system.

    PubMed

    Kawanishi, Hideki; Moriishi, Misaki; Sato, Takashi; Taoka, Masahiro

    2009-01-01

    The fully automated dialysis system (FADS) was developed as an improvement over previous patient monitors used in the treatment of hemodialysis, with the aim of standardizing and promoting labor-saving in such treatment. This system uses backfiltration dialysis fluid to perform priming, blood rinse back and rapid fluid replenishment, and causes guiding of blood into the dialyzer by the drainage pump for ultrafiltration. This requires that the dialysis fluid used be purified to a high level. The central dialysis fluid delivery system (CDDS) combines the process of the creation and supply of dialysis water and dialysis fluid to achieve a level of purity equivalent with ultrapure dialysis fluid. FADS has the further advantages of greater efficiency and streamlined operation, reducing human error and the risk of infection without requiring the storage or disposal of normal saline solution. The simplification of hemodialysis allows for greater frequency of dialysis or extended dialysis, enabling treatment to be provided in line with the patient's particular situation. FADS thus markedly improves the reliability, safety and standardization of dialysis procedures while ensuring labor-saving in these procedures, making it of particular utility for institutions dealing with dialysis on a large scale.

  12. Improving GPR Surveys Productivity by Array Technology and Fully Automated Processing

    NASA Astrophysics Data System (ADS)

    Morello, Marco; Ercoli, Emanuele; Mazzucchelli, Paolo; Cottino, Edoardo

    2016-04-01

    The realization of network infrastructures with lower environmental impact and the tendency to use digging technologies less invasive in terms of time and space of road occupation and restoration play a key-role in the development of communication networks. However, pre-existing buried utilities must be detected and located in the subsurface, to exploit the high productivity of modern digging apparatus. According to SUE quality level B+ both position and depth of subsurface utilities must be accurately estimated, demanding for 3D GPR surveys. In fact, the advantages of 3D GPR acquisitions (obtained either by multiple 2D recordings or by an antenna array) versus 2D acquisitions are well-known. Nonetheless, the amount of acquired data for such 3D acquisitions does not usually allow to complete processing and interpretation directly in field and in real-time, thus limiting the overall efficiency of the GPR acquisition. As an example, the "low impact mini-trench "technique (addressed in ITU - International Telecommunication Union - L.83 recommendation) requires that non-destructive mapping of buried services enhances its productivity to match the improvements of new digging equipment. Nowadays multi-antenna and multi-pass GPR acquisitions demand for new processing techniques that can obtain high quality subsurface images, taking full advantage of 3D data: the development of a fully automated and real-time 3D GPR processing system plays a key-role in overall optical network deployment profitability. Furthermore, currently available computing power suggests the feasibility of processing schemes that incorporate better focusing algorithms. A novel processing scheme, whose goal is the automated processing and detection of buried targets that can be applied in real-time to 3D GPR array systems, has been developed and fruitfully tested with two different GPR arrays (16 antennas, 900 MHz central frequency, and 34 antennas, 600 MHz central frequency). The proposed processing

  13. Development and implementation of industrialized, fully automated high throughput screening systems

    PubMed Central

    2003-01-01

    Automation has long been a resource for high-throughput screening at Bristol-Myers Squibb. However, with growing deck sizes and decreasing time lines, a new generation of more robust, supportable automated systems was necessary for accomplishing high-throughput screening goals. Implementation of this new generation of automated systems required numerous decisions concerning hardware, software and the value of in-house automation expertise. This project has resulted in fast, flexible, industrialized automation systems with a strong in-house support structure that we believe meets our current high-throughput screening requirements and will continue to meet them well into the future. PMID:18924614

  14. Fully automated prostate magnetic resonance imaging and transrectal ultrasound fusion via a probabilistic registration metric

    NASA Astrophysics Data System (ADS)

    Sparks, Rachel; Bloch, B. Nicholas; Feleppa, Ernest; Barratt, Dean; Madabhushi, Anant

    2013-03-01

    In this work, we present a novel, automated, registration method to fuse magnetic resonance imaging (MRI) and transrectal ultrasound (TRUS) images of the prostate. Our methodology consists of: (1) delineating the prostate on MRI, (2) building a probabilistic model of prostate location on TRUS, and (3) aligning the MRI prostate segmentation to the TRUS probabilistic model. TRUS-guided needle biopsy is the current gold standard for prostate cancer (CaP) diagnosis. Up to 40% of CaP lesions appear isoechoic on TRUS, hence TRUS-guided biopsy cannot reliably target CaP lesions and is associated with a high false negative rate. MRI is better able to distinguish CaP from benign prostatic tissue, but requires special equipment and training. MRI-TRUS fusion, whereby MRI is acquired pre-operatively and aligned to TRUS during the biopsy procedure, allows for information from both modalities to be used to help guide the biopsy. The use of MRI and TRUS in combination to guide biopsy at least doubles the yield of positive biopsies. Previous work on MRI-TRUS fusion has involved aligning manually determined fiducials or prostate surfaces to achieve image registration. The accuracy of these methods is dependent on the reader's ability to determine fiducials or prostate surfaces with minimal error, which is a difficult and time-consuming task. Our novel, fully automated MRI-TRUS fusion method represents a significant advance over the current state-of-the-art because it does not require manual intervention after TRUS acquisition. All necessary preprocessing steps (i.e. delineation of the prostate on MRI) can be performed offline prior to the biopsy procedure. We evaluated our method on seven patient studies, with B-mode TRUS and a 1.5 T surface coil MRI. Our method has a root mean square error (RMSE) for expertly selected fiducials (consisting of the urethra, calcifications, and the centroids of CaP nodules) of 3.39 +/- 0.85 mm.

  15. Fully Automated Detection of Cloud and Aerosol Layers in the CALIPSO Lidar Measurements

    NASA Technical Reports Server (NTRS)

    Vaughan, Mark A.; Powell, Kathleen A.; Kuehn, Ralph E.; Young, Stuart A.; Winker, David M.; Hostetler, Chris A.; Hunt, William H.; Liu, Zhaoyan; McGill, Matthew J.; Getzewich, Brian J.

    2009-01-01

    Accurate knowledge of the vertical and horizontal extent of clouds and aerosols in the earth s atmosphere is critical in assessing the planet s radiation budget and for advancing human understanding of climate change issues. To retrieve this fundamental information from the elastic backscatter lidar data acquired during the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) mission, a selective, iterated boundary location (SIBYL) algorithm has been developed and deployed. SIBYL accomplishes its goals by integrating an adaptive context-sensitive profile scanner into an iterated multiresolution spatial averaging scheme. This paper provides an in-depth overview of the architecture and performance of the SIBYL algorithm. It begins with a brief review of the theory of target detection in noise-contaminated signals, and an enumeration of the practical constraints levied on the retrieval scheme by the design of the lidar hardware, the geometry of a space-based remote sensing platform, and the spatial variability of the measurement targets. Detailed descriptions are then provided for both the adaptive threshold algorithm used to detect features of interest within individual lidar profiles and the fully automated multiresolution averaging engine within which this profile scanner functions. The resulting fusion of profile scanner and averaging engine is specifically designed to optimize the trade-offs between the widely varying signal-to-noise ratio of the measurements and the disparate spatial resolutions of the detection targets. Throughout the paper, specific algorithm performance details are illustrated using examples drawn from the existing CALIPSO dataset. Overall performance is established by comparisons to existing layer height distributions obtained by other airborne and space-based lidars.

  16. Fully automated whole-head segmentation with improved smoothness and continuity, with theory reviewed.

    PubMed

    Huang, Yu; Parra, Lucas C

    2015-01-01

    Individualized current-flow models are needed for precise targeting of brain structures using transcranial electrical or magnetic stimulation (TES/TMS). The same is true for current-source reconstruction in electroencephalography and magnetoencephalography (EEG/MEG). The first step in generating such models is to obtain an accurate segmentation of individual head anatomy, including not only brain but also cerebrospinal fluid (CSF), skull and soft tissues, with a field of view (FOV) that covers the whole head. Currently available automated segmentation tools only provide results for brain tissues, have a limited FOV, and do not guarantee continuity and smoothness of tissues, which is crucially important for accurate current-flow estimates. Here we present a tool that addresses these needs. It is based on a rigorous Bayesian inference framework that combines image intensity model, anatomical prior (atlas) and morphological constraints using Markov random fields (MRF). The method is evaluated on 20 simulated and 8 real head volumes acquired with magnetic resonance imaging (MRI) at 1 mm3 resolution. We find improved surface smoothness and continuity as compared to the segmentation algorithms currently implemented in Statistical Parametric Mapping (SPM). With this tool, accurate and morphologically correct modeling of the whole-head anatomy for individual subjects may now be feasible on a routine basis. Code and data are fully integrated into SPM software tool and are made publicly available. In addition, a review on the MRI segmentation using atlas and the MRF over the last 20 years is also provided, with the general mathematical framework clearly derived.

  17. Field version of the fully automated system for δ13C IRMS analysis of atmospheric methane

    NASA Astrophysics Data System (ADS)

    Röckmann, Thomas; van der Veen, Carin; Snellen, Henk; Wendeberg, Magnus; Brand, Willi

    2014-05-01

    In order to measure CH4 carbon isotope ratios continuously at rural locations, we developed a robust, fully automated extraction system for field IRMS measurements. We based our system on the iSAAC design from the MPI-BGC, with its cold traps mounted on a cryocooler. Because this new extraction system makes no use of liquid nitrogen, it is possible to leave it working unattendedly for more than one week. Alternately, 50 mL of reference air from a cylinder, and 50 mL of dried local air is measured with the same pre-concentration trap and focus unit. Up to 60 measurements per day can be performed in this way. This will give a temporal resolution in CH4 isotope measurements that cannot be maintained for extended periods with flask samples. The CH4 (and other compounds) are frozen on the pre-concentration trap, while the air matrix is flushed out. Then the CH4 is transferred to the smaller focus trap, and released by controlled heating into the combustion oven. A post combustion GC is used to separate the CO2(CH4) peak from Krypton and other compounds. Under laboratory conditions we achieved well over 500 measurements without attending the system. The precision of the δ13C-CH4 measurements is better than 0.07‰, and the mole ratio is determined within 10 ppb. The system is to be employed in a fieldwork comparison of several CH4 isotope analyzers, to be held in Spring 2014 at the Cabauw tower, Netherlands, as part of the InGOS WP16: Innovation in isotope measurement techniques.

  18. Fully automated intrinsic respiratory and cardiac gating for small animal CT

    NASA Astrophysics Data System (ADS)

    Kuntz, J.; Dinkel, J.; Zwick, S.; Bäuerle, T.; Grasruck, M.; Kiessling, F.; Gupta, R.; Semmler, W.; Bartling, S. H.

    2010-04-01

    A fully automated, intrinsic gating algorithm for small animal cone-beam CT is described and evaluated. A parameter representing the organ motion, derived from the raw projection images, is used for both cardiac and respiratory gating. The proposed algorithm makes it possible to reconstruct motion-corrected still images as well as to generate four-dimensional (4D) datasets representing the cardiac and pulmonary anatomy of free-breathing animals without the use of electrocardiogram (ECG) or respiratory sensors. Variation analysis of projections from several rotations is used to place a region of interest (ROI) on the diaphragm. The ROI is cranially extended to include the heart. The centre of mass (COM) variation within this ROI, the filtered frequency response and the local maxima are used to derive a binary motion-gating parameter for phase-sensitive gated reconstruction. This algorithm was implemented on a flat-panel-based cone-beam CT scanner and evaluated using a moving phantom and animal scans (seven rats and eight mice). Volumes were determined using a semiautomatic segmentation. In all cases robust gating signals could be obtained. The maximum volume error in phantom studies was less than 6%. By utilizing extrinsic gating via externally placed cardiac and respiratory sensors, the functional parameters (e.g. cardiac ejection fraction) and image quality were equivalent to this current gold standard. This algorithm obviates the necessity of both gating hardware and user interaction. The simplicity of the proposed algorithm enables adoption in a wide range of small animal cone-beam CT scanners.

  19. Fully automated analytical procedure for propofol determination by sequential injection technique with spectrophotometric and fluorimetric detections.

    PubMed

    Šrámková, Ivana; Amorim, Célia G; Sklenářová, Hana; Montenegro, Maria C B M; Horstkotte, Burkhard; Araújo, Alberto N; Solich, Petr

    2014-01-01

    In this work, an application of an enzymatic reaction for the determination of the highly hydrophobic drug propofol in emulsion dosage form is presented. Emulsions represent a complex and therefore challenging matrix for analysis. Ethanol was used for breakage of a lipid emulsion, which enabled optical detection. A fully automated method based on Sequential Injection Analysis was developed, allowing propofol determination without the requirement of tedious sample pre-treatment. The method was based on spectrophotometric detection after the enzymatic oxidation catalysed by horseradish peroxidase and subsequent coupling with 4-aminoantipyrine leading to a coloured product with an absorbance maximum at 485 nm. This procedure was compared with a simple fluorimetric method, which was based on the direct selective fluorescence emission of propofol in ethanol at 347 nm. Both methods provide comparable validation parameters with linear working ranges of 0.005-0.100 mg mL(-1) and 0.004-0.243 mg mL(-1) for the spectrophotometric and fluorimetric methods, respectively. The detection and quantitation limits achieved with the spectrophotometric method were 0.0016 and 0.0053 mg mL(-1), respectively. The fluorimetric method provided the detection limit of 0.0013 mg mL(-1) and limit of quantitation of 0.0043 mg mL(-1). The RSD did not exceed 5% and 2% (n=10), correspondingly. A sample throughput of approx. 14 h(-1) for the spectrophotometric and 68 h(-1) for the fluorimetric detection was achieved. Both methods proved to be suitable for the determination of propofol in pharmaceutical formulation with average recovery values of 98.1 and 98.5%.

  20. Accurate, fully-automated registration of coronary arteries for volumetric CT digital subtraction angiography

    NASA Astrophysics Data System (ADS)

    Razeto, Marco; Mohr, Brian; Arakita, Kazumasa; Schuijf, Joanne D.; Fuchs, Andreas; Kühl, J. Tobias; Chen, Marcus Y.; Kofoed, Klaus F.

    2014-03-01

    Diagnosis of coronary artery disease with Coronary Computed Tomography Angiography (CCTA) is complicated by the presence of signi cant calci cation or stents. Volumetric CT Digital Subtraction Angiography (CTDSA) has recently been shown to be e ective at overcoming these limitations. Precise registration of structures is essential as any misalignment can produce artifacts potentially inhibiting clinical interpretation of the data. The fully-automated registration method described in this paper addresses the problem by combining a dense deformation eld with rigid-body transformations where calci cations/stents are present. The method contains non-rigid and rigid components. Non-rigid registration recovers the majority of motion artifacts and produces a dense deformation eld valid over the entire scan domain. Discrete domains are identi ed in which rigid registrations very accurately align each calci cation/stent. These rigid-body transformations are combined within the immediate area of the deformation eld using a distance transform to minimize distortion of the surrounding tissue. A recent interim analysis of a clinical feasibility study evaluated reader con dence and diagnostic accuracy in conventional CCTA and CTDSA registered using this method. Conventional invasive coronary angiography was used as the reference. The study included 27 patients scanned with a second-generation 320-row CT detector in which 41 lesions were identi ed. Compared to conventional CCTA, CTDSA improved reader con dence in 13/36 (36%) of segments with severe calci cation and 3/5 (60%) of segments with coronary stents. Also, the false positive rate of CTDSA was reduced compared to conventional CCTA from 18% (24/130) to 14% (19/130).

  1. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    SciTech Connect

    Lorenz, Matthias; Ovchinnikova, Olga S; Van Berkel, Gary J

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  2. Results from the first fully automated PBS-mask process and pelliclization

    NASA Astrophysics Data System (ADS)

    Oelmann, Andreas B.; Unger, Gerd M.

    1994-02-01

    Automation is widely discussed in IC- and mask-manufacturing and partially realized everywhere. The idea for the automation goes back to 1978, when it turned out that the operators for the then newly installed PBS-process-line (the first in Europe) should be trained to behave like robots for particle reduction gaining lower defect densities on the masks. More than this goal has been achieved. It turned out recently, that the automation with its dedicated work routes and detailed documentation of every lot (individual mask or reticle) made it easy to obtain the CEEC certificate which includes ISO 9001.

  3. A scalable, fully automated process for construction of sequence-ready barcoded libraries for 454.

    PubMed

    Lennon, Niall J; Lintner, Robert E; Anderson, Scott; Alvarez, Pablo; Barry, Andrew; Brockman, William; Daza, Riza; Erlich, Rachel L; Giannoukos, Georgia; Green, Lisa; Hollinger, Andrew; Hoover, Cindi A; Jaffe, David B; Juhn, Frank; McCarthy, Danielle; Perrin, Danielle; Ponchner, Karen; Powers, Taryn L; Rizzolo, Kamran; Robbins, Dana; Ryan, Elizabeth; Russ, Carsten; Sparrow, Todd; Stalker, John; Steelman, Scott; Weiand, Michael; Zimmer, Andrew; Henn, Matthew R; Nusbaum, Chad; Nicol, Robert

    2010-01-01

    We present an automated, high throughput library construction process for 454 technology. Sample handling errors and cross-contamination are minimized via end-to-end barcoding of plasticware, along with molecular DNA barcoding of constructs. Automation-friendly magnetic bead-based size selection and cleanup steps have been devised, eliminating major bottlenecks and significant sources of error. Using this methodology, one technician can create 96 sequence-ready 454 libraries in 2 days, a dramatic improvement over the standard method.

  4. A Fully Automated Method for CT-on-Rails-Guided Online Adaptive Planning for Prostate Cancer Intensity Modulated Radiation Therapy

    SciTech Connect

    Li, Xiaoqiang; Quan, Enzhuo M.; Li, Yupeng; Pan, Xiaoning; Zhou, Yin; Wang, Xiaochun; Du, Weiliang; Kudchadker, Rajat J.; Johnson, Jennifer L.; Kuban, Deborah A.; Lee, Andrew K.; Zhang, Xiaodong

    2013-08-01

    Purpose: This study was designed to validate a fully automated adaptive planning (AAP) method which integrates automated recontouring and automated replanning to account for interfractional anatomical changes in prostate cancer patients receiving adaptive intensity modulated radiation therapy (IMRT) based on daily repeated computed tomography (CT)-on-rails images. Methods and Materials: Nine prostate cancer patients treated at our institution were randomly selected. For the AAP method, contours on each repeat CT image were automatically generated by mapping the contours from the simulation CT image using deformable image registration. An in-house automated planning tool incorporated into the Pinnacle treatment planning system was used to generate the original and the adapted IMRT plans. The cumulative dose–volume histograms (DVHs) of the target and critical structures were calculated based on the manual contours for all plans and compared with those of plans generated by the conventional method, that is, shifting the isocenters by aligning the images based on the center of the volume (COV) of prostate (prostate COV-aligned). Results: The target coverage from our AAP method for every patient was acceptable, while 1 of the 9 patients showed target underdosing from prostate COV-aligned plans. The normalized volume receiving at least 70 Gy (V{sub 70}), and the mean dose of the rectum and bladder were reduced by 8.9%, 6.4 Gy and 4.3%, 5.3 Gy, respectively, for the AAP method compared with the values obtained from prostate COV-aligned plans. Conclusions: The AAP method, which is fully automated, is effective for online replanning to compensate for target dose deficits and critical organ overdosing caused by interfractional anatomical changes in prostate cancer.

  5. Fully automated open access platform for rapid, combined serial evaporation and sample reformatting.

    PubMed

    Benali, Otman; Davies, Gary; Deal, Martyn; Farrant, Elizabeth; Guthrie, Duncan; Holden, John; Wheeler, Rob

    2008-01-01

    This paper reports a novel evaporator and its integration with an automated sample handling system to create a high throughput evaporation platform. The Vaportec V-10 evaporator uses a high speed rotation motor ( approximately 6000 rpm) to spin the vial containing a sample, creating a thin film of solvent which can be readily evaporated by the application of heat to the vial, while the consequent centrifugal force prevents "bumping". An intelligent algorithm controls pressure and temperature for optimum solvent removal conditions and end of run detection, critical for automation. The system allows the option of evaporation directly from a sample source vial, or alternatively, integrated liquid handling facilities provide the capability of transferring samples portionwise from a (large) source vial or bottle to a (small) daughter container, enabling efficient sample reformatting, with minimum user intervention. The open access system makes significant advances over current vacuum centrifugal evaporators in terms of evaporation rate and ease of automation. The evaporator's main features, the integration of robotics to provide automation, and examples of evaporation rates of a wide range of solvents from a variety of containers are described.

  6. A fully automated method for quantifying and localizing white matter hyperintensities on MR images.

    PubMed

    Wu, Minjie; Rosano, Caterina; Butters, Meryl; Whyte, Ellen; Nable, Megan; Crooks, Ryan; Meltzer, Carolyn C; Reynolds, Charles F; Aizenstein, Howard J

    2006-12-01

    White matter hyperintensities (WMH), commonly found on T2-weighted FLAIR brain MR images in the elderly, are associated with a number of neuropsychiatric disorders, including vascular dementia, Alzheimer's disease, and late-life depression. Previous MRI studies of WMHs have primarily relied on the subjective and global (i.e., full-brain) ratings of WMH grade. In the current study we implement and validate an automated method for quantifying and localizing WMHs. We adapt a fuzzy-connected algorithm to automate the segmentation of WMHs and use a demons-based image registration to automate the anatomic localization of the WMHs using the Johns Hopkins University White Matter Atlas. The method is validated using the brain MR images acquired from eleven elderly subjects with late-onset late-life depression (LLD) and eight elderly controls. This dataset was chosen because LLD subjects are known to have significant WMH burden. The volumes of WMH identified in our automated method are compared with the accepted gold standard (manual ratings). A significant correlation of the automated method and the manual ratings is found (P<0.0001), thus demonstrating similar WMH quantifications of both methods. As has been shown in other studies (e.g. [Taylor, W.D., MacFall, J.R., Steffens, D.C., Payne, M.E., Provenzale, J.M., Krishnan, K.R., 2003. Localization of age-associated white matter hyperintensities in late-life depression. Progress in Neuro-Psychopharmacology and Biological Psychiatry. 27 (3), 539-544.]), we found there was a significantly greater WMH burden in the LLD subjects versus the controls for both the manual and automated method. The effect size was greater for the automated method, suggesting that it is a more specific measure. Additionally, we describe the anatomic localization of the WMHs in LLD subjects as well as in the control subjects, and detect the regions of interest (ROIs) specific for the WMH burden of LLD patients. Given the emergence of large Neuro

  7. Fully-automated synthesis of 16β-18F-fluoro-5α-dihydrotestosterone (FDHT) on the ELIXYS radiosynthesizer

    PubMed Central

    Lazari, Mark; Lyashchenko, Serge K.; Burnazi, Eva M.; Lewis, Jason S.; van Dam, R. Michael; Murphy, Jennifer M.

    2015-01-01

    Noninvasive in vivo imaging of androgen receptor (AR) levels with positron emission tomography (PET) is becoming the primary tool in prostate cancer detection and staging. Of the potential 18F-labeled PET tracers, 18F-FDHT has clinically shown to be of highest diagnostic value. We demonstrate the first automated synthesis of 18F-FDHT by adapting the conventional manual synthesis onto the fully-automated ELIXYS radiosynthesizer. Clinically-relevant amounts of 18F-FDHT were synthesized on ELIXYS in 90 min with decay-corrected radiochemical yield of 29 ± 5% (n = 7). The specific activity was 4.6 Ci/µmol (170 GBq/µmol) at end of formulation with a starting activity of 1.0 Ci (37 GBq). The formulated 18F-FDHT yielded sufficient activity for multiple patient doses and passed all quality control tests required for routine clinical use. PMID:26046518

  8. A Fully Automated and Highly Versatile System for Testing Multi-cognitive Functions and Recording Neuronal Activities in Rodents

    PubMed Central

    Zheng, Weimin; Ycu, Edgar A.

    2012-01-01

    We have developed a fully automated system for operant behavior testing and neuronal activity recording by which multiple cognitive brain functions can be investigated in a single task sequence. The unique feature of this system is a custom-made, acoustically transparent chamber that eliminates many of the issues associated with auditory cue control in most commercially available chambers. The ease with which operant devices can be added or replaced makes this system quite versatile, allowing for the implementation of a variety of auditory, visual, and olfactory behavioral tasks. Automation of the system allows fine temporal (10 ms) control and precise time-stamping of each event in a predesigned behavioral sequence. When combined with a multi-channel electrophysiology recording system, multiple cognitive brain functions, such as motivation, attention, decision-making, patience, and rewards, can be examined sequentially or independently. PMID:22588124

  9. Rapid access to compound libraries through flow technology: fully automated synthesis of a 3-aminoindolizine library via orthogonal diversification.

    PubMed

    Lange, Paul P; James, Keith

    2012-10-08

    A novel methodology for the synthesis of druglike heterocycle libraries has been developed through the use of flow reactor technology. The strategy employs orthogonal modification of a heterocyclic core, which is generated in situ, and was used to construct both a 25-membered library of druglike 3-aminoindolizines, and selected examples of a 100-member virtual library. This general protocol allows a broad range of acylation, alkylation and sulfonamidation reactions to be performed in conjunction with a tandem Sonogashira coupling/cycloisomerization sequence. All three synthetic steps were conducted under full automation in the flow reactor, with no handling or isolation of intermediates, to afford the desired products in good yields. This fully automated, multistep flow approach opens the way to highly efficient generation of druglike heterocyclic systems as part of a lead discovery strategy or within a lead optimization program.

  10. Feasibility of fully automated detection of fiducial markers implanted into the prostate using electronic portal imaging: A comparison of methods

    SciTech Connect

    Harris, Emma J. . E-mail: eharris@icr.ac.uk; McNair, Helen A.; Evans, Phillip M.

    2006-11-15

    Purpose: To investigate the feasibility of fully automated detection of fiducial markers implanted into the prostate using portal images acquired with an electronic portal imaging device. Methods and Materials: We have made a direct comparison of 4 different methods (2 template matching-based methods, a method incorporating attenuation and constellation analyses and a cross correlation method) that have been published in the literature for the automatic detection of fiducial markers. The cross-correlation technique requires a-priory information from the portal images, therefore the technique is not fully automated for the first treatment fraction. Images of 7 patients implanted with gold fiducial markers (8 mm in length and 1 mm in diameter) were acquired before treatment (set-up images) and during treatment (movie images) using 1MU and 15MU per image respectively. Images included: 75 anterior (AP) and 69 lateral (LAT) set-up images and 51 AP and 83 LAT movie images. Using the different methods described in the literature, marker positions were automatically identified. Results: The method based upon cross correlation techniques gave the highest percentage detection success rate of 99% (AP) and 83% (LAT) set-up (1MU) images. The methods gave detection success rates of less than 91% (AP) and 42% (LAT) set-up images. The amount of a-priory information used and how it affects the way the techniques are implemented, is discussed. Conclusions: Fully automated marker detection in set-up images for the first treatment fraction is unachievable using these methods and that using cross-correlation is the best technique for automatic detection on subsequent radiotherapy treatment fractions.

  11. Development of a fully automated network system for long-term health-care monitoring at home.

    PubMed

    Motoi, K; Kubota, S; Ikarashi, A; Nogawa, M; Tanaka, S; Nemoto, T; Yamakoshi, K

    2007-01-01

    Daily monitoring of health condition at home is very important not only as an effective scheme for early diagnosis and treatment of cardiovascular and other diseases, but also for prevention and control of such diseases. From this point of view, we have developed a prototype room for fully automated monitoring of various vital signs. From the results of preliminary experiments using this room, it was confirmed that (1) ECG and respiration during bathing, (2) excretion weight and blood pressure, and (3) respiration and cardiac beat during sleep could be monitored with reasonable accuracy by the sensor system installed in bathtub, toilet and bed, respectively.

  12. A scalable, fully automated process for construction of sequence-ready human exome targeted capture libraries

    PubMed Central

    2011-01-01

    Genome targeting methods enable cost-effective capture of specific subsets of the genome for sequencing. We present here an automated, highly scalable method for carrying out the Solution Hybrid Selection capture approach that provides a dramatic increase in scale and throughput of sequence-ready libraries produced. Significant process improvements and a series of in-process quality control checkpoints are also added. These process improvements can also be used in a manual version of the protocol. PMID:21205303

  13. LDRD final report: Automated planning and programming of assembly of fully 3D mechanisms

    SciTech Connect

    Kaufman, S.G.; Wilson, R.H.; Jones, R.E.; Calton, T.L.; Ames, A.L.

    1996-11-01

    This report describes the results of assembly planning research under the LDRD. The assembly planning problem is that of finding a sequence of assembly operations, starting from individual parts, that will result in complete assembly of a device specified as a CAD model. The automated assembly programming problem is that of automatically producing a robot program that will carry out a given assembly sequence. Given solutions to both of these problems, it is possible to automatically program a robot to assemble a mechanical device given as a CAD data file. This report describes the current state of our solutions to both of these problems, and a software system called Archimedes 2 we have constructed to automate these solutions. Because Archimedes 2 can input CAD data in several standard formats, we have been able to test it on a number of industrial assembly models more complex than any before attempted by automated assembly planning systems, some having over 100 parts. A complete path from a CAD model to an automatically generated robot program for assembling the device represented by the CAD model has also been demonstrated.

  14. Advanced manufacturing rules check (MRC) for fully automated assessment of complex reticle designs: Part II

    NASA Astrophysics Data System (ADS)

    Straub, J. A.; Aguilar, D.; Buck, P. D.; Dawkins, D.; Gladhill, R.; Nolke, S.; Riddick, J.

    2006-10-01

    Advanced electronic design automation (EDA) tools, with their simulation, modeling, design rule checking, and optical proximity correction capabilities, have facilitated the improvement of first pass wafer yields. While the data produced by these tools may have been processed for optimal wafer manufacturing, it is possible for the same data to be far from ideal for photomask manufacturing, particularly at lithography and inspection stages, resulting in production delays and increased costs. The same EDA tools used to produce the data can be used to detect potential problems for photomask manufacturing in the data. In the previous paper, it was shown how photomask MRC is used to uncover data related problems prior to automated defect inspection. It was demonstrated how jobs which are likely to have problems at inspection could be identified and separated from those which are not. The use of photomask MRC in production was shown to reduce time lost to aborted runs and troubleshooting due to data issues. In this paper, the effectiveness of this photomask MRC program in a high volume photomask factory over the course of a year as applied to more than ten thousand jobs will be shown. Statistics on the results of the MRC runs will be presented along with the associated impact to the automated defect inspection process. Common design problems will be shown as well as their impact to mask manufacturing throughput and productivity. Finally, solutions to the most common and most severe problems will be offered and discussed.

  15. Construction and evaluation of an automated light directed protein-detecting microarray synthesizer.

    PubMed

    Marthandan, N; Klyza, S; Li, S; Kwon, Y U; Kodadek, T; Garner, H R

    2008-03-01

    We have designed, constructed, and evaluated an automated instrument that has produced high-density arrays with more than 30 000 peptide features within a 1.5 cm(2) area of a glass slide surface. These arrays can be used for high throughput library screening for protein binding ligands, for potential drug candidate molecules, or for discovering biomarkers. The device consists of a novel fluidics system, a relay control electrical system, an optics system that implements Texas Instruments' digital micromirror device (DMD), and a microwave source for accelerated synthesis of peptide arrays. The instrument implements two novel solid phase chemical synthesis strategies for producing peptide and peptoid arrays. Biotin-streptavidin and DNP anti-DNP (dinitrophenol) models of antibody small molecule interactions were used to demonstrate and evaluate the instrument's capability to produce high-density protein detecting arrays. Several screening assay and detection schemes were explored with various levels of efficiency and assays with sensitivity of 10 nM were also possible.

  16. Fully automated system for the gas chromatographic characterization of polar biopolymers based on thermally assisted hydrolysis and methylation.

    PubMed

    Kaal, Erwin; de Koning, Sjaak; Brudin, Stella; Janssen, Hans-Gerd

    2008-08-08

    Pyrolysis-gas chromatography (Py-GC) is a powerful tool for the detailed compositional analysis of polymers. A major problem of Py-GC is that polar (bio)polymers yield polar pyrolyzates which are not easily accessible to further GC characterization. In the present work, a newly developed fully automated procedure for thermally assisted hydrolysis and methylation (THM) of biopolymers is described. Drying of the sample, addition of the reagent, incubation and pyrolysis are performed inside the liner of a programmable temperature vaporizer injector. The new system not only allows efficient analysis of large series of samples, but also allows automated optimization of the experimental parameters based on an experimental design approach. The performance of the automated THM-procedure was evaluated by performing THM-GC of a poly(acrylic acid)-poly(maleic anhydride) copolymer (PAA/PMAH) and several polysaccharides. The optimized THM-procedure was applied for the structural characterization and differentiation of several lignins and hydroxypropylmethyl-celluloses. It was also applied to proteins. Here myoglobin and cytochrome c were used as the model compounds. Both conventional GC-mass spectrometry (MS) and comprehensive two-dimensional gas chromatography (GCxGC)-time-of-flight (TOF) MS were used for separation and identification of the species formed. The information obtained can aid in structure elucidation of polar biopolymers as well as in providing detailed compositional information which can be used to differentiate structurally similar biopolymers.

  17. Fully automated synthesis of [(18) F]fluoro-dihydrotestosterone ([(18) F]FDHT) using the FlexLab module.

    PubMed

    Ackermann, Uwe; Lewis, Jason S; Young, Kenneth; Morris, Michael J; Weickhardt, Andrew; Davis, Ian D; Scott, Andrew M

    2016-08-01

    Imaging of androgen receptor expression in prostate cancer using F-18 FDHT is becoming increasingly popular. With the radiolabelling precursor now commercially available, developing a fully automated synthesis of [(18) F] FDHT is important. We have fully automated the synthesis of F-18 FDHT using the iPhase FlexLab module using only commercially available components. Total synthesis time was 90 min, radiochemical yields were 25-33% (n = 11). Radiochemical purity of the final formulation was > 99% and specific activity was > 18.5 GBq/µmol for all batches. This method can be up-scaled as desired, thus making it possible to study multiple patients in a day. Furthermore, our procedure uses 4 mg of precursor only and is therefore cost-effective. The synthesis has now been validated at Austin Health and is currently used for [(18) F]FDHT studies in patients. We believe that this method can easily adapted by other modules to further widen the availability of [(18) F]FDHT.

  18. Lab on valve-multisyringe flow injection system (LOV-MSFIA) for fully automated uranium determination in environmental samples.

    PubMed

    Avivar, Jessica; Ferrer, Laura; Casas, Montserrat; Cerdà, Víctor

    2011-06-15

    The hyphenation of lab-on-valve (LOV) and multisyringe flow analysis (MSFIA), coupled to a long path length liquid waveguide capillary cell (LWCC), allows the spectrophotometric determination of uranium in different types of environmental sample matrices, without any manual pre-treatment, and achieving high selectivity and sensitivity levels. On-line separation and preconcentration of uranium is carried out by means of UTEVA resin. The potential of the LOV-MSFIA makes possible the fully automation of the system by the in-line regeneration of the column. After elution, uranium(VI) is spectrophotometrically detected after reaction with arsenazo-III. The determination of levels of uranium present in environmental samples is required in order to establish an environmental control. Thus, we propose a rapid, cheap and fully automated method to determine uranium(VI) in environmental samples. The limit of detection reached is 1.9 ηg of uranium and depending on the preconcentrated volume; it results in ppt levels (10.3 ηg L(-1)). Different water sample matrices (seawater, well water, freshwater, tap water and mineral water) and a phosphogypsum sample (with natural uranium content) were satisfactorily analyzed.

  19. A Robust and Fully-Automated Chromatographic Method for the Quantitative Purification of Ca and Sr for Isotopic Analysis

    NASA Astrophysics Data System (ADS)

    Smith, H. B.; Kim, H.; Romaniello, S. J.; Field, P.; Anbar, A. D.

    2014-12-01

    High throughput methods for sample purification are required to effectively exploit new opportunities in the study of non-traditional stable isotopes. Many geochemical isotopic studies would benefit from larger data sets, but these are often impractical with manual drip chromatography techniques, which can be time-consuming and demand the attention of skilled laboratory staff. Here we present a new, fully-automated single-column method suitable for the purification of both Ca and Sr for stable and radiogenic isotopic analysis. The method can accommodate a wide variety of sample types, including carbonates, bones, and teeth; silicate rocks and sediments; fresh and marine waters; and biological samples such as blood and urine. Protocols for these isotopic analyses are being developed for use on the new prepFAST-MCTM system from Elemental Scientific (ESI). The system is highly adaptable and processes up to 24-60 samples per day by reusing a single chromatographic column. Efficient column cleaning between samples and an all Teflon flow path ensures that sample carryover is maintained at the level of background laboratory blanks typical for manual drip chromatography. This method is part of a family of new fully-automated chromatographic methods being developed to address many different isotopic systems including B, Ca, Fe, Cu, Zn, Sr, Cd, Pb, and U. These methods are designed to be rugged and transferrable, and to allow the preparation of large, diverse sample sets via a highly repeatable process with minimal effort.

  20. Evaluation of a fully automated method to measure the critical removal stress of adult barnacles.

    PubMed

    Conlan, Sheelagh L; Mutton, Robert J; Aldred, Nick; Clare, Anthony S

    2008-01-01

    A computer-controlled force gauge designed to measure the adhesive strength of barnacles on test substrata is described. The instrument was evaluated with adult barnacles grown in situ on Silastic T2(R)-coated microscope slides and epoxy replicas adhered to the same substratum with synthetic adhesive. The force per unit area required to detach the barnacles (critical removal stress) using the new automated system was comparable to that obtained with ASTM D5618 (1994) (0.19 and 0.28 MPa compared with 0.18 and 0.27 MPa for two batches of barnacles). The automated method showed a faster rate of force development compared with the manual spring force gauge used for ASTM D5618 (1994). The new instrument was as accurate and precise at determining surface area as manual delineation used with ASTM D5618 (1994). The method provided significant advantages such as higher throughput speed, the ability to test smaller barnacles (which took less time to grow) and to control the force application angle and speed. The variability in measurements was lower than previously reported, suggesting an improved ability to compare the results obtained by different researchers.

  1. Fully automated high-performance liquid chromatographic assay for the analysis of free catecholamines in urine.

    PubMed

    Said, R; Robinet, D; Barbier, C; Sartre, J; Huguet, C

    1990-08-24

    A totally automated and reliable high-performance liquid chromatographic method is described for the routine determination of free catecholamines (norepinephrine, epinephrine and dopamine) in urine. The catecholamines were isolated from urine samples using small alumina columns. A standard automated method for pH adjustment of urine before the extraction step has been developed. The extraction was performed on an ASPEC (Automatic Sample Preparation with Extraction Columns, Gilson). The eluate was collected in a separate tube and then automatically injected into the chromatographic column. The catecholamines were separated by reversed-phase ion-pair liquid chromatography and quantified by fluorescence detection. No manual intervention was required during the extraction and separation procedure. One sample may be run every 15 min, ca. 96 samples in 24 h. Analytical recoveries for all three catecholamines are 63-87%, and the detection limits are 0.01, 0.01, and 0.03 microM for norepinephrine, epinephrine and dopamine, respectively, which is highly satisfactory for urine. Day-to-day coefficients of variation were less than 10%.

  2. A fully automated primary screening system for the discovery of therapeutic antibodies directly from B cells.

    PubMed

    Tickle, Simon; Howells, Louise; O'Dowd, Victoria; Starkie, Dale; Whale, Kevin; Saunders, Mark; Lee, David; Lightwood, Daniel

    2015-04-01

    For a therapeutic antibody to succeed, it must meet a range of potency, stability, and specificity criteria. Many of these characteristics are conferred by the amino acid sequence of the heavy and light chain variable regions and, for this reason, can be screened for during antibody selection. However, it is important to consider that antibodies satisfying all these criteria may be of low frequency in an immunized animal; for this reason, it is essential to have a mechanism that allows for efficient sampling of the immune repertoire. UCB's core antibody discovery platform combines high-throughput B cell culture screening and the identification and isolation of single, antigen-specific IgG-secreting B cells through a proprietary technique called the "fluorescent foci" method. Using state-of-the-art automation to facilitate primary screening, extremely efficient interrogation of the natural antibody repertoire is made possible; more than 1 billion immune B cells can now be screened to provide a useful starting point from which to identify the rare therapeutic antibody. This article will describe the design, construction, and commissioning of a bespoke automated screening platform and two examples of how it was used to screen for antibodies against two targets.

  3. Fast Image Analysis for the Micronucleus Assay in a Fully Automated High-Throughput Biodosimetry System

    PubMed Central

    Lyulko, Oleksandra V.; Garty, Guy; Randers-Pehrson, Gerhard; Turner, Helen C.; Szolc, Barbara; Brenner, David J.

    2014-01-01

    The development of, and results from an image analysis system are presented for automated detection and scoring of micronuclei in human peripheral blood lymphocytes. The system is part of the Rapid Automated Biodosimetry Tool, which was developed at the Center for High-Throughput Minimally Invasive Radiation Biodosimetry for rapid radiation dose assessment of many individuals based on single fingerstick samples of blood. Blood lymphocytes were subjected to the cytokinesis-block micronucleus assay and the images of cell cytoplasm and nuclei are analyzed to estimate the frequency of micronuclei in binucleated cells. We describe an algorithm that is based on dual fluorescent labeling of lymphocytes with separate analysis of images of cytoplasm and nuclei. To evaluate the performance of the system, blood samples of seven healthy donors were irradiated in vitro with doses from 0–10 Gy and dose-response curves of micronuclei frequencies were generated. To establish the applicability of the system to the detection of high doses, the ratios of mononucleated cells to binucleated cells were determined for three of the donors. All of the dose-response curves generated automatically showed clear dose dependence and good correlation (R2 from 0.914–0.998) with the results of manual scoring. PMID:24502354

  4. A fully automated liquid–liquid extraction system utilizing interface detection

    PubMed Central

    Maslana, Eugene; Schmitt, Robert; Pan, Jeffrey

    2000-01-01

    The development of the Abbott Liquid-Liquid Extraction Station was a result of the need for an automated system to perform aqueous extraction on large sets of newly synthesized organic compounds used for drug discovery. The system utilizes a cylindrical laboratory robot to shuttle sample vials between two loading racks, two identical extraction stations, and a centrifuge. Extraction is performed by detecting the phase interface (by difference in refractive index) of the moving column of fluid drawn from the bottom of each vial containing a biphasic mixture. The integration of interface detection with fluid extraction maximizes sample throughput. Abbott-developed electronics process the detector signals. Sample mixing is performed by high-speed solvent injection. Centrifuging of the samples reduces interface emulsions. Operating software permits the user to program wash protocols with any one of six solvents per wash cycle with as many cycle repeats as necessary. Station capacity is eighty, 15 ml vials. This system has proven successful with a broad spectrum of both ethyl acetate and methylene chloride based chemistries. The development and characterization of this automated extraction system will be presented. PMID:18924693

  5. Early detection of glaucoma using fully automated disparity analysis of the optic nerve head (ONH) from stereo fundus images

    NASA Astrophysics Data System (ADS)

    Sharma, Archie; Corona, Enrique; Mitra, Sunanda; Nutter, Brian S.

    2006-03-01

    Early detection of structural damage to the optic nerve head (ONH) is critical in diagnosis of glaucoma, because such glaucomatous damage precedes clinically identifiable visual loss. Early detection of glaucoma can prevent progression of the disease and consequent loss of vision. Traditional early detection techniques involve observing changes in the ONH through an ophthalmoscope. Stereo fundus photography is also routinely used to detect subtle changes in the ONH. However, clinical evaluation of stereo fundus photographs suffers from inter- and intra-subject variability. Even the Heidelberg Retina Tomograph (HRT) has not been found to be sufficiently sensitive for early detection. A semi-automated algorithm for quantitative representation of the optic disc and cup contours by computing accumulated disparities in the disc and cup regions from stereo fundus image pairs has already been developed using advanced digital image analysis methodologies. A 3-D visualization of the disc and cup is achieved assuming camera geometry. High correlation among computer-generated and manually segmented cup to disc ratios in a longitudinal study involving 159 stereo fundus image pairs has already been demonstrated. However, clinical usefulness of the proposed technique can only be tested by a fully automated algorithm. In this paper, we present a fully automated algorithm for segmentation of optic cup and disc contours from corresponding stereo disparity information. Because this technique does not involve human intervention, it eliminates subjective variability encountered in currently used clinical methods and provides ophthalmologists with a cost-effective and quantitative method for detection of ONH structural damage for early detection of glaucoma.

  6. Fully automated hybrid diode laser assembly using high precision active alignment

    NASA Astrophysics Data System (ADS)

    Böttger, Gunnar; Weber, Daniel; Scholz, Friedemann; Schröder, Henning; Schneider-Ramelow, Martin; Lang, Klaus-Dieter

    2016-03-01

    Fraunhofer IZM, Technische Universität Berlin and eagleyard Photonics present various implementations of current micro-optical assemblies for high quality free space laser beam forming and efficient fiber coupling. The laser modules shown are optimized for fast and automated assembly in small form factor packages via state-of-the-art active alignment machinery, using alignment and joining processes that have been developed and established in various industrial research projects. Operational wavelengths and optical powers ranging from 600 to 1600 nm and from 1 mW to several W respectively are addressed, for application in high-resolution laser spectroscopy, telecom and optical sensors, up to the optical powers needed in industrial and medical laser treatment.

  7. a Fully Automated Pipeline for Classification Tasks with AN Application to Remote Sensing

    NASA Astrophysics Data System (ADS)

    Suzuki, K.; Claesen, M.; Takeda, H.; De Moor, B.

    2016-06-01

    Nowadays deep learning has been intensively in spotlight owing to its great victories at major competitions, which undeservedly pushed `shallow' machine learning methods, relatively naive/handy algorithms commonly used by industrial engineers, to the background in spite of their facilities such as small requisite amount of time/dataset for training. We, with a practical point of view, utilized shallow learning algorithms to construct a learning pipeline such that operators can utilize machine learning without any special knowledge, expensive computation environment, and a large amount of labelled data. The proposed pipeline automates a whole classification process, namely feature-selection, weighting features and the selection of the most suitable classifier with optimized hyperparameters. The configuration facilitates particle swarm optimization, one of well-known metaheuristic algorithms for the sake of generally fast and fine optimization, which enables us not only to optimize (hyper)parameters but also to determine appropriate features/classifier to the problem, which has conventionally been a priori based on domain knowledge and remained untouched or dealt with naïve algorithms such as grid search. Through experiments with the MNIST and CIFAR-10 datasets, common datasets in computer vision field for character recognition and object recognition problems respectively, our automated learning approach provides high performance considering its simple setting (i.e. non-specialized setting depending on dataset), small amount of training data, and practical learning time. Moreover, compared to deep learning the performance stays robust without almost any modification even with a remote sensing object recognition problem, which in turn indicates that there is a high possibility that our approach contributes to general classification problems.

  8. A quality assurance framework for the fully automated and objective evaluation of image quality in cone-beam computed tomography

    SciTech Connect

    Steiding, Christian; Kolditz, Daniel; Kalender, Willi A.

    2014-03-15

    Purpose: Thousands of cone-beam computed tomography (CBCT) scanners for vascular, maxillofacial, neurological, and body imaging are in clinical use today, but there is no consensus on uniform acceptance and constancy testing for image quality (IQ) and dose yet. The authors developed a quality assurance (QA) framework for fully automated and time-efficient performance evaluation of these systems. In addition, the dependence of objective Fourier-based IQ metrics on direction and position in 3D volumes was investigated for CBCT. Methods: The authors designed a dedicated QA phantom 10 cm in length consisting of five compartments, each with a diameter of 10 cm, and an optional extension ring 16 cm in diameter. A homogeneous section of water-equivalent material allows measuring CT value accuracy, image noise and uniformity, and multidimensional global and local noise power spectra (NPS). For the quantitative determination of 3D high-contrast spatial resolution, the modulation transfer function (MTF) of centrally and peripherally positioned aluminum spheres was computed from edge profiles. Additional in-plane and axial resolution patterns were used to assess resolution qualitatively. The characterization of low-contrast detectability as well as CT value linearity and artifact behavior was tested by utilizing sections with soft-tissue-equivalent and metallic inserts. For an automated QA procedure, a phantom detection algorithm was implemented. All tests used in the dedicated QA program were initially verified in simulation studies and experimentally confirmed on a clinical dental CBCT system. Results: The automated IQ evaluation of volume data sets of the dental CBCT system was achieved with the proposed phantom requiring only one scan for the determination of all desired parameters. Typically, less than 5 min were needed for phantom set-up, scanning, and data analysis. Quantitative evaluation of system performance over time by comparison to previous examinations was also

  9. Fully automated detection of corticospinal tract damage in chronic stroke patients.

    PubMed

    Yang, Ming; Yang, Ya-ru; Li, Hui-jun; Lu, Xue-song; Shi, Yong-mei; Liu, Bin; Chen, Hua-jun; Teng, Gao-jun

    2014-01-01

    Structural integrity of the corticospinal tract (CST) after stroke is closely linked to the degree of motor impairment. However, current methods for measurement of fractional atrophy (FA) of CST based on region of interest (ROI) are time-consuming and open to bias. Here, we used tract-based spatial statistics (TBSS) together with a CST template with healthy volunteers to quantify structural integrity of CST automatically. Two groups of patients after ischemic stroke were enrolled, group 1 (10 patients, 7 men, and Fugl-Meyer assessment (FMA) scores ⩽ 50) and group 2 (12 patients, 12 men, and FMA scores = 100). CST of FA(ipsi), FA(contra), and FA(ratio) was compared between the two groups. Relative to group 2, FA was decreased in group 1 in the ipsilesional CST (P < 0.01), as well as the FA(ratio) (P < 0.01). There was no significant difference between the two subgroups in the contralesional CST (P = 0.23). Compared with contralesional CST, FA of ipsilesional CST decreased in group 1 (P < 0.01). These results suggest that the automated method used in our study could detect a surrogate biomarker to quantify the CST after stroke, which would facilitate implementation of clinical practice.

  10. Fully Automated Field-Deployable Bioaerosol Monitoring System Using Carbon Nanotube-Based Biosensors.

    PubMed

    Kim, Junhyup; Jin, Joon-Hyung; Kim, Hyun Soo; Song, Wonbin; Shin, Su-Kyoung; Yi, Hana; Jang, Dae-Ho; Shin, Sehyun; Lee, Byung Yang

    2016-05-17

    Much progress has been made in the field of automated monitoring systems of airborne pathogens. However, they still lack the robustness and stability necessary for field deployment. Here, we demonstrate a bioaerosol automonitoring instrument (BAMI) specifically designed for the in situ capturing and continuous monitoring of airborne fungal particles. This was possible by developing highly sensitive and selective fungi sensors based on two-channel carbon nanotube field-effect transistors (CNT-FETs), followed by integration with a bioaerosol sampler, a Peltier cooler for receptor lifetime enhancement, and a pumping assembly for fluidic control. These four main components collectively cooperated with each other to enable the real-time monitoring of fungi. The two-channel CNT-FETs can detect two different fungal species simultaneously. The Peltier cooler effectively lowers the working temperature of the sensor device, resulting in extended sensor lifetime and receptor stability. The system performance was verified in both laboratory conditions and real residential areas. The system response was in accordance with reported fungal species distribution in the environment. Our system is versatile enough that it can be easily modified for the monitoring of other airborne pathogens. We expect that our system will expedite the development of hand-held and portable systems for airborne bioaerosol monitoring.

  11. Fully automated software solution for protein quantitation by global metabolic labeling with stable isotopes.

    PubMed

    Bindschedler, L V; Cramer, R

    2011-06-15

    Metabolic stable isotope labeling is increasingly employed for accurate protein (and metabolite) quantitation using mass spectrometry (MS). It provides sample-specific isotopologues that can be used to facilitate comparative analysis of two or more samples. Stable Isotope Labeling by Amino acids in Cell culture (SILAC) has been used for almost a decade in proteomic research and analytical software solutions have been established that provide an easy and integrated workflow for elucidating sample abundance ratios for most MS data formats. While SILAC is a discrete labeling method using specific amino acids, global metabolic stable isotope labeling using isotopes such as (15)N labels the entire element content of the sample, i.e. for (15)N the entire peptide backbone in addition to all nitrogen-containing side chains. Although global metabolic labeling can deliver advantages with regard to isotope incorporation and costs, the requirements for data analysis are more demanding because, for instance for polypeptides, the mass difference introduced by the label depends on the amino acid composition. Consequently, there has been less progress on the automation of the data processing and mining steps for this type of protein quantitation. Here, we present a new integrated software solution for the quantitative analysis of protein expression in differential samples and show the benefits of high-resolution MS data in quantitative proteomic analyses.

  12. Grid-Competitive Residential and Commercial Fully Automated PV Systems Technology: Final technical Report, August 2011

    SciTech Connect

    Brown, Katie E.; Cousins, Peter; Culligan, Matt; Jonathan Botkin; DeGraaff, David; Bunea, Gabriella; Rose, Douglas; Bourne, Ben; Koehler, Oliver

    2011-08-26

    Under DOE's Technology Pathway Partnership program, SunPower Corporation developed turn-key, high-efficiency residential and commercial systems that are cost effective. Key program objectives include a reduction in LCOE values to 9-12 cents/kWh and 13-18 cents/kWh respectively for the commercial and residential markets. Target LCOE values for the commercial ground, commercial roof, and residential markets are 10, 11, and 13 cents/kWh. For this effort, SunPower collaborated with a variety of suppliers and partners to complete the tasks below. Subcontractors included: Solaicx, SiGen, Ribbon Technology, Dow Corning, Xantrex, Tigo Energy, and Solar Bridge. SunPower's TPP addressed nearly the complete PV value chain: from ingot growth through system deployment. Throughout the award period of performance, SunPower has made progress toward achieving these reduced costs through the development of 20%+ efficient modules, increased cell efficiency through the understanding of loss mechanisms and improved manufacturing technologies, novel module development, automated design tools and techniques, and reduced system development and installation time. Based on an LCOE assessment using NREL's Solar Advisor Model, SunPower achieved the 2010 target range, as well as progress toward 2015 targets.

  13. A Fully Automated and Robust Method to Incorporate Stamping Data in Crash, NVH and Durability Analysis

    NASA Astrophysics Data System (ADS)

    Palaniswamy, Hariharasudhan; Kanthadai, Narayan; Roy, Subir; Beauchesne, Erwan

    2011-08-01

    Crash, NVH (Noise, Vibration, Harshness), and durability analysis are commonly deployed in structural CAE analysis for mechanical design of components especially in the automotive industry. Components manufactured by stamping constitute a major portion of the automotive structure. In CAE analysis they are modeled at a nominal state with uniform thickness and no residual stresses and strains. However, in reality the stamped components have non-uniformly distributed thickness and residual stresses and strains resulting from stamping. It is essential to consider the stamping information in CAE analysis to accurately model the behavior of the sheet metal structures under different loading conditions. Especially with the current emphasis on weight reduction by replacing conventional steels with aluminum and advanced high strength steels it is imperative to avoid over design. Considering this growing need in industry, a highly automated and robust method has been integrated within Altair Hyperworks® to initialize sheet metal components in CAE models with stamping data. This paper demonstrates this new feature and the influence of stamping data for a full car frontal crash analysis.

  14. Development of a Platform to Enable Fully Automated Cross-Titration Experiments.

    PubMed

    Cassaday, Jason; Finley, Michael; Squadroni, Brian; Jezequel-Sur, Sylvie; Rauch, Albert; Gajera, Bharti; Uebele, Victor; Hermes, Jeffrey; Zuck, Paul

    2017-04-01

    In the triage of hits from a high-throughput screening campaign or during the optimization of a lead compound, it is relatively routine to test compounds at multiple concentrations to determine potency and maximal effect. Additional follow-up experiments, such as agonist shift, can be quite valuable in ascertaining compound mechanism of action (MOA). However, these experiments require cross-titration of a test compound with the activating ligand of the receptor requiring 100-200 data points, severely limiting the number tested in MOA assays in a screening triage. We describe a process to enhance the throughput of such cross-titration experiments through the integration of Hewlett Packard's D300 digital dispenser onto one of our robotics platforms to enable on-the-fly cross-titration of compounds in a 1536-well plate format. The process handles all the compound management and data tracking, as well as the biological assay. The process relies heavily on in-house-built software and hardware, and uses our proprietary control software for the platform. Using this system, we were able to automate the cross-titration of compounds for both positive and negative allosteric modulators of two different G protein-coupled receptors (GPCRs) using two distinct assay detection formats, IP1 and Ca(2+) detection, on nearly 100 compounds for each target.

  15. Fully-automated roller bottle handling system for large scale culture of mammalian cells.

    PubMed

    Kunitake, R; Suzuki, A; Ichihashi, H; Matsuda, S; Hirai, O; Morimoto, K

    1997-01-20

    A fully automatic and continuous cell culture system based on roller bottles is described in this paper. The system includes a culture rack storage station for storing a large number of roller bottles filled with culture medium and inoculated with mammalian cells, mass-handling facility for extracting completed cultures from the roller bottles, and replacing the culture medium. The various component units of the system were controlled either by a general-purpose programmable logic controller or a dedicated controller. The system provided four subsequent operation modes: cell inoculation, medium change, harvesting, and medium change. The operator could easily select and change the appropriate mode from outside of the aseptic area. The development of the system made large-scale production of mammalian cells, and manufacturing and stabilization of high quality products such as erythropoietin possible under total aseptic control, and opened up the door for industrial production of physiologically active substances as pharmaceutical drugs by mammalian cell culture.

  16. An alternative method for monitoring carbonyls, and the development of a 24-port fully automated carbonyl sampler for PAMS program

    SciTech Connect

    Parmar, S.S.; Ugarova, L.; Fernandes, C.; Guyton, J.; Lee, C.P.

    1994-12-31

    The authors have investigated the possibility of collecting different aldehydes and ketones on different sorbents such as silica gel, molecular sieve and charcoal followed by solvent extraction, DNPH derivatization and HPLC/UV analysis. Carbonyl collection efficiencies for these sorbents were calculated relative to a DNPH coated C{sub 18} sep-pak cartridge. From a limited number of laboratory experiments, at various concentrations, it appears that silica gel tubes can be used for sampling aldehydes (collection efficiencies {approximately} 1), whereas charcoal tubes are suitable for collecting ketones. Molecular sieve was found to be unsuitable for collecting most of the carbonyl studied. The authors also report the development of a fully automated 24-port carbonyl sampler specially designed for EPA`s PAMS program.

  17. A randomised control study of a fully automated internet based smoking cessation programme

    PubMed Central

    Swartz, L H G; Noell, J W; Schroeder, S W; Ary, D V

    2006-01-01

    Objective The objective of this project was to test the short term (90 days) efficacy of an automated behavioural intervention for smoking cessation, the “1‐2‐3 Smokefree” programme, delivered via an internet website. Design Randomised control trial. Subjects surveyed at baseline, immediately post‐intervention, and 90 days later. Settings The study and the intervention occurred entirely via the internet site. Subjects were recruited primarily via worksites, which referred potential subjects to the website. Subjects The 351 qualifying subjects were notified of the study via their worksite and required to have internet access. Additionally, subjects were required to be over 18 years of age, smoke cigarettes, and be interested in quitting smoking in the next 30 days. Eligible subjects were randomly assigned individually to treatment or control condition by computer algorithm. Intervention The intervention consisted of a video based internet site that presented current strategies for smoking cessation and motivational materials tailored to the user's race/ethnicity, sex, and age. Control subjects received nothing for 90 days and were then allowed access to the programme. Main outcome measures The primary outcome measure was abstinence from smoking at 90 day follow up. Results At follow up, the cessation rate at 90 days was 24.1% (n  =  21) for the treatment group and 8.2% (n  =  9) for the control group (p  =  0.002). Using an intent‐to‐treat model, 12.3% (n  =  21) of the treatment group were abstinent, compared to 5.0% (n  =  9) in the control group (p  =  0.015). Conclusions These evaluation results suggest that a smoking cessation programme, with at least short term efficacy, can be successfully delivered via the internet. PMID:16436397

  18. “Smart” RCTs: Development of a Smartphone App for Fully Automated Nutrition-Labeling Intervention Trials

    PubMed Central

    Li, Nicole; Dunford, Elizabeth; Eyles, Helen; Crino, Michelle; Michie, Jo; Ni Mhurchu, Cliona

    2016-01-01

    Background There is substantial interest in the effects of nutrition labels on consumer food-purchasing behavior. However, conducting randomized controlled trials on the impact of nutrition labels in the real world presents a significant challenge. Objective The Food Label Trial (FLT) smartphone app was developed to enable conducting fully automated trials, delivering intervention remotely, and collecting individual-level data on food purchases for two nutrition-labeling randomized controlled trials (RCTs) in New Zealand and Australia. Methods Two versions of the smartphone app were developed: one for a 5-arm trial (Australian) and the other for a 3-arm trial (New Zealand). The RCT protocols guided requirements for app functionality, that is, obtaining informed consent, two-stage eligibility check, questionnaire administration, randomization, intervention delivery, and outcome assessment. Intervention delivery (nutrition labels) and outcome data collection (individual shopping data) used the smartphone camera technology, where a barcode scanner was used to identify a packaged food and link it with its corresponding match in a food composition database. Scanned products were either recorded in an electronic list (data collection mode) or allocated a nutrition label on screen if matched successfully with an existing product in the database (intervention delivery mode). All recorded data were transmitted to the RCT database hosted on a server. Results In total approximately 4000 users have downloaded the FLT app to date; 606 (Australia) and 1470 (New Zealand) users met the eligibility criteria and were randomized. Individual shopping data collected by participants currently comprise more than 96,000 (Australia) and 229,000 (New Zealand) packaged food and beverage products. Conclusions The FLT app is one of the first smartphone apps to enable conducting fully automated RCTs. Preliminary app usage statistics demonstrate large potential of such technology, both for

  19. Fully automated measuring equipment for aqueous boron and its application to online monitoring of industrial process effluents.

    PubMed

    Ohyama, Seiichi; Abe, Keiko; Ohsumi, Hitoshi; Kobayashi, Hirokazu; Miyazaki, Naotsugu; Miyadera, Koji; Akasaka, Kin-ichi

    2009-06-01

    Fully automated measuring equipment for aqueous boron (referred to as the online boron monitor) was developed on the basis of a rapid potentiometric determination method using a commercial BF4(-) ion-selective electrode (ISE). The equipment can measure boron compounds with concentration ranging from a few to several hundred mg/L, and the measurement is completed in less than 20 min without any pretreatment of the sample. In the monitor, a series of operations for the measurement, i.e., sampling and dispensing of the sample, addition of the chemicals, acquisition and processing of potentiometric data, rinsing of the measurement cell, and calibration of the BF4(-) ISE, is automated. To demonstrate the performance, we installed the monitor in full-scale coal-fired power plants and measured the effluent from a flue gas desulfurization unit. The boron concentration in the wastewater varied significantly depending on the type of coal and the load of power generation. An excellent correlation (R2 = 0.987) was obtained in the measurements between the online boron monitor and inductively coupled plasma atomic emission spectrometry, which proved that the developed monitor can serve as a useful tool for managing boron emission in industrial process effluent.

  20. Fully automated measuring equipment for aqueous boron and its application to online monitoring of industrial process effluents

    SciTech Connect

    Seiichi Ohyama; Keiko Abe; Hitoshi Ohsumi; Hirokazu Kobayashi; Naotsugu Miyazaki; Koji Miyadera; Kin-ichi Akasaka

    2009-06-15

    Fully automated measuring equipment for aqueous boron (referred to as the online boron monitor) was developed on the basis of a rapid potentiometric determination method using a commercial BF{sub 4}{sup -} ion-selective electrode (ISE). The equipment can measure boron compounds with concentration ranging from a few to several hundred mg/L, and the measurement is completed in less than 20 min without any pretreatment of the sample. In the monitor, a series of operations for the measurement, i.e., sampling and dispensing of the sample, addition of the chemicals, acquisition and processing of potentiometric data, rinsing of the measurement cell, and calibration of the BF{sub 4}{sup -} ISE, is automated. To demonstrate the performance, we installed the monitor in full-scale coal-fired power plants and measured the effluent from a flue gas desulfurization unit. The boron concentration in the wastewater varied significantly depending on the type of coal and the load of power generation. An excellent correlation (R{sup 2} = 0.987) was obtained in the measurements between the online boron monitor and inductively coupled plasma atomic emission spectrometry, which proved that the developed monitor can serve as a useful tool for managing boron emission in industrial process effluent. 22 refs., 6 figs.

  1. Screening for anabolic steroids in urine of forensic cases using fully automated solid phase extraction and LC-MS-MS.

    PubMed

    Andersen, David W; Linnet, Kristian

    2014-01-01

    A screening method for 18 frequently measured exogenous anabolic steroids and the testosterone/epitestosterone (T/E) ratio in forensic cases has been developed and validated. The method involves a fully automated sample preparation including enzyme treatment, addition of internal standards and solid phase extraction followed by analysis by liquid chromatography-tandem mass spectrometry (LC-MS-MS) using electrospray ionization with adduct formation for two compounds. Urine samples from 580 forensic cases were analyzed to determine the T/E ratio and occurrence of exogenous anabolic steroids. Extraction recoveries ranged from 77 to 95%, matrix effects from 48 to 78%, overall process efficiencies from 40 to 54% and the lower limit of identification ranged from 2 to 40 ng/mL. In the 580 urine samples analyzed from routine forensic cases, 17 (2.9%) were found positive for one or more anabolic steroids. Only seven different steroids including testosterone were found in the material, suggesting that only a small number of common steroids are likely to occur in a forensic context. The steroids were often in high concentrations (>100 ng/mL), and a combination of steroids and/or other drugs of abuse were seen in the majority of cases. The method presented serves as a fast and automated screening procedure, proving the suitability of LC-MS-MS for analyzing anabolic steroids.

  2. A fully automated dual-online multifunctional ultrahigh pressure liquid chromatography system for high-throughput proteomics analysis.

    PubMed

    Lee, Hangyeore; Lee, Jung Hwa; Kim, Hokeun; Kim, Su-Jin; Bae, Jingi; Kim, Hark Kyun; Lee, Sang-Won

    2014-02-14

    A fully automated dual-online multifunctional ultrahigh pressure liquid chromatography (DO-MULTI-UPLC) system has been developed for high throughput proteome analyses of complex peptide mixtures. The system employs two online solid phase extraction (SPE) columns (150μm inner diameter×3cm), two capillary reverse phase (RP) columns (75μm×100cm) and a strong cation exchange (SCX) column (150μm×15cm) on a single system utilizing one binary pump and one isocratic pump. With the automated operation of six switching valves, the selection of LC experiments between single-dimensional RPLC and online two-dimensional SCX/RPLC were achieved automatically, without manual intervention, while two RPLC columns were used independently and alternatively. By essentially removing the dead time for column equilibration between experiments, in either 1D mode or 2D experimental mode, the current system was demonstrated to increase the experimental throughput by about two folds, while keeping the inter-column reproducibility of peptide elution time in less than 1% of gradient time. The advantageous features of the proposed system were demonstrated by its application to proteome samples of varying complexities.

  3. Fully automated precision predictions for heavy neutrino production mechanisms at hadron colliders

    NASA Astrophysics Data System (ADS)

    Degrande, Céline; Mattelaer, Olivier; Ruiz, Richard; Turner, Jessica

    2016-09-01

    Motivated by TeV-scale neutrino mass models, we propose a systematic treatment of heavy neutrino (N ) production at hadron colliders. Our simple and efficient modeling of the vector boson fusion (VBF) W±γ →N ℓ± and N ℓ±+nj signal definitions resolve collinear and soft divergences that have plagued past studies, and is applicable to other color-singlet processes, e.g., associated Higgs (W±h), sparticle (ℓ˜±νℓ˜), and charged Higgs (h±±h∓) production. We present, for the first time, a comparison of all leading N production modes, including both gluon fusion (GF) g g →Z*/h*→N νℓ (-) and VBF. We obtain fully differential results up to next-to-leading order (NLO) in QCD accuracy using a Monte Carlo tool chain linking feynrules, nloct, and madgraph5_amc@nlo. Associated model files are publicly available. At the 14 TeV LHC, the leading order GF rate is small and comparable to the NLO N ℓ±+1 j rate; at a future 100 TeV Very Large Hadron Collider, GF dominates for mN=300 - 1500 GeV , beyond which VBF takes the lead.

  4. Fully automated screening of immunocytochemically stained specimens for early cancer detection

    NASA Astrophysics Data System (ADS)

    Bell, André A.; Schneider, Timna E.; Müller-Frank, Dirk A. C.; Meyer-Ebrecht, Dietrich; Böcking, Alfred; Aach, Til

    2007-03-01

    Cytopathological cancer diagnoses can be obtained less invasive than histopathological investigations. Cells containing specimens can be obtained without pain or discomfort, bloody biopsies are avoided, and the diagnosis can, in some cases, even be made earlier. Since no tissue biopsies are necessary these methods can also be used in screening applications, e.g., for cervical cancer. Among the cytopathological methods a diagnosis based on the analysis of the amount of DNA in individual cells achieves high sensitivity and specificity. Yet this analysis is time consuming, which is prohibitive for a screening application. Hence, it will be advantageous to retain, by a preceding selection step, only a subset of suspicious specimens. This can be achieved using highly sensitive immunocytochemical markers like p16 ink4a for preselection of suspicious cells and specimens. We present a method to fully automatically acquire images at distinct positions at cytological specimens using a conventional computer controlled microscope and an autofocus algorithm. Based on the thus obtained images we automatically detect p16 ink4a-positive objects. This detection in turn is based on an analysis of the color distribution of the p16 ink4a marker in the Lab-colorspace. A Gaussian-mixture-model is used to describe this distribution and the method described in this paper so far achieves a sensitivity of up to 90%.

  5. Fully automated segmentation of carotid and vertebral arteries from contrast enhanced CTA

    NASA Astrophysics Data System (ADS)

    Cuisenaire, Olivier; Virmani, Sunny; Olszewski, Mark E.; Ardon, Roberto

    2008-03-01

    We propose a method for segmenting and labeling the main head and neck vessels (common, internal, external carotid, vertebral) from a contrast enhanced computed tomography angiography (CTA) volume. First, an initial centerline of each vessel is extracted. Next, the vessels are segmented using 3D active objects initialized using the first step. Finally, the true centerline is identified by smoothly deforming it away from the segmented mask edges using a spline-snake. We focus particularly on the novel initial centerline extraction technique. It uses a locally adaptive front propagation algorithm that attempts to find the optimal path connecting the ends of the vessel, typically from the lowest image of the scan to the Circle of Willis in the brain. It uses a patient adapted anatomical model of the different vessels both to initialize and constrain this fast marching, thus eliminating the need for manual selection of seed points. The method is evaluated using data from multiple regions (USA, India, China, Israel) including a variety of scanners (10, 16, 40, 64-slice; Brilliance CT, Philips Healthcare, Cleveland, OH, USA), contrast agent dose, and image resolution. It is fully successful in over 90% of patients and only misses a single vessel in most remaining cases. We also demonstrate its robustness to metal and dental artifacts and anatomical variability. Total processing time is approximately two minutes with no user interaction, which dramatically improves the workflow over existing clinical software. It also reduces patient dose exposure by obviating the need to acquire an unenhanced scan for bone suppression as this can be done by applying the segmentation masks.

  6. A preliminary study for fully automated quantification of psoriasis severity using image mapping

    NASA Astrophysics Data System (ADS)

    Mukai, Kazuhiro; Iyatomi, Hitoshi

    2014-03-01

    Psoriasis is a common chronic skin disease and it detracts patients' QoL seriously. Since there is no known permanent cure so far, controlling appropriate disease condition is necessary and therefore quantification of its severity is important. In clinical, psoriasis area and severity index (PASI) is commonly used for abovementioned purpose, however it is often subjective and troublesome. A fully automatic computer-assisted area and severity index (CASI) was proposed to make an objective quantification of skin disease. It investigates the size and density of erythema based on digital image analysis, however it does not consider various inadequate effects caused by different geometrical conditions under clinical follow-up (i.e. variability in direction and distance between camera and patient). In this study, we proposed an image alignment method for clinical images and investigated to quantify the severity of psoriasis under clinical follow-up combined with the idea of CASI. The proposed method finds geometrical same points in patient's body (ROI) between images with Scale Invariant Feature Transform (SIFT) and performs the Affine transform to map the pixel value to the other. In this study, clinical images from 7 patients with psoriasis lesions on their trunk under clinical follow-up were used. In each series, our image alignment algorithm align images to the geometry of their first image. Our proposed method aligned images appropriately on visual assessment and confirmed that psoriasis areas were properly extracted using the approach of CASI. Although we cannot evaluate PASI and CASI directly due to their different definition of ROI, we confirmed that there is a large correlation between those scores with our image quantification method.

  7. Fully automated software for mitral annulus evaluation in chronic mitral regurgitation by 3-dimensional transesophageal echocardiography

    PubMed Central

    Aquila, Iolanda; Fernández-Golfín, Covadonga; Rincon, Luis Miguel; González, Ariana; García Martín, Ana; Hinojar, Rocio; Jimenez Nacher, Jose Julio; Indolfi, Ciro; Zamorano, Jose Luis

    2016-01-01

    Abstract Three-dimensional (3D) transesophageal echocardiography (TEE) is the gold standard for mitral valve (MV) anatomic and functional evaluation. Currently, dedicated MV analysis software has limitations for its use in clinical practice. Thus, we tested here a complete and reproducible evaluation of a new fully automatic software to characterize MV anatomy in different forms of mitral regurgitation (MR) by 3D TEE. Sixty patients were included: 45 with more than moderate MR (28 organic MR [OMR] and 17 functional MR [FMR]) and 15 controls. All patients underwent TEE. 3D MV images obtained using 3D zoom were imported into the new software for automatic analysis. Different MV parameters were obtained and compared. Anatomic and dynamic differences between FMR and OMR were detected. A significant increase in systolic (859.75 vs 801.83 vs 607.78 mm2; P = 0.002) and diastolic (1040.60 vs. 1217.83 and 859.74 mm2; P < 0.001) annular sizes was observed in both OMR and FMR compared to that in controls. FMR had a reduced mitral annular contraction compared to degenerative cases of OMR and to controls (17.14% vs 32.78% and 29.89%; P = 0.007). Good reproducibility was demonstrated along with a short analysis time (mean 4.30 minutes). Annular characteristics and dynamics are abnormal in both FMR and OMR. Full 3D software analysis automatically calculates several significant parameters that provide a correct and complete assessment of anatomy and dynamic mitral annulus geometry and displacement in the 3D space. This analysis allows a better characterization of MR pathophysiology and could be useful in designing new devices for MR repair or replacement. PMID:27930514

  8. HATSouth: A Global Network of Fully Automated Identical Wide-Field Telescopes

    NASA Astrophysics Data System (ADS)

    Bakos, G. Á.; Csubry, Z.; Penev, K.; Bayliss, D.; Jordán, A.; Afonso, C.; Hartman, J. D.; Henning, T.; Kovács, G.; Noyes, R. W.; Béky, B.; Suc, V.; Csák, B.; Rabus, M.; Lázár, J.; Papp, I.; Sári, P.; Conroy, P.; Zhou, G.; Sackett, P. D.; Schmidt, B.; Mancini, L.; Sasselov, D. D.; Ueltzhoeffer, K.

    2013-02-01

    HATSouth is the world’s first network of automated and homogeneous telescopes that is capable of year-round 24 hr monitoring of positions over an entire hemisphere of the sky. The primary scientific goal of the network is to discover and characterize a large number of transiting extrasolar planets, reaching out to long periods and down to small planetary radii. HATSouth achieves this by monitoring extended areas on the sky, deriving high precision light curves for a large number of stars, searching for the signature of planetary transits, and confirming planetary candidates with larger telescopes. HATSouth employs six telescope units spread over three prime locations with large longitude separation in the southern hemisphere (Las Campanas Observatory, Chile; HESS site, Namibia; Siding Spring Observatory, Australia). Each of the HATSouth units holds four 0.18 m diameter f/2.8 focal ratio telescope tubes on a common mount producing an 8.2° × 8.2° field of view on the sky, imaged using four 4 K × 4 K CCD cameras and Sloan r filters, to give a pixel scale of 3.7″ pixel-1. The HATSouth network is capable of continuously monitoring 128 square arc degrees at celestial positions moderately close to the anti-solar direction. We present the technical details of the network, summarize operations, and present detailed weather statistics for the three sites. Robust operations have meant that on average each of the six HATSouth units has conducted observations on ∼500 nights over a 2 years time period, yielding a total of more than 1 million science frames at a 4 minute integration time and observing ∼10.65 hr day-1 on average. We describe the scheme of our data transfer and reduction from raw pixel images to trend-filtered light curves and transiting planet candidates. Photometric precision reaches ∼6 mmag at 4 minute cadence for the brightest non-saturated stars at r ≈ 10.5. We present detailed transit recovery simulations to determine the expected yield of

  9. Fully automated determination of cannabinoids in hair samples using headspace solid-phase microextraction and gas chromatography-mass spectrometry.

    PubMed

    Musshoff, Frank; Junker, Heike P; Lachenmeier, Dirk W; Kroener, Lars; Madea, Burkhard

    2002-01-01

    This paper describes a fully automated procedure using alkaline hydrolysis and headspace solid-phase microextraction (HS-SPME) followed by on-fiber derivatization and gas chromatographic-mass spectrometric (GC-MS) detection of cannabinoids in human hair samples. Ten milligrams of hair was washed with deionized water, petroleum ether, and dichloromethane. After the addition of deuterated internal standards the sample was hydrolyzed with sodium hydroxide and directly submitted to HS-SPME. After absorption of analytes for an on-fiber derivatization procedure the fiber was directly placed into the headspace of a second vial containing N-methyl-N-trimethylsilyltrifluoroacetamide (MSTFA) before GC-MS analysis. The limit of detection was 0.05 ng/mg for delta9-tetrahydrocannabinol (THC), 0.08 ng/mg for cannabidiol (CBD), and 0.14 ng/mg for cannabinol (CBN). Absolute recoveries were in the range between 0.3 and 7.5%. Linearity was proved over a range from 0.1 to 20 ng/mg with coefficients of correlation from 0.998 to 0.999. Validation of the whole procedure revealed excellent results. In comparison with conventional methods of hair analysis this automated HS-SPME-GC-MS procedure is substantially faster. It is easy to perform without use of solvents and with minimal sample quantities, but with the same degree of sensitivity and reproducibility. The applicability was demonstrated by the analysis of 25 hair samples from several forensic cases. The following concentration ranges were determined: THC 0.29-2.20 (mean 1.7) ng/mg, CBN 0.55-4.54 (mean 1.2) ng/mg, and CBD 0.53-18.36 (mean 1.3) ng/mg. 11-nor-Delta9-tetrahydrocannabinol-9-carboxylic acid could not be detected with this method.

  10. A fully automated multi-modal computer aided diagnosis approach to coronary calcium scoring of MSCT images

    NASA Astrophysics Data System (ADS)

    Wu, Jing; Ferns, Gordon; Giles, John; Lewis, Emma

    2012-03-01

    Inter- and intra- observer variability is a problem often faced when an expert or observer is tasked with assessing the severity of a disease. This issue is keenly felt in coronary calcium scoring of patients suffering from atherosclerosis where in clinical practice, the observer must identify firstly the presence, followed by the location of candidate calcified plaques found within the coronary arteries that may prevent oxygenated blood flow to the heart muscle. However, it can be difficult for a human observer to differentiate calcified plaques that are located in the coronary arteries from those found in surrounding anatomy such as the mitral valve or pericardium. In addition to the benefits to scoring accuracy, the use of fast, low dose multi-slice CT imaging to perform the cardiac scan is capable of acquiring the entire heart within a single breath hold. Thus exposing the patient to lower radiation dose, which for a progressive disease such as atherosclerosis where multiple scans may be required, is beneficial to their health. Presented here is a fully automated method for calcium scoring using both the traditional Agatston method, as well as the volume scoring method. Elimination of the unwanted regions of the cardiac image slices such as lungs, ribs, and vertebrae is carried out using adaptive heart isolation. Such regions cannot contain calcified plaques but can be of a similar intensity and their removal will aid detection. Removal of both the ascending and descending aortas, as they contain clinical insignificant plaques, is necessary before the final calcium scores are calculated and examined against ground truth scores of three averaged expert observer results. The results presented here are intended to show the feasibility and requirement for an automated scoring method to reduce the subjectivity and reproducibility error inherent with manual clinical calcium scoring.

  11. Validation of Fully Automated VMAT Plan Generation for Library-Based Plan-of-the-Day Cervical Cancer Radiotherapy

    PubMed Central

    Breedveld, Sebastiaan; Voet, Peter W. J.; Heijkoop, Sabrina T.; Mens, Jan-Willem M.; Hoogeman, Mischa S.; Heijmen, Ben J. M.

    2016-01-01

    Purpose To develop and validate fully automated generation of VMAT plan-libraries for plan-of-the-day adaptive radiotherapy in locally-advanced cervical cancer. Material and Methods Our framework for fully automated treatment plan generation (Erasmus-iCycle) was adapted to create dual-arc VMAT treatment plan libraries for cervical cancer patients. For each of 34 patients, automatically generated VMAT plans (autoVMAT) were compared to manually generated, clinically delivered 9-beam IMRT plans (CLINICAL), and to dual-arc VMAT plans generated manually by an expert planner (manVMAT). Furthermore, all plans were benchmarked against 20-beam equi-angular IMRT plans (autoIMRT). For all plans, a PTV coverage of 99.5% by at least 95% of the prescribed dose (46 Gy) had the highest planning priority, followed by minimization of V45Gy for small bowel (SB). Other OARs considered were bladder, rectum, and sigmoid. Results All plans had a highly similar PTV coverage, within the clinical constraints (above). After plan normalizations for exactly equal median PTV doses in corresponding plans, all evaluated OAR parameters in autoVMAT plans were on average lower than in the CLINICAL plans with an average reduction in SB V45Gy of 34.6% (p<0.001). For 41/44 autoVMAT plans, SB V45Gy was lower than for manVMAT (p<0.001, average reduction 30.3%), while SB V15Gy increased by 2.3% (p = 0.011). AutoIMRT reduced SB V45Gy by another 2.7% compared to autoVMAT, while also resulting in a 9.0% reduction in SB V15Gy (p<0.001), but with a prolonged delivery time. Differences between manVMAT and autoVMAT in bladder, rectal and sigmoid doses were ≤ 1%. Improvements in SB dose delivery with autoVMAT instead of manVMAT were higher for empty bladder PTVs compared to full bladder PTVs, due to differences in concavity of the PTVs. Conclusions Quality of automatically generated VMAT plans was superior to manually generated plans. Automatic VMAT plan generation for cervical cancer has been implemented in

  12. Determination of 21 drugs in oral fluid using fully automated supported liquid extraction and UHPLC-MS/MS.

    PubMed

    Valen, Anja; Leere Øiestad, Åse Marit; Strand, Dag Helge; Skari, Ragnhild; Berg, Thomas

    2016-07-28

    Collection of oral fluid (OF) is easy and non-invasive compared to the collection of urine and blood, and interest in OF for drug screening and diagnostic purposes is increasing. A high-throughput ultra-high-performance liquid chromatography-tandem mass spectrometry method for determination of 21 drugs in OF using fully automated 96-well plate supported liquid extraction for sample preparation is presented. The method contains a selection of classic drugs of abuse, including amphetamines, cocaine, cannabis, opioids, and benzodiazepines. The method was fully validated for 200 μL OF/buffer mix using an Intercept OF sampling kit; validation included linearity, sensitivity, precision, accuracy, extraction recovery, matrix effects, stability, and carry-over. Inter-assay precision (RSD) and accuracy (relative error) were <15% and 13 to 5%, respectively, for all compounds at concentrations equal to or higher than the lower limit of quantification. Extraction recoveries were between 58 and 76% (RSD < 8%), except for tetrahydrocannabinol and three 7-amino benzodiazepine metabolites with recoveries between 23 and 33% (RSD between 51 and 52 % and 11 and 25%, respectively). Ion enhancement or ion suppression effects were observed for a few compounds; however, to a large degree they were compensated for by the internal standards used. Deuterium-labelled and (13) C-labelled internal standards were used for 8 and 11 of the compounds, respectively. In a comparison between Intercept and Quantisal OF kits, better recoveries and fewer matrix effects were observed for some compounds using Quantisal. The method is sensitive and robust for its purposes and has been used successfully since February 2015 for analysis of Intercept OF samples from 2600 cases in a 12-month period. Copyright © 2016 John Wiley & Sons, Ltd.

  13. Development and laboratory-scale testing of a fully automated online flow cytometer for drinking water analysis.

    PubMed

    Hammes, Frederik; Broger, Tobias; Weilenmann, Hans-Ulrich; Vital, Marius; Helbing, Jakob; Bosshart, Ulrich; Huber, Pascal; Odermatt, Res Peter; Sonnleitner, Bernhard

    2012-06-01

    Accurate and sensitive online detection tools would benefit both fundamental research and practical applications in aquatic microbiology. Here, we describe the development and testing of an online flow cytometer (FCM), with a specific use foreseen in the field of drinking water microbiology. The system incorporated fully automated sampling and fluorescent labeling of bacterial nucleic acids with analysis at 5-min intervals for periods in excess of 24 h. The laboratory scale testing showed sensitive detection (< 5% error) of bacteria over a broad concentration range (1 × 10(3) -1 × 10(6) cells mL(-1) ) and particularly the ability to track both gradual changes and dramatic events in water samples. The system was tested with bacterial pure cultures as well as indigenous microbial communities from natural water samples. Moreover, we demonstrated the possibility of using either a single fluorescent dye (e.g., SYBR Green I) or a combination of two dyes (SYBR Green I and Propidium Iodide), thus broadening the application possibilities of the system. The online FCM approach described herein has considerable potential for routine and continuous monitoring of drinking water, optimization of specific drinking water processes such as biofiltration or disinfection, as well as aquatic microbiology research in general.

  14. Fully automated dual-frequency three-pulse-echo 2DIR spectrometer accessing spectral range from 800 to 4000 wavenumbers.

    PubMed

    Leger, Joel D; Nyby, Clara M; Varner, Clyde; Tang, Jianan; Rubtsova, Natalia I; Yue, Yuankai; Kireev, Victor V; Burtsev, Viacheslav D; Qasim, Layla N; Rubtsov, Grigory I; Rubtsov, Igor V

    2014-08-01

    A novel dual-frequency two-dimensional infrared instrument is designed and built that permits three-pulse heterodyned echo measurements of any cross-peak within a spectral range from 800 to 4000 cm(-1) to be performed in a fully automated fashion. The superior sensitivity of the instrument is achieved by a combination of spectral interferometry, phase cycling, and closed-loop phase stabilization accurate to ~70 as. The anharmonicity of smaller than 10(-4) cm(-1) was recorded for strong carbonyl stretching modes using 800 laser shot accumulations. The novel design of the phase stabilization scheme permits tuning polarizations of the mid-infrared (m-IR) pulses, thus supporting measurements of the angles between vibrational transition dipoles. The automatic frequency tuning is achieved by implementing beam direction stabilization schemes for each m-IR beam, providing better than 50 μrad beam stability, and novel scheme for setting the phase-matching geometry for the m-IR beams at the sample. The errors in the cross-peak amplitudes associated with imperfect phase matching conditions and alignment are found to be at the level of 20%. The instrument can be used by non-specialists in ultrafast spectroscopy.

  15. Fully automated dual-frequency three-pulse-echo 2DIR spectrometer accessing spectral range from 800 to 4000 wavenumbers

    SciTech Connect

    Leger, Joel D.; Nyby, Clara M.; Varner, Clyde; Tang, Jianan; Rubtsova, Natalia I.; Yue, Yuankai; Kireev, Victor V.; Burtsev, Viacheslav D.; Qasim, Layla N.; Rubtsov, Igor V.; Rubtsov, Grigory I.

    2014-08-15

    A novel dual-frequency two-dimensional infrared instrument is designed and built that permits three-pulse heterodyned echo measurements of any cross-peak within a spectral range from 800 to 4000 cm{sup −1} to be performed in a fully automated fashion. The superior sensitivity of the instrument is achieved by a combination of spectral interferometry, phase cycling, and closed-loop phase stabilization accurate to ∼70 as. The anharmonicity of smaller than 10{sup −4} cm{sup −1} was recorded for strong carbonyl stretching modes using 800 laser shot accumulations. The novel design of the phase stabilization scheme permits tuning polarizations of the mid-infrared (m-IR) pulses, thus supporting measurements of the angles between vibrational transition dipoles. The automatic frequency tuning is achieved by implementing beam direction stabilization schemes for each m-IR beam, providing better than 50 μrad beam stability, and novel scheme for setting the phase-matching geometry for the m-IR beams at the sample. The errors in the cross-peak amplitudes associated with imperfect phase matching conditions and alignment are found to be at the level of 20%. The instrument can be used by non-specialists in ultrafast spectroscopy.

  16. Fully-automated radiosynthesis and in vitro uptake investigation of [N-methyl-¹¹C]methylene blue.

    PubMed

    Schweiger, Lutz F; Smith, Tim A D

    2013-10-01

    Malignant melanoma is a type of skin cancer which can spread rapidly if not detected early and left untreated. Positron Emission Tomography (PET) is a powerful imaging technique for detecting cancer but with only a limited number of radiotracers available the development of novel PET probes for detection and prevention of cancer is imperative. In the present study we present the fully-automated radiosynthesis of [N-methyl-(11)C]methylene blue and an in vitro uptake study in metastasic melanoma cell lines. Using the GE TRACERlab FXc Pro module [N-methyl-(11)C]methylene blue was isolated via solid-phase extraction in an average time of 36 min after end of bombardment and formulated with a radiochemical purity greater than 95%. The in vitro uptake study of [N-methyl-(11)C]methylene blue in SK-MEL28 melanin-expressing melanoma cell line demonstrated in site-specific binding of 51% promoting it as a promising melanoma PET imaging agent.

  17. Development of a Fully Automated, GPS Based Monitoring System for Disaster Prevention and Emergency Preparedness: PPMS+RT

    PubMed Central

    Bond, Jason; Kim, Don; Chrzanowski, Adam; Szostak-Chrzanowski, Anna

    2007-01-01

    The increasing number of structural collapses, slope failures and other natural disasters has lead to a demand for new sensors, sensor integration techniques and data processing strategies for deformation monitoring systems. In order to meet extraordinary accuracy requirements for displacement detection in recent deformation monitoring projects, research has been devoted to integrating Global Positioning System (GPS) as a monitoring sensor. Although GPS has been used for monitoring purposes worldwide, certain environments pose challenges where conventional processing techniques cannot provide the required accuracy with sufficient update frequency. Described is the development of a fully automated, continuous, real-time monitoring system that employs GPS sensors and pseudolite technology to meet these requirements in such environments. Ethernet and/or serial port communication techniques are used to transfer data between GPS receivers at target points and a central processing computer. The data can be processed locally or remotely based upon client needs. A test was conducted that illustrated a 10 mm displacement was remotely detected at a target point using the designed system. This information could then be used to signal an alarm if conditions are deemed to be unsafe.

  18. Mutation Profile of B-Raf Gene Analyzed by fully Automated System and Clinical Features in Japanese Melanoma Patients.

    PubMed

    Ide, Masaru; Koba, Shinichi; Sueoka-Aragane, Naoko; Sato, Akemi; Nagano, Yuri; Inoue, Takuya; Misago, Noriyuki; Narisawa, Yutaka; Kimura, Shinya; Sueoka, Eisaburo

    2017-01-01

    BRAF gene mutations have been observed in 30-50 % of malignant melanoma patients. Recent development of therapeutic intervention using BRAF inhibitors requires an accurate and rapid detection system for BRAF mutations. In addition, the clinical characteristics of the melanoma associated with BRAF mutations in Japanese patients have not been investigated on a large scale evaluation. We recently established quenching probe system (QP) for detection of an activating BRAF mutation, V600E and evaluated 113 melanoma samples diagnosed in Saga University Hospital from 1982 to 2011. The QP system includes fully automated genotyping, based on analysis of the probe DNA melting curve, which binds the target mutated site using a fluorescent guanine quenched probe. BRAF mutations were detected in 54 of 115 (47 %) including 51 of V600E and 3 of V600 K in Japanese melanoma cases. Among clinical subtypes of melanoma, nodular melanoma showed high frequency (12 of 15; 80 %) of mutation followed by superficial spreading melanoma (13 of 26; 50 %). The QP system is a simple and sensitive method to determine BRAF V600E mutation, and will be useful tool for patient-oriented therapy with BRAF inhibitors.

  19. KRAS detection on archival cytological smears by the novel fully automated polymerase chain reaction-based Idylla mutation test

    PubMed Central

    De Luca, Caterina; Vigliar, Elena; d’Anna, Melania; Pisapia, Pasquale; Bellevicine, Claudio; Malapelle, Umberto; Troncone, Giancarlo

    2017-01-01

    Background: Molecular techniques are relevant to modern cytopathology, but their implementation is difficult without molecular expertise and infrastructure. The assessment of KRAS mutational status on cytological preparations may be useful either to refine uncertain diagnoses on pancreatic aspirates or to yield predictive information to plan targeted treatment of metastatic colorectal cancer (mCRC). The novel test Idylla™ enables fully automated KRAS genotyping in approximately 2 h, even in less experienced hands. Materials and Methods: This study aims to validate this methodology to detect KRAS mutations on archival cytological preparations of pancreatic cancer (n = 9) and mCRC (n = 9) by comparing the Idylla™ performance to that of standard real-time polymerase chain reaction. Results: The same 11 mutations (n = 4: p.G12D; n = 2: p.G12V; n = 2: p.A59E/G/T; n = 1: p.G12R; n = 1: p.G13D; n = 1: p.Q61H) were detected by both techniques. Conclusion: Even in less experienced laboratories, a cytopathologist may easily integrate morphological diagnostic report with accurate KRAS mutation detection, which is relevant for diagnostic and treatment decisions. PMID:28331530

  20. Validation of a fully automated robotic setup for preparation of whole blood samples for LC-MS toxicology analysis.

    PubMed

    Andersen, David; Rasmussen, Brian; Linnet, Kristian

    2012-05-01

    A fully automated setup was developed for preparing whole blood samples using a Tecan Evo workstation. By integrating several add-ons to the robotic platform, the flexible setup was able to prepare samples from sample tubes to a 96-well sample plate ready for injection on liquid chromatography-mass spectrometry using several preparation techniques, including protein precipitation, solid-phase extraction and centrifugation, without any manual intervention. Pipetting of a known aliquot of whole blood was achieved by integrating a balance and performing gravimetric measurements. The system was able to handle 1,073 of 1,092 (98.3%) samples of whole blood from forensic material, including postmortem samples, without any need for repeating sample preparation. Only three samples required special treatment such as dilution. The addition of internal and calibration standards were validated by pipetting a solution of Orange G and measuring the weight and absorbance. Internal standard (20 µL) was added in a multi-pipetting sequence with an accuracy of 99.9% and imprecision (coefficient of variation) of 1.6%. Calibration standards were added with high accuracy at volumes as low as 6.00 µL (±0.21 µL). The general setup of the offline sample preparation and key validation parameters of a quantitative analysis of Δ(9)-tetrahydrocannabinol is presented.

  1. Fully automated measurement of field-dependent AMS using MFK1-FA Kappabridge equipped with 3D rotator

    NASA Astrophysics Data System (ADS)

    Chadima, Martin; Studynka, Jan

    2013-04-01

    Low-field magnetic susceptibility of paramagnetic and diamagnetic minerals is field-independent by definition being also field-independent in pure magnetite. On the other hand, in pyrrhotite, hematite and high-Ti titanomagnetite it may be clearly field-dependent. Consequently, the field-dependent AMS enables the magnetic fabric of the latter group of minerals to be separated from the whole-rock AMS. The methods for the determination of the field-dependent AMS consist of separate measurements of each specimen in several fields within the Rayleigh Law range and subsequent processing in which the field-independent and field-dependent AMS components are calculated. The disadvantage of this technique is that each specimen must be measured several times, which is relatively laborious and time consuming. Recently, a new 3D rotator was developed for the MFK1-FA Kappabridge, which rotates the specimen simultaneously about two axes with different velocities. The measurement is fully automated in such a way that, once the specimen is inserted into the rotator, it requires no additional manipulation to measure the full AMS tensor. Consequently, the 3D rotator enables to measure the AMS tensors in the pre-set field intensities without any operator interference. Whole procedure is controlled by newly developed Safyr5 software; once the measurements are finished, the acquired data are immediately processed and can be visualized in a standard way.

  2. A device for fully automated on-site process monitoring and control of trihalomethane concentrations in drinking water.

    PubMed

    Brown, Aaron W; Simone, Paul S; York, J C; Emmert, Gary L

    2015-01-01

    An instrument designed for fully automated on-line monitoring of trihalomethane concentrations in chlorinated drinking water is presented. The patented capillary membrane sampling device automatically samples directly from a water tap followed by injection of the sample into a gas chromatograph equipped with a nickel-63 electron capture detector. Detailed studies using individual trihalomethane species exhibited method detection limits ranging from 0.01-0.04 μg L(-1). Mean percent recoveries ranged from 77.1 to 86.5% with percent relative standard deviation values ranging from 1.2 to 4.6%. Out of more than 5200 samples analyzed, 95% of the concentration ranges were detectable, 86.5% were quantifiable. The failure rate was less than 2%. Using the data from the instrument, two different treatment processes were optimized so that total trihalomethane concentrations were maintained at acceptable levels while reducing treatment costs significantly. This ongoing trihalomethane monitoring program has been operating for more than ten months and has produced the longest continuous and most finely time-resolved data on trihalomethane concentrations reported in the literature.

  3. Fully-automated in-syringe dispersive liquid-liquid microextraction for the determination of caffeine in coffee beverages.

    PubMed

    Frizzarin, Rejane M; Maya, Fernando; Estela, José M; Cerdà, Víctor

    2016-12-01

    A novel fully-automated magnetic stirring-assisted lab-in-syringe analytical procedure has been developed for the fast and efficient dispersive liquid-liquid microextraction (DLLME) of caffeine in coffee beverages. The procedure is based on the microextraction of caffeine with a minute amount of dichloromethane, isolating caffeine from the sample matrix with no further sample pretreatment. Selection of the relevant extraction parameters such as the dispersive solvent, proportion of aqueous/organic phase, pH and flow rates have been carefully evaluated. Caffeine quantification was linear from 2 to 75mgL(-1), with detection and quantification limits of 0.46mgL(-1) and 1.54mgL(-1), respectively. A coefficient of variation (n=8; 5mgL(-1)) of a 2.1% and a sampling rate of 16h(-1), were obtained. The procedure was satisfactorily applied to the determination of caffeine in brewed, instant and decaf coffee samples, being the results for the sample analysis validated using high-performance liquid chromatography.

  4. Fully automated Liquid Extraction-Based Surface Sampling and Ionization Using a Chip-Based Robotic Nanoelectrospray Platform

    SciTech Connect

    Kertesz, Vilmos; Van Berkel, Gary J

    2010-01-01

    A fully automated liquid extraction-based surface sampling device utilizing an Advion NanoMate chip-based infusion nanoelectrospray ionization system is reported. Analyses were enabled for discrete spot sampling by using the Advanced User Interface of the current commercial control software. This software interface provided the parameter control necessary for the NanoMate robotic pipettor to both form and withdraw a liquid microjunction for sampling from a surface. The system was tested with three types of analytically important sample surface types, viz., spotted sample arrays on a MALDI plate, dried blood spots on paper, and whole-body thin tissue sections from drug dosed mice. The qualitative and quantitative data were consistent with previous studies employing other liquid extraction-based surface sampling techniques. The successful analyses performed here utilized the hardware and software elements already present in the NanoMate system developed to handle and analyze liquid samples. Implementation of an appropriate sample (surface) holder, a solvent reservoir, faster movement of the robotic arm, finer control over solvent flow rate when dispensing and retrieving the solution at the surface, and the ability to select any location on a surface to sample from would improve the analytical performance and utility of the platform.

  5. Validation of TrichoScan technology as a fully-automated tool for evaluation of hair growth parameters.

    PubMed

    Gassmueller, Johannes; Rowold, Elisabeth; Frase, Thomas; Hughes-Formella, Betsy

    2009-01-01

    There is a need for a simple and reliable tool to evaluate hair loss and treatment effects in patients suffering from alopecia. In 2001 TrichoScan was introduced as a fully-automated method for the measurement of biological parameters of hair growth such as density, diameter and growth rate. However, the conventional phototrichogram method with manual marking of hairs on images is still performed and, although no real independent side-by-side comparison is available, the manual method is sometimes defined as the most precise method of measurement. The aim of this study was validation of the Trichoscan method by comparative assessment of TrichoScan analysis and manual marking of hairs. Digital images were taken from 10 patients with androgenetic alopecia (AGA) and validity and reliability of both methods were assessed. This study showed an excellent correlation of TrichoScan and manual marking of hairs. Considerable variability was noted in the results from manually evaluated images (range 2.71%-12.95%), compared to none in TrichoScan analyzed images. Results with TrichoScan were obtained more quickly and were more reproducible with a smaller margin of operator error. The consistency in the Trichoscan data allows statistically significant results to be obtained with a smaller sample size.

  6. A Closed-Loop Proportional-Integral (PI) Control Software for Fully Mechanically Controlled Automated Electron Microscopic Tomography

    SciTech Connect

    REN, GANG; LIU, JINXIN; LI, HONGCHANG; CHEN, XUEFENG

    2016-06-23

    A closed-loop proportional-integral (PI) control software is provided for fully mechanically controlled automated electron microscopic tomography. The software is developed based on Gatan DigitalMicrograph�, and is compatible with Zeiss LIBRA� 120 transmission electron microscope. However, it can be expanded to other TEM instrument with modification. The software consists of a graphical user interface, a digital PI controller, an image analyzing unit, and other drive units (i.e.: image acquire unit and goniometer drive unit). During a tomography data collection process, the image analyzing unit analyzes both the accumulated shift and defocus value of the latest acquired image, and provides the results to the digital PI controller. The digital PI control compares the results with the preset values and determines the optimum adjustments of the goniometer. The goniometer drive unit adjusts the spatial position of the specimen according to the instructions given by the digital PI controller for the next tilt angle and image acquisition. The goniometer drive unit achieves high precision positioning by using a backlash elimination method. The major benefits of the software are: 1) the goniometer drive unit keeps pre-aligned/optimized beam conditions unchanged and achieves position tracking solely through mechanical control; 2) the image analyzing unit relies on only historical data and therefore does not require additional images/exposures; 3) the PI controller enables the system to dynamically track the imaging target with extremely low system error.

  7. Fully-Automated μMRI Morphometric Phenotyping of the Tc1 Mouse Model of Down Syndrome.

    PubMed

    Powell, Nick M; Modat, Marc; Cardoso, M Jorge; Ma, Da; Holmes, Holly E; Yu, Yichao; O'Callaghan, James; Cleary, Jon O; Sinclair, Ben; Wiseman, Frances K; Tybulewicz, Victor L J; Fisher, Elizabeth M C; Lythgoe, Mark F; Ourselin, Sébastien

    We describe a fully automated pipeline for the morphometric phenotyping of mouse brains from μMRI data, and show its application to the Tc1 mouse model of Down syndrome, to identify new morphological phenotypes in the brain of this first transchromosomic animal carrying human chromosome 21. We incorporate an accessible approach for simultaneously scanning multiple ex vivo brains, requiring only a 3D-printed brain holder, and novel image processing steps for their separation and orientation. We employ clinically established multi-atlas techniques-superior to single-atlas methods-together with publicly-available atlas databases for automatic skull-stripping and tissue segmentation, providing high-quality, subject-specific tissue maps. We follow these steps with group-wise registration, structural parcellation and both Voxel- and Tensor-Based Morphometry-advantageous for their ability to highlight morphological differences without the laborious delineation of regions of interest. We show the application of freely available open-source software developed for clinical MRI analysis to mouse brain data: NiftySeg for segmentation and NiftyReg for registration, and discuss atlases and parameters suitable for the preclinical paradigm. We used this pipeline to compare 29 Tc1 brains with 26 wild-type littermate controls, imaged ex vivo at 9.4T. We show an unexpected increase in Tc1 total intracranial volume and, controlling for this, local volume and grey matter density reductions in the Tc1 brain compared to the wild-types, most prominently in the cerebellum, in agreement with human DS and previous histological findings.

  8. Real-time direct cell concentration and viability determination using a fully automated microfluidic platform for standalone process monitoring.

    PubMed

    Nunes, P S; Kjaerulff, S; Dufva, M; Mogensen, K B

    2015-06-21

    The industrial production of cells has a large unmet need for greater process monitoring, in addition to the standard temperature, pH and oxygen concentration determination. Monitoring the cell health by a vast range of fluorescence cell-based assays can greatly improve the feedback control and thereby ensure optimal cell production, by prolonging the fermentation cycle and increasing the bioreactor output. In this work, we report on the development of a fully automated microfluidic system capable of extracting samples directly from a bioreactor, diluting the sample, staining the cells, and determining the total cell and dead cells concentrations, within a time frame of 10.3 min. The platform consists of custom made stepper motor actuated peristaltic pumps and valves, fluidic interconnections, sample to waste liquid management and image cytometry-based detection. The total concentration of cells is determined by brightfield microscopy, while fluorescence detection is used to detect propidium iodide stained non-viable cells. This method can be incorporated into facilities with bioreactors to monitor the cell concentration and viability during the cultivation process. Here, we demonstrate the microfluidic system performance by monitoring in real time the cell concentration and viability of yeast extracted directly from an in-house made bioreactor. This is the first demonstration of using the Dean drag force, generated due to the implementation of a curved microchannel geometry in conjunction with high flow rates, to promote passive mixing of cell samples and thus homogenization of the diluted cell plug. The autonomous operation of the fluidics furthermore allows implementation of intelligent protocols for administering air bubbles from the bioreactor in the microfluidic system, so that these will be guided away from the imaging region, thereby significantly improving both the robustness of the system and the quality of the data.

  9. Fully-Automated μMRI Morphometric Phenotyping of the Tc1 Mouse Model of Down Syndrome

    PubMed Central

    Modat, Marc; Cardoso, M. Jorge; Ma, Da; Holmes, Holly E.; Yu, Yichao; O’Callaghan, James; Cleary, Jon O.; Sinclair, Ben; Wiseman, Frances K.; Tybulewicz, Victor L. J.; Fisher, Elizabeth M. C.; Lythgoe, Mark F.; Ourselin, Sébastien

    2016-01-01

    We describe a fully automated pipeline for the morphometric phenotyping of mouse brains from μMRI data, and show its application to the Tc1 mouse model of Down syndrome, to identify new morphological phenotypes in the brain of this first transchromosomic animal carrying human chromosome 21. We incorporate an accessible approach for simultaneously scanning multiple ex vivo brains, requiring only a 3D-printed brain holder, and novel image processing steps for their separation and orientation. We employ clinically established multi-atlas techniques–superior to single-atlas methods–together with publicly-available atlas databases for automatic skull-stripping and tissue segmentation, providing high-quality, subject-specific tissue maps. We follow these steps with group-wise registration, structural parcellation and both Voxel- and Tensor-Based Morphometry–advantageous for their ability to highlight morphological differences without the laborious delineation of regions of interest. We show the application of freely available open-source software developed for clinical MRI analysis to mouse brain data: NiftySeg for segmentation and NiftyReg for registration, and discuss atlases and parameters suitable for the preclinical paradigm. We used this pipeline to compare 29 Tc1 brains with 26 wild-type littermate controls, imaged ex vivo at 9.4T. We show an unexpected increase in Tc1 total intracranial volume and, controlling for this, local volume and grey matter density reductions in the Tc1 brain compared to the wild-types, most prominently in the cerebellum, in agreement with human DS and previous histological findings. PMID:27658297

  10. Evaluation of the Q analyzer, a new cap-piercing fully automated coagulometer with clotting, chromogenic, and immunoturbidometric capability.

    PubMed

    Kitchen, Steve; Woolley, Anita

    2013-01-01

    The Q analyzer is a recently launched fully automated photo-optical analyzer equipped with primary tube cap-piercing and capable of clotting, chromogenic, and immunoturbidometric tests. The purpose of the present study was to evaluate the performance characteristics of the Q analyzer with reagents from the instrument manufacturer. We assessed precision and throughput when performing coagulation screening tests, prothrombin time (PT)/international normalized ratio (INR), activated partial thromboplastin time (APTT), and fibrinogen assay by Clauss assay. We compared results with established reagent instrument combinations in widespread use. Precision of PT/INR and APTT was acceptable as indicated by total precision of around 3%. The time to first result was 3  min for an INR and 5  min for PT/APTT. The system produced 115 completed samples per hour when processing only INRs and 60 samples (120 results) per hour for PT/APTT combined. The sensitivity of the DG-APTT Synth/Q method to mild deficiency of factor VIII (FVIII), IX, and XI was excellent (as indicated by APTTs being prolonged above the upper limit of the reference range). The Q analyzer was associated with high precision, acceptable throughput, and good reliability. When used in combination with DG-PT reagent and manufacturer's instrument-specific international sensitivity index, the INRs obtained were accurate. The Q analyzer with DG-APTT Synth reagent demonstrated good sensitivity to isolated mild deficiency of FVIII, IX, and XI and had the advantage of relative insensitivity to mild FXII deficiency. Taken together, our data indicate that the Q hemostasis analyzer was suitable for routine use in combination with the reagents evaluated.

  11. Multicenter evaluation of fully automated BACTEC Mycobacteria Growth Indicator Tube 960 system for susceptibility testing of Mycobacterium tuberculosis.

    PubMed

    Bemer, Pascale; Palicova, Frantiska; Rüsch-Gerdes, Sabine; Drugeon, Henri B; Pfyffer, Gaby E

    2002-01-01

    The reliability of the BACTEC Mycobacteria Growth Indicator Tube (MGIT) 960 system for testing of Mycobacterium tuberculosis susceptibility to the three front-line drugs (isoniazid [INH], rifampin [RIF], and ethambutol [EMB]) plus streptomycin (STR) was compared to that of the BACTEC 460 TB system. The proportion method was used to resolve discrepant results by an independent arbiter. One hundred and ten strains were tested with an overall agreement of 93.5%. Discrepant results were obtained for seven strains (6.4%) with INH (resistant by BACTEC MGIT 960; susceptible by BACTEC 460 TB), for one strain (0.9%) with RIF (resistant by BACTEC MGIT 960; susceptible by BACTEC 460 TB), for seven strains (6.4%) with EMB (six resistant by BACTEC MGIT 960 and susceptible by BACTEC 460 TB; one susceptible by BACTEC MGIT 960 and resistant by BACTEC 460 TB), and for 19 strains (17.3%) with STR (resistant by BACTEC MGIT 960 and susceptible by BACTEC 460 TB). After resolution of discrepant results, the sensitivity of the BACTEC MGIT 960 system was 100% for all four drugs and specificity ranged from 89.8% for STR to 100% for RIF. Turnaround times were 4.6 to 11.7 days (median, 6.5 days) for BACTEC MGIT 960 and 4.0 to 10.0 days (median, 7.0 days) for BACTEC 460 TB. These data demonstrate that the fully automated and nonradiometric BACTEC MGIT 960 system is an accurate method for rapid susceptibility testing of M. tuberculosis.

  12. Multicenter Evaluation of Fully Automated BACTEC Mycobacteria Growth Indicator Tube 960 System for Susceptibility Testing of Mycobacterium tuberculosis

    PubMed Central

    Bemer, Pascale; Palicova, Frantiska; Rüsch-Gerdes, Sabine; Drugeon, Henri B.; Pfyffer, Gaby E.

    2002-01-01

    The reliability of the BACTEC Mycobacteria Growth Indicator Tube (MGIT) 960 system for testing of Mycobacterium tuberculosis susceptibility to the three front-line drugs (isoniazid [INH], rifampin [RIF], and ethambutol [EMB]) plus streptomycin (STR) was compared to that of the BACTEC 460 TB system. The proportion method was used to resolve discrepant results by an independent arbiter. One hundred and ten strains were tested with an overall agreement of 93.5%. Discrepant results were obtained for seven strains (6.4%) with INH (resistant by BACTEC MGIT 960; susceptible by BACTEC 460 TB), for one strain (0.9%) with RIF (resistant by BACTEC MGIT 960; susceptible by BACTEC 460 TB), for seven strains (6.4%) with EMB (six resistant by BACTEC MGIT 960 and susceptible by BACTEC 460 TB; one susceptible by BACTEC MGIT 960 and resistant by BACTEC 460 TB), and for 19 strains (17.3%) with STR (resistant by BACTEC MGIT 960 and susceptible by BACTEC 460 TB). After resolution of discrepant results, the sensitivity of the BACTEC MGIT 960 system was 100% for all four drugs and specificity ranged from 89.8% for STR to 100% for RIF. Turnaround times were 4.6 to 11.7 days (median, 6.5 days) for BACTEC MGIT 960 and 4.0 to 10.0 days (median, 7.0 days) for BACTEC 460 TB. These data demonstrate that the fully automated and nonradiometric BACTEC MGIT 960 system is an accurate method for rapid susceptibility testing of M. tuberculosis. PMID:11773109

  13. Separation of field-independent and field-dependent susceptibility tensors using a sequence of fully automated AMS measurements

    NASA Astrophysics Data System (ADS)

    Studynka, J.; Chadima, M.; Hrouda, F.; Suza, P.

    2013-12-01

    Low-field magnetic susceptibility of diamagnetic and paramagnetic minerals as well as that of pure magnetite and all single-domain ferromagnetic (s.l.) minerals is field-independent. In contrast, magnetic susceptibility of multi-domain pyrrhotite, hematite and titanomagnetite may significantly depend on the field intensity. Hence, the AMS data acquired in various fields have a great potential to separate the magnetic fabric carried by the latter group of minerals from the whole-rock fabric. The determination of the field variation of AMS consist of separate measurements of each sample in several fields within the Rayleigh Law range and subsequent processing in which the field-independent and field-dependent susceptibility tensors are calculated. The disadvantage of this technique is that each sample must be measured several times in various positions, which is relatively laborious and time consuming. Recently, a new 3D rotator was developed for the MFK1 Kappabridges which rotates the sample simultaneously about two axes with different velocities. The measurement is fully automated in such a way that, once the sample is mounted into the rotator, it requires no additional positioning to measure the full AMS tensor. The important advantage of the 3D rotator is that it enables to measure AMS in a sequence of pre-set field intensities without any operator manipulation. Whole procedure is computer-controlled and, once a sequence of measurements is finished, the acquired data are immediately processed and visualized. Examples of natural rocks demonstrating various types of field dependence of AMS are given.

  14. Therapeutic Alliance With a Fully Automated Mobile Phone and Web-Based Intervention: Secondary Analysis of a Randomized Controlled Trial

    PubMed Central

    Parker, Gordon; Manicavasagar, Vijaya; Hadzi-Pavlovic, Dusan; Fogarty, Andrea

    2016-01-01

    Background Studies of Internet-delivered psychotherapies suggest that clients report development of a therapeutic alliance in the Internet environment. Because a majority of the interventions studied to date have been therapist-assisted to some degree, it remains unclear whether a therapeutic alliance can develop within the context of an Internet-delivered self-guided intervention with no therapist support, and whether this has consequences for program outcomes. Objective This study reports findings of a secondary analysis of data from 90 participants with mild-to-moderate depression, anxiety, and/or stress who used a fully automated mobile phone and Web-based cognitive behavior therapy (CBT) intervention called “myCompass” in a recent randomized controlled trial (RCT). Methods Symptoms, functioning, and positive well-being were assessed at baseline and post-intervention using the Depression, Anxiety and Stress Scale (DASS), the Work and Social Adjustment Scale (WSAS), and the Mental Health Continuum-Short Form (MHC-SF). Therapeutic alliance was measured at post-intervention using the Agnew Relationship Measure (ARM), and this was supplemented with qualitative data obtained from 16 participant interviews. Extent of participant engagement with the program was also assessed. Results Mean ratings on the ARM subscales were above the neutral midpoints, and the interviewees provided rich detail of a meaningful and collaborative therapeutic relationship with the myCompass program. Whereas scores on the ARM subscales did not correlate with treatment outcomes, participants’ ratings of the quality of their emotional connection with the program correlated significantly and positively with program logins, frequency of self-monitoring, and number of treatment modules completed (r values between .32-.38, P≤.002). The alliance (ARM) subscales measuring perceived empowerment (r=.26, P=.02) and perceived freedom to self-disclose (r=.25, P=.04) also correlated significantly

  15. Fully Automated Pulmonary Lobar Segmentation: Influence of Different Prototype Software Programs onto Quantitative Evaluation of Chronic Obstructive Lung Disease

    PubMed Central

    Lim, Hyun-ju; Weinheimer, Oliver; Wielpütz, Mark O.; Dinkel, Julien; Hielscher, Thomas; Gompelmann, Daniela; Kauczor, Hans-Ulrich; Heussel, Claus Peter

    2016-01-01

    Objectives Surgical or bronchoscopic lung volume reduction (BLVR) techniques can be beneficial for heterogeneous emphysema. Post-processing software tools for lobar emphysema quantification are useful for patient and target lobe selection, treatment planning and post-interventional follow-up. We aimed to evaluate the inter-software variability of emphysema quantification using fully automated lobar segmentation prototypes. Material and Methods 66 patients with moderate to severe COPD who underwent CT for planning of BLVR were included. Emphysema quantification was performed using 2 modified versions of in-house software (without and with prototype advanced lung vessel segmentation; programs 1 [YACTA v.2.3.0.2] and 2 [YACTA v.2.4.3.1]), as well as 1 commercial program 3 [Pulmo3D VA30A_HF2] and 1 pre-commercial prototype 4 [CT COPD ISP ver7.0]). The following parameters were computed for each segmented anatomical lung lobe and the whole lung: lobar volume (LV), mean lobar density (MLD), 15th percentile of lobar density (15th), emphysema volume (EV) and emphysema index (EI). Bland-Altman analysis (limits of agreement, LoA) and linear random effects models were used for comparison between the software. Results Segmentation using programs 1, 3 and 4 was unsuccessful in 1 (1%), 7 (10%) and 5 (7%) patients, respectively. Program 2 could analyze all datasets. The 53 patients with successful segmentation by all 4 programs were included for further analysis. For LV, program 1 and 4 showed the largest mean difference of 72 ml and the widest LoA of [-356, 499 ml] (p<0.05). Program 3 and 4 showed the largest mean difference of 4% and the widest LoA of [-7, 14%] for EI (p<0.001). Conclusions Only a single software program was able to successfully analyze all scheduled data-sets. Although mean bias of LV and EV were relatively low in lobar quantification, ranges of disagreement were substantial in both of them. For longitudinal emphysema monitoring, not only scanning protocol but

  16. Analytical Protein Microarrays: Advancements Towards Clinical Applications

    PubMed Central

    Sauer, Ursula

    2017-01-01

    Protein microarrays represent a powerful technology with the potential to serve as tools for the detection of a broad range of analytes in numerous applications such as diagnostics, drug development, food safety, and environmental monitoring. Key features of analytical protein microarrays include high throughput and relatively low costs due to minimal reagent consumption, multiplexing, fast kinetics and hence measurements, and the possibility of functional integration. So far, especially fundamental studies in molecular and cell biology have been conducted using protein microarrays, while the potential for clinical, notably point-of-care applications is not yet fully utilized. The question arises what features have to be implemented and what improvements have to be made in order to fully exploit the technology. In the past we have identified various obstacles that have to be overcome in order to promote protein microarray technology in the diagnostic field. Issues that need significant improvement to make the technology more attractive for the diagnostic market are for instance: too low sensitivity and deficiency in reproducibility, inadequate analysis time, lack of high-quality antibodies and validated reagents, lack of automation and portable instruments, and cost of instruments necessary for chip production and read-out. The scope of the paper at hand is to review approaches to solve these problems. PMID:28146048

  17. Analytical Protein Microarrays: Advancements Towards Clinical Applications.

    PubMed

    Sauer, Ursula

    2017-01-29

    Protein microarrays represent a powerful technology with the potential to serve as tools for the detection of a broad range of analytes in numerous applications such as diagnostics, drug development, food safety, and environmental monitoring. Key features of analytical protein microarrays include high throughput and relatively low costs due to minimal reagent consumption, multiplexing, fast kinetics and hence measurements, and the possibility of functional integration. So far, especially fundamental studies in molecular and cell biology have been conducted using protein microarrays, while the potential for clinical, notably point-of-care applications is not yet fully utilized. The question arises what features have to be implemented and what improvements have to be made in order to fully exploit the technology. In the past we have identified various obstacles that have to be overcome in order to promote protein microarray technology in the diagnostic field. Issues that need significant improvement to make the technology more attractive for the diagnostic market are for instance: too low sensitivity and deficiency in reproducibility, inadequate analysis time, lack of high-quality antibodies and validated reagents, lack of automation and portable instruments, and cost of instruments necessary for chip production and read-out. The scope of the paper at hand is to review approaches to solve these problems.

  18. Fully Automated RNAscope In Situ Hybridization Assays for Formalin‐Fixed Paraffin‐Embedded Cells and Tissues

    PubMed Central

    Anderson, Courtney M.; Zhang, Bingqing; Miller, Melanie; Butko, Emerald; Wu, Xingyong; Laver, Thomas; Kernag, Casey; Kim, Jeffrey; Luo, Yuling; Lamparski, Henry; Park, Emily; Su, Nan

    2016-01-01

    ABSTRACT Biomarkers such as DNA, RNA, and protein are powerful tools in clinical diagnostics and therapeutic development for many diseases. Identifying RNA expression at the single cell level within the morphological context by RNA in situ hybridization provides a great deal of information on gene expression changes over conventional techniques that analyze bulk tissue, yet widespread use of this technique in the clinical setting has been hampered by the dearth of automated RNA ISH assays. Here we present an automated version of the RNA ISH technology RNAscope that is adaptable to multiple automation platforms. The automated RNAscope assay yields a high signal‐to‐noise ratio with little to no background staining and results comparable to the manual assay. In addition, the automated duplex RNAscope assay was able to detect two biomarkers simultaneously. Lastly, assay consistency and reproducibility were confirmed by quantification of TATA‐box binding protein (TBP) mRNA signals across multiple lots and multiple experiments. Taken together, the data presented in this study demonstrate that the automated RNAscope technology is a high performance RNA ISH assay with broad applicability in biomarker research and diagnostic assay development. J. Cell. Biochem. 117: 2201–2208, 2016. © 2016 The Authors. Journal of Cellular Biochemistry Published by Wiley Periodicals, Inc. PMID:27191821

  19. Applicability of a System for fully automated nucleic acid extraction from formalin-fixed paraffin-embedded sections for routine KRAS mutation testing.

    PubMed

    Lehmann, Annika; Schewe, Christiane; Hennig, Guido; Denkert, Carsten; Weichert, Wilko; Budczies, Jan; Dietel, Manfred

    2012-06-01

    Due to the approval of various new targeted therapies for the treatment of cancer, molecular pathology laboratories with a diagnostic focus have to meet new challenges: simultaneous handling of a large number of samples, small amounts of input material, and fragmentation of nucleic acids because of formalin fixation. As a consequence, fully automated systems for a fast and standardized extraction of high-quality DNA from formalin-fixed paraffin-embedded (FFPE) tissues are urgently needed. In this study, we tested the performance of a fully automated, high-throughput method for the extraction of nucleic acids from FFPE tissues. We investigated the extraction performance in sections of 5 different tissue types often analyzed in routine pathology laboratories (cervix, colon, liver, lymph node, and lung; n=340). Furthermore, we compared the quality, labor input, and applicability of the method for diagnostic purposes with those of a laboratory-validated manual method in a clinical setting by screening a set of 45 colorectal adenocarcinoma for the KRAS mutation. Automated extraction of both DNA and RNA was successful in 339 of 340 FFPE samples representing 5 different tissue types. In comparison with a conventional manual extraction protocol, the method showed an overall agreement of 97.7% (95% confidence interval, 88.2%-99.9%) for the subsequent mutational analysis of the KRAS gene in colorectal cancer samples. The fully automated system is a promising tool for a simple, robust, and rapid extraction of DNA and RNA from formalin-fixed tissue. It ensures a standardization of sample processing and can be applied to clinical FFPE samples in routine pathology.

  20. Usage and Effectiveness of a Fully Automated, Open-Access, Spanish Web-Based Smoking Cessation Program: Randomized Controlled Trial

    PubMed Central

    2014-01-01

    Background The Internet is an optimal setting to provide massive access to tobacco treatments. To evaluate open-access Web-based smoking cessation programs in a real-world setting, adherence and retention data should be taken into account as much as abstinence rate. Objective The objective was to analyze the usage and effectiveness of a fully automated, open-access, Web-based smoking cessation program by comparing interactive versus noninteractive versions. Methods Participants were randomly assigned either to the interactive or noninteractive version of the program, both with identical content divided into 4 interdependent modules. At baseline, we collected demographic, psychological, and smoking characteristics of the smokers self-enrolled in the Web-based program of Universidad Nacional de Educación a Distancia (National Distance Education University; UNED) in Madrid, Spain. The following questionnaires were administered: the anxiety and depression subscales from the Symptom Checklist-90-Revised, the 4-item Perceived Stress Scale, and the Heaviness of Smoking Index. At 3 months, we analyzed dropout rates, module completion, user satisfaction, follow-up response rate, and self-assessed smoking abstinence. Results A total of 23,213 smokers were registered, 50.06% (11,620/23,213) women and 49.94% (11,593/23,213) men, with a mean age of 39.5 years (SD 10.3). Of these, 46.10% (10,701/23,213) were married and 34.43% (7992/23,213) were single, 46.03% (10,686/23,213) had university education, and 78.73% (18,275/23,213) were employed. Participants smoked an average of 19.4 cigarettes per day (SD 10.3). Of the 11,861 smokers randomly assigned to the interactive version, 2720 (22.93%) completed the first module, 1052 (8.87%) the second, 624 (5.26%) the third, and 355 (2.99%) the fourth. Completion data was not available for the noninteractive version (no way to record it automatically). The 3-month follow-up questionnaire was completed by 1085 of 23,213 enrolled smokers

  1. Technical Note: A fully automated purge and trap GC-MS system for quantification of volatile organic compound (VOC) fluxes between the ocean and atmosphere

    NASA Astrophysics Data System (ADS)

    Andrews, S. J.; Hackenberg, S. C.; Carpenter, L. J.

    2015-04-01

    The oceans are a key source of a number of atmospherically important volatile gases. The accurate and robust determination of trace gases in seawater is a significant analytical challenge, requiring reproducible and ideally automated sample handling, a high efficiency of seawater-air transfer, removal of water vapour from the sample stream, and high sensitivity and selectivity of the analysis. Here we describe a system that was developed for the fully automated analysis of dissolved very short-lived halogenated species (VSLS) sampled from an under-way seawater supply. The system can also be used for semi-automated batch sampling from Niskin bottles filled during CTD (conductivity, temperature, depth) profiles. The essential components comprise a bespoke, automated purge and trap (AutoP & T) unit coupled to a commercial thermal desorption and gas chromatograph mass spectrometer (TD-GC-MS). The AutoP & T system has completed five research cruises, from the tropics to the poles, and collected over 2500 oceanic samples to date. It is able to quantify >25 species over a boiling point range of 34-180 °C with Henry's law coefficients of 0.018 and greater (CH22l, kHcc dimensionless gas/aqueous) and has been used to measure organic sulfurs, hydrocarbons, halocarbons and terpenes. In the eastern tropical Pacific, the high sensitivity and sampling frequency provided new information regarding the distribution of VSLS, including novel measurements of a photolytically driven diurnal cycle of CH22l within the surface ocean water.

  2. Toward fully automated genotyping: allele assignment, pedigree construction, phase determination, and recombination detection in Duchenne muscular dystrophy.

    PubMed Central

    Perlin, M. W.; Burks, M. B.; Hoop, R. C.; Hoffman, E. P.

    1994-01-01

    Human genetic maps have made quantum leaps in the past few years, because of the characterization of > 2,000 CA dinucleotide repeat loci: these PCR-based markers offer extraordinarily high PIC, and within the next year their density is expected to reach intervals of a few centimorgans per marker. These new genetic maps open new avenues for disease gene research, including large-scale genotyping for both simple and complex disease loci. However, the allele patterns of many dinucleotide repeat loci can be complex and difficult to interpret, with genotyping errors a recognized problem. Furthermore, the possibility of genotyping individuals at hundreds or thousands of polymorphic loci requires improvements in data handling and analysis. The automation of genotyping and analysis of computer-derived haplotypes would remove many of the barriers preventing optimal use of dense and informative dinucleotide genetic maps. Toward this end, we have automated the allele identification, genotyping, phase determinations, and inheritance consistency checks generated by four CA repeats within the 2.5-Mbp, 10-cM X-linked dystrophin gene, using fluorescein-labeled multiplexed PCR products analyzed on automated sequencers. The described algorithms can deconvolute and resolve closely spaced alleles, despite interfering stutter noise; set phase in females; propagate the phase through the family; and identify recombination events. We show the implementation of these algorithms for the completely automated interpretation of allele data and risk assessment for five Duchenne/Becker muscular dystrophy families. The described approach can be scaled up to perform genome-based analyses with hundreds or thousands of CA-repeat loci, using multiple fluorophors on automated sequencers. PMID:7942856

  3. Toward fully automated genotyping: Allele assignment, pedigree construction, phase determination, and recombination detection in Duchenne muscular dystrophy

    SciTech Connect

    Perlin, M.W.; Burks, M.B.; Hoop, R.C.; Hoffman, E.P.

    1994-10-01

    Human genetic maps have made quantum leaps in the past few years, because of the characterization of >2,000 CA dinucleotide repeat loci: these PCR-based markers offer extraordinarily high PIC, and within the next year their density is expected to reach intervals of a few centimorgans per marker. These new genetic maps open new avenues for disease gene research, including large-scale genotyping for both simple and complex disease loci. However, the allele patterns of many dinucleotide repeat loci can be complex and difficult to interpret, with genotyping errors a recognized problem. Furthermore, the possibility of genotyping individuals at hundreds or thousands of polymorphic loci requires improvements in data handling and analysis. The automation of genotyping and analysis of computer-derived haplotypes would remove many of the barriers preventing optimal use of dense and informative dinucleotide genetic maps. Toward this end, we have automated the allele identification, genotyping, phase determinations, and inheritance consistency checks generated by four CA repeats within the 2.5-Mbp, 10-cM X-linked dystrophin gene, using fluorescein-labeled multiplexed PCR products analyzed on automated sequencers. The described algorithms can deconvolute and resolve closely spaced alleles, despite interfering stutter noise; set phase in females; propagate the phase through the family; and identify recombination events. We show the implementation of these algorithms for the completely automated interpretation of allele data and risk assessment for five Duchenne/Becker muscular dystrophy families. The described approach can be scaled up to perform genome-based analyses with hundreds or thousands of CA-repeat loci, using multiple fluorophors on automated sequencers. 16 refs., 5 figs., 1 tab.

  4. Establishment of a fully automated microtiter plate-based system for suspension cell culture and its application for enhanced process optimization.

    PubMed

    Markert, Sven; Joeris, Klaus

    2017-01-01

    We developed an automated microtiter plate (MTP)-based system for suspension cell culture to meet the increased demands for miniaturized high throughput applications in biopharmaceutical process development. The generic system is based on off-the-shelf commercial laboratory automation equipment and is able to utilize MTPs of different configurations (6-24 wells per plate) in orbital shaken mode. The shaking conditions were optimized by Computational Fluid Dynamics simulations. The fully automated system handles plate transport, seeding and feeding of cells, daily sampling, and preparation of analytical assays. The integration of all required analytical instrumentation into the system enables a hands-off operation which prevents bottlenecks in sample processing. The modular set-up makes the system flexible and adaptable for a continuous extension of analytical parameters and add-on components. The system proved suitable as screening tool for process development by verifying the comparability of results for the MTP-based system and bioreactors regarding profiles of viable cell density, lactate, and product concentration of CHO cell lines. These studies confirmed that 6 well MTPs as well as 24 deepwell MTPs were predictive for a scale up to a 1000 L stirred tank reactor (scale factor 1:200,000). Applying the established cell culture system for automated media blend screening in late stage development, a 22% increase in product yield was achieved in comparison to the reference process. The predicted product increase was subsequently confirmed in 2 L bioreactors. Thus, we demonstrated the feasibility of the automated MTP-based cell culture system for enhanced screening and optimization applications in process development and identified further application areas such as process robustness. The system offers a great potential to accelerate time-to-market for new biopharmaceuticals. Biotechnol. Bioeng. 2017;114: 113-121. © 2016 Wiley Periodicals, Inc.

  5. Microarray in parasitic infections

    PubMed Central

    Sehgal, Rakesh; Misra, Shubham; Anand, Namrata; Sharma, Monika

    2012-01-01

    Modern biology and genomic sciences are rooted in parasitic disease research. Genome sequencing efforts have provided a wealth of new biological information that promises to have a major impact on our understanding of parasites. Microarrays provide one of the major high-throughput platforms by which this information can be exploited in the laboratory. Many excellent reviews and technique articles have recently been published on applying microarrays to organisms for which fully annotated genomes are at hand. However, many parasitologists work on organisms whose genomes have been only partially sequenced. This review is mainly focused on how to use microarray in these situations. PMID:23508469

  6. ELIXYS - a fully automated, three-reactor high-pressure radiosynthesizer for development and routine production of diverse PET tracers

    PubMed Central

    2013-01-01

    Background Automated radiosynthesizers are vital for routine production of positron-emission tomography tracers to minimize radiation exposure to operators and to ensure reproducible synthesis yields. The recent trend in the synthesizer industry towards the use of disposable kits aims to simplify setup and operation for the user, but often introduces several limitations related to temperature and chemical compatibility, thus requiring reoptimization of protocols developed on non-cassette-based systems. Radiochemists would benefit from a single hybrid system that provides tremendous flexibility for development and optimization of reaction conditions while also providing a pathway to simple, cassette-based production of diverse tracers. Methods We have designed, built, and tested an automated three-reactor radiosynthesizer (ELIXYS) to provide a flexible radiosynthesis platform suitable for both tracer development and routine production. The synthesizer is capable of performing high-pressure and high-temperature reactions by eliminating permanent tubing and valve connections to the reaction vessel. Each of the three movable reactors can seal against different locations on disposable cassettes to carry out different functions such as sealed reactions, evaporations, and reagent addition. A reagent and gas handling robot moves sealed reagent vials from storage locations in the cassette to addition positions and also dynamically provides vacuum and inert gas to ports on the cassette. The software integrates these automated features into chemistry unit operations (e.g., React, Evaporate, Add) to intuitively create synthesis protocols. 2-Deoxy-2-[18F]fluoro-5-methyl-β-l-arabinofuranosyluracil (l-[18F]FMAU) and 2-deoxy-2-[18F]fluoro-β-d-arabinofuranosylcytosine (d-[18F]FAC) were synthesized to validate the system. Results l-[18F]FMAU and d-[18F]FAC were successfully synthesized in 165 and 170 min, respectively, with decay-corrected radiochemical yields of 46% ± 1% (n = 6

  7. Centrifugal LabTube platform for fully automated DNA purification and LAMP amplification based on an integrated, low-cost heating system.

    PubMed

    Hoehl, Melanie M; Weißert, Michael; Dannenberg, Arne; Nesch, Thomas; Paust, Nils; von Stetten, Felix; Zengerle, Roland; Slocum, Alexander H; Steigert, Juergen

    2014-06-01

    This paper introduces a disposable battery-driven heating system for loop-mediated isothermal DNA amplification (LAMP) inside a centrifugally-driven DNA purification platform (LabTube). We demonstrate LabTube-based fully automated DNA purification of as low as 100 cell-equivalents of verotoxin-producing Escherichia coli (VTEC) in water, milk and apple juice in a laboratory centrifuge, followed by integrated and automated LAMP amplification with a reduction of hands-on time from 45 to 1 min. The heating system consists of two parallel SMD thick film resistors and a NTC as heating and temperature sensing elements. They are driven by a 3 V battery and controlled by a microcontroller. The LAMP reagents are stored in the elution chamber and the amplification starts immediately after the eluate is purged into the chamber. The LabTube, including a microcontroller-based heating system, demonstrates contamination-free and automated sample-to-answer nucleic acid testing within a laboratory centrifuge. The heating system can be easily parallelized within one LabTube and it is deployable for a variety of heating and electrical applications.

  8. Development of a fully automated open-column chemical-separation system—COLUMNSPIDER—and its application to Sr-Nd-Pb isotope analyses of igneous rock samples

    NASA Astrophysics Data System (ADS)

    Miyazaki, Takashi; Vaglarov, Bogdan Stefanov; Takei, Masakazu; Suzuki, Masahiro; Suzuki, Hiroaki; Ohsawa, Kouzou; Chang, Qing; Takahashi, Toshiro; Hirahara, Yuka; Hanyu, Takeshi; Kimura, Jun-Ichi; Tatsumi, Yoshiyuki

    A fully automated open-column resin-bed chemical-separation system, named COLUMNSPIDER, has been developed. The system consists of a programmable micropipetting robot that dispenses chemical reagents and sample solutions into an open-column resin bed for elemental separation. After the initial set up of resin columns, chemical reagents, and beakers for the separated chemical components, all separation procedures are automated. As many as ten samples can be eluted in parallel in a single automated run. Many separation procedures, such as radiogenic isotope ratio analyses for Sr and Nd, involve the use of multiple column separations with different resin columns, chemical reagents, and beakers of various volumes. COLUMNSPIDER completes these separations using multiple runs. Programmable functions, including the positioning of the micropipetter, reagent volume, and elution time, enable flexible operation. Optimized movements for solution take-up and high-efficiency column flushing allow the system to perform as precisely as when carried out manually by a skilled operator. Procedural blanks, examined for COLUMNSPIDER separations of Sr, Nd, and Pb, are low and negligible. The measured Sr, Nd, and Pb isotope ratios for JB-2 and Nd isotope ratios for JB-3 and BCR-2 rock standards all fall within the ranges reported previously in high-accuracy analyses. COLUMNSPIDER is a versatile tool for the efficient elemental separation of igneous rock samples, a process that is both labor intensive and time consuming.

  9. A user-friendly robotic sample preparation program for fully automated biological sample pipetting and dilution to benefit the regulated bioanalysis.

    PubMed

    Jiang, Hao; Ouyang, Zheng; Zeng, Jianing; Yuan, Long; Zheng, Naiyu; Jemal, Mohammed; Arnold, Mark E

    2012-06-01

    Biological sample dilution is a rate-limiting step in bioanalytical sample preparation when the concentrations of samples are beyond standard curve ranges, especially when multiple dilution factors are needed in an analytical run. We have developed and validated a Microsoft Excel-based robotic sample preparation program (RSPP) that automatically transforms Watson worklist sample information (identification, sequence and dilution factor) to comma-separated value (CSV) files. The Freedom EVO liquid handler software imports and transforms the CSV files to executable worklists (.gwl files), allowing the robot to perform sample dilutions at variable dilution factors. The dynamic dilution range is 1- to 1000-fold and divided into three dilution steps: 1- to 10-, 11- to 100-, and 101- to 1000-fold. The whole process, including pipetting samples, diluting samples, and adding internal standard(s), is accomplished within 1 h for two racks of samples (96 samples/rack). This platform also supports online sample extraction (liquid-liquid extraction, solid-phase extraction, protein precipitation, etc.) using 96 multichannel arms. This fully automated and validated sample dilution and preparation process has been applied to several drug development programs. The results demonstrate that application of the RSPP for fully automated sample processing is efficient and rugged. The RSPP not only saved more than 50% of the time in sample pipetting and dilution but also reduced human errors. The generated bioanalytical data are accurate and precise; therefore, this application can be used in regulated bioanalysis.

  10. A fully automated system with online sample loading, isotope dimethyl labeling and multidimensional separation for high-throughput quantitative proteome analysis.

    PubMed

    Wang, Fangjun; Chen, Rui; Zhu, Jun; Sun, Deguang; Song, Chunxia; Wu, Yifeng; Ye, Mingliang; Wang, Liming; Zou, Hanfa

    2010-04-01

    Multidimensional separation is often applied for large-scale qualitative and quantitative proteome analysis. A fully automated system with integration of a reversed phase-strong cation exchange (RP-SCX) biphasic trap column into vented sample injection system was developed to realize online sample loading, isotope dimethyl labeling and online multidimensional separation of the proteome samples. Comparing to conventionally manual isotope labeling and off-line fractionation technologies, this system is fully automated and time-saving, which is benefit for improving the quantification reproducibility and accuracy. As phosphate SCX monolith was integrated into the biphasic trap column, high sample injection flow rate and high-resolution stepwise fractionation could be easily achieved. Approximately 1000 proteins could be quantified in approximately 30 h proteome analysis, and the proteome coverage of quantitative analysis can be further greatly improved by prolong the multidimensional separation time. This system was applied to analyze the different protein expression level of HCC and normal human liver tissues. After three times replicated analysis, finally 94 up-regulated and 249 down-regulated (HCC/Normal) proteins were successfully obtained. These significantly regulated proteins are widely validated by both gene and proteins expression studies previously. Such as some enzymes involved in urea cycle, methylation cycle and fatty acids catabolism in liver were all observed down-regulated.

  11. Eco-HAB as a fully automated and ecologically relevant assessment of social impairments in mouse models of autism

    PubMed Central

    Puścian, Alicja; Łęski, Szymon; Kasprowicz, Grzegorz; Winiarski, Maciej; Borowska, Joanna; Nikolaev, Tomasz; Boguszewski, Paweł M; Lipp, Hans-Peter; Knapska, Ewelina

    2016-01-01

    Eco-HAB is an open source, RFID-based system for automated measurement and analysis of social preference and in-cohort sociability in mice. The system closely follows murine ethology. It requires no contact between a human experimenter and tested animals, overcoming the confounding factors that lead to irreproducible assessment of murine social behavior between laboratories. In Eco-HAB, group-housed animals live in a spacious, four-compartment apparatus with shadowed areas and narrow tunnels, resembling natural burrows. Eco-HAB allows for assessment of the tendency of mice to voluntarily spend time together in ethologically relevant mouse group sizes. Custom-made software for automated tracking, data extraction, and analysis enables quick evaluation of social impairments. The developed protocols and standardized behavioral measures demonstrate high replicability. Unlike classic three-chambered sociability tests, Eco-HAB provides measurements of spontaneous, ecologically relevant social behaviors in group-housed animals. Results are obtained faster, with less manpower, and without confounding factors. DOI: http://dx.doi.org/10.7554/eLife.19532.001 PMID:27731798

  12. Fully Automated Evaluation of Total Glomerular Number and Capillary Tuft Size in Nephritic Kidneys Using Lightsheet Microscopy.

    PubMed

    Klingberg, Anika; Hasenberg, Anja; Ludwig-Portugall, Isis; Medyukhina, Anna; Männ, Linda; Brenzel, Alexandra; Engel, Daniel R; Figge, Marc Thilo; Kurts, Christian; Gunzer, Matthias

    2017-02-01

    The total number of glomeruli is a fundamental parameter of kidney function but very difficult to determine using standard methodology. Here, we counted all individual glomeruli in murine kidneys and sized the capillary tufts by combining in vivo fluorescence labeling of endothelial cells, a novel tissue-clearing technique, lightsheet microscopy, and automated registration by image analysis. Total hands-on time per organ was <1 hour, and automated counting/sizing was finished in <3 hours. We also investigated the novel use of ethyl-3-phenylprop-2-enoate (ethyl cinnamate) as a nontoxic solvent-based clearing reagent that can be handled without specific safety measures. Ethyl cinnamate rapidly cleared all tested organs, including calcified bone, but the fluorescence of proteins and immunohistochemical labels was maintained over weeks. Using ethyl cinnamate-cleared kidneys, we also quantified the average creatinine clearance rate per glomerulus. This parameter decreased in the first week of experimental nephrotoxic nephritis, whereas reduction in glomerular numbers occurred much later. Our approach delivers fundamental parameters of renal function, and because of its ease of use and speed, it is suitable for high-throughput analysis and could greatly facilitate studies of the effect of kidney diseases on whole-organ physiology.

  13. Fully automated prostate segmentation in 3D MR based on normalized gradient fields cross-correlation initialization and LOGISMOS refinement

    NASA Astrophysics Data System (ADS)

    Yin, Yin; Fotin, Sergei V.; Periaswamy, Senthil; Kunz, Justin; Haldankar, Hrishikesh; Muradyan, Naira; Cornud, François; Turkbey, Baris; Choyke, Peter

    2012-02-01

    Manual delineation of the prostate is a challenging task for a clinician due to its complex and irregular shape. Furthermore, the need for precisely targeting the prostate boundary continues to grow. Planning for radiation therapy, MR-ultrasound fusion for image-guided biopsy, multi-parametric MRI tissue characterization, and context-based organ retrieval are examples where accurate prostate delineation can play a critical role in a successful patient outcome. Therefore, a robust automated full prostate segmentation system is desired. In this paper, we present an automated prostate segmentation system for 3D MR images. In this system, the prostate is segmented in two steps: the prostate displacement and size are first detected, and then the boundary is refined by a shape model. The detection approach is based on normalized gradient fields cross-correlation. This approach is fast, robust to intensity variation and provides good accuracy to initialize a prostate mean shape model. The refinement model is based on a graph-search based framework, which contains both shape and topology information during deformation. We generated the graph cost using trained classifiers and used coarse-to-fine search and region-specific classifier training. The proposed algorithm was developed using 261 training images and tested on another 290 cases. The segmentation performance using mean DSC ranging from 0.89 to 0.91 depending on the evaluation subset demonstrates state of the art performance. Running time for the system is about 20 to 40 seconds depending on image size and resolution.

  14. Eco-HAB as a fully automated and ecologically relevant assessment of social impairments in mouse models of autism.

    PubMed

    Puścian, Alicja; Łęski, Szymon; Kasprowicz, Grzegorz; Winiarski, Maciej; Borowska, Joanna; Nikolaev, Tomasz; Boguszewski, Paweł M; Lipp, Hans-Peter; Knapska, Ewelina

    2016-10-12

    Eco-HAB is an open source, RFID-based system for automated measurement and analysis of social preference and in-cohort sociability in mice. The system closely follows murine ethology. It requires no contact between a human experimenter and tested animals, overcoming the confounding factors that lead to irreproducible assessment of murine social behavior between laboratories. In Eco-HAB, group-housed animals live in a spacious, four-compartment apparatus with shadowed areas and narrow tunnels, resembling natural burrows. Eco-HAB allows for assessment of the tendency of mice to voluntarily spend time together in ethologically relevant mouse group sizes. Custom-made software for automated tracking, data extraction, and analysis enables quick evaluation of social impairments. The developed protocols and standardized behavioral measures demonstrate high replicability. Unlike classic three-chambered sociability tests, Eco-HAB provides measurements of spontaneous, ecologically relevant social behaviors in group-housed animals. Results are obtained faster, with less manpower, and without confounding factors.

  15. Evaluation of a fully automated treponemal test and comparison with conventional VDRL and FTA-ABS tests.

    PubMed

    Park, Yongjung; Park, Younhee; Joo, Shin Young; Park, Myoung Hee; Kim, Hyon-Suk

    2011-11-01

    We evaluated analytic performances of an automated treponemal test and compared this test with the Venereal Disease Research Laboratory test (VDRL) and fluorescent treponemal antibody absorption test (FTA-ABS). Precision performance of the Architect Syphilis TP assay (TP; Abbott Japan, Tokyo, Japan) was assessed, and 150 serum samples were assayed with the TP before and after heat inactivation to estimate the effect of heat inactivation. A total of 616 specimens were tested with the FTA-ABS and TP, and 400 were examined with the VDRL. The TP showed good precision performance with total imprecision of less than a 10% coefficient of variation. An excellent linear relationship between results before and after heat inactivation was observed (R(2) = 0.9961). The FTA-ABS and TP agreed well with a κ coefficient of 0.981. The concordance rate between the FTA-ABS and TP was the highest (99.0%), followed by the rates between FTA-ABS and VDRL (85.0%) and between TP and VDRL (83.8%). The automated TP assay may be adequate for screening for syphilis in a large volume of samples and can be an alternative to FTA-ABS.

  16. Anxiety Online—A Virtual Clinic: Preliminary Outcomes Following Completion of Five Fully Automated Treatment Programs for Anxiety Disorders and Symptoms

    PubMed Central

    Meyer, Denny; Austin, David William; Kyrios, Michael

    2011-01-01

    Background The development of e-mental health interventions to treat or prevent mental illness and to enhance wellbeing has risen rapidly over the past decade. This development assists the public in sidestepping some of the obstacles that are often encountered when trying to access traditional face-to-face mental health care services. Objective The objective of our study was to investigate the posttreatment effectiveness of five fully automated self-help cognitive behavior e-therapy programs for generalized anxiety disorder (GAD), panic disorder with or without agoraphobia (PD/A), obsessive–compulsive disorder (OCD), posttraumatic stress disorder (PTSD), and social anxiety disorder (SAD) offered to the international public via Anxiety Online, an open-access full-service virtual psychology clinic for anxiety disorders. Methods We used a naturalistic participant choice, quasi-experimental design to evaluate each of the five Anxiety Online fully automated self-help e-therapy programs. Participants were required to have at least subclinical levels of one of the anxiety disorders to be offered the associated disorder-specific fully automated self-help e-therapy program. These programs are offered free of charge via Anxiety Online. Results A total of 225 people self-selected one of the five e-therapy programs (GAD, n = 88; SAD, n = 50; PD/A, n = 40; PTSD, n = 30; OCD, n = 17) and completed their 12-week posttreatment assessment. Significant improvements were found on 21/25 measures across the five fully automated self-help programs. At postassessment we observed significant reductions on all five anxiety disorder clinical disorder severity ratings (Cohen d range 0.72–1.22), increased confidence in managing one’s own mental health care (Cohen d range 0.70–1.17), and decreases in the total number of clinical diagnoses (except for the PD/A program, where a positive trend was found) (Cohen d range 0.45–1.08). In addition, we found significant improvements in

  17. Technical Note: A fully automated purge and trap-GC-MS system for quantification of volatile organic compound (VOC) fluxes between the ocean and atmosphere

    NASA Astrophysics Data System (ADS)

    Andrews, S. J.; Hackenberg, S. C.; Carpenter, L. J.

    2014-12-01

    The oceans are a key source of a number of atmospherically important volatile gases. The accurate and robust determination of trace gases in seawater is a significant analytical challenge, requiring reproducible and ideally automated sample handling, a high efficiency of seawater-air transfer, removal of water vapour from the sample stream, and high sensitivity and selectivity of the analysis. Here we describe a system that was developed for the fully automated analysis of dissolved very short-lived halogenated species (VSLS) sampled from an under-way seawater supply. The system can also be used for semi-automated batch sampling from Niskin bottles filled during CTD (Conductivity, Temperature, Depth) profiles. The essential components comprise of a bespoke, automated purge and trap (AutoP & T) unit coupled to a commercial thermal desorption and gas chromatograph-mass spectrometer (TD-GC-MS). The AutoP & T system has completed five research cruises, from the tropics to the poles, and collected over 2500 oceanic samples to date. It is able to quantify >25 species over a boiling point range of 34-180 °C with Henry's Law coefficients of 0.018 and greater (CH2I2, kHcc dimensionless gas/aqueous) and has been used to measure organic sulfurs, hydrocarbons, halocarbons and terpenes. In the east tropical Pacific, the high sensitivity and sampling frequency provided new information regarding the distribution of VSLS, including novel measurements of a photolytically driven diurnal cycle of CH2I2 within the surface ocean water.

  18. Development of a Fully Automated Guided Wave System for In-Process Cure Monitoring of CFRP Composite Laminates

    NASA Technical Reports Server (NTRS)

    Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.; Yaun, Fuh-Gwo

    2016-01-01

    A guided wave-based in-process cure monitoring technique for carbon fiber reinforced polymer (CFRP) composites was investigated at NASA Langley Research Center. A key cure transition point (vitrification) was identified and the degree of cure was monitored using metrics such as amplitude and time of arrival (TOA) of guided waves. Using an automated system preliminarily developed in this work, high-temperature piezoelectric transducers were utilized to interrogate a twenty-four ply unidirectional composite panel fabricated from Hexcel (Registered Trademark) IM7/8552 prepreg during cure. It was shown that the amplitude of the guided wave increased sharply around vitrification and the TOA curve possessed an inverse relationship with degree of cure. The work is a first step in demonstrating the feasibility of transitioning the technique to perform in-process cure monitoring in an autoclave, defect detection during cure, and ultimately a closed-loop process control to maximize composite part quality and consistency.

  19. Fully automated, high speed, tomographic phase object reconstruction using the transport of intensity equation in transmission and reflection configurations.

    PubMed

    Nguyen, Thanh; Nehmetallah, George; Tran, Dat; Darudi, Ahmad; Soltani, Peyman

    2015-12-10

    While traditional transport of intensity equation (TIE) based phase retrieval of a phase object is performed through axial translation of the CCD, in this work a tunable lens TIE is employed in both transmission and reflection configurations. These configurations are extended to a 360° tomographic 3D reconstruction through multiple illuminations from different angles by a custom fabricated rotating assembly of the phase object. Synchronization circuitry is developed to control the CCD camera and the Arduino board, which in its turn controls the tunable lens and the stepper motor to automate the tomographic reconstruction process. Finally, a MATLAB based user friendly graphical user interface is developed to control the whole system and perform tomographic reconstruction using both multiplicative and inverse radon based techniques.

  20. Fully Automated Electro Membrane Extraction Autosampler for LC-MS Systems Allowing Soft Extractions for High-Throughput Applications.

    PubMed

    Fuchs, David; Pedersen-Bjergaard, Stig; Jensen, Henrik; Rand, Kasper D; Honoré Hansen, Steen; Petersen, Nickolaj Jacob

    2016-07-05

    The current work describes the implementation of electro membrane extraction (EME) into an autosampler for high-throughput analysis of samples by EME-LC-MS. The extraction probe was built into a luer lock adapter connected to a HTC PAL autosampler syringe. As the autosampler drew sample solution, analytes were extracted into the lumen of the extraction probe and transferred to a LC-MS system for further analysis. Various parameters affecting extraction efficacy were investigated including syringe fill strokes, syringe pull up volume, pull up delay and volume in the sample vial. The system was optimized for soft extraction of analytes and high sample throughput. Further, it was demonstrated that by flushing the EME-syringe with acidic wash buffer and reverting the applied electric potential, carry-over between samples can be reduced to below 1%. Performance of the system was characterized (RSD, <10%; R(2), 0.994) and finally, the EME-autosampler was used to analyze in vitro conversion of methadone into its main metabolite by rat liver microsomes and for demonstrating the potential of known CYP3A4 inhibitors to prevent metabolism of methadone. By making use of the high extraction speed of EME, a complete analytical workflow of purification, separation, and analysis of sample could be achieved within only 5.5 min. With the developed system large sequences of samples could be analyzed in a completely automated manner. This high degree of automation makes the developed EME-autosampler a powerful tool for a wide range of applications where high-throughput extractions are required before sample analysis.

  1. Comparison of Cobas 6500 and Iris IQ200 fully-automated urine analyzers to manual urine microscopy

    PubMed Central

    Bakan, Ebubekir; Ozturk, Nurinnisa; Baygutalp, Nurcan Kilic; Polat, Elif; Akpinar, Kadriye; Dorman, Emrullah; Polat, Harun; Bakan, Nuri

    2016-01-01

    Introduction Urine screening is achieved by either automated or manual microscopic analysis. The aim of the study was to compare Cobas 6500 and Iris IQ200 urine analyzers, and manual urine microscopic analysis. Materials and methods A total of 540 urine samples sent to the laboratory for chemical and sediment analysis were analyzed on Cobas 6500 and Iris IQ200 within 1 hour from sampling. One hundred and fifty three samples were found to have pathological sediment results and were subjected to manual microscopic analysis performed by laboratory staff blinded to the study. Spearman’s and Gamma statistics were used for correlation analyses, and the McNemar test for the comparison of the two automated analyzers. Results The comparison of Cobas u701 to the manual method yielded the following regression equations: y = - 0.12 (95% CI: - 1.09 to 0.67) + 0.78 (95% CI: 0.65 to 0.95) x for WBC and y = 0.06 (95% CI: - 0.09 to 0.25) + 0.66 (95% CI: 0.57 to 0.73) x for RBC. The comparison of IQ200 Elite to manual method the following equations: y = 0.03 (95% CI: - 1.00 to 1.00) + 0.88 (95% CI: 0.66 to 1.00) x for WBC and y = - 0.22 (95% CI: - 0.80 to 0.20) + 0.40 (95% CI: 0.32 to 0.50) x for RBC. IQ200 Elite compared to Cobas u701 yielded the following equations: y = - 0.95 (95% CI: - 2.13 to 0.11) + 1.25 (95% CI: 1.08 to 1.44) x for WBC and y = - 1.20 (95% CI: - 1.80 to -0.30) + 0. 80 (95% CI: 0.55 to 1.00) x for RBC. Conclusions The two analyzers showed similar performances and good compatibility to manual microscopy. However, they are still inadequate in the determination of WBC, RBC, and EC in highly-pathological samples. Thus, confirmation by manual microscopic analysis may be useful. PMID:27812305

  2. EST2uni: an open, parallel tool for automated EST analysis and database creation, with a data mining web interface and microarray expression data integration

    PubMed Central

    Forment, Javier; Gilabert, Francisco; Robles, Antonio; Conejero, Vicente; Nuez, Fernando; Blanca, Jose M

    2008-01-01

    Background Expressed sequence tag (EST) collections are composed of a high number of single-pass, redundant, partial sequences, which need to be processed, clustered, and annotated to remove low-quality and vector regions, eliminate redundancy and sequencing errors, and provide biologically relevant information. In order to provide a suitable way of performing the different steps in the analysis of the ESTs, flexible computation pipelines adapted to the local needs of specific EST projects have to be developed. Furthermore, EST collections must be stored in highly structured relational databases available to researchers through user-friendly interfaces which allow efficient and complex data mining, thus offering maximum capabilities for their full exploitation. Results We have created EST2uni, an integrated, highly-configurable EST analysis pipeline and data mining software package that automates the pre-processing, clustering, annotation, database creation, and data mining of EST collections. The pipeline uses standard EST analysis tools and the software has a modular design to facilitate the addition of new analytical methods and their configuration. Currently implemented analyses include functional and structural annotation, SNP and microsatellite discovery, integration of previously known genetic marker data and gene expression results, and assistance in cDNA microarray design. It can be run in parallel in a PC cluster in order to reduce the time necessary for the analysis. It also creates a web site linked to the database, showing collection statistics, with complex query capabilities and tools for data mining and retrieval. Conclusion The software package presented here provides an efficient and complete bioinformatics tool for the management of EST collections which is very easy to adapt to the local needs of different EST projects. The code is freely available under the GPL license and can be obtained at . This site also provides detailed instructions for

  3. [Condition setting for the measurement of blood coagulation factor XIII activity using a fully automated blood coagulation analyzer, COAGTRON-350].

    PubMed

    Kanno, Nobuko; Kaneko, Makoto; Tanabe, Kumiko; Jyona, Masahiro; Yokota, Hiromitsu; Yatomi, Yutaka

    2012-12-01

    The automated laboratory analyzer COAGTRON-350 (Trinity Biotech) is used for routine and specific coagulation testing for the detection of fibrin formation utilizing either mechanical principles (ball method) or photo-optical principles, chromogenic kinetic enzyme analysis, and immune-turbidimetric detection systems in one benchtop unit. In this study, we demonstrated and established a parameter for the measurement of factor XIII (FXIII) activity using Berichrom FXIII reagent and the COAGTRON-350 analyzer. The usual protocol used for this reagent, based on the handling method, was slightly modified for this device. The analysis showed that fundamental study for the measurement of FXIII activity under our condition setting was favorable in terms of reproducibility, linearity, and correlation with another assays. Since FXIII is the key enzyme that plays important roles in hemostasis by stabilizing fibrin formation, the measurement of FXIII is essential for the diagnosis of bleeding disorders. Therefore, FXIII activity assessment as well as a routine coagulation testing can be conducted simultaneously with one instrument, which is useful in coagulopathy assessment.

  4. Fully automated macular pathology detection in retina optical coherence tomography images using sparse coding and dictionary learning

    NASA Astrophysics Data System (ADS)

    Sun, Yankui; Li, Shan; Sun, Zhongyang

    2017-01-01

    We propose a framework for automated detection of dry age-related macular degeneration (AMD) and diabetic macular edema (DME) from retina optical coherence tomography (OCT) images, based on sparse coding and dictionary learning. The study aims to improve the classification performance of state-of-the-art methods. First, our method presents a general approach to automatically align and crop retina regions; then it obtains global representations of images by using sparse coding and a spatial pyramid; finally, a multiclass linear support vector machine classifier is employed for classification. We apply two datasets for validating our algorithm: Duke spectral domain OCT (SD-OCT) dataset, consisting of volumetric scans acquired from 45 subjects-15 normal subjects, 15 AMD patients, and 15 DME patients; and clinical SD-OCT dataset, consisting of 678 OCT retina scans acquired from clinics in Beijing-168, 297, and 213 OCT images for AMD, DME, and normal retinas, respectively. For the former dataset, our classifier correctly identifies 100%, 100%, and 93.33% of the volumes with DME, AMD, and normal subjects, respectively, and thus performs much better than the conventional method; for the latter dataset, our classifier leads to a correct classification rate of 99.67%, 99.67%, and 100.00% for DME, AMD, and normal images, respectively.

  5. Fully Automated Gis-Based Individual Tree Crown Delineation Based on Curvature Values from a LIDAR Derived Canopy Height Model in a Coniferous Plantation

    NASA Astrophysics Data System (ADS)

    Argamosa, R. J. L.; Paringit, E. C.; Quinton, K. R.; Tandoc, F. A. M.; Faelga, R. A. G.; Ibañez, C. A. G.; Posilero, M. A. V.; Zaragosa, G. P.

    2016-06-01

    The generation of high resolution canopy height model (CHM) from LiDAR makes it possible to delineate individual tree crown by means of a fully-automated method using the CHM's curvature through its slope. The local maxima are obtained by taking the maximum raster value in a 3 m x 3 m cell. These values are assumed as tree tops and therefore considered as individual trees. Based on the assumptions, thiessen polygons were generated to serve as buffers for the canopy extent. The negative profile curvature is then measured from the slope of the CHM. The results show that the aggregated points from a negative profile curvature raster provide the most realistic crown shape. The absence of field data regarding tree crown dimensions require accurate visual assessment after the appended delineated tree crown polygon was superimposed to the hill shaded CHM.

  6. On transcending the impasse of respiratory motion correction applications in routine clinical imaging - a consideration of a fully automated data driven motion control framework.

    PubMed

    Kesner, Adam L; Schleyer, Paul J; Büther, Florian; Walter, Martin A; Schäfers, Klaus P; Koo, Phillip J

    2014-12-01

    Positron emission tomography (PET) is increasingly used for the detection, characterization, and follow-up of tumors located in the thorax. However, patient respiratory motion presents a unique limitation that hinders the application of high-resolution PET technology for this type of imaging. Efforts to transcend this limitation have been underway for more than a decade, yet PET remains for practical considerations a modality vulnerable to motion-induced image degradation. Respiratory motion control is not employed in routine clinical operations. In this article, we take an opportunity to highlight some of the recent advancements in data-driven motion control strategies and how they may form an underpinning for what we are presenting as a fully automated data-driven motion control framework. This framework represents an alternative direction for future endeavors in motion control and can conceptually connect individual focused studies with a strategy for addressing big picture challenges and goals.

  7. A novel fully automated on-line coupled liquid chromatography-gas chromatography technique used for the determination of organochlorine pesticide residues in tobacco and tobacco products.

    PubMed

    Qi, Dawei; Fei, Ting; Sha, Yunfei; Wang, Leijun; Li, Gang; Wu, Da; Liu, Baizhan

    2014-12-29

    In this study, a novel fully automated on-line coupled liquid chromatography-gas chromatography (LC-GC) technique was reported and applied for the determination of organochlorine pesticide residues (OCPs) in tobacco and tobacco products. Using a switching valve to isolate the capillary pre-column and the analytical column during the solvent evaporation period, the LC solvent can be completely removed and prevented from reaching the GC column and the detector. The established method was used to determinate the OCPs in tobacco samples. By using Florisil SPE column and employing GPC technique, polarity impurities and large molecule impurities were removed. A dynamic range 1-100ng/mL was achieved with detection limits from 1.5 to 3.3μg/kg. The method exhibited good repeatability and recoveries. This technology may provide an alternative way for trace analysis of complex samples.

  8. Instrumentation of LOTIS: Livermore Optical Transient Imaging System; a fully automated wide field of view telescope system searching for simultaneous optical counterparts of gamma ray bursts

    SciTech Connect

    Park, H.S.; Ables, E.; Barthelmy, S.D.; Bionta, R.M.; Ott, L.L.; Parker, E.L.; Williams, G.G.

    1998-03-06

    LOTIS is a rapidly slewing wide-field-of-view telescope which was designed and constructed to search for simultaneous gamma-ray burst (GRB) optical counterparts. This experiment requires a rapidly slewing ({lt} 10 sec), wide-field-of-view ({gt} 15{degrees}), automatic and dedicated telescope. LOTIS utilizes commercial tele-photo lenses and custom 2048 x 2048 CCD cameras to view a 17.6 x 17.6{degrees} field of view. It can point to any part of the sky within 5 sec and is fully automated. It is connected via Internet socket to the GRB coordinate distribution network which analyzes telemetry from the satellite and delivers GRB coordinate information in real-time. LOTIS started routine operation in Oct. 1996. In the idle time between GRB triggers, LOTIS systematically surveys the entire available sky every night for new optical transients. This paper will describe the system design and performance.

  9. Improved synthesis of [(18)F]FLETT via a fully automated vacuum distillation method for [(18)F]2-fluoroethyl azide purification.

    PubMed

    Ackermann, Uwe; Plougastel, Lucie; Goh, Yit Wooi; Yeoh, Shinn Dee; Scott, Andrew M

    2014-12-01

    The synthesis of [(18)F]2-fluoroethyl azide and its subsequent click reaction with 5-ethynyl-2'-deoxyuridine (EDU) to form [(18)F]FLETT was performed using an iPhase FlexLab module. The implementation of a vacuum distillation method afforded [(18)F]2-fluoroethyl azide in 87±5.3% radiochemical yield. The use of Cu(CH3CN)4PF6 and TBTA as catalyst enabled us to fully automate the [(18)F]FLETT synthesis without the need for the operator to enter the radiation field. [(18)F]FLETT was produced in higher overall yield (41.3±6.5%) and shorter synthesis time (67min) than with our previously reported manual method (32.5±2.5% in 130min).

  10. The Enigma ML FluAB-RSV assay: a fully automated molecular test for the rapid detection of influenza A, B and respiratory syncytial viruses in respiratory specimens.

    PubMed

    Goldenberg, Simon D; Edgeworth, Jonathan D

    2015-01-01

    The Enigma(®) ML FluAB-RSV assay (Enigma Diagnostics, Porton Down, Salisbury, UK) is a CE-IVD marked multiplex molecular panel for the detection of influenza A, B and respiratory syncytial viruses in nasopharyngeal swabs. The assay runs on the fully automated Enigma ML platform without further specimen manipulation and provides a sample-to-answer result within 95 min. The reported sensitivity and specificity for influenza A are 100% (95% CI: 98.2-100) and 98.3% (95% CI: 95.5-99.4), respectively, for influenza B are 100% (95% CI: 98.2-100) and 98.7% (95% CI: 96-99.6), respectively, and for respiratory syncytial virus are 100% (95% CI: 98.2-100) and 99.4% (95% CI: 97.2-99.9), respectively.

  11. SU-D-BRD-06: Creating a Safety Net for a Fully Automated, Script Driven Electronic Medical Record

    SciTech Connect

    Sheu, R; Ghafar, R; Powers, A; Green, S; Lo, Y

    2015-06-15

    Purpose: Demonstrate the effectiveness of in-house software in ensuring EMR workflow efficiency and safety. Methods: A web-based dashboard system (WBDS) was developed to monitor clinical workflow in real time using web technology (WAMP) through ODBC (Open Database Connectivity). Within Mosaiq (Elekta Inc), operational workflow is driven and indicated by Quality Check Lists (QCLs), which is triggered by automation software IQ Scripts (Elekta Inc); QCLs rely on user completion to propagate. The WBDS retrieves data directly from the Mosaig SQL database and tracks clinical events in real time. For example, the necessity of a physics initial chart check can be determined by screening all patients on treatment who have received their first fraction and who have not yet had their first chart check. Monitoring similar “real” events with our in-house software creates a safety net as its propagation does not rely on individual users input. Results: The WBDS monitors the following: patient care workflow (initial consult to end of treatment), daily treatment consistency (scheduling, technique, charges), physics chart checks (initial, EOT, weekly), new starts, missing treatments (>3 warning/>5 fractions, action required), and machine overrides. The WBDS can be launched from any web browser which allows the end user complete transparency and timely information. Since the creation of the dashboards, workflow interruptions due to accidental deletion or completion of QCLs were eliminated. Additionally, all physics chart checks were completed timely. Prompt notifications of treatment record inconsistency and machine overrides have decreased the amount of time between occurrence and execution of corrective action. Conclusion: Our clinical workflow relies primarily on QCLs and IQ Scripts; however, this functionality is not the panacea of safety and efficiency. The WBDS creates a more thorough system of checks to provide a safer and near error-less working environment.

  12. Platform-Independent Cirrus and Spectralis Thickness Measurements in Eyes with Diabetic Macular Edema Using Fully Automated Software

    PubMed Central

    Willoughby, Alex S.; Chiu, Stephanie J.; Silverman, Rachel K.; Farsiu, Sina; Bailey, Clare; Wiley, Henry E.; Ferris, Frederick L.; Jaffe, Glenn J.

    2017-01-01

    Purpose We determine whether the automated segmentation software, Duke Optical Coherence Tomography Retinal Analysis Program (DOCTRAP), can measure, in a platform-independent manner, retinal thickness on Cirrus and Spectralis spectral domain optical coherence tomography (SD-OCT) images in eyes with diabetic macular edema (DME) under treatment in a clinical trial. Methods Automatic segmentation software was used to segment the internal limiting membrane (ILM), inner retinal pigment epithelium (RPE), and Bruch's membrane (BM) in SD-OCT images acquired by Cirrus and Spectralis commercial systems, from the same eye, on the same day during a clinical interventional DME trial. Mean retinal thickness differences were compared across commercial and DOCTRAP platforms using intraclass correlation (ICC) and Bland-Altman plots. Results The mean 1 mm central subfield thickness difference (standard error [SE]) comparing segmentation of Spectralis images with DOCTRAP versus HEYEX was 0.7 (0.3) μm (0.2 pixels). The corresponding values comparing segmentation of Cirrus images with DOCTRAP versus Cirrus software was 2.2 (0.7) μm. The mean 1 mm central subfield thickness difference (SE) comparing segmentation of Cirrus and Spectralis scan pairs with DOCTRAP using BM as the outer retinal boundary was −2.3 (0.9) μm compared to 2.8 (0.9) μm with inner RPE as the outer boundary. Conclusions DOCTRAP segmentation of Cirrus and Spectralis images produces validated thickness measurements that are very similar to each other, and very similar to the values generated by the corresponding commercial software in eyes with treated DME. Translational Relevance This software enables automatic total retinal thickness measurements across two OCT platforms, a process that is impractical to perform manually. PMID:28180033

  13. Fully Automated Simultaneous Integrated Boosted-Intensity Modulated Radiation Therapy Treatment Planning Is Feasible for Head-and-Neck Cancer: A Prospective Clinical Study

    SciTech Connect

    Wu Binbin; McNutt, Todd; Zahurak, Marianna; Simari, Patricio; Pang, Dalong; Taylor, Russell; Sanguineti, Giuseppe

    2012-12-01

    Purpose: To prospectively determine whether overlap volume histogram (OVH)-driven, automated simultaneous integrated boosted (SIB)-intensity-modulated radiation therapy (IMRT) treatment planning for head-and-neck cancer can be implemented in clinics. Methods and Materials: A prospective study was designed to compare fully automated plans (APs) created by an OVH-driven, automated planning application with clinical plans (CPs) created by dosimetrists in a 3-dose-level (70 Gy, 63 Gy, and 58.1 Gy), head-and-neck SIB-IMRT planning. Because primary organ sparing (cord, brain, brainstem, mandible, and optic nerve/chiasm) always received the highest priority in clinical planning, the study aimed to show the noninferiority of APs with respect to PTV coverage and secondary organ sparing (parotid, brachial plexus, esophagus, larynx, inner ear, and oral mucosa). The sample size was determined a priori by a superiority hypothesis test that had 85% power to detect a 4% dose decrease in secondary organ sparing with a 2-sided alpha level of 0.05. A generalized estimating equation (GEE) regression model was used for statistical comparison. Results: Forty consecutive patients were accrued from July to December 2010. GEE analysis indicated that in APs, overall average dose to the secondary organs was reduced by 1.16 (95% CI = 0.09-2.33) with P=.04, overall average PTV coverage was increased by 0.26% (95% CI = 0.06-0.47) with P=.02 and overall average dose to the primary organs was reduced by 1.14 Gy (95% CI = 0.45-1.8) with P=.004. A physician determined that all APs could be delivered to patients, and APs were clinically superior in 27 of 40 cases. Conclusions: The application can be implemented in clinics as a fast, reliable, and consistent way of generating plans that need only minor adjustments to meet specific clinical needs.

  14. Quantitative determination of opioids in whole blood using fully automated dried blood spot desorption coupled to on-line SPE-LC-MS/MS.

    PubMed

    Verplaetse, Ruth; Henion, Jack

    2016-01-01

    Opioids are well known, widely used painkillers. Increased stability of opioids in the dried blood spot (DBS) matrix compared to blood/plasma has been described. Other benefits provided by DBS techniques include point-of-care collection, less invasive micro sampling, more economical shipment, and convenient storage. Current methodology for analysis of micro whole blood samples for opioids is limited to the classical DBS workflow, including tedious manual punching of the DBS cards followed by extraction and liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. The goal of this study was to develop and validate a fully automated on-line sample preparation procedure for the analysis of DBS micro samples relevant to the detection of opioids in finger prick blood. To this end, automated flow-through elution of DBS cards was followed by on-line solid-phase extraction (SPE) and analysis by LC-MS/MS. Selective, sensitive, accurate, and reproducible quantitation of five representative opioids in human blood at sub-therapeutic, therapeutic, and toxic levels was achieved. The range of reliable response (R(2)  ≥0.997) was 1 to 500 ng/mL whole blood for morphine, codeine, oxycodone, hydrocodone; and 0.1 to 50 ng/mL for fentanyl. Inter-day, intra-day, and matrix inter-lot accuracy and precision was less than 15% (even at lower limits of quantitation (LLOQ) level). The method was successfully used to measure hydrocodone and its major metabolite norhydrocodone in incurred human samples. Our data support the enormous potential of DBS sampling and automated analysis for monitoring opioids as well as other pharmaceuticals in both anti-doping and pain management regimens.

  15. Fully automated sample preparation microsystem for genetic testing of hereditary hearing loss using two-color multiplex allele-specific PCR.

    PubMed

    Zhuang, Bin; Gan, Wupeng; Wang, Shuaiqin; Han, Junping; Xiang, Guangxin; Li, Cai-Xia; Sun, Jing; Liu, Peng

    2015-01-20

    A fully automated microsystem consisting of a disposable DNA extraction and PCR microchip, as well as a compact control instrument, has been successfully developed for genetic testing of hereditary hearing loss from human whole blood. DNA extraction and PCR were integrated into a single 15-μL reaction chamber, where a piece of filter paper was embedded for capturing genomic DNA, followed by in-situ PCR amplification without elution. Diaphragm microvalves actuated by external solenoids together with a "one-way" fluidic control strategy operated by a modular valve positioner and a syringe pump were employed to control the fluids and to seal the chamber during thermal cycling. Fully automated DNA extractions from as low as 0.3-μL human whole blood followed by amplifications of 59-bp β-actin fragments can be completed on the microsystem in about 100 min. Negative control tests that were performed between blood sample analyses proved the successful elimination of any contamination or carryover in the system. To more critically test the microsystem, a two-color multiplex allele-specific PCR (ASPCR) assay for detecting c.176_191del16, c.235delC, and c.299_300delAT mutations in GJB2 gene that accounts for hereditary hearing loss was constructed. Two allele-specific primers, one labeled with TAMRA for wild type and the other with FAM for mutation, were designed for each locus. DNA extraction from blood and ASPCR were performed on the microsystem, followed by an electrophoretic analysis on a portable microchip capillary electrophoresis system. Blood samples from a healthy donor and five persons with genetic mutations were all accurately analyzed with only two steps in less than 2 h.

  16. Fast and Efficient Fragment-Based Lead Generation by Fully Automated Processing and Analysis of Ligand-Observed NMR Binding Data.

    PubMed

    Peng, Chen; Frommlet, Alexandra; Perez, Manuel; Cobas, Carlos; Blechschmidt, Anke; Dominguez, Santiago; Lingel, Andreas

    2016-04-14

    NMR binding assays are routinely applied in hit finding and validation during early stages of drug discovery, particularly for fragment-based lead generation. To this end, compound libraries are screened by ligand-observed NMR experiments such as STD, T1ρ, and CPMG to identify molecules interacting with a target. The analysis of a high number of complex spectra is performed largely manually and therefore represents a limiting step in hit generation campaigns. Here we report a novel integrated computational procedure that processes and analyzes ligand-observed proton and fluorine NMR binding data in a fully automated fashion. A performance evaluation comparing automated and manual analysis results on (19)F- and (1)H-detected data sets shows that the program delivers robust, high-confidence hit lists in a fraction of the time needed for manual analysis and greatly facilitates visual inspection of the associated NMR spectra. These features enable considerably higher throughput, the assessment of larger libraries, and shorter turn-around times.

  17. EyeCatch: Data-mining over Half a Million EEG Independent Components to Construct a Fully-Automated Eye-Component Detector*

    PubMed Central

    Bigdely-Shamlo, Nima; Kreutz-Delgado, Ken; Kothe, Christian; Makeig, Scott

    2014-01-01

    Independent component analysis (ICA) can find distinct sources of electroencephalographic (EEG) activity, both brain-based and artifactual, and has become a common pre-preprocessing step in analysis of EEG data. Distinction between brain and non-brain independent components (ICs) accounting for, e.g., eye or muscle activities is an important step in the analysis. Here we present a fully automated method to identify eye-movement related EEG components by analyzing the spatial distribution of their scalp projections (scalp maps). The EyeCatch method compares each input scalp map to a database of eye-related IC scalp maps obtained by data-mining over half a million IC scalp maps obtained from 80,006 EEG datasets associated with a diverse set of EEG studies and paradigms. To our knowledge this is the largest sample of IC scalp maps that has ever been analyzed. Our result show comparable performance to a previous state-of-art semi-automated method, CORRMAP, while eliminating the need for human intervention. PMID:24111068

  18. Histograms of Oriented 3D Gradients for Fully Automated Fetal Brain Localization and Robust Motion Correction in 3 T Magnetic Resonance Images

    PubMed Central

    Macnaught, Gillian; Denison, Fiona C.; Reynolds, Rebecca M.; Semple, Scott I.; Boardman, James P.

    2017-01-01

    Fetal brain magnetic resonance imaging (MRI) is a rapidly emerging diagnostic imaging tool. However, automated fetal brain localization is one of the biggest obstacles in expediting and fully automating large-scale fetal MRI processing. We propose a method for automatic localization of fetal brain in 3 T MRI when the images are acquired as a stack of 2D slices that are misaligned due to fetal motion. First, the Histogram of Oriented Gradients (HOG) feature descriptor is extended from 2D to 3D images. Then, a sliding window is used to assign a score to all possible windows in an image, depending on the likelihood of it containing a brain, and the window with the highest score is selected. In our evaluation experiments using a leave-one-out cross-validation strategy, we achieved 96% of complete brain localization using a database of 104 MRI scans at gestational ages between 34 and 38 weeks. We carried out comparisons against template matching and random forest based regression methods and the proposed method showed superior performance. We also showed the application of the proposed method in the optimization of fetal motion correction and how it is essential for the reconstruction process. The method is robust and does not rely on any prior knowledge of fetal brain development. PMID:28251155

  19. A fully automated meltwater monitoring and collection system for spatially distributed isotope analysis in snowmelt-dominated catchments

    NASA Astrophysics Data System (ADS)

    Rücker, Andrea; Boss, Stefan; Von Freyberg, Jana; Zappa, Massimiliano; Kirchner, James

    2016-04-01

    In many mountainous catchments the seasonal snowpack stores a significant volume of water, which is released as streamflow during the melting period. The predicted change in future climate will bring new challenges in water resource management in snow-dominated headwater catchments and their receiving lowlands. To improve predictions of hydrologic extreme events, particularly summer droughts, it is important characterize the relationship between winter snowpack and summer (low) flows in such areas (e.g., Godsey et al., 2014). In this context, stable water isotopes (18O, 2H) are a powerful tool for fingerprinting the sources of streamflow and tracing water flow pathways. For this reason, we have established an isotope sampling network in the Alptal catchment (46.4 km2) in Central-Switzerland as part of the SREP-Drought project (Snow Resources and the Early Prediction of hydrological DROUGHT in mountainous streams). Samples of precipitation (daily), snow cores (weekly) and runoff (daily) are analyzed for their isotopic signature in a regular cycle. Precipitation is also sampled along a horizontal transect at the valley bottom, and along an elevational transect. Additionally, the analysis of snow meltwater is of importance. As the sample collection of snow meltwater in mountainous terrain is often impractical, we have developed a fully automatic snow lysimeter system, which measures meltwater volume and collects samples for isotope analysis at daily intervals. The system consists of three lysimeters built from Decagon-ECRN-100 High Resolution Rain Gauges as standard component that allows monitoring of meltwater flow. Each lysimeter leads the meltwater into a 10-liter container that is automatically sampled and then emptied daily. These water samples are replaced regularly and analyzed afterwards on their isotopic composition in the lab. Snow melt events as well as system status can be monitored in real time. In our presentation we describe the automatic snow lysimeter

  20. Fully automated diagnosis of papilledema through robust extraction of vascular patterns and ocular pathology from fundus photographs

    PubMed Central

    Fatima, Khush Naseeb; Hassan, Taimur; Akram, M. Usman; Akhtar, Mahmood; Butt, Wasi Haider

    2017-01-01

    Rapid development in the field of ophthalmology has increased the demand of computer aided diagnosis of various eye diseases. Papilledema is an eye disease in which the optic disc of the eye is swelled due to an increase in intracranial pressure. This increased pressure can cause severe encephalic complications like abscess, tumors, meningitis or encephalitis, which may lead to a patient’s death. Although there have been several papilledema case studies reported from a medical point of view, only a few researchers have presented automated algorithms for this problem. This paper presents a novel computer aided system which aims to automatically detect papilledema from fundus images. Firstly, the fundus images are preprocessed by going through optic disc detection and vessel segmentation. After preprocessing, a total of 26 different features are extracted to capture possible changes in the optic disc due to papilledema. These features are further divided into four categories based upon their color, textural, vascular and disc margin obscuration properties. The best features are then selected and combined to form a feature matrix that is used to distinguish between normal images and images with papilledema using the supervised support vector machine (SVM) classifier. The proposed method is tested on 160 fundus images obtained from two different data sets i.e. structured analysis of retina (STARE), which is a publicly available data set, and our local data set that has been acquired from the Armed Forces Institute of Ophthalmology (AFIO). The STARE data set contained 90 and our local data set contained 70 fundus images respectively. These annotations have been performed with the help of two ophthalmologists. We report detection accuracies of 95.6% for STARE, 87.4% for the local data set, and 85.9% for the combined STARE and local data sets. The proposed system is fast and robust in detecting papilledema from fundus images with promising results. This will aid

  1. Fully automated production of diverse 18F-labeled PET tracers on the ELIXYS multi-reactor radiosynthesizer without hardware modification

    PubMed Central

    Lazari, Mark; Collins, Jeffrey; Shen, Bin; Farhoud, Mohammed; Yeh, Daniel; Maraglia, Brandon; Chin, Frederick T.; Nathanson, David A.; Moore, Melissa; van Dam, R. Michael

    2015-01-01

    Fully-automated radiosynthesizers are continuing to be developed to meet the growing need for the reliable production of positron emission tomography (PET) tracers made under current good manufacturing practice (cGMP) guidelines. There is a current trend towards supporting “kit-like” disposable cassettes that come preconfigured for particular tracers, thus eliminating the need for cleaning protocols between syntheses and enabling quick transitions to synthesizing other tracers. Though ideal for production, these systems are often limited for the development of novel tracers due to pressure, temperature, and chemical compatibility considerations. This study demonstrates the versatile use of the ELIXYS fully-automated radiosynthesizer to adapt and produce eight different 18F-labeled PET tracers of varying complexity. Methods Three reactor syntheses of D-[18F]FAC, L-[18F]FMAU, and D-[18F]FEAU along with the one reactor syntheses of D-[18F]FEAU, [18F]FDG, [18F]FLT, [18F]Fallypride, [18F]FHBG, and [18F]SFB were all produced using ELIXYS without the need for any hardware modifications or reconfiguration. Synthesis protocols were adapted, and slightly modified from literature, but not fully optimized. Furthermore, [18F]FLT, [18F]FDG, and [18F]Fallypride were produced sequentially on the same day and used for preclinical imaging of A431 tumor-bearing SCID mice and wild-type BALB/c mice, respectively. To assess future translation to the clinical setting, several batches of tracers were subjected to a full set of quality control tests. Results All tracers were produced with radiochemical yields comparable to those in literature. [18F]FLT, [18F]FDG, and [18F]Fallypride were successfully used to image the mice with results consistent with literature. All tracers subjected to clinical quality control tests passed. Conclusion The ELIXYS radiosynthesizer facilitates rapid tracer development and is capable of producing multiple 18F-labeled PET tracers suitable for clinical

  2. Fully automated segmentation and tracking of the intima media thickness in ultrasound video sequences of the common carotid artery.

    PubMed

    Ilea, Dana E; Duffy, Caoimhe; Kavanagh, Liam; Stanton, Alice; Whelan, Paul F

    2013-01-01

    The robust identification and measurement of the intima media thickness (IMT) has a high clinical relevance because it represents one of the most precise predictors used in the assessment of potential future cardiovascular events. To facilitate the analysis of arterial wall thickening in serial clinical investigations, in this paper we have developed a novel fully automatic algorithm for the segmentation, measurement, and tracking of the intima media complex (IMC) in B-mode ultrasound video sequences. The proposed algorithm entails a two-stage image analysis process that initially addresses the segmentation of the IMC in the first frame of the ultrasound video sequence using a model-based approach; in the second step, a novel customized tracking procedure is applied to robustly detect the IMC in the subsequent frames. For the video tracking procedure, we introduce a spatially coherent algorithm called adaptive normalized correlation that prevents the tracking process from converging to wrong arterial interfaces. This represents the main contribution of this paper and was developed to deal with inconsistencies in the appearance of the IMC over the cardiac cycle. The quantitative evaluation has been carried out on 40 ultrasound video sequences of the common carotid artery (CCA) by comparing the results returned by the developed algorithm with respect to ground truth data that has been manually annotated by clinical experts. The measured IMT(mean) ± standard deviation recorded by the proposed algorithm is 0.60 mm ± 0.10, with a mean coefficient of variation (CV) of 2.05%, whereas the corresponding result obtained for the manually annotated ground truth data is 0.60 mm ± 0.11 with a mean CV equal to 5.60%. The numerical results reported in this paper indicate that the proposed algorithm is able to correctly segment and track the IMC in ultrasound CCA video sequences, and we were encouraged by the stability of our technique when applied to data captured under

  3. PNA microarrays for hybridisation of unlabelled DNA samples

    PubMed Central

    Brandt, Ole; Feldner, Julia; Stephan, Achim; Schröder, Markus; Schnölzer, Martina; Arlinghaus, Heinrich F.; Hoheisel, Jörg D.; Jacob, Anette

    2003-01-01

    Several strategies have been developed for the production of peptide nucleic acid (PNA) microarrays by parallel probe synthesis and selective coupling of full-length molecules. Such microarrays were used for direct detection of the hybridisation of unlabelled DNA by time-of-flight secondary ion mass spectrometry. PNAs were synthesised by an automated process on filter-bottom microtitre plates. The resulting molecules were released from the solid support and attached without any purification to microarray surfaces via the terminal amino group itself or via modifications, which had been chemically introduced during synthesis. Thus, only full-length PNA oligomers were attached whereas truncated molecules, produced during synthesis because of incomplete condensation reactions, did not bind. Different surface chemistries and fitting modifications of the PNA terminus were tested. For an examination of coupling selectivity, bound PNAs were cleaved off microarray surfaces and analysed by MALDI-TOF mass spectrometry. Additionally, hybridisation experiments were performed to compare the attachment chemistries, with fully acetylated PNAs spotted as controls. Upon hybridisation of unlabelled DNA to such microarrays, binding events could be detected by visualisation of phosphates, which are an integral part of nucleic acids but missing entirely in PNA probes. Overall best results in terms of selectivity and sensitivity were obtained with thiol-modified PNAs on maleimide surfaces. PMID:14500847

  4. Comparison of Two Theory-Based, Fully Automated Telephone Interventions Designed to Maintain Dietary Change in Healthy Adults: Study Protocol of a Three-Arm Randomized Controlled Trial

    PubMed Central

    Quintiliani, Lisa M; Turner-McGrievy, Gabrielle M; Migneault, Jeffrey P; Heeren, Timothy; Friedman, Robert H

    2014-01-01

    Background Health behavior change interventions have focused on obtaining short-term intervention effects; few studies have evaluated mid-term and long-term outcomes, and even fewer have evaluated interventions that are designed to maintain and enhance initial intervention effects. Moreover, behavior theory has not been developed for maintenance or applied to maintenance intervention design to the degree that it has for behavior change initiation. Objective The objective of this paper is to describe a study that compared two theory-based interventions (social cognitive theory [SCT] vs goal systems theory [GST]) designed to maintain previously achieved improvements in fruit and vegetable (F&V) consumption. Methods The interventions used tailored, interactive conversations delivered by a fully automated telephony system (Telephone-Linked Care [TLC]) over a 6-month period. TLC maintenance intervention based on SCT used a skills-based approach to build self-efficacy. It assessed confidence in and barriers to eating F&V, provided feedback on how to overcome barriers, plan ahead, and set goals. The TLC maintenance intervention based on GST used a cognitive-based approach. Conversations trained participants in goal management to help them integrate their newly acquired dietary behavior into their hierarchical system of goals. Content included goal facilitation, conflict, shielding, and redundancy, and reflection on personal goals and priorities. To evaluate and compare the two approaches, a sample of adults whose F&V consumption was below public health goal levels were recruited from a large urban area to participate in a fully automated telephony intervention (TLC-EAT) for 3-6 months. Participants who increase their daily intake of F&V by ≥1 serving/day will be eligible for the three-arm randomized controlled trial. A sample of 405 participants will be randomized to one of three arms: (1) an assessment-only control, (2) TLC-SCT, and (3) TLC-GST. The maintenance

  5. Development of a Real-Time PCR Protocol Requiring Minimal Handling for Detection of Vancomycin-Resistant Enterococci with the Fully Automated BD Max System.

    PubMed

    Dalpke, Alexander H; Hofko, Marjeta; Zimmermann, Stefan

    2016-09-01

    Vancomycin-resistant enterococci (VRE) are an important cause of health care-associated infections, resulting in significant mortality and a significant economic burden in hospitals. Active surveillance for at-risk populations contributes to the prevention of infections with VRE. The availability of a combination of automation and molecular detection procedures for rapid screening would be beneficial. Here, we report on the development of a laboratory-developed PCR for detection of VRE which runs on the fully automated Becton Dickinson (BD) Max platform, which combines DNA extraction, PCR setup, and real-time PCR amplification. We evaluated two protocols: one using a liquid master mix and the other employing commercially ordered dry-down reagents. The BD Max VRE PCR was evaluated in two rounds with 86 and 61 rectal elution swab (eSwab) samples, and the results were compared to the culture results. The sensitivities of the different PCR formats were 84 to 100% for vanA and 83.7 to 100% for vanB; specificities were 96.8 to 100% for vanA and 81.8 to 97% for vanB The use of dry-down reagents and the ExK DNA-2 kit for extraction showed that the samples were less inhibited (3.3%) than they were by the use of the liquid master mix (14.8%). Adoption of a cutoff threshold cycle of 35 for discrimination of vanB-positive samples allowed an increase of specificity to 87.9%. The performance of the BD Max VRE assay equaled that of the BD GeneOhm VanR assay, which was run in parallel. The use of dry-down reagents simplifies the assay and omits any need to handle liquid PCR reagents.

  6. Fully automated quantification of cytomegalovirus (CMV) in whole blood with the new sensitive Abbott RealTime CMV assay in the era of the CMV international standard.

    PubMed

    Schnepf, Nathalie; Scieux, Catherine; Resche-Riggon, Matthieu; Feghoul, Linda; Xhaard, Alienor; Gallien, Sébastien; Molina, Jean-Michel; Socié, Gérard; Viglietti, Denis; Simon, François; Mazeron, Marie-Christine; Legoff, Jérôme

    2013-07-01

    Fully standardized reproducible and sensitive quantification assays for cytomegalovirus (CMV) are needed to better define thresholds for antiviral therapy initiation and interruption. We evaluated the newly released Abbott RealTime CMV assay for CMV quantification in whole blood (WB) that includes automated extraction and amplification (m2000 RealTime system). Sensitivity, accuracy, linearity, and intra- and interassay variability were validated in a WB matrix using Quality Control for Molecular Diagnostics (QCMD) panels and the WHO international standard (IS). The intra- and interassay coefficients of variation were 1.37% and 2.09% at 5 log10 copies/ml and 2.41% and 3.80% at 3 log10 copies/ml, respectively. According to expected values for the QCMD and Abbott RealTime CMV methods, the lower limits of quantification were 104 and <50 copies/ml, respectively. The conversion factor between international units and copies (2.18), determined from serial dilutions of the WHO IS in WB, was significantly different from the factor provided by the manufacturer (1.56) (P = 0.001). Results from 302 clinical samples were compared with those from the Qiagen artus CMV assay on the same m2000 RealTime system. The two assays provided highly concordant results (concordance correlation coefficient, 0.92), but the Abbott RealTime CMV assay detected and quantified, respectively, 20.6% and 47.8% more samples than the Qiagen/artus CMV assay. The sensitivity and reproducibility of the results, along with the automation, fulfilled the quality requirements for implementation of the Abbott RealTime CMV assay in clinical settings. Our results highlight the need for careful validation of conversion factors provided by the manufacturers for the WHO IS in WB to allow future comparison of results obtained with different assays.

  7. Fully Automated Quantification of Cytomegalovirus (CMV) in Whole Blood with the New Sensitive Abbott RealTime CMV Assay in the Era of the CMV International Standard

    PubMed Central

    Schnepf, Nathalie; Scieux, Catherine; Resche-Riggon, Matthieu; Feghoul, Linda; Xhaard, Alienor; Gallien, Sébastien; Molina, Jean-Michel; Socié, Gérard; Viglietti, Denis; Simon, François; Mazeron, Marie-Christine

    2013-01-01

    Fully standardized reproducible and sensitive quantification assays for cytomegalovirus (CMV) are needed to better define thresholds for antiviral therapy initiation and interruption. We evaluated the newly released Abbott RealTime CMV assay for CMV quantification in whole blood (WB) that includes automated extraction and amplification (m2000 RealTime system). Sensitivity, accuracy, linearity, and intra- and interassay variability were validated in a WB matrix using Quality Control for Molecular Diagnostics (QCMD) panels and the WHO international standard (IS). The intra- and interassay coefficients of variation were 1.37% and 2.09% at 5 log10 copies/ml and 2.41% and 3.80% at 3 log10 copies/ml, respectively. According to expected values for the QCMD and Abbott RealTime CMV methods, the lower limits of quantification were 104 and <50 copies/ml, respectively. The conversion factor between international units and copies (2.18), determined from serial dilutions of the WHO IS in WB, was significantly different from the factor provided by the manufacturer (1.56) (P = 0.001). Results from 302 clinical samples were compared with those from the Qiagen artus CMV assay on the same m2000 RealTime system. The two assays provided highly concordant results (concordance correlation coefficient, 0.92), but the Abbott RealTime CMV assay detected and quantified, respectively, 20.6% and 47.8% more samples than the Qiagen/artus CMV assay. The sensitivity and reproducibility of the results, along with the automation, fulfilled the quality requirements for implementation of the Abbott RealTime CMV assay in clinical settings. Our results highlight the need for careful validation of conversion factors provided by the manufacturers for the WHO IS in WB to allow future comparison of results obtained with different assays. PMID:23616450

  8. A fully-automated approach to land cover mapping with airborne LiDAR and high resolution multispectral imagery in a forested suburban landscape

    NASA Astrophysics Data System (ADS)

    Parent, Jason R.; Volin, John C.; Civco, Daniel L.

    2015-06-01

    Information on land cover is essential for guiding land management decisions and supporting landscape-level ecological research. In recent years, airborne light detection and ranging (LiDAR) and high resolution aerial imagery have become more readily available in many areas. These data have great potential to enable the generation of land cover at a fine scale and across large areas by leveraging 3-dimensional structure and multispectral information. LiDAR and other high resolution datasets must be processed in relatively small subsets due to their large volumes; however, conventional classification techniques cannot be fully automated and thus are unlikely to be feasible options when processing large high-resolution datasets. In this paper, we propose a fully automated rule-based algorithm to develop a 1 m resolution land cover classification from LiDAR data and multispectral imagery. The algorithm we propose uses a series of pixel- and object-based rules to identify eight vegetated and non-vegetated land cover features (deciduous and coniferous tall vegetation, medium vegetation, low vegetation, water, riparian wetlands, buildings, low impervious cover). The rules leverage both structural and spectral properties including height, LiDAR return characteristics, brightness in visible and near-infrared wavelengths, and normalized difference vegetation index (NDVI). Pixel-based properties were used initially to classify each land cover class while minimizing omission error; a series of object-based tests were then used to remove errors of commission. These tests used conservative thresholds, based on diverse test areas, to help avoid over-fitting the algorithm to the test areas. The accuracy assessment of the classification results included a stratified random sample of 3198 validation points distributed across 30 1 × 1 km tiles in eastern Connecticut, USA. The sample tiles were selected in a stratified random manner from locations representing the full range of

  9. The microfluidic bioagent autonomous networked detector (M-BAND): an update. Fully integrated, automated, and networked field identification of airborne pathogens

    NASA Astrophysics Data System (ADS)

    Sanchez, M.; Probst, L.; Blazevic, E.; Nakao, B.; Northrup, M. A.

    2011-11-01

    We describe a fully automated and autonomous air-borne biothreat detection system for biosurveillance applications. The system, including the nucleic-acid-based detection assay, was designed, built and shipped by Microfluidic Systems Inc (MFSI), a new subsidiary of PositiveID Corporation (PSID). Our findings demonstrate that the system and assay unequivocally identify pathogenic strains of Bacillus anthracis, Yersinia pestis, Francisella tularensis, Burkholderia mallei, and Burkholderia pseudomallei. In order to assess the assay's ability to detect unknown samples, our team also challenged it against a series of blind samples provided by the Department of Homeland Security (DHS). These samples included natural occurring isolated strains, near-neighbor isolates, and environmental samples. Our results indicate that the multiplex assay was specific and produced no false positives when challenged with in house gDNA collections and DHS provided panels. Here we present another analytical tool for the rapid identification of nine Centers for Disease Control and Prevention category A and B biothreat organisms.

  10. Fully automated method for simultaneous determination of total cysteine, cysteinylglycine, glutathione and homocysteine in plasma by HPLC with UV absorbance detection.

    PubMed

    Głowacki, Rafał; Bald, Edward

    2009-10-15

    A fully automated HPLC method for the simultaneous determination of total thiols in plasma samples has been developed. The method involves reductive conversion of disulfides to their reduced counterparts with the use of tris(2-carboxyethyl)phosphine. After reduction the newly formed sulfhydryl groups are reacted with 2-chloro-1-methylquinolinium tetrafluoroborate to form 2-S-quinolinium derivatives followed by deproteinization by dialysis. The reaction products are separated by reversed-phase HPLC, detected and quantified by UV absorbance detection at 355nm. The recommended HPLC procedure enables measurement of four main plasma aminothiols cysteine, cysteinylglycine, glutathione, and homocysteine with low imprecision (mean relative standard deviations within calibration range, 3.47%, 5.34%, 4.25% and 3.26%, respectively) and good sensitivity. Accuracy, expressed as the mean measured amount as percentage of added amount, was within 97.5-103.0%, 98.3-102.5%, 96.3-99.5% and 97.1-99.1%, respectively. The lower limit of quantification for all thiols was 0.5microM. The whole unattended instrument acquisition time amounts 13min.

  11. Multiresidue trace analysis of pharmaceuticals, their human metabolites and transformation products by fully automated on-line solid-phase extraction-liquid chromatography-tandem mass spectrometry.

    PubMed

    García-Galán, María Jesús; Petrovic, Mira; Rodríguez-Mozaz, Sara; Barceló, Damià

    2016-09-01

    A novel, fully automated analytical methodology based on dual column liquid chromatography coupled to tandem mass spectrometry (LC-LC-MS(2)) has been developed and validated for the analysis of 12 pharmaceuticals and 20 metabolites and transformation products in different types of water (influent and effluent wastewaters and surface water). Two LC columns were used - one for pre-concentration of the sample and the second for separation and analysis - so that water samples were injected directly in the chromatographic system. Besides the many advantages of the methodology, such as minimization of the sample volume required and its manipulation, both compounds ionized in positive and negative mode could be analyzed simultaneously without compromising the sensitivity. A comparative study of different mobile phases, gradients and LC pre-concentration columns was carried out to obtain the best analytical performance. Limits of detection (MLODs) achieved were in the low ngL(-1) range for all the compounds. The method was successfully applied to study the presence of the target analytes in different wastewater and surface water samples collected near the city of Girona (Catalonia, Spain). Data on the environmental presence and fate of pharmaceutical metabolites and TPs is still scarce, highlighting the relevance of the developed methodology.

  12. Black tea volatiles fingerprinting by comprehensive two-dimensional gas chromatography - Mass spectrometry combined with high concentration capacity sample preparation techniques: Toward a fully automated sensomic assessment.

    PubMed

    Magagna, Federico; Cordero, Chiara; Cagliero, Cecilia; Liberto, Erica; Rubiolo, Patrizia; Sgorbini, Barbara; Bicchi, Carlo

    2017-06-15

    Tea prepared by infusion of dried leaves of Camellia sinensis (L.) Kuntze, is the second world's most popular beverage, after water. Its consumption is associated with its chemical composition: it influences its sensory and nutritional quality addressing consumer preferences, and potential health benefits. This study aims to obtain an informative chemical signature of the volatile fraction of black tea samples from Ceylon by applying the principles of sensomics. In particular, several high concentration capacity (HCC) sample preparation techniques were tested in combination with GC×GC-MS to investigate chemical signatures of black tea volatiles. This platform, using headspace solid phase microextraction (HS-SPME) with multicomponent fiber as sampling technique, recovers 95% of the key-odorants in a fully automated work-flow. A group 123 components, including key-odorants, technological and botanical tracers, were mapped. The resulting 2D fingerprints were interpreted by pattern recognition tools (i.e. template matching fingerprinting and scripting) providing highly informative chemical signatures for quality assessment.

  13. Antioxidant effects of carnitine supplementation on 14-3-3 protein isoforms in the aged rat hippocampus detected using fully automated two-dimensional chip gel electrophoresis.

    PubMed

    Iwamoto, M; Miura, Y; Tsumoto, H; Tanaka, Y; Morisawa, H; Endo, T; Toda, T

    2014-12-01

    We here described the antioxidant effects of carnitine supplementation on 14-3-3 protein isoforms in the aged rat hippocampus detected using the fully automated two-dimensional chip gel electrophoresis system (Auto2D). This system was easy and convenient to use, and the resolution obtained was more sensitive and higher than that of conventional two-dimensional polyacrylamide gel electrophoresis (2-D PAGE). We separated and identified five isoforms of the 14-3-3 protein (beta/alpha, gamma, epsilon, zeta/delta, and eta) using the Auto2D system. We then examined the antioxidant effects of carnitine supplementation on the protein profiles of the cytosolic fraction in the aged rat hippocampus, demonstrating that carnitine supplementation suppressed the oxidation of methionine residues in these isoforms. Since methionine residues are easily oxidized to methionine sulfoxide, the convenient and high-resolution 2-D PAGE system can be available to analyze methionine oxidation avoiding artifactual oxidation. We showed here that the Auto2D system was a very useful tool for studying antioxidant effects through proteomic analysis of protein oxidation.

  14. Web-Based Fully Automated Self-Help With Different Levels of Therapist Support for Individuals With Eating Disorder Symptoms: A Randomized Controlled Trial

    PubMed Central

    Dingemans, Alexandra E; Spinhoven, Philip; van Ginkel, Joost R; de Rooij, Mark; van Furth, Eric F

    2016-01-01

    Background Despite the disabling nature of eating disorders (EDs), many individuals with ED symptoms do not receive appropriate mental health care. Internet-based interventions have potential to reduce the unmet needs by providing easily accessible health care services. Objective This study aimed to investigate the effectiveness of an Internet-based intervention for individuals with ED symptoms, called “Featback.” In addition, the added value of different intensities of therapist support was investigated. Methods Participants (N=354) were aged 16 years or older with self-reported ED symptoms, including symptoms of anorexia nervosa, bulimia nervosa, and binge eating disorder. Participants were recruited via the website of Featback and the website of a Dutch pro-recovery–focused e-community for young women with ED problems. Participants were randomized to: (1) Featback, consisting of psychoeducation and a fully automated self-monitoring and feedback system, (2) Featback supplemented with low-intensity (weekly) digital therapist support, (3) Featback supplemented with high-intensity (3 times a week) digital therapist support, and (4) a waiting list control condition. Internet-administered self-report questionnaires were completed at baseline, post-intervention (ie, 8 weeks after baseline), and at 3- and 6-month follow-up. The primary outcome measure was ED psychopathology. Secondary outcome measures were symptoms of depression and anxiety, perseverative thinking, and ED-related quality of life. Statistical analyses were conducted according to an intent-to-treat approach using linear mixed models. Results The 3 Featback conditions were superior to a waiting list in reducing bulimic psychopathology (d=−0.16, 95% confidence interval (CI)=−0.31 to −0.01), symptoms of depression and anxiety (d=−0.28, 95% CI=−0.45 to −0.11), and perseverative thinking (d=−0.28, 95% CI=−0.45 to −0.11). No added value of therapist support was found in terms of symptom

  15. Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB

    PubMed Central

    Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N

    2009-01-01

    Background The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. Results We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime

  16. Effectiveness of a Web-Based Screening and Fully Automated Brief Motivational Intervention for Adolescent Substance Use: A Randomized Controlled Trial

    PubMed Central

    Elgán, Tobias H; De Paepe, Nina; Tønnesen, Hanne; Csémy, Ladislav; Thomasius, Rainer

    2016-01-01

    Background Mid-to-late adolescence is a critical period for initiation of alcohol and drug problems, which can be reduced by targeted brief motivational interventions. Web-based brief interventions have advantages in terms of acceptability and accessibility and have shown significant reductions of substance use among college students. However, the evidence is sparse among adolescents with at-risk use of alcohol and other drugs. Objective This study evaluated the effectiveness of a targeted and fully automated Web-based brief motivational intervention with no face-to-face components on substance use among adolescents screened for at-risk substance use in four European countries. Methods In an open-access, purely Web-based randomized controlled trial, a convenience sample of adolescents aged 16-18 years from Sweden, Germany, Belgium, and the Czech Republic was recruited using online and offline methods and screened online for at-risk substance use using the CRAFFT (Car, Relax, Alone, Forget, Friends, Trouble) screening instrument. Participants were randomized to a single session brief motivational intervention group or an assessment-only control group but not blinded. Primary outcome was differences in past month drinking measured by a self-reported AUDIT-C-based index score for drinking frequency, quantity, and frequency of binge drinking with measures collected online at baseline and after 3 months. Secondary outcomes were the AUDIT-C-based separate drinking indicators, illegal drug use, and polydrug use. All outcome analyses were conducted with and without Expectation Maximization (EM) imputation of missing follow-up data. Results In total, 2673 adolescents were screened and 1449 (54.2%) participants were randomized to the intervention or control group. After 3 months, 211 adolescents (14.5%) provided follow-up data. Compared to the control group, results from linear mixed models revealed significant reductions in self-reported past-month drinking in favor of the

  17. Fully automated determination of macrocyclic musk fragrances in wastewater by microextraction by packed sorbents and large volume injection gas chromatography-mass spectrometry.

    PubMed

    Vallecillos, Laura; Pocurull, Eva; Borrull, Francesc

    2012-11-16

    A fully automated method has been developed for the determination of eight macrocyclic musk fragrances in urban wastewater. The procedure includes the enrichment of the analytes by microextraction by packed sorbent (MEPS) followed by large volume injection-gas chromatography-mass spectrometry (LVI-GC-MS). The main factors in the MEPS technique were optimized. For all of the analytes, the highest enrichment factors were achieved when 4 mL samples were extracted by using C18 MEPS-sorbent and 50 μL of ethyl acetate were used for desorption. The eluate was directly analysed by GC-MS. Detection limits were found to be between 5 ng L(-1) and 10 ng L(-1), depending on the target analytes. In addition, under optimized conditions, the method gave good levels of intra-day and inter-day repeatability in wastewater samples with relative standard deviation (RSD) (n=3, 1,000 ng L(-1)) less than 5% and 9%, respectively. The applicability of the method was tested with wastewater samples from two influent and effluent urban wastewater treatment plants (WWTPs). The analysis of influent urban wastewater revealed the presence of most of the macrocylic musks at concentrations higher than the method quantification limits (MQLs), being the most abundant analyte ambrettolide at 9.29 μg L(-1). In addition, the analyses of effluent urban wastewater showed a decrease in the concentrations with macrocyclic musk concentrations of between not detected (n.d.) and 2.26 μg L(-1) being detected.

  18. Evaluation of fully automated assays for the detection of Rubella IgM and IgG antibodies by the Elecsys(®) immunoassay system.

    PubMed

    van Helden, Josef; Grangeot-Keros, Liliane; Vauloup-Fellous, Christelle; Vleminckx, Renaud; Masset, Frédéric; Revello, Maria-Grazia

    2014-04-01

    Screening for acute rubella infection in pregnancy is an important element of antenatal care. This study compared the sensitivity, specificity and reproducibility of two new, fully automated Elecsys(®) Rubella IgM and IgG immunoassays designed for the Elecsys 2010, Modular Analytics E170, COBAS e-411 and COBAS e-601 and e602 analytical platforms, with current assays using serum from patients with primary rubella infections, vaccinated patients, patients with potentially cross-reacting infections and on routine samples in clinical laboratories in France, Germany and Italy. Both assays showed good within-run and within-laboratory precision. A sensitivity of 79.8-96.0% was demonstrated for Elecsys IgM in primary, early acute infection, consistent with existing assays. In samples obtained from routine antenatal screening, the Elecsys Rubella IgM assay revealed high specificity (98.7-99.0%). A significantly (p<0.0001) lower reactivity was demonstrated in samples from previously infected patients where acute rubella infection was excluded, and the incidence of false positives in patients with potentially cross-reacting infections was lower with Elecsys Rubella IgM compared with other. The Elecsys Rubella IgG assay exhibited a relative sensitivity of 99.9-100.0% and specificity of 97.4-100.0% in samples from routine antenatal screening. The Elecsys Rubella IgM and IgG assays allow convenient, rapid and reliable determination of anti-rubella antibodies. Sensitivity, specificity and reproducibility were comparable with existing assay systems. Assay results were available in approximately half the time required for currently employed methods and the assays are compatible with widely used analytical platforms.

  19. Development and evaluation of a real-time PCR assay for detection of Pneumocystis jirovecii on the fully automated BD MAX platform.

    PubMed

    Dalpke, Alexander H; Hofko, Marjeta; Zimmermann, Stefan

    2013-07-01

    Pneumocystis jirovecii is an opportunistic pathogen in immunocompromised and AIDS patients. Detection by quantitative PCR is faster and more sensitive than microscopic diagnosis yet requires specific infrastructure. We adapted a real-time PCR amplifying the major surface glycoprotein (MSG) target from Pneumocystis jirovecii for use on the new BD MAX platform. The assay allowed fully automated DNA extraction and multiplex real-time PCR. The BD MAX assay was evaluated against manual DNA extraction and conventional real-time PCR. The BD MAX was used in the research mode running a multiplex PCR (MSG, internal control, and sample process control). The assay had a detection limit of 10 copies of an MSG-encoding plasmid per PCR that equated to 500 copies/ml in respiratory specimens. We observed accurate quantification of MSG targets over a 7- to 8-log range. Prealiquoting and sealing of the complete PCR reagents in conical tubes allowed easy and convenient handling of the BD MAX PCR. In a retrospective analysis of 54 positive samples, the BD MAX assay showed good quantitative correlation with the reference PCR method (R(2) = 0.82). Cross-contamination was not observed. Prospectively, 278 respiratory samples were analyzed by both molecular assays. The positivity rate overall was 18.3%. The BD MAX assay identified 46 positive samples, compared to 40 by the reference PCR. The BD MAX assay required liquefaction of highly viscous samples with dithiothreitol as the only manual step, thus offering advantages for timely availability of molecular-based detection assays.

  20. Predicting survival in heart failure case and control subjects by use of fully automated methods for deriving nonlinear and conventional indices of heart rate dynamics

    NASA Technical Reports Server (NTRS)

    Ho, K. K.; Moody, G. B.; Peng, C. K.; Mietus, J. E.; Larson, M. G.; Levy, D.; Goldberger, A. L.

    1997-01-01

    BACKGROUND: Despite much recent interest in quantification of heart rate variability (HRV), the prognostic value of conventional measures of HRV and of newer indices based on nonlinear dynamics is not universally accepted. METHODS AND RESULTS: We have designed algorithms for analyzing ambulatory ECG recordings and measuring HRV without human intervention, using robust methods for obtaining time-domain measures (mean and SD of heart rate), frequency-domain measures (power in the bands of 0.001 to 0.01 Hz [VLF], 0.01 to 0.15 Hz [LF], and 0.15 to 0.5 Hz [HF] and total spectral power [TP] over all three of these bands), and measures based on nonlinear dynamics (approximate entropy [ApEn], a measure of complexity, and detrended fluctuation analysis [DFA], a measure of long-term correlations). The study population consisted of chronic congestive heart failure (CHF) case patients and sex- and age-matched control subjects in the Framingham Heart Study. After exclusion of technically inadequate studies and those with atrial fibrillation, we used these algorithms to study HRV in 2-hour ambulatory ECG recordings of 69 participants (mean age, 71.7+/-8.1 years). By use of separate Cox proportional-hazards models, the conventional measures SD (P<.01), LF (P<.01), VLF (P<.05), and TP (P<.01) and the nonlinear measure DFA (P<.05) were predictors of survival over a mean follow-up period of 1.9 years; other measures, including ApEn (P>.3), were not. In multivariable models, DFA was of borderline predictive significance (P=.06) after adjustment for the diagnosis of CHF and SD. CONCLUSIONS: These results demonstrate that HRV analysis of ambulatory ECG recordings based on fully automated methods can have prognostic value in a population-based study and that nonlinear HRV indices may contribute prognostic value to complement traditional HRV measures.

  1. Chromosome Microarray.

    PubMed

    Anderson, Sharon

    2016-01-01

    Over the last half century, knowledge about genetics, genetic testing, and its complexity has flourished. Completion of the Human Genome Project provided a foundation upon which the accuracy of genetics, genomics, and integration of bioinformatics knowledge and testing has grown exponentially. What is lagging, however, are efforts to reach and engage nurses about this rapidly changing field. The purpose of this article is to familiarize nurses with several frequently ordered genetic tests including chromosomes and fluorescence in situ hybridization followed by a comprehensive review of chromosome microarray. It shares the complexity of microarray including how testing is performed and results analyzed. A case report demonstrates how this technology is applied in clinical practice and reveals benefits and limitations of this scientific and bioinformatics genetic technology. Clinical implications for maternal-child nurses across practice levels are discussed.

  2. A comparison of fully automated methods of data analysis and computer assisted heuristic methods in an electrode kinetic study of the pathologically variable [Fe(CN)6](3-/4-) process by AC voltammetry.

    PubMed

    Morris, Graham P; Simonov, Alexandr N; Mashkina, Elena A; Bordas, Rafel; Gillow, Kathryn; Baker, Ruth E; Gavaghan, David J; Bond, Alan M

    2013-12-17

    Fully automated and computer assisted heuristic data analysis approaches have been applied to a series of AC voltammetric experiments undertaken on the [Fe(CN)6](3-/4-) process at a glassy carbon electrode in 3 M KCl aqueous electrolyte. The recovered parameters in all forms of data analysis encompass E(0) (reversible potential), k(0) (heterogeneous charge transfer rate constant at E(0)), α (charge transfer coefficient), Ru (uncompensated resistance), and Cdl (double layer capacitance). The automated method of analysis employed time domain optimization and Bayesian statistics. This and all other methods assumed the Butler-Volmer model applies for electron transfer kinetics, planar diffusion for mass transport, Ohm's Law for Ru, and a potential-independent Cdl model. Heuristic approaches utilize combinations of Fourier Transform filtering, sensitivity analysis, and simplex-based forms of optimization applied to resolved AC harmonics and rely on experimenter experience to assist in experiment-theory comparisons. Remarkable consistency of parameter evaluation was achieved, although the fully automated time domain method provided consistently higher α values than those based on frequency domain data analysis. The origin of this difference is that the implemented fully automated method requires a perfect model for the double layer capacitance. In contrast, the importance of imperfections in the double layer model is minimized when analysis is performed in the frequency domain. Substantial variation in k(0) values was found by analysis of the 10 data sets for this highly surface-sensitive pathologically variable [Fe(CN)6](3-/4-) process, but remarkably, all fit the quasi-reversible model satisfactorily.

  3. Microarray Technologies in Fungal Diagnostics.

    PubMed

    Rupp, Steffen

    2017-01-01

    Microarray technologies have been a major research tool in the last decades. In addition they have been introduced into several fields of diagnostics including diagnostics of infectious diseases. Microarrays are highly parallelized assay systems that initially were developed for multiparametric nucleic acid detection. From there on they rapidly developed towards a tool for the detection of all kind of biological compounds (DNA, RNA, proteins, cells, nucleic acids, carbohydrates, etc.) or their modifications (methylation, phosphorylation, etc.). The combination of closed-tube systems and lab on chip devices with microarrays further enabled a higher automation degree with a reduced contamination risk. Microarray-based diagnostic applications currently complement and may in the future replace classical methods in clinical microbiology like blood cultures, resistance determination, microscopic and metabolic analyses as well as biochemical or immunohistochemical assays. In addition, novel diagnostic markers appear, like noncoding RNAs and miRNAs providing additional room for novel nucleic acid based biomarkers. Here I focus an microarray technologies in diagnostics and as research tools, based on nucleic acid-based arrays.

  4. Low-complexity PDE-based approach for automatic microarray image processing.

    PubMed

    Belean, Bogdan; Terebes, Romulus; Bot, Adrian

    2015-02-01

    Microarray image processing is known as a valuable tool for gene expression estimation, a crucial step in understanding biological processes within living organisms. Automation and reliability are open subjects in microarray image processing, where grid alignment and spot segmentation are essential processes that can influence the quality of gene expression information. The paper proposes a novel partial differential equation (PDE)-based approach for fully automatic grid alignment in case of microarray images. Our approach can handle image distortions and performs grid alignment using the vertical and horizontal luminance function profiles. These profiles are evolved using a hyperbolic shock filter PDE and then refined using the autocorrelation function. The results are compared with the ones delivered by state-of-the-art approaches for grid alignment in terms of accuracy and computational complexity. Using the same PDE formalism and curve fitting, automatic spot segmentation is achieved and visual results are presented. Considering microarray images with different spots layouts, reliable results in terms of accuracy and reduced computational complexity are achieved, compared with existing software platforms and state-of-the-art methods for microarray image processing.

  5. ExpressYourself: a modular platform for processing and visualizing microarray data

    PubMed Central

    Luscombe, Nicholas M.; Royce, Thomas E.; Bertone, Paul; Echols, Nathaniel; Horak, Christine E.; Chang, Joseph T.; Snyder, Michael; Gerstein, Mark

    2003-01-01

    DNA microarrays are widely used in biological research; by analyzing differential hybridization on a single microarray slide, one can detect changes in mRNA expression levels, increases in DNA copy numbers and the location of transcription factor binding sites on a genomic scale. Having performed the experiments, the major challenge is to process large, noisy datasets in order to identify the specific array elements that are significantly differentially hybridized. This normally requires aggregating different, often incompatible programs into a multi-step pipeline. Here we present ExpressYourself, a fully integrated platform for processing microarray data. In completely automated fashion, it will correct the background array signal, normalize the Cy5 and Cy3 signals, score levels of differential hybridization, combine the results of replicate experiments, filter problematic regions of the array and assess the quality of individual and replicate experiments. ExpressYourself is designed with a highly modular architecture so various types of microarray analysis algorithms can readily be incorporated as they are developed; for example, the system currently implements several normalization methods, including those that simultaneously consider signal intensity and slide location. The processed data are presented using a web-based graphical interface to facilitate comparison with the original images of the array slides. In particular, Express Yourself is able to regenerate images of the original microarray after applying various steps of processing, which greatly facilities identification of position-specific artifacts. The program is freely available for use at http://bioinfo.mbb.yale.edu/expressyourself. PMID:12824348

  6. A robotics platform for automated batch fabrication of high density, microfluidics-based DNA microarrays, with applications to single cell, multiplex assays of secreted proteins

    NASA Astrophysics Data System (ADS)

    Ahmad, Habib; Sutherland, Alex; Shin, Young Shik; Hwang, Kiwook; Qin, Lidong; Krom, Russell-John; Heath, James R.

    2011-09-01

    Microfluidics flow-patterning has been utilized for the construction of chip-scale miniaturized DNA and protein barcode arrays. Such arrays have been used for specific clinical and fundamental investigations in which many proteins are assayed from single cells or other small sample sizes. However, flow-patterned arrays are hand-prepared, and so are impractical for broad applications. We describe an integrated robotics/microfluidics platform for the automated preparation of such arrays, and we apply it to the batch fabrication of up to eighteen chips of flow-patterned DNA barcodes. The resulting substrates are comparable in quality with hand-made arrays and exhibit excellent substrate-to-substrate consistency. We demonstrate the utility and reproducibility of robotics-patterned barcodes by utilizing two flow-patterned chips for highly parallel assays of a panel of secreted proteins from single macrophage cells.

  7. Comparison of automated microarray detection with real-time PCR assays for detection of respiratory viruses in specimens obtained from children.

    PubMed

    Raymond, Frédéric; Carbonneau, Julie; Boucher, Nancy; Robitaille, Lynda; Boisvert, Sébastien; Wu, Whei-Kuo; De Serres, Gaston; Boivin, Guy; Corbeil, Jacques

    2009-03-01

    Respiratory virus infections are a major health concern and represent the primary cause of testing consultation and hospitalization for young children. We developed and compared two assays that allow the detection of up to 23 different respiratory viruses that frequently infect children. The first method consisted of single TaqMan quantitative real-time PCR assays in a 96-well-plate format. The second consisted of a multiplex PCR followed by primer extension and microarray hybridization in an integrated molecular diagnostic device, the Infiniti analyzer. Both of our assays can detect adenoviruses of groups A, B, C, and E; coronaviruses HKU1, 229E, NL63, and OC43; enteroviruses A, B, C, and D; rhinoviruses of genotypes A and B; influenza viruses A and B; human metapneumoviruses (HMPV) A and B, human respiratory syncytial viruses (HRSV) A and B; and parainfluenza viruses of types 1, 2, and 3. These tests were used to identify viruses in 221 nasopharyngeal aspirates obtained from children hospitalized for respiratory tract infections. Respiratory viruses were detected with at least one of the two methods in 81.4% of the 221 specimens: 10.0% were positive for HRSV A, 38.0% for HRSV B, 13.1% for influenzavirus A, 8.6% for any coronaviruses, 13.1% for rhinoviruses or enteroviruses, 7.2% for adenoviruses, 4.1% for HMPV, and 1.5% for parainfluenzaviruses. Multiple viral infections were found in 13.1% of the specimens. The two methods yielded concordant results for 94.1% of specimens. These tests allowed a thorough etiological assessment of respiratory viruses infecting children in hospital settings and would assist public health interventions.

  8. Washing of platelets can be fully automated using a closed-system cell processor and BRS-A platelet additive solution.

    PubMed

    Oikawa, S; Minegishi, M; Endo, K; Kawashima, W; Suzuki, K; Shimizu, H

    2016-11-01

    This study evaluated the in vitro properties of platelets (PLTs) washed with BRS-A additive solution in the Haemonetics ACP215 automated processing system. Two washing modes, 'manually/automatically adding ACD-A to BRS before/during the washing process', represented the control and test groups, respectively. Outcomes were compared over 7 days of storage (n = 7, for both). PLT recovery following washing processing (26-27 min) was 86·2 ± 1·7% and 86·0 ± 2·2% and plasma protein removal was 98·8 ± 0·3% and 99·0 ± 0·2% in the control and test groups, respectively (not significant). Both groups exhibited comparable in vitro properties.

  9. The Zymark BenchMate™. A compact, fully-automated solution-phase reaction work-up facility for multiple parallel synthesis

    PubMed Central

    Hamlin, Gordon A.

    2000-01-01

    The rapid growth of multiple parallel synthesis in our laboratories has created a demand for a robust, easily accessed automated system for solution-phase reaction work-up, since the manual work-up of large numbers of small-scale reactions is both time-consuming and tedious, and is a rate limiting step in the generation of large numbers of compounds for test. Work-up in chemical organic synthesis consists of a series of post-reaction operations designed using differential chemical properties to remove excess reagent or starting material, reagent products and, where possible reaction by-products. Careful consideration of post-reaction operations as a clean-up step can obviate the requirement for purification. Generally, work-up can be resolved into four operations: filtration, solvent addition (dilution, trituration), washing and separation (partition) and it is the selection and ordering of these four basic operations that constitutes a chemical work-up. Following the proven success of centralized Zymate robotic systems in the compilation, execution and work-up of complex reaction sequences, a centralized chemical work-up service has been in operation for over 12 months. It now seemed prudent that the needs of multiple parallel synthesis would be better served by the development of a compact, automated system, capable of operating in a standard chemistry laboratory fume-hood. A Zymark BenchMate platform has been configured to perform the four basic operations of chemical solution work-up. A custom-built filtration station, incorporating an integrated tipping facility for the sample tube has also been developed. Compilation of each work-up is through a set of Visual Basic procedure screens, each dedicated to a particular work-up scenario. Methods are compiled at the chemist's own PC and transferred to the BenchMate via a diskette. PMID:18924692

  10. SaDA: From Sampling to Data Analysis—An Extensible Open Source Infrastructure for Rapid, Robust and Automated Management and Analysis of Modern Ecological High-Throughput Microarray Data

    PubMed Central

    Singh, Kumar Saurabh; Thual, Dominique; Spurio, Roberto; Cannata, Nicola

    2015-01-01

    One of the most crucial characteristics of day-to-day laboratory information management is the collection, storage and retrieval of information about research subjects and environmental or biomedical samples. An efficient link between sample data and experimental results is absolutely important for the successful outcome of a collaborative project. Currently available software solutions are largely limited to large scale, expensive commercial Laboratory Information Management Systems (LIMS). Acquiring such LIMS indeed can bring laboratory information management to a higher level, but most of the times this requires a sufficient investment of money, time and technical efforts. There is a clear need for a light weighted open source system which can easily be managed on local servers and handled by individual researchers. Here we present a software named SaDA for storing, retrieving and analyzing data originated from microorganism monitoring experiments. SaDA is fully integrated in the management of environmental samples, oligonucleotide sequences, microarray data and the subsequent downstream analysis procedures. It is simple and generic software, and can be extended and customized for various environmental and biomedical studies. PMID:26047146

  11. SaDA: From Sampling to Data Analysis-An Extensible Open Source Infrastructure for Rapid, Robust and Automated Management and Analysis of Modern Ecological High-Throughput Microarray Data.

    PubMed

    Singh, Kumar Saurabh; Thual, Dominique; Spurio, Roberto; Cannata, Nicola

    2015-06-03

    One of the most crucial characteristics of day-to-day laboratory information management is the collection, storage and retrieval of information about research subjects and environmental or biomedical samples. An efficient link between sample data and experimental results is absolutely important for the successful outcome of a collaborative project. Currently available software solutions are largely limited to large scale, expensive commercial Laboratory Information Management Systems (LIMS). Acquiring such LIMS indeed can bring laboratory information management to a higher level, but most of the times this requires a sufficient investment of money, time and technical efforts. There is a clear need for a light weighted open source system which can easily be managed on local servers and handled by individual researchers. Here we present a software named SaDA for storing, retrieving and analyzing data originated from microorganism monitoring experiments. SaDA is fully integrated in the management of environmental samples, oligonucleotide sequences, microarray data and the subsequent downstream analysis procedures. It is simple and generic software, and can be extended and customized for various environmental and biomedical studies.

  12. Fully-automated approach to hippocampus segmentation using a graph-cuts algorithm combined with atlas-based segmentation and morphological opening.

    PubMed

    Kwak, Kichang; Yoon, Uicheul; Lee, Dong-Kyun; Kim, Geon Ha; Seo, Sang Won; Na, Duk L; Shim, Hack-Joon; Lee, Jong-Min

    2013-09-01

    The hippocampus has been known to be an important structure as a biomarker for Alzheimer's disease (AD) and other neurological and psychiatric diseases. However, it requires accurate, robust and reproducible delineation of hippocampal structures. In this study, an automated hippocampal segmentation method based on a graph-cuts algorithm combined with atlas-based segmentation and morphological opening was proposed. First of all, the atlas-based segmentation was applied to define initial hippocampal region for a priori information on graph-cuts. The definition of initial seeds was further elaborated by incorporating estimation of partial volume probabilities at each voxel. Finally, morphological opening was applied to reduce false positive of the result processed by graph-cuts. In the experiments with twenty-seven healthy normal subjects, the proposed method showed more reliable results (similarity index=0.81±0.03) than the conventional atlas-based segmentation method (0.72±0.04). Also as for segmentation accuracy which is measured in terms of the ratios of false positive and false negative, the proposed method (precision=0.76±0.04, recall=0.86±0.05) produced lower ratios than the conventional methods (0.73±0.05, 0.72±0.06) demonstrating its plausibility for accurate, robust and reliable segmentation of hippocampus.

  13. Therapeutic drug monitoring of haloperidol, perphenazine, and zuclopenthixol in serum by a fully automated sequential solid phase extraction followed by high-performance liquid chromatography.

    PubMed

    Angelo, H R; Petersen, A

    2001-04-01

    In Denmark, haloperidol, perphenazine, and zuclopenthixol are among the most frequently requested antipsychotics for therapeutic drug monitoring. With the number of requests made at the authors' laboratory, the only rational analysis is one that can measure all three drugs simultaneously. The authors therefore decided to develop an automated high-performance liquid chromatography (HPLC) method. Two milliliters serum, 2.0 mL 10 mmol/L sodium phosphate buffer (pH 5.5), and 150 microL internal standard (trifluoperazine) solution were pipetted into HPLC vials and extracted on an ASPEC XL equipped with 1 mL (50 mg) Isolute C2 (EC) extraction columns and acetonitrile-methanol-ammonium acetate buffer (60:34:6) as extracting solution. Three hundred fifty microliters was analyzed by HPLC; a 150 x 4.6-mm S5CN Spherisorb column with a mobile phase of 10 mmol/L ammonium acetate buffer-methanol (1:9), a flow rate of 0.6-1.7 mL/min, and ultraviolet detection at 256 and 245 nm were used. Reproducibility was 5-12% and the lower limit of quantitation was 10, 1, and 5 nmol/L (4, 0.4, and 2 ng/mL) for haloperidol, perphenazine, and zuclopenthixol, respectively. The method was found to be sufficiently selective and robust for routine analysis.

  14. A fully automated effervescence assisted dispersive liquid-liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples.

    PubMed

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina; Andruch, Vasil; Moskvin, Leonid; Bulatov, Andrey

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid-liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L(-1) Na2CO3) and the proton donor solution (1 mol L(-1) CH3COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min(-1) during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV-Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5-100 µmol L(-1) of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L(-1).

  15. PMD: A Resource for Archiving and Analyzing Protein Microarray data

    PubMed Central

    Xu, Zhaowei; Huang, Likun; Zhang, Hainan; Li, Yang; Guo, Shujuan; Wang, Nan; Wang, Shi-hua; Chen, Ziqing; Wang, Jingfang; Tao, Sheng-ce

    2016-01-01

    Protein microarray is a powerful technology for both basic research and clinical study. However, because there is no database specifically tailored for protein microarray, the majority of the valuable original protein microarray data is still not publically accessible. To address this issue, we constructed Protein Microarray Database (PMD), which is specifically designed for archiving and analyzing protein microarray data. In PMD, users can easily browse and search the entire database by experimental name, protein microarray type, and sample information. Additionally, PMD integrates several data analysis tools and provides an automated data analysis pipeline for users. With just one click, users can obtain a comprehensive analysis report for their protein microarray data. The report includes preliminary data analysis, such as data normalization, candidate identification, and an in-depth bioinformatics analysis of the candidates, which include functional annotation, pathway analysis, and protein-protein interaction network analysis. PMD is now freely available at www.proteinmicroarray.cn. PMID:26813635

  16. PMD: A Resource for Archiving and Analyzing Protein Microarray data.

    PubMed

    Xu, Zhaowei; Huang, Likun; Zhang, Hainan; Li, Yang; Guo, Shujuan; Wang, Nan; Wang, Shi-Hua; Chen, Ziqing; Wang, Jingfang; Tao, Sheng-Ce

    2016-01-27

    Protein microarray is a powerful technology for both basic research and clinical study. However, because there is no database specifically tailored for protein microarray, the majority of the valuable original protein microarray data is still not publically accessible. To address this issue, we constructed Protein Microarray Database (PMD), which is specifically designed for archiving and analyzing protein microarray data. In PMD, users can easily browse and search the entire database by experimental name, protein microarray type, and sample information. Additionally, PMD integrates several data analysis tools and provides an automated data analysis pipeline for users. With just one click, users can obtain a comprehensive analysis report for their protein microarray data. The report includes preliminary data analysis, such as data normalization, candidate identification, and an in-depth bioinformatics analysis of the candidates, which include functional annotation, pathway analysis, and protein-protein interaction network analysis. PMD is now freely available at www.proteinmicroarray.cn.

  17. Detection of 11 common viral and bacterial pathogens causing community-acquired pneumonia or sepsis in asymptomatic patients by using a multiplex reverse transcription-PCR assay with manual (enzyme hybridization) or automated (electronic microarray) detection.

    PubMed

    Kumar, Swati; Wang, Lihua; Fan, Jiang; Kraft, Andrea; Bose, Michael E; Tiwari, Sagarika; Van Dyke, Meredith; Haigis, Robert; Luo, Tingquo; Ghosh, Madhushree; Tang, Huong; Haghnia, Marjan; Mather, Elizabeth L; Weisburg, William G; Henrickson, Kelly J

    2008-09-01

    Community-acquired pneumonia (CAP) and sepsis are important causes of morbidity and mortality. We describe the development of two molecular assays for the detection of 11 common viral and bacterial agents of CAP and sepsis: influenza virus A, influenza virus B, respiratory syncytial virus A (RSV A), RSV B, Mycoplasma pneumoniae, Chlamydophila pneumoniae, Legionella pneumophila, Legionella micdadei, Bordetella pertussis, Staphylococcus aureus, and Streptococcus pneumoniae. Further, we report the prevalence of carriage of these pathogens in respiratory, skin, and serum specimens from 243 asymptomatic children and adults. The detection of pathogens was done using both a manual enzyme hybridization assay and an automated electronic microarray following reverse transcription and PCR amplification. The analytical sensitivities ranged between 0.01 and 100 50% tissue culture infective doses, cells, or CFU per ml for both detection methods. Analytical specificity testing demonstrated no significant cross-reactivity among 19 other common respiratory organisms. One hundred spiked "surrogate" clinical specimens were all correctly identified with 100% specificity (95% confidence interval, 100%). Overall, 28 (21.7%) of 129 nasopharyngeal specimens, 11 of 100 skin specimens, and 2 of 100 serum specimens from asymptomatic subjects tested positive for one or more pathogens, with S. pneumoniae and S. aureus giving 89% of the positive results. Our data suggest that asymptomatic carriage makes the use of molecular assays problematic for the detection of S. pneumoniae or S. aureus in upper respiratory tract secretions; however, the specimens tested showed virtually no carriage of the other nine viral and bacterial pathogens, and the detection of these pathogens should not be a significant diagnostic problem. In addition, slightly less sensitive molecular assays may have better correlation with clinical disease in the case of CAP.

  18. An automated and fully validated LC-MS/MS procedure for the simultaneous determination of 11 opioids used in palliative care, with 5 of their metabolites.

    PubMed

    Musshoff, F; Trafkowski, J; Kuepper, U; Madea, B

    2006-05-01

    A fully validated liquid chromatographic procedure coupled with electrospray ionization-tandem mass spectrometry (LC-ESI-MS/MS) is presented for quantitative determination of the opioids buprenorphine, codeine, fentanyl, hydromorphone, methadone, morphine, oxycodone, oxymorphone, piritramide, tilidine, and tramadol together with their metabolites bisnortilidine, morphine-glucuronides, norfentanyl, and nortilidine in blood plasma after an automatically performed solid-phase extraction (SPE). Separation was achieved in 35 min on a Phenomenex C12 MAX-RP column (4 microm, 150 x 2 mm) using a gradient of ammonium formiate buffer (pH 3.5) and acetonitrile. The validation data were within the required limits. The assay was successfully applied to authentic plasma samples, allowing confirmation of the diagnosis of overdose situations as well as monitoring of patients' compliance, especially in patients under palliative care.

  19. Protein Microarrays: Novel Developments and Applications

    PubMed Central

    Berrade, Luis; Garcia, Angie E.

    2011-01-01

    Protein microarray technology possesses some of the greatest potential for providing direct information on protein function and potential drug targets. For example, functional protein microarrays are ideal tools suited for the mapping of biological pathways. They can be used to study most major types of interactions and enzymatic activities that take place in biochemical pathways and have been used for the analysis of simultaneous multiple biomolecular interactions involving protein-protein, protein-lipid, protein-DNA and protein-small molecule interactions. Because of this unique ability to analyze many kinds of molecular interactions en masse, the requirement of very small sample amount and the potential to be miniaturized and automated, protein microarrays are extremely well suited for protein profiling, drug discovery, drug target identification and clinical prognosis and diagnosis. The aim of this review is to summarize the most recent developments in the production, applications and analysis of protein microarrays. PMID:21116694

  20. Instrumentation of LOTIS--Livermore Optical Transient Imaging System: a fully automated wide-field-of-view telescope system searching for simultaneous optical counterparts of gamma-ray bursts

    NASA Astrophysics Data System (ADS)

    Park, Hye-Sook; Ables, Elden; Barthelmy, Scott D.; Bionta, Richard M.; Ott, Linda L.; Parker, Eric L.; Williams, George G.

    1998-07-01

    LOTIS is a rapidly slewing wide-field-of-viewtelescope which was designed and constructed to search for simultaneous gamma- ray burst (GRB) optical counterparts. This experiment requires a rapidly slewing (less than 10 sec), wide-field-of-view (greater than 15 degrees celsius), automatic and dedicated telescope. LOTIS utilizes commercial tele-photo lenses and custom 2048 X 2048 CCD cameras to view a 17.6 X 17.6 degree field of view. It can point to any part of the sky within 5 sec and is fully automated. It is connected via Internet socket to the GRB coordinate distribution network which analyzes telemetry from the satellite and delivers GRB coordinate information in real-time. LOTIS started routine operation in Oct. 1996. In the idle time between GRB triggers, LOTIS systematically surveys the entire available sky every night for new optical transients. This paper will describe the system design and performance.

  1. Preliminary evaluation of a fully automated quantitative framework for characterizing general breast tissue histology via color histogram and color texture analysis

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Gastounioti, Aimilia; Batiste, Rebecca C.; Kontos, Despina; Feldman, Michael D.

    2016-03-01

    Visual characterization of histologic specimens is known to suffer from intra- and inter-observer variability. To help address this, we developed an automated framework for characterizing digitized histology specimens based on a novel application of color histogram and color texture analysis. We perform a preliminary evaluation of this framework using a set of 73 trichrome-stained, digitized slides of normal breast tissue which were visually assessed by an expert pathologist in terms of the percentage of collagenous stroma, stromal collagen density, duct-lobular unit density and the presence of elastosis. For each slide, our algorithm automatically segments the tissue region based on the lightness channel in CIELAB colorspace. Within each tissue region, a color histogram feature vector is extracted using a common color palette for trichrome images generated with a previously described method. Then, using a whole-slide, lattice-based methodology, color texture maps are generated using a set of color co-occurrence matrix statistics: contrast, correlation, energy and homogeneity. The extracted features sets are compared to the visually assessed tissue characteristics. Overall, the extracted texture features have high correlations to both the percentage of collagenous stroma (r=0.95, p<0.001) and duct-lobular unit density (r=0.71, p<0.001) seen in the tissue samples, and several individual features were associated with either collagen density and/or the presence of elastosis (p<=0.05). This suggests that the proposed framework has promise as a means to quantitatively extract descriptors reflecting tissue-level characteristics and thus could be useful in detecting and characterizing histological processes in digitized histology specimens.

  2. A fully automated procedure for the high-throughput detection of avian influenza virus by real-time reverse transcription-polymerase chain reaction.

    PubMed

    Agüero, Montserrat; San Miguel, Elena; Sánchez, Azucena; Gómez-Tejedor, Concepción; Jiménez-Clavero, Miguel Angel

    2007-03-01

    The recent spread of highly pathogenic H5N1 avian influenza (AI) has made it important to develop highly sensitive diagnostic systems for the rapid detection of AI genome and the differentiation of H5N1 variants in a high number of samples. In the present paper, we describe a high-throughput procedure that combines automated extraction, amplification, and detection of AI RNA, by an already described TaqMan real-time reverse transcription-polymerase chain reaction (RRT-PCR) assay targeted at the matrix (M) protein gene of AI virus (AIV). The method was tested in cloacal and tracheal swabs, the most common type of samples used in AI surveillance, as well as in tissue and fecal samples. A robotic system (QIAGEN Biosprint 96) extracted RNA and set up reactions for RRT-PCR in a 96-well format. The recovery of the extracted RNA was as efficient as that of a manual RNA extraction kit, and the sensitivity of the detection system was as high as with previously described nonautomated methods. A system with a basic configuration (one extraction robot plus two real-time 96-well thermocyclers) operated by two persons could account for about 360 samples in 5 hr. Further characterization of AI RNA-positive samples with a TaqMan RRT-PCR specific for H5 (also described here) and/or N1 was possible within 2 hr more. As this work shows, the system can analyze up to 1400 samples per working day by using two nucleic acid extraction robots and a 384-well-format thermocycler.

  3. Association between 25-Hydroxy Vitamin D and volumetric breast density via a fully automated software Volpara™ in the reproductive age group

    PubMed Central

    Wasim, Bushra; Khan, Khalid; Samad, Mohd Abdul

    2016-01-01

    Objective: To determine the association between serum 25 hydroxyvitamin D levels and percent breast density among asymptomatic premenopausal women. Methods: Hundred asymptomatic, pre-menopausal women who visited the General Surgery Breast Clinic, Patel Hospital, Karachi, Pakistan between 3rd March and 10th November, 2015 were included in this study. The serum 25 (OH)D and calcium levels were measured and mammographic density (MD) was assessed using automated volumetric breast density software, Volpara Research (algorithm version 1.5.1, Volpara solutions Ltd, Wellington, NZ) on the same day. The volumetric breast density (VBD) was categorized as; VG1: 0% - 4.5 %; VG2: 4.6% - 7.5%; VG3: 7.6% – 15.5% and VG4 >15.5%. Mean serum 25(OH)D and calcium levels were compared across the four volumetric breast density categories. The percent volumetric density was also correlated with anthropometric measurements and other related variables. Results: No significant difference was found in mean serum 25 (OH)D level across the four groups (15.87 Vs. 12.40 Vs. 8.99 Vs. 9.68; p-value = 0.106). The percent VBD were found significantly negatively correlated with age (r = - 0.365; p-value = 0.001), weight (r = - 0.575; p-value = 0.001), height (r = - 0.197; p-value = 0.049), and BMI (r = - 0.519; p-value = 0.001). The serum Vitamin D, and calcium levels were not found significantly correlated with percent VBD (p-value > 0.05). Conclusion: No significant association exists between serum 25(OH)D level and breast density. PMID:27882030

  4. Fully automated determination of nicotine and its major metabolites in whole blood by means of a DBS online-SPE LC-HR-MS/MS approach for sports drug testing.

    PubMed

    Tretzel, Laura; Thomas, Andreas; Piper, Thomas; Hedeland, Mikael; Geyer, Hans; Schänzer, Wilhelm; Thevis, Mario

    2016-05-10

    Dried blood spots (DBS) represent a sample matrix collected under minimal-invasive, straightforward and robust conditions. DBS specimens have been shown to provide appropriate test material for different analytical disciplines, e.g., preclinical drug development, therapeutic drug monitoring, forensic toxicology and diagnostic analysis of metabolic disorders in newborns. However, the sample preparation has occasionally been reported as laborious and time consuming. In order to minimize the manual workload and to substantiate the suitability of DBS for high sample-throughput, the automation of sample preparation processes is of paramount interest. In the current study, the development and validation of a fully automated DBS extraction method coupled to online solid-phase extraction using the example of nicotine, its major metabolites nornicotine, cotinine and trans-3'-hydroxycotinine and the tobacco alkaloids anabasine and anatabine is presented, based on the rationale that the use of nicotine-containing products for performance-enhancing purposes has been monitored by the World Anti-Doping Agency (WADA) for several years. Automation-derived DBS sample extracts were directed online to liquid chromatography high resolution/high mass accuracy tandem mass spectrometry, and target analytes were determined with support of four deuterated internal standards. Validation of the method yielded precise (CV <7.5% for intraday and <12.3% for interday measurements) and linear (r(2)>0.998) results. The limit of detection was established at 5 ng mL(-1) for all studied compounds, the extraction recovery ranged from 25 to 44%, and no matrix effects were observed. To exemplify the applicability of the DBS online-SPE LC-MS/MS approach for sports drug testing purposes, the method was applied to authentic DBS samples obtained from smokers, snus users, and e-cigarette users. Statistical evaluation of the obtained results indicated differences in metabolic behavior depending on the route

  5. The Genopolis Microarray Database

    PubMed Central

    Splendiani, Andrea; Brandizi, Marco; Even, Gael; Beretta, Ottavio; Pavelka, Norman; Pelizzola, Mattia; Mayhaus, Manuel; Foti, Maria; Mauri, Giancarlo; Ricciardi-Castagnoli, Paola

    2007-01-01

    Background Gene expression databases are key resources for microarray data management and analysis and the importance of a proper annotation of their content is well understood. Public repositories as well as microarray database systems that can be implemented by single laboratories exist. However, there is not yet a tool that can easily support a collaborative environment where different users with different rights of access to data can interact to define a common highly coherent content. The scope of the Genopolis database is to provide a resource that allows different groups performing microarray experiments related to a common subject to create a common coherent knowledge base and to analyse it. The Genopolis database has been implemented as a dedicated system for the scientific community studying dendritic and macrophage cells functions and host-parasite interactions. Results The Genopolis Database system allows the community to build an object based MIAME compliant annotation of their experiments and to store images, raw and processed data from the Affymetrix GeneChip® platform. It supports dynamical definition of controlled vocabularies and provides automated and supervised steps to control the coherence of data and annotations. It allows a precise control of the visibility of the database content to different sub groups in the community and facilitates exports of its content to public repositories. It provides an interactive users interface for data analysis: this allows users to visualize data matrices based on functional lists and sample characterization, and to navigate to other data matrices defined by similarity of expression values as well as functional characterizations of genes involved. A collaborative environment is also provided for the definition and sharing of functional annotation by users. Conclusion The Genopolis Database supports a community in building a common coherent knowledge base and analyse it. This fills a gap between a local

  6. Picogram per liter level determination of estrogens in natural waters and waterworks by a fully automated on-line solid-phase extraction-liquid chromatography-electrospray tandem mass spectrometry method.

    PubMed

    Rodriguez-Mozaz, Sara; Lopez de Alda, Maria J; Barceló, Damià

    2004-12-01

    The present work describes a novel, fully automated method, based on on-line solid-phase extraction-liquid chromatography-electrospray tandem mass spectrometry (SPE-LC-ESI-MS-MS), which allows the unequivocal identification and quantification of the most environmentally relevant estrogens (estradiol, estrone, estriol, estradiol-17-glucuronide, estradiol-17-acetate, estrone-3-sulfate, ethynyl estradiol, diethylstilbestrol) in natural and treated waters at levels well below those of concern (limits of quantification between 0.02 and 1.02 ng/L). The method is highly precise, with relative standard deviations varying between 1.43 and 3.89%, and accurate (recovery percentages >74 %). This method was used to track the presence and fate of the target compounds in a waterworks and to evaluate the removal efficiency of the treatment processes applied. Only estrone and estrone-3-sulfate were detected in the river water used as source (at 0.68 and 0.33 ng/L, respectively). After progressive removal through the various treatment steps, none of them were detected in the finished drinking water. In addition to selectivity, sensitivity, repeatability, and automation (up to 15 samples plus 6 calibration solutions and 1 blank can be analyzed unattended), this technique offers fairly high throughput (analysis time per sample is 60 min), low time and solvent consumption, and ease of use.

  7. A fully automated method for simultaneous determination of aflatoxins and ochratoxin A in dried fruits by pressurized liquid extraction and online solid-phase extraction cleanup coupled to ultra-high-pressure liquid chromatography-tandem mass spectrometry.

    PubMed

    Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Russo, Mariateresa; Valdés, Alberto; Ibáñez, Clara; Rastrelli, Luca

    2015-04-01

    According to current demands and future perspectives in food safety, this study reports a fast and fully automated analytical method for the simultaneous analysis of the mycotoxins with high toxicity and wide spread, aflatoxins (AFs) and ochratoxin A (OTA) in dried fruits, a high-risk foodstuff. The method is based on pressurized liquid extraction (PLE), with aqueous methanol (30%) at 110 °C, of the slurried dried fruit and online solid-phase extraction (online SPE) cleanup of the PLE extracts with a C18 cartridge. The purified sample was directly analysed by ultra-high-pressure liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) for sensitive and selective determination of AFs and OTA. The proposed analytical procedure was validated for different dried fruits (vine fruit, fig and apricot), providing method detection and quantification limits much lower than the AFs and OTA maximum levels imposed by EU regulation in dried fruit for direct human consumption. Also, recoveries (83-103%) and repeatability (RSD < 8, n = 3) meet the performance criteria required by EU regulation for the determination of the levels of mycotoxins in foodstuffs. The main advantage of the proposed method is full automation of the whole analytical procedure that reduces the time and cost of the analysis, sample manipulation and solvent consumption, enabling high-throughput analysis and highly accurate and precise results.

  8. Automation or De-automation

    NASA Astrophysics Data System (ADS)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  9. Diode laser bonding of planar microfluidic devices, MOEMS, bioMEMS, diagnostic chips, and microarrays

    NASA Astrophysics Data System (ADS)

    Chen, Jie-Wei; Zybko, Jerry M.

    2005-01-01

    The assembly of plastic microfluidic devices, MOEMS and microarrays requiring high positioning and welding accuracy in the micrometer range, has been successfully achieved using a new technology based on laser transmission welding combined with a photolithographic mask technique. This paper reviews a laser assembly platform for the joining of microfluidic plastic parts with its main related process characteristics and its potential for low-cost and high volume manufacturing. The system consists of a of diode laser with a mask and an automated alignment function to generate micro welding seams with freely definable geometries. A fully automated mask alignment system with a resolution of < 2 &mum and a precise, noncontact energy input allows a fast welding of micro structured plastic parts with high reproducibility and excellent welding quality.

  10. Fully automated trace level determination of parent and alkylated PAHs in environmental waters by online SPE-LC-APPI-MS/MS.

    PubMed

    Ramirez, Cesar E; Wang, Chengtao; Gardinali, Piero R

    2014-01-01

    Polycyclic aromatic hydrocarbons (PAHs) are ubiquitous compounds that enter the environment from natural and anthropogenic sources, often used as markers to determine the extent, fate, and potential effects on natural resources after a crude oil accidental release. Gas chromatography-mass spectrometry (GC-MS) after liquid-liquid extraction (LLE+GC-MS) has been extensively used to isolate and quantify both parent and alkylated PAHs. However, it requires labor-intensive extraction and cleanup steps and generates large amounts of toxic solvent waste. Therefore, there is a clear need for greener, faster techniques with enough reproducibility and sensitivity to quantify many PAHs in large numbers of water samples in a short period of time. This study combines online solid-phase extraction followed by liquid chromatography (LC) separation with dopant-assisted atmospheric pressure photoionization (APPI) and tandem MS detection, to provide a one-step protocol that detects PAHs at low nanograms per liter with almost no sample preparation and with a significantly lower consumption of toxic halogenated solvents. Water samples were amended with methanol, fortified with isotopically labeled PAHs, and loaded onto an online SPE column, using a large-volume sample loop with an auxiliary LC pump for sample preconcentration and salt removal. The loaded SPE column was connected to an UPLC pump and analytes were backflushed to a Thermo Hypersil Green PAH analytical column where a 20-min gradient separation was performed at a variable flow rate. Detection was performed by a triple-quadrupole MS equipped with a gas-phase dopant delivery system, using 1.50 mL of chlorobenzene dopant per run. In contrast, LLE+GC-MS typically use 150 mL of organic solvents per sample, and methylene chloride is preferred because of its low boiling point. However, this solvent has a higher environmental persistence than chlorobenzene and is considered a carcinogen. The automated system is capable of

  11. PATMA: parser of archival tissue microarray.

    PubMed

    Roszkowiak, Lukasz; Lopez, Carlos

    2016-01-01

    Tissue microarrays are commonly used in modern pathology for cancer tissue evaluation, as it is a very potent technique. Tissue microarray slides are often scanned to perform computer-aided histopathological analysis of the tissue cores. For processing the image, splitting the whole virtual slide into images of individual cores is required. The only way to distinguish cores corresponding to specimens in the tissue microarray is through their arrangement. Unfortunately, distinguishing the correct order of cores is not a trivial task as they are not labelled directly on the slide. The main aim of this study was to create a procedure capable of automatically finding and extracting cores from archival images of the tissue microarrays. This software supports the work of scientists who want to perform further image processing on single cores. The proposed method is an efficient and fast procedure, working in fully automatic or semi-automatic mode. A total of 89% of punches were correctly extracted with automatic selection. With an addition of manual correction, it is possible to fully prepare the whole slide image for extraction in 2 min per tissue microarray. The proposed technique requires minimum skill and time to parse big array of cores from tissue microarray whole slide image into individual core images.

  12. PATMA: parser of archival tissue microarray

    PubMed Central

    2016-01-01

    Tissue microarrays are commonly used in modern pathology for cancer tissue evaluation, as it is a very potent technique. Tissue microarray slides are often scanned to perform computer-aided histopathological analysis of the tissue cores. For processing the image, splitting the whole virtual slide into images of individual cores is required. The only way to distinguish cores corresponding to specimens in the tissue microarray is through their arrangement. Unfortunately, distinguishing the correct order of cores is not a trivial task as they are not labelled directly on the slide. The main aim of this study was to create a procedure capable of automatically finding and extracting cores from archival images of the tissue microarrays. This software supports the work of scientists who want to perform further image processing on single cores. The proposed method is an efficient and fast procedure, working in fully automatic or semi-automatic mode. A total of 89% of punches were correctly extracted with automatic selection. With an addition of manual correction, it is possible to fully prepare the whole slide image for extraction in 2 min per tissue microarray. The proposed technique requires minimum skill and time to parse big array of cores from tissue microarray whole slide image into individual core images. PMID:27920955

  13. MARS: Microarray analysis, retrieval, and storage system

    PubMed Central

    Maurer, Michael; Molidor, Robert; Sturn, Alexander; Hartler, Juergen; Hackl, Hubert; Stocker, Gernot; Prokesch, Andreas; Scheideler, Marcel; Trajanoski, Zlatko

    2005-01-01

    Background Microarray analysis has become a widely used technique for the study of gene-expression patterns on a genomic scale. As more and more laboratories are adopting microarray technology, there is a need for powerful and easy to use microarray databases facilitating array fabrication, labeling, hybridization, and data analysis. The wealth of data generated by this high throughput approach renders adequate database and analysis tools crucial for the pursuit of insights into the transcriptomic behavior of cells. Results MARS (Microarray Analysis and Retrieval System) provides a comprehensive MIAME supportive suite for storing, retrieving, and analyzing multi color microarray data. The system comprises a laboratory information management system (LIMS), a quality control management, as well as a sophisticated user management system. MARS is fully integrated into an analytical pipeline of microarray image analysis, normalization, gene expression clustering, and mapping of gene expression data onto biological pathways. The incorporation of ontologies and the use of MAGE-ML enables an export of studies stored in MARS to public repositories and other databases accepting these documents. Conclusion We have developed an integrated system tailored to serve the specific needs of microarray based research projects using a unique fusion of Web based and standalone applications connected to the latest J2EE application server technology. The presented system is freely available for academic and non-profit institutions. More information can be found at . PMID:15836795

  14. Fully automated analysis of four tobacco-specific N-nitrosamines in mainstream cigarette smoke using two-dimensional online solid phase extraction combined with liquid chromatography-tandem mass spectrometry.

    PubMed

    Zhang, Jie; Bai, Ruoshi; Yi, Xiaoli; Yang, Zhendong; Liu, Xingyu; Zhou, Jun; Liang, Wei

    2016-01-01

    A fully automated method for the detection of four tobacco-specific nitrosamines (TSNAs) in mainstream cigarette smoke (MSS) has been developed. The new developed method is based on two-dimensional online solid-phase extraction-liquid chromatography-tandem mass spectrometry (SPE/LC-MS/MS). The two dimensional SPE was performed in the method utilizing two cartridges with different extraction mechanisms to cleanup disturbances of different polarity to minimize sample matrix effects on each analyte. Chromatographic separation was achieved using a UPLC C18 reversed phase analytical column. Under the optimum online SPE/LC-MS/MS conditions, N'-nitrosonornicotine (NNN), N'-nitrosoanatabine (NAT), N'-nitrosoanabasine (NAB), and 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanone (NNK) were baseline separated with good peak shapes. This method appears to be the most sensitive method yet reported for determination of TSNAs in mainstream cigarette smoke. The limits of quantification for NNN, NNK, NAT and NAB reached the levels of 6.0, 1.0, 3.0 and 0.6 pg/cig, respectively, which were well below the lowest levels of TSNAs in MSS of current commercial cigarettes. The accuracy of the measurement of four TSNAs was from 92.8 to 107.3%. The relative standard deviations of intra-and inter-day analysis were less than 5.4% and 7.5%, respectively. The main advantages of the method developed are fairly high sensitivity, selectivity and accuracy of results, minimum sample pre-treatment, full automation, and high throughput. As a part of the validation procedure, the developed method was applied to evaluate TSNAs yields for 27 top-selling commercial cigarettes in China.

  15. A fully automated system with on-line micro solid-phase extraction combined with capillary liquid chromatography-tandem mass spectrometry for high throughput analysis of microcystins and nodularin-R in tap water and lake water.

    PubMed

    Shan, Yuanhong; Shi, Xianzhe; Dou, Abo; Zou, Cunjie; He, Hongbing; Yang, Qin; Zhao, Sumin; Lu, Xin; Xu, Guowang

    2011-04-01

    Microcystins and nodularins are cyclic peptide hepatotoxins and tumour promoters from cyanobacteria. The present study describes the development, validation and practical application of a fully automated analytical method based on on-line micro solid-phase extraction-capillary liquid chromatography-tandem mass spectrometry for the simultaneous determination of seven microcystins and nodularin-R in tap water and lake water. Aliquots of just 100 μL of water samples are sufficient for the detection and quantification of all eight toxins. Selected reaction monitoring was used to obtain the highest sensitivity. Good linear calibrations were obtained for microcystins (50-2000ng/L) and nodularin-R (25-1000 ng/L) in spiked tap water and lake water samples. Excellent interday and intraday repeatability were achieved for eight toxins with relative standard deviation less than 15.7% in three different concentrations. Acceptable recoveries were achieved in the three concentrations with both tap water matrix and lake water matrix and no significant matrix effect was found in tap water and lake water except for microcystin-RR. The limits of detection (signal to noise ratio=3) of toxins were lower than 56.6 ng/L which is far below the 1 μg/L defined by the World Health Organization provisional guideline for microcystin-LR. Finally, this method was successfully applied to lake water samples from Tai lake and proved to be useful for water quality monitoring.

  16. Quantitative analysis of cortisol and 6β-hydroxycortisol in urine by fully automated SPE and ultra-performance LC coupled with electrospray and atmospheric pressure chemical ionization (ESCi)-TOF-MS.

    PubMed

    Lang, Lotte M; Dalsgaard, Petur W; Linnet, Kristian

    2013-01-01

    An ultra-performance LC TOF MS method for quantitative analysis of cortisol and 6β-hydroxycortisol in urine was developed. The method was used for determination of the ratio between 6β-hydroxycortisol and cortisol in urine received from autopsy cases and living persons as a measure of cytochrome P450 3A enzyme activity. Urine samples (0.25 mL) were extracted with an in-house developed fully automated 96-well SPE system. The compounds were quantified using a Waters ACQUITY UPLC system coupled to a Waters SYNAPT G2. The MS sensitivity was optimized by using negative ionization in sensitivity mode (resolution >10 000 full-width at half-maximum), and further optimized by using the enhanced duty cycle around the 410 m/z. ESCi (simultaneous electrospray and atmospheric pressure chemical ionization) mode was used to compensate for the matrix effects of postmortem urine. Finally, the SYNAPT G2 was tested as a quantitative instrument. The developed method has a measurement range from 2.5-300 ng/mL for cortisol to 10-1200 ng/mL for 6β-hydroxycortisol. Mean overall process efficiencies were 29.4 and 23.0% for cortisol and 6β-hydroxycortisol, respectively. In 20 forensic reference cases, the range of the 6β-hydroxycortisol/cortisol ratio was 0.29-14.2 with a median of 3.04.

  17. Screening for illicit and medicinal drugs in whole blood using fully automated SPE and ultra-high-performance liquid chromatography with TOF-MS with data-independent acquisition.

    PubMed

    Pedersen, Anders Just; Dalsgaard, Petur Weihe; Rode, Andrej Jaroslav; Rasmussen, Brian Schou; Müller, Irene Breum; Johansen, Sys Stybe; Linnet, Kristian

    2013-07-01

    A broad forensic screening method for 256 analytes in whole blood based on a fully automated SPE robotic extraction and ultra-high-performance liquid chromatography (UHPLC) with TOF-MS with data-independent acquisition has been developed. The limit of identification was evaluated for all 256 compounds and 95 of these compounds were validated with regard to matrix effects, extraction recovery, and process efficiency. The limit of identification ranged from 0.001 to 0.1 mg/kg, and the process efficiency exceeded 50% for 73 of the 95 analytes. As an example of application, 1335 forensic traffic cases were analyzed with the presented screening method. Of these, 992 cases (74%) were positive for one or more traffic-relevant drugs above the Danish legal limits. Commonly abused drugs such as amphetamine, cocaine, and frequent types of benzodiazepines were the major findings. Nineteen less frequently encountered drugs were detected e.g. buprenorphine, butylone, cathine, fentanyl, lysergic acid diethylamide, m-chlorophenylpiperazine, 3,4-methylenedioxypyrovalerone, mephedrone, 4-methylamphetamine, p-fluoroamphetamine, and p-methoxy-N-methylamphetamine. In conclusion, using UHPLC-TOF-MS screening with data-independent acquisition resulted in the detection of common drugs of abuse as well as new designer drugs and more rarely occurring drugs. Thus, TOF-MS screening of blood samples constitutes a practical way for screening traffic cases, with the exception of δ-9-tetrahydrocannabinol, which should be handled in a separate method.

  18. Validation of high-throughput measurement system with microwave-assisted extraction, fully automated sample preparation device, and gas chromatography-electron capture detector for determination of polychlorinated biphenyls in whale blubber.

    PubMed

    Fujita, Hiroyuki; Honda, Katsuhisa; Hamada, Noriaki; Yasunaga, Genta; Fujise, Yoshihiro

    2009-02-01

    Validation of a high-throughput measurement system with microwave-assisted extraction (MAE), fully automated sample preparation device (SPD), and gas chromatography-electron capture detector (GC-ECD) for the determination of polychlorinated biphenyls (PCBs) in minke whale blubber was performed. PCB congeners accounting for > 95% of the total PCBs burden in blubber were efficiently extracted with a small volume (20 mL) of n-hexane using MAE due to simultaneous saponification and extraction. Further, the crude extract obtained by MAE was rapidly purified and automatically substituted to a small volume (1 mL) of toluene using SPD without using concentrators. Furthermore, the concentration of PCBs in the purified and concentrated solution was accurately determined by GC-ECD. Moreover, the result of accuracy test using a certified material (SRM 1588b; Cod liver oil) showed good agreement with the NIST certified concentration values. In addition, the method quantification limit of total-PCB in whale blubbers was 41 ng g(-1). This new measurement system for PCBs takes only four hours. Consequently, it indicated this method is the most suitable for the monitoring and screening of PCBs in the conservation of the marine ecosystem and safe distribution of foods.

  19. Fully Automated Anesthesia, Analgesia and Fluid Management

    ClinicalTrials.gov

    2017-01-03

    General Anesthetic Drug Overdose; Adverse Effect of Intravenous Anesthetics, Sequela; Complication of Anesthesia; Drug Delivery System Malfunction; Hemodynamic Instability; Underdosing of Other General Anesthetics

  20. Overview of Protein Microarrays

    PubMed Central

    Reymond Sutandy, FX; Qian, Jiang; Chen, Chien-Sheng; Zhu, Heng

    2013-01-01

    Protein microarray is an emerging technology that provides a versatile platform for characterization of hundreds of thousands of proteins in a highly parallel and high-throughput way. Two major classes of protein microarrays are defined to describe their applications: analytical and functional protein microarrays. In addition, tissue or cell lysates can also be fractionated and spotted on a slide to form a reverse-phase protein microarray. While the fabrication technology is maturing, applications of protein microarrays, especially functional protein microarrays, have flourished during the past decade. Here, we will first review recent advances in the protein microarray technologies, and then present a series of examples to illustrate the applications of analytical and functional protein microarrays in both basic and clinical research. The research areas will include detection of various binding properties of proteins, study of protein posttranslational modifications, analysis of host-microbe interactions, profiling antibody specificity, and identification of biomarkers in autoimmune diseases. As a powerful technology platform, it would not be surprising if protein microarrays will become one of the leading technologies in proteomic and diagnostic fields in the next decade. PMID:23546620

  1. Ultrahigh density microarrays of solid samples.

    PubMed

    LeBaron, Matthew J; Crismon, Heidi R; Utama, Fransiscus E; Neilson, Lynn M; Sultan, Ahmed S; Johnson, Kevin J; Andersson, Eva C; Rui, Hallgeir

    2005-07-01

    We present a sectioning and bonding technology to make ultrahigh density microarrays of solid samples, cutting edge matrix assembly (CEMA). Maximized array density is achieved by a scaffold-free, self-supporting construction with rectangular array features that are incrementally scalable. This platform technology facilitates arrays of >10,000 tissue features on a standard glass slide, inclusion of unique sample identifiers for improved manual or automated tracking, and oriented arraying of stratified or polarized samples.

  2. Automation in Clinical Microbiology

    PubMed Central

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  3. Development of a fully automated sequential injection solid-phase extraction procedure coupled to liquid chromatography to determine free 2-hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid in human urine.

    PubMed

    León, Zacarías; Chisvert, Alberto; Balaguer, Angel; Salvador, Amparo

    2010-04-07

    2-Hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid, commonly known as benzophenone-3 (BZ3) and benzophenone-4 (BZ4), respectively, are substances widely used as UV filters in cosmetic products in order to absorb UV radiation and protect human skin from direct exposure to the deleterious wavelengths of sunlight. As with other UV filters, there is evidence of their percutaneous absorption. This work describes an analytical method developed to determine trace levels of free BZ3 and BZ4 in human urine. The methodology is based on a solid-phase extraction (SPE) procedure for clean-up and pre-concentration, followed by the monitoring of the UV filters by liquid chromatography-ultraviolet spectrophotometry detection (LC-UV). In order to improve not only the sensitivity and selectivity, but also the precision of the method, the principle of sequential injection analysis was used to automate the SPE process and to transfer the eluates from the SPE to the LC system. The application of a six-channel valve as an interface for the switching arrangements successfully allowed the on-line connection of SPE sample processing with LC analysis. The SPE process for BZ3 and BZ4 was performed using octadecyl (C18) and diethylaminopropyl (DEA) modified silica microcolumns, respectively, in which the analytes were retained and eluted selectively. Due to the matrix effects, the determination was based on standard addition quantification and was fully validated. The relative standard deviations of the results were 13% and 6% for BZ3 and BZ4, respectively, whereas the limits of detection were 60 and 30 ng mL(-1), respectively. The method was satisfactorily applied to determine BZ3 and BZ4 in urine from volunteers that had applied a sunscreen cosmetic containing both UV filters.

  4. Xenon International Automated Control

    SciTech Connect

    2016-08-05

    The Xenon International Automated Control software monitors, displays status, and allows for manual operator control as well as fully automatic control of multiple commercial and PNNL designed hardware components to generate and transmit atmospheric radioxenon concentration measurements every six hours.

  5. Hybridization and Selective Release of DNA Microarrays

    SciTech Connect

    Beer, N R; Baker, B; Piggott, T; Maberry, S; Hara, C M; DeOtte, J; Benett, W; Mukerjee, E; Dzenitis, J; Wheeler, E K

    2011-11-29

    DNA microarrays contain sequence specific probes arrayed in distinct spots numbering from 10,000 to over 1,000,000, depending on the platform. This tremendous degree of multiplexing gives microarrays great potential for environmental background sampling, broad-spectrum clinical monitoring, and continuous biological threat detection. In practice, their use in these applications is not common due to limited information content, long processing times, and high cost. The work focused on characterizing the phenomena of microarray hybridization and selective release that will allow these limitations to be addressed. This will revolutionize the ways that microarrays can be used for LLNL's Global Security missions. The goals of this project were two-fold: automated faster hybridizations and selective release of hybridized features. The first study area involves hybridization kinetics and mass-transfer effects. the standard hybridization protocol uses an overnight incubation to achieve the best possible signal for any sample type, as well as for convenience in manual processing. There is potential to significantly shorten this time based on better understanding and control of the rate-limiting processes and knowledge of the progress of the hybridization. In the hybridization work, a custom microarray flow cell was used to manipulate the chemical and thermal environment of the array and autonomously image the changes over time during hybridization. The second study area is selective release. Microarrays easily generate hybridization patterns and signatures, but there is still an unmet need for methodologies enabling rapid and selective analysis of these patterns and signatures. Detailed analysis of individual spots by subsequent sequencing could potentially yield significant information for rapidly mutating and emerging (or deliberately engineered) pathogens. In the selective release work, optical energy deposition with coherent light quickly provides the thermal energy to

  6. E-Predict: a computational strategy for species identification based on observed DNA microarray hybridization patterns.

    PubMed

    Urisman, Anatoly; Fischer, Kael F; Chiu, Charles Y; Kistler, Amy L; Beck, Shoshannah; Wang, David; DeRisi, Joseph L

    2005-01-01

    DNA microarrays may be used to identify microbial species present in environmental and clinical samples. However, automated tools for reliable species identification based on observed microarray hybridization patterns are lacking. We present an algorithm, E-Predict, for microarray-based species identification. E-Predict compares observed hybridization patterns with theoretical energy profiles representing different species. We demonstrate the application of the algorithm to viral detection in a set of clinical samples and discuss its relevance to other metagenomic applications.

  7. Evaluation of the Fully Automated BACTEC MGIT 960 System for Testing Susceptibility of Mycobacterium tuberculosis to Pyrazinamide, Streptomycin, Isoniazid, Rifampin, and Ethambutol and Comparison with the Radiometric BACTEC 460TB Method

    PubMed Central

    Scarparo, Claudio; Ricordi, Paolo; Ruggiero, Giuliana; Piccoli, Paola

    2004-01-01

    The performance of the fully automated BACTEC MGIT 960 (M960) system for the testing of Mycobacterium tuberculosis susceptibility to streptomycin (SM), isoniazid (INH), rifampin (RMP), ethambutol (EMB), and pyrazinamide (PZA) was evaluated with 100 clinical isolates and compared to that of the radiometric BACTEC 460TB (B460) system. The agar proportion method and the B460 system were used as reference methods to resolve the discordant results for SM, INH, RMP, and EMB (a combination known as SIRE) and PZA, respectively. The overall agreements were 96.3% for SIRE and 92% for PZA. For SIRE, a total of 26 discrepancies were found and were resolved in favor of the M960 system in 8 cases and in favor of the B460 system in 18 cases. The M960 system produced 8 very major errors (VME) and 10 major errors (ME), while the B460 system showed 4 VME and 4 ME. No statistically significant differences were found. Both systems exhibited excellent performance, but a higher number of VME was observed with the M960 system at the critical concentrations of EMB and SM. For PZA, a total of eight discrepancies were observed and were resolved in favor of the M960 system in one case and in favor of the B460 system in seven cases; no statistically significant differences were found. The M960 system showed four VME and three ME. The mean times to report overall PZA results and resistant results were 8.2 and 9.8 days, respectively, for the M960 system and 7.4 and 8.1 days, respectively, for the B460 system. Statistically significant differences were found. The mean times to report SIRE results were 8.3 days for the M960 system and 8.2 days for the B460 system. No statistically significant differences were found. Twelve strains tested for SIRE susceptibility and seven strains tested for PZA susceptibility had been reprocessed because of contamination. In conclusion, the M960 system can represent a valid alternative to the B460 for M. tuberculosis susceptibility testing; however, the frequent

  8. Image quantification of high-throughput tissue microarray

    NASA Astrophysics Data System (ADS)

    Wu, Jiahua; Dong, Junyu; Zhou, Huiyu

    2006-03-01

    Tissue microarray (TMA) technology allows rapid visualization of molecular targets in thousands of tissue specimens at a time and provides valuable information on expression of proteins within tissues at a cellular and sub-cellular level. TMA technology overcomes the bottleneck of traditional tissue analysis and allows it to catch up with the rapid advances in lead discovery. Studies using TMA on immunohistochemistry (IHC) can produce a large amount of images for interpretation within a very short time. Manual interpretation does not allow accurate quantitative analysis of staining to be undertaken. Automatic image capture and analysis has been shown to be superior to manual interpretation. The aims of this work is to develop a truly high-throughput and fully automated image capture and analysis system. We develop a robust colour segmentation algorithm using hue-saturation-intensity (HSI) colour space to provide quantification of signal intensity and partitioning of staining on high-throughput TMA. Initial segmentation results and quantification data have been achieved on 16,000 TMA colour images over 23 different tissue types.

  9. Multievidence microarray mining.

    PubMed

    Seifert, Martin; Scherf, Matthias; Epple, Anton; Werner, Thomas

    2005-10-01

    Microarray mining is a challenging task because of the superposition of several processes in the data. We believe that the combination of microarray data-based analyses (statistical significance analysis of gene expression) with array-independent analyses (literature-mining and promoter analysis) enables some of the problems of traditional array analysis to be overcome. As a proof-of-principle, we revisited publicly available microarray data derived from an experiment with platelet-derived growth factor (PDGF)-stimulated fibroblasts. Our strategy revealed results beyond the detection of the major metabolic pathway known to be linked to the PDGF response: we were able to identify the crosstalking regulatory networks underlying the metabolic pathway without using a priori knowledge about the experiment.

  10. DNA microarray technology. Introduction.

    PubMed

    Pollack, Jonathan R

    2009-01-01

    DNA microarray technology has revolutionized biological research by enabling genome-scale explorations. This chapter provides an overview of DNA microarray technology and its application to characterizing the physical genome, with a focus on cancer genomes. Specific areas discussed include investigations of DNA copy number alteration (and loss of heterozygosity), DNA methylation, DNA-protein (i.e., chromatin and transcription factor) interactions, DNA replication, and the integration of diverse genome-scale data types. Also provided is a perspective on recent advances and future directions in characterizing the physical genome.

  11. Identifying Fishes through DNA Barcodes and Microarrays

    PubMed Central

    Kochzius, Marc; Seidel, Christian; Antoniou, Aglaia; Botla, Sandeep Kumar; Campo, Daniel; Cariani, Alessia; Vazquez, Eva Garcia; Hauschild, Janet; Hervet, Caroline; Hjörleifsdottir, Sigridur; Hreggvidsson, Gudmundur; Kappel, Kristina; Landi, Monica; Magoulas, Antonios; Marteinsson, Viggo; Nölte, Manfred; Planes, Serge; Tinti, Fausto; Turan, Cemal; Venugopal, Moleyur N.; Weber, Hannes; Blohm, Dietmar

    2010-01-01

    Background International fish trade reached an import value of 62.8 billion Euro in 2006, of which 44.6% are covered by the European Union. Species identification is a key problem throughout the life cycle of fishes: from eggs and larvae to adults in fisheries research and control, as well as processed fish products in consumer protection. Methodology/Principal Findings This study aims to evaluate the applicability of the three mitochondrial genes 16S rRNA (16S), cytochrome b (cyt b), and cytochrome oxidase subunit I (COI) for the identification of 50 European marine fish species by combining techniques of “DNA barcoding” and microarrays. In a DNA barcoding approach, neighbour Joining (NJ) phylogenetic trees of 369 16S, 212 cyt b, and 447 COI sequences indicated that cyt b and COI are suitable for unambiguous identification, whereas 16S failed to discriminate closely related flatfish and gurnard species. In course of probe design for DNA microarray development, each of the markers yielded a high number of potentially species-specific probes in silico, although many of them were rejected based on microarray hybridisation experiments. None of the markers provided probes to discriminate the sibling flatfish and gurnard species. However, since 16S-probes were less negatively influenced by the “position of label” effect and showed the lowest rejection rate and the highest mean signal intensity, 16S is more suitable for DNA microarray probe design than cty b and COI. The large portion of rejected COI-probes after hybridisation experiments (>90%) renders the DNA barcoding marker as rather unsuitable for this high-throughput technology. Conclusions/Significance Based on these data, a DNA microarray containing 64 functional oligonucleotide probes for the identification of 30 out of the 50 fish species investigated was developed. It represents the next step towards an automated and easy-to-handle method to identify fish, ichthyoplankton, and fish products. PMID

  12. Protein Microarray Technology

    PubMed Central

    Hall, David A.; Ptacek, Jason

    2007-01-01

    Protein chips have emerged as a promising approach for a wide variety of applications including the identification of protein-protein interactions, protein-phospholipid interactions, small molecule targets, and substrates of proteins kinases. They can also be used for clinical diagnostics and monitoring disease states. This article reviews current methods in the generation and applications of protein microarrays. PMID:17126887

  13. Microarrays for Undergraduate Classes

    ERIC Educational Resources Information Center

    Hancock, Dale; Nguyen, Lisa L.; Denyer, Gareth S.; Johnston, Jill M.

    2006-01-01

    A microarray experiment is presented that, in six laboratory sessions, takes undergraduate students from the tissue sample right through to data analysis. The model chosen, the murine erythroleukemia cell line, can be easily cultured in sufficient quantities for class use. Large changes in gene expression can be induced in these cells by…

  14. FPGA based system for automatic cDNA microarray image processing.

    PubMed

    Belean, Bogdan; Borda, Monica; Le Gal, Bertrand; Terebes, Romulus

    2012-07-01

    Automation is an open subject in DNA microarray image processing, aiming reliable gene expression estimation. The paper presents a novel shock filter based approach for automatic microarray grid alignment. The proposed method brings up significantly reduced computational complexity compared to state of the art approaches, while similar results in terms of accuracy are achieved. Based on this approach, we also propose an FPGA based system for microarray image analysis that eliminates the shortcomings of existing software platforms: user intervention, increased computational time and cost. Our system includes application-specific architectures which involve algorithm parallelization, aiming fast and automated cDNA microarray image processing. The proposed automated image processing chain is implemented both on a general purpose processor and using the developed hardware architectures as co-processors in a FPGA based system. The comparative results included in the last section show that an important gain in terms of computational time is obtained using hardware based implementations.

  15. Cellular neural networks, the Navier-Stokes equation, and microarray image reconstruction.

    PubMed

    Zineddin, Bachar; Wang, Zidong; Liu, Xiaohui

    2011-11-01

    Although the last decade has witnessed a great deal of improvements achieved for the microarray technology, many major developments in all the main stages of this technology, including image processing, are still needed. Some hardware implementations of microarray image processing have been proposed in the literature and proved to be promising alternatives to the currently available software systems. However, the main drawback of those proposed approaches is the unsuitable addressing of the quantification of the gene spot in a realistic way without any assumption about the image surface. Our aim in this paper is to present a new image-reconstruction algorithm using the cellular neural network that solves the Navier-Stokes equation. This algorithm offers a robust method for estimating the background signal within the gene-spot region. The MATCNN toolbox for Matlab is used to test the proposed method. Quantitative comparisons are carried out, i.e., in terms of objective criteria, between our approach and some other available methods. It is shown that the proposed algorithm gives highly accurate and realistic measurements in a fully automated manner within a remarkably efficient time.

  16. Photonic wire biosensor microarray chip and instrumentation with application to serotyping of Escherichia coli isolates.

    PubMed

    Janz, S; Xu, D-X; Vachon, M; Sabourin, N; Cheben, P; McIntosh, H; Ding, H; Wang, S; Schmid, J H; Delâge, A; Lapointe, J; Densmore, A; Ma, R; Sinclair, W; Logan, S M; Mackenzie, R; Liu, Q Y; Zhang, D; Lopinski, G; Mozenson, O; Gilmour, M; Tabor, H

    2013-02-25

    A complete photonic wire molecular biosensor microarray chip architecture and supporting instrumentation is described. Chip layouts with 16 and 128 independent sensors have been fabricated and tested, where each sensor can provide an independent molecular binding curve. Each sensor is 50 μm in diameter, and consists of a millimeter long silicon photonic wire waveguide folded into a spiral ring resonator. An array of 128 sensors occupies a 2 × 2 mm2 area on a 6 × 9 mm2 chip. Microfluidic sample delivery channels are fabricated monolithically on the chip. The size and layout of the sensor array is fully compatible with commercial spotting tools designed to independently functionalize fluorescence based biochips. The sensor chips are interrogated using an instrument that delivers sample fluid to the chip and is capable of acquiring up to 128 optical sensor outputs simultaneously and in real time. Coupling light from the sensor chip is accomplished through arrays of sub-wavelength surface grating couplers, and the signals are collected by a fixed two-dimensional detector array. The chip and instrument are designed so that connection of the fluid delivery system and optical alignment are automated, and can be completed in a few seconds with no active user input. This microarray system is used to demonstrate a multiplexed assay for serotyping E. coli bacteria using serospecific polyclonal antibody probe molecules.

  17. Identification of Upper Respiratory Tract Pathogens Using Electrochemical Detection on an Oligonucleotide Microarray

    PubMed Central

    Lodes, Michael J.; Suciu, Dominic; Wilmoth, Jodi L.; Ross, Marty; Munro, Sandra; Dix, Kim; Bernards, Karen; Stöver, Axel G.; Quintana, Miguel; Iihoshi, Naomi; Lyon, Wanda J.; Danley, David L.; McShea, Andrew

    2007-01-01

    Bacterial and viral upper respiratory infections (URI) produce highly variable clinical symptoms that cannot be used to identify the etiologic agent. Proper treatment, however, depends on correct identification of the pathogen involved as antibiotics provide little or no benefit with viral infections. Here we describe a rapid and sensitive genotyping assay and microarray for URI identification using standard amplification and hybridization techniques, with electrochemical detection (ECD) on a semiconductor-based oligonucleotide microarray. The assay was developed to detect four bacterial pathogens (Bordetella pertussis, Streptococcus pyogenes, Chlamydia pneumoniae and Mycoplasma pneumoniae) and 9 viral pathogens (adenovirus 4, coronavirus OC43, 229E and HK, influenza A and B, parainfluinza types 1, 2, and 3 and respiratory syncytial virus. This new platform forms the basis for a fully automated diagnostics system that is very flexible and can be customized to suit different or additional pathogens. Multiple probes on a flexible platform allow one to test probes empirically and then select highly reactive probes for further iterative evaluation. Because ECD uses an enzymatic reaction to create electrical signals that can be read directly from the array, there is no need for image analysis or for expensive and delicate optical scanning equipment. We show assay sensitivity and specificity that are excellent for a multiplexed format. PMID:17895966

  18. Detecting outlier samples in microarray data.

    PubMed

    Shieh, Albert D; Hung, Yeung Sam

    2009-01-01

    In this paper, we address the problem of detecting outlier samples with highly different expression patterns in microarray data. Although outliers are not common, they appear even in widely used benchmark data sets and can negatively affect microarray data analysis. It is important to identify outliers in order to explore underlying experimental or biological problems and remove erroneous data. We propose an outlier detection method based on principal component analysis (PCA) and robust estimation of Mahalanobis distances that is fully automatic. We demonstrate that our outlier detection method identifies biologically significant outliers with high accuracy and that outlier removal improves the prediction accuracy of classifiers. Our outlier detection method is closely related to existing robust PCA methods, so we compare our outlier detection method to a prominent robust PCA method.

  19. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  20. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  1. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  2. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  3. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  4. Microwave-Assisted Sample Treatment in a Fully Automated Flow-Based Instrument: Oxidation of Reduced Technetium Species in the Analysis of Total Technetium-99 in Caustic Aged Nuclear Waste Samples

    SciTech Connect

    Egorov, Oleg B.; O'Hara, Matthew J.; Grate, Jay W.

    2004-07-15

    An automated flow-based instrument for microwave-assisted treatment of liquid samples has been developed and characterized. The instrument utilizes a flow-through reaction vessel design that facilitates the addition of multiple reagents during sample treatment, removal of the gaseous reaction products, and enables quantitative removal of liquids from the reaction vessel for carryover-free operations. Matrix modification and speciation control chemistries that are required for the radiochemical determination of total 99Tc in caustic aged nuclear waste samples have been investigated. A rapid and quantitative oxidation procedure using peroxydisulfate in acidic solution was developed to convert reduced technetium species to pertechnetate in samples with high content of reducing organics. The effectiveness of the automated sample treatment procedures has been validated in the radiochemical analysis of total 99Tc in caustic aged nuclear waste matrixes from the Hanford site.

  5. Design of a covalently bonded glycosphingolipid microarray.

    PubMed

    Arigi, Emma; Blixt, Ola; Buschard, Karsten; Clausen, Henrik; Levery, Steven B

    2012-01-01

    Glycosphingolipids (GSLs) are well known ubiquitous constituents of all eukaryotic cell membranes, yet their normal biological functions are not fully understood. As with other glycoconjugates and saccharides, solid phase display on microarrays potentially provides an effective platform for in vitro study of their functional interactions. However, with few exceptions, the most widely used microarray platforms display only the glycan moiety of GSLs, which not only ignores potential modulating effects of the lipid aglycone, but inherently limits the scope of application, excluding, for example, the major classes of plant and fungal GSLs. In this work, a prototype "universal" GSL-based covalent microarray has been designed, and preliminary evaluation of its potential utility in assaying protein-GSL binding interactions investigated. An essential step in development involved the enzymatic release of the fatty acyl moiety of the ceramide aglycone of selected mammalian GSLs with sphingolipid N-deacylase (SCDase). Derivatization of the free amino group of a typical lyso-GSL, lyso-G(M1), with a prototype linker assembled from succinimidyl-[(N-maleimidopropionamido)-diethyleneglycol] ester and 2-mercaptoethylamine, was also tested. Underivatized or linker-derivatized lyso-GSL were then immobilized on N-hydroxysuccinimide- or epoxide-activated glass microarray slides and probed with carbohydrate binding proteins of known or partially known specificities (i.e., cholera toxin B-chain; peanut agglutinin, a monoclonal antibody to sulfatide, Sulph 1; and a polyclonal antiserum reactive to asialo-G(M2)). Preliminary evaluation of the method indicated successful immobilization of the GSLs, and selective binding of test probes. The potential utility of this methodology for designing covalent microarrays that incorporate GSLs for serodiagnosis is discussed.

  6. Automation in Immunohematology

    PubMed Central

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-01-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process. PMID:22988378

  7. Automated, Miniaturized Instrument for Measuring Gene Expression in Space

    NASA Technical Reports Server (NTRS)

    Pohorille, A.; Peyvan, K.; Danley, D.; Ricco, A. J.

    2010-01-01

    To facilitate astrobiological studies on the survival and adaptation of microorganisms and mixed microbial cultures to space environment, we have been developing a fully automated, miniaturized system for measuring their gene expression on small spacecraft. This low-cost, multi-purpose instrument represents a major scientific and technological advancement in our ability to study the impact of the space environment on biological systems by providing data on cellular metabolism and regulation orders of magnitude richer than what is currently available. The system supports growth of the organism, lyse it to release the expressed RNA, label the RNA, read the expression levels of a large number of genes by microarray analysis of labeled RNA and transmit the measurements to Earth. To measure gene expression we use microarray technology developed by CombiMatrix, which is based on electrochemical reactions on arrays of electrodes on a semiconductor substrate. Since the electrical integrity of the microarray remains intact after probe synthesis, the circuitry can be employed to sense nucleic acid binding at each electrode. CombiMatrix arrays can be sectored to allow multiple samples per chip. In addition, a single array can be used for several assays. The array has been integrated into an automated microfluidic cartridge that uses flexible reagent blisters and pinch pumping to move liquid reagents between chambers. The proposed instrument will help to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment, develop effective countermeasures against these effects, and test our ability to sustain and grow in space organisms that can be used for life support and in situ resource utilization during long-duration space exploration. The instrument is suitable for small satellite platforms, which provide frequent, low cost access to space. It can be also used on any other platform in space

  8. Analyzing Microarray Data.

    PubMed

    Hung, Jui-Hung; Weng, Zhiping

    2017-03-01

    Because there is no widely used software for analyzing RNA-seq data that has a graphical user interface, this protocol provides an example of analyzing microarray data using Babelomics. This analysis entails performing quantile normalization and then detecting differentially expressed genes associated with the transgenesis of a human oncogene c-Myc in mice. Finally, hierarchical clustering is performed on the differentially expressed genes using the Cluster program, and the results are visualized using TreeView.

  9. Membrane-based microarrays

    NASA Astrophysics Data System (ADS)

    Dawson, Elliott P.; Hudson, James; Steward, John; Donnell, Philip A.; Chan, Wing W.; Taylor, Richard F.

    1999-11-01

    Microarrays represent a new approach to the rapid detection and identification of analytes. Studies to date have shown that the immobilization of receptor molecules (such as DNA, oligonucleotides, antibodies, enzymes and binding proteins) onto silicon and polymeric substrates can result in arrays able to detect hundreds of analytes in a single step. The formation of the receptor/analyte complex can, itself, lead to detection, or the complex can be interrogated through the use of fluorescent, chemiluminescent or radioactive probes and ligands.

  10. Evaluating concentration estimation errors in ELISA microarray experiments

    SciTech Connect

    Daly, Don S.; White, Amanda M.; Varnum, Susan M.; Anderson, Kevin K.; Zangar, Richard C.

    2005-01-26

    Enzyme-linked immunosorbent assay (ELISA) is a standard immunoassay to predict a protein concentration in a sample. Deploying ELISA in a microarray format permits simultaneous prediction of the concentrations of numerous proteins in a small sample. These predictions, however, are uncertain due to processing error and biological variability. Evaluating prediction error is critical to interpreting biological significance and improving the ELISA microarray process. Evaluating prediction error must be automated to realize a reliable high-throughput ELISA microarray system. Methods: In this paper, we present a statistical method based on propagation of error to evaluate prediction errors in the ELISA microarray process. Although propagation of error is central to this method, it is effective only when comparable data are available. Therefore, we briefly discuss the roles of experimental design, data screening, normalization and statistical diagnostics when evaluating ELISA microarray prediction errors. We use an ELISA microarray investigation of breast cancer biomarkers to illustrate the evaluation of prediction errors. The illustration begins with a description of the design and resulting data, followed by a brief discussion of data screening and normalization. In our illustration, we fit a standard curve to the screened and normalized data, review the modeling diagnostics, and apply propagation of error.

  11. Microarray oligonucleotide probe designer (MOPeD): A web service.

    PubMed

    Patel, Viren C; Mondal, Kajari; Shetty, Amol Carl; Horner, Vanessa L; Bedoyan, Jirair K; Martin, Donna; Caspary, Tamara; Cutler, David J; Zwick, Michael E

    2010-11-01

    Methods of genomic selection that combine high-density oligonucleotide microarrays with next-generation DNA sequencing allow investigators to characterize genomic variation in selected portions of complex eukaryotic genomes. Yet choosing which specific oligonucleotides to be use can pose a major technical challenge. To address this issue, we have developed a software package called MOPeD (Microarray Oligonucleotide Probe Designer), which automates the process of designing genomic selection microarrays. This web-based software allows individual investigators to design custom genomic selection microarrays optimized for synthesis with Roche NimbleGen's maskless photolithography. Design parameters include uniqueness of the probe sequences, melting temperature, hairpin formation, and the presence of single nucleotide polymorphisms. We generated probe databases for the human, mouse, and rhesus macaque genomes and conducted experimental validation of MOPeD-designed microarrays in human samples by sequencing the human X chromosome exome, where relevant sequence metrics indicated superior performance relative to a microarray designed by the Roche NimbleGen proprietary algorithm. We also performed validation in the mouse to identify known mutations contained within a 487-kb region from mouse chromosome 16, the mouse chromosome 16 exome (1.7 Mb), and the mouse chromosome 12 exome (3.3 Mb). Our results suggest that the open source MOPeD software package and website (http://moped.genetics.emory.edu/) will make a valuable resource for investigators in their sequence-based studies of complex eukaryotic genomes.

  12. Surface chemistries for antibody microarrays

    SciTech Connect

    Seurynck-Servoss, Shannon L.; Baird, Cheryl L.; Rodland, Karin D.; Zangar, Richard C.

    2007-05-01

    Enzyme-linked immunosorbent assay (ELISA) microarrays promise to be a powerful tool for the detection of disease biomarkers. The original technology for printing ELISA microarray chips and capturing antibodies on slides was derived from the DNA microarray field. However, due to the need to maintain antibody structure and function when immobilized, surface chemistries used for DNA microarrays are not always appropriate for ELISA microarrays. In order to identify better surface chemistries for antibody capture, a number of commercial companies and academic research groups have developed new slide types that could improve antibody function in microarray applications. In this review we compare and contrast the commercially available slide chemistries, as well as highlight some promising recent advances in the field.

  13. Validating Automated Speaking Tests

    ERIC Educational Resources Information Center

    Bernstein, Jared; Van Moere, Alistair; Cheng, Jian

    2010-01-01

    This paper presents evidence that supports the valid use of scores from fully automatic tests of spoken language ability to indicate a person's effectiveness in spoken communication. The paper reviews the constructs, scoring, and the concurrent validity evidence of "facility-in-L2" tests, a family of automated spoken language tests in Spanish,…

  14. The Longhorn Array Database (LAD): An Open-Source, MIAME compliant implementation of the Stanford Microarray Database (SMD)

    PubMed Central

    Killion, Patrick J; Sherlock, Gavin; Iyer, Vishwanath R

    2003-01-01

    Background The power of microarray analysis can be realized only if data is systematically archived and linked to biological annotations as well as analysis algorithms. Description The Longhorn Array Database (LAD) is a MIAME compliant microarray database that operates on PostgreSQL and Linux. It is a fully open source version of the Stanford Microarray Database (SMD), one of the largest microarray databases. LAD is available at Conclusions Our development of LAD provides a simple, free, open, reliable and proven solution for storage and analysis of two-color microarray data. PMID:12930545

  15. Microarrays in cancer research.

    PubMed

    Grant, Geraldine M; Fortney, Amanda; Gorreta, Francesco; Estep, Michael; Del Giacco, Luca; Van Meter, Amy; Christensen, Alan; Appalla, Lakshmi; Naouar, Chahla; Jamison, Curtis; Al-Timimi, Ali; Donovan, Jean; Cooper, James; Garrett, Carleton; Chandhoke, Vikas

    2004-01-01

    Microarray technology has presented the scientific community with a compelling approach that allows for simultaneous evaluation of all cellular processes at once. Cancer, being one of the most challenging diseases due to its polygenic nature, presents itself as a perfect candidate for evaluation by this approach. Several recent articles have provided significant insight into the strengths and limitations of microarrays. Nevertheless, there are strong indications that this approach will provide new molecular markers that could be used in diagnosis and prognosis of cancers. To achieve these goals it is essential that there is a seamless integration of clinical and molecular biological data that allows us to elucidate genes and pathways involved in various cancers. To this effect we are currently evaluating gene expression profiles in human brain, ovarian, breast and hematopoetic, lung, colorectal, head and neck and biliary tract cancers. To address the issues we have a joint team of scientists, doctors and computer scientists from two Virginia Universities and a major healthcare provider. The study has been divided into several focus groups that include; Tissue Bank Clinical & Pathology Laboratory Data, Chip Fabrication, QA/QC, Tissue Devitalization, Database Design and Data Analysis, using multiple microarray platforms. Currently over 300 consenting patients have been enrolled in the study with the largest number being that of breast cancer patients. Clinical data on each patient is being compiled into a secure and interactive relational database and integration of these data elements will be accomplished by a common programming interface. This clinical database contains several key parameters on each patient including demographic (risk factors, nutrition, co-morbidity, familial history), histopathology (non genetic predictors), tumor, treatment and follow-up information. Gene expression data derived from the tissue samples will be linked to this database, which

  16. DNA Microarray-Based Diagnostics.

    PubMed

    Marzancola, Mahsa Gharibi; Sedighi, Abootaleb; Li, Paul C H

    2016-01-01

    The DNA microarray technology is currently a useful biomedical tool which has been developed for a variety of diagnostic applications. However, the development pathway has not been smooth and the technology has faced some challenges. The reliability of the microarray data and also the clinical utility of the results in the early days were criticized. These criticisms added to the severe competition from other techniques, such as next-generation sequencing (NGS), impacting the growth of microarray-based tests in the molecular diagnostic market.Thanks to the advances in the underlying technologies as well as the tremendous effort offered by the research community and commercial vendors, these challenges have mostly been addressed. Nowadays, the microarray platform has achieved sufficient standardization and method validation as well as efficient probe printing, liquid handling and signal visualization. Integration of various steps of the microarray assay into a harmonized and miniaturized handheld lab-on-a-chip (LOC) device has been a goal for the microarray community. In this respect, notable progress has been achieved in coupling the DNA microarray with the liquid manipulation microsystem as well as the supporting subsystem that will generate the stand-alone LOC device.In this chapter, we discuss the major challenges that microarray technology has faced in its almost two decades of development and also describe the solutions to overcome the challenges. In addition, we review the advancements of the technology, especially the progress toward developing the LOC devices for DNA diagnostic applications.

  17. Systematic review automation technologies.

    PubMed

    Tsafnat, Guy; Glasziou, Paul; Choong, Miew Keen; Dunn, Adam; Galgani, Filippo; Coiera, Enrico

    2014-07-09

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects.We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time.

  18. Systematic review automation technologies

    PubMed Central

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  19. Fully Regressive Melanoma

    PubMed Central

    Ehrsam, Eric; Kallini, Joseph R.; Lebas, Damien; Modiano, Philippe; Cotten, Hervé

    2016-01-01

    Fully regressive melanoma is a phenomenon in which the primary cutaneous melanoma becomes completely replaced by fibrotic components as a result of host immune response. Although 10 to 35 percent of cases of cutaneous melanomas may partially regress, fully regressive melanoma is very rare; only 47 cases have been reported in the literature to date. AH of the cases of fully regressive melanoma reported in the literature were diagnosed in conjunction with metastasis on a patient. The authors describe a case of fully regressive melanoma without any metastases at the time of its diagnosis. Characteristic findings on dermoscopy, as well as the absence of melanoma on final biopsy, confirmed the diagnosis. PMID:27672418

  20. Living-cell microarrays.

    PubMed

    Yarmush, Martin L; King, Kevin R

    2009-01-01

    Living cells are remarkably complex. To unravel this complexity, living-cell assays have been developed that allow delivery of experimental stimuli and measurement of the resulting cellular responses. High-throughput adaptations of these assays, known as living-cell microarrays, which are based on microtiter plates, high-density spotting, microfabrication, and microfluidics technologies, are being developed for two general applications: (a) to screen large-scale chemical and genomic libraries and (b) to systematically investigate the local cellular microenvironment. These emerging experimental platforms offer exciting opportunities to rapidly identify genetic determinants of disease, to discover modulators of cellular function, and to probe the complex and dynamic relationships between cells and their local environment.

  1. A rapid automatic processing platform for bead label-assisted microarray analysis: application for genetic hearing-loss mutation detection.

    PubMed

    Zhu, Jiang; Song, Xiumei; Xiang, Guangxin; Feng, Zhengde; Guo, Hongju; Mei, Danyang; Zhang, Guohao; Wang, Dong; Mitchelson, Keith; Xing, Wanli; Cheng, Jing

    2014-04-01

    Molecular diagnostics using microarrays are increasingly being used in clinical diagnosis because of their high throughput, sensitivity, and accuracy. However, standard microarray processing takes several hours and involves manual steps during hybridization, slide clean up, and imaging. Here we describe the development of an integrated platform that automates these individual steps as well as significantly shortens the processing time and improves reproducibility. The platform integrates such key elements as a microfluidic chip, flow control system, temperature control system, imaging system, and automated analysis of clinical results. Bead labeling of microarray signals required a simple imaging system and allowed continuous monitoring of the microarray processing. To demonstrate utility, the automated platform was used to genotype hereditary hearing-loss gene mutations. Compared with conventional microarray processing procedures, the platform increases the efficiency and reproducibility of hybridization, speeding microarray processing through to result analysis. The platform also continuously monitors the microarray signals, which can be used to facilitate optimization of microarray processing conditions. In addition, the modular design of the platform lends itself to development of simultaneous processing of multiple microfluidic chips. We believe the novel features of the platform will benefit its use in clinical settings in which fast, low-complexity molecular genetic testing is required.

  2. Manual segmentation of the fornix, fimbria, and alveus on high-resolution 3T MRI: Application via fully-automated mapping of the human memory circuit white and grey matter in healthy and pathological aging.

    PubMed

    Amaral, Robert S C; Park, Min Tae M; Devenyi, Gabriel A; Lynn, Vivian; Pipitone, Jon; Winterburn, Julie; Chavez, Sofia; Schira, Mark; Lobaugh, Nancy J; Voineskos, Aristotle N; Pruessner, Jens C; Chakravarty, M Mallar

    2016-10-18

    Recently, much attention has been focused on the definition and structure of the hippocampus and its subfields, while the projections from the hippocampus have been relatively understudied. Here, we derive a reliable protocol for manual segmentation of hippocampal white matter regions (alveus, fimbria, and fornix) using high-resolution magnetic resonance images that are complementary to our previous definitions of the hippocampal subfields, both of which are freely available at https://github.com/cobralab/atlases. Our segmentation methods demonstrated high inter- and intra-rater reliability, were validated as inputs in automated segmentation, and were used to analyze the trajectory of these regions in both healthy aging (OASIS), and Alzheimer's disease (AD) and mild cognitive impairment (MCI; using ADNI). We observed significant bilateral decreases in the fornix in healthy aging while the alveus and cornu ammonis (CA) 1 were well preserved (all p's<0.006). MCI and AD demonstrated significant decreases in fimbriae and fornices. Many hippocampal subfields exhibited decreased volume in both MCI and AD, yet no significant differences were found between MCI and AD cohorts themselves. Our results suggest a neuroprotective or compensatory role for the alveus and CA1 in healthy aging and suggest that an improved understanding of the volumetric trajectories of these structures is required.

  3. Short time-series microarray analysis: Methods and challenges

    PubMed Central

    Wang, Xuewei; Wu, Ming; Li, Zheng; Chan, Christina

    2008-01-01

    The detection and analysis of steady-state gene expression has become routine. Time-series microarrays are of growing interest to systems biologists for deciphering the dynamic nature and complex regulation of biosystems. Most temporal microarray data only contain a limited number of time points, giving rise to short-time-series data, which imposes challenges for traditional methods of extracting meaningful information. To obtain useful information from the wealth of short-time series data requires addressing the problems that arise due to limited sampling. Current efforts have shown promise in improving the analysis of short time-series microarray data, although challenges remain. This commentary addresses recent advances in methods for short-time series analysis including simplification-based approaches and the integration of multi-source information. Nevertheless, further studies and development of computational methods are needed to provide practical solutions to fully exploit the potential of this data. PMID:18605994

  4. Microarray simulator as educational tool.

    PubMed

    Ruusuvuori, Pekka; Nykter, Matti; Mäkiraatikka, Eeva; Lehmussola, Antti; Korpelainen, Tomi; Erkkilä, Timo; Yli-Harja, Olli

    2007-01-01

    As many real-world applications, microarray measurements are inapplicable for large-scale teaching purposes due to their laborious preparation process and expense. Fortunately, many phases of the array preparation process can be efficiently demonstrated by using a software simulator tool. Here we propose the use of microarray simulator as an aiding tool in teaching of computational biology. Three case studies on educational use of the simulator are presented, which demonstrate the effect of gene knock-out, synthetic time series, and effect of noise sources. We conclude that the simulator, used for teaching the principles of microarray measurement technology, proved to be a useful tool in education.

  5. Chemistry of Natural Glycan Microarray

    PubMed Central

    Song, Xuezheng; Heimburg-Molinaro, Jamie; Cummings, Richard D.; Smith, David F.

    2014-01-01

    Glycan microarrays have become indispensable tools for studying protein-glycan interactions. Along with chemo-enzymatic synthesis, glycans isolated from natural sources have played important roles in array development and will continue to be a major source of glycans. N- and O-glycans from glycoproteins, and glycans from glycosphingolipids can be released from corresponding glycoconjugates with relatively mature methods, although isolation of large numbers and quantities of glycans are still very challenging. Glycosylphosphatidylinositol (GPI)-anchors and glycosaminoglycans (GAGs) are less represented on current glycan microarrays. Glycan microarray development has been greatly facilitated by bifunctional fluorescent linkers, which can be applied in a “Shotgun Glycomics” approach to incorporate isolated natural glycans. Glycan presentation on microarrays may affect glycan binding by GBPs, often through multivalent recognition by the GBP. PMID:24487062

  6. Employing image processing techniques for cancer detection using microarray images.

    PubMed

    Dehghan Khalilabad, Nastaran; Hassanpour, Hamid

    2017-02-01

    Microarray technology is a powerful genomic tool for simultaneously studying and analyzing the behavior of thousands of genes. The analysis of images obtained from this technology plays a critical role in the detection and treatment of diseases. The aim of the current study is to develop an automated system for analyzing data from microarray images in order to detect cancerous cases. The proposed system consists of three main phases, namely image processing, data mining, and the detection of the disease. The image processing phase performs operations such as refining image rotation, gridding (locating genes) and extracting raw data from images the data mining includes normalizing the extracted data and selecting the more effective genes. Finally, via the extracted data, cancerous cell is recognized. To evaluate the performance of the proposed system, microarray database is employed which includes Breast cancer, Myeloid Leukemia and Lymphomas from the Stanford Microarray Database. The results indicate that the proposed system is able to identify the type of cancer from the data set with an accuracy of 95.45%, 94.11%, and 100%, respectively.

  7. Giant Magnetoresistive Sensors for DNA Microarray

    PubMed Central

    Xu, Liang; Yu, Heng; Han, Shu-Jen; Osterfeld, Sebastian; White, Robert L.; Pourmand, Nader; Wang, Shan X.

    2009-01-01

    Giant magnetoresistive (GMR) sensors are developed for a DNA microarray. Compared with the conventional fluorescent sensors, GMR sensors are cheaper, more sensitive, can generate fully electronic signals, and can be easily integrated with electronics and microfluidics. The GMR sensor used in this work has a bottom spin valve structure with an MR ratio of 12%. The single-strand target DNA detected has a length of 20 bases. Assays with DNA concentrations down to 10 pM were performed, with a dynamic range of 3 logs. A double modulation technique was used in signal detection to reduce the 1/f noise in the sensor while circumventing electromagnetic interference. The logarithmic relationship between the magnetic signal and the target DNA concentration can be described by the Temkin isotherm. Furthermore, GMR sensors integrated with microfluidics has great potential of improving the sensitivity to 1 pM or below, and the total assay time can be reduced to less than 1 hour. PMID:20824116

  8. Comparing Bacterial DNA Microarray Fingerprints

    SciTech Connect

    Willse, Alan R.; Chandler, Darrell P.; White, Amanda M.; Protic, Miroslava; Daly, Don S.; Wunschel, Sharon C.

    2005-08-15

    Detecting subtle genetic differences between microorganisms is an important problem in molecular epidemiology and microbial forensics. In a typical investigation, gel electrophoresis is used to compare randomly amplified DNA fragments between microbial strains, where the patterns of DNA fragment sizes are proxies for a microbe's genotype. The limited genomic sample captured on a gel is often insufficient to discriminate nearly identical strains. This paper examines the application of microarray technology to DNA fingerprinting as a high-resolution alternative to gel-based methods. The so-called universal microarray, which uses short oligonucleotide probes that do not target specific genes or species, is intended to be applicable to all microorganisms because it does not require prior knowledge of genomic sequence. In principle, closely related strains can be distinguished if the number of probes on the microarray is sufficiently large, i.e., if the genome is sufficiently sampled. In practice, we confront noisy data, imperfectly matched hybridizations, and a high-dimensional inference problem. We describe the statistical problems of microarray fingerprinting, outline similarities with and differences from more conventional microarray applications, and illustrate the statistical fingerprinting problem for 10 closely related strains from three Bacillus species, and 3 strains from non-Bacillus species.

  9. Nanodroplet chemical microarrays and label-free assays.

    PubMed

    Gosalia, Dhaval; Diamond, Scott L

    2010-01-01

    The microarraying of chemicals or biomolecules on a glass surface allows for dense storage and miniaturized screening experiments and can be deployed in chemical-biology research or drug discovery. Microarraying allows the production of scores of replicate slides. Small molecule libraries are typically stored as 10 mM DMSO stock solutions, whereas libraries of biomolecules are typically stored in high percentages of glycerol. Thus, a method is required to print such libraries on microarrays, and then assay them against biological targets. By printing either small molecule libraries or biomolecule libraries in an aqueous solvent containing glycerol, each adherent nanodroplet remains fixed at a position on the microarray by surface tension without the use of wells, without evaporating, and without the need for chemically linking the compound to the surface. Importantly, glycerol is a high boiling point solvent that is fully miscible with DMSO and water and has the additional property of stabilizing various enzymes. The nanoliter volume of the droplet forms the reaction compartment once additional reagents are metered onto the microarray, either by aerosol spray deposition or by addressable acoustic dispensing. Incubation of the nanodroplet microarray in a high humidity environment controls the final water content of the reaction. This platform has been validated for fluorescent HTS assays of protease and kinases as well as for fluorogenic substrate profiling of proteases. Label-free HTS is also possible by running nanoliter HTS reactions on a MALDI target for mass spectrometry (MS) analysis without the need for desalting of the samples. A method is described for running nanoliter-scale multicomponent homogeneous reactions followed by label-free MALDI MS spectrometry analysis of the reactions.

  10. Automated, Miniaturized Instrument for Measuring Gene Expression in Space

    NASA Astrophysics Data System (ADS)

    Pohorille, Andrew; Danley, David; Payvan, Kia; Ricco, Antonio

    To facilitate astrobiological studies on the survival and adaptation of microorganisms and mixed microbial cultures to space environment, we have been developing a fully automated, minia-turized system for measuring their gene expression on small spacecraft. This low-cost, multi-purpose instrument represents a major scientific and technological advancement in our ability to study the impact of the space environment on biological systems by providing data on cel-lular metabolism and regulation orders of magnitude richer than what is currently available. The system supports growth of the organism, lyse it to release the expressed RNA, label the RNA, read the expression levels of a large number of genes by microarray analysis of labeled RNA and transmit the measurements to Earth. To measure gene expression we use microarray technology developed by CombiMatrix, which is based on electrochemical reactions on arrays of electrodes on a semiconductor substrate. Since the electrical integrity of the microarray re-mains intact after probe synthesis, the circuitry can be employed to sense nucleic acid binding at each electrode. CombiMatrix arrays can be sectored to allow multiple samples per chip. In addition, a single array can be used for several assays. The array has been integrated into an automated microfluidic cartridge that uses flexible reagent blisters and pinch pumping to move liquid reagents between chambers. The proposed instrument will help to understand adaptation of terrestrial life to conditions be-yond the planet of origin, identify deleterious effects of the space environment, develop effective countermeasures against these effects, and test our ability to sustain and grow in space organ-isms that can be used for life support and in situ resource utilization during long-duration space exploration. The instrument is suitable for small satellite platforms, which provide frequent, low cost access to space. It can be also used on any other platform in space

  11. Dynamics of photospheric bright points in G-band derived from two fully automated algorithms. (Slovak Title: Dynamika fotosférických jasných bodov v G-páse odvodená použitím dvoch plne automatických algoritmov)

    NASA Astrophysics Data System (ADS)

    Bodnárová, M.; Rybák, J.; Hanslmeier, A.; Utz, D.

    2010-12-01

    Concentrations of small-scale magnetic field in the solar photosphere can be identified in the G-band of the solar spectrum as bright points. Studying the dynamics of the bright points in the G-band (BPGBs) can also help in addressing many issues related to the problem of the solar corona heating. In this work, we have used a set of 142 specled images in the G-band taken by the Dutch Open Telescope (DOT) on 19 October 2005 to make a comparison of two fully automated algorithms identifying BPGBs: an algorithm developed by Utz et al. (2009, 2010), and an algorithm developed following the work of Berger et al. (1995, 1998). We then followed in time and space motion of the BPGBs identified by both algorithms and constructed the distributions of their lifetimes, sizes and speeds. The results show that both algorithms give very similar results for the BPGB lifetimes and speeds, but their results vary significantly for the sizes of the identified BPGBs. This difference is due to the fact that in the case of the Berger et al. identification algorithm no additional criteria were applied to constrain the allowed BPGB sizes. As a result in further studies of the BPGB dynamics we will prefer to use the Utz algorithm to identify and track BPGBs.

  12. Simple Fully Automated Group Classification on Brain fMRI

    SciTech Connect

    Honorio, J.; Goldstein, R.; Honorio, J.; Samaras, D.; Tomasi, D.; Goldstein, R.Z.

    2010-04-14

    We propose a simple, well grounded classification technique which is suited for group classification on brain fMRI data sets that have high dimensionality, small number of subjects, high noise level, high subject variability, imperfect registration and capture subtle cognitive effects. We propose threshold-split region as a new feature selection method and majority voteas the classification technique. Our method does not require a predefined set of regions of interest. We use average acros ssessions, only one feature perexperimental condition, feature independence assumption, and simple classifiers. The seeming counter-intuitive approach of using a simple design is supported by signal processing and statistical theory. Experimental results in two block design data sets that capture brain function under distinct monetary rewards for cocaine addicted and control subjects, show that our method exhibits increased generalization accuracy compared to commonly used feature selection and classification techniques.

  13. A Fully Automated Stage for Optical Waveguide Measurements

    DTIC Science & Technology

    1993-09-01

    015). waskungton. D( 20503 1. AGENCY USE ONLY (Leave blank) 12. REPORT DATE 3 . REPORT TYPE AND DATES COVERED SEP 93 FINAL 09/09/92-09/09/93 4. TITLE...EXPERIMENTAL BACKGROUND 3 2.1 THE OUT-OF-PLANE TECHNIQUE 3 2.2 THE THREE-PRISM TECHNIQUE 5 3.0 SYSTEM REQUIREMENTS 8 3.1 SPECIFIC REQUIREMENTS 8 4.0...OF ILLUSTRATIONS FIGURE PAGE 1 Out-of-Plane Scattering Technique Schematic 4 2 Three-Prism Measurement Technique Schematic 6 3 Overall Waveguide

  14. Wire-Guide Manipulator For Automated Welding

    NASA Technical Reports Server (NTRS)

    Morris, Tim; White, Kevin; Gordon, Steve; Emerich, Dave; Richardson, Dave; Faulkner, Mike; Stafford, Dave; Mccutcheon, Kim; Neal, Ken; Milly, Pete

    1994-01-01

    Compact motor drive positions guide for welding filler wire. Drive part of automated wire feeder in partly or fully automated welding system. Drive unit contains three parallel subunits. Rotations of lead screws in three subunits coordinated to obtain desired motions in three degrees of freedom. Suitable for both variable-polarity plasma arc welding and gas/tungsten arc welding.

  15. Implementation of GenePattern within the Stanford Microarray Database.

    PubMed

    Hubble, Jeremy; Demeter, Janos; Jin, Heng; Mao, Maria; Nitzberg, Michael; Reddy, T B K; Wymore, Farrell; Zachariah, Zachariah K; Sherlock, Gavin; Ball, Catherine A

    2009-01-01

    Hundreds of researchers across the world use the Stanford Microarray Database (SMD; http://smd.stanford.edu/) to store, annotate, view, analyze and share microarray data. In addition to providing registered users at Stanford access to their own data, SMD also provides access to public data, and tools with which to analyze those data, to any public user anywhere in the world. Previously, the addition of new microarray data analysis tools to SMD has been limited by available engineering resources, and in addition, the existing suite of tools did not provide a simple way to design, execute and share analysis pipelines, or to document such pipelines for the purposes of publication. To address this, we have incorporated the GenePattern software package directly into SMD, providing access to many new analysis tools, as well as a plug-in architecture that allows users to directly integrate and share additional tools through SMD. In this article, we describe our implementation of the GenePattern microarray analysis software package into the SMD code base. This extension is available with the SMD source code that is fully and freely available to others under an Open Source license, enabling other groups to create a local installation of SMD with an enriched data analysis capability.

  16. Viral Discovery and Sequence Recovery Using DNA Microarrays

    PubMed Central

    Wang, David; Urisman, Anatoly; Liu, Yu-Tsueng; Springer, Michael; Ksiazek, Thomas G; Erdman, Dean D; Mardis, Elaine R; Hickenbotham, Matthew; Magrini, Vincent; Eldred, James; Latreille, J. Phillipe; Wilson, Richard K; Ganem, Don

    2003-01-01

    Because of the constant threat posed by emerging infectious diseases and the limitations of existing approaches used to identify new pathogens, there is a great demand for new technological methods for viral discovery. We describe herein a DNA microarray-based platform for novel virus identification and characterization. Central to this approach was a DNA microarray designed to detect a wide range of known viruses as well as novel members of existing viral families; this microarray contained the most highly conserved 70mer sequences from every fully sequenced reference viral genome in GenBank. During an outbreak of severe acute respiratory syndrome (SARS) in March 2003, hybridization to this microarray revealed the presence of a previously uncharacterized coronavirus in a viral isolate cultivated from a SARS patient. To further characterize this new virus, approximately 1 kb of the unknown virus genome was cloned by physically recovering viral sequences hybridized to individual array elements. Sequencing of these fragments confirmed that the virus was indeed a new member of the coronavirus family. This combination of array hybridization followed by direct viral sequence recovery should prove to be a general strategy for the rapid identification and characterization of novel viruses and emerging infectious disease. PMID:14624234

  17. Using Kepler for Tool Integration in Microarray Analysis Workflows.

    PubMed

    Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.

  18. Microfluidic microarray systems and methods thereof

    SciTech Connect

    West, Jay A. A.; Hukari, Kyle W.; Hux, Gary A.

    2009-04-28

    Disclosed are systems that include a manifold in fluid communication with a microfluidic chip having a microarray, an illuminator, and a detector in optical communication with the microarray. Methods for using these systems for biological detection are also disclosed.

  19. The Microarray Revolution: Perspectives from Educators

    ERIC Educational Resources Information Center

    Brewster, Jay L.; Beason, K. Beth; Eckdahl, Todd T.; Evans, Irene M.

    2004-01-01

    In recent years, microarray analysis has become a key experimental tool, enabling the analysis of genome-wide patterns of gene expression. This review approaches the microarray revolution with a focus upon four topics: 1) the early development of this technology and its application to cancer diagnostics; 2) a primer of microarray research,…

  20. 21 CFR 864.5620 - Automated hemoglobin system.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Automated hemoglobin system. 864.5620 Section 864....5620 Automated hemoglobin system. (a) Identification. An automated hemoglobin system is a fully... hemoglobin content of human blood. (b) Classification. Class II (performance standards)....

  1. 21 CFR 864.5620 - Automated hemoglobin system.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated hemoglobin system. 864.5620 Section 864....5620 Automated hemoglobin system. (a) Identification. An automated hemoglobin system is a fully... hemoglobin content of human blood. (b) Classification. Class II (performance standards)....

  2. 21 CFR 864.5620 - Automated hemoglobin system.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Automated hemoglobin system. 864.5620 Section 864....5620 Automated hemoglobin system. (a) Identification. An automated hemoglobin system is a fully... hemoglobin content of human blood. (b) Classification. Class II (performance standards)....

  3. 21 CFR 864.5620 - Automated hemoglobin system.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Automated hemoglobin system. 864.5620 Section 864....5620 Automated hemoglobin system. (a) Identification. An automated hemoglobin system is a fully... hemoglobin content of human blood. (b) Classification. Class II (performance standards)....

  4. 21 CFR 864.5620 - Automated hemoglobin system.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Automated hemoglobin system. 864.5620 Section 864....5620 Automated hemoglobin system. (a) Identification. An automated hemoglobin system is a fully... hemoglobin content of human blood. (b) Classification. Class II (performance standards)....

  5. Investigating Factors Affecting the Uptake of Automated Assessment Technology

    ERIC Educational Resources Information Center

    Dreher, Carl; Reiners, Torsten; Dreher, Heinz

    2011-01-01

    Automated assessment is an emerging innovation in educational praxis, however its pedagogical potential is not fully utilised in Australia, particularly regarding automated essay grading. The rationale for this research is that the usage of automated assessment currently lags behind the capacity that the technology provides, thus restricting the…

  6. Automation pilot

    NASA Technical Reports Server (NTRS)

    1983-01-01

    An important concept of the Action Information Management System (AIMS) approach is to evaluate office automation technology in the context of hands on use by technical program managers in the conduct of human acceptance difficulties which may accompany the transition to a significantly changing work environment. The improved productivity and communications which result from application of office automation technology are already well established for general office environments, but benefits unique to NASA are anticipated and these will be explored in detail.

  7. Sex determination of bovine preimplantation embryos by oligonucleotide microarray.

    PubMed

    Yang, Hua; Zhong, Fagang; Yang, Yonglin; Wang, Xinhua; Liu, Shouren; Zhu, Bin

    2013-06-01

    The aim has been to set up a rapid and accurate microarray assay using sandwich mode for sex determination of bovine preimplantation embryos. Twelve sequence-specific oligonucleotide capture probes used to discriminate 12 samples were spotted onto the aldehyde-modified glass slides by Arrayer. The 2 recognition probes used to identify coding regions of the sex-determining region of the Y chromosome gene (SRY) and β-casein (CSN2) reference gene were coupled with biotin. The assay was optimized by using genomic DNA extracted from blood samples of known sex individuals. Polymerase chain reaction (PCR) was used to amplify the fragments in the HMG box region of SRY gene and CSN2 gene with sequence-specific primers. The sex of samples was identified by detecting both the SRY and CSN2 genes simultaneously in 2 reaction cells of microarrays, with the male having SRY and CSN2 signals and the female only CSN2. The sex of 20 bovine preimplantation embryos was determined by oligonucleotide microarray. The protocol was run with a blind test that showed a 100% (82/82) specificity and accuracy in sexing of leukocytes. The bovine embryos were transferred into 20 bovine recipients, with a pregnant rate of 40% (8/20). Three calves were born at term, and 5 fetuses were miscarried. Their sexes were fully in accordance with the embryonic sex predetermination predicted by oligonucleotide microarray. This suggests that the oligonucleotide microarray method of SRY gene analysis can be used in early sex prediction of bovine embryos in breeding programs.

  8. The Current Status of DNA Microarrays

    NASA Astrophysics Data System (ADS)

    Shi, Leming; Perkins, Roger G.; Tong, Weida

    DNA microarray technology that allows simultaneous assay of thousands of genes in a single experiment has steadily advanced to become a mainstream method used in research, and has reached a stage that envisions its use in medical applications and personalized medicine. Many different strategies have been developed for manufacturing DNA microarrays. In this chapter, we discuss the manufacturing characteristics of seven microarray platforms that were used in a recently completed large study by the MicroArray Quality Control (MAQC) consortium, which evaluated the concordance of results across these platforms. The platforms can be grouped into three categories: (1) in situ synthesis of oligonucleotide probes on microarrays (Affymetrix GeneChip® arrays based on photolithography synthesis and Agilent's arrays based on inkjet synthesis); (2) spotting of presynthesized oligonucleotide probes on microarrays (GE Healthcare's CodeLink system, Applied Biosystems' Genome Survey Microarrays, and the custom microarrays printed with Operon's oligonucleotide set); and (3) deposition of presynthesized oligonucleotide probes on bead-based microarrays (Illumina's BeadChip microarrays). We conclude this chapter with our views on the challenges and opportunities toward acceptance of DNA microarray data in clinical and regulatory settings.

  9. The Current Status of DNA Microarrays

    NASA Astrophysics Data System (ADS)

    Shi, Leming; Perkins, Roger G.; Tong, Weida

    DNA microarray technology that allows simultaneous assay of thousands of genes in a single experiment has steadily advanced to become a mainstream method used in research, and has reached a stage that envisions its use in medical applications and personalized medicine. Many different strategies have been developed for manufacturing DNA microarrays. In this chapter, we discuss the manu facturing characteristics of seven microarray platforms that were used in a recently completed large study by the MicroArray Quality Control (MAQC) consortium, which evaluated the concordance of results across these platforms. The platforms can be grouped into three categories: (1) in situ synthesis of oligonucleotide probes on microarrays (Affymetrix GeneChip® arrays based on photolithography synthesis and Agilent's arrays based on inkjet synthesis); (2) spotting of presynthe-sized oligonucleotide probes on microarrays (GE Healthcare's CodeLink system, Applied Biosystems' Genome Survey Microarrays, and the custom microarrays printed with Operon's oligonucleotide set); and (3) deposition of presynthesized oligonucleotide probes on bead-based microarrays (Illumina's BeadChip microar-rays). We conclude this chapter with our views on the challenges and opportunities toward acceptance of DNA microarray data in clinical and regulatory settings.

  10. Microarray analysis in pulmonary hypertension

    PubMed Central

    Hoffmann, Julia; Wilhelm, Jochen; Olschewski, Andrea

    2016-01-01

    Microarrays are a powerful and effective tool that allows the detection of genome-wide gene expression differences between controls and disease conditions. They have been broadly applied to investigate the pathobiology of diverse forms of pulmonary hypertension, namely group 1, including patients with idiopathic pulmonary arterial hypertension, and group 3, including pulmonary hypertension associated with chronic lung diseases such as chronic obstructive pulmonary disease and idiopathic pulmonary fibrosis. To date, numerous human microarray studies have been conducted to analyse global (lung homogenate samples), compartment-specific (laser capture microdissection), cell type-specific (isolated primary cells) and circulating cell (peripheral blood) expression profiles. Combined, they provide important information on development, progression and the end-stage disease. In the future, system biology approaches, expression of noncoding RNAs that regulate coding RNAs, and direct comparison between animal models and human disease might be of importance. PMID:27076594

  11. DNA microarray technology in dermatology.

    PubMed

    Kunz, Manfred

    2008-03-01

    In recent years, DNA microarray technology has been used for the analysis of gene expression patterns in a variety of skin diseases, including malignant melanoma, psoriasis, lupus erythematosus, and systemic sclerosis. Many of the studies described herein confirmed earlier results on individual genes or functional groups of genes. However, a plethora of new candidate genes, gene patterns, and regulatory pathways have been identified. Major progresses were reached by the identification of a prognostic gene pattern in malignant melanoma, an immune signaling cluster in psoriasis, and a so-called interferon signature in systemic lupus erythematosus. In future, interference with genes or regulatory pathways with the use of different RNA interference technologies or targeted therapy may not only underscore the functional significance of microarray data but also may open interesting therapeutic perspectives. Large-scale gene expression analyses may also help to design more individualized treatment approaches of cutaneous diseases.

  12. Microarray analysis in pulmonary hypertension.

    PubMed

    Hoffmann, Julia; Wilhelm, Jochen; Olschewski, Andrea; Kwapiszewska, Grazyna

    2016-07-01

    Microarrays are a powerful and effective tool that allows the detection of genome-wide gene expression differences between controls and disease conditions. They have been broadly applied to investigate the pathobiology of diverse forms of pulmonary hypertension, namely group 1, including patients with idiopathic pulmonary arterial hypertension, and group 3, including pulmonary hypertension associated with chronic lung diseases such as chronic obstructive pulmonary disease and idiopathic pulmonary fibrosis. To date, numerous human microarray studies have been conducted to analyse global (lung homogenate samples), compartment-specific (laser capture microdissection), cell type-specific (isolated primary cells) and circulating cell (peripheral blood) expression profiles. Combined, they provide important information on development, progression and the end-stage disease. In the future, system biology approaches, expression of noncoding RNAs that regulate coding RNAs, and direct comparison between animal models and human disease might be of importance.

  13. Microarrays, antiobesity and the liver

    PubMed Central

    Castro-Chávez, Fernando

    2013-01-01

    In this review, the microarray technology and especially oligonucleotide arrays are exemplified with a practical example taken from the perilipin−/− mice and using the dChip software, available for non-lucrative purposes. It was found that the liver of perilipin−/− mice was healthy and normal, even under high-fat diet when compared with the results published for the scd1−/− mice, which under high-fat diets had a darker liver, suggestive of hepatic steatosis. Scd1 is required for the biosynthesis of monounsaturated fatty acids and plays a key role in the hepatic synthesis of triglycerides and of very-low-density lipoproteins. Both models of obesity resistance share many similar phenotypic antiobesity features, however, the perilipin−/− mice had a significant downregulation of stearoyl CoA desaturases scd1 and scd2 in its white adipose tissue, but a normal level of both genes inside the liver, even under high-fat diet. Here, different microarray methodologies are discussed, and also some of the most recent discoveries and perspectives regarding the use of microarrays, with an emphasis on obesity gene expression, and a personal remark on my findings of increased expression for hemoglobin transcripts and other hemo related genes (hemo-like), and for leukocyte like (leuko-like) genes inside the white adipose tissue of the perilipin−/− mice. In conclusion, microarrays have much to offer in comparative studies such as those in antiobesity, and also they are methodologies adequate for new astounding molecular discoveries [free full text of this article PMID:15657555

  14. Automated High Throughput Drug Target Crystallography

    SciTech Connect

    Rupp, B

    2005-02-18

    The molecular structures of drug target proteins and receptors form the basis for 'rational' or structure guided drug design. The majority of target structures are experimentally determined by protein X-ray crystallography, which as evolved into a highly automated, high throughput drug discovery and screening tool. Process automation has accelerated tasks from parallel protein expression, fully automated crystallization, and rapid data collection to highly efficient structure determination methods. A thoroughly designed automation technology platform supported by a powerful informatics infrastructure forms the basis for optimal workflow implementation and the data mining and analysis tools to generate new leads from experimental protein drug target structures.

  15. Office automation.

    PubMed

    Arenson, R L

    1986-03-01

    By now, the term "office automation" should have more meaning for those readers who are not intimately familiar with the subject. Not all of the preceding material pertains to every department or practice, but certainly, word processing and simple telephone management are key items. The size and complexity of the organization will dictate the usefulness of electronic mail and calendar management, and the individual radiologist's personal needs and habits will determine the usefulness of the home computer. Perhaps the most important ingredient for success in the office automation arena relates to the ability to integrate information from various systems in a simple and flexible manner. Unfortunately, this is perhaps the one area that most office automation systems have ignored or handled poorly. In the personal computer world, there has been much emphasis recently on integration of packages such as spreadsheet, database management, word processing, graphics, time management, and communications. This same philosophy of integration has been applied to a few office automation systems, but these are generally vendor-specific and do not allow for a mixture of foreign subsystems. During the next few years, it is likely that a few vendors will emerge as dominant in this integrated office automation field and will stress simplicity and flexibility as major components.

  16. Habitat automation

    NASA Technical Reports Server (NTRS)

    Swab, Rodney E.

    1992-01-01

    A habitat, on either the surface of the Moon or Mars, will be designed and built with the proven technologies of that day. These technologies will be mature and readily available to the habitat designer. We believe an acceleration of the normal pace of automation would allow a habitat to be safer and more easily maintained than would be the case otherwise. This document examines the operation of a habitat and describes elements of that operation which may benefit from an increased use of automation. Research topics within the automation realm are then defined and discussed with respect to the role they can have in the design of the habitat. Problems associated with the integration of advanced technologies into real-world projects at NASA are also addressed.

  17. Testing fully depleted CCD

    NASA Astrophysics Data System (ADS)

    Casas, Ricard; Cardiel-Sas, Laia; Castander, Francisco J.; Jiménez, Jorge; de Vicente, Juan

    2014-08-01

    The focal plane of the PAU camera is composed of eighteen 2K x 4K CCDs. These devices, plus four spares, were provided by the Japanese company Hamamatsu Photonics K.K. with type no. S10892-04(X). These detectors are 200 μm thick fully depleted and back illuminated with an n-type silicon base. They have been built with a specific coating to be sensitive in the range from 300 to 1,100 nm. Their square pixel size is 15 μm. The read-out system consists of a Monsoon controller (NOAO) and the panVIEW software package. The deafualt CCD read-out speed is 133 kpixel/s. This is the value used in the calibration process. Before installing these devices in the camera focal plane, they were characterized using the facilities of the ICE (CSIC- IEEC) and IFAE in the UAB Campus in Bellaterra (Barcelona, Catalonia, Spain). The basic tests performed for all CCDs were to obtain the photon transfer curve (PTC), the charge transfer efficiency (CTE) using X-rays and the EPER method, linearity, read-out noise, dark current, persistence, cosmetics and quantum efficiency. The X-rays images were also used for the analysis of the charge diffusion for different substrate voltages (VSUB). Regarding the cosmetics, and in addition to white and dark pixels, some patterns were also found. The first one, which appears in all devices, is the presence of half circles in the external edges. The origin of this pattern can be related to the assembly process. A second one appears in the dark images, and shows bright arcs connecting corners along the vertical axis of the CCD. This feature appears in all CCDs exactly in the same position so our guess is that the pattern is due to electrical fields. Finally, and just in two devices, there is a spot with wavelength dependence whose origin could be the result of a defectous coating process.

  18. Surface characterization of carbohydrate microarrays.

    PubMed

    Scurr, David J; Horlacher, Tim; Oberli, Matthias A; Werz, Daniel B; Kroeck, Lenz; Bufali, Simone; Seeberger, Peter H; Shard, Alexander G; Alexander, Morgan R

    2010-11-16

    Carbohydrate microarrays are essential tools to determine the biological function of glycans. Here, we analyze a glycan array by time-of-flight secondary ion mass spectrometry (ToF-SIMS) to gain a better understanding of the physicochemical properties of the individual spots and to improve carbohydrate microarray quality. The carbohydrate microarray is prepared by piezo printing of thiol-terminated sugars onto a maleimide functionalized glass slide. The hyperspectral ToF-SIMS imaging data are analyzed by multivariate curve resolution (MCR) to discern secondary ions from regions of the array containing saccharide, linker, salts from the printing buffer, and the background linker chemistry. Analysis of secondary ions from the linker common to all of the sugar molecules employed reveals a relatively uniform distribution of the sugars within the spots formed from solutions with saccharide concentration of 0.4 mM and less, whereas a doughnut shape is often formed at higher-concentration solutions. A detailed analysis of individual spots reveals that in the larger spots the phosphate buffered saline (PBS) salts are heterogeneously distributed, apparently resulting in saccharide concentrated at the rim of the spots. A model of spot formation from the evaporating sessile drop is proposed to explain these observations. Saccharide spot diameters increase with saccharide concentration due to a reduction in surface tension of the saccharide solution compared to PBS. The multivariate analytical partial least squares (PLS) technique identifies ions from the sugars that in the complex ToF-SIMS spectra correlate with the binding of galectin proteins.

  19. Array2BIO: A Comprehensive Suite of Utilities for the Analysis of Microarray Data

    SciTech Connect

    Loots, G G; Chain, P G; Mabery, S; Rasley, A; Garcia, E; Ovcharenko, I

    2006-02-13

    We have developed an integrative and automated toolkit for the analysis of Affymetrix microarray data, named Array2BIO. It identifies groups of coexpressed genes using two complementary approaches--comparative analysis of signal versus control microarrays and clustering analysis of gene expression across different conditions. The identified genes are assigned to functional categories based on the Gene Ontology classification, and a detection of corresponding KEGG protein interaction pathways. Array2BIO reliably handles low-expressor genes and provides a set of statistical methods to quantify the odds of observations, including the Benjamini-Hochberg and Bonferroni multiple testing corrections. Automated interface with the ECR Browser provides evolutionary conservation analysis of identified gene loci while the interconnection with Creme allows high-throughput analysis of human promoter regions and prediction of gene regulatory elements that underlie the observed expression patterns. Array2BIO is publicly available at http://array2bio.dcode.org.

  20. Diagnostic challenges for multiplexed protein microarrays.

    PubMed

    Master, Stephen R; Bierl, Charlene; Kricka, Larry J

    2006-11-01

    Multiplexed protein analysis using planar microarrays or microbeads is growing in popularity for simultaneous assays of antibodies, cytokines, allergens, drugs and hormones. However, this new assay format presents several new operational issues for the clinical laboratory, such as the quality control of protein-microarray-based assays, the release of unrequested test data and the use of diagnostic algorithms to transform microarray data into diagnostic results.

  1. THE ABRF MARG MICROARRAY SURVEY 2005: TAKING THE PULSE ON THE MICROARRAY FIELD

    EPA Science Inventory

    Over the past several years microarray technology has evolved into a critical component of any discovery based program. Since 1999, the Association of Biomolecular Resource Facilities (ABRF) Microarray Research Group (MARG) has conducted biennial surveys designed to generate a pr...

  2. Living Cell Microarrays: An Overview of Concepts

    PubMed Central

    Jonczyk, Rebecca; Kurth, Tracy; Lavrentieva, Antonina; Walter, Johanna-Gabriela; Scheper, Thomas; Stahl, Frank

    2016-01-01

    Living cell microarrays are a highly efficient cellular screening system. Due to the low number of cells required per spot, cell microarrays enable the use of primary and stem cells and provide resolution close to the single-cell level. Apart from a variety of conventional static designs, microfluidic microarray systems have also been established. An alternative format is a microarray consisting of three-dimensional cell constructs ranging from cell spheroids to cells encapsulated in hydrogel. These systems provide an in vivo-like microenvironment and are preferably used for the investigation of cellular physiology, cytotoxicity, and drug screening. Thus, many different high-tech microarray platforms are currently available. Disadvantages of many systems include their high cost, the requirement of specialized equipment for their manufacture, and the poor comparability of results between different platforms. In this article, we provide an overview of static, microfluidic, and 3D cell microarrays. In addition, we describe a simple method for the printing of living cell microarrays on modified microscope glass slides using standard DNA microarray equipment available in most laboratories. Applications in research and diagnostics are discussed, e.g., the selective and sensitive detection of biomarkers. Finally, we highlight current limitations and the future prospects of living cell microarrays. PMID:27600077

  3. Clustering Short Time-Series Microarray

    NASA Astrophysics Data System (ADS)

    Ping, Loh Wei; Hasan, Yahya Abu

    2008-01-01

    Most microarray analyses are carried out on static gene expressions. However, the dynamical study of microarrays has lately gained more attention. Most researches on time-series microarray emphasize on the bioscience and medical aspects but few from the numerical aspect. This study attempts to analyze short time-series microarray mathematically using STEM clustering tool which formally preprocess data followed by clustering. We next introduce the Circular Mould Distance (CMD) algorithm with combinations of both preprocessing and clustering analysis. Both methods are subsequently compared in terms of efficiencies.

  4. Automating Finance

    ERIC Educational Resources Information Center

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  5. MAPPI-DAT: data management and analysis for protein-protein interaction data from the high-throughput MAPPIT cell microarray platform.

    PubMed

    Gupta, Surya; De Puysseleyr, Veronic; Van der Heyden, José; Maddelein, Davy; Lemmens, Irma; Lievens, Sam; Degroeve, Sven; Tavernier, Jan; Martens, Lennart

    2017-01-18

    Protein-protein interaction (PPI) studies have dramatically expanded our knowledge about cellular behaviour and development in different conditions. A multitude of high-throughput PPI techniques have been developed to achieve proteome-scale coverage for PPI studies, including the microarray based Mammalian Protein-Protein Interaction Trap (MAPPIT) system. Because such high-throughput techniques typically report thousands of interactions, managing and analysing the large amounts of acquired data is a challenge. We have therefore built the MAPPIT cell microArray Protein Protein Interaction- Data management & Analysis Tool (MAPPI-DAT) as an automated data management and analysis tool for MAPPIT cell microarray experiments. MAPPI-DAT stores the experimental data and metadata in a systematic and structured way, automates data analysis and interpretation, and enables the meta-analysis of MAPPIT cell microarray data across all stored experiments.

  6. THE ABRF-MARG MICROARRAY SURVEY 2004: TAKING THE PULSE OF THE MICROARRAY FIELD

    EPA Science Inventory

    Over the past several years, the field of microarrays has grown and evolved drastically. In its continued efforts to track this evolution, the ABRF-MARG has once again conducted a survey of international microarray facilities and individual microarray users. The goal of the surve...

  7. 2008 Microarray Research Group (MARG Survey): Sensing the State of Microarray Technology

    EPA Science Inventory

    Over the past several years, the field of microarrays has grown and evolved drastically. In its continued efforts to track this evolution and transformation, the ABRF-MARG has once again conducted a survey of international microarray facilities and individual microarray users. Th...

  8. Systematic review of accuracy of prenatal diagnosis for abnormal chromosome diseases by microarray technology.

    PubMed

    Xu, H B; Yang, H; Liu, G; Chen, H

    2014-10-31

    The accuracy of prenatal diagnosis for abnormal chromosome diseases by chromosome microarray technology and karyotyping were compared. A literature search was carried out in the MEDLINE database with the keywords "chromosome" and "karyotype" and "genetic testing" and "prenatal diagnosis" and "oligonucleotide array sequence". The studies obtained were filtered by using the QUADAS tool, and studies conforming to the quality standard were fully analyzed. There was one paper conforming to the QUADAS standards including 4406 gravidas with adaptability syndromes of prenatal diagnosis including elderly parturient women, abnormal structure by type-B ultrasound, and other abnormalities. Microarray technology yielded successful diagnoses in 4340 cases (98.8%), and there was no need for tissue culture in 87.9% of the samples. All aneuploids and non-parallel translocations in 4282 cases of non-chimera identified by karyotyping could be detected using microarray analysis technology, whereas parallel translocations and fetal triploids could not be detected by microarray analysis technology. In the samples with normal karyotyping results, type-B ultrasound showed that 6% of chromosomal deficiencies or chromosome duplications could be detected by microarray technology, and the same abnormal chromosomes were detected in 1.7% of elderly parturient women and samples with positive serology screening results. In the prenatal diagnosis test, compared with karyotyping, microarray technology could identify the extra cell genetic information with clinical significance, aneuploids, and non-parallel translocations; however, its disadvantage is that it could not identify parallel translocations and triploids.

  9. Microarrays Made Simple: "DNA Chips" Paper Activity

    ERIC Educational Resources Information Center

    Barnard, Betsy

    2006-01-01

    DNA microarray technology is revolutionizing biological science. DNA microarrays (also called DNA chips) allow simultaneous screening of many genes for changes in expression between different cells. Now researchers can obtain information about genes in days or weeks that used to take months or years. The paper activity described in this article…

  10. Automated Methods for Multiplexed Pathogen Detection

    SciTech Connect

    Straub, Tim M.; Dockendorff, Brian P.; Quinonez-Diaz, Maria D.; Valdez, Catherine O.; Shutthanandan, Janani I.; Tarasevich, Barbara J.; Grate, Jay W.; Bruckner-Lea, Cindy J.

    2005-09-01

    Detection of pathogenic microorganisms in environmental samples is a difficult process. Concentration of the organisms of interest also co-concentrates inhibitors of many end-point detection methods, notably, nucleic acid methods. In addition, sensitive, highly multiplexed pathogen detection continues to be problematic. The primary function of the BEADS (Biodetection Enabling Analyte Delivery System) platform is the automated concentration and purification of target analytes from interfering substances, often present in these samples, via a renewable surface column. In one version of BEADS, automated immunomagnetic separation (IMS) is used to separate cells from their samples. Captured cells are transferred to a flow-through thermal cycler where PCR, using labeled primers, is performed. PCR products are then detected by hybridization to a DNA suspension array. In another version of BEADS, cell lysis is performed, and community RNA is purified and directly labeled. Multiplexed detection is accomplished by direct hybridization of the RNA to a planar microarray. The integrated IMS/PCR version of BEADS can successfully purify and amplify 10 E. coli O157:H7 cells from river water samples. Multiplexed PCR assays for the simultaneous detection of E. coli O157:H7, Salmonella, and Shigella on bead suspension arrays was demonstrated for the detection of as few as 100 cells for each organism. Results for the RNA version of BEADS are also showing promising results. Automation yields highly purified RNA, suitable for multiplexed detection on microarrays, with microarray detection specificity equivalent to PCR. Both versions of the BEADS platform show great promise for automated pathogen detection from environmental samples. Highly multiplexed pathogen detection using PCR continues to be problematic, but may be required for trace detection in large volume samples. The RNA approach solves the issues of highly multiplexed PCR and provides ''live vs. dead'' capabilities. However

  11. Automated methods for multiplexed pathogen detection.

    PubMed

    Straub, Timothy M; Dockendorff, Brian P; Quiñonez-Díaz, Maria D; Valdez, Catherine O; Shutthanandan, Janani I; Tarasevich, Barbara J; Grate, Jay W; Bruckner-Lea, Cynthia J

    2005-09-01

    Detection of pathogenic microorganisms in environmental samples is a difficult process. Concentration of the organisms of interest also co-concentrates inhibitors of many end-point detection methods, notably, nucleic acid methods. In addition, sensitive, highly multiplexed pathogen detection continues to be problematic. The primary function of the BEADS (Biodetection Enabling Analyte Delivery System) platform is the automated concentration and purification of target analytes from interfering substances, often present in these samples, via a renewable surface column. In one version of BEADS, automated immunomagnetic separation (IMS) is used to separate cells from their samples. Captured cells are transferred to a flow-through thermal cycler where PCR, using labeled primers, is performed. PCR products are then detected by hybridization to a DNA suspension array. In another version of BEADS, cell lysis is performed, and community RNA is purified and directly labeled. Multiplexed detection is accomplished by direct hybridization of the RNA to a planar microarray. The integrated IMS/PCR version of BEADS can successfully purify and amplify 10 E. coli O157:H7 cells from river water samples. Multiplexed PCR assays for the simultaneous detection of E. coli O157:H7, Salmonella, and Shigella on bead suspension arrays was demonstrated for the detection of as few as 100 cells for each organism. Results for the RNA version of BEADS are also showing promising results. Automation yields highly purified RNA, suitable for multiplexed detection on microarrays, with microarray detection specificity equivalent to PCR. Both versions of the BEADS platform show great promise for automated pathogen detection from environmental samples. Highly multiplexed pathogen detection using PCR continues to be problematic, but may be required for trace detection in large volume samples. The RNA approach solves the issues of highly multiplexed PCR and provides "live vs. dead" capabilities. However

  12. Automated Test Requirement Document Generation

    DTIC Science & Technology

    1987-11-01

    DIAGNOSTICS BASED ON THE PRINCIPLES OF ARTIFICIAL INTELIGENCE ", 1984 International Test Conference, 01Oct84, (A3, 3, Cs D3, E2, G2, H2, 13, J6, K) 425...j0O GLOSSARY OF ACRONYMS 0 ABBREVIATION DEFINITION AFSATCOM Air Force Satellite Communication Al Artificial Intelligence ASIC Application Specific...In-Test Equipment (BITE) and AI ( Artificial Intelligence) - Expert Systems - need to be fully applied before a completely automated process can be

  13. Tissue Microarrays in Clinical Oncology

    PubMed Central

    Voduc, David; Kenney, Challayne; Nielsen, Torsten O.

    2008-01-01

    The tissue microarray is a recently-implemented, high-throughput technology for the analysis of molecular markers in oncology. This research tool permits the rapid assessment of a biomarker in thousands of tumor samples, using commonly available laboratory assays such as immunohistochemistry and in-situ hybridization. Although introduced less than a decade ago, the TMA has proven to be invaluable in the study of tumor biology, the development of diagnostic tests, and the investigation of oncological biomarkers. This review describes the impact of TMA-based research in clinical oncology and its potential future applications. Technical aspects of TMA construction, and the advantages and disadvantages inherent to this technology are also discussed. PMID:18314063

  14. Analysis of DNA microarray expression data.

    PubMed

    Simon, Richard

    2009-06-01

    DNA microarrays are powerful tools for studying biological mechanisms and for developing prognostic and predictive classifiers for identifying the patients who require treatment and are best candidates for specific treatments. Because microarrays produce so much data from each specimen, they offer great opportunities for discovery and great dangers or producing misleading claims. Microarray based studies require clear objectives for selecting cases and appropriate analysis methods. Effective analysis of microarray data, where the number of measured variables is orders of magnitude greater than the number of cases, requires specialized statistical methods which have recently been developed. Recent literature reviews indicate that serious problems of analysis exist a substantial proportion of publications. This manuscript attempts to provide a non-technical summary of the key principles of statistical design and analysis for studies that utilize microarray expression profiling.

  15. Microarray Applications in Microbial Ecology Research.

    SciTech Connect

    Gentry, T.; Schadt, C.; Zhou, J.

    2006-04-06

    Microarray technology has the unparalleled potential tosimultaneously determine the dynamics and/or activities of most, if notall, of the microbial populations in complex environments such as soilsand sediments. Researchers have developed several types of arrays thatcharacterize the microbial populations in these samples based on theirphylogenetic relatedness or functional genomic content. Several recentstudies have used these microarrays to investigate ecological issues;however, most have only analyzed a limited number of samples withrelatively few experiments utilizing the full high-throughput potentialof microarray analysis. This is due in part to the unique analyticalchallenges that these samples present with regard to sensitivity,specificity, quantitation, and data analysis. This review discussesspecific applications of microarrays to microbial ecology research alongwith some of the latest studies addressing the difficulties encounteredduring analysis of complex microbial communities within environmentalsamples. With continued development, microarray technology may ultimatelyachieve its potential for comprehensive, high-throughput characterizationof microbial populations in near real-time.

  16. In control: systematic assessment of microarray performance.

    PubMed

    van Bakel, Harm; Holstege, Frank C P

    2004-10-01

    Expression profiling using DNA microarrays is a powerful technique that is widely used in the life sciences. How reliable are microarray-derived measurements? The assessment of performance is challenging because of the complicated nature of microarray experiments and the many different technology platforms. There is a mounting call for standards to be introduced, and this review addresses some of the issues that are involved. Two important characteristics of performance are accuracy and precision. The assessment of these factors can be either for the purpose of technology optimization or for the evaluation of individual microarray hybridizations. Microarray performance has been evaluated by at least four approaches in the past. Here, we argue that external RNA controls offer the most versatile system for determining performance and describe how such standards could be implemented. Other uses of external controls are discussed, along with the importance of probe sequence availability and the quantification of labelled material.

  17. Space science experimentation automation and support

    NASA Technical Reports Server (NTRS)

    Frainier, Richard J.; Groleau, Nicolas; Shapiro, Jeff C.

    1994-01-01

    This paper outlines recent work done at the NASA Ames Artificial Intelligence Research Laboratory on automation and support of science experiments on the US Space Shuttle in low earth orbit. Three approaches to increasing the science return of these experiments using emerging automation technologies are described: remote control (telescience), science advisors for astronaut operators, and fully autonomous experiments. The capabilities and limitations of these approaches are reviewed.

  18. Chaotic mixer improves microarray hybridization.

    PubMed

    McQuain, Mark K; Seale, Kevin; Peek, Joel; Fisher, Timothy S; Levy, Shawn; Stremler, Mark A; Haselton, Frederick R

    2004-02-15

    Hybridization is an important aspect of microarray experimental design which influences array signal levels and the repeatability of data within an array and across different arrays. Current methods typically require 24h and use target inefficiently. In these studies, we compare hybridization signals obtained in conventional static hybridization, which depends on diffusional target delivery, with signals obtained in a dynamic hybridization chamber, which employs a fluid mixer based on chaotic advection theory to deliver targets across a conventional glass slide array. Microarrays were printed with a pattern of 102 identical probe spots containing a 65-mer oligonucleotide capture probe. Hybridization of a 725-bp fluorescently labeled target was used to measure average target hybridization levels, local signal-to-noise ratios, and array hybridization uniformity. Dynamic hybridization for 1h with 1 or 10ng of target DNA increased hybridization signal intensities approximately threefold over a 24-h static hybridization. Similarly, a 10- or 60-min dynamic hybridization of 10ng of target DNA increased hybridization signal intensities fourfold over a 24h static hybridization. In time course studies, static hybridization reached a maximum within 8 to 12h using either 1 or 10ng of target. In time course studies using the dynamic hybridization chamber, hybridization using 1ng of target increased to a maximum at 4h and that using 10ng of target did not vary over the time points tested. In comparison to static hybridization, dynamic hybridization reduced the signal-to-noise ratios threefold and reduced spot-to-spot variation twofold. Therefore, we conclude that dynamic hybridization based on a chaotic mixer design improves both the speed of hybridization and the maximum level of hybridization while increasing signal-to-noise ratios and reducing spot-to-spot variation.

  19. Automated S/TEM metrology on advanced semiconductor gate structures

    NASA Astrophysics Data System (ADS)

    Strauss, M.; Arjavac, J.; Horspool, D. N.; Nakahara, K.; Deeb, C.; Hobbs, C.

    2012-03-01

    Alternate techniques for obatining metrology data from advanced semiconductor device structures may be required. Automated STEM-based dimensional metrology (CD-STEM) was developed for complex 3D geometries in read/write head metrology in teh hard disk drive industry. It has been widely adopted, and is the process of record for metrology. Fully automated S/TEM metrology on advanced semiconductor gate structures is viable, with good repeatability and robustness. Consistent automated throughput of 10 samples per hour was achieved. Automated sample preparation was developed with sufficient throughput and quality to support the automated CD-STEM.

  20. Microarray-integrated optoelectrofluidic immunoassay system.

    PubMed

    Han, Dongsik; Park, Je-Kyun

    2016-05-01

    A microarray-based analytical platform has been utilized as a powerful tool in biological assay fields. However, an analyte depletion problem due to the slow mass transport based on molecular diffusion causes low reaction efficiency, resulting in a limitation for practical applications. This paper presents a novel method to improve the efficiency of microarray-based immunoassay via an optically induced electrokinetic phenomenon by integrating an optoelectrofluidic device with a conventional glass slide-based microarray format. A sample droplet was loaded between the microarray slide and the optoelectrofluidic device on which a photoconductive layer was deposited. Under the application of an AC voltage, optically induced AC electroosmotic flows caused by a microarray-patterned light actively enhanced the mass transport of target molecules at the multiple assay spots of the microarray simultaneously, which reduced tedious reaction time from more than 30 min to 10 min. Based on this enhancing effect, a heterogeneous immunoassay with a tiny volume of sample (5 μl) was successfully performed in the microarray-integrated optoelectrofluidic system using immunoglobulin G (IgG) and anti-IgG, resulting in improved efficiency compared to the static environment. Furthermore, the application of multiplex assays was also demonstrated by multiple protein detection.

  1. MAAMD: a workflow to standardize meta-analyses and comparison of affymetrix microarray data

    PubMed Central

    2014-01-01

    Background Mandatory deposit of raw microarray data files for public access, prior to study publication, provides significant opportunities to conduct new bioinformatics analyses within and across multiple datasets. Analysis of raw microarray data files (e.g. Affymetrix CEL files) can be time consuming, complex, and requires fundamental computational and bioinformatics skills. The development of analytical workflows to automate these tasks simplifies the processing of, improves the efficiency of, and serves to standardize multiple and sequential analyses. Once installed, workflows facilitate the tedious steps required to run rapid intra- and inter-dataset comparisons. Results We developed a workflow to facilitate and standardize Meta-Analysis of Affymetrix Microarray Data analysis (MAAMD) in Kepler. Two freely available stand-alone software tools, R and AltAnalyze were embedded in MAAMD. The inputs of MAAMD are user-editable csv files, which contain sample information and parameters describing the locations of input files and required tools. MAAMD was tested by analyzing 4 different GEO datasets from mice and drosophila. MAAMD automates data downloading, data organization, data quality control assesment, differential gene expression analysis, clustering analysis, pathway visualization, gene-set enrichment analysis, and cross-species orthologous-gene comparisons. MAAMD was utilized to identify gene orthologues responding to hypoxia or hyperoxia in both mice and drosophila. The entire set of analyses for 4 datasets (34 total microarrays) finished in ~ one hour. Conclusions MAAMD saves time, minimizes the required computer skills, and offers a standardized procedure for users to analyze microarray datasets and make new intra- and inter-dataset comparisons. PMID:24621103

  2. DNA Microarrays in Herbal Drug Research

    PubMed Central

    Chavan, Preeti; Joshi, Kalpana; Patwardhan, Bhushan

    2006-01-01

    Natural products are gaining increased applications in drug discovery and development. Being chemically diverse they are able to modulate several targets simultaneously in a complex system. Analysis of gene expression becomes necessary for better understanding of molecular mechanisms. Conventional strategies for expression profiling are optimized for single gene analysis. DNA microarrays serve as suitable high throughput tool for simultaneous analysis of multiple genes. Major practical applicability of DNA microarrays remains in DNA mutation and polymorphism analysis. This review highlights applications of DNA microarrays in pharmacodynamics, pharmacogenomics, toxicogenomics and quality control of herbal drugs and extracts. PMID:17173108

  3. Progress in the application of DNA microarrays.

    PubMed Central

    Lobenhofer, E K; Bushel, P R; Afshari, C A; Hamadeh, H K

    2001-01-01

    Microarray technology has been applied to a variety of different fields to address fundamental research questions. The use of microarrays, or DNA chips, to study the gene expression profiles of biologic samples began in 1995. Since that time, the fundamental concepts behind the chip, the technology required for making and using these chips, and the multitude of statistical tools for analyzing the data have been extensively reviewed. For this reason, the focus of this review will be not on the technology itself but on the application of microarrays as a research tool and the future challenges of the field. PMID:11673116

  4. Automated office blood pressure.

    PubMed

    Myers, Martin G; Godwin, Marshall

    2012-05-01

    Manual blood pressure (BP) is gradually disappearing from clinical practice with the mercury sphygmomanometer now considered to be an environmental hazard. Manual BP is also subject to measurement error on the part of the physician/nurse and patient-related anxiety which can result in poor quality BP measurements and office-induced (white coat) hypertension. Automated office (AO) BP with devices such as the BpTRU (BpTRU Medical Devices, Coquitlam, BC) has already replaced conventional manual BP in many primary care practices in Canada and has also attracted interest in other countries where research studies using AOBP have been undertaken. The basic principles of AOBP include multiple readings taken with a fully automated recorder with the patient resting alone in a quiet room. When these principles are followed, office-induced hypertension is eliminated and AOBP exhibits a much stronger correlation with the awake ambulatory BP as compared with routine manual BP measurements. Unlike routine manual BP, AOBP correlates as well with left ventricular mass as does the awake ambulatory BP. AOBP also simplifies the definition of hypertension in that the cut point for a normal AOBP (< 135/85 mm Hg) is the same as for the awake ambulatory BP and home BP. This article summarizes the currently available evidence supporting the use of AOBP in routine clinical practice and proposes an algorithm in which AOBP replaces manual BP for the diagnosis and management of hypertension.

  5. Agile automated vision

    NASA Astrophysics Data System (ADS)

    Fandrich, Juergen; Schmitt, Lorenz A.

    1994-11-01

    The microelectronic industry is a protagonist in driving automated vision to new paradigms. Today semiconductor manufacturers use vision systems quite frequently in their fabs in the front-end process. In fact, the process depends on reliable image processing systems. In the back-end process, where ICs are assembled and packaged, today vision systems are only partly used. But in the next years automated vision will become compulsory for the back-end process as well. Vision will be fully integrated into every IC package production machine to increase yields and reduce costs. Modem high-speed material processing requires dedicated and efficient concepts in image processing. But the integration of various equipment in a production plant leads to unifying handling of data flow and interfaces. Only agile vision systems can act with these contradictions: fast, reliable, adaptable, scalable and comprehensive. A powerful hardware platform is a unneglectable requirement for the use of advanced and reliable, but unfortunately computing intensive image processing algorithms. The massively parallel SIMD hardware product LANTERN/VME supplies a powerful platform for existing and new functionality. LANTERN/VME is used with a new optical sensor for IC package lead inspection. This is done in 3D, including horizontal and coplanarity inspection. The appropriate software is designed for lead inspection, alignment and control tasks in IC package production and handling equipment, like Trim&Form, Tape&Reel and Pick&Place machines.

  6. Mutational analysis using oligonucleotide microarrays

    PubMed Central

    Hacia, J.; Collins, F.

    1999-01-01

    The development of inexpensive high throughput methods to identify individual DNA sequence differences is important to the future growth of medical genetics. This has become increasingly apparent as epidemiologists, pathologists, and clinical geneticists focus more attention on the molecular basis of complex multifactorial diseases. Such undertakings will rely upon genetic maps based upon newly discovered, common, single nucleotide polymorphisms. Furthermore, candidate gene approaches used in identifying disease associated genes necessitate screening large sequence blocks for changes tracking with the disease state. Even after such genes are isolated, large scale mutational analyses will often be needed for risk assessment studies to define the likely medical consequences of carrying a mutated gene.
This review concentrates on the use of oligonucleotide arrays for hybridisation based comparative sequence analysis. Technological advances within the past decade have made it possible to apply this technology to many different aspects of medical genetics. These applications range from the detection and scoring of single nucleotide polymorphisms to mutational analysis of large genes. Although we discuss published scientific reports, unpublished work from the private sector12 could also significantly affect the future of this technology.


Keywords: mutational analysis; oligonucleotide microarrays; DNA chips PMID:10528850

  7. Integrating Microarray Data and GRNs.

    PubMed

    Koumakis, L; Potamias, G; Tsiknakis, M; Zervakis, M; Moustakis, V

    2016-01-01

    With the completion of the Human Genome Project and the emergence of high-throughput technologies, a vast amount of molecular and biological data are being produced. Two of the most important and significant data sources come from microarray gene-expression experiments and respective databanks (e,g., Gene Expression Omnibus-GEO (http://www.ncbi.nlm.nih.gov/geo)), and from molecular pathways and Gene Regulatory Networks (GRNs) stored and curated in public (e.g., Kyoto Encyclopedia of Genes and Genomes-KEGG (http://www.genome.jp/kegg/pathway.html), Reactome (http://www.reactome.org/ReactomeGWT/entrypoint.html)) as well as in commercial repositories (e.g., Ingenuity IPA (http://www.ingenuity.com/products/ipa)). The association of these two sources aims to give new insight in disease understanding and reveal new molecular targets in the treatment of specific phenotypes.Three major research lines and respective efforts that try to utilize and combine data from both of these sources could be identified, namely: (1) de novo reconstruction of GRNs, (2) identification of Gene-signatures, and (3) identification of differentially expressed GRN functional paths (i.e., sub-GRN paths that distinguish between different phenotypes). In this chapter, we give an overview of the existing methods that support the different types of gene-expression and GRN integration with a focus on methodologies that aim to identify phenotype-discriminant GRNs or subnetworks, and we also present our methodology.

  8. Mapping the affinity landscape of Thrombin-binding aptamers on 2'F-ANA/DNA chimeric G-Quadruplex microarrays.

    PubMed

    Lietard, Jory; Abou Assi, Hala; Gómez-Pinto, Irene; González, Carlos; Somoza, Mark M; Damha, Masad J

    2017-01-18

    In situ fabricated nucleic acids microarrays are versatile and very high-throughput platforms for aptamer optimization and discovery, but the chemical space that can be probed against a given target has largely been confined to DNA, while RNA and non-natural nucleic acid microarrays are still an essentially uncharted territory. 2'-Fluoroarabinonucleic acid (2'F-ANA) is a prime candidate for such use in microarrays. Indeed, 2'F-ANA chemistry is readily amenable to photolithographic microarray synthesis and its potential in high affinity aptamers has been recently discovered. We thus synthesized the first microarrays containing 2'F-ANA and 2'F-ANA/DNA chimeric sequences to fully map the binding affinity landscape of the TBA1 thrombin-binding G-quadruplex aptamer containing all 32 768 possible DNA-to-2'F-ANA mutations. The resulting microarray was screened against thrombin to identify a series of promising 2'F-ANA-modified aptamer candidates with Kds significantly lower than that of the unmodified control and which were found to adopt highly stable, antiparallel-folded G-quadruplex structures. The solution structure of the TBA1 aptamer modified with 2'F-ANA at position T3 shows that fluorine substitution preorganizes the dinucleotide loop into the proper conformation for interaction with thrombin. Overall, our work strengthens the potential of 2'F-ANA in aptamer research and further expands non-genomic applications of nucleic acids microarrays.

  9. Genome-scale cluster analysis of replicated microarrays using shrinkage correlation coefficient

    PubMed Central

    Yao, Jianchao; Chang, Chunqi; Salmi, Mari L; Hung, Yeung Sam; Loraine, Ann; Roux, Stanley J

    2008-01-01

    Background Currently, clustering with some form of correlation coefficient as the gene similarity metric has become a popular method for profiling genomic data. The Pearson correlation coefficient and the standard deviation (SD)-weighted correlation coefficient are the two most widely-used correlations as the similarity metrics in clustering microarray data. However, these two correlations are not optimal for analyzing replicated microarray data generated by most laboratories. An effective correlation coefficient is needed to provide statistically sufficient analysis of replicated microarray data. Results In this study, we describe a novel correlation coefficient, shrinkage correlation coefficient (SCC), that fully exploits the similarity between the replicated microarray experimental samples. The methodology considers both the number of replicates and the variance within each experimental group in clustering expression data, and provides a robust statistical estimation of the error of replicated microarray data. The value of SCC is revealed by its comparison with two other correlation coefficients that are currently the most widely-used (Pearson correlation coefficient and SD-weighted correlation coefficient) using statistical measures on both synthetic expression data as well as real gene expression data from Saccharomyces cerevisiae. Two leading clustering methods, hierarchical and k-means clustering were applied for the comparison. The comparison indicated that using SCC achieves better clustering performance. Applying SCC-based hierarchical clustering to the replicated microarray data obtained from germinating spores of the fern Ceratopteris richardii, we discovered two clusters of genes with shared expression patterns during spore germination. Functional analysis suggested that some of the genetic mechanisms that control germination in such diverse plant lineages as mosses and angiosperms are also conserved among ferns. Conclusion This study shows that SCC is

  10. Integrating Test-Form Formatting into Automated Test Assembly

    ERIC Educational Resources Information Center

    Diao, Qi; van der Linden, Wim J.

    2013-01-01

    Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…

  11. 21 CFR 870.5310 - Automated external defibrillator.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... external defibrillator. (a) Identification. An automated external defibrillator (AED) is a low-energy... atria or ventricles of the heart. An AED analyzes the patient's electrocardiogram, interprets the cardiac rhythm, and automatically delivers an electrical shock (fully automated AED), or advises the...

  12. 21 CFR 870.5310 - Automated external defibrillator.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... shock of a maximum of 360 joules of energy used for defibrillating (restoring normal heart rhythm) the... cardiac rhythm, and automatically delivers an electrical shock (fully automated AED), or advises the user to deliver the shock (semi-automated or shock advisory AED) to treat ventricular fibrillation...

  13. Enhancing interdisciplinary mathematics and biology education: a microarray data analysis course bridging these disciplines.

    PubMed

    Tra, Yolande V; Evans, Irene M

    2010-01-01

    BIO2010 put forth the goal of improving the mathematical educational background of biology students. The analysis and interpretation of microarray high-dimensional data can be very challenging and is best done by a statistician and a biologist working and teaching in a collaborative manner. We set up such a collaboration and designed a course on microarray data analysis. We started using Genome Consortium for Active Teaching (GCAT) materials and Microarray Genome and Clustering Tool software and added R statistical software along with Bioconductor packages. In response to student feedback, one microarray data set was fully analyzed in class, starting from preprocessing to gene discovery to pathway analysis using the latter software. A class project was to conduct a similar analysis where students analyzed their own data or data from a published journal paper. This exercise showed the impact that filtering, preprocessing, and different normalization methods had on gene inclusion in the final data set. We conclude that this course achieved its goals to equip students with skills to analyze data from a microarray experiment. We offer our insight about collaborative teaching as well as how other faculty might design and implement a similar interdisciplinary course.

  14. Enhancing Interdisciplinary Mathematics and Biology Education: A Microarray Data Analysis Course Bridging These Disciplines

    PubMed Central

    Evans, Irene M.

    2010-01-01

    BIO2010 put forth the goal of improving the mathematical educational background of biology students. The analysis and interpretation of microarray high-dimensional data can be very challenging and is best done by a statistician and a biologist working and teaching in a collaborative manner. We set up such a collaboration and designed a course on microarray data analysis. We started using Genome Consortium for Active Teaching (GCAT) materials and Microarray Genome and Clustering Tool software and added R statistical software along with Bioconductor packages. In response to student feedback, one microarray data set was fully analyzed in class, starting from preprocessing to gene discovery to pathway analysis using the latter software. A class project was to conduct a similar analysis where students analyzed their own data or data from a published journal paper. This exercise showed the impact that filtering, preprocessing, and different normalization methods had on gene inclusion in the final data set. We conclude that this course achieved its goals to equip students with skills to analyze data from a microarray experiment. We offer our insight about collaborative teaching as well as how other faculty might design and implement a similar interdisciplinary course. PMID:20810954

  15. Contributions to Statistical Problems Related to Microarray Data

    ERIC Educational Resources Information Center

    Hong, Feng

    2009-01-01

    Microarray is a high throughput technology to measure the gene expression. Analysis of microarray data brings many interesting and challenging problems. This thesis consists three studies related to microarray data. First, we propose a Bayesian model for microarray data and use Bayes Factors to identify differentially expressed genes. Second, we…

  16. The Impact of Photobleaching on Microarray Analysis

    PubMed Central

    von der Haar, Marcel; Preuß, John-Alexander; von der Haar, Kathrin; Lindner, Patrick; Scheper, Thomas; Stahl, Frank

    2015-01-01

    DNA-Microarrays have become a potent technology for high-throughput analysis of genetic regulation. However, the wide dynamic range of signal intensities of fluorophore-based microarrays exceeds the dynamic range of a single array scan by far, thus limiting the key benefit of microarray technology: parallelization. The implementation of multi-scan techniques represents a promising approach to overcome these limitations. These techniques are, in turn, limited by the fluorophores’ susceptibility to photobleaching when exposed to the scanner’s laser light. In this paper the photobleaching characteristics of cyanine-3 and cyanine-5 as part of solid state DNA microarrays are studied. The effects of initial fluorophore intensity as well as laser scanner dependent variables such as the photomultiplier tube’s voltage on bleaching and imaging are investigated. The resulting data is used to develop a model capable of simulating the expected degree of signal intensity reduction caused by photobleaching for each fluorophore individually, allowing for the removal of photobleaching-induced, systematic bias in multi-scan procedures. Single-scan applications also benefit as they rely on pre-scans to determine the optimal scanner settings. These findings constitute a step towards standardization of microarray experiments and analysis and may help to increase the lab-to-lab comparability of microarray experiment results. PMID:26378589

  17. Unsupervised assessment of microarray data quality using a Gaussian mixture model

    PubMed Central

    Howard, Brian E; Sick, Beate; Heber, Steffen

    2009-01-01

    Background Quality assessment of microarray data is an important and often challenging aspect of gene expression analysis. This task frequently involves the examination of a variety of summary statistics and diagnostic plots. The interpretation of these diagnostics is often subjective, and generally requires careful expert scrutiny. Results We show how an unsupervised classification technique based on the Expectation-Maximization (EM) algorithm and the naïve Bayes model can be used to automate microarray quality assessment. The method is flexible and can be easily adapted to accommodate alternate quality statistics and platforms. We evaluate our approach using Affymetrix 3' gene expression and exon arrays and compare the performance of this method to a similar supervised approach. Conclusion This research illustrates the efficacy of an unsupervised classification approach for the purpose of automated microarray data quality assessment. Since our approach requires only unannotated training data, it is easy to customize and to keep up-to-date as technology evolves. In contrast to other "black box" classification systems, this method also allows for intuitive explanations. PMID:19545436

  18. Evaluation of Solid Supports for Slide- and Well-Based Recombinant Antibody Microarrays

    PubMed Central

    Gerdtsson, Anna S.; Dexlin-Mellby, Linda; Delfani, Payam; Berglund, Erica; Borrebaeck, Carl A. K.; Wingren, Christer

    2016-01-01

    Antibody microarrays have emerged as an important tool within proteomics, enabling multiplexed protein expression profiling in both health and disease. The design and performance of antibody microarrays and how they are processed are dependent on several factors, of which the interplay between the antibodies and the solid surfaces plays a central role. In this study, we have taken on the first comprehensive view and evaluated the overall impact of solid surfaces on the recombinant antibody microarray design. The results clearly demonstrated the importance of the surface-antibody interaction and showed the effect of the solid supports on the printing process, the array format of planar arrays (slide- and well-based), the assay performance (spot features, reproducibility, specificity and sensitivity) and assay processing (degree of automation). In the end, two high-end recombinant antibody microarray technology platforms were designed, based on slide-based (black polymer) and well-based (clear polymer) arrays, paving the way for future large-scale protein expression profiling efforts. PMID:27600082

  19. A hill-climbing approach for automatic gridding of cDNA microarray images.

    PubMed

    Rueda, Luis; Vidyadharan, Vidya

    2006-01-01

    Image and statistical analysis are two important stages of cDNA microarrays. Of these, gridding is necessary to accurately identify the location of each spot while extracting spot intensities from the microarray images and automating this procedure permits high-throughput analysis. Due to the deficiencies of the equipment used to print the arrays, rotations, misalignments, high contamination with noise and artifacts, and the enormous amount of data generated, solving the gridding problem by means of an automatic system is not trivial. Existing techniques to solve the automatic grid segmentation problem cover only limited aspects of this challenging problem and require the user to specify the size of the spots, the number of rows and columns in the grid, and boundary conditions. In this paper, a hill-climbing automatic gridding and spot quantification technique is proposed which takes a microarray image (or a subgrid) as input and makes no assumptions about the size of the spots, rows, and columns in the grid. The proposed method is based on a hill-climbing approach that utilizes different objective functions. The method has been found to effectively detect the grids on microarray images drawn from databases from GEO and the Stanford genomic laboratories.

  20. Evaluation of Solid Supports for Slide- and Well-Based Recombinant Antibody Microarrays.

    PubMed

    Gerdtsson, Anna S; Dexlin-Mellby, Linda; Delfani, Payam; Berglund, Erica; Borrebaeck, Carl A K; Wingren, Christer

    2016-06-08

    Antibody microarrays have emerged as an important tool within proteomics, enabling multiplexed protein expression profiling in both health and disease. The design and performance of antibody microarrays and how they are processed are dependent on several factors, of which the interplay between the antibodies and the solid surfaces plays a central role. In this study, we have taken on the first comprehensive view and evaluated the overall impact of solid surfaces on the recombinant antibody microarray design. The results clearly demonstrated the importance of the surface-antibody interaction and showed the effect of the solid supports on the printing process, the array format of planar arrays (slide- and well-based), the assay performance (spot features, reproducibility, specificity and sensitivity) and assay processing (degree of automation). In the end, two high-end recombinant antibody microarray technology platforms were designed, based on slide-based (black polymer) and well-based (clear polymer) arrays, paving the way for future large-scale protein expression profiling efforts.

  1. Validation Procedure for Multiplex Antibiotic Immunoassays Using Flow-Based Chemiluminescence Microarrays.

    PubMed

    Meyer, Verena Katharina; Meloni, Daniela; Olivo, Fabio; Märtlbauer, Erwin; Dietrich, Richard; Niessner, Reinhard; Seidel, Michael

    2017-01-01

    Small molecules like antibiotics or other pharmaceuticals can be detected and quantified, among others, with indirect competitive immunoassays. With regard to multiplex quantification, these tests can be performed as chemiluminescence microarray immunoassays, in which, in principle, the analyte in the sample and the same substance immobilized on the chip surface compete for a limited number of specific antibody binding sites. The amount of the specific primary antibody that has been bound to the surface is visualized by means of a chemiluminescence reaction.Validated quantitative confirmatory methods for the detection of contaminants, for example drug residues, in food samples usually comprise chromatographic analysis and spectrometric detection, e.g., HPLC-MS, GC-MS, or GC with electron capture detection. Here, we describe a validation procedure (according to the Commission Decision of the European Communities 2002/657/EC) for multiplex immunoassays performed as flow-through chemiluminescence microarrays, using the example of a small molecule microarray for the simultaneous detection of 13 antibiotics in milk. By this means, we suggest to accept multianalyte immunoassays as confirmatory methods as well, to benefit from the advantages of a fast automated method that does not need any pretreatment of the sample. The presented microarray chip is regenerable, so an internal calibration is implemented. Therefore, the analytical results are highly precise, combined with low costs (the aim for commercialization is less than 1 € per analyte per sample, this is significantly less than HPLC-MS).

  2. Automated tetraploid genotype calling by hierarchical clustering

    Technology Transfer Automated Retrieval System (TEKTRAN)

    SNP arrays are transforming breeding and genetics research for autotetraploids. To fully utilize these arrays, however, the relationship between signal intensity and allele dosage must be inferred independently for each marker. We developed an improved computational method to automate this process, ...

  3. Fully Integrating the Design Process

    SciTech Connect

    T.A. Bjornard; R.S. Bean

    2008-03-01

    The basic approach to designing nuclear facilities in the United States does not currently reflect the routine consideration of proliferation resistance and international safeguards. The fully integrated design process is an approach for bringing consideration of international safeguards and proliferation resistance, together with state safeguards and security, fully into the design process from the very beginning, while integrating them sensibly and synergistically with the other project functions. In view of the recently established GNEP principles agreed to by the United States and at least eighteen other countries, this paper explores such an integrated approach, and its potential to help fulfill the new internationally driven design requirements with improved efficiencies and reduced costs.

  4. Library Automation: A Survey of Leading Academic and Public Libraries in the United States.

    ERIC Educational Resources Information Center

    Mann, Thomas W., Jr.; And Others

    Results of this survey of 26 public and academic libraries of national stature show that the country's major libraries are fully committed to automating their library operations. Major findings of the survey show that: (1) all libraries surveyed are involved in automation; (2) all libraries surveyed have automated their catalogs and bibliographic…

  5. Genome-wide transcription analyses in rice using tiling microarrays.

    PubMed

    Li, Lei; Wang, Xiangfeng; Stolc, Viktor; Li, Xueyong; Zhang, Dongfen; Su, Ning; Tongprasit, Waraporn; Li, Songgang; Cheng, Zhukuan; Wang, Jun; Deng, Xing Wang

    2006-01-01

    Sequencing and computational annotation revealed several features, including high gene numbers, unusual composition of the predicted genes and a large number of genes lacking homology to known genes, that distinguish the rice (Oryza sativa) genome from that of other fully sequenced model species. We report here a full-genome transcription analysis of the indica rice subspecies using high-density oligonucleotide tiling microarrays. Our results provided expression data support for the existence of 35,970 (81.9%) annotated gene models and identified 5,464 unique transcribed intergenic regions that share similar compositional properties with the annotated exons and have significant homology to other plant proteins. Elucidating and mapping of all transcribed regions revealed an association between global transcription and cytological chromosome features, and an overall similarity of transcriptional activity between duplicated segments of the genome. Collectively, our results provide the first whole-genome transcription map useful for further understanding the rice genome.

  6. Chromosomal Microarray versus Karyotyping for Prenatal Diagnosis

    PubMed Central

    Wapner, Ronald J.; Martin, Christa Lese; Levy, Brynn; Ballif, Blake C.; Eng, Christine M.; Zachary, Julia M.; Savage, Melissa; Platt, Lawrence D.; Saltzman, Daniel; Grobman, William A.; Klugman, Susan; Scholl, Thomas; Simpson, Joe Leigh; McCall, Kimberly; Aggarwal, Vimla S.; Bunke, Brian; Nahum, Odelia; Patel, Ankita; Lamb, Allen N.; Thom, Elizabeth A.; Beaudet, Arthur L.; Ledbetter, David H.; Shaffer, Lisa G.; Jackson, Laird

    2013-01-01

    Background Chromosomal microarray analysis has emerged as a primary diagnostic tool for the evaluation of developmental delay and structural malformations in children. We aimed to evaluate the accuracy, efficacy, and incremental yield of chromosomal microarray analysis as compared with karyotyping for routine prenatal diagnosis. Methods Samples from women undergoing prenatal diagnosis at 29 centers were sent to a central karyotyping laboratory. Each sample was split in two; standard karyotyping was performed on one portion and the other was sent to one of four laboratories for chromosomal microarray. Results We enrolled a total of 4406 women. Indications for prenatal diagnosis were advanced maternal age (46.6%), abnormal result on Down’s syndrome screening (18.8%), structural anomalies on ultrasonography (25.2%), and other indications (9.4%). In 4340 (98.8%) of the fetal samples, microarray analysis was successful; 87.9% of samples could be used without tissue culture. Microarray analysis of the 4282 nonmosaic samples identified all the aneuploidies and unbalanced rearrangements identified on karyotyping but did not identify balanced translocations and fetal triploidy. In samples with a normal karyotype, microarray analysis revealed clinically relevant deletions or duplications in 6.0% with a structural anomaly and in 1.7% of those whose indications were advanced maternal age or positive screening results. Conclusions In the context of prenatal diagnostic testing, chromosomal microarray analysis identified additional, clinically significant cytogenetic information as compared with karyotyping and was equally efficacious in identifying aneuploidies and unbalanced rearrangements but did not identify balanced translocations and triploidies. (Funded by the Eunice Kennedy Shriver National Institute of Child Health and Human Development and others; ClinicalTrials.gov number, NCT01279733.) PMID:23215555

  7. Ensuring Fully Soldered Through Holes

    NASA Technical Reports Server (NTRS)

    Blow, Raymond K.

    1987-01-01

    Simple differential-pressure soldering method provides visual evidence that hidden joints are fully soldered. Intended for soldering connector pins in plated through holes in circuit boards. Molten solder flows into plated through holes, drawn by vacuum in manifold over circuit board. Differential-pressure process ensures solder wets entire through hole around connector pin.

  8. Segment and fit thresholding: a new method for image analysis applied to microarray and immunofluorescence data.

    PubMed

    Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E; Allen, Peter J; Sempere, Lorenzo F; Haab, Brian B

    2015-10-06

    Experiments involving the high-throughput quantification of image data require algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multicolor, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu's method for selected images. SFT promises to advance the goal of full automation in image analysis.

  9. Automated Demand Response and Commissioning

    SciTech Connect

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Bourassa, Norman

    2005-04-01

    This paper describes the results from the second season of research to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve the electric grid reliability and manage electricity costs. Fully-Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. We refer to this as Auto-DR. The evaluation of the control and communications must be properly configured and pass through a set of test stages: Readiness, Approval, Price Client/Price Server Communication, Internet Gateway/Internet Relay Communication, Control of Equipment, and DR Shed Effectiveness. New commissioning tests are needed for such systems to improve connecting demand responsive building systems to the electric grid demand response systems.

  10. Sensors and Automated Analyzers for Radionuclides

    SciTech Connect

    Grate, Jay W.; Egorov, Oleg B.

    2003-03-27

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less.

  11. Image segmentation for automated dental identification

    NASA Astrophysics Data System (ADS)

    Haj Said, Eyad; Nassar, Diaa Eldin M.; Ammar, Hany H.

    2006-02-01

    Dental features are one of few biometric identifiers that qualify for postmortem identification; therefore, creation of an Automated Dental Identification System (ADIS) with goals and objectives similar to the Automated Fingerprint Identification System (AFIS) has received increased attention. As a part of ADIS, teeth segmentation from dental radiographs films is an essential step in the identification process. In this paper, we introduce a fully automated approach for teeth segmentation with goal to extract at least one tooth from the dental radiograph film. We evaluate our approach based on theoretical and empirical basis, and we compare its performance with the performance of other approaches introduced in the literature. The results show that our approach exhibits the lowest failure rate and the highest optimality among all full automated approaches introduced in the literature.

  12. Validation of affinity reagents using antigen microarrays.

    PubMed

    Sjöberg, Ronald; Sundberg, Mårten; Gundberg, Anna; Sivertsson, Asa; Schwenk, Jochen M; Uhlén, Mathias; Nilsson, Peter

    2012-06-15

    There is a need for standardised validation of affinity reagents to determine their binding selectivity and specificity. This is of particular importance for systematic efforts that aim to cover the human proteome with different types of binding reagents. One such international program is the SH2-consortium, which was formed to generate a complete set of renewable affinity reagents to the SH2-domain containing human proteins. Here, we describe a microarray strategy to validate various affinity reagents, such as recombinant single-chain antibodies, mouse monoclonal antibodies and antigen-purified polyclonal antibodies using a highly multiplexed approach. An SH2-specific antigen microarray was designed and generated, containing more than 6000 spots displayed by 14 identical subarrays each with 406 antigens, where 105 of them represented SH2-domain containing proteins. Approximately 400 different affinity reagents of various types were analysed on these antigen microarrays carrying antigens of different types. The microarrays revealed not only very detailed specificity profiles for all the binders, but also showed that overlapping target sequences of spotted antigens were detected by off-target interactions. The presented study illustrates the feasibility of using antigen microarrays for integrative, high-throughput validation of various types of binders and antigens.

  13. Fully depleted back illuminated CCD

    DOEpatents

    Holland, Stephen Edward

    2001-01-01

    A backside illuminated charge coupled device (CCD) is formed of a relatively thick high resistivity photon sensitive silicon substrate, with frontside electronic circuitry, and an optically transparent backside ohmic contact for applying a backside voltage which is at least sufficient to substantially fully deplete the substrate. A greater bias voltage which overdepletes the substrate may also be applied. One way of applying the bias voltage to the substrate is by physically connecting the voltage source to the ohmic contact. An alternate way of applying the bias voltage to the substrate is to physically connect the voltage source to the frontside of the substrate, at a point outside the depletion region. Thus both frontside and backside contacts can be used for backside biasing to fully deplete the substrate. Also, high resistivity gaps around the CCD channels and electrically floating channel stop regions can be provided in the CCD array around the CCD channels. The CCD array forms an imaging sensor useful in astronomy.

  14. Posttranslational Modification Assays on Functional Protein Microarrays.

    PubMed

    Neiswinger, Johnathan; Uzoma, Ijeoma; Cox, Eric; Rho, HeeSool; Jeong, Jun Seop; Zhu, Heng

    2016-10-03

    Protein microarray technology provides a straightforward yet powerful strategy for identifying substrates of posttranslational modifications (PTMs) and studying the specificity of the enzymes that catalyze these reactions. Protein microarray assays can be designed for individual enzymes or a mixture to establish connections between enzymes and substrates. Assays for four well-known PTMs-phosphorylation, acetylation, ubiquitylation, and SUMOylation-have been developed and are described here for use on functional protein microarrays. Phosphorylation and acetylation require a single enzyme and are easily adapted for use on an array. The ubiquitylation and SUMOylation cascades are very similar, and the combination of the E1, E2, and E3 enzymes plus ubiquitin or SUMO protein and ATP is sufficient for in vitro modification of many substrates.

  15. Automated Confocal Microscope Bias Correction

    NASA Astrophysics Data System (ADS)

    Dorval, Thierry; Genovesio, Auguste

    2006-10-01

    Illumination artifacts systematically occur in 2D cross-section confocal microscopy imaging . These bias can strongly corrupt an higher level image processing such as a segmentation, a fluorescence evaluation or even a pattern extraction/recognition. This paper presents a new fully automated bias correction methodology based on large image database preprocessing. This method is very appropriate to the High Content Screening (HCS), method dedicated to drugs discovery. Our method assumes that the amount of pictures available is large enough to allow a reliable statistical computation of an average bias image. A relevant segmentation evaluation protocol and experimental results validate our correction algorithm by outperforming object extraction on non corrupted images.

  16. Fully synthetic taped insulation cables

    DOEpatents

    Forsyth, Eric B.; Muller, Albert C.

    1984-01-01

    A high voltage oil-impregnated electrical cable with fully polymer taped insulation operable to 765 kV. Biaxially oriented, specially processed, polyethylene, polybutene or polypropylene tape with an embossed pattern is wound in multiple layers over a conductive core with a permeable screen around the insulation. Conventional oil which closely matches the dielectric constant of the tape is used, and the cable can be impregnated after field installation because of its excellent impregnation characteristics.

  17. Overview of DNA microarrays: types, applications, and their future.

    PubMed

    Bumgarner, Roger

    2013-01-01

    This unit provides an overview of DNA microarrays. Microarrays are a technology in which thousands of nucleic acids are bound to a surface and are used to measure the relative concentration of nucleic acid sequences in a mixture via hybridization and subsequent detection of the hybridization events. This overview first discusses the history of microarrays and the antecedent technologies that led to their development. This is followed by discussion of the methods of manufacture of microarrays and the most common biological applications. The unit ends with a brief description of the limitations of microarrays and discusses how microarrays are being rapidly replaced by DNA sequencing technologies.

  18. Analysis of High-Throughput ELISA Microarray Data

    SciTech Connect

    White, Amanda M.; Daly, Don S.; Zangar, Richard C.

    2011-02-23

    Our research group develops analytical methods and software for the high-throughput analysis of quantitative enzyme-linked immunosorbent assay (ELISA) microarrays. ELISA microarrays differ from DNA microarrays in several fundamental aspects and most algorithms for analysis of DNA microarray data are not applicable to ELISA microarrays. In this review, we provide an overview of the steps involved in ELISA microarray data analysis and how the statistically sound algorithms we have developed provide an integrated software suite to address the needs of each data-processing step. The algorithms discussed are available in a set of open-source software tools (http://www.pnl.gov/statistics/ProMAT).

  19. The use of microarrays in microbial ecology

    SciTech Connect

    Andersen, G.L.; He, Z.; DeSantis, T.Z.; Brodie, E.L.; Zhou, J.

    2009-09-15

    Microarrays have proven to be a useful and high-throughput method to provide targeted DNA sequence information for up to many thousands of specific genetic regions in a single test. A microarray consists of multiple DNA oligonucleotide probes that, under high stringency conditions, hybridize only to specific complementary nucleic acid sequences (targets). A fluorescent signal indicates the presence and, in many cases, the abundance of genetic regions of interest. In this chapter we will look at how microarrays are used in microbial ecology, especially with the recent increase in microbial community DNA sequence data. Of particular interest to microbial ecologists, phylogenetic microarrays are used for the analysis of phylotypes in a community and functional gene arrays are used for the analysis of functional genes, and, by inference, phylotypes in environmental samples. A phylogenetic microarray that has been developed by the Andersen laboratory, the PhyloChip, will be discussed as an example of a microarray that targets the known diversity within the 16S rRNA gene to determine microbial community composition. Using multiple, confirmatory probes to increase the confidence of detection and a mismatch probe for every perfect match probe to minimize the effect of cross-hybridization by non-target regions, the PhyloChip is able to simultaneously identify any of thousands of taxa present in an environmental sample. The PhyloChip is shown to reveal greater diversity within a community than rRNA gene sequencing due to the placement of the entire gene product on the microarray compared with the analysis of up to thousands of individual molecules by traditional sequencing methods. A functional gene array that has been developed by the Zhou laboratory, the GeoChip, will be discussed as an example of a microarray that dynamically identifies functional activities of multiple members within a community. The recent version of GeoChip contains more than 24,000 50mer

  20. IgE and IgG4 Epitope Mapping of Food Allergens with a Peptide Microarray Immunoassay.

    PubMed

    Martínez-Botas, Javier; de la Hoz, Belén

    2016-01-01

    Peptide microarrays are a powerful tool to identify linear epitopes of food allergens in a high-throughput manner. The main advantages of the microarray-based immunoassay are the possibility to assay thousands of targets simultaneously, the requirement of a low volume of serum, the more robust statistical analysis, and the possibility to test simultaneously several immunoglobulin subclasses. Among them, the last one has a special interest in the field of food allergy, because the development of tolerance to food allergens has been associated with a decrease in IgE and an increase in IgG4 levels against linear epitopes. However, the main limitation to the clinical use of microarray is the automated analysis of the data. Recent studies mapping the linear epitopes of food allergens with peptide microarray immunoassays have identified peptide biomarkers that can be used for early diagnosis of food allergies and to predict their severity or the self-development of tolerance. Using this approach, we have worked on epitope mapping of the two most important food allergens in the Spanish population, cow's milk and chicken eggs. The final aim of these studies is to define subsets of peptides that could be used as biomarkers to improve the diagnosis and prognosis of food allergies. This chapter describes the protocol to produce microarrays using a library of overlapping peptides corresponding to the primary sequences of food allergens and data acquisition and analysis of IgE- and IgG4-binding epitopes.

  1. Fully differential NLO predictions for the rare muon decay

    NASA Astrophysics Data System (ADS)

    Pruna, G. M.; Signer, A.; Ulrich, Y.

    2017-02-01

    Using the automation program GoSam, fully differential NLO corrections were obtained for the rare decay of the muon μ → eν ν bar ee. This process is an important Standard Model background to searches of the Mu3e Collaboration for lepton-flavor violation, as it becomes indistinguishable from the signal μ → 3 e if the neutrinos carry little energy. With our NLO program we are able to compute the branching ratio as well as custom-tailored observables for the experiment. With minor modifications, related decays of the tau can also be computed.

  2. Technical Advances of the Recombinant Antibody Microarray Technology Platform for Clinical Immunoproteomics

    PubMed Central

    Delfani, Payam; Dexlin Mellby, Linda; Nordström, Malin; Holmér, Andreas; Ohlsson, Mattias; Borrebaeck, Carl A. K.; Wingren, Christer

    2016-01-01

    In the quest for deciphering disease-associated biomarkers, high-performing tools for multiplexed protein expression profiling of crude clinical samples will be crucial. Affinity proteomics, mainly represented by antibody-based microarrays, have during recent years been established as a proteomic tool providing unique opportunities for parallelized protein expression profiling. But despite the progress, several main technical features and assay procedures remains to be (fully) resolved. Among these issues, the handling of protein microarray data, i.e. the biostatistics parts, is one of the key features to solve. In this study, we have therefore further optimized, validated, and standardized our in-house designed recombinant antibody microarray technology platform. To this end, we addressed the main remaining technical issues (e.g. antibody quality, array production, sample labelling, and selected assay conditions) and most importantly key biostatistics subjects (e.g. array data pre-processing and biomarker panel condensation). This represents one of the first antibody array studies in which these key biostatistics subjects have been studied in detail. Here, we thus present the next generation of the recombinant antibody microarray technology platform designed for clinical immunoproteomics. PMID:27414037

  3. MIMAS: an innovative tool for network-based high density oligonucleotide microarray data management and annotation

    PubMed Central

    Hermida, Leandro; Schaad, Olivier; Demougin, Philippe; Descombes, Patrick; Primig, Michael

    2006-01-01

    Background The high-density oligonucleotide microarray (GeneChip) is an important tool for molecular biological research aiming at large-scale detection of small nucleotide polymorphisms in DNA and genome-wide analysis of mRNA concentrations. Local array data management solutions are instrumental for efficient processing of the results and for subsequent uploading of data and annotations to a global certified data repository at the EBI (ArrayExpress) or the NCBI (GeneOmnibus). Description To facilitate and accelerate annotation of high-throughput expression profiling experiments, the Microarray Information Management and Annotation System (MIMAS) was developed. The system is fully compliant with the Minimal Information About a Microarray Experiment (MIAME) convention. MIMAS provides life scientists with a highly flexible and focused GeneChip data storage and annotation platform essential for subsequent analysis and interpretation of experimental results with clustering and mining tools. The system software can be downloaded for academic use upon request. Conclusion MIMAS implements a novel concept for nation-wide GeneChip data management whereby a network of facilities is centered on one data node directly connected to the European certified public microarray data repository located at the EBI. The solution proposed may serve as a prototype approach to array data management between research institutes organized in a consortium. PMID:16597336

  4. Automation of Capacity Bidding with an Aggregator Using Open Automated Demand Response

    SciTech Connect

    Kiliccote, Sila; Piette, Mary Ann

    2008-10-01

    This report summarizes San Diego Gas& Electric Company?s collaboration with the Demand Response Research Center to develop and test automation capability for the Capacity Bidding Program in 2007. The report describes the Open Automated Demand Response architecture, summarizes the history of technology development and pilot studies. It also outlines the Capacity Bidding Program and technology being used by an aggregator that participated in this demand response program. Due to delays, the program was not fully operational for summer 2007. However, a test event on October 3, 2007, showed that the project successfully achieved the objective to develop and demonstrate how an open, Web?based interoperable automated notification system for capacity bidding can be used by aggregators for demand response. The system was effective in initiating a fully automated demand response shed at the aggregated sites. This project also demonstrated how aggregators can integrate their demand response automation systems with San Diego Gas& Electric Company?s Demand Response Automation Server and capacity bidding program.

  5. DISC-BASED IMMUNOASSAY MICROARRAYS. (R825433)

    EPA Science Inventory

    Microarray technology as applied to areas that include genomics, diagnostics, environmental, and drug discovery, is an interesting research topic for which different chip-based devices have been developed. As an alternative, we have explored the principle of compact disc-based...

  6. Raman-based microarray readout: a review.

    PubMed

    Haisch, Christoph

    2016-07-01

    For a quarter of a century, microarrays have been part of the routine analytical toolbox. Label-based fluorescence detection is still the commonest optical readout strategy. Since the 1990s, a continuously increasing number of label-based as well as label-free experiments on Raman-based microarray readout concepts have been reported. This review summarizes the possible concepts and methods and their advantages and challenges. A common label-based strategy is based on the binding of selective receptors as well as Raman reporter molecules to plasmonic nanoparticles in a sandwich immunoassay, which results in surface-enhanced Raman scattering signals of the reporter molecule. Alternatively, capture of the analytes can be performed by receptors on a microarray surface. Addition of plasmonic nanoparticles again leads to a surface-enhanced Raman scattering signal, not of a label but directly of the analyte. This approach is mostly proposed for bacteria and cell detection. However, although many promising readout strategies have been discussed in numerous publications, rarely have any of them made the step from proof of concept to a practical application, let alone routine use. Graphical Abstract Possible realization of a SERS (Surface-Enhanced Raman Scattering) system for microarray readout.

  7. Annotating nonspecific SAGE tags with microarray data.

    PubMed

    Ge, Xijin; Jung, Yong-Chul; Wu, Qingfa; Kibbe, Warren A; Wang, San Ming

    2006-01-01

    SAGE (serial analysis of gene expression) detects transcripts by extracting short tags from the transcripts. Because of the limited length, many SAGE tags are shared by transcripts from different genes. Relying on sequence information in the general gene expression database has limited power to solve this problem due to the highly heterogeneous nature of the deposited sequences. Considering that the complexity of gene expression at a single tissue level should be much simpler than that in the general expression database, we reasoned that by restricting gene expression to tissue level, the accuracy of gene annotation for the nonspecific SAGE tags should be significantly improved. To test the idea, we developed a tissue-specific SAGE annotation database based on microarray data (). This database contains microarray expression information represented as UniGene clusters for 73 normal human tissues and 18 cancer tissues and cell lines. The nonspecific SAGE tag is first matched to the database by the same tissue type used by both SAGE and microarray analysis; then the multiple UniGene clusters assigned to the nonspecific SAGE tag are searched in the database under the matched tissue type. The UniGene cluster presented solely or at higher expression levels in the database is annotated to represent the specific gene for the nonspecific SAGE tags. The accuracy of gene annotation by this database was largely confirmed by experimental data. Our study shows that microarray data provide a useful source for annotating the nonspecific SAGE tags.

  8. Microarrays (DNA Chips) for the Classroom Laboratory

    ERIC Educational Resources Information Center

    Barnard, Betsy; Sussman, Michael; BonDurant, Sandra Splinter; Nienhuis, James; Krysan, Patrick

    2006-01-01

    We have developed and optimized the necessary laboratory materials to make DNA microarray technology accessible to all high school students at a fraction of both cost and data size. The primary component is a DNA chip/array that students "print" by hand and then analyze using research tools that have been adapted for classroom use. The…

  9. Diagnostic Oligonucleotide Microarray Fingerprinting of Bacillus Isolates

    SciTech Connect

    Chandler, Darrell P.; Alferov, Oleg; Chernov, Boris; Daly, Don S.; Golova, Julia; Perov, Alexander N.; Protic, Miroslava; Robison, Richard; Shipma, Matthew; White, Amanda M.; Willse, Alan R.

    2006-01-01

    A diagnostic, genome-independent microbial fingerprinting method using DNA oligonucleotide microarrays was used for high-resolution differentiation between closely related Bacillus strains, including two strains of Bacillus anthracis that are monomorphic (indistinguishable) via amplified fragment length polymorphism fingerprinting techniques. Replicated hybridizations on 391-probe nonamer arrays were used to construct a prototype fingerprint library for quantitative comparisons. Descriptive analysis of the fingerprints, including phylogenetic reconstruction, is consistent with previous taxonomic organization of the genus. Newly developed statistical analysis methods were used to quantitatively compare and objectively confirm apparent differences in microarray fingerprints with the statistical rigor required for microbial forensics and clinical diagnostics. These data suggest that a relatively simple fingerprinting microarray and statistical analysis method can differentiate between species in the Bacillus cereus complex, and between strains of B. anthracis. A synthetic DNA standard was used to understand underlying microarray and process-level variability, leading to specific recommendations for the development of a standard operating procedure and/or continued technology enhancements for microbial forensics and diagnostics.

  10. MICROARRAY DATA ANALYSIS USING MULTIPLE STATISTICAL MODELS

    EPA Science Inventory

    Microarray Data Analysis Using Multiple Statistical Models

    Wenjun Bao1, Judith E. Schmid1, Amber K. Goetz1, Ming Ouyang2, William J. Welsh2,Andrew I. Brooks3,4, ChiYi Chu3,Mitsunori Ogihara3,4, Yinhe Cheng5, David J. Dix1. 1National Health and Environmental Effects Researc...

  11. Shrinkage covariance matrix approach for microarray data

    NASA Astrophysics Data System (ADS)

    Karjanto, Suryaefiza; Aripin, Rasimah

    2013-04-01

    Microarray technology was developed for the purpose of monitoring the expression levels of thousands of genes. A microarray data set typically consists of tens of thousands of genes (variables) from just dozens of samples due to various constraints including the high cost of producing microarray chips. As a result, the widely used standard covariance estimator is not appropriate for this purpose. One such technique is the Hotelling's T2 statistic which is a multivariate test statistic for comparing means between two groups. It requires that the number of observations (n) exceeds the number of genes (p) in the set but in microarray studies it is common that n < p. This leads to a biased estimate of the covariance matrix. In this study, the Hotelling's T2 statistic with the shrinkage approach is proposed to estimate the covariance matrix for testing differential gene expression. The performance of this approach is then compared with other commonly used multivariate tests using a widely analysed diabetes data set as illustrations. The results across the methods are consistent, implying that this approach provides an alternative to existing techniques.

  12. Automated Miniaturized Instrument for Space Biology Applications and the Monitoring of the Astronauts Health Onboard the ISS

    NASA Technical Reports Server (NTRS)

    Karouia, Fathi; Peyvan, Kia; Danley, David; Ricco, Antonio J.; Santos, Orlando; Pohorille, Andrew

    2011-01-01

    Human space travelers experience a unique environment that affects homeostasis and physiologic adaptation. The spacecraft environment subjects the traveler to noise, chemical and microbiological contaminants, increased radiation, and variable gravity forces. As humans prepare for long-duration missions to the International Space Station (ISS) and beyond, effective measures must be developed, verified and implemented to ensure mission success. Limited biomedical quantitative capabilities are currently available onboard the ISS. Therefore, the development of versatile instruments to perform space biological analysis and to monitor astronauts' health is needed. We are developing a fully automated, miniaturized system for measuring gene expression on small spacecraft in order to better understand the influence of the space environment on biological systems. This low-cost, low-power, multi-purpose instrument represents a major scientific and technological advancement by providing data on cellular metabolism and regulation. The current system will support growth of microorganisms, extract and purify the RNA, hybridize it to the array, read the expression levels of a large number of genes by microarray analysis, and transmit the measurements to Earth. The system will help discover how bacteria develop resistance to antibiotics and how pathogenic bacteria sometimes increase their virulence in space, facilitating the development of adequate countermeasures to decrease risks associated with human spaceflight. The current stand-alone technology could be used as an integrated platform onboard the ISS to perform similar genetic analyses on any biological systems from the tree of life. Additionally, with some modification the system could be implemented to perform real-time in-situ microbial monitoring of the ISS environment (air, surface and water samples) and the astronaut's microbiome using 16SrRNA microarray technology. Furthermore, the current system can be enhanced

  13. A Method of Microarray Data Storage Using Array Data Type

    PubMed Central

    Tsoi, Lam C.; Zheng, W. Jim

    2009-01-01

    A well-designed microarray database can provide valuable information on gene expression levels. However, designing an efficient microarray database with minimum space usage is not an easy task since designers need to integrate the microarray data with the information of genes, probe annotation, and the descriptions of each microarray experiment. Developing better methods to store microarray data can greatly improve the efficiency and usefulness of such data. A new schema is proposed to store microarray data by using array data type in an object-relational database management system – PostgreSQL. The implemented database can store all the microarray data from the same chip in an array data structure. The variable length array data type in PostgreSQL can store microarray data from same chip. The implementation of our schema can help to increase the data retrieval and space efficiency. PMID:17392028

  14. PRACTICAL STRATEGIES FOR PROCESSING AND ANALYZING SPOTTED OLIGONUCLEOTIDE MICROARRAY DATA

    EPA Science Inventory

    Thoughtful data analysis is as important as experimental design, biological sample quality, and appropriate experimental procedures for making microarrays a useful supplement to traditional toxicology. In the present study, spotted oligonucleotide microarrays were used to profile...

  15. Examining microarray slide quality for the EPA using SNL's hyperspectral microarray scanner.

    SciTech Connect

    Rohde, Rachel M.; Timlin, Jerilyn Ann

    2005-11-01

    This report summarizes research performed at Sandia National Laboratories (SNL) in collaboration with the Environmental Protection Agency (EPA) to assess microarray quality on arrays from two platforms of interest to the EPA. Custom microarrays from two novel, commercially produced array platforms were imaged with SNL's unique hyperspectral imaging technology and multivariate data analysis was performed to investigate sources of emission on the arrays. No extraneous sources of emission were evident in any of the array areas scanned. This led to the conclusions that either of these array platforms could produce high quality, reliable microarray data for the EPA toxicology programs. Hyperspectral imaging results are presented and recommendations for microarray analyses using these platforms are detailed within the report.

  16. Autonomy and Automation

    NASA Technical Reports Server (NTRS)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  17. An automated microfluidic platform for C. elegans embryo arraying, phenotyping, and long-term live imaging

    PubMed Central

    Cornaglia, Matteo; Mouchiroud, Laurent; Marette, Alexis; Narasimhan, Shreya; Lehnert, Thomas; Jovaisaite, Virginija; Auwerx, Johan; Gijs, Martin A. M.

    2015-01-01

    Studies of the real-time dynamics of embryonic development require a gentle embryo handling method, the possibility of long-term live imaging during the complete embryogenesis, as well as of parallelization providing a population’s statistics, while keeping single embryo resolution. We describe an automated approach that fully accomplishes these requirements for embryos of Caenorhabditis elegans, one of the most employed model organisms in biomedical research. We developed a microfluidic platform which makes use of pure passive hydrodynamics to run on-chip worm cultures, from which we obtain synchronized embryo populations, and to immobilize these embryos in incubator microarrays for long-term high-resolution optical imaging. We successfully employ our platform to investigate morphogenesis and mitochondrial biogenesis during the full embryonic development and elucidate the role of the mitochondrial unfolded protein response (UPRmt) within C. elegans embryogenesis. Our method can be generally used for protein expression and developmental studies at the embryonic level, but can also provide clues to understand the aging process and age-related diseases in particular. PMID:25950235

  18. An automated microfluidic platform for C. elegans embryo arraying, phenotyping, and long-term live imaging

    NASA Astrophysics Data System (ADS)

    Cornaglia, Matteo; Mouchiroud, Laurent; Marette, Alexis; Narasimhan, Shreya; Lehnert, Thomas; Jovaisaite, Virginija; Auwerx, Johan; Gijs, Martin A. M.

    2015-05-01

    Studies of the real-time dynamics of embryonic development require a gentle embryo handling method, the possibility of long-term live imaging during the complete embryogenesis, as well as of parallelization providing a population’s statistics, while keeping single embryo resolution. We describe an automated approach that fully accomplishes these requirements for embryos of Caenorhabditis elegans, one of the most employed model organisms in biomedical research. We developed a microfluidic platform which makes use of pure passive hydrodynamics to run on-chip worm cultures, from which we obtain synchronized embryo populations, and to immobilize these embryos in incubator microarrays for long-term high-resolution optical imaging. We successfully employ our platform to investigate morphogenesis and mitochondrial biogenesis during the full embryonic development and elucidate the role of the mitochondrial unfolded protein response (UPRmt) within C. elegans embryogenesis. Our method can be generally used for protein expression and developmental studies at the embryonic level, but can also provide clues to understand the aging process and age-related diseases in particular.

  19. Workflow automation architecture standard

    SciTech Connect

    Moshofsky, R.P.; Rohen, W.T.

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  20. Fully analogue photonic reservoir computer.

    PubMed

    Duport, François; Smerieri, Anteo; Akrout, Akram; Haelterman, Marc; Massar, Serge

    2016-03-03

    Introduced a decade ago, reservoir computing is an efficient approach for signal processing. State of the art capabilities have already been demonstrated with both computer simulations and physical implementations. If photonic reservoir computing appears to be promising a solution for ultrafast nontrivial computing, all the implementations presented up to now require digital pre or post processing, which prevents them from exploiting their full potential, in particular in terms of processing speed. We address here the possibility to get rid simultaneously of both digital pre and post processing. The standalone fully analogue reservoir computer resulting from our endeavour is compared to previous experiments and only exhibits rather limited degradation of performances. Our experiment constitutes a proof of concept for standalone physical reservoir computers.

  1. Fully analogue photonic reservoir computer

    PubMed Central

    Duport, François; Smerieri, Anteo; Akrout, Akram; Haelterman, Marc; Massar, Serge

    2016-01-01

    Introduced a decade ago, reservoir computing is an efficient approach for signal processing. State of the art capabilities have already been demonstrated with both computer simulations and physical implementations. If photonic reservoir computing appears to be promising a solution for ultrafast nontrivial computing, all the implementations presented up to now require digital pre or post processing, which prevents them from exploiting their full potential, in particular in terms of processing speed. We address here the possibility to get rid simultaneously of both digital pre and post processing. The standalone