2015-01-01
Biological assays formatted as microarrays have become a critical tool for the generation of the comprehensive data sets required for systems-level understanding of biological processes. Manual annotation of data extracted from images of microarrays, however, remains a significant bottleneck, particularly for protein microarrays due to the sensitivity of this technology to weak artifact signal. In order to automate the extraction and curation of data from protein microarrays, we describe an algorithm called Crossword that logically combines information from multiple approaches to fully automate microarray segmentation. Automated artifact removal is also accomplished by segregating structured pixels from the background noise using iterative clustering and pixel connectivity. Correlation of the location of structured pixels across image channels is used to identify and remove artifact pixels from the image prior to data extraction. This component improves the accuracy of data sets while reducing the requirement for time-consuming visual inspection of the data. Crossword enables a fully automated protocol that is robust to significant spatial and intensity aberrations. Overall, the average amount of user intervention is reduced by an order of magnitude and the data quality is increased through artifact removal and reduced user variability. The increase in throughput should aid the further implementation of microarray technologies in clinical studies. PMID:24417579
Fully automated analysis of multi-resolution four-channel micro-array genotyping data
NASA Astrophysics Data System (ADS)
Abbaspour, Mohsen; Abugharbieh, Rafeef; Podder, Mohua; Tebbutt, Scott J.
2006-03-01
We present a fully-automated and robust microarray image analysis system for handling multi-resolution images (down to 3-micron with sizes up to 80 MBs per channel). The system is developed to provide rapid and accurate data extraction for our recently developed microarray analysis and quality control tool (SNP Chart). Currently available commercial microarray image analysis applications are inefficient, due to the considerable user interaction typically required. Four-channel DNA microarray technology is a robust and accurate tool for determining genotypes of multiple genetic markers in individuals. It plays an important role in the state of the art trend where traditional medical treatments are to be replaced by personalized genetic medicine, i.e. individualized therapy based on the patient's genetic heritage. However, fast, robust, and precise image processing tools are required for the prospective practical use of microarray-based genetic testing for predicting disease susceptibilities and drug effects in clinical practice, which require a turn-around timeline compatible with clinical decision-making. In this paper we have developed a fully-automated image analysis platform for the rapid investigation of hundreds of genetic variations across multiple genes. Validation tests indicate very high accuracy levels for genotyping results. Our method achieves a significant reduction in analysis time, from several hours to just a few minutes, and is completely automated requiring no manual interaction or guidance.
Bingle, Lynne; Fonseca, Felipe P; Farthing, Paula M
2017-01-01
Tissue microarrays were first constructed in the 1980s but were used by only a limited number of researchers for a considerable period of time. In the last 10 years there has been a dramatic increase in the number of publications describing the successful use of tissue microarrays in studies aimed at discovering and validating biomarkers. This, along with the increased availability of both manual and automated microarray builders on the market, has encouraged even greater use of this novel and powerful tool. This chapter describes the basic techniques required to build a tissue microarray using a manual method in order that the theory behind the practical steps can be fully explained. Guidance is given to ensure potential disadvantages of the technique are fully considered.
NASA Astrophysics Data System (ADS)
Liu, Robin H.; Lodes, Mike; Fuji, H. Sho; Danley, David; McShea, Andrew
Microarray assays typically involve multistage sample processing and fluidic handling, which are generally labor-intensive and time-consuming. Automation of these processes would improve robustness, reduce run-to-run and operator-to-operator variation, and reduce costs. In this chapter, a fully integrated and self-contained microfluidic biochip device that has been developed to automate the fluidic handling steps for microarray-based gene expression or genotyping analysis is presented. The device consists of a semiconductor-based CustomArray® chip with 12,000 features and a microfluidic cartridge. The CustomArray was manufactured using a semiconductor-based in situ synthesis technology. The micro-fluidic cartridge consists of microfluidic pumps, mixers, valves, fluid channels, and reagent storage chambers. Microarray hybridization and subsequent fluidic handling and reactions (including a number of washing and labeling steps) were performed in this fully automated and miniature device before fluorescent image scanning of the microarray chip. Electrochemical micropumps were integrated in the cartridge to provide pumping of liquid solutions. A micromixing technique based on gas bubbling generated by electrochemical micropumps was developed. Low-cost check valves were implemented in the cartridge to prevent cross-talk of the stored reagents. Gene expression study of the human leukemia cell line (K562) and genotyping detection and sequencing of influenza A subtypes have been demonstrated using this integrated biochip platform. For gene expression assays, the microfluidic CustomArray device detected sample RNAs with a concentration as low as 0.375 pM. Detection was quantitative over more than three orders of magnitude. Experiment also showed that chip-to-chip variability was low indicating that the integrated microfluidic devices eliminate manual fluidic handling steps that can be a significant source of variability in genomic analysis. The genotyping results showed that the device identified influenza A hemagglutinin and neuraminidase subtypes and sequenced portions of both genes, demonstrating the potential of integrated microfluidic and microarray technology for multiple virus detection. The device provides a cost-effective solution to eliminate labor-intensive and time-consuming fluidic handling steps and allows microarray-based DNA analysis in a rapid and automated fashion.
Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.
Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N
2009-10-27
The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a highly adaptable, integrative, yet flexible tool which can be used for automated quality control, analysis, annotation and visualization of microarray data, constituting a starting point for further data interpretation and integration with numerous other tools.
Hansen, Christian Skjødt; Østerbye, Thomas; Marcatili, Paolo; Lund, Ole; Buus, Søren; Nielsen, Morten
2017-01-01
Identification of epitopes targeted by antibodies (B cell epitopes) is of critical importance for the development of many diagnostic and therapeutic tools. For clinical usage, such epitopes must be extensively characterized in order to validate specificity and to document potential cross-reactivity. B cell epitopes are typically classified as either linear epitopes, i.e. short consecutive segments from the protein sequence or conformational epitopes adapted through native protein folding. Recent advances in high-density peptide microarrays enable high-throughput, high-resolution identification and characterization of linear B cell epitopes. Using exhaustive amino acid substitution analysis of peptides originating from target antigens, these microarrays can be used to address the specificity of polyclonal antibodies raised against such antigens containing hundreds of epitopes. However, the interpretation of the data provided in such large-scale screenings is far from trivial and in most cases it requires advanced computational and statistical skills. Here, we present an online application for automated identification of linear B cell epitopes, allowing the non-expert user to analyse peptide microarray data. The application takes as input quantitative peptide data of fully or partially substituted overlapping peptides from a given antigen sequence and identifies epitope residues (residues that are significantly affected by substitutions) and visualize the selectivity towards each residue by sequence logo plots. Demonstrating utility, the application was used to identify and address the antibody specificity of 18 linear epitope regions in Human Serum Albumin (HSA), using peptide microarray data consisting of fully substituted peptides spanning the entire sequence of HSA and incubated with polyclonal rabbit anti-HSA (and mouse anti-rabbit-Cy3). The application is made available at: www.cbs.dtu.dk/services/ArrayPitope.
Hansen, Christian Skjødt; Østerbye, Thomas; Marcatili, Paolo; Lund, Ole; Buus, Søren
2017-01-01
Identification of epitopes targeted by antibodies (B cell epitopes) is of critical importance for the development of many diagnostic and therapeutic tools. For clinical usage, such epitopes must be extensively characterized in order to validate specificity and to document potential cross-reactivity. B cell epitopes are typically classified as either linear epitopes, i.e. short consecutive segments from the protein sequence or conformational epitopes adapted through native protein folding. Recent advances in high-density peptide microarrays enable high-throughput, high-resolution identification and characterization of linear B cell epitopes. Using exhaustive amino acid substitution analysis of peptides originating from target antigens, these microarrays can be used to address the specificity of polyclonal antibodies raised against such antigens containing hundreds of epitopes. However, the interpretation of the data provided in such large-scale screenings is far from trivial and in most cases it requires advanced computational and statistical skills. Here, we present an online application for automated identification of linear B cell epitopes, allowing the non-expert user to analyse peptide microarray data. The application takes as input quantitative peptide data of fully or partially substituted overlapping peptides from a given antigen sequence and identifies epitope residues (residues that are significantly affected by substitutions) and visualize the selectivity towards each residue by sequence logo plots. Demonstrating utility, the application was used to identify and address the antibody specificity of 18 linear epitope regions in Human Serum Albumin (HSA), using peptide microarray data consisting of fully substituted peptides spanning the entire sequence of HSA and incubated with polyclonal rabbit anti-HSA (and mouse anti-rabbit-Cy3). The application is made available at: www.cbs.dtu.dk/services/ArrayPitope. PMID:28095436
Analytical Protein Microarrays: Advancements Towards Clinical Applications
Sauer, Ursula
2017-01-01
Protein microarrays represent a powerful technology with the potential to serve as tools for the detection of a broad range of analytes in numerous applications such as diagnostics, drug development, food safety, and environmental monitoring. Key features of analytical protein microarrays include high throughput and relatively low costs due to minimal reagent consumption, multiplexing, fast kinetics and hence measurements, and the possibility of functional integration. So far, especially fundamental studies in molecular and cell biology have been conducted using protein microarrays, while the potential for clinical, notably point-of-care applications is not yet fully utilized. The question arises what features have to be implemented and what improvements have to be made in order to fully exploit the technology. In the past we have identified various obstacles that have to be overcome in order to promote protein microarray technology in the diagnostic field. Issues that need significant improvement to make the technology more attractive for the diagnostic market are for instance: too low sensitivity and deficiency in reproducibility, inadequate analysis time, lack of high-quality antibodies and validated reagents, lack of automation and portable instruments, and cost of instruments necessary for chip production and read-out. The scope of the paper at hand is to review approaches to solve these problems. PMID:28146048
Fully Automated Complementary DNA Microarray Segmentation using a Novel Fuzzy-based Algorithm.
Saberkari, Hamidreza; Bahrami, Sheyda; Shamsi, Mousa; Amoshahy, Mohammad Javad; Ghavifekr, Habib Badri; Sedaaghi, Mohammad Hossein
2015-01-01
DNA microarray is a powerful approach to study simultaneously, the expression of 1000 of genes in a single experiment. The average value of the fluorescent intensity could be calculated in a microarray experiment. The calculated intensity values are very close in amount to the levels of expression of a particular gene. However, determining the appropriate position of every spot in microarray images is a main challenge, which leads to the accurate classification of normal and abnormal (cancer) cells. In this paper, first a preprocessing approach is performed to eliminate the noise and artifacts available in microarray cells using the nonlinear anisotropic diffusion filtering method. Then, the coordinate center of each spot is positioned utilizing the mathematical morphology operations. Finally, the position of each spot is exactly determined through applying a novel hybrid model based on the principle component analysis and the spatial fuzzy c-means clustering (SFCM) algorithm. Using a Gaussian kernel in SFCM algorithm will lead to improving the quality in complementary DNA microarray segmentation. The performance of the proposed algorithm has been evaluated on the real microarray images, which is available in Stanford Microarray Databases. Results illustrate that the accuracy of microarray cells segmentation in the proposed algorithm reaches to 100% and 98% for noiseless/noisy cells, respectively.
Howat, William J; Blows, Fiona M; Provenzano, Elena; Brook, Mark N; Morris, Lorna; Gazinska, Patrycja; Johnson, Nicola; McDuffus, Leigh‐Anne; Miller, Jodi; Sawyer, Elinor J; Pinder, Sarah; van Deurzen, Carolien H M; Jones, Louise; Sironen, Reijo; Visscher, Daniel; Caldas, Carlos; Daley, Frances; Coulson, Penny; Broeks, Annegien; Sanders, Joyce; Wesseling, Jelle; Nevanlinna, Heli; Fagerholm, Rainer; Blomqvist, Carl; Heikkilä, Päivi; Ali, H Raza; Dawson, Sarah‐Jane; Figueroa, Jonine; Lissowska, Jolanta; Brinton, Louise; Mannermaa, Arto; Kataja, Vesa; Kosma, Veli‐Matti; Cox, Angela; Brock, Ian W; Cross, Simon S; Reed, Malcolm W; Couch, Fergus J; Olson, Janet E; Devillee, Peter; Mesker, Wilma E; Seyaneve, Caroline M; Hollestelle, Antoinette; Benitez, Javier; Perez, Jose Ignacio Arias; Menéndez, Primitiva; Bolla, Manjeet K; Easton, Douglas F; Schmidt, Marjanka K; Pharoah, Paul D; Sherman, Mark E
2014-01-01
Abstract Breast cancer risk factors and clinical outcomes vary by tumour marker expression. However, individual studies often lack the power required to assess these relationships, and large‐scale analyses are limited by the need for high throughput, standardized scoring methods. To address these limitations, we assessed whether automated image analysis of immunohistochemically stained tissue microarrays can permit rapid, standardized scoring of tumour markers from multiple studies. Tissue microarray sections prepared in nine studies containing 20 263 cores from 8267 breast cancers stained for two nuclear (oestrogen receptor, progesterone receptor), two membranous (human epidermal growth factor receptor 2 and epidermal growth factor receptor) and one cytoplasmic (cytokeratin 5/6) marker were scanned as digital images. Automated algorithms were used to score markers in tumour cells using the Ariol system. We compared automated scores against visual reads, and their associations with breast cancer survival. Approximately 65–70% of tissue microarray cores were satisfactory for scoring. Among satisfactory cores, agreement between dichotomous automated and visual scores was highest for oestrogen receptor (Kappa = 0.76), followed by human epidermal growth factor receptor 2 (Kappa = 0.69) and progesterone receptor (Kappa = 0.67). Automated quantitative scores for these markers were associated with hazard ratios for breast cancer mortality in a dose‐response manner. Considering visual scores of epidermal growth factor receptor or cytokeratin 5/6 as the reference, automated scoring achieved excellent negative predictive value (96–98%), but yielded many false positives (positive predictive value = 30–32%). For all markers, we observed substantial heterogeneity in automated scoring performance across tissue microarrays. Automated analysis is a potentially useful tool for large‐scale, quantitative scoring of immunohistochemically stained tissue microarrays available in consortia. However, continued optimization, rigorous marker‐specific quality control measures and standardization of tissue microarray designs, staining and scoring protocols is needed to enhance results. PMID:27499890
Automated detection and quantitation of bacterial RNA by using electrical microarrays.
Elsholz, B; Wörl, R; Blohm, L; Albers, J; Feucht, H; Grunwald, T; Jürgen, B; Schweder, T; Hintsche, Rainer
2006-07-15
Low-density electrical 16S rRNA specific oligonucleotide microarrays and an automated analysis system have been developed for the identification and quantitation of pathogens. The pathogens are Escherichia coli, Pseudomonas aeruginosa, Enterococcus faecalis, Staphylococcus aureus, and Staphylococcus epidermidis, which are typically involved in urinary tract infections. Interdigitated gold array electrodes (IDA-electrodes), which have structures in the nanometer range, have been used for very sensitive analysis. Thiol-modified oligonucleotides are immobilized on the gold IDA as capture probes. They mediate the specific recognition of the target 16S rRNA by hybridization. Additionally three unlabeled oligonucleotides are hybridized in close proximity to the capturing site. They are supporting molecules, because they improve the RNA hybridization at the capturing site. A biotin labeled detector oligonucleotide is also allowed to hybridize to the captured RNA sequence. The biotin labels enable the binding of avidin alkaline phophatase conjugates. The phosphatase liberates the electrochemical mediator p-aminophenol from its electrically inactive phosphate derivative. The electrical signals were generated by amperometric redox cycling and detected by a unique multipotentiostat. The read out signals of the microarray are position specific current and change over time in proportion to the analyte concentration. If two additional biotins are introduced into the affinity binding complex via the supporting oligonucleotides, the sensitivity of the assays increase more than 60%. The limit of detection of Escherichia coli total RNA has been determined to be 0.5 ng/microL. The control of fluidics for variable assay formats as well as the multichannel electrical read out and data handling have all been fully automated. The fast and easy procedure does not require any amplification of the targeted nucleic acids by PCR.
Autonomous system for Web-based microarray image analysis.
Bozinov, Daniel
2003-12-01
Software-based feature extraction from DNA microarray images still requires human intervention on various levels. Manual adjustment of grid and metagrid parameters, precise alignment of superimposed grid templates and gene spots, or simply identification of large-scale artifacts have to be performed beforehand to reliably analyze DNA signals and correctly quantify their expression values. Ideally, a Web-based system with input solely confined to a single microarray image and a data table as output containing measurements for all gene spots would directly transform raw image data into abstracted gene expression tables. Sophisticated algorithms with advanced procedures for iterative correction function can overcome imminent challenges in image processing. Herein is introduced an integrated software system with a Java-based interface on the client side that allows for decentralized access and furthermore enables the scientist to instantly employ the most updated software version at any given time. This software tool is extended from PixClust as used in Extractiff incorporated with Java Web Start deployment technology. Ultimately, this setup is destined for high-throughput pipelines in genome-wide medical diagnostics labs or microarray core facilities aimed at providing fully automated service to its users.
Cellular neural networks, the Navier-Stokes equation, and microarray image reconstruction.
Zineddin, Bachar; Wang, Zidong; Liu, Xiaohui
2011-11-01
Although the last decade has witnessed a great deal of improvements achieved for the microarray technology, many major developments in all the main stages of this technology, including image processing, are still needed. Some hardware implementations of microarray image processing have been proposed in the literature and proved to be promising alternatives to the currently available software systems. However, the main drawback of those proposed approaches is the unsuitable addressing of the quantification of the gene spot in a realistic way without any assumption about the image surface. Our aim in this paper is to present a new image-reconstruction algorithm using the cellular neural network that solves the Navier-Stokes equation. This algorithm offers a robust method for estimating the background signal within the gene-spot region. The MATCNN toolbox for Matlab is used to test the proposed method. Quantitative comparisons are carried out, i.e., in terms of objective criteria, between our approach and some other available methods. It is shown that the proposed algorithm gives highly accurate and realistic measurements in a fully automated manner within a remarkably efficient time.
Nucleosome positioning from tiling microarray data.
Yassour, Moran; Kaplan, Tommy; Jaimovich, Ariel; Friedman, Nir
2008-07-01
The packaging of DNA around nucleosomes in eukaryotic cells plays a crucial role in regulation of gene expression, and other DNA-related processes. To better understand the regulatory role of nucleosomes, it is important to pinpoint their position in a high (5-10 bp) resolution. Toward this end, several recent works used dense tiling arrays to map nucleosomes in a high-throughput manner. These data were then parsed and hand-curated, and the positions of nucleosomes were assessed. In this manuscript, we present a fully automated algorithm to analyze such data and predict the exact location of nucleosomes. We introduce a method, based on a probabilistic graphical model, to increase the resolution of our predictions even beyond that of the microarray used. We show how to build such a model and how to compile it into a simple Hidden Markov Model, allowing for a fast and accurate inference of nucleosome positions. We applied our model to nucleosomal data from mid-log yeast cells reported by Yuan et al. and compared our predictions to those of the original paper; to a more recent method that uses five times denser tiling arrays as explained by Lee et al.; and to a curated set of literature-based nucleosome positions. Our results suggest that by applying our algorithm to the same data used by Yuan et al. our fully automated model traced 13% more nucleosomes, and increased the overall accuracy by about 20%. We believe that such an improvement opens the way for a better understanding of the regulatory mechanisms controlling gene expression, and how they are encoded in the DNA.
van der Logt, Elise M. J.; Kuperus, Deborah A. J.; van Setten, Jan W.; van den Heuvel, Marius C.; Boers, James. E.; Schuuring, Ed; Kibbelaar, Robby E.
2015-01-01
HER2 assessment is routinely used to select patients with invasive breast cancer that might benefit from HER2-targeted therapy. The aim of this study was to validate a fully automated in situ hybridization (ISH) procedure that combines the automated Leica HER2 fluorescent ISH system for Bond with supervised automated analysis with the Visia imaging D-Sight digital imaging platform. HER2 assessment was performed on 328 formalin-fixed/paraffin-embedded invasive breast cancer tumors on tissue microarrays (TMA) and 100 (50 selected IHC 2+ and 50 random IHC scores) full-sized slides of resections/biopsies obtained for diagnostic purposes previously. For digital analysis slides were pre-screened at 20x and 100x magnification for all fluorescent signals and supervised-automated scoring was performed on at least two pictures (in total at least 20 nuclei were counted) with the D-Sight HER2 FISH analysis module by two observers independently. Results were compared to data obtained previously with the manual Abbott FISH test. The overall agreement with Abbott FISH data among TMA samples and 50 selected IHC 2+ cases was 98.8% (κ = 0.94) and 93.8% (κ = 0.88), respectively. The results of 50 additionally tested unselected IHC cases were concordant with previously obtained IHC and/or FISH data. The combination of the Leica FISH system with the D-Sight digital imaging platform is a feasible method for HER2 assessment in routine clinical practice for patients with invasive breast cancer. PMID:25844540
Automated, Miniaturized Instrument for Measuring Gene Expression in Space
NASA Technical Reports Server (NTRS)
Pohorille, A.; Peyvan, K.; Danley, D.; Ricco, A. J.
2010-01-01
To facilitate astrobiological studies on the survival and adaptation of microorganisms and mixed microbial cultures to space environment, we have been developing a fully automated, miniaturized system for measuring their gene expression on small spacecraft. This low-cost, multi-purpose instrument represents a major scientific and technological advancement in our ability to study the impact of the space environment on biological systems by providing data on cellular metabolism and regulation orders of magnitude richer than what is currently available. The system supports growth of the organism, lyse it to release the expressed RNA, label the RNA, read the expression levels of a large number of genes by microarray analysis of labeled RNA and transmit the measurements to Earth. To measure gene expression we use microarray technology developed by CombiMatrix, which is based on electrochemical reactions on arrays of electrodes on a semiconductor substrate. Since the electrical integrity of the microarray remains intact after probe synthesis, the circuitry can be employed to sense nucleic acid binding at each electrode. CombiMatrix arrays can be sectored to allow multiple samples per chip. In addition, a single array can be used for several assays. The array has been integrated into an automated microfluidic cartridge that uses flexible reagent blisters and pinch pumping to move liquid reagents between chambers. The proposed instrument will help to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment, develop effective countermeasures against these effects, and test our ability to sustain and grow in space organisms that can be used for life support and in situ resource utilization during long-duration space exploration. The instrument is suitable for small satellite platforms, which provide frequent, low cost access to space. It can be also used on any other platform in space, including the ISS. It can be replicated and used with only small modifications in multiple biological experiments with a broad range of goals in mind.
Automated, Miniaturized Instrument for Measuring Gene Expression in Space
NASA Astrophysics Data System (ADS)
Pohorille, Andrew; Danley, David; Payvan, Kia; Ricco, Antonio
To facilitate astrobiological studies on the survival and adaptation of microorganisms and mixed microbial cultures to space environment, we have been developing a fully automated, minia-turized system for measuring their gene expression on small spacecraft. This low-cost, multi-purpose instrument represents a major scientific and technological advancement in our ability to study the impact of the space environment on biological systems by providing data on cel-lular metabolism and regulation orders of magnitude richer than what is currently available. The system supports growth of the organism, lyse it to release the expressed RNA, label the RNA, read the expression levels of a large number of genes by microarray analysis of labeled RNA and transmit the measurements to Earth. To measure gene expression we use microarray technology developed by CombiMatrix, which is based on electrochemical reactions on arrays of electrodes on a semiconductor substrate. Since the electrical integrity of the microarray re-mains intact after probe synthesis, the circuitry can be employed to sense nucleic acid binding at each electrode. CombiMatrix arrays can be sectored to allow multiple samples per chip. In addition, a single array can be used for several assays. The array has been integrated into an automated microfluidic cartridge that uses flexible reagent blisters and pinch pumping to move liquid reagents between chambers. The proposed instrument will help to understand adaptation of terrestrial life to conditions be-yond the planet of origin, identify deleterious effects of the space environment, develop effective countermeasures against these effects, and test our ability to sustain and grow in space organ-isms that can be used for life support and in situ resource utilization during long-duration space exploration. The instrument is suitable for small satellite platforms, which provide frequent, low cost access to space. It can be also used on any other platform in space, including the ISS. It can be replicated and used with only small modifications in multiple biological experiments with a broad range of goals in mind.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chandler, Darrell P.; Brown, Jeremy D.; Call, Douglas R.
2001-09-01
We describe the development and application of a novel electromagnetic flow cell and fluidics system for automated immunomagnetic separation of E. coli directly from unprocessed poultry carcass rinse, and the biochemical coupling of automated sample preparation with nucleic acid microarrays without cell growth. Highly porous nickel foam was used as a magnetic flux conductor. Up to 32% recovery efficiency of 'total' E. coli was achieved within the automated system with 6 sec contact times and 15 minute protocol (from sample injection through elution), statistically similar to cell recovery efficiencies in > 1 hour 'batch' captures. The electromagnet flow cell allowedmore » complete recovery of 2.8 mm particles directly from unprocessed poultry carcass rinse whereas the batch system did not. O157:H7 cells were reproducibly isolated directly from unprocessed poultry rinse with 39% recovery efficiency at 103 cells ml-1 inoculum. Direct plating of washed beads showed positive recovery of O 157:H7 directly from carcass rinse at an inoculum of 10 cells ml-1. Recovered beads were used for direct PCR amplification and microarray detection, with a process-level detection limit (automated cell concentration through microarray detection) of < 103 cells ml-1 carcass rinse. The fluidic system and analytical approach described here are generally applicable to most microbial detection problems and applications.« less
Erickson, A; Fisher, M; Furukawa-Stoffer, T; Ambagala, A; Hodko, D; Pasick, J; King, D P; Nfon, C; Ortega Polo, R; Lung, O
2018-04-01
Microarray technology can be useful for pathogen detection as it allows simultaneous interrogation of the presence or absence of a large number of genetic signatures. However, most microarray assays are labour-intensive and time-consuming to perform. This study describes the development and initial evaluation of a multiplex reverse transcription (RT)-PCR and novel accompanying automated electronic microarray assay for simultaneous detection and differentiation of seven important viruses that affect swine (foot-and-mouth disease virus [FMDV], swine vesicular disease virus [SVDV], vesicular exanthema of swine virus [VESV], African swine fever virus [ASFV], classical swine fever virus [CSFV], porcine respiratory and reproductive syndrome virus [PRRSV] and porcine circovirus type 2 [PCV2]). The novel electronic microarray assay utilizes a single, user-friendly instrument that integrates and automates capture probe printing, hybridization, washing and reporting on a disposable electronic microarray cartridge with 400 features. This assay accurately detected and identified a total of 68 isolates of the seven targeted virus species including 23 samples of FMDV, representing all seven serotypes, and 10 CSFV strains, representing all three genotypes. The assay successfully detected viruses in clinical samples from the field, experimentally infected animals (as early as 1 day post-infection (dpi) for FMDV and SVDV, 4 dpi for ASFV, 5 dpi for CSFV), as well as in biological material that were spiked with target viruses. The limit of detection was 10 copies/μl for ASFV, PCV2 and PRRSV, 100 copies/μl for SVDV, CSFV, VESV and 1,000 copies/μl for FMDV. The electronic microarray component had reduced analytical sensitivity for several of the target viruses when compared with the multiplex RT-PCR. The integration of capture probe printing allows custom onsite array printing as needed, while electrophoretically driven hybridization generates results faster than conventional microarrays that rely on passive hybridization. With further refinement, this novel, rapid, highly automated microarray technology has potential applications in multipathogen surveillance of livestock diseases. © 2017 Her Majesty the Queen in Right of Canada • Transboundary and Emerging Diseases.
Dallabernardina, Pietro; Ruprecht, Colin; Smith, Peter J; Hahn, Michael G; Urbanowicz, Breeanna R; Pfrengle, Fabian
2017-12-06
We report the automated glycan assembly of oligosaccharides related to the plant cell wall hemicellulosic polysaccharide xyloglucan. The synthesis of galactosylated xyloglucan oligosaccharides was enabled by introducing p-methoxybenzyl (PMB) as a temporary protecting group for automated glycan assembly. The generated oligosaccharides were printed as microarrays, and the binding of a collection of xyloglucan-directed monoclonal antibodies (mAbs) to the oligosaccharides was assessed. We also demonstrated that the printed glycans can be further enzymatically modified while appended to the microarray surface by Arabidopsis thaliana xyloglucan xylosyltransferase 2 (AtXXT2).
Goldman, Mindy; Núria, Núria; Castilho, Lilian M
2015-01-01
Automated testing platforms facilitate the introduction of red cell genotyping of patients and blood donors. Fluidic microarray systems, such as Luminex XMAP (Austin, TX), are used in many clinical applications, including HLA and HPA typing. The Progenika ID CORE XT (Progenika Biopharma-Grifols, Bizkaia, Spain) uses this platform to analyze 29 polymorphisms determining 37 antigens in 10 blood group systems. Once DNA has been extracted, processing time is approximately 4 hours. The system is highly automated and includes integrated analysis software that produces a file and a report with genotype and predicted phenotype results.
Zhu, Yuerong; Zhu, Yuelin; Xu, Wei
2008-01-01
Background Though microarray experiments are very popular in life science research, managing and analyzing microarray data are still challenging tasks for many biologists. Most microarray programs require users to have sophisticated knowledge of mathematics, statistics and computer skills for usage. With accumulating microarray data deposited in public databases, easy-to-use programs to re-analyze previously published microarray data are in high demand. Results EzArray is a web-based Affymetrix expression array data management and analysis system for researchers who need to organize microarray data efficiently and get data analyzed instantly. EzArray organizes microarray data into projects that can be analyzed online with predefined or custom procedures. EzArray performs data preprocessing and detection of differentially expressed genes with statistical methods. All analysis procedures are optimized and highly automated so that even novice users with limited pre-knowledge of microarray data analysis can complete initial analysis quickly. Since all input files, analysis parameters, and executed scripts can be downloaded, EzArray provides maximum reproducibility for each analysis. In addition, EzArray integrates with Gene Expression Omnibus (GEO) and allows instantaneous re-analysis of published array data. Conclusion EzArray is a novel Affymetrix expression array data analysis and sharing system. EzArray provides easy-to-use tools for re-analyzing published microarray data and will help both novice and experienced users perform initial analysis of their microarray data from the location of data storage. We believe EzArray will be a useful system for facilities with microarray services and laboratories with multiple members involved in microarray data analysis. EzArray is freely available from . PMID:18218103
Huang, Shu-Hong; Chang, Yu-Shin; Juang, Jyh-Ming Jimmy; Chang, Kai-Wei; Tsai, Mong-Hsun; Lu, Tzu-Pin; Lai, Liang-Chuan; Chuang, Eric Y; Huang, Nien-Tsu
2018-03-12
In this study, we developed an automated microfluidic DNA microarray (AMDM) platform for point mutation detection of genetic variants in inherited arrhythmic diseases. The platform allows for automated and programmable reagent sequencing under precise conditions of hybridization flow and temperature control. It is composed of a commercial microfluidic control system, a microfluidic microarray device, and a temperature control unit. The automated and rapid hybridization process can be performed in the AMDM platform using Cy3 labeled oligonucleotide exons of SCN5A genetic DNA, which produces proteins associated with sodium channels abundant in the heart (cardiac) muscle cells. We then introduce a graphene oxide (GO)-assisted DNA microarray hybridization protocol to enable point mutation detection. In this protocol, a GO solution is added after the staining step to quench dyes bound to single-stranded DNA or non-perfectly matched DNA, which can improve point mutation specificity. As proof-of-concept we extracted the wild-type and mutant of exon 12 and exon 17 of SCN5A genetic DNA from patients with long QT syndrome or Brugada syndrome by touchdown PCR and performed a successful point mutation discrimination in the AMDM platform. Overall, the AMDM platform can greatly reduce laborious and time-consuming hybridization steps and prevent potential contamination. Furthermore, by introducing the reciprocating flow into the microchannel during the hybridization process, the total assay time can be reduced to 3 hours, which is 6 times faster than the conventional DNA microarray. Given the automatic assay operation, shorter assay time, and high point mutation discrimination, we believe that the AMDM platform has potential for low-cost, rapid and sensitive genetic testing in a simple and user-friendly manner, which may benefit gene screening in medical practice.
21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Fully automated short-term incubation cycle... Diagnostic Devices § 866.1645 Fully automated short-term incubation cycle antimicrobial susceptibility system. (a) Identification. A fully automated short-term incubation cycle antimicrobial susceptibility system...
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Amanda M.; Daly, Don S.; Willse, Alan R.
The Automated Microarray Image Analysis (AMIA) Toolbox for MATLAB is a flexible, open-source microarray image analysis tool that allows the user to customize analysis of sets of microarray images. This tool provides several methods of identifying and quantify spot statistics, as well as extensive diagnostic statistics and images to identify poor data quality or processing. The open nature of this software allows researchers to understand the algorithms used to provide intensity estimates and to modify them easily if desired.
Singh, Kumar Saurabh; Thual, Dominique; Spurio, Roberto; Cannata, Nicola
2015-01-01
One of the most crucial characteristics of day-to-day laboratory information management is the collection, storage and retrieval of information about research subjects and environmental or biomedical samples. An efficient link between sample data and experimental results is absolutely important for the successful outcome of a collaborative project. Currently available software solutions are largely limited to large scale, expensive commercial Laboratory Information Management Systems (LIMS). Acquiring such LIMS indeed can bring laboratory information management to a higher level, but most of the times this requires a sufficient investment of money, time and technical efforts. There is a clear need for a light weighted open source system which can easily be managed on local servers and handled by individual researchers. Here we present a software named SaDA for storing, retrieving and analyzing data originated from microorganism monitoring experiments. SaDA is fully integrated in the management of environmental samples, oligonucleotide sequences, microarray data and the subsequent downstream analysis procedures. It is simple and generic software, and can be extended and customized for various environmental and biomedical studies. PMID:26047146
Singh, Kumar Saurabh; Thual, Dominique; Spurio, Roberto; Cannata, Nicola
2015-06-03
One of the most crucial characteristics of day-to-day laboratory information management is the collection, storage and retrieval of information about research subjects and environmental or biomedical samples. An efficient link between sample data and experimental results is absolutely important for the successful outcome of a collaborative project. Currently available software solutions are largely limited to large scale, expensive commercial Laboratory Information Management Systems (LIMS). Acquiring such LIMS indeed can bring laboratory information management to a higher level, but most of the times this requires a sufficient investment of money, time and technical efforts. There is a clear need for a light weighted open source system which can easily be managed on local servers and handled by individual researchers. Here we present a software named SaDA for storing, retrieving and analyzing data originated from microorganism monitoring experiments. SaDA is fully integrated in the management of environmental samples, oligonucleotide sequences, microarray data and the subsequent downstream analysis procedures. It is simple and generic software, and can be extended and customized for various environmental and biomedical studies.
NASA Astrophysics Data System (ADS)
Cornaglia, Matteo; Mouchiroud, Laurent; Marette, Alexis; Narasimhan, Shreya; Lehnert, Thomas; Jovaisaite, Virginija; Auwerx, Johan; Gijs, Martin A. M.
2015-05-01
Studies of the real-time dynamics of embryonic development require a gentle embryo handling method, the possibility of long-term live imaging during the complete embryogenesis, as well as of parallelization providing a population’s statistics, while keeping single embryo resolution. We describe an automated approach that fully accomplishes these requirements for embryos of Caenorhabditis elegans, one of the most employed model organisms in biomedical research. We developed a microfluidic platform which makes use of pure passive hydrodynamics to run on-chip worm cultures, from which we obtain synchronized embryo populations, and to immobilize these embryos in incubator microarrays for long-term high-resolution optical imaging. We successfully employ our platform to investigate morphogenesis and mitochondrial biogenesis during the full embryonic development and elucidate the role of the mitochondrial unfolded protein response (UPRmt) within C. elegans embryogenesis. Our method can be generally used for protein expression and developmental studies at the embryonic level, but can also provide clues to understand the aging process and age-related diseases in particular.
Killion, Patrick J; Sherlock, Gavin; Iyer, Vishwanath R
2003-01-01
Background The power of microarray analysis can be realized only if data is systematically archived and linked to biological annotations as well as analysis algorithms. Description The Longhorn Array Database (LAD) is a MIAME compliant microarray database that operates on PostgreSQL and Linux. It is a fully open source version of the Stanford Microarray Database (SMD), one of the largest microarray databases. LAD is available at Conclusions Our development of LAD provides a simple, free, open, reliable and proven solution for storage and analysis of two-color microarray data. PMID:12930545
Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E; Allen, Peter J; Sempere, Lorenzo F; Haab, Brian B
2015-10-06
Experiments involving the high-throughput quantification of image data require algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multicolor, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu's method for selected images. SFT promises to advance the goal of full automation in image analysis.
Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M.; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E.; Allen, Peter J.; Sempere, Lorenzo F.; Haab, Brian B.
2016-01-01
Certain experiments involve the high-throughput quantification of image data, thus requiring algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multi-color, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu’s method for selected images. SFT promises to advance the goal of full automation in image analysis. PMID:26339978
NASA Technical Reports Server (NTRS)
Pohorille, Andrew; Peyvan, Kia; Karouia, Fathi; Ricco, Antonio
2012-01-01
The capability to measure gene expression on board spacecraft opens the door to a large number of high-value experiments on the influence of the space environment on biological systems. For example, measurements of gene expression will help us to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment on a wide range of organisms from microbes to humans, develop effective countermeasures against these effects, and determine the metabolic bases of microbial pathogenicity and drug resistance. These and other applications hold significant potential for discoveries in space biology, biotechnology, and medicine. Supported by funding from the NASA Astrobiology Science and Technology Instrument Development Program, we are developing a fully automated, miniaturized, integrated fluidic system for small spacecraft capable of in-situ measurement of expression of several hundreds of microbial genes from multiple samples. The instrument will be capable of (1) lysing cell walls of bacteria sampled from cultures grown in space, (2) extracting and purifying RNA released from cells, (3) hybridizing the RNA on a microarray and (4) providing readout of the microarray signal, all in a single microfluidics cartridge. The device is suitable for deployment on nanosatellite platforms developed by NASA Ames' Small Spacecraft Division. To meet space and other technical constraints imposed by these platforms, a number of technical innovations are being implemented. The integration and end-to-end technological and biological validation of the instrument are carried out using as a model the photosynthetic bacterium Synechococcus elongatus, known for its remarkable metabolic diversity and resilience to adverse conditions. Each step in the measurement process-lysis, nucleic acid extraction, purification, and hybridization to an array-is assessed through comparison of the results obtained using the instrument with those from standard laboratory protocols. Once developed, the system can be used with minor modifications for multiple experiments on different platforms in space, including extension to higher organisms and microbial monitoring. A proposed version of GEMM that is capable of handling both microbial and tissue samples on the International Space Station will be briefly summarized.
An automated multiplex specific IgE assay system using a photoimmobilized microarray.
Ito, Yoshihiro; Moritsugu, Nozomi; Matsue, Takahisa; Mitsukoshi, Kiyomi; Ayame, Hirohito; Okochi, Norihiko; Hattori, Hideshi; Tashiro, Hideo; Sato, Sakura; Ebisawa, Motohiro
2012-11-15
An automated microarray diagnostic system for specific IgE using photoimmobilized allergen has been developed. Photoimmobilization is useful for preparing microarrays, where various types of biological components are covalently immobilized on a plate. Because the immobilization is based on a photo-induced radical cross-linking reaction, it does not require specific functional groups on the immobilized components. Here, an aqueous solution of a photoreactive poly(ethylene glycol)-based polymer was spin-coated on a plate, and an aqueous solution of each allergen was microspotted on the coated plate and allowed to dry in air. Finally, the plate was irradiated with an ultraviolet lamp for covalent immobilization. An automated machine using these plates was developed for the assay of antigen-specific IgE. Initially, the patient serum was added to the microarray plate, and after reaction of the microspotted allergen with IgE, the adsorbed IgE was detected by a peroxidase-conjugated anti-IgE-antibody. The chemical luminescence intensity of the substrate decomposed by the peroxidase was automatically detected using a sensitive charge-coupled device camera. All the allergens were immobilized stably using this method, which was used to screen for allergen-specific IgE. The results were comparable with those using conventional specific IgE. Using this system, six different allergen-specific IgE were assayed using 10 μL of serum within a period of 20 min. Copyright © 2012 Elsevier B.V. All rights reserved.
Tips on hybridizing, washing, and scanning affymetrix microarrays.
Ares, Manuel
2014-02-01
Starting in the late 1990s, Affymetrix, Inc. produced a commercial system for hybridizing, washing, and scanning microarrays that was designed to be easy to operate and reproducible. The system used arrays packaged in a plastic cassette or chamber in which the prefabricated array was mounted and could be filled with fluid through resealable membrane ports either by hand or by an automated "fluidics station" specially designed to handle the arrays. A special rotating hybridization oven and a specially designed scanner were also required. Primarily because of automation and standardization the Affymetrix system was and still remains popular. Here, we provide a skeleton protocol with the potential pitfalls identified. It is designed to augment the protocols provided by Affymetrix.
Matsudaira, Takahiro; Tsuzuki, Saki; Wada, Akira; Suwa, Akira; Kohsaka, Hitoshi; Tomida, Maiko; Ito, Yoshihiro
2008-01-01
Autoimmune diseases such as rheumatoid arthritis, multiple sclerosis, and autoimmune diabetes are characterized by the production of autoantibodies that serve as useful diagnostic markers, surrogate markers, and prognostic factors. We devised an in vitro system to detect these clinically pivotal autoantibodies using a photoimmobilized autoantigen microarray. Photoimmobilization was useful for preparing the autoantigen microarray, where autoantigens are covalently immobilized on a plate, because it does not require specific functional groups of the autoantigens and any organic material can be immobilized by a radical reaction induced by photoirradiation. Here, we prepared the microarray using a very convenient method. Aqueous solutions of each autoantigen were mixed with a polymer of poly(ethylene glycol) methacrylate and a photoreactive crosslinker, and the mixtures were microspotted on a plate and dried in air. Finally, the plate was irradiated with an ultraviolet lamp to obtain immobilization. In the assay, patient serum was added to the microarray plate. Antigen-specific IgG adsorbed on the microspotted autoantigen was detected by peroxidase-conjugated anti-IgG antibody. The chemical luminescence intensities of the substrate decomposed by the peroxidase were detected with a sensitive CCD camera. All autoantigens were immobilized stably by this method and used to screen antigen-specific IgG. In addition, the plate was covered with a polydimethylsiloxane sheet containing microchannels and automated measurement was carried out.
Gupta, Surya; De Puysseleyr, Veronic; Van der Heyden, José; Maddelein, Davy; Lemmens, Irma; Lievens, Sam; Degroeve, Sven; Tavernier, Jan; Martens, Lennart
2017-05-01
Protein-protein interaction (PPI) studies have dramatically expanded our knowledge about cellular behaviour and development in different conditions. A multitude of high-throughput PPI techniques have been developed to achieve proteome-scale coverage for PPI studies, including the microarray based Mammalian Protein-Protein Interaction Trap (MAPPIT) system. Because such high-throughput techniques typically report thousands of interactions, managing and analysing the large amounts of acquired data is a challenge. We have therefore built the MAPPIT cell microArray Protein Protein Interaction-Data management & Analysis Tool (MAPPI-DAT) as an automated data management and analysis tool for MAPPIT cell microarray experiments. MAPPI-DAT stores the experimental data and metadata in a systematic and structured way, automates data analysis and interpretation, and enables the meta-analysis of MAPPIT cell microarray data across all stored experiments. MAPPI-DAT is developed in Python, using R for data analysis and MySQL as data management system. MAPPI-DAT is cross-platform and can be ran on Microsoft Windows, Linux and OS X/macOS. The source code and a Microsoft Windows executable are freely available under the permissive Apache2 open source license at https://github.com/compomics/MAPPI-DAT. jan.tavernier@vib-ugent.be or lennart.martens@vib-ugent.be. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.
Progress in Fully Automated Abdominal CT Interpretation
Summers, Ronald M.
2016-01-01
OBJECTIVE Automated analysis of abdominal CT has advanced markedly over just the last few years. Fully automated assessment of organs, lymph nodes, adipose tissue, muscle, bowel, spine, and tumors are some examples where tremendous progress has been made. Computer-aided detection of lesions has also improved dramatically. CONCLUSION This article reviews the progress and provides insights into what is in store in the near future for automated analysis for abdominal CT, ultimately leading to fully automated interpretation. PMID:27101207
Development of a high-throughput Candida albicans biofilm chip.
Srinivasan, Anand; Uppuluri, Priya; Lopez-Ribot, Jose; Ramasubramanian, Anand K
2011-04-22
We have developed a high-density microarray platform consisting of nano-biofilms of Candida albicans. A robotic microarrayer was used to print yeast cells of C. albicans encapsulated in a collagen matrix at a volume as low as 50 nL onto surface-modified microscope slides. Upon incubation, the cells grow into fully formed "nano-biofilms". The morphological and architectural complexity of these biofilms were evaluated by scanning electron and confocal scanning laser microscopy. The extent of biofilm formation was determined using a microarray scanner from changes in fluorescence intensities due to FUN 1 metabolic processing. This staining technique was also adapted for antifungal susceptibility testing, which demonstrated that, similar to regular biofilms, cells within the on-chip biofilms displayed elevated levels of resistance against antifungal agents (fluconazole and amphotericin B). Thus, results from structural analyses and antifungal susceptibility testing indicated that despite miniaturization, these biofilms display the typical phenotypic properties associated with the biofilm mode of growth. In its final format, the C. albicans biofilm chip (CaBChip) is composed of 768 equivalent and spatially distinct nano-biofilms on a single slide; multiple chips can be printed and processed simultaneously. Compared to current methods for the formation of microbial biofilms, namely the 96-well microtiter plate model, this fungal biofilm chip has advantages in terms of miniaturization and automation, which combine to cut reagent use and analysis time, minimize labor intensive steps, and dramatically reduce assay costs. Such a chip should accelerate the antifungal drug discovery process by enabling rapid, convenient and inexpensive screening of hundreds-to-thousands of compounds simultaneously.
MAAMD: a workflow to standardize meta-analyses and comparison of affymetrix microarray data
2014-01-01
Background Mandatory deposit of raw microarray data files for public access, prior to study publication, provides significant opportunities to conduct new bioinformatics analyses within and across multiple datasets. Analysis of raw microarray data files (e.g. Affymetrix CEL files) can be time consuming, complex, and requires fundamental computational and bioinformatics skills. The development of analytical workflows to automate these tasks simplifies the processing of, improves the efficiency of, and serves to standardize multiple and sequential analyses. Once installed, workflows facilitate the tedious steps required to run rapid intra- and inter-dataset comparisons. Results We developed a workflow to facilitate and standardize Meta-Analysis of Affymetrix Microarray Data analysis (MAAMD) in Kepler. Two freely available stand-alone software tools, R and AltAnalyze were embedded in MAAMD. The inputs of MAAMD are user-editable csv files, which contain sample information and parameters describing the locations of input files and required tools. MAAMD was tested by analyzing 4 different GEO datasets from mice and drosophila. MAAMD automates data downloading, data organization, data quality control assesment, differential gene expression analysis, clustering analysis, pathway visualization, gene-set enrichment analysis, and cross-species orthologous-gene comparisons. MAAMD was utilized to identify gene orthologues responding to hypoxia or hyperoxia in both mice and drosophila. The entire set of analyses for 4 datasets (34 total microarrays) finished in ~ one hour. Conclusions MAAMD saves time, minimizes the required computer skills, and offers a standardized procedure for users to analyze microarray datasets and make new intra- and inter-dataset comparisons. PMID:24621103
NASA Technical Reports Server (NTRS)
Karouia, Fathi; Peyvan, Kia; Danley, David; Ricco, Antonio J.; Santos, Orlando; Pohorille, Andrew
2011-01-01
Human space travelers experience a unique environment that affects homeostasis and physiologic adaptation. The spacecraft environment subjects the traveler to noise, chemical and microbiological contaminants, increased radiation, and variable gravity forces. As humans prepare for long-duration missions to the International Space Station (ISS) and beyond, effective measures must be developed, verified and implemented to ensure mission success. Limited biomedical quantitative capabilities are currently available onboard the ISS. Therefore, the development of versatile instruments to perform space biological analysis and to monitor astronauts' health is needed. We are developing a fully automated, miniaturized system for measuring gene expression on small spacecraft in order to better understand the influence of the space environment on biological systems. This low-cost, low-power, multi-purpose instrument represents a major scientific and technological advancement by providing data on cellular metabolism and regulation. The current system will support growth of microorganisms, extract and purify the RNA, hybridize it to the array, read the expression levels of a large number of genes by microarray analysis, and transmit the measurements to Earth. The system will help discover how bacteria develop resistance to antibiotics and how pathogenic bacteria sometimes increase their virulence in space, facilitating the development of adequate countermeasures to decrease risks associated with human spaceflight. The current stand-alone technology could be used as an integrated platform onboard the ISS to perform similar genetic analyses on any biological systems from the tree of life. Additionally, with some modification the system could be implemented to perform real-time in-situ microbial monitoring of the ISS environment (air, surface and water samples) and the astronaut's microbiome using 16SrRNA microarray technology. Furthermore, the current system can be enhanced substantially by combining it with other technologies for automated, miniaturized, high-throughput biological measurements, such as fast sequencing, protein identification (proteomics) and metabolite profiling (metabolomics). Thus, the system can be integrated with other biomedical instruments in order to support and enhance telemedicine capability onboard ISS. NASA's mission includes sustained investment in critical research leading to effective countermeasures to minimize the risks associated with human spaceflight, and the use of appropriate technology to sustain space exploration at reasonable cost. Our integrated microarray technology is expected to fulfill these two critical requirements and to enable the scientific community to better understand and monitor the effects of the space environment on microorganisms and on the astronaut, in the process leveraging current capabilities and overcoming present limitations.
Automated methods for multiplexed pathogen detection.
Straub, Timothy M; Dockendorff, Brian P; Quiñonez-Díaz, Maria D; Valdez, Catherine O; Shutthanandan, Janani I; Tarasevich, Barbara J; Grate, Jay W; Bruckner-Lea, Cynthia J
2005-09-01
Detection of pathogenic microorganisms in environmental samples is a difficult process. Concentration of the organisms of interest also co-concentrates inhibitors of many end-point detection methods, notably, nucleic acid methods. In addition, sensitive, highly multiplexed pathogen detection continues to be problematic. The primary function of the BEADS (Biodetection Enabling Analyte Delivery System) platform is the automated concentration and purification of target analytes from interfering substances, often present in these samples, via a renewable surface column. In one version of BEADS, automated immunomagnetic separation (IMS) is used to separate cells from their samples. Captured cells are transferred to a flow-through thermal cycler where PCR, using labeled primers, is performed. PCR products are then detected by hybridization to a DNA suspension array. In another version of BEADS, cell lysis is performed, and community RNA is purified and directly labeled. Multiplexed detection is accomplished by direct hybridization of the RNA to a planar microarray. The integrated IMS/PCR version of BEADS can successfully purify and amplify 10 E. coli O157:H7 cells from river water samples. Multiplexed PCR assays for the simultaneous detection of E. coli O157:H7, Salmonella, and Shigella on bead suspension arrays was demonstrated for the detection of as few as 100 cells for each organism. Results for the RNA version of BEADS are also showing promising results. Automation yields highly purified RNA, suitable for multiplexed detection on microarrays, with microarray detection specificity equivalent to PCR. Both versions of the BEADS platform show great promise for automated pathogen detection from environmental samples. Highly multiplexed pathogen detection using PCR continues to be problematic, but may be required for trace detection in large volume samples. The RNA approach solves the issues of highly multiplexed PCR and provides "live vs. dead" capabilities. However, sensitivity of the method will need to be improved for RNA analysis to replace PCR.
Automated Methods for Multiplexed Pathogen Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Straub, Tim M.; Dockendorff, Brian P.; Quinonez-Diaz, Maria D.
2005-09-01
Detection of pathogenic microorganisms in environmental samples is a difficult process. Concentration of the organisms of interest also co-concentrates inhibitors of many end-point detection methods, notably, nucleic acid methods. In addition, sensitive, highly multiplexed pathogen detection continues to be problematic. The primary function of the BEADS (Biodetection Enabling Analyte Delivery System) platform is the automated concentration and purification of target analytes from interfering substances, often present in these samples, via a renewable surface column. In one version of BEADS, automated immunomagnetic separation (IMS) is used to separate cells from their samples. Captured cells are transferred to a flow-through thermal cyclermore » where PCR, using labeled primers, is performed. PCR products are then detected by hybridization to a DNA suspension array. In another version of BEADS, cell lysis is performed, and community RNA is purified and directly labeled. Multiplexed detection is accomplished by direct hybridization of the RNA to a planar microarray. The integrated IMS/PCR version of BEADS can successfully purify and amplify 10 E. coli O157:H7 cells from river water samples. Multiplexed PCR assays for the simultaneous detection of E. coli O157:H7, Salmonella, and Shigella on bead suspension arrays was demonstrated for the detection of as few as 100 cells for each organism. Results for the RNA version of BEADS are also showing promising results. Automation yields highly purified RNA, suitable for multiplexed detection on microarrays, with microarray detection specificity equivalent to PCR. Both versions of the BEADS platform show great promise for automated pathogen detection from environmental samples. Highly multiplexed pathogen detection using PCR continues to be problematic, but may be required for trace detection in large volume samples. The RNA approach solves the issues of highly multiplexed PCR and provides ''live vs. dead'' capabilities. However, sensitivity of the method will need to be improved for RNA analysis to replace PCR.« less
ATALARS Operational Requirements: Automated Tactical Aircraft Launch and Recovery System
DOT National Transportation Integrated Search
1988-04-01
The Automated Tactical Aircraft Launch and Recovery System (ATALARS) is a fully automated air traffic management system intended for the military service but is also fully compatible with civil air traffic control systems. This report documents a fir...
Evaluating concentration estimation errors in ELISA microarray experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daly, Don S.; White, Amanda M.; Varnum, Susan M.
Enzyme-linked immunosorbent assay (ELISA) is a standard immunoassay to predict a protein concentration in a sample. Deploying ELISA in a microarray format permits simultaneous prediction of the concentrations of numerous proteins in a small sample. These predictions, however, are uncertain due to processing error and biological variability. Evaluating prediction error is critical to interpreting biological significance and improving the ELISA microarray process. Evaluating prediction error must be automated to realize a reliable high-throughput ELISA microarray system. Methods: In this paper, we present a statistical method based on propagation of error to evaluate prediction errors in the ELISA microarray process. Althoughmore » propagation of error is central to this method, it is effective only when comparable data are available. Therefore, we briefly discuss the roles of experimental design, data screening, normalization and statistical diagnostics when evaluating ELISA microarray prediction errors. We use an ELISA microarray investigation of breast cancer biomarkers to illustrate the evaluation of prediction errors. The illustration begins with a description of the design and resulting data, followed by a brief discussion of data screening and normalization. In our illustration, we fit a standard curve to the screened and normalized data, review the modeling diagnostics, and apply propagation of error.« less
Using Kepler for Tool Integration in Microarray Analysis Workflows.
Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C
Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.
Fully Automated Sunspot Detection and Classification Using SDO HMI Imagery in MATLAB
2014-03-27
FULLY AUTOMATED SUNSPOT DETECTION AND CLASSIFICATION USING SDO HMI IMAGERY IN MATLAB THESIS Gordon M. Spahr, Second Lieutenant, USAF AFIT-ENP-14-M-34...CLASSIFICATION USING SDO HMI IMAGERY IN MATLAB THESIS Presented to the Faculty Department of Engineering Physics Graduate School of Engineering and Management Air...DISTRIUBUTION UNLIMITED. AFIT-ENP-14-M-34 FULLY AUTOMATED SUNSPOT DETECTION AND CLASSIFICATION USING SDO HMI IMAGERY IN MATLAB Gordon M. Spahr, BS Second
DOT National Transportation Integrated Search
2014-08-01
Fully automated or autonomous vehicles (AVs) hold great promise for the future of transportation. By 2020 : Google, auto manufacturers and other technology providers intend to introduce self-driving cars to the public with : either limited or fully a...
Girard, Laurie D.; Boissinot, Karel; Peytavi, Régis; Boissinot, Maurice; Bergeron, Michel G.
2014-01-01
The combination of molecular diagnostic technologies is increasingly used to overcome limitations on sensitivity, specificity or multiplexing capabilities, and provide efficient lab-on-chip devices. Two such techniques, PCR amplification and microarray hybridization are used serially to take advantage of the high sensitivity and specificity of the former combined with high multiplexing capacities of the latter. These methods are usually performed in different buffers and reaction chambers. However, these elaborate methods have a high complexity cost related to reagent requirements, liquid storage and the number of reaction chambers to integrate into automated devices. Furthermore, microarray hybridizations have a sequence dependent efficiency not always predictable. In this work, we have developed the concept of a structured oligonucleotide probe which is activated by cleavage from polymerase exonuclease activity. This technology is called SCISSOHR for Structured Cleavage Induced Single-Stranded Oligonucleotide Hybridization Reaction. The SCISSOHR probes enable indexing the target sequence to a tag sequence. The SCISSOHR technology also allows the combination of nucleic acid amplification and microarray hybridization in a single vessel in presence of the PCR buffer only. The SCISSOHR technology uses an amplification probe that is irreversibly modified in presence of the target, releasing a single-stranded DNA tag for microarray hybridization. Each tag is composed of a 3-nucleotidesequence-dependent segment and a unique “target sequence-independent” 14-nucleotide segment allowing for optimal hybridization with minimal cross-hybridization. We evaluated the performance of five (5) PCR buffers to support microarray hybridization, compared to a conventional hybridization buffer. Finally, as a proof of concept, we developed a multiplexed assay for the amplification, detection, and identification of three (3) DNA targets. This new technology will facilitate the design of lab-on-chip microfluidic devices, while also reducing consumable costs. At term, it will allow the cost-effective automation of highly multiplexed assays for detection and identification of genetic targets. PMID:25489607
Girard, Laurie D; Boissinot, Karel; Peytavi, Régis; Boissinot, Maurice; Bergeron, Michel G
2015-02-07
The combination of molecular diagnostic technologies is increasingly used to overcome limitations on sensitivity, specificity or multiplexing capabilities, and provide efficient lab-on-chip devices. Two such techniques, PCR amplification and microarray hybridization are used serially to take advantage of the high sensitivity and specificity of the former combined with high multiplexing capacities of the latter. These methods are usually performed in different buffers and reaction chambers. However, these elaborate methods have high complexity and cost related to reagent requirements, liquid storage and the number of reaction chambers to integrate into automated devices. Furthermore, microarray hybridizations have a sequence dependent efficiency not always predictable. In this work, we have developed the concept of a structured oligonucleotide probe which is activated by cleavage from polymerase exonuclease activity. This technology is called SCISSOHR for Structured Cleavage Induced Single-Stranded Oligonucleotide Hybridization Reaction. The SCISSOHR probes enable indexing the target sequence to a tag sequence. The SCISSOHR technology also allows the combination of nucleic acid amplification and microarray hybridization in a single vessel in presence of the PCR buffer only. The SCISSOHR technology uses an amplification probe that is irreversibly modified in presence of the target, releasing a single-stranded DNA tag for microarray hybridization. Each tag is composed of a 3-nucleotide sequence-dependent segment and a unique "target sequence-independent" 14-nucleotide segment allowing for optimal hybridization with minimal cross-hybridization. We evaluated the performance of five (5) PCR buffers to support microarray hybridization, compared to a conventional hybridization buffer. Finally, as a proof of concept, we developed a multiplexed assay for the amplification, detection, and identification of three (3) DNA targets. This new technology will facilitate the design of lab-on-chip microfluidic devices, while also reducing consumable costs. At term, it will allow the cost-effective automation of highly multiplexed assays for detection and identification of genetic targets.
[Research progress of probe design software of oligonucleotide microarrays].
Chen, Xi; Wu, Zaoquan; Liu, Zhengchun
2014-02-01
DNA microarray has become an essential medical genetic diagnostic tool for its high-throughput, miniaturization and automation. The design and selection of oligonucleotide probes are critical for preparing gene chips with high quality. Several sets of probe design software have been developed and are available to perform this work now. Every set of the software aims to different target sequences and shows different advantages and limitations. In this article, the research and development of these sets of software are reviewed in line with three main criteria, including specificity, sensitivity and melting temperature (Tm). In addition, based on the experimental results from literatures, these sets of software are classified according to their applications. This review will be helpful for users to choose an appropriate probe-design software. It will also reduce the costs of microarrays, improve the application efficiency of microarrays, and promote both the research and development (R&D) and commercialization of high-performance probe design software.
Fully automated processing of fMRI data in SPM: from MRI scanner to PACS.
Maldjian, Joseph A; Baer, Aaron H; Kraft, Robert A; Laurienti, Paul J; Burdette, Jonathan H
2009-01-01
Here we describe the Wake Forest University Pipeline, a fully automated method for the processing of fMRI data using SPM. The method includes fully automated data transfer and archiving from the point of acquisition, real-time batch script generation, distributed grid processing, interface to SPM in MATLAB, error recovery and data provenance, DICOM conversion and PACS insertion. It has been used for automated processing of fMRI experiments, as well as for the clinical implementation of fMRI and spin-tag perfusion imaging. The pipeline requires no manual intervention, and can be extended to any studies requiring offline processing.
Spectral Analysis of Breast Cancer on Tissue Microarrays: Seeing Beyond Morphology
2005-04-01
Harvey N., Szymanski J.J., Bloch J.J., Mitchell M. investigation of image feature extraction by a genetic algorithm. Proc. SPIE 1999;3812:24-31. 11...automated feature extraction using multiple data sources. Proc. SPIE 2003;5099:190-200. 15 4 Spectral-Spatial Analysis of Urine Cytology Angeletti et al...Appendix Contents: 1. Harvey, N.R., Levenson, R.M., Rimm, D.L. (2003) Investigation of Automated Feature Extraction Techniques for Applications in
GenePublisher: Automated analysis of DNA microarray data.
Knudsen, Steen; Workman, Christopher; Sicheritz-Ponten, Thomas; Friis, Carsten
2003-07-01
GenePublisher, a system for automatic analysis of data from DNA microarray experiments, has been implemented with a web interface at http://www.cbs.dtu.dk/services/GenePublisher. Raw data are uploaded to the server together with a specification of the data. The server performs normalization, statistical analysis and visualization of the data. The results are run against databases of signal transduction pathways, metabolic pathways and promoter sequences in order to extract more information. The results of the entire analysis are summarized in report form and returned to the user.
Employing image processing techniques for cancer detection using microarray images.
Dehghan Khalilabad, Nastaran; Hassanpour, Hamid
2017-02-01
Microarray technology is a powerful genomic tool for simultaneously studying and analyzing the behavior of thousands of genes. The analysis of images obtained from this technology plays a critical role in the detection and treatment of diseases. The aim of the current study is to develop an automated system for analyzing data from microarray images in order to detect cancerous cases. The proposed system consists of three main phases, namely image processing, data mining, and the detection of the disease. The image processing phase performs operations such as refining image rotation, gridding (locating genes) and extracting raw data from images the data mining includes normalizing the extracted data and selecting the more effective genes. Finally, via the extracted data, cancerous cell is recognized. To evaluate the performance of the proposed system, microarray database is employed which includes Breast cancer, Myeloid Leukemia and Lymphomas from the Stanford Microarray Database. The results indicate that the proposed system is able to identify the type of cancer from the data set with an accuracy of 95.45%, 94.11%, and 100%, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.
Huerta, Mario; Munyi, Marc; Expósito, David; Querol, Enric; Cedano, Juan
2014-06-15
The microarrays performed by scientific teams grow exponentially. These microarray data could be useful for researchers around the world, but unfortunately they are underused. To fully exploit these data, it is necessary (i) to extract these data from a repository of the high-throughput gene expression data like Gene Expression Omnibus (GEO) and (ii) to make the data from different microarrays comparable with tools easy to use for scientists. We have developed these two solutions in our server, implementing a database of microarray marker genes (Marker Genes Data Base). This database contains the marker genes of all GEO microarray datasets and it is updated monthly with the new microarrays from GEO. Thus, researchers can see whether the marker genes of their microarray are marker genes in other microarrays in the database, expanding the analysis of their microarray to the rest of the public microarrays. This solution helps not only to corroborate the conclusions regarding a researcher's microarray but also to identify the phenotype of different subsets of individuals under investigation, to frame the results with microarray experiments from other species, pathologies or tissues, to search for drugs that promote the transition between the studied phenotypes, to detect undesirable side effects of the treatment applied, etc. Thus, the researcher can quickly add relevant information to his/her studies from all of the previous analyses performed in other studies as long as they have been deposited in public repositories. Marker-gene database tool: http://ibb.uab.es/mgdb © The Author 2014. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Karouia, Fathi; Ricco, Antonio; Pohorille, Andrew; Peyvan, Kianoosh
2012-07-01
The capability to measure gene expression on board spacecrafts opens the doors to a large number of experiments on the influence of space environment on biological systems that will profoundly impact our ability to conduct safe and effective space travel, and might also shed light on terrestrial physiology or biological function and human disease and aging processes. Measurements of gene expression will help us to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment on a wide range of organisms from microbes to humans, develop effective countermeasures against these effects, determine metabolic basis of microbial pathogenicity and drug resistance, test our ability to sustain and grow in space organisms that can be used for life support and in situ resource utilization during long-duration space exploration, and monitor both the spacecraft environment and crew health. These and other applications hold significant potential for discoveries in space biology, biotechnology and medicine. Accordingly, supported by funding from the NASA Astrobiology Science and Technology Instrument Development Program, we are developing a fully automated, miniaturized, integrated fluidic system for small spacecraft capable of in-situ measuring microbial expression of thousands of genes from multiple samples. The instrument will be capable of (1) lysing bacterial cell walls, (2) extracting and purifying RNA released from cells, (3) hybridizing it on a microarray and (4) providing electrochemical readout, all in a microfluidics cartridge. The prototype under development is suitable for deployment on nanosatellite platforms developed by the NASA Small Spacecraft Office. The first target application is to cultivate and measure gene expression of the photosynthetic bacterium Synechococcus elongatus, i.e. a cyanobacterium known to exhibit remarkable metabolic diversity and resilience to adverse conditions, under light and dark cycles exposed to polar orbit for a period of 6 months. The integration and end-to-end technology validation of this instrument will be discussed. In particular, preliminary results demonstrating that the instrument properly carries out cellular lysis, nucleic acid extraction and its purification is being assessed by reverse transcription polymerase chain reaction (PCR) and real time PCR, in addition to microarray analysis of selected genes. Once developed, the system can be used with minor modifications for multiple experiments on different platforms in space, including extensions to higher organisms and microbial monitoring. A proposed version of GEMM that is capable of handing both microbial and tissue samples on the International Space Station will be briefly reviewed.
ONLINE WATER MONITORING UTILIZING AN AUTOMATED MICROARRAY BIOSENSOR INSTRUMENT - PHASE I
Constellation Technology Corporation (Constellation) proposes the use of an integrated recovery and detection system for online water supply monitoring. The integrated system is designed to efficiently capture and recover pathogens such as bacteria, viruses, parasites, an...
Howat, William J; Daley, Frances; Zabaglo, Lila; McDuffus, Leigh‐Anne; Blows, Fiona; Coulson, Penny; Raza Ali, H; Benitez, Javier; Milne, Roger; Brenner, Herman; Stegmaier, Christa; Mannermaa, Arto; Chang‐Claude, Jenny; Rudolph, Anja; Sinn, Peter; Couch, Fergus J; Tollenaar, Rob A.E.M.; Devilee, Peter; Figueroa, Jonine; Sherman, Mark E; Lissowska, Jolanta; Hewitt, Stephen; Eccles, Diana; Hooning, Maartje J; Hollestelle, Antoinette; WM Martens, John; HM van Deurzen, Carolien; Investigators, kConFab; Bolla, Manjeet K; Wang, Qin; Jones, Michael; Schoemaker, Minouk; Broeks, Annegien; van Leeuwen, Flora E; Van't Veer, Laura; Swerdlow, Anthony J; Orr, Nick; Dowsett, Mitch; Easton, Douglas; Schmidt, Marjanka K; Pharoah, Paul D; Garcia‐Closas, Montserrat
2016-01-01
Abstract Automated methods are needed to facilitate high‐throughput and reproducible scoring of Ki67 and other markers in breast cancer tissue microarrays (TMAs) in large‐scale studies. To address this need, we developed an automated protocol for Ki67 scoring and evaluated its performance in studies from the Breast Cancer Association Consortium. We utilized 166 TMAs containing 16,953 tumour cores representing 9,059 breast cancer cases, from 13 studies, with information on other clinical and pathological characteristics. TMAs were stained for Ki67 using standard immunohistochemical procedures, and scanned and digitized using the Ariol system. An automated algorithm was developed for the scoring of Ki67, and scores were compared to computer assisted visual (CAV) scores in a subset of 15 TMAs in a training set. We also assessed the correlation between automated Ki67 scores and other clinical and pathological characteristics. Overall, we observed good discriminatory accuracy (AUC = 85%) and good agreement (kappa = 0.64) between the automated and CAV scoring methods in the training set. The performance of the automated method varied by TMA (kappa range= 0.37–0.87) and study (kappa range = 0.39–0.69). The automated method performed better in satisfactory cores (kappa = 0.68) than suboptimal (kappa = 0.51) cores (p‐value for comparison = 0.005); and among cores with higher total nuclei counted by the machine (4,000–4,500 cells: kappa = 0.78) than those with lower counts (50–500 cells: kappa = 0.41; p‐value = 0.010). Among the 9,059 cases in this study, the correlations between automated Ki67 and clinical and pathological characteristics were found to be in the expected directions. Our findings indicate that automated scoring of Ki67 can be an efficient method to obtain good quality data across large numbers of TMAs from multicentre studies. However, robust algorithm development and rigorous pre‐ and post‐analytical quality control procedures are necessary in order to ensure satisfactory performance. PMID:27499923
Kostić, Tanja; Sessitsch, Angela
2011-01-01
Reliable and sensitive pathogen detection in clinical and environmental (including food and water) samples is of greatest importance for public health. Standard microbiological methods have several limitations and improved alternatives are needed. Most important requirements for reliable analysis include: (i) specificity; (ii) sensitivity; (iii) multiplexing potential; (iv) robustness; (v) speed; (vi) automation potential; and (vii) low cost. Microarray technology can, through its very nature, fulfill many of these requirements directly and the remaining challenges have been tackled. In this review, we attempt to compare performance characteristics of the microbial diagnostic microarrays developed for the detection and typing of food and water pathogens, and discuss limitations, points still to be addressed and issues specific for the analysis of food, water and environmental samples. PMID:27605332
Panuccio, Giuseppe; Torsello, Giovanni Federico; Pfister, Markus; Bisdas, Theodosios; Bosiers, Michel J; Torsello, Giovanni; Austermann, Martin
2016-12-01
To assess the usability of a fully automated fusion imaging engine prototype, matching preinterventional computed tomography with intraoperative fluoroscopic angiography during endovascular aortic repair. From June 2014 to February 2015, all patients treated electively for abdominal and thoracoabdominal aneurysms were enrolled prospectively. Before each procedure, preoperative planning was performed with a fully automated fusion engine prototype based on computed tomography angiography, creating a mesh model of the aorta. In a second step, this three-dimensional dataset was registered with the two-dimensional intraoperative fluoroscopy. The main outcome measure was the applicability of the fully automated fusion engine. Secondary outcomes were freedom from failure of automatic segmentation or of the automatic registration as well as accuracy of the mesh model, measuring deviations from intraoperative angiography in millimeters, if applicable. Twenty-five patients were enrolled in this study. The fusion imaging engine could be used in successfully 92% of the cases (n = 23). Freedom from failure of automatic segmentation was 44% (n = 11). The freedom from failure of the automatic registration was 76% (n = 19), the median error of the automatic registration process was 0 mm (interquartile range, 0-5 mm). The fully automated fusion imaging engine was found to be applicable in most cases, albeit in several cases a fully automated data processing was not possible, requiring manual intervention. The accuracy of the automatic registration yielded excellent results and promises a useful and simple to use technology. Copyright © 2016 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Geue, Lutz; Stieber, Bettina; Monecke, Stefan; Engelmann, Ines; Gunzer, Florian; Slickers, Peter; Braun, Sascha D; Ehricht, Ralf
2014-08-01
In this study, we developed a new rapid, economic, and automated microarray-based genotyping test for the standardized subtyping of Shiga toxins 1 and 2 of Escherichia coli. The microarrays from Alere Technologies can be used in two different formats, the ArrayTube and the ArrayStrip (which enables high-throughput testing in a 96-well format). One microarray chip harbors all the gene sequences necessary to distinguish between all Stx subtypes, facilitating the identification of single and multiple subtypes within a single isolate in one experiment. Specific software was developed to automatically analyze all data obtained from the microarray. The assay was validated with 21 Shiga toxin-producing E. coli (STEC) reference strains that were previously tested by the complete set of conventional subtyping PCRs. The microarray results showed 100% concordance with the PCR results. Essentially identical results were detected when the standard DNA extraction method was replaced by a time-saving heat lysis protocol. For further validation of the microarray, we identified the Stx subtypes or combinations of the subtypes in 446 STEC field isolates of human and animal origin. In summary, this oligonucleotide array represents an excellent diagnostic tool that provides some advantages over standard PCR-based subtyping. The number of the spotted probes on the microarrays can be increased by additional probes, such as for novel alleles, species markers, or resistance genes, should the need arise. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
Parallel solution-phase synthesis of a 2-aminothiazole library including fully automated work-up.
Buchstaller, Hans-Peter; Anlauf, Uwe
2011-02-01
A straightforward and effective procedure for the solution phase preparation of a 2-aminothiazole combinatorial library is described. Reaction, work-up and isolation of the title compounds as free bases was accomplished in a fully automated fashion using the Chemspeed ASW 2000 automated synthesizer. The compounds were obtained in good yields and excellent purities without any further purification procedure.
Does bacteriology laboratory automation reduce time to results and increase quality management?
Dauwalder, O; Landrieve, L; Laurent, F; de Montclos, M; Vandenesch, F; Lina, G
2016-03-01
Due to reductions in financial and human resources, many microbiological laboratories have merged to build very large clinical microbiology laboratories, which allow the use of fully automated laboratory instruments. For clinical chemistry and haematology, automation has reduced the time to results and improved the management of laboratory quality. The aim of this review was to examine whether fully automated laboratory instruments for microbiology can reduce time to results and impact quality management. This study focused on solutions that are currently available, including the BD Kiestra™ Work Cell Automation and Total Lab Automation and the Copan WASPLab(®). Copyright © 2015 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Greef, Charles; Petropavlovskikh, Viatcheslav; Nilsen, Oyvind; Khattatov, Boris; Plam, Mikhail; Gardner, Patrick; Hall, John
2008-04-01
Small non-coding RNA sequences have recently been discovered as unique identifiers of certain bacterial species, raising the possibility that they can be used as highly specific Biowarfare Agent detection markers in automated field deployable integrated detection systems. Because they are present in high abundance they could allow genomic based bacterial species identification without the need for pre-assay amplification. Further, a direct detection method would obviate the need for chemical labeling, enabling a rapid, efficient, high sensitivity mechanism for bacterial detection. Surface Plasmon Resonance enhanced Common Path Interferometry (SPR-CPI) is a potentially market disruptive, high sensitivity dual technology that allows real-time direct multiplex measurement of biomolecule interactions, including small molecules, nucleic acids, proteins, and microbes. SPR-CPI measures differences in phase shift of reflected S and P polarized light under Total Internal Reflection (TIR) conditions at a surface, caused by changes in refractive index induced by biomolecular interactions within the evanescent field at the TIR interface. The measurement is performed on a microarray of discrete 2-dimensional areas functionalized with biomolecule capture reagents, allowing simultaneous measurement of up to 100 separate analytes. The optical beam encompasses the entire microarray, allowing a solid state detector system with no scanning requirement. Output consists of simultaneous voltage measurements proportional to the phase differences resulting from the refractive index changes from each microarray feature, and is automatically processed and displayed graphically or delivered to a decision making algorithm, enabling a fully automatic detection system capable of rapid detection and quantification of small nucleic acids at extremely sensitive levels. Proof-of-concept experiments on model systems and cell culture samples have demonstrated utility of the system, and efforts are in progress for full development and deployment of the device. The technology has broad applicability as a universal detection platform for BWA detection, medical diagnostics, and drug discovery research, and represents a new class of instrumentation as a rapid, high sensitivity, label-free methodology.
Mining microarray data at NCBI's Gene Expression Omnibus (GEO)*.
Barrett, Tanya; Edgar, Ron
2006-01-01
The Gene Expression Omnibus (GEO) at the National Center for Biotechnology Information (NCBI) has emerged as the leading fully public repository for gene expression data. This chapter describes how to use Web-based interfaces, applications, and graphics to effectively explore, visualize, and interpret the hundreds of microarray studies and millions of gene expression patterns stored in GEO. Data can be examined from both experiment-centric and gene-centric perspectives using user-friendly tools that do not require specialized expertise in microarray analysis or time-consuming download of massive data sets. The GEO database is publicly accessible through the World Wide Web at http://www.ncbi.nlm.nih.gov/geo.
Fully automated segmentation of callus by micro-CT compared to biomechanics.
Bissinger, Oliver; Götz, Carolin; Wolff, Klaus-Dietrich; Hapfelmeier, Alexander; Prodinger, Peter Michael; Tischer, Thomas
2017-07-11
A high percentage of closed femur fractures have slight comminution. Using micro-CT (μCT), multiple fragment segmentation is much more difficult than segmentation of unfractured or osteotomied bone. Manual or semi-automated segmentation has been performed to date. However, such segmentation is extremely laborious, time-consuming and error-prone. Our aim was to therefore apply a fully automated segmentation algorithm to determine μCT parameters and examine their association with biomechanics. The femura of 64 rats taken after randomised inhibitory or neutral medication, in terms of the effect on fracture healing, and controls were closed fractured after a Kirschner wire was inserted. After 21 days, μCT and biomechanical parameters were determined by a fully automated method and correlated (Pearson's correlation). The fully automated segmentation algorithm automatically detected bone and simultaneously separated cortical bone from callus without requiring ROI selection for each single bony structure. We found an association of structural callus parameters obtained by μCT to the biomechanical properties. However, results were only explicable by additionally considering the callus location. A large number of slightly comminuted fractures in combination with therapies that influence the callus qualitatively and/or quantitatively considerably affects the association between μCT and biomechanics. In the future, contrast-enhanced μCT imaging of the callus cartilage might provide more information to improve the non-destructive and non-invasive prediction of callus mechanical properties. As studies evaluating such important drugs increase, fully automated segmentation appears to be clinically important.
Singh, Tulika; Sharma, Madhurima; Singla, Veenu; Khandelwal, Niranjan
2016-01-01
The objective of our study was to calculate mammographic breast density with a fully automated volumetric breast density measurement method and to compare it to breast imaging reporting and data system (BI-RADS) breast density categories assigned by two radiologists. A total of 476 full-field digital mammography examinations with standard mediolateral oblique and craniocaudal views were evaluated by two blinded radiologists and BI-RADS density categories were assigned. Using a fully automated software, mean fibroglandular tissue volume, mean breast volume, and mean volumetric breast density were calculated. Based on percentage volumetric breast density, a volumetric density grade was assigned from 1 to 4. The weighted overall kappa was 0.895 (almost perfect agreement) for the two radiologists' BI-RADS density estimates. A statistically significant difference was seen in mean volumetric breast density among the BI-RADS density categories. With increased BI-RADS density category, increase in mean volumetric breast density was also seen (P < 0.001). A significant positive correlation was found between BI-RADS categories and volumetric density grading by fully automated software (ρ = 0.728, P < 0.001 for first radiologist and ρ = 0.725, P < 0.001 for second radiologist). Pairwise estimates of the weighted kappa between Volpara density grade and BI-RADS density category by two observers showed fair agreement (κ = 0.398 and 0.388, respectively). In our study, a good correlation was seen between density grading using fully automated volumetric method and density grading using BI-RADS density categories assigned by the two radiologists. Thus, the fully automated volumetric method may be used to quantify breast density on routine mammography. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Automatic Identification and Quantification of Extra-Well Fluorescence in Microarray Images.
Rivera, Robert; Wang, Jie; Yu, Xiaobo; Demirkan, Gokhan; Hopper, Marika; Bian, Xiaofang; Tahsin, Tasnia; Magee, D Mitchell; Qiu, Ji; LaBaer, Joshua; Wallstrom, Garrick
2017-11-03
In recent studies involving NAPPA microarrays, extra-well fluorescence is used as a key measure for identifying disease biomarkers because there is evidence to support that it is better correlated with strong antibody responses than statistical analysis involving intraspot intensity. Because this feature is not well quantified by traditional image analysis software, identification and quantification of extra-well fluorescence is performed manually, which is both time-consuming and highly susceptible to variation between raters. A system that could automate this task efficiently and effectively would greatly improve the process of data acquisition in microarray studies, thereby accelerating the discovery of disease biomarkers. In this study, we experimented with different machine learning methods, as well as novel heuristics, for identifying spots exhibiting extra-well fluorescence (rings) in microarray images and assigning each ring a grade of 1-5 based on its intensity and morphology. The sensitivity of our final system for identifying rings was found to be 72% at 99% specificity and 98% at 92% specificity. Our system performs this task significantly faster than a human, while maintaining high performance, and therefore represents a valuable tool for microarray image analysis.
Gokce, Sertan Kutal; Guo, Samuel X.; Ghorashian, Navid; Everett, W. Neil; Jarrell, Travis; Kottek, Aubri; Bovik, Alan C.; Ben-Yakar, Adela
2014-01-01
Femtosecond laser nanosurgery has been widely accepted as an axonal injury model, enabling nerve regeneration studies in the small model organism, Caenorhabditis elegans. To overcome the time limitations of manual worm handling techniques, automation and new immobilization technologies must be adopted to improve throughput in these studies. While new microfluidic immobilization techniques have been developed that promise to reduce the time required for axotomies, there is a need for automated procedures to minimize the required amount of human intervention and accelerate the axotomy processes crucial for high-throughput. Here, we report a fully automated microfluidic platform for performing laser axotomies of fluorescently tagged neurons in living Caenorhabditis elegans. The presented automation process reduces the time required to perform axotomies within individual worms to ∼17 s/worm, at least one order of magnitude faster than manual approaches. The full automation is achieved with a unique chip design and an operation sequence that is fully computer controlled and synchronized with efficient and accurate image processing algorithms. The microfluidic device includes a T-shaped architecture and three-dimensional microfluidic interconnects to serially transport, position, and immobilize worms. The image processing algorithms can identify and precisely position axons targeted for ablation. There were no statistically significant differences observed in reconnection probabilities between axotomies carried out with the automated system and those performed manually with anesthetics. The overall success rate of automated axotomies was 67.4±3.2% of the cases (236/350) at an average processing rate of 17.0±2.4 s. This fully automated platform establishes a promising methodology for prospective genome-wide screening of nerve regeneration in C. elegans in a truly high-throughput manner. PMID:25470130
Arlt, Sönke; Buchert, Ralph; Spies, Lothar; Eichenlaub, Martin; Lehmbeck, Jan T; Jahn, Holger
2013-06-01
Fully automated magnetic resonance imaging (MRI)-based volumetry may serve as biomarker for the diagnosis in patients with mild cognitive impairment (MCI) or dementia. We aimed at investigating the relation between fully automated MRI-based volumetric measures and neuropsychological test performance in amnestic MCI and patients with mild dementia due to Alzheimer's disease (AD) in a cross-sectional and longitudinal study. In order to assess a possible prognostic value of fully automated MRI-based volumetry for future cognitive performance, the rate of change of neuropsychological test performance over time was also tested for its correlation with fully automated MRI-based volumetry at baseline. In 50 subjects, 18 with amnestic MCI, 21 with mild AD, and 11 controls, neuropsychological testing and T1-weighted MRI were performed at baseline and at a mean follow-up interval of 2.1 ± 0.5 years (n = 19). Fully automated MRI volumetry of the grey matter volume (GMV) was performed using a combined stereotactic normalisation and segmentation approach as provided by SPM8 and a set of pre-defined binary lobe masks. Left and right hippocampus masks were derived from probabilistic cytoarchitectonic maps. Volumes of the inner and outer liquor space were also determined automatically from the MRI. Pearson's test was used for the correlation analyses. Left hippocampal GMV was significantly correlated with performance in memory tasks, and left temporal GMV was related to performance in language tasks. Bilateral frontal, parietal and occipital GMVs were correlated to performance in neuropsychological tests comprising multiple domains. Rate of GMV change in the left hippocampus was correlated with decline of performance in the Boston Naming Test (BNT), Mini-Mental Status Examination, and trail making test B (TMT-B). The decrease of BNT and TMT-A performance over time correlated with the loss of grey matter in multiple brain regions. We conclude that fully automated MRI-based volumetry allows detection of regional grey matter volume loss that correlates with neuropsychological performance in patients with amnestic MCI or mild AD. Because of the high level of automation, MRI-based volumetry may easily be integrated into clinical routine to complement the current diagnostic procedure.
Huang, Jianyan; Maram, Jyotsna; Tepelus, Tudor C; Modak, Cristina; Marion, Ken; Sadda, SriniVas R; Chopra, Vikas; Lee, Olivia L
2017-08-07
To determine the reliability of corneal endothelial cell density (ECD) obtained by automated specular microscopy versus that of validated manual methods and factors that predict such reliability. Sharp central images from 94 control and 106 glaucomatous eyes were captured with Konan specular microscope NSP-9900. All images were analyzed by trained graders using Konan CellChek Software, employing the fully- and semi-automated methods as well as Center Method. Images with low cell count (input cells number <100) and/or guttata were compared with the Center and Flex-Center Methods. ECDs were compared and absolute error was used to assess variation. The effect on ECD of age, cell count, cell size, and cell size variation was evaluated. No significant difference was observed between the Center and Flex-Center Methods in corneas with guttata (p=0.48) or low ECD (p=0.11). No difference (p=0.32) was observed in ECD of normal controls <40 yrs old between the fully-automated method and manual Center Method. However, in older controls and glaucomatous eyes, ECD was overestimated by the fully-automated method (p=0.034) and semi-automated method (p=0.025) as compared to manual method. Our findings show that automated analysis significantly overestimates ECD in the eyes with high polymegathism and/or large cell size, compared to the manual method. Therefore, we discourage reliance upon the fully-automated method alone to perform specular microscopy analysis, particularly if an accurate ECD value is imperative. Copyright © 2017. Published by Elsevier España, S.L.U.
DOT National Transportation Integrated Search
2009-02-01
The Office of Special Investigations at Iowa Department of Transportation (DOT) collects FWD data on regular basis to evaluate pavement structural conditions. The primary objective of this study was to develop a fully-automated software system for ra...
Ercan, Ertuğrul; Kırılmaz, Bahadır; Kahraman, İsmail; Bayram, Vildan; Doğan, Hüseyin
2012-11-01
Flow-mediated dilation (FMD) is used to evaluate endothelial functions. Computer-assisted analysis utilizing edge detection permits continuous measurements along the vessel wall. We have developed a new fully automated software program to allow accurate and reproducible measurement. FMD has been measured and analyzed in 18 coronary artery disease (CAD) patients and 17 controls both by manually and by the software developed (computer supported) methods. The agreement between methods was assessed by Bland-Altman analysis. The mean age, body mass index and cardiovascular risk factors were higher in CAD group. Automated FMD% measurement for the control subjects was 18.3±8.5 and 6.8±6.5 for the CAD group (p=0.0001). The intraobserver and interobserver correlation for automated measurement was high (r=0.974, r=0.981, r=0.937, r=0.918, respectively). Manual FMD% at 60th second was correlated with automated FMD % (r=0.471, p=0.004). The new fully automated software© can be used to precise measurement of FMD with low intra- and interobserver variability than manual assessment.
Oligo Design: a computer program for development of probes for oligonucleotide microarrays.
Herold, Keith E; Rasooly, Avraham
2003-12-01
Oligonucleotide microarrays have demonstrated potential for the analysis of gene expression, genotyping, and mutational analysis. Our work focuses primarily on the detection and identification of bacteria based on known short sequences of DNA. Oligo Design, the software described here, automates several design aspects that enable the improved selection of oligonucleotides for use with microarrays for these applications. Two major features of the program are: (i) a tiling algorithm for the design of short overlapping temperature-matched oligonucleotides of variable length, which are useful for the analysis of single nucleotide polymorphisms and (ii) a set of tools for the analysis of multiple alignments of gene families and related short DNA sequences, which allow for the identification of conserved DNA sequences for PCR primer selection and variable DNA sequences for the selection of unique probes for identification. Note that the program does not address the full genome perspective but, instead, is focused on the genetic analysis of short segments of DNA. The program is Internet-enabled and includes a built-in browser and the automated ability to download sequences from GenBank by specifying the GI number. The program also includes several utilities, including audio recital of a DNA sequence (useful for verifying sequences against a written document), a random sequence generator that provides insight into the relationship between melting temperature and GC content, and a PCR calculator.
Mining Microarray Data at NCBI’s Gene Expression Omnibus (GEO)*
Barrett, Tanya; Edgar, Ron
2006-01-01
Summary The Gene Expression Omnibus (GEO) at the National Center for Biotechnology Information (NCBI) has emerged as the leading fully public repository for gene expression data. This chapter describes how to use Web-based interfaces, applications, and graphics to effectively explore, visualize, and interpret the hundreds of microarray studies and millions of gene expression patterns stored in GEO. Data can be examined from both experiment-centric and gene-centric perspectives using user-friendly tools that do not require specialized expertise in microarray analysis or time-consuming download of massive data sets. The GEO database is publicly accessible through the World Wide Web at http://www.ncbi.nlm.nih.gov/geo. PMID:16888359
21 CFR 864.5200 - Automated cell counter.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the...
21 CFR 864.5240 - Automated blood cell diluting apparatus.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated blood cell diluting apparatus. 864.5240... § 864.5240 Automated blood cell diluting apparatus. (a) Identification. An automated blood cell diluting apparatus is a fully automated or semi-automated device used to make appropriate dilutions of a blood sample...
21 CFR 864.5240 - Automated blood cell diluting apparatus.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Automated blood cell diluting apparatus. 864.5240... § 864.5240 Automated blood cell diluting apparatus. (a) Identification. An automated blood cell diluting apparatus is a fully automated or semi-automated device used to make appropriate dilutions of a blood sample...
21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY DEVICES...
21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY DEVICES...
21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY DEVICES...
21 CFR 866.1645 - Fully automated short-term incubation cycle antimicrobial susceptibility system.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Fully automated short-term incubation cycle antimicrobial susceptibility system. 866.1645 Section 866.1645 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES IMMUNOLOGY AND MICROBIOLOGY DEVICES...
Anesthesiology, automation, and artificial intelligence.
Alexander, John C; Joshi, Girish P
2018-01-01
There have been many attempts to incorporate automation into the practice of anesthesiology, though none have been successful. Fundamentally, these failures are due to the underlying complexity of anesthesia practice and the inability of rule-based feedback loops to fully master it. Recent innovations in artificial intelligence, especially machine learning, may usher in a new era of automation across many industries, including anesthesiology. It would be wise to consider the implications of such potential changes before they have been fully realized.
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
21 CFR 864.5200 - Automated cell counter.
Code of Federal Regulations, 2014 CFR
2014-04-01
....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...
21 CFR 864.5200 - Automated cell counter.
Code of Federal Regulations, 2011 CFR
2011-04-01
....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...
21 CFR 864.5200 - Automated cell counter.
Code of Federal Regulations, 2012 CFR
2012-04-01
....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...
21 CFR 864.5200 - Automated cell counter.
Code of Federal Regulations, 2013 CFR
2013-04-01
....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...
Dynamic variable selection in SNP genotype autocalling from APEX microarray data.
Podder, Mohua; Welch, William J; Zamar, Ruben H; Tebbutt, Scott J
2006-11-30
Single nucleotide polymorphisms (SNPs) are DNA sequence variations, occurring when a single nucleotide--adenine (A), thymine (T), cytosine (C) or guanine (G)--is altered. Arguably, SNPs account for more than 90% of human genetic variation. Our laboratory has developed a highly redundant SNP genotyping assay consisting of multiple probes with signals from multiple channels for a single SNP, based on arrayed primer extension (APEX). This mini-sequencing method is a powerful combination of a highly parallel microarray with distinctive Sanger-based dideoxy terminator sequencing chemistry. Using this microarray platform, our current genotype calling system (known as SNP Chart) is capable of calling single SNP genotypes by manual inspection of the APEX data, which is time-consuming and exposed to user subjectivity bias. Using a set of 32 Coriell DNA samples plus three negative PCR controls as a training data set, we have developed a fully-automated genotyping algorithm based on simple linear discriminant analysis (LDA) using dynamic variable selection. The algorithm combines separate analyses based on the multiple probe sets to give a final posterior probability for each candidate genotype. We have tested our algorithm on a completely independent data set of 270 DNA samples, with validated genotypes, from patients admitted to the intensive care unit (ICU) of St. Paul's Hospital (plus one negative PCR control sample). Our method achieves a concordance rate of 98.9% with a 99.6% call rate for a set of 96 SNPs. By adjusting the threshold value for the final posterior probability of the called genotype, the call rate reduces to 94.9% with a higher concordance rate of 99.6%. We also reversed the two independent data sets in their training and testing roles, achieving a concordance rate up to 99.8%. The strength of this APEX chemistry-based platform is its unique redundancy having multiple probes for a single SNP. Our model-based genotype calling algorithm captures the redundancy in the system considering all the underlying probe features of a particular SNP, automatically down-weighting any 'bad data' corresponding to image artifacts on the microarray slide or failure of a specific chemistry. In this regard, our method is able to automatically select the probes which work well and reduce the effect of other so-called bad performing probes in a sample-specific manner, for any number of SNPs.
Isolation of Microarray-Grade Total RNA, MicroRNA, and DNA from a Single PAXgene Blood RNA Tube
Kruhøffer, Mogens; Dyrskjøt, Lars; Voss, Thorsten; Lindberg, Raija L.P.; Wyrich, Ralf; Thykjaer, Thomas; Orntoft, Torben F.
2007-01-01
We have developed a procedure for isolation of microRNA and genomic DNA in addition to total RNA from whole blood stabilized in PAXgene Blood RNA tubes. The procedure is based on automatic extraction on a BioRobot MDx and includes isolation of DNA from a fraction of the stabilized blood and recovery of small RNA species that are otherwise lost. The procedure presented here is suitable for large-scale experiments and is amenable to further automation. Procured total RNA and DNA was tested using Affymetrix Expression and single-nucleotide polymorphism GeneChips, respectively, and isolated microRNA was tested using spotted locked nucleic acid-based microarrays. We conclude that the yield and quality of total RNA, microRNA, and DNA from a single PAXgene blood RNA tube is sufficient for downstream microarray analysis. PMID:17690207
Scheible, Max B; Pardatscher, Günther; Kuzyk, Anton; Simmel, Friedrich C
2014-03-12
The combination of molecular self-assembly based on the DNA origami technique with lithographic patterning enables the creation of hierarchically ordered nanosystems, in which single molecules are positioned at precise locations on multiple length scales. Based on a hybrid assembly protocol utilizing DNA self-assembly and electron-beam lithography on transparent glass substrates, we here demonstrate a DNA origami microarray, which is compatible with the requirements of single molecule fluorescence and super-resolution microscopy. The spatial arrangement allows for a simple and reliable identification of single molecule events and facilitates automated read-out and data analysis. As a specific application, we utilize the microarray to characterize the performance of DNA strand displacement reactions localized on the DNA origami structures. We find considerable variability within the array, which results both from structural variations and stochastic reaction dynamics prevalent at the single molecule level.
Tra, Yolande V; Evans, Irene M
2010-01-01
BIO2010 put forth the goal of improving the mathematical educational background of biology students. The analysis and interpretation of microarray high-dimensional data can be very challenging and is best done by a statistician and a biologist working and teaching in a collaborative manner. We set up such a collaboration and designed a course on microarray data analysis. We started using Genome Consortium for Active Teaching (GCAT) materials and Microarray Genome and Clustering Tool software and added R statistical software along with Bioconductor packages. In response to student feedback, one microarray data set was fully analyzed in class, starting from preprocessing to gene discovery to pathway analysis using the latter software. A class project was to conduct a similar analysis where students analyzed their own data or data from a published journal paper. This exercise showed the impact that filtering, preprocessing, and different normalization methods had on gene inclusion in the final data set. We conclude that this course achieved its goals to equip students with skills to analyze data from a microarray experiment. We offer our insight about collaborative teaching as well as how other faculty might design and implement a similar interdisciplinary course.
Evans, Irene M.
2010-01-01
BIO2010 put forth the goal of improving the mathematical educational background of biology students. The analysis and interpretation of microarray high-dimensional data can be very challenging and is best done by a statistician and a biologist working and teaching in a collaborative manner. We set up such a collaboration and designed a course on microarray data analysis. We started using Genome Consortium for Active Teaching (GCAT) materials and Microarray Genome and Clustering Tool software and added R statistical software along with Bioconductor packages. In response to student feedback, one microarray data set was fully analyzed in class, starting from preprocessing to gene discovery to pathway analysis using the latter software. A class project was to conduct a similar analysis where students analyzed their own data or data from a published journal paper. This exercise showed the impact that filtering, preprocessing, and different normalization methods had on gene inclusion in the final data set. We conclude that this course achieved its goals to equip students with skills to analyze data from a microarray experiment. We offer our insight about collaborative teaching as well as how other faculty might design and implement a similar interdisciplinary course. PMID:20810954
An ODE-Based Wall Model for Turbulent Flow Simulations
NASA Technical Reports Server (NTRS)
Berger, Marsha J.; Aftosmis, Michael J.
2017-01-01
Fully automated meshing for Reynolds-Averaged Navier-Stokes Simulations, Mesh generation for complex geometry continues to be the biggest bottleneck in the RANS simulation process; Fully automated Cartesian methods routinely used for inviscid simulations about arbitrarily complex geometry; These methods lack of an obvious & robust way to achieve near wall anisotropy; Goal: Extend these methods for RANS simulation without sacrificing automation, at an affordable cost; Note: Nothing here is limited to Cartesian methods, and much becomes simpler in a body-fitted setting.
Anesthesiology, automation, and artificial intelligence
Alexander, John C.; Joshi, Girish P.
2018-01-01
ABSTRACT There have been many attempts to incorporate automation into the practice of anesthesiology, though none have been successful. Fundamentally, these failures are due to the underlying complexity of anesthesia practice and the inability of rule-based feedback loops to fully master it. Recent innovations in artificial intelligence, especially machine learning, may usher in a new era of automation across many industries, including anesthesiology. It would be wise to consider the implications of such potential changes before they have been fully realized. PMID:29686578
Kovacevic, Sanja; Rafii, Michael S.; Brewer, James B.
2008-01-01
Medial temporal lobe (MTL) atrophy is associated with increased risk for conversion to Alzheimer's disease (AD), but manual tracing techniques and even semi-automated techniques for volumetric assessment are not practical in the clinical setting. In addition, most studies that examined MTL atrophy in AD have focused only on the hippocampus. It is unknown the extent to which volumes of amygdala and temporal horn of the lateral ventricle predict subsequent clinical decline. This study examined whether measures of hippocampus, amygdala, and temporal horn volume predict clinical decline over the following 6-month period in patients with mild cognitive impairment (MCI). Fully-automated volume measurements were performed in 269 MCI patients. Baseline volumes of the hippocampus, amygdala, and temporal horn were evaluated as predictors of change in Mini-mental State Exam (MMSE) and Clinical Dementia Rating Sum of Boxes (CDR SB) over a 6-month interval. Fully-automated measurements of baseline hippocampus and amygdala volumes correlated with baseline delayed recall scores. Patients with smaller baseline volumes of the hippocampus and amygdala or larger baseline volumes of the temporal horn had more rapid subsequent clinical decline on MMSE and CDR SB. Fully-automated and rapid measurement of segmental MTL volumes may help clinicians predict clinical decline in MCI patients. PMID:19474571
Cardiac imaging: working towards fully-automated machine analysis & interpretation.
Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido
2017-03-01
Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.
Twelve automated thresholding methods for segmentation of PET images: a phantom study.
Prieto, Elena; Lecumberri, Pablo; Pagola, Miguel; Gómez, Marisol; Bilbao, Izaskun; Ecay, Margarita; Peñuelas, Iván; Martí-Climent, Josep M
2012-06-21
Tumor volume delineation over positron emission tomography (PET) images is of great interest for proper diagnosis and therapy planning. However, standard segmentation techniques (manual or semi-automated) are operator dependent and time consuming while fully automated procedures are cumbersome or require complex mathematical development. The aim of this study was to segment PET images in a fully automated way by implementing a set of 12 automated thresholding algorithms, classical in the fields of optical character recognition, tissue engineering or non-destructive testing images in high-tech structures. Automated thresholding algorithms select a specific threshold for each image without any a priori spatial information of the segmented object or any special calibration of the tomograph, as opposed to usual thresholding methods for PET. Spherical (18)F-filled objects of different volumes were acquired on clinical PET/CT and on a small animal PET scanner, with three different signal-to-background ratios. Images were segmented with 12 automatic thresholding algorithms and results were compared with the standard segmentation reference, a threshold at 42% of the maximum uptake. Ridler and Ramesh thresholding algorithms based on clustering and histogram-shape information, respectively, provided better results that the classical 42%-based threshold (p < 0.05). We have herein demonstrated that fully automated thresholding algorithms can provide better results than classical PET segmentation tools.
Twelve automated thresholding methods for segmentation of PET images: a phantom study
NASA Astrophysics Data System (ADS)
Prieto, Elena; Lecumberri, Pablo; Pagola, Miguel; Gómez, Marisol; Bilbao, Izaskun; Ecay, Margarita; Peñuelas, Iván; Martí-Climent, Josep M.
2012-06-01
Tumor volume delineation over positron emission tomography (PET) images is of great interest for proper diagnosis and therapy planning. However, standard segmentation techniques (manual or semi-automated) are operator dependent and time consuming while fully automated procedures are cumbersome or require complex mathematical development. The aim of this study was to segment PET images in a fully automated way by implementing a set of 12 automated thresholding algorithms, classical in the fields of optical character recognition, tissue engineering or non-destructive testing images in high-tech structures. Automated thresholding algorithms select a specific threshold for each image without any a priori spatial information of the segmented object or any special calibration of the tomograph, as opposed to usual thresholding methods for PET. Spherical 18F-filled objects of different volumes were acquired on clinical PET/CT and on a small animal PET scanner, with three different signal-to-background ratios. Images were segmented with 12 automatic thresholding algorithms and results were compared with the standard segmentation reference, a threshold at 42% of the maximum uptake. Ridler and Ramesh thresholding algorithms based on clustering and histogram-shape information, respectively, provided better results that the classical 42%-based threshold (p < 0.05). We have herein demonstrated that fully automated thresholding algorithms can provide better results than classical PET segmentation tools.
Hu, Guohong; Wang, Hui-Yun; Greenawalt, Danielle M.; Azaro, Marco A.; Luo, Minjie; Tereshchenko, Irina V.; Cui, Xiangfeng; Yang, Qifeng; Gao, Richeng; Shen, Li; Li, Honghua
2006-01-01
Microarray-based analysis of single nucleotide polymorphisms (SNPs) has many applications in large-scale genetic studies. To minimize the influence of experimental variation, microarray data usually need to be processed in different aspects including background subtraction, normalization and low-signal filtering before genotype determination. Although many algorithms are sophisticated for these purposes, biases are still present. In the present paper, new algorithms for SNP microarray data analysis and the software, AccuTyping, developed based on these algorithms are described. The algorithms take advantage of a large number of SNPs included in each assay, and the fact that the top and bottom 20% of SNPs can be safely treated as homozygous after sorting based on their ratios between the signal intensities. These SNPs are then used as controls for color channel normalization and background subtraction. Genotype calls are made based on the logarithms of signal intensity ratios using two cutoff values, which were determined after training the program with a dataset of ∼160 000 genotypes and validated by non-microarray methods. AccuTyping was used to determine >300 000 genotypes of DNA and sperm samples. The accuracy was shown to be >99%. AccuTyping can be downloaded from . PMID:16982644
21 CFR 864.5620 - Automated hemoglobin system.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated hemoglobin system. 864.5620 Section 864....5620 Automated hemoglobin system. (a) Identification. An automated hemoglobin system is a fully... hemoglobin content of human blood. (b) Classification. Class II (performance standards). [45 FR 60601, Sept...
21 CFR 864.5620 - Automated hemoglobin system.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Automated hemoglobin system. 864.5620 Section 864....5620 Automated hemoglobin system. (a) Identification. An automated hemoglobin system is a fully... hemoglobin content of human blood. (b) Classification. Class II (performance standards). [45 FR 60601, Sept...
Abou Assi, Hala; Gómez-Pinto, Irene; González, Carlos
2017-01-01
Abstract In situ fabricated nucleic acids microarrays are versatile and very high-throughput platforms for aptamer optimization and discovery, but the chemical space that can be probed against a given target has largely been confined to DNA, while RNA and non-natural nucleic acid microarrays are still an essentially uncharted territory. 2΄-Fluoroarabinonucleic acid (2΄F-ANA) is a prime candidate for such use in microarrays. Indeed, 2΄F-ANA chemistry is readily amenable to photolithographic microarray synthesis and its potential in high affinity aptamers has been recently discovered. We thus synthesized the first microarrays containing 2΄F-ANA and 2΄F-ANA/DNA chimeric sequences to fully map the binding affinity landscape of the TBA1 thrombin-binding G-quadruplex aptamer containing all 32 768 possible DNA-to-2΄F-ANA mutations. The resulting microarray was screened against thrombin to identify a series of promising 2΄F-ANA-modified aptamer candidates with Kds significantly lower than that of the unmodified control and which were found to adopt highly stable, antiparallel-folded G-quadruplex structures. The solution structure of the TBA1 aptamer modified with 2΄F-ANA at position T3 shows that fluorine substitution preorganizes the dinucleotide loop into the proper conformation for interaction with thrombin. Overall, our work strengthens the potential of 2΄F-ANA in aptamer research and further expands non-genomic applications of nucleic acids microarrays. PMID:28100695
Towards Automated Screening of Two-dimensional Crystals
Cheng, Anchi; Leung, Albert; Fellmann, Denis; Quispe, Joel; Suloway, Christian; Pulokas, James; Carragher, Bridget; Potter, Clinton S.
2007-01-01
Screening trials to determine the presence of two-dimensional (2D) protein crystals suitable for three-dimensional structure determination using electron crystallography is a very labor-intensive process. Methods compatible with fully automated screening have been developed for the process of crystal production by dialysis and for producing negatively stained grids of the resulting trials. Further automation via robotic handling of the EM grids, and semi-automated transmission electron microscopic imaging and evaluation of the trial grids is also possible. We, and others, have developed working prototypes for several of these tools and tested and evaluated them in a simple screen of 24 crystallization conditions. While further development of these tools is certainly required for a turn-key system, the goal of fully automated screening appears to be within reach. PMID:17977016
Investigating Factors Affecting the Uptake of Automated Assessment Technology
ERIC Educational Resources Information Center
Dreher, Carl; Reiners, Torsten; Dreher, Heinz
2011-01-01
Automated assessment is an emerging innovation in educational praxis, however its pedagogical potential is not fully utilised in Australia, particularly regarding automated essay grading. The rationale for this research is that the usage of automated assessment currently lags behind the capacity that the technology provides, thus restricting the…
NASA Technical Reports Server (NTRS)
Steinberg, R.
1978-01-01
A low-cost communications system to provide meteorological data from commercial aircraft, in neat real-time, on a fully automated basis has been developed. The complete system including the low profile antenna and all installation hardware weighs 34 kg. The prototype system was installed on a B-747 aircraft and provided meteorological data (wind angle and velocity, temperature, altitude and position as a function of time) on a fully automated basis. The results were exceptional. This concept is expected to have important implications for operational meteorology and airline route forecasting.
Cardiac imaging: working towards fully-automated machine analysis & interpretation
Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido
2017-01-01
Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804
Automated frame selection process for high-resolution microendoscopy
NASA Astrophysics Data System (ADS)
Ishijima, Ayumu; Schwarz, Richard A.; Shin, Dongsuk; Mondrik, Sharon; Vigneswaran, Nadarajah; Gillenwater, Ann M.; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca
2015-04-01
We developed an automated frame selection algorithm for high-resolution microendoscopy video sequences. The algorithm rapidly selects a representative frame with minimal motion artifact from a short video sequence, enabling fully automated image analysis at the point-of-care. The algorithm was evaluated by quantitative comparison of diagnostically relevant image features and diagnostic classification results obtained using automated frame selection versus manual frame selection. A data set consisting of video sequences collected in vivo from 100 oral sites and 167 esophageal sites was used in the analysis. The area under the receiver operating characteristic curve was 0.78 (automated selection) versus 0.82 (manual selection) for oral sites, and 0.93 (automated selection) versus 0.92 (manual selection) for esophageal sites. The implementation of fully automated high-resolution microendoscopy at the point-of-care has the potential to reduce the number of biopsies needed for accurate diagnosis of precancer and cancer in low-resource settings where there may be limited infrastructure and personnel for standard histologic analysis.
Fully automated corneal endothelial morphometry of images captured by clinical specular microscopy
NASA Astrophysics Data System (ADS)
Bucht, Curry; Söderberg, Per; Manneberg, Göran
2009-02-01
The corneal endothelium serves as the posterior barrier of the cornea. Factors such as clarity and refractive properties of the cornea are in direct relationship to the quality of the endothelium. The endothelial cell density is considered the most important morphological factor. Morphometry of the corneal endothelium is presently done by semi-automated analysis of pictures captured by a Clinical Specular Microscope (CSM). Because of the occasional need of operator involvement, this process can be tedious, having a negative impact on sampling size. This study was dedicated to the development of fully automated analysis of images of the corneal endothelium, captured by CSM, using Fourier analysis. Software was developed in the mathematical programming language Matlab. Pictures of the corneal endothelium, captured by CSM, were read into the analysis software. The software automatically performed digital enhancement of the images. The digitally enhanced images of the corneal endothelium were transformed, using the fast Fourier transform (FFT). Tools were developed and applied for identification and analysis of relevant characteristics of the Fourier transformed images. The data obtained from each Fourier transformed image was used to calculate the mean cell density of its corresponding corneal endothelium. The calculation was based on well known diffraction theory. Results in form of estimated cell density of the corneal endothelium were obtained, using fully automated analysis software on images captured by CSM. The cell density obtained by the fully automated analysis was compared to the cell density obtained from classical, semi-automated analysis and a relatively large correlation was found.
Vrooman, Henri A; Cocosco, Chris A; van der Lijn, Fedde; Stokking, Rik; Ikram, M Arfan; Vernooij, Meike W; Breteler, Monique M B; Niessen, Wiro J
2007-08-01
Conventional k-Nearest-Neighbor (kNN) classification, which has been successfully applied to classify brain tissue in MR data, requires training on manually labeled subjects. This manual labeling is a laborious and time-consuming procedure. In this work, a new fully automated brain tissue classification procedure is presented, in which kNN training is automated. This is achieved by non-rigidly registering the MR data with a tissue probability atlas to automatically select training samples, followed by a post-processing step to keep the most reliable samples. The accuracy of the new method was compared to rigid registration-based training and to conventional kNN-based segmentation using training on manually labeled subjects for segmenting gray matter (GM), white matter (WM) and cerebrospinal fluid (CSF) in 12 data sets. Furthermore, for all classification methods, the performance was assessed when varying the free parameters. Finally, the robustness of the fully automated procedure was evaluated on 59 subjects. The automated training method using non-rigid registration with a tissue probability atlas was significantly more accurate than rigid registration. For both automated training using non-rigid registration and for the manually trained kNN classifier, the difference with the manual labeling by observers was not significantly larger than inter-observer variability for all tissue types. From the robustness study, it was clear that, given an appropriate brain atlas and optimal parameters, our new fully automated, non-rigid registration-based method gives accurate and robust segmentation results. A similarity index was used for comparison with manually trained kNN. The similarity indices were 0.93, 0.92 and 0.92, for CSF, GM and WM, respectively. It can be concluded that our fully automated method using non-rigid registration may replace manual segmentation, and thus that automated brain tissue segmentation without laborious manual training is feasible.
Coutinho, Rita; Clear, Andrew J.; Mazzola, Emanuele; Owen, Andrew; Greaves, Paul; Wilson, Andrew; Matthews, Janet; Lee, Abigail; Alvarez, Rute; da Silva, Maria Gomes; Cabeçadas, José; Neuberg, Donna; Calaminici, Maria; Gribben, John G.
2015-01-01
Gene expression studies have identified the microenvironment as a prognostic player in diffuse large B-cell lymphoma. However, there is a lack of simple immune biomarkers that can be applied in the clinical setting and could be helpful in stratifying patients. Immunohistochemistry has been used for this purpose but the results are inconsistent. We decided to reinvestigate the immune microenvironment and its impact using immunohistochemistry, with two systems of image analysis, in a large set of patients with diffuse large B-cell lymphoma. Diagnostic tissue from 309 patients was arrayed onto tissue microarrays. Results from 161 chemoimmunotherapy-treated patients were used for outcome prediction. Positive cells, percentage stained area and numbers of pixels/area were quantified and results were compared with the purpose of inferring consistency between the two semi-automated systems. Measurement cutpoints were assessed using a recursive partitioning algorithm classifying results according to survival. Kaplan-Meier estimators and Fisher exact tests were evaluated to check for significant differences between measurement classes, and for dependence between pairs of measurements, respectively. Results were validated by multivariate analysis incorporating the International Prognostic Index. The concordance between the two systems of image analysis was surprisingly high, supporting their applicability for immunohistochemistry studies. Patients with a high density of CD3 and FoxP3 by both methods had a better outcome. Automated analysis should be the preferred method for immunohistochemistry studies. Following the use of two methods of semi-automated analysis we suggest that CD3 and FoxP3 play a role in predicting response to chemoimmunotherapy in diffuse large B-cell lymphoma. PMID:25425693
Coutinho, Rita; Clear, Andrew J; Mazzola, Emanuele; Owen, Andrew; Greaves, Paul; Wilson, Andrew; Matthews, Janet; Lee, Abigail; Alvarez, Rute; da Silva, Maria Gomes; Cabeçadas, José; Neuberg, Donna; Calaminici, Maria; Gribben, John G
2015-03-01
Gene expression studies have identified the microenvironment as a prognostic player in diffuse large B-cell lymphoma. However, there is a lack of simple immune biomarkers that can be applied in the clinical setting and could be helpful in stratifying patients. Immunohistochemistry has been used for this purpose but the results are inconsistent. We decided to reinvestigate the immune microenvironment and its impact using immunohistochemistry, with two systems of image analysis, in a large set of patients with diffuse large B-cell lymphoma. Diagnostic tissue from 309 patients was arrayed onto tissue microarrays. Results from 161 chemoimmunotherapy-treated patients were used for outcome prediction. Positive cells, percentage stained area and numbers of pixels/area were quantified and results were compared with the purpose of inferring consistency between the two semi-automated systems. Measurement cutpoints were assessed using a recursive partitioning algorithm classifying results according to survival. Kaplan-Meier estimators and Fisher exact tests were evaluated to check for significant differences between measurement classes, and for dependence between pairs of measurements, respectively. Results were validated by multivariate analysis incorporating the International Prognostic Index. The concordance between the two systems of image analysis was surprisingly high, supporting their applicability for immunohistochemistry studies. Patients with a high density of CD3 and FoxP3 by both methods had a better outcome. Automated analysis should be the preferred method for immunohistochemistry studies. Following the use of two methods of semi-automated analysis we suggest that CD3 and FoxP3 play a role in predicting response to chemoimmunotherapy in diffuse large B-cell lymphoma. Copyright© Ferrata Storti Foundation.
A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.
Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham
2017-08-01
Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.
Validation of automated white matter hyperintensity segmentation.
Smart, Sean D; Firbank, Michael J; O'Brien, John T
2011-01-01
Introduction. White matter hyperintensities (WMHs) are a common finding on MRI scans of older people and are associated with vascular disease. We compared 3 methods for automatically segmenting WMHs from MRI scans. Method. An operator manually segmented WMHs on MRI images from a 3T scanner. The scans were also segmented in a fully automated fashion by three different programmes. The voxel overlap between manual and automated segmentation was compared. Results. Between observer overlap ratio was 63%. Using our previously described in-house software, we had overlap of 62.2%. We investigated the use of a modified version of SPM segmentation; however, this was not successful, with only 14% overlap. Discussion. Using our previously reported software, we demonstrated good segmentation of WMHs in a fully automated fashion.
Is this the real time for genomics?
Guarnaccia, Maria; Gentile, Giulia; Alessi, Enrico; Schneider, Claudio; Petralia, Salvatore; Cavallaro, Sebastiano
2014-01-01
In the last decades, molecular biology has moved from gene-by-gene analysis to more complex studies using a genome-wide scale. Thanks to high-throughput genomic technologies, such as microarrays and next-generation sequencing, a huge amount of information has been generated, expanding our knowledge on the genetic basis of various diseases. Although some of this information could be transferred to clinical diagnostics, the technologies available are not suitable for this purpose. In this review, we will discuss the drawbacks associated with the use of traditional DNA microarrays in diagnostics, pointing out emerging platforms that could overcome these obstacles and offer a more reproducible, qualitative and quantitative multigenic analysis. New miniaturized and automated devices, called Lab-on-Chip, begin to integrate PCR and microarray on the same platform, offering integrated sample-to-result systems. The introduction of this kind of innovative devices may facilitate the transition of genome-based tests into clinical routine. Copyright © 2014. Published by Elsevier Inc.
DNA microarray technology in nutraceutical and food safety.
Liu-Stratton, Yiwen; Roy, Sashwati; Sen, Chandan K
2004-04-15
The quality and quantity of diet is a key determinant of health and disease. Molecular diagnostics may play a key role in food safety related to genetically modified foods, food-borne pathogens and novel nutraceuticals. Functional outcomes in biology are determined, for the most part, by net balance between sets of genes related to the specific outcome in question. The DNA microarray technology offers a new dimension of strength in molecular diagnostics by permitting the simultaneous analysis of large sets of genes. Automation of assay and novel bioinformatics tools make DNA microarrays a robust technology for diagnostics. Since its development a few years ago, this technology has been used for the applications of toxicogenomics, pharmacogenomics, cell biology, and clinical investigations addressing the prevention and intervention of diseases. Optimization of this technology to specifically address food safety is a vast resource that remains to be mined. Efforts to develop diagnostic custom arrays and simplified bioinformatics tools for field use are warranted.
English, Sangeeta B.; Shih, Shou-Ching; Ramoni, Marco F.; Smith, Lois E.; Butte, Atul J.
2014-01-01
Though genome-wide technologies, such as microarrays, are widely used, data from these methods are considered noisy; there is still varied success in downstream biological validation. We report a method that increases the likelihood of successfully validating microarray findings using real time RT-PCR, including genes at low expression levels and with small differences. We use a Bayesian network to identify the most relevant sources of noise based on the successes and failures in validation for an initial set of selected genes, and then improve our subsequent selection of genes for validation based on eliminating these sources of noise. The network displays the significant sources of noise in an experiment, and scores the likelihood of validation for every gene. We show how the method can significantly increase validation success rates. In conclusion, in this study, we have successfully added a new automated step to determine the contributory sources of noise that determine successful or unsuccessful downstream biological validation. PMID:18790084
Liu, Fang; Zhou, Zhaoye; Jang, Hyungseok; Samsonov, Alexey; Zhao, Gengyan; Kijowski, Richard
2018-04-01
To describe and evaluate a new fully automated musculoskeletal tissue segmentation method using deep convolutional neural network (CNN) and three-dimensional (3D) simplex deformable modeling to improve the accuracy and efficiency of cartilage and bone segmentation within the knee joint. A fully automated segmentation pipeline was built by combining a semantic segmentation CNN and 3D simplex deformable modeling. A CNN technique called SegNet was applied as the core of the segmentation method to perform high resolution pixel-wise multi-class tissue classification. The 3D simplex deformable modeling refined the output from SegNet to preserve the overall shape and maintain a desirable smooth surface for musculoskeletal structure. The fully automated segmentation method was tested using a publicly available knee image data set to compare with currently used state-of-the-art segmentation methods. The fully automated method was also evaluated on two different data sets, which include morphological and quantitative MR images with different tissue contrasts. The proposed fully automated segmentation method provided good segmentation performance with segmentation accuracy superior to most of state-of-the-art methods in the publicly available knee image data set. The method also demonstrated versatile segmentation performance on both morphological and quantitative musculoskeletal MR images with different tissue contrasts and spatial resolutions. The study demonstrates that the combined CNN and 3D deformable modeling approach is useful for performing rapid and accurate cartilage and bone segmentation within the knee joint. The CNN has promising potential applications in musculoskeletal imaging. Magn Reson Med 79:2379-2391, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study.
Holter, Marianne T S; Johansen, Ayna; Brendryen, Håvar
2016-06-28
eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist's support of a working alliance, internalization of motivation, and managing lapses. We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several "counseling sessions" about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. The program supports the user's working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective.
How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study
Johansen, Ayna; Brendryen, Håvar
2016-01-01
Background eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. Objective We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist’s support of a working alliance, internalization of motivation, and managing lapses. Methods We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several “counseling sessions” about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. Results The program supports the user’s working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. Conclusions A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective. PMID:27354373
Automated MRI segmentation for individualized modeling of current flow in the human head.
Huang, Yu; Dmochowski, Jacek P; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C
2013-12-01
High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly. Fully automated individualized modeling may now be feasible for large-sample EEG research studies and tDCS clinical trials.
Rexhepaj, Elton; Brennan, Donal J; Holloway, Peter; Kay, Elaine W; McCann, Amanda H; Landberg, Goran; Duffy, Michael J; Jirstrom, Karin; Gallagher, William M
2008-01-01
Manual interpretation of immunohistochemistry (IHC) is a subjective, time-consuming and variable process, with an inherent intra-observer and inter-observer variability. Automated image analysis approaches offer the possibility of developing rapid, uniform indicators of IHC staining. In the present article we describe the development of a novel approach for automatically quantifying oestrogen receptor (ER) and progesterone receptor (PR) protein expression assessed by IHC in primary breast cancer. Two cohorts of breast cancer patients (n = 743) were used in the study. Digital images of breast cancer tissue microarrays were captured using the Aperio ScanScope XT slide scanner (Aperio Technologies, Vista, CA, USA). Image analysis algorithms were developed using MatLab 7 (MathWorks, Apple Hill Drive, MA, USA). A fully automated nuclear algorithm was developed to discriminate tumour from normal tissue and to quantify ER and PR expression in both cohorts. Random forest clustering was employed to identify optimum thresholds for survival analysis. The accuracy of the nuclear algorithm was initially confirmed by a histopathologist, who validated the output in 18 representative images. In these 18 samples, an excellent correlation was evident between the results obtained by manual and automated analysis (Spearman's rho = 0.9, P < 0.001). Optimum thresholds for survival analysis were identified using random forest clustering. This revealed 7% positive tumour cells as the optimum threshold for the ER and 5% positive tumour cells for the PR. Moreover, a 7% cutoff level for the ER predicted a better response to tamoxifen than the currently used 10% threshold. Finally, linear regression was employed to demonstrate a more homogeneous pattern of expression for the ER (R = 0.860) than for the PR (R = 0.681). In summary, we present data on the automated quantification of the ER and the PR in 743 primary breast tumours using a novel unsupervised image analysis algorithm. This novel approach provides a useful tool for the quantification of biomarkers on tissue specimens, as well as for objective identification of appropriate cutoff thresholds for biomarker positivity. It also offers the potential to identify proteins with a homogeneous pattern of expression.
Medical ADP Systems: Automated Medical Records Hold Promise to Improve Patient Care
1991-01-01
automated medical records. The report discusses the potential benefits that automation could make to the quality of patient care and the factors that impede...information systems, but no organization has fully automated one of the most critical types of information, patient medical records. The patient medical record...its review of automated medical records. GAO’s objectives in this study were to identify the (1) benefits of automating patient records and (2) factors
1987-06-01
commercial products. · OP -- Typical cutout at a plumbiinc location where an automated monitoring system has bv :• installed. The sensor used with the...This report provides a description of commercially available sensors , instruments, and ADP equipment that may be selected to fully automate...automated. The automated plumbline monitoring system includes up to twelve sensors , repeaters, a system controller, and a printer. The system may
Qi, Xin; Xing, Fuyong; Foran, David J.; Yang, Lin
2013-01-01
Automated image analysis of histopathology specimens could potentially provide support for early detection and improved characterization of breast cancer. Automated segmentation of the cells comprising imaged tissue microarrays (TMA) is a prerequisite for any subsequent quantitative analysis. Unfortunately, crowding and overlapping of cells present significant challenges for most traditional segmentation algorithms. In this paper, we propose a novel algorithm which can reliably separate touching cells in hematoxylin stained breast TMA specimens which have been acquired using a standard RGB camera. The algorithm is composed of two steps. It begins with a fast, reliable object center localization approach which utilizes single-path voting followed by mean-shift clustering. Next, the contour of each cell is obtained using a level set algorithm based on an interactive model. We compared the experimental results with those reported in the most current literature. Finally, performance was evaluated by comparing the pixel-wise accuracy provided by human experts with that produced by the new automated segmentation algorithm. The method was systematically tested on 234 image patches exhibiting dense overlap and containing more than 2200 cells. It was also tested on whole slide images including blood smears and tissue microarrays containing thousands of cells. Since the voting step of the seed detection algorithm is well suited for parallelization, a parallel version of the algorithm was implemented using graphic processing units (GPU) which resulted in significant speed-up over the C/C++ implementation. PMID:22167559
2009-10-01
molecular breast imaging, with the ability to dynamically contour any sized breast, will improve detection and potentially in vivo characterization of...Having flexible 3D positioning about the breast yielded minimal RMSD differences, which is important for high resolution molecular emission imaging. This...TITLE: Automation and Preclinical Evaluation of a Dedicated Emission Mammotomography System for Fully 3-D Molecular Breast Imaging PRINCIPAL
Neef, Maren; Ecke, Margret; Hampp, Rüdiger
2015-01-01
The Simbox mission was the first joint space project between Germany and China in November 2011. Eleven-day-old Arabidopsis thaliana wild type semisolid callus cultures were integrated into fully automated plant cultivation containers and exposed to spaceflight conditions within the Simbox hardware on board of the spacecraft Shenzhou 8. The related ground experiment was conducted under similar conditions. The use of an in-flight centrifuge provided a 1 g gravitational field in space. The cells were metabolically quenched after 5 days via RNAlater injection. The impact on the Arabidopsis transcriptome was investigated by means of whole-genome gene expression analysis. The results show a major impact of nonmicrogravity related spaceflight conditions. Genes that were significantly altered in transcript abundance are mainly involved in protein phosphorylation and MAPK cascade-related signaling processes, as well as in the cellular defense and stress responses. In contrast to short-term effects of microgravity (seconds, minutes), this mission identified only minor changes after 5 days of microgravity. These concerned genes coding for proteins involved in the plastid-associated translation machinery, mitochondrial electron transport, and energy production. PMID:25654111
Diagnostic classification of cancer using DNA microarrays and artificial intelligence.
Greer, Braden T; Khan, Javed
2004-05-01
The application of artificial intelligence (AI) to microarray data has been receiving much attention in recent years because of the possibility of automated diagnosis in the near future. Studies have been published predicting tumor type, estrogen receptor status, and prognosis using a variety of AI algorithms. The performance of intelligent computing decisions based on gene expression signatures is in some cases comparable to or better than the current clinical decision schemas. The goal of these tools is not to make clinicians obsolete, but rather to give clinicians one more tool in their armamentarium to accurately diagnose and hence better treat cancer patients. Several such applications are summarized in this chapter, and some of the common pitfalls are noted.
USDA-ARS?s Scientific Manuscript database
In addition to microarray technology, which provides a robust method to study protein function in a rapid, economical, and proteome-wide fashion, plasmid-based functional proteomics is an important technology for rapidly obtaining large quantities of protein and determining protein function across a...
DOT National Transportation Integrated Search
2014-09-01
The concept of Automated Transit Networks (ATN) - in which fully automated vehicles on exclusive, grade-separated guideways : provide on-demand, primarily non-stop, origin-to-destination service over an area network has been around since the 1950...
Validation of Automated White Matter Hyperintensity Segmentation
Smart, Sean D.; Firbank, Michael J.; O'Brien, John T.
2011-01-01
Introduction. White matter hyperintensities (WMHs) are a common finding on MRI scans of older people and are associated with vascular disease. We compared 3 methods for automatically segmenting WMHs from MRI scans. Method. An operator manually segmented WMHs on MRI images from a 3T scanner. The scans were also segmented in a fully automated fashion by three different programmes. The voxel overlap between manual and automated segmentation was compared. Results. Between observer overlap ratio was 63%. Using our previously described in-house software, we had overlap of 62.2%. We investigated the use of a modified version of SPM segmentation; however, this was not successful, with only 14% overlap. Discussion. Using our previously reported software, we demonstrated good segmentation of WMHs in a fully automated fashion. PMID:21904678
Schulze, H Georg; Turner, Robin F B
2014-01-01
Charge-coupled device detectors are vulnerable to cosmic rays that can contaminate Raman spectra with positive going spikes. Because spikes can adversely affect spectral processing and data analyses, they must be removed. Although both hardware-based and software-based spike removal methods exist, they typically require parameter and threshold specification dependent on well-considered user input. Here, we present a fully automated spike removal algorithm that proceeds without requiring user input. It is minimally dependent on sample attributes, and those that are required (e.g., standard deviation of spectral noise) can be determined with other fully automated procedures. At the core of the method is the identification and location of spikes with coincident second derivatives along both the spectral and spatiotemporal dimensions of two-dimensional datasets. The method can be applied to spectra that are relatively inhomogeneous because it provides fairly effective and selective targeting of spikes resulting in minimal distortion of spectra. Relatively effective spike removal obtained with full automation could provide substantial benefits to users where large numbers of spectra must be processed.
NASA Technical Reports Server (NTRS)
Steinberg, R.
1978-01-01
The National Aeronautics and Space Administration has developed a low-cost communications system to provide meteorological data from commercial aircraft, in near real-time, on a fully automated basis. The complete system including the low profile antenna and all installation hardware weighs 34 kg. The prototype system has been installed on a Pan American B-747 aircraft and has been providing meteorological data (wind angle and velocity, temperature, altitude and position as a function of time) on a fully automated basis for the past several months. The results have been exceptional. This concept is expected to have important implications for operational meteorology and airline route forecasting.
A Fully Automated Method to Detect and Segment a Manufactured Object in an Underwater Color Image
NASA Astrophysics Data System (ADS)
Barat, Christian; Phlypo, Ronald
2010-12-01
We propose a fully automated active contours-based method for the detection and the segmentation of a moored manufactured object in an underwater image. Detection of objects in underwater images is difficult due to the variable lighting conditions and shadows on the object. The proposed technique is based on the information contained in the color maps and uses the visual attention method, combined with a statistical approach for the detection and an active contour for the segmentation of the object to overcome the above problems. In the classical active contour method the region descriptor is fixed and the convergence of the method depends on the initialization. With our approach, this dependence is overcome with an initialization using the visual attention results and a criterion to select the best region descriptor. This approach improves the convergence and the processing time while providing the advantages of a fully automated method.
Fananapazir, Ghaneh; Bashir, Mustafa R; Marin, Daniele; Boll, Daniel T
2015-06-01
To evaluate the performance of a prototype, fully-automated post-processing solution for whole-liver and lobar segmentation based on MDCT datasets. A polymer liver phantom was used to assess accuracy of post-processing applications comparing phantom volumes determined via Archimedes' principle with MDCT segmented datasets. For the IRB-approved, HIPAA-compliant study, 25 patients were enrolled. Volumetry performance compared the manual approach with the automated prototype, assessing intraobserver variability, and interclass correlation for whole-organ and lobar segmentation using ANOVA comparison. Fidelity of segmentation was evaluated qualitatively. Phantom volume was 1581.0 ± 44.7 mL, manually segmented datasets estimated 1628.0 ± 47.8 mL, representing a mean overestimation of 3.0%, automatically segmented datasets estimated 1601.9 ± 0 mL, representing a mean overestimation of 1.3%. Whole-liver and segmental volumetry demonstrated no significant intraobserver variability for neither manual nor automated measurements. For whole-liver volumetry, automated measurement repetitions resulted in identical values; reproducible whole-organ volumetry was also achieved with manual segmentation, p(ANOVA) 0.98. For lobar volumetry, automated segmentation improved reproducibility over manual approach, without significant measurement differences for either methodology, p(ANOVA) 0.95-0.99. Whole-organ and lobar segmentation results from manual and automated segmentation showed no significant differences, p(ANOVA) 0.96-1.00. Assessment of segmentation fidelity found that segments I-IV/VI showed greater segmentation inaccuracies compared to the remaining right hepatic lobe segments. Automated whole-liver segmentation showed non-inferiority of fully-automated whole-liver segmentation compared to manual approaches with improved reproducibility and post-processing duration; automated dual-seed lobar segmentation showed slight tendencies for underestimating the right hepatic lobe volume and greater variability in edge detection for the left hepatic lobe compared to manual segmentation.
Advantages and challenges in automated apatite fission track counting
NASA Astrophysics Data System (ADS)
Enkelmann, E.; Ehlers, T. A.
2012-04-01
Fission track thermochronometer data are often a core element of modern tectonic and denudation studies. Soon after the development of the fission track methods interest emerged for the developed an automated counting procedure to replace the time consuming labor of counting fission tracks under the microscope. Automated track counting became feasible in recent years with increasing improvements in computer software and hardware. One such example used in this study is the commercial automated fission track counting procedure from Autoscan Systems Pty that has been highlighted through several venues. We conducted experiments that are designed to reliably and consistently test the ability of this fully automated counting system to recognize fission tracks in apatite and a muscovite external detector. Fission tracks were analyzed in samples with a step-wise increase in sample complexity. The first set of experiments used a large (mm-size) slice of Durango apatite cut parallel to the prism plane. Second, samples with 80-200 μm large apatite grains of Fish Canyon Tuff were analyzed. This second sample set is characterized by complexities often found in apatites in different rock types. In addition to the automated counting procedure, the same samples were also analyzed using conventional counting procedures. We found for all samples that the fully automated fission track counting procedure using the Autoscan System yields a larger scatter in the fission track densities measured compared to conventional (manual) track counting. This scatter typically resulted from the false identification of tracks due surface and mineralogical defects, regardless of the image filtering procedure used. Large differences between track densities analyzed with the automated counting persisted between different grains analyzed in one sample as well as between different samples. As a result of these differences a manual correction of the fully automated fission track counts is necessary for each individual surface area and grain counted. This manual correction procedure significantly increases (up to four times) the time required to analyze a sample with the automated counting procedure compared to the conventional approach.
NASA Astrophysics Data System (ADS)
Hopp, T.; Zapf, M.; Ruiter, N. V.
2014-03-01
An essential processing step for comparison of Ultrasound Computer Tomography images to other modalities, as well as for the use in further image processing, is to segment the breast from the background. In this work we present a (semi-) automated 3D segmentation method which is based on the detection of the breast boundary in coronal slice images and a subsequent surface fitting. The method was evaluated using a software phantom and in-vivo data. The fully automatically processed phantom results showed that a segmentation of approx. 10% of the slices of a dataset is sufficient to recover the overall breast shape. Application to 16 in-vivo datasets was performed successfully using semi-automated processing, i.e. using a graphical user interface for manual corrections of the automated breast boundary detection. The processing time for the segmentation of an in-vivo dataset could be significantly reduced by a factor of four compared to a fully manual segmentation. Comparison to manually segmented images identified a smoother surface for the semi-automated segmentation with an average of 11% of differing voxels and an average surface deviation of 2mm. Limitations of the edge detection may be overcome by future updates of the KIT USCT system, allowing a fully-automated usage of our segmentation approach.
NASA Astrophysics Data System (ADS)
Liu, Hongna; Li, Song; Wang, Zhifei; Li, Zhiyang; Deng, Yan; Wang, Hua; Shi, Zhiyang; He, Nongyue
2008-11-01
Single nucleotide polymorphisms (SNPs) comprise the most abundant source of genetic variation in the human genome wide codominant SNPs identification. Therefore, large-scale codominant SNPs identification, especially for those associated with complex diseases, has induced the need for completely high-throughput and automated SNP genotyping method. Herein, we present an automated detection system of SNPs based on two kinds of functional magnetic nanoparticles (MNPs) and dual-color hybridization. The amido-modified MNPs (NH 2-MNPs) modified with APTES were used for DNA extraction from whole blood directly by electrostatic reaction, and followed by PCR, was successfully performed. Furthermore, biotinylated PCR products were captured on the streptavidin-coated MNPs (SA-MNPs) and interrogated by hybridization with a pair of dual-color probes to determine SNP, then the genotype of each sample can be simultaneously identified by scanning the microarray printed with the denatured fluorescent probes. This system provided a rapid, sensitive and highly versatile automated procedure that will greatly facilitate the analysis of different known SNPs in human genome.
Li, Wei; Abram, François; Pelletier, Jean-Pierre; Raynauld, Jean-Pierre; Dorais, Marc; d'Anjou, Marc-André; Martel-Pelletier, Johanne
2010-01-01
Joint effusion is frequently associated with osteoarthritis (OA) flare-up and is an important marker of therapeutic response. This study aimed at developing and validating a fully automated system based on magnetic resonance imaging (MRI) for the quantification of joint effusion volume in knee OA patients. MRI examinations consisted of two axial sequences: a T2-weighted true fast imaging with steady-state precession and a T1-weighted gradient echo. An automated joint effusion volume quantification system using MRI was developed and validated (a) with calibrated phantoms (cylinder and sphere) and effusion from knee OA patients; (b) with assessment by manual quantification; and (c) by direct aspiration. Twenty-five knee OA patients with joint effusion were included in the study. The automated joint effusion volume quantification was developed as a four stage sequencing process: bone segmentation, filtering of unrelated structures, segmentation of joint effusion, and subvoxel volume calculation. Validation experiments revealed excellent coefficients of variation with the calibrated cylinder (1.4%) and sphere (0.8%) phantoms. Comparison of the OA knee joint effusion volume assessed by the developed automated system and by manual quantification was also excellent (r = 0.98; P < 0.0001), as was the comparison with direct aspiration (r = 0.88; P = 0.0008). The newly developed fully automated MRI-based system provided precise quantification of OA knee joint effusion volume with excellent correlation with data from phantoms, a manual system, and joint aspiration. Such an automated system will be instrumental in improving the reproducibility/reliability of the evaluation of this marker in clinical application.
Bălăcescu, Loredana; Bălăcescu, O; Crişan, N; Fetica, B; Petruţ, B; Bungărdean, Cătălina; Rus, Meda; Tudoran, Oana; Meurice, G; Irimie, Al; Dragoş, N; Berindan-Neagoe, Ioana
2011-01-01
Prostate cancer represents the first leading cause of cancer among western male population, with different clinical behavior ranging from indolent to metastatic disease. Although many molecules and deregulated pathways are known, the molecular mechanisms involved in the development of prostate cancer are not fully understood. The aim of this study was to explore the molecular variation underlying the prostate cancer, based on microarray analysis and bioinformatics approaches. Normal and prostate cancer tissues were collected by macrodissection from prostatectomy pieces. All prostate cancer specimens used in our study were Gleason score 7. Gene expression microarray (Agilent Technologies) was used for Whole Human Genome evaluation. The bioinformatics and functional analysis were based on Limma and Ingenuity software. The microarray analysis identified 1119 differentially expressed genes between prostate cancer and normal prostate, which were up- or down-regulated at least 2-fold. P-values were adjusted for multiple testing using Benjamini-Hochberg method with a false discovery rate of 0.01. These genes were analyzed with Ingenuity Pathway Analysis software and were established 23 genetic networks. Our microarray results provide new information regarding the molecular networks in prostate cancer stratified as Gleason 7. These data highlighted gene expression profiles for better understanding of prostate cancer progression.
Implementation of GenePattern within the Stanford Microarray Database.
Hubble, Jeremy; Demeter, Janos; Jin, Heng; Mao, Maria; Nitzberg, Michael; Reddy, T B K; Wymore, Farrell; Zachariah, Zachariah K; Sherlock, Gavin; Ball, Catherine A
2009-01-01
Hundreds of researchers across the world use the Stanford Microarray Database (SMD; http://smd.stanford.edu/) to store, annotate, view, analyze and share microarray data. In addition to providing registered users at Stanford access to their own data, SMD also provides access to public data, and tools with which to analyze those data, to any public user anywhere in the world. Previously, the addition of new microarray data analysis tools to SMD has been limited by available engineering resources, and in addition, the existing suite of tools did not provide a simple way to design, execute and share analysis pipelines, or to document such pipelines for the purposes of publication. To address this, we have incorporated the GenePattern software package directly into SMD, providing access to many new analysis tools, as well as a plug-in architecture that allows users to directly integrate and share additional tools through SMD. In this article, we describe our implementation of the GenePattern microarray analysis software package into the SMD code base. This extension is available with the SMD source code that is fully and freely available to others under an Open Source license, enabling other groups to create a local installation of SMD with an enriched data analysis capability.
Cruella: developing a scalable tissue microarray data management system.
Cowan, James D; Rimm, David L; Tuck, David P
2006-06-01
Compared with DNA microarray technology, relatively little information is available concerning the special requirements, design influences, and implementation strategies of data systems for tissue microarray technology. These issues include the requirement to accommodate new and different data elements for each new project as well as the need to interact with pre-existing models for clinical, biological, and specimen-related data. To design and implement a flexible, scalable tissue microarray data storage and management system that could accommodate information regarding different disease types and different clinical investigators, and different clinical investigation questions, all of which could potentially contribute unforeseen data types that require dynamic integration with existing data. The unpredictability of the data elements combined with the novelty of automated analysis algorithms and controlled vocabulary standards in this area require flexible designs and practical decisions. Our design includes a custom Java-based persistence layer to mediate and facilitate interaction with an object-relational database model and a novel database schema. User interaction is provided through a Java Servlet-based Web interface. Cruella has become an indispensable resource and is used by dozens of researchers every day. The system stores millions of experimental values covering more than 300 biological markers and more than 30 disease types. The experimental data are merged with clinical data that has been aggregated from multiple sources and is available to the researchers for management, analysis, and export. Cruella addresses many of the special considerations for managing tissue microarray experimental data and the associated clinical information. A metadata-driven approach provides a practical solution to many of the unique issues inherent in tissue microarray research, and allows relatively straightforward interoperability with and accommodation of new data models.
ROBOCAL: An automated NDA (nondestructive analysis) calorimetry and gamma isotopic system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hurd, J.R.; Powell, W.D.; Ostenak, C.A.
1989-11-01
ROBOCAL, which is presently being developed and tested at Los Alamos National Laboratory, is a full-scale, prototype robotic system for remote calorimetric and gamma-ray analysis of special nuclear materials. It integrates a fully automated, multidrawer, vertical stacker-retriever system for staging unmeasured nuclear materials, and a fully automated gantry robot for computer-based selection and transfer of nuclear materials to calorimetric and gamma-ray measurement stations. Since ROBOCAL is designed for minimal operator intervention, a completely programmed user interface is provided to interact with the automated mechanical and assay systems. The assay system is designed to completely integrate calorimetric and gamma-ray data acquisitionmore » and to perform state-of-the-art analyses on both homogeneous and heterogeneous distributions of nuclear materials in a wide variety of matrices.« less
Fully automated urban traffic system
NASA Technical Reports Server (NTRS)
Dobrotin, B. M.; Hansen, G. R.; Peng, T. K. C.; Rennels, D. A.
1977-01-01
The replacement of the driver with an automatic system which could perform the functions of guiding and routing a vehicle with a human's capability of responding to changing traffic demands was discussed. The problem was divided into four technological areas; guidance, routing, computing, and communications. It was determined that the latter three areas being developed independent of any need for fully automated urban traffic. A guidance system that would meet system requirements was not being developed but was technically feasible.
A fully automated digitally controlled 30-inch telescope
NASA Technical Reports Server (NTRS)
Colgate, S. A.; Moore, E. P.; Carlson, R.
1975-01-01
A fully automated 30-inch (75-cm) telescope has been successfully designed and constructed from a military surplus Nike-Ajax radar mount. Novel features include: closed-loop operation between mountain telescope and campus computer 30 km apart via microwave link, a TV-type sensor which is photon shot-noise limited, a special lightweight primary mirror, and a stepping motor drive capable of slewing and settling one degree in one second or a radian in fifteen seconds.
Kim, Jungkyu; Jensen, Erik C; Stockton, Amanda M; Mathies, Richard A
2013-08-20
A fully integrated multilayer microfluidic chemical analyzer for automated sample processing and labeling, as well as analysis using capillary zone electrophoresis is developed and characterized. Using lifting gate microfluidic control valve technology, a microfluidic automaton consisting of a two-dimensional microvalve cellular array is fabricated with soft lithography in a format that enables facile integration with a microfluidic capillary electrophoresis device. The programmable sample processor performs precise mixing, metering, and routing operations that can be combined to achieve automation of complex and diverse assay protocols. Sample labeling protocols for amino acid, aldehyde/ketone and carboxylic acid analysis are performed automatically followed by automated transfer and analysis by the integrated microfluidic capillary electrophoresis chip. Equivalent performance to off-chip sample processing is demonstrated for each compound class; the automated analysis resulted in a limit of detection of ~16 nM for amino acids. Our microfluidic automaton provides a fully automated, portable microfluidic analysis system capable of autonomous analysis of diverse compound classes in challenging environments.
Automated MRI Segmentation for Individualized Modeling of Current Flow in the Human Head
Huang, Yu; Dmochowski, Jacek P.; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C.
2013-01-01
Objective High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography (HD-EEG) require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images (MRI) requires labor-intensive manual segmentation, even when leveraging available automated segmentation tools. Also, accurate placement of many high-density electrodes on individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. Approach A fully automated segmentation technique based on Statical Parametric Mapping 8 (SPM8), including an improved tissue probability map (TPM) and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on 4 healthy subjects and 7 stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets. Main results The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view (FOV) extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly. Significance Fully automated individualized modeling may now be feasible for large-sample EEG research studies and tDCS clinical trials. PMID:24099977
Automated MRI segmentation for individualized modeling of current flow in the human head
NASA Astrophysics Data System (ADS)
Huang, Yu; Dmochowski, Jacek P.; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C.
2013-12-01
Objective. High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. Approach. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets.Main results. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly.Significance. Fully automated individualized modeling may now be feasible for large-sample EEG research studies and tDCS clinical trials.
Yoon, Nara; Do, In-Gu; Cho, Eun Yoon
2014-09-01
Easy and accurate HER2 testing is essential when considering the prognostic and predictive significance of HER2 in breast cancer. The use of a fully automated, quantitative FISH assay would be helpful to detect HER2 amplification in breast cancer tissue specimens with reduced inter-laboratory variability. We compared the concordance of HER2 status as assessed by an automated FISH staining system to manual FISH testing. Using 60 formalin-fixed paraffin-embedded breast carcinoma specimens, we assessed HER2 immunoexpression with two antibodies (DAKO HercepTest and CB11). In addition, HER2 status was evaluated with automated FISH using the Leica FISH System for BOND and a manual FISH using the Abbott PathVysion DNA Probe Kit. All but one specimen were successfully stained using both FISH methods. When the data were divided into two groups according to HER2/CEP17 ratio, positive and negative, the results from both the automated and manual FISH techniques were identical for all 59 evaluable specimens. The HER2 and CEP17 copy numbers and HER2/CEP17 ratio showed great agreement between both FISH methods. The automated FISH technique was interpretable with signal intensity similar to those of the manual FISH technique. In contrast with manual FISH, the automated FISH technique showed well-preserved architecture due to low membrane digestion. HER2 immunohistochemistry and FISH results showed substantial significant agreement (κ = 1.0, p < 0.001). HER2 status can be reliably determined using a fully automated HER2 FISH system with high concordance to the well-established manual FISH method. Because of stable signal intensity and high staining quality, the automated FISH technique may be more appropriate than manual FISH for routine applications. © 2013 APMIS. Published by John Wiley & Sons Ltd.
Schmidt, Deborah; Schuhmacher, Frank; Geissner, Andreas; Seeberger, Peter H; Pfrengle, Fabian
2015-04-07
Monoclonal antibodies that recognize plant cell wall glycans are used for high-resolution imaging, providing important information about the structure and function of cell wall polysaccharides. To characterize the binding epitopes of these powerful molecular probes a library of eleven plant arabinoxylan oligosaccharides was produced by automated solid-phase synthesis. Modular assembly of oligoarabinoxylans from few building blocks was enabled by adding (2-naphthyl)methyl (Nap) to the toolbox of orthogonal protecting groups for solid-phase synthesis. Conjugation-ready oligosaccharides were obtained and the binding specificities of xylan-directed antibodies were determined on microarrays. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Maximizing Your Investment in Building Automation System Technology.
ERIC Educational Resources Information Center
Darnell, Charles
2001-01-01
Discusses how organizational issues and system standardization can be important factors that determine an institution's ability to fully exploit contemporary building automation systems (BAS). Further presented is management strategy for maximizing BAS investments. (GR)
NASA Astrophysics Data System (ADS)
Yu, Long; Druckenbrod, Markus; Greve, Martin; Wang, Ke-qi; Abdel-Maksoud, Moustafa
2015-10-01
A fully automated optimization process is provided for the design of ducted propellers under open water conditions, including 3D geometry modeling, meshing, optimization algorithm and CFD analysis techniques. The developed process allows the direct integration of a RANSE solver in the design stage. A practical ducted propeller design case study is carried out for validation. Numerical simulations and open water tests are fulfilled and proved that the optimum ducted propeller improves hydrodynamic performance as predicted.
Xu, Weiyi; Wan, Feng; Lou, Yufeng; Jin, Jiali; Mao, Weilin
2014-01-01
A number of automated devices for pretransfusion testing have recently become available. This study evaluated the Immucor Galileo System, a fully automated device based on the microplate hemagglutination technique for ABO/Rh (D) determinations. Routine ABO/Rh typing tests were performed on 13,045 samples using the Immucor automated instruments. Manual tube method was used to resolve ABO forward and reverse grouping discrepancies. D-negative test results were investigated and confirmed manually by the indirect antiglobulin test (IAT). The system rejected 70 tests for sample inadequacy. 87 samples were read as "No-type-determined" due to forward and reverse grouping discrepancies. 25 tests gave these results because of sample hemolysis. After further tests, we found 34 tests were caused by weakened RBC antibodies, 5 tests were attributable to weak A and/or B antigens, 4 tests were due to mixed-field reactions, and 8 tests had high titer cold agglutinin with blood qualifications which react only at temperatures below 34 degrees C. In the remaining 11 cases, irregular RBC antibodies were identified in 9 samples (seven anti-M and two anti-P) and two subgroups were identified in 2 samples (one A1 and one A2) by a reference laboratory. As for D typing, 2 weak D+ samples missed by automated systems gave negative results, but weak-positive reactions were observed in the IAT. The Immucor Galileo System is reliable and suited for ABO and D blood groups, some reasons may cause a discrepancy in ABO/D typing using a fully automated system. It is suggested that standardization of sample collection may improve the performance of the fully automated system.
2010-01-01
Introduction Joint effusion is frequently associated with osteoarthritis (OA) flare-up and is an important marker of therapeutic response. This study aimed at developing and validating a fully automated system based on magnetic resonance imaging (MRI) for the quantification of joint effusion volume in knee OA patients. Methods MRI examinations consisted of two axial sequences: a T2-weighted true fast imaging with steady-state precession and a T1-weighted gradient echo. An automated joint effusion volume quantification system using MRI was developed and validated (a) with calibrated phantoms (cylinder and sphere) and effusion from knee OA patients; (b) with assessment by manual quantification; and (c) by direct aspiration. Twenty-five knee OA patients with joint effusion were included in the study. Results The automated joint effusion volume quantification was developed as a four stage sequencing process: bone segmentation, filtering of unrelated structures, segmentation of joint effusion, and subvoxel volume calculation. Validation experiments revealed excellent coefficients of variation with the calibrated cylinder (1.4%) and sphere (0.8%) phantoms. Comparison of the OA knee joint effusion volume assessed by the developed automated system and by manual quantification was also excellent (r = 0.98; P < 0.0001), as was the comparison with direct aspiration (r = 0.88; P = 0.0008). Conclusions The newly developed fully automated MRI-based system provided precise quantification of OA knee joint effusion volume with excellent correlation with data from phantoms, a manual system, and joint aspiration. Such an automated system will be instrumental in improving the reproducibility/reliability of the evaluation of this marker in clinical application. PMID:20846392
Integrating Test-Form Formatting into Automated Test Assembly
ERIC Educational Resources Information Center
Diao, Qi; van der Linden, Wim J.
2013-01-01
Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…
Wire-Guide Manipulator For Automated Welding
NASA Technical Reports Server (NTRS)
Morris, Tim; White, Kevin; Gordon, Steve; Emerich, Dave; Richardson, Dave; Faulkner, Mike; Stafford, Dave; Mccutcheon, Kim; Neal, Ken; Milly, Pete
1994-01-01
Compact motor drive positions guide for welding filler wire. Drive part of automated wire feeder in partly or fully automated welding system. Drive unit contains three parallel subunits. Rotations of lead screws in three subunits coordinated to obtain desired motions in three degrees of freedom. Suitable for both variable-polarity plasma arc welding and gas/tungsten arc welding.
Fully automated corneal endothelial morphometry of images captured by clinical specular microscopy
NASA Astrophysics Data System (ADS)
Bucht, Curry; Söderberg, Per; Manneberg, Göran
2010-02-01
The corneal endothelium serves as the posterior barrier of the cornea. Factors such as clarity and refractive properties of the cornea are in direct relationship to the quality of the endothelium. The endothelial cell density is considered the most important morphological factor of the corneal endothelium. Pathological conditions and physical trauma may threaten the endothelial cell density to such an extent that the optical property of the cornea and thus clear eyesight is threatened. Diagnosis of the corneal endothelium through morphometry is an important part of several clinical applications. Morphometry of the corneal endothelium is presently carried out by semi automated analysis of pictures captured by a Clinical Specular Microscope (CSM). Because of the occasional need of operator involvement, this process can be tedious, having a negative impact on sampling size. This study was dedicated to the development and use of fully automated analysis of a very large range of images of the corneal endothelium, captured by CSM, using Fourier analysis. Software was developed in the mathematical programming language Matlab. Pictures of the corneal endothelium, captured by CSM, were read into the analysis software. The software automatically performed digital enhancement of the images, normalizing lights and contrasts. The digitally enhanced images of the corneal endothelium were Fourier transformed, using the fast Fourier transform (FFT) and stored as new images. Tools were developed and applied for identification and analysis of relevant characteristics of the Fourier transformed images. The data obtained from each Fourier transformed image was used to calculate the mean cell density of its corresponding corneal endothelium. The calculation was based on well known diffraction theory. Results in form of estimated cell density of the corneal endothelium were obtained, using fully automated analysis software on 292 images captured by CSM. The cell density obtained by the fully automated analysis was compared to the cell density obtained from classical, semi-automated analysis and a relatively large correlation was found.
Pertuz, Said; McDonald, Elizabeth S; Weinstein, Susan P; Conant, Emily F; Kontos, Despina
2016-04-01
To assess a fully automated method for volumetric breast density (VBD) estimation in digital breast tomosynthesis (DBT) and to compare the findings with those of full-field digital mammography (FFDM) and magnetic resonance (MR) imaging. Bilateral DBT images, FFDM images, and sagittal breast MR images were retrospectively collected from 68 women who underwent breast cancer screening from October 2011 to September 2012 with institutional review board-approved, HIPAA-compliant protocols. A fully automated computer algorithm was developed for quantitative estimation of VBD from DBT images. FFDM images were processed with U.S. Food and Drug Administration-cleared software, and the MR images were processed with a previously validated automated algorithm to obtain corresponding VBD estimates. Pearson correlation and analysis of variance with Tukey-Kramer post hoc correction were used to compare the multimodality VBD estimates. Estimates of VBD from DBT were significantly correlated with FFDM-based and MR imaging-based estimates with r = 0.83 (95% confidence interval [CI]: 0.74, 0.90) and r = 0.88 (95% CI: 0.82, 0.93), respectively (P < .001). The corresponding correlation between FFDM and MR imaging was r = 0.84 (95% CI: 0.76, 0.90). However, statistically significant differences after post hoc correction (α = 0.05) were found among VBD estimates from FFDM (mean ± standard deviation, 11.1% ± 7.0) relative to MR imaging (16.6% ± 11.2) and DBT (19.8% ± 16.2). Differences between VDB estimates from DBT and MR imaging were not significant (P = .26). Fully automated VBD estimates from DBT, FFDM, and MR imaging are strongly correlated but show statistically significant differences. Therefore, absolute differences in VBD between FFDM, DBT, and MR imaging should be considered in breast cancer risk assessment.
Kline, Timothy L; Korfiatis, Panagiotis; Edwards, Marie E; Blais, Jaime D; Czerwiec, Frank S; Harris, Peter C; King, Bernard F; Torres, Vicente E; Erickson, Bradley J
2017-08-01
Deep learning techniques are being rapidly applied to medical imaging tasks-from organ and lesion segmentation to tissue and tumor classification. These techniques are becoming the leading algorithmic approaches to solve inherently difficult image processing tasks. Currently, the most critical requirement for successful implementation lies in the need for relatively large datasets that can be used for training the deep learning networks. Based on our initial studies of MR imaging examinations of the kidneys of patients affected by polycystic kidney disease (PKD), we have generated a unique database of imaging data and corresponding reference standard segmentations of polycystic kidneys. In the study of PKD, segmentation of the kidneys is needed in order to measure total kidney volume (TKV). Automated methods to segment the kidneys and measure TKV are needed to increase measurement throughput and alleviate the inherent variability of human-derived measurements. We hypothesize that deep learning techniques can be leveraged to perform fast, accurate, reproducible, and fully automated segmentation of polycystic kidneys. Here, we describe a fully automated approach for segmenting PKD kidneys within MR images that simulates a multi-observer approach in order to create an accurate and robust method for the task of segmentation and computation of TKV for PKD patients. A total of 2000 cases were used for training and validation, and 400 cases were used for testing. The multi-observer ensemble method had mean ± SD percent volume difference of 0.68 ± 2.2% compared with the reference standard segmentations. The complete framework performs fully automated segmentation at a level comparable with interobserver variability and could be considered as a replacement for the task of segmentation of PKD kidneys by a human.
Autonomous Vehicle Operation A person can operate a fully autonomous vehicle with the automated federal motor vehicle safety standards and is registered as a fully autonomous vehicle. Other conditions
Xenon International Automated Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-08-05
The Xenon International Automated Control software monitors, displays status, and allows for manual operator control as well as fully automatic control of multiple commercial and PNNL designed hardware components to generate and transmit atmospheric radioxenon concentration measurements every six hours.
Development of Fully Automated Low-Cost Immunoassay System for Research Applications.
Wang, Guochun; Das, Champak; Ledden, Bradley; Sun, Qian; Nguyen, Chien
2017-10-01
Enzyme-linked immunosorbent assay (ELISA) automation for routine operation in a small research environment would be very attractive. A portable fully automated low-cost immunoassay system was designed, developed, and evaluated with several protein analytes. It features disposable capillary columns as the reaction sites and uses real-time calibration for improved accuracy. It reduces the overall assay time to less than 75 min with the ability of easy adaptation of new testing targets. The running cost is extremely low due to the nature of automation, as well as reduced material requirements. Details about system configuration, components selection, disposable fabrication, system assembly, and operation are reported. The performance of the system was initially established with a rabbit immunoglobulin G (IgG) assay, and an example of assay adaptation with an interleukin 6 (IL6) assay is shown. This system is ideal for research use, but could work for broader testing applications with further optimization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hurd, J.R.; Bonner, C.A.; Ostenak, C.A.
1989-01-01
ROBOCAL, which is presently being developed and tested at Los Alamos National Laboratory, is a full-scale, prototypical robotic system, for remote calorimetric and gamma-ray analysis of special nuclear materials. It integrates a fully automated, multi-drawer, vertical stacker-retriever system for staging unmeasured nuclear materials, and a fully automated gantry robot for computer-based selection and transfer of nuclear materials to calorimetric and gamma-ray measurement stations. Since ROBOCAL is designed for minimal operator intervention, a completely programmed user interface and data-base system are provided to interact with the automated mechanical and assay systems. The assay system is designed to completely integrate calorimetric andmore » gamma-ray data acquisition and to perform state-of-the-art analyses on both homogeneous and heterogeneous distributions of nuclear materials in a wide variety of matrices. 10 refs., 10 figs., 4 tabs.« less
Kaur, Harsimar B; Guedes, Liana B; Lu, Jiayun; Maldonado, Laneisha; Reitz, Logan; Barber, John R; De Marzo, Angelo M; Tosoian, Jeffrey J; Tomlins, Scott A; Schaeffer, Edward M; Joshu, Corinne E; Sfanos, Karen S; Lotan, Tamara L
2018-05-30
The inflammatory microenvironment plays an important role in the pathogenesis and progression of tumors and may be associated with somatic genomic alterations. We examined the association of tumor-infiltrating T-cell density with clinical-pathologic variables, tumor molecular subtype, and oncologic outcomes in surgically treated primary prostate cancer occurring in patients of European-American or African-American ancestry. We evaluated 312 primary prostate tumors, enriched for patients with African-American ancestry and high grade disease. Tissue microarrays were immunostained for CD3, CD8, and FOXP3 and were previously immunostained for ERG and PTEN using genetically validated protocols. Image analysis for quantification of T-cell density in tissue microarray tumor spots was performed. Automated quantification of T-cell densities in tumor-containing regions of tissue microarray spots and standard histologic sections were correlated (r = 0.73, p < 0.00001) and there was good agreement between visual and automated T-cell density counts on tissue microarray spots (r = 0.93, p < 0.00001). There was a significant correlation between CD3+, CD8+, and FOXP3+ T-cell densities (p < 0.00001), but these were not associated with most clinical or pathologic variables. Increased T-cell density was significantly associated with ERG positivity (median 309 vs. 188 CD3+ T cells/mm 2 ; p = 0.0004) and also with PTEN loss (median 317 vs. 192 CD3+ T cells/mm 2 ; p = 0.001) in the combined cohort of matched European-American and African-American ancestry patients. The same association or a similar trend was present in patients of both ancestries when analyzed separately. When the African-American patients from the matched race set were combined with a separate high grade set of African-American cases, there was a weak association of increased FOXP3+ T-cell densities with increased risk of metastasis in multivariable analysis. Though high T-cell density is associated with specific molecular subclasses of prostate cancer, we did not find an association of T-cell density with racial ancestry.
A prototype for unsupervised analysis of tissue microarrays for cancer research and diagnostics.
Chen, Wenjin; Reiss, Michael; Foran, David J
2004-06-01
The tissue microarray (TMA) technique enables researchers to extract small cylinders of tissue from histological sections and arrange them in a matrix configuration on a recipient paraffin block such that hundreds can be analyzed simultaneously. TMA offers several advantages over traditional specimen preparation by maximizing limited tissue resources and providing a highly efficient means for visualizing molecular targets. By enabling researchers to reliably determine the protein expression profile for specific types of cancer, it may be possible to elucidate the mechanism by which healthy tissues are transformed into malignancies. Currently, the primary methods used to evaluate arrays involve the interactive review of TMA samples while they are viewed under a microscope, subjectively evaluated, and scored by a technician. This process is extremely slow, tedious, and prone to error. In order to facilitate large-scale, multi-institutional studies, a more automated and reliable means for analyzing TMAs is needed. We report here a web-based prototype which features automated imaging, registration, and distributed archiving of TMAs in multiuser network environments. The system utilizes a principal color decomposition approach to identify and characterize the predominant staining signatures of specimens in color space. This strategy was shown to be reliable for detecting and quantifying the immunohistochemical expression levels for TMAs.
ProDeGe: A computational protocol for fully automated decontamination of genomes
Tennessen, Kristin; Andersen, Evan; Clingenpeel, Scott; ...
2015-06-09
Single amplified genomes and genomes assembled from metagenomes have enabled the exploration of uncultured microorganisms at an unprecedented scale. However, both these types of products are plagued by contamination. Since these genomes are now being generated in a high-throughput manner and sequences from them are propagating into public databases to drive novel scientific discoveries, rigorous quality controls and decontamination protocols are urgently needed. Here, we present ProDeGe (Protocol for fully automated Decontamination of Genomes), the first computational protocol for fully automated decontamination of draft genomes. ProDeGe classifies sequences into two classes—clean and contaminant—using a combination of homology and feature-based methodologies.more » On average, 84% of sequence from the non-target organism is removed from the data set (specificity) and 84% of the sequence from the target organism is retained (sensitivity). Lastly, the procedure operates successfully at a rate of ~0.30 CPU core hours per megabase of sequence and can be applied to any type of genome sequence.« less
ProDeGe: A computational protocol for fully automated decontamination of genomes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tennessen, Kristin; Andersen, Evan; Clingenpeel, Scott
Single amplified genomes and genomes assembled from metagenomes have enabled the exploration of uncultured microorganisms at an unprecedented scale. However, both these types of products are plagued by contamination. Since these genomes are now being generated in a high-throughput manner and sequences from them are propagating into public databases to drive novel scientific discoveries, rigorous quality controls and decontamination protocols are urgently needed. Here, we present ProDeGe (Protocol for fully automated Decontamination of Genomes), the first computational protocol for fully automated decontamination of draft genomes. ProDeGe classifies sequences into two classes—clean and contaminant—using a combination of homology and feature-based methodologies.more » On average, 84% of sequence from the non-target organism is removed from the data set (specificity) and 84% of the sequence from the target organism is retained (sensitivity). Lastly, the procedure operates successfully at a rate of ~0.30 CPU core hours per megabase of sequence and can be applied to any type of genome sequence.« less
Astronomical algorithms for automated analysis of tissue protein expression in breast cancer
Ali, H R; Irwin, M; Morris, L; Dawson, S-J; Blows, F M; Provenzano, E; Mahler-Araujo, B; Pharoah, P D; Walton, N A; Brenton, J D; Caldas, C
2013-01-01
Background: High-throughput evaluation of tissue biomarkers in oncology has been greatly accelerated by the widespread use of tissue microarrays (TMAs) and immunohistochemistry. Although TMAs have the potential to facilitate protein expression profiling on a scale to rival experiments of tumour transcriptomes, the bottleneck and imprecision of manually scoring TMAs has impeded progress. Methods: We report image analysis algorithms adapted from astronomy for the precise automated analysis of IHC in all subcellular compartments. The power of this technique is demonstrated using over 2000 breast tumours and comparing quantitative automated scores against manual assessment by pathologists. Results: All continuous automated scores showed good correlation with their corresponding ordinal manual scores. For oestrogen receptor (ER), the correlation was 0.82, P<0.0001, for BCL2 0.72, P<0.0001 and for HER2 0.62, P<0.0001. Automated scores showed excellent concordance with manual scores for the unsupervised assignment of cases to ‘positive' or ‘negative' categories with agreement rates of up to 96%. Conclusion: The adaptation of astronomical algorithms coupled with their application to large annotated study cohorts, constitutes a powerful tool for the realisation of the enormous potential of digital pathology. PMID:23329232
Automation in Clinical Microbiology
Ledeboer, Nathan A.
2013-01-01
Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547
Interim Assessment of the VAL Automated Guideway Transit System.
DOT National Transportation Integrated Search
1981-11-01
This report describes an interim assessment of the VAL (Vehicules Automatiques Legers or Light Automated Vehicle) AGT system which is currently under construction in Lille, France, and which is to become fully operational in December 1983. This repor...
Sun, Yung-Shin; Zhu, Xiangdong
2016-10-01
Microarrays provide a platform for high-throughput characterization of biomolecular interactions. To increase the sensitivity and specificity of microarrays, surface blocking is required to minimize the nonspecific interactions between analytes and unprinted yet functionalized surfaces. To block amine- or epoxy-functionalized substrates, bovine serum albumin (BSA) is one of the most commonly used blocking reagents because it is cheap and easy to use. Based on standard protocols from microarray manufactories, a BSA concentration of 1% (10 mg/mL or 200 μM) and reaction time of at least 30 min are required to efficiently block epoxy-coated slides. In this paper, we used both fluorescent and label-free methods to characterize the BSA blocking efficiency on epoxy-functionalized substrates. The blocking efficiency of BSA was characterized using a fluorescent scanner and a label-free oblique-incidence reflectivity difference (OI-RD) microscope. We found that (1) a BSA concentration of 0.05% (0.5 mg/mL or 10 μM) could give a blocking efficiency of 98%, and (2) the BSA blocking step took only about 5 min to be complete. Also, from real-time and in situ measurements, we were able to calculate the conformational properties (thickness, mass density, and number density) of BSA molecules deposited on the epoxy surface. © 2015 Society for Laboratory Automation and Screening.
Detecting and Genotyping Escherichia coli O157:H7 using multiplexed PCR and nucleic acid microarrays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Call, Douglas R.; Brockman, Fred J.; Chandler, Darrell P.
2000-12-01
Rapid detection and characterization of food borne pathogens such as Escherichia coli O157:H7 is crucial for epidemiological investigations and food safety surveillance. As an alternative to conventional technologies, we examined the sensitivity and specificity of nucleic acid microarrays for detecting and genotyping E. coli O157:H7. The array was composed of oligonucleotide probes (25-30 mer) complementary to four virulence loci (intimin, Shiga-like toxins I and II, and hemolysin A). Target DNA was amplified from whole cells or from purified DNA via single or multiplexed polymerase chain reaction (PCR), and PCR products were hybridized to the array without further modification or purification.more » The array was 32-fold more sensitive than gel electrophoresis and capable of detecting amplification products from < 1 cell equivalent of genomic DNA (1 fg). Immunomagnetic capture, PCR and a microarray were subsequently used to detect 55 CFU ml-1 (E. coli O157:H7) from chicken rinsate without the aid of pre-enrichment. Four isolates of E. coli O157:H7 and one isolate of O91:H2, for which genotypic data were available, were unambiguously genotyped with this array. Glass based microarrays are relatively simple to construct and provide a rapid and sensitive means to detect multiplexed PCR products and the system is amenable to automation.« less
Detecting and genotyping Escherichia coli O157:H7 using multiplexed PCR and nucleic acid microarrays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Call, Douglas R.; Brockman, Fred J.; Chandler, Darrell P.
2001-07-05
Rapid detection and characterization of food borne pathogens such as Escherichia coli O157:H7 is crucial for epidemiological investigations and food safety surveillance. As an alternative to conventional technologies, we examined the sensitivity and specificity of nucleic acid microarrays for detecting and genotyping E. coli O157:H7. The array was composed of oligonucleotide probes (25-30 mer) complementary to four virulence loci (intimin, Shiga-like toxins I and II, and hemolysin A). Target DNA was amplified from whole cells or from purified DNA via single or multiplexed polymerase chain reaction (PCR), and PCR products were hybridized to the array without further modification or purification.more » The array was 32-fold more sensitive than gel electrophoresis and capable of detecting amplification products from < 1 cell equivalent of genomic DNA (1 fg). Immunomagnetic capture, PCR and a microarray were subsequently used to detect 55 CFUs ml-1 (E. coli O157:H7) from chicken rinsate without the aid of pre-enrichment. Four isolates of E. coli O157:H7 and one isolate of O91:H2, for which genotypic data were available, were unambiguously genotyped with this array. Glass based microarrays are relatively simple to construct and provide a rapid and sensitive means to detect multiplexed PCR products and the system is amenable to automation.« less
Beijbom, Oscar; Edmunds, Peter J.; Roelfsema, Chris; Smith, Jennifer; Kline, David I.; Neal, Benjamin P.; Dunlap, Matthew J.; Moriarty, Vincent; Fan, Tung-Yung; Tan, Chih-Jui; Chan, Stephen; Treibitz, Tali; Gamst, Anthony; Mitchell, B. Greg; Kriegman, David
2015-01-01
Global climate change and other anthropogenic stressors have heightened the need to rapidly characterize ecological changes in marine benthic communities across large scales. Digital photography enables rapid collection of survey images to meet this need, but the subsequent image annotation is typically a time consuming, manual task. We investigated the feasibility of using automated point-annotation to expedite cover estimation of the 17 dominant benthic categories from survey-images captured at four Pacific coral reefs. Inter- and intra- annotator variability among six human experts was quantified and compared to semi- and fully- automated annotation methods, which are made available at coralnet.ucsd.edu. Our results indicate high expert agreement for identification of coral genera, but lower agreement for algal functional groups, in particular between turf algae and crustose coralline algae. This indicates the need for unequivocal definitions of algal groups, careful training of multiple annotators, and enhanced imaging technology. Semi-automated annotation, where 50% of the annotation decisions were performed automatically, yielded cover estimate errors comparable to those of the human experts. Furthermore, fully-automated annotation yielded rapid, unbiased cover estimates but with increased variance. These results show that automated annotation can increase spatial coverage and decrease time and financial outlay for image-based reef surveys. PMID:26154157
A hybrid approach to device integration on a genetic analysis platform
NASA Astrophysics Data System (ADS)
Brennan, Des; Jary, Dorothee; Kurg, Ants; Berik, Evgeny; Justice, John; Aherne, Margaret; Macek, Milan; Galvin, Paul
2012-10-01
Point-of-care (POC) systems require significant component integration to implement biochemical protocols associated with molecular diagnostic assays. Hybrid platforms where discrete components are combined in a single platform are a suitable approach to integration, where combining multiple device fabrication steps on a single substrate is not possible due to incompatible or costly fabrication steps. We integrate three devices each with a specific system functionality: (i) a silicon electro-wetting-on-dielectric (EWOD) device to move and mix sample and reagent droplets in an oil phase, (ii) a polymer microfluidic chip containing channels and reservoirs and (iii) an aqueous phase glass microarray for fluorescence microarray hybridization detection. The EWOD device offers the possibility of fully integrating on-chip sample preparation using nanolitre sample and reagent volumes. A key challenge is sample transfer from the oil phase EWOD device to the aqueous phase microarray for hybridization detection. The EWOD device, waveguide performance and functionality are maintained during the integration process. An on-chip biochemical protocol for arrayed primer extension (APEX) was implemented for single nucleotide polymorphism (SNiP) analysis. The prepared sample is aspirated from the EWOD oil phase to the aqueous phase microarray for hybridization. A bench-top instrumentation system was also developed around the integrated platform to drive the EWOD electrodes, implement APEX sample heating and image the microarray after hybridization.
ERIC Educational Resources Information Center
Gbadamosi, Belau Olatunde
2011-01-01
The paper examines the level of library automation and virtual library development in four academic libraries. A validated questionnaire was used to capture the responses from academic librarians of the libraries under study. The paper discovers that none of the four academic libraries is fully automated. The libraries make use of librarians with…
Joslin, John; Gilligan, James; Anderson, Paul; Garcia, Catherine; Sharif, Orzala; Hampton, Janice; Cohen, Steven; King, Miranda; Zhou, Bin; Jiang, Shumei; Trussell, Christopher; Dunn, Robert; Fathman, John W; Snead, Jennifer L; Boitano, Anthony E; Nguyen, Tommy; Conner, Michael; Cooke, Mike; Harris, Jennifer; Ainscow, Ed; Zhou, Yingyao; Shaw, Chris; Sipes, Dan; Mainquist, James; Lesley, Scott
2018-05-01
The goal of high-throughput screening is to enable screening of compound libraries in an automated manner to identify quality starting points for optimization. This often involves screening a large diversity of compounds in an assay that preserves a connection to the disease pathology. Phenotypic screening is a powerful tool for drug identification, in that assays can be run without prior understanding of the target and with primary cells that closely mimic the therapeutic setting. Advanced automation and high-content imaging have enabled many complex assays, but these are still relatively slow and low throughput. To address this limitation, we have developed an automated workflow that is dedicated to processing complex phenotypic assays for flow cytometry. The system can achieve a throughput of 50,000 wells per day, resulting in a fully automated platform that enables robust phenotypic drug discovery. Over the past 5 years, this screening system has been used for a variety of drug discovery programs, across many disease areas, with many molecules advancing quickly into preclinical development and into the clinic. This report will highlight a diversity of approaches that automated flow cytometry has enabled for phenotypic drug discovery.
Genome-scale cluster analysis of replicated microarrays using shrinkage correlation coefficient.
Yao, Jianchao; Chang, Chunqi; Salmi, Mari L; Hung, Yeung Sam; Loraine, Ann; Roux, Stanley J
2008-06-18
Currently, clustering with some form of correlation coefficient as the gene similarity metric has become a popular method for profiling genomic data. The Pearson correlation coefficient and the standard deviation (SD)-weighted correlation coefficient are the two most widely-used correlations as the similarity metrics in clustering microarray data. However, these two correlations are not optimal for analyzing replicated microarray data generated by most laboratories. An effective correlation coefficient is needed to provide statistically sufficient analysis of replicated microarray data. In this study, we describe a novel correlation coefficient, shrinkage correlation coefficient (SCC), that fully exploits the similarity between the replicated microarray experimental samples. The methodology considers both the number of replicates and the variance within each experimental group in clustering expression data, and provides a robust statistical estimation of the error of replicated microarray data. The value of SCC is revealed by its comparison with two other correlation coefficients that are currently the most widely-used (Pearson correlation coefficient and SD-weighted correlation coefficient) using statistical measures on both synthetic expression data as well as real gene expression data from Saccharomyces cerevisiae. Two leading clustering methods, hierarchical and k-means clustering were applied for the comparison. The comparison indicated that using SCC achieves better clustering performance. Applying SCC-based hierarchical clustering to the replicated microarray data obtained from germinating spores of the fern Ceratopteris richardii, we discovered two clusters of genes with shared expression patterns during spore germination. Functional analysis suggested that some of the genetic mechanisms that control germination in such diverse plant lineages as mosses and angiosperms are also conserved among ferns. This study shows that SCC is an alternative to the Pearson correlation coefficient and the SD-weighted correlation coefficient, and is particularly useful for clustering replicated microarray data. This computational approach should be generally useful for proteomic data or other high-throughput analysis methodology.
Automated tetraploid genotype calling by hierarchical clustering
USDA-ARS?s Scientific Manuscript database
SNP arrays are transforming breeding and genetics research for autotetraploids. To fully utilize these arrays, however, the relationship between signal intensity and allele dosage must be inferred independently for each marker. We developed an improved computational method to automate this process, ...
Buchan, Blake W; Peterson, Jess F; Cogbill, Christopher H; Anderson, Dennis K; Ledford, Joellen S; White, Mary N; Quigley, Neil B; Jannetto, Paul J; Ledeboer, Nathan A
2011-10-01
Numerous drugs such as clopidogrel have been developed to reduce coagulation or inhibit platelet function. The hepatic cytochrome P450 (CYP) pathway is involved in the conversion of clopidogrel to its active metabolite. A recent black-box warning was included in the clopidogrel package insert indicating a significant clinical link between specific CYP2C19 genetic variants and poor metabolism of clopidogrel. Of these variants, *2 and *3 are the most common and are associated with complete loss of enzyme activity. In patients who are carriers of a CYP2C19 *2 or *3 allele, the conversion of clopidogrel to its active metabolite may be reduced, which can lead to ischemic events and negative consequence for the patient. We examined the ability of the Verigene CLO assay (Nanosphere, Northbrook, IL) to identify CYP2C19 *2 and *3 polymorphisms in 1,286 unique whole blood samples. The Verigene CLO assay accurately identified homozygous and heterozygous *2 and *3 phenotypes with a specificity of 100% and a final call rate of 99.7%. The assay is fully automated and can produce a result in approximately 3.5 hours.
Integrated Microfluidic Lectin Barcode Platform for High-Performance Focused Glycomic Profiling
NASA Astrophysics Data System (ADS)
Shang, Yuqin; Zeng, Yun; Zeng, Yong
2016-02-01
Protein glycosylation is one of the key processes that play essential roles in biological functions and dysfunctions. However, progress in glycomics has considerably lagged behind genomics and proteomics, due in part to the enormous challenges in analysis of glycans. Here we present a new integrated and automated microfluidic lectin barcode platform to substantially improve the performance of lectin array for focused glycomic profiling. The chip design and flow control were optimized to promote the lectin-glycan binding kinetics and speed of lectin microarray. Moreover, we established an on-chip lectin assay which employs a very simple blocking method to effectively suppress the undesired background due to lectin binding of antibodies. Using this technology, we demonstrated focused differential profiling of tissue-specific glycosylation changes of a biomarker, CA125 protein purified from ovarian cancer cell line and different tissues from ovarian cancer patients in a fast, reproducible, and high-throughput fashion. Highly sensitive CA125 detection was also demonstrated with a detection limit much lower than the clinical cutoff value for cancer diagnosis. This microfluidic platform holds the potential to integrate with sample preparation functions to construct a fully integrated “sample-to-answer” microsystem for focused differential glycomic analysis. Thus, our technology should present a powerful tool in support of rapid advance in glycobiology and glyco-biomarker development.
Integrated Microfluidic Lectin Barcode Platform for High-Performance Focused Glycomic Profiling
Shang, Yuqin; Zeng, Yun; Zeng, Yong
2016-01-01
Protein glycosylation is one of the key processes that play essential roles in biological functions and dysfunctions. However, progress in glycomics has considerably lagged behind genomics and proteomics, due in part to the enormous challenges in analysis of glycans. Here we present a new integrated and automated microfluidic lectin barcode platform to substantially improve the performance of lectin array for focused glycomic profiling. The chip design and flow control were optimized to promote the lectin-glycan binding kinetics and speed of lectin microarray. Moreover, we established an on-chip lectin assay which employs a very simple blocking method to effectively suppress the undesired background due to lectin binding of antibodies. Using this technology, we demonstrated focused differential profiling of tissue-specific glycosylation changes of a biomarker, CA125 protein purified from ovarian cancer cell line and different tissues from ovarian cancer patients in a fast, reproducible, and high-throughput fashion. Highly sensitive CA125 detection was also demonstrated with a detection limit much lower than the clinical cutoff value for cancer diagnosis. This microfluidic platform holds the potential to integrate with sample preparation functions to construct a fully integrated “sample-to-answer” microsystem for focused differential glycomic analysis. Thus, our technology should present a powerful tool in support of rapid advance in glycobiology and glyco-biomarker development. PMID:26831207
Development of a Novel and Rapid Fully Automated Genetic Testing System.
Uehara, Masayuki
2016-01-01
We have developed a rapid genetic testing system integrating nucleic acid extraction, purification, amplification, and detection in a single cartridge. The system performs real-time polymerase chain reaction (PCR) after nucleic acid purification in a fully automated manner. RNase P, a housekeeping gene, was purified from human nasal epithelial cells using silica-coated magnetic beads and subjected to real-time PCR using a novel droplet-real-time-PCR machine. The process was completed within 13 min. This system will be widely applicable for research and diagnostic uses.
Lou, Junyang; Obuchowski, Nancy A; Krishnaswamy, Amar; Popovic, Zoran; Flamm, Scott D; Kapadia, Samir R; Svensson, Lars G; Bolen, Michael A; Desai, Milind Y; Halliburton, Sandra S; Tuzcu, E Murat; Schoenhagen, Paul
2015-01-01
Preprocedural 3-dimensional CT imaging of the aortic annular plane plays a critical role for transcatheter aortic valve replacement (TAVR) planning; however, manual reconstructions are complex. Automated analysis software may improve reproducibility and agreement between readers but is incompletely validated. In 110 TAVR patients (mean age, 81 years; 37% female) undergoing preprocedural multidetector CT, automated reconstruction of the aortic annular plane and planimetry of the annulus was performed with a prototype of now commercially available software (syngo.CT Cardiac Function-Valve Pilot; Siemens Healthcare, Erlangen, Germany). Fully automated, semiautomated, and manual annulus measurements were compared. Intrareader and inter-reader agreement, intermodality agreement, and interchangeability were analyzed. Finally, the impact of these measurements on recommended valve size was evaluated. Semiautomated analysis required major correction in 5 patients (4.5%). In the remaining 95.5%, only minor correction was performed. Mean manual annulus area was significantly smaller than fully automated results (P < .001 for both readers) but similar to semiautomated measurements (5.0 vs 5.4 vs 4.9 cm(2), respectively). The frequency of concordant recommendations for valve size increased if manual analysis was replaced with the semiautomated method (60% agreement was improved to 82.4%; 95% confidence interval for the difference [69.1%-83.4%]). Semiautomated aortic annulus analysis, with minor correction by the user, provides reliable results in the context of TAVR annulus evaluation. Copyright © 2015 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.
Summers, Ronald M; Baecher, Nicolai; Yao, Jianhua; Liu, Jiamin; Pickhardt, Perry J; Choi, J Richard; Hill, Suvimol
2011-01-01
To show the feasibility of calculating the bone mineral density (BMD) from computed tomographic colonography (CTC) scans using fully automated software. Automated BMD measurement software was developed that measures the BMD of the first and second lumbar vertebrae on computed tomography and calculates the mean of the 2 values to provide a per patient BMD estimate. The software was validated in a reference population of 17 consecutive women who underwent quantitative computed tomography and in a population of 475 women from a consecutive series of asymptomatic patients enrolled in a CTC screening trial conducted at 3 medical centers. The mean (SD) BMD was 133.6 (34.6) mg/mL (95% confidence interval, 130.5-136.7; n = 475). In women aged 42 to 60 years (n = 316) and 61 to 79 years (n = 159), the mean (SD) BMDs were 143.1 (33.5) and 114.7 (28.3) mg/mL, respectively (P < 0.0001). Fully automated BMD measurements were reproducible for a given patient with 95% limits of agreement of -9.79 to 8.46 mg/mL for the mean difference between paired assessments on supine and prone CTC. Osteoporosis screening can be performed simultaneously with screening for colorectal polyps.
Gene Expression Omnibus (GEO): Microarray data storage, submission, retrieval, and analysis
Barrett, Tanya
2006-01-01
The Gene Expression Omnibus (GEO) repository at the National Center for Biotechnology Information (NCBI) archives and freely distributes high-throughput molecular abundance data, predominantly gene expression data generated by DNA microarray technology. The database has a flexible design that can handle diverse styles of both unprocessed and processed data in a MIAME- (Minimum Information About a Microarray Experiment) supportive infrastructure that promotes fully annotated submissions. GEO currently stores about a billion individual gene expression measurements, derived from over 100 organisms, submitted by over 1,500 laboratories, addressing a wide range of biological phenomena. To maximize the utility of these data, several user-friendly Web-based interfaces and applications have been implemented that enable effective exploration, query, and visualization of these data, at the level of individual genes or entire studies. This chapter describes how the data are stored, submission procedures, and mechanisms for data retrieval and query. GEO is publicly accessible at http://www.ncbi.nlm.nih.gov/projects/geo/. PMID:16939800
Space science experimentation automation and support
NASA Technical Reports Server (NTRS)
Frainier, Richard J.; Groleau, Nicolas; Shapiro, Jeff C.
1994-01-01
This paper outlines recent work done at the NASA Ames Artificial Intelligence Research Laboratory on automation and support of science experiments on the US Space Shuttle in low earth orbit. Three approaches to increasing the science return of these experiments using emerging automation technologies are described: remote control (telescience), science advisors for astronaut operators, and fully autonomous experiments. The capabilities and limitations of these approaches are reviewed.
Kim, Youngwoo; Ge, Yinghui; Tao, Cheng; Zhu, Jianbing; Chapman, Arlene B.; Torres, Vicente E.; Yu, Alan S.L.; Mrug, Michal; Bennett, William M.; Flessner, Michael F.; Landsittel, Doug P.
2016-01-01
Background and objectives Our study developed a fully automated method for segmentation and volumetric measurements of kidneys from magnetic resonance images in patients with autosomal dominant polycystic kidney disease and assessed the performance of the automated method with the reference manual segmentation method. Design, setting, participants, & measurements Study patients were selected from the Consortium for Radiologic Imaging Studies of Polycystic Kidney Disease. At the enrollment of the Consortium for Radiologic Imaging Studies of Polycystic Kidney Disease Study in 2000, patients with autosomal dominant polycystic kidney disease were between 15 and 46 years of age with relatively preserved GFRs. Our fully automated segmentation method was on the basis of a spatial prior probability map of the location of kidneys in abdominal magnetic resonance images and regional mapping with total variation regularization and propagated shape constraints that were formulated into a level set framework. T2–weighted magnetic resonance image sets of 120 kidneys were selected from 60 patients with autosomal dominant polycystic kidney disease and divided into the training and test datasets. The performance of the automated method in reference to the manual method was assessed by means of two metrics: Dice similarity coefficient and intraclass correlation coefficient of segmented kidney volume. The training and test sets were swapped for crossvalidation and reanalyzed. Results Successful segmentation of kidneys was performed with the automated method in all test patients. The segmented kidney volumes ranged from 177.2 to 2634 ml (mean, 885.4±569.7 ml). The mean Dice similarity coefficient ±SD between the automated and manual methods was 0.88±0.08. The mean correlation coefficient between the two segmentation methods for the segmented volume measurements was 0.97 (P<0.001 for each crossvalidation set). The results from the crossvalidation sets were highly comparable. Conclusions We have developed a fully automated method for segmentation of kidneys from abdominal magnetic resonance images in patients with autosomal dominant polycystic kidney disease with varying kidney volumes. The performance of the automated method was in good agreement with that of manual method. PMID:26797708
Kim, Youngwoo; Ge, Yinghui; Tao, Cheng; Zhu, Jianbing; Chapman, Arlene B; Torres, Vicente E; Yu, Alan S L; Mrug, Michal; Bennett, William M; Flessner, Michael F; Landsittel, Doug P; Bae, Kyongtae T
2016-04-07
Our study developed a fully automated method for segmentation and volumetric measurements of kidneys from magnetic resonance images in patients with autosomal dominant polycystic kidney disease and assessed the performance of the automated method with the reference manual segmentation method. Study patients were selected from the Consortium for Radiologic Imaging Studies of Polycystic Kidney Disease. At the enrollment of the Consortium for Radiologic Imaging Studies of Polycystic Kidney Disease Study in 2000, patients with autosomal dominant polycystic kidney disease were between 15 and 46 years of age with relatively preserved GFRs. Our fully automated segmentation method was on the basis of a spatial prior probability map of the location of kidneys in abdominal magnetic resonance images and regional mapping with total variation regularization and propagated shape constraints that were formulated into a level set framework. T2-weighted magnetic resonance image sets of 120 kidneys were selected from 60 patients with autosomal dominant polycystic kidney disease and divided into the training and test datasets. The performance of the automated method in reference to the manual method was assessed by means of two metrics: Dice similarity coefficient and intraclass correlation coefficient of segmented kidney volume. The training and test sets were swapped for crossvalidation and reanalyzed. Successful segmentation of kidneys was performed with the automated method in all test patients. The segmented kidney volumes ranged from 177.2 to 2634 ml (mean, 885.4±569.7 ml). The mean Dice similarity coefficient ±SD between the automated and manual methods was 0.88±0.08. The mean correlation coefficient between the two segmentation methods for the segmented volume measurements was 0.97 (P<0.001 for each crossvalidation set). The results from the crossvalidation sets were highly comparable. We have developed a fully automated method for segmentation of kidneys from abdominal magnetic resonance images in patients with autosomal dominant polycystic kidney disease with varying kidney volumes. The performance of the automated method was in good agreement with that of manual method. Copyright © 2016 by the American Society of Nephrology.
Automated acquisition system for routine, noninvasive monitoring of physiological data.
Ogawa, M; Tamura, T; Togawa, T
1998-01-01
A fully automated, noninvasive data-acquisition system was developed to permit long-term measurement of physiological functions at home, without disturbing subjects' normal routines. The system consists of unconstrained monitors built into furnishings and structures in a home environment. An electrocardiographic (ECG) monitor in the bathtub measures heart function during bathing, a temperature monitor in the bed measures body temperature, and a weight monitor built into the toilet serves as a scale to record weight. All three monitors are connected to one computer and function with data-acquisition programs and a data format rule. The unconstrained physiological parameter monitors and fully automated measurement procedures collect data noninvasively without the subject's awareness. The system was tested for 1 week by a healthy male subject, aged 28, in laboratory-based facilities.
Jiang, Hui; Hanna, Eriny; Gatto, Cheryl L.; Page, Terry L.; Bhuva, Bharat; Broadie, Kendal
2016-01-01
Background Aversive olfactory classical conditioning has been the standard method to assess Drosophila learning and memory behavior for decades, yet training and testing are conducted manually under exceedingly labor-intensive conditions. To overcome this severe limitation, a fully automated, inexpensive system has been developed, which allows accurate and efficient Pavlovian associative learning/memory analyses for high-throughput pharmacological and genetic studies. New Method The automated system employs a linear actuator coupled to an odorant T-maze with airflow-mediated transfer of animals between training and testing stages. Odorant, airflow and electrical shock delivery are automatically administered and monitored during training trials. Control software allows operator-input variables to define parameters of Drosophila learning, short-term memory and long-term memory assays. Results The approach allows accurate learning/memory determinations with operational fail-safes. Automated learning indices (immediately post-training) and memory indices (after 24 hours) are comparable to traditional manual experiments, while minimizing experimenter involvement. Comparison with Existing Methods The automated system provides vast improvements over labor-intensive manual approaches with no experimenter involvement required during either training or testing phases. It provides quality control tracking of airflow rates, odorant delivery and electrical shock treatments, and an expanded platform for high-throughput studies of combinational drug tests and genetic screens. The design uses inexpensive hardware and software for a total cost of ~$500US, making it affordable to a wide range of investigators. Conclusions This study demonstrates the design, construction and testing of a fully automated Drosophila olfactory classical association apparatus to provide low-labor, high-fidelity, quality-monitored, high-throughput and inexpensive learning and memory behavioral assays. PMID:26703418
Jiang, Hui; Hanna, Eriny; Gatto, Cheryl L; Page, Terry L; Bhuva, Bharat; Broadie, Kendal
2016-03-01
Aversive olfactory classical conditioning has been the standard method to assess Drosophila learning and memory behavior for decades, yet training and testing are conducted manually under exceedingly labor-intensive conditions. To overcome this severe limitation, a fully automated, inexpensive system has been developed, which allows accurate and efficient Pavlovian associative learning/memory analyses for high-throughput pharmacological and genetic studies. The automated system employs a linear actuator coupled to an odorant T-maze with airflow-mediated transfer of animals between training and testing stages. Odorant, airflow and electrical shock delivery are automatically administered and monitored during training trials. Control software allows operator-input variables to define parameters of Drosophila learning, short-term memory and long-term memory assays. The approach allows accurate learning/memory determinations with operational fail-safes. Automated learning indices (immediately post-training) and memory indices (after 24h) are comparable to traditional manual experiments, while minimizing experimenter involvement. The automated system provides vast improvements over labor-intensive manual approaches with no experimenter involvement required during either training or testing phases. It provides quality control tracking of airflow rates, odorant delivery and electrical shock treatments, and an expanded platform for high-throughput studies of combinational drug tests and genetic screens. The design uses inexpensive hardware and software for a total cost of ∼$500US, making it affordable to a wide range of investigators. This study demonstrates the design, construction and testing of a fully automated Drosophila olfactory classical association apparatus to provide low-labor, high-fidelity, quality-monitored, high-throughput and inexpensive learning and memory behavioral assays. Copyright © 2015 Elsevier B.V. All rights reserved.
Looney, Pádraig; Stevenson, Gordon N; Nicolaides, Kypros H; Plasencia, Walter; Molloholli, Malid; Natsis, Stavros; Collins, Sally L
2018-06-07
We present a new technique to fully automate the segmentation of an organ from 3D ultrasound (3D-US) volumes, using the placenta as the target organ. Image analysis tools to estimate organ volume do exist but are too time consuming and operator dependant. Fully automating the segmentation process would potentially allow the use of placental volume to screen for increased risk of pregnancy complications. The placenta was segmented from 2,393 first trimester 3D-US volumes using a semiautomated technique. This was quality controlled by three operators to produce the "ground-truth" data set. A fully convolutional neural network (OxNNet) was trained using this ground-truth data set to automatically segment the placenta. OxNNet delivered state-of-the-art automatic segmentation. The effect of training set size on the performance of OxNNet demonstrated the need for large data sets. The clinical utility of placental volume was tested by looking at predictions of small-for-gestational-age babies at term. The receiver-operating characteristics curves demonstrated almost identical results between OxNNet and the ground-truth). Our results demonstrated good similarity to the ground-truth and almost identical clinical results for the prediction of SGA.
Suppa, Per; Anker, Ulrich; Spies, Lothar; Bopp, Irene; Rüegger-Frey, Brigitte; Klaghofer, Richard; Gocke, Carola; Hampel, Harald; Beck, Sacha; Buchert, Ralph
2015-01-01
Hippocampal volume is a promising biomarker to enhance the accuracy of the diagnosis of dementia due to Alzheimer's disease (AD). However, whereas hippocampal volume is well studied in patient samples from clinical trials, its value in clinical routine patient care is still rather unclear. The aim of the present study, therefore, was to evaluate fully automated atlas-based hippocampal volumetry for detection of AD in the setting of a secondary care expert memory clinic for outpatients. One-hundred consecutive patients with memory complaints were clinically evaluated and categorized into three diagnostic groups: AD, intermediate AD, and non-AD. A software tool based on open source software (Statistical Parametric Mapping SPM8) was employed for fully automated tissue segmentation and stereotactical normalization of high-resolution three-dimensional T1-weighted magnetic resonance images. Predefined standard masks were used for computation of grey matter volume of the left and right hippocampus which then was scaled to the patient's total grey matter volume. The right hippocampal volume provided an area under the receiver operating characteristic curve of 84% for detection of AD patients in the whole sample. This indicates that fully automated MR-based hippocampal volumetry fulfills the requirements for a relevant core feasible biomarker for detection of AD in everyday patient care in a secondary care memory clinic for outpatients. The software used in the present study has been made freely available as an SPM8 toolbox. It is robust and fast so that it is easily integrated into routine workflow.
A multilevel Lab on chip platform for DNA analysis.
Marasso, Simone Luigi; Giuri, Eros; Canavese, Giancarlo; Castagna, Riccardo; Quaglio, Marzia; Ferrante, Ivan; Perrone, Denis; Cocuzza, Matteo
2011-02-01
Lab-on-chips (LOCs) are critical systems that have been introduced to speed up and reduce the cost of traditional, laborious and extensive analyses in biological and biomedical fields. These ambitious and challenging issues ask for multi-disciplinary competences that range from engineering to biology. Starting from the aim to integrate microarray technology and microfluidic devices, a complex multilevel analysis platform has been designed, fabricated and tested (All rights reserved-IT Patent number TO2009A000915). This LOC successfully manages to interface microfluidic channels with standard DNA microarray glass slides, in order to implement a complete biological protocol. Typical Micro Electro Mechanical Systems (MEMS) materials and process technologies were employed. A silicon/glass microfluidic chip and a Polydimethylsiloxane (PDMS) reaction chamber were fabricated and interfaced with a standard microarray glass slide. In order to have a high disposable system all micro-elements were passive and an external apparatus provided fluidic driving and thermal control. The major microfluidic and handling problems were investigated and innovative solutions were found. Finally, an entirely automated DNA hybridization protocol was successfully tested with a significant reduction in analysis time and reagent consumption with respect to a conventional protocol.
Bennet, Jaison; Ganaprakasam, Chilambuchelvan Arul; Arputharaj, Kannan
2014-01-01
Cancer classification by doctors and radiologists was based on morphological and clinical features and had limited diagnostic ability in olden days. The recent arrival of DNA microarray technology has led to the concurrent monitoring of thousands of gene expressions in a single chip which stimulates the progress in cancer classification. In this paper, we have proposed a hybrid approach for microarray data classification based on nearest neighbor (KNN), naive Bayes, and support vector machine (SVM). Feature selection prior to classification plays a vital role and a feature selection technique which combines discrete wavelet transform (DWT) and moving window technique (MWT) is used. The performance of the proposed method is compared with the conventional classifiers like support vector machine, nearest neighbor, and naive Bayes. Experiments have been conducted on both real and benchmark datasets and the results indicate that the ensemble approach produces higher classification accuracy than conventional classifiers. This paper serves as an automated system for the classification of cancer and can be applied by doctors in real cases which serve as a boon to the medical community. This work further reduces the misclassification of cancers which is highly not allowed in cancer detection.
Svetnik, Vladimir; Ma, Junshui; Soper, Keith A.; Doran, Scott; Renger, John J.; Deacon, Steve; Koblan, Ken S.
2007-01-01
Objective: To evaluate the performance of 2 automated systems, Morpheus and Somnolyzer24X7, with various levels of human review/editing, in scoring polysomnographic (PSG) recordings from a clinical trial using zolpidem in a model of transient insomnia. Methods: 164 all-night PSG recordings from 82 subjects collected during 2 nights of sleep, one under placebo and one under zolpidem (10 mg) treatment were used. For each recording, 6 different methods were used to provide sleep stage scores based on Rechtschaffen & Kales criteria: 1) full manual scoring, 2) automated scoring by Morpheus 3) automated scoring by Somnolyzer24X7, 4) automated scoring by Morpheus with full manual review, 5) automated scoring by Morpheus with partial manual review, 6) automated scoring by Somnolyzer24X7 with partial manual review. Ten traditional clinical efficacy measures of sleep initiation, maintenance, and architecture were calculated. Results: Pair-wise epoch-by-epoch agreements between fully automated and manual scores were in the range of intersite manual scoring agreements reported in the literature (70%-72%). Pair-wise epoch-by-epoch agreements between automated scores manually reviewed were higher (73%-76%). The direction and statistical significance of treatment effect sizes using traditional efficacy endpoints were essentially the same whichever method was used. As the degree of manual review increased, the magnitude of the effect size approached those estimated with fully manual scoring. Conclusion: Automated or semi-automated sleep PSG scoring offers valuable alternatives to costly, time consuming, and intrasite and intersite variable manual scoring, especially in large multicenter clinical trials. Reduction in scoring variability may also reduce the sample size of a clinical trial. Citation: Svetnik V; Ma J; Soper KA; Doran S; Renger JJ; Deacon S; Koblan KS. Evaluation of automated and semi-automated scoring of polysomnographic recordings from a clinical trial using zolpidem in the treatment of insomnia. SLEEP 2007;30(11):1562-1574. PMID:18041489
Identifying Fishes through DNA Barcodes and Microarrays.
Kochzius, Marc; Seidel, Christian; Antoniou, Aglaia; Botla, Sandeep Kumar; Campo, Daniel; Cariani, Alessia; Vazquez, Eva Garcia; Hauschild, Janet; Hervet, Caroline; Hjörleifsdottir, Sigridur; Hreggvidsson, Gudmundur; Kappel, Kristina; Landi, Monica; Magoulas, Antonios; Marteinsson, Viggo; Nölte, Manfred; Planes, Serge; Tinti, Fausto; Turan, Cemal; Venugopal, Moleyur N; Weber, Hannes; Blohm, Dietmar
2010-09-07
International fish trade reached an import value of 62.8 billion Euro in 2006, of which 44.6% are covered by the European Union. Species identification is a key problem throughout the life cycle of fishes: from eggs and larvae to adults in fisheries research and control, as well as processed fish products in consumer protection. This study aims to evaluate the applicability of the three mitochondrial genes 16S rRNA (16S), cytochrome b (cyt b), and cytochrome oxidase subunit I (COI) for the identification of 50 European marine fish species by combining techniques of "DNA barcoding" and microarrays. In a DNA barcoding approach, neighbour Joining (NJ) phylogenetic trees of 369 16S, 212 cyt b, and 447 COI sequences indicated that cyt b and COI are suitable for unambiguous identification, whereas 16S failed to discriminate closely related flatfish and gurnard species. In course of probe design for DNA microarray development, each of the markers yielded a high number of potentially species-specific probes in silico, although many of them were rejected based on microarray hybridisation experiments. None of the markers provided probes to discriminate the sibling flatfish and gurnard species. However, since 16S-probes were less negatively influenced by the "position of label" effect and showed the lowest rejection rate and the highest mean signal intensity, 16S is more suitable for DNA microarray probe design than cty b and COI. The large portion of rejected COI-probes after hybridisation experiments (>90%) renders the DNA barcoding marker as rather unsuitable for this high-throughput technology. Based on these data, a DNA microarray containing 64 functional oligonucleotide probes for the identification of 30 out of the 50 fish species investigated was developed. It represents the next step towards an automated and easy-to-handle method to identify fish, ichthyoplankton, and fish products.
NASA Astrophysics Data System (ADS)
Bychkov, Dmitrii; Turkki, Riku; Haglund, Caj; Linder, Nina; Lundin, Johan
2016-03-01
Recent advances in computer vision enable increasingly accurate automated pattern classification. In the current study we evaluate whether a convolutional neural network (CNN) can be trained to predict disease outcome in patients with colorectal cancer based on images of tumor tissue microarray samples. We compare the prognostic accuracy of CNN features extracted from the whole, unsegmented tissue microarray spot image, with that of CNN features extracted from the epithelial and non-epithelial compartments, respectively. The prognostic accuracy of visually assessed histologic grade is used as a reference. The image data set consists of digitized hematoxylin-eosin (H and E) stained tissue microarray samples obtained from 180 patients with colorectal cancer. The patient samples represent a variety of histological grades, have data available on a series of clinicopathological variables including long-term outcome and ground truth annotations performed by experts. The CNN features extracted from images of the epithelial tissue compartment significantly predicted outcome (hazard ratio (HR) 2.08; CI95% 1.04-4.16; area under the curve (AUC) 0.66) in a test set of 60 patients, as compared to the CNN features extracted from unsegmented images (HR 1.67; CI95% 0.84-3.31, AUC 0.57) and visually assessed histologic grade (HR 1.96; CI95% 0.99-3.88, AUC 0.61). As a conclusion, a deep-learning classifier can be trained to predict outcome of colorectal cancer based on images of H and E stained tissue microarray samples and the CNN features extracted from the epithelial compartment only resulted in a prognostic discrimination comparable to that of visually determined histologic grade.
Microarray profiling of human white adipose tissue after exogenous leptin injection.
Taleb, S; Van Haaften, R; Henegar, C; Hukshorn, C; Cancello, R; Pelloux, V; Hanczar, B; Viguerie, N; Langin, D; Evelo, C; Zucker, J; Clément, K; Saris, W H M
2006-03-01
Leptin is a secreted adipocyte hormone that plays a key role in the regulation of body weight homeostasis. The leptin effect on human white adipose tissue (WAT) is still debated. The aim of this study was to assess whether the administration of polyethylene glycol-leptin (PEG-OB) in a single supraphysiological dose has transcriptional effects on genes of WAT and to identify its target genes and functional pathways in WAT. Blood samples and WAT biopsies were obtained from 10 healthy nonobese men before treatment and 72 h after the PEG-OB injection, leading to an approximate 809-fold increase in circulating leptin. The WAT gene expression profile before and after the PEG-OB injection was compared using pangenomic microarrays. Functional gene annotations based on the gene ontology of the PEG-OB regulated genes were performed using both an 'in house' automated procedure and GenMAPP (Gene Microarray Pathway Profiler), designed for viewing and analyzing gene expression data in the context of biological pathways. Statistical analysis of microarray data revealed that PEG-OB had a major down-regulated effect on WAT gene expression, as we obtained 1,822 and 100 down- and up-regulated genes, respectively. Microarray data were validated using reverse transcription quantitative PCR. Functional gene annotations of PEG-OB regulated genes revealed that the functional class related to immunity and inflammation was among the most mobilized PEG-OB pathway in WAT. These genes are mainly expressed in the cell of the stroma vascular fraction in comparison with adipocytes. Our observations support the hypothesis that leptin could act on WAT, particularly on genes related to inflammation and immunity, which may suggest a novel leptin target pathway in human WAT.
Ackermann, Uwe; Lewis, Jason S; Young, Kenneth; Morris, Michael J; Weickhardt, Andrew; Davis, Ian D; Scott, Andrew M
2016-08-01
Imaging of androgen receptor expression in prostate cancer using F-18 FDHT is becoming increasingly popular. With the radiolabelling precursor now commercially available, developing a fully automated synthesis of [(18) F] FDHT is important. We have fully automated the synthesis of F-18 FDHT using the iPhase FlexLab module using only commercially available components. Total synthesis time was 90 min, radiochemical yields were 25-33% (n = 11). Radiochemical purity of the final formulation was > 99% and specific activity was > 18.5 GBq/µmol for all batches. This method can be up-scaled as desired, thus making it possible to study multiple patients in a day. Furthermore, our procedure uses 4 mg of precursor only and is therefore cost-effective. The synthesis has now been validated at Austin Health and is currently used for [(18) F]FDHT studies in patients. We believe that this method can easily adapted by other modules to further widen the availability of [(18) F]FDHT. Copyright © 2016 John Wiley & Sons, Ltd.
Nakanishi, Rine; Sankaran, Sethuraman; Grady, Leo; Malpeso, Jenifer; Yousfi, Razik; Osawa, Kazuhiro; Ceponiene, Indre; Nazarat, Negin; Rahmani, Sina; Kissel, Kendall; Jayawardena, Eranthi; Dailing, Christopher; Zarins, Christopher; Koo, Bon-Kwon; Min, James K; Taylor, Charles A; Budoff, Matthew J
2018-03-23
Our goal was to evaluate the efficacy of a fully automated method for assessing the image quality (IQ) of coronary computed tomography angiography (CCTA). The machine learning method was trained using 75 CCTA studies by mapping features (noise, contrast, misregistration scores, and un-interpretability index) to an IQ score based on manual ground truth data. The automated method was validated on a set of 50 CCTA studies and subsequently tested on a new set of 172 CCTA studies against visual IQ scores on a 5-point Likert scale. The area under the curve in the validation set was 0.96. In the 172 CCTA studies, our method yielded a Cohen's kappa statistic for the agreement between automated and visual IQ assessment of 0.67 (p < 0.01). In the group where good to excellent (n = 163), fair (n = 6), and poor visual IQ scores (n = 3) were graded, 155, 5, and 2 of the patients received an automated IQ score > 50 %, respectively. Fully automated assessment of the IQ of CCTA data sets by machine learning was reproducible and provided similar results compared with visual analysis within the limits of inter-operator variability. • The proposed method enables automated and reproducible image quality assessment. • Machine learning and visual assessments yielded comparable estimates of image quality. • Automated assessment potentially allows for more standardised image quality. • Image quality assessment enables standardization of clinical trial results across different datasets.
Otani, Kyoko; Nakazono, Akemi; Salgo, Ivan S; Lang, Roberto M; Takeuchi, Masaaki
2016-10-01
Echocardiographic determination of left heart chamber volumetric parameters by using manual tracings during multiple beats is tedious in atrial fibrillation (AF). The aim of this study was to determine the usefulness of fully automated left chamber quantification software with single-beat three-dimensional transthoracic echocardiographic data sets in patients with AF. Single-beat full-volume three-dimensional transthoracic echocardiographic data sets were prospectively acquired during consecutive multiple cardiac beats (≥10 beats) in 88 patients with AF. In protocol 1, left ventricular volumes, left ventricular ejection fraction, and maximal left atrial volume were validated using automated quantification against the manual tracing method in identical beats in 10 patients. In protocol 2, automated quantification-derived averaged values from multiple beats were compared with the corresponding values obtained from the indexed beat in all patients. Excellent correlations of left chamber parameters between automated quantification and the manual method were observed (r = 0.88-0.98) in protocol 1. The time required for the analysis with the automated quantification method (5 min) was significantly less compared with the manual method (27 min) (P < .0001). In protocol 2, there were excellent linear correlations between the averaged left chamber parameters and the corresponding values obtained from the indexed beat (r = 0.94-0.99), and test-retest variability of left chamber parameters was low (3.5%-4.8%). Three-dimensional transthoracic echocardiography with fully automated quantification software is a rapid and reliable way to measure averaged values of left heart chamber parameters during multiple consecutive beats. Thus, it is a potential new approach for left chamber quantification in patients with AF in daily routine practice. Copyright © 2016 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.
Grimmer, Timo; Wutz, Carolin; Alexopoulos, Panagiotis; Drzezga, Alexander; Förster, Stefan; Förstl, Hans; Goldhardt, Oliver; Ortner, Marion; Sorg, Christian; Kurz, Alexander
2016-02-01
Biomarkers of Alzheimer disease (AD) can be imaged in vivo and can be used for diagnostic and prognostic purposes in people with cognitive decline and dementia. Indicators of amyloid deposition such as (11)C-Pittsburgh compound B ((11)C-PiB) PET are primarily used to identify or rule out brain diseases that are associated with amyloid pathology but have also been deployed to forecast the clinical course. Indicators of neuronal metabolism including (18)F-FDG PET demonstrate the localization and severity of neuronal dysfunction and are valuable for differential diagnosis and for predicting the progression from mild cognitive impairment (MCI) to dementia. It is a matter of debate whether to analyze these images visually or using automated techniques. Therefore, we compared the usefulness of both imaging methods and both analyzing strategies to predict dementia due to AD. In MCI participants, a baseline examination, including clinical and imaging assessments, and a clinical follow-up examination after a planned interval of 24 mo were performed. Of 28 MCI patients, 9 developed dementia due to AD, 2 developed frontotemporal dementia, and 1 developed moderate dementia of unknown etiology. The positive and negative predictive values and the accuracy of visual and fully automated analyses of (11)C-PiB for the prediction of progression to dementia due to AD were 0.50, 1.00, and 0.68, respectively, for the visual and 0.53, 1.00, and 0.71, respectively, for the automated analyses. Positive predictive value, negative predictive value, and accuracy of fully automated analyses of (18)F-FDG PET were 0.37, 0.78, and 0.50, respectively. Results of visual analyses were highly variable between raters but were superior to automated analyses. Both (18)F-FDG and (11)C-PiB imaging appear to be of limited use for predicting the progression from MCI to dementia due to AD in short-term follow-up, irrespective of the strategy of analysis. On the other hand, amyloid PET is extremely useful to rule out underlying AD. The findings of the present study favor a fully automated method of analysis for (11)C-PiB assessments and a visual analysis by experts for (18)F-FDG assessments. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
Fully automated MR liver volumetry using watershed segmentation coupled with active contouring.
Huynh, Hieu Trung; Le-Trong, Ngoc; Bao, Pham The; Oto, Aytek; Suzuki, Kenji
2017-02-01
Our purpose is to develop a fully automated scheme for liver volume measurement in abdominal MR images, without requiring any user input or interaction. The proposed scheme is fully automatic for liver volumetry from 3D abdominal MR images, and it consists of three main stages: preprocessing, rough liver shape generation, and liver extraction. The preprocessing stage reduced noise and enhanced the liver boundaries in 3D abdominal MR images. The rough liver shape was revealed fully automatically by using the watershed segmentation, thresholding transform, morphological operations, and statistical properties of the liver. An active contour model was applied to refine the rough liver shape to precisely obtain the liver boundaries. The liver volumes calculated by the proposed scheme were compared to the "gold standard" references which were estimated by an expert abdominal radiologist. The liver volumes computed by using our developed scheme excellently agreed (Intra-class correlation coefficient was 0.94) with the "gold standard" manual volumes by the radiologist in the evaluation with 27 cases from multiple medical centers. The running time was 8.4 min per case on average. We developed a fully automated liver volumetry scheme in MR, which does not require any interaction by users. It was evaluated with cases from multiple medical centers. The liver volumetry performance of our developed system was comparable to that of the gold standard manual volumetry, and it saved radiologists' time for manual liver volumetry of 24.7 min per case.
Peak picking multidimensional NMR spectra with the contour geometry based algorithm CYPICK.
Würz, Julia M; Güntert, Peter
2017-01-01
The automated identification of signals in multidimensional NMR spectra is a challenging task, complicated by signal overlap, noise, and spectral artifacts, for which no universally accepted method is available. Here, we present a new peak picking algorithm, CYPICK, that follows, as far as possible, the manual approach taken by a spectroscopist who analyzes peak patterns in contour plots of the spectrum, but is fully automated. Human visual inspection is replaced by the evaluation of geometric criteria applied to contour lines, such as local extremality, approximate circularity (after appropriate scaling of the spectrum axes), and convexity. The performance of CYPICK was evaluated for a variety of spectra from different proteins by systematic comparison with peak lists obtained by other, manual or automated, peak picking methods, as well as by analyzing the results of automated chemical shift assignment and structure calculation based on input peak lists from CYPICK. The results show that CYPICK yielded peak lists that compare in most cases favorably to those obtained by other automated peak pickers with respect to the criteria of finding a maximal number of real signals, a minimal number of artifact peaks, and maximal correctness of the chemical shift assignments and the three-dimensional structure obtained by fully automated assignment and structure calculation.
Foda, Abd Al-Rahman Mohammad
2013-05-01
Manual tissue microarray (TMA) construction had been introduced to avoid the high cost of automated and semiautomated techniques. The cheapest and simplest technique for constructing manual TMA was that of using mechanical pencil tips. This study was carried out to modify this method, aiming to raise its quality to reach that of expensive ones. Some modifications were introduced to Shebl's technique. Two conventional mechanical pencil tips of different diameters were used to construct the recipient blocks. A source of mild heat was used, and blocks were incubated at 38°C overnight. With our modifications, 3 high-density TMA blocks were constructed. We successfully performed immunostaining without substantial tissue loss. Our modifications increased the number of cores per block and improved the stability of the cores within the paraffin block. This new, modified technique is a good alternative for expensive machines in many laboratories.
Jain, Ruchi; Dey, Bappaditya; Tyagi, Anil K
2012-10-02
The Guinea pig (Cavia porcellus) is one of the most extensively used animal models to study infectious diseases. However, despite its tremendous contribution towards understanding the establishment, progression and control of a number of diseases in general and tuberculosis in particular, the lack of fully annotated guinea pig genome sequence as well as appropriate molecular reagents has severely hampered detailed genetic and immunological analysis in this animal model. By employing the cross-species hybridization technique, we have developed an oligonucleotide microarray with 44,000 features assembled from different mammalian species, which to the best of our knowledge is the first attempt to employ microarray to study the global gene expression profile in guinea pigs. To validate and demonstrate the merit of this microarray, we have studied, as an example, the expression profile of guinea pig lungs during the advanced phase of M. tuberculosis infection. A significant upregulation of 1344 genes and a marked down regulation of 1856 genes in the lungs identified a disease signature of pulmonary tuberculosis infection. We report the development of first comprehensive microarray for studying the global gene expression profile in guinea pigs and validation of its usefulness with tuberculosis as a case study. An important gap in the area of infectious diseases has been addressed and a valuable molecular tool is provided to optimally harness the potential of guinea pig model to develop better vaccines and therapies against human diseases.
Pertuz, Said; McDonald, Elizabeth S.; Weinstein, Susan P.; Conant, Emily F.
2016-01-01
Purpose To assess a fully automated method for volumetric breast density (VBD) estimation in digital breast tomosynthesis (DBT) and to compare the findings with those of full-field digital mammography (FFDM) and magnetic resonance (MR) imaging. Materials and Methods Bilateral DBT images, FFDM images, and sagittal breast MR images were retrospectively collected from 68 women who underwent breast cancer screening from October 2011 to September 2012 with institutional review board–approved, HIPAA-compliant protocols. A fully automated computer algorithm was developed for quantitative estimation of VBD from DBT images. FFDM images were processed with U.S. Food and Drug Administration–cleared software, and the MR images were processed with a previously validated automated algorithm to obtain corresponding VBD estimates. Pearson correlation and analysis of variance with Tukey-Kramer post hoc correction were used to compare the multimodality VBD estimates. Results Estimates of VBD from DBT were significantly correlated with FFDM-based and MR imaging–based estimates with r = 0.83 (95% confidence interval [CI]: 0.74, 0.90) and r = 0.88 (95% CI: 0.82, 0.93), respectively (P < .001). The corresponding correlation between FFDM and MR imaging was r = 0.84 (95% CI: 0.76, 0.90). However, statistically significant differences after post hoc correction (α = 0.05) were found among VBD estimates from FFDM (mean ± standard deviation, 11.1% ± 7.0) relative to MR imaging (16.6% ± 11.2) and DBT (19.8% ± 16.2). Differences between VDB estimates from DBT and MR imaging were not significant (P = .26). Conclusion Fully automated VBD estimates from DBT, FFDM, and MR imaging are strongly correlated but show statistically significant differences. Therefore, absolute differences in VBD between FFDM, DBT, and MR imaging should be considered in breast cancer risk assessment. © RSNA, 2015 Online supplemental material is available for this article. PMID:26491909
Maruoka, Sachiko; Nakakura, Shunsuke; Matsuo, Naoko; Yoshitomi, Kayo; Katakami, Chikako; Tabuchi, Hitoshi; Chikama, Taiichiro; Kiuchi, Yoshiaki
2017-10-30
To evaluate two specular microscopy analysis methods across different endothelial cell densities (ECDs). Endothelial images of one eye from each of 45 patients were taken by using three different specular microscopes (three replicates each). To determine the consistency of the center-dot method, we compared SP-6000 and SP-2000P images. CME-530 and SP-6000 images were compared to assess the consistency of the fully automated method. The SP-6000 images from the two methods were compared. Intraclass correlation coefficients (ICCs) for the three measurements were calculated, and parametric multiple comparisons tests and Bland-Altman analysis were performed. The ECD mean value was 2425 ± 883 (range 516-3707) cells/mm 2 . ICC values were > 0.9 for all three microscopes for ECD, but the coefficients of variation (CVs) were 0.3-0.6. For ECD measurements, Bland-Altman analysis revealed that the mean difference was 42 cells/mm 2 between the SP-2000P and SP-6000 for the center-dot method; 57 cells/mm 2 between the SP-6000 measurements from both methods; and -5 cells/mm 2 between the SP-6000 and CME-530 for the fully automated method (95% limits of agreement: - 201 to 284 cell/mm 2 , - 410 to 522 cells/mm 2 , and - 327 to 318 cells/mm 2 , respectively). For CV measurements, the mean differences were - 3, - 12, and 13% (95% limits of agreement - 18 to 11, - 26 to 2, and - 5 to 32%, respectively). Despite using three replicate measurements, the precision of the center-dot method with the SP-2000P and SP-6000 software was only ± 10% for ECD data and was even worse for the fully automated method. Japan Clinical Trials Register ( http://www.umin.ac.jp/ctr/index/htm9 ) number UMIN 000015236.
Reuse of imputed data in microarray analysis increases imputation efficiency
Kim, Ki-Yeol; Kim, Byoung-Jin; Yi, Gwan-Su
2004-01-01
Background The imputation of missing values is necessary for the efficient use of DNA microarray data, because many clustering algorithms and some statistical analysis require a complete data set. A few imputation methods for DNA microarray data have been introduced, but the efficiency of the methods was low and the validity of imputed values in these methods had not been fully checked. Results We developed a new cluster-based imputation method called sequential K-nearest neighbor (SKNN) method. This imputes the missing values sequentially from the gene having least missing values, and uses the imputed values for the later imputation. Although it uses the imputed values, the efficiency of this new method is greatly improved in its accuracy and computational complexity over the conventional KNN-based method and other methods based on maximum likelihood estimation. The performance of SKNN was in particular higher than other imputation methods for the data with high missing rates and large number of experiments. Application of Expectation Maximization (EM) to the SKNN method improved the accuracy, but increased computational time proportional to the number of iterations. The Multiple Imputation (MI) method, which is well known but not applied previously to microarray data, showed a similarly high accuracy as the SKNN method, with slightly higher dependency on the types of data sets. Conclusions Sequential reuse of imputed data in KNN-based imputation greatly increases the efficiency of imputation. The SKNN method should be practically useful to save the data of some microarray experiments which have high amounts of missing entries. The SKNN method generates reliable imputed values which can be used for further cluster-based analysis of microarray data. PMID:15504240
TreeRipper web application: towards a fully automated optical tree recognition software.
Hughes, Joseph
2011-05-20
Relationships between species, genes and genomes have been printed as trees for over a century. Whilst this may have been the best format for exchanging and sharing phylogenetic hypotheses during the 20th century, the worldwide web now provides faster and automated ways of transferring and sharing phylogenetic knowledge. However, novel software is needed to defrost these published phylogenies for the 21st century. TreeRipper is a simple website for the fully-automated recognition of multifurcating phylogenetic trees (http://linnaeus.zoology.gla.ac.uk/~jhughes/treeripper/). The program accepts a range of input image formats (PNG, JPG/JPEG or GIF). The underlying command line c++ program follows a number of cleaning steps to detect lines, remove node labels, patch-up broken lines and corners and detect line edges. The edge contour is then determined to detect the branch length, tip label positions and the topology of the tree. Optical Character Recognition (OCR) is used to convert the tip labels into text with the freely available tesseract-ocr software. 32% of images meeting the prerequisites for TreeRipper were successfully recognised, the largest tree had 115 leaves. Despite the diversity of ways phylogenies have been illustrated making the design of a fully automated tree recognition software difficult, TreeRipper is a step towards automating the digitization of past phylogenies. We also provide a dataset of 100 tree images and associated tree files for training and/or benchmarking future software. TreeRipper is an open source project licensed under the GNU General Public Licence v3.
Fully automated contour detection of the ascending aorta in cardiac 2D phase-contrast MRI.
Codari, Marina; Scarabello, Marco; Secchi, Francesco; Sforza, Chiarella; Baselli, Giuseppe; Sardanelli, Francesco
2018-04-01
In this study we proposed a fully automated method for localizing and segmenting the ascending aortic lumen with phase-contrast magnetic resonance imaging (PC-MRI). Twenty-five phase-contrast series were randomly selected out of a large population dataset of patients whose cardiac MRI examination, performed from September 2008 to October 2013, was unremarkable. The local Ethical Committee approved this retrospective study. The ascending aorta was automatically identified on each phase of the cardiac cycle using a priori knowledge of aortic geometry. The frame that maximized the area, eccentricity, and solidity parameters was chosen for unsupervised initialization. Aortic segmentation was performed on each frame using active contouring without edges techniques. The entire algorithm was developed using Matlab R2016b. To validate the proposed method, the manual segmentation performed by a highly experienced operator was used. Dice similarity coefficient, Bland-Altman analysis, and Pearson's correlation coefficient were used as performance metrics. Comparing automated and manual segmentation of the aortic lumen on 714 images, Bland-Altman analysis showed a bias of -6.68mm 2 , a coefficient of repeatability of 91.22mm 2 , a mean area measurement of 581.40mm 2 , and a reproducibility of 85%. Automated and manual segmentation were highly correlated (R=0.98). The Dice similarity coefficient versus the manual reference standard was 94.6±2.1% (mean±standard deviation). A fully automated and robust method for identification and segmentation of ascending aorta on PC-MRI was developed. Its application on patients with a variety of pathologic conditions is advisable. Copyright © 2017 Elsevier Inc. All rights reserved.
ATLAS from Data Research Associates: A Fully Integrated Automation System.
ERIC Educational Resources Information Center
Mellinger, Michael J.
1987-01-01
This detailed description of a fully integrated, turnkey library system includes a complete profile of the system (functions, operational characteristics, hardware, operating system, minimum memory and pricing); history of the technologies involved; and descriptions of customer services and availability. (CLB)
ROBOCAL: Gamma-ray isotopic hardware/software interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hurd, J.R.; Bonner, C.A.; Ostenak, C.A.
1989-01-01
ROBOCAL, presently being developed at the Los Alamos National Laboratory, is a full-scale prototypical robotic system for remotely performing calorimetric and gamma-ray isotopics measurements of nuclear materials. It features a fully automated vertical stacker-retriever for storing and retrieving packaged nuclear materials from a multi-drawer system, and a fully automated, uniquely integrated gantry robot for programmable selection and transfer of nuclear materials to calorimetric and gamma-ray isotopic measurement stations. Since ROBOCAL is to require almost no operator intervention, a mechanical control system is required in addition to a totally automated assay system. The assay system must be a completely integrated datamore » acquisition and isotopic analysis package fully capable of performing state-of-the-art homogeneous and heterogeneous analyses on many varied matrices. The TRIFID assay system being discussed at this conference by J. G. Fleissner of the Rocky Flats Plant has been adopted because of its many automated features. These include: MCA/ADC setup and acquisition; spectral storage and analysis utilizing an expert system formalism; report generation with internal measurement control printout; user friendly screens and menus. The mechanical control portion consists primarily of two detector platforms and a sample platform, each with independent movement. Some minor modifications and additions are needed with TRIFID to interface the assay and mechanical portions with the CimRoc 4000 software controlling the robot. 6 refs., 5 figs., 3 tabs.« less
Khan, Ali R; Wang, Lei; Beg, Mirza Faisal
2008-07-01
Fully-automated brain segmentation methods have not been widely adopted for clinical use because of issues related to reliability, accuracy, and limitations of delineation protocol. By combining the probabilistic-based FreeSurfer (FS) method with the Large Deformation Diffeomorphic Metric Mapping (LDDMM)-based label-propagation method, we are able to increase reliability and accuracy, and allow for flexibility in template choice. Our method uses the automated FreeSurfer subcortical labeling to provide a coarse-to-fine introduction of information in the LDDMM template-based segmentation resulting in a fully-automated subcortical brain segmentation method (FS+LDDMM). One major advantage of the FS+LDDMM-based approach is that the automatically generated segmentations generated are inherently smooth, thus subsequent steps in shape analysis can directly follow without manual post-processing or loss of detail. We have evaluated our new FS+LDDMM method on several databases containing a total of 50 subjects with different pathologies, scan sequences and manual delineation protocols for labeling the basal ganglia, thalamus, and hippocampus. In healthy controls we report Dice overlap measures of 0.81, 0.83, 0.74, 0.86 and 0.75 for the right caudate nucleus, putamen, pallidum, thalamus and hippocampus respectively. We also find statistically significant improvement of accuracy in FS+LDDMM over FreeSurfer for the caudate nucleus and putamen of Huntington's disease and Tourette's syndrome subjects, and the right hippocampus of Schizophrenia subjects.
Utility of an automated thermal-based approach for monitoring evapotranspiration
USDA-ARS?s Scientific Manuscript database
A very simple remote sensing-based model for water use monitoring is presented. The model acronym DATTUTDUT, (Deriving Atmosphere Turbulent Transport Useful To Dummies Using Temperature) is a Dutch word which loosely translates as “It’s unbelievable that it works”. DATTUTDUT is fully automated and o...
ArrayWiki: an enabling technology for sharing public microarray data repositories and meta-analyses
Stokes, Todd H; Torrance, JT; Li, Henry; Wang, May D
2008-01-01
Background A survey of microarray databases reveals that most of the repository contents and data models are heterogeneous (i.e., data obtained from different chip manufacturers), and that the repositories provide only basic biological keywords linking to PubMed. As a result, it is difficult to find datasets using research context or analysis parameters information beyond a few keywords. For example, to reduce the "curse-of-dimension" problem in microarray analysis, the number of samples is often increased by merging array data from different datasets. Knowing chip data parameters such as pre-processing steps (e.g., normalization, artefact removal, etc), and knowing any previous biological validation of the dataset is essential due to the heterogeneity of the data. However, most of the microarray repositories do not have meta-data information in the first place, and do not have a a mechanism to add or insert this information. Thus, there is a critical need to create "intelligent" microarray repositories that (1) enable update of meta-data with the raw array data, and (2) provide standardized archiving protocols to minimize bias from the raw data sources. Results To address the problems discussed, we have developed a community maintained system called ArrayWiki that unites disparate meta-data of microarray meta-experiments from multiple primary sources with four key features. First, ArrayWiki provides a user-friendly knowledge management interface in addition to a programmable interface using standards developed by Wikipedia. Second, ArrayWiki includes automated quality control processes (caCORRECT) and novel visualization methods (BioPNG, Gel Plots), which provide extra information about data quality unavailable in other microarray repositories. Third, it provides a user-curation capability through the familiar Wiki interface. Fourth, ArrayWiki provides users with simple text-based searches across all experiment meta-data, and exposes data to search engine crawlers (Semantic Agents) such as Google to further enhance data discovery. Conclusions Microarray data and meta information in ArrayWiki are distributed and visualized using a novel and compact data storage format, BioPNG. Also, they are open to the research community for curation, modification, and contribution. By making a small investment of time to learn the syntax and structure common to all sites running MediaWiki software, domain scientists and practioners can all contribute to make better use of microarray technologies in research and medical practices. ArrayWiki is available at . PMID:18541053
Garrido, Terhilda; Kumar, Sudheen; Lekas, John; Lindberg, Mark; Kadiyala, Dhanyaja; Whippy, Alan; Crawford, Barbara; Weissberg, Jed
2014-01-01
Using electronic health records (EHR) to automate publicly reported quality measures is receiving increasing attention and is one of the promises of EHR implementation. Kaiser Permanente has fully or partly automated six of 13 the joint commission measure sets. We describe our experience with automation and the resulting time savings: a reduction by approximately 50% of abstractor time required for one measure set alone (surgical care improvement project). However, our experience illustrates the gap between the current and desired states of automated public quality reporting, which has important implications for measure developers, accrediting entities, EHR vendors, public/private payers, and government. PMID:23831833
Salimi, Nima; Loh, Kar Hoe; Kaur Dhillon, Sarinder; Chong, Ving Ching
2016-01-01
Background. Fish species may be identified based on their unique otolith shape or contour. Several pattern recognition methods have been proposed to classify fish species through morphological features of the otolith contours. However, there has been no fully-automated species identification model with the accuracy higher than 80%. The purpose of the current study is to develop a fully-automated model, based on the otolith contours, to identify the fish species with the high classification accuracy. Methods. Images of the right sagittal otoliths of 14 fish species from three families namely Sciaenidae, Ariidae, and Engraulidae were used to develop the proposed identification model. Short-time Fourier transform (STFT) was used, for the first time in the area of otolith shape analysis, to extract important features of the otolith contours. Discriminant Analysis (DA), as a classification technique, was used to train and test the model based on the extracted features. Results. Performance of the model was demonstrated using species from three families separately, as well as all species combined. Overall classification accuracy of the model was greater than 90% for all cases. In addition, effects of STFT variables on the performance of the identification model were explored in this study. Conclusions. Short-time Fourier transform could determine important features of the otolith outlines. The fully-automated model proposed in this study (STFT-DA) could predict species of an unknown specimen with acceptable identification accuracy. The model codes can be accessed at http://mybiodiversityontologies.um.edu.my/Otolith/ and https://peerj.com/preprints/1517/. The current model has flexibility to be used for more species and families in future studies.
Workflows for microarray data processing in the Kepler environment.
Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark
2012-05-17
Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R/BioConductor scripting approaches to pipeline design. Finally, we suggest that microarray data processing task workflows may provide a basis for future example-based comparison of different workflow systems. We provide a set of tools and complete workflows for microarray data analysis in the Kepler environment, which has the advantages of offering graphical, clear display of conceptual steps and parameters and the ability to easily integrate other resources such as remote data and web services.
Reproducibility of myelin content-based human habenula segmentation at 3 Tesla.
Kim, Joo-Won; Naidich, Thomas P; Joseph, Joshmi; Nair, Divya; Glasser, Matthew F; O'halloran, Rafael; Doucet, Gaelle E; Lee, Won Hee; Krinsky, Hannah; Paulino, Alejandro; Glahn, David C; Anticevic, Alan; Frangou, Sophia; Xu, Junqian
2018-03-26
In vivo morphological study of the human habenula, a pair of small epithalamic nuclei adjacent to the dorsomedial thalamus, has recently gained significant interest for its role in reward and aversion processing. However, segmenting the habenula from in vivo magnetic resonance imaging (MRI) is challenging due to the habenula's small size and low anatomical contrast. Although manual and semi-automated habenula segmentation methods have been reported, the test-retest reproducibility of the segmented habenula volume and the consistency of the boundaries of habenula segmentation have not been investigated. In this study, we evaluated the intra- and inter-site reproducibility of in vivo human habenula segmentation from 3T MRI (0.7-0.8 mm isotropic resolution) using our previously proposed semi-automated myelin contrast-based method and its fully-automated version, as well as a previously published manual geometry-based method. The habenula segmentation using our semi-automated method showed consistent boundary definition (high Dice coefficient, low mean distance, and moderate Hausdorff distance) and reproducible volume measurement (low coefficient of variation). Furthermore, the habenula boundary in our semi-automated segmentation from 3T MRI agreed well with that in the manual segmentation from 7T MRI (0.5 mm isotropic resolution) of the same subjects. Overall, our proposed semi-automated habenula segmentation showed reliable and reproducible habenula localization, while its fully-automated version offers an efficient way for large sample analysis. © 2018 Wiley Periodicals, Inc.
Morris, Olivia; McMahon, Adam; Boutin, Herve; Grigg, Julian; Prenant, Christian
2016-06-15
[(18) F]Fluoroacetaldehyde is a biocompatible prosthetic group that has been implemented pre-clinically using a semi-automated remotely controlled system. Automation of radiosyntheses permits use of higher levels of [(18) F]fluoride whilst minimising radiochemist exposure and enhancing reproducibility. In order to achieve full-automation of [(18) F]fluoroacetaldehyde peptide radiolabelling, a customised GE Tracerlab FX-FN with fully programmed automated synthesis was developed. The automated synthesis of [(18) F]fluoroacetaldehyde is carried out using a commercially available precursor, with reproducible yields of 26% ± 3 (decay-corrected, n = 10) within 45 min. Fully automated radiolabelling of a protein, recombinant human interleukin-1 receptor antagonist (rhIL-1RA), with [(18) F]fluoroacetaldehyde was achieved within 2 h. Radiolabelling efficiency of rhIL-1RA with [(18) F]fluoroacetaldehyde was confirmed using HPLC and reached 20% ± 10 (n = 5). Overall RCY of [(18) F]rhIL-1RA was 5% ± 2 (decay-corrected, n = 5) within 2 h starting from 35 to 40 GBq of [(18) F]fluoride. Specific activity measurements of 8.11-13.5 GBq/µmol were attained (n = 5), a near three-fold improvement of those achieved using the semi-automated approach. The strategy can be applied to radiolabelling a range of peptides and proteins with [(18) F]fluoroacetaldehyde analogous to other aldehyde-bearing prosthetic groups, yet automation of the method provides reproducibility thereby aiding translation to Good Manufacturing Practice manufacture and the transformation from pre-clinical to clinical production. Copyright © 2016 The Authors. Journal of Labelled Compounds and Radiopharmaceuticals published by John Wiley & Sons, Ltd.
Application of Artificial Intelligence to Improve Aircraft Survivability.
1985-12-01
may be as smooth and effective as possible. 3. Fully Automatic Digital Engine Control ( FADEC ) Under development at the Naval Weapons Center, a major...goal of the FADEC program is to significantly reduce engine vulnerability by fully automating the regulation of engine controls. Given a thrust
NASA Astrophysics Data System (ADS)
Fritzsche, Klaus H.; Giesel, Frederik L.; Heimann, Tobias; Thomann, Philipp A.; Hahn, Horst K.; Pantel, Johannes; Schröder, Johannes; Essig, Marco; Meinzer, Hans-Peter
2008-03-01
Objective quantification of disease specific neurodegenerative changes can facilitate diagnosis and therapeutic monitoring in several neuropsychiatric disorders. Reproducibility and easy-to-perform assessment are essential to ensure applicability in clinical environments. Aim of this comparative study is the evaluation of a fully automated approach that assesses atrophic changes in Alzheimer's disease (AD) and Mild Cognitive Impairment (MCI). 21 healthy volunteers (mean age 66.2), 21 patients with MCI (66.6), and 10 patients with AD (65.1) were enrolled. Subjects underwent extensive neuropsychological testing and MRI was conducted on a 1.5 Tesla clinical scanner. Atrophic changes were measured automatically by a series of image processing steps including state of the art brain mapping techniques. Results were compared with two reference approaches: a manual segmentation of the hippocampal formation and a semi-automated estimation of temporal horn volume, which is based upon interactive selection of two to six landmarks in the ventricular system. All approaches separated controls and AD patients significantly (10 -5 < p < 10 -4) and showed a slight but not significant increase of neurodegeneration for subjects with MCI compared to volunteers. The automated approach correlated significantly with the manual (r = -0.65, p < 10 -6) and semi automated (r = -0.83, p < 10 -13) measurements. It proved high accuracy and at the same time maximized observer independency, time reduction and thus usefulness for clinical routine.
Geraghty, Adam W A; Torres, Leandro D; Leykin, Yan; Pérez-Stable, Eliseo J; Muñoz, Ricardo F
2013-09-01
Worldwide automated Internet health interventions have the potential to greatly reduce health disparities. High attrition from automated Internet interventions is ubiquitous, and presents a challenge in the evaluation of their effectiveness. Our objective was to evaluate variables hypothesized to be related to attrition, by modeling predictors of attrition in a secondary data analysis of two cohorts of an international, dual language (English and Spanish) Internet smoking cessation intervention. The two cohorts were identical except for the approach to follow-up (FU): one cohort employed only fully automated FU (n = 16 430), while the other cohort also used 'live' contact conditional upon initial non-response (n = 1000). Attrition rates were 48.1 and 10.8% for the automated FU and live FU cohorts, respectively. Significant attrition predictors in the automated FU cohort included higher levels of nicotine dependency, lower education, lower quitting confidence and receiving more contact emails. Participants' younger age was the sole predictor of attrition in the live FU cohort. While research on large-scale deployment of Internet interventions is at an early stage, this study demonstrates that differences in attrition from trials on this scale are (i) systematic and predictable and (ii) can largely be eliminated by live FU efforts. In fully automated trials, targeting the predictors we identify may reduce attrition, a necessary precursor to effective behavioral Internet interventions that can be accessed globally.
Man-Robot Symbiosis: A Framework For Cooperative Intelligence And Control
NASA Astrophysics Data System (ADS)
Parker, Lynne E.; Pin, Francois G.
1988-10-01
The man-robot symbiosis concept has the fundamental objective of bridging the gap between fully human-controlled and fully autonomous systems to achieve true man-robot cooperative control and intelligence. Such a system would allow improved speed, accuracy, and efficiency of task execution, while retaining the man in the loop for innovative reasoning and decision-making. The symbiont would have capabilities for supervised and unsupervised learning, allowing an increase of expertise in a wide task domain. This paper describes a robotic system architecture facilitating the symbiotic integration of teleoperative and automated modes of task execution. The architecture reflects a unique blend of many disciplines of artificial intelligence into a working system, including job or mission planning, dynamic task allocation, man-robot communication, automated monitoring, and machine learning. These disciplines are embodied in five major components of the symbiotic framework: the Job Planner, the Dynamic Task Allocator, the Presenter/Interpreter, the Automated Monitor, and the Learning System.
Lange, Paul P; James, Keith
2012-10-08
A novel methodology for the synthesis of druglike heterocycle libraries has been developed through the use of flow reactor technology. The strategy employs orthogonal modification of a heterocyclic core, which is generated in situ, and was used to construct both a 25-membered library of druglike 3-aminoindolizines, and selected examples of a 100-member virtual library. This general protocol allows a broad range of acylation, alkylation and sulfonamidation reactions to be performed in conjunction with a tandem Sonogashira coupling/cycloisomerization sequence. All three synthetic steps were conducted under full automation in the flow reactor, with no handling or isolation of intermediates, to afford the desired products in good yields. This fully automated, multistep flow approach opens the way to highly efficient generation of druglike heterocyclic systems as part of a lead discovery strategy or within a lead optimization program.
Cost-Effectiveness Analysis of the Automation of a Circulation System.
ERIC Educational Resources Information Center
Mosley, Isobel
A general methodology for cost effectiveness analysis was developed and applied to the Colorado State University library loan desk. The cost effectiveness of the existing semi-automated circulation system was compared with that of a fully manual one, based on the existing manual subsystem. Faculty users' time and computer operating costs were…
Automated Semantic Indices Related to Cognitive Function and Rate of Cognitive Decline
ERIC Educational Resources Information Center
Pakhomov, Serguei V. S.; Hemmy, Laura S.; Lim, Kelvin O.
2012-01-01
The objective of our study is to introduce a fully automated, computational linguistic technique to quantify semantic relations between words generated on a standard semantic verbal fluency test and to determine its cognitive and clinical correlates. Cognitive differences between patients with Alzheimer's disease and mild cognitive impairment are…
ERIC Educational Resources Information Center
Hartt, Richard W.
This report discusses the characteristics, operations, and automation requirements of technical libraries providing services to organizations involved in aerospace and defense scientific and technical work, and describes the Local Automation Model project. This on-going project is designed to demonstrate the concept of a fully integrated library…
Robotic implementation of assays: tissue-nonspecific alkaline phosphatase (TNAP) case study.
Chung, Thomas D Y
2013-01-01
Laboratory automation and robotics have "industrialized" the execution and completion of large-scale, enabling high-capacity and high-throughput (100 K-1 MM/day) screening (HTS) campaigns of large "libraries" of compounds (>200 K-2 MM) to complete in a few days or weeks. Critical to the success these HTS campaigns is the ability of a competent assay development team to convert a validated research-grade laboratory "benchtop" assay suitable for manual or semi-automated operations on a few hundreds of compounds into a robust miniaturized (384- or 1,536-well format), well-engineered, scalable, industrialized assay that can be seamlessly implemented on a fully automated, fully integrated robotic screening platform for cost-effective screening of hundreds of thousands of compounds. Here, we provide a review of the theoretical guiding principles and practical considerations necessary to reduce often complex research biology into a "lean manufacturing" engineering endeavor comprising adaption, automation, and implementation of HTS. Furthermore we provide a detailed example specifically for a cell-free in vitro biochemical, enzymatic phosphatase assay for tissue-nonspecific alkaline phosphatase that illustrates these principles and considerations.
Hoehl, Melanie M; Weißert, Michael; Dannenberg, Arne; Nesch, Thomas; Paust, Nils; von Stetten, Felix; Zengerle, Roland; Slocum, Alexander H; Steigert, Juergen
2014-06-01
This paper introduces a disposable battery-driven heating system for loop-mediated isothermal DNA amplification (LAMP) inside a centrifugally-driven DNA purification platform (LabTube). We demonstrate LabTube-based fully automated DNA purification of as low as 100 cell-equivalents of verotoxin-producing Escherichia coli (VTEC) in water, milk and apple juice in a laboratory centrifuge, followed by integrated and automated LAMP amplification with a reduction of hands-on time from 45 to 1 min. The heating system consists of two parallel SMD thick film resistors and a NTC as heating and temperature sensing elements. They are driven by a 3 V battery and controlled by a microcontroller. The LAMP reagents are stored in the elution chamber and the amplification starts immediately after the eluate is purged into the chamber. The LabTube, including a microcontroller-based heating system, demonstrates contamination-free and automated sample-to-answer nucleic acid testing within a laboratory centrifuge. The heating system can be easily parallelized within one LabTube and it is deployable for a variety of heating and electrical applications.
Hatz, F; Hardmeier, M; Bousleiman, H; Rüegg, S; Schindler, C; Fuhr, P
2015-02-01
To compare the reliability of a newly developed Matlab® toolbox for the fully automated, pre- and post-processing of resting state EEG (automated analysis, AA) with the reliability of analysis involving visually controlled pre- and post-processing (VA). 34 healthy volunteers (age: median 38.2 (20-49), 82% female) had three consecutive 256-channel resting-state EEG at one year intervals. Results of frequency analysis of AA and VA were compared with Pearson correlation coefficients, and reliability over time was assessed with intraclass correlation coefficients (ICC). Mean correlation coefficient between AA and VA was 0.94±0.07, mean ICC for AA 0.83±0.05 and for VA 0.84±0.07. AA and VA yield very similar results for spectral EEG analysis and are equally reliable. AA is less time-consuming, completely standardized, and independent of raters and their training. Automated processing of EEG facilitates workflow in quantitative EEG analysis. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Development of a fully automated network system for long-term health-care monitoring at home.
Motoi, K; Kubota, S; Ikarashi, A; Nogawa, M; Tanaka, S; Nemoto, T; Yamakoshi, K
2007-01-01
Daily monitoring of health condition at home is very important not only as an effective scheme for early diagnosis and treatment of cardiovascular and other diseases, but also for prevention and control of such diseases. From this point of view, we have developed a prototype room for fully automated monitoring of various vital signs. From the results of preliminary experiments using this room, it was confirmed that (1) ECG and respiration during bathing, (2) excretion weight and blood pressure, and (3) respiration and cardiac beat during sleep could be monitored with reasonable accuracy by the sensor system installed in bathtub, toilet and bed, respectively.
Fully automated, deep learning segmentation of oxygen-induced retinopathy images
Xiao, Sa; Bucher, Felicitas; Wu, Yue; Rokem, Ariel; Lee, Cecilia S.; Marra, Kyle V.; Fallon, Regis; Diaz-Aguilar, Sophia; Aguilar, Edith; Friedlander, Martin; Lee, Aaron Y.
2017-01-01
Oxygen-induced retinopathy (OIR) is a widely used model to study ischemia-driven neovascularization (NV) in the retina and to serve in proof-of-concept studies in evaluating antiangiogenic drugs for ocular, as well as nonocular, diseases. The primary parameters that are analyzed in this mouse model include the percentage of retina with vaso-obliteration (VO) and NV areas. However, quantification of these two key variables comes with a great challenge due to the requirement of human experts to read the images. Human readers are costly, time-consuming, and subject to bias. Using recent advances in machine learning and computer vision, we trained deep learning neural networks using over a thousand segmentations to fully automate segmentation in OIR images. While determining the percentage area of VO, our algorithm achieved a similar range of correlation coefficients to that of expert inter-human correlation coefficients. In addition, our algorithm achieved a higher range of correlation coefficients compared with inter-expert correlation coefficients for quantification of the percentage area of neovascular tufts. In summary, we have created an open-source, fully automated pipeline for the quantification of key values of OIR images using deep learning neural networks. PMID:29263301
Designs and concept reliance of a fully automated high-content screening platform.
Radu, Constantin; Adrar, Hosna Sana; Alamir, Ab; Hatherley, Ian; Trinh, Trung; Djaballah, Hakim
2012-10-01
High-content screening (HCS) is becoming an accepted platform in academic and industry screening labs and does require slightly different logistics for execution. To automate our stand-alone HCS microscopes, namely, an alpha IN Cell Analyzer 3000 (INCA3000), originally a Praelux unit hooked to a Hudson Plate Crane with a maximum capacity of 50 plates per run, and the IN Cell Analyzer 2000 (INCA2000), in which up to 320 plates could be fed per run using the Thermo Fisher Scientific Orbitor, we opted for a 4 m linear track system harboring both microscopes, plate washer, bulk dispensers, and a high-capacity incubator allowing us to perform both live and fixed cell-based assays while accessing both microscopes on deck. Considerations in design were given to the integration of the alpha INCA3000, a new gripper concept to access the onboard nest, and peripheral locations on deck to ensure a self-reliant system capable of achieving higher throughput. The resulting system, referred to as Hestia, has been fully operational since the new year, has an onboard capacity of 504 plates, and harbors the only fully automated alpha INCA3000 unit in the world.
Designs and Concept-Reliance of a Fully Automated High Content Screening Platform
Radu, Constantin; Adrar, Hosna Sana; Alamir, Ab; Hatherley, Ian; Trinh, Trung; Djaballah, Hakim
2013-01-01
High content screening (HCS) is becoming an accepted platform in academic and industry screening labs and does require slightly different logistics for execution. To automate our stand alone HCS microscopes, namely an alpha IN Cell Analyzer 3000 (INCA3000) originally a Praelux unit hooked to a Hudson Plate Crane with a maximum capacity of 50 plates per run; and the IN Cell Analyzer 2000 (INCA2000) where up to 320 plates could be fed per run using the Thermo Fisher Scientific Orbitor, we opted for a 4 meter linear track system harboring both microscopes, plate washer, bulk dispensers, and a high capacity incubator allowing us to perform both live and fixed cell based assays while accessing both microscopes on deck. Considerations in design were given to the integration of the alpha INCA3000, a new gripper concept to access the onboard nest, and peripheral locations on deck to ensure a self reliant system capable of achieving higher throughput. The resulting system, referred to as Hestia, has been fully operational since the New Year, has an onboard capacity of 504 plates, and harbors the only fully automated alpha INCA3000 unit in the World. PMID:22797489
Merlos Rodrigo, Miguel Angel; Krejcova, Ludmila; Kudr, Jiri; Cernei, Natalia; Kopel, Pavel; Richtera, Lukas; Moulick, Amitava; Hynek, David; Adam, Vojtech; Stiborova, Marie; Eckschlager, Tomas; Heger, Zbynek; Zitka, Ondrej
2016-12-15
Metallothioneins (MTs) are involved in heavy metal detoxification in a wide range of living organisms. Currently, it is well known that MTs play substantial role in many pathophysiological processes, including carcinogenesis, and they can serve as diagnostic biomarkers. In order to increase the applicability of MT in cancer diagnostics, an easy-to-use and rapid method for its detection is required. Hence, the aim of this study was to develop a fully automated and high-throughput assay for the estimation of MT levels. Here, we report the optimal conditions for the isolation of MTs from rabbit liver and their characterization using MALDI-TOF MS. In addition, we described a two-step assay, which started with an isolation of the protein using functionalized paramagnetic particles and finished with their electrochemical analysis. The designed easy-to-use, cost-effective, error-free and fully automated procedure for the isolation of MT coupled with a simple analytical detection method can provide a prototype for the construction of a diagnostic instrument, which would be appropriate for the monitoring of carcinogenesis or MT-related chemoresistance of tumors. Copyright © 2016 Elsevier B.V. All rights reserved.
Nielsen, Patricia Switten; Riber-Hansen, Rikke; Schmidt, Henrik; Steiniche, Torben
2016-04-09
Staging of melanoma includes quantification of a proliferation index, i.e., presumed melanocytic mitoses of H&E stains are counted manually in hot spots. Yet, its reproducibility and prognostic impact increases by immunohistochemical dual staining for phosphohistone H3 (PHH3) and MART1, which also may enable fully automated quantification by image analysis. To ensure manageable workloads and repeatable measurements in modern pathology, the study aimed to present an automated quantification of proliferation with automated hot-spot selection in PHH3/MART1-stained melanomas. Formalin-fixed, paraffin-embedded tissue from 153 consecutive stage I/II melanoma patients was immunohistochemically dual-stained for PHH3 and MART1. Whole slide images were captured, and the number of PHH3/MART1-positive cells was manually and automatically counted in the global tumor area and in a manually and automatically selected hot spot, i.e., a fixed 1-mm(2) square. Bland-Altman plots and hypothesis tests compared manual and automated procedures, and the Cox proportional hazards model established their prognostic impact. The mean difference between manual and automated global counts was 2.9 cells/mm(2) (P = 0.0071) and 0.23 cells per hot spot (P = 0.96) for automated counts in manually and automatically selected hot spots. In 77 % of cases, manual and automated hot spots overlapped. Fully manual hot-spot counts yielded the highest prognostic performance with an adjusted hazard ratio of 5.5 (95 % CI, 1.3-24, P = 0.024) as opposed to 1.3 (95 % CI, 0.61-2.9, P = 0.47) for automated counts with automated hot spots. The automated index and automated hot-spot selection were highly correlated to their manual counterpart, but altogether their prognostic impact was noticeably reduced. Because correct recognition of only one PHH3/MART1-positive cell seems important, extremely high sensitivity and specificity of the algorithm is required for prognostic purposes. Thus, automated analysis may still aid and improve the pathologists' detection of mitoses in melanoma and possibly other malignancies.
BATS: a Bayesian user-friendly software for analyzing time series microarray experiments.
Angelini, Claudia; Cutillo, Luisa; De Canditiis, Daniela; Mutarelli, Margherita; Pensky, Marianna
2008-10-06
Gene expression levels in a given cell can be influenced by different factors, namely pharmacological or medical treatments. The response to a given stimulus is usually different for different genes and may depend on time. One of the goals of modern molecular biology is the high-throughput identification of genes associated with a particular treatment or a biological process of interest. From methodological and computational point of view, analyzing high-dimensional time course microarray data requires very specific set of tools which are usually not included in standard software packages. Recently, the authors of this paper developed a fully Bayesian approach which allows one to identify differentially expressed genes in a 'one-sample' time-course microarray experiment, to rank them and to estimate their expression profiles. The method is based on explicit expressions for calculations and, hence, very computationally efficient. The software package BATS (Bayesian Analysis of Time Series) presented here implements the methodology described above. It allows an user to automatically identify and rank differentially expressed genes and to estimate their expression profiles when at least 5-6 time points are available. The package has a user-friendly interface. BATS successfully manages various technical difficulties which arise in time-course microarray experiments, such as a small number of observations, non-uniform sampling intervals and replicated or missing data. BATS is a free user-friendly software for the analysis of both simulated and real microarray time course experiments. The software, the user manual and a brief illustrative example are freely available online at the BATS website: http://www.na.iac.cnr.it/bats.
Automated System for Early Breast Cancer Detection in Mammograms
NASA Technical Reports Server (NTRS)
Bankman, Isaac N.; Kim, Dong W.; Christens-Barry, William A.; Weinberg, Irving N.; Gatewood, Olga B.; Brody, William R.
1993-01-01
The increasing demand on mammographic screening for early breast cancer detection, and the subtlety of early breast cancer signs on mammograms, suggest an automated image processing system that can serve as a diagnostic aid in radiology clinics. We present a fully automated algorithm for detecting clusters of microcalcifications that are the most common signs of early, potentially curable breast cancer. By using the contour map of the mammogram, the algorithm circumvents some of the difficulties encountered with standard image processing methods. The clinical implementation of an automated instrument based on this algorithm is also discussed.
Results from the first fully automated PBS-mask process and pelliclization
NASA Astrophysics Data System (ADS)
Oelmann, Andreas B.; Unger, Gerd M.
1994-02-01
Automation is widely discussed in IC- and mask-manufacturing and partially realized everywhere. The idea for the automation goes back to 1978, when it turned out that the operators for the then newly installed PBS-process-line (the first in Europe) should be trained to behave like robots for particle reduction gaining lower defect densities on the masks. More than this goal has been achieved. It turned out recently, that the automation with its dedicated work routes and detailed documentation of every lot (individual mask or reticle) made it easy to obtain the CEEC certificate which includes ISO 9001.
Microengineering methods for cell-based microarrays and high-throughput drug-screening applications.
Xu, Feng; Wu, JinHui; Wang, ShuQi; Durmus, Naside Gozde; Gurkan, Umut Atakan; Demirci, Utkan
2011-09-01
Screening for effective therapeutic agents from millions of drug candidates is costly, time consuming, and often faces concerns due to the extensive use of animals. To improve cost effectiveness, and to minimize animal testing in pharmaceutical research, in vitro monolayer cell microarrays with multiwell plate assays have been developed. Integration of cell microarrays with microfluidic systems has facilitated automated and controlled component loading, significantly reducing the consumption of the candidate compounds and the target cells. Even though these methods significantly increased the throughput compared to conventional in vitro testing systems and in vivo animal models, the cost associated with these platforms remains prohibitively high. Besides, there is a need for three-dimensional (3D) cell-based drug-screening models which can mimic the in vivo microenvironment and the functionality of the native tissues. Here, we present the state-of-the-art microengineering approaches that can be used to develop 3D cell-based drug-screening assays. We highlight the 3D in vitro cell culture systems with live cell-based arrays, microfluidic cell culture systems, and their application to high-throughput drug screening. We conclude that among the emerging microengineering approaches, bioprinting holds great potential to provide repeatable 3D cell-based constructs with high temporal, spatial control and versatility.
Microengineering Methods for Cell Based Microarrays and High-Throughput Drug Screening Applications
Xu, Feng; Wu, JinHui; Wang, ShuQi; Durmus, Naside Gozde; Gurkan, Umut Atakan; Demirci, Utkan
2011-01-01
Screening for effective therapeutic agents from millions of drug candidates is costly, time-consuming and often face ethical concerns due to extensive use of animals. To improve cost-effectiveness, and to minimize animal testing in pharmaceutical research, in vitro monolayer cell microarrays with multiwell plate assays have been developed. Integration of cell microarrays with microfluidic systems have facilitated automated and controlled component loading, significantly reducing the consumption of the candidate compounds and the target cells. Even though these methods significantly increased the throughput compared to conventional in vitro testing systems and in vivo animal models, the cost associated with these platforms remains prohibitively high. Besides, there is a need for three-dimensional (3D) cell based drug-screening models, which can mimic the in vivo microenvironment and the functionality of the native tissues. Here, we present the state-of-the-art microengineering approaches that can be used to develop 3D cell based drug screening assays. We highlight the 3D in vitro cell culture systems with live cell-based arrays, microfluidic cell culture systems, and their application to high-throughput drug screening. We conclude that among the emerging microengineering approaches, bioprinting holds a great potential to provide repeatable 3D cell based constructs with high temporal, spatial control and versatility. PMID:21725152
Meyer, Denny; Austin, David William; Kyrios, Michael
2011-01-01
Background The development of e-mental health interventions to treat or prevent mental illness and to enhance wellbeing has risen rapidly over the past decade. This development assists the public in sidestepping some of the obstacles that are often encountered when trying to access traditional face-to-face mental health care services. Objective The objective of our study was to investigate the posttreatment effectiveness of five fully automated self-help cognitive behavior e-therapy programs for generalized anxiety disorder (GAD), panic disorder with or without agoraphobia (PD/A), obsessive–compulsive disorder (OCD), posttraumatic stress disorder (PTSD), and social anxiety disorder (SAD) offered to the international public via Anxiety Online, an open-access full-service virtual psychology clinic for anxiety disorders. Methods We used a naturalistic participant choice, quasi-experimental design to evaluate each of the five Anxiety Online fully automated self-help e-therapy programs. Participants were required to have at least subclinical levels of one of the anxiety disorders to be offered the associated disorder-specific fully automated self-help e-therapy program. These programs are offered free of charge via Anxiety Online. Results A total of 225 people self-selected one of the five e-therapy programs (GAD, n = 88; SAD, n = 50; PD/A, n = 40; PTSD, n = 30; OCD, n = 17) and completed their 12-week posttreatment assessment. Significant improvements were found on 21/25 measures across the five fully automated self-help programs. At postassessment we observed significant reductions on all five anxiety disorder clinical disorder severity ratings (Cohen d range 0.72–1.22), increased confidence in managing one’s own mental health care (Cohen d range 0.70–1.17), and decreases in the total number of clinical diagnoses (except for the PD/A program, where a positive trend was found) (Cohen d range 0.45–1.08). In addition, we found significant improvements in quality of life for the GAD, OCD, PTSD, and SAD e-therapy programs (Cohen d range 0.11–0.96) and significant reductions relating to general psychological distress levels for the GAD, PD/A, and PTSD e-therapy programs (Cohen d range 0.23–1.16). Overall, treatment satisfaction was good across all five e-therapy programs, and posttreatment assessment completers reported using their e-therapy program an average of 395.60 (SD 272.2) minutes over the 12-week treatment period. Conclusions Overall, all five fully automated self-help e-therapy programs appear to be delivering promising high-quality outcomes; however, the results require replication. Trial Registration Australian and New Zealand Clinical Trials Registry ACTRN121611000704998; http://www.anzctr.org.au/trial_view.aspx?ID=336143 (Archived by WebCite at http://www.webcitation.org/618r3wvOG) PMID:22057287
Fully automated gynecomastia quantification from low-dose chest CT
NASA Astrophysics Data System (ADS)
Liu, Shuang; Sonnenblick, Emily B.; Azour, Lea; Yankelevitz, David F.; Henschke, Claudia I.; Reeves, Anthony P.
2018-02-01
Gynecomastia is characterized by the enlargement of male breasts, which is a common and sometimes distressing condition found in over half of adult men over the age of 44. Although the majority of gynecomastia is physiologic or idiopathic, its occurrence may also associate with an extensive variety of underlying systemic disease or drug toxicity. With the recent large-scale implementation of annual lung cancer screening using low-dose chest CT (LDCT), gynecomastia is believed to be a frequent incidental finding on LDCT. A fully automated system for gynecomastia quantification from LDCT is presented in this paper. The whole breast region is first segmented using an anatomyorientated approach based on the propagation of pectoral muscle fronts in the vertical direction. The subareolar region is then localized, and the fibroglandular tissue within it is measured for the assessment of gynecomastia. The presented system was validated using 454 breast regions from non-contrast LDCT scans of 227 adult men. The ground truth was established by an experienced radiologist by classifying each breast into one of the five categorical scores. The automated measurements have been demonstrated to achieve promising performance for the gynecomastia diagnosis with the AUC of 0.86 for the ROC curve and have statistically significant Spearman correlation r=0.70 (p < 0.001) with the reference categorical grades. The encouraging results demonstrate the feasibility of fully automated gynecomastia quantification from LDCT, which may aid the early detection as well as the treatment of both gynecomastia and the underlying medical problems, if any, that cause gynecomastia.
Fully Automated Driving: Impact of Trust and Practice on Manual Control Recovery.
Payre, William; Cestac, Julien; Delhomme, Patricia
2016-03-01
An experiment was performed in a driving simulator to investigate the impacts of practice, trust, and interaction on manual control recovery (MCR) when employing fully automated driving (FAD). To increase the use of partially or highly automated driving efficiency and to improve safety, some studies have addressed trust in driving automation and training, but few studies have focused on FAD. FAD is an autonomous system that has full control of a vehicle without any need for intervention by the driver. A total of 69 drivers with a valid license practiced with FAD. They were distributed evenly across two conditions: simple practice and elaborate practice. When examining emergency MCR, a correlation was found between trust and reaction time in the simple practice group (i.e., higher trust meant a longer reaction time), but not in the elaborate practice group. This result indicated that to mitigate the negative impact of overtrust on reaction time, more appropriate practice may be needed. Drivers should be trained in how the automated device works so as to improve MCR performance in case of an emergency. The practice format used in this study could be used for the first interaction with an FAD car when acquiring such a vehicle. © 2015, Human Factors and Ergonomics Society.
Kandaswamy, Umasankar; Rotman, Ziv; Watt, Dana; Schillebeeckx, Ian; Cavalli, Valeria; Klyachko, Vitaly
2013-01-01
High-resolution live-cell imaging studies of neuronal structure and function are characterized by large variability in image acquisition conditions due to background and sample variations as well as low signal-to-noise ratio. The lack of automated image analysis tools that can be generalized for varying image acquisition conditions represents one of the main challenges in the field of biomedical image analysis. Specifically, segmentation of the axonal/dendritic arborizations in brightfield or fluorescence imaging studies is extremely labor-intensive and still performed mostly manually. Here we describe a fully automated machine-learning approach based on textural analysis algorithms for segmenting neuronal arborizations in high-resolution brightfield images of live cultured neurons. We compare performance of our algorithm to manual segmentation and show that it combines 90% accuracy, with similarly high levels of specificity and sensitivity. Moreover, the algorithm maintains high performance levels under a wide range of image acquisition conditions indicating that it is largely condition-invariable. We further describe an application of this algorithm to fully automated synapse localization and classification in fluorescence imaging studies based on synaptic activity. Textural analysis-based machine-learning approach thus offers a high performance condition-invariable tool for automated neurite segmentation. PMID:23261652
Building Flexible User Interfaces for Solving PDEs
NASA Astrophysics Data System (ADS)
Logg, Anders; Wells, Garth N.
2010-09-01
FEniCS is a collection of software tools for the automated solution of differential equations by finite element methods. In this note, we describe how FEniCS can be used to solve a simple nonlinear model problem with varying levels of automation. At one extreme, FEniCS provides tools for the fully automated and adaptive solution of nonlinear partial differential equations. At the other extreme, FEniCS provides a range of tools that allow the computational scientist to experiment with novel solution algorithms.
Direct Detection of Drug-Resistant Hepatitis B Virus in Serum Using a Dendron-Modified Microarray
Kim, Doo Hyun; Kang, Hong Seok; Hur, Seong-Suk; Sim, Seobo; Ahn, Sung Hyun; Park, Yong Kwang; Park, Eun-Sook; Lee, Ah Ram; Park, Soree; Kwon, So Young; Lee, Jeong-Hoon
2018-01-01
Background/Aims Direct sequencing is the gold standard for the detection of drug-resistance mutations in hepatitis B virus (HBV); however, this procedure is time-consuming, labor-intensive, and difficult to adapt to high-throughput screening. In this study, we aimed to develop a dendron-modified DNA microarray for the detection of genotypic resistance mutations and evaluate its efficiency. Methods The specificity, sensitivity, and selectivity of dendron-modified slides for the detection of representative drug-resistance mutations were evaluated and compared to those of conventional slides. The diagnostic accuracy was validated using sera obtained from 13 patients who developed viral breakthrough during lamivudine, adefovir, or entecavir therapy and compared with the accuracy of restriction fragment mass polymorphism and direct sequencing data. Results The dendron-modified slides significantly outperformed the conventional microarray slides and were able to detect HBV DNA at a very low level (1 copy/μL). Notably, HBV mutants could be detected in the chronic hepatitis B patient sera without virus purification. The validation of our data revealed that this technique is fully compatible with sequencing data of drug-resistant HBV. Conclusions We developed a novel diagnostic technique for the simultaneous detection of several drug-resistance mutations using a dendron-modified DNA microarray. This technique can be directly applied to sera from chronic hepatitis B patients who show resistance to several nucleos(t)ide analogues. PMID:29271185
Schadt, Eric E; Edwards, Stephen W; GuhaThakurta, Debraj; Holder, Dan; Ying, Lisa; Svetnik, Vladimir; Leonardson, Amy; Hart, Kyle W; Russell, Archie; Li, Guoya; Cavet, Guy; Castle, John; McDonagh, Paul; Kan, Zhengyan; Chen, Ronghua; Kasarskis, Andrew; Margarint, Mihai; Caceres, Ramon M; Johnson, Jason M; Armour, Christopher D; Garrett-Engele, Philip W; Tsinoremas, Nicholas F; Shoemaker, Daniel D
2004-01-01
Background Computational and microarray-based experimental approaches were used to generate a comprehensive transcript index for the human genome. Oligonucleotide probes designed from approximately 50,000 known and predicted transcript sequences from the human genome were used to survey transcription from a diverse set of 60 tissues and cell lines using ink-jet microarrays. Further, expression activity over at least six conditions was more generally assessed using genomic tiling arrays consisting of probes tiled through a repeat-masked version of the genomic sequence making up chromosomes 20 and 22. Results The combination of microarray data with extensive genome annotations resulted in a set of 28,456 experimentally supported transcripts. This set of high-confidence transcripts represents the first experimentally driven annotation of the human genome. In addition, the results from genomic tiling suggest that a large amount of transcription exists outside of annotated regions of the genome and serves as an example of how this activity could be measured on a genome-wide scale. Conclusions These data represent one of the most comprehensive assessments of transcriptional activity in the human genome and provide an atlas of human gene expression over a unique set of gene predictions. Before the annotation of the human genome is considered complete, however, the previously unannotated transcriptional activity throughout the genome must be fully characterized. PMID:15461792
Effects of Automation Types on Air Traffic Controller Situation Awareness and Performance
NASA Technical Reports Server (NTRS)
Sethumadhavan, A.
2009-01-01
The Joint Planning and Development Office has proposed the introduction of automated systems to help air traffic controllers handle the increasing volume of air traffic in the next two decades (JPDO, 2007). Because fully automated systems leave operators out of the decision-making loop (e.g., Billings, 1991), it is important to determine the right level and type of automation that will keep air traffic controllers in the loop. This study examined the differences in the situation awareness (SA) and collision detection performance of individuals when they worked with information acquisition, information analysis, decision and action selection and action implementation automation to control air traffic (Parasuraman, Sheridan, & Wickens, 2000). When the automation was unreliable, the time taken to detect an upcoming collision was significantly longer for all the automation types compared with the information acquisition automation. This poor performance following automation failure was mediated by SA, with lower SA yielding poor performance. Thus, the costs associated with automation failure are greater when automation is applied to higher order stages of information processing. Results have practical implications for automation design and development of SA training programs.
JPLEX: Java Simplex Implementation with Branch-and-Bound Search for Automated Test Assembly
ERIC Educational Resources Information Center
Park, Ryoungsun; Kim, Jiseon; Dodd, Barbara G.; Chung, Hyewon
2011-01-01
JPLEX, short for Java simPLEX, is an automated test assembly (ATA) program. It is a mixed integer linear programming (MILP) solver written in Java. It reads in a configuration file, solves the minimization problem, and produces an output file for postprocessing. It implements the simplex algorithm to create a fully relaxed solution and…
ECMS--Educational Contest Management System for Selecting Elite Students
ERIC Educational Resources Information Center
Schneider, Thorsten
2004-01-01
Selecting elite students out of a huge collective is a difficult task. The main problem is to provide automated processes to reduce human work. ECMS (Educational Contest Management System) is an online tool approach to help--fully or partly automated--with the task of selecting such elite students out of a mass of candidates. International tests…
An Automated Distillation Column for the Unit Operations Laboratory
ERIC Educational Resources Information Center
Perkins, Douglas M.; Bruce, David A.; Gooding, Charles H.; Butler, Justin T.
2005-01-01
A batch distillation apparatus has been designed and built for use in the undergraduate unit operations laboratory course. The column is fully automated and is accompanied by data acquisition and control software. A mixture of 1-propanol and 2-propanol is separated in the column, using either a constant distillate rate or constant composition…
An automated system for global atmospheric sampling using B-747 airliners
NASA Technical Reports Server (NTRS)
Lew, K. Q.; Gustafsson, U. R. C.; Johnson, R. E.
1981-01-01
The global air sampling program utilizes commercial aircrafts in scheduled service to measure atmospheric constituents. A fully automated system designed for the 747 aircraft is described. Airline operational constraints and data and control subsystems are treated. The overall program management, system monitoring, and data retrieval from four aircraft in global service is described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daly, Don S.; Anderson, Kevin K.; White, Amanda M.
Background: A microarray of enzyme-linked immunosorbent assays, or ELISA microarray, predicts simultaneously the concentrations of numerous proteins in a small sample. These predictions, however, are uncertain due to processing error and biological variability. Making sound biological inferences as well as improving the ELISA microarray process require require both concentration predictions and creditable estimates of their errors. Methods: We present a statistical method based on monotonic spline statistical models, penalized constrained least squares fitting (PCLS) and Monte Carlo simulation (MC) to predict concentrations and estimate prediction errors in ELISA microarray. PCLS restrains the flexible spline to a fit of assay intensitymore » that is a monotone function of protein concentration. With MC, both modeling and measurement errors are combined to estimate prediction error. The spline/PCLS/MC method is compared to a common method using simulated and real ELISA microarray data sets. Results: In contrast to the rigid logistic model, the flexible spline model gave credible fits in almost all test cases including troublesome cases with left and/or right censoring, or other asymmetries. For the real data sets, 61% of the spline predictions were more accurate than their comparable logistic predictions; especially the spline predictions at the extremes of the prediction curve. The relative errors of 50% of comparable spline and logistic predictions differed by less than 20%. Monte Carlo simulation rendered acceptable asymmetric prediction intervals for both spline and logistic models while propagation of error produced symmetric intervals that diverged unrealistically as the standard curves approached horizontal asymptotes. Conclusions: The spline/PCLS/MC method is a flexible, robust alternative to a logistic/NLS/propagation-of-error method to reliably predict protein concentrations and estimate their errors. The spline method simplifies model selection and fitting, and reliably estimates believable prediction errors. For the 50% of the real data sets fit well by both methods, spline and logistic predictions are practically indistinguishable, varying in accuracy by less than 15%. The spline method may be useful when automated prediction across simultaneous assays of numerous proteins must be applied routinely with minimal user intervention.« less
Automation of fluorescent differential display with digital readout.
Meade, Jonathan D; Cho, Yong-Jig; Fisher, Jeffrey S; Walden, Jamie C; Guo, Zhen; Liang, Peng
2006-01-01
Since its invention in 1992, differential display (DD) has become the most commonly used technique for identifying differentially expressed genes because of its many advantages over competing technologies such as DNA microarray, serial analysis of gene expression (SAGE), and subtractive hybridization. Despite the great impact of the method on biomedical research, there has been a lack of automation of DD technology to increase its throughput and accuracy for systematic gene expression analysis. Most of previous DD work has taken a "shot-gun" approach of identifying one gene at a time, with a limited number of polymerase chain reaction (PCR) reactions set up manually, giving DD a low-tech and low-throughput image. We have optimized the DD process with a new platform that incorporates fluorescent digital readout, automated liquid handling, and large-format gels capable of running entire 96-well plates. The resulting streamlined fluorescent DD (FDD) technology offers an unprecedented accuracy, sensitivity, and throughput in comprehensive and quantitative analysis of gene expression. These major improvements will allow researchers to find differentially expressed genes of interest, both known and novel, quickly and easily.
Effects of imperfect automation on decision making in a simulated command and control task.
Rovira, Ericka; McGarry, Kathleen; Parasuraman, Raja
2007-02-01
Effects of four types of automation support and two levels of automation reliability were examined. The objective was to examine the differential impact of information and decision automation and to investigate the costs of automation unreliability. Research has shown that imperfect automation can lead to differential effects of stages and levels of automation on human performance. Eighteen participants performed a "sensor to shooter" targeting simulation of command and control. Dependent variables included accuracy and response time of target engagement decisions, secondary task performance, and subjective ratings of mental work-load, trust, and self-confidence. Compared with manual performance, reliable automation significantly reduced decision times. Unreliable automation led to greater cost in decision-making accuracy under the higher automation reliability condition for three different forms of decision automation relative to information automation. At low automation reliability, however, there was a cost in performance for both information and decision automation. The results are consistent with a model of human-automation interaction that requires evaluation of the different stages of information processing to which automation support can be applied. If fully reliable decision automation cannot be guaranteed, designers should provide users with information automation support or other tools that allow for inspection and analysis of raw data.
Laboratory Testing Protocols for Heparin-Induced Thrombocytopenia (HIT) Testing.
Lau, Kun Kan Edwin; Mohammed, Soma; Pasalic, Leonardo; Favaloro, Emmanuel J
2017-01-01
Heparin-induced thrombocytopenia (HIT) represents a significant high morbidity complication of heparin therapy. The clinicopathological diagnosis of HIT remains challenging for many reasons; thus, laboratory testing represents an important component of an accurate diagnosis. Although there are many assays available to assess HIT, these essentially fall into two categories-(a) immunological assays, and (b) functional assays. The current chapter presents protocols for several HIT assays, being those that are most commonly performed in laboratory practice and have the widest geographic distribution. These comprise a manual lateral flow-based system (STiC), a fully automated latex immunoturbidimetric assay, a fully automated chemiluminescent assay (CLIA), light transmission aggregation (LTA), and whole blood aggregation (Multiplate).
An Open-Source Automated Peptide Synthesizer Based on Arduino and Python.
Gali, Hariprasad
2017-10-01
The development of the first open-source automated peptide synthesizer, PepSy, using Arduino UNO and readily available components is reported. PepSy was primarily designed to synthesize small peptides in a relatively small scale (<100 µmol). Scripts to operate PepSy in a fully automatic or manual mode were written in Python. Fully automatic script includes functions to carry out resin swelling, resin washing, single coupling, double coupling, Fmoc deprotection, ivDde deprotection, on-resin oxidation, end capping, and amino acid/reagent line cleaning. Several small peptides and peptide conjugates were successfully synthesized on PepSy with reasonably good yields and purity depending on the complexity of the peptide.
Building biochips: a protein production pipeline
NASA Astrophysics Data System (ADS)
de Carvalho-Kavanagh, Marianne G. S.; Albala, Joanna S.
2004-06-01
Protein arrays are emerging as a practical format in which to study proteins in high-throughput using many of the same techniques as that of the DNA microarray. The key advantage to array-based methods for protein study is the potential for parallel analysis of thousands of samples in an automated, high-throughput fashion. Building protein arrays capable of this analysis capacity requires a robust expression and purification system capable of generating hundreds to thousands of purified recombinant proteins. We have developed a method to utilize LLNL-I.M.A.G.E. cDNAs to generate recombinant protein libraries using a baculovirus-insect cell expression system. We have used this strategy to produce proteins for analysis of protein/DNA and protein/protein interactions using protein microarrays in order to understand the complex interactions of proteins involved in homologous recombination and DNA repair. Using protein array techniques, a novel interaction between the DNA repair protein, Rad51B, and histones has been identified.
CNV-WebStore: online CNV analysis, storage and interpretation.
Vandeweyer, Geert; Reyniers, Edwin; Wuyts, Wim; Rooms, Liesbeth; Kooy, R Frank
2011-01-05
Microarray technology allows the analysis of genomic aberrations at an ever increasing resolution, making functional interpretation of these vast amounts of data the main bottleneck in routine implementation of high resolution array platforms, and emphasising the need for a centralised and easy to use CNV data management and interpretation system. We present CNV-WebStore, an online platform to streamline the processing and downstream interpretation of microarray data in a clinical context, tailored towards but not limited to the Illumina BeadArray platform. Provided analysis tools include CNV analsyis, parent of origin and uniparental disomy detection. Interpretation tools include data visualisation, gene prioritisation, automated PubMed searching, linking data to several genome browsers and annotation of CNVs based on several public databases. Finally a module is provided for uniform reporting of results. CNV-WebStore is able to present copy number data in an intuitive way to both lab technicians and clinicians, making it a useful tool in daily clinical practice.
Direct on-chip DNA synthesis using electrochemically modified gold electrodes as solid support
NASA Astrophysics Data System (ADS)
Levrie, Karen; Jans, Karolien; Schepers, Guy; Vos, Rita; Van Dorpe, Pol; Lagae, Liesbet; Van Hoof, Chris; Van Aerschot, Arthur; Stakenborg, Tim
2018-04-01
DNA microarrays have propelled important advancements in the field of genomic research by enabling the monitoring of thousands of genes in parallel. The throughput can be increased even further by scaling down the microarray feature size. In this respect, microelectronics-based DNA arrays are promising as they can leverage semiconductor processing techniques with lithographic resolutions. We propose a method that enables the use of metal electrodes for de novo DNA synthesis without the need for an insulating support. By electrochemically functionalizing gold electrodes, these electrodes can act as solid support for phosphoramidite-based synthesis. The proposed method relies on the electrochemical reduction of diazonium salts, enabling site-specific incorporation of hydroxyl groups onto the metal electrodes. An automated DNA synthesizer was used to couple phosphoramidite moieties directly onto the OH-modified electrodes to obtain the desired oligonucleotide sequence. Characterization was done via cyclic voltammetry and fluorescence microscopy. Our results present a valuable proof-of-concept for the integration of solid-phase DNA synthesis with microelectronics.
Automation study for space station subsystems and mission ground support
NASA Technical Reports Server (NTRS)
1985-01-01
An automation concept for the autonomous operation of space station subsystems, i.e., electric power, thermal control, and communications and tracking are discussed. To assure that functions essential for autonomous operations are not neglected, an operations function (systems monitoring and control) is included in the discussion. It is recommended that automated speech recognition and synthesis be considered a basic mode of man/machine interaction for space station command and control, and that the data management system (DMS) and other systems on the space station be designed to accommodate fully automated fault detection, isolation, and recovery within the system monitoring function of the DMS.
NASA Technical Reports Server (NTRS)
Bayless, E. O.; Lawless, K. G.; Kurgan, C.; Nunes, A. C.; Graham, B. F.; Hoffman, D.; Jones, C. S.; Shepard, R.
1993-01-01
Fully automated variable-polarity plasma arc VPPA welding system developed at Marshall Space Flight Center. System eliminates defects caused by human error. Integrates many sensors with mathematical model of the weld and computer-controlled welding equipment. Sensors provide real-time information on geometry of weld bead, location of weld joint, and wire-feed entry. Mathematical model relates geometry of weld to critical parameters of welding process.
De Marni, Marzia L; Monegal, Ana; Venturini, Samuele; Vinati, Simone; Carbone, Roberta; de Marco, Ario
2012-02-01
The preparation of effective conventional antibody microarrays depends on the availability of high quality material and on the correct accessibility of the antibody active moieties following their immobilization on the support slide. We show that spotting bacteria that expose recombinant antibodies on their external surface directly on nanostructured-TiO(2) or epoxy slides (purification-independent microarray - PIM) is a simple and reliable alternative for preparing sensitive and specific microarrays for antigen detection. Variable domains of single heavy-chain antibodies (VHHs) against fibroblast growth factor receptor 1 (FGFR1) were used to capture the antigen diluted in serum or BSA solution. The FGFR1 detection was performed by either direct antigen labeling or using a sandwich system in which FGFR1 was first bound to its antibody and successively identified using a labeled FGF. In both cases the signal distribution within each spot was uniform and spot morphology regular. The signal-to-noise ratio of the signal was extremely elevated and the specificity of the system was proved statistically. The LOD of the system for the antigen was calculated being 0.4ng/mL and the dynamic range between 0.4ng/mL and 10μg/mL. The microarrays prepared with bacteria exposing antibodies remain fully functional for at least 31 days after spotting. We finally demonstrated that the method is suitable for other antigen-antibody pairs and expect that it could be easily adapted to further applications such as the display of scFv and IgG antibodies or the autoantibody detection using protein PIMs. Copyright © 2011. Published by Elsevier Inc.
Fully Automated Deep Learning System for Bone Age Assessment.
Lee, Hyunkwang; Tajmir, Shahein; Lee, Jenny; Zissen, Maurice; Yeshiwas, Bethel Ayele; Alkasab, Tarik K; Choy, Garry; Do, Synho
2017-08-01
Skeletal maturity progresses through discrete phases, a fact that is used routinely in pediatrics where bone age assessments (BAAs) are compared to chronological age in the evaluation of endocrine and metabolic disorders. While central to many disease evaluations, little has changed to improve the tedious process since its introduction in 1950. In this study, we propose a fully automated deep learning pipeline to segment a region of interest, standardize and preprocess input radiographs, and perform BAA. Our models use an ImageNet pretrained, fine-tuned convolutional neural network (CNN) to achieve 57.32 and 61.40% accuracies for the female and male cohorts on our held-out test images. Female test radiographs were assigned a BAA within 1 year 90.39% and within 2 years 98.11% of the time. Male test radiographs were assigned 94.18% within 1 year and 99.00% within 2 years. Using the input occlusion method, attention maps were created which reveal what features the trained model uses to perform BAA. These correspond to what human experts look at when manually performing BAA. Finally, the fully automated BAA system was deployed in the clinical environment as a decision supporting system for more accurate and efficient BAAs at much faster interpretation time (<2 s) than the conventional method.
A new fully automated FTIR system for total column measurements of greenhouse gases
NASA Astrophysics Data System (ADS)
Geibel, M. C.; Gerbig, C.; Feist, D. G.
2010-10-01
This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network (TCCON). It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics. Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. The automation software employs a new approach relying on multiple processes, database logging and web-based remote control. First results of total column measurements at Jena, Germany show that the instrument works well and can provide parts of the diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months. After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.
Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes
NASA Astrophysics Data System (ADS)
Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao
2010-06-01
To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.
ATLAS, an integrated structural analysis and design system. Volume 6: Design module theory
NASA Technical Reports Server (NTRS)
Backman, B. F.
1979-01-01
The automated design theory underlying the operation of the ATLAS Design Module is decribed. The methods, applications and limitations associated with the fully stressed design, the thermal fully stressed design and a regional optimization algorithm are presented. A discussion of the convergence characteristics of the fully stressed design is also included. Derivations and concepts specific to the ATLAS design theory are shown, while conventional terminology and established methods are identified by references.
NASA Astrophysics Data System (ADS)
McIntosh, Chris; Welch, Mattea; McNiven, Andrea; Jaffray, David A.; Purdie, Thomas G.
2017-08-01
Recent works in automated radiotherapy treatment planning have used machine learning based on historical treatment plans to infer the spatial dose distribution for a novel patient directly from the planning image. We present a probabilistic, atlas-based approach which predicts the dose for novel patients using a set of automatically selected most similar patients (atlases). The output is a spatial dose objective, which specifies the desired dose-per-voxel, and therefore replaces the need to specify and tune dose-volume objectives. Voxel-based dose mimicking optimization then converts the predicted dose distribution to a complete treatment plan with dose calculation using a collapsed cone convolution dose engine. In this study, we investigated automated planning for right-sided oropharaynx head and neck patients treated with IMRT and VMAT. We compare four versions of our dose prediction pipeline using a database of 54 training and 12 independent testing patients by evaluating 14 clinical dose evaluation criteria. Our preliminary results are promising and demonstrate that automated methods can generate comparable dose distributions to clinical. Overall, automated plans achieved an average of 0.6% higher dose for target coverage evaluation criteria, and 2.4% lower dose at the organs at risk criteria levels evaluated compared with clinical. There was no statistically significant difference detected in high-dose conformity between automated and clinical plans as measured by the conformation number. Automated plans achieved nine more unique criteria than clinical across the 12 patients tested and automated plans scored a significantly higher dose at the evaluation limit for two high-risk target coverage criteria and a significantly lower dose in one critical organ maximum dose. The novel dose prediction method with dose mimicking can generate complete treatment plans in 12-13 min without user interaction. It is a promising approach for fully automated treatment planning and can be readily applied to different treatment sites and modalities.
Bonekamp, S; Ghosh, P; Crawford, S; Solga, S F; Horska, A; Brancati, F L; Diehl, A M; Smith, S; Clark, J M
2008-01-01
To examine five available software packages for the assessment of abdominal adipose tissue with magnetic resonance imaging, compare their features and assess the reliability of measurement results. Feature evaluation and test-retest reliability of softwares (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision) used in manual, semi-automated or automated segmentation of abdominal adipose tissue. A random sample of 15 obese adults with type 2 diabetes. Axial T1-weighted spin echo images centered at vertebral bodies of L2-L3 were acquired at 1.5 T. Five software packages were evaluated (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision), comparing manual, semi-automated and automated segmentation approaches. Images were segmented into cross-sectional area (CSA), and the areas of visceral (VAT) and subcutaneous adipose tissue (SAT). Ease of learning and use and the design of the graphical user interface (GUI) were rated. Intra-observer accuracy and agreement between the software packages were calculated using intra-class correlation. Intra-class correlation coefficient was used to obtain test-retest reliability. Three of the five evaluated programs offered a semi-automated technique to segment the images based on histogram values or a user-defined threshold. One software package allowed manual delineation only. One fully automated program demonstrated the drawbacks of uncritical automated processing. The semi-automated approaches reduced variability and measurement error, and improved reproducibility. There was no significant difference in the intra-observer agreement in SAT and CSA. The VAT measurements showed significantly lower test-retest reliability. There were some differences between the software packages in qualitative aspects, such as user friendliness. Four out of five packages provided essentially the same results with respect to the inter- and intra-rater reproducibility. Our results using SliceOmatic, Analyze or NIHImage were comparable and could be used interchangeably. Newly developed fully automated approaches should be compared to one of the examined software packages.
Bonekamp, S; Ghosh, P; Crawford, S; Solga, SF; Horska, A; Brancati, FL; Diehl, AM; Smith, S; Clark, JM
2009-01-01
Objective To examine five available software packages for the assessment of abdominal adipose tissue with magnetic resonance imaging, compare their features and assess the reliability of measurement results. Design Feature evaluation and test–retest reliability of softwares (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision) used in manual, semi-automated or automated segmentation of abdominal adipose tissue. Subjects A random sample of 15 obese adults with type 2 diabetes. Measurements Axial T1-weighted spin echo images centered at vertebral bodies of L2–L3 were acquired at 1.5 T. Five software packages were evaluated (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision), comparing manual, semi-automated and automated segmentation approaches. Images were segmented into cross-sectional area (CSA), and the areas of visceral (VAT) and subcutaneous adipose tissue (SAT). Ease of learning and use and the design of the graphical user interface (GUI) were rated. Intra-observer accuracy and agreement between the software packages were calculated using intra-class correlation. Intra-class correlation coefficient was used to obtain test–retest reliability. Results Three of the five evaluated programs offered a semi-automated technique to segment the images based on histogram values or a user-defined threshold. One software package allowed manual delineation only. One fully automated program demonstrated the drawbacks of uncritical automated processing. The semi-automated approaches reduced variability and measurement error, and improved reproducibility. There was no significant difference in the intra-observer agreement in SAT and CSA. The VAT measurements showed significantly lower test–retest reliability. There were some differences between the software packages in qualitative aspects, such as user friendliness. Conclusion Four out of five packages provided essentially the same results with respect to the inter- and intra-rater reproducibility. Our results using SliceOmatic, Analyze or NIHImage were comparable and could be used interchangeably. Newly developed fully automated approaches should be compared to one of the examined software packages. PMID:17700582
McIntosh, Chris; Welch, Mattea; McNiven, Andrea; Jaffray, David A; Purdie, Thomas G
2017-07-06
Recent works in automated radiotherapy treatment planning have used machine learning based on historical treatment plans to infer the spatial dose distribution for a novel patient directly from the planning image. We present a probabilistic, atlas-based approach which predicts the dose for novel patients using a set of automatically selected most similar patients (atlases). The output is a spatial dose objective, which specifies the desired dose-per-voxel, and therefore replaces the need to specify and tune dose-volume objectives. Voxel-based dose mimicking optimization then converts the predicted dose distribution to a complete treatment plan with dose calculation using a collapsed cone convolution dose engine. In this study, we investigated automated planning for right-sided oropharaynx head and neck patients treated with IMRT and VMAT. We compare four versions of our dose prediction pipeline using a database of 54 training and 12 independent testing patients by evaluating 14 clinical dose evaluation criteria. Our preliminary results are promising and demonstrate that automated methods can generate comparable dose distributions to clinical. Overall, automated plans achieved an average of 0.6% higher dose for target coverage evaluation criteria, and 2.4% lower dose at the organs at risk criteria levels evaluated compared with clinical. There was no statistically significant difference detected in high-dose conformity between automated and clinical plans as measured by the conformation number. Automated plans achieved nine more unique criteria than clinical across the 12 patients tested and automated plans scored a significantly higher dose at the evaluation limit for two high-risk target coverage criteria and a significantly lower dose in one critical organ maximum dose. The novel dose prediction method with dose mimicking can generate complete treatment plans in 12-13 min without user interaction. It is a promising approach for fully automated treatment planning and can be readily applied to different treatment sites and modalities.
NASA Astrophysics Data System (ADS)
Sharma, Archie; Corona, Enrique; Mitra, Sunanda; Nutter, Brian S.
2006-03-01
Early detection of structural damage to the optic nerve head (ONH) is critical in diagnosis of glaucoma, because such glaucomatous damage precedes clinically identifiable visual loss. Early detection of glaucoma can prevent progression of the disease and consequent loss of vision. Traditional early detection techniques involve observing changes in the ONH through an ophthalmoscope. Stereo fundus photography is also routinely used to detect subtle changes in the ONH. However, clinical evaluation of stereo fundus photographs suffers from inter- and intra-subject variability. Even the Heidelberg Retina Tomograph (HRT) has not been found to be sufficiently sensitive for early detection. A semi-automated algorithm for quantitative representation of the optic disc and cup contours by computing accumulated disparities in the disc and cup regions from stereo fundus image pairs has already been developed using advanced digital image analysis methodologies. A 3-D visualization of the disc and cup is achieved assuming camera geometry. High correlation among computer-generated and manually segmented cup to disc ratios in a longitudinal study involving 159 stereo fundus image pairs has already been demonstrated. However, clinical usefulness of the proposed technique can only be tested by a fully automated algorithm. In this paper, we present a fully automated algorithm for segmentation of optic cup and disc contours from corresponding stereo disparity information. Because this technique does not involve human intervention, it eliminates subjective variability encountered in currently used clinical methods and provides ophthalmologists with a cost-effective and quantitative method for detection of ONH structural damage for early detection of glaucoma.
NASA Astrophysics Data System (ADS)
Brown, James M.; Campbell, J. Peter; Beers, Andrew; Chang, Ken; Donohue, Kyra; Ostmo, Susan; Chan, R. V. Paul; Dy, Jennifer; Erdogmus, Deniz; Ioannidis, Stratis; Chiang, Michael F.; Kalpathy-Cramer, Jayashree
2018-03-01
Retinopathy of prematurity (ROP) is a disease that affects premature infants, where abnormal growth of the retinal blood vessels can lead to blindness unless treated accordingly. Infants considered at risk of severe ROP are monitored for symptoms of plus disease, characterized by arterial tortuosity and venous dilation at the posterior pole, with a standard photographic definition. Disagreement among ROP experts in diagnosing plus disease has driven the development of computer-based methods that classify images based on hand-crafted features extracted from the vasculature. However, most of these approaches are semi-automated, which are time-consuming and subject to variability. In contrast, deep learning is a fully automated approach that has shown great promise in a wide variety of domains, including medical genetics, informatics and imaging. Convolutional neural networks (CNNs) are deep networks which learn rich representations of disease features that are highly robust to variations in acquisition and image quality. In this study, we utilized a U-Net architecture to perform vessel segmentation and then a GoogLeNet to perform disease classification. The classifier was trained on 3,000 retinal images and validated on an independent test set of patients with different observed progressions and treatments. We show that our fully automated algorithm can be used to monitor the progression of plus disease over multiple patient visits with results that are consistent with the experts' consensus diagnosis. Future work will aim to further validate the method on larger cohorts of patients to assess its applicability within the clinic as a treatment monitoring tool.
NASA Astrophysics Data System (ADS)
Xie, Dengling; Xie, Yanjun; Liu, Peng; Tong, Lieshu; Chu, Kaiqin; Smith, Zachary J.
2017-02-01
Current flow-based blood counting devices require expensive and centralized medical infrastructure and are not appropriate for field use. In this paper we report a method to count red blood cells, white blood cells as well as platelets through a low-cost and fully-automated blood counting system. The approach consists of using a compact, custom-built microscope with large field-of-view to record bright-field and fluorescence images of samples that are diluted with a single, stable reagent mixture and counted using automatic algorithms. Sample collection is performed manually using a spring loaded lancet, and volume-metering capillary tubes. The capillaries are then dropped into a tube of pre-measured reagents and gently shaken for 10-30 seconds. The sample is loaded into a measurement chamber and placed on a custom 3D printed platform. Sample translation and focusing is fully automated, and a user has only to press a button for the measurement and analysis to commence. Cost of the system is minimized through the use of custom-designed motorized components. We performed a series of comparative experiments by trained and untrained users on blood from adults and children. We compare the performance of our system, as operated by trained and untrained users, to the clinical gold standard using a Bland-Altman analysis, demonstrating good agreement of our system to the clinical standard. The system's low cost, complete automation, and good field performance indicate that it can be successfully translated for use in low-resource settings where central hematology laboratories are not accessible.
Automated structure determination of proteins with the SAIL-FLYA NMR method.
Takeda, Mitsuhiro; Ikeya, Teppei; Güntert, Peter; Kainosho, Masatsune
2007-01-01
The labeling of proteins with stable isotopes enhances the NMR method for the determination of 3D protein structures in solution. Stereo-array isotope labeling (SAIL) provides an optimal stereospecific and regiospecific pattern of stable isotopes that yields sharpened lines, spectral simplification without loss of information, and the ability to collect rapidly and evaluate fully automatically the structural restraints required to solve a high-quality solution structure for proteins up to twice as large as those that can be analyzed using conventional methods. Here, we describe a protocol for the preparation of SAIL proteins by cell-free methods, including the preparation of S30 extract and their automated structure analysis using the FLYA algorithm and the program CYANA. Once efficient cell-free expression of the unlabeled or uniformly labeled target protein has been achieved, the NMR sample preparation of a SAIL protein can be accomplished in 3 d. A fully automated FLYA structure calculation can be completed in 1 d on a powerful computer system.
Sochor, Jiri; Ryvolova, Marketa; Krystofova, Olga; Salas, Petr; Hubalek, Jaromir; Adam, Vojtech; Trnkova, Libuse; Havel, Ladislav; Beklova, Miroslava; Zehnalek, Josef; Provaznik, Ivo; Kizek, Rene
2010-11-29
The aim of this study was to describe behaviour, kinetics, time courses and limitations of the six different fully automated spectrometric methods--DPPH, TEAC, FRAP, DMPD, Free Radicals and Blue CrO5. Absorption curves were measured and absorbance maxima were found. All methods were calibrated using the standard compounds Trolox® and/or gallic acid. Calibration curves were determined (relative standard deviation was within the range from 1.5 to 2.5%). The obtained characteristics were compared and discussed. Moreover, the data obtained were applied to optimize and to automate all mentioned protocols. Automatic analyzer allowed us to analyse simultaneously larger set of samples, to decrease the measurement time, to eliminate the errors and to provide data of higher quality in comparison to manual analysis. The total time of analysis for one sample was decreased to 10 min for all six methods. In contrary, the total time of manual spectrometric determination was approximately 120 min. The obtained data provided good correlations between studied methods (R=0.97-0.99).
A fully automated FTIR system for remote sensing of greenhouse gases in the tropics
NASA Astrophysics Data System (ADS)
Geibel, M. C.; Gerbig, C.; Feist, D. G.
2010-07-01
This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network. It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics. Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. First results of total column measurements at Jena, Germany show that the instrument works well and can provide diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months. After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.
Peng, Chen; Frommlet, Alexandra; Perez, Manuel; Cobas, Carlos; Blechschmidt, Anke; Dominguez, Santiago; Lingel, Andreas
2016-04-14
NMR binding assays are routinely applied in hit finding and validation during early stages of drug discovery, particularly for fragment-based lead generation. To this end, compound libraries are screened by ligand-observed NMR experiments such as STD, T1ρ, and CPMG to identify molecules interacting with a target. The analysis of a high number of complex spectra is performed largely manually and therefore represents a limiting step in hit generation campaigns. Here we report a novel integrated computational procedure that processes and analyzes ligand-observed proton and fluorine NMR binding data in a fully automated fashion. A performance evaluation comparing automated and manual analysis results on (19)F- and (1)H-detected data sets shows that the program delivers robust, high-confidence hit lists in a fraction of the time needed for manual analysis and greatly facilitates visual inspection of the associated NMR spectra. These features enable considerably higher throughput, the assessment of larger libraries, and shorter turn-around times.
Automation of surface observations program
NASA Technical Reports Server (NTRS)
Short, Steve E.
1988-01-01
At present, surface weather observing methods are still largely manual and labor intensive. Through the nationwide implementation of Automated Surface Observing Systems (ASOS), this situation can be improved. Two ASOS capability levels are planned. The first is a basic-level system which will automatically observe the weather parameters essential for aviation operations and will operate either with or without supplemental contributions by an observer. The second is a more fully automated, stand-alone system which will observe and report the full range of weather parameters and will operate primarily in the unattended mode. Approximately 250 systems are planned by the end of the decade. When deployed, these systems will generate the standard hourly and special long-line transmitted weather observations, as well as provide continuous weather information direct to airport users. Specific ASOS configurations will vary depending upon whether the operation is unattended, minimally attended, or fully attended. The major functions of ASOS are data collection, data processing, product distribution, and system control. The program phases of development, demonstration, production system acquisition, and operational implementation are described.
NASA Astrophysics Data System (ADS)
Ahmad, Habib; Sutherland, Alex; Shin, Young Shik; Hwang, Kiwook; Qin, Lidong; Krom, Russell-John; Heath, James R.
2011-09-01
Microfluidics flow-patterning has been utilized for the construction of chip-scale miniaturized DNA and protein barcode arrays. Such arrays have been used for specific clinical and fundamental investigations in which many proteins are assayed from single cells or other small sample sizes. However, flow-patterned arrays are hand-prepared, and so are impractical for broad applications. We describe an integrated robotics/microfluidics platform for the automated preparation of such arrays, and we apply it to the batch fabrication of up to eighteen chips of flow-patterned DNA barcodes. The resulting substrates are comparable in quality with hand-made arrays and exhibit excellent substrate-to-substrate consistency. We demonstrate the utility and reproducibility of robotics-patterned barcodes by utilizing two flow-patterned chips for highly parallel assays of a panel of secreted proteins from single macrophage cells.
Ahmad, Habib; Sutherland, Alex; Shin, Young Shik; Hwang, Kiwook; Qin, Lidong; Krom, Russell-John; Heath, James R.
2011-01-01
Microfluidics flow-patterning has been utilized for the construction of chip-scale miniaturized DNA and protein barcode arrays. Such arrays have been used for specific clinical and fundamental investigations in which many proteins are assayed from single cells or other small sample sizes. However, flow-patterned arrays are hand-prepared, and so are impractical for broad applications. We describe an integrated robotics/microfluidics platform for the automated preparation of such arrays, and we apply it to the batch fabrication of up to eighteen chips of flow-patterned DNA barcodes. The resulting substrates are comparable in quality with hand-made arrays and exhibit excellent substrate-to-substrate consistency. We demonstrate the utility and reproducibility of robotics-patterned barcodes by utilizing two flow-patterned chips for highly parallel assays of a panel of secreted proteins from single macrophage cells. PMID:21974603
Ahmad, Habib; Sutherland, Alex; Shin, Young Shik; Hwang, Kiwook; Qin, Lidong; Krom, Russell-John; Heath, James R
2011-09-01
Microfluidics flow-patterning has been utilized for the construction of chip-scale miniaturized DNA and protein barcode arrays. Such arrays have been used for specific clinical and fundamental investigations in which many proteins are assayed from single cells or other small sample sizes. However, flow-patterned arrays are hand-prepared, and so are impractical for broad applications. We describe an integrated robotics/microfluidics platform for the automated preparation of such arrays, and we apply it to the batch fabrication of up to eighteen chips of flow-patterned DNA barcodes. The resulting substrates are comparable in quality with hand-made arrays and exhibit excellent substrate-to-substrate consistency. We demonstrate the utility and reproducibility of robotics-patterned barcodes by utilizing two flow-patterned chips for highly parallel assays of a panel of secreted proteins from single macrophage cells. © 2011 American Institute of Physics
Lemaire, C; Libert, L; Franci, X; Genon, J-L; Kuci, S; Giacomelli, F; Luxen, A
2015-06-15
An efficient, fully automated, enantioselective multi-step synthesis of no-carrier-added (nca) 6-[(18)F]fluoro-L-dopa ([(18)F]FDOPA) and 2-[(18)F]fluoro-L-tyrosine ([(18)F]FTYR) on a GE FASTlab synthesizer in conjunction with an additional high- performance liquid chromatography (HPLC) purification has been developed. A PTC (phase-transfer catalyst) strategy was used to synthesize these two important radiopharmaceuticals. According to recent chemistry improvements, automation of the whole process was implemented in a commercially available GE FASTlab module, with slight hardware modification using single use cassettes and stand-alone HPLC. [(18)F]FDOPA and [(18)F]FTYR were produced in 36.3 ± 3.0% (n = 8) and 50.5 ± 2.7% (n = 10) FASTlab radiochemical yield (decay corrected). The automated radiosynthesis on the FASTlab module requires about 52 min. Total synthesis time including HPLC purification and formulation was about 62 min. Enantiomeric excesses for these two aromatic amino acids were always >95%, and the specific activity of was >740 GBq/µmol. This automated synthesis provides high amount of [(18)F]FDOPA and [(18)F]FTYR (>37 GBq end of synthesis (EOS)). The process, fully adaptable for reliable production across multiple PET sites, could be readily implemented into a clinical good manufacturing process (GMP) environment. Copyright © 2015 John Wiley & Sons, Ltd.
Taminau, Jonatan; Meganck, Stijn; Lazar, Cosmin; Steenhoff, David; Coletta, Alain; Molter, Colin; Duque, Robin; de Schaetzen, Virginie; Weiss Solís, David Y; Bersini, Hugues; Nowé, Ann
2012-12-24
With an abundant amount of microarray gene expression data sets available through public repositories, new possibilities lie in combining multiple existing data sets. In this new context, analysis itself is no longer the problem, but retrieving and consistently integrating all this data before delivering it to the wide variety of existing analysis tools becomes the new bottleneck. We present the newly released inSilicoMerging R/Bioconductor package which, together with the earlier released inSilicoDb R/Bioconductor package, allows consistent retrieval, integration and analysis of publicly available microarray gene expression data sets. Inside the inSilicoMerging package a set of five visual and six quantitative validation measures are available as well. By providing (i) access to uniformly curated and preprocessed data, (ii) a collection of techniques to remove the batch effects between data sets from different sources, and (iii) several validation tools enabling the inspection of the integration process, these packages enable researchers to fully explore the potential of combining gene expression data for downstream analysis. The power of using both packages is demonstrated by programmatically retrieving and integrating gene expression studies from the InSilico DB repository [https://insilicodb.org/app/].
Fully Automated Anesthesia, Analgesia and Fluid Management
2017-01-03
General Anesthetic Drug Overdose; Adverse Effect of Intravenous Anesthetics, Sequela; Complication of Anesthesia; Drug Delivery System Malfunction; Hemodynamic Instability; Underdosing of Other General Anesthetics
NASA Technical Reports Server (NTRS)
Tenney, Yvette J.; Rogers, William H.; Pew, Richard W.
1995-01-01
There has been much concern in recent years about the rapid increase in automation on commercial flight decks. The survey was composed of three major sections. The first section asked pilots to rate different automation components that exist on the latest commercial aircraft regarding their obtrusiveness and the attention and effort required in using them. The second section addressed general 'automation philosophy' issues. The third section focused on issues related to levels and amount of automation. The results indicate that pilots of advanced aircraft like their automation, use it, and would welcome more automation. However, they also believe that automation has many disadvantages, especially fully autonomous automation. They want their automation to be simple and reliable and to produce predictable results. The biggest needs for higher levels of automation were in pre-flight, communication, systems management, and task management functions, planning as well as response tasks, and high workload situations. There is an irony and a challenge in the implications of these findings. On the one hand pilots would like new automation to be simple and reliable, but they need it to support the most complex part of the job--managing and planning tasks in high workload situations.
NASA Astrophysics Data System (ADS)
Gilat-Schmidt, Taly; Wang, Adam; Coradi, Thomas; Haas, Benjamin; Star-Lack, Josh
2016-03-01
The overall goal of this work is to develop a rapid, accurate and fully automated software tool to estimate patient-specific organ doses from computed tomography (CT) scans using a deterministic Boltzmann Transport Equation solver and automated CT segmentation algorithms. This work quantified the accuracy of organ dose estimates obtained by an automated segmentation algorithm. The investigated algorithm uses a combination of feature-based and atlas-based methods. A multiatlas approach was also investigated. We hypothesize that the auto-segmentation algorithm is sufficiently accurate to provide organ dose estimates since random errors at the organ boundaries will average out when computing the total organ dose. To test this hypothesis, twenty head-neck CT scans were expertly segmented into nine regions. A leave-one-out validation study was performed, where every case was automatically segmented with each of the remaining cases used as the expert atlas, resulting in nineteen automated segmentations for each of the twenty datasets. The segmented regions were applied to gold-standard Monte Carlo dose maps to estimate mean and peak organ doses. The results demonstrated that the fully automated segmentation algorithm estimated the mean organ dose to within 10% of the expert segmentation for regions other than the spinal canal, with median error for each organ region below 2%. In the spinal canal region, the median error was 7% across all data sets and atlases, with a maximum error of 20%. The error in peak organ dose was below 10% for all regions, with a median error below 4% for all organ regions. The multiple-case atlas reduced the variation in the dose estimates and additional improvements may be possible with more robust multi-atlas approaches. Overall, the results support potential feasibility of an automated segmentation algorithm to provide accurate organ dose estimates.
Raterink, Robert-Jan; Witkam, Yoeri; Vreeken, Rob J; Ramautar, Rawi; Hankemeier, Thomas
2014-10-21
In the field of bioanalysis, there is an increasing demand for miniaturized, automated, robust sample pretreatment procedures that can be easily connected to direct-infusion mass spectrometry (DI-MS) in order to allow the high-throughput screening of drugs and/or their metabolites in complex body fluids like plasma. Liquid-Liquid extraction (LLE) is a common sample pretreatment technique often used for complex aqueous samples in bioanalysis. Despite significant developments that have been made in automated and miniaturized LLE procedures, fully automated LLE techniques allowing high-throughput bioanalytical studies on small-volume samples using direct infusion mass spectrometry, have not been matured yet. Here, we introduce a new fully automated micro-LLE technique based on gas-pressure assisted mixing followed by passive phase separation, coupled online to nanoelectrospray-DI-MS. Our method was characterized by varying the gas flow and its duration through the solvent mixture. For evaluation of the analytical performance, four drugs were spiked to human plasma, resulting in highly acceptable precision (RSD down to 9%) and linearity (R(2) ranging from 0.990 to 0.998). We demonstrate that our new method does not only allow the reliable extraction of analytes from small sample volumes of a few microliters in an automated and high-throughput manner, but also performs comparable or better than conventional offline LLE, in which the handling of small volumes remains challenging. Finally, we demonstrate the applicability of our method for drug screening on dried blood spots showing excellent linearity (R(2) of 0.998) and precision (RSD of 9%). In conclusion, we present the proof of principe of a new high-throughput screening platform for bioanalysis based on a new automated microLLE method, coupled online to a commercially available nano-ESI-DI-MS.
NASA Astrophysics Data System (ADS)
Polan, Daniel F.; Brady, Samuel L.; Kaufman, Robert A.
2016-09-01
There is a need for robust, fully automated whole body organ segmentation for diagnostic CT. This study investigates and optimizes a Random Forest algorithm for automated organ segmentation; explores the limitations of a Random Forest algorithm applied to the CT environment; and demonstrates segmentation accuracy in a feasibility study of pediatric and adult patients. To the best of our knowledge, this is the first study to investigate a trainable Weka segmentation (TWS) implementation using Random Forest machine-learning as a means to develop a fully automated tissue segmentation tool developed specifically for pediatric and adult examinations in a diagnostic CT environment. Current innovation in computed tomography (CT) is focused on radiomics, patient-specific radiation dose calculation, and image quality improvement using iterative reconstruction, all of which require specific knowledge of tissue and organ systems within a CT image. The purpose of this study was to develop a fully automated Random Forest classifier algorithm for segmentation of neck-chest-abdomen-pelvis CT examinations based on pediatric and adult CT protocols. Seven materials were classified: background, lung/internal air or gas, fat, muscle, solid organ parenchyma, blood/contrast enhanced fluid, and bone tissue using Matlab and the TWS plugin of FIJI. The following classifier feature filters of TWS were investigated: minimum, maximum, mean, and variance evaluated over a voxel radius of 2 n , (n from 0 to 4), along with noise reduction and edge preserving filters: Gaussian, bilateral, Kuwahara, and anisotropic diffusion. The Random Forest algorithm used 200 trees with 2 features randomly selected per node. The optimized auto-segmentation algorithm resulted in 16 image features including features derived from maximum, mean, variance Gaussian and Kuwahara filters. Dice similarity coefficient (DSC) calculations between manually segmented and Random Forest algorithm segmented images from 21 patient image sections, were analyzed. The automated algorithm produced segmentation of seven material classes with a median DSC of 0.86 ± 0.03 for pediatric patient protocols, and 0.85 ± 0.04 for adult patient protocols. Additionally, 100 randomly selected patient examinations were segmented and analyzed, and a mean sensitivity of 0.91 (range: 0.82-0.98), specificity of 0.89 (range: 0.70-0.98), and accuracy of 0.90 (range: 0.76-0.98) were demonstrated. In this study, we demonstrate that this fully automated segmentation tool was able to produce fast and accurate segmentation of the neck and trunk of the body over a wide range of patient habitus and scan parameters.
Long-term maintenance of human induced pluripotent stem cells by automated cell culture system.
Konagaya, Shuhei; Ando, Takeshi; Yamauchi, Toshiaki; Suemori, Hirofumi; Iwata, Hiroo
2015-11-17
Pluripotent stem cells, such as embryonic stem cells and induced pluripotent stem (iPS) cells, are regarded as new sources for cell replacement therapy. These cells can unlimitedly expand under undifferentiated conditions and be differentiated into multiple cell types. Automated culture systems enable the large-scale production of cells. In addition to reducing the time and effort of researchers, an automated culture system improves the reproducibility of cell cultures. In the present study, we newly designed a fully automated cell culture system for human iPS maintenance. Using an automated culture system, hiPS cells maintained their undifferentiated state for 60 days. Automatically prepared hiPS cells had a potency of differentiation into three germ layer cells including dopaminergic neurons and pancreatic cells.
Lee, Hyunkwang; Troschel, Fabian M; Tajmir, Shahein; Fuchs, Georg; Mario, Julia; Fintelmann, Florian J; Do, Synho
2017-08-01
Pretreatment risk stratification is key for personalized medicine. While many physicians rely on an "eyeball test" to assess whether patients will tolerate major surgery or chemotherapy, "eyeballing" is inherently subjective and difficult to quantify. The concept of morphometric age derived from cross-sectional imaging has been found to correlate well with outcomes such as length of stay, morbidity, and mortality. However, the determination of the morphometric age is time intensive and requires highly trained experts. In this study, we propose a fully automated deep learning system for the segmentation of skeletal muscle cross-sectional area (CSA) on an axial computed tomography image taken at the third lumbar vertebra. We utilized a fully automated deep segmentation model derived from an extended implementation of a fully convolutional network with weight initialization of an ImageNet pre-trained model, followed by post processing to eliminate intramuscular fat for a more accurate analysis. This experiment was conducted by varying window level (WL), window width (WW), and bit resolutions in order to better understand the effects of the parameters on the model performance. Our best model, fine-tuned on 250 training images and ground truth labels, achieves 0.93 ± 0.02 Dice similarity coefficient (DSC) and 3.68 ± 2.29% difference between predicted and ground truth muscle CSA on 150 held-out test cases. Ultimately, the fully automated segmentation system can be embedded into the clinical environment to accelerate the quantification of muscle and expanded to volume analysis of 3D datasets.
Fully automated adipose tissue measurement on abdominal CT
NASA Astrophysics Data System (ADS)
Yao, Jianhua; Sussman, Daniel L.; Summers, Ronald M.
2011-03-01
Obesity has become widespread in America and has been associated as a risk factor for many illnesses. Adipose tissue (AT) content, especially visceral AT (VAT), is an important indicator for risks of many disorders, including heart disease and diabetes. Measuring adipose tissue (AT) with traditional means is often unreliable and inaccurate. CT provides a means to measure AT accurately and consistently. We present a fully automated method to segment and measure abdominal AT in CT. Our method integrates image preprocessing which attempts to correct for image artifacts and inhomogeneities. We use fuzzy cmeans to cluster AT regions and active contour models to separate subcutaneous and visceral AT. We tested our method on 50 abdominal CT scans and evaluated the correlations between several measurements.
Srinivasan, Pratul P.; Kim, Leo A.; Mettu, Priyatham S.; Cousins, Scott W.; Comer, Grant M.; Izatt, Joseph A.; Farsiu, Sina
2014-01-01
We present a novel fully automated algorithm for the detection of retinal diseases via optical coherence tomography (OCT) imaging. Our algorithm utilizes multiscale histograms of oriented gradient descriptors as feature vectors of a support vector machine based classifier. The spectral domain OCT data sets used for cross-validation consisted of volumetric scans acquired from 45 subjects: 15 normal subjects, 15 patients with dry age-related macular degeneration (AMD), and 15 patients with diabetic macular edema (DME). Our classifier correctly identified 100% of cases with AMD, 100% cases with DME, and 86.67% cases of normal subjects. This algorithm is a potentially impactful tool for the remote diagnosis of ophthalmic diseases. PMID:25360373
NASA Technical Reports Server (NTRS)
Miller, R. H.; Minsky, M. L.; Smith, D. B. S.
1982-01-01
Applications of automation, robotics, and machine intelligence systems (ARAMIS) to space activities and their related ground support functions are studied, so that informed decisions can be made on which aspects of ARAMIS to develop. The specific tasks which will be required by future space project tasks are identified and the relative merits of these options are evaluated. The ARAMIS options defined and researched span the range from fully human to fully machine, including a number of intermediate options (e.g., humans assisted by computers, and various levels of teleoperation). By including this spectrum, the study searches for the optimum mix of humans and machines for space project tasks.
Automated Medical Supply Chain Management: A Remedy for Logistical Shortcomings
2016-08-01
Regional case study where the hospital compared its utilization of automated inventory management technologies (Pyxis) to previous SCM practice in the... management practices within the 96 Medical Group (MDG), Eglin Hospital . It was known that the Defense Medical Logistics Standard was used at Eglin... Hospital but was not fully integrated down to the unit level. Casual manual inventory management practices were used explicitly resulting in
Greg C. Liknes; Dacia M. Meneguzzo; Todd A. Kellerman
2017-01-01
Windbreaks are an important ecological resource across the large expanse of agricultural land in the central United States and are often planted in straight-line or L-shaped configurations to serve specific functions. As high-resolution (i.e., <5 m) land cover datasets become more available for these areas, semi-or fully-automated methods for distinguishing...
Lab-on-a-Chip Proteomic Assays for Psychiatric Disorders.
Peter, Harald; Wienke, Julia; Guest, Paul C; Bistolas, Nikitas; Bier, Frank F
2017-01-01
Lab-on-a-chip assays allow rapid identification of multiple parameters on an automated user-friendly platform. Here we describe a fully automated multiplex immunoassay and readout in less than 15 min using the Fraunhofer in vitro diagnostics (ivD) platform to enable inexpensive point-of-care profiling of sera or a single drop of blood from patients with various diseases such as psychiatric disorders.
Automated optimization techniques for aircraft synthesis
NASA Technical Reports Server (NTRS)
Vanderplaats, G. N.
1976-01-01
Application of numerical optimization techniques to automated conceptual aircraft design is examined. These methods are shown to be a general and efficient way to obtain quantitative information for evaluating alternative new vehicle projects. Fully automated design is compared with traditional point design methods and time and resource requirements for automated design are given. The NASA Ames Research Center aircraft synthesis program (ACSYNT) is described with special attention to calculation of the weight of a vehicle to fly a specified mission. The ACSYNT procedures for automatically obtaining sensitivity of the design (aircraft weight, performance and cost) to various vehicle, mission, and material technology parameters are presented. Examples are used to demonstrate the efficient application of these techniques.
The automation of an inlet mass flow control system
NASA Technical Reports Server (NTRS)
Supplee, Frank; Tcheng, Ping; Weisenborn, Michael
1989-01-01
The automation of a closed-loop computer controlled system for the inlet mass flow system (IMFS) developed for a wind tunnel facility at Langley Research Center is presented. This new PC based control system is intended to replace the manual control system presently in use in order to fully automate the plug positioning of the IMFS during wind tunnel testing. Provision is also made for communication between the PC and a host-computer in order to allow total animation of the plug positioning and data acquisition during the complete sequence of predetermined plug locations. As extensive running time is programmed for the IMFS, this new automated system will save both manpower and tunnel running time.
NMR-based automated protein structure determination.
Würz, Julia M; Kazemi, Sina; Schmidt, Elena; Bagaria, Anurag; Güntert, Peter
2017-08-15
NMR spectra analysis for protein structure determination can now in many cases be performed by automated computational methods. This overview of the computational methods for NMR protein structure analysis presents recent automated methods for signal identification in multidimensional NMR spectra, sequence-specific resonance assignment, collection of conformational restraints, and structure calculation, as implemented in the CYANA software package. These algorithms are sufficiently reliable and integrated into one software package to enable the fully automated structure determination of proteins starting from NMR spectra without manual interventions or corrections at intermediate steps, with an accuracy of 1-2 Å backbone RMSD in comparison with manually solved reference structures. Copyright © 2017 Elsevier Inc. All rights reserved.
Kume, Teruyoshi; Kim, Byeong-Keuk; Waseda, Katsuhisa; Sathyanarayana, Shashidhar; Li, Wenguang; Teo, Tat-Jin; Yock, Paul G; Fitzgerald, Peter J; Honda, Yasuhiro
2013-02-01
The aim of this study was to evaluate a new fully automated lumen border tracing system based on a novel multifrequency processing algorithm. We developed the multifrequency processing method to enhance arterial lumen detection by exploiting the differential scattering characteristics of blood and arterial tissue. The implementation of the method can be integrated into current intravascular ultrasound (IVUS) hardware. This study was performed in vivo with conventional 40-MHz IVUS catheters (Atlantis SR Pro™, Boston Scientific Corp, Natick, MA) in 43 clinical patients with coronary artery disease. A total of 522 frames were randomly selected, and lumen areas were measured after automatically tracing lumen borders with the new tracing system and a commercially available tracing system (TraceAssist™) referred to as the "conventional tracing system." The data assessed by the two automated systems were compared with the results of manual tracings by experienced IVUS analysts. New automated lumen measurements showed better agreement with manual lumen area tracings compared with those of the conventional tracing system (correlation coefficient: 0.819 vs. 0.509). When compared against manual tracings, the new algorithm also demonstrated improved systematic error (mean difference: 0.13 vs. -1.02 mm(2) ) and random variability (standard deviation of difference: 2.21 vs. 4.02 mm(2) ) compared with the conventional tracing system. This preliminary study showed that the novel fully automated tracing system based on the multifrequency processing algorithm can provide more accurate lumen border detection than current automated tracing systems and thus, offer a more reliable quantitative evaluation of lumen geometry. Copyright © 2011 Wiley Periodicals, Inc.
Yousef Kalafi, Elham; Tan, Wooi Boon; Town, Christopher; Dhillon, Sarinder Kaur
2016-12-22
Monogeneans are flatworms (Platyhelminthes) that are primarily found on gills and skin of fishes. Monogenean parasites have attachment appendages at their haptoral regions that help them to move about the body surface and feed on skin and gill debris. Haptoral attachment organs consist of sclerotized hard parts such as hooks, anchors and marginal hooks. Monogenean species are differentiated based on their haptoral bars, anchors, marginal hooks, reproductive parts' (male and female copulatory organs) morphological characters and soft anatomical parts. The complex structure of these diagnostic organs and also their overlapping in microscopic digital images are impediments for developing fully automated identification system for monogeneans (LNCS 7666:256-263, 2012), (ISDA; 457-462, 2011), (J Zoolog Syst Evol Res 52(2): 95-99. 2013;). In this study images of hard parts of the haptoral organs such as bars and anchors are used to develop a fully automated identification technique for monogenean species identification by implementing image processing techniques and machine learning methods. Images of four monogenean species namely Sinodiplectanotrema malayanus, Trianchoratus pahangensis, Metahaliotrema mizellei and Metahaliotrema sp. (undescribed) were used to develop an automated technique for identification. K-nearest neighbour (KNN) was applied to classify the monogenean specimens based on the extracted features. 50% of the dataset was used for training and the other 50% was used as testing for system evaluation. Our approach demonstrated overall classification accuracy of 90%. In this study Leave One Out (LOO) cross validation is used for validation of our system and the accuracy is 91.25%. The methods presented in this study facilitate fast and accurate fully automated classification of monogeneans at the species level. In future studies more classes will be included in the model, the time to capture the monogenean images will be reduced and improvements in extraction and selection of features will be implemented.
NCBI GEO: mining tens of millions of expression profiles--database and tools update.
Barrett, Tanya; Troup, Dennis B; Wilhite, Stephen E; Ledoux, Pierre; Rudnev, Dmitry; Evangelista, Carlos; Kim, Irene F; Soboleva, Alexandra; Tomashevsky, Maxim; Edgar, Ron
2007-01-01
The Gene Expression Omnibus (GEO) repository at the National Center for Biotechnology Information (NCBI) archives and freely disseminates microarray and other forms of high-throughput data generated by the scientific community. The database has a minimum information about a microarray experiment (MIAME)-compliant infrastructure that captures fully annotated raw and processed data. Several data deposit options and formats are supported, including web forms, spreadsheets, XML and Simple Omnibus Format in Text (SOFT). In addition to data storage, a collection of user-friendly web-based interfaces and applications are available to help users effectively explore, visualize and download the thousands of experiments and tens of millions of gene expression patterns stored in GEO. This paper provides a summary of the GEO database structure and user facilities, and describes recent enhancements to database design, performance, submission format options, data query and retrieval utilities. GEO is accessible at http://www.ncbi.nlm.nih.gov/geo/
Tissue microarrays and digital image analysis.
Ryan, Denise; Mulrane, Laoighse; Rexhepaj, Elton; Gallagher, William M
2011-01-01
Tissue microarrays (TMAs) have recently emerged as very valuable tools for high-throughput pathological assessment, especially in the cancer research arena. This important technology, however, has yet to fully penetrate into the area of toxicology. Here, we describe the creation of TMAs representative of samples produced from conventional toxicology studies within a large-scale, multi-institutional pan-European project, PredTox. PredTox, short for Predictive Toxicology, formed part of an EU FP6 Integrated Project, Innovative Medicines for Europe (InnoMed), and aimed to study pre-clinically 16 compounds of known liver and/or kidney toxicity. In more detail, TMAs were constructed from materials corresponding to the full face sections of liver and kidney from rats treated with different drug candidates by members of the consortium. We also describe the process of digital slide scanning of kidney and liver sections, in the context of creating an online resource of histopathological data.
Polonchuk, Liudmila
2014-01-01
Patch-clamping is a powerful technique for investigating the ion channel function and regulation. However, its low throughput hampered profiling of large compound series in early drug development. Fortunately, automation has revolutionized the area of experimental electrophysiology over the past decade. Whereas the first automated patch-clamp instruments using the planar patch-clamp technology demonstrated rather a moderate throughput, few second-generation automated platforms recently launched by various companies have significantly increased ability to form a high number of high-resistance seals. Among them is SyncroPatch(®) 96 (Nanion Technologies GmbH, Munich, Germany), a fully automated giga-seal patch-clamp system with the highest throughput on the market. By recording from up to 96 cells simultaneously, the SyncroPatch(®) 96 allows to substantially increase throughput without compromising data quality. This chapter describes features of the innovative automated electrophysiology system and protocols used for a successful transfer of the established hERG assay to this high-throughput automated platform.
Ultramap v3 - a Revolution in Aerial Photogrammetry
NASA Astrophysics Data System (ADS)
Reitinger, B.; Sormann, M.; Zebedin, L.; Schachinger, B.; Hoefler, M.; Tomasi, R.; Lamperter, M.; Gruber, B.; Schiester, G.; Kobald, M.; Unger, M.; Klaus, A.; Bernoegger, S.; Karner, K.; Wiechert, A.; Ponticelli, M.; Gruber, M.
2012-07-01
In the last years, Microsoft has driven innovation in the aerial photogrammetry community. Besides the market leading camera technology, UltraMap has grown to an outstanding photogrammetric workflow system which enables users to effectively work with large digital aerial image blocks in a highly automated way. Best example is the project-based color balancing approach which automatically balances images to a homogeneous block. UltraMap V3 continues innovation, and offers a revolution in terms of ortho processing. A fully automated dense matching module strives for high precision digital surface models (DSMs) which are calculated either on CPUs or on GPUs using a distributed processing framework. By applying constrained filtering algorithms, a digital terrain model can be derived which in turn can be used for fully automated traditional ortho texturing. By having the knowledge about the underlying geometry, seamlines can be generated automatically by applying cost functions in order to minimize visual disturbing artifacts. By exploiting the generated DSM information, a DSMOrtho is created using the balanced input images. Again, seamlines are detected automatically resulting in an automatically balanced ortho mosaic. Interactive block-based radiometric adjustments lead to a high quality ortho product based on UltraCam imagery. UltraMap v3 is the first fully integrated and interactive solution for supporting UltraCam images at best in order to deliver DSM and ortho imagery.
Wang, Xinggang; Yang, Wei; Weinreb, Jeffrey; Han, Juan; Li, Qiubai; Kong, Xiangchuang; Yan, Yongluan; Ke, Zan; Luo, Bo; Liu, Tao; Wang, Liang
2017-11-13
Prostate cancer (PCa) is a major cause of death since ancient time documented in Egyptian Ptolemaic mummy imaging. PCa detection is critical to personalized medicine and varies considerably under an MRI scan. 172 patients with 2,602 morphologic images (axial 2D T2-weighted imaging) of the prostate were obtained. A deep learning with deep convolutional neural network (DCNN) and a non-deep learning with SIFT image feature and bag-of-word (BoW), a representative method for image recognition and analysis, were used to distinguish pathologically confirmed PCa patients from prostate benign conditions (BCs) patients with prostatitis or prostate benign hyperplasia (BPH). In fully automated detection of PCa patients, deep learning had a statistically higher area under the receiver operating characteristics curve (AUC) than non-deep learning (P = 0.0007 < 0.001). The AUCs were 0.84 (95% CI 0.78-0.89) for deep learning method and 0.70 (95% CI 0.63-0.77) for non-deep learning method, respectively. Our results suggest that deep learning with DCNN is superior to non-deep learning with SIFT image feature and BoW model for fully automated PCa patients differentiation from prostate BCs patients. Our deep learning method is extensible to image modalities such as MR imaging, CT and PET of other organs.
Suppa, Per; Hampel, Harald; Spies, Lothar; Fiebach, Jochen B; Dubois, Bruno; Buchert, Ralph
2015-01-01
Hippocampus volumetry based on magnetic resonance imaging (MRI) has not yet been translated into everyday clinical diagnostic patient care, at least in part due to limited availability of appropriate software tools. In the present study, we evaluate a fully-automated and computationally efficient processing pipeline for atlas based hippocampal volumetry using freely available Statistical Parametric Mapping (SPM) software in 198 amnestic mild cognitive impairment (MCI) subjects from the Alzheimer's Disease Neuroimaging Initiative (ADNI1). Subjects were grouped into MCI stable and MCI to probable Alzheimer's disease (AD) converters according to follow-up diagnoses at 12, 24, and 36 months. Hippocampal grey matter volume (HGMV) was obtained from baseline T1-weighted MRI and then corrected for total intracranial volume and age. Average processing time per subject was less than 4 minutes on a standard PC. The area under the receiver operator characteristic curve of the corrected HGMV for identification of MCI to probable AD converters within 12, 24, and 36 months was 0.78, 0.72, and 0.71, respectively. Thus, hippocampal volume computed with the fully-automated processing pipeline provides similar power for prediction of MCI to probable AD conversion as computationally more expensive methods. The whole processing pipeline has been made freely available as an SPM8 toolbox. It is easily set up and integrated into everyday clinical patient care.
Understanding reliance on automation: effects of error type, error distribution, age and experience
Sanchez, Julian; Rogers, Wendy A.; Fisk, Arthur D.; Rovira, Ericka
2015-01-01
An obstacle detection task supported by “imperfect” automation was used with the goal of understanding the effects of automation error types and age on automation reliance. Sixty younger and sixty older adults interacted with a multi-task simulation of an agricultural vehicle (i.e. a virtual harvesting combine). The simulator included an obstacle detection task and a fully manual tracking task. A micro-level analysis provided insight into the way reliance patterns change over time. The results indicated that there are distinct patterns of reliance that develop as a function of error type. A prevalence of automation false alarms led participants to under-rely on the automation during alarm states while over relying on it during non-alarms states. Conversely, a prevalence of automation misses led participants to over-rely on automated alarms and under-rely on the automation during non-alarm states. Older adults adjusted their behavior according to the characteristics of the automation similarly to younger adults, although it took them longer to do so. The results of this study suggest the relationship between automation reliability and reliance depends on the prevalence of specific errors and on the state of the system. Understanding the effects of automation detection criterion settings on human-automation interaction can help designers of automated systems make predictions about human behavior and system performance as a function of the characteristics of the automation. PMID:25642142
Understanding reliance on automation: effects of error type, error distribution, age and experience.
Sanchez, Julian; Rogers, Wendy A; Fisk, Arthur D; Rovira, Ericka
2014-03-01
An obstacle detection task supported by "imperfect" automation was used with the goal of understanding the effects of automation error types and age on automation reliance. Sixty younger and sixty older adults interacted with a multi-task simulation of an agricultural vehicle (i.e. a virtual harvesting combine). The simulator included an obstacle detection task and a fully manual tracking task. A micro-level analysis provided insight into the way reliance patterns change over time. The results indicated that there are distinct patterns of reliance that develop as a function of error type. A prevalence of automation false alarms led participants to under-rely on the automation during alarm states while over relying on it during non-alarms states. Conversely, a prevalence of automation misses led participants to over-rely on automated alarms and under-rely on the automation during non-alarm states. Older adults adjusted their behavior according to the characteristics of the automation similarly to younger adults, although it took them longer to do so. The results of this study suggest the relationship between automation reliability and reliance depends on the prevalence of specific errors and on the state of the system. Understanding the effects of automation detection criterion settings on human-automation interaction can help designers of automated systems make predictions about human behavior and system performance as a function of the characteristics of the automation.
Jiang, Jiyang; Liu, Tao; Zhu, Wanlin; Koncz, Rebecca; Liu, Hao; Lee, Teresa; Sachdev, Perminder S; Wen, Wei
2018-07-01
We present 'UBO Detector', a cluster-based, fully automated pipeline for extracting and calculating variables for regions of white matter hyperintensities (WMH) (available for download at https://cheba.unsw.edu.au/group/neuroimaging-pipeline). It takes T1-weighted and fluid attenuated inversion recovery (FLAIR) scans as input, and SPM12 and FSL functions are utilised for pre-processing. The candidate clusters are then generated by FMRIB's Automated Segmentation Tool (FAST). A supervised machine learning algorithm, k-nearest neighbor (k-NN), is applied to determine whether the candidate clusters are WMH or non-WMH. UBO Detector generates both image and text (volumes and the number of WMH clusters) outputs for whole brain, periventricular, deep, and lobar WMH, as well as WMH in arterial territories. The computation time for each brain is approximately 15 min. We validated the performance of UBO Detector by showing a) high segmentation (similarity index (SI) = 0.848) and volumetric (intraclass correlation coefficient (ICC) = 0.985) agreement between the UBO Detector-derived and manually traced WMH; b) highly correlated (r 2 > 0.9) and a steady increase of WMH volumes over time; and c) significant associations of periventricular (t = 22.591, p < 0.001) and deep (t = 14.523, p < 0.001) WMH volumes generated by UBO Detector with Fazekas rating scores. With parallel computing enabled in UBO Detector, the processing can take advantage of multi-core CPU's that are commonly available on workstations. In conclusion, UBO Detector is a reliable, efficient and fully automated WMH segmentation pipeline. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Sheffner, E. J.; Hlavka, C. A.; Bauer, E. M.
1984-01-01
Two techniques have been developed for the mapping and area estimation of small grains in California from Landsat digital data. The two techniques are Band Ratio Thresholding, a semi-automated version of a manual procedure, and LCLS, a layered classification technique which can be fully automated and is based on established clustering and classification technology. Preliminary evaluation results indicate that the two techniques have potential for providing map products which can be incorporated into existing inventory procedures and automated alternatives to traditional inventory techniques and those which currently employ Landsat imagery.
Note: Automated electrochemical etching and polishing of silver scanning tunneling microscope tips.
Sasaki, Stephen S; Perdue, Shawn M; Rodriguez Perez, Alejandro; Tallarida, Nicholas; Majors, Julia H; Apkarian, V Ara; Lee, Joonhee
2013-09-01
Fabrication of sharp and smooth Ag tips is crucial in optical scanning probe microscope experiments. To ensure reproducible tip profiles, the polishing process is fully automated using a closed-loop laminar flow system to deliver the electrolytic solution to moving electrodes mounted on a motorized translational stage. The repetitive translational motion is controlled precisely on the μm scale with a stepper motor and screw-thread mechanism. The automated setup allows reproducible control over the tip profile and improves smoothness and sharpness of tips (radius 27 ± 18 nm), as measured by ultrafast field emission.
Exploring the Use of a Test Automation Framework
NASA Technical Reports Server (NTRS)
Cervantes, Alex
2009-01-01
It is known that software testers, more often than not, lack the time needed to fully test the delivered software product within the time period allotted to them. When problems in the implementation phase of a development project occur, it normally causes the software delivery date to slide. As a result, testers either need to work longer hours, or supplementary resources need to be added to the test team in order to meet aggressive test deadlines. One solution to this problem is to provide testers with a test automation framework to facilitate the development of automated test solutions.
NASA Astrophysics Data System (ADS)
Harms, Justin D.; Bachmann, Charles M.; Ambeau, Brittany L.; Faulring, Jason W.; Ruiz Torres, Andres J.; Badura, Gregory; Myers, Emily
2017-10-01
Field-portable goniometers are created for a wide variety of applications. Many of these applications require specific types of instruments and measurement schemes and must operate in challenging environments. Therefore, designs are based on the requirements that are specific to the application. We present a field-portable goniometer that was designed for measuring the hemispherical-conical reflectance factor (HCRF) of various soils and low-growing vegetation in austere coastal and desert environments and biconical reflectance factors in laboratory settings. Unlike some goniometers, this system features a requirement for "target-plane tracking" to ensure that measurements can be collected on sloped surfaces, without compromising angular accuracy. The system also features a second upward-looking spectrometer to measure the spatially dependent incoming illumination, an integrated software package to provide full automation, an automated leveling system to ensure a standard frame of reference, a design that minimizes the obscuration due to self-shading to measure the opposition effect, and the ability to record a digital elevation model of the target region. This fully automated and highly mobile system obtains accurate and precise measurements of HCRF in a wide variety of terrain and in less time than most other systems while not sacrificing consistency or repeatability in laboratory environments.
Automated Detection and Analysis of Interplanetary Shocks with Real-Time Application
NASA Astrophysics Data System (ADS)
Vorotnikov, V.; Smith, C. W.; Hu, Q.; Szabo, A.; Skoug, R. M.; Cohen, C. M.
2006-12-01
The ACE real-time data stream provides web-based now-casting capabilities for solar wind conditions upstream of Earth. Our goal is to provide an automated code that finds and analyzes interplanetary shocks as they occur for possible real-time application to space weather nowcasting. Shock analysis algorithms based on the Rankine-Hugoniot jump conditions exist and are in wide-spread use today for the interactive analysis of interplanetary shocks yielding parameters such as shock speed and propagation direction and shock strength in the form of compression ratios. Although these codes can be automated in a reasonable manner to yield solutions not far from those obtained by user-directed interactive analysis, event detection presents an added obstacle and the first step in a fully automated analysis. We present a fully automated Rankine-Hugoniot analysis code that can scan the ACE science data, find shock candidates, analyze the events, obtain solutions in good agreement with those derived from interactive applications, and dismiss false positive shock candidates on the basis of the conservation equations. The intent is to make this code available to NOAA for use in real-time space weather applications. The code has the added advantage of being able to scan spacecraft data sets to provide shock solutions for use outside real-time applications and can easily be applied to science-quality data sets from other missions. Use of the code for this purpose will also be explored.
Dera, Dimah; Bouaynaya, Nidhal; Fathallah-Shaykh, Hassan M
2016-07-01
We address the problem of fully automated region discovery and robust image segmentation by devising a new deformable model based on the level set method (LSM) and the probabilistic nonnegative matrix factorization (NMF). We describe the use of NMF to calculate the number of distinct regions in the image and to derive the local distribution of the regions, which is incorporated into the energy functional of the LSM. The results demonstrate that our NMF-LSM method is superior to other approaches when applied to synthetic binary and gray-scale images and to clinical magnetic resonance images (MRI) of the human brain with and without a malignant brain tumor, glioblastoma multiforme. In particular, the NMF-LSM method is fully automated, highly accurate, less sensitive to the initial selection of the contour(s) or initial conditions, more robust to noise and model parameters, and able to detect as small distinct regions as desired. These advantages stem from the fact that the proposed method relies on histogram information instead of intensity values and does not introduce nuisance model parameters. These properties provide a general approach for automated robust region discovery and segmentation in heterogeneous images. Compared with the retrospective radiological diagnoses of two patients with non-enhancing grade 2 and 3 oligodendroglioma, the NMF-LSM detects earlier progression times and appears suitable for monitoring tumor response. The NMF-LSM method fills an important need of automated segmentation of clinical MRI.
Race, Caitlin M.; Kwon, Lydia E.; Foreman, Myles T.; ...
2017-11-24
Here, we report on the implementation of an automated platform for detecting the presence of an antibody biomarker for human papillomavirus-associated oropharyngeal cancer from a single droplet of serum, in which a nanostructured photonic crystal surface is used to amplify the output of a fluorescence-linked immunosorbent assay. The platform is comprised of a microfluidic cartridge with integrated photonic crystal chips that interfaces with an assay instrument that automates the introduction of reagents, wash steps, and surface drying. Upon assay completion, the cartridge interfaces with a custom laser-scanning instrument that couples light into the photonic crystal at the optimal resonance conditionmore » for fluorescence enhancement. The instrument is used to measure the fluorescence intensity values of microarray spots corresponding to the biomarkers of interest, in addition to several experimental controls that verify correct functioning of the assay protocol. In this work, we report both dose-response characterization of the system using anti-E7 antibody introduced at known concentrations into serum and characterization of a set of clinical samples from which results were compared with a conventional enzyme-linked immunosorbent assay (ELISA) performed in microplate format. Finally, the demonstrated capability represents a simple, rapid, automated, and high-sensitivity method for multiplexed detection of protein biomarkers from a low-volume test sample.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Race, Caitlin M.; Kwon, Lydia E.; Foreman, Myles T.
Here, we report on the implementation of an automated platform for detecting the presence of an antibody biomarker for human papillomavirus-associated oropharyngeal cancer from a single droplet of serum, in which a nanostructured photonic crystal surface is used to amplify the output of a fluorescence-linked immunosorbent assay. The platform is comprised of a microfluidic cartridge with integrated photonic crystal chips that interfaces with an assay instrument that automates the introduction of reagents, wash steps, and surface drying. Upon assay completion, the cartridge interfaces with a custom laser-scanning instrument that couples light into the photonic crystal at the optimal resonance conditionmore » for fluorescence enhancement. The instrument is used to measure the fluorescence intensity values of microarray spots corresponding to the biomarkers of interest, in addition to several experimental controls that verify correct functioning of the assay protocol. In this work, we report both dose-response characterization of the system using anti-E7 antibody introduced at known concentrations into serum and characterization of a set of clinical samples from which results were compared with a conventional enzyme-linked immunosorbent assay (ELISA) performed in microplate format. Finally, the demonstrated capability represents a simple, rapid, automated, and high-sensitivity method for multiplexed detection of protein biomarkers from a low-volume test sample.« less
Ten years of R&D and full automation in molecular diagnosis.
Greub, Gilbert; Sahli, Roland; Brouillet, René; Jaton, Katia
2016-01-01
A 10-year experience of our automated molecular diagnostic platform that carries out 91 different real-time PCR is described. Progresses and future perspectives in molecular diagnostic microbiology are reviewed: why automation is important; how our platform was implemented; how homemade PCRs were developed; the advantages/disadvantages of homemade PCRs, including the critical aspects of troubleshooting and the need to further reduce the turnaround time for specific samples, at least for defined clinical settings such as emergencies. The future of molecular diagnosis depends on automation, and in a novel perspective, it is time now to fully acknowledge the true contribution of molecular diagnostic and to reconsider the indication for PCR, by also using these tests as first-line assays.
Automated homogeneous liposome immunoassay systems for anticonvulsant drugs.
Kubotsu, K; Goto, S; Fujita, M; Tuchiya, H; Kida, M; Takano, S; Matsuura, S; Sakurabayashi, I
1992-06-01
We developed automated homogeneous immunoassays, based on immunolysis of liposomes, for measuring phenytoin, phenobarbital, and carbamazepine from serum. Liposome lysis was detected spectrophotometrically from entrapped glucose-6-phosphate dehydrogenase activity. The procedure was fully automated on a routine automated clinical analyzer. Within-run, between-run, dilution, and recovery tests showed good accuracies and reproducibilities. Bilirubin, hemoglobin, triglycerides, and Intrafat did not affect assay results. The results obtained by liposome immunoassays for phenytoin, phenobarbital, and carbamazepine correlated well with those obtained by enzyme-multiplied immunoassay (Syva EMIT) kits (r = 0.995, 0.986, and 0.988, respectively) and fluorescence polarization immunoassay (Abbott TDx) kits (r = 0.990, 0.991, and 0.975, respectively). The proposed method should be useful for monitoring anticonvulsant drug concentrations in blood.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Jung Hwa; Hyung, Seok-Won; Mun, Dong-Gi
2012-08-03
A multi-functional liquid chromatography system that performs 1-dimensional, 2-dimensional (strong cation exchange/reverse phase liquid chromatography, or SCX/RPLC) separations, and online phosphopeptides enrichment using a single binary nano-flow pump has been developed. With a simple operation of a function selection valve, which is equipped with a SCX column and a TiO2 (titanium dioxide) column, a fully automated selection of three different experiment modes was achieved. Because the current system uses essentially the same solvent flow paths, the same trap column, and the same separation column for reverse-phase separation of 1D, 2D, and online phosphopeptides enrichment experiments, the elution time information obtainedmore » from these experiments is in excellent agreement, which facilitates correlating peptide information from different experiments.« less
3D model assisted fully automated scanning laser Doppler vibrometer measurements
NASA Astrophysics Data System (ADS)
Sels, Seppe; Ribbens, Bart; Bogaerts, Boris; Peeters, Jeroen; Vanlanduit, Steve
2017-12-01
In this paper, a new fully automated scanning laser Doppler vibrometer (LDV) measurement technique is presented. In contrast to existing scanning LDV techniques which use a 2D camera for the manual selection of sample points, we use a 3D Time-of-Flight camera in combination with a CAD file of the test object to automatically obtain measurements at pre-defined locations. The proposed procedure allows users to test prototypes in a shorter time because physical measurement locations are determined without user interaction. Another benefit from this methodology is that it incorporates automatic mapping between a CAD model and the vibration measurements. This mapping can be used to visualize measurements directly on a 3D CAD model. The proposed method is illustrated with vibration measurements of an unmanned aerial vehicle
Barbesi, Donato; Vicente Vilas, Víctor; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Aldave de Las Heras, Laura
2017-01-01
A LabVIEW ® -based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino ® -based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW ® VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste.
NASA Technical Reports Server (NTRS)
Miller, R. H.; Minsky, M. L.; Smith, D. B. S.
1982-01-01
Potential applications of automation, robotics, and machine intelligence systems (ARAMIS) to space activities, and to their related ground support functions are explored. The specific tasks which will be required by future space projects are identified. ARAMIS options which are candidates for those space project tasks and the relative merits of these options are defined and evaluated. Promising applications of ARAMIS and specific areas for further research are identified. The ARAMIS options defined and researched by the study group span the range from fully human to fully machine, including a number of intermediate options (e.g., humans assisted by computers, and various levels of teleoperation). By including this spectrum, the study searches for the optimum mix of humans and machines for space project tasks.
A Prototype System for Retrieval of Gene Functional Information
Folk, Lillian C.; Patrick, Timothy B.; Pattison, James S.; Wolfinger, Russell D.; Mitchell, Joyce A.
2003-01-01
Microarrays allow researchers to gather data about the expression patterns of thousands of genes simultaneously. Statistical analysis can reveal which genes show statistically significant results. Making biological sense of those results requires the retrieval of functional information about the genes thus identified, typically a manual gene-by-gene retrieval of information from various on-line databases. For experiments generating thousands of genes of interest, retrieval of functional information can become a significant bottleneck. To address this issue, we are currently developing a prototype system to automate the process of retrieval of functional information from multiple on-line sources. PMID:14728346
Slide Set: Reproducible image analysis and batch processing with ImageJ.
Nanes, Benjamin A
2015-11-01
Most imaging studies in the biological sciences rely on analyses that are relatively simple. However, manual repetition of analysis tasks across multiple regions in many images can complicate even the simplest analysis, making record keeping difficult, increasing the potential for error, and limiting reproducibility. While fully automated solutions are necessary for very large data sets, they are sometimes impractical for the small- and medium-sized data sets common in biology. Here we present the Slide Set plugin for ImageJ, which provides a framework for reproducible image analysis and batch processing. Slide Set organizes data into tables, associating image files with regions of interest and other relevant information. Analysis commands are automatically repeated over each image in the data set, and multiple commands can be chained together for more complex analysis tasks. All analysis parameters are saved, ensuring transparency and reproducibility. Slide Set includes a variety of built-in analysis commands and can be easily extended to automate other ImageJ plugins, reducing the manual repetition of image analysis without the set-up effort or programming expertise required for a fully automated solution.
Campbell, J Q; Petrella, A J
2016-09-06
Population-based modeling of the lumbar spine has the potential to be a powerful clinical tool. However, developing a fully parameterized model of the lumbar spine with accurate geometry has remained a challenge. The current study used automated methods for landmark identification to create a statistical shape model of the lumbar spine. The shape model was evaluated using compactness, generalization ability, and specificity. The primary shape modes were analyzed visually, quantitatively, and biomechanically. The biomechanical analysis was performed by using the statistical shape model with an automated method for finite element model generation to create a fully parameterized finite element model of the lumbar spine. Functional finite element models of the mean shape and the extreme shapes (±3 standard deviations) of all 17 shape modes were created demonstrating the robust nature of the methods. This study represents an advancement in finite element modeling of the lumbar spine and will allow population-based modeling in the future. Copyright © 2016 Elsevier Ltd. All rights reserved.
2013-01-01
Background Analysis of global gene expression by DNA microarrays is widely used in experimental molecular biology. However, the complexity of such high-dimensional data sets makes it difficult to fully understand the underlying biological features present in the data. The aim of this study is to introduce a method for DNA microarray analysis that provides an intuitive interpretation of data through dimension reduction and pattern recognition. We present the first “Archetypal Analysis” of global gene expression. The analysis is based on microarray data from five integrated studies of Pseudomonas aeruginosa isolated from the airways of cystic fibrosis patients. Results Our analysis clustered samples into distinct groups with comprehensible characteristics since the archetypes representing the individual groups are closely related to samples present in the data set. Significant changes in gene expression between different groups identified adaptive changes of the bacteria residing in the cystic fibrosis lung. The analysis suggests a similar gene expression pattern between isolates with a high mutation rate (hypermutators) despite accumulation of different mutations for these isolates. This suggests positive selection in the cystic fibrosis lung environment, and changes in gene expression for these isolates are therefore most likely related to adaptation of the bacteria. Conclusions Archetypal analysis succeeded in identifying adaptive changes of P. aeruginosa. The combination of clustering and matrix factorization made it possible to reveal minor similarities among different groups of data, which other analytical methods failed to identify. We suggest that this analysis could be used to supplement current methods used to analyze DNA microarray data. PMID:24059747
Automated Test Requirement Document Generation
1987-11-01
DIAGNOSTICS BASED ON THE PRINCIPLES OF ARTIFICIAL INTELIGENCE ", 1984 International Test Conference, 01Oct84, (A3, 3, Cs D3, E2, G2, H2, 13, J6, K) 425...j0O GLOSSARY OF ACRONYMS 0 ABBREVIATION DEFINITION AFSATCOM Air Force Satellite Communication Al Artificial Intelligence ASIC Application Specific...In-Test Equipment (BITE) and AI ( Artificial Intelligence) - Expert Systems - need to be fully applied before a completely automated process can be
An Intelligent Automation Platform for Rapid Bioprocess Design.
Wu, Tianyi; Zhou, Yuhong
2014-08-01
Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.
Sauer, Juergen; Chavaillaz, Alain; Wastell, David
2016-06-01
This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.
Fully automated joint space width measurement and digital X-ray radiogrammetry in early RA.
Platten, Michael; Kisten, Yogan; Kälvesten, Johan; Arnaud, Laurent; Forslind, Kristina; van Vollenhoven, Ronald
2017-01-01
To study fully automated digital joint space width (JSW) and bone mineral density (BMD) in relation to a conventional radiographic scoring method in early rheumatoid arthritis (eRA). Radiographs scored by the modified Sharp van der Heijde score (SHS) in patients with eRA were acquired from the SWEdish FarmacOTherapy study. Fully automated JSW measurements of bilateral metacarpals 2, 3 and 4 were compared with the joint space narrowing (JSN) score in SHS. Multilevel mixed model statistics were applied to calculate the significance of the association between ΔJSW and ΔBMD over 1 year, and the JSW differences between damaged and undamaged joints as evaluated by the JSN. Based on 576 joints of 96 patients with eRA, a significant reduction from baseline to 1 year was observed in the JSW from 1.69 (±0.19) mm to 1.66 (±0.19) mm (p<0.01), and BMD from 0.583 (±0.068) g/cm 2 to 0.566 (±0.074) g/cm 2 (p<0.01). A significant positive association was observed between ΔJSW and ΔBMD over 1 year (p<0.0001). On an individual joint level, JSWs of undamaged (JSN=0) joints were wider than damaged (JSN>0) joints: 1.68 mm (95% CI 1.70 to 1.67) vs 1.54 mm (95% CI 1.63 to 1.46). Similarly the unadjusted multilevel model showed significant differences in JSW between undamaged (1.68 mm (95% CI 1.72 to 1.64)) and damaged joints (1.63 mm (95% CI 1.68 to 1.58)) (p=0.0048). This difference remained significant in the adjusted model: 1.66 mm (95% CI 1.70 to 1.61) vs 1.62 mm (95% CI 1.68 to 1.56) (p=0.042). To measure the JSW with this fully automated digital tool may be useful as a quick and observer-independent application for evaluating cartilage damage in eRA. NCT00764725.
Ou, Hong-Yu; He, Xinyi; Harrison, Ewan M.; Kulasekara, Bridget R.; Thani, Ali Bin; Kadioglu, Aras; Lory, Stephen; Hinton, Jay C. D.; Barer, Michael R.; Rajakumar, Kumar
2007-01-01
MobilomeFINDER (http://mml.sjtu.edu.cn/MobilomeFINDER) is an interactive online tool that facilitates bacterial genomic island or ‘mobile genome’ (mobilome) discovery; it integrates the ArrayOme and tRNAcc software packages. ArrayOme utilizes a microarray-derived comparative genomic hybridization input data set to generate ‘inferred contigs’ produced by merging adjacent genes classified as ‘present’. Collectively these ‘fragments’ represent a hypothetical ‘microarray-visualized genome (MVG)’. ArrayOme permits recognition of discordances between physical genome and MVG sizes, thereby enabling identification of strains rich in microarray-elusive novel genes. Individual tRNAcc tools facilitate automated identification of genomic islands by comparative analysis of the contents and contexts of tRNA sites and other integration hotspots in closely related sequenced genomes. Accessory tools facilitate design of hotspot-flanking primers for in silico and/or wet-science-based interrogation of cognate loci in unsequenced strains and analysis of islands for features suggestive of foreign origins; island-specific and genome-contextual features are tabulated and represented in schematic and graphical forms. To date we have used MobilomeFINDER to analyse several Enterobacteriaceae, Pseudomonas aeruginosa and Streptococcus suis genomes. MobilomeFINDER enables high-throughput island identification and characterization through increased exploitation of emerging sequence data and PCR-based profiling of unsequenced test strains; subsequent targeted yeast recombination-based capture permits full-length sequencing and detailed functional studies of novel genomic islands. PMID:17537813
Flexible automated approach for quantitative liquid handling of complex biological samples.
Palandra, Joe; Weller, David; Hudson, Gary; Li, Jeff; Osgood, Sarah; Hudson, Emily; Zhong, Min; Buchholz, Lisa; Cohen, Lucinda H
2007-11-01
A fully automated protein precipitation technique for biological sample preparation has been developed for the quantitation of drugs in various biological matrixes. All liquid handling during sample preparation was automated using a Hamilton MicroLab Star Robotic workstation, which included the preparation of standards and controls from a Watson laboratory information management system generated work list, shaking of 96-well plates, and vacuum application. Processing time is less than 30 s per sample or approximately 45 min per 96-well plate, which is then immediately ready for injection onto an LC-MS/MS system. An overview of the process workflow is discussed, including the software development. Validation data are also provided, including specific liquid class data as well as comparative data of automated vs manual preparation using both quality controls and actual sample data. The efficiencies gained from this automated approach are described.
The effect of JPEG compression on automated detection of microaneurysms in retinal images
NASA Astrophysics Data System (ADS)
Cree, M. J.; Jelinek, H. F.
2008-02-01
As JPEG compression at source is ubiquitous in retinal imaging, and the block artefacts introduced are known to be of similar size to microaneurysms (an important indicator of diabetic retinopathy) it is prudent to evaluate the effect of JPEG compression on automated detection of retinal pathology. Retinal images were acquired at high quality and then compressed to various lower qualities. An automated microaneurysm detector was run on the retinal images of various qualities of JPEG compression and the ability to predict the presence of diabetic retinopathy based on the detected presence of microaneurysms was evaluated with receiver operating characteristic (ROC) methodology. The negative effect of JPEG compression on automated detection was observed even at levels of compression sometimes used in retinal eye-screening programmes and these may have important clinical implications for deciding on acceptable levels of compression for a fully automated eye-screening programme.
Tronser, Tina; Popova, Anna A; Jaggy, Mona; Bastmeyer, Martin; Levkin, Pavel A
2017-12-01
Over the past decades, stem cells have attracted growing interest in fundamental biological and biomedical research as well as in regenerative medicine, due to their unique ability to self-renew and differentiate into various cell types. Long-term maintenance of the self-renewal ability and inhibition of spontaneous differentiation, however, still remain challenging and are not fully understood. Uncontrolled spontaneous differentiation of stem cells makes high-throughput screening of stem cells also difficult. This further hinders investigation of the underlying mechanisms of stem cell differentiation and the factors that might affect it. In this work, a dual functionality of nanoporous superhydrophobic-hydrophilic micropatterns is demonstrated in their ability to inhibit differentiation of mouse embryonic stem cells (mESCs) and at the same time enable formation of arrays of microdroplets (droplet microarray) via the effect of discontinuous dewetting. Such combination makes high-throughput screening of undifferentiated mouse embryonic stem cells possible. The droplet microarray is used to investigate the development, differentiation, and maintenance of stemness of mESC, revealing the dependence of stem cell behavior on droplet volume in nano- and microliter scale. The inhibition of spontaneous differentiation of mESCs cultured on the droplet microarray for up to 72 h is observed. In addition, up to fourfold increased cell growth rate of mESCs cultured on our platform has been observed. The difference in the behavior of mESCs is attributed to the porosity and roughness of the polymer surface. This work demonstrates that the droplet microarray possesses the potential for the screening of mESCs under conditions of prolonged inhibition of stem cells' spontaneous differentiation. Such a platform can be useful for applications in the field of stem cell research, pharmacological testing of drug efficacy and toxicity, biomedical research as well as in the field of regenerative medicine and tissue engineering. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Kenny, Caitlin; Fern, Lisa
2012-01-01
Continuing demand for the use of Unmanned Aircraft Systems (UAS) has put increasing pressure on operations in civil airspace. The need to fly UAS in the National Airspace System (NAS) in order to perform missions vital to national security and defense, emergency management, and science is increasing at a rapid pace. In order to ensure safe operations in the NAS, operators of unmanned aircraft, like those of manned aircraft, may be required to maintain separation assurance and avoid loss of separation with other aircraft while performing their mission tasks. This experiment investigated the effects of varying levels of automation on UAS operator performance and workload while responding to conflict resolution instructions provided by the Tactical Collision Avoidance System II (TCAS II) during a UAS mission in high-density airspace. The purpose of this study was not to investigate the safety of using TCAS II on UAS, but rather to examine the effect of automation on the ability of operators to respond to traffic collision alerts. Six licensed pilots were recruited to act as UAS operators for this study. Operators were instructed to follow a specified mission flight path, while maintaining radio contact with Air Traffic Control and responding to TCAS II resolution advisories. Operators flew four, 45 minute, experimental missions with four different levels of automation: Manual, Knobs, Management by Exception, and Fully Automated. All missions included TCAS II Resolution Advisories (RAs) that required operator attention and rerouting. Operator compliance and reaction time to RAs was measured, and post-run NASA-TLX ratings were collected to measure workload. Results showed significantly higher compliance rates, faster responses to TCAS II alerts, as well as less preemptive operator actions when higher levels of automation are implemented. Physical and Temporal ratings of workload were significantly higher in the Manual condition than in the Management by Exception and Fully Automated conditions.
Design and Prototype of an Automated Column-Switching HPLC System for Radiometabolite Analysis.
Vasdev, Neil; Collier, Thomas Lee
2016-08-17
Column-switching high performance liquid chromatography (HPLC) is extensively used for the critical analysis of radiolabeled ligands and their metabolites in plasma. However, the lack of streamlined apparatus and consequently varying protocols remain as a challenge among positron emission tomography laboratories. We report here the prototype apparatus and implementation of a fully automated and simplified column-switching procedure to allow for the easy and automated determination of radioligands and their metabolites in up to 5 mL of plasma. The system has been used with conventional UV and coincidence radiation detectors, as well as with a single quadrupole mass spectrometer.
Image segmentation evaluation for very-large datasets
NASA Astrophysics Data System (ADS)
Reeves, Anthony P.; Liu, Shuang; Xie, Yiting
2016-03-01
With the advent of modern machine learning methods and fully automated image analysis there is a need for very large image datasets having documented segmentations for both computer algorithm training and evaluation. Current approaches of visual inspection and manual markings do not scale well to big data. We present a new approach that depends on fully automated algorithm outcomes for segmentation documentation, requires no manual marking, and provides quantitative evaluation for computer algorithms. The documentation of new image segmentations and new algorithm outcomes are achieved by visual inspection. The burden of visual inspection on large datasets is minimized by (a) customized visualizations for rapid review and (b) reducing the number of cases to be reviewed through analysis of quantitative segmentation evaluation. This method has been applied to a dataset of 7,440 whole-lung CT images for 6 different segmentation algorithms designed to fully automatically facilitate the measurement of a number of very important quantitative image biomarkers. The results indicate that we could achieve 93% to 99% successful segmentation for these algorithms on this relatively large image database. The presented evaluation method may be scaled to much larger image databases.
Who shares? Who doesn't? Factors associated with openly archiving raw research data.
Piwowar, Heather A
2011-01-01
Many initiatives encourage investigators to share their raw datasets in hopes of increasing research efficiency and quality. Despite these investments of time and money, we do not have a firm grasp of who openly shares raw research data, who doesn't, and which initiatives are correlated with high rates of data sharing. In this analysis I use bibliometric methods to identify patterns in the frequency with which investigators openly archive their raw gene expression microarray datasets after study publication. Automated methods identified 11,603 articles published between 2000 and 2009 that describe the creation of gene expression microarray data. Associated datasets in best-practice repositories were found for 25% of these articles, increasing from less than 5% in 2001 to 30%-35% in 2007-2009. Accounting for sensitivity of the automated methods, approximately 45% of recent gene expression studies made their data publicly available. First-order factor analysis on 124 diverse bibliometric attributes of the data creation articles revealed 15 factors describing authorship, funding, institution, publication, and domain environments. In multivariate regression, authors were most likely to share data if they had prior experience sharing or reusing data, if their study was published in an open access journal or a journal with a relatively strong data sharing policy, or if the study was funded by a large number of NIH grants. Authors of studies on cancer and human subjects were least likely to make their datasets available. These results suggest research data sharing levels are still low and increasing only slowly, and data is least available in areas where it could make the biggest impact. Let's learn from those with high rates of sharing to embrace the full potential of our research output.
NASA Astrophysics Data System (ADS)
Gorlach, Igor; Wessel, Oliver
2008-09-01
In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.
Automated Transition State Theory Calculations for High-Throughput Kinetics.
Bhoorasingh, Pierre L; Slakman, Belinda L; Seyedzadeh Khanshan, Fariba; Cain, Jason Y; West, Richard H
2017-09-21
A scarcity of known chemical kinetic parameters leads to the use of many reaction rate estimates, which are not always sufficiently accurate, in the construction of detailed kinetic models. To reduce the reliance on these estimates and improve the accuracy of predictive kinetic models, we have developed a high-throughput, fully automated, reaction rate calculation method, AutoTST. The algorithm integrates automated saddle-point geometry search methods and a canonical transition state theory kinetics calculator. The automatically calculated reaction rates compare favorably to existing estimated rates. Comparison against high level theoretical calculations show the new automated method performs better than rate estimates when the estimate is made by a poor analogy. The method will improve by accounting for internal rotor contributions and by improving methods to determine molecular symmetry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karnowski, Thomas Paul; Giancardo, Luca; Li, Yaquin
2013-01-01
Automated retina image analysis has reached a high level of maturity in recent years, and thus the question of how validation is performed in these systems is beginning to grow in importance. One application of retina image analysis is in telemedicine, where an automated system could enable the automated detection of diabetic retinopathy and other eye diseases as a low-cost method for broad-based screening. In this work we discuss our experiences in developing a telemedical network for retina image analysis, including our progression from a manual diagnosis network to a more fully automated one. We pay special attention to howmore » validations of our algorithm steps are performed, both using data from the telemedicine network and other public databases.« less
Analysis of postural load during tasks related to milking cows-a case study.
Groborz, Anna; Tokarski, Tomasz; Roman-Liu, Danuta
2011-01-01
The aim of this study was to analyse postural load during tasks related to milking cows of 2 farmers on 2 different farms (one with a manual milk transport system, the other with a fully automated milk transport system) as a case study. The participants were full-time farmers, they were both healthy and experienced in their job. The Ovako Working Posture Analyzing System (OWAS) was used to evaluate postural load and postural risk. Postural load was medium for the farmer on the farm with a manual milk transport system and high for the farmer working on the farm with a fully automated milk transport system. Thus, it can be concluded that a higher level of farm mechanization not always mean that the farmer's postural load is lower, but limitation of OWAS should be considered.
Fully automated laser ray tracing system to measure changes in the crystalline lens GRIN profile.
Qiu, Chen; Maceo Heilman, Bianca; Kaipio, Jari; Donaldson, Paul; Vaghefi, Ehsan
2017-11-01
Measuring the lens gradient refractive index (GRIN) accurately and reliably has proven an extremely challenging technical problem. A fully automated laser ray tracing (LRT) system was built to address this issue. The LRT system captures images of multiple laser projections before and after traversing through an ex vivo lens. These LRT images, combined with accurate measurements of the lens geometry, are used to calculate the lens GRIN profile. Mathematically, this is an ill-conditioned problem; hence, it is essential to apply biologically relevant constraints to produce a feasible solution. The lens GRIN measurements were compared with previously published data. Our GRIN retrieval algorithm produces fast and accurate measurements of the lens GRIN profile. Experiments to study the optics of physiologically perturbed lenses are the future direction of this research.
Fully automated laser ray tracing system to measure changes in the crystalline lens GRIN profile
Qiu, Chen; Maceo Heilman, Bianca; Kaipio, Jari; Donaldson, Paul; Vaghefi, Ehsan
2017-01-01
Measuring the lens gradient refractive index (GRIN) accurately and reliably has proven an extremely challenging technical problem. A fully automated laser ray tracing (LRT) system was built to address this issue. The LRT system captures images of multiple laser projections before and after traversing through an ex vivo lens. These LRT images, combined with accurate measurements of the lens geometry, are used to calculate the lens GRIN profile. Mathematically, this is an ill-conditioned problem; hence, it is essential to apply biologically relevant constraints to produce a feasible solution. The lens GRIN measurements were compared with previously published data. Our GRIN retrieval algorithm produces fast and accurate measurements of the lens GRIN profile. Experiments to study the optics of physiologically perturbed lenses are the future direction of this research. PMID:29188093
A fully-automated neural network analysis of AFM force-distance curves for cancer tissue diagnosis
NASA Astrophysics Data System (ADS)
Minelli, Eleonora; Ciasca, Gabriele; Sassun, Tanya Enny; Antonelli, Manila; Palmieri, Valentina; Papi, Massimiliano; Maulucci, Giuseppe; Santoro, Antonio; Giangaspero, Felice; Delfini, Roberto; Campi, Gaetano; De Spirito, Marco
2017-10-01
Atomic Force Microscopy (AFM) has the unique capability of probing the nanoscale mechanical properties of biological systems that affect and are affected by the occurrence of many pathologies, including cancer. This capability has triggered growing interest in the translational process of AFM from physics laboratories to clinical practice. A factor still hindering the current use of AFM in diagnostics is related to the complexity of AFM data analysis, which is time-consuming and needs highly specialized personnel with a strong physical and mathematical background. In this work, we demonstrate an operator-independent neural-network approach for the analysis of surgically removed brain cancer tissues. This approach allowed us to distinguish—in a fully automated fashion—cancer from healthy tissues with high accuracy, also highlighting the presence and the location of infiltrating tumor cells.
Ackermann, Uwe; Plougastel, Lucie; Goh, Yit Wooi; Yeoh, Shinn Dee; Scott, Andrew M
2014-12-01
The synthesis of [(18)F]2-fluoroethyl azide and its subsequent click reaction with 5-ethynyl-2'-deoxyuridine (EDU) to form [(18)F]FLETT was performed using an iPhase FlexLab module. The implementation of a vacuum distillation method afforded [(18)F]2-fluoroethyl azide in 87±5.3% radiochemical yield. The use of Cu(CH3CN)4PF6 and TBTA as catalyst enabled us to fully automate the [(18)F]FLETT synthesis without the need for the operator to enter the radiation field. [(18)F]FLETT was produced in higher overall yield (41.3±6.5%) and shorter synthesis time (67min) than with our previously reported manual method (32.5±2.5% in 130min). Copyright © 2014 Elsevier Ltd. All rights reserved.
Godfrey, Alexander G; Masquelin, Thierry; Hemmerle, Horst
2013-09-01
This article describes our experiences in creating a fully integrated, globally accessible, automated chemical synthesis laboratory. The goal of the project was to establish a fully integrated automated synthesis solution that was initially focused on minimizing the burden of repetitive, routine, rules-based operations that characterize more established chemistry workflows. The architecture was crafted to allow for the expansion of synthetic capabilities while also providing for a flexible interface that permits the synthesis objective to be introduced and manipulated as needed under the judicious direction of a remote user in real-time. This innovative central synthesis suite is herein described along with some case studies to illustrate the impact such a system is having in expanding drug discovery capabilities. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Huang, Xia; Li, Chunqiang; Xiao, Chuan; Sun, Wenqing; Qian, Wei
2017-03-01
The temporal focusing two-photon microscope (TFM) is developed to perform depth resolved wide field fluorescence imaging by capturing frames sequentially. However, due to strong nonignorable noises and diffraction rings surrounding particles, further researches are extremely formidable without a precise particle localization technique. In this paper, we developed a fully-automated scheme to locate particles positions with high noise tolerance. Our scheme includes the following procedures: noise reduction using a hybrid Kalman filter method, particle segmentation based on a multiscale kernel graph cuts global and local segmentation algorithm, and a kinematic estimation based particle tracking method. Both isolated and partial-overlapped particles can be accurately identified with removal of unrelated pixels. Based on our quantitative analysis, 96.22% isolated particles and 84.19% partial-overlapped particles were successfully detected.
Fricke, Jens; Pohlmann, Kristof; Jonescheit, Nils A; Ellert, Andree; Joksch, Burkhard; Luttmann, Reiner
2013-06-01
The identification of optimal expression conditions for state-of-the-art production of pharmaceutical proteins is a very time-consuming and expensive process. In this report a method for rapid and reproducible optimization of protein expression in an in-house designed small-scale BIOSTAT® multi-bioreactor plant is described. A newly developed BioPAT® MFCS/win Design of Experiments (DoE) module (Sartorius Stedim Systems, Germany) connects the process control system MFCS/win and the DoE software MODDE® (Umetrics AB, Sweden) and enables therefore the implementation of fully automated optimization procedures. As a proof of concept, a commercial Pichia pastoris strain KM71H has been transformed for the expression of potential malaria vaccines. This approach has allowed a doubling of intact protein secretion productivity due to the DoE optimization procedure compared to initial cultivation results. In a next step, robustness regarding the sensitivity to process parameter variability has been proven around the determined optimum. Thereby, a pharmaceutical production process that is significantly improved within seven 24-hour cultivation cycles was established. Specifically, regarding the regulatory demands pointed out in the process analytical technology (PAT) initiative of the United States Food and Drug Administration (FDA), the combination of a highly instrumented, fully automated multi-bioreactor platform with proper cultivation strategies and extended DoE software solutions opens up promising benefits and opportunities for pharmaceutical protein production. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Zhang, Zhongqi; Zhang, Aming; Xiao, Gang
2012-06-05
Protein hydrogen/deuterium exchange (HDX) followed by protease digestion and mass spectrometric (MS) analysis is accepted as a standard method for studying protein conformation and conformational dynamics. In this article, an improved HDX MS platform with fully automated data processing is described. The platform significantly reduces systematic and random errors in the measurement by introducing two types of corrections in HDX data analysis. First, a mixture of short peptides with fast HDX rates is introduced as internal standards to adjust the variations in the extent of back exchange from run to run. Second, a designed unique peptide (PPPI) with slow intrinsic HDX rate is employed as another internal standard to reflect the possible differences in protein intrinsic HDX rates when protein conformations at different solution conditions are compared. HDX data processing is achieved with a comprehensive HDX model to simulate the deuterium labeling and back exchange process. The HDX model is implemented into the in-house developed software MassAnalyzer and enables fully unattended analysis of the entire protein HDX MS data set starting from ion detection and peptide identification to final processed HDX output, typically within 1 day. The final output of the automated data processing is a set (or the average) of the most possible protection factors for each backbone amide hydrogen. The utility of the HDX MS platform is demonstrated by exploring the conformational transition of a monoclonal antibody by increasing concentrations of guanidine.
Automated sizing of large structures by mixed optimization methods
NASA Technical Reports Server (NTRS)
Sobieszczanski, J.; Loendorf, D.
1973-01-01
A procedure for automating the sizing of wing-fuselage airframes was developed and implemented in the form of an operational program. The program combines fully stressed design to determine an overall material distribution with mass-strength and mathematical programming methods to design structural details accounting for realistic design constraints. The practicality and efficiency of the procedure is demonstrated for transport aircraft configurations. The methodology is sufficiently general to be applicable to other large and complex structures.
Automated ILA design for synchronous sequential circuits
NASA Technical Reports Server (NTRS)
Liu, M. N.; Liu, K. Z.; Maki, G. K.; Whitaker, S. R.
1991-01-01
An iterative logic array (ILA) architecture for synchronous sequential circuits is presented. This technique utilizes linear algebra to produce the design equations. The ILA realization of synchronous sequential logic can be fully automated with a computer program. A programmable design procedure is proposed to fullfill the design task and layout generation. A software algorithm in the C language has been developed and tested to generate 1 micron CMOS layouts using the Hewlett-Packard FUNGEN module generator shell.
Automated position control of a surface array relative to a liquid microjunction surface sampler
Van Berkel, Gary J.; Kertesz, Vilmos; Ford, Michael James
2007-11-13
A system and method utilizes an image analysis approach for controlling the probe-to-surface distance of a liquid junction-based surface sampling system for use with mass spectrometric detection. Such an approach enables a hands-free formation of the liquid microjunction used to sample solution composition from the surface and for re-optimization, as necessary, of the microjunction thickness during a surface scan to achieve a fully automated surface sampling system.
Leb, Victoria; Stöcher, Markus; Valentine-Thon, Elizabeth; Hölzl, Gabriele; Kessler, Harald; Stekel, Herbert; Berg, Jörg
2004-02-01
We report on the development of a fully automated real-time PCR assay for the quantitative detection of hepatitis B virus (HBV) DNA in plasma with EDTA (EDTA plasma). The MagNA Pure LC instrument was used for automated DNA purification and automated preparation of PCR mixtures. Real-time PCR was performed on the LightCycler instrument. An internal amplification control was devised as a PCR competitor and was introduced into the assay at the stage of DNA purification to permit monitoring for sample adequacy. The detection limit of the assay was found to be 200 HBV DNA copies/ml, with a linear dynamic range of 8 orders of magnitude. When samples from the European Union Quality Control Concerted Action HBV Proficiency Panel 1999 were examined, the results were found to be in acceptable agreement with the HBV DNA concentrations of the panel members. In a clinical laboratory evaluation of 123 EDTA plasma samples, a significant correlation was found with the results obtained by the Roche HBV Monitor test on the Cobas Amplicor analyzer within the dynamic range of that system. In conclusion, the newly developed assay has a markedly reduced hands-on time, permits monitoring for sample adequacy, and is suitable for the quantitative detection of HBV DNA in plasma in a routine clinical laboratory.
Improving Grid Resilience through Informed Decision-making (IGRID)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burnham, Laurie; Stamber, Kevin L.; Jeffers, Robert Fredric
The transformation of the distribution grid from a centralized to decentralized architecture, with bi-directional power and data flows, is made possible by a surge in network intelligence and grid automation. While changes are largely beneficial, the interface between grid operator and automated technologies is not well understood, nor are the benefits and risks of automation. Quantifying and understanding the latter is an important facet of grid resilience that needs to be fully investigated. The work described in this document represents the first empirical study aimed at identifying and mitigating the vulnerabilities posed by automation for a grid that for themore » foreseeable future will remain a human-in-the-loop critical infrastructure. Our scenario-based methodology enabled us to conduct a series of experimental studies to identify causal relationships between grid-operator performance and automated technologies and to collect measurements of human performance as a function of automation. Our findings, though preliminary, suggest there are predictive patterns in the interplay between human operators and automation, patterns that can inform the rollout of distribution automation and the hiring and training of operators, and contribute in multiple and significant ways to the field of grid resilience.« less
An Intelligent Automation Platform for Rapid Bioprocess Design
Wu, Tianyi
2014-01-01
Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579
Advanced automation for in-space vehicle processing
NASA Technical Reports Server (NTRS)
Sklar, Michael; Wegerif, D.
1990-01-01
The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.
Kesner, Adam Leon; Kuntner, Claudia
2010-10-01
Respiratory gating in PET is an approach used to minimize the negative effects of respiratory motion on spatial resolution. It is based on an initial determination of a patient's respiratory movements during a scan, typically using hardware based systems. In recent years, several fully automated databased algorithms have been presented for extracting a respiratory signal directly from PET data, providing a very practical strategy for implementing gating in the clinic. In this work, a new method is presented for extracting a respiratory signal from raw PET sinogram data and compared to previously presented automated techniques. The acquisition of respiratory signal from PET data in the newly proposed method is based on rebinning the sinogram data into smaller data structures and then analyzing the time activity behavior in the elements of these structures. From this analysis, a 1D respiratory trace is produced, analogous to a hardware derived respiratory trace. To assess the accuracy of this fully automated method, respiratory signal was extracted from a collection of 22 clinical FDG-PET scans using this method, and compared to signal derived from several other software based methods as well as a signal derived from a hardware system. The method presented required approximately 9 min of processing time for each 10 min scan (using a single 2.67 GHz processor), which in theory can be accomplished while the scan is being acquired and therefore allowing a real-time respiratory signal acquisition. Using the mean correlation between the software based and hardware based respiratory traces, the optimal parameters were determined for the presented algorithm. The mean/median/range of correlations for the set of scans when using the optimal parameters was found to be 0.58/0.68/0.07-0.86. The speed of this method was within the range of real-time while the accuracy surpassed the most accurate of the previously presented algorithms. PET data inherently contains information about patient motion; information that is not currently being utilized. We have shown that a respiratory signal can be extracted from raw PET data in potentially real-time and in a fully automated manner. This signal correlates well with hardware based signal for a large percentage of scans, and avoids the efforts and complications associated with hardware. The proposed method to extract a respiratory signal can be implemented on existing scanners and, if properly integrated, can be applied without changes to routine clinical procedures.
Bayesian ISOLA: new tool for automated centroid moment tensor inversion
NASA Astrophysics Data System (ADS)
Vackář, Jiří; Burjánek, Jan; Gallovič, František; Zahradník, Jiří; Clinton, John
2017-04-01
Focal mechanisms are important for understanding seismotectonics of a region, and they serve as a basic input for seismic hazard assessment. Usually, the point source approximation and the moment tensor (MT) are used. We have developed a new, fully automated tool for the centroid moment tensor (CMT) inversion in a Bayesian framework. It includes automated data retrieval, data selection where station components with various instrumental disturbances and high signal-to-noise are rejected, and full-waveform inversion in a space-time grid around a provided hypocenter. The method is innovative in the following aspects: (i) The CMT inversion is fully automated, no user interaction is required, although the details of the process can be visually inspected latter on many figures which are automatically plotted.(ii) The automated process includes detection of disturbances based on MouseTrap code, so disturbed recordings do not affect inversion.(iii) A data covariance matrix calculated from pre-event noise yields an automated weighting of the station recordings according to their noise levels and also serves as an automated frequency filter suppressing noisy frequencies.(iv) Bayesian approach is used, so not only the best solution is obtained, but also the posterior probability density function.(v) A space-time grid search effectively combined with the least-squares inversion of moment tensor components speeds up the inversion and allows to obtain more accurate results compared to stochastic methods. The method has been tested on synthetic and observed data. It has been tested by comparison with manually processed moment tensors of all events greater than M≥3 in the Swiss catalogue over 16 years using data available at the Swiss data center (http://arclink.ethz.ch). The quality of the results of the presented automated process is comparable with careful manual processing of data. The software package programmed in Python has been designed to be as versatile as possible in order to be applicable in various networks ranging from local to regional. The method can be applied either to the everyday network data flow, or to process large previously existing earthquake catalogues and data sets.
Technical Assessment of the Transette Transit System
DOT National Transportation Integrated Search
1981-10-01
This report describes an assessment of the Transette system located at the Georgia Institute of Technology in Atlanta, Georgia. The Transette system is a unique, fully-automated, engineering prototype transportation test system installed on the campu...
A Next-generation Tissue Microarray (ngTMA) Protocol for Biomarker Studies
Zlobec, Inti; Suter, Guido; Perren, Aurel; Lugli, Alessandro
2014-01-01
Biomarker research relies on tissue microarrays (TMA). TMAs are produced by repeated transfer of small tissue cores from a ‘donor’ block into a ‘recipient’ block and then used for a variety of biomarker applications. The construction of conventional TMAs is labor intensive, imprecise, and time-consuming. Here, a protocol using next-generation Tissue Microarrays (ngTMA) is outlined. ngTMA is based on TMA planning and design, digital pathology, and automated tissue microarraying. The protocol is illustrated using an example of 134 metastatic colorectal cancer patients. Histological, statistical and logistical aspects are considered, such as the tissue type, specific histological regions, and cell types for inclusion in the TMA, the number of tissue spots, sample size, statistical analysis, and number of TMA copies. Histological slides for each patient are scanned and uploaded onto a web-based digital platform. There, they are viewed and annotated (marked) using a 0.6-2.0 mm diameter tool, multiple times using various colors to distinguish tissue areas. Donor blocks and 12 ‘recipient’ blocks are loaded into the instrument. Digital slides are retrieved and matched to donor block images. Repeated arraying of annotated regions is automatically performed resulting in an ngTMA. In this example, six ngTMAs are planned containing six different tissue types/histological zones. Two copies of the ngTMAs are desired. Three to four slides for each patient are scanned; 3 scan runs are necessary and performed overnight. All slides are annotated; different colors are used to represent the different tissues/zones, namely tumor center, invasion front, tumor/stroma, lymph node metastases, liver metastases, and normal tissue. 17 annotations/case are made; time for annotation is 2-3 min/case. 12 ngTMAs are produced containing 4,556 spots. Arraying time is 15-20 hr. Due to its precision, flexibility and speed, ngTMA is a powerful tool to further improve the quality of TMAs used in clinical and translational research. PMID:25285857
Method and Apparatus for Automated Isolation of Nucleic Acids from Small Cell Samples
NASA Technical Reports Server (NTRS)
Sundaram, Shivshankar; Prabhakarpandian, Balabhaskar; Pant, Kapil; Wang, Yi
2014-01-01
RNA isolation is a ubiquitous need, driven by current emphasis on microarrays and miniaturization. With commercial systems requiring 100,000 to 1,000,000 cells for successful isolation, there is a growing need for a small-footprint, easy-to-use device that can harvest nucleic acids from much smaller cell samples (1,000 to 10,000 cells). The process of extraction of RNA from cell cultures is a complex, multi-step one, and requires timed, asynchronous operations with multiple reagents/buffers. An added complexity is the fragility of RNA (subject to degradation) and its reactivity to surface. A novel, microfluidics-based, integrated cartridge has been developed that can fully automate the complex process of RNA isolation (lyse, capture, and elute RNA) from small cell culture samples. On-cartridge cell lysis is achieved using either reagents or high-strength electric fields made possible by the miniaturized format. Traditionally, silica-based, porous-membrane formats have been used for RNA capture, requiring slow perfusion for effective capture. In this design, high efficiency capture/elution are achieved using a microsphere-based "microfluidized" format. Electrokinetic phenomena are harnessed to actively mix microspheres with the cell lysate and capture/elution buffer, providing important advantages in extraction efficiency, processing time, and operational flexibility. Successful RNA isolation was demonstrated using both suspension (HL-60) and adherent (BHK-21) cells. Novel features associated with this development are twofold. First, novel designs that execute needed processes with improved speed and efficiency were developed. These primarily encompass electric-field-driven lysis of cells. The configurations include electrode-containing constructs, or an "electrode-less" chip design, which is easy to fabricate and mitigates fouling at the electrode surface; and the "fluidized" extraction format based on electrokinetically assisted mixing and contacting of microbeads in a shape-optimized chamber. A secondary proprietary feature is in the particular layout integrating these components to perform the desired operation of RNA isolation. Apart from a novel functional capability, advantages of the innovation include reduced or eliminated use of toxic reagents, and operator-independent extraction of RNA.
Ibrahim, Sarah A; Martini, Luigi
2014-08-01
Dissolution method transfer is a complicated yet common process in the pharmaceutical industry. With increased pharmaceutical product manufacturing and dissolution acceptance requirements, dissolution testing has become one of the most labor-intensive quality control testing methods. There is an increased trend for automation in dissolution testing, particularly for large pharmaceutical companies to reduce variability and increase personnel efficiency. There is no official guideline for dissolution testing method transfer from a manual, semi-automated, to automated dissolution tester. In this study, a manual multipoint dissolution testing procedure for an enteric-coated aspirin tablet was transferred effectively and reproducibly to a fully automated dissolution testing device, RoboDis II. Enteric-coated aspirin samples were used as a model formulation to assess the feasibility and accuracy of media pH change during continuous automated dissolution testing. Several RoboDis II parameters were evaluated to ensure the integrity and equivalency of dissolution method transfer from a manual dissolution tester. This current study provides a systematic outline for the transfer of the manual dissolution testing protocol to an automated dissolution tester. This study further supports that automated dissolution testers compliant with regulatory requirements and similar to manual dissolution testers facilitate method transfer. © 2014 Society for Laboratory Automation and Screening.
Comparison of Actual Costs to Integrate Commercial Buildings with the Grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piette, Mary Ann; Black, Doug; Yin, Rongxin
During the past decade, the technology to automate demand response (DR) in buildings and industrial facilities has advanced significantly. Automation allows rapid, repeatable, reliable operation. This study focuses on costs for DR automation in commercial buildings with some discussion on residential buildings and industrial facilities. DR automation technology relies on numerous components, including communication systems, hardware and software gateways, standards-based messaging protocols, controls and integration platforms, and measurement and telemetry systems. This paper discusses the impact factors that contribute to the costs of automated DR systems, with a focus on OpenADR 1.0 and 2.0 systems. In addition, this report comparesmore » cost data from several DR automation programs and pilot projects, evaluates trends in the cost per unit of DR and kilowatts (kW) available from automated systems, and applies a standard naming convention and classification or taxonomy for system elements. In summary, median costs for the 56 installed automated DR systems studied here are about $200/kW. The deviation around this median is large with costs in some cases being an order of magnitude greater or less than median. Costs to automate fast DR systems for ancillary services are not fully analyzed in this report because additional research is needed to determine the total such costs.« less
NASA Technical Reports Server (NTRS)
Cabrall, C.; Gomez, A.; Homola, J.; Hunt, S..; Martin, L.; Merccer, J.; Prevott, T.
2013-01-01
As part of an ongoing research effort on separation assurance and functional allocation in NextGen, a controller- in-the-loop study with ground-based automation was conducted at NASA Ames' Airspace Operations Laboratory in August 2012 to investigate the potential impact of introducing self-separating aircraft in progressively advanced NextGen timeframes. From this larger study, the current exploratory analysis of controller-automation interaction styles focuses on the last and most far-term time frame. Measurements were recorded that firstly verified the continued operational validity of this iteration of the ground-based functional allocation automation concept in forecast traffic densities up to 2x that of current day high altitude en-route sectors. Additionally, with greater levels of fully automated conflict detection and resolution as well as the introduction of intervention functionality, objective and subjective analyses showed a range of passive to active controller- automation interaction styles between the participants. Not only did the controllers work with the automation to meet their safety and capacity goals in the simulated future NextGen timeframe, they did so in different ways and with different attitudes of trust/use of the automation. Taken as a whole, the results showed that the prototyped controller-automation functional allocation framework was very flexible and successful overall.
The Changing Role of the Clinical Microbiology Laboratory in Defining Resistance in Gram-negatives.
Endimiani, Andrea; Jacobs, Michael R
2016-06-01
The evolution of resistance in Gram-negatives has challenged the clinical microbiology laboratory to implement new methods for their detection. Multidrug-resistant strains present major challenges to conventional and new detection methods. More rapid pathogen identification and antimicrobial susceptibility testing have been developed for use directly on specimens, including fluorescence in situ hybridization tests, automated polymerase chain reaction systems, microarrays, mass spectroscopy, next-generation sequencing, and microfluidics. Review of these methods shows the advances that have been made in rapid detection of resistance in cultures, but limited progress in direct detection from specimens. Copyright © 2016 Elsevier Inc. All rights reserved.
NCBI GEO: archive for high-throughput functional genomic data.
Barrett, Tanya; Troup, Dennis B; Wilhite, Stephen E; Ledoux, Pierre; Rudnev, Dmitry; Evangelista, Carlos; Kim, Irene F; Soboleva, Alexandra; Tomashevsky, Maxim; Marshall, Kimberly A; Phillippy, Katherine H; Sherman, Patti M; Muertter, Rolf N; Edgar, Ron
2009-01-01
The Gene Expression Omnibus (GEO) at the National Center for Biotechnology Information (NCBI) is the largest public repository for high-throughput gene expression data. Additionally, GEO hosts other categories of high-throughput functional genomic data, including those that examine genome copy number variations, chromatin structure, methylation status and transcription factor binding. These data are generated by the research community using high-throughput technologies like microarrays and, more recently, next-generation sequencing. The database has a flexible infrastructure that can capture fully annotated raw and processed data, enabling compliance with major community-derived scientific reporting standards such as 'Minimum Information About a Microarray Experiment' (MIAME). In addition to serving as a centralized data storage hub, GEO offers many tools and features that allow users to effectively explore, analyze and download expression data from both gene-centric and experiment-centric perspectives. This article summarizes the GEO repository structure, content and operating procedures, as well as recently introduced data mining features. GEO is freely accessible at http://www.ncbi.nlm.nih.gov/geo/.
NCBI GEO: mining millions of expression profiles--database and tools.
Barrett, Tanya; Suzek, Tugba O; Troup, Dennis B; Wilhite, Stephen E; Ngau, Wing-Chi; Ledoux, Pierre; Rudnev, Dmitry; Lash, Alex E; Fujibuchi, Wataru; Edgar, Ron
2005-01-01
The Gene Expression Omnibus (GEO) at the National Center for Biotechnology Information (NCBI) is the largest fully public repository for high-throughput molecular abundance data, primarily gene expression data. The database has a flexible and open design that allows the submission, storage and retrieval of many data types. These data include microarray-based experiments measuring the abundance of mRNA, genomic DNA and protein molecules, as well as non-array-based technologies such as serial analysis of gene expression (SAGE) and mass spectrometry proteomic technology. GEO currently holds over 30,000 submissions representing approximately half a billion individual molecular abundance measurements, for over 100 organisms. Here, we describe recent database developments that facilitate effective mining and visualization of these data. Features are provided to examine data from both experiment- and gene-centric perspectives using user-friendly Web-based interfaces accessible to those without computational or microarray-related analytical expertise. The GEO database is publicly accessible through the World Wide Web at http://www.ncbi.nlm.nih.gov/geo.
Xu, Lingyang; Hou, Yali; Bickhart, Derek M; Song, Jiuzhou; Liu, George E
2013-06-25
Copy number variations (CNVs) are gains and losses of genomic sequence between two individuals of a species when compared to a reference genome. The data from single nucleotide polymorphism (SNP) microarrays are now routinely used for genotyping, but they also can be utilized for copy number detection. Substantial progress has been made in array design and CNV calling algorithms and at least 10 comparison studies in humans have been published to assess them. In this review, we first survey the literature on existing microarray platforms and CNV calling algorithms. We then examine a number of CNV calling tools to evaluate their impacts using bovine high-density SNP data. Large incongruities in the results from different CNV calling tools highlight the need for standardizing array data collection, quality assessment and experimental validation. Only after careful experimental design and rigorous data filtering can the impacts of CNVs on both normal phenotypic variability and disease susceptibility be fully revealed.
A consensus prognostic gene expression classifier for ER positive breast cancer
Teschendorff, Andrew E; Naderi, Ali; Barbosa-Morais, Nuno L; Pinder, Sarah E; Ellis, Ian O; Aparicio, Sam; Brenton, James D; Caldas, Carlos
2006-01-01
Background A consensus prognostic gene expression classifier is still elusive in heterogeneous diseases such as breast cancer. Results Here we perform a combined analysis of three major breast cancer microarray data sets to hone in on a universally valid prognostic molecular classifier in estrogen receptor (ER) positive tumors. Using a recently developed robust measure of prognostic separation, we further validate the prognostic classifier in three external independent cohorts, confirming the validity of our molecular classifier in a total of 877 ER positive samples. Furthermore, we find that molecular classifiers may not outperform classical prognostic indices but that they can be used in hybrid molecular-pathological classification schemes to improve prognostic separation. Conclusion The prognostic molecular classifier presented here is the first to be valid in over 877 ER positive breast cancer samples and across three different microarray platforms. Larger multi-institutional studies will be needed to fully determine the added prognostic value of molecular classifiers when combined with standard prognostic factors. PMID:17076897
Probabilistic segmentation and intensity estimation for microarray images.
Gottardo, Raphael; Besag, Julian; Stephens, Matthew; Murua, Alejandro
2006-01-01
We describe a probabilistic approach to simultaneous image segmentation and intensity estimation for complementary DNA microarray experiments. The approach overcomes several limitations of existing methods. In particular, it (a) uses a flexible Markov random field approach to segmentation that allows for a wider range of spot shapes than existing methods, including relatively common 'doughnut-shaped' spots; (b) models the image directly as background plus hybridization intensity, and estimates the two quantities simultaneously, avoiding the common logical error that estimates of foreground may be less than those of the corresponding background if the two are estimated separately; and (c) uses a probabilistic modeling approach to simultaneously perform segmentation and intensity estimation, and to compute spot quality measures. We describe two approaches to parameter estimation: a fast algorithm, based on the expectation-maximization and the iterated conditional modes algorithms, and a fully Bayesian framework. These approaches produce comparable results, and both appear to offer some advantages over other methods. We use an HIV experiment to compare our approach to two commercial software products: Spot and Arrayvision.
Zhang, Xirui; Daaboul, George G; Spuhler, Philipp S; Dröge, Peter; Ünlü, M Selim
2016-03-14
DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions.
Carmona, Santiago J.; Nielsen, Morten; Schafer-Nielsen, Claus; Mucci, Juan; Altcheh, Jaime; Balouz, Virginia; Tekiel, Valeria; Frasch, Alberto C.; Campetella, Oscar; Buscaglia, Carlos A.; Agüero, Fernán
2015-01-01
Complete characterization of antibody specificities associated to natural infections is expected to provide a rich source of serologic biomarkers with potential applications in molecular diagnosis, follow-up of chemotherapeutic treatments, and prioritization of targets for vaccine development. Here, we developed a highly-multiplexed platform based on next-generation high-density peptide microarrays to map these specificities in Chagas Disease, an exemplar of a human infectious disease caused by the protozoan Trypanosoma cruzi. We designed a high-density peptide microarray containing more than 175,000 overlapping 15mer peptides derived from T. cruzi proteins. Peptides were synthesized in situ on microarray slides, spanning the complete length of 457 parasite proteins with fully overlapped 15mers (1 residue shift). Screening of these slides with antibodies purified from infected patients and healthy donors demonstrated both a high technical reproducibility as well as epitope mapping consistency when compared with earlier low-throughput technologies. Using a conservative signal threshold to classify positive (reactive) peptides we identified 2,031 disease-specific peptides and 97 novel parasite antigens, effectively doubling the number of known antigens and providing a 10-fold increase in the number of fine mapped antigenic determinants for this disease. Finally, further analysis of the chip data showed that optimizing the amount of sequence overlap of displayed peptides can increase the protein space covered in a single chip by at least ∼threefold without sacrificing sensitivity. In conclusion, we show the power of high-density peptide chips for the discovery of pathogen-specific linear B-cell epitopes from clinical samples, thus setting the stage for high-throughput biomarker discovery screenings and proteome-wide studies of immune responses against pathogens. PMID:25922409
Carmona, Santiago J; Nielsen, Morten; Schafer-Nielsen, Claus; Mucci, Juan; Altcheh, Jaime; Balouz, Virginia; Tekiel, Valeria; Frasch, Alberto C; Campetella, Oscar; Buscaglia, Carlos A; Agüero, Fernán
2015-07-01
Complete characterization of antibody specificities associated to natural infections is expected to provide a rich source of serologic biomarkers with potential applications in molecular diagnosis, follow-up of chemotherapeutic treatments, and prioritization of targets for vaccine development. Here, we developed a highly-multiplexed platform based on next-generation high-density peptide microarrays to map these specificities in Chagas Disease, an exemplar of a human infectious disease caused by the protozoan Trypanosoma cruzi. We designed a high-density peptide microarray containing more than 175,000 overlapping 15 mer peptides derived from T. cruzi proteins. Peptides were synthesized in situ on microarray slides, spanning the complete length of 457 parasite proteins with fully overlapped 15 mers (1 residue shift). Screening of these slides with antibodies purified from infected patients and healthy donors demonstrated both a high technical reproducibility as well as epitope mapping consistency when compared with earlier low-throughput technologies. Using a conservative signal threshold to classify positive (reactive) peptides we identified 2,031 disease-specific peptides and 97 novel parasite antigens, effectively doubling the number of known antigens and providing a 10-fold increase in the number of fine mapped antigenic determinants for this disease. Finally, further analysis of the chip data showed that optimizing the amount of sequence overlap of displayed peptides can increase the protein space covered in a single chip by at least ∼ threefold without sacrificing sensitivity. In conclusion, we show the power of high-density peptide chips for the discovery of pathogen-specific linear B-cell epitopes from clinical samples, thus setting the stage for high-throughput biomarker discovery screenings and proteome-wide studies of immune responses against pathogens. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.
DeepPicker: A deep learning approach for fully automated particle picking in cryo-EM.
Wang, Feng; Gong, Huichao; Liu, Gaochao; Li, Meijing; Yan, Chuangye; Xia, Tian; Li, Xueming; Zeng, Jianyang
2016-09-01
Particle picking is a time-consuming step in single-particle analysis and often requires significant interventions from users, which has become a bottleneck for future automated electron cryo-microscopy (cryo-EM). Here we report a deep learning framework, called DeepPicker, to address this problem and fill the current gaps toward a fully automated cryo-EM pipeline. DeepPicker employs a novel cross-molecule training strategy to capture common features of particles from previously-analyzed micrographs, and thus does not require any human intervention during particle picking. Tests on the recently-published cryo-EM data of three complexes have demonstrated that our deep learning based scheme can successfully accomplish the human-level particle picking process and identify a sufficient number of particles that are comparable to those picked manually by human experts. These results indicate that DeepPicker can provide a practically useful tool to significantly reduce the time and manual effort spent in single-particle analysis and thus greatly facilitate high-resolution cryo-EM structure determination. DeepPicker is released as an open-source program, which can be downloaded from https://github.com/nejyeah/DeepPicker-python. Copyright © 2016 Elsevier Inc. All rights reserved.
Choe, Leila H; Lee, Kelvin H
2003-10-01
We investigate one approach to assess the quantitative variability in two-dimensional gel electrophoresis (2-DE) separations based on gel-to-gel variability, sample preparation variability, sample load differences, and the effect of automation on image analysis. We observe that 95% of spots present in three out of four replicate gels exhibit less than a 0.52 coefficient of variation (CV) in fluorescent stain intensity (% volume) for a single sample run on multiple gels. When four parallel sample preparations are performed, this value increases to 0.57. We do not observe any significant change in quantitative value for an increase or decrease in sample load of 30% when using appropriate image analysis variables. Increasing use of automation, while necessary in modern 2-DE experiments, does change the observed level of quantitative and qualitative variability among replicate gels. The number of spots that change qualitatively for a single sample run in parallel varies from a CV = 0.03 for fully manual analysis to CV = 0.20 for a fully automated analysis. We present a systematic method by which a single laboratory can measure gel-to-gel variability using only three gel runs.
Quintero, Catherine; Kariv, Ilona
2009-06-01
To meet the needs of the increasingly rapid and parallelized lead optimization process, a fully integrated local compound storage and liquid handling system was designed and implemented to automate the generation of assay-ready plates directly from newly submitted and cherry-picked compounds. A key feature of the system is the ability to create project- or assay-specific compound-handling methods, which provide flexibility for any combination of plate types, layouts, and plate bar-codes. Project-specific workflows can be created by linking methods for processing new and cherry-picked compounds and control additions to produce a complete compound set for both biological testing and local storage in one uninterrupted workflow. A flexible cherry-pick approach allows for multiple, user-defined strategies to select the most appropriate replicate of a compound for retesting. Examples of custom selection parameters include available volume, compound batch, and number of freeze/thaw cycles. This adaptable and integrated combination of software and hardware provides a basis for reducing cycle time, fully automating compound processing, and ultimately increasing the rate at which accurate, biologically relevant results can be produced for compounds of interest in the lead optimization process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsunoda, Hirokazu; Sato, Osamu; Okajima, Shigeaki
2002-07-01
In order to achieve fully automated reactor operation of RAPID-L reactor, innovative reactivity control systems LEM, LIM, and LRM are equipped with lithium-6 as a liquid poison. Because lithium-6 has not been used as a neutron absorbing material of conventional fast reactors, measurements of the reactivity worth of Lithium-6 were performed at the Fast Critical Assembly (FCA) of Japan Atomic Energy Research Institute (JAERI). The FCA core was composed of highly enriched uranium and stainless steel samples so as to simulate the core spectrum of RAPID-L. The samples of 95% enriched lithium-6 were inserted into the core parallel to themore » core axis for the measurement of the reactivity worth at each position. It was found that the measured reactivity worth in the core region well agreed with calculated value by the method for the core designs of RAPID-L. Bias factors for the core design method were obtained by comparing between experimental and calculated results. The factors were used to determine the number of LEM and LIM equipped in the core to achieve fully automated operation of RAPID-L. (authors)« less
NASA Astrophysics Data System (ADS)
Amrute, Junedh M.; Athanasiou, Lambros S.; Rikhtegar, Farhad; de la Torre Hernández, José M.; Camarero, Tamara García; Edelman, Elazer R.
2018-03-01
Polymeric endovascular implants are the next step in minimally invasive vascular interventions. As an alternative to traditional metallic drug-eluting stents, these often-erodible scaffolds present opportunities and challenges for patients and clinicians. Theoretically, as they resorb and are absorbed over time, they obviate the long-term complications of permanent implants, but in the short-term visualization and therefore positioning is problematic. Polymeric scaffolds can only be fully imaged using optical coherence tomography (OCT) imaging-they are relatively invisible via angiography-and segmentation of polymeric struts in OCT images is performed manually, a laborious and intractable procedure for large datasets. Traditional lumen detection methods using implant struts as boundary limits fail in images with polymeric implants. Therefore, it is necessary to develop an automated method to detect polymeric struts and luminal borders in OCT images; we present such a fully automated algorithm. Accuracy was validated using expert annotations on 1140 OCT images with a positive predictive value of 0.93 for strut detection and an R2 correlation coefficient of 0.94 between detected and expert-annotated lumen areas. The proposed algorithm allows for rapid, accurate, and automated detection of polymeric struts and the luminal border in OCT images.
IntelliCages and automated assessment of learning in group-housed mice
NASA Astrophysics Data System (ADS)
Puścian, Alicja; Knapska, Ewelina
2014-11-01
IntelliCage is a fully automated, computer controlled system, which can be used for long-term monitoring of behavior of group-housed mice. Using standardized experimental protocols we can assess cognitive abilities and behavioral flexibility in appetitively and aversively motivated tasks, as well as measure social influences on learning of the subjects. We have also identified groups of neurons specifically activated by appetitively and aversively motivated learning within the amygdala, function of which we are going to investigate optogenetically in the future.
Gravity-driven pH adjustment for site-specific protein pKa measurement by solution-state NMR
NASA Astrophysics Data System (ADS)
Li, Wei
2017-12-01
To automate pH adjustment in site-specific protein pKa measurement by solution-state NMR, I present a funnel with two caps for the standard 5 mm NMR tube. The novelty of this simple-to-build and inexpensive apparatus is that it allows automatic gravity-driven pH adjustment within the magnet, and consequently results in a fully automated NMR-monitored pH titration without any hardware modification on the NMR spectrometer.
Automated, per pixel Cloud Detection from High-Resolution VNIR Data
NASA Technical Reports Server (NTRS)
Varlyguin, Dmitry L.
2007-01-01
CASA is a fully automated software program for the per-pixel detection of clouds and cloud shadows from medium- (e.g., Landsat, SPOT, AWiFS) and high- (e.g., IKONOS, QuickBird, OrbView) resolution imagery without the use of thermal data. CASA is an object-based feature extraction program which utilizes a complex combination of spectral, spatial, and contextual information available in the imagery and the hierarchical self-learning logic for accurate detection of clouds and their shadows.
Hierarchically Parallelized Constrained Nonlinear Solvers with Automated Substructuring
NASA Technical Reports Server (NTRS)
Padovan, Joe; Kwang, Abel
1994-01-01
This paper develops a parallelizable multilevel multiple constrained nonlinear equation solver. The substructuring process is automated to yield appropriately balanced partitioning of each succeeding level. Due to the generality of the procedure,_sequential, as well as partially and fully parallel environments can be handled. This includes both single and multiprocessor assignment per individual partition. Several benchmark examples are presented. These illustrate the robustness of the procedure as well as its capability to yield significant reductions in memory utilization and calculational effort due both to updating and inversion.
Frapid: achieving full automation of FRAP for chemical probe validation
Yapp, Clarence; Rogers, Catherine; Savitsky, Pavel; Philpott, Martin; Müller, Susanne
2016-01-01
Fluorescence Recovery After Photobleaching (FRAP) is an established method for validating chemical probes against the chromatin reading bromodomains, but so far requires constant human supervision. Here, we present Frapid, an automated open source code implementation of FRAP that fully handles cell identification through fuzzy logic analysis, drug dispensing with a custom-built fluid handler, image acquisition & analysis, and reporting. We successfully tested Frapid on 3 bromodomains as well as on spindlin1 (SPIN1), a methyl lysine binder, for the first time. PMID:26977352
2011-01-01
Genome targeting methods enable cost-effective capture of specific subsets of the genome for sequencing. We present here an automated, highly scalable method for carrying out the Solution Hybrid Selection capture approach that provides a dramatic increase in scale and throughput of sequence-ready libraries produced. Significant process improvements and a series of in-process quality control checkpoints are also added. These process improvements can also be used in a manual version of the protocol. PMID:21205303
NASA Technical Reports Server (NTRS)
Heyward, Ann O.; Reilly, Charles H.; Walton, Eric K.; Mata, Fernando; Olen, Carl
1990-01-01
Creation of an Allotment Plan for the Fixed Satellite Service at the 1988 Space World Administrative Radio Conference (WARC) represented a complex satellite plan synthesis problem, involving a large number of planned and existing systems. Solutions to this problem at WARC-88 required the use of both automated and manual procedures to develop an acceptable set of system positions. Development of an Allotment Plan may also be attempted through solution of an optimization problem, known as the Satellite Location Problem (SLP). Three automated heuristic procedures, developed specifically to solve SLP, are presented. The heuristics are then applied to two specific WARC-88 scenarios. Solutions resulting from the fully automated heuristics are then compared with solutions obtained at WARC-88 through a combination of both automated and manual planning efforts.
Automated solar cell assembly team process research
NASA Astrophysics Data System (ADS)
Nowlan, M. J.; Hogan, S. J.; Darkazalli, G.; Breen, W. F.; Murach, J. M.; Sutherland, S. F.; Patterson, J. S.
1994-06-01
This report describes work done under the Photovoltaic Manufacturing Technology (PVMaT) project, Phase 3A, which addresses problems that are generic to the photovoltaic (PV) industry. Spire's objective during Phase 3A was to use its light soldering technology and experience to design and fabricate solar cell tabbing and interconnecting equipment to develop new, high-yield, high-throughput, fully automated processes for tabbing and interconnecting thin cells. Areas that were addressed include processing rates, process control, yield, throughput, material utilization efficiency, and increased use of automation. Spire teamed with Solec International, a PV module manufacturer, and the University of Massachusetts at Lowell's Center for Productivity Enhancement (CPE), automation specialists, who are lower-tier subcontractors. A number of other PV manufacturers, including Siemens Solar, Mobil Solar, Solar Web, and Texas instruments, agreed to evaluate the processes developed under this program.
Diatomite filters--methods of automation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maloney, G.F.
1966-01-01
Following an introduction of subject material, diatomite filters are discussed in the following categories: a filter system, the manual station, the decision to automate, equipment, the automated filter, and the fail-safe methods. Diagrams and pictures of the equipment and its operation are included. Many aspects of the uses of both the automatic and manually operated diatomite filtering systems are reviewed. The fully automated station may be ideally suited to the remotely located waterflood since it requires virtually no attention or perhaps only periodic inspection. On the other hand, floods large enough to employ full-time personnel, who can maintain a constantmore » vigil and peiodically scrutinize the filtering operation, probably require nothing more than a semiautomatic operation. The reduction of human error can save money, and the introduction of consistency into any unit operation is certain to be beneficial.« less
Automated MAD and MIR structure solution
Terwilliger, Thomas C.; Berendzen, Joel
1999-01-01
Obtaining an electron-density map from X-ray diffraction data can be difficult and time-consuming even after the data have been collected, largely because MIR and MAD structure determinations currently require many subjective evaluations of the qualities of trial heavy-atom partial structures before a correct heavy-atom solution is obtained. A set of criteria for evaluating the quality of heavy-atom partial solutions in macromolecular crystallography have been developed. These have allowed the conversion of the crystal structure-solution process into an optimization problem and have allowed its automation. The SOLVE software has been used to solve MAD data sets with as many as 52 selenium sites in the asymmetric unit. The automated structure-solution process developed is a major step towards the fully automated structure-determination, model-building and refinement procedure which is needed for genomic scale structure determinations. PMID:10089316
Automation - Changes in cognitive demands and mental workload
NASA Technical Reports Server (NTRS)
Tsang, Pamela S.; Johnson, Walter W.
1987-01-01
The effect of partial automation on mental workloads in man/machine tasks is investigated experimentally. Subjective workload measures are obtained from six subjects after performance of a task battery comprising two manual (flight-path control, FC, and target acquisition, TA) tasks and one decisionmaking (engine failure, EF) task; the FC task was performed in both a fully manual (altitude and lateral control) mode and in a semiautomated mode (autmatic latitude control). The performance results and subjective evaluations are presented in graphs and characterized in detail. The automation is shown to improve objective performance and lower subjective workload significantly in the combined FC/TA task, but not in the FC task alone or in the FC/EF task.
BOREAS TF-3 Automated Chamber CO2 Flux Data from the NSA-OBS
NASA Technical Reports Server (NTRS)
Goulden, Michael L.; Crill, Patrick M.; Hall, Forrest G. (Editor); Conrad, Sara (Editor)
2000-01-01
The BOReal Ecosystem Atmosphere Study Tower Flux (BOREAS TF-3) and Trace Gas Biogeochemistry (TGB-1) teams collected automated CO2 chamber flux data in their efforts to fully describe the CO2 flux at the Northern Study Area-Old Black Spruce (NSA-OBS) site. This data set contains fluxes of CO2 at the NSA-OBS site measured using automated chambers. In addition to reporting the CO2 flux, it reports chamber air temperature, moss temperature, and light levels during each measurement. The data set covers the period from 23-Sep-1995 through 26-Oct-1995 and from 28-May-1996 through 21-Oct-1996. The data are stored in tabular ASCII files.
Fully Integrated Microfluidic Device for Direct Sample-to-Answer Genetic Analysis
NASA Astrophysics Data System (ADS)
Liu, Robin H.; Grodzinski, Piotr
Integration of microfluidics technology with DNA microarrays enables building complete sample-to-answer systems that are useful in many applications such as clinic diagnostics. In this chapter, a fully integrated microfluidic device [1] that consists of microfluidic mixers, valves, pumps, channels, chambers, heaters, and a DNA microarray sensor to perform DNA analysis of complex biological sample solutions is present. This device can perform on-chip sample preparation (including magnetic bead-based cell capture, cell preconcentration and purification, and cell lysis) of complex biological sample solutions (such as whole blood), polymerase chain reaction, DNA hybridization, and electrochemical detection. A few novel microfluidic techniques were developed and employed. A micromix-ing technique based on a cavitation microstreaming principle was implemented to enhance target cell capture from whole blood samples using immunomagnetic beads. This technique was also employed to accelerate DNA hybridization reaction. Thermally actuated paraffin-based microvalves were developed to regulate flows. Electrochemical pumps and thermopneumatic pumps were integrated on the chip to provide pumping of liquid solutions. The device is completely self-contained: no external pressure sources, fluid storage, mechanical pumps, or valves are necessary for fluid manipulation, thus eliminating possible sample contamination and simplifying device operation. Pathogenic bacteria detection from ~mL whole blood samples and single-nucleotide polymorphism analysis directly from diluted blood were demonstrated. The device provides a cost-effective solution to direct sample-to-answer genetic analysis, and thus has a potential impact in the fields of point-of-care genetic analysis, environmental testing, and biological warfare agent detection.
SARA South Observatory: A Fully Automated Boller & Chivens 0.6-m Telescope at C.T.I.O.
NASA Astrophysics Data System (ADS)
Mack, Peter; KanniahPadmanaban, S. Y.; Kaitchuck, R.; Borstad, A.; Luzier, N.
2010-05-01
The SARA South Observatory is the re-birth of the Lowell 24-inch telescope located on the south-east ridge of Cerro Tololo, Chile. Installed in 1968 this Boller & Chivens telescope fell into disuse for almost 20 years. The telescope and observatory have undergone a major restoration. A new dome with a wide slit has been fully automated with an ACE SmartDome controller featuring autonomous closure. The telescope was completely gutted, repainted, and virtually every electronic component and wire replaced. Modern infrastructure, such as USB, Ethernet and video ports have been incorporated into the telescope tube saddle boxes. Absolute encoders have been placed on the Hour Angle and declination axes with a resolution of less than 0.7 arc seconds. The secondary mirror is also equipped with an absolute encoder and temperature sensor to allow for fully automated focus. New mirror coatings, automated mirror covers, a new 150mm refractor, and new instrumentation have been deployed. An integrated X-stage guider and dual filter wheel containing 18 filters is used for direct imaging. The guider camera can be easily removed and a standard 2-inch eyepiece used for occasional viewing by VIP's at C.T.I.O. A 12 megapixel all-sky camera produces color images every 30 seconds showing details in the Milky Way and Magellanic Clouds. Two low light level cameras are deployed; one on the finder and one at the top of the telescope showing a 30° field. Other auxiliary equipment, including daytime color video cameras, weather station and remotely controllable power outlets permit complete control and servicing of the system. The SARA Consortium (www.saraobservatory.org), a collection of ten eastern universities, also operates a 0.9-m telescope at the Kitt Peak National Observatory using an almost identical set of instruments with the same ACE control system. This project was funded by the SARA Consortium.
Soft robot design methodology for `push-button' manufacturing
NASA Astrophysics Data System (ADS)
Paik, Jamie
2018-06-01
`Push-button' or fully automated manufacturing would enable the production of robots with zero intervention from human hands. Realizing this utopia requires a fundamental shift from a sequential (design-materials-manufacturing) to a concurrent design methodology.
NASA Technical Reports Server (NTRS)
Miller, R. H.; Minsky, M. L.; Smith, D. B. S.
1982-01-01
Potential applications of automation, robotics, and machine intelligence systems (ARAMIS) to space activities, and to their related ground support functions, in the years 1985-2000, so that NASA may make informed decisions on which aspects of ARAMIS to develop. The study first identifies the specific tasks which will be required by future space projects. It then defines ARAMIS options which are candidates for those space project tasks, and evaluates the relative merits of these options. Finally, the study identifies promising applications of ARAMIS, and recommends specific areas for further research. The ARAMIS options defined and researched by the study group span the range from fully human to fully machine, including a number of intermediate options (e.g., humans assisted by computers, and various levels of teleoperation). By including this spectrum, the study searches for the optimum mix of humans and machines for space project tasks.
Solvepol: A Reduction Pipeline for Imaging Polarimetry Data
NASA Astrophysics Data System (ADS)
Ramírez, Edgar A.; Magalhães, Antônio M.; Davidson, James W., Jr.; Pereyra, Antonio; Rubinho, Marcelo
2017-05-01
We present a newly, fully automated, data pipeline, Solvepol, designed to reduce and analyze polarimetric data. It has been optimized for imaging data from the Instituto de Astronomía, Geofísica e Ciências Atmosféricas (IAG) of the University of São Paulo (USP), calcite Savart prism plate-based IAGPOL polarimeter. Solvepol is also the basis of a reduction pipeline for the wide-field optical polarimeter that will execute SOUTH POL, a survey of the polarized southern sky. Solvepol was written using the Interactive data language (IDL) and is based on the Image Reduction and Analysis Facility (IRAF) task PCCDPACK, developed by our polarimetry group. We present and discuss reduced data from standard stars and other fields and compare these results with those obtained in the IRAF environment. Our analysis shows that Solvepol, in addition to being a fully automated pipeline, produces results consistent with those reduced by PCCDPACK and reported in the literature.
NASA Technical Reports Server (NTRS)
Clement, Warren F.; Gorder, Pater J.; Jewell, Wayne F.; Coppenbarger, Richard
1990-01-01
Developing a single-pilot all-weather NOE capability requires fully automatic NOE navigation and flight control. Innovative guidance and control concepts are being investigated to (1) organize the onboard computer-based storage and real-time updating of NOE terrain profiles and obstacles; (2) define a class of automatic anticipative pursuit guidance algorithms to follow the vertical, lateral, and longitudinal guidance commands; (3) automate a decision-making process for unexpected obstacle avoidance; and (4) provide several rapid response maneuvers. Acquired knowledge from the sensed environment is correlated with the recorded environment which is then used to determine an appropriate evasive maneuver if a nonconformity is observed. This research effort has been evaluated in both fixed-base and moving-base real-time piloted simulations thereby evaluating pilot acceptance of the automated concepts, supervisory override, manual operation, and reengagement of the automatic system.
Self-consistent hybrid functionals for solids: a fully-automated implementation
NASA Astrophysics Data System (ADS)
Erba, A.
2017-08-01
A fully-automated algorithm for the determination of the system-specific optimal fraction of exact exchange in self-consistent hybrid functionals of the density-functional-theory is illustrated, as implemented into the public Crystal program. The exchange fraction of this new class of functionals is self-consistently updated proportionally to the inverse of the dielectric response of the system within an iterative procedure (Skone et al 2014 Phys. Rev. B 89, 195112). Each iteration of the present scheme, in turn, implies convergence of a self-consistent-field (SCF) and a coupled-perturbed-Hartree-Fock/Kohn-Sham (CPHF/KS) procedure. The present implementation, beside improving the user-friendliness of self-consistent hybrids, exploits the unperturbed and electric-field perturbed density matrices from previous iterations as guesses for subsequent SCF and CPHF/KS iterations, which is documented to reduce the overall computational cost of the whole process by a factor of 2.
2014-01-01
Background The production of biofuels in photosynthetic microalgae and cyanobacteria is a promising alternative to the generation of fuels from fossil resources. To be economically competitive, producer strains need to be established that synthesize the targeted product at high yield and over a long time. Engineering cyanobacteria into forced fuel producers should considerably interfere with overall cell homeostasis, which in turn might counteract productivity and sustainability of the process. Therefore, in-depth characterization of the cellular response upon long-term production is of high interest for the targeted improvement of a desired strain. Results The transcriptome-wide response to continuous ethanol production was examined in Synechocystis sp. PCC6803 using high resolution microarrays. In two independent experiments, ethanol production rates of 0.0338% (v/v) ethanol d-1 and 0.0303% (v/v) ethanol d-1 were obtained over 18 consecutive days, measuring two sets of biological triplicates in fully automated photobioreactors. Ethanol production caused a significant (~40%) delay in biomass accumulation, the development of a bleaching phenotype and a down-regulation of light harvesting capacity. However, microarray analyses performed at day 4, 7, 11 and 18 of the experiment revealed only three mRNAs with a strongly modified accumulation level throughout the course of the experiment. In addition to the overexpressed adhA (slr1192) gene, this was an approximately 4 fold reduction in cpcB (sll1577) and 3 to 6 fold increase in rps8 (sll1809) mRNA levels. Much weaker modifications of expression level or modifications restricted to day 18 of the experiment were observed for genes involved in carbon assimilation (Ribulose bisphosphate carboxylase and Glutamate decarboxylase). Molecular analysis of the reduced cpcB levels revealed a post-transcriptional processing of the cpcBA operon mRNA leaving a truncated mRNA cpcA* likely not competent for translation. Moreover, western blots and zinc-enhanced bilin fluorescence blots confirmed a severe reduction in the amounts of both phycocyanin subunits, explaining the cause of the bleaching phenotype. Conclusions Changes in gene expression upon induction of long-term ethanol production in Synechocystis sp. PCC6803 are highly specific. In particular, we did not observe a comprehensive stress response as might have been expected. PMID:24502290
Space station automation: the role of robotics and artificial intelligence (Invited Paper)
NASA Astrophysics Data System (ADS)
Park, W. T.; Firschein, O.
1985-12-01
Automation of the space station is necessary to make more effective use of the crew, to carry out repairs that are impractical or dangerous, and to monitor and control the many space station subsystems. Intelligent robotics and expert systems play a strong role in automation, and both disciplines are highly dependent on a common artificial intelligence (Al) technology base. The AI technology base provides the reasoning and planning capabilities needed in robotic tasks, such as perception of the environment and planning a path to a goal, and in expert systems tasks, such as control of subsystems and maintenance of equipment. This paper describes automation concepts for the space station, the specific robotic and expert systems required to attain this automation, and the research and development required. It also presents an evolutionary development plan that leads to fully automatic mobile robots for servicing satellites. Finally, we indicate the sequence of demonstrations and the research and development needed to confirm the automation capabilities. We emphasize that advanced robotics requires AI, and that to advance, AI needs the "real-world" problems provided by robotics.
Automated structure solution, density modification and model building.
Terwilliger, Thomas C
2002-11-01
The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.
Harnessing Vehicle Automation for Public Mobility -- An Overview of Ongoing Efforts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Young, Stanley E.
This presentation takes a look at the efforts to harness automated vehicle technology for public transport. The European CityMobil2 is the leading demonstration project in which automated shuttles were, or are planned to be, demonstrated in several cities and regions. The presentation provides a brief overview of the demonstrations at Oristano, Italy (July 2014), LaRochelle, France (Dec 2014), Lausanne, Switzerland (Apr 2015), Vantaa, Finland (July 2015), and Trikala, Greece (Sept 2015). In addition to technology exposition, the objectives included generating a legal framework for operation in each location and gaging the reaction of the public to unmanned shuttles, both ofmore » which were successfully achieved. Several such demonstrations are planned throughout the world, including efforts in North America in conjunction with the GoMentum Station in California. These early demonstration with low-speed automated shuttles provide a glimpse of the possible with a fully automated fleet of driverless vehicle providing a public transit service.« less
Bayesian ISOLA: new tool for automated centroid moment tensor inversion
NASA Astrophysics Data System (ADS)
Vackář, Jiří; Burjánek, Jan; Gallovič, František; Zahradník, Jiří; Clinton, John
2017-08-01
We have developed a new, fully automated tool for the centroid moment tensor (CMT) inversion in a Bayesian framework. It includes automated data retrieval, data selection where station components with various instrumental disturbances are rejected and full-waveform inversion in a space-time grid around a provided hypocentre. A data covariance matrix calculated from pre-event noise yields an automated weighting of the station recordings according to their noise levels and also serves as an automated frequency filter suppressing noisy frequency ranges. The method is tested on synthetic and observed data. It is applied on a data set from the Swiss seismic network and the results are compared with the existing high-quality MT catalogue. The software package programmed in Python is designed to be as versatile as possible in order to be applicable in various networks ranging from local to regional. The method can be applied either to the everyday network data flow, or to process large pre-existing earthquake catalogues and data sets.
Human-Automation Cooperation for Separation Assurance in Future NextGen Environments
NASA Technical Reports Server (NTRS)
Mercer, Joey; Homola, Jeffrey; Cabrall, Christopher; Martin, Lynne; Morey, Susan; Gomez, Ashley; Prevot, Thomas
2014-01-01
A 2012 Human-In-The-Loop air traffic control simulation investigated a gradual paradigm-shift in the allocation of functions between operators and automation. Air traffic controllers staffed five adjacent high-altitude en route sectors, and during the course of a two-week experiment, worked traffic under different function-allocation approaches aligned with four increasingly mature NextGen operational environments. These NextGen time-frames ranged from near current-day operations to nearly fully-automated control, in which the ground systems automation was responsible for detecting conflicts, issuing strategic and tactical resolutions, and alerting the controller to exceptional circumstances. Results indicate that overall performance was best in the most automated NextGen environment. Safe operations were achieved in this environment for twice todays peak airspace capacity, while being rated by the controllers as highly acceptable. However, results show that sector operations were not always safe; separation violations did in fact occur. This paper will describe in detail the simulation conducted, as well discuss important results and their implications.
NASA Astrophysics Data System (ADS)
Hiraki, M.; Yamada, Y.; Chavas, L. M. G.; Matsugaki, N.; Igarashi, N.; Wakatsuki, S.
2013-03-01
To achieve fully-automated and/or remote data collection in high-throughput X-ray experiments, the Structural Biology Research Centre at the Photon Factory (PF) has installed PF automated mounting system (PAM) for sample exchange robots at PF macromolecular crystallography beamlines BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. We are upgrading the experimental systems, including the PAM for stable and efficient operation. To prevent human error in automated data collection, we installed a two-dimensional barcode reader for identification of the cassettes and sample pins. Because no liquid nitrogen pipeline in the PF experimental hutch is installed, the users commonly add liquid nitrogen using a small Dewar. To address this issue, an automated liquid nitrogen filling system that links a 100-liter tank to the robot Dewar has been installed on the PF macromolecular beamline. Here we describe this new implementation, as well as future prospects.
Signal amplification of FISH for automated detection using image cytometry.
Truong, K; Boenders, J; Maciorowski, Z; Vielh, P; Dutrillaux, B; Malfoy, B; Bourgeois, C A
1997-05-01
The purpose of this study was to improve the detection of FISH signals, in order that spot counting by a fully automated image cytometer be comparable to that obtained visually under the microscope. Two systems of spot scoring, visual and automated counting, were investigated in parallel on stimulated human lymphocytes with FISH using a biotinylated centromeric probe for chromosome 3. Signal characteristics were first analyzed on images recorded with a coupled charge device (CCD) camera. Number of spots per nucleus were scored visually on these recorded images versus automatically with a DISCOVERY image analyzer. Several fluochromes, amplification and pretreatments were tested. Our results for both visual and automated scoring show that the tyramide amplification system (TSA) gives the best amplification of signal if pepsin treatment is applied prior to FISH. Accuracy of the automated scoring, however, remained low (58% of nuclei containing two spots) compared to the visual scoring because of the high intranuclear variation between FISH spots.
NASA Astrophysics Data System (ADS)
Fotin, Sergei V.; Yin, Yin; Periaswamy, Senthil; Kunz, Justin; Haldankar, Hrishikesh; Muradyan, Naira; Cornud, François; Turkbey, Baris; Choyke, Peter L.
2012-02-01
Fully automated prostate segmentation helps to address several problems in prostate cancer diagnosis and treatment: it can assist in objective evaluation of multiparametric MR imagery, provides a prostate contour for MR-ultrasound (or CT) image fusion for computer-assisted image-guided biopsy or therapy planning, may facilitate reporting and enables direct prostate volume calculation. Among the challenges in automated analysis of MR images of the prostate are the variations of overall image intensities across scanners, the presence of nonuniform multiplicative bias field within scans and differences in acquisition setup. Furthermore, images acquired with the presence of an endorectal coil suffer from localized high-intensity artifacts at the posterior part of the prostate. In this work, a three-dimensional method for fast automated prostate detection based on normalized gradient fields cross-correlation, insensitive to intensity variations and coil-induced artifacts, is presented and evaluated. The components of the method, offline template learning and the localization algorithm, are described in detail. The method was validated on a dataset of 522 T2-weighted MR images acquired at the National Cancer Institute, USA that was split in two halves for development and testing. In addition, second dataset of 29 MR exams from Centre d'Imagerie Médicale Tourville, France were used to test the algorithm. The 95% confidence intervals for the mean Euclidean distance between automatically and manually identified prostate centroids were 4.06 +/- 0.33 mm and 3.10 +/- 0.43 mm for the first and second test datasets respectively. Moreover, the algorithm provided the centroid within the true prostate volume in 100% of images from both datasets. Obtained results demonstrate high utility of the detection method for a fully automated prostate segmentation.
Fully automated chest wall line segmentation in breast MRI by using context information
NASA Astrophysics Data System (ADS)
Wu, Shandong; Weinstein, Susan P.; Conant, Emily F.; Localio, A. Russell; Schnall, Mitchell D.; Kontos, Despina
2012-03-01
Breast MRI has emerged as an effective modality for the clinical management of breast cancer. Evidence suggests that computer-aided applications can further improve the diagnostic accuracy of breast MRI. A critical and challenging first step for automated breast MRI analysis, is to separate the breast as an organ from the chest wall. Manual segmentation or user-assisted interactive tools are inefficient, tedious, and error-prone, which is prohibitively impractical for processing large amounts of data from clinical trials. To address this challenge, we developed a fully automated and robust computerized segmentation method that intensively utilizes context information of breast MR imaging and the breast tissue's morphological characteristics to accurately delineate the breast and chest wall boundary. A critical component is the joint application of anisotropic diffusion and bilateral image filtering to enhance the edge that corresponds to the chest wall line (CWL) and to reduce the effect of adjacent non-CWL tissues. A CWL voting algorithm is proposed based on CWL candidates yielded from multiple sequential MRI slices, in which a CWL representative is generated and used through a dynamic time warping (DTW) algorithm to filter out inferior candidates, leaving the optimal one. Our method is validated by a representative dataset of 20 3D unilateral breast MRI scans that span the full range of the American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS) fibroglandular density categorization. A promising performance (average overlay percentage of 89.33%) is observed when the automated segmentation is compared to manually segmented ground truth obtained by an experienced breast imaging radiologist. The automated method runs time-efficiently at ~3 minutes for each breast MR image set (28 slices).
NASA Astrophysics Data System (ADS)
Jiang, Luan; Ling, Shan; Li, Qiang
2016-03-01
Cardiovascular diseases are becoming a leading cause of death all over the world. The cardiac function could be evaluated by global and regional parameters of left ventricle (LV) of the heart. The purpose of this study is to develop and evaluate a fully automated scheme for segmentation of LV in short axis cardiac cine MR images. Our fully automated method consists of three major steps, i.e., LV localization, LV segmentation at end-diastolic phase, and LV segmentation propagation to the other phases. First, the maximum intensity projection image along the time phases of the midventricular slice, located at the center of the image, was calculated to locate the region of interest of LV. Based on the mean intensity of the roughly segmented blood pool in the midventricular slice at each phase, end-diastolic (ED) and end-systolic (ES) phases were determined. Second, the endocardial and epicardial boundaries of LV of each slice at ED phase were synchronously delineated by use of a dual dynamic programming technique. The external costs of the endocardial and epicardial boundaries were defined with the gradient values obtained from the original and enhanced images, respectively. Finally, with the advantages of the continuity of the boundaries of LV across adjacent phases, we propagated the LV segmentation from the ED phase to the other phases by use of dual dynamic programming technique. The preliminary results on 9 clinical cardiac cine MR cases show that the proposed method can obtain accurate segmentation of LV based on subjective evaluation.
Giulieri, Stefano G; Chapuis-Taillard, Caroline; Manuel, Oriol; Hugli, Olivier; Pinget, Christophe; Wasserfallen, Jean-Blaise; Sahli, Roland; Jaton, Katia; Marchetti, Oscar; Meylan, Pascal
2015-01-01
Enterovirus (EV) is the most frequent cause of aseptic meningitis (AM). Lack of microbiological documentation results in unnecessary antimicrobial therapy and hospitalization. To assess the impact of rapid EV detection in cerebrospinal fluid (CSF) by a fully-automated PCR (GeneXpert EV assay, GXEA) on the management of AM. Observational study in adult patients with AM. Three groups were analyzed according to EV documentation in CSF: group A = no PCR or negative PCR (n=17), group B = positive real-time PCR (n = 20), and group C = positive GXEA (n = 22). Clinical, laboratory and health-care costs data were compared. Clinical characteristics were similar in the 3 groups. Median turn-around time of EV PCR decreased from 60 h (IQR (interquartile range) 44-87) in group B to 5h (IQR 4-11) in group C (p<0.0001). Median duration of antibiotics was 1 (IQR 0-6), 1 (0-1.9), and 0.5 days (single dose) in groups A, B, and C, respectively (p < 0.001). Median length of hospitalization was 4 days (2.5-7.5), 2 (1-3.7), and 0.5 (0.3-0.7), respectively (p < 0.001). Median hospitalization costs were $5458 (2676-6274) in group A, $2796 (2062-5726) in group B, and $921 (765-1230) in group C (p < 0.0001). Rapid EV detection in CSF by a fully-automated PCR improves management of AM by significantly reducing antibiotic use, hospitalization length and costs. Copyright © 2014 Elsevier B.V. All rights reserved.
Fully automated bone mineral density assessment from low-dose chest CT
NASA Astrophysics Data System (ADS)
Liu, Shuang; Gonzalez, Jessica; Zulueta, Javier; de-Torres, Juan P.; Yankelevitz, David F.; Henschke, Claudia I.; Reeves, Anthony P.
2018-02-01
A fully automated system is presented for bone mineral density (BMD) assessment from low-dose chest CT (LDCT). BMD assessment is central in the diagnosis and follow-up therapy monitoring of osteoporosis, which is characterized by low bone density and is estimated to affect 12.3 million US population aged 50 years or older, creating tremendous social and economic burdens. BMD assessment from DXA scans (BMDDXA) is currently the most widely used and gold standard technique for the diagnosis of osteoporosis and bone fracture risk estimation. With the recent large-scale implementation of annual lung cancer screening using LDCT, great potential emerges for the concurrent opportunistic osteoporosis screening. In the presented BMDCT assessment system, each vertebral body is first segmented and labeled with its anatomical name. Various 3D region of interest (ROI) inside the vertebral body are then explored for BMDCT measurements at different vertebral levels. The system was validated using 76 pairs of DXA and LDCT scans of the same subject. Average BMDDXA of L1-L4 was used as the reference standard. Statistically significant (p-value < 0.001) strong correlation is obtained between BMDDXA and BMDCT at all vertebral levels (T1 - L2). A Pearson correlation of 0.857 was achieved between BMDDXA and average BMDCT of T9-T11 by using a 3D ROI taking into account of both trabecular and cortical bone tissue. These encouraging results demonstrate the feasibility of fully automated quantitative BMD assessment and the potential of opportunistic osteoporosis screening with concurrent lung cancer screening using LDCT.
Lubbock, Alexander L. R.; Katz, Elad; Harrison, David J.; Overton, Ian M.
2013-01-01
Tissue microarrays (TMAs) allow multiplexed analysis of tissue samples and are frequently used to estimate biomarker protein expression in tumour biopsies. TMA Navigator (www.tmanavigator.org) is an open access web application for analysis of TMA data and related information, accommodating categorical, semi-continuous and continuous expression scores. Non-biological variation, or batch effects, can hinder data analysis and may be mitigated using the ComBat algorithm, which is incorporated with enhancements for automated application to TMA data. Unsupervised grouping of samples (patients) is provided according to Gaussian mixture modelling of marker scores, with cardinality selected by Bayesian information criterion regularization. Kaplan–Meier survival analysis is available, including comparison of groups identified by mixture modelling using the Mantel-Cox log-rank test. TMA Navigator also supports network inference approaches useful for TMA datasets, which often constitute comparatively few markers. Tissue and cell-type specific networks derived from TMA expression data offer insights into the molecular logic underlying pathophenotypes, towards more effective and personalized medicine. Output is interactive, and results may be exported for use with external programs. Private anonymous access is available, and user accounts may be generated for easier data management. PMID:23761446
Markert, Sven; Joeris, Klaus
2017-01-01
We developed an automated microtiter plate (MTP)-based system for suspension cell culture to meet the increased demands for miniaturized high throughput applications in biopharmaceutical process development. The generic system is based on off-the-shelf commercial laboratory automation equipment and is able to utilize MTPs of different configurations (6-24 wells per plate) in orbital shaken mode. The shaking conditions were optimized by Computational Fluid Dynamics simulations. The fully automated system handles plate transport, seeding and feeding of cells, daily sampling, and preparation of analytical assays. The integration of all required analytical instrumentation into the system enables a hands-off operation which prevents bottlenecks in sample processing. The modular set-up makes the system flexible and adaptable for a continuous extension of analytical parameters and add-on components. The system proved suitable as screening tool for process development by verifying the comparability of results for the MTP-based system and bioreactors regarding profiles of viable cell density, lactate, and product concentration of CHO cell lines. These studies confirmed that 6 well MTPs as well as 24 deepwell MTPs were predictive for a scale up to a 1000 L stirred tank reactor (scale factor 1:200,000). Applying the established cell culture system for automated media blend screening in late stage development, a 22% increase in product yield was achieved in comparison to the reference process. The predicted product increase was subsequently confirmed in 2 L bioreactors. Thus, we demonstrated the feasibility of the automated MTP-based cell culture system for enhanced screening and optimization applications in process development and identified further application areas such as process robustness. The system offers a great potential to accelerate time-to-market for new biopharmaceuticals. Biotechnol. Bioeng. 2017;114: 113-121. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Ihlow, Alexander; Schweizer, Patrick; Seiffert, Udo
2008-01-23
To find candidate genes that potentially influence the susceptibility or resistance of crop plants to powdery mildew fungi, an assay system based on transient-induced gene silencing (TIGS) as well as transient over-expression in single epidermal cells of barley has been developed. However, this system relies on quantitative microscopic analysis of the barley/powdery mildew interaction and will only become a high-throughput tool of phenomics upon automation of the most time-consuming steps. We have developed a high-throughput screening system based on a motorized microscope which evaluates the specimens fully automatically. A large-scale double-blind verification of the system showed an excellent agreement of manual and automated analysis and proved the system to work dependably. Furthermore, in a series of bombardment experiments an RNAi construct targeting the Mlo gene was included, which is expected to phenocopy resistance mediated by recessive loss-of-function alleles such as mlo5. In most cases, the automated analysis system recorded a shift towards resistance upon RNAi of Mlo, thus providing proof of concept for its usefulness in detecting gene-target effects. Besides saving labor and enabling a screening of thousands of candidate genes, this system offers continuous operation of expensive laboratory equipment and provides a less subjective analysis as well as a complete and enduring documentation of the experimental raw data in terms of digital images. In general, it proves the concept of enabling available microscope hardware to handle challenging screening tasks fully automatically.
Automated determination of arterial input function for DCE-MRI of the prostate
NASA Astrophysics Data System (ADS)
Zhu, Yingxuan; Chang, Ming-Ching; Gupta, Sandeep
2011-03-01
Prostate cancer is one of the commonest cancers in the world. Dynamic contrast enhanced MRI (DCE-MRI) provides an opportunity for non-invasive diagnosis, staging, and treatment monitoring. Quantitative analysis of DCE-MRI relies on determination of an accurate arterial input function (AIF). Although several methods for automated AIF detection have been proposed in literature, none are optimized for use in prostate DCE-MRI, which is particularly challenging due to large spatial signal inhomogeneity. In this paper, we propose a fully automated method for determining the AIF from prostate DCE-MRI. Our method is based on modeling pixel uptake curves as gamma variate functions (GVF). First, we analytically compute bounds on GVF parameters for more robust fitting. Next, we approximate a GVF for each pixel based on local time domain information, and eliminate the pixels with false estimated AIFs using the deduced upper and lower bounds. This makes the algorithm robust to signal inhomogeneity. After that, according to spatial information such as similarity and distance between pixels, we formulate the global AIF selection as an energy minimization problem and solve it using a message passing algorithm to further rule out the weak pixels and optimize the detected AIF. Our method is fully automated without training or a priori setting of parameters. Experimental results on clinical data have shown that our method obtained promising detection accuracy (all detected pixels inside major arteries), and a very good match with expert traced manual AIF.
Peters, Sonja; Kaal, Erwin; Horsting, Iwan; Janssen, Hans-Gerd
2012-02-24
A new method is presented for the analysis of phenolic acids in plasma based on ion-pairing 'Micro-extraction in packed sorbent' (MEPS) coupled on-line to in-liner derivatisation-gas chromatography-mass spectrometry (GC-MS). The ion-pairing reagent served a dual purpose. It was used both to improve extraction yields of the more polar analytes and as the methyl donor in the automated in-liner derivatisation method. In this way, a fully automated procedure for the extraction, derivatisation and injection of a wide range of phenolic acids in plasma samples has been obtained. An extensive optimisation of the extraction and derivatisation procedure has been performed. The entire method showed excellent repeatabilities of under 10% and linearities of 0.99 or better for all phenolic acids. The limits of detection of the optimised method for the majority of phenolic acids were 10ng/mL or lower with three phenolic acids having less-favourable detection limits of around 100 ng/mL. Finally, the newly developed method has been applied in a human intervention trial in which the bioavailability of polyphenols from wine and tea was studied. Forty plasma samples could be analysed within 24h in a fully automated method including sample extraction, derivatisation and gas chromatographic analysis. Copyright © 2011 Elsevier B.V. All rights reserved.
Hayashi, Nobuyoshi; Sugawara, Tohru; Shintani, Motoaki; Kato, Shinji
1989-01-01
A versatile automated apparatus, equipped with an artificial intelligence has been developed which may be used to prepare and isolate a wide variety of compounds. The prediction of the optimum reaction conditions and the reaction control in real time, are accomplished using novel kinetic equations and substituent effects in an artificial intelligence software which has already reported [1]. This paper deals with the design and construction of the fully automated system, and its application to the synthesis of a substituted N-(carboxyalkyl)amino acid. The apparatus is composed of units for perfoming various tasks, e.g. reagent supply, reaction, purification and separation, each linked to a control system. All synthetic processes including washing and drying of the apparatus after each synthetic run were automatically performed from the mixing of the reactants to the isolation of the products as powders with purities of greater than 98%. The automated apparatus has been able to run for 24 hours per day, and the average rate of synthesis of substituted N-(carboxyalkyl)amino acids has been three compounds daily. The apparatus is extremely valuable for synthesizing many derivatives of one particular compound structure. Even if the chemical yields are low under the optimum conditions, it is still possible to obtain a sufficient amount of the desired product by repetition of the reaction. Moreover it was possible to greatly reduce the manual involvement of the many syntheses which are a necessary part of pharmaceutical research. PMID:18924679
New York State Thruway Authority automatic vehicle classification (AVC) : research report.
DOT National Transportation Integrated Search
2008-03-31
In December 2007, the N.Y.S. Thruway Authority (Thruway) concluded a Federal : funded research effort to study technology and develop a design for retrofitting : devices required in implementing a fully automated vehicle classification system i...
Improving the accuracy and usability of Iowa falling weight deflectometer data.
DOT National Transportation Integrated Search
2013-05-01
This study aims to improve the accuracy and usability of Iowa Falling Weight Deflectometer (FWD) data by incorporating significant : enhancements into the fully-automated software system for rapid processing of the FWD data. These enhancements includ...
Self Consistent Bathymetric Mapping From Robotic Vehicles in the Deep Ocean
2005-06-01
that have been aligned in a consistent manner. Experimental results from the fully automated processing of a multibeam survey over the TAG hydrothermal structure at the Mid-Atlantic ridge are presented to validate the proposed method.
Plant Habitat Facility in the JPM
2017-11-21
iss053e234714 (Nov. 21, 2017) --- Advanced Plant Habitat (APH) Facility in the Japanese Experiment Module (JEM) Pressurized Module (JPM). The Plant Habitat is a fully automated facility that provides a large, enclosed, environmentally-controlled chamber for plant bioscience research.
Level of Automation and Failure Frequency Effects on Simulated Lunar Lander Performance
NASA Technical Reports Server (NTRS)
Marquez, Jessica J.; Ramirez, Margarita
2014-01-01
A human-in-the-loop experiment was conducted at the NASA Ames Research Center Vertical Motion Simulator, where instrument-rated pilots completed a simulated terminal descent phase of a lunar landing. Ten pilots participated in a 2 x 2 mixed design experiment, with level of automation as the within-subjects factor and failure frequency as the between subjects factor. The two evaluated levels of automation were high (fully automated landing) and low (manual controlled landing). During test trials, participants were exposed to either a high number of failures (75% failure frequency) or low number of failures (25% failure frequency). In order to investigate the pilots' sensitivity to changes in levels of automation and failure frequency, the dependent measure selected for this experiment was accuracy of failure diagnosis, from which D Prime and Decision Criterion were derived. For each of the dependent measures, no significant difference was found for level of automation and no significant interaction was detected between level of automation and failure frequency. A significant effect was identified for failure frequency suggesting failure frequency has a significant effect on pilots' sensitivity to failure detection and diagnosis. Participants were more likely to correctly identify and diagnose failures if they experienced the higher levels of failures, regardless of level of automation
Gao, Shan; van 't Klooster, Ronald; Brandts, Anne; Roes, Stijntje D; Alizadeh Dehnavi, Reza; de Roos, Albert; Westenberg, Jos J M; van der Geest, Rob J
2017-01-01
To develop and evaluate a method that can fully automatically identify the vessel wall boundaries and quantify the wall thickness for both common carotid artery (CCA) and descending aorta (DAO) from axial magnetic resonance (MR) images. 3T MRI data acquired with T 1 -weighted gradient-echo black-blood imaging sequence from carotid (39 subjects) and aorta (39 subjects) were used to develop and test the algorithm. The vessel wall segmentation was achieved by respectively fitting a 3D cylindrical B-spline surface to the boundaries of lumen and outer wall. The tube-fitting was based on the edge detection performed on the signal intensity (SI) profile along the surface normal. To achieve a fully automated process, Hough Transform (HT) was developed to estimate the lumen centerline and radii for the target vessel. Using the outputs of HT, a tube model for lumen segmentation was initialized and deformed to fit the image data. Finally, lumen segmentation was dilated to initiate the adaptation procedure of outer wall tube. The algorithm was validated by determining: 1) its performance against manual tracing; 2) its interscan reproducibility in quantifying vessel wall thickness (VWT); 3) its capability of detecting VWT difference in hypertensive patients compared with healthy controls. Statistical analysis including Bland-Altman analysis, t-test, and sample size calculation were performed for the purpose of algorithm evaluation. The mean distance between the manual and automatically detected lumen/outer wall contours was 0.00 ± 0.23/0.09 ± 0.21 mm for CCA and 0.12 ± 0.24/0.14 ± 0.35 mm for DAO. No significant difference was observed between the interscan VWT assessment using automated segmentation for both CCA (P = 0.19) and DAO (P = 0.94). Both manual and automated segmentation detected significantly higher carotid (P = 0.016 and P = 0.005) and aortic (P < 0.001 and P = 0.021) wall thickness in the hypertensive patients. A reliable and reproducible pipeline for fully automatic vessel wall quantification was developed and validated on healthy volunteers as well as patients with increased vessel wall thickness. This method holds promise for helping in efficient image interpretation for large-scale cohort studies. 4 J. Magn. Reson. Imaging 2017;45:215-228. © 2016 International Society for Magnetic Resonance in Medicine.
Synthesis of many different types of organic small molecules using one automated process.
Li, Junqi; Ballmer, Steven G; Gillis, Eric P; Fujii, Seiko; Schmidt, Michael J; Palazzolo, Andrea M E; Lehmann, Jonathan W; Morehouse, Greg F; Burke, Martin D
2015-03-13
Small-molecule synthesis usually relies on procedures that are highly customized for each target. A broadly applicable automated process could greatly increase the accessibility of this class of compounds to enable investigations of their practical potential. Here we report the synthesis of 14 distinct classes of small molecules using the same fully automated process. This was achieved by strategically expanding the scope of a building block-based synthesis platform to include even C(sp3)-rich polycyclic natural product frameworks and discovering a catch-and-release chromatographic purification protocol applicable to all of the corresponding intermediates. With thousands of compatible building blocks already commercially available, many small molecules are now accessible with this platform. More broadly, these findings illuminate an actionable roadmap to a more general and automated approach for small-molecule synthesis. Copyright © 2015, American Association for the Advancement of Science.
Eye Tracking Metrics for Workload Estimation in Flight Deck Operation
NASA Technical Reports Server (NTRS)
Ellis, Kyle; Schnell, Thomas
2010-01-01
Flight decks of the future are being enhanced through improved avionics that adapt to both aircraft and operator state. Eye tracking allows for non-invasive analysis of pilot eye movements, from which a set of metrics can be derived to effectively and reliably characterize workload. This research identifies eye tracking metrics that correlate to aircraft automation conditions, and identifies the correlation of pilot workload to the same automation conditions. Saccade length was used as an indirect index of pilot workload: Pilots in the fully automated condition were observed to have on average, larger saccadic movements in contrast to the guidance and manual flight conditions. The data set itself also provides a general model of human eye movement behavior and so ostensibly visual attention distribution in the cockpit for approach to land tasks with various levels of automation, by means of the same metrics used for workload algorithm development.
Modeling the Energy Use of a Connected and Automated Transportation System (Poster)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonder, J.; Brown, A.
Early research points to large potential impacts of connected and automated vehicles (CAVs) on transportation energy use - dramatic savings, increased use, or anything in between. Due to a lack of suitable data and integrated modeling tools to explore these complex future systems, analyses to date have relied on simple combinations of isolated effects. This poster proposes a framework for modeling the potential energy implications from increasing penetration of CAV technologies and for assessing technology and policy options to steer them toward favorable energy outcomes. Current CAV modeling challenges include estimating behavior change, understanding potential vehicle-to-vehicle interactions, and assessing trafficmore » flow and vehicle use under different automation scenarios. To bridge these gaps and develop a picture of potential future automated systems, NREL is integrating existing modeling capabilities with additional tools and data inputs to create a more fully integrated CAV assessment toolkit.« less
Merat, Natasha; Lee, John D
2012-10-01
This special section brings together diverse research regarding driver interaction with advanced automotive technology to guide design of increasingly automated vehicles. Rapidly evolving vehicle automation will likely change cars and trucks more in the next 5 years than the preceding 50, radically redefining what it means to drive. This special section includes 10 articles from European and North American researchers reporting simulator and naturalistic driving studies. Little research has considered the consequences of fully automated driving, with most focusing on lane-keeping and speed control systems individually. The studies reveal two underlying design philosophies: automate driving versus support driving. Results of several studies, consistent with previous research in other domains, suggest that the automate philosophy can delay driver responses to incidents in which the driver has to intervene and take control from the automation. Understanding how to orchestrate the transfer or sharing of control between the system and the driver, particularly in critical incidents, emerges as a central challenge. Designers should not assume that automation can substitute seamlessly for a human driver, nor can they assume that the driver can safely accommodate the limitations of automation. Designers, policy makers, and researchers must give careful consideration to what role the person should have in highly automated vehicles and how to support the driver if the driver is to be responsible for vehicle control. As in other domains, driving safety increasingly depends on the combined performance of the human and automation, and successful designs will depend on recognizing and supporting the new roles of the driver.
Smart sign enhancement : executive summary report.
DOT National Transportation Integrated Search
2007-09-01
In the Smart Sign Ordering System (Phase I) the : University of Akron developed an on-line : interactive traffic-sign ordering system for ODOT. : The main focus of SSOS Phase I was to provide : ODOT with a fully automated and networked sign : orderin...
Automated system for analyzing the activity of individual neurons
NASA Technical Reports Server (NTRS)
Bankman, Isaac N.; Johnson, Kenneth O.; Menkes, Alex M.; Diamond, Steve D.; Oshaughnessy, David M.
1993-01-01
This paper presents a signal processing system that: (1) provides an efficient and reliable instrument for investigating the activity of neuronal assemblies in the brain; and (2) demonstrates the feasibility of generating the command signals of prostheses using the activity of relevant neurons in disabled subjects. The system operates online, in a fully automated manner and can recognize the transient waveforms of several neurons in extracellular neurophysiological recordings. Optimal algorithms for detection, classification, and resolution of overlapping waveforms are developed and evaluated. Full automation is made possible by an algorithm that can set appropriate decision thresholds and an algorithm that can generate templates on-line. The system is implemented with a fast IBM PC compatible processor board that allows on-line operation.
Explosive Transient Camera (ETC) Program
NASA Technical Reports Server (NTRS)
Ricker, George
1991-01-01
Since the inception of the ETC program, a wide range of new technologies was developed to support this astronomical instrument. The prototype unit was installed at ETC Site 1. The first partially automated observations were made and some major renovations were later added to the ETC hardware. The ETC was outfitted with new thermoelectrically-cooled CCD cameras and a sophisticated vacuum manifold, which, together, made the ETC a much more reliable unit than the prototype. The ETC instrumentation and building were placed under full computer control, allowing the ETC to operate as an automated, autonomous instrument with virtually no human intervention necessary. The first fully-automated operation of the ETC was performed, during which the ETC monitored the error region of the repeating soft gamma-ray burster SGR 1806-21.
NASA Astrophysics Data System (ADS)
Shimchuk, G.; Shimchuk, Gr; Pakhomov, G.; Avalishvili, G.; Zavrazhnov, G.; Polonsky-Byslaev, I.; Fedotov, A.; Polozov, P.
2017-01-01
One of the prospective directions of PET development is using generator positron radiating nuclides [1,2]. Introduction of this technology is financially promising, since it does not require expensive special accelerator and radiochemical laboratory in the medical institution, which considerably reduces costs of PET diagnostics and makes it available to more patients. POZITOM-PRO RPC LLC developed and produced an 82Sr-82Rb generator, an automated injection system, designed for automatic and fully-controlled injections of 82RbCl produced by this generator, automated radiopharmaceutical synthesis units based on generated 68Ga produced using a domestically-manufactured 68Ge-68Ga generator for preparing two pharmaceuticals: Ga-68-DOTA-TATE and Vascular Ga-68.
Flip the tip: an automated, high quality, cost-effective patch clamp screen.
Lepple-Wienhues, Albrecht; Ferlinz, Klaus; Seeger, Achim; Schäfer, Arvid
2003-01-01
The race for creating an automated patch clamp has begun. Here, we present a novel technology to produce true gigaseals and whole cell preparations at a high rate. Suspended cells are flushed toward the tip of glass micropipettes. Seal, whole-cell break-in, and pipette/liquid handling are fully automated. Extremely stable seals and access resistance guarantee high recording quality. Data obtained from different cell types sealed inside pipettes show long-term stability, voltage clamp and seal quality, as well as block by compounds in the pM range. A flexible array of independent electrode positions minimizes consumables consumption at maximal throughput. Pulled micropipettes guarantee a proven gigaseal substrate with ultra clean and smooth surface at low cost.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torres, P.; Luque de Castro, M.D.
1996-12-31
A fully automated method for the determination of organochlorine pesticides in vegetables is proposed. The overall system acts as an {open_quotes}analytical black box{close_quotes} because a robotic station performs the prelimninary operations, from weighing to capping the leached analytes and location in an autosampler of an automated gas chromatograph with electron capture detection. The method has been applied to the determination of lindane, heptachlor, captan, chlordane and metoxcychlor in tea, marjoram, cinnamon, pennyroyal, and mint with good results in most cases. A gas chromatograph has been interfaced to a robotic station for the determination of pesticides in vegetables. 15 refs., 4more » figs., 2 tabs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aurah, Mirwaise Y.; Roberts, Mark A.
Washington River Protection Solutions (WRPS), operator of High Level Radioactive Waste (HLW) Tank Farms at the Hanford Site, is taking an over 20-year leap in technology, replacing systems that were monitored with clipboards and obsolete computer systems, as well as solving major operations and maintenance hurdles in the area of process automation and information management. While WRPS is fully compliant with procedures and regulations, the current systems are not integrated and do not share data efficiently, hampering how information is obtained and managed.
The NLM Indexing Initiative's Medical Text Indexer.
Aronson, Alan R; Mork, James G; Gay, Clifford W; Humphrey, Susanne M; Rogers, Willie J
2004-01-01
The Medical Text Indexer (MTI) is a program for producing MeSH indexing recommendations. It is the major product of NLM's Indexing Initiative and has been used in both semi-automated and fully automated indexing environments at the Library since mid 2002. We report here on an experiment conducted with MEDLINE indexers to evaluate MTI's performance and to generate ideas for its improvement as a tool for user-assisted indexing. We also discuss some filtering techniques developed to improve MTI's accuracy for use primarily in automatically producing the indexing for several abstracts collections.
Controlling Wafer Contamination Using Automated On-Line Metrology during Wet Chemical Cleaning
NASA Astrophysics Data System (ADS)
Wang, Jason; Kingston, Skip; Han, Ye; Saini, Harmesh; McDonald, Robert; Mui, Rudy
2003-09-01
The capabilities of a trace contamination analyzer are discussed and demonstrated. This analytical tool utilizes an electrospray, time-of-flight mass spectrometer (ES-TOF-MS) for fully automated on-line monitoring of wafer cleaning solutions. The analyzer provides rich information on metallic, anionic, cationic, elemental, and organic species through its ability to provide harsh (elemental) and soft (molecular) ionization under both positive and negative modes. It is designed to meet semiconductor process control and yield management needs for the ever increasing complex new chemistries present in wafer fabrication.
On automating domain connectivity for overset grids
NASA Technical Reports Server (NTRS)
Chiu, Ing-Tsau
1994-01-01
An alternative method for domain connectivity among systems of overset grids is presented. Reference uniform Cartesian systems of points are used to achieve highly efficient domain connectivity, and form the basis for a future fully automated system. The Cartesian systems are used to approximated body surfaces and to map the computational space of component grids. By exploiting the characteristics of Cartesian Systems, Chimera type hole-cutting and identification of donor elements for intergrid boundary points can be carried out very efficiently. The method is tested for a range of geometrically complex multiple-body overset grid systems.
AIRSAR Automated Web-based Data Processing and Distribution System
NASA Technical Reports Server (NTRS)
Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen
2005-01-01
In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.
Automation of TL brick dating by ADAM-1
NASA Astrophysics Data System (ADS)
Čechák, T.; Gerndt, J.; Hiršl, P.; Jiroušek, P.; Kanaval, J.; Kubelík, M.; Musílek, L.
2001-06-01
A specially adapted machine ADAM-1 for the thermoluminescence fine grain dating of bricks was constructed in an interdisciplinary research project, undertaken by a team recruited from three faculties of the Czech Technical University in Prague. This TL-reader is able to measure and evaluate automatically numerous samples. The sample holder has 60 sample positions, which allow the irradiation and evaluation of samples taken from two locations. All procedures of alpha and beta irradiation by varying doses and the TL-signal measurement as also the age evaluation and error assessment are programmable and fully automated.
Automated Verification of Specifications with Typestates and Access Permissions
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Catano, Nestor
2011-01-01
We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).
Schwaenen, Carsten; Nessling, Michelle; Wessendorf, Swen; Salvi, Tatjana; Wrobel, Gunnar; Radlwimmer, Bernhard; Kestler, Hans A.; Haslinger, Christian; Stilgenbauer, Stephan; Döhner, Hartmut; Bentz, Martin; Lichter, Peter
2004-01-01
B cell chronic lymphocytic leukemia (B-CLL) is characterized by a highly variable clinical course. Recurrent chromosomal imbalances provide significant prognostic markers. Risk-adapted therapy based on genomic alterations has become an option that is currently being tested in clinical trials. To supply a robust tool for such large scale studies, we developed a comprehensive DNA microarray dedicated to the automated analysis of recurrent genomic imbalances in B-CLL by array-based comparative genomic hybridization (matrix–CGH). Validation of this chip in a series of 106 B-CLL cases revealed a high specificity and sensitivity that fulfils the criteria for application in clinical oncology. This chip is immediately applicable within clinical B-CLL treatment trials that evaluate whether B-CLL cases with distinct chromosomal abnormalities should be treated with chemotherapy of different intensities and/or stem cell transplantation. Through the control set of DNA fragments equally distributed over the genome, recurrent genomic imbalances were discovered: trisomy of chromosome 19 and gain of the MYCN oncogene correlating with an elevation of MYCN mRNA expression. PMID:14730057
Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S
2015-03-02
A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits. Copyright © 2014 Elsevier B.V. All rights reserved.
Weber, Emanuel; Pinkse, Martijn W. H.; Bener-Aksam, Eda; Vellekoop, Michael J.; Verhaert, Peter D. E. M.
2012-01-01
We present a fully automated setup for performing in-line mass spectrometry (MS) analysis of conditioned media in cell cultures, in particular focusing on the peptides therein. The goal is to assess peptides secreted by cells in different culture conditions. The developed system is compatible with MS as analytical technique, as this is one of the most powerful analysis methods for peptide detection and identification. Proof of concept was achieved using the well-known mating-factor signaling in baker's yeast, Saccharomyces cerevisiae. Our concept system holds 1 mL of cell culture medium and allows maintaining a yeast culture for, at least, 40 hours with continuous supernatant extraction (and medium replenishing). The device's small dimensions result in reduced costs for reagents and open perspectives towards full integration on-chip. Experimental data that can be obtained are time-resolved peptide profiles in a yeast culture, including information about the appearance of mating-factor-related peptides. We emphasize that the system operates without any manual intervention or pipetting steps, which allows for an improved overall sensitivity compared to non-automated alternatives. MS data confirmed previously reported aspects of the physiology of the yeast-mating process. Moreover, matingfactor breakdown products (as well as evidence for a potentially responsible protease) were found. PMID:23091722
Faded-example as a Tool to Acquire and Automate Mathematics Knowledge
NASA Astrophysics Data System (ADS)
Retnowati, E.
2017-04-01
Students themselves accomplish Knowledge acquisition and automation. The teacher plays a role as the facilitator by creating mathematics tasks that assist students in building knowledge efficiently and effectively. Cognitive load caused by learning material presented by teachers should be considered as a critical factor. While the intrinsic cognitive load is related to the degree of complexity of the material learning ones can handle, the extraneous cognitive load is directly caused by how the material is presented. Strategies to present a learning material in computational learning domains like mathematics are a namely worked example (fully-guided task) or problem-solving (discovery task with no guidance). According to the empirical evidence, learning based on problem-solving may cause high-extraneous cognitive load for students who have limited prior knowledge, conversely learn based on worked example may cause high-extraneous cognitive load for students who have mastered the knowledge base. An alternative is a faded example consisting of the partly-completed task. Learning from faded-example can facilitate students who already acquire some knowledge about the to-be-learned material but still need more practice to automate the knowledge further. This instructional strategy provides a smooth transition from a fully-guided into an independent problem solver. Designs of faded examples for learning trigonometry are discussed.
NASA Astrophysics Data System (ADS)
Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting
2015-08-01
Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P < 0.001). A biphasic scoring method was applied, combining support vector machine and multivariate generalized linear models to assess the early and late stages of fibrosis, respectively, based on these parameters. The verification cohort was used to verify the scoring method, and the area under the receiver operating characteristic curve was >0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method.
Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting
2015-08-11
Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P < 0.001). A biphasic scoring method was applied, combining support vector machine and multivariate generalized linear models to assess the early and late stages of fibrosis, respectively, based on these parameters. The verification cohort was used to verify the scoring method, and the area under the receiver operating characteristic curve was >0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method.
NASA Astrophysics Data System (ADS)
Zhang, Xirui; Daaboul, George G.; Spuhler, Philipp S.; Dröge, Peter; Ünlü, M. Selim
2016-03-01
DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions.DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions. Electronic supplementary information (ESI) available: DNA sequences and nomenclature (Table 1S); SDS-PAGE assay of IHF stock solution (Fig. 1S); determination of the concentration of IHF stock solution by Bradford assay (Fig. 2S); equilibrium binding isotherm fitting results of other DNA sequences (Table 2S); calculation of dissociation constants (Fig. 3S, 4S; Table 2S); geometric model for quantitation of DNA bending angle induced by specific IHF binding (Fig. 4S); customized flow cell assembly (Fig. 5S); real-time measurement of average fluorophore height change by SSFM (Fig. 6S); summary of binding parameters obtained from additive isotherm model fitting (Table 3S); average surface densities of 10 dsDNA spots and bound IHF at equilibrium (Table 4S); effects of surface densities on the binding and bending of dsDNA (Tables 5S, 6S and Fig. 7S-10S). See DOI: 10.1039/c5nr06785e
Anderson, Courtney M.; Zhang, Bingqing; Miller, Melanie; Butko, Emerald; Wu, Xingyong; Laver, Thomas; Kernag, Casey; Kim, Jeffrey; Luo, Yuling; Lamparski, Henry; Park, Emily; Su, Nan
2016-01-01
ABSTRACT Biomarkers such as DNA, RNA, and protein are powerful tools in clinical diagnostics and therapeutic development for many diseases. Identifying RNA expression at the single cell level within the morphological context by RNA in situ hybridization provides a great deal of information on gene expression changes over conventional techniques that analyze bulk tissue, yet widespread use of this technique in the clinical setting has been hampered by the dearth of automated RNA ISH assays. Here we present an automated version of the RNA ISH technology RNAscope that is adaptable to multiple automation platforms. The automated RNAscope assay yields a high signal‐to‐noise ratio with little to no background staining and results comparable to the manual assay. In addition, the automated duplex RNAscope assay was able to detect two biomarkers simultaneously. Lastly, assay consistency and reproducibility were confirmed by quantification of TATA‐box binding protein (TBP) mRNA signals across multiple lots and multiple experiments. Taken together, the data presented in this study demonstrate that the automated RNAscope technology is a high performance RNA ISH assay with broad applicability in biomarker research and diagnostic assay development. J. Cell. Biochem. 117: 2201–2208, 2016. © 2016 The Authors. Journal of Cellular Biochemistry Published by Wiley Periodicals, Inc. PMID:27191821
Circular RNA expression profiles in hippocampus from mice with perinatal glyphosate exposure.
Yu, Ning; Tong, Yun; Zhang, Danni; Zhao, Shanshan; Fan, Xinli; Wu, Lihui; Ji, Hua
2018-07-02
Glyphosate is the active ingredient in numerous herbicide formulations. The roles of glyphosate in embryo-toxicity and neurotoxicity have been reported in human and animal models. Recently, several studies have reported evidence linking neurodevelopmental disorders (NDDs) with gestational glyphosate exposure. However, the role of glyphosate in neuronal development is still not fully understood. Our previous study found that perinatal glyphosate exposure resulted in differential microRNA expression in the prefrontal cortex of mouse offspring. However, the mechanism of glyphosate-induced neurotoxicity in the developing brain is still not fully understood. Considering the pivotal role of Circular RNAs (circRNAs) in the regulation of gene expression, a circRNA microarray method was used in this study to investigate circRNA expression changes in the hippocampus of mice with perinatal glyphosate exposure. The circRNA microarrays revealed that 663 circRNAs were significantly altered in the perinatal glyphosate exposure group compared with the control group. Among them, 330 were significantly upregulated, and the other 333 were downregulated. Furthermore, the relative expression levels of mmu-circRNA-014015, mmu-circRNA-28128 and mmu-circRNA-29837 were verified using quantitative real-time polymerase chain reaction (qRT-PCR). Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) pathway analyses demonstrated that stress-associated steroid metabolism pathways, such as aldosterone synthesis and secretion pathways, may be involved in the neurotoxicity of glyphosate. These results showed that circRNAs are aberrantly expressed in the hippocampus of mice with perinatal glyphosate exposure and play potential roles in glyphosate-induced neurotoxicity. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wahi-Anwar, M. Wasil; Emaminejad, Nastaran; Hoffman, John; Kim, Grace H.; Brown, Matthew S.; McNitt-Gray, Michael F.
2018-02-01
Quantitative imaging in lung cancer CT seeks to characterize nodules through quantitative features, usually from a region of interest delineating the nodule. The segmentation, however, can vary depending on segmentation approach and image quality, which can affect the extracted feature values. In this study, we utilize a fully-automated nodule segmentation method - to avoid reader-influenced inconsistencies - to explore the effects of varied dose levels and reconstruction parameters on segmentation. Raw projection CT images from a low-dose screening patient cohort (N=59) were reconstructed at multiple dose levels (100%, 50%, 25%, 10%), two slice thicknesses (1.0mm, 0.6mm), and a medium kernel. Fully-automated nodule detection and segmentation was then applied, from which 12 nodules were selected. Dice similarity coefficient (DSC) was used to assess the similarity of the segmentation ROIs of the same nodule across different reconstruction and dose conditions. Nodules at 1.0mm slice thickness and dose levels of 25% and 50% resulted in DSC values greater than 0.85 when compared to 100% dose, with lower dose leading to a lower average and wider spread of DSC values. At 0.6mm, the increased bias and wider spread of DSC values from lowering dose were more pronounced. The effects of dose reduction on DSC for CAD-segmented nodules were similar in magnitude to reducing the slice thickness from 1.0mm to 0.6mm. In conclusion, variation of dose and slice thickness can result in very different segmentations because of noise and image quality. However, there exists some stability in segmentation overlap, as even at 1mm, an image with 25% of the lowdose scan still results in segmentations similar to that seen in a full-dose scan.
ST-Segment Analysis Using Wireless Technology in Acute Myocardial Infarction (STAT-MI) trial.
Dhruva, Vivek N; Abdelhadi, Samir I; Anis, Ather; Gluckman, William; Hom, David; Dougan, William; Kaluski, Edo; Haider, Bunyad; Klapholz, Marc
2007-08-07
Our goal was to examine the effects of implementing a fully automated wireless network to reduce door-to-intervention times (D2I) in ST-segment elevation myocardial infarction (STEMI). Wireless technologies used to transmit prehospital electrocardiograms (ECGs) have helped to decrease D2I times but have unrealized potential. A fully automated wireless network that facilitates simultaneous 12-lead ECG transmission from emergency medical services (EMS) personnel in the field to the emergency department (ED) and offsite cardiologists via smartphones was developed. The system is composed of preconfigured Bluetooth devices, preprogrammed receiving/transmitting stations, dedicated e-mail servers, and smartphones. The network facilitates direct communication between offsite cardiologists and EMS personnel, allowing for patient triage directly to the cardiac catheterization laboratory from the field. Demographic, laboratory, and time interval data were prospectively collected and compared with calendar year 2005 data. From June to December 2006, 80 ECGs with suspected STEMI were transmitted via the network. Twenty patients with ECGs consistent with STEMI were triaged to the catheterization laboratory. Improvement was seen in mean door-to-cardiologist notification (-14.6 vs. 61.4 min, p < 0.001), door-to-arterial access (47.6 vs. 108.1 min, p < 0.001), time-to-first angiographic injection (52.8 vs. 119.2 min, p < 0.001), and D2I times (80.1 vs. 145.6 min, p < 0.001) compared with 2005 data. A fully automated wireless network that transmits ECGs simultaneously to the ED and offsite cardiologists for the early evaluation and triage of patients with suspected STEMI can decrease D2I times to <90 min and has the potential to be broadly applied in clinical practice.
Köller, Thomas; Kurze, Daniel; Lange, Mirjam; Scherdin, Martin; Podbielski, Andreas; Warnke, Philipp
2016-01-01
A fully automated multiplex real-time PCR assay—including a sample process control and a plasmid based positive control—for the detection and differentiation of herpes simplex virus 1 (HSV1), herpes simplex virus 2 (HSV2) and varicella-zoster virus (VZV) from cerebrospinal fluids (CSF) was developed on the BD Max platform. Performance was compared to an established accredited multiplex real time PCR protocol utilizing the easyMAG and the LightCycler 480/II, both very common devices in viral molecular diagnostics. For clinical validation, 123 CSF specimens and 40 reference samples from national interlaboratory comparisons were examined with both methods, resulting in 97.6% and 100% concordance for CSF and reference samples, respectively. Utilizing the BD Max platform revealed sensitivities of 173 (CI 95%, 88–258) copies/ml for HSV1, 171 (CI 95%, 148–194) copies/ml for HSV2 and 84 (CI 95%, 5–163) copies/ml for VZV. Cross reactivity could be excluded by checking 25 common viral, bacterial and fungal human pathogens. Workflow analyses displayed shorter test duration as well as remarkable fewer and easier preparation steps with the potential to reduce error rates occurring when manually assessing patient samples. This protocol allows for a fully automated PCR assay on the BD Max platform for the simultaneously detection of herpesviridae from CSF specimens. Singular or multiple infections due to HSV1, HSV2 and VZV can reliably be differentiated with good sensitivities. Control parameters are included within the assay, thereby rendering its suitability for current quality management requirements. PMID:27092772
Systematic review automation technologies
2014-01-01
Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128
RubisCO Gene Clusters Found in a Metagenome Microarray from Acid Mine Drainage
Guo, Xue; Yin, Huaqun; Cong, Jing; Dai, Zhimin; Liang, Yili
2013-01-01
The enzyme responsible for carbon dioxide fixation in the Calvin cycle, ribulose-1,5-bisphosphate carboxylase/oxygenase (RubisCO), is always detected as a phylogenetic marker to analyze the distribution and activity of autotrophic bacteria. However, such an approach provides no indication as to the significance of genomic content and organization. Horizontal transfers of RubisCO genes occurring in eubacteria and plastids may seriously affect the credibility of this approach. Here, we presented a new method to analyze the diversity and genomic content of RubisCO genes in acid mine drainage (AMD). A metagenome microarray containing 7,776 large-insertion fosmids was constructed to quickly screen genome fragments containing RubisCO form I large-subunit genes (cbbL). Forty-six cbbL-containing fosmids were detected, and six fosmids were fully sequenced. To evaluate the reliability of the metagenome microarray and understand the microbial community in AMD, the diversities of cbbL and the 16S rRNA gene were analyzed. Fosmid sequences revealed that the form I RubisCO gene cluster could be subdivided into form IA and IB RubisCO gene clusters in AMD, because of significant divergences in molecular phylogenetics and conservative genomic organization. Interestingly, the form I RubisCO gene cluster coexisted with the form II RubisCO gene cluster in one fosmid genomic fragment. Phylogenetic analyses revealed that horizontal transfers of RubisCO genes may occur widely in AMD, which makes the evolutionary history of RubisCO difficult to reconcile with organismal phylogeny. PMID:23335778
Dehne, T.; Lindahl, A.; Brittberg, M.; Pruss, A.; Ringe, J.; Sittinger, M.; Karlsson, C.
2012-01-01
Objective: It is well known that expression of markers for WNT signaling is dysregulated in osteoarthritic (OA) bone. However, it is still not fully known if the expression of these markers also is affected in OA cartilage. The aim of this study was therefore to examine this issue. Methods: Human cartilage biopsies from OA and control donors were subjected to genome-wide oligonucleotide microarrays. Genes involved in WNT signaling were selected using the BioRetis database, KEGG pathway analysis was searched using DAVID software tools, and cluster analysis was performed using Genesis software. Results from the microarray analysis were verified using quantitative real-time PCR and immunohistochemistry. In order to study the impact of cytokines for the dysregulated WNT signaling, OA and control chondrocytes were stimulated with interleukin-1 and analyzed with real-time PCR for their expression of WNT-related genes. Results: Several WNT markers displayed a significantly altered expression in OA compared to normal cartilage. Interestingly, inhibitors of the canonical and planar cell polarity WNT signaling pathways displayed significantly increased expression in OA cartilage, while the Ca2+/WNT signaling pathway was activated. Both real-time PCR and immunohistochemistry verified the microarray results. Real-time PCR analysis demonstrated that interleukin-1 upregulated expression of important WNT markers. Conclusions: WNT signaling is significantly affected in OA cartilage. The result suggests that both the canonical and planar cell polarity WNT signaling pathways were partly inhibited while the Ca2+/WNT pathway was activated in OA cartilage. PMID:26069618
Predicting breast cancer using an expression values weighted clinical classifier.
Thomas, Minta; De Brabanter, Kris; Suykens, Johan A K; De Moor, Bart
2014-12-31
Clinical data, such as patient history, laboratory analysis, ultrasound parameters-which are the basis of day-to-day clinical decision support-are often used to guide the clinical management of cancer in the presence of microarray data. Several data fusion techniques are available to integrate genomics or proteomics data, but only a few studies have created a single prediction model using both gene expression and clinical data. These studies often remain inconclusive regarding an obtained improvement in prediction performance. To improve clinical management, these data should be fully exploited. This requires efficient algorithms to integrate these data sets and design a final classifier. LS-SVM classifiers and generalized eigenvalue/singular value decompositions are successfully used in many bioinformatics applications for prediction tasks. While bringing up the benefits of these two techniques, we propose a machine learning approach, a weighted LS-SVM classifier to integrate two data sources: microarray and clinical parameters. We compared and evaluated the proposed methods on five breast cancer case studies. Compared to LS-SVM classifier on individual data sets, generalized eigenvalue decomposition (GEVD) and kernel GEVD, the proposed weighted LS-SVM classifier offers good prediction performance, in terms of test area under ROC Curve (AUC), on all breast cancer case studies. Thus a clinical classifier weighted with microarray data set results in significantly improved diagnosis, prognosis and prediction responses to therapy. The proposed model has been shown as a promising mathematical framework in both data fusion and non-linear classification problems.
Deep machine learning provides state-of-the-art performance in image-based plant phenotyping.
Pound, Michael P; Atkinson, Jonathan A; Townsend, Alexandra J; Wilson, Michael H; Griffiths, Marcus; Jackson, Aaron S; Bulat, Adrian; Tzimiropoulos, Georgios; Wells, Darren M; Murchie, Erik H; Pridmore, Tony P; French, Andrew P
2017-10-01
In plant phenotyping, it has become important to be able to measure many features on large image sets in order to aid genetic discovery. The size of the datasets, now often captured robotically, often precludes manual inspection, hence the motivation for finding a fully automated approach. Deep learning is an emerging field that promises unparalleled results on many data analysis problems. Building on artificial neural networks, deep approaches have many more hidden layers in the network, and hence have greater discriminative and predictive power. We demonstrate the use of such approaches as part of a plant phenotyping pipeline. We show the success offered by such techniques when applied to the challenging problem of image-based plant phenotyping and demonstrate state-of-the-art results (>97% accuracy) for root and shoot feature identification and localization. We use fully automated trait identification using deep learning to identify quantitative trait loci in root architecture datasets. The majority (12 out of 14) of manually identified quantitative trait loci were also discovered using our automated approach based on deep learning detection to locate plant features. We have shown deep learning-based phenotyping to have very good detection and localization accuracy in validation and testing image sets. We have shown that such features can be used to derive meaningful biological traits, which in turn can be used in quantitative trait loci discovery pipelines. This process can be completely automated. We predict a paradigm shift in image-based phenotyping bought about by such deep learning approaches, given sufficient training sets. © The Authors 2017. Published by Oxford University Press.
Development of a phantom to test fully automated breast density software - A work in progress.
Waade, G G; Hofvind, S; Thompson, J D; Highnam, R; Hogg, P
2017-02-01
Mammographic density (MD) is an independent risk factor for breast cancer and may have a future role for stratified screening. Automated software can estimate MD but the relationship between breast thickness reduction and MD is not fully understood. Our aim is to develop a deformable breast phantom to assess automated density software and the impact of breast thickness reduction on MD. Several different configurations of poly vinyl alcohol (PVAL) phantoms were created. Three methods were used to estimate their density. Raw image data of mammographic images were processed using Volpara to estimate volumetric breast density (VBD%); Hounsfield units (HU) were measured on CT images; and physical density (g/cm 3 ) was calculated using a formula involving mass and volume. Phantom volume versus contact area and phantom volume versus phantom thickness was compared to values of real breasts. Volpara recognized all deformable phantoms as female breasts. However, reducing the phantom thickness caused a change in phantom density and the phantoms were not able to tolerate same level of compression and thickness reduction experienced by female breasts during mammography. Our results are promising as all phantoms resulted in valid data for automated breast density measurement. Further work should be conducted on PVAL and other materials to produce deformable phantoms that mimic female breast structure and density with the ability of being compressed to the same level as female breasts. We are the first group to have produced deformable phantoms that are recognized as breasts by Volpara software. Copyright © 2016 The College of Radiographers. All rights reserved.
ARTIST: A fully automated artifact rejection algorithm for single-pulse TMS-EEG data.
Wu, Wei; Keller, Corey J; Rogasch, Nigel C; Longwell, Parker; Shpigel, Emmanuel; Rolle, Camarin E; Etkin, Amit
2018-04-01
Concurrent single-pulse TMS-EEG (spTMS-EEG) is an emerging noninvasive tool for probing causal brain dynamics in humans. However, in addition to the common artifacts in standard EEG data, spTMS-EEG data suffer from enormous stimulation-induced artifacts, posing significant challenges to the extraction of neural information. Typically, neural signals are analyzed after a manual time-intensive and often subjective process of artifact rejection. Here we describe a fully automated algorithm for spTMS-EEG artifact rejection. A key step of this algorithm is to decompose the spTMS-EEG data into statistically independent components (ICs), and then train a pattern classifier to automatically identify artifact components based on knowledge of the spatio-temporal profile of both neural and artefactual activities. The autocleaned and hand-cleaned data yield qualitatively similar group evoked potential waveforms. The algorithm achieves a 95% IC classification accuracy referenced to expert artifact rejection performance, and does so across a large number of spTMS-EEG data sets (n = 90 stimulation sites), retains high accuracy across stimulation sites/subjects/populations/montages, and outperforms current automated algorithms. Moreover, the algorithm was superior to the artifact rejection performance of relatively novice individuals, who would be the likely users of spTMS-EEG as the technique becomes more broadly disseminated. In summary, our algorithm provides an automated, fast, objective, and accurate method for cleaning spTMS-EEG data, which can increase the utility of TMS-EEG in both clinical and basic neuroscience settings. © 2018 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Könning, Tobias; Bayer, Andreas; Plappert, Nora; Faßbender, Wilhelm; Dürsch, Sascha; Küster, Matthias; Hubrich, Ralf; Wolf, Paul; Köhler, Bernd; Biesenbach, Jens
2018-02-01
A novel 3-dimensional arrangement of mirrors is used to re-arrange beams from 1-D and 2-D high power diode laser arrays. The approach allows for a variety of stacking geometries, depending on individual requirements. While basic building blocks, including collimating optics, always remain the same, most adaptations can be realized by simple rearrangement of a few optical components. Due to fully automated alignment processes, the required changes can be realized in software by changing coordinates, rather than requiring customized mechanical components. This approach minimizes development costs due to its flexibility, while reducing overall product cost by using similar building blocks for a variety of products and utilizing a high grade of automation. The modules can be operated with industrial grade water, lowering overall system and maintenance cost. Stackable macro coolers are used as the smallest building block of the system. Each cooler can hold up to five diode laser bars. Micro optical components, collimating the beam, are mounted directly to the cooler. All optical assembly steps are fully automated. Initially, the beams from all laser bars propagate in the same direction. Key to the concept is an arrangement of deflectors, which re-arrange the beams into a 2-D array of the desired shape and high fill factor. Standard multiplexing techniques like polarization- or wavelengths-multiplexing have been implemented as well. A variety of fiber coupled modules ranging from a few hundred watts of optical output power to multiple kilowatts of power, as well as customized laser spot geometries like uniform line sources, have been realized.
Horsch, Salome; Kopczynski, Dominik; Kuthe, Elias; Baumbach, Jörg Ingo; Rahmann, Sven
2017-01-01
Motivation Disease classification from molecular measurements typically requires an analysis pipeline from raw noisy measurements to final classification results. Multi capillary column—ion mobility spectrometry (MCC-IMS) is a promising technology for the detection of volatile organic compounds in the air of exhaled breath. From raw measurements, the peak regions representing the compounds have to be identified, quantified, and clustered across different experiments. Currently, several steps of this analysis process require manual intervention of human experts. Our goal is to identify a fully automatic pipeline that yields competitive disease classification results compared to an established but subjective and tedious semi-manual process. Method We combine a large number of modern methods for peak detection, peak clustering, and multivariate classification into analysis pipelines for raw MCC-IMS data. We evaluate all combinations on three different real datasets in an unbiased cross-validation setting. We determine which specific algorithmic combinations lead to high AUC values in disease classifications across the different medical application scenarios. Results The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace-operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology. PMID:28910313
Adapting the γ-H2AX assay for automated processing in human lymphocytes. 1. Technological aspects.
Turner, Helen C; Brenner, David J; Chen, Youhua; Bertucci, Antonella; Zhang, Jian; Wang, Hongliang; Lyulko, Oleksandra V; Xu, Yanping; Shuryak, Igor; Schaefer, Julia; Simaan, Nabil; Randers-Pehrson, Gerhard; Yao, Y Lawrence; Amundson, Sally A; Garty, Guy
2011-03-01
The immunofluorescence-based detection of γ-H2AX is a reliable and sensitive method for quantitatively measuring DNA double-strand breaks (DSBs) in irradiated samples. Since H2AX phosphorylation is highly linear with radiation dose, this well-established biomarker is in current use in radiation biodosimetry. At the Center for High-Throughput Minimally Invasive Radiation Biodosimetry, we have developed a fully automated high-throughput system, the RABIT (Rapid Automated Biodosimetry Tool), that can be used to measure γ-H2AX yields from fingerstick-derived samples of blood. The RABIT workstation has been designed to fully automate the γ-H2AX immunocytochemical protocol, from the isolation of human blood lymphocytes in heparin-coated PVC capillaries to the immunolabeling of γ-H2AX protein and image acquisition to determine fluorescence yield. High throughput is achieved through the use of purpose-built robotics, lymphocyte handling in 96-well filter-bottomed plates, and high-speed imaging. The goal of the present study was to optimize and validate the performance of the RABIT system for the reproducible and quantitative detection of γ-H2AX total fluorescence in lymphocytes in a multiwell format. Validation of our biodosimetry platform was achieved by the linear detection of a dose-dependent increase in γ-H2AX fluorescence in peripheral blood samples irradiated ex vivo with γ rays over the range 0 to 8 Gy. This study demonstrates for the first time the optimization and use of our robotically based biodosimetry workstation to successfully quantify γ-H2AX total fluorescence in irradiated peripheral lymphocytes.
Automated tilt series alignment and tomographic reconstruction in IMOD.
Mastronarde, David N; Held, Susannah R
2017-02-01
Automated tomographic reconstruction is now possible in the IMOD software package, including the merging of tomograms taken around two orthogonal axes. Several developments enable the production of high-quality tomograms. When using fiducial markers for alignment, the markers to be tracked through the series are chosen automatically; if there is an excess of markers available, a well-distributed subset is selected that is most likely to track well. Marker positions are refined by applying an edge-enhancing Sobel filter, which results in a 20% improvement in alignment error for plastic-embedded samples and 10% for frozen-hydrated samples. Robust fitting, in which outlying points are given less or no weight in computing the fitting error, is used to obtain an alignment solution, so that aberrant points from the automated tracking can have little effect on the alignment. When merging two dual-axis tomograms, the alignment between them is refined from correlations between local patches; a measure of structure was developed so that patches with insufficient structure to give accurate correlations can now be excluded automatically. We have also developed a script for running all steps in the reconstruction process with a flexible mechanism for setting parameters, and we have added a user interface for batch processing of tilt series to the Etomo program in IMOD. Batch processing is fully compatible with interactive processing and can increase efficiency even when the automation is not fully successful, because users can focus their effort on the steps that require manual intervention. Copyright © 2016 Elsevier Inc. All rights reserved.
Automation of closed environments in space for human comfort and safety
NASA Technical Reports Server (NTRS)
1991-01-01
The development of Environmental Control and Life Support Systems (ECLSS) for Space Station Freedom, future colonization of the Moon, and Mars missions presents new challenges for present technologies. ECLSS that operate during long-duration missions must be semi-autonomous to allow crew members environmental control without constant supervision. A control system for the ECLSS must address these issues as well as being reliable. The Kansas State University Advanced Design Team is in the process of researching and designing controls for the automation of the ECLSS for Space Station Freedom and beyond. The ECLSS for Freedom is composed of six subsystems. The temperature and humidity control (THC) subsystem maintains the cabin temperature and humidity at a comfortable level. The atmosphere control and supply (ACS) subsystem insures proper cabin pressure and partial pressures of oxygen and nitrogen. To protect the space station from fire damage, the fire detection and suppression (FDS) subsystem provides fire-sensing alarms and extinguishers. The waste management (WM) subsystem compacts solid wastes for return to Earth, and collects urine for water recovery. The atmosphere revitalization (AR) subsystem removes CO2 and other dangerous contaminants from the air. The water recovery and management (WRM) subsystem collects and filters condensate from the cabin to replenish potable water supplies, and processes urine and other waste waters to replenish hygiene water supplies. These subsystems are not fully automated at this time. Furthermore, the control of these subsystems is not presently integrated; they are largely independent of one another. A fully integrated and automated ECLSS would increase astronauts' productivity and contribute to their safety and comfort.
Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David
2018-04-01
Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.
Chen, Josephine; Zhao, Po; Massaro, Donald; Clerch, Linda B; Almon, Richard R; DuBois, Debra C; Jusko, William J; Hoffman, Eric P
2004-01-01
Publicly accessible DNA databases (genome browsers) are rapidly accelerating post-genomic research (see http://www.genome.ucsc.edu/), with integrated genomic DNA, gene structure, EST/ splicing and cross-species ortholog data. DNA databases have relatively low dimensionality; the genome is a linear code that anchors all associated data. In contrast, RNA expression and protein databases need to be able to handle very high dimensional data, with time, tissue, cell type and genes, as interrelated variables. The high dimensionality of microarray expression profile data, and the lack of a standard experimental platform have complicated the development of web-accessible databases and analytical tools. We have designed and implemented a public resource of expression profile data containing 1024 human, mouse and rat Affymetrix GeneChip expression profiles, generated in the same laboratory, and subject to the same quality and procedural controls (Public Expression Profiling Resource; PEPR). Our Oracle-based PEPR data warehouse includes a novel time series query analysis tool (SGQT), enabling dynamic generation of graphs and spreadsheets showing the action of any transcript of interest over time. In this report, we demonstrate the utility of this tool using a 27 time point, in vivo muscle regeneration series. This data warehouse and associated analysis tools provides access to multidimensional microarray data through web-based interfaces, both for download of all types of raw data for independent analysis, and also for straightforward gene-based queries. Planned implementations of PEPR will include web-based remote entry of projects adhering to quality control and standard operating procedure (QC/SOP) criteria, and automated output of alternative probe set algorithms for each project (see http://microarray.cnmcresearch.org/pgadatatable.asp).
Chen, Josephine; Zhao, Po; Massaro, Donald; Clerch, Linda B.; Almon, Richard R.; DuBois, Debra C.; Jusko, William J.; Hoffman, Eric P.
2004-01-01
Publicly accessible DNA databases (genome browsers) are rapidly accelerating post-genomic research (see http://www.genome.ucsc.edu/), with integrated genomic DNA, gene structure, EST/ splicing and cross-species ortholog data. DNA databases have relatively low dimensionality; the genome is a linear code that anchors all associated data. In contrast, RNA expression and protein databases need to be able to handle very high dimensional data, with time, tissue, cell type and genes, as interrelated variables. The high dimensionality of microarray expression profile data, and the lack of a standard experimental platform have complicated the development of web-accessible databases and analytical tools. We have designed and implemented a public resource of expression profile data containing 1024 human, mouse and rat Affymetrix GeneChip expression profiles, generated in the same laboratory, and subject to the same quality and procedural controls (Public Expression Profiling Resource; PEPR). Our Oracle-based PEPR data warehouse includes a novel time series query analysis tool (SGQT), enabling dynamic generation of graphs and spreadsheets showing the action of any transcript of interest over time. In this report, we demonstrate the utility of this tool using a 27 time point, in vivo muscle regeneration series. This data warehouse and associated analysis tools provides access to multidimensional microarray data through web-based interfaces, both for download of all types of raw data for independent analysis, and also for straightforward gene-based queries. Planned implementations of PEPR will include web-based remote entry of projects adhering to quality control and standard operating procedure (QC/SOP) criteria, and automated output of alternative probe set algorithms for each project (see http://microarray.cnmcresearch.org/pgadatatable.asp). PMID:14681485
Williams, James A; Eddleman, Laura; Pantone, Amy; Martinez, Regina; Young, Stephen; Van Der Pol, Barbara
2014-08-01
Next-generation diagnostics for Chlamydia trachomatis and Neisseria gonorrhoeae are available on semi- or fully-automated platforms. These systems require less hands-on time than older platforms and are user friendly. Four automated systems, the ABBOTT m2000 system, Becton Dickinson Viper System with XTR Technology, Gen-Probe Tigris DTS system, and Roche cobas 4800 system, were evaluated for total run time, hands-on time, and walk-away time. All of the systems evaluated in this time-motion study were able to complete a diagnostic test run within an 8-h work shift, instrument setup and operation were straightforward and uncomplicated, and walk-away time ranged from approximately 90 to 270 min in a head-to-head comparison of each system. All of the automated systems provide technical staff with increased time to perform other tasks during the run, offer easy expansion of the diagnostic test menu, and have the ability to increase specimen throughput. © 2013 Society for Laboratory Automation and Screening.
Revisiting the Fully Automated Double-Ring Infiltrometer Using Open-Source Electronics
The double-ring infiltrometer (DRI) is commonly used for measuring soil hydraulic conductivity. However, constant-head DRI tests typically involve the use of Mariotte tubes, which can be problematic to set-up, and time-consuming to maintain and monitor during infiltration tests....
Low energy production processes in manufacturing of silicon solar cells
NASA Technical Reports Server (NTRS)
Kirkpatrick, A. R.
1976-01-01
Ion implantation and pulsed energy techniques are being combined for fabrication of silicon solar cells totally under vacuum and at room temperature. Simplified sequences allow very short processing times with small process energy consumption. Economic projections for fully automated production are excellent.
Information Retrieval Systems Retrieved? An Alternative to Present Dial Access Systems
ERIC Educational Resources Information Center
Hofmann, Norbert
1976-01-01
The expense of a dial access information retrieval system (DIARS) is weighed against its benefits. Problems of usage and efficacy for the student are outlined. A fully automated system is proposed instead, and its cost-saving features are pointed out. (MS)
Neff, Michael; Rauhut, Guntram
2014-02-05
Multidimensional potential energy surfaces obtained from explicitly correlated coupled-cluster calculations and further corrections for high-order correlation contributions, scalar relativistic effects and core-correlation energy contributions were generated in a fully automated fashion for the double-minimum benchmark systems OH3(+) and NH3. The black-box generation of the potentials is based on normal coordinates, which were used in the underlying multimode expansions of the potentials and the μ-tensor within the Watson operator. Normal coordinates are not the optimal choice for describing double-minimum potentials and the question remains if they can be used for accurate calculations at all. However, their unique definition is an appealing feature, which removes remaining errors in truncated potential expansions arising from different choices of curvilinear coordinate systems. Fully automated calculations are presented, which demonstrate, that the proposed scheme allows for the determination of energy levels and tunneling splittings as a routine application. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Samson, Arnaud; Thibaudeau, Christian; Bouchard, Jonathan; Gaudin, Émilie; Paulin, Caroline; Lecomte, Roger; Fontaine, Réjean
2018-05-01
A fully automated time alignment method based on a positron timing probe was developed to correct the channel-to-channel coincidence time dispersion of the LabPET II avalanche photodiode-based positron emission tomography (PET) scanners. The timing probe was designed to directly detect positrons and generate an absolute time reference. The probe-to-channel coincidences are recorded and processed using firmware embedded in the scanner hardware to compute the time differences between detector channels. The time corrections are then applied in real-time to each event in every channel during PET data acquisition to align all coincidence time spectra, thus enhancing the scanner time resolution. When applied to the mouse version of the LabPET II scanner, the calibration of 6 144 channels was performed in less than 15 min and showed a 47% improvement on the overall time resolution of the scanner, decreasing from 7 ns to 3.7 ns full width at half maximum (FWHM).
Harder, Nathalie; Mora-Bermúdez, Felipe; Godinez, William J; Wünsche, Annelie; Eils, Roland; Ellenberg, Jan; Rohr, Karl
2009-11-01
Live-cell imaging allows detailed dynamic cellular phenotyping for cell biology and, in combination with small molecule or drug libraries, for high-content screening. Fully automated analysis of live cell movies has been hampered by the lack of computational approaches that allow tracking and recognition of individual cell fates over time in a precise manner. Here, we present a fully automated approach to analyze time-lapse movies of dividing cells. Our method dynamically categorizes cells into seven phases of the cell cycle and five aberrant morphological phenotypes over time. It reliably tracks cells and their progeny and can thus measure the length of mitotic phases and detect cause and effect if mitosis goes awry. We applied our computational scheme to annotate mitotic phenotypes induced by RNAi gene knockdown of CKAP5 (also known as ch-TOG) or by treatment with the drug nocodazole. Our approach can be readily applied to comparable assays aiming at uncovering the dynamic cause of cell division phenotypes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, H.S.; Ables, E.; Barthelmy, S.D.
LOTIS is a rapidly slewing wide-field-of-view telescope which was designed and constructed to search for simultaneous gamma-ray burst (GRB) optical counterparts. This experiment requires a rapidly slewing ({lt} 10 sec), wide-field-of-view ({gt} 15{degrees}), automatic and dedicated telescope. LOTIS utilizes commercial tele-photo lenses and custom 2048 x 2048 CCD cameras to view a 17.6 x 17.6{degrees} field of view. It can point to any part of the sky within 5 sec and is fully automated. It is connected via Internet socket to the GRB coordinate distribution network which analyzes telemetry from the satellite and delivers GRB coordinate information in real-time. LOTISmore » started routine operation in Oct. 1996. In the idle time between GRB triggers, LOTIS systematically surveys the entire available sky every night for new optical transients. This paper will describe the system design and performance.« less
NASA Astrophysics Data System (ADS)
Rückwardt, M.; Göpfert, A.; Correns, M.; Schellhorn, M.; Linß, G.
2010-07-01
Coordinate measuring machines are high precession all-rounder in three dimensional measuring. Therefore the versatility of parameters and expandability of additionally hardware is very comprehensive. Consequently you need much expert knowledge of the user and mostly a lot of advanced information about the measuring object. In this paper a coordinate measuring machine and a specialized measuring machine are compared at the example of the measuring of eyeglass frames. For this case of three dimensional measuring challenges the main focus is divided into metrological and economical aspects. At first there is shown a fully automated method for tactile measuring of this abstract form. At second there is shown a comparison of the metrological characteristics of a coordinate measuring machine and a tracer for eyeglass frames. The result is in favour to the coordinate measuring machine. It was not surprising in these aspects. At last there is shown a comparison of the machine in front of the economical aspects.
NASA Astrophysics Data System (ADS)
Sanchez, M.; Probst, L.; Blazevic, E.; Nakao, B.; Northrup, M. A.
2011-11-01
We describe a fully automated and autonomous air-borne biothreat detection system for biosurveillance applications. The system, including the nucleic-acid-based detection assay, was designed, built and shipped by Microfluidic Systems Inc (MFSI), a new subsidiary of PositiveID Corporation (PSID). Our findings demonstrate that the system and assay unequivocally identify pathogenic strains of Bacillus anthracis, Yersinia pestis, Francisella tularensis, Burkholderia mallei, and Burkholderia pseudomallei. In order to assess the assay's ability to detect unknown samples, our team also challenged it against a series of blind samples provided by the Department of Homeland Security (DHS). These samples included natural occurring isolated strains, near-neighbor isolates, and environmental samples. Our results indicate that the multiplex assay was specific and produced no false positives when challenged with in house gDNA collections and DHS provided panels. Here we present another analytical tool for the rapid identification of nine Centers for Disease Control and Prevention category A and B biothreat organisms.
NASA Astrophysics Data System (ADS)
Srivastava, Vishal; Dalal, Devjyoti; Kumar, Anuj; Prakash, Surya; Dalal, Krishna
2018-06-01
Moisture content is an important feature of fruits and vegetables. As 80% of apple content is water, so decreasing the moisture content will degrade the quality of apples (Golden Delicious). The computational and texture features of the apples were extracted from optical coherence tomography (OCT) images. A support vector machine with a Gaussian kernel model was used to perform automated classification. To evaluate the quality of wax coated apples during storage in vivo, our proposed method opens up the possibility of fully automated quantitative analysis based on the morphological features of apples. Our results demonstrate that the analysis of the computational and texture features of OCT images may be a good non-destructive method for the assessment of the quality of apples.
An Automated, High-Throughput System for GISAXS and GIWAXS Measurements of Thin Films
NASA Astrophysics Data System (ADS)
Schaible, Eric; Jimenez, Jessica; Church, Matthew; Lim, Eunhee; Stewart, Polite; Hexemer, Alexander
Grazing incidence small-angle X-ray scattering (GISAXS) and grazing incidence wide-angle X-ray scattering (GIWAXS) are important techniques for characterizing thin films. In order to meet rapidly increasing demand, the SAXSWAXS beamline at the Advanced Light Source (beamline 7.3.3) has implemented a fully automated, high-throughput system to conduct SAXS, GISAXS and GIWAXS measurements. An automated robot arm transfers samples from a holding tray to a measurement stage. Intelligent software aligns each sample in turn, and measures each according to user-defined specifications. Users mail in trays of samples on individually barcoded pucks, and can download and view their data remotely. Data will be pipelined to the NERSC supercomputing facility, and will be available to users via a web portal that facilitates highly parallelized analysis.
Autonomous Operations System: Development and Application
NASA Technical Reports Server (NTRS)
Toro Medina, Jaime A.; Wilkins, Kim N.; Walker, Mark; Stahl, Gerald M.
2016-01-01
Autonomous control systems provides the ability of self-governance beyond the conventional control system. As the complexity of mechanical and electrical systems increases, there develops a natural drive for developing robust control systems to manage complicated operations. By closing the bridge between conventional automated systems to knowledge based self-awareness systems, nominal control of operations can evolve into relying on safe critical mitigation processes to support any off-nominal behavior. Current research and development efforts lead by the Autonomous Propellant Loading (APL) group at NASA Kennedy Space Center aims to improve cryogenic propellant transfer operations by developing an automated control and health monitoring system. As an integrated systems, the center aims to produce an Autonomous Operations System (AOS) capable of integrating health management operations with automated control to produce a fully autonomous system.
Serotyping of Streptococcus pneumoniae Based on Capsular Genes Polymorphisms
Raymond, Frédéric; Boucher, Nancy; Allary, Robin; Robitaille, Lynda; Lefebvre, Brigitte; Tremblay, Cécile
2013-01-01
Streptococcus pneumoniae serotype epidemiology is essential since serotype replacement is a concern when introducing new polysaccharide-conjugate vaccines. A novel PCR-based automated microarray assay was developed to assist in the tracking of the serotypes. Autolysin, pneumolysin and eight genes located in the capsular operon were amplified using multiplex PCR. This step was followed by a tagged fluorescent primer extension step targeting serotype-specific polymorphisms. The tagged primers were then hybridized to a microarray. Results were exported to an expert system to identify capsular serotypes. The assay was validated on 166 cultured S. pneumoniae samples from 63 different serotypes as determined by the Quellung method. We show that typing only 12 polymorphisms located in the capsular operon allows the identification at the serotype level of 22 serotypes and the assignation of 24 other serotypes to a subgroup of serotypes. Overall, 126 samples (75.9%) were correctly serotyped, 14 were assigned to a member of the same serogroup, 8 rare serotypes were erroneously serotyped, and 18 gave negative serotyping results. Most of the discrepancies involved rare serotypes or serotypes that are difficult to discriminate using a DNA-based approach, for example 6A and 6B. The assay was also tested on clinical specimens including 43 cerebrospinal fluid samples from patients with meningitis and 59 nasopharyngeal aspirates from bacterial pneumonia patients. Overall, 89% of specimens positive for pneumolysin were serotyped, demonstrating that this method does not require culture to serotype clinical specimens. The assay showed no cross-reactivity for 24 relevant bacterial species found in these types of samples. The limit of detection for serotyping and S. pneumoniae detection was 100 genome equivalent per reaction. This automated assay is amenable to clinical testing and does not require any culturing of the samples. The assay will be useful for the evaluation of serotype prevalence changes after new conjugate vaccines introduction. PMID:24086706
Harati, Vida; Khayati, Rasoul; Farzan, Abdolreza
2011-07-01
Uncontrollable and unlimited cell growth leads to tumor genesis in the brain. If brain tumors are not diagnosed early and cured properly, they could cause permanent brain damage or even death to patients. As in all methods of treatments, any information about tumor position and size is important for successful treatment; hence, finding an accurate and a fully automated method to give information to physicians is necessary. A fully automatic and accurate method for tumor region detection and segmentation in brain magnetic resonance (MR) images is suggested. The presented approach is an improved fuzzy connectedness (FC) algorithm based on a scale in which the seed point is selected automatically. This algorithm is independent of the tumor type in terms of its pixels intensity. Tumor segmentation evaluation results based on similarity criteria (similarity index (SI), overlap fraction (OF), and extra fraction (EF) are 92.89%, 91.75%, and 3.95%, respectively) indicate a higher performance of the proposed approach compared to the conventional methods, especially in MR images, in tumor regions with low contrast. Thus, the suggested method is useful for increasing the ability of automatic estimation of tumor size and position in brain tissues, which provides more accurate investigation of the required surgery, chemotherapy, and radiotherapy procedures. Copyright © 2011 Elsevier Ltd. All rights reserved.
Fully printable, strain-engineered electronic wrap for customizable soft electronics.
Byun, Junghwan; Lee, Byeongmoon; Oh, Eunho; Kim, Hyunjong; Kim, Sangwoo; Lee, Seunghwan; Hong, Yongtaek
2017-03-24
Rapid growth of stretchable electronics stimulates broad uses in multidisciplinary fields as well as industrial applications. However, existing technologies are unsuitable for implementing versatile applications involving adaptable system design and functions in a cost/time-effective way because of vacuum-conditioned, lithographically-predefined processes. Here, we present a methodology for a fully printable, strain-engineered electronic wrap as a universal strategy which makes it more feasible to implement various stretchable electronic systems with customizable layouts and functions. The key aspects involve inkjet-printed rigid island (PRI)-based stretchable platform technology and corresponding printing-based automated electronic functionalization methodology, the combination of which provides fully printed, customized layouts of stretchable electronic systems with simplified process. Specifically, well-controlled contact line pinning effect of printed polymer solution enables the formation of PRIs with tunable thickness; and surface strain analysis on those PRIs leads to the optimized stability and device-to-island fill factor of strain-engineered electronic wraps. Moreover, core techniques of image-based automated pinpointing, surface-mountable device based electronic functionalizing, and one-step interconnection networking of PRIs enable customized circuit design and adaptable functionalities. To exhibit the universality of our approach, multiple types of practical applications ranging from self-computable digital logics to display and sensor system are demonstrated on skin in a customized form.
Fully printable, strain-engineered electronic wrap for customizable soft electronics
NASA Astrophysics Data System (ADS)
Byun, Junghwan; Lee, Byeongmoon; Oh, Eunho; Kim, Hyunjong; Kim, Sangwoo; Lee, Seunghwan; Hong, Yongtaek
2017-03-01
Rapid growth of stretchable electronics stimulates broad uses in multidisciplinary fields as well as industrial applications. However, existing technologies are unsuitable for implementing versatile applications involving adaptable system design and functions in a cost/time-effective way because of vacuum-conditioned, lithographically-predefined processes. Here, we present a methodology for a fully printable, strain-engineered electronic wrap as a universal strategy which makes it more feasible to implement various stretchable electronic systems with customizable layouts and functions. The key aspects involve inkjet-printed rigid island (PRI)-based stretchable platform technology and corresponding printing-based automated electronic functionalization methodology, the combination of which provides fully printed, customized layouts of stretchable electronic systems with simplified process. Specifically, well-controlled contact line pinning effect of printed polymer solution enables the formation of PRIs with tunable thickness; and surface strain analysis on those PRIs leads to the optimized stability and device-to-island fill factor of strain-engineered electronic wraps. Moreover, core techniques of image-based automated pinpointing, surface-mountable device based electronic functionalizing, and one-step interconnection networking of PRIs enable customized circuit design and adaptable functionalities. To exhibit the universality of our approach, multiple types of practical applications ranging from self-computable digital logics to display and sensor system are demonstrated on skin in a customized form.
Oost, Elco; Koning, Gerhard; Sonka, Milan; Oemrawsingh, Pranobe V; Reiber, Johan H C; Lelieveldt, Boudewijn P F
2006-09-01
This paper describes a new approach to the automated segmentation of X-ray left ventricular (LV) angiograms, based on active appearance models (AAMs) and dynamic programming. A coupling of shape and texture information between the end-diastolic (ED) and end-systolic (ES) frame was achieved by constructing a multiview AAM. Over-constraining of the model was compensated for by employing dynamic programming, integrating both intensity and motion features in the cost function. Two applications are compared: a semi-automatic method with manual model initialization, and a fully automatic algorithm. The first proved to be highly robust and accurate, demonstrating high clinical relevance. Based on experiments involving 70 patient data sets, the algorithm's success rate was 100% for ED and 99% for ES, with average unsigned border positioning errors of 0.68 mm for ED and 1.45 mm for ES. Calculated volumes were accurate and unbiased. The fully automatic algorithm, with intrinsically less user interaction was less robust, but showed a high potential, mostly due to a controlled gradient descent in updating the model parameters. The success rate of the fully automatic method was 91% for ED and 83% for ES, with average unsigned border positioning errors of 0.79 mm for ED and 1.55 mm for ES.
Fully printable, strain-engineered electronic wrap for customizable soft electronics
Byun, Junghwan; Lee, Byeongmoon; Oh, Eunho; Kim, Hyunjong; Kim, Sangwoo; Lee, Seunghwan; Hong, Yongtaek
2017-01-01
Rapid growth of stretchable electronics stimulates broad uses in multidisciplinary fields as well as industrial applications. However, existing technologies are unsuitable for implementing versatile applications involving adaptable system design and functions in a cost/time-effective way because of vacuum-conditioned, lithographically-predefined processes. Here, we present a methodology for a fully printable, strain-engineered electronic wrap as a universal strategy which makes it more feasible to implement various stretchable electronic systems with customizable layouts and functions. The key aspects involve inkjet-printed rigid island (PRI)-based stretchable platform technology and corresponding printing-based automated electronic functionalization methodology, the combination of which provides fully printed, customized layouts of stretchable electronic systems with simplified process. Specifically, well-controlled contact line pinning effect of printed polymer solution enables the formation of PRIs with tunable thickness; and surface strain analysis on those PRIs leads to the optimized stability and device-to-island fill factor of strain-engineered electronic wraps. Moreover, core techniques of image-based automated pinpointing, surface-mountable device based electronic functionalizing, and one-step interconnection networking of PRIs enable customized circuit design and adaptable functionalities. To exhibit the universality of our approach, multiple types of practical applications ranging from self-computable digital logics to display and sensor system are demonstrated on skin in a customized form. PMID:28338055
Burns, Joseph E.; Yao, Jianhua; Muñoz, Hector
2016-01-01
Purpose To design and validate a fully automated computer system for the detection and anatomic localization of traumatic thoracic and lumbar vertebral body fractures at computed tomography (CT). Materials and Methods This retrospective study was HIPAA compliant. Institutional review board approval was obtained, and informed consent was waived. CT examinations in 104 patients (mean age, 34.4 years; range, 14–88 years; 32 women, 72 men), consisting of 94 examinations with positive findings for fractures (59 with vertebral body fractures) and 10 control examinations (without vertebral fractures), were performed. There were 141 thoracic and lumbar vertebral body fractures in the case set. The locations of fractures were marked and classified by a radiologist according to Denis column involvement. The CT data set was divided into training and testing subsets (37 and 67 subsets, respectively) for analysis by means of prototype software for fully automated spinal segmentation and fracture detection. Free-response receiver operating characteristic analysis was performed. Results Training set sensitivity for detection and localization of fractures within each vertebra was 0.82 (28 of 34 findings; 95% confidence interval [CI]: 0.68, 0.90), with a false-positive rate of 2.5 findings per patient. The sensitivity for fracture localization to the correct vertebra was 0.88 (23 of 26 findings; 95% CI: 0.72, 0.96), with a false-positive rate of 1.3. Testing set sensitivity for the detection and localization of fractures within each vertebra was 0.81 (87 of 107 findings; 95% CI: 0.75, 0.87), with a false-positive rate of 2.7. The sensitivity for fracture localization to the correct vertebra was 0.92 (55 of 60 findings; 95% CI: 0.79, 0.94), with a false-positive rate of 1.6. The most common cause of false-positive findings was nutrient foramina (106 of 272 findings [39%]). Conclusion The fully automated computer system detects and anatomically localizes vertebral body fractures in the thoracic and lumbar spine on CT images with a high sensitivity and a low false-positive rate. © RSNA, 2015 Online supplemental material is available for this article. PMID:26172532
Kittas, Aristotelis; Delobelle, Aurélien; Schmitt, Sabrina; Breuhahn, Kai; Guziolowski, Carito; Grabe, Niels
2016-01-01
An effective means to analyze mRNA expression data is to take advantage of established knowledge from pathway databases, using methods such as pathway-enrichment analyses. However, pathway databases are not case-specific and expression data could be used to infer gene-regulation patterns in the context of specific pathways. In addition, canonical pathways may not always describe the signaling mechanisms properly, because interactions can frequently occur between genes in different pathways. Relatively few methods have been proposed to date for generating and analyzing such networks, preserving the causality between gene interactions and reasoning over the qualitative logic of regulatory effects. We present an algorithm (MCWalk) integrated with a logic programming approach, to discover subgraphs in large-scale signaling networks by random walks in a fully automated pipeline. As an exemplary application, we uncover the signal transduction mechanisms in a gene interaction network describing hepatocyte growth factor-stimulated cell migration and proliferation from gene-expression measured with microarray and RT-qPCR using in-house perturbation experiments in a keratinocyte-fibroblast co-culture. The resulting subgraphs illustrate possible associations of hepatocyte growth factor receptor c-Met nodes, differentially expressed genes and cellular states. Using perturbation experiments and Answer Set programming, we are able to select those which are more consistent with the experimental data. We discover key regulator nodes by measuring the frequency with which they are traversed when connecting signaling between receptors and significantly regulated genes and predict their expression-shift consistently with the measured data. The Java implementation of MCWalk is publicly available under the MIT license at: https://bitbucket.org/akittas/biosubg. © 2015 FEBS.
Adams, Jessica M.; Otero-Corchon, Veronica; Hammond, Geoffrey L.; Veldhuis, Johannes D.; Qi, Nathan
2015-01-01
Distinct male and female patterns of pituitary GH secretion produce sexually differentiated hepatic gene expression profiles, thereby influencing steroid and xenobiotic metabolism. We used a fully automated system to obtain serial nocturnal blood samples every 15 minutes from cannulated wild-type (WT) and somatostatin knockout (Sst-KO) mice to determine the role of SST, the principal inhibitor of GH release, in the generation of sexually dimorphic GH pulsatility. WT males had lower mean and median GH values, less random GH secretory bursts, and longer trough periods between GH pulses than WT females. Each of these parameters was feminized in male Sst-KO mice, whereas female Sst-KO mice had higher GH levels than all other groups, but GH pulsatility was unaffected. We next performed hepatic mRNA profiling with high-density microarrays. Male Sst-KO mice exhibited a globally feminized pattern of GH-dependent mRNA levels, but female Sst-KO mice were largely unaffected. Among the differentially expressed female-predominant genes was Serpina6, which encodes corticosteroid-binding globulin (CBG). Increased CBG was associated with elevated diurnal peak plasma corticosterone in unstressed WT females and both sexes of Sst-KO mice compared with WT males. Sst-KO mice also had exaggerated ACTH and corticosterone responses to acute restraint stress. However, consistent with their lack of phenotypic signs of excess glucocorticoids, cerebrospinal fluid concentrations of free corticosterone in Sst-KO mice were not elevated. In summary, SST is necessary for the prolonged interpulse troughs that define masculinized pituitary GH secretion. SST also contributes to sexual dimorphism of the hypothalamic-pituitary-adrenal axis via GH-dependent regulation of hepatic CBG production. PMID:25551181
Implementing ICAO Language Proficiency Requirements in the Versant Aviation English Test
ERIC Educational Resources Information Center
Van Moere, Alistair; Suzuki, Masanori; Downey, Ryan; Cheng, Jian
2009-01-01
This paper discusses the development of an assessment to satisfy the International Civil Aviation Organization (ICAO) Language Proficiency Requirements. The Versant Aviation English Test utilizes speech recognition technology and a computerized testing platform, such that test administration and scoring are fully automated. Developed in…
Exploring the Issues: Humans and Computers.
ERIC Educational Resources Information Center
Walsh, Huber M.
This presentation addresses three basic social issues generated by the computer revolution. The first section, "Money Matters," focuses on the economic effects of computer technology. These include the replacement of workers by fully automated machines, the threat to professionals posed by expanded access to specialized information, and the…
Application of a minicomputer-based system in measuring intraocular fluid dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bronzino, J.D.; D'Amato, D.P.; O'Rourke, J.
A complete, computerized system has been developed to automate and display radionuclide clearance studies in an ophthalmology clinical laboratory. The system is based on a PDP-8E computer with a 16-k core memory and includes a dual-drive Decassette system and an interactive display terminal. The software controls the acquisition of data from an NIM scaler, times the procedures, and analyzes and simultaneously displays logarithmically converted data on a fully annotated graph. Animal studies and clinical experiments are presented to illustrate the nature of these displays and the results obtained using this automated eye physiometer.