Sample records for cellular automation model

  1. Genetic Algorithm Calibration of Probabilistic Cellular Automata for Modeling Mining Permit Activity

    USGS Publications Warehouse

    Louis, S.J.; Raines, G.L.

    2003-01-01

    We use a genetic algorithm to calibrate a spatially and temporally resolved cellular automata to model mining activity on public land in Idaho and western Montana. The genetic algorithm searches through a space of transition rule parameters of a two dimensional cellular automata model to find rule parameters that fit observed mining activity data. Previous work by one of the authors in calibrating the cellular automaton took weeks - the genetic algorithm takes a day and produces rules leading to about the same (or better) fit to observed data. These preliminary results indicate that genetic algorithms are a viable tool in calibrating cellular automata for this application. Experience gained during the calibration of this cellular automata suggests that mineral resource information is a critical factor in the quality of the results. With automated calibration, further refinements of how the mineral-resource information is provided to the cellular automaton will probably improve our model.

  2. Automated quantitative histology reveals vascular morphodynamics during Arabidopsis hypocotyl secondary growth.

    PubMed

    Sankar, Martial; Nieminen, Kaisa; Ragni, Laura; Xenarios, Ioannis; Hardtke, Christian S

    2014-02-11

    Among various advantages, their small size makes model organisms preferred subjects of investigation. Yet, even in model systems detailed analysis of numerous developmental processes at cellular level is severely hampered by their scale. For instance, secondary growth of Arabidopsis hypocotyls creates a radial pattern of highly specialized tissues that comprises several thousand cells starting from a few dozen. This dynamic process is difficult to follow because of its scale and because it can only be investigated invasively, precluding comprehensive understanding of the cell proliferation, differentiation, and patterning events involved. To overcome such limitation, we established an automated quantitative histology approach. We acquired hypocotyl cross-sections from tiled high-resolution images and extracted their information content using custom high-throughput image processing and segmentation. Coupled with automated cell type recognition through machine learning, we could establish a cellular resolution atlas that reveals vascular morphodynamics during secondary growth, for example equidistant phloem pole formation. DOI: http://dx.doi.org/10.7554/eLife.01567.001.

  3. Automated quantitative histology reveals vascular morphodynamics during Arabidopsis hypocotyl secondary growth

    PubMed Central

    Sankar, Martial; Nieminen, Kaisa; Ragni, Laura; Xenarios, Ioannis; Hardtke, Christian S

    2014-01-01

    Among various advantages, their small size makes model organisms preferred subjects of investigation. Yet, even in model systems detailed analysis of numerous developmental processes at cellular level is severely hampered by their scale. For instance, secondary growth of Arabidopsis hypocotyls creates a radial pattern of highly specialized tissues that comprises several thousand cells starting from a few dozen. This dynamic process is difficult to follow because of its scale and because it can only be investigated invasively, precluding comprehensive understanding of the cell proliferation, differentiation, and patterning events involved. To overcome such limitation, we established an automated quantitative histology approach. We acquired hypocotyl cross-sections from tiled high-resolution images and extracted their information content using custom high-throughput image processing and segmentation. Coupled with automated cell type recognition through machine learning, we could establish a cellular resolution atlas that reveals vascular morphodynamics during secondary growth, for example equidistant phloem pole formation. DOI: http://dx.doi.org/10.7554/eLife.01567.001 PMID:24520159

  4. Cellular automaton formulation of passive scalar dynamics

    NASA Technical Reports Server (NTRS)

    Chen, Hudong; Matthaeus, William H.

    1987-01-01

    Cellular automata modeling of the advection of a passive scalar in a two-dimensional flow is examined in the context of discrete lattice kinetic theory. It is shown that if the passive scalar is represented by tagging or 'coloring' automation particles a passive advection-diffusion equation emerges without use of perturbation expansions. For the specific case of the hydrodynamic lattice gas model of Frisch et al. (1986), the diffusion coefficient is calculated by perturbation.

  5. Automated measurement of zebrafish larval movement

    PubMed Central

    Cario, Clinton L; Farrell, Thomas C; Milanese, Chiara; Burton, Edward A

    2011-01-01

    Abstract The zebrafish is a powerful vertebrate model that is readily amenable to genetic, pharmacological and environmental manipulations to elucidate the molecular and cellular basis of movement and behaviour. We report software enabling automated analysis of zebrafish movement from video recordings captured with cameras ranging from a basic camcorder to more specialized equipment. The software, which is provided as open-source MATLAB functions, can be freely modified and distributed, and is compatible with multiwell plates under a wide range of experimental conditions. Automated measurement of zebrafish movement using this technique will be useful for multiple applications in neuroscience, pharmacology and neuropsychiatry. PMID:21646414

  6. Automated measurement of zebrafish larval movement.

    PubMed

    Cario, Clinton L; Farrell, Thomas C; Milanese, Chiara; Burton, Edward A

    2011-08-01

    The zebrafish is a powerful vertebrate model that is readily amenable to genetic, pharmacological and environmental manipulations to elucidate the molecular and cellular basis of movement and behaviour. We report software enabling automated analysis of zebrafish movement from video recordings captured with cameras ranging from a basic camcorder to more specialized equipment. The software, which is provided as open-source MATLAB functions, can be freely modified and distributed, and is compatible with multiwell plates under a wide range of experimental conditions. Automated measurement of zebrafish movement using this technique will be useful for multiple applications in neuroscience, pharmacology and neuropsychiatry.

  7. High content image analysis for human H4 neuroglioma cells exposed to CuO nanoparticles.

    PubMed

    Li, Fuhai; Zhou, Xiaobo; Zhu, Jinmin; Ma, Jinwen; Huang, Xudong; Wong, Stephen T C

    2007-10-09

    High content screening (HCS)-based image analysis is becoming an important and widely used research tool. Capitalizing this technology, ample cellular information can be extracted from the high content cellular images. In this study, an automated, reliable and quantitative cellular image analysis system developed in house has been employed to quantify the toxic responses of human H4 neuroglioma cells exposed to metal oxide nanoparticles. This system has been proved to be an essential tool in our study. The cellular images of H4 neuroglioma cells exposed to different concentrations of CuO nanoparticles were sampled using IN Cell Analyzer 1000. A fully automated cellular image analysis system has been developed to perform the image analysis for cell viability. A multiple adaptive thresholding method was used to classify the pixels of the nuclei image into three classes: bright nuclei, dark nuclei, and background. During the development of our image analysis methodology, we have achieved the followings: (1) The Gaussian filtering with proper scale has been applied to the cellular images for generation of a local intensity maximum inside each nucleus; (2) a novel local intensity maxima detection method based on the gradient vector field has been established; and (3) a statistical model based splitting method was proposed to overcome the under segmentation problem. Computational results indicate that 95.9% nuclei can be detected and segmented correctly by the proposed image analysis system. The proposed automated image analysis system can effectively segment the images of human H4 neuroglioma cells exposed to CuO nanoparticles. The computational results confirmed our biological finding that human H4 neuroglioma cells had a dose-dependent toxic response to the insult of CuO nanoparticles.

  8. New cellular automaton model for magnetohydrodynamics

    NASA Technical Reports Server (NTRS)

    Chen, Hudong; Matthaeus, William H.

    1987-01-01

    A new type of two-dimensional cellular automation method is introduced for computation of magnetohydrodynamic fluid systems. Particle population is described by a 36-component tensor referred to a hexagonal lattice. By appropriate choice of the coefficients that control the modified streaming algorithm and the definition of the macroscopic fields, it is possible to compute both Lorentz-force and magnetic-induction effects. The method is local in the microscopic space and therefore suited to massively parallel computations.

  9. 20180312 - Mechanistic Modeling of Developmental Defects through Computational Embryology (SOT)

    EPA Science Inventory

    Significant advances in the genome sciences, in automated high-throughput screening (HTS), and in alternative methods for testing enable rapid profiling of chemical libraries for quantitative effects on diverse cellular activities. While a surfeit of HTS data and information is n...

  10. LAND USE CHANGE DUE TO URBANIZATION FOR THE NEUSE RIVER BASIN

    EPA Science Inventory

    The Urban Growth Model (UGM) was applied to analysis of land use change in the Neuse River Basin as part of a larger project for estimating the regional and broader impact of urbanization. UGM is based on cellular automation (CA) simulation techniques developed at the University...

  11. Survey statistics of automated segmentations applied to optical imaging of mammalian cells.

    PubMed

    Bajcsy, Peter; Cardone, Antonio; Chalfoun, Joe; Halter, Michael; Juba, Derek; Kociolek, Marcin; Majurski, Michael; Peskin, Adele; Simon, Carl; Simon, Mylene; Vandecreme, Antoine; Brady, Mary

    2015-10-15

    The goal of this survey paper is to overview cellular measurements using optical microscopy imaging followed by automated image segmentation. The cellular measurements of primary interest are taken from mammalian cells and their components. They are denoted as two- or three-dimensional (2D or 3D) image objects of biological interest. In our applications, such cellular measurements are important for understanding cell phenomena, such as cell counts, cell-scaffold interactions, cell colony growth rates, or cell pluripotency stability, as well as for establishing quality metrics for stem cell therapies. In this context, this survey paper is focused on automated segmentation as a software-based measurement leading to quantitative cellular measurements. We define the scope of this survey and a classification schema first. Next, all found and manually filteredpublications are classified according to the main categories: (1) objects of interests (or objects to be segmented), (2) imaging modalities, (3) digital data axes, (4) segmentation algorithms, (5) segmentation evaluations, (6) computational hardware platforms used for segmentation acceleration, and (7) object (cellular) measurements. Finally, all classified papers are converted programmatically into a set of hyperlinked web pages with occurrence and co-occurrence statistics of assigned categories. The survey paper presents to a reader: (a) the state-of-the-art overview of published papers about automated segmentation applied to optical microscopy imaging of mammalian cells, (b) a classification of segmentation aspects in the context of cell optical imaging, (c) histogram and co-occurrence summary statistics about cellular measurements, segmentations, segmented objects, segmentation evaluations, and the use of computational platforms for accelerating segmentation execution, and (d) open research problems to pursue. The novel contributions of this survey paper are: (1) a new type of classification of cellular measurements and automated segmentation, (2) statistics about the published literature, and (3) a web hyperlinked interface to classification statistics of the surveyed papers at https://isg.nist.gov/deepzoomweb/resources/survey/index.html.

  12. A computational and cellular solids approach to the stiffness-based design of bone scaffolds.

    PubMed

    Norato, J A; Wagoner Johnson, A J

    2011-09-01

    We derive a cellular solids approach to the design of bone scaffolds for stiffness and pore size. Specifically, we focus on scaffolds made of stacked, alternating, orthogonal layers of hydroxyapatite rods, such as those obtained via micro-robotic deposition, and aim to determine the rod diameter, spacing and overlap required to obtain specified elastic moduli and pore size. To validate and calibrate the cellular solids model, we employ a finite element model and determine the effective scaffold moduli via numerical homogenization. In order to perform an efficient, automated execution of the numerical studies, we employ a geometry projection method so that analyses corresponding to different scaffold dimensions can be performed on a fixed, non-conforming mesh. Based on the developed model, we provide design charts to aid in the selection of rod diameter, spacing and overlap to be used in the robotic deposition to attain desired elastic moduli and pore size.

  13. High-Dimensional Modeling for Cytometry: Building Rock Solid Models Using GemStone™ and Verity Cen-se'™ High-Definition t-SNE Mapping.

    PubMed

    Bruce Bagwell, C

    2018-01-01

    This chapter outlines how to approach the complex tasks associated with designing models for high-dimensional cytometry data. Unlike gating approaches, modeling lends itself to automation and accounts for measurement overlap among cellular populations. Designing these models is now easier because of a new technique called high-definition t-SNE mapping. Nontrivial examples are provided that serve as a guide to create models that are consistent with data.

  14. Challenges in structural approaches to cell modeling

    PubMed Central

    Im, Wonpil; Liang, Jie; Olson, Arthur; Zhou, Huan-Xiang; Vajda, Sandor; Vakser, Ilya A.

    2016-01-01

    Computational modeling is essential for structural characterization of biomolecular mechanisms across the broad spectrum of scales. Adequate understanding of biomolecular mechanisms inherently involves our ability to model them. Structural modeling of individual biomolecules and their interactions has been rapidly progressing. However, in terms of the broader picture, the focus is shifting toward larger systems, up to the level of a cell. Such modeling involves a more dynamic and realistic representation of the interactomes in vivo, in a crowded cellular environment, as well as membranes and membrane proteins, and other cellular components. Structural modeling of a cell complements computational approaches to cellular mechanisms based on differential equations, graph models, and other techniques to model biological networks, imaging data, etc. Structural modeling along with other computational and experimental approaches will provide a fundamental understanding of life at the molecular level and lead to important applications to biology and medicine. A cross section of diverse approaches presented in this review illustrates the developing shift from the structural modeling of individual molecules to that of cell biology. Studies in several related areas are covered: biological networks; automated construction of three-dimensional cell models using experimental data; modeling of protein complexes; prediction of non-specific and transient protein interactions; thermodynamic and kinetic effects of crowding; cellular membrane modeling; and modeling of chromosomes. The review presents an expert opinion on the current state-of-the-art in these various aspects of structural modeling in cellular biology, and the prospects of future developments in this emerging field. PMID:27255863

  15. Sculplexity: Sculptures of Complexity using 3D printing

    NASA Astrophysics Data System (ADS)

    Reiss, D. S.; Price, J. J.; Evans, T. S.

    2013-11-01

    We show how to convert models of complex systems such as 2D cellular automata into a 3D printed object. Our method takes into account the limitations inherent to 3D printing processes and materials. Our approach automates the greater part of this task, bypassing the use of CAD software and the need for manual design. As a proof of concept, a physical object representing a modified forest fire model was successfully printed. Automated conversion methods similar to the ones developed here can be used to create objects for research, for demonstration and teaching, for outreach, or simply for aesthetic pleasure. As our outputs can be touched, they may be particularly useful for those with visual disabilities.

  16. High-Throughput Method for Automated Colony and Cell Counting by Digital Image Analysis Based on Edge Detection

    PubMed Central

    Choudhry, Priya

    2016-01-01

    Counting cells and colonies is an integral part of high-throughput screens and quantitative cellular assays. Due to its subjective and time-intensive nature, manual counting has hindered the adoption of cellular assays such as tumor spheroid formation in high-throughput screens. The objective of this study was to develop an automated method for quick and reliable counting of cells and colonies from digital images. For this purpose, I developed an ImageJ macro Cell Colony Edge and a CellProfiler Pipeline Cell Colony Counting, and compared them to other open-source digital methods and manual counts. The ImageJ macro Cell Colony Edge is valuable in counting cells and colonies, and measuring their area, volume, morphology, and intensity. In this study, I demonstrate that Cell Colony Edge is superior to other open-source methods, in speed, accuracy and applicability to diverse cellular assays. It can fulfill the need to automate colony/cell counting in high-throughput screens, colony forming assays, and cellular assays. PMID:26848849

  17. Challenges in structural approaches to cell modeling.

    PubMed

    Im, Wonpil; Liang, Jie; Olson, Arthur; Zhou, Huan-Xiang; Vajda, Sandor; Vakser, Ilya A

    2016-07-31

    Computational modeling is essential for structural characterization of biomolecular mechanisms across the broad spectrum of scales. Adequate understanding of biomolecular mechanisms inherently involves our ability to model them. Structural modeling of individual biomolecules and their interactions has been rapidly progressing. However, in terms of the broader picture, the focus is shifting toward larger systems, up to the level of a cell. Such modeling involves a more dynamic and realistic representation of the interactomes in vivo, in a crowded cellular environment, as well as membranes and membrane proteins, and other cellular components. Structural modeling of a cell complements computational approaches to cellular mechanisms based on differential equations, graph models, and other techniques to model biological networks, imaging data, etc. Structural modeling along with other computational and experimental approaches will provide a fundamental understanding of life at the molecular level and lead to important applications to biology and medicine. A cross section of diverse approaches presented in this review illustrates the developing shift from the structural modeling of individual molecules to that of cell biology. Studies in several related areas are covered: biological networks; automated construction of three-dimensional cell models using experimental data; modeling of protein complexes; prediction of non-specific and transient protein interactions; thermodynamic and kinetic effects of crowding; cellular membrane modeling; and modeling of chromosomes. The review presents an expert opinion on the current state-of-the-art in these various aspects of structural modeling in cellular biology, and the prospects of future developments in this emerging field. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Semi-Automated Curation Allows Causal Network Model Building for the Quantification of Age-Dependent Plaque Progression in ApoE-/- Mouse.

    PubMed

    Szostak, Justyna; Martin, Florian; Talikka, Marja; Peitsch, Manuel C; Hoeng, Julia

    2016-01-01

    The cellular and molecular mechanisms behind the process of atherosclerotic plaque destabilization are complex, and molecular data from aortic plaques are difficult to interpret. Biological network models may overcome these difficulties and precisely quantify the molecular mechanisms impacted during disease progression. The atherosclerosis plaque destabilization biological network model was constructed with the semiautomated curation pipeline, BELIEF. Cellular and molecular mechanisms promoting plaque destabilization or rupture were captured in the network model. Public transcriptomic data sets were used to demonstrate the specificity of the network model and to capture the different mechanisms that were impacted in ApoE -/- mouse aorta at 6 and 32 weeks. We concluded that network models combined with the network perturbation amplitude algorithm provide a sensitive, quantitative method to follow disease progression at the molecular level. This approach can be used to investigate and quantify molecular mechanisms during plaque progression.

  19. Two-threshold model for scaling laws of noninteracting snow avalanches

    USGS Publications Warehouse

    Faillettaz, J.; Louchet, F.; Grasso, J.-R.

    2004-01-01

    A two-threshold model was proposed for scaling laws of noninteracting snow avalanches. It was found that the sizes of the largest avalanches just preceding the lattice system were power-law distributed. The proposed model reproduced the range of power-law exponents observe for land, rock or snow avalanches, by tuning the maximum value of the ratio of the two failure thresholds. A two-threshold 2D cellular automation was introduced to study the scaling for gravity-driven systems.

  20. Advances in molecular labeling, high throughput imaging and machine intelligence portend powerful functional cellular biochemistry tools.

    PubMed

    Price, Jeffrey H; Goodacre, Angela; Hahn, Klaus; Hodgson, Louis; Hunter, Edward A; Krajewski, Stanislaw; Murphy, Robert F; Rabinovich, Andrew; Reed, John C; Heynen, Susanne

    2002-01-01

    Cellular behavior is complex. Successfully understanding systems at ever-increasing complexity is fundamental to advances in modern science and unraveling the functional details of cellular behavior is no exception. We present a collection of prospectives to provide a glimpse of the techniques that will aid in collecting, managing and utilizing information on complex cellular processes via molecular imaging tools. These include: 1) visualizing intracellular protein activity with fluorescent markers, 2) high throughput (and automated) imaging of multilabeled cells in statistically significant numbers, and 3) machine intelligence to analyze subcellular image localization and pattern. Although not addressed here, the importance of combining cell-image-based information with detailed molecular structure and ligand-receptor binding models cannot be overlooked. Advanced molecular imaging techniques have the potential to impact cellular diagnostics for cancer screening, clinical correlations of tissue molecular patterns for cancer biology, and cellular molecular interactions for accelerating drug discovery. The goal of finally understanding all cellular components and behaviors will be achieved by advances in both instrumentation engineering (software and hardware) and molecular biochemistry. Copyright 2002 Wiley-Liss, Inc.

  1. Simulation of Regionally Ecological Land Based on a Cellular Automation Model: A Case Study of Beijing, China

    PubMed Central

    Xie, Hualin; Kung, Chih-Chun; Zhang, Yanting; Li, Xiubin

    2012-01-01

    Ecological land is like the “liver” of a city and is very useful to public health. Ecological land change is a spatially dynamic non-linear process under the interaction between natural and anthropogenic factors at different scales. In this study, by setting up natural development scenario, object orientation scenario and ecosystem priority scenario, a Cellular Automation (CA) model has been established to simulate the evolution pattern of ecological land in Beijing in the year 2020. Under the natural development scenario, most of ecological land will be replaced by construction land and crop land. But under the scenarios of object orientation and ecosystem priority, the ecological land area will increase, especially under the scenario of ecosystem priority. When considering the factors such as total area of ecological land, loss of key ecological land and spatial patterns of land use, the scenarios from priority to inferiority are ecosystem priority, object orientation and natural development, so future land management policies in Beijing should be focused on conversion of cropland to forest, wetland protection and prohibition of exploitation of natural protection zones, water source areas and forest parks to maintain the safety of the regional ecosystem. PMID:23066410

  2. Simulation of regionally ecological land based on a cellular automation model: a case study of Beijing, China.

    PubMed

    Xie, Hualin; Kung, Chih-Chun; Zhang, Yanting; Li, Xiubin

    2012-08-01

    Ecological land is like the "liver" of a city and is very useful to public health. Ecological land change is a spatially dynamic non-linear process under the interaction between natural and anthropogenic factors at different scales. In this study, by setting up natural development scenario, object orientation scenario and ecosystem priority scenario, a Cellular Automation (CA) model has been established to simulate the evolution pattern of ecological land in Beijing in the year 2020. Under the natural development scenario, most of ecological land will be replaced by construction land and crop land. But under the scenarios of object orientation and ecosystem priority, the ecological land area will increase, especially under the scenario of ecosystem priority. When considering the factors such as total area of ecological land, loss of key ecological land and spatial patterns of land use, the scenarios from priority to inferiority are ecosystem priority, object orientation and natural development, so future land management policies in Beijing should be focused on conversion of cropland to forest, wetland protection and prohibition of exploitation of natural protection zones, water source areas and forest parks to maintain the safety of the regional ecosystem.

  3. Discrimination of Isomers of Released N- and O-Glycans Using Diagnostic Product Ions in Negative Ion PGC-LC-ESI-MS/MS

    NASA Astrophysics Data System (ADS)

    Ashwood, Christopher; Lin, Chi-Hung; Thaysen-Andersen, Morten; Packer, Nicolle H.

    2018-03-01

    Profiling cellular protein glycosylation is challenging due to the presence of highly similar glycan structures that play diverse roles in cellular physiology. As the anomericity and the exact linkage type of a single glycosidic bond can influence glycan function, there is a demand for improved and automated methods to confirm detailed structural features and to discriminate between structurally similar isomers, overcoming a significant bottleneck in the analysis of data generated by glycomics experiments. We used porous graphitized carbon-LC-ESI-MS/MS to separate and detect released N- and O-glycan isomers from mammalian model glycoproteins using negative mode resonance activation CID-MS/MS. By interrogating similar fragment spectra from closely related glycan isomers that differ only in arm position and sialyl linkage, product fragment ions for discrimination between these features were discovered. Using the Skyline software, at least two diagnostic fragment ions of high specificity were validated for automated discrimination of sialylation and arm position in N-glycan structures, and sialylation in O-glycan structures, complementing existing structural diagnostic ions. These diagnostic ions were shown to be useful for isomer discrimination using both linear and 3D ion trap mass spectrometers when analyzing complex glycan mixtures from cell lysates. Skyline was found to serve as a useful tool for automated assessment of glycan isomer discrimination. This platform-independent workflow can potentially be extended to automate the characterization and quantitation of other challenging glycan isomers. [Figure not available: see fulltext.

  4. Tissue and Animal Models of Sudden Cardiac Death

    PubMed Central

    Sallam, Karim; Li, Yingxin; Sager, Philip T.; Houser, Steven R.; Wu, Joseph C.

    2015-01-01

    Sudden Cardiac Death (SCD) is a common cause of death in patients with structural heart disease, genetic mutations or acquired disorders affecting cardiac ion channels. A wide range of platforms exist to model and study disorders associated with SCD. Human clinical studies are cumbersome and are thwarted by the extent of investigation that can be performed on human subjects. Animal models are limited by their degree of homology to human cardiac electrophysiology including ion channel expression. Most commonly used cellular models are cellular transfection models, which are able to mimic the expression of a single ion channel offering incomplete insight into changes of the action potential profile. Induced pluripotent stem cell derived Cardiomyocytes (iPSC-CMs) resemble, but are not identical, to adult human cardiomyocytes, and provide a new platform for studying arrhythmic disorders leading to SCD. A variety of platforms exist to phenotype cellular models including conventional and automated patch clamp, multi-electrode array, and computational modeling. iPSC-CMs have been used to study Long QT syndrome, catecholaminergic polymorphic ventricular tachycardia, hypertrophic cardiomyopathy and other hereditary cardiac disorders. Although iPSC-CMs are distinct from adult cardiomyocytes, they provide a robust platform to advance the science and clinical care of SCD. PMID:26044252

  5. Automated microscopy for high-content RNAi screening

    PubMed Central

    2010-01-01

    Fluorescence microscopy is one of the most powerful tools to investigate complex cellular processes such as cell division, cell motility, or intracellular trafficking. The availability of RNA interference (RNAi) technology and automated microscopy has opened the possibility to perform cellular imaging in functional genomics and other large-scale applications. Although imaging often dramatically increases the content of a screening assay, it poses new challenges to achieve accurate quantitative annotation and therefore needs to be carefully adjusted to the specific needs of individual screening applications. In this review, we discuss principles of assay design, large-scale RNAi, microscope automation, and computational data analysis. We highlight strategies for imaging-based RNAi screening adapted to different library and assay designs. PMID:20176920

  6. Systems microscopy: an emerging strategy for the life sciences.

    PubMed

    Lock, John G; Strömblad, Staffan

    2010-05-01

    Dynamic cellular processes occurring in time and space are fundamental to all physiology and disease. To understand complex and dynamic cellular processes therefore demands the capacity to record and integrate quantitative multiparametric data from the four spatiotemporal dimensions within which living cells self-organize, and to subsequently use these data for the mathematical modeling of cellular systems. To this end, a raft of complementary developments in automated fluorescence microscopy, cell microarray platforms, quantitative image analysis and data mining, combined with multivariate statistics and computational modeling, now coalesce to produce a new research strategy, "systems microscopy", which facilitates systems biology analyses of living cells. Systems microscopy provides the crucial capacities to simultaneously extract and interrogate multiparametric quantitative data at resolution levels ranging from the molecular to the cellular, thereby elucidating a more comprehensive and richly integrated understanding of complex and dynamic cellular systems. The unique capacities of systems microscopy suggest that it will become a vital cornerstone of systems biology, and here we describe the current status and future prospects of this emerging field, as well as outlining some of the key challenges that remain to be overcome. Copyright 2010 Elsevier Inc. All rights reserved.

  7. Automated reagent-dispensing system for microfluidic cell biology assays.

    PubMed

    Ly, Jimmy; Masterman-Smith, Michael; Ramakrishnan, Ravichandran; Sun, Jing; Kokubun, Brent; van Dam, R Michael

    2013-12-01

    Microscale systems that enable measurements of oncological phenomena at the single-cell level have a great capacity to improve therapeutic strategies and diagnostics. Such measurements can reveal unprecedented insights into cellular heterogeneity and its implications into the progression and treatment of complicated cellular disease processes such as those found in cancer. We describe a novel fluid-delivery platform to interface with low-cost microfluidic chips containing arrays of microchambers. Using multiple pairs of needles to aspirate and dispense reagents, the platform enables automated coating of chambers, loading of cells, and treatment with growth media or other agents (e.g., drugs, fixatives, membrane permeabilizers, washes, stains, etc.). The chips can be quantitatively assayed using standard fluorescence-based immunocytochemistry, microscopy, and image analysis tools, to determine, for example, drug response based on differences in protein expression and/or activation of cellular targets on an individual-cell level. In general, automation of fluid and cell handling increases repeatability, eliminates human error, and enables increased throughput, especially for sophisticated, multistep assays such as multiparameter quantitative immunocytochemistry. We report the design of the automated platform and compare several aspects of its performance to manually-loaded microfluidic chips.

  8. Characterization of GABAA receptor ligands with automated patch-clamp using human neurons derived from pluripotent stem cells

    PubMed Central

    Yuan, Nina Y.; Poe, Michael M.; Witzigmann, Christopher; Cook, James M.; Stafford, Douglas; Arnold, Leggy A.

    2016-01-01

    Introduction Automated patch clamp is a recent but widely used technology to assess pre-clinical drug safety. With the availability of human neurons derived from pluripotent stem cells, this technology can be extended to determine CNS effects of drug candidates, especially those acting on the GABAA receptor. Methods iCell Neurons (Cellular Dynamics International, A Fujifilm Company) were cultured for ten days and analyzed by patch clamp in the presence of agonist GABA or in combination with positive allosteric GABAA receptor modulators. Both efficacy and affinity were determined. In addition, mRNA of GABAA receptor subunits were quantified by qRT-PCR. Results We have shown that iCell Neurons are compatible with the IonFlux microfluidic system of the automated patch clamp instrument. Resistance ranging from 15-25 MΩ was achieved for each trap channel of patch clamped cells in a 96-well plate format. GABA induced a robust change of current with an EC50 of 0.43 μM. Positive GABAA receptor modulators diazepam, HZ166, and CW-04-020 exhibited EC50 values of 0.42 μM, 1.56 μM, and 0.23 μM, respectively. The α2/α3/α5 selective compound HZ166-induced the highest potentiation (efficacy) of 810% of the current induced by 100 nM GABA. Quantification of GABAA receptor mRNA in iCell Neurons revealed high levels of α5 and β3 subunits and low levels of α1, which is similar to the configuration in human neonatal brain. Discussion iCell Neurons represent a new cellular model to characterize GABAergic compounds using automated patch clamp. These cells have excellent representation of cellular GABAA receptor distribution that enable determination of total small molecule efficacy and affinity as measured by cell membrane current change. PMID:27544543

  9. A cellular automation model accounting for bicycle's group behavior

    NASA Astrophysics Data System (ADS)

    Tang, Tie-Qiao; Rui, Ying-Xu; Zhang, Jian; Shang, Hua-Yan

    2018-02-01

    Recently, bicycle has become an important traffic tool in China, again. Due to the merits of bicycle, the group behavior widely exists in urban traffic system. However, little effort has been made to explore the impacts of the group behavior on bicycle flow. In this paper, we propose a CA (cellular automaton) model with group behavior to explore the complex traffic phenomena caused by shoulder group behavior and following group behavior on an open road. The numerical results illustrate that the proposed model can qualitatively describe the impacts of the two kinds of group behaviors on bicycle flow and that the effects are related to the mode and size of group behaviors. The results can help us to better understand the impacts of the bicycle's group behaviors on urban traffic system and effectively control the bicycle's group behavior.

  10. Microfluidic-Based Platform for Universal Sample Preparation and Biological Assays Automation for Life-Sciences Research and Remote Medical Applications

    NASA Astrophysics Data System (ADS)

    Brassard, D.; Clime, L.; Daoud, J.; Geissler, M.; Malic, L.; Charlebois, D.; Buckley, N.; Veres, T.

    2018-02-01

    An innovative centrifugal microfluidic universal platform for remote bio-analytical assays automation required in life-sciences research and medical applications, including purification and analysis from body fluids of cellular and circulating markers.

  11. Cellular Metabolomics for Exposure and Toxicity Assessment

    EPA Science Inventory

    We have developed NMR automation and cell quench methods for cell culture-based metabolomics to study chemical exposure and toxicity. Our flow automation method is robust and free of cross contamination. The direct cell quench method is rapid and effective. Cell culture-based met...

  12. Creation of a virtual cutaneous tissue bank

    NASA Astrophysics Data System (ADS)

    LaFramboise, William A.; Shah, Sujal; Hoy, R. W.; Letbetter, D.; Petrosko, P.; Vennare, R.; Johnson, Peter C.

    2000-04-01

    Cellular and non-cellular constituents of skin contain fundamental morphometric features and structural patterns that correlate with tissue function. High resolution digital image acquisitions performed using an automated system and proprietary software to assemble adjacent images and create a contiguous, lossless, digital representation of individual microscope slide specimens. Serial extraction, evaluation and statistical analysis of cutaneous feature is performed utilizing an automated analysis system, to derive normal cutaneous parameters comprising essential structural skin components. Automated digital cutaneous analysis allows for fast extraction of microanatomic dat with accuracy approximating manual measurement. The process provides rapid assessment of feature both within individual specimens and across sample populations. The images, component data, and statistical analysis comprise a bioinformatics database to serve as an architectural blueprint for skin tissue engineering and as a diagnostic standard of comparison for pathologic specimens.

  13. A High-Throughput Automated Microfluidic Platform for Calcium Imaging of Taste Sensing.

    PubMed

    Hsiao, Yi-Hsing; Hsu, Chia-Hsien; Chen, Chihchen

    2016-07-08

    The human enteroendocrine L cell line NCI-H716, expressing taste receptors and taste signaling elements, constitutes a unique model for the studies of cellular responses to glucose, appetite regulation, gastrointestinal motility, and insulin secretion. Targeting these gut taste receptors may provide novel treatments for diabetes and obesity. However, NCI-H716 cells are cultured in suspension and tend to form multicellular aggregates, preventing high-throughput calcium imaging due to interferences caused by laborious immobilization and stimulus delivery procedures. Here, we have developed an automated microfluidic platform that is capable of trapping more than 500 single cells into microwells with a loading efficiency of 77% within two minutes, delivering multiple chemical stimuli and performing calcium imaging with enhanced spatial and temporal resolutions when compared to bath perfusion systems. Results revealed the presence of heterogeneity in cellular responses to the type, concentration, and order of applied sweet and bitter stimuli. Sucralose and denatonium benzoate elicited robust increases in the intracellular Ca(2+) concentration. However, glucose evoked a rapid elevation of intracellular Ca(2+) followed by reduced responses to subsequent glucose stimulation. Using Gymnema sylvestre as a blocking agent for the sweet taste receptor confirmed that different taste receptors were utilized for sweet and bitter tastes. This automated microfluidic platform is cost-effective, easy to fabricate and operate, and may be generally applicable for high-throughput and high-content single-cell analysis and drug screening.

  14. Finding the rhythm of sudden cardiac death: new opportunities using induced pluripotent stem cell-derived cardiomyocytes.

    PubMed

    Sallam, Karim; Li, Yingxin; Sager, Philip T; Houser, Steven R; Wu, Joseph C

    2015-06-05

    Sudden cardiac death is a common cause of death in patients with structural heart disease, genetic mutations, or acquired disorders affecting cardiac ion channels. A wide range of platforms exist to model and study disorders associated with sudden cardiac death. Human clinical studies are cumbersome and are thwarted by the extent of investigation that can be performed on human subjects. Animal models are limited by their degree of homology to human cardiac electrophysiology, including ion channel expression. Most commonly used cellular models are cellular transfection models, which are able to mimic the expression of a single-ion channel offering incomplete insight into changes of the action potential profile. Induced pluripotent stem cell-derived cardiomyocytes resemble, but are not identical, adult human cardiomyocytes and provide a new platform for studying arrhythmic disorders leading to sudden cardiac death. A variety of platforms exist to phenotype cellular models, including conventional and automated patch clamp, multielectrode array, and computational modeling. Induced pluripotent stem cell-derived cardiomyocytes have been used to study long QT syndrome, catecholaminergic polymorphic ventricular tachycardia, hypertrophic cardiomyopathy, and other hereditary cardiac disorders. Although induced pluripotent stem cell-derived cardiomyocytes are distinct from adult cardiomyocytes, they provide a robust platform to advance the science and clinical care of sudden cardiac death. © 2015 American Heart Association, Inc.

  15. A quantitative image cytometry technique for time series or population analyses of signaling networks.

    PubMed

    Ozaki, Yu-ichi; Uda, Shinsuke; Saito, Takeshi H; Chung, Jaehoon; Kubota, Hiroyuki; Kuroda, Shinya

    2010-04-01

    Modeling of cellular functions on the basis of experimental observation is increasingly common in the field of cellular signaling. However, such modeling requires a large amount of quantitative data of signaling events with high spatio-temporal resolution. A novel technique which allows us to obtain such data is needed for systems biology of cellular signaling. We developed a fully automatable assay technique, termed quantitative image cytometry (QIC), which integrates a quantitative immunostaining technique and a high precision image-processing algorithm for cell identification. With the aid of an automated sample preparation system, this device can quantify protein expression, phosphorylation and localization with subcellular resolution at one-minute intervals. The signaling activities quantified by the assay system showed good correlation with, as well as comparable reproducibility to, western blot analysis. Taking advantage of the high spatio-temporal resolution, we investigated the signaling dynamics of the ERK pathway in PC12 cells. The QIC technique appears as a highly quantitative and versatile technique, which can be a convenient replacement for the most conventional techniques including western blot, flow cytometry and live cell imaging. Thus, the QIC technique can be a powerful tool for investigating the systems biology of cellular signaling.

  16. Automation of the ELISpot assay for high-throughput detection of antigen-specific T-cell responses.

    PubMed

    Almeida, Coral-Ann M; Roberts, Steven G; Laird, Rebecca; McKinnon, Elizabeth; Ahmed, Imran; Pfafferott, Katja; Turley, Joanne; Keane, Niamh M; Lucas, Andrew; Rushton, Ben; Chopra, Abha; Mallal, Simon; John, Mina

    2009-05-15

    The enzyme linked immunospot (ELISpot) assay is a fundamental tool in cellular immunology, providing both quantitative and qualitative information on cellular cytokine responses to defined antigens. It enables the comprehensive screening of patient derived peripheral blood mononuclear cells to reveal the antigenic restriction of T-cell responses and is an emerging technique in clinical laboratory investigation of certain infectious diseases. As with all cellular-based assays, the final results of the assay are dependent on a number of technical variables that may impact precision if not highly standardised between operators. When studies that are large scale or using multiple antigens are set up manually, these assays may be labour intensive, have many manual handling steps, are subject to data and sample integrity failure and may show large inter-operator variability. Here we describe the successful automated performance of the interferon (IFN)-gamma ELISpot assay from cell counting through to electronic capture of cytokine quantitation and present the results of a comparison between automated and manual performance of the ELISpot assay. The mean number of spot forming units enumerated by both methods for limiting dilutions of CMV, EBV and influenza (CEF)-derived peptides in six healthy individuals were highly correlated (r>0.83, p<0.05). The precision results from the automated system compared favourably with the manual ELISpot and further ensured electronic tracking, increased through-put and reduced turnaround time.

  17. Automated and Adaptable Quantification of Cellular Alignment from Microscopic Images for Tissue Engineering Applications

    PubMed Central

    Xu, Feng; Beyazoglu, Turker; Hefner, Evan; Gurkan, Umut Atakan

    2011-01-01

    Cellular alignment plays a critical role in functional, physical, and biological characteristics of many tissue types, such as muscle, tendon, nerve, and cornea. Current efforts toward regeneration of these tissues include replicating the cellular microenvironment by developing biomaterials that facilitate cellular alignment. To assess the functional effectiveness of the engineered microenvironments, one essential criterion is quantification of cellular alignment. Therefore, there is a need for rapid, accurate, and adaptable methodologies to quantify cellular alignment for tissue engineering applications. To address this need, we developed an automated method, binarization-based extraction of alignment score (BEAS), to determine cell orientation distribution in a wide variety of microscopic images. This method combines a sequenced application of median and band-pass filters, locally adaptive thresholding approaches and image processing techniques. Cellular alignment score is obtained by applying a robust scoring algorithm to the orientation distribution. We validated the BEAS method by comparing the results with the existing approaches reported in literature (i.e., manual, radial fast Fourier transform-radial sum, and gradient based approaches). Validation results indicated that the BEAS method resulted in statistically comparable alignment scores with the manual method (coefficient of determination R2=0.92). Therefore, the BEAS method introduced in this study could enable accurate, convenient, and adaptable evaluation of engineered tissue constructs and biomaterials in terms of cellular alignment and organization. PMID:21370940

  18. Spatiotemporal dynamics of landscape pattern and hydrologic process in watershed systems

    NASA Astrophysics Data System (ADS)

    Randhir, Timothy O.; Tsvetkova, Olga

    2011-06-01

    SummaryLand use change is influenced by spatial and temporal factors that interact with watershed resources. Modeling these changes is critical to evaluate emerging land use patterns and to predict variation in water quantity and quality. The objective of this study is to model the nature and emergence of spatial patterns in land use and water resource impacts using a spatially explicit and dynamic landscape simulation. Temporal changes are predicted using a probabilistic Markovian process and spatial interaction through cellular automation. The MCMC (Monte Carlo Markov Chain) analysis with cellular automation is linked to hydrologic equations to simulate landscape patterns and processes. The spatiotemporal watershed dynamics (SWD) model is applied to a subwatershed in the Blackstone River watershed of Massachusetts to predict potential land use changes and expected runoff and sediment loading. Changes in watershed land use and water resources are evaluated over 100 years at a yearly time step. Results show high potential for rapid urbanization that could result in lowering of groundwater recharge and increased storm water peaks. The watershed faces potential decreases in agricultural and forest area that affect open space and pervious cover of the watershed system. Water quality deteriorated due to increased runoff which can also impact stream morphology. While overland erosion decreased, instream erosion increased from increased runoff from urban areas. Use of urban best management practices (BMPs) in sensitive locations, preventive strategies, and long-term conservation planning will be useful in sustaining the watershed system.

  19. Research highlights: microfluidics meets big data.

    PubMed

    Tseng, Peter; Weaver, Westbrook M; Masaeli, Mahdokht; Owsley, Keegan; Di Carlo, Dino

    2014-03-07

    In this issue we highlight a collection of recent work in which microfluidic parallelization and automation have been employed to address the increasing need for large amounts of quantitative data concerning cellular function--from correlating microRNA levels to protein expression, increasing the throughput and reducing the noise when studying protein dynamics in single-cells, and understanding how signal dynamics encodes information. The painstaking dissection of cellular pathways one protein at a time appears to be coming to an end, leading to more rapid discoveries which will inevitably translate to better cellular control--in producing useful gene products and treating disease at the individual cell level. From these studies it is also clear that development of large scale mutant or fusion libraries, automation of microscopy, image analysis, and data extraction will be key components as microfluidics contributes its strengths to aid systems biology moving forward.

  20. Nonlinear dynamics in cardiac conduction

    NASA Technical Reports Server (NTRS)

    Kaplan, D. T.; Smith, J. M.; Saxberg, B. E.; Cohen, R. J.

    1988-01-01

    Electrical conduction in the heart shows many phenomena familiar from nonlinear dynamics. Among these phenomena are multiple basins of attraction, phase locking, and perhaps period-doubling bifurcations and chaos. We describe a simple cellular-automation model of electrical conduction which simulates normal conduction patterns in the heart as well as a wide range of disturbances of heart rhythm. In addition, we review the application of percolation theory to the analysis of the development of complex, self-sustaining conduction patterns.

  1. Automated cellular sample preparation using a Centrifuge-on-a-Chip.

    PubMed

    Mach, Albert J; Kim, Jae Hyun; Arshi, Armin; Hur, Soojung Claire; Di Carlo, Dino

    2011-09-07

    The standard centrifuge is a laboratory instrument widely used by biologists and medical technicians for preparing cell samples. Efforts to automate the operations of concentration, cell separation, and solution exchange that a centrifuge performs in a simpler and smaller platform have had limited success. Here, we present a microfluidic chip that replicates the functions of a centrifuge without moving parts or external forces. The device operates using a purely fluid dynamic phenomenon in which cells selectively enter and are maintained in microscale vortices. Continuous and sequential operation allows enrichment of cancer cells from spiked blood samples at the mL min(-1) scale, followed by fluorescent labeling of intra- and extra-cellular antigens on the cells without the need for manual pipetting and washing steps. A versatile centrifuge-analogue may open opportunities in automated, low-cost and high-throughput sample preparation as an alternative to the standard benchtop centrifuge in standardized clinical diagnostics or resource poor settings.

  2. An Automated Design Framework for Multicellular Recombinase Logic.

    PubMed

    Guiziou, Sarah; Ulliana, Federico; Moreau, Violaine; Leclere, Michel; Bonnet, Jerome

    2018-05-18

    Tools to systematically reprogram cellular behavior are crucial to address pressing challenges in manufacturing, environment, or healthcare. Recombinases can very efficiently encode Boolean and history-dependent logic in many species, yet current designs are performed on a case-by-case basis, limiting their scalability and requiring time-consuming optimization. Here we present an automated workflow for designing recombinase logic devices executing Boolean functions. Our theoretical framework uses a reduced library of computational devices distributed into different cellular subpopulations, which are then composed in various manners to implement all desired logic functions at the multicellular level. Our design platform called CALIN (Composable Asynchronous Logic using Integrase Networks) is broadly accessible via a web server, taking truth tables as inputs and providing corresponding DNA designs and sequences as outputs (available at http://synbio.cbs.cnrs.fr/calin ). We anticipate that this automated design workflow will streamline the implementation of Boolean functions in many organisms and for various applications.

  3. An automated microphysiological assay for toxicity evaluation.

    PubMed

    Eggert, S; Alexander, F A; Wiest, J

    2015-08-01

    Screening a newly developed drug, food additive or cosmetic ingredient for toxicity is a critical preliminary step before it can move forward in the development pipeline. Due to the sometimes dire consequences when a harmful agent is overlooked, toxicologists work under strict guidelines to effectively catalogue and classify new chemical agents. Conventional assays involve long experimental hours and many manual steps that increase the probability of user error; errors that can potentially manifest as inaccurate toxicology results. Automated assays can overcome many potential mistakes that arise due to human error. In the presented work, we created and validated a novel, automated platform for a microphysiological assay that can examine cellular attributes with sensors measuring changes in cellular metabolic rate, oxygen consumption, and vitality mediated by exposure to a potentially toxic agent. The system was validated with low buffer culture medium with varied conductivities that caused changes in the measured impedance on integrated impedance electrodes.

  4. The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic

    NASA Technical Reports Server (NTRS)

    Armstrong, Curtis D.; Humphreys, William M.

    2003-01-01

    We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.

  5. Realistic numerical modelling of human head tissue exposure to electromagnetic waves from cellular phones

    NASA Astrophysics Data System (ADS)

    Scarella, Gilles; Clatz, Olivier; Lanteri, Stéphane; Beaume, Grégory; Oudot, Steve; Pons, Jean-Philippe; Piperno, Sergo; Joly, Patrick; Wiart, Joe

    2006-06-01

    The ever-rising diffusion of cellular phones has brought about an increased concern for the possible consequences of electromagnetic radiation on human health. Possible thermal effects have been investigated, via experimentation or simulation, by several research projects in the last decade. Concerning numerical modeling, the power absorption in a user's head is generally computed using discretized models built from clinical MRI data. The vast majority of such numerical studies have been conducted using Finite Differences Time Domain methods, although strong limitations of their accuracy are due to heterogeneity, poor definition of the detailed structures of head tissues (staircasing effects), etc. In order to propose numerical modeling using Finite Element or Discontinuous Galerkin Time Domain methods, reliable automated tools for the unstructured discretization of human heads are also needed. Results presented in this article aim at filling the gap between human head MRI images and the accurate numerical modeling of wave propagation in biological tissues and its thermal effects. To cite this article: G. Scarella et al., C. R. Physique 7 (2006).

  6. ALC: automated reduction of rule-based models

    PubMed Central

    Koschorreck, Markus; Gilles, Ernst Dieter

    2008-01-01

    Background Combinatorial complexity is a challenging problem for the modeling of cellular signal transduction since the association of a few proteins can give rise to an enormous amount of feasible protein complexes. The layer-based approach is an approximative, but accurate method for the mathematical modeling of signaling systems with inherent combinatorial complexity. The number of variables in the simulation equations is highly reduced and the resulting dynamic models show a pronounced modularity. Layer-based modeling allows for the modeling of systems not accessible previously. Results ALC (Automated Layer Construction) is a computer program that highly simplifies the building of reduced modular models, according to the layer-based approach. The model is defined using a simple but powerful rule-based syntax that supports the concepts of modularity and macrostates. ALC performs consistency checks on the model definition and provides the model output in different formats (C MEX, MATLAB, Mathematica and SBML) as ready-to-run simulation files. ALC also provides additional documentation files that simplify the publication or presentation of the models. The tool can be used offline or via a form on the ALC website. Conclusion ALC allows for a simple rule-based generation of layer-based reduced models. The model files are given in different formats as ready-to-run simulation files. PMID:18973705

  7. Applications of pathology-assisted image analysis of immunohistochemistry-based biomarkers in oncology.

    PubMed

    Shinde, V; Burke, K E; Chakravarty, A; Fleming, M; McDonald, A A; Berger, A; Ecsedy, J; Blakemore, S J; Tirrell, S M; Bowman, D

    2014-01-01

    Immunohistochemistry-based biomarkers are commonly used to understand target inhibition in key cancer pathways in preclinical models and clinical studies. Automated slide-scanning and advanced high-throughput image analysis software technologies have evolved into a routine methodology for quantitative analysis of immunohistochemistry-based biomarkers. Alongside the traditional pathology H-score based on physical slides, the pathology world is welcoming digital pathology and advanced quantitative image analysis, which have enabled tissue- and cellular-level analysis. An automated workflow was implemented that includes automated staining, slide-scanning, and image analysis methodologies to explore biomarkers involved in 2 cancer targets: Aurora A and NEDD8-activating enzyme (NAE). The 2 workflows highlight the evolution of our immunohistochemistry laboratory and the different needs and requirements of each biological assay. Skin biopsies obtained from MLN8237 (Aurora A inhibitor) phase 1 clinical trials were evaluated for mitotic and apoptotic index, while mitotic index and defects in chromosome alignment and spindles were assessed in tumor biopsies to demonstrate Aurora A inhibition. Additionally, in both preclinical xenograft models and an acute myeloid leukemia phase 1 trial of the NAE inhibitor MLN4924, development of a novel image algorithm enabled measurement of downstream pathway modulation upon NAE inhibition. In the highlighted studies, developing a biomarker strategy based on automated image analysis solutions enabled project teams to confirm target and pathway inhibition and understand downstream outcomes of target inhibition with increased throughput and quantitative accuracy. These case studies demonstrate a strategy that combines a pathologist's expertise with automated image analysis to support oncology drug discovery and development programs.

  8. An automated digital imaging system for environmental monitoring applications

    USGS Publications Warehouse

    Bogle, Rian; Velasco, Miguel; Vogel, John

    2013-01-01

    Recent improvements in the affordability and availability of high-resolution digital cameras, data loggers, embedded computers, and radio/cellular modems have advanced the development of sophisticated automated systems for remote imaging. Researchers have successfully placed and operated automated digital cameras in remote locations and in extremes of temperature and humidity, ranging from the islands of the South Pacific to the Mojave Desert and the Grand Canyon. With the integration of environmental sensors, these automated systems are able to respond to local conditions and modify their imaging regimes as needed. In this report we describe in detail the design of one type of automated imaging system developed by our group. It is easily replicated, low-cost, highly robust, and is a stand-alone automated camera designed to be placed in remote locations, without wireless connectivity.

  9. G protein-coupled receptor internalization assays in the high-content screening format.

    PubMed

    Haasen, Dorothea; Schnapp, Andreas; Valler, Martin J; Heilker, Ralf

    2006-01-01

    High-content screening (HCS), a combination of fluorescence microscopic imaging and automated image analysis, has become a frequently applied tool to study test compound effects in cellular disease-modeling systems. This chapter describes the measurement of G protein-coupled receptor (GPCR) internalization in the HCS format using a high-throughput, confocal cellular imaging device. GPCRs are the most successful group of therapeutic targets on the pharmaceutical market. Accordingly, the search for compounds that interfere with GPCR function in a specific and selective way is a major focus of the pharmaceutical industry today. This chapter describes methods for the ligand-induced internalization of GPCRs labeled previously with either a fluorophore-conjugated ligand or an antibody directed against an N-terminal tag of the GPCR. Both labeling techniques produce robust assay formats. Complementary to other functional GPCR drug discovery assays, internalization assays enable a pharmacological analysis of test compounds. We conclude that GPCR internalization assays represent a valuable medium/high-throughput screening format to determine the cellular activity of GPCR ligands.

  10. Aquatic models, genomics and chemical risk management.

    PubMed

    Cheng, Keith C; Hinton, David E; Mattingly, Carolyn J; Planchart, Antonio

    2012-01-01

    The 5th Aquatic Animal Models for Human Disease meeting follows four previous meetings (Nairn et al., 2001; Schmale, 2004; Schmale et al., 2007; Hinton et al., 2009) in which advances in aquatic animal models for human disease research were reported, and community discussion of future direction was pursued. At this meeting, discussion at a workshop entitled Bioinformatics and Computational Biology with Web-based Resources (20 September 2010) led to an important conclusion: Aquatic model research using feral and experimental fish, in combination with web-based access to annotated anatomical atlases and toxicological databases, yields data that advance our understanding of human gene function, and can be used to facilitate environmental management and drug development. We propose here that the effects of genes and environment are best appreciated within an anatomical context - the specifically affected cells and organs in the whole animal. We envision the use of automated, whole-animal imaging at cellular resolution and computational morphometry facilitated by high-performance computing and automated entry into toxicological databases, as anchors for genetic and toxicological data, and as connectors between human and model system data. These principles should be applied to both laboratory and feral fish populations, which have been virtually irreplaceable sentinals for environmental contamination that results in human morbidity and mortality. We conclude that automation, database generation, and web-based accessibility, facilitated by genomic/transcriptomic data and high-performance and cloud computing, will potentiate the unique and potentially key roles that aquatic models play in advancing systems biology, drug development, and environmental risk management. Copyright © 2011 Elsevier Inc. All rights reserved.

  11. Detecting the Extent of Cellular Decomposition after Sub-Eutectoid Annealing in Rolled UMo Foils

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kautz, Elizabeth J.; Jana, Saumyadeep; Devaraj, Arun

    2017-07-31

    This report presents an automated image processing approach to quantifying microstructure image data, specifically the extent of eutectoid (cellular) decomposition in rolled U-10Mo foils. An image processing approach is used here to be able to quantitatively describe microstructure image data in order to relate microstructure to processing parameters (time, temperature, deformation).

  12. Image analysis in cytology: DNA-histogramming versus cervical smear prescreening.

    PubMed

    Bengtsson, E W; Nordin, B

    1993-01-01

    The visual inspection of cellular specimens and histological sections through a light microscope plays an important role in clinical medicine and biomedical research. The human visual system is very good at the recognition of various patterns but less efficient at quantitative assessment of these patterns. Some samples are prepared in great numbers, most notably the screening for cervical cancer, the so-called PAP-smears, which results in hundreds of millions of samples each year, creating a tedious mass inspection task. Numerous attempts have been made over the last 40 years to create systems that solve these two tasks, the quantitative supplement to the human visual system and the automation of mass screening. The most difficult task, the total automation, has received the greatest attention with many large scale projects over the decades. In spite of all these efforts, still no generally accepted automated prescreening device exists on the market. The main reason for this failure is the great pattern recognition capabilities needed to distinguish between cancer cells and all other kinds of objects found in the specimens: cellular clusters, debris, degenerate cells, etc. Improved algorithms, the ever-increasing processing power of computers and progress in biochemical specimen preparation techniques make it likely that eventually useful automated prescreening systems will become available. Meanwhile, much less effort has been put into the development of interactive cell image analysis systems. Still, some such systems have been developed and put into use at thousands of laboratories worldwide. In these the human pattern recognition capability is used to select the fields and objects that are to be analysed while the computational power of the computer is used for the quantitative analysis of cellular DNA content or other relevant markers. Numerous studies have shown that the quantitative information about the distribution of cellular DNA content is of prognostic significance in many types of cancer. Several laboratories are therefore putting these techniques into routine clinical use. The more advanced systems can also study many other markers and cellular features, some known to be of clinical interest, others useful in research. The advances in computer technology are making these systems more generally available through decreasing cost, increasing computational power and improved user interfaces. We have been involved in research and development of both automated and interactive cell analysis systems during the last 20 years. Here some experiences and conclusions from this work will be presented as well as some predictions about what can be expected in the near future.

  13. Advances in the simulation and automated measurement of well-sorted granular material: 1. Simulation

    USGS Publications Warehouse

    Daniel Buscombe,; Rubin, David M.

    2012-01-01

    1. In this, the first of a pair of papers which address the simulation and automated measurement of well-sorted natural granular material, a method is presented for simulation of two-phase (solid, void) assemblages of discrete non-cohesive particles. The purpose is to have a flexible, yet computationally and theoretically simple, suite of tools with well constrained and well known statistical properties, in order to simulate realistic granular material as a discrete element model with realistic size and shape distributions, for a variety of purposes. The stochastic modeling framework is based on three-dimensional tessellations with variable degrees of order in particle-packing arrangement. Examples of sediments with a variety of particle size distributions and spatial variability in grain size are presented. The relationship between particle shape and porosity conforms to published data. The immediate application is testing new algorithms for automated measurements of particle properties (mean and standard deviation of particle sizes, and apparent porosity) from images of natural sediment, as detailed in the second of this pair of papers. The model could also prove useful for simulating specific depositional structures found in natural sediments, the result of physical alterations to packing and grain fabric, using discrete particle flow models. While the principal focus here is on naturally occurring sediment and sedimentary rock, the methods presented might also be useful for simulations of similar granular or cellular material encountered in engineering, industrial and life sciences.

  14. Laser scanning cytometry for automation of the micronucleus assay

    PubMed Central

    Darzynkiewicz, Zbigniew; Smolewski, Piotr; Holden, Elena; Luther, Ed; Henriksen, Mel; François, Maxime; Leifert, Wayne; Fenech, Michael

    2011-01-01

    Laser scanning cytometry (LSC) provides a novel approach for automated scoring of micronuclei (MN) in different types of mammalian cells, serving as a biomarker of genotoxicity and mutagenicity. In this review, we discuss the advances to date in measuring MN in cell lines, buccal cells and erythrocytes, describe the advantages and outline potential challenges of this distinctive approach of analysis of nuclear anomalies. The use of multiple laser wavelengths in LSC and the high dynamic range of fluorescence and absorption detection allow simultaneous measurement of multiple cellular and nuclear features such as cytoplasmic area, nuclear area, DNA content and density of nuclei and MN, protein content and density of cytoplasm as well as other features using molecular probes. This high-content analysis approach allows the cells of interest to be identified (e.g. binucleated cells in cytokinesis-blocked cultures) and MN scored specifically in them. MN assays in cell lines (e.g. the CHO cell MN assay) using LSC are increasingly used in routine toxicology screening. More high-content MN assays and the expansion of MN analysis by LSC to other models (i.e. exfoliated cells, dermal cell models, etc.) hold great promise for robust and exciting developments in MN assay automation as a high-content high-throughput analysis procedure. PMID:21164197

  15. Impedance-based cellular assays for regenerative medicine.

    PubMed

    Gamal, W; Wu, H; Underwood, I; Jia, J; Smith, S; Bagnaninchi, P O

    2018-07-05

    Therapies based on regenerative techniques have the potential to radically improve healthcare in the coming years. As a result, there is an emerging need for non-destructive and label-free technologies to assess the quality of engineered tissues and cell-based products prior to their use in the clinic. In parallel, the emerging regenerative medicine industry that aims to produce stem cells and their progeny on a large scale will benefit from moving away from existing destructive biochemical assays towards data-driven automation and control at the industrial scale. Impedance-based cellular assays (IBCA) have emerged as an alternative approach to study stem-cell properties and cumulative studies, reviewed here, have shown their potential to monitor stem-cell renewal, differentiation and maturation. They offer a novel method to non-destructively assess and quality-control stem-cell cultures. In addition, when combined with in vitro disease models they provide complementary insights as label-free phenotypic assays. IBCA provide quantitative and very sensitive results that can easily be automated and up-scaled in multi-well format. When facing the emerging challenge of real-time monitoring of three-dimensional cell culture dielectric spectroscopy and electrical impedance tomography represent viable alternatives to two-dimensional impedance sensing.This article is part of the theme issue 'Designer human tissue: coming to a lab near you'. © 2018 The Author(s).

  16. Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy.

    PubMed

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-08-01

    To target bacterial pathogens that invade and proliferate inside host cells, it is necessary to design intervention strategies directed against bacterial attachment, cellular invasion and intracellular proliferation. We present an automated microscopy-based, fast, high-throughput method for analyzing size and number of intracellular bacterial colonies in infected tissue culture cells. Cells are seeded in 48-well plates and infected with a GFP-expressing bacterial pathogen. Following gentamicin treatment to remove extracellular pathogens, cells are fixed and cell nuclei stained. This is followed by automated microscopy and subsequent semi-automated spot detection to determine the number of intracellular bacterial colonies, their size distribution, and the average number per host cell. Multiple 48-well plates can be processed sequentially and the procedure can be completed in one working day. As a model we quantified intracellular bacterial colonies formed by uropathogenic Escherichia coli (UPEC) during infection of human kidney cells (HKC-8). Urinary tract infections caused by UPEC are among the most common bacterial infectious diseases in humans. UPEC can colonize tissues of the urinary tract and is responsible for acute, chronic, and recurrent infections. In the bladder, UPEC can form intracellular quiescent reservoirs, thought to be responsible for recurrent infections. In the kidney, UPEC can colonize renal epithelial cells and pass to the blood stream, either via epithelial cell disruption or transcellular passage, to cause sepsis. Intracellular colonies are known to be clonal, originating from single invading UPEC. In our experimental setup, we found UPEC CFT073 intracellular bacterial colonies to be heterogeneous in size and present in nearly one third of the HKC-8 cells. This high-throughput experimental format substantially reduces experimental time and enables fast screening of the intracellular bacterial load and cellular distribution of multiple bacterial isolates. This will be a powerful experimental tool facilitating the study of bacterial invasion, drug resistance, and the development of new therapeutics. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Building cell models and simulations from microscope images.

    PubMed

    Murphy, Robert F

    2016-03-01

    The use of fluorescence microscopy has undergone a major revolution over the past twenty years, both with the development of dramatic new technologies and with the widespread adoption of image analysis and machine learning methods. Many open source software tools provide the ability to use these methods in a wide range of studies, and many molecular and cellular phenotypes can now be automatically distinguished. This article presents the next major challenge in microscopy automation, the creation of accurate models of cell organization directly from images, and reviews the progress that has been made towards this challenge. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. High-throughput 3D whole-brain quantitative histopathology in rodents

    PubMed Central

    Vandenberghe, Michel E.; Hérard, Anne-Sophie; Souedet, Nicolas; Sadouni, Elmahdi; Santin, Mathieu D.; Briet, Dominique; Carré, Denis; Schulz, Jocelyne; Hantraye, Philippe; Chabrier, Pierre-Etienne; Rooney, Thomas; Debeir, Thomas; Blanchard, Véronique; Pradier, Laurent; Dhenain, Marc; Delzescaux, Thierry

    2016-01-01

    Histology is the gold standard to unveil microscopic brain structures and pathological alterations in humans and animal models of disease. However, due to tedious manual interventions, quantification of histopathological markers is classically performed on a few tissue sections, thus restricting measurements to limited portions of the brain. Recently developed 3D microscopic imaging techniques have allowed in-depth study of neuroanatomy. However, quantitative methods are still lacking for whole-brain analysis of cellular and pathological markers. Here, we propose a ready-to-use, automated, and scalable method to thoroughly quantify histopathological markers in 3D in rodent whole brains. It relies on block-face photography, serial histology and 3D-HAPi (Three Dimensional Histology Analysis Pipeline), an open source image analysis software. We illustrate our method in studies involving mouse models of Alzheimer’s disease and show that it can be broadly applied to characterize animal models of brain diseases, to evaluate therapeutic interventions, to anatomically correlate cellular and pathological markers throughout the entire brain and to validate in vivo imaging techniques. PMID:26876372

  19. Comparison of cell counting methods in rodent pulmonary toxicity studies: automated and manual protocols and considerations for experimental design

    PubMed Central

    Zeidler-Erdely, Patti C.; Antonini, James M.; Meighan, Terence G.; Young, Shih-Houng; Eye, Tracy J.; Hammer, Mary Ann; Erdely, Aaron

    2016-01-01

    Pulmonary toxicity studies often use bronchoalveolar lavage (BAL) to investigate potential adverse lung responses to a particulate exposure. The BAL cellular fraction is counted, using automated (i.e. Coulter Counter®), flow cytometry or manual (i.e. hemocytometer) methods, to determine inflammatory cell influx. The goal of the study was to compare the different counting methods to determine which is optimal for examining BAL cell influx after exposure by inhalation or intratracheal instillation (ITI) to different particles with varying inherent pulmonary toxicities in both rat and mouse models. General findings indicate that total BAL cell counts using the automated and manual methods tended to agree after inhalation or ITI exposure to particle samples that are relatively nontoxic or at later time points after exposure to a pneumotoxic particle when the response resolves. However, when the initial lung inflammation and cytotoxicity was high after exposure to a pneumotoxic particle, significant differences were observed when comparing cell counts from the automated, flow cytometry and manual methods. When using total BAL cell count for differential calculations from the automated method, depending on the cell diameter size range cutoff, the data suggest that the number of lung polymorphonuclear leukocytes (PMN) varies. Importantly, the automated counts, regardless of the size cutoff, still indicated a greater number of total lung PMN when compared with the manual method, which agreed more closely with flow cytometry. The results suggest that either the manual method or flow cytometry would be better suited for BAL studies where cytotoxicity is an unknown variable. PMID:27251196

  20. Comparison of cell counting methods in rodent pulmonary toxicity studies: automated and manual protocols and considerations for experimental design.

    PubMed

    Zeidler-Erdely, Patti C; Antonini, James M; Meighan, Terence G; Young, Shih-Houng; Eye, Tracy J; Hammer, Mary Ann; Erdely, Aaron

    2016-08-01

    Pulmonary toxicity studies often use bronchoalveolar lavage (BAL) to investigate potential adverse lung responses to a particulate exposure. The BAL cellular fraction is counted, using automated (i.e. Coulter Counter®), flow cytometry or manual (i.e. hemocytometer) methods, to determine inflammatory cell influx. The goal of the study was to compare the different counting methods to determine which is optimal for examining BAL cell influx after exposure by inhalation or intratracheal instillation (ITI) to different particles with varying inherent pulmonary toxicities in both rat and mouse models. General findings indicate that total BAL cell counts using the automated and manual methods tended to agree after inhalation or ITI exposure to particle samples that are relatively nontoxic or at later time points after exposure to a pneumotoxic particle when the response resolves. However, when the initial lung inflammation and cytotoxicity was high after exposure to a pneumotoxic particle, significant differences were observed when comparing cell counts from the automated, flow cytometry and manual methods. When using total BAL cell count for differential calculations from the automated method, depending on the cell diameter size range cutoff, the data suggest that the number of lung polymorphonuclear leukocytes (PMN) varies. Importantly, the automated counts, regardless of the size cutoff, still indicated a greater number of total lung PMN when compared with the manual method, which agreed more closely with flow cytometry. The results suggest that either the manual method or flow cytometry would be better suited for BAL studies where cytotoxicity is an unknown variable.

  1. Microfluidic Sample Preparation for Diagnostic Cytopathology

    PubMed Central

    Mach, Albert J.; Adeyiga, Oladunni B.; Di Carlo, Dino

    2014-01-01

    The cellular components of body fluids are routinely analyzed to identify disease and treatment approaches. While significant focus has been placed on developing cell analysis technologies, tools to automate the preparation of cellular specimens have been more limited, especially for body fluids beyond blood. Preparation steps include separating, concentrating, and exposing cells to reagents. Sample preparation continues to be routinely performed off-chip by technicians, preventing cell-based point-of-care diagnostics, increasing the cost of tests, and reducing the consistency of the final analysis following multiple manually-performed steps. Here, we review the assortment of biofluids for which suspended cells are analyzed, along with their characteristics and diagnostic value. We present an overview of the conventional sample preparation processes for cytological diagnosis. We finally discuss the challenges and opportunities in developing microfluidic devices for the purpose of automating or miniaturizing these processes, with particular emphases on preparing large or small volume samples, working with samples of high cellularity, automating multi-step processes, and obtaining high purity subpopulations of cells. We hope to convey the importance of and help identify new research directions addressing the vast biological and clinical applications in preparing and analyzing the array of available biological fluids. Successfully addressing the challenges described in this review can lead to inexpensive systems to improve diagnostic accuracy while simultaneously reducing overall systemic healthcare costs. PMID:23380972

  2. Automated processing of label-free Raman microscope images of macrophage cells with standardized regression for high-throughput analysis.

    PubMed

    Milewski, Robert J; Kumagai, Yutaro; Fujita, Katsumasa; Standley, Daron M; Smith, Nicholas I

    2010-11-19

    Macrophages represent the front lines of our immune system; they recognize and engulf pathogens or foreign particles thus initiating the immune response. Imaging macrophages presents unique challenges, as most optical techniques require labeling or staining of the cellular compartments in order to resolve organelles, and such stains or labels have the potential to perturb the cell, particularly in cases where incomplete information exists regarding the precise cellular reaction under observation. Label-free imaging techniques such as Raman microscopy are thus valuable tools for studying the transformations that occur in immune cells upon activation, both on the molecular and organelle levels. Due to extremely low signal levels, however, Raman microscopy requires sophisticated image processing techniques for noise reduction and signal extraction. To date, efficient, automated algorithms for resolving sub-cellular features in noisy, multi-dimensional image sets have not been explored extensively. We show that hybrid z-score normalization and standard regression (Z-LSR) can highlight the spectral differences within the cell and provide image contrast dependent on spectral content. In contrast to typical Raman imaging processing methods using multivariate analysis, such as single value decomposition (SVD), our implementation of the Z-LSR method can operate nearly in real-time. In spite of its computational simplicity, Z-LSR can automatically remove background and bias in the signal, improve the resolution of spatially distributed spectral differences and enable sub-cellular features to be resolved in Raman microscopy images of mouse macrophage cells. Significantly, the Z-LSR processed images automatically exhibited subcellular architectures whereas SVD, in general, requires human assistance in selecting the components of interest. The computational efficiency of Z-LSR enables automated resolution of sub-cellular features in large Raman microscopy data sets without compromise in image quality or information loss in associated spectra. These results motivate further use of label free microscopy techniques in real-time imaging of live immune cells.

  3. A grid matrix-based Raman spectroscopic method to characterize different cell milieu in biopsied axillary sentinel lymph nodes of breast cancer patients.

    PubMed

    Som, Dipasree; Tak, Megha; Setia, Mohit; Patil, Asawari; Sengupta, Amit; Chilakapati, C Murali Krishna; Srivastava, Anurag; Parmar, Vani; Nair, Nita; Sarin, Rajiv; Badwe, R

    2016-01-01

    Raman spectroscopy which is based upon inelastic scattering of photons has a potential to emerge as a noninvasive bedside in vivo or ex vivo molecular diagnostic tool. There is a need to improve the sensitivity and predictability of Raman spectroscopy. We developed a grid matrix-based tissue mapping protocol to acquire cellular-specific spectra that also involved digital microscopy for localizing malignant and lymphocytic cells in sentinel lymph node biopsy sample. Biosignals acquired from specific cellular milieu were subjected to an advanced supervised analytical method, i.e., cross-correlation and peak-to-peak ratio in addition to PCA and PC-LDA. We observed decreased spectral intensity as well as shift in the spectral peaks of amides and lipid bands in the completely metastatic (cancer cells) lymph nodes with high cellular density. Spectral library of normal lymphocytes and metastatic cancer cells created using the cellular specific mapping technique can be utilized to create an automated smart diagnostic tool for bench side screening of sampled lymph nodes. Spectral library of normal lymphocytes and metastatic cancer cells created using the cellular specific mapping technique can be utilized to develop an automated smart diagnostic tool for bench side screening of sampled lymph nodes supported by ongoing global research in developing better technology and signal and big data processing algorithms.

  4. Statewide Cellular Coverage Map

    DOT National Transportation Integrated Search

    2002-02-01

    The role of wireless communications in transportation is becoming increasingly important. Wireless communications are critical for many applications of Intelligent Transportation Systems (ITS) such as Automatic Vehicle Location (AVL) and Automated Co...

  5. Fast and accurate automated cell boundary determination for fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Arce, Stephen Hugo; Wu, Pei-Hsun; Tseng, Yiider

    2013-07-01

    Detailed measurement of cell phenotype information from digital fluorescence images has the potential to greatly advance biomedicine in various disciplines such as patient diagnostics or drug screening. Yet, the complexity of cell conformations presents a major barrier preventing effective determination of cell boundaries, and introduces measurement error that propagates throughout subsequent assessment of cellular parameters and statistical analysis. State-of-the-art image segmentation techniques that require user-interaction, prolonged computation time and specialized training cannot adequately provide the support for high content platforms, which often sacrifice resolution to foster the speedy collection of massive amounts of cellular data. This work introduces a strategy that allows us to rapidly obtain accurate cell boundaries from digital fluorescent images in an automated format. Hence, this new method has broad applicability to promote biotechnology.

  6. Human breast cancer histoid: an in vitro 3-dimensional co-culture model that mimics breast cancer tissue.

    PubMed

    Kaur, Pavinder; Ward, Brenda; Saha, Baisakhi; Young, Lillian; Groshen, Susan; Techy, Geza; Lu, Yani; Atkinson, Roscoe; Taylor, Clive R; Ingram, Marylou; Imam, S Ashraf

    2011-12-01

    Progress in our understanding of heterotypic cellular interaction in the tumor microenvironment, which is recognized to play major roles in cancer progression, has been hampered due to unavailability of an appropriate in vitro co-culture model. The aim of this study was to generate an in vitro 3-dimensional human breast cancer model, which consists of cancer cells and fibroblasts. Breast cancer cells (UACC-893) and fibroblasts at various densities were co-cultured in a rotating suspension culture system to establish co-culture parameters. Subsequently, UACC-893, BT.20, or MDA.MB.453 were co-cultured with fibroblasts for 9 days. Co-cultures resulted in the generation of breast cancer histoid (BCH) with cancer cells showing the invasion of fibroblast spheroids, which were visualized by immunohistochemical (IHC) staining of sections (4 µm thick) of BCH. A reproducible quantitative expression of C-erbB.2 was detected in UACC-893 cancer cells in BCH sections by IHC staining and the Automated Cellular Imaging System. BCH sections also consistently exhibited qualitative expression of pancytokeratins, p53, Ki-67, or E-cadherin in cancer cells and that of vimentin or GSTPi in fibroblasts, fibronectin in the basement membrane and collagen IV in the extracellular matrix. The expression of the protein analytes and cellular architecture of BCH were markedly similar to those of breast cancer tissue.

  7. Investigating the feasibility of scale up and automation of human induced pluripotent stem cells cultured in aggregates in feeder free conditions☆

    PubMed Central

    Soares, Filipa A.C.; Chandra, Amit; Thomas, Robert J.; Pedersen, Roger A.; Vallier, Ludovic; Williams, David J.

    2014-01-01

    The transfer of a laboratory process into a manufacturing facility is one of the most critical steps required for the large scale production of cell-based therapy products. This study describes the first published protocol for scalable automated expansion of human induced pluripotent stem cell lines growing in aggregates in feeder-free and chemically defined medium. Cells were successfully transferred between different sites representative of research and manufacturing settings; and passaged manually and using the CompacT SelecT automation platform. Modified protocols were developed for the automated system and the management of cells aggregates (clumps) was identified as the critical step. Cellular morphology, pluripotency gene expression and differentiation into the three germ layers have been used compare the outcomes of manual and automated processes. PMID:24440272

  8. Automated quantitative cytological analysis using portable microfluidic microscopy.

    PubMed

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Automated batch fiducial-less tilt-series alignment in Appion using Protomo

    PubMed Central

    Noble, Alex J.; Stagg, Scott M.

    2015-01-01

    The field of electron tomography has benefited greatly from manual and semi-automated approaches to marker-based tilt-series alignment that have allowed for the structural determination of multitudes of in situ cellular structures as well as macromolecular structures of individual protein complexes. The emergence of complementary metal-oxide semiconductor detectors capable of detecting individual electrons has enabled the collection of low dose, high contrast images, opening the door for reliable correlation-based tilt-series alignment. Here we present a set of automated, correlation-based tilt-series alignment, contrast transfer function (CTF) correction, and reconstruction workflows for use in conjunction with the Appion/Leginon package that are primarily targeted at automating structure determination with cryogenic electron microscopy. PMID:26455557

  10. An Automated High-Throughput System to Fractionate Plant Natural Products for Drug Discovery

    PubMed Central

    Tu, Ying; Jeffries, Cynthia; Ruan, Hong; Nelson, Cynthia; Smithson, David; Shelat, Anang A.; Brown, Kristin M.; Li, Xing-Cong; Hester, John P.; Smillie, Troy; Khan, Ikhlas A.; Walker, Larry; Guy, Kip; Yan, Bing

    2010-01-01

    The development of an automated, high-throughput fractionation procedure to prepare and analyze natural product libraries for drug discovery screening is described. Natural products obtained from plant materials worldwide were extracted and first prefractionated on polyamide solid-phase extraction cartridges to remove polyphenols, followed by high-throughput automated fractionation, drying, weighing, and reformatting for screening and storage. The analysis of fractions with UPLC coupled with MS, PDA and ELSD detectors provides information that facilitates characterization of compounds in active fractions. Screening of a portion of fractions yielded multiple assay-specific hits in several high-throughput cellular screening assays. This procedure modernizes the traditional natural product fractionation paradigm by seamlessly integrating automation, informatics, and multimodal analytical interrogation capabilities. PMID:20232897

  11. Automated Synthesis of 18F-Fluoropropoxytryptophan for Amino Acid Transporter System Imaging

    PubMed Central

    Shih, I-Hong; Duan, Xu-Dong; Kong, Fan-Lin; Williams, Michael D.; Zhang, Yin-Han; Yang, David J.

    2014-01-01

    Objective. This study was to develop a cGMP grade of [18F]fluoropropoxytryptophan (18F-FTP) to assess tryptophan transporters using an automated synthesizer. Methods. Tosylpropoxytryptophan (Ts-TP) was reacted with K18F/kryptofix complex. After column purification, solvent evaporation, and hydrolysis, the identity and purity of the product were validated by radio-TLC (1M-ammonium acetate : methanol = 4 : 1) and HPLC (C-18 column, methanol : water = 7 : 3) analyses. In vitro cellular uptake of 18F-FTP and 18F-FDG was performed in human prostate cancer cells. PET imaging studies were performed with 18F-FTP and 18F-FDG in prostate and small cell lung tumor-bearing mice (3.7 MBq/mouse, iv). Results. Radio-TLC and HPLC analyses of 18F-FTP showed that the Rf and Rt values were 0.9 and 9 min, respectively. Radiochemical purity was >99%. The radiochemical yield was 37.7% (EOS 90 min, decay corrected). Cellular uptake of 18F-FTP and 18F-FDG showed enhanced uptake as a function of incubation time. PET imaging studies showed that 18F-FTP had less tumor uptake than 18F-FDG in prostate cancer model. However, 18F-FTP had more uptake than 18F-FDG in small cell lung cancer model. Conclusion. 18F-FTP could be synthesized with high radiochemical yield. Assessment of upregulated transporters activity by 18F-FTP may provide potential applications in differential diagnosis and prediction of early treatment response. PMID:25136592

  12. Automated synthesis of 18F-fluoropropoxytryptophan for amino acid transporter system imaging.

    PubMed

    Shih, I-Hong; Duan, Xu-Dong; Kong, Fan-Lin; Williams, Michael D; Yang, Kevin; Zhang, Yin-Han; Yang, David J

    2014-01-01

    This study was to develop a cGMP grade of [(18)F]fluoropropoxytryptophan ((18)F-FTP) to assess tryptophan transporters using an automated synthesizer. Tosylpropoxytryptophan (Ts-TP) was reacted with K(18)F/kryptofix complex. After column purification, solvent evaporation, and hydrolysis, the identity and purity of the product were validated by radio-TLC (1M-ammonium acetate : methanol = 4 : 1) and HPLC (C-18 column, methanol : water = 7 : 3) analyses. In vitro cellular uptake of (18)F-FTP and (18)F-FDG was performed in human prostate cancer cells. PET imaging studies were performed with (18)F-FTP and (18)F-FDG in prostate and small cell lung tumor-bearing mice (3.7 MBq/mouse, iv). Radio-TLC and HPLC analyses of (18)F-FTP showed that the Rf and Rt values were 0.9 and 9 min, respectively. Radiochemical purity was >99%. The radiochemical yield was 37.7% (EOS 90 min, decay corrected). Cellular uptake of (18)F-FTP and (18)F-FDG showed enhanced uptake as a function of incubation time. PET imaging studies showed that (18)F-FTP had less tumor uptake than (18)F-FDG in prostate cancer model. However, (18)F-FTP had more uptake than (18)F-FDG in small cell lung cancer model. (18)F-FTP could be synthesized with high radiochemical yield. Assessment of upregulated transporters activity by (18)F-FTP may provide potential applications in differential diagnosis and prediction of early treatment response.

  13. Human Breast Cancer Histoid

    PubMed Central

    Kaur, Pavinder; Ward, Brenda; Saha, Baisakhi; Young, Lillian; Groshen, Susan; Techy, Geza; Lu, Yani; Atkinson, Roscoe; Taylor, Clive R.; Ingram, Marylou

    2011-01-01

    Progress in our understanding of heterotypic cellular interaction in the tumor microenvironment, which is recognized to play major roles in cancer progression, has been hampered due to unavailability of an appropriate in vitro co-culture model. The aim of this study was to generate an in vitro 3-dimensional human breast cancer model, which consists of cancer cells and fibroblasts. Breast cancer cells (UACC-893) and fibroblasts at various densities were co-cultured in a rotating suspension culture system to establish co-culture parameters. Subsequently, UACC-893, BT.20, or MDA.MB.453 were co-cultured with fibroblasts for 9 days. Co-cultures resulted in the generation of breast cancer histoid (BCH) with cancer cells showing the invasion of fibroblast spheroids, which were visualized by immunohistochemical (IHC) staining of sections (4 µm thick) of BCH. A reproducible quantitative expression of C-erbB.2 was detected in UACC-893 cancer cells in BCH sections by IHC staining and the Automated Cellular Imaging System. BCH sections also consistently exhibited qualitative expression of pancytokeratins, p53, Ki-67, or E-cadherin in cancer cells and that of vimentin or GSTPi in fibroblasts, fibronectin in the basement membrane and collagen IV in the extracellular matrix. The expression of the protein analytes and cellular architecture of BCH were markedly similar to those of breast cancer tissue. PMID:22034518

  14. XML-based data model and architecture for a knowledge-based grid-enabled problem-solving environment for high-throughput biological imaging.

    PubMed

    Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif

    2008-03-01

    High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.

  15. Metadata Standard and Data Exchange Specifications to Describe, Model, and Integrate Complex and Diverse High-Throughput Screening Data from the Library of Integrated Network-based Cellular Signatures (LINCS).

    PubMed

    Vempati, Uma D; Chung, Caty; Mader, Chris; Koleti, Amar; Datar, Nakul; Vidović, Dušica; Wrobel, David; Erickson, Sean; Muhlich, Jeremy L; Berriz, Gabriel; Benes, Cyril H; Subramanian, Aravind; Pillai, Ajay; Shamu, Caroline E; Schürer, Stephan C

    2014-06-01

    The National Institutes of Health Library of Integrated Network-based Cellular Signatures (LINCS) program is generating extensive multidimensional data sets, including biochemical, genome-wide transcriptional, and phenotypic cellular response signatures to a variety of small-molecule and genetic perturbations with the goal of creating a sustainable, widely applicable, and readily accessible systems biology knowledge resource. Integration and analysis of diverse LINCS data sets depend on the availability of sufficient metadata to describe the assays and screening results and on their syntactic, structural, and semantic consistency. Here we report metadata specifications for the most important molecular and cellular components and recommend them for adoption beyond the LINCS project. We focus on the minimum required information to model LINCS assays and results based on a number of use cases, and we recommend controlled terminologies and ontologies to annotate assays with syntactic consistency and semantic integrity. We also report specifications for a simple annotation format (SAF) to describe assays and screening results based on our metadata specifications with explicit controlled vocabularies. SAF specifically serves to programmatically access and exchange LINCS data as a prerequisite for a distributed information management infrastructure. We applied the metadata specifications to annotate large numbers of LINCS cell lines, proteins, and small molecules. The resources generated and presented here are freely available. © 2014 Society for Laboratory Automation and Screening.

  16. CellCognition: time-resolved phenotype annotation in high-throughput live cell imaging.

    PubMed

    Held, Michael; Schmitz, Michael H A; Fischer, Bernd; Walter, Thomas; Neumann, Beate; Olma, Michael H; Peter, Matthias; Ellenberg, Jan; Gerlich, Daniel W

    2010-09-01

    Fluorescence time-lapse imaging has become a powerful tool to investigate complex dynamic processes such as cell division or intracellular trafficking. Automated microscopes generate time-resolved imaging data at high throughput, yet tools for quantification of large-scale movie data are largely missing. Here we present CellCognition, a computational framework to annotate complex cellular dynamics. We developed a machine-learning method that combines state-of-the-art classification with hidden Markov modeling for annotation of the progression through morphologically distinct biological states. Incorporation of time information into the annotation scheme was essential to suppress classification noise at state transitions and confusion between different functional states with similar morphology. We demonstrate generic applicability in different assays and perturbation conditions, including a candidate-based RNA interference screen for regulators of mitotic exit in human cells. CellCognition is published as open source software, enabling live-cell imaging-based screening with assays that directly score cellular dynamics.

  17. Combination of automated high throughput platforms, flow cytometry, and hierarchical clustering to detect cell state.

    PubMed

    Kitsos, Christine M; Bhamidipati, Phani; Melnikova, Irena; Cash, Ethan P; McNulty, Chris; Furman, Julia; Cima, Michael J; Levinson, Douglas

    2007-01-01

    This study examined whether hierarchical clustering could be used to detect cell states induced by treatment combinations that were generated through automation and high-throughput (HT) technology. Data-mining techniques were used to analyze the large experimental data sets to determine whether nonlinear, non-obvious responses could be extracted from the data. Unary, binary, and ternary combinations of pharmacological factors (examples of stimuli) were used to induce differentiation of HL-60 cells using a HT automated approach. Cell profiles were analyzed by incorporating hierarchical clustering methods on data collected by flow cytometry. Data-mining techniques were used to explore the combinatorial space for nonlinear, unexpected events. Additional small-scale, follow-up experiments were performed on cellular profiles of interest. Multiple, distinct cellular profiles were detected using hierarchical clustering of expressed cell-surface antigens. Data-mining of this large, complex data set retrieved cases of both factor dominance and cooperativity, as well as atypical cellular profiles. Follow-up experiments found that treatment combinations producing "atypical cell types" made those cells more susceptible to apoptosis. CONCLUSIONS Hierarchical clustering and other data-mining techniques were applied to analyze large data sets from HT flow cytometry. From each sample, the data set was filtered and used to define discrete, usable states that were then related back to their original formulations. Analysis of resultant cell populations induced by a multitude of treatments identified unexpected phenotypes and nonlinear response profiles.

  18. Neuronal Morphology goes Digital: A Research Hub for Cellular and System Neuroscience

    PubMed Central

    Parekh, Ruchi; Ascoli, Giorgio A.

    2013-01-01

    Summary The importance of neuronal morphology in brain function has been recognized for over a century. The broad applicability of “digital reconstructions” of neuron morphology across neuroscience sub-disciplines has stimulated the rapid development of numerous synergistic tools for data acquisition, anatomical analysis, three-dimensional rendering, electrophysiological simulation, growth models, and data sharing. Here we discuss the processes of histological labeling, microscopic imaging, and semi-automated tracing. Moreover, we provide an annotated compilation of currently available resources in this rich research “ecosystem” as a central reference for experimental and computational neuroscience. PMID:23522039

  19. Automated batch fiducial-less tilt-series alignment in Appion using Protomo.

    PubMed

    Noble, Alex J; Stagg, Scott M

    2015-11-01

    The field of electron tomography has benefited greatly from manual and semi-automated approaches to marker-based tilt-series alignment that have allowed for the structural determination of multitudes of in situ cellular structures as well as macromolecular structures of individual protein complexes. The emergence of complementary metal-oxide semiconductor detectors capable of detecting individual electrons has enabled the collection of low dose, high contrast images, opening the door for reliable correlation-based tilt-series alignment. Here we present a set of automated, correlation-based tilt-series alignment, contrast transfer function (CTF) correction, and reconstruction workflows for use in conjunction with the Appion/Leginon package that are primarily targeted at automating structure determination with cryogenic electron microscopy. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Tissue vascularization through 3D printing: Will technology bring us flow?

    PubMed

    Paulsen, S J; Miller, J S

    2015-05-01

    Though in vivo models provide the most physiologically relevant environment for studying tissue function, in vitro studies provide researchers with explicit control over experimental conditions and the potential to develop high throughput testing methods. In recent years, advancements in developmental biology research and imaging techniques have significantly improved our understanding of the processes involved in vascular development. However, the task of recreating the complex, multi-scale vasculature seen in in vivo systems remains elusive. 3D bioprinting offers a potential method to generate controlled vascular networks with hierarchical structure approaching that of in vivo networks. Bioprinting is an interdisciplinary field that relies on advances in 3D printing technology along with advances in imaging and computational modeling, which allow researchers to monitor cellular function and to better understand cellular environment within the printed tissue. As bioprinting technologies improve with regards to resolution, printing speed, available materials, and automation, 3D printing could be used to generate highly controlled vascularized tissues in a high throughput manner for use in regenerative medicine and the development of in vitro tissue models for research in developmental biology and vascular diseases. © 2015 Wiley Periodicals, Inc.

  1. Model-based cell number quantification using online single-oxygen sensor data for tissue engineering perfusion bioreactors.

    PubMed

    Lambrechts, T; Papantoniou, I; Sonnaert, M; Schrooten, J; Aerts, J-M

    2014-10-01

    Online and non-invasive quantification of critical tissue engineering (TE) construct quality attributes in TE bioreactors is indispensable for the cost-effective up-scaling and automation of cellular construct manufacturing. However, appropriate monitoring techniques for cellular constructs in bioreactors are still lacking. This study presents a generic and robust approach to determine cell number and metabolic activity of cell-based TE constructs in perfusion bioreactors based on single oxygen sensor data in dynamic perfusion conditions. A data-based mechanistic modeling technique was used that is able to correlate the number of cells within the scaffold (R(2)  = 0.80) and the metabolic activity of the cells (R(2)  = 0.82) to the dynamics of the oxygen response to step changes in the perfusion rate. This generic non-destructive measurement technique is effective for a large range of cells, from as low as 1.0 × 10(5) cells to potentially multiple millions of cells, and can open-up new possibilities for effective bioprocess monitoring. © 2014 Wiley Periodicals, Inc.

  2. SuperSegger: robust image segmentation, analysis and lineage tracking of bacterial cells.

    PubMed

    Stylianidou, Stella; Brennan, Connor; Nissen, Silas B; Kuwada, Nathan J; Wiggins, Paul A

    2016-11-01

    Many quantitative cell biology questions require fast yet reliable automated image segmentation to identify and link cells from frame-to-frame, and characterize the cell morphology and fluorescence. We present SuperSegger, an automated MATLAB-based image processing package well-suited to quantitative analysis of high-throughput live-cell fluorescence microscopy of bacterial cells. SuperSegger incorporates machine-learning algorithms to optimize cellular boundaries and automated error resolution to reliably link cells from frame-to-frame. Unlike existing packages, it can reliably segment microcolonies with many cells, facilitating the analysis of cell-cycle dynamics in bacteria as well as cell-contact mediated phenomena. This package has a range of built-in capabilities for characterizing bacterial cells, including the identification of cell division events, mother, daughter and neighbouring cells, and computing statistics on cellular fluorescence, the location and intensity of fluorescent foci. SuperSegger provides a variety of postprocessing data visualization tools for single cell and population level analysis, such as histograms, kymographs, frame mosaics, movies and consensus images. Finally, we demonstrate the power of the package by analyzing lag phase growth with single cell resolution. © 2016 John Wiley & Sons Ltd.

  3. Human cell structure-driven model construction for predicting protein subcellular location from biological images.

    PubMed

    Shao, Wei; Liu, Mingxia; Zhang, Daoqiang

    2016-01-01

    The systematic study of subcellular location pattern is very important for fully characterizing the human proteome. Nowadays, with the great advances in automated microscopic imaging, accurate bioimage-based classification methods to predict protein subcellular locations are highly desired. All existing models were constructed on the independent parallel hypothesis, where the cellular component classes are positioned independently in a multi-class classification engine. The important structural information of cellular compartments is missed. To deal with this problem for developing more accurate models, we proposed a novel cell structure-driven classifier construction approach (SC-PSorter) by employing the prior biological structural information in the learning model. Specifically, the structural relationship among the cellular components is reflected by a new codeword matrix under the error correcting output coding framework. Then, we construct multiple SC-PSorter-based classifiers corresponding to the columns of the error correcting output coding codeword matrix using a multi-kernel support vector machine classification approach. Finally, we perform the classifier ensemble by combining those multiple SC-PSorter-based classifiers via majority voting. We evaluate our method on a collection of 1636 immunohistochemistry images from the Human Protein Atlas database. The experimental results show that our method achieves an overall accuracy of 89.0%, which is 6.4% higher than the state-of-the-art method. The dataset and code can be downloaded from https://github.com/shaoweinuaa/. dqzhang@nuaa.edu.cn Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. A multi-phenotypic imaging screen to identify bacterial effectors by exogenous expression in a HeLa cell line.

    PubMed

    Collins, Adam; Huett, Alan

    2018-05-15

    We present a high-content screen (HCS) for the simultaneous analysis of multiple phenotypes in HeLa cells expressing an autophagy reporter (mcherry-LC3) and one of 224 GFP-fused proteins from the Crohn's Disease (CD)-associated bacterium, Adherent Invasive E. coli (AIEC) strain LF82. Using automated confocal microscopy and image analysis (CellProfiler), we localised GFP fusions within cells, and monitored their effects upon autophagy (an important innate cellular defence mechanism), cellular and nuclear morphology, and the actin cytoskeleton. This data will provide an atlas for the localisation of 224 AIEC proteins within human cells, as well as a dataset to analyse their effects upon many aspects of host cell morphology. We also describe an open-source, automated, image-analysis workflow to identify bacterial effectors and their roles via the perturbations induced in reporter cell lines when candidate effectors are exogenously expressed.

  5. Target identification by image analysis.

    PubMed

    Fetz, V; Prochnow, H; Brönstrup, M; Sasse, F

    2016-05-04

    Covering: 1997 to the end of 2015Each biologically active compound induces phenotypic changes in target cells that are characteristic for its mode of action. These phenotypic alterations can be directly observed under the microscope or made visible by labelling structural elements or selected proteins of the cells with dyes. A comparison of the cellular phenotype induced by a compound of interest with the phenotypes of reference compounds with known cellular targets allows predicting its mode of action. While this approach has been successfully applied to the characterization of natural products based on a visual inspection of images, recent studies used automated microscopy and analysis software to increase speed and to reduce subjective interpretation. In this review, we give a general outline of the workflow for manual and automated image analysis, and we highlight natural products whose bacterial and eucaryotic targets could be identified through such approaches.

  6. An Algorithm to Automate Yeast Segmentation and Tracking

    PubMed Central

    Doncic, Andreas; Eser, Umut; Atay, Oguzhan; Skotheim, Jan M.

    2013-01-01

    Our understanding of dynamic cellular processes has been greatly enhanced by rapid advances in quantitative fluorescence microscopy. Imaging single cells has emphasized the prevalence of phenomena that can be difficult to infer from population measurements, such as all-or-none cellular decisions, cell-to-cell variability, and oscillations. Examination of these phenomena requires segmenting and tracking individual cells over long periods of time. However, accurate segmentation and tracking of cells is difficult and is often the rate-limiting step in an experimental pipeline. Here, we present an algorithm that accomplishes fully automated segmentation and tracking of budding yeast cells within growing colonies. The algorithm incorporates prior information of yeast-specific traits, such as immobility and growth rate, to segment an image using a set of threshold values rather than one specific optimized threshold. Results from the entire set of thresholds are then used to perform a robust final segmentation. PMID:23520484

  7. Development and automation of a test of impulse control in zebrafish

    PubMed Central

    Parker, Matthew O.; Ife, Dennis; Ma, Jun; Pancholi, Mahesh; Smeraldi, Fabrizio; Straw, Chris; Brennan, Caroline H.

    2013-01-01

    Deficits in impulse control (difficulties in inhibition of a pre-potent response) are fundamental to a number of psychiatric disorders, but the molecular and cellular basis is poorly understood. Zebrafish offer a very useful model for exploring these mechanisms, but there is currently a lack of validated procedures for measuring impulsivity in fish. In mammals, impulsivity can be measured by examining rates of anticipatory responding in the 5-choice serial reaction time task (5-CSRTT), a continuous performance task where the subject is reinforced upon accurate detection of a briefly presented light in one of five distinct spatial locations. This paper describes the development of a fully-integrated automated system for testing impulsivity in adult zebrafish. We outline the development of our image analysis software and its integration with National Instruments drivers and actuators to produce the system. We also describe an initial validation of the system through a one-generation screen of chemically mutagenized zebrafish, where the testing parameters were optimized. PMID:24133417

  8. Automation in high-content flow cytometry screening.

    PubMed

    Naumann, U; Wand, M P

    2009-09-01

    High-content flow cytometric screening (FC-HCS) is a 21st Century technology that combines robotic fluid handling, flow cytometric instrumentation, and bioinformatics software, so that relatively large numbers of flow cytometric samples can be processed and analysed in a short period of time. We revisit a recent application of FC-HCS to the problem of cellular signature definition for acute graft-versus-host-disease. Our focus is on automation of the data processing steps using recent advances in statistical methodology. We demonstrate that effective results, on par with those obtained via manual processing, can be achieved using our automatic techniques. Such automation of FC-HCS has the potential to drastically improve diagnosis and biomarker identification.

  9. Quantitation of Cellular Dynamics in Growing Arabidopsis Roots with Light Sheet Microscopy

    PubMed Central

    Birnbaum, Kenneth D.; Leibler, Stanislas

    2011-01-01

    To understand dynamic developmental processes, living tissues have to be imaged frequently and for extended periods of time. Root development is extensively studied at cellular resolution to understand basic mechanisms underlying pattern formation and maintenance in plants. Unfortunately, ensuring continuous specimen access, while preserving physiological conditions and preventing photo-damage, poses major barriers to measurements of cellular dynamics in growing organs such as plant roots. We present a system that integrates optical sectioning through light sheet fluorescence microscopy with hydroponic culture that enables us to image, at cellular resolution, a vertically growing Arabidopsis root every few minutes and for several consecutive days. We describe novel automated routines to track the root tip as it grows, to track cellular nuclei and to identify cell divisions. We demonstrate the system's capabilities by collecting data on divisions and nuclear dynamics. PMID:21731697

  10. Strategies for reducing driver distraction from in-vehicle telematics devices : a discussion document

    DOT National Transportation Integrated Search

    2003-04-01

    "In-Vehicle Telematics" refers to devices incorporating wireless communications technologies in order to provide information services, vehicle automation and other functions. While cellular phones are currently the most common type of telematics devi...

  11. Immunohistochemical Detection of the Autophagy Markers LC3 and p62/SQSTM1 in Formalin-Fixed and Paraffin-Embedded Tissue.

    PubMed

    Berezowska, Sabina; Galván, José A

    2017-01-01

    Autophagy is a highly conserved cellular mechanism of "self digestion," ensuring cellular homeostasis, and playing a role in many diseases including cancer. As a stress response mechanism, it may also be involved in cellular response to therapy.LC3 and Sequestosome 1 (p62/SQSTM1) are among the most widely used markers to monitor autophagy, and can be visualized in formalin-fixed and paraffin-embedded tissue by immunohistochemistry. Here we describe a validated staining protocol using an automated staining system available in many routine pathology laboratories, enabling high-throughput staining under standardized conditions.

  12. Universal microfluidic automaton for autonomous sample processing: application to the Mars Organic Analyzer.

    PubMed

    Kim, Jungkyu; Jensen, Erik C; Stockton, Amanda M; Mathies, Richard A

    2013-08-20

    A fully integrated multilayer microfluidic chemical analyzer for automated sample processing and labeling, as well as analysis using capillary zone electrophoresis is developed and characterized. Using lifting gate microfluidic control valve technology, a microfluidic automaton consisting of a two-dimensional microvalve cellular array is fabricated with soft lithography in a format that enables facile integration with a microfluidic capillary electrophoresis device. The programmable sample processor performs precise mixing, metering, and routing operations that can be combined to achieve automation of complex and diverse assay protocols. Sample labeling protocols for amino acid, aldehyde/ketone and carboxylic acid analysis are performed automatically followed by automated transfer and analysis by the integrated microfluidic capillary electrophoresis chip. Equivalent performance to off-chip sample processing is demonstrated for each compound class; the automated analysis resulted in a limit of detection of ~16 nM for amino acids. Our microfluidic automaton provides a fully automated, portable microfluidic analysis system capable of autonomous analysis of diverse compound classes in challenging environments.

  13. [Non-animal toxicology in the safety testing of chemicals].

    PubMed

    Heinonen, Tuula; Tähti, Hanna

    2013-01-01

    There is an urgent need to develop predictive test methods better than animal experiments for assessing the safety of chemical substances to man. According to today's vision this is achieved by using human cell based tissue and organ models. In the new testing strategy the toxic effects are assessed by the changes in the critical parameters of the cellular biochemical routes (AOP, adverse toxic outcome pathway-principle) in the target tissues. In vitro-tests are rapid and effective, and with them automation can be applied. The change in the testing paradigm is supported by all stakeholders: scientists, regulators and people concerned on animal welfare.

  14. Genome-wide assessment of the carriers involved in the cellular uptake of drugs: a model system in yeast

    PubMed Central

    2011-01-01

    Background The uptake of drugs into cells has traditionally been considered to be predominantly via passive diffusion through the bilayer portion of the cell membrane. The recent recognition that drug uptake is mostly carrier-mediated raises the question of which drugs use which carriers. Results To answer this, we have constructed a chemical genomics platform built upon the yeast gene deletion collection, using competition experiments in batch fermenters and robotic automation of cytotoxicity screens, including protection by 'natural' substrates. Using these, we tested 26 different drugs and identified the carriers required for 18 of the drugs to gain entry into yeast cells. Conclusions As well as providing a useful platform technology, these results further substantiate the notion that the cellular uptake of pharmaceutical drugs normally occurs via carrier-mediated transport and indicates that establishing the identity and tissue distribution of such carriers should be a major consideration in the design of safe and effective drugs. PMID:22023736

  15. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection.

    PubMed

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-10-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial colonies in infected host cells (Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy, Ernstsen et al., 2017 [1]). The infected cells were imaged with a 10× objective and number of intracellular bacterial colonies, their size distribution and the number of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method.

  16. Automated Microfluidic Filtration and Immunocytochemistry Detection System for Capture and Enumeration of Circulating Tumor Cells and Other Rare Cell Populations in Blood.

    PubMed

    Pugia, Michael; Magbanua, Mark Jesus M; Park, John W

    2017-01-01

    Isolation by size using a filter membrane offers an antigen-independent method for capturing rare cells present in blood of cancer patients. Multiple cell types, including circulating tumor cells (CTCs), captured on the filter membrane can be simultaneously identified via immunocytochemistry (ICC) analysis of specific cellular biomarkers. Here, we describe an automated microfluidic filtration method combined with a liquid handling system for sequential ICC assays to detect and enumerate non-hematologic rare cells in blood.

  17. Cellular High-Energy Cavitation Trauma - Description of a Novel In Vitro Trauma Model in Three Different Cell Types.

    PubMed

    Cao, Yuli; Risling, Mårten; Malm, Elisabeth; Sondén, Anders; Bolling, Magnus Frödin; Sköld, Mattias K

    2016-01-01

    The mechanisms involved in traumatic brain injury have yet to be fully characterized. One mechanism that, especially in high-energy trauma, could be of importance is cavitation. Cavitation can be described as a process of vaporization, bubble generation, and bubble implosion as a result of a decrease and subsequent increase in pressure. Cavitation as an injury mechanism is difficult to visualize and model due to its short duration and limited spatial distribution. One strategy to analyze the cellular response of cavitation is to employ suitable in vitro models. The flyer-plate model is an in vitro high-energy trauma model that includes cavitation as a trauma mechanism. A copper fragment is accelerated by means of a laser, hits the bottom of a cell culture well causing cavitation, and shock waves inside the well and cell medium. We have found the flyer-plate model to be efficient, reproducible, and easy to control. In this study, we have used the model to analyze the cellular response to microcavitation in SH-SY5Y neuroblastoma, Caco-2, and C6 glioma cell lines. Mitotic activity in neuroblastoma and glioma was investigated with BrdU staining, and cell numbers were calculated using automated time-lapse imaging. We found variations between cell types and between different zones surrounding the lesion with these methods. It was also shown that the injured cell cultures released S-100B in a dose-dependent manner. Using gene expression microarray, a number of gene families of potential interest were found to be strongly, but differently regulated in neuroblastoma and glioma at 24 h post trauma. The data from the gene expression arrays may be used to identify new candidates for biomarkers in cavitation trauma. We conclude that our model is useful for studies of trauma in vitro and that it could be applied in future treatment studies.

  18. Modeling Molecular and Cellular Aspects of Human Disease using the Nematode Caenorhabditis elegans

    PubMed Central

    Silverman, Gary A.; Luke, Cliff J.; Bhatia, Sangeeta R.; Long, Olivia S.; Vetica, Anne C.; Perlmutter, David H.; Pak, Stephen C.

    2009-01-01

    As an experimental system, Caenorhabditis elegans, offers a unique opportunity to interrogate in vivo the genetic and molecular functions of human disease-related genes. For example, C. elegans has provided crucial insights into fundamental biological processes such as cell death and cell fate determinations, as well as pathological processes such as neurodegeneration and microbial susceptibility. The C. elegans model has several distinct advantages including a completely sequenced genome that shares extensive homology with that of mammals, ease of cultivation and storage, a relatively short lifespan and techniques for generating null and transgenic animals. However, the ability to conduct unbiased forward and reverse genetic screens in C. elegans remains one of the most powerful experimental paradigms for discovering the biochemical pathways underlying human disease phenotypes. The identification of these pathways leads to a better understanding of the molecular interactions that perturb cellular physiology, and forms the foundation for designing mechanism-based therapies. To this end, the ability to process large numbers of isogenic animals through automated work stations suggests that C. elegans, manifesting different aspects of human disease phenotypes, will become the platform of choice for in vivo drug discovery and target validation using high-throughput/content screening technologies. PMID:18852689

  19. Cellient™ automated cell block versus traditional cell block preparation: a comparison of morphologic features and immunohistochemical staining.

    PubMed

    Wagner, David G; Russell, Donna K; Benson, Jenna M; Schneider, Ashley E; Hoda, Rana S; Bonfiglio, Thomas A

    2011-10-01

    Traditional cell block (TCB) sections serve as an important diagnostic adjunct to cytologic smears but are also used today as a reliable preparation for immunohistochemical (IHC) studies. There are many ways to prepare a cell block and the methods continue to be revised. In this study, we compare the TCB with the Cellient™ automated cell block system. Thirty-five cell blocks were obtained from 16 benign and 19 malignant nongynecologic cytology specimens at a large university teaching hospital and prepared according to TCB and Cellient protocols. Cell block sections from both methods were compared for possible differences in various morphologic features and immunohistochemical staining patterns. In the 16 benign cases, no significant morphologic differences were found between the TCB and Cellient cell block sections. For the 19 malignant cases, some noticeable differences in the nuclear chromatin and cellularity were identified, although statistical significance was not attained. Immunohistochemical or special stains were performed on 89% of the malignant cases (17/19). Inadequate cellularity precluded full evaluation in 23% of Cellient cell block IHC preparations (4/17). Of the malignant cases with adequate cellularity (13/17), the immunohistochemical staining patterns from the different methods were identical in 53% of cases. The traditional and Cellient cell block sections showed similar morphologic and immunohistochemical staining patterns. The only significant difference between the two methods concerned the lower overall cell block cellularity identified during immunohistochemical staining in the Cellient cell block sections. Copyright © 2010 Wiley-Liss, Inc.

  20. Immunohistochemical Expression of Matrix Metalloproteinase-7 in Human Colorectal Adenomas Using Specified Automated Cellular Image Analysis System: A Clinicopathological Study

    PubMed Central

    Qasim, Ban J.; Ali, Hussam H.; Hussein, Alaa G.

    2013-01-01

    Background/Aim: To evaluate the immunohistochemical expression of matrix metalloproteinase-7 (MMP-7) in colorectal adenomas, and to correlate this expression with different clinicopathological parameters. Patients and Methods: The study was retrospectively designed. Thirty three paraffin blocks from patients with colorectal adenoma and 20 samples of non-tumerous colonic tissue taken as control group were included in the study. MMP-7 expression was assessed by immunohistochemistry method. The scoring of immunohistochemical staining was conducted utilizing a specified automated cellular image analysis system (Digimizer). Results: The frequency of positive immunohistochemical expression of MMP-7 was significantly higher in adenoma than control group (45.45% versus 10%) (P value < 0.001). Strong MMP-7 staining was mainly seen in adenoma cases (30.30%) in comparison with control (0%) the difference is significant (P < 0.001). The three digital parameters of MMP-7 immunohistochemical expression (Area (A), Number of objects (N), and intensity (I)) were significantly higher in adenoma than control. Mean (A and I) of MMP-7 showed a significant correlation with large sized adenoma (≥ 1cm) (P < 0.05), also a significant positive correlation of the three digital parameters (A, N, and I) of MMP-7 expression with villous configuration and severe dysplasia in colorectal adenoma had been identified (P < 0.05). Conclusion: MMP-7 plays an important role in the growth and malignant conversion of colorectal adenomas as it is more likely to be expressed in advanced colorectal adenomatous polyps with large size, severe dysplasia and villous histology. The use of automated cellular image analysis system (Digmizer) to quantify immunohistochemical staining yields more consistent assay results, converts semi-quantitative assay to a truly quantitative assay, and improves assay objectivity and reproducibility. PMID:23319034

  1. Prototypic automated continuous recreational water quality monitoring of nine Chicago beaches

    USGS Publications Warehouse

    Dawn Shively,; Nevers, Meredith; Cathy Breitenbach,; Phanikumar, Mantha S.; Kasia Przybyla-Kelly,; Ashley M. Spoljaric,; Richard L. Whitman,

    2015-01-01

    Predictive empirical modeling is used in many locations worldwide as a rapid, alternative recreational water quality management tool to eliminate delayed notifications associated with traditional fecal indicator bacteria (FIB) culturing (referred to as the persistence model, PM) and to prevent errors in releasing swimming advisories. The goal of this study was to develop a fully automated water quality management system for multiple beaches using predictive empirical models (EM) and state-of-the-art technology. Many recent EMs rely on samples or data collected manually, which adds to analysis time and increases the burden to the beach manager. In this study, data from water quality buoys and weather stations were transmitted through cellular telemetry to a web hosting service. An executable program simultaneously retrieved and aggregated data for regression equations and calculated EM results each morning at 9:30 AM; results were transferred through RSS feed to a website, mapped to each beach, and received by the lifeguards to be posted at the beach. Models were initially developed for five beaches, but by the third year, 21 beaches were managed using refined and validated modeling systems. The adjusted R2 of the regressions relating Escherichia coli to hydrometeorological variables for the EMs were greater than those for the PMs, and ranged from 0.220 to 0.390 (2011) and 0.103 to 0.381 (2012). Validation results in 2013 revealed reduced predictive capabilities; however, three of the originally modeled beaches showed improvement in 2013 compared to 2012. The EMs generally showed higher accuracy and specificity than those of the PMs, and sensitivity was low for both approaches. In 2012 EM accuracy was 70–97%; specificity, 71–100%; and sensitivity, 0–64% and in 2013 accuracy was 68–97%; specificity, 73–100%; and sensitivity 0–36%. Factors that may have affected model capabilities include instrument malfunction, non-point source inputs, and sparse calibration data. The modeling system developed is the most extensive, fully-automated system for recreational water quality developed to date. Key insights for refining and improving large-scale empirical models for beach management have been developed through this multi-year effort.

  2. Fully Automated RNAscope In Situ Hybridization Assays for Formalin‐Fixed Paraffin‐Embedded Cells and Tissues

    PubMed Central

    Anderson, Courtney M.; Zhang, Bingqing; Miller, Melanie; Butko, Emerald; Wu, Xingyong; Laver, Thomas; Kernag, Casey; Kim, Jeffrey; Luo, Yuling; Lamparski, Henry; Park, Emily; Su, Nan

    2016-01-01

    ABSTRACT Biomarkers such as DNA, RNA, and protein are powerful tools in clinical diagnostics and therapeutic development for many diseases. Identifying RNA expression at the single cell level within the morphological context by RNA in situ hybridization provides a great deal of information on gene expression changes over conventional techniques that analyze bulk tissue, yet widespread use of this technique in the clinical setting has been hampered by the dearth of automated RNA ISH assays. Here we present an automated version of the RNA ISH technology RNAscope that is adaptable to multiple automation platforms. The automated RNAscope assay yields a high signal‐to‐noise ratio with little to no background staining and results comparable to the manual assay. In addition, the automated duplex RNAscope assay was able to detect two biomarkers simultaneously. Lastly, assay consistency and reproducibility were confirmed by quantification of TATA‐box binding protein (TBP) mRNA signals across multiple lots and multiple experiments. Taken together, the data presented in this study demonstrate that the automated RNAscope technology is a high performance RNA ISH assay with broad applicability in biomarker research and diagnostic assay development. J. Cell. Biochem. 117: 2201–2208, 2016. © 2016 The Authors. Journal of Cellular Biochemistry Published by Wiley Periodicals, Inc. PMID:27191821

  3. Semi-automated confocal imaging of fungal pathogenesis on plants: microscopic analysis of macroscopic specimens

    USDA-ARS?s Scientific Manuscript database

    Contextualizing natural genetic variation in plant disease resistance in terms of pathogenesis can provide information about the function of causal genes. Cellular mechanisms associated with pathogenesis can be elucidated with confocal microscopy, but systematic phenotyping platforms—from sample pro...

  4. Teaching Cellular Automation Concepts through Interdisciplinary Collaborative Learning.

    ERIC Educational Resources Information Center

    Biernacki, Joseph J.; Ayers, Jerry B.

    2000-01-01

    Reports on the experiences of 12 students--three senior undergraduates majoring in chemical engineering, five master-level, and four doctoral students--in a course titled "Interdisciplinary Studies in Multi-Scale Simulation of Concrete Materials". Course objectives focused on incorporating team-oriented interdisciplinary experiences into the…

  5. An end-to-end workflow for engineering of biological networks from high-level specifications.

    PubMed

    Beal, Jacob; Weiss, Ron; Densmore, Douglas; Adler, Aaron; Appleton, Evan; Babb, Jonathan; Bhatia, Swapnil; Davidsohn, Noah; Haddock, Traci; Loyall, Joseph; Schantz, Richard; Vasilev, Viktor; Yaman, Fusun

    2012-08-17

    We present a workflow for the design and production of biological networks from high-level program specifications. The workflow is based on a sequence of intermediate models that incrementally translate high-level specifications into DNA samples that implement them. We identify algorithms for translating between adjacent models and implement them as a set of software tools, organized into a four-stage toolchain: Specification, Compilation, Part Assignment, and Assembly. The specification stage begins with a Boolean logic computation specified in the Proto programming language. The compilation stage uses a library of network motifs and cellular platforms, also specified in Proto, to transform the program into an optimized Abstract Genetic Regulatory Network (AGRN) that implements the programmed behavior. The part assignment stage assigns DNA parts to the AGRN, drawing the parts from a database for the target cellular platform, to create a DNA sequence implementing the AGRN. Finally, the assembly stage computes an optimized assembly plan to create the DNA sequence from available part samples, yielding a protocol for producing a sample of engineered plasmids with robotics assistance. Our workflow is the first to automate the production of biological networks from a high-level program specification. Furthermore, the workflow's modular design allows the same program to be realized on different cellular platforms simply by swapping workflow configurations. We validated our workflow by specifying a small-molecule sensor-reporter program and verifying the resulting plasmids in both HEK 293 mammalian cells and in E. coli bacterial cells.

  6. Quantitative phase-digital holographic microscopy: a new imaging modality to identify original cellular biomarkers of diseases

    NASA Astrophysics Data System (ADS)

    Marquet, P.; Rothenfusser, K.; Rappaz, B.; Depeursinge, C.; Jourdain, P.; Magistretti, P. J.

    2016-03-01

    Quantitative phase microscopy (QPM) has recently emerged as a powerful label-free technique in the field of living cell imaging allowing to non-invasively measure with a nanometric axial sensitivity cell structure and dynamics. Since the phase retardation of a light wave when transmitted through the observed cells, namely the quantitative phase signal (QPS), is sensitive to both cellular thickness and intracellular refractive index related to the cellular content, its accurate analysis allows to derive various cell parameters and monitor specific cell processes, which are very likely to identify new cell biomarkers. Specifically, quantitative phase-digital holographic microscopy (QP-DHM), thanks to its numerical flexibility facilitating parallelization and automation processes, represents an appealing imaging modality to both identify original cellular biomarkers of diseases as well to explore the underlying pathophysiological processes.

  7. Refined annotation and assembly of the Tetrahymena thermophila genome sequence through EST analysis, comparative genomic hybridization, and targeted gap closure

    PubMed Central

    Coyne, Robert S; Thiagarajan, Mathangi; Jones, Kristie M; Wortman, Jennifer R; Tallon, Luke J; Haas, Brian J; Cassidy-Hanley, Donna M; Wiley, Emily A; Smith, Joshua J; Collins, Kathleen; Lee, Suzanne R; Couvillion, Mary T; Liu, Yifan; Garg, Jyoti; Pearlman, Ronald E; Hamilton, Eileen P; Orias, Eduardo; Eisen, Jonathan A; Methé, Barbara A

    2008-01-01

    Background Tetrahymena thermophila, a widely studied model for cellular and molecular biology, is a binucleated single-celled organism with a germline micronucleus (MIC) and somatic macronucleus (MAC). The recent draft MAC genome assembly revealed low sequence repetitiveness, a result of the epigenetic removal of invasive DNA elements found only in the MIC genome. Such low repetitiveness makes complete closure of the MAC genome a feasible goal, which to achieve would require standard closure methods as well as removal of minor MIC contamination of the MAC genome assembly. Highly accurate preliminary annotation of Tetrahymena's coding potential was hindered by the lack of both comparative genomic sequence information from close relatives and significant amounts of cDNA evidence, thus limiting the value of the genomic information and also leaving unanswered certain questions, such as the frequency of alternative splicing. Results We addressed the problem of MIC contamination using comparative genomic hybridization with purified MIC and MAC DNA probes against a whole genome oligonucleotide microarray, allowing the identification of 763 genome scaffolds likely to contain MIC-limited DNA sequences. We also employed standard genome closure methods to essentially finish over 60% of the MAC genome. For the improvement of annotation, we have sequenced and analyzed over 60,000 verified EST reads from a variety of cellular growth and development conditions. Using this EST evidence, a combination of automated and manual reannotation efforts led to updates that affect 16% of the current protein-coding gene models. By comparing EST abundance, many genes showing apparent differential expression between these conditions were identified. Rare instances of alternative splicing and uses of the non-standard amino acid selenocysteine were also identified. Conclusion We report here significant progress in genome closure and reannotation of Tetrahymena thermophila. Our experience to date suggests that complete closure of the MAC genome is attainable. Using the new EST evidence, automated and manual curation has resulted in substantial improvements to the over 24,000 gene models, which will be valuable to researchers studying this model organism as well as for comparative genomics purposes. PMID:19036158

  8. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    PubMed

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  9. A semi-automated technique for labeling and counting of apoptosing retinal cells

    PubMed Central

    2014-01-01

    Background Retinal ganglion cell (RGC) loss is one of the earliest and most important cellular changes in glaucoma. The DARC (Detection of Apoptosing Retinal Cells) technology enables in vivo real-time non-invasive imaging of single apoptosing retinal cells in animal models of glaucoma and Alzheimer’s disease. To date, apoptosing RGCs imaged using DARC have been counted manually. This is time-consuming, labour-intensive, vulnerable to bias, and has considerable inter- and intra-operator variability. Results A semi-automated algorithm was developed which enabled automated identification of apoptosing RGCs labeled with fluorescent Annexin-5 on DARC images. Automated analysis included a pre-processing stage involving local-luminance and local-contrast “gain control”, a “blob analysis” step to differentiate between cells, vessels and noise, and a method to exclude non-cell structures using specific combined ‘size’ and ‘aspect’ ratio criteria. Apoptosing retinal cells were counted by 3 masked operators, generating ‘Gold-standard’ mean manual cell counts, and were also counted using the newly developed automated algorithm. Comparison between automated cell counts and the mean manual cell counts on 66 DARC images showed significant correlation between the two methods (Pearson’s correlation coefficient 0.978 (p < 0.001), R Squared = 0.956. The Intraclass correlation coefficient was 0.986 (95% CI 0.977-0.991, p < 0.001), and Cronbach’s alpha measure of consistency = 0.986, confirming excellent correlation and consistency. No significant difference (p = 0.922, 95% CI: −5.53 to 6.10) was detected between the cell counts of the two methods. Conclusions The novel automated algorithm enabled accurate quantification of apoptosing RGCs that is highly comparable to manual counting, and appears to minimise operator-bias, whilst being both fast and reproducible. This may prove to be a valuable method of quantifying apoptosing retinal cells, with particular relevance to translation in the clinic, where a Phase I clinical trial of DARC in glaucoma patients is due to start shortly. PMID:24902592

  10. Protocol for the validation of microbiological control of cellular products according to German regulators recommendations--Boon and Bane for the manufacturer.

    PubMed

    Störmer, M; Radojska, S; Hos, N J; Gathof, B S

    2015-04-01

    In order to generate standardized conditions for the microbiological control of HPCs, the PEI recommended defined steps for validation that will lead to extensive validation as shown in this study, where a possible validation principle for the microbiological control of allogeneic SCPs is presented. Although it could be demonstrated that automated culture improves microbial safety of cellular products, the requirement for extensive validation studies needs to be considered. © 2014 International Society of Blood Transfusion.

  11. The Design of Fault Tolerant Quantum Dot Cellular Automata Based Logic

    NASA Technical Reports Server (NTRS)

    Armstrong, C. Duane; Humphreys, William M.; Fijany, Amir

    2002-01-01

    As transistor geometries are reduced, quantum effects begin to dominate device performance. At some point, transistors cease to have the properties that make them useful computational components. New computing elements must be developed in order to keep pace with Moore s Law. Quantum dot cellular automata (QCA) represent an alternative paradigm to transistor-based logic. QCA architectures that are robust to manufacturing tolerances and defects must be developed. We are developing software that allows the exploration of fault tolerant QCA gate architectures by automating the specification, simulation, analysis and documentation processes.

  12. Machine Learning of Human Pluripotent Stem Cell-Derived Engineered Cardiac Tissue Contractility for Automated Drug Classification.

    PubMed

    Lee, Eugene K; Tran, David D; Keung, Wendy; Chan, Patrick; Wong, Gabriel; Chan, Camie W; Costa, Kevin D; Li, Ronald A; Khine, Michelle

    2017-11-14

    Accurately predicting cardioactive effects of new molecular entities for therapeutics remains a daunting challenge. Immense research effort has been focused toward creating new screening platforms that utilize human pluripotent stem cell (hPSC)-derived cardiomyocytes and three-dimensional engineered cardiac tissue constructs to better recapitulate human heart function and drug responses. As these new platforms become increasingly sophisticated and high throughput, the drug screens result in larger multidimensional datasets. Improved automated analysis methods must therefore be developed in parallel to fully comprehend the cellular response across a multidimensional parameter space. Here, we describe the use of machine learning to comprehensively analyze 17 functional parameters derived from force readouts of hPSC-derived ventricular cardiac tissue strips (hvCTS) electrically paced at a range of frequencies and exposed to a library of compounds. A generated metric is effective for then determining the cardioactivity of a given drug. Furthermore, we demonstrate a classification model that can automatically predict the mechanistic action of an unknown cardioactive drug. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Automated method for the rapid and precise estimation of adherent cell culture characteristics from phase contrast microscopy images.

    PubMed

    Jaccard, Nicolas; Griffin, Lewis D; Keser, Ana; Macown, Rhys J; Super, Alexandre; Veraitch, Farlan S; Szita, Nicolas

    2014-03-01

    The quantitative determination of key adherent cell culture characteristics such as confluency, morphology, and cell density is necessary for the evaluation of experimental outcomes and to provide a suitable basis for the establishment of robust cell culture protocols. Automated processing of images acquired using phase contrast microscopy (PCM), an imaging modality widely used for the visual inspection of adherent cell cultures, could enable the non-invasive determination of these characteristics. We present an image-processing approach that accurately detects cellular objects in PCM images through a combination of local contrast thresholding and post hoc correction of halo artifacts. The method was thoroughly validated using a variety of cell lines, microscope models and imaging conditions, demonstrating consistently high segmentation performance in all cases and very short processing times (<1 s per 1,208 × 960 pixels image). Based on the high segmentation performance, it was possible to precisely determine culture confluency, cell density, and the morphology of cellular objects, demonstrating the wide applicability of our algorithm for typical microscopy image processing pipelines. Furthermore, PCM image segmentation was used to facilitate the interpretation and analysis of fluorescence microscopy data, enabling the determination of temporal and spatial expression patterns of a fluorescent reporter. We created a software toolbox (PHANTAST) that bundles all the algorithms and provides an easy to use graphical user interface. Source-code for MATLAB and ImageJ is freely available under a permissive open-source license. © 2013 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals, Inc.

  14. Automated Method for the Rapid and Precise Estimation of Adherent Cell Culture Characteristics from Phase Contrast Microscopy Images

    PubMed Central

    Jaccard, Nicolas; Griffin, Lewis D; Keser, Ana; Macown, Rhys J; Super, Alexandre; Veraitch, Farlan S; Szita, Nicolas

    2014-01-01

    The quantitative determination of key adherent cell culture characteristics such as confluency, morphology, and cell density is necessary for the evaluation of experimental outcomes and to provide a suitable basis for the establishment of robust cell culture protocols. Automated processing of images acquired using phase contrast microscopy (PCM), an imaging modality widely used for the visual inspection of adherent cell cultures, could enable the non-invasive determination of these characteristics. We present an image-processing approach that accurately detects cellular objects in PCM images through a combination of local contrast thresholding and post hoc correction of halo artifacts. The method was thoroughly validated using a variety of cell lines, microscope models and imaging conditions, demonstrating consistently high segmentation performance in all cases and very short processing times (<1 s per 1,208 × 960 pixels image). Based on the high segmentation performance, it was possible to precisely determine culture confluency, cell density, and the morphology of cellular objects, demonstrating the wide applicability of our algorithm for typical microscopy image processing pipelines. Furthermore, PCM image segmentation was used to facilitate the interpretation and analysis of fluorescence microscopy data, enabling the determination of temporal and spatial expression patterns of a fluorescent reporter. We created a software toolbox (PHANTAST) that bundles all the algorithms and provides an easy to use graphical user interface. Source-code for MATLAB and ImageJ is freely available under a permissive open-source license. Biotechnol. Bioeng. 2014;111: 504–517. © 2013 Wiley Periodicals, Inc. PMID:24037521

  15. Remote automated multi-generational growth and observation of an animal in low Earth orbit

    PubMed Central

    Oczypok, Elizabeth A.; Etheridge, Timothy; Freeman, Jacob; Stodieck, Louis; Johnsen, Robert; Baillie, David; Szewczyk, Nathaniel J.

    2012-01-01

    The ultimate survival of humanity is dependent upon colonization of other planetary bodies. Key challenges to such habitation are (patho)physiologic changes induced by known, and unknown, factors associated with long-duration and distance space exploration. However, we currently lack biological models for detecting and studying these changes. Here, we use a remote automated culture system to successfully grow an animal in low Earth orbit for six months. Our observations, over 12 generations, demonstrate that the multi-cellular soil worm Caenorhabditis elegans develops from egg to adulthood and produces progeny with identical timings in space as on the Earth. Additionally, these animals display normal rates of movement when fully fed, comparable declines in movement when starved, and appropriate growth arrest upon starvation and recovery upon re-feeding. These observations establish C. elegans as a biological model that can be used to detect changes in animal growth, development, reproduction and behaviour in response to environmental conditions during long-duration spaceflight. This experimental system is ready to be incorporated on future, unmanned interplanetary missions and could be used to study cost-effectively the effects of such missions on these biological processes and the efficacy of new life support systems and radiation shielding technologies. PMID:22130552

  16. The iFly Tracking System for an Automated Locomotor and Behavioural Analysis of Drosophila melanogaster

    PubMed Central

    Kohlhoff, Kai J.; Jahn, Thomas R.; Lomas, David A.; Dobson, Christopher M.; Crowther, Damian C.; Vendruscolo, Michele

    2016-01-01

    The use of animal models in medical research provides insights into molecular and cellular mechanisms of human disease, and helps identify and test novel therapeutic strategies. Drosophila melanogaster – the common fruit fly – is one of the most established model organisms, as its study can be performed more readily and with far less expense than for other model animal systems, such as mice, fish, or indeed primates. In the case of fruit flies, standard assays are based on the analysis of longevity and basic locomotor functions. Here we present the iFly tracking system, which enables to increase the amount of quantitative information that can be extracted from these studies, and to reduce significantly the duration and costs associated with them. The iFly system uses a single camera to simultaneously track the trajectories of up to 20 individual flies with about 100μm spatial and 33ms temporal resolution. The statistical analysis of fly movements recorded with such accuracy makes it possible to perform a rapid and fully automated quantitative analysis of locomotor changes in response to a range of different stimuli. We anticipate that the iFly method will reduce very considerably the costs and the duration of the testing of genetic and pharmacological interventions in Drosophila models, including an earlier detection of behavioural changes and a large increase in throughput compared to current longevity and locomotor assays. PMID:21698336

  17. Robotic Automation of In Vivo Two-Photon Targeted Whole-Cell Patch-Clamp Electrophysiology.

    PubMed

    Annecchino, Luca A; Morris, Alexander R; Copeland, Caroline S; Agabi, Oshiorenoya E; Chadderton, Paul; Schultz, Simon R

    2017-08-30

    Whole-cell patch-clamp electrophysiological recording is a powerful technique for studying cellular function. While in vivo patch-clamp recording has recently benefited from automation, it is normally performed "blind," meaning that throughput for sampling some genetically or morphologically defined cell types is unacceptably low. One solution to this problem is to use two-photon microscopy to target fluorescently labeled neurons. Combining this with robotic automation is difficult, however, as micropipette penetration induces tissue deformation, moving target cells from their initial location. Here we describe a platform for automated two-photon targeted patch-clamp recording, which solves this problem by making use of a closed loop visual servo algorithm. Our system keeps the target cell in focus while iteratively adjusting the pipette approach trajectory to compensate for tissue motion. We demonstrate platform validation with patch-clamp recordings from a variety of cells in the mouse neocortex and cerebellum. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Imaging cell picker: A morphology-based automated cell separation system on a photodegradable hydrogel culture platform.

    PubMed

    Shibuta, Mayu; Tamura, Masato; Kanie, Kei; Yanagisawa, Masumi; Matsui, Hirofumi; Satoh, Taku; Takagi, Toshiyuki; Kanamori, Toshiyuki; Sugiura, Shinji; Kato, Ryuji

    2018-06-09

    Cellular morphology on and in a scaffold composed of extracellular matrix generally represents the cellular phenotype. Therefore, morphology-based cell separation should be interesting method that is applicable to cell separation without staining surface markers in contrast to conventional cell separation methods (e.g., fluorescence activated cell sorting and magnetic activated cell sorting). In our previous study, we have proposed a cloning technology using a photodegradable gelatin hydrogel to separate the individual cells on and in hydrogels. To further expand the applicability of this photodegradable hydrogel culture platform, we here report an image-based cell separation system imaging cell picker for the morphology-based cell separation on a photodegradable hydrogel. We have developed the platform which enables the automated workflow of image acquisition, image processing and morphology analysis, and collection of a target cells. We have shown the performance of the morphology-based cell separation through the optimization of the critical parameters that determine the system's performance, such as (i) culture conditions, (ii) imaging conditions, and (iii) the image analysis scheme, to actually clone the cells of interest. Furthermore, we demonstrated the morphology-based cloning performance of cancer cells in the mixture of cells by automated hydrogel degradation by light irradiation and pipetting. Copyright © 2018 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  19. An analytical tool that quantifies cellular morphology changes from three-dimensional fluorescence images.

    PubMed

    Haass-Koffler, Carolina L; Naeemuddin, Mohammad; Bartlett, Selena E

    2012-08-31

    The most common software analysis tools available for measuring fluorescence images are for two-dimensional (2D) data that rely on manual settings for inclusion and exclusion of data points, and computer-aided pattern recognition to support the interpretation and findings of the analysis. It has become increasingly important to be able to measure fluorescence images constructed from three-dimensional (3D) datasets in order to be able to capture the complexity of cellular dynamics and understand the basis of cellular plasticity within biological systems. Sophisticated microscopy instruments have permitted the visualization of 3D fluorescence images through the acquisition of multispectral fluorescence images and powerful analytical software that reconstructs the images from confocal stacks that then provide a 3D representation of the collected 2D images. Advanced design-based stereology methods have progressed from the approximation and assumptions of the original model-based stereology even in complex tissue sections. Despite these scientific advances in microscopy, a need remains for an automated analytic method that fully exploits the intrinsic 3D data to allow for the analysis and quantification of the complex changes in cell morphology, protein localization and receptor trafficking. Current techniques available to quantify fluorescence images include Meta-Morph (Molecular Devices, Sunnyvale, CA) and Image J (NIH) which provide manual analysis. Imaris (Andor Technology, Belfast, Northern Ireland) software provides the feature MeasurementPro, which allows the manual creation of measurement points that can be placed in a volume image or drawn on a series of 2D slices to create a 3D object. This method is useful for single-click point measurements to measure a line distance between two objects or to create a polygon that encloses a region of interest, but it is difficult to apply to complex cellular network structures. Filament Tracer (Andor) allows automatic detection of the 3D neuronal filament-like however, this module has been developed to measure defined structures such as neurons, which are comprised of dendrites, axons and spines (tree-like structure). This module has been ingeniously utilized to make morphological measurements to non-neuronal cells, however, the output data provide information of an extended cellular network by using a software that depends on a defined cell shape rather than being an amorphous-shaped cellular model. To overcome the issue of analyzing amorphous-shaped cells and making the software more suitable to a biological application, Imaris developed Imaris Cell. This was a scientific project with the Eidgenössische Technische Hochschule, which has been developed to calculate the relationship between cells and organelles. While the software enables the detection of biological constraints, by forcing one nucleus per cell and using cell membranes to segment cells, it cannot be utilized to analyze fluorescence data that are not continuous because ideally it builds cell surface without void spaces. To our knowledge, at present no user-modifiable automated approach that provides morphometric information from 3D fluorescence images has been developed that achieves cellular spatial information of an undefined shape (Figure 1). We have developed an analytical platform using the Imaris core software module and Imaris XT interfaced to MATLAB (Mat Works, Inc.). These tools allow the 3D measurement of cells without a pre-defined shape and with inconsistent fluorescence network components. Furthermore, this method will allow researchers who have extended expertise in biological systems, but not familiarity to computer applications, to perform quantification of morphological changes in cell dynamics.

  20. From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data

    PubMed Central

    Tsai, Wen-Ting; Hassan, Ahmed; Sarkar, Purbasha; Correa, Joaquin; Metlagel, Zoltan; Jorgens, Danielle M.; Auer, Manfred

    2014-01-01

    Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets. PMID:25145678

  1. End-to-end automated microfluidic platform for synthetic biology: from design to functional analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linshiz, Gregory; Jensen, Erik; Stawski, Nina

    Synthetic biology aims to engineer biological systems for desired behaviors. The construction of these systems can be complex, often requiring genetic reprogramming, extensive de novo DNA synthesis, and functional screening. Here, we present a programmable, multipurpose microfluidic platform and associated software and apply the platform to major steps of the synthetic biology research cycle: design, construction, testing, and analysis. We show the platform’s capabilities for multiple automated DNA assembly methods, including a new method for Isothermal Hierarchical DNA Construction, and for Escherichia coli and Saccharomyces cerevisiae transformation. The platform enables the automated control of cellular growth, gene expression induction, andmore » proteogenic and metabolic output analysis. Finally, taken together, we demonstrate the microfluidic platform’s potential to provide end-to-end solutions for synthetic biology research, from design to functional analysis.« less

  2. End-to-end automated microfluidic platform for synthetic biology: from design to functional analysis

    DOE PAGES

    Linshiz, Gregory; Jensen, Erik; Stawski, Nina; ...

    2016-02-02

    Synthetic biology aims to engineer biological systems for desired behaviors. The construction of these systems can be complex, often requiring genetic reprogramming, extensive de novo DNA synthesis, and functional screening. Here, we present a programmable, multipurpose microfluidic platform and associated software and apply the platform to major steps of the synthetic biology research cycle: design, construction, testing, and analysis. We show the platform’s capabilities for multiple automated DNA assembly methods, including a new method for Isothermal Hierarchical DNA Construction, and for Escherichia coli and Saccharomyces cerevisiae transformation. The platform enables the automated control of cellular growth, gene expression induction, andmore » proteogenic and metabolic output analysis. Finally, taken together, we demonstrate the microfluidic platform’s potential to provide end-to-end solutions for synthetic biology research, from design to functional analysis.« less

  3. Nerve Growth Factor-Induced Angiogenesis: 1. Endothelial Cell Tube Formation Assay.

    PubMed

    Lazarovici, Philip; Lahiani, Adi; Gincberg, Galit; Haham, Dikla; Fluksman, Arnon; Benny, Ofra; Marcinkiewicz, Cezary; Lelkes, Peter I

    2018-01-01

    Nerve growth factor (NGF) is a neurotrophin promoting survival, proliferation, differentiation, and neuroprotection in the embryonal and adult nervous system. NGF also induces angiogenic effects in the cardiovascular system, which may be beneficial in engineering new blood vessels and for developing novel anti-angiogenesis therapies for cancer. Angiogenesis is a cellular process characterized by a number of events, including endothelial cell migration, invasion, and assembly into capillaries. In vitro endothelial tube formation assays are performed using primary human umbilical vein endothelial cells, human aortic endothelial cells, and other human or rodent primary endothelial cells isolated from the vasculature of both tumors and normal tissues. Immortalized endothelial cell lines are also used for these assays. When seeded onto Matrigel, these cells reorganize to create tubelike structure, which may be used as models for studying some aspects of in vitro angiogenesis. Image acquisition by light and fluorescence microscopy and/or quantification of fluorescently labeled cells can be carried out manually or digitally, using commercial software and automated image processing. Here we detail materials, procedure, assay conditions, and cell labeling for quantification of endothelial cell tube formation. This model can be applied to study cellular and molecular mechanisms by which NGF or other neurotrophins promote angiogenesis. This model may also be useful for the development of potential angiogenic and/or anti-angiogenic drugs targeting NGF receptors.

  4. Simulating the impacts of on-street vehicle parking on traffic operations on urban streets using cellular automation

    NASA Astrophysics Data System (ADS)

    Chen, Jingxu; Li, Zhibin; Jiang, Hang; Zhu, Senlai; Wang, Wei

    2017-02-01

    In recent years, many bicycle lanes on urban streets are replaced with vehicle parking places. Spaces for bicycle riding are reduced, resulting in changes in bicycle and vehicle operational features. The objective of this study is to estimate the impacts of on-street parking on heterogeneous traffic operation on urban streets. A cellular automaton (CA) model is developed and calibrated to simulate bicycle lane-changing on streets with on-street parking. Two types of street segments with different bicycle lane width are considered. From the simulation, two types of conflicts between bicycles and vehicles are identified which are frictional conflicts and blocking conflicts. Factors affecting the frequency of conflicts are also identified. Based on the results, vehicle delay is estimated for various traffic situations considering the range of occupancy levels for on-street parking. Later, a numerical network example is analyzed to estimate the network impact of on-street parking on traffic assignment and operation. Findings of the study are helpful to policies and design regarding on-street vehicle parking to improve the efficiency of traffic operations.

  5. Designing automation for human use: empirical studies and quantitative models.

    PubMed

    Parasuraman, R

    2000-07-01

    An emerging knowledge base of human performance research can provide guidelines for designing automation that can be used effectively by human operators of complex systems. Which functions should be automated and to what extent in a given system? A model for types and levels of automation that provides a framework and an objective basis for making such choices is described. The human performance consequences of particular types and levels of automation constitute primary evaluative criteria for automation design when using the model. Four human performance areas are considered--mental workload, situation awareness, complacency and skill degradation. Secondary evaluative criteria include such factors as automation reliability, the risks of decision/action consequences and the ease of systems integration. In addition to this qualitative approach, quantitative models can inform design. Several computational and formal models of human interaction with automation that have been proposed by various researchers are reviewed. An important future research need is the integration of qualitative and quantitative approaches. Application of these models provides an objective basis for designing automation for effective human use.

  6. Using Modeling and Simulation to Predict Operator Performance and Automation-Induced Complacency With Robotic Automation: A Case Study and Empirical Validation.

    PubMed

    Wickens, Christopher D; Sebok, Angelia; Li, Huiyang; Sarter, Nadine; Gacy, Andrew M

    2015-09-01

    The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Some computational models of complacency in human-automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development. © 2015, Human Factors and Ergonomics Society.

  7. Intelligent data analysis to model and understand live cell time-lapse sequences.

    PubMed

    Paterson, Allan; Ashtari, M; Ribé, D; Stenbeck, G; Tucker, A

    2012-01-01

    One important aspect of cellular function, which is at the basis of tissue homeostasis, is the delivery of proteins to their correct destinations. Significant advances in live cell microscopy have allowed tracking of these pathways by following the dynamics of fluorescently labelled proteins in living cells. This paper explores intelligent data analysis techniques to model the dynamic behavior of proteins in living cells as well as to classify different experimental conditions. We use a combination of decision tree classification and hidden Markov models. In particular, we introduce a novel approach to "align" hidden Markov models so that hidden states from different models can be cross-compared. Our models capture the dynamics of two experimental conditions accurately with a stable hidden state for control data and multiple (less stable) states for the experimental data recapitulating the behaviour of particle trajectories within live cell time-lapse data. In addition to having successfully developed an automated framework for the classification of protein transport dynamics from live cell time-lapse data our model allows us to understand the dynamics of a complex trafficking pathway in living cells in culture.

  8. Multiscale modelling approaches for assessing cosmetic ingredients safety.

    PubMed

    Bois, Frédéric Y; Ochoa, Juan G Diaz; Gajewska, Monika; Kovarich, Simona; Mauch, Klaus; Paini, Alicia; Péry, Alexandre; Benito, Jose Vicente Sala; Teng, Sophie; Worth, Andrew

    2017-12-01

    The European Union's ban on animal testing for cosmetic ingredients and products has generated a strong momentum for the development of in silico and in vitro alternative methods. One of the focus of the COSMOS project was ab initio prediction of kinetics and toxic effects through multiscale pharmacokinetic modeling and in vitro data integration. In our experience, mathematical or computer modeling and in vitro experiments are complementary. We present here a summary of the main models and results obtained within the framework of the project on these topics. A first section presents our work at the organelle and cellular level. We then go toward modeling cell levels effects (monitored continuously), multiscale physiologically based pharmacokinetic and effect models, and route to route extrapolation. We follow with a short presentation of the automated KNIME workflows developed for dissemination and easy use of the models. We end with a discussion of two challenges to the field: our limited ability to deal with massive data and complex computations. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  9. Automated Slide Scanning and Segmentation in Fluorescently-labeled Tissues Using a Widefield High-content Analysis System.

    PubMed

    Poon, Candice C; Ebacher, Vincent; Liu, Katherine; Yong, Voon Wee; Kelly, John James Patrick

    2018-05-03

    Automated slide scanning and segmentation of fluorescently-labeled tissues is the most efficient way to analyze whole slides or large tissue sections. Unfortunately, many researchers spend large amounts of time and resources developing and optimizing workflows that are only relevant to their own experiments. In this article, we describe a protocol that can be used by those with access to a widefield high-content analysis system (WHCAS) to image any slide-mounted tissue, with options for customization within pre-built modules found in the associated software. Not originally intended for slide scanning, the steps detailed in this article make it possible to acquire slide scanning images in the WHCAS which can be imported into the associated software. In this example, the automated segmentation of brain tumor slides is demonstrated, but the automated segmentation of any fluorescently-labeled nuclear or cytoplasmic marker is possible. Furthermore, there are a variety of other quantitative software modules including assays for protein localization/translocation, cellular proliferation/viability/apoptosis, and angiogenesis that can be run. This technique will save researchers time and effort and create an automated protocol for slide analysis.

  10. Study of living single cells in culture: automated recognition of cell behavior.

    PubMed

    Bodin, P; Papin, S; Meyer, C; Travo, P

    1988-07-01

    An automated system capable of analyzing the behavior, in real time, of single living cells in culture, in a noninvasive and nondestructive way, has been developed. A large number of cell positions in single culture dishes were recorded using a computer controlled, robotized microscope. During subsequent observations, binary images obtained from video image analysis of the microscope visual field allowed the identification of the recorded cells. These cells could be revisited automatically every few minutes. Long-term studies of the behavior of cells make possible the analysis of cellular locomotary and mitotic activities as well as determination of cell shape (chosen from a defined library) for several hours or days in a fully automated way with observations spaced up to 30 minutes. Short-term studies of the behavior of cells permit the study, in a semiautomatic way, of acute effects of drugs (5 to 15 minutes) on changes of surface area and length of cells.

  11. Localization-based super-resolution imaging meets high-content screening.

    PubMed

    Beghin, Anne; Kechkar, Adel; Butler, Corey; Levet, Florian; Cabillic, Marine; Rossier, Olivier; Giannone, Gregory; Galland, Rémi; Choquet, Daniel; Sibarita, Jean-Baptiste

    2017-12-01

    Single-molecule localization microscopy techniques have proven to be essential tools for quantitatively monitoring biological processes at unprecedented spatial resolution. However, these techniques are very low throughput and are not yet compatible with fully automated, multiparametric cellular assays. This shortcoming is primarily due to the huge amount of data generated during imaging and the lack of software for automation and dedicated data mining. We describe an automated quantitative single-molecule-based super-resolution methodology that operates in standard multiwell plates and uses analysis based on high-content screening and data-mining software. The workflow is compatible with fixed- and live-cell imaging and allows extraction of quantitative data like fluorophore photophysics, protein clustering or dynamic behavior of biomolecules. We demonstrate that the method is compatible with high-content screening using 3D dSTORM and DNA-PAINT based super-resolution microscopy as well as single-particle tracking.

  12. Automated deep-phenotyping of the vertebrate brain

    PubMed Central

    Allalou, Amin; Wu, Yuelong; Ghannad-Rezaie, Mostafa; Eimon, Peter M; Yanik, Mehmet Fatih

    2017-01-01

    Here, we describe an automated platform suitable for large-scale deep-phenotyping of zebrafish mutant lines, which uses optical projection tomography to rapidly image brain-specific gene expression patterns in 3D at cellular resolution. Registration algorithms and correlation analysis are then used to compare 3D expression patterns, to automatically detect all statistically significant alterations in mutants, and to map them onto a brain atlas. Automated deep-phenotyping of a mutation in the master transcriptional regulator fezf2 not only detects all known phenotypes but also uncovers important novel neural deficits that were overlooked in previous studies. In the telencephalon, we show for the first time that fezf2 mutant zebrafish have significant patterning deficits, particularly in glutamatergic populations. Our findings reveal unexpected parallels between fezf2 function in zebrafish and mice, where mutations cause deficits in glutamatergic neurons of the telencephalon-derived neocortex. DOI: http://dx.doi.org/10.7554/eLife.23379.001 PMID:28406399

  13. Multiscale image analysis reveals structural heterogeneity of the cell microenvironment in homotypic spheroids.

    PubMed

    Schmitz, Alexander; Fischer, Sabine C; Mattheyer, Christian; Pampaloni, Francesco; Stelzer, Ernst H K

    2017-03-03

    Three-dimensional multicellular aggregates such as spheroids provide reliable in vitro substitutes for tissues. Quantitative characterization of spheroids at the cellular level is fundamental. We present the first pipeline that provides three-dimensional, high-quality images of intact spheroids at cellular resolution and a comprehensive image analysis that completes traditional image segmentation by algorithms from other fields. The pipeline combines light sheet-based fluorescence microscopy of optically cleared spheroids with automated nuclei segmentation (F score: 0.88) and concepts from graph analysis and computational topology. Incorporating cell graphs and alpha shapes provided more than 30 features of individual nuclei, the cellular neighborhood and the spheroid morphology. The application of our pipeline to a set of breast carcinoma spheroids revealed two concentric layers of different cell density for more than 30,000 cells. The thickness of the outer cell layer depends on a spheroid's size and varies between 50% and 75% of its radius. In differently-sized spheroids, we detected patches of different cell densities ranging from 5 × 10 5 to 1 × 10 6  cells/mm 3 . Since cell density affects cell behavior in tissues, structural heterogeneities need to be incorporated into existing models. Our image analysis pipeline provides a multiscale approach to obtain the relevant data for a system-level understanding of tissue architecture.

  14. Semi-automated Neuron Boundary Detection and Nonbranching Process Segmentation in Electron Microscopy Images

    PubMed Central

    Jurrus, Elizabeth; Watanabe, Shigeki; Giuly, Richard J.; Paiva, Antonio R. C.; Ellisman, Mark H.; Jorgensen, Erik M.; Tasdizen, Tolga

    2013-01-01

    Neuroscientists are developing new imaging techniques and generating large volumes of data in an effort to understand the complex structure of the nervous system. The complexity and size of this data makes human interpretation a labor-intensive task. To aid in the analysis, new segmentation techniques for identifying neurons in these feature rich datasets are required. This paper presents a method for neuron boundary detection and nonbranching process segmentation in electron microscopy images and visualizing them in three dimensions. It combines both automated segmentation techniques with a graphical user interface for correction of mistakes in the automated process. The automated process first uses machine learning and image processing techniques to identify neuron membranes that deliniate the cells in each two-dimensional section. To segment nonbranching processes, the cell regions in each two-dimensional section are connected in 3D using correlation of regions between sections. The combination of this method with a graphical user interface specially designed for this purpose, enables users to quickly segment cellular processes in large volumes. PMID:22644867

  15. Semi-Automated Neuron Boundary Detection and Nonbranching Process Segmentation in Electron Microscopy Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jurrus, Elizabeth R.; Watanabe, Shigeki; Giuly, Richard J.

    2013-01-01

    Neuroscientists are developing new imaging techniques and generating large volumes of data in an effort to understand the complex structure of the nervous system. The complexity and size of this data makes human interpretation a labor-intensive task. To aid in the analysis, new segmentation techniques for identifying neurons in these feature rich datasets are required. This paper presents a method for neuron boundary detection and nonbranching process segmentation in electron microscopy images and visualizing them in three dimensions. It combines both automated segmentation techniques with a graphical user interface for correction of mistakes in the automated process. The automated processmore » first uses machine learning and image processing techniques to identify neuron membranes that deliniate the cells in each two-dimensional section. To segment nonbranching processes, the cell regions in each two-dimensional section are connected in 3D using correlation of regions between sections. The combination of this method with a graphical user interface specially designed for this purpose, enables users to quickly segment cellular processes in large volumes.« less

  16. The Co-regulation Data Harvester: Automating gene annotation starting from a transcriptome database

    NASA Astrophysics Data System (ADS)

    Tsypin, Lev M.; Turkewitz, Aaron P.

    Identifying co-regulated genes provides a useful approach for defining pathway-specific machinery in an organism. To be efficient, this approach relies on thorough genome annotation, a process much slower than genome sequencing per se. Tetrahymena thermophila, a unicellular eukaryote, has been a useful model organism and has a fully sequenced but sparsely annotated genome. One important resource for studying this organism has been an online transcriptomic database. We have developed an automated approach to gene annotation in the context of transcriptome data in T. thermophila, called the Co-regulation Data Harvester (CDH). Beginning with a gene of interest, the CDH identifies co-regulated genes by accessing the Tetrahymena transcriptome database. It then identifies their closely related genes (orthologs) in other organisms by using reciprocal BLAST searches. Finally, it collates the annotations of those orthologs' functions, which provides the user with information to help predict the cellular role of the initial query. The CDH, which is freely available, represents a powerful new tool for analyzing cell biological pathways in Tetrahymena. Moreover, to the extent that genes and pathways are conserved between organisms, the inferences obtained via the CDH should be relevant, and can be explored, in many other systems.

  17. Endometrial Stromal Cells and Immune Cell Populations Within Lymph Nodes in a Nonhuman Primate Model of Endometriosis

    PubMed Central

    Fazleabas, A. T.; Braundmeier, A. G.; Markham, R.; Fraser, I. S.; Berbic, M.

    2011-01-01

    Mounting evidence suggests that immunological responses may be altered in endometriosis. The baboon (Papio anubis) is generally considered the best model of endometriosis pathogenesis. The objective of the current study was to investigate for the first time immunological changes within uterine and peritoneal draining lymph nodes in a nonhuman primate baboon model of endometriosis. Paraffin-embedded femoral lymph nodes were obtained from 22 normally cycling female baboons (induced endometriosis n = 11; control n = 11). Immunohistochemical staining was performed with antibodies for endometrial stromal cells, T cells, immature and mature dendritic cells, and B cells. Lymph nodes were evaluated using an automated cellular imaging system. Endometrial stromal cells were significantly increased in lymph nodes from animals with induced endometriosis, compared to control animals (P = .033). In animals with induced endometriosis, some lymph node immune cell populations including T cells, dendritic cells and B cells were increased, suggesting an efficient early response or peritoneal drainage. PMID:21617251

  18. The use of adverse outcome pathway-based toxicity predictions: A case study evaluating the effects of imazalil on fathead minnow reproduction

    EPA Science Inventory

    Product Description: As a means to increase the efficiency of chemical safety assessment, there is an interest in using data from molecular and cellular bioassays, conducted in a highly automated fashion using modern robotics, to predict toxicity in humans and wildlife. The prese...

  19. System-wide organization of actin cytoskeleton determines organelle transport in hypocotyl plant cells

    PubMed Central

    Nowak, Jacqueline; Ivakov, Alexander; Somssich, Marc; Persson, Staffan; Nikoloski, Zoran

    2017-01-01

    The actin cytoskeleton is an essential intracellular filamentous structure that underpins cellular transport and cytoplasmic streaming in plant cells. However, the system-level properties of actin-based cellular trafficking remain tenuous, largely due to the inability to quantify key features of the actin cytoskeleton. Here, we developed an automated image-based, network-driven framework to accurately segment and quantify actin cytoskeletal structures and Golgi transport. We show that the actin cytoskeleton in both growing and elongated hypocotyl cells has structural properties facilitating efficient transport. Our findings suggest that the erratic movement of Golgi is a stable cellular phenomenon that might optimize distribution efficiency of cell material. Moreover, we demonstrate that Golgi transport in hypocotyl cells can be accurately predicted from the actin network topology alone. Thus, our framework provides quantitative evidence for system-wide coordination of cellular transport in plant cells and can be readily applied to investigate cytoskeletal organization and transport in other organisms. PMID:28655850

  20. Automated workflows for modelling chemical fate, kinetics and toxicity.

    PubMed

    Sala Benito, J V; Paini, Alicia; Richarz, Andrea-Nicole; Meinl, Thorsten; Berthold, Michael R; Cronin, Mark T D; Worth, Andrew P

    2017-12-01

    Automation is universal in today's society, from operating equipment such as machinery, in factory processes, to self-parking automobile systems. While these examples show the efficiency and effectiveness of automated mechanical processes, automated procedures that support the chemical risk assessment process are still in their infancy. Future human safety assessments will rely increasingly on the use of automated models, such as physiologically based kinetic (PBK) and dynamic models and the virtual cell based assay (VCBA). These biologically-based models will be coupled with chemistry-based prediction models that also automate the generation of key input parameters such as physicochemical properties. The development of automated software tools is an important step in harmonising and expediting the chemical safety assessment process. In this study, we illustrate how the KNIME Analytics Platform can be used to provide a user-friendly graphical interface for these biokinetic models, such as PBK models and VCBA, which simulates the fate of chemicals in vivo within the body and in vitro test systems respectively. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Model-centric distribution automation: Capacity, reliability, and efficiency

    DOE PAGES

    Onen, Ahmet; Jung, Jaesung; Dilek, Murat; ...

    2016-02-26

    A series of analyses along with field validations that evaluate efficiency, reliability, and capacity improvements of model-centric distribution automation are presented. With model-centric distribution automation, the same model is used from design to real-time control calculations. A 14-feeder system with 7 substations is considered. The analyses involve hourly time-varying loads and annual load growth factors. Phase balancing and capacitor redesign modifications are used to better prepare the system for distribution automation, where the designs are performed considering time-varying loads. Coordinated control of load tap changing transformers, line regulators, and switched capacitor banks is considered. In evaluating distribution automation versus traditionalmore » system design and operation, quasi-steady-state power flow analysis is used. In evaluating distribution automation performance for substation transformer failures, reconfiguration for restoration analysis is performed. In evaluating distribution automation for storm conditions, Monte Carlo simulations coupled with reconfiguration for restoration calculations are used. As a result, the evaluations demonstrate that model-centric distribution automation has positive effects on system efficiency, capacity, and reliability.« less

  2. Model-centric distribution automation: Capacity, reliability, and efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onen, Ahmet; Jung, Jaesung; Dilek, Murat

    A series of analyses along with field validations that evaluate efficiency, reliability, and capacity improvements of model-centric distribution automation are presented. With model-centric distribution automation, the same model is used from design to real-time control calculations. A 14-feeder system with 7 substations is considered. The analyses involve hourly time-varying loads and annual load growth factors. Phase balancing and capacitor redesign modifications are used to better prepare the system for distribution automation, where the designs are performed considering time-varying loads. Coordinated control of load tap changing transformers, line regulators, and switched capacitor banks is considered. In evaluating distribution automation versus traditionalmore » system design and operation, quasi-steady-state power flow analysis is used. In evaluating distribution automation performance for substation transformer failures, reconfiguration for restoration analysis is performed. In evaluating distribution automation for storm conditions, Monte Carlo simulations coupled with reconfiguration for restoration calculations are used. As a result, the evaluations demonstrate that model-centric distribution automation has positive effects on system efficiency, capacity, and reliability.« less

  3. Complacency and bias in human use of automation: an attentional integration.

    PubMed

    Parasuraman, Raja; Manzey, Dietrich H

    2010-06-01

    Our aim was to review empirical studies of complacency and bias in human interaction with automated and decision support systems and provide an integrated theoretical model for their explanation. Automation-related complacency and automation bias have typically been considered separately and independently. Studies on complacency and automation bias were analyzed with respect to the cognitive processes involved. Automation complacency occurs under conditions of multiple-task load, when manual tasks compete with the automated task for the operator's attention. Automation complacency is found in both naive and expert participants and cannot be overcome with simple practice. Automation bias results in making both omission and commission errors when decision aids are imperfect. Automation bias occurs in both naive and expert participants, cannot be prevented by training or instructions, and can affect decision making in individuals as well as in teams. While automation bias has been conceived of as a special case of decision bias, our analysis suggests that it also depends on attentional processes similar to those involved in automation-related complacency. Complacency and automation bias represent different manifestations of overlapping automation-induced phenomena, with attention playing a central role. An integrated model of complacency and automation bias shows that they result from the dynamic interaction of personal, situational, and automation-related characteristics. The integrated model and attentional synthesis provides a heuristic framework for further research on complacency and automation bias and design options for mitigating such effects in automated and decision support systems.

  4. The Application of the Cumulative Logistic Regression Model to Automated Essay Scoring

    ERIC Educational Resources Information Center

    Haberman, Shelby J.; Sinharay, Sandip

    2010-01-01

    Most automated essay scoring programs use a linear regression model to predict an essay score from several essay features. This article applied a cumulative logit model instead of the linear regression model to automated essay scoring. Comparison of the performances of the linear regression model and the cumulative logit model was performed on a…

  5. Localized, macromolecular transport for thin, adherent, single cells via an automated, single cell electroporation biomanipulator.

    PubMed

    Sakaki, Kelly; Esmaeilsabzali, Hadi; Massah, Shabnam; Prefontaine, Gratien G; Dechev, Nikolai; Burke, Robert D; Park, Edward J

    2013-11-01

    Single cell electroporation (SCE), via microcapillary, is an effective method for molecular, transmembrane transport used to gain insight on cell processes with minimal preparation. Although possessing great potential, SCE is difficult to execute and the technology spans broad fields within cell biology and engineering. The technical complexities, the focus and expertise demanded during manual operation, and the lack of an automated SCE platform limit the widespread use of this technique, thus the potential of SCE has not been realized. In this study, an automated biomanipulator for SCE is presented. Our system is capable of delivering molecules into the cytoplasm of extremely thin cellular features of adherent cells. The intent of the system is to abstract the technical challenges and exploit the accuracy and repeatability of automated instrumentation, leaving only the focus of the experimental design to the operator. Each sequence of SCE including cell and SCE site localization, tip-membrane contact detection, and SCE has been automated. Positions of low-contrast cells are localized and "SCE sites" for microcapillary tip placement are determined using machine vision. In addition, new milestones within automated cell manipulation have been achieved. The system described herein has the capability of automated SCE of "thin" cell features less than 10 μm in thickness. Finally, SCE events are anticipated using visual feedback, while monitoring fluorescing dye entering the cytoplasm of a cell. The execution is demonstrated by inserting a combination of a fluorescing dye and a reporter gene into NIH/3T3 fibroblast cells.

  6. Determining the distribution of probes between different subcellular locations through automated unmixing of subcellular patterns.

    PubMed

    Peng, Tao; Bonamy, Ghislain M C; Glory-Afshar, Estelle; Rines, Daniel R; Chanda, Sumit K; Murphy, Robert F

    2010-02-16

    Many proteins or other biological macromolecules are localized to more than one subcellular structure. The fraction of a protein in different cellular compartments is often measured by colocalization with organelle-specific fluorescent markers, requiring availability of fluorescent probes for each compartment and acquisition of images for each in conjunction with the macromolecule of interest. Alternatively, tailored algorithms allow finding particular regions in images and quantifying the amount of fluorescence they contain. Unfortunately, this approach requires extensive hand-tuning of algorithms and is often cell type-dependent. Here we describe a machine-learning approach for estimating the amount of fluorescent signal in different subcellular compartments without hand tuning, requiring only the acquisition of separate training images of markers for each compartment. In testing on images of cells stained with mixtures of probes for different organelles, we achieved a 93% correlation between estimated and expected amounts of probes in each compartment. We also demonstrated that the method can be used to quantify drug-dependent protein translocations. The method enables automated and unbiased determination of the distributions of protein across cellular compartments, and will significantly improve imaging-based high-throughput assays and facilitate proteome-scale localization efforts.

  7. Flexible Measurement of Bioluminescent Reporters Using an Automated Longitudinal Luciferase Imaging Gas- and Temperature-optimized Recorder (ALLIGATOR).

    PubMed

    Crosby, Priya; Hoyle, Nathaniel P; O'Neill, John S

    2017-12-13

    Luciferase-based reporters of cellular gene expression are in widespread use for both longitudinal and end-point assays of biological activity. In circadian rhythms research, for example, clock gene fusions with firefly luciferase give rise to robust rhythms in cellular bioluminescence that persist over many days. Technical limitations associated with photomultiplier tubes (PMT) or conventional microscopy-based methods for bioluminescence quantification have typically demanded that cells and tissues be maintained under quite non-physiological conditions during recording, with a trade-off between sensitivity and throughput. Here, we report a refinement of prior methods that allows long-term bioluminescence imaging with high sensitivity and throughput which supports a broad range of culture conditions, including variable gas and humidity control, and that accepts many different tissue culture plates and dishes. This automated longitudinal luciferase imaging gas- and temperature-optimized recorder (ALLIGATOR) also allows the observation of spatial variations in luciferase expression across a cell monolayer or tissue, which cannot readily be observed by traditional methods. We highlight how the ALLIGATOR provides vastly increased flexibility for the detection of luciferase activity when compared with existing methods.

  8. Modeling Hydraulic Components for Automated FMEA of a Braking System

    DTIC Science & Technology

    2014-12-23

    Modeling Hydraulic Components for Automated FMEA of a Braking System Peter Struss, Alessandro Fraracci Tech. Univ. of Munich, 85748 Garching...Germany struss@in.tum.de ABSTRACT This paper presents work on model-based automation of failure-modes-and-effects analysis ( FMEA ) applied to...the hydraulic part of a vehicle braking system. We describe the FMEA task and the application problem and outline the foundations for automating the

  9. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  10. Automated Morphological and Morphometric Analysis of Mass Spectrometry Imaging Data: Application to Biomarker Discovery

    NASA Astrophysics Data System (ADS)

    Picard de Muller, Gaël; Ait-Belkacem, Rima; Bonnel, David; Longuespée, Rémi; Stauber, Jonathan

    2017-12-01

    Mass spectrometry imaging datasets are mostly analyzed in terms of average intensity in regions of interest. However, biological tissues have different morphologies with several sizes, shapes, and structures. The important biological information, contained in this highly heterogeneous cellular organization, could be hidden by analyzing the average intensities. Finding an analytical process of morphology would help to find such information, describe tissue model, and support identification of biomarkers. This study describes an informatics approach for the extraction and identification of mass spectrometry image features and its application to sample analysis and modeling. For the proof of concept, two different tissue types (healthy kidney and CT-26 xenograft tumor tissues) were imaged and analyzed. A mouse kidney model and tumor model were generated using morphometric - number of objects and total surface - information. The morphometric information was used to identify m/z that have a heterogeneous distribution. It seems to be a worthwhile pursuit as clonal heterogeneity in a tumor is of clinical relevance. This study provides a new approach to find biomarker or support tissue classification with more information. [Figure not available: see fulltext.

  11. A microchip electrophoresis-mass spectrometric platform with double cell lysis nano-electrodes for automated single cell analysis.

    PubMed

    Li, Xiangtang; Zhao, Shulin; Hu, Hankun; Liu, Yi-Ming

    2016-06-17

    Capillary electrophoresis-based single cell analysis has become an essential approach in researches at the cellular level. However, automation of single cell analysis has been a challenge due to the difficulty to control the number of cells injected and the irreproducibility associated with cell aggregation. Herein we report the development of a new microfluidic platform deploying the double nano-electrode cell lysis technique for automated analysis of single cells with mass spectrometric detection. The proposed microfluidic chip features integration of a cell-sized high voltage zone for quick single cell lysis, a microfluidic channel for electrophoretic separation, and a nanoelectrospray emitter for ionization in MS detection. Built upon this platform, a microchip electrophoresis-mass spectrometric method (MCE-MS) has been developed for automated single cell analysis. In the method, cell introduction, cell lysis, and MCE-MS separation are computer controlled and integrated as a cycle into consecutive assays. Analysis of large numbers of individual PC-12 neuronal cells (both intact and exposed to 25mM KCl) was carried out to determine intracellular levels of dopamine (DA) and glutamic acid (Glu). It was found that DA content in PC-12 cells was higher than Glu content, and both varied from cell to cell. The ratio of intracellular DA to Glu was 4.20±0.8 (n=150). Interestingly, the ratio drastically decreased to 0.38±0.20 (n=150) after the cells are exposed to 25mM KCl for 8min, suggesting the cells released DA promptly and heavily while they released Glu at a much slower pace in response to KCl-induced depolarization. These results indicate that the proposed MCE-MS analytical platform may have a great potential in researches at the cellular level. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. High-content screening of small compounds on human embryonic stem cells.

    PubMed

    Barbaric, Ivana; Gokhale, Paul J; Andrews, Peter W

    2010-08-01

    Human ES (embryonic stem) cells and iPS (induced pluripotent stem) cells have been heralded as a source of differentiated cells that could be used in the treatment of degenerative diseases, such as Parkinson's disease or diabetes. Despite the great potential for their use in regenerative therapy, the challenge remains to understand the basic biology of these remarkable cells, in order to differentiate them into any functional cell type. Given the scale of the task, high-throughput screening of agents and culture conditions offers one way to accelerate these studies. The screening of small-compound libraries is particularly amenable to such high-throughput methods. Coupled with high-content screening technology that enables simultaneous assessment of multiple cellular features in an automated and quantitative way, this approach is proving powerful in identifying both small molecules as tools for manipulating stem cell fates and novel mechanisms of differentiation not previously associated with stem cell biology. Such screens performed on human ES cells also demonstrate the usefulness of human ES/iPS cells as cellular models for pharmacological testing of drug efficacy and toxicity, possibly a more imminent use of these cells than in regenerative medicine.

  13. Automated texture-based identification of ovarian cancer in confocal microendoscope images

    NASA Astrophysics Data System (ADS)

    Srivastava, Saurabh; Rodriguez, Jeffrey J.; Rouse, Andrew R.; Brewer, Molly A.; Gmitro, Arthur F.

    2005-03-01

    The fluorescence confocal microendoscope provides high-resolution, in-vivo imaging of cellular pathology during optical biopsy. There are indications that the examination of human ovaries with this instrument has diagnostic implications for the early detection of ovarian cancer. The purpose of this study was to develop a computer-aided system to facilitate the identification of ovarian cancer from digital images captured with the confocal microendoscope system. To achieve this goal, we modeled the cellular-level structure present in these images as texture and extracted features based on first-order statistics, spatial gray-level dependence matrices, and spatial-frequency content. Selection of the best features for classification was performed using traditional feature selection techniques including stepwise discriminant analysis, forward sequential search, a non-parametric method, principal component analysis, and a heuristic technique that combines the results of these methods. The best set of features selected was used for classification, and performance of various machine classifiers was compared by analyzing the areas under their receiver operating characteristic curves. The results show that it is possible to automatically identify patients with ovarian cancer based on texture features extracted from confocal microendoscope images and that the machine performance is superior to that of the human observer.

  14. Meta-analysis of global metabolomics and proteomics data to link alterations with phenotype

    DOE PAGES

    Patti, Gary J.; Tautenhahn, Ralf; Fonslow, Bryan R.; ...

    2011-01-01

    Global metabolomics has emerged as a powerful tool to interrogate cellular biochemistry at the systems level by tracking alterations in the levels of small molecules. One approach to define cellular dynamics with respect to this dysregulation of small molecules has been to consider metabolic flux as a function of time. While flux measurements have proven effective for model organisms, acquiring multiple time points at appropriate temporal intervals for many sample types (e.g., clinical specimens) is challenging. As an alternative, meta-analysis provides another strategy for delineating metabolic cause and effect perturbations. That is, the combination of untargeted metabolomic data from multiplemore » pairwise comparisons enables the association of specific changes in small molecules with unique phenotypic alterations. We recently developed metabolomic software called metaXCMS to automate these types of higher order comparisons. Here we discuss the potential of metaXCMS for analyzing proteomic datasets and highlight the biological value of combining meta-results from both metabolomic and proteomic analyses. The combined meta-analysis has the potential to facilitate efforts in functional genomics and the identification of metabolic disruptions related to disease pathogenesis.« less

  15. MESA: An Interactive Modeling and Simulation Environment for Intelligent Systems Automation

    NASA Technical Reports Server (NTRS)

    Charest, Leonard

    1994-01-01

    This report describes MESA, a software environment for creating applications that automate NASA mission opterations. MESA enables intelligent automation by utilizing model-based reasoning techniques developed in the field of Artificial Intelligence. Model-based reasoning techniques are realized in Mesa through native support of causal modeling and discrete event simulation.

  16. Performance modeling of automated manufacturing systems

    NASA Astrophysics Data System (ADS)

    Viswanadham, N.; Narahari, Y.

    A unified and systematic treatment is presented of modeling methodologies and analysis techniques for performance evaluation of automated manufacturing systems. The book is the first treatment of the mathematical modeling of manufacturing systems. Automated manufacturing systems are surveyed and three principal analytical modeling paradigms are discussed: Markov chains, queues and queueing networks, and Petri nets.

  17. Studies in fat grafting: Part I. Effects of injection technique on in vitro fat viability and in vivo volume retention.

    PubMed

    Chung, Michael T; Paik, Kevin J; Atashroo, David A; Hyun, Jeong S; McArdle, Adrian; Senarath-Yapa, Kshemendra; Zielins, Elizabeth R; Tevlin, Ruth; Duldulao, Chris; Hu, Michael S; Walmsley, Graham G; Parisi-Amon, Andreina; Momeni, Arash; Rimsa, Joe R; Commons, George W; Gurtner, Geoffrey C; Wan, Derrick C; Longaker, Michael T

    2014-07-01

    Fat grafting has become increasingly popular for the correction of soft-tissue deficits at many sites throughout the body. Long-term outcomes, however, depend on delivery of fat in the least traumatic fashion to optimize viability of the transplanted tissue. In this study, the authors compare the biological properties of fat following injection using two methods. Lipoaspiration samples were obtained from five female donors, and cellular viability, proliferation, and lipolysis were evaluated following injection using either a modified Coleman technique or an automated, low-shear device. Comparisons were made to minimally processed, uninjected fat. Volume retention was also measured over 12 weeks after injection of fat under the scalp of immunodeficient mice using either the modified Coleman technique or the Adipose Tissue Injector. Finally, fat grafts were analyzed histologically. Fat viability and cellular proliferation were both significantly greater with the Adipose Tissue Injector relative to injection with the modified Coleman technique. In contrast, significantly less lipolysis was noted using the automated device. In vivo fat volume retention was significantly greater than with the modified Coleman technique at the 4-, 6-, 8-, and 12-week time points. This corresponded to significantly greater histologic scores for healthy fat and lower scores for injury following injection with the device. Biological properties of injected tissues reflect how disruptive and harmful techniques for placement of fat may be, and the authors' in vitro and in vivo data both support the use of the automated, low-shear devices compared with the modified Coleman technique.

  18. Studies in Fat Grafting: Part I. Effects of Injection Technique on in vitro Fat Viability and in vivo Volume Retention

    PubMed Central

    Chung, Michael T.; Paik, Kevin J.; Atashroo, David A.; Hyun, Jeong S.; McArdle, Adrian; Senarath-Yapa, Kshemendra; Zielins, Elizabeth R.; Tevlin, Ruth; Duldulao, Chris; Hu, Michael S.; Walmsley, Graham G.; Parisi-Amon, Andreina; Momeni, Arash; Rimsa, Joe R.; Commons, George W.; Gurtner, Geoffrey C.; Wan, Derrick C.; Longaker, Michael T.

    2014-01-01

    Background Fat grafting has become increasingly popular for the correction of soft tissue deficits at many sites throughout the body. Long-term outcomes, however, depend on delivery of fat in the least traumatic fashion to optimize viability of the transplanted tissue. In this study, we compare the biologic properties of fat following injection using two methods. Methods Lipoaspiration samples were obtained from five female donors and cellular viability, proliferation, and lipolysis were evaluated following injection using either a modified Coleman technique or an automated, low shear device. Comparisons were made to minimally processed, uninjected fat. Volume retention was also measured over twelve weeks following injection of fat under the scalp of immunodeficient mice using either the modified Coleman technique or the Adipose Tissue Injector. Finally, fat grafts were analyzed histologically. Results Fat viability and cellular proliferation were both significantly greater with the Adipose Tissue Injector relative to injection with the modified Coleman technique. In contrast, significantly less lipolysis was noted using the automated device. In vivo fat volume retention was significantly greater than with the modified Coleman technique at 4, 6, 8, and 12 week time points. This corresponded with significantly greater histological scores for healthy fat and lower scores for injury following injection with the device. Conclusions Biological properties of injected tissues reflect how disruptive and harmful techniques for placement of fat may be, and our in vitro and in vivo data both support the use of the automated, low shear devices compared to the modified Coleman technique. PMID:24622574

  19. Building quantitative, three-dimensional atlases of gene expression and morphology at cellular resolution.

    PubMed

    Knowles, David W; Biggin, Mark D

    2013-01-01

    Animals comprise dynamic three-dimensional arrays of cells that express gene products in intricate spatial and temporal patterns that determine cellular differentiation and morphogenesis. A rigorous understanding of these developmental processes requires automated methods that quantitatively record and analyze complex morphologies and their associated patterns of gene expression at cellular resolution. Here we summarize light microscopy-based approaches to establish permanent, quantitative datasets-atlases-that record this information. We focus on experiments that capture data for whole embryos or large areas of tissue in three dimensions, often at multiple time points. We compare and contrast the advantages and limitations of different methods and highlight some of the discoveries made. We emphasize the need for interdisciplinary collaborations and integrated experimental pipelines that link sample preparation, image acquisition, image analysis, database design, visualization, and quantitative analysis. Copyright © 2013 Wiley Periodicals, Inc.

  20. Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human-Automation Interaction.

    PubMed

    Sebok, Angelia; Wickens, Christopher D

    2017-03-01

    The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.

  1. Automated MRI segmentation for individualized modeling of current flow in the human head.

    PubMed

    Huang, Yu; Dmochowski, Jacek P; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C

    2013-12-01

    High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly. Fully automated individualized modeling may now be feasible for large-sample EEG research studies and tDCS clinical trials.

  2. A microfluidic device for automated, high-speed microinjection of Caenorhabditis elegans

    PubMed Central

    Song, Pengfei; Dong, Xianke; Liu, Xinyu

    2016-01-01

    The nematode worm Caenorhabditis elegans has been widely used as a model organism in biological studies because of its short and prolific life cycle, relatively simple body structure, significant genetic overlap with human, and facile/inexpensive cultivation. Microinjection, as an established and versatile tool for delivering liquid substances into cellular/organismal objects, plays an important role in C. elegans research. However, the conventional manual procedure of C. elegans microinjection is labor-intensive and time-consuming and thus hinders large-scale C. elegans studies involving microinjection of a large number of C. elegans on a daily basis. In this paper, we report a novel microfluidic device that enables, for the first time, fully automated, high-speed microinjection of C. elegans. The device is automatically regulated by on-chip pneumatic valves and allows rapid loading, immobilization, injection, and downstream sorting of single C. elegans. For demonstration, we performed microinjection experiments on 200 C. elegans worms and demonstrated an average injection speed of 6.6 worm/min (average worm handling time: 9.45 s/worm) and a success rate of 77.5% (post-sorting success rate: 100%), both much higher than the performance of manual operation (speed: 1 worm/4 min and success rate: 30%). We conducted typical viability tests on the injected C. elegans and confirmed that the automated injection system does not impose significant adverse effect on the physiological condition of the injected C. elegans. We believe that the developed microfluidic device holds great potential to become a useful tool for facilitating high-throughput, large-scale worm biology research. PMID:26958099

  3. ultraLM and miniLM: Locator tools for smart tracking of fluorescent cells in correlative light and electron microscopy.

    PubMed

    Brama, Elisabeth; Peddie, Christopher J; Wilkes, Gary; Gu, Yan; Collinson, Lucy M; Jones, Martin L

    2016-12-13

    In-resin fluorescence (IRF) protocols preserve fluorescent proteins in resin-embedded cells and tissues for correlative light and electron microscopy, aiding interpretation of macromolecular function within the complex cellular landscape. Dual-contrast IRF samples can be imaged in separate fluorescence and electron microscopes, or in dual-modality integrated microscopes for high resolution correlation of fluorophore to organelle. IRF samples also offer a unique opportunity to automate correlative imaging workflows. Here we present two new locator tools for finding and following fluorescent cells in IRF blocks, enabling future automation of correlative imaging. The ultraLM is a fluorescence microscope that integrates with an ultramicrotome, which enables 'smart collection' of ultrathin sections containing fluorescent cells or tissues for subsequent transmission electron microscopy or array tomography. The miniLM is a fluorescence microscope that integrates with serial block face scanning electron microscopes, which enables 'smart tracking' of fluorescent structures during automated serial electron image acquisition from large cell and tissue volumes.

  4. Note: An automated image analysis method for high-throughput classification of surface-bound bacterial cell motions.

    PubMed

    Shen, Simon; Syal, Karan; Tao, Nongjian; Wang, Shaopeng

    2015-12-01

    We present a Single-Cell Motion Characterization System (SiCMoCS) to automatically extract bacterial cell morphological features from microscope images and use those features to automatically classify cell motion for rod shaped motile bacterial cells. In some imaging based studies, bacteria cells need to be attached to the surface for time-lapse observation of cellular processes such as cell membrane-protein interactions and membrane elasticity. These studies often generate large volumes of images. Extracting accurate bacterial cell morphology features from these images is critical for quantitative assessment. Using SiCMoCS, we demonstrated simultaneous and automated motion tracking and classification of hundreds of individual cells in an image sequence of several hundred frames. This is a significant improvement from traditional manual and semi-automated approaches to segmenting bacterial cells based on empirical thresholds, and a first attempt to automatically classify bacterial motion types for motile rod shaped bacterial cells, which enables rapid and quantitative analysis of various types of bacterial motion.

  5. Image segmentation and dynamic lineage analysis in single-cell fluorescence microscopy.

    PubMed

    Wang, Quanli; Niemi, Jarad; Tan, Chee-Meng; You, Lingchong; West, Mike

    2010-01-01

    An increasingly common component of studies in synthetic and systems biology is analysis of dynamics of gene expression at the single-cell level, a context that is heavily dependent on the use of time-lapse movies. Extracting quantitative data on the single-cell temporal dynamics from such movies remains a major challenge. Here, we describe novel methods for automating key steps in the analysis of single-cell, fluorescent images-segmentation and lineage reconstruction-to recognize and track individual cells over time. The automated analysis iteratively combines a set of extended morphological methods for segmentation, and uses a neighborhood-based scoring method for frame-to-frame lineage linking. Our studies with bacteria, budding yeast and human cells, demonstrate the portability and usability of these methods, whether using phase, bright field or fluorescent images. These examples also demonstrate the utility of our integrated approach in facilitating analyses of engineered and natural cellular networks in diverse settings. The automated methods are implemented in freely available, open-source software.

  6. Automated segmentation of retinal pigment epithelium cells in fluorescence adaptive optics images.

    PubMed

    Rangel-Fonseca, Piero; Gómez-Vieyra, Armando; Malacara-Hernández, Daniel; Wilson, Mario C; Williams, David R; Rossi, Ethan A

    2013-12-01

    Adaptive optics (AO) imaging methods allow the histological characteristics of retinal cell mosaics, such as photoreceptors and retinal pigment epithelium (RPE) cells, to be studied in vivo. The high-resolution images obtained with ophthalmic AO imaging devices are rich with information that is difficult and/or tedious to quantify using manual methods. Thus, robust, automated analysis tools that can provide reproducible quantitative information about the cellular mosaics under examination are required. Automated algorithms have been developed to detect the position of individual photoreceptor cells; however, most of these methods are not well suited for characterizing the RPE mosaic. We have developed an algorithm for RPE cell segmentation and show its performance here on simulated and real fluorescence AO images of the RPE mosaic. Algorithm performance was compared to manual cell identification and yielded better than 91% correspondence. This method can be used to segment RPE cells for morphometric analysis of the RPE mosaic and speed the analysis of both healthy and diseased RPE mosaics.

  7. Customization of Advia 120 thresholds for canine erythrocyte volume and hemoglobin concentration, and effects on morphology flagging results.

    PubMed

    Grimes, Carolyn N; Fry, Michael M

    2014-12-01

    This study sought to develop customized morphology flagging thresholds for canine erythrocyte volume and hemoglobin concentration [Hgb] on the ADVIA 120 hematology analyzer; compare automated morphology flagging with results of microscopic blood smear evaluation; and examine effects of customized thresholds on morphology flagging results. Customized thresholds were determined using data from 52 clinically healthy dogs. Blood smear evaluation and automated morphology flagging results were correlated with mean cell volume (MCV) and cellular hemoglobin concentration mean (CHCM) in 26 dogs. Customized thresholds were applied retroactively to complete blood (cell) count (CBC) data from 5 groups of dogs, including a reference sample group, clinical cases, and animals with experimentally induced iron deficiency anemia. Automated morphology flagging correlated more highly with MCV or CHCM than did blood smear evaluation; correlation with MCV was highest using customized thresholds. Customized morphology flagging thresholds resulted in more sensitive detection of microcytosis, macrocytosis, and hypochromasia than default thresholds.

  8. The study of muscle remodeling in Drosophila metamorphosis using in vivo microscopy and bioimage informatics

    PubMed Central

    2012-01-01

    Background Metamorphosis in insects transforms the larval into an adult body plan and comprises the destruction and remodeling of larval and the generation of adult tissues. The remodeling of larval into adult muscles promises to be a genetic model for human atrophy since it is associated with dramatic alteration in cell size. Furthermore, muscle development is amenable to 3D in vivo microscopy at high cellular resolution. However, multi-dimensional image acquisition leads to sizeable amounts of data that demand novel approaches in image processing and analysis. Results To handle, visualize and quantify time-lapse datasets recorded in multiple locations, we designed a workflow comprising three major modules. First, the previously introduced TLM-converter concatenates stacks of single time-points. The second module, TLM-2D-Explorer, creates maximum intensity projections for rapid inspection and allows the temporal alignment of multiple datasets. The transition between prepupal and pupal stage serves as reference point to compare datasets of different genotypes or treatments. We demonstrate how the temporal alignment can reveal novel insights into the east gene which is involved in muscle remodeling. The third module, TLM-3D-Segmenter, performs semi-automated segmentation of selected muscle fibers over multiple frames. 3D image segmentation consists of 3 stages. First, the user places a seed into a muscle of a key frame and performs surface detection based on level-set evolution. Second, the surface is propagated to subsequent frames. Third, automated segmentation detects nuclei inside the muscle fiber. The detected surfaces can be used to visualize and quantify the dynamics of cellular remodeling. To estimate the accuracy of our segmentation method, we performed a comparison with a manually created ground truth. Key and predicted frames achieved a performance of 84% and 80%, respectively. Conclusions We describe an analysis pipeline for the efficient handling and analysis of time-series microscopy data that enhances productivity and facilitates the phenotypic characterization of genetic perturbations. Our methodology can easily be scaled up for genome-wide genetic screens using readily available resources for RNAi based gene silencing in Drosophila and other animal models. PMID:23282138

  9. Imaging C. elegans embryos using an epifluorescent microscope and open source software.

    PubMed

    Verbrugghe, Koen J C; Chan, Raymond C

    2011-03-24

    Cellular processes, such as chromosome assembly, segregation and cytokinesis,are inherently dynamic. Time-lapse imaging of living cells, using fluorescent-labeled reporter proteins or differential interference contrast (DIC) microscopy, allows for the examination of the temporal progression of these dynamic events which is otherwise inferred from analysis of fixed samples(1,2). Moreover, the study of the developmental regulations of cellular processes necessitates conducting time-lapse experiments on an intact organism during development. The Caenorhabiditis elegans embryo is light-transparent and has a rapid, invariant developmental program with a known cell lineage(3), thus providing an ideal experiment model for studying questions in cell biology(4,5)and development(6-9). C. elegans is amendable to genetic manipulation by forward genetics (based on random mutagenesis(10,11)) and reverse genetics to target specific genes (based on RNAi-mediated interference and targeted mutagenesis(12-15)). In addition, transgenic animals can be readily created to express fluorescently tagged proteins or reporters(16,17). These traits combine to make it easy to identify the genetic pathways regulating fundamental cellular and developmental processes in vivo(18-21). In this protocol we present methods for live imaging of C. elegans embryos using DIC optics or GFP fluorescence on a compound epifluorescent microscope. We demonstrate the ease with which readily available microscopes, typically used for fixed sample imaging, can also be applied for time-lapse analysis using open-source software to automate the imaging process.

  10. Cell Motility Dynamics: A Novel Segmentation Algorithm to Quantify Multi-Cellular Bright Field Microscopy Images

    PubMed Central

    Zaritsky, Assaf; Natan, Sari; Horev, Judith; Hecht, Inbal; Wolf, Lior; Ben-Jacob, Eshel; Tsarfaty, Ilan

    2011-01-01

    Confocal microscopy analysis of fluorescence and morphology is becoming the standard tool in cell biology and molecular imaging. Accurate quantification algorithms are required to enhance the understanding of different biological phenomena. We present a novel approach based on image-segmentation of multi-cellular regions in bright field images demonstrating enhanced quantitative analyses and better understanding of cell motility. We present MultiCellSeg, a segmentation algorithm to separate between multi-cellular and background regions for bright field images, which is based on classification of local patches within an image: a cascade of Support Vector Machines (SVMs) is applied using basic image features. Post processing includes additional classification and graph-cut segmentation to reclassify erroneous regions and refine the segmentation. This approach leads to a parameter-free and robust algorithm. Comparison to an alternative algorithm on wound healing assay images demonstrates its superiority. The proposed approach was used to evaluate common cell migration models such as wound healing and scatter assay. It was applied to quantify the acceleration effect of Hepatocyte growth factor/scatter factor (HGF/SF) on healing rate in a time lapse confocal microscopy wound healing assay and demonstrated that the healing rate is linear in both treated and untreated cells, and that HGF/SF accelerates the healing rate by approximately two-fold. A novel fully automated, accurate, zero-parameters method to classify and score scatter-assay images was developed and demonstrated that multi-cellular texture is an excellent descriptor to measure HGF/SF-induced cell scattering. We show that exploitation of textural information from differential interference contrast (DIC) images on the multi-cellular level can prove beneficial for the analyses of wound healing and scatter assays. The proposed approach is generic and can be used alone or alongside traditional fluorescence single-cell processing to perform objective, accurate quantitative analyses for various biological applications. PMID:22096600

  11. Cell motility dynamics: a novel segmentation algorithm to quantify multi-cellular bright field microscopy images.

    PubMed

    Zaritsky, Assaf; Natan, Sari; Horev, Judith; Hecht, Inbal; Wolf, Lior; Ben-Jacob, Eshel; Tsarfaty, Ilan

    2011-01-01

    Confocal microscopy analysis of fluorescence and morphology is becoming the standard tool in cell biology and molecular imaging. Accurate quantification algorithms are required to enhance the understanding of different biological phenomena. We present a novel approach based on image-segmentation of multi-cellular regions in bright field images demonstrating enhanced quantitative analyses and better understanding of cell motility. We present MultiCellSeg, a segmentation algorithm to separate between multi-cellular and background regions for bright field images, which is based on classification of local patches within an image: a cascade of Support Vector Machines (SVMs) is applied using basic image features. Post processing includes additional classification and graph-cut segmentation to reclassify erroneous regions and refine the segmentation. This approach leads to a parameter-free and robust algorithm. Comparison to an alternative algorithm on wound healing assay images demonstrates its superiority. The proposed approach was used to evaluate common cell migration models such as wound healing and scatter assay. It was applied to quantify the acceleration effect of Hepatocyte growth factor/scatter factor (HGF/SF) on healing rate in a time lapse confocal microscopy wound healing assay and demonstrated that the healing rate is linear in both treated and untreated cells, and that HGF/SF accelerates the healing rate by approximately two-fold. A novel fully automated, accurate, zero-parameters method to classify and score scatter-assay images was developed and demonstrated that multi-cellular texture is an excellent descriptor to measure HGF/SF-induced cell scattering. We show that exploitation of textural information from differential interference contrast (DIC) images on the multi-cellular level can prove beneficial for the analyses of wound healing and scatter assays. The proposed approach is generic and can be used alone or alongside traditional fluorescence single-cell processing to perform objective, accurate quantitative analyses for various biological applications.

  12. Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2015-01-01

    Usage of automatic systems in airliners has increased fuel efficiency, added extra capabilities, enhanced safety and reliability, as well as provide improved passenger comfort since its introduction in the late 80's. However, original automation benefits, including reduced flight crew workload, human errors or training requirements, were not achieved as originally expected. Instead, automation introduced new failure modes, redistributed, and sometimes increased workload, brought in new cognitive and attention demands, and increased training requirements. Modern airliners have numerous flight modes, providing more flexibility (and inherently more complexity) to the flight crew. However, the price to pay for the increased flexibility is the need for increased mode awareness, as well as the need to supervise, understand, and predict automated system behavior. Also, over-reliance on automation is linked to manual flight skill degradation and complacency in commercial pilots. As a result, recent accidents involving human errors are often caused by the interactions between humans and the automated systems (e.g., the breakdown in man-machine coordination), deteriorated manual flying skills, and/or loss of situational awareness due to heavy dependence on automated systems. This paper describes the development of the increased complexity and reliance on automation baseline model, named FLAP for FLightdeck Automation Problems. The model development process starts with a comprehensive literature review followed by the construction of a framework comprised of high-level causal factors leading to an automation-related flight anomaly. The framework was then converted into a Bayesian Belief Network (BBN) using the Hugin Software v7.8. The effects of automation on flight crew are incorporated into the model, including flight skill degradation, increased cognitive demand and training requirements along with their interactions. Besides flight crew deficiencies, automation system failures and anomalies of avionic systems are also incorporated. The resultant model helps simulate the emergence of automation-related issues in today's modern airliners from a top-down, generalized approach, which serves as a platform to evaluate NASA developed technologies

  13. Device and method for automated separation of a sample of whole blood into aliquots

    DOEpatents

    Burtis, Carl A.; Johnson, Wayne F.

    1989-01-01

    A device and a method for automated processing and separation of an unmeasured sample of whole blood into multiple aliquots of plasma. Capillaries are radially oriented on a rotor, with the rotor defining a sample chamber, transfer channels, overflow chamber, overflow channel, vent channel, cell chambers, and processing chambers. A sample of whole blood is placed in the sample chamber, and when the rotor is rotated, the blood moves outward through the transfer channels to the processing chambers where the blood is centrifugally separated into a solid cellular component and a liquid plasma component. When the rotor speed is decreased, the plasma component backfills the capillaries resulting in uniform aliquots of plasma which may be used for subsequent analytical procedures.

  14. Toward designing for trust in database automation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duez, P. P.; Jamieson, G. A.

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operatingmore » functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process. The connection between an AH for an automated tool and a list of information elements at the three levels of attributional abstraction is then direct, providing a method for satisfying information requirements for appropriate trust in automation. In this paper, we will present our method for developing specific information requirements for an automated tool, based on a formal analysis of that tool and the models presented by Lee and See. We will show an example of the application of the AH to automation, in the domain of relational database automation, and the resulting set of specific information elements for appropriate trust in the automated tool. Finally, we will comment on the applicability of this approach to the domain of nuclear plant instrumentation. (authors)« less

  15. Rapid automation of a cell-based assay using a modular approach: case study of a flow-based Varicella Zoster Virus infectivity assay.

    PubMed

    Joelsson, Daniel; Gates, Irina V; Pacchione, Diana; Wang, Christopher J; Bennett, Philip S; Zhang, Yuhua; McMackin, Jennifer; Frey, Tina; Brodbeck, Kristin C; Baxter, Heather; Barmat, Scott L; Benetti, Luca; Bodmer, Jean-Luc

    2010-06-01

    Vaccine manufacturing requires constant analytical monitoring to ensure reliable quality and a consistent safety profile of the final product. Concentration and bioactivity of active components of the vaccine are key attributes routinely evaluated throughout the manufacturing cycle and for product release and dosage. In the case of live attenuated virus vaccines, bioactivity is traditionally measured in vitro by infection of susceptible cells with the vaccine followed by quantification of virus replication, cytopathology or expression of viral markers. These assays are typically multi-day procedures that require trained technicians and constant attention. Considering the need for high volumes of testing, automation and streamlining of these assays is highly desirable. In this study, the automation and streamlining of a complex infectivity assay for Varicella Zoster Virus (VZV) containing test articles is presented. The automation procedure was completed using existing liquid handling infrastructure in a modular fashion, limiting custom-designed elements to a minimum to facilitate transposition. In addition, cellular senescence data provided an optimal population doubling range for long term, reliable assay operation at high throughput. The results presented in this study demonstrate a successful automation paradigm resulting in an eightfold increase in throughput while maintaining assay performance characteristics comparable to the original assay. Copyright 2010 Elsevier B.V. All rights reserved.

  16. A Multiple Agent Model of Human Performance in Automated Air Traffic Control and Flight Management Operations

    NASA Technical Reports Server (NTRS)

    Corker, Kevin; Pisanich, Gregory; Condon, Gregory W. (Technical Monitor)

    1995-01-01

    A predictive model of human operator performance (flight crew and air traffic control (ATC)) has been developed and applied in order to evaluate the impact of automation developments in flight management and air traffic control. The model is used to predict the performance of a two person flight crew and the ATC operators generating and responding to clearances aided by the Center TRACON Automation System (CTAS). The purpose of the modeling is to support evaluation and design of automated aids for flight management and airspace management and to predict required changes in procedure both air and ground in response to advancing automation in both domains. Additional information is contained in the original extended abstract.

  17. DMPy: a Python package for automated mathematical model construction of large-scale metabolic systems.

    PubMed

    Smith, Robert W; van Rosmalen, Rik P; Martins Dos Santos, Vitor A P; Fleck, Christian

    2018-06-19

    Models of metabolism are often used in biotechnology and pharmaceutical research to identify drug targets or increase the direct production of valuable compounds. Due to the complexity of large metabolic systems, a number of conclusions have been drawn using mathematical methods with simplifying assumptions. For example, constraint-based models describe changes of internal concentrations that occur much quicker than alterations in cell physiology. Thus, metabolite concentrations and reaction fluxes are fixed to constant values. This greatly reduces the mathematical complexity, while providing a reasonably good description of the system in steady state. However, without a large number of constraints, many different flux sets can describe the optimal model and we obtain no information on how metabolite levels dynamically change. Thus, to accurately determine what is taking place within the cell, finer quality data and more detailed models need to be constructed. In this paper we present a computational framework, DMPy, that uses a network scheme as input to automatically search for kinetic rates and produce a mathematical model that describes temporal changes of metabolite fluxes. The parameter search utilises several online databases to find measured reaction parameters. From this, we take advantage of previous modelling efforts, such as Parameter Balancing, to produce an initial mathematical model of a metabolic pathway. We analyse the effect of parameter uncertainty on model dynamics and test how recent flux-based model reduction techniques alter system properties. To our knowledge this is the first time such analysis has been performed on large models of metabolism. Our results highlight that good estimates of at least 80% of the reaction rates are required to accurately model metabolic systems. Furthermore, reducing the size of the model by grouping reactions together based on fluxes alters the resulting system dynamics. The presented pipeline automates the modelling process for large metabolic networks. From this, users can simulate their pathway of interest and obtain a better understanding of how altering conditions influences cellular dynamics. By testing the effects of different parameterisations we are also able to provide suggestions to help construct more accurate models of complete metabolic systems in the future.

  18. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    NASA Technical Reports Server (NTRS)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  19. Optimized Heart Sampling and Systematic Evaluation of Cardiac Therapies in Mouse Models of Ischemic Injury: Assessment of Cardiac Remodeling and Semi-Automated Quantification of Myocardial Infarct Size.

    PubMed

    Valente, Mariana; Araújo, Ana; Esteves, Tiago; Laundos, Tiago L; Freire, Ana G; Quelhas, Pedro; Pinto-do-Ó, Perpétua; Nascimento, Diana S

    2015-12-02

    Cardiac therapies are commonly tested preclinically in small-animal models of myocardial infarction. Following functional evaluation, post-mortem histological analysis is essential to assess morphological and molecular alterations underlying the effectiveness of treatment. However, non-methodical and inadequate sampling of the left ventricle often leads to misinterpretations and variability, making direct study comparisons unreliable. Protocols are provided for representative sampling of the ischemic mouse heart followed by morphometric analysis of the left ventricle. Extending the use of this sampling to other types of in situ analysis is also illustrated through the assessment of neovascularization and cellular engraftment in a cell-based therapy setting. This is of interest to the general cardiovascular research community as it details methods for standardization and simplification of histo-morphometric evaluation of emergent heart therapies. © 2015 by John Wiley & Sons, Inc. Copyright © 2015 John Wiley & Sons, Inc.

  20. First-Stage Development and Validation of a Web-Based Automated Dietary Modeling Tool: Using Constraint Optimization Techniques to Streamline Food Group and Macronutrient Focused Dietary Prescriptions for Clinical Trials.

    PubMed

    Probst, Yasmine; Morrison, Evan; Sullivan, Emma; Dam, Hoa Khanh

    2016-07-28

    Standardizing the background diet of participants during a dietary randomized controlled trial is vital to trial outcomes. For this process, dietary modeling based on food groups and their target servings is employed via a dietary prescription before an intervention, often using a manual process. Partial automation has employed the use of linear programming. Validity of the modeling approach is critical to allow trial outcomes to be translated to practice. This paper describes the first-stage development of a tool to automatically perform dietary modeling using food group and macronutrient requirements as a test case. The Dietary Modeling Tool (DMT) was then compared with existing approaches to dietary modeling (manual and partially automated), which were previously available to dietitians working within a dietary intervention trial. Constraint optimization techniques were implemented to determine whether nonlinear constraints are best suited to the development of the automated dietary modeling tool using food composition and food consumption data. Dietary models were produced and compared with a manual Microsoft Excel calculator, a partially automated Excel Solver approach, and the automated DMT that was developed. The web-based DMT was produced using nonlinear constraint optimization, incorporating estimated energy requirement calculations, nutrition guidance systems, and the flexibility to amend food group targets for individuals. Percentage differences between modeling tools revealed similar results for the macronutrients. Polyunsaturated fatty acids and monounsaturated fatty acids showed greater variation between tools (practically equating to a 2-teaspoon difference), although it was not considered clinically significant when the whole diet, as opposed to targeted nutrients or energy requirements, were being addressed. Automated modeling tools can streamline the modeling process for dietary intervention trials ensuring consistency of the background diets, although appropriate constraints must be used in their development to achieve desired results. The DMT was found to be a valid automated tool producing similar results to tools with less automation. The results of this study suggest interchangeability of the modeling approaches used, although implementation should reflect the requirements of the dietary intervention trial in which it is used.

  1. First-Stage Development and Validation of a Web-Based Automated Dietary Modeling Tool: Using Constraint Optimization Techniques to Streamline Food Group and Macronutrient Focused Dietary Prescriptions for Clinical Trials

    PubMed Central

    Morrison, Evan; Sullivan, Emma; Dam, Hoa Khanh

    2016-01-01

    Background Standardizing the background diet of participants during a dietary randomized controlled trial is vital to trial outcomes. For this process, dietary modeling based on food groups and their target servings is employed via a dietary prescription before an intervention, often using a manual process. Partial automation has employed the use of linear programming. Validity of the modeling approach is critical to allow trial outcomes to be translated to practice. Objective This paper describes the first-stage development of a tool to automatically perform dietary modeling using food group and macronutrient requirements as a test case. The Dietary Modeling Tool (DMT) was then compared with existing approaches to dietary modeling (manual and partially automated), which were previously available to dietitians working within a dietary intervention trial. Methods Constraint optimization techniques were implemented to determine whether nonlinear constraints are best suited to the development of the automated dietary modeling tool using food composition and food consumption data. Dietary models were produced and compared with a manual Microsoft Excel calculator, a partially automated Excel Solver approach, and the automated DMT that was developed. Results The web-based DMT was produced using nonlinear constraint optimization, incorporating estimated energy requirement calculations, nutrition guidance systems, and the flexibility to amend food group targets for individuals. Percentage differences between modeling tools revealed similar results for the macronutrients. Polyunsaturated fatty acids and monounsaturated fatty acids showed greater variation between tools (practically equating to a 2-teaspoon difference), although it was not considered clinically significant when the whole diet, as opposed to targeted nutrients or energy requirements, were being addressed. Conclusions Automated modeling tools can streamline the modeling process for dietary intervention trials ensuring consistency of the background diets, although appropriate constraints must be used in their development to achieve desired results. The DMT was found to be a valid automated tool producing similar results to tools with less automation. The results of this study suggest interchangeability of the modeling approaches used, although implementation should reflect the requirements of the dietary intervention trial in which it is used. PMID:27471104

  2. Application of Deep Learning in Automated Analysis of Molecular Images in Cancer: A Survey

    PubMed Central

    Xue, Yong; Chen, Shihui; Liu, Yong

    2017-01-01

    Molecular imaging enables the visualization and quantitative analysis of the alterations of biological procedures at molecular and/or cellular level, which is of great significance for early detection of cancer. In recent years, deep leaning has been widely used in medical imaging analysis, as it overcomes the limitations of visual assessment and traditional machine learning techniques by extracting hierarchical features with powerful representation capability. Research on cancer molecular images using deep learning techniques is also increasing dynamically. Hence, in this paper, we review the applications of deep learning in molecular imaging in terms of tumor lesion segmentation, tumor classification, and survival prediction. We also outline some future directions in which researchers may develop more powerful deep learning models for better performance in the applications in cancer molecular imaging. PMID:29114182

  3. Overexpression of the Cell Cycle Inhibitor p16INK4a Promotes a Prothrombotic Phenotype Following Vascular Injury in Mice

    PubMed Central

    Cardenas, Jessica C.; Owens, A. Phillip; Krishnamurthy, Janakiraman; Sharpless, Norman E.; Whinna, Herbert C.; Church, Frank C.

    2011-01-01

    Objective Age-associated cellular senescence is thought to promote vascular dysfunction. p16INK4a is a cell cycle inhibitor that promotes senescence and is upregulated during normal aging. In this study, we examine the contribution of p16INK4a overexpression on venous thrombosis. Methods and Results Mice overexpressing p16INK4a were studied with four different vascular injury models: (1) ferric chloride (FeCl3) and (2) Rose Bengal to induce saphenous vein thrombus formation; (3) FeCl3 and vascular ligation to examine thrombus resolution; and (4) LPS administration to initiate inflammation-induced vascular dysfunction. p16INK4a transgenic mice had accelerated occlusion times (13.1 ± 0.4 min) compared to normal controls (19.7 ± 1.1 min) in the FeCl3 model and 12.7 ± 2.0 and 18.6 ± 1.9, respectively in the Rose Bengal model. Moreover, overexpression of p16INK4a delayed thrombus resolution compared to normal controls. In response to LPS treatment, the p16INK4a transgenic mice showed enhanced thrombin generation in plasma-based calibrated automated thrombography (CAT) assays. Finally, bone marrow transplantation studies suggested increased p16INK4a expression in hematopoietic cells contributes to thrombosis, demonstrating a role for p16INK4a expression in venous thrombosis. Conclusions Venous thrombosis is augmented by overexpression of the cellular senescence gene p16INK4a. PMID:21233453

  4. Development of Computational Tools for Metabolic Model Curation, Flux Elucidation and Strain Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maranas, Costas D

    An overarching goal of the Department of Energy mission is the efficient deployment and engineering of microbial and plant systems to enable biomass conversion in pursuit of high energy density liquid biofuels. This has spurred the pace at which new organisms are sequenced and annotated. This torrent of genomic information has opened the door to understanding metabolism in not just skeletal pathways and a handful of microorganisms but for truly genome-scale reconstructions derived for hundreds of microbes and plants. Understanding and redirecting metabolism is crucial because metabolic fluxes are unique descriptors of cellular physiology that directly assess the current cellularmore » state and quantify the effect of genetic engineering interventions. At the same time, however, trying to keep pace with the rate of genomic data generation has ushered in a number of modeling and computational challenges related to (i) the automated assembly, testing and correction of genome-scale metabolic models, (ii) metabolic flux elucidation using labeled isotopes, and (iii) comprehensive identification of engineering interventions leading to the desired metabolism redirection.« less

  5. MATLAB-based automated patch-clamp system for awake behaving mice

    PubMed Central

    Siegel, Jennifer J.; Taylor, William; Chitwood, Raymond A.; Johnston, Daniel

    2015-01-01

    Automation has been an important part of biomedical research for decades, and the use of automated and robotic systems is now standard for such tasks as DNA sequencing, microfluidics, and high-throughput screening. Recently, Kodandaramaiah and colleagues (Nat Methods 9: 585–587, 2012) demonstrated, using anesthetized animals, the feasibility of automating blind patch-clamp recordings in vivo. Blind patch is a good target for automation because it is a complex yet highly stereotyped process that revolves around analysis of a single signal (electrode impedance) and movement along a single axis. Here, we introduce an automated system for blind patch-clamp recordings from awake, head-fixed mice running on a wheel. In its design, we were guided by 3 requirements: easy-to-use and easy-to-modify software; seamless integration of behavioral equipment; and efficient use of time. The resulting system employs equipment that is standard for patch recording rigs, moderately priced, or simple to make. It is written entirely in MATLAB, a programming environment that has an enormous user base in the neuroscience community and many available resources for analysis and instrument control. Using this system, we obtained 19 whole cell patch recordings from neurons in the prefrontal cortex of awake mice, aged 8–9 wk. Successful recordings had series resistances that averaged 52 ± 4 MΩ and required 5.7 ± 0.6 attempts to obtain. These numbers are comparable with those of experienced electrophysiologists working manually, and this system, written in a simple and familiar language, will be useful to many cellular electrophysiologists who wish to study awake behaving mice. PMID:26084901

  6. Automation in organizations: Eternal conflict

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  7. Automated MRI Segmentation for Individualized Modeling of Current Flow in the Human Head

    PubMed Central

    Huang, Yu; Dmochowski, Jacek P.; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C.

    2013-01-01

    Objective High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography (HD-EEG) require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images (MRI) requires labor-intensive manual segmentation, even when leveraging available automated segmentation tools. Also, accurate placement of many high-density electrodes on individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. Approach A fully automated segmentation technique based on Statical Parametric Mapping 8 (SPM8), including an improved tissue probability map (TPM) and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on 4 healthy subjects and 7 stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets. Main results The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view (FOV) extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly. Significance Fully automated individualized modeling may now be feasible for large-sample EEG research studies and tDCS clinical trials. PMID:24099977

  8. Automated MRI segmentation for individualized modeling of current flow in the human head

    NASA Astrophysics Data System (ADS)

    Huang, Yu; Dmochowski, Jacek P.; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C.

    2013-12-01

    Objective. High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. Approach. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets.Main results. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly.Significance. Fully automated individualized modeling may now be feasible for large-sample EEG research studies and tDCS clinical trials.

  9. Model-Based Design of Air Traffic Controller-Automation Interaction

    NASA Technical Reports Server (NTRS)

    Romahn, Stephan; Callantine, Todd J.; Palmer, Everett A.; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    A model of controller and automation activities was used to design the controller-automation interactions necessary to implement a new terminal area air traffic management concept. The model was then used to design a controller interface that provides the requisite information and functionality. Using data from a preliminary study, the Crew Activity Tracking System (CATS) was used to help validate the model as a computational tool for describing controller performance.

  10. Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation

    DTIC Science & Technology

    2018-01-01

    ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model......Do not return it to the originator. ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory

  11. An automated image processing routine for segmentation of cell cytoplasms in high-resolution autofluorescence images

    NASA Astrophysics Data System (ADS)

    Walsh, Alex J.; Skala, Melissa C.

    2014-02-01

    The heterogeneity of genotypes and phenotypes within cancers is correlated with disease progression and drug-resistant cellular sub-populations. Therefore, robust techniques capable of probing majority and minority cell populations are important both for cancer diagnostics and therapy monitoring. Herein, we present a modified CellProfiler routine to isolate cytoplasmic fluorescence signal on a single cell level from high resolution auto-fluorescence microscopic images.

  12. Single-cell multimodal profiling reveals cellular epigenetic heterogeneity.

    PubMed

    Cheow, Lih Feng; Courtois, Elise T; Tan, Yuliana; Viswanathan, Ramya; Xing, Qiaorui; Tan, Rui Zhen; Tan, Daniel S W; Robson, Paul; Loh, Yuin-Han; Quake, Stephen R; Burkholder, William F

    2016-10-01

    Sample heterogeneity often masks DNA methylation signatures in subpopulations of cells. Here, we present a method to genotype single cells while simultaneously interrogating gene expression and DNA methylation at multiple loci. We used this targeted multimodal approach, implemented on an automated, high-throughput microfluidic platform, to assess primary lung adenocarcinomas and human fibroblasts undergoing reprogramming by profiling epigenetic variation among cell types identified through genotyping and transcriptional analysis.

  13. NeuroCa: integrated framework for systematic analysis of spatiotemporal neuronal activity patterns from large-scale optical recording data

    PubMed Central

    Jang, Min Jee; Nam, Yoonkey

    2015-01-01

    Abstract. Optical recording facilitates monitoring the activity of a large neural network at the cellular scale, but the analysis and interpretation of the collected data remain challenging. Here, we present a MATLAB-based toolbox, named NeuroCa, for the automated processing and quantitative analysis of large-scale calcium imaging data. Our tool includes several computational algorithms to extract the calcium spike trains of individual neurons from the calcium imaging data in an automatic fashion. Two algorithms were developed to decompose the imaging data into the activity of individual cells and subsequently detect calcium spikes from each neuronal signal. Applying our method to dense networks in dissociated cultures, we were able to obtain the calcium spike trains of ∼1000 neurons in a few minutes. Further analyses using these data permitted the quantification of neuronal responses to chemical stimuli as well as functional mapping of spatiotemporal patterns in neuronal firing within the spontaneous, synchronous activity of a large network. These results demonstrate that our method not only automates time-consuming, labor-intensive tasks in the analysis of neural data obtained using optical recording techniques but also provides a systematic way to visualize and quantify the collective dynamics of a network in terms of its cellular elements. PMID:26229973

  14. Automated imaging of cellular spheroids with selective plane illumination microscopy on a chip (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Paiè, Petra; Bassi, Andrea; Bragheri, Francesca; Osellame, Roberto

    2017-02-01

    Selective plane illumination microscopy (SPIM) is an optical sectioning technique that allows imaging of biological samples at high spatio-temporal resolution. Standard SPIM devices require dedicated set-ups, complex sample preparation and accurate system alignment, thus limiting the automation of the technique, its accessibility and throughput. We present a millimeter-scaled optofluidic device that incorporates selective plane illumination and fully automatic sample delivery and scanning. To this end an integrated cylindrical lens and a three-dimensional fluidic network were fabricated by femtosecond laser micromachining into a single glass chip. This device can upgrade any standard fluorescence microscope to a SPIM system. We used SPIM on a CHIP to automatically scan biological samples under a conventional microscope, without the need of any motorized stage: tissue spheroids expressing fluorescent proteins were flowed in the microchannel at constant speed and their sections were acquired while passing through the light sheet. We demonstrate high-throughput imaging of the entire sample volume (with a rate of 30 samples/min), segmentation and quantification in thick (100-300 μm diameter) cellular spheroids. This optofluidic device gives access to SPIM analyses to non-expert end-users, opening the way to automatic and fast screening of a high number of samples at subcellular resolution.

  15. Towards monitoring real-time cellular response using an integrated microfluidics-MALDI/nESI-ion mobility-mass spectrometry platform

    PubMed Central

    Enders, Jeffrey R.; Marasco, Christina C.; Kole, Ayeeshik; Nguyen, Bao; Sundarapandian, Sevugarajan; Seale, Kevin T.; Wikswo, John P.; McLean, John A.

    2014-01-01

    The combination of microfluidic cell trapping devices with ion mobility-mass spectrometry offers the potential for elucidating in real time the dynamic responses of small populations of cells to paracrine signals, changes in metabolite levels, and delivery of drugs and toxins. Preliminary experiments examining peptides in methanol and recording the interactions of yeast and Jurkat cells with their superfusate have identified instrumental setup and control parameters and on-line desalting procedures. Numerous initial experiments demonstrate and validate this new instrumental platform. Future outlooks and potential applications are addressed, specifically how this instrumentation may be used for fully automated systems biology studies of the significantly interdependent, dynamic internal workings of cellular metabolic and signaling pathways. PMID:21073240

  16. Viscum album neutralizes tumor-induced immunosuppression in a human in vitro cell model

    PubMed Central

    Steinborn, Carmen; Klemd, Amy Marisa; Sauer, Barbara; Garcia-Käufer, Manuel; Urech, Konrad; Follo, Marie; Ücker, Annekathrin; Kienle, Gunver Sophia; Huber, Roman

    2017-01-01

    Tumor cells have the capacity to secrete immunosuppressive substances in order to diminish dendritic cell (DC) activity and thereby escape from immune responses. The impact of mistletoe (Viscum album) extracts (VAE), which are frequently used as an additive anti-cancer therapy to stimulate the immune response, is still unknown. Using a human cellular system, the impact of two different VAE (VAEA + VAEI) on the maturation of human dendritic cells and on T cell function has been investigated using flow cytometry, automated fluorescence microscopy and cytokine bead array assays. Furthermore, we examined whether VAEI was able to counteract tumor-induced immunosuppression within this cellular system using a renal cancer cell model. The role of mistletoe lectin (ML) was analyzed using ML-specific antibodies and ML-depleted VAEI. VAEI and VAEA augmented the maturation of dendritic cells. VAEI abrogated tumor-induced immunosuppression of dendritic cells and both processes were partially mediated by ML since ML-depleted VAEI and ML-specific antibodies almost neutralized the rehabilitative effects of VAEI on DC maturation. Using these settings, co-culture experiments with purified CD4+ T cells had no influence on T cell proliferation and activation but did have an impact on IFN-γ secretion. The study provides a potential mode-of-action of VAE as an additive cancer therapy based on immunomodulatory effects. However, the impact on the in vivo situation has to be evaluated in further studies. PMID:28719632

  17. An Automated Method for High-Definition Transcranial Direct Current Stimulation Modeling*

    PubMed Central

    Huang, Yu; Su, Yuzhuo; Rorden, Christopher; Dmochowski, Jacek; Datta, Abhishek; Parra, Lucas C.

    2014-01-01

    Targeted transcranial stimulation with electric currents requires accurate models of the current flow from scalp electrodes to the human brain. Idiosyncratic anatomy of individual brains and heads leads to significant variability in such current flows across subjects, thus, necessitating accurate individualized head models. Here we report on an automated processing chain that computes current distributions in the head starting from a structural magnetic resonance image (MRI). The main purpose of automating this process is to reduce the substantial effort currently required for manual segmentation, electrode placement, and solving of finite element models. In doing so, several weeks of manual labor were reduced to no more than 4 hours of computation time and minimal user interaction, while current-flow results for the automated method deviated by less than 27.9% from the manual method. Key facilitating factors are the addition of three tissue types (skull, scalp and air) to a state-of-the-art automated segmentation process, morphological processing to correct small but important segmentation errors, and automated placement of small electrodes based on easily reproducible standard electrode configurations. We anticipate that such an automated processing will become an indispensable tool to individualize transcranial direct current stimulation (tDCS) therapy. PMID:23367144

  18. On the engineering design for systematic integration of agent-orientation in industrial automation.

    PubMed

    Yu, Liyong; Schüller, Andreas; Epple, Ulrich

    2014-09-01

    In today's automation industry, agent-oriented development of system functionalities appears to have a great potential for increasing autonomy and flexibility of complex operations, while lowering the workload of users. In this paper, we present a reference model for the harmonious and systematical integration of agent-orientation in industrial automation. Considering compatibility with existing automation systems and best practice, this model combines advantages of function block technology, service orientation and native description methods from the automation standard IEC 61131-3. This approach can be applied as a guideline for the engineering design of future agent-oriented automation systems. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Application of DEN refinement and automated model building to a difficult case of molecular-replacement phasing: the structure of a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum.

    PubMed

    Brunger, Axel T; Das, Debanu; Deacon, Ashley M; Grant, Joanna; Terwilliger, Thomas C; Read, Randy J; Adams, Paul D; Levitt, Michael; Schröder, Gunnar F

    2012-04-01

    Phasing by molecular replacement remains difficult for targets that are far from the search model or in situations where the crystal diffracts only weakly or to low resolution. Here, the process of determining and refining the structure of Cgl1109, a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum, at ∼3 Å resolution is described using a combination of homology modeling with MODELLER, molecular-replacement phasing with Phaser, deformable elastic network (DEN) refinement and automated model building using AutoBuild in a semi-automated fashion, followed by final refinement cycles with phenix.refine and Coot. This difficult molecular-replacement case illustrates the power of including DEN restraints derived from a starting model to guide the movements of the model during refinement. The resulting improved model phases provide better starting points for automated model building and produce more significant difference peaks in anomalous difference Fourier maps to locate anomalous scatterers than does standard refinement. This example also illustrates a current limitation of automated procedures that require manual adjustment of local sequence misalignments between the homology model and the target sequence.

  20. Application of DEN refinement and automated model building to a difficult case of molecular-replacement phasing: the structure of a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum

    PubMed Central

    Brunger, Axel T.; Das, Debanu; Deacon, Ashley M.; Grant, Joanna; Terwilliger, Thomas C.; Read, Randy J.; Adams, Paul D.; Levitt, Michael; Schröder, Gunnar F.

    2012-01-01

    Phasing by molecular replacement remains difficult for targets that are far from the search model or in situations where the crystal diffracts only weakly or to low resolution. Here, the process of determining and refining the structure of Cgl1109, a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum, at ∼3 Å resolution is described using a combination of homology modeling with MODELLER, molecular-replacement phasing with Phaser, deformable elastic network (DEN) refinement and automated model building using AutoBuild in a semi-automated fashion, followed by final refinement cycles with phenix.refine and Coot. This difficult molecular-replacement case illustrates the power of including DEN restraints derived from a starting model to guide the movements of the model during refinement. The resulting improved model phases provide better starting points for automated model building and produce more significant difference peaks in anomalous difference Fourier maps to locate anomalous scatterers than does standard refinement. This example also illustrates a current limitation of automated procedures that require manual adjustment of local sequence misalignments between the homology model and the target sequence. PMID:22505259

  1. Discovering novel phenotypes with automatically inferred dynamic models: a partial melanocyte conversion in Xenopus

    NASA Astrophysics Data System (ADS)

    Lobo, Daniel; Lobikin, Maria; Levin, Michael

    2017-01-01

    Progress in regenerative medicine requires reverse-engineering cellular control networks to infer perturbations with desired systems-level outcomes. Such dynamic models allow phenotypic predictions for novel perturbations to be rapidly assessed in silico. Here, we analyzed a Xenopus model of conversion of melanocytes to a metastatic-like phenotype only previously observed in an all-or-none manner. Prior in vivo genetic and pharmacological experiments showed that individual animals either fully convert or remain normal, at some characteristic frequency after a given perturbation. We developed a Machine Learning method which inferred a model explaining this complex, stochastic all-or-none dataset. We then used this model to ask how a new phenotype could be generated: animals in which only some of the melanocytes converted. Systematically performing in silico perturbations, the model predicted that a combination of altanserin (5HTR2 inhibitor), reserpine (VMAT inhibitor), and VP16-XlCreb1 (constitutively active CREB) would break the all-or-none concordance. Remarkably, applying the predicted combination of three reagents in vivo revealed precisely the expected novel outcome, resulting in partial conversion of melanocytes within individuals. This work demonstrates the capability of automated analysis of dynamic models of signaling networks to discover novel phenotypes and predictively identify specific manipulations that can reach them.

  2. Automation Applications in an Advanced Air Traffic Management System : Volume 5B. DELTA Simulation Model - Programmer's Guide.

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 5 describes the DELTA Simulation Model. It includes all documentation of the DELTA (Determine Effective Levels of Task Automation) computer simulation developed by TRW for use in the Automation Applications Study. Volume 5A includes a user's m...

  3. Physics of automated driving in framework of three-phase traffic theory.

    PubMed

    Kerner, Boris S

    2018-04-01

    We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.

  4. Physics of automated driving in framework of three-phase traffic theory

    NASA Astrophysics Data System (ADS)

    Kerner, Boris S.

    2018-04-01

    We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.

  5. An attempt of modelling debris flows characterised by strong inertial effects through Cellular Automata

    NASA Astrophysics Data System (ADS)

    Iovine, G.; D'Ambrosio, D.

    2003-04-01

    Cellular Automata models do represent a valid method for the simulation of complex phenomena, when these latter can be described in "a-centric" terms - i.e. through local interactions within a discrete time-space. In particular, flow-type landslides (such as debris flows) can be viewed as a-centric dynamical system. SCIDDICA S4b, the last release of a family of two-dimensional hexagonal Cellular Automata models, has recently been developed for simulating debris flows characterised by strong inertial effects. It has been derived by progressively enriching an initial simplified CA model, originally derived for simulating very simple cases of slow-moving flow-type landslides. In S4b, by applying an empirical strategy, the inertial characters of the flowing mass have been translated into CA terms. In the transition function of the model, the distribution of landslide debris among the cells is computed by considering the momentum of the debris which move among the cells of the neighbourhood, and privileging the flow direction. By properly setting the value of one of the global parameters of the model (the "inertial factor"), the mechanism of distribution of the landslide debris among the cells can be influenced in order to emphasise the inertial effects, according to the energy of the flowing mass. Moreover, the high complexity of both the model and of the phenomena to be simulated (e.g. debris flows characterised by severe erosion along their path, and by strong inertial effects) suggested to employ an automated technique of evaluation, for the determination of the best set of global parameters. Accordingly, the calibration of the model has been performed through Genetic Algorithms, by considering several real cases of study: these latter have been selected among the population of landslides triggered in Campania (Southern Italy) in May 1998 and December 1999. Obtained results are satisfying: errors computed by comparing the simulations with the map of the real landslides are smaller than those previously obtained either through previous releases of the same model or without Genetic Algorithms. Nevertheless, results are still only preliminary, as the experiments have been realised in sequential computing environment. A more efficient calibration of the model would certainly be possible by adopting a parallel environment of computing, as a great number of tests could be performed in reasonable times; moreover, the parameters' optimisation could be realised in wider ranges, and in greater detail.

  6. Microfluidics as a new tool in radiation biology

    PubMed Central

    Lacombe, Jerome; Phillips, Shanna Leslie; Zenhausern, Frederic

    2016-01-01

    Ionizing radiations interact with molecules at the cellular and molecular levels leading to several biochemical modifications that may be responsible for biological effects on tissue or whole organisms. The study of these changes is difficult because of the complexity of the biological response(s) to radiations and the lack of reliable models able to mimic the whole molecular phenomenon and different communications between the various cell networks, from the cell activation to the macroscopic effect at the tissue or organismal level. Microfluidics, the science and technology of systems that can handle small amounts of fluids in confined and controlled environment, has been an emerging field for several years. Some microfluidic devices, even at early stages of development, may already help radiobiological research by proposing new approaches to study cellular, tissue and total-body behavior upon irradiation. These devices may also be used in clinical biodosimetry since microfluidic technology is frequently developed for integrating complex bioassay chemistries into automated user-friendly, reproducible and sensitive analyses. In this review, we discuss the use, numerous advantages, and possible future of microfluidic technology in the field of radiobiology. We will also examine the disadvantages and required improvements for microfluidics to be fully practical in radiation research and to become an enabling tool for radiobiologists and radiation oncologists. PMID:26704304

  7. Microfluidics as a new tool in radiation biology.

    PubMed

    Lacombe, Jerome; Phillips, Shanna Leslie; Zenhausern, Frederic

    2016-02-28

    Ionizing radiations interact with molecules at the cellular and molecular levels leading to several biochemical modifications that may be responsible for biological effects on tissue or whole organisms. The study of these changes is difficult because of the complexity of the biological response(s) to radiations and the lack of reliable models able to mimic the whole molecular phenomenon and different communications between the various cell networks, from the cell activation to the macroscopic effect at the tissue or organismal level. Microfluidics, the science and technology of systems that can handle small amounts of fluids in confined and controlled environment, has been an emerging field for several years. Some microfluidic devices, even at early stages of development, may already help radiobiological research by proposing new approaches to study cellular, tissue and total-body behavior upon irradiation. These devices may also be used in clinical biodosimetry since microfluidic technology is frequently developed for integrating complex bioassay chemistries into automated user-friendly, reproducible and sensitive analyses. In this review, we discuss the use, numerous advantages, and possible future of microfluidic technology in the field of radiobiology. We will also examine the disadvantages and required improvements for microfluidics to be fully practical in radiation research and to become an enabling tool for radiobiologists and radiation oncologists. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Local cellular neighborhood controls proliferation in cell competition

    PubMed Central

    Bove, Anna; Gradeci, Daniel; Fujita, Yasuyuki; Banerjee, Shiladitya; Charras, Guillaume; Lowe, Alan R.

    2017-01-01

    Cell competition is a quality-control mechanism through which tissues eliminate unfit cells. Cell competition can result from short-range biochemical inductions or long-range mechanical cues. However, little is known about how cell-scale interactions give rise to population shifts in tissues, due to the lack of experimental and computational tools to efficiently characterize interactions at the single-cell level. Here, we address these challenges by combining long-term automated microscopy with deep-learning image analysis to decipher how single-cell behavior determines tissue makeup during competition. Using our high-throughput analysis pipeline, we show that competitive interactions between MDCK wild-type cells and cells depleted of the polarity protein scribble are governed by differential sensitivity to local density and the cell type of each cell’s neighbors. We find that local density has a dramatic effect on the rate of division and apoptosis under competitive conditions. Strikingly, our analysis reveals that proliferation of the winner cells is up-regulated in neighborhoods mostly populated by loser cells. These data suggest that tissue-scale population shifts are strongly affected by cellular-scale tissue organization. We present a quantitative mathematical model that demonstrates the effect of neighbor cell–type dependence of apoptosis and division in determining the fitness of competing cell lines. PMID:28931601

  9. Automated procedures for sizing aerospace vehicle structures /SAVES/

    NASA Technical Reports Server (NTRS)

    Giles, G. L.; Blackburn, C. L.; Dixon, S. C.

    1972-01-01

    Results from a continuing effort to develop automated methods for structural design are described. A system of computer programs presently under development called SAVES is intended to automate the preliminary structural design of a complete aerospace vehicle. Each step in the automated design process of the SAVES system of programs is discussed, with emphasis placed on use of automated routines for generation of finite-element models. The versatility of these routines is demonstrated by structural models generated for a space shuttle orbiter, an advanced technology transport,n hydrogen fueled Mach 3 transport. Illustrative numerical results are presented for the Mach 3 transport wing.

  10. Automation based on knowledge modeling theory and its applications in engine diagnostic systems using Space Shuttle Main Engine vibrational data. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Kim, Jonnathan H.

    1995-01-01

    Humans can perform many complicated tasks without explicit rules. This inherent and advantageous capability becomes a hurdle when a task is to be automated. Modern computers and numerical calculations require explicit rules and discrete numerical values. In order to bridge the gap between human knowledge and automating tools, a knowledge model is proposed. Knowledge modeling techniques are discussed and utilized to automate a labor and time intensive task of detecting anomalous bearing wear patterns in the Space Shuttle Main Engine (SSME) High Pressure Oxygen Turbopump (HPOTP).

  11. Evaluating Cell Processes, Quality, and Biomarkers in Pluripotent Stem Cells Using Video Bioinformatics

    PubMed Central

    Lin, Sabrina C.; Bays, Brett C.; Omaiye, Esther; Bhanu, Bir; Talbot, Prue

    2016-01-01

    There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells. PMID:26848582

  12. Evaluating Cell Processes, Quality, and Biomarkers in Pluripotent Stem Cells Using Video Bioinformatics.

    PubMed

    Zahedi, Atena; On, Vincent; Lin, Sabrina C; Bays, Brett C; Omaiye, Esther; Bhanu, Bir; Talbot, Prue

    2016-01-01

    There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells.

  13. Corona sign: manifestation of peripheral corneal epithelial edema as a possible marker of the progression of corneal endothelial dysfunction.

    PubMed

    Inoue, Tomoyuki; Hara, Yuko; Kobayashi, Takeshi; Zheng, Xiaodong; Suzuki, Takashi; Shiraishi, Atsushi; Ohashi, Yuichi

    2016-09-01

    To describe a characteristic form of the corona sign and its clinical relevance to the degree of corneal endothelial decompensation and investigate the underlying mechanism using a rabbit model. These observational cases include 31 patients undergoing penetrating keratoplasty (PKP) and 15 patients undergoing Descemet stripping automated endothelial keratoplasty (DSAEK) with special attention to the circumferentially developed corneal epithelial edema. We also conducted a laboratory observation of horizontal water flow in the rabbit cornea. We consistently observed the corona sign at the superior periphery during the initial stage of corneal endothelial decompensation after PKP. With progressive corneal endothelial cellular loss, the epithelial edema gradually expanded circumferentially in the periphery. The endothelial cellular density associated with the corona sign significantly (P < 0.01) decreased compared with that without the sign. The endothelial cellular density decreased significantly (P < 0.05) in cases with a circumferential corona sign compared with a superior corona sign. After DSAEK, however, the corneal epithelial edema subsided from the center but persisted peripherally as a corona sign in all cases. By 3 months postoperatively, the epithelial edema was confined to the superior periphery along with uneventful corneal endothelial healing. Rabbit experiments showed that total corneal endothelial decompensation decreased the horizontal intracorneal water migration (Inoue-Ohashi phenomenon) in the corneal periphery and induced peripheral corneal edema. The slit-lamp microscopic findings of the corona-like epithelial edema in the peripheral cornea are associated with the stage of corneal endothelial function. To support this, the developmental mechanism of the corona sign was demonstrated experimentally.

  14. An Automated Microfluidic Multiplexer for Fast Delivery of C. elegans Populations from Multiwells

    PubMed Central

    Ghorashian, Navid; Gökçe, Sertan Kutal; Guo, Sam Xun; Everett, William Neil; Ben-Yakar, Adela

    2013-01-01

    Automated biosorter platforms, including recently developed microfluidic devices, enable and accelerate high-throughput and/or high-resolution bioassays on small animal models. However, time-consuming delivery of different organism populations to these systems introduces a major bottleneck to executing large-scale screens. Current population delivery strategies rely on suction from conventional well plates through tubing periodically exposed to air, leading to certain disadvantages: 1) bubble introduction to the sample, interfering with analysis in the downstream system, 2) substantial time drain from added bubble-cleaning steps, and 3) the need for complex mechanical systems to manipulate well plate position. To address these concerns, we developed a multiwell-format microfluidic platform that can deliver multiple distinct animal populations from on-chip wells using multiplexed valve control. This Population Delivery Chip could operate autonomously as part of a relatively simple setup that did not require any of the major mechanical moving parts typical of plate-handling systems to address a given well. We demonstrated automatic serial delivery of 16 distinct C. elegans worm populations to a single outlet without introducing any bubbles to the samples, causing cross-contamination, or damaging the animals. The device achieved delivery of more than 90% of the population preloaded into a given well in 4.7 seconds; an order of magnitude faster than delivery modalities in current use. This platform could potentially handle other similarly sized model organisms, such as zebrafish and drosophila larvae or cellular micro-colonies. The device’s architecture and microchannel dimensions allow simple expansion for processing larger numbers of populations. PMID:24069313

  15. Individual Differences in Response to Automation: The Five Factor Model of Personality

    ERIC Educational Resources Information Center

    Szalma, James L.; Taylor, Grant S.

    2011-01-01

    This study examined the relationship of operator personality (Five Factor Model) and characteristics of the task and of adaptive automation (reliability and adaptiveness--whether the automation was well-matched to changes in task demand) to operator performance, workload, stress, and coping. This represents the first investigation of how the Five…

  16. Automated cellular pathology in noninvasive confocal microscopy

    NASA Astrophysics Data System (ADS)

    Ting, Monica; Krueger, James; Gareau, Daniel

    2014-03-01

    A computer algorithm was developed to automatically identify and count melanocytes and keratinocytes in 3D reflectance confocal microscopy (RCM) images of the skin. Computerized pathology increases our understanding and enables prevention of superficial spreading melanoma (SSM). Machine learning involved looking at the images to measure the size of cells through a 2-D Fourier transform and developing an appropriate mask with the erf() function to model the cells. Implementation involved processing the images to identify cells whose image segments provided the least difference when subtracted from the mask. With further simplification of the algorithm, the program may be directly implemented on the RCM images to indicate the presence of keratinocytes in seconds and to quantify the keratinocytes size in the en face plane as a function of depth. Using this system, the algorithm can identify any irregularities in maturation and differentiation of keratinocytes, thereby signaling the possible presence of cancer.

  17. Combining deep learning and coherent anti-Stokes Raman scattering imaging for automated differential diagnosis of lung cancer

    NASA Astrophysics Data System (ADS)

    Weng, Sheng; Xu, Xiaoyun; Li, Jiasong; Wong, Stephen T. C.

    2017-10-01

    Lung cancer is the most prevalent type of cancer and the leading cause of cancer-related deaths worldwide. Coherent anti-Stokes Raman scattering (CARS) is capable of providing cellular-level images and resolving pathologically related features on human lung tissues. However, conventional means of analyzing CARS images requires extensive image processing, feature engineering, and human intervention. This study demonstrates the feasibility of applying a deep learning algorithm to automatically differentiate normal and cancerous lung tissue images acquired by CARS. We leverage the features learned by pretrained deep neural networks and retrain the model using CARS images as the input. We achieve 89.2% accuracy in classifying normal, small-cell carcinoma, adenocarcinoma, and squamous cell carcinoma lung images. This computational method is a step toward on-the-spot diagnosis of lung cancer and can be further strengthened by the efforts aimed at miniaturizing the CARS technique for fiber-based microendoscopic imaging.

  18. A receptor and neuron that activate a circuit limiting sucrose consumption.

    PubMed

    Joseph, Ryan M; Sun, Jennifer S; Tam, Edric; Carlson, John R

    2017-03-23

    The neural control of sugar consumption is critical for normal metabolism. In contrast to sugar-sensing taste neurons that promote consumption, we identify a taste neuron that limits sucrose consumption in Drosophila . Silencing of the neuron increases sucrose feeding; optogenetic activation decreases it. The feeding inhibition depends on the IR60b receptor, as shown by behavioral analysis and Ca 2+ imaging of an IR60b mutant. The IR60b phenotype shows a high degree of chemical specificity when tested with a broad panel of tastants. An automated analysis of feeding behavior in freely moving flies shows that IR60b limits the duration of individual feeding bouts. This receptor and neuron provide the molecular and cellular underpinnings of a new element in the circuit logic of feeding regulation. We propose a dynamic model in which sucrose acts via IR60b to activate a circuit that inhibits feeding and prevents overconsumption.

  19. Simulation for Carbon Nanotube Dispersion and Microstructure Formation in CNTs/AZ91D Composite Fabricated by Ultrasonic Processing

    NASA Astrophysics Data System (ADS)

    Yang, Yuansheng; Zhao, Fuze; Feng, Xiaohui

    2017-10-01

    The dispersion of carbon nanotubes (CNTs) in AZ91D melt by ultrasonic processing and microstructure formation of CNTs/AZ91D composite were studied using numerical and physical simulations. The sound field and acoustic streaming were predicted using finite element method. Meanwhile, optimal immersion depth of the ultrasonic probe and suitable ultrasonic power were obtained. Single-bubble model was used to predict ultrasonic cavitation in AZ91D melt. The relationship between sound pressure amplitude and ultrasonic cavitation was established. Physical simulations of acoustic streaming and ultrasonic cavitation agreed well with the numerical simulations. It was confirmed that the dispersion of carbon nanotubes was remarkably improved by ultrasonic processing. Microstructure formation of CNTs/AZ91D composite was numerically simulated using cellular automation method. In addition, grain refinement was achieved and the growth of dendrites was changed due to the uniform dispersion of CNTs.

  20. Analysis and numerical simulation research of the heating process in the oven

    NASA Astrophysics Data System (ADS)

    Chen, Yawei; Lei, Dingyou

    2016-10-01

    How to use the oven to bake delicious food is the most concerned problem of the designers and users of the oven. For this intent, this paper analyzed the heat distribution in the oven based on the basic operation principles and proceeded the data simulation of the temperature distribution on the rack section. Constructing the differential equation model of the temperature distribution changes in the pan when the oven works based on the heat radiation and heat transmission, based on the idea of utilizing cellular automation to simulate heat transfer process, used ANSYS software to proceed the numerical simulation analysis to the rectangular, round-cornered rectangular, elliptical and circular pans and giving out the instantaneous temperature distribution of the corresponding shapes of the pans. The temperature distribution of the rectangular and circular pans proves that the product gets overcooked easily at the corners and edges of rectangular pans but not of a round pan.

  1. Automated sub-5 nm image registration in integrated correlative fluorescence and electron microscopy using cathodoluminescence pointers

    NASA Astrophysics Data System (ADS)

    Haring, Martijn T.; Liv, Nalan; Zonnevylle, A. Christiaan; Narvaez, Angela C.; Voortman, Lenard M.; Kruit, Pieter; Hoogenboom, Jacob P.

    2017-03-01

    In the biological sciences, data from fluorescence and electron microscopy is correlated to allow fluorescence biomolecule identification within the cellular ultrastructure and/or ultrastructural analysis following live-cell imaging. High-accuracy (sub-100 nm) image overlay requires the addition of fiducial markers, which makes overlay accuracy dependent on the number of fiducials present in the region of interest. Here, we report an automated method for light-electron image overlay at high accuracy, i.e. below 5 nm. Our method relies on direct visualization of the electron beam position in the fluorescence detection channel using cathodoluminescence pointers. We show that image overlay using cathodoluminescence pointers corrects for image distortions, is independent of user interpretation, and does not require fiducials, allowing image correlation with molecular precision anywhere on a sample.

  2. Automated sub-5 nm image registration in integrated correlative fluorescence and electron microscopy using cathodoluminescence pointers.

    PubMed

    Haring, Martijn T; Liv, Nalan; Zonnevylle, A Christiaan; Narvaez, Angela C; Voortman, Lenard M; Kruit, Pieter; Hoogenboom, Jacob P

    2017-03-02

    In the biological sciences, data from fluorescence and electron microscopy is correlated to allow fluorescence biomolecule identification within the cellular ultrastructure and/or ultrastructural analysis following live-cell imaging. High-accuracy (sub-100 nm) image overlay requires the addition of fiducial markers, which makes overlay accuracy dependent on the number of fiducials present in the region of interest. Here, we report an automated method for light-electron image overlay at high accuracy, i.e. below 5 nm. Our method relies on direct visualization of the electron beam position in the fluorescence detection channel using cathodoluminescence pointers. We show that image overlay using cathodoluminescence pointers corrects for image distortions, is independent of user interpretation, and does not require fiducials, allowing image correlation with molecular precision anywhere on a sample.

  3. Automated sub-5 nm image registration in integrated correlative fluorescence and electron microscopy using cathodoluminescence pointers

    PubMed Central

    Haring, Martijn T.; Liv, Nalan; Zonnevylle, A. Christiaan; Narvaez, Angela C.; Voortman, Lenard M.; Kruit, Pieter; Hoogenboom, Jacob P.

    2017-01-01

    In the biological sciences, data from fluorescence and electron microscopy is correlated to allow fluorescence biomolecule identification within the cellular ultrastructure and/or ultrastructural analysis following live-cell imaging. High-accuracy (sub-100 nm) image overlay requires the addition of fiducial markers, which makes overlay accuracy dependent on the number of fiducials present in the region of interest. Here, we report an automated method for light-electron image overlay at high accuracy, i.e. below 5 nm. Our method relies on direct visualization of the electron beam position in the fluorescence detection channel using cathodoluminescence pointers. We show that image overlay using cathodoluminescence pointers corrects for image distortions, is independent of user interpretation, and does not require fiducials, allowing image correlation with molecular precision anywhere on a sample. PMID:28252673

  4. Modeling strategic behavior in human-automation interaction - Why an 'aid' can (and should) go unused

    NASA Technical Reports Server (NTRS)

    Kirlik, Alex

    1993-01-01

    Task-offload aids (e.g., an autopilot, an 'intelligent' assistant) can be selectively engaged by the human operator to dynamically delegate tasks to automation. Introducing such aids eliminates some task demands but creates new ones associated with programming, engaging, and disengaging the aiding device via an interface. The burdens associated with managing automation can sometimes outweigh the potential benefits of automation to improved system performance. Aid design parameters and features of the overall multitask context combine to determine whether or not a task-offload aid will effectively support the operator. A modeling and sensitivity analysis approach is presented that identifies effective strategies for human-automation interaction as a function of three task-context parameters and three aid design parameters. The analysis and modeling approaches provide resources for predicting how a well-adapted operator will use a given task-offload aid, and for specifying aid design features that ensure that automation will provide effective operator support in a multitask environment.

  5. Automated finite element meshing of the lumbar spine: Verification and validation with 18 specimen-specific models.

    PubMed

    Campbell, J Q; Coombs, D J; Rao, M; Rullkoetter, P J; Petrella, A J

    2016-09-06

    The purpose of this study was to seek broad verification and validation of human lumbar spine finite element models created using a previously published automated algorithm. The automated algorithm takes segmented CT scans of lumbar vertebrae, automatically identifies important landmarks and contact surfaces, and creates a finite element model. Mesh convergence was evaluated by examining changes in key output variables in response to mesh density. Semi-direct validation was performed by comparing experimental results for a single specimen to the automated finite element model results for that specimen with calibrated material properties from a prior study. Indirect validation was based on a comparison of results from automated finite element models of 18 individual specimens, all using one set of generalized material properties, to a range of data from the literature. A total of 216 simulations were run and compared to 186 experimental data ranges in all six primary bending modes up to 7.8Nm with follower loads up to 1000N. Mesh convergence results showed less than a 5% difference in key variables when the original mesh density was doubled. The semi-direct validation results showed that the automated method produced results comparable to manual finite element modeling methods. The indirect validation results showed a wide range of outcomes due to variations in the geometry alone. The studies showed that the automated models can be used to reliably evaluate lumbar spine biomechanics, specifically within our intended context of use: in pure bending modes, under relatively low non-injurious simulated in vivo loads, to predict torque rotation response, disc pressures, and facet forces. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. A Human-Automation Interface Model to Guide Automation of System Functions: A Way to Achieve Manning Goals in New Systems

    DTIC Science & Technology

    2006-06-01

    levels of automation applied as per Figure 13. .................................. 60 x THIS PAGE...models generated for this thesis were set to run for 60 minutes. To run the simulation for the set time, the analyst provides a random number seed to...1984). The IMPRINT 59 workload value of 60 has been used by a consensus of workload modeling SMEs to represent the ‘high’ threshold, while the

  7. [The study of medical supplies automation replenishment algorithm in hospital on medical supplies supplying chain].

    PubMed

    Sheng, Xi

    2012-07-01

    The thesis aims to study the automation replenishment algorithm in hospital on medical supplies supplying chain. The mathematical model and algorithm of medical supplies automation replenishment are designed through referring to practical data form hospital on the basis of applying inventory theory, greedy algorithm and partition algorithm. The automation replenishment algorithm is proved to realize automatic calculation of the medical supplies distribution amount and optimize medical supplies distribution scheme. A conclusion could be arrived that the model and algorithm of inventory theory, if applied in medical supplies circulation field, could provide theoretical and technological support for realizing medical supplies automation replenishment of hospital on medical supplies supplying chain.

  8. Trust in automation: integrating empirical evidence on factors that influence trust.

    PubMed

    Hoff, Kevin Anthony; Bashir, Masooda

    2015-05-01

    We systematically review recent empirical research on factors that influence trust in automation to present a three-layered trust model that synthesizes existing knowledge. Much of the existing research on factors that guide human-automation interaction is centered around trust, a variable that often determines the willingness of human operators to rely on automation. Studies have utilized a variety of different automated systems in diverse experimental paradigms to identify factors that impact operators' trust. We performed a systematic review of empirical research on trust in automation from January 2002 to June 2013. Papers were deemed eligible only if they reported the results of a human-subjects experiment in which humans interacted with an automated system in order to achieve a goal. Additionally, a relationship between trust (or a trust-related behavior) and another variable had to be measured. All together, 101 total papers, containing 127 eligible studies, were included in the review. Our analysis revealed three layers of variability in human-automation trust (dispositional trust, situational trust, and learned trust), which we organize into a model. We propose design recommendations for creating trustworthy automation and identify environmental conditions that can affect the strength of the relationship between trust and reliance. Future research directions are also discussed for each layer of trust. Our three-layered trust model provides a new lens for conceptualizing the variability of trust in automation. Its structure can be applied to help guide future research and develop training interventions and design procedures that encourage appropriate trust. © 2014, Human Factors and Ergonomics Society.

  9. Parmodel: a web server for automated comparative modeling of proteins.

    PubMed

    Uchôa, Hugo Brandão; Jorge, Guilherme Eberhart; Freitas Da Silveira, Nelson José; Camera, João Carlos; Canduri, Fernanda; De Azevedo, Walter Filgueira

    2004-12-24

    Parmodel is a web server for automated comparative modeling and evaluation of protein structures. The aim of this tool is to help inexperienced users to perform modeling, assessment, visualization, and optimization of protein models as well as crystallographers to evaluate structures solved experimentally. It is subdivided in four modules: Parmodel Modeling, Parmodel Assessment, Parmodel Visualization, and Parmodel Optimization. The main module is the Parmodel Modeling that allows the building of several models for a same protein in a reduced time, through the distribution of modeling processes on a Beowulf cluster. Parmodel automates and integrates the main softwares used in comparative modeling as MODELLER, Whatcheck, Procheck, Raster3D, Molscript, and Gromacs. This web server is freely accessible at .

  10. Structuring and extracting knowledge for the support of hypothesis generation in molecular biology

    PubMed Central

    Roos, Marco; Marshall, M Scott; Gibson, Andrew P; Schuemie, Martijn; Meij, Edgar; Katrenko, Sophia; van Hage, Willem Robert; Krommydas, Konstantinos; Adriaans, Pieter W

    2009-01-01

    Background Hypothesis generation in molecular and cellular biology is an empirical process in which knowledge derived from prior experiments is distilled into a comprehensible model. The requirement of automated support is exemplified by the difficulty of considering all relevant facts that are contained in the millions of documents available from PubMed. Semantic Web provides tools for sharing prior knowledge, while information retrieval and information extraction techniques enable its extraction from literature. Their combination makes prior knowledge available for computational analysis and inference. While some tools provide complete solutions that limit the control over the modeling and extraction processes, we seek a methodology that supports control by the experimenter over these critical processes. Results We describe progress towards automated support for the generation of biomolecular hypotheses. Semantic Web technologies are used to structure and store knowledge, while a workflow extracts knowledge from text. We designed minimal proto-ontologies in OWL for capturing different aspects of a text mining experiment: the biological hypothesis, text and documents, text mining, and workflow provenance. The models fit a methodology that allows focus on the requirements of a single experiment while supporting reuse and posterior analysis of extracted knowledge from multiple experiments. Our workflow is composed of services from the 'Adaptive Information Disclosure Application' (AIDA) toolkit as well as a few others. The output is a semantic model with putative biological relations, with each relation linked to the corresponding evidence. Conclusion We demonstrated a 'do-it-yourself' approach for structuring and extracting knowledge in the context of experimental research on biomolecular mechanisms. The methodology can be used to bootstrap the construction of semantically rich biological models using the results of knowledge extraction processes. Models specific to particular experiments can be constructed that, in turn, link with other semantic models, creating a web of knowledge that spans experiments. Mapping mechanisms can link to other knowledge resources such as OBO ontologies or SKOS vocabularies. AIDA Web Services can be used to design personalized knowledge extraction procedures. In our example experiment, we found three proteins (NF-Kappa B, p21, and Bax) potentially playing a role in the interplay between nutrients and epigenetic gene regulation. PMID:19796406

  11. Attributed relational graphs for cell nucleus segmentation in fluorescence microscopy images.

    PubMed

    Arslan, Salim; Ersahin, Tulin; Cetin-Atalay, Rengul; Gunduz-Demir, Cigdem

    2013-06-01

    More rapid and accurate high-throughput screening in molecular cellular biology research has become possible with the development of automated microscopy imaging, for which cell nucleus segmentation commonly constitutes the core step. Although several promising methods exist for segmenting the nuclei of monolayer isolated and less-confluent cells, it still remains an open problem to segment the nuclei of more-confluent cells, which tend to grow in overlayers. To address this problem, we propose a new model-based nucleus segmentation algorithm. This algorithm models how a human locates a nucleus by identifying the nucleus boundaries and piecing them together. In this algorithm, we define four types of primitives to represent nucleus boundaries at different orientations and construct an attributed relational graph on the primitives to represent their spatial relations. Then, we reduce the nucleus identification problem to finding predefined structural patterns in the constructed graph and also use the primitives in region growing to delineate the nucleus borders. Working with fluorescence microscopy images, our experiments demonstrate that the proposed algorithm identifies nuclei better than previous nucleus segmentation algorithms.

  12. Effect of Aspiration and Mean Gain on the Emergence of Cooperation in Unidirectional Pedestrian Flow

    NASA Astrophysics Data System (ADS)

    Wang, Zi-Yang; Ma, Jian; Zhao, Hui; Qin, Yong; Zhu, Wei; Jia, Li-Min

    2013-03-01

    When more than one pedestrian want to move to the same site, conflicts appear and thus the involved pedestrians play a motion game. In order to describe the emergence of cooperation during the conflict resolving process, an evolutionary cellular automation model is established considering the effect of aspiration and mean gain. In each game, pedestrian may be gentle cooperator or aggressive defector. We propose a set of win-stay-lose-shrift (WSLS) like rules for updating pedestrian's strategy. These rules prescribe that if the mean gain of current strategy between some given steps is larger than aspiration the strategy keeps, otherwise the strategy changes. The simulation results show that a high level aspiration will lead to more cooperation. With the increment of the statistic length, pedestrians will be more rational in decision making. It is also found that when the aspiration level is small enough and the statistic length is large enough all the pedestrian will turn to defectors. We use the prisoner's dilemma model to explain it. At last we discuss the effect of aspiration on fundamental diagram.

  13. A linear programming approach to reconstructing subcellular structures from confocal images for automated generation of representative 3D cellular models.

    PubMed

    Wood, Scott T; Dean, Brian C; Dean, Delphine

    2013-04-01

    This paper presents a novel computer vision algorithm to analyze 3D stacks of confocal images of fluorescently stained single cells. The goal of the algorithm is to create representative in silico model structures that can be imported into finite element analysis software for mechanical characterization. Segmentation of cell and nucleus boundaries is accomplished via standard thresholding methods. Using novel linear programming methods, a representative actin stress fiber network is generated by computing a linear superposition of fibers having minimum discrepancy compared with an experimental 3D confocal image. Qualitative validation is performed through analysis of seven 3D confocal image stacks of adherent vascular smooth muscle cells (VSMCs) grown in 2D culture. The presented method is able to automatically generate 3D geometries of the cell's boundary, nucleus, and representative F-actin network based on standard cell microscopy data. These geometries can be used for direct importation and implementation in structural finite element models for analysis of the mechanics of a single cell to potentially speed discoveries in the fields of regenerative medicine, mechanobiology, and drug discovery. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Automation in photogrammetry: Recent developments and applications (1972-1976)

    USGS Publications Warehouse

    Thompson, M.M.; Mikhail, E.M.

    1976-01-01

    An overview of recent developments in the automation of photogrammetry in various countries is presented. Conclusions regarding automated photogrammetry reached at the 1972 Congress in Ottawa are reviewed first as a background for examining the developments of 1972-1976. Applications are described for each country reporting significant developments. Among fifteen conclusions listed are statements concerning: the widespread practice of equipping existing stereoplotters with simple digitizers; the growing tendency to use minicomputers on-line with stereoplotters; the optimization of production of digital terrain models by progressive sampling in stereomodels; the potential of digitization of a photogrammetric model by density correlation on epipolar lines; the capabilities and economic aspects of advanced systems which permit simultaneous production of orthophotos, contours, and digital terrain models; the economy of off-line orthophoto systems; applications of digital image processing; automation by optical techniques; applications of sensors other than photographic imagery, and the role of photogrammetric phases in a completely automated cartographic system. ?? 1976.

  15. An Overview of the Automated Dispatch Controller Algorithms in the System Advisor Model (SAM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DiOrio, Nicholas A

    2017-11-22

    Three automatic dispatch modes have been added to the battery model within the System Adviser Model. These controllers have been developed to perform peak shaving in an automated fashion, providing users with a way to see the benefit of reduced demand charges without manually programming a complicated dispatch control. A flexible input option allows more advanced interaction with the automated controller. This document will describe the algorithms in detail and present brief results on its use and limitations.

  16. Impact of Resolution on Simulation of Closed Mesoscale Cellular Convection Identified by Dynamically Guided Watershed Segmentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martini, Matus N.; Gustafson, William I.; Yang, Qing

    2014-11-18

    Organized mesoscale cellular convection (MCC) is a common feature of marine stratocumulus that forms in response to a balance between mesoscale dynamics and smaller scale processes such as cloud radiative cooling and microphysics. We use the Weather Research and Forecasting model with chemistry (WRF-Chem) and fully coupled cloud-aerosol interactions to simulate marine low clouds during the VOCALS-REx campaign over the southeast Pacific. A suite of experiments with 3- and 9-km grid spacing indicates resolution-dependent behavior. The simulations with finer grid spacing have smaller liquid water paths and cloud fractions, while cloud tops are higher. The observed diurnal cycle is reasonablymore » well simulated. To isolate organized MCC characteristics we develop a new automated method, which uses a variation of the watershed segmentation technique that combines the detection of cloud boundaries with a test for coincident vertical velocity characteristics. This ensures that the detected cloud fields are dynamically consistent for closed MCC, the most common MCC type over the VOCALS-REx region. We demonstrate that the 3-km simulation is able to reproduce the scaling between horizontal cell size and boundary layer height seen in satellite observations. However, the 9-km simulation is unable to resolve smaller circulations corresponding to shallower boundary layers, instead producing invariant MCC horizontal scale for all simulated boundary layers depths. The results imply that climate models with grid spacing of roughly 3 km or smaller may be needed to properly simulate the MCC structure in the marine stratocumulus regions.« less

  17. First Steps to Automated Interior Reconstruction from Semantically Enriched Point Clouds and Imagery

    NASA Astrophysics Data System (ADS)

    Obrock, L. S.; Gülch, E.

    2018-05-01

    The automated generation of a BIM-Model from sensor data is a huge challenge for the modeling of existing buildings. Currently the measurements and analyses are time consuming, allow little automation and require expensive equipment. We do lack an automated acquisition of semantical information of objects in a building. We are presenting first results of our approach based on imagery and derived products aiming at a more automated modeling of interior for a BIM building model. We examine the building parts and objects visible in the collected images using Deep Learning Methods based on Convolutional Neural Networks. For localization and classification of building parts we apply the FCN8s-Model for pixel-wise Semantic Segmentation. We, so far, reach a Pixel Accuracy of 77.2 % and a mean Intersection over Union of 44.2 %. We finally use the network for further reasoning on the images of the interior room. We combine the segmented images with the original images and use photogrammetric methods to produce a three-dimensional point cloud. We code the extracted object types as colours of the 3D-points. We thus are able to uniquely classify the points in three-dimensional space. We preliminary investigate a simple extraction method for colour and material of building parts. It is shown, that the combined images are very well suited to further extract more semantic information for the BIM-Model. With the presented methods we see a sound basis for further automation of acquisition and modeling of semantic and geometric information of interior rooms for a BIM-Model.

  18. Modeling the Energy Use of a Connected and Automated Transportation System (Poster)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonder, J.; Brown, A.

    Early research points to large potential impacts of connected and automated vehicles (CAVs) on transportation energy use - dramatic savings, increased use, or anything in between. Due to a lack of suitable data and integrated modeling tools to explore these complex future systems, analyses to date have relied on simple combinations of isolated effects. This poster proposes a framework for modeling the potential energy implications from increasing penetration of CAV technologies and for assessing technology and policy options to steer them toward favorable energy outcomes. Current CAV modeling challenges include estimating behavior change, understanding potential vehicle-to-vehicle interactions, and assessing trafficmore » flow and vehicle use under different automation scenarios. To bridge these gaps and develop a picture of potential future automated systems, NREL is integrating existing modeling capabilities with additional tools and data inputs to create a more fully integrated CAV assessment toolkit.« less

  19. Automation of a DXA-based finite element tool for clinical assessment of hip fracture risk.

    PubMed

    Luo, Yunhua; Ahmed, Sharif; Leslie, William D

    2018-03-01

    Finite element analysis of medical images is a promising tool for assessing hip fracture risk. Although a number of finite element models have been developed for this purpose, none of them have been routinely used in clinic. The main reason is that the computer programs that implement the finite element models have not been completely automated, and heavy training is required before clinicians can effectively use them. By using information embedded in clinical dual energy X-ray absorptiometry (DXA), we completely automated a DXA-based finite element (FE) model that we previously developed for predicting hip fracture risk. The automated FE tool can be run as a standalone computer program with the subject's raw hip DXA image as input. The automated FE tool had greatly improved short-term precision compared with the semi-automated version. To validate the automated FE tool, a clinical cohort consisting of 100 prior hip fracture cases and 300 matched controls was obtained from a local community clinical center. Both the automated FE tool and femoral bone mineral density (BMD) were applied to discriminate the fracture cases from the controls. Femoral BMD is the gold standard reference recommended by the World Health Organization for screening osteoporosis and for assessing hip fracture risk. The accuracy was measured by the area under ROC curve (AUC) and odds ratio (OR). Compared with femoral BMD (AUC = 0.71, OR = 2.07), the automated FE tool had a considerably improved accuracy (AUC = 0.78, OR = 2.61 at the trochanter). This work made a large step toward applying our DXA-based FE model as a routine clinical tool for the assessment of hip fracture risk. Furthermore, the automated computer program can be embedded into a web-site as an internet application. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Adaptive Automation Design and Implementation

    DTIC Science & Technology

    2015-09-17

    Study : Space Navigator This section demonstrates the player modeling paradigm, focusing specifically on the response generation section of the player ...human-machine system, a real-time player modeling framework for imitating a specific person’s task performance, and the Adaptive Automation System...Model . . . . . . . . . . . . . . . . . . . . . . . 13 Clustering-Based Real-Time Player Modeling . . . . . . . . . . . . . . . . . . . . . . 15 An

  1. Advancing haemostasis automation--successful implementation of robotic centrifugation and sample processing in a tertiary service hospital.

    PubMed

    Sédille-Mostafaie, Nazanin; Engler, Hanna; Lutz, Susanne; Korte, Wolfgang

    2013-06-01

    Laboratories today face increasing pressure to automate operations due to increasing workloads and the need to reduce expenditure. Few studies to date have focussed on the laboratory automation of preanalytical coagulation specimen processing. In the present study, we examined whether a clinical chemistry automation protocol meets the preanalytical requirements for the analyses of coagulation. During the implementation of laboratory automation, we began to operate a pre- and postanalytical automation system. The preanalytical unit processes blood specimens for chemistry, immunology and coagulation by automated specimen processing. As the production of platelet-poor plasma is highly dependent on optimal centrifugation, we examined specimen handling under different centrifugation conditions in order to produce optimal platelet deficient plasma specimens. To this end, manually processed models centrifuged at 1500 g for 5 and 20 min were compared to an automated centrifugation model at 3000 g for 7 min. For analytical assays that are performed frequently enough to be targets for full automation, Passing-Bablok regression analysis showed close agreement between different centrifugation methods, with a correlation coefficient between 0.98 and 0.99 and a bias between -5% and +6%. For seldom performed assays that do not mandate full automation, the Passing-Bablok regression analysis showed acceptable to poor agreement between different centrifugation methods. A full automation solution is suitable and can be recommended for frequent haemostasis testing.

  2. Flexible End2End Workflow Automation of Hit-Discovery Research.

    PubMed

    Holzmüller-Laue, Silke; Göde, Bernd; Thurow, Kerstin

    2014-08-01

    The article considers a new approach of more complex laboratory automation at the workflow layer. The authors purpose the automation of end2end workflows. The combination of all relevant subprocesses-whether automated or manually performed, independently, and in which organizational unit-results in end2end processes that include all result dependencies. The end2end approach focuses on not only the classical experiments in synthesis or screening, but also on auxiliary processes such as the production and storage of chemicals, cell culturing, and maintenance as well as preparatory activities and analyses of experiments. Furthermore, the connection of control flow and data flow in the same process model leads to reducing of effort of the data transfer between the involved systems, including the necessary data transformations. This end2end laboratory automation can be realized effectively with the modern methods of business process management (BPM). This approach is based on a new standardization of the process-modeling notation Business Process Model and Notation 2.0. In drug discovery, several scientific disciplines act together with manifold modern methods, technologies, and a wide range of automated instruments for the discovery and design of target-based drugs. The article discusses the novel BPM-based automation concept with an implemented example of a high-throughput screening of previously synthesized compound libraries. © 2014 Society for Laboratory Automation and Screening.

  3. The Use of AMET & Automated Scripts for Model Evaluation

    EPA Science Inventory

    Brief overview of EPA’s new CMAQ website to be launched publically in June, 2017. Details on the upcoming release of the Atmospheric Model Evaluation Tool (AMET) and the creation of automated scripts for post-processing and evaluating air quality model data.

  4. Asynchronous adaptive time step in quantitative cellular automata modeling

    PubMed Central

    Zhu, Hao; Pang, Peter YH; Sun, Yan; Dhar, Pawan

    2004-01-01

    Background The behaviors of cells in metazoans are context dependent, thus large-scale multi-cellular modeling is often necessary, for which cellular automata are natural candidates. Two related issues are involved in cellular automata based multi-cellular modeling: how to introduce differential equation based quantitative computing to precisely describe cellular activity, and upon it, how to solve the heavy time consumption issue in simulation. Results Based on a modified, language based cellular automata system we extended that allows ordinary differential equations in models, we introduce a method implementing asynchronous adaptive time step in simulation that can considerably improve efficiency yet without a significant sacrifice of accuracy. An average speedup rate of 4–5 is achieved in the given example. Conclusions Strategies for reducing time consumption in simulation are indispensable for large-scale, quantitative multi-cellular models, because even a small 100 × 100 × 100 tissue slab contains one million cells. Distributed and adaptive time step is a practical solution in cellular automata environment. PMID:15222901

  5. Formally verifying human–automation interaction as part of a system model: limitations and tradeoffs

    PubMed Central

    Bass, Ellen J.

    2011-01-01

    Both the human factors engineering (HFE) and formal methods communities are concerned with improving the design of safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to perform formal verification of human–automation interaction with a programmable device. This effort utilizes a system architecture composed of independent models of the human mission, human task behavior, human-device interface, device automation, and operational environment. The goals of this architecture were to allow HFE practitioners to perform formal verifications of realistic systems that depend on human–automation interaction in a reasonable amount of time using representative models, intuitive modeling constructs, and decoupled models of system components that could be easily changed to support multiple analyses. This framework was instantiated using a patient controlled analgesia pump in a two phased process where models in each phase were verified using a common set of specifications. The first phase focused on the mission, human-device interface, and device automation; and included a simple, unconstrained human task behavior model. The second phase replaced the unconstrained task model with one representing normative pump programming behavior. Because models produced in the first phase were too large for the model checker to verify, a number of model revisions were undertaken that affected the goals of the effort. While the use of human task behavior models in the second phase helped mitigate model complexity, verification time increased. Additional modeling tools and technological developments are necessary for model checking to become a more usable technique for HFE. PMID:21572930

  6. Influence of Texture and Colour in Breast TMA Classification

    PubMed Central

    Fernández-Carrobles, M. Milagro; Bueno, Gloria; Déniz, Oscar; Salido, Jesús; García-Rojo, Marcial; González-López, Lucía

    2015-01-01

    Breast cancer diagnosis is still done by observation of biopsies under the microscope. The development of automated methods for breast TMA classification would reduce diagnostic time. This paper is a step towards the solution for this problem and shows a complete study of breast TMA classification based on colour models and texture descriptors. The TMA images were divided into four classes: i) benign stromal tissue with cellularity, ii) adipose tissue, iii) benign and benign anomalous structures, and iv) ductal and lobular carcinomas. A relevant set of features was obtained on eight different colour models from first and second order Haralick statistical descriptors obtained from the intensity image, Fourier, Wavelets, Multiresolution Gabor, M-LBP and textons descriptors. Furthermore, four types of classification experiments were performed using six different classifiers: (1) classification per colour model individually, (2) classification by combination of colour models, (3) classification by combination of colour models and descriptors, and (4) classification by combination of colour models and descriptors with a previous feature set reduction. The best result shows an average of 99.05% accuracy and 98.34% positive predictive value. These results have been obtained by means of a bagging tree classifier with combination of six colour models and the use of 1719 non-correlated (correlation threshold of 97%) textural features based on Statistical, M-LBP, Gabor and Spatial textons descriptors. PMID:26513238

  7. Automated single cell microbioreactor for monitoring intracellular dynamics and cell growth in free solution†

    PubMed Central

    Johnson-Chavarria, Eric M.; Agrawal, Utsav; Tanyeri, Melikhan; Kuhlman, Thomas E.

    2014-01-01

    We report an automated microfluidic-based platform for single cell analysis that allows for cell culture in free solution with the ability to control the cell growth environment. Using this approach, cells are confined by the sole action of gentle fluid flow, thereby enabling non-perturbative analysis of cell growth away from solid boundaries. In addition, the single cell microbioreactor allows for precise and time-dependent control over cell culture media, with the combined ability to observe the dynamics of non-adherent cells over long time scales. As a proof-of-principle demonstration, we used the platform to observe dynamic cell growth, gene expression, and intracellular diffusion of repressor proteins while precisely tuning the cell growth environment. Overall, this microfluidic approach enables the direct observation of cellular dynamics with exquisite control over environmental conditions, which will be useful for quantifying the behaviour of single cells in well-defined media. PMID:24836754

  8. aMAP is a validated pipeline for registration and segmentation of high-resolution mouse brain data

    PubMed Central

    Niedworok, Christian J.; Brown, Alexander P. Y.; Jorge Cardoso, M.; Osten, Pavel; Ourselin, Sebastien; Modat, Marc; Margrie, Troy W.

    2016-01-01

    The validation of automated image registration and segmentation is crucial for accurate and reliable mapping of brain connectivity and function in three-dimensional (3D) data sets. While validation standards are necessarily high and routinely met in the clinical arena, they have to date been lacking for high-resolution microscopy data sets obtained from the rodent brain. Here we present a tool for optimized automated mouse atlas propagation (aMAP) based on clinical registration software (NiftyReg) for anatomical segmentation of high-resolution 3D fluorescence images of the adult mouse brain. We empirically evaluate aMAP as a method for registration and subsequent segmentation by validating it against the performance of expert human raters. This study therefore establishes a benchmark standard for mapping the molecular function and cellular connectivity of the rodent brain. PMID:27384127

  9. Towards automated segmentation of cells and cell nuclei in nonlinear optical microscopy.

    PubMed

    Medyukhina, Anna; Meyer, Tobias; Schmitt, Michael; Romeike, Bernd F M; Dietzek, Benjamin; Popp, Jürgen

    2012-11-01

    Nonlinear optical (NLO) imaging techniques based e.g. on coherent anti-Stokes Raman scattering (CARS) or two photon excited fluorescence (TPEF) show great potential for biomedical imaging. In order to facilitate the diagnostic process based on NLO imaging, there is need for an automated calculation of quantitative values such as cell density, nucleus-to-cytoplasm ratio, average nuclear size. Extraction of these parameters is helpful for the histological assessment in general and specifically e.g. for the determination of tumor grades. This requires an accurate image segmentation and detection of locations and boundaries of cells and nuclei. Here we present an image processing approach for the detection of nuclei and cells in co-registered TPEF and CARS images. The algorithm developed utilizes the gray-scale information for the detection of the nuclei locations and the gradient information for the delineation of the nuclear and cellular boundaries. The approach reported is capable for an automated segmentation of cells and nuclei in multimodal TPEF-CARS images of human brain tumor samples. The results are important for the development of NLO microscopy into a clinically relevant diagnostic tool. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Long-term microfluidic glucose and lactate monitoring in hepatic cell culture

    PubMed Central

    Prill, Sebastian; Jaeger, Magnus S.; Duschl, Claus

    2014-01-01

    Monitoring cellular bioenergetic pathways provides the basis for a detailed understanding of the physiological state of a cell culture. Therefore, it is widely used as a tool amongst others in the field of in vitro toxicology. The resulting metabolic information allows for performing in vitro toxicology assays for assessing drug-induced toxicity. In this study, we demonstrate the value of a microsystem for the fully automated detection of drug-induced changes in cellular viability by continuous monitoring of the metabolic activity over several days. To this end, glucose consumption and lactate secretion of a hepatic tumor cell line were continuously measured using microfluidically addressed electrochemical sensors. Adapting enzyme-based electrochemical flat-plate sensors, originally designed for human whole-blood samples, to their use with cell culture medium supersedes the common manual and laborious colorimetric assays and off-line operated external measurement systems. The cells were exposed to different concentrations of the mitochondrial inhibitor rotenone and the cellular response was analyzed by detecting changes in the rates of the glucose and lactate metabolism. Thus, the system provides real-time information on drug-induced liver injury in vitro. PMID:24926387

  11. Automated Student Model Improvement

    ERIC Educational Resources Information Center

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  12. Demonstration of the feasibility of automated silicon solar cell fabrication

    NASA Technical Reports Server (NTRS)

    Taylor, W. E.; Schwartz, F. M.

    1975-01-01

    A study effort was undertaken to determine the process, steps and design requirements of an automated silicon solar cell production facility. Identification of the key process steps was made and a laboratory model was conceptually designed to demonstrate the feasibility of automating the silicon solar cell fabrication process. A detailed laboratory model was designed to demonstrate those functions most critical to the question of solar cell fabrication process automating feasibility. The study and conceptual design have established the technical feasibility of automating the solar cell manufacturing process to produce low cost solar cells with improved performance. Estimates predict an automated process throughput of 21,973 kilograms of silicon a year on a three shift 49-week basis, producing 4,747,000 hexagonal cells (38mm/side), a total of 3,373 kilowatts at an estimated manufacturing cost of $0.866 per cell or $1.22 per watt.

  13. Applying genetic algorithms for calibrating a hexagonal cellular automata model for the simulation of debris flows characterised by strong inertial effects

    NASA Astrophysics Data System (ADS)

    Iovine, G.; D'Ambrosio, D.; Di Gregorio, S.

    2005-03-01

    In modelling complex a-centric phenomena which evolve through local interactions within a discrete time-space, cellular automata (CA) represent a valid alternative to standard solution methods based on differential equations. Flow-type phenomena (such as lava flows, pyroclastic flows, earth flows, and debris flows) can be viewed as a-centric dynamical systems, and they can therefore be properly investigated in CA terms. SCIDDICA S 4a is the last release of a two-dimensional hexagonal CA model for simulating debris flows characterised by strong inertial effects. S 4a has been obtained by progressively enriching an initial simplified model, originally derived for simulating very simple cases of slow-moving flow-type landslides. Using an empirical strategy, in S 4a, the inertial character of the flowing mass is translated into CA terms by means of local rules. In particular, in the transition function of the model, the distribution of landslide debris among the cells is obtained through a double cycle of computation. In the first phase, the inertial character of the landslide debris is taken into account by considering indicators of momentum. In the second phase, any remaining debris in the central cell is distributed among the adjacent cells, according to the principle of maximum possible equilibrium. The complexities of the model and of the phenomena to be simulated suggested the need for an automated technique of evaluation for the determination of the best set of global parameters. Accordingly, the model is calibrated using a genetic algorithm and by considering the May 1998 Curti-Sarno (Southern Italy) debris flow. The boundaries of the area affected by the debris flow are simulated well with the model. Errors computed by comparing the simulations with the mapped areal extent of the actual landslide are smaller than those previously obtained without genetic algorithms. As the experiments have been realised in a sequential computing environment, they could be improved by adopting a parallel environment, which allows the performance of a great number of tests in reasonable times.

  14. Automation of Endmember Pixel Selection in SEBAL/METRIC Model

    NASA Astrophysics Data System (ADS)

    Bhattarai, N.; Quackenbush, L. J.; Im, J.; Shaw, S. B.

    2015-12-01

    The commonly applied surface energy balance for land (SEBAL) and its variant, mapping evapotranspiration (ET) at high resolution with internalized calibration (METRIC) models require manual selection of endmember (i.e. hot and cold) pixels to calibrate sensible heat flux. Current approaches for automating this process are based on statistical methods and do not appear to be robust under varying climate conditions and seasons. In this paper, we introduce a new approach based on simple machine learning tools and search algorithms that provides an automatic and time efficient way of identifying endmember pixels for use in these models. The fully automated models were applied on over 100 cloud-free Landsat images with each image covering several eddy covariance flux sites in Florida and Oklahoma. Observed land surface temperatures at automatically identified hot and cold pixels were within 0.5% of those from pixels manually identified by an experienced operator (coefficient of determination, R2, ≥ 0.92, Nash-Sutcliffe efficiency, NSE, ≥ 0.92, and root mean squared error, RMSE, ≤ 1.67 K). Daily ET estimates derived from the automated SEBAL and METRIC models were in good agreement with their manual counterparts (e.g., NSE ≥ 0.91 and RMSE ≤ 0.35 mm day-1). Automated and manual pixel selection resulted in similar estimates of observed ET across all sites. The proposed approach should reduce time demands for applying SEBAL/METRIC models and allow for their more widespread and frequent use. This automation can also reduce potential bias that could be introduced by an inexperienced operator and extend the domain of the models to new users.

  15. Molecular and Cellular Quantitative Microscopy: theoretical investigations, technological developments and applications to neurobiology

    NASA Astrophysics Data System (ADS)

    Esposito, Alessandro

    2006-05-01

    This PhD project aims at the development and evaluation of microscopy techniques for the quantitative detection of molecular interactions and cellular features. The primarily investigated techniques are Fαrster Resonance Energy Transfer imaging and Fluorescence Lifetime Imaging Microscopy. These techniques have the capability to quantitatively probe the biochemical environment of fluorophores. An automated microscope capable of unsupervised operation has been developed that enables the investigation of molecular and cellular properties at high throughput levels and the analysis of cellular heterogeneity. State-of-the-art Förster Resonance Energy Transfer imaging, Fluorescence Lifetime Imaging Microscopy, Confocal Laser Scanning Microscopy and the newly developed tools have been combined with cellular and molecular biology techniques for the investigation of protein-protein interactions, oligomerization and post-translational modifications of α-Synuclein and Tau, two proteins involved in Parkinson’s and Alzheimer’s disease, respectively. The high inter-disciplinarity of this project required the merging of the expertise of both the Molecular Biophysics Group at the Debye Institute - Utrecht University and the Cell Biophysics Group at the European Neuroscience Institute - Gαttingen University. This project was conducted also with the support and the collaboration of the Center for the Molecular Physiology of the Brain (Göttingen), particularly with the groups associated with the Molecular Quantitative Microscopy and Parkinson’s Disease and Aggregopathies areas. This work demonstrates that molecular and cellular quantitative microscopy can be used in combination with high-throughput screening as a powerful tool for the investigation of the molecular mechanisms of complex biological phenomena like those occurring in neurodegenerative diseases.

  16. PATIKA: an integrated visual environment for collaborative construction and analysis of cellular pathways.

    PubMed

    Demir, E; Babur, O; Dogrusoz, U; Gursoy, A; Nisanci, G; Cetin-Atalay, R; Ozturk, M

    2002-07-01

    Availability of the sequences of entire genomes shifts the scientific curiosity towards the identification of function of the genomes in large scale as in genome studies. In the near future, data produced about cellular processes at molecular level will accumulate with an accelerating rate as a result of proteomics studies. In this regard, it is essential to develop tools for storing, integrating, accessing, and analyzing this data effectively. We define an ontology for a comprehensive representation of cellular events. The ontology presented here enables integration of fragmented or incomplete pathway information and supports manipulation and incorporation of the stored data, as well as multiple levels of abstraction. Based on this ontology, we present the architecture of an integrated environment named Patika (Pathway Analysis Tool for Integration and Knowledge Acquisition). Patika is composed of a server-side, scalable, object-oriented database and client-side editors to provide an integrated, multi-user environment for visualizing and manipulating network of cellular events. This tool features automated pathway layout, functional computation support, advanced querying and a user-friendly graphical interface. We expect that Patika will be a valuable tool for rapid knowledge acquisition, microarray generated large-scale data interpretation, disease gene identification, and drug development. A prototype of Patika is available upon request from the authors.

  17. Noninvasive metabolic imaging of engineered 3D human adipose tissue in a perfusion bioreactor.

    PubMed

    Ward, Andrew; Quinn, Kyle P; Bellas, Evangelia; Georgakoudi, Irene; Kaplan, David L

    2013-01-01

    The efficacy and economy of most in vitro human models used in research is limited by the lack of a physiologically-relevant three-dimensional perfused environment and the inability to noninvasively quantify the structural and biochemical characteristics of the tissue. The goal of this project was to develop a perfusion bioreactor system compatible with two-photon imaging to noninvasively assess tissue engineered human adipose tissue structure and function in vitro. Three-dimensional (3D) vascularized human adipose tissues were engineered in vitro, before being introduced to a perfusion environment and tracked over time by automated quantification of endogenous markers of metabolism using two-photon excited fluorescence (TPEF). Depth-resolved image stacks were analyzed for redox ratio metabolic profiling and compared to prior analyses performed on 3D engineered adipose tissue in static culture. Traditional assessments with H&E staining were used to qualitatively measure extracellular matrix generation and cell density with respect to location within the tissue. The distribution of cells within the tissue and average cellular redox ratios were different between static and perfusion cultures, while the trends of decreased redox ratio and increased cellular proliferation with time in both static and perfusion cultures were similar. These results establish a basis for noninvasive optical tracking of tissue structure and function in vitro, which can be applied to future studies to assess tissue development or drug toxicity screening and disease progression.

  18. Toward Automated Inventory Modeling in Life Cycle Assessment: The Utility of Semantic Data Modeling to Predict Real-WorldChemical Production

    EPA Science Inventory

    A set of coupled semantic data models, i.e., ontologies, are presented to advance a methodology towards automated inventory modeling of chemical manufacturing in life cycle assessment. The cradle-to-gate life cycle inventory for chemical manufacturing is a detailed collection of ...

  19. The use of interactive computer vision and robot hand controllers for enhancing manufacturing safety

    NASA Technical Reports Server (NTRS)

    Marzwell, Neville I.; Jacobus, Charles J.; Peurach, Thomas M.; Mitchell, Brian T.

    1994-01-01

    Current available robotic systems provide limited support for CAD-based model-driven visualization, sensing algorithm development and integration, and automated graphical planning systems. This paper describes ongoing work which provides the functionality necessary to apply advanced robotics to automated manufacturing and assembly operations. An interface has been built which incorporates 6-DOF tactile manipulation, displays for three dimensional graphical models, and automated tracking functions which depend on automated machine vision. A set of tools for single and multiple focal plane sensor image processing and understanding has been demonstrated which utilizes object recognition models. The resulting tool will enable sensing and planning from computationally simple graphical objects. A synergistic interplay between human and operator vision is created from programmable feedback received from the controller. This approach can be used as the basis for implementing enhanced safety in automated robotics manufacturing, assembly, repair and inspection tasks in both ground and space applications. Thus, an interactive capability has been developed to match the modeled environment to the real task environment for safe and predictable task execution.

  20. The use of interactive computer vision and robot hand controllers for enhancing manufacturing safety

    NASA Astrophysics Data System (ADS)

    Marzwell, Neville I.; Jacobus, Charles J.; Peurach, Thomas M.; Mitchell, Brian T.

    1994-02-01

    Current available robotic systems provide limited support for CAD-based model-driven visualization, sensing algorithm development and integration, and automated graphical planning systems. This paper describes ongoing work which provides the functionality necessary to apply advanced robotics to automated manufacturing and assembly operations. An interface has been built which incorporates 6-DOF tactile manipulation, displays for three dimensional graphical models, and automated tracking functions which depend on automated machine vision. A set of tools for single and multiple focal plane sensor image processing and understanding has been demonstrated which utilizes object recognition models. The resulting tool will enable sensing and planning from computationally simple graphical objects. A synergistic interplay between human and operator vision is created from programmable feedback received from the controller. This approach can be used as the basis for implementing enhanced safety in automated robotics manufacturing, assembly, repair and inspection tasks in both ground and space applications. Thus, an interactive capability has been developed to match the modeled environment to the real task environment for safe and predictable task execution.

  1. A knowledge-based approach to automated planning for hepatocellular carcinoma.

    PubMed

    Zhang, Yujie; Li, Tingting; Xiao, Han; Ji, Weixing; Guo, Ming; Zeng, Zhaochong; Zhang, Jianying

    2018-01-01

    To build a knowledge-based model of liver cancer for Auto-Planning, a function in Pinnacle, which is used as an automated inverse intensity modulated radiation therapy (IMRT) planning system. Fifty Tomotherapy patients were enrolled to extract the dose-volume histograms (DVHs) information and construct the protocol for Auto-Planning model. Twenty more patients were chosen additionally to test the model. Manual planning and automatic planning were performed blindly for all twenty test patients with the same machine and treatment planning system. The dose distributions of target and organs at risks (OARs), along with the working time for planning, were evaluated. Statistically significant results showed that automated plans performed better in target conformity index (CI) while mean target dose was 0.5 Gy higher than manual plans. The differences between target homogeneity indexes (HI) of the two methods were not statistically significant. Additionally, the doses of normal liver, left kidney, and small bowel were significantly reduced with automated plan. Particularly, mean dose and V15 of normal liver were 1.4 Gy and 40.5 cc lower with automated plans respectively. Mean doses of left kidney and small bowel were reduced with automated plans by 1.2 Gy and 2.1 Gy respectively. In contrast, working time was also significantly reduced with automated planning. Auto-Planning shows availability and effectiveness in our knowledge-based model for liver cancer. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  2. Modular workcells: modern methods for laboratory automation.

    PubMed

    Felder, R A

    1998-12-01

    Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.

  3. Automation effects in a multiloop manual control system

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Mcnally, B. D.

    1986-01-01

    An experimental and analytical study was undertaken to investigate human interaction with a simple multiloop manual control system in which the human's activity was systematically varied by changing the level of automation. The system simulated was the longitudinal dynamics of a hovering helicopter. The automation-systems-stabilized vehicle responses from attitude to velocity to position and also provided for display automation in the form of a flight director. The control-loop structure resulting from the task definition can be considered a simple stereotype of a hierarchical control system. The experimental study was complemented by an analytical modeling effort which utilized simple crossover models of the human operator. It was shown that such models can be extended to the description of multiloop tasks involving preview and precognitive human operator behavior. The existence of time optimal manual control behavior was established for these tasks and the role which internal models may play in establishing human-machine performance was discussed.

  4. Automating an integrated spatial data-mining model for landfill site selection

    NASA Astrophysics Data System (ADS)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Aziz, Hamidi Abdul

    2017-10-01

    An integrated programming environment represents a robust approach to building a valid model for landfill site selection. One of the main challenges in the integrated model is the complicated processing and modelling due to the programming stages and several limitations. An automation process helps avoid the limitations and improve the interoperability between integrated programming environments. This work targets the automation of a spatial data-mining model for landfill site selection by integrating between spatial programming environment (Python-ArcGIS) and non-spatial environment (MATLAB). The model was constructed using neural networks and is divided into nine stages distributed between Matlab and Python-ArcGIS. A case study was taken from the north part of Peninsular Malaysia. 22 criteria were selected to utilise as input data and to build the training and testing datasets. The outcomes show a high-performance accuracy percentage of 98.2% in the testing dataset using 10-fold cross validation. The automated spatial data mining model provides a solid platform for decision makers to performing landfill site selection and planning operations on a regional scale.

  5. Model of Emotional Expressions in Movements

    ERIC Educational Resources Information Center

    Rozaliev, Vladimir L.; Orlova, Yulia A.

    2013-01-01

    This paper presents a new approach to automated identification of human emotions based on analysis of body movements, a recognition of gestures and poses. Methodology, models and automated system for emotion identification are considered. To characterize the person emotions in the model, body movements are described with linguistic variables and a…

  6. Data for Environmental Modeling (D4EM): Background and Applications of Data Automation

    EPA Science Inventory

    The Data for Environmental Modeling (D4EM) project demonstrates the development of a comprehensive set of open source software tools that overcome obstacles to accessing data needed by automating the process of populating model input data sets with environmental data available fr...

  7. A predictive model of flight crew performance in automated air traffic control and flight management operations

    DOT National Transportation Integrated Search

    1995-01-01

    Prepared ca. 1995. This paper describes Air-MIDAS, a model of pilot performance in interaction with varied levels of automation in flight management operations. The model was used to predict the performance of a two person flight crew responding to c...

  8. Automation of Ocean Product Metrics

    DTIC Science & Technology

    2008-09-30

    Presented in: Ocean Sciences 2008 Conf., 5 Mar 2008. Shriver, J., J. D. Dykes, and J. Fabre: Automation of Operational Ocean Product Metrics. Presented in 2008 EGU General Assembly , 14 April 2008. 9 ...processing (multiple data cuts per day) and multiple-nested models. Routines for generating automated evaluations of model forecast statistics will be...developed and pre-existing tools will be collected to create a generalized tool set, which will include user-interface tools to the metrics data

  9. System Operations Studies for Automated Guideway Transit Systems : Discrete Event Simulation Model Programmer's Manual

    DOT National Transportation Integrated Search

    1982-07-01

    In order to examine specific automated guideway transit (AGT) developments and concepts, UMTA undertook a program of studies and technology investigations called Automated Guideway Transit Technology (AGTT) Program. The objectives of one segment of t...

  10. Systems Operations Studies for Automated Guideway Transit Systems : System Availability Model Programmer's Manual

    DOT National Transportation Integrated Search

    1981-07-01

    In order to examine specific automated guideway transit (AGT) developments and concepts, UMTA undertook a program of studies and technology investigations called Automated Guideway Transit Technology (AGTT) Program. The objectives of one segment of t...

  11. Automated structure solution, density modification and model building.

    PubMed

    Terwilliger, Thomas C

    2002-11-01

    The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.

  12. Automation in clinical microbiology: a new approach to identifying micro-organisms by automated pattern matching of proteins labelled with 35S-methionine.

    PubMed Central

    Tabaqchali, S; Silman, R; Holland, D

    1987-01-01

    A new rapid automated method for the identification and classification of microorganisms is described. It is based on the incorporation of 35S-methionine into cellular proteins and subsequent separation of the radiolabelled proteins by sodium dodecyl sulphate-polyacrylamide gel electrophoresis (SDS-PAGE). The protein patterns produced were species specific and reproducible, permitting discrimination between the species. A large number of Gram negative and Gram positive aerobic and anaerobic organisms were successfully tested. Furthermore, there were sufficient differences within species between the protein profiles to permit subdivision of the species. New typing schemes for Clostridium difficile, coagulase negative staphylococci, and Staphylococcus aureus, including the methicillin resistant strains, could thus be introduced; this has provided the basis for useful epidemiological studies. To standardise and automate the procedure an automated electrophoresis system and a two dimensional scanner were developed to scan the dried gels directly. The scanner is operated by a computer which also stores and analyses the scan data. Specific histograms are produced for each bacterial species. Pattern recognition software is used to construct databases and to compare data obtained from different gels: in this way duplicate "unknowns" can be identified. Specific small areas showing differences between various histograms can also be isolated and expanded to maximise the differences, thus providing differentiation between closely related bacterial species and the identification of differences within the species to provide new typing schemes. This system should be widely applied in clinical microbiology laboratories in the near future. Images Fig 1 Fig 2 Fig 3 Fig 4 Fig 5 Fig 6 Fig 7 Fig 8 PMID:3312300

  13. Comparison of automated versus manual neutrophil counts for the detection of cellular abnormalities in dogs receiving chemotherapy: 50 cases (May to June 2008).

    PubMed

    Cora, Michelle C; Neel, Jennifer A; Grindem, Carol B; Kissling, Grace E; Hess, Paul R

    2013-06-01

    To determine the frequency of clinically relevant abnormalities missed by failure to perform a blood smear evaluation in a specific subset of dogs receiving chemotherapy and to compare automated and manual neutrophil counts in the same population. Retrospective case series. 50 dogs receiving chemotherapy with a total nucleated cell count > 4,000 nucleated cells/μL. 50 blood smears were evaluated for abnormalities that have strong potential to change the medical plan for a patient: presence of blast cells, band neutrophils, nucleated RBCs, toxic change, hemoparasites, schistocytes, and spherocytes. Automated and manual neutrophil counts were compared. Blood smears from 10 (20%) patients had ≥ 1 abnormalities. Blast cells were identified on 4 (8%) blood smears, increased nucleated RBCs were identified on 5 (10%), and very mild toxic change was identified on 2 (4%). Correlation coefficient of the neutrophil counts was 0.96. Analysis revealed a slight bias between the automated and manual neutrophil counts (mean ± SD difference, -0.43 × 10(3)/μL ± 1.10 × 10(3)/μL). In this series of patients, neutrophil count correlation was very good. Clinically relevant abnormalities were found on 20% of the blood smears. An automated CBC appears to be accurate for neutrophil counts, but a microscopic examination of the corresponding blood smear is still recommended; further studies are needed to determine whether the detection or frequency of these abnormalities would differ dependent on chemotherapy protocol, neoplastic disease, and decision thresholds used by the oncologist in the ordering of a CBC without a blood smear evaluation.

  14. Comparison of two methods for measuring γ-H2AX nuclear fluorescence as a marker of DNA damage in cultured human cells: applications for microbeam radiation therapy

    NASA Astrophysics Data System (ADS)

    Anderson, D.; Andrais, B.; Mirzayans, R.; Siegbahn, E. A.; Fallone, B. G.; Warkentin, B.

    2013-06-01

    Microbeam radiation therapy (MRT) delivers single fractions of very high doses of synchrotron x-rays using arrays of microbeams. In animal experiments, MRT has achieved higher tumour control and less normal tissue toxicity compared to single-fraction broad beam irradiations of much lower dose. The mechanism behind the normal tissue sparing of MRT has yet to be fully explained. An accurate method for evaluating DNA damage, such as the γ-H2AX immunofluorescence assay, will be important for understanding the role of cellular communication in the radiobiological response of normal and cancerous cell types to MRT. We compare two methods of quantifying γ-H2AX nuclear fluorescence for uniformly irradiated cell cultures: manual counting of γ-H2AX foci by eye, and an automated, MATLAB-based fluorescence intensity measurement. We also demonstrate the automated analysis of cell cultures irradiated with an array of microbeams. In addition to offering a relatively high dynamic range of γ-H2AX signal versus irradiation dose ( > 10 Gy), our automated method provides speed, robustness, and objectivity when examining a series of images. Our in-house analysis facilitates the automated extraction of the spatial distribution of the γ-H2AX intensity with respect to the microbeam array — for example, the intensities in the peak (high dose area) and valley (area between two microbeams) regions. The automated analysis is particularly beneficial when processing a large number of samples, as is needed to systematically study the relationship between the numerous dosimetric and geometric parameters involved with MRT (e.g., microbeam width, microbeam spacing, microbeam array dimensions, peak dose, valley dose, and geometric arrangement of multiple arrays) and the resulting DNA damage.

  15. Modelling of human-machine interaction in equipment design of manufacturing cells

    NASA Astrophysics Data System (ADS)

    Cochran, David S.; Arinez, Jorge F.; Collins, Micah T.; Bi, Zhuming

    2017-08-01

    This paper proposes a systematic approach to model human-machine interactions (HMIs) in supervisory control of machining operations; it characterises the coexistence of machines and humans for an enterprise to balance the goals of automation/productivity and flexibility/agility. In the proposed HMI model, an operator is associated with a set of behavioural roles as a supervisor for multiple, semi-automated manufacturing processes. The model is innovative in the sense that (1) it represents an HMI based on its functions for process control but provides the flexibility for ongoing improvements in the execution of manufacturing processes; (2) it provides a computational tool to define functional requirements for an operator in HMIs. The proposed model can be used to design production systems at different levels of an enterprise architecture, particularly at the machine level in a production system where operators interact with semi-automation to accomplish the goal of 'autonomation' - automation that augments the capabilities of human beings.

  16. Design and Implementation of an Intelligent Cost Estimation Model for Decision Support System Software

    DTIC Science & Technology

    1990-09-01

    following two chapters. 28 V. COCOMO MODEL A. OVERVIEW The COCOMO model which stands for COnstructive COst MOdel was developed by Barry Boehm and is...estimation model which uses an expert system to automate the Intermediate COnstructive Cost Estimation MOdel (COCOMO), developed by Barry W. Boehm and...cost estimation model which uses an expert system to automate the Intermediate COnstructive Cost Estimation MOdel (COCOMO), developed by Barry W

  17. Development of an automated film-reading system for ballistic ranges

    NASA Technical Reports Server (NTRS)

    Yates, Leslie A.

    1992-01-01

    Software for an automated film-reading system that uses personal computers and digitized shadowgraphs is described. The software identifies pixels associated with fiducial-line and model images, and least-squares procedures are used to calculate the positions and orientations of the images. Automated position and orientation readings for sphere and cone models are compared to those obtained using a manual film reader. When facility calibration errors are removed from these readings, the accuracy of the automated readings is better than the pixel resolution, and it is equal to, or better than, the manual readings. The effects of film-reading and facility-calibration errors on calculated aerodynamic coefficients is discussed.

  18. Contrasting models of driver behaviour in emergencies using retrospective verbalisations and network analysis.

    PubMed

    Banks, Victoria A; Stanton, Neville A

    2015-01-01

    Automated assistance in driving emergencies aims to improve the safety of our roads by avoiding or mitigating the effects of accidents. However, the behavioural implications of such systems remain unknown. This paper introduces the driver decision-making in emergencies (DDMiEs) framework to investigate how the level and type of automation may affect driver decision-making and subsequent responses to critical braking events using network analysis to interrogate retrospective verbalisations. Four DDMiE models were constructed to represent different levels of automation within the driving task and its effects on driver decision-making. Findings suggest that whilst automation does not alter the decision-making pathway (e.g. the processes between hazard detection and response remain similar), it does appear to significantly weaken the links between information-processing nodes. This reflects an unintended yet emergent property within the task network that could mean that we may not be improving safety in the way we expect. This paper contrasts models of driver decision-making in emergencies at varying levels of automation using the Southampton University Driving Simulator. Network analysis of retrospective verbalisations indicates that increasing the level of automation in driving emergencies weakens the link between information-processing nodes essential for effective decision-making.

  19. Automated analysis of cell migration and nuclear envelope rupture in confined environments.

    PubMed

    Elacqua, Joshua J; McGregor, Alexandra L; Lammerding, Jan

    2018-01-01

    Recent in vitro and in vivo studies have highlighted the importance of the cell nucleus in governing migration through confined environments. Microfluidic devices that mimic the narrow interstitial spaces of tissues have emerged as important tools to study cellular dynamics during confined migration, including the consequences of nuclear deformation and nuclear envelope rupture. However, while image acquisition can be automated on motorized microscopes, the analysis of the corresponding time-lapse sequences for nuclear transit through the pores and events such as nuclear envelope rupture currently requires manual analysis. In addition to being highly time-consuming, such manual analysis is susceptible to person-to-person variability. Studies that compare large numbers of cell types and conditions therefore require automated image analysis to achieve sufficiently high throughput. Here, we present an automated image analysis program to register microfluidic constrictions and perform image segmentation to detect individual cell nuclei. The MATLAB program tracks nuclear migration over time and records constriction-transit events, transit times, transit success rates, and nuclear envelope rupture. Such automation reduces the time required to analyze migration experiments from weeks to hours, and removes the variability that arises from different human analysts. Comparison with manual analysis confirmed that both constriction transit and nuclear envelope rupture were detected correctly and reliably, and the automated analysis results closely matched a manual analysis gold standard. Applying the program to specific biological examples, we demonstrate its ability to detect differences in nuclear transit time between cells with different levels of the nuclear envelope proteins lamin A/C, which govern nuclear deformability, and to detect an increase in nuclear envelope rupture duration in cells in which CHMP7, a protein involved in nuclear envelope repair, had been depleted. The program thus presents a versatile tool for the study of confined migration and its effect on the cell nucleus.

  20. Cockpit Adaptive Automation and Pilot Performance

    NASA Technical Reports Server (NTRS)

    Parasuraman, Raja

    2001-01-01

    The introduction of high-level automated systems in the aircraft cockpit has provided several benefits, e.g., new capabilities, enhanced operational efficiency, and reduced crew workload. At the same time, conventional 'static' automation has sometimes degraded human operator monitoring performance, increased workload, and reduced situation awareness. Adaptive automation represents an alternative to static automation. In this approach, task allocation between human operators and computer systems is flexible and context-dependent rather than static. Adaptive automation, or adaptive task allocation, is thought to provide for regulation of operator workload and performance, while preserving the benefits of static automation. In previous research we have reported beneficial effects of adaptive automation on the performance of both pilots and non-pilots of flight-related tasks. For adaptive systems to be viable, however, such benefits need to be examined jointly in the context of a single set of tasks. The studies carried out under this project evaluated a systematic method for combining different forms of adaptive automation. A model for effective combination of different forms of adaptive automation, based on matching adaptation to operator workload was proposed and tested. The model was evaluated in studies using IFR-rated pilots flying a general-aviation simulator. Performance, subjective, and physiological (heart rate variability, eye scan-paths) measures of workload were recorded. The studies compared workload-based adaptation to to non-adaptive control conditions and found evidence for systematic benefits of adaptive automation. The research provides an empirical basis for evaluating the effectiveness of adaptive automation in the cockpit. The results contribute to the development of design principles and guidelines for the implementation of adaptive automation in the cockpit, particularly in general aviation, and in other human-machine systems. Project goals were met or exceeded. The results of the research extended knowledge of automation-related performance decrements in pilots and demonstrated the positive effects of adaptive task allocation. In addition, several practical implications for cockpit automation design were drawn from the research conducted. A total of 12 articles deriving from the project were published.

  1. Effects of imperfect automation on decision making in a simulated command and control task.

    PubMed

    Rovira, Ericka; McGarry, Kathleen; Parasuraman, Raja

    2007-02-01

    Effects of four types of automation support and two levels of automation reliability were examined. The objective was to examine the differential impact of information and decision automation and to investigate the costs of automation unreliability. Research has shown that imperfect automation can lead to differential effects of stages and levels of automation on human performance. Eighteen participants performed a "sensor to shooter" targeting simulation of command and control. Dependent variables included accuracy and response time of target engagement decisions, secondary task performance, and subjective ratings of mental work-load, trust, and self-confidence. Compared with manual performance, reliable automation significantly reduced decision times. Unreliable automation led to greater cost in decision-making accuracy under the higher automation reliability condition for three different forms of decision automation relative to information automation. At low automation reliability, however, there was a cost in performance for both information and decision automation. The results are consistent with a model of human-automation interaction that requires evaluation of the different stages of information processing to which automation support can be applied. If fully reliable decision automation cannot be guaranteed, designers should provide users with information automation support or other tools that allow for inspection and analysis of raw data.

  2. Real-Time Kinetic Modeling of Voltage-Gated Ion Channels Using Dynamic Clamp

    PubMed Central

    Milescu, Lorin S.; Yamanishi, Tadashi; Ptak, Krzysztof; Mogri, Murtaza Z.; Smith, Jeffrey C.

    2008-01-01

    We propose what to our knowledge is a new technique for modeling the kinetics of voltage-gated ion channels in a functional context, in neurons or other excitable cells. The principle is to pharmacologically block the studied channel type, and to functionally replace it with dynamic clamp, on the basis of a computational model. Then, the parameters of the model are modified in real time (manually or automatically), with the objective of matching the dynamical behavior of the cell (e.g., action potential shape and spiking frequency), but also the transient and steady-state properties of the model (e.g., those derived from voltage-clamp recordings). Through this approach, one may find a model and parameter values that explain both the observed cellular dynamics and the biophysical properties of the channel. We extensively tested the method, focusing on Nav models. Complex Markov models (10–12 states or more) could be accurately integrated in real time at >50 kHz using the transition probability matrix, but not the explicit Euler method. The practicality of the technique was tested with experiments in raphe pacemaker neurons. Through automated real-time fitting, a Hodgkin-Huxley model could be found that reproduced well the action potential shape and the spiking frequency. Adding a virtual axonal compartment with a high density of Nav channels further improved the action potential shape. The computational procedure was implemented in the free QuB software, running under Microsoft Windows and featuring a friendly graphical user interface. PMID:18375511

  3. ANDSystem: an Associative Network Discovery System for automated literature mining in the field of biology

    PubMed Central

    2015-01-01

    Background Sufficient knowledge of molecular and genetic interactions, which comprise the entire basis of the functioning of living systems, is one of the necessary requirements for successfully answering almost any research question in the field of biology and medicine. To date, more than 24 million scientific papers can be found in PubMed, with many of them containing descriptions of a wide range of biological processes. The analysis of such tremendous amounts of data requires the use of automated text-mining approaches. Although a handful of tools have recently been developed to meet this need, none of them provide error-free extraction of highly detailed information. Results The ANDSystem package was developed for the reconstruction and analysis of molecular genetic networks based on an automated text-mining technique. It provides a detailed description of the various types of interactions between genes, proteins, microRNA's, metabolites, cellular components, pathways and diseases, taking into account the specificity of cell lines and organisms. Although the accuracy of ANDSystem is comparable to other well known text-mining tools, such as Pathway Studio and STRING, it outperforms them in having the ability to identify an increased number of interaction types. Conclusion The use of ANDSystem, in combination with Pathway Studio and STRING, can improve the quality of the automated reconstruction of molecular and genetic networks. ANDSystem should provide a useful tool for researchers working in a number of different fields, including biology, biotechnology, pharmacology and medicine. PMID:25881313

  4. Automated recognition of cell phenotypes in histology images based on membrane- and nuclei-targeting biomarkers

    PubMed Central

    Karaçalı, Bilge; Vamvakidou, Alexandra P; Tözeren, Aydın

    2007-01-01

    Background Three-dimensional in vitro culture of cancer cells are used to predict the effects of prospective anti-cancer drugs in vivo. In this study, we present an automated image analysis protocol for detailed morphological protein marker profiling of tumoroid cross section images. Methods Histologic cross sections of breast tumoroids developed in co-culture suspensions of breast cancer cell lines, stained for E-cadherin and progesterone receptor, were digitized and pixels in these images were classified into five categories using k-means clustering. Automated segmentation was used to identify image regions composed of cells expressing a given biomarker. Synthesized images were created to check the accuracy of the image processing system. Results Accuracy of automated segmentation was over 95% in identifying regions of interest in synthesized images. Image analysis of adjacent histology slides stained, respectively, for Ecad and PR, accurately predicted regions of different cell phenotypes. Image analysis of tumoroid cross sections from different tumoroids obtained under the same co-culture conditions indicated the variation of cellular composition from one tumoroid to another. Variations in the compositions of cross sections obtained from the same tumoroid were established by parallel analysis of Ecad and PR-stained cross section images. Conclusion Proposed image analysis methods offer standardized high throughput profiling of molecular anatomy of tumoroids based on both membrane and nuclei markers that is suitable to rapid large scale investigations of anti-cancer compounds for drug development. PMID:17822559

  5. An automated laboratory-scale methodology for the generation of sheared mammalian cell culture samples.

    PubMed

    Joseph, Adrian; Goldrick, Stephen; Mollet, Michael; Turner, Richard; Bender, Jean; Gruber, David; Farid, Suzanne S; Titchener-Hooker, Nigel

    2017-05-01

    Continuous disk-stack centrifugation is typically used for the removal of cells and cellular debris from mammalian cell culture broths at manufacturing-scale. The use of scale-down methods to characterise disk-stack centrifugation performance enables substantial reductions in material requirements and allows a much wider design space to be tested than is currently possible at pilot-scale. The process of scaling down centrifugation has historically been challenging due to the difficulties in mimicking the Energy Dissipation Rates (EDRs) in typical machines. This paper describes an alternative and easy-to-assemble automated capillary-based methodology to generate levels of EDRs consistent with those found in a continuous disk-stack centrifuge. Variations in EDR were achieved through changes in capillary internal diameter and the flow rate of operation through the capillary. The EDRs found to match the levels of shear in the feed zone of a pilot-scale centrifuge using the experimental method developed in this paper (2.4×10 5 W/Kg) are consistent with those obtained through previously published computational fluid dynamic (CFD) studies (2.0×10 5 W/Kg). Furthermore, this methodology can be incorporated into existing scale-down methods to model the process performance of continuous disk-stack centrifuges. This was demonstrated through the characterisation of culture hold time, culture temperature and EDRs on centrate quality. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. High-Throughput Identification of Combinatorial Ligands for DNA Delivery in Cell Culture

    NASA Astrophysics Data System (ADS)

    Svahn, Mathias G.; Rabe, Kersten S.; Barger, Geoffrey; EL-Andaloussi, Samir; Simonson, Oscar E.; Didier, Boturyn; Olivier, Renaudet; Dumy, Pascal; Brandén, Lars J.; Niemeyer, Christof M.; Smith, C. I. Edvard

    2008-10-01

    Finding the optimal combinations of ligands for tissue-specific delivery is tedious even if only a few well-established compounds are tested. The cargo affects the receptor-ligand interaction, especially when it is charged like DNA. The ligand should therefore be evaluated together with its cargo. Several viruses have been shown to interact with more than one receptor, for efficient internalization. We here present a DNA oligonucleotide-based method for inexpensive and rapid screening of biotin labeled ligands for combinatorial effects on cellular binding and uptake. The oligonucleotide complex was designed as a 44 bp double-stranded DNA oligonucleotide with one central streptavidin molecule and a second streptavidin at the terminus. The use of a highly advanced robotic platform ensured stringent processing and execution of the experiments. The oligonucleotides were fluorescently labeled and used for detection and analysis of cell-bound, internalized and intra-cellular compartmentalized constructs by an automated line-scanning confocal microscope, IN Cell Analyzer 3000. All possible combinations of 22 ligands were explored in sets of 2 and tested on 6 different human cell lines in triplicates. In total, 10 000 transfections were performed on the automation platform. Cell-specific combinations of ligands were identified and their relative position on the scaffold oligonucleotide was found to be of importance. The ligands were found to be cargo dependent, carbohydrates were more potent for DNA delivery whereas cell penetrating peptides were more potent for delivery of less charged particles.

  7. Liquid-Based Medium Used to Prepare Cytological Breast Nipple Fluid Improves the Quality of Cellular Samples Automatic Collection

    PubMed Central

    Zonta, Marco Antonio; Velame, Fernanda; Gema, Samara; Filassi, Jose Roberto; Longatto-Filho, Adhemar

    2014-01-01

    Background Breast cancer is the second cause of death in women worldwide. The spontaneous breast nipple discharge may contain cells that can be analyzed for malignancy. Halo® Mamo Cyto Test (HMCT) was recently developed as an automated system indicated to aspirate cells from the breast ducts. The objective of this study was to standardize the methodology of sampling and sample preparation of nipple discharge obtained by the automated method Halo breast test and perform cytological evaluation in samples preserved in liquid medium (SurePath™). Methods We analyzed 564 nipple fluid samples, from women between 20 and 85 years old, without history of breast disease and neoplasia, no pregnancy, and without gynecologic medical history, collected by HMCT method and preserved in two different vials with solutions for transport. Results From 306 nipple fluid samples from method 1, 199 (65%) were classified as unsatisfactory (class 0), 104 (34%) samples were classified as benign findings (class II), and three (1%) were classified as undetermined to neoplastic cells (class III). From 258 samples analyzed in method 2, 127 (49%) were classified as class 0, 124 (48%) were classified as class II, and seven (2%) were classified as class III. Conclusion Our study suggests an improvement in the quality and quantity of cellular samples when the association of the two methodologies is performed, Halo breast test and the method in liquid medium. PMID:29147397

  8. Comparison of the Cellient(™) automated cell block system and agar cell block method.

    PubMed

    Kruger, A M; Stevens, M W; Kerley, K J; Carter, C D

    2014-12-01

    To compare the Cellient(TM) automated cell block system with the agar cell block method in terms of quantity and quality of diagnostic material and morphological, histochemical and immunocytochemical features. Cell blocks were prepared from 100 effusion samples using the agar method and Cellient system, and routinely sectioned and stained for haematoxylin and eosin and periodic acid-Schiff with diastase (PASD). A preliminary immunocytochemical study was performed on selected cases (27/100 cases). Sections were evaluated using a three-point grading system to compare a set of morphological parameters. Statistical analysis was performed using Fisher's exact test. Parameters assessing cellularity, presence of single cells and definition of nuclear membrane, nucleoli, chromatin and cytoplasm showed a statistically significant improvement on Cellient cell blocks compared with agar cell blocks (P < 0.05). No significant difference was seen for definition of cell groups, PASD staining or the intensity or clarity of immunocytochemical staining. A discrepant immunocytochemistry (ICC) result was seen in 21% (13/63) of immunostains. The Cellient technique is comparable with the agar method, with statistically significant results achieved for important morphological features. It demonstrates potential as an alternative cell block preparation method which is relevant for the rapid processing of fine needle aspiration samples, malignant effusions and low-cellularity specimens, where optimal cell morphology and architecture are essential. Further investigation is required to optimize immunocytochemical staining using the Cellient method. © 2014 John Wiley & Sons Ltd.

  9. The Impact of Sampling Approach on Population Invariance in Automated Scoring of Essays. Research Report. ETS RR-13-18

    ERIC Educational Resources Information Center

    Zhang, Mo

    2013-01-01

    Many testing programs use automated scoring to grade essays. One issue in automated essay scoring that has not been examined adequately is population invariance and its causes. The primary purpose of this study was to investigate the impact of sampling in model calibration on population invariance of automated scores. This study analyzed scores…

  10. A Compact Synchronous Cellular Model of Nonlinear Calcium Dynamics: Simulation and FPGA Synthesis Results.

    PubMed

    Soleimani, Hamid; Drakakis, Emmanuel M

    2017-06-01

    Recent studies have demonstrated that calcium is a widespread intracellular ion that controls a wide range of temporal dynamics in the mammalian body. The simulation and validation of such studies using experimental data would benefit from a fast large scale simulation and modelling tool. This paper presents a compact and fully reconfigurable cellular calcium model capable of mimicking Hopf bifurcation phenomenon and various nonlinear responses of the biological calcium dynamics. The proposed cellular model is synthesized on a digital platform for a single unit and a network model. Hardware synthesis, physical implementation on FPGA, and theoretical analysis confirm that the proposed cellular model can mimic the biological calcium behaviors with considerably low hardware overhead. The approach has the potential to speed up large-scale simulations of slow intracellular dynamics by sharing more cellular units in real-time. To this end, various networks constructed by pipelining 10 k to 40 k cellular calcium units are compared with an equivalent simulation run on a standard PC workstation. Results show that the cellular hardware model is, on average, 83 times faster than the CPU version.

  11. A Binary Programming Approach to Automated Test Assembly for Cognitive Diagnosis Models

    ERIC Educational Resources Information Center

    Finkelman, Matthew D.; Kim, Wonsuk; Roussos, Louis; Verschoor, Angela

    2010-01-01

    Automated test assembly (ATA) has been an area of prolific psychometric research. Although ATA methodology is well developed for unidimensional models, its application alongside cognitive diagnosis models (CDMs) is a burgeoning topic. Two suggested procedures for combining ATA and CDMs are to maximize the cognitive diagnostic index and to use a…

  12. Automated side-chain model building and sequence assignment by template matching.

    PubMed

    Terwilliger, Thomas C

    2003-01-01

    An algorithm is described for automated building of side chains in an electron-density map once a main-chain model is built and for alignment of the protein sequence to the map. The procedure is based on a comparison of electron density at the expected side-chain positions with electron-density templates. The templates are constructed from average amino-acid side-chain densities in 574 refined protein structures. For each contiguous segment of main chain, a matrix with entries corresponding to an estimate of the probability that each of the 20 amino acids is located at each position of the main-chain model is obtained. The probability that this segment corresponds to each possible alignment with the sequence of the protein is estimated using a Bayesian approach and high-confidence matches are kept. Once side-chain identities are determined, the most probable rotamer for each side chain is built into the model. The automated procedure has been implemented in the RESOLVE software. Combined with automated main-chain model building, the procedure produces a preliminary model suitable for refinement and extension by an experienced crystallographer.

  13. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  14. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  15. An innovative approach for modeling and simulation of an automated industrial robotic arm operated electro-pneumatically

    NASA Astrophysics Data System (ADS)

    Popa, L.; Popa, V.

    2017-08-01

    The article is focused on modeling an automated industrial robotic arm operated electro-pneumatically and to simulate the robotic arm operation. It is used the graphic language FBD (Function Block Diagram) to program the robotic arm on Zelio Logic automation. The innovative modeling and simulation procedures are considered specific problems regarding the development of a new type of technical products in the field of robotics. Thus, were identified new applications of a Programmable Logic Controller (PLC) as a specialized computer performing control functions with a variety of high levels of complexit.

  16. Automated Discovery and Modeling of Sequential Patterns Preceding Events of Interest

    NASA Technical Reports Server (NTRS)

    Rohloff, Kurt

    2010-01-01

    The integration of emerging data manipulation technologies has enabled a paradigm shift in practitioners' abilities to understand and anticipate events of interest in complex systems. Example events of interest include outbreaks of socio-political violence in nation-states. Rather than relying on human-centric modeling efforts that are limited by the availability of SMEs, automated data processing technologies has enabled the development of innovative automated complex system modeling and predictive analysis technologies. We introduce one such emerging modeling technology - the sequential pattern methodology. We have applied the sequential pattern methodology to automatically identify patterns of observed behavior that precede outbreaks of socio-political violence such as riots, rebellions and coups in nation-states. The sequential pattern methodology is a groundbreaking approach to automated complex system model discovery because it generates easily interpretable patterns based on direct observations of sampled factor data for a deeper understanding of societal behaviors that is tolerant of observation noise and missing data. The discovered patterns are simple to interpret and mimic human's identifications of observed trends in temporal data. Discovered patterns also provide an automated forecasting ability: we discuss an example of using discovered patterns coupled with a rich data environment to forecast various types of socio-political violence in nation-states.

  17. Improving automation standards via semantic modelling: Application to ISA88.

    PubMed

    Dombayci, Canan; Farreres, Javier; Rodríguez, Horacio; Espuña, Antonio; Graells, Moisès

    2017-03-01

    Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Improved compliance by BPM-driven workflow automation.

    PubMed

    Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin

    2014-12-01

    Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.

  19. Automation Applications in an Advanced Air Traffic Management System : Volume 5A. DELTA Simulation Model - User's Guide

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 4 describes the automation requirements. A presentation of automation requirements is made for an advanced air traffic management system in terms of controller work for-e, computer resources, controller productivity, system manning, failure ef...

  20. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    ERIC Educational Resources Information Center

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  1. From Here to Autonomy.

    PubMed

    Endsley, Mica R

    2017-02-01

    As autonomous and semiautonomous systems are developed for automotive, aviation, cyber, robotics and other applications, the ability of human operators to effectively oversee and interact with them when needed poses a significant challenge. An automation conundrum exists in which as more autonomy is added to a system, and its reliability and robustness increase, the lower the situation awareness of human operators and the less likely that they will be able to take over manual control when needed. The human-autonomy systems oversight model integrates several decades of relevant autonomy research on operator situation awareness, out-of-the-loop performance problems, monitoring, and trust, which are all major challenges underlying the automation conundrum. Key design interventions for improving human performance in interacting with autonomous systems are integrated in the model, including human-automation interface features and central automation interaction paradigms comprising levels of automation, adaptive automation, and granularity of control approaches. Recommendations for the design of human-autonomy interfaces are presented and directions for future research discussed.

  2. Task-focused modeling in automated agriculture

    NASA Astrophysics Data System (ADS)

    Vriesenga, Mark R.; Peleg, K.; Sklansky, Jack

    1993-01-01

    Machine vision systems analyze image data to carry out automation tasks. Our interest is in machine vision systems that rely on models to achieve their designed task. When the model is interrogated from an a priori menu of questions, the model need not be complete. Instead, the machine vision system can use a partial model that contains a large amount of information in regions of interest and less information elsewhere. We propose an adaptive modeling scheme for machine vision, called task-focused modeling, which constructs a model having just sufficient detail to carry out the specified task. The model is detailed in regions of interest to the task and is less detailed elsewhere. This focusing effect saves time and reduces the computational effort expended by the machine vision system. We illustrate task-focused modeling by an example involving real-time micropropagation of plants in automated agriculture.

  3. Space biology initiative program definition review. Trade study 1: Automation costs versus crew utilization

    NASA Technical Reports Server (NTRS)

    Jackson, L. Neal; Crenshaw, John, Sr.; Hambright, R. N.; Nedungadi, A.; Mcfayden, G. M.; Tsuchida, M. S.

    1989-01-01

    A significant emphasis upon automation within the Space Biology Initiative hardware appears justified in order to conserve crew labor and crew training effort. Two generic forms of automation were identified: automation of data and information handling and decision making, and the automation of material handling, transfer, and processing. The use of automatic data acquisition, expert systems, robots, and machine vision will increase the volume of experiments and quality of results. The automation described may also influence efforts to miniaturize and modularize the large array of SBI hardware identified to date. The cost and benefit model developed appears to be a useful guideline for SBI equipment specifiers and designers. Additional refinements would enhance the validity of the model. Two NASA automation pilot programs, 'The Principal Investigator in a Box' and 'Rack Mounted Robots' were investigated and found to be quite appropriate for adaptation to the SBI program. There are other in-house NASA efforts that provide technology that may be appropriate for the SBI program. Important data is believed to exist in advanced medical labs throughout the U.S., Japan, and Europe. The information and data processing in medical analysis equipment is highly automated and future trends reveal continued progress in this area. However, automation of material handling and processing has progressed in a limited manner because the medical labs are not affected by the power and space constraints that Space Station medical equipment is faced with. Therefore, NASA's major emphasis in automation will require a lead effort in the automation of material handling to achieve optimal crew utilization.

  4. Trust in automation: designing for appropriate reliance.

    PubMed

    Lee, John D; See, Katrina A

    2004-01-01

    Automation is often problematic because people fail to rely upon it appropriately. Because people respond to technology socially, trust influences reliance on automation. In particular, trust guides reliance when complexity and unanticipated situations make a complete understanding of the automation impractical. This review considers trust from the organizational, sociological, interpersonal, psychological, and neurological perspectives. It considers how the context, automation characteristics, and cognitive processes affect the appropriateness of trust. The context in which the automation is used influences automation performance and provides a goal-oriented perspective to assess automation characteristics along a dimension of attributional abstraction. These characteristics can influence trust through analytic, analogical, and affective processes. The challenges of extrapolating the concept of trust in people to trust in automation are discussed. A conceptual model integrates research regarding trust in automation and describes the dynamics of trust, the role of context, and the influence of display characteristics. Actual or potential applications of this research include improved designs of systems that require people to manage imperfect automation.

  5. Pilots' monitoring strategies and performance on automated flight decks: an empirical study combining behavioral and eye-tracking data.

    PubMed

    Sarter, Nadine B; Mumaw, Randall J; Wickens, Christopher D

    2007-06-01

    The objective of the study was to examine pilots' automation monitoring strategies and performance on highly automated commercial flight decks. A considerable body of research and operational experience has documented breakdowns in pilot-automation coordination on modern flight decks. These breakdowns are often considered symptoms of monitoring failures even though, to date, only limited and mostly anecdotal data exist concerning pilots' monitoring strategies and performance. Twenty experienced B-747-400 airline pilots flew a 1-hr scenario involving challenging automation-related events on a full-mission simulator. Behavioral, mental model, and eye-tracking data were collected. The findings from this study confirm that pilots monitor basic flight parameters to a much greater extent than visual indications of the automation configuration. More specifically, they frequently fail to verify manual mode selections or notice automatic mode changes. In other cases, they do not process mode annunciations in sufficient depth to understand their implications for aircraft behavior. Low system observability and gaps in pilots' understanding of complex automation modes were shown to contribute to these problems. Our findings describe and explain shortcomings in pilot's automation monitoring strategies and performance based on converging behavioral, eye-tracking, and mental model data. They confirm that monitoring failures are one major contributor to breakdowns in pilot-automation interaction. The findings from this research can inform the design of improved training programs and automation interfaces that support more effective system monitoring.

  6. Breast MR segmentation and lesion detection with cellular neural networks and 3D template matching.

    PubMed

    Ertaş, Gökhan; Gülçür, H Ozcan; Osman, Onur; Uçan, Osman N; Tunaci, Mehtap; Dursun, Memduh

    2008-01-01

    A novel fully automated system is introduced to facilitate lesion detection in dynamic contrast-enhanced, magnetic resonance mammography (DCE-MRM). The system extracts breast regions from pre-contrast images using a cellular neural network, generates normalized maximum intensity-time ratio (nMITR) maps and performs 3D template matching with three layers of 12x12 cells to detect lesions. A breast is considered to be properly segmented when relative overlap >0.85 and misclassification rate <0.10. Sensitivity, false-positive rate per slice and per lesion are used to assess detection performance. The system was tested with a dataset of 2064 breast MR images (344slicesx6 acquisitions over time) from 19 women containing 39 marked lesions. Ninety-seven percent of the breasts were segmented properly and all the lesions were detected correctly (detection sensitivity=100%), however, there were some false-positive detections (31%/lesion, 10%/slice).

  7. Automated High-Throughput Quantification of Mitotic Spindle Positioning from DIC Movies of Caenorhabditis Embryos

    PubMed Central

    Cluet, David; Spichty, Martin; Delattre, Marie

    2014-01-01

    The mitotic spindle is a microtubule-based structure that elongates to accurately segregate chromosomes during anaphase. Its position within the cell also dictates the future cell cleavage plan, thereby determining daughter cell orientation within a tissue or cell fate adoption for polarized cells. Therefore, the mitotic spindle ensures at the same time proper cell division and developmental precision. Consequently, spindle dynamics is the matter of intensive research. Among the different cellular models that have been explored, the one-cell stage C. elegans embryo has been an essential and powerful system to dissect the molecular and biophysical basis of spindle elongation and positioning. Indeed, in this large and transparent cell, spindle poles (or centrosomes) can be easily detected from simple DIC microscopy by human eyes. To perform quantitative and high-throughput analysis of spindle motion, we developed a computer program ACT for Automated-Centrosome-Tracking from DIC movies of C. elegans embryos. We therefore offer an alternative to the image acquisition and processing of transgenic lines expressing fluorescent spindle markers. Consequently, experiments on large sets of cells can be performed with a simple setup using inexpensive microscopes. Moreover, analysis of any mutant or wild-type backgrounds is accessible because laborious rounds of crosses with transgenic lines become unnecessary. Last, our program allows spindle detection in other nematode species, offering the same quality of DIC images but for which techniques of transgenesis are not accessible. Thus, our program also opens the way towards a quantitative evolutionary approach of spindle dynamics. Overall, our computer program is a unique macro for the image- and movie-processing platform ImageJ. It is user-friendly and freely available under an open-source licence. ACT allows batch-wise analysis of large sets of mitosis events. Within 2 minutes, a single movie is processed and the accuracy of the automated tracking matches the precision of the human eye. PMID:24763198

  8. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    PubMed

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  9. AutoQSAR: an automated machine learning tool for best-practice quantitative structure-activity relationship modeling.

    PubMed

    Dixon, Steven L; Duan, Jianxin; Smith, Ethan; Von Bargen, Christopher D; Sherman, Woody; Repasky, Matthew P

    2016-10-01

    We introduce AutoQSAR, an automated machine-learning application to build, validate and deploy quantitative structure-activity relationship (QSAR) models. The process of descriptor generation, feature selection and the creation of a large number of QSAR models has been automated into a single workflow within AutoQSAR. The models are built using a variety of machine-learning methods, and each model is scored using a novel approach. Effectiveness of the method is demonstrated through comparison with literature QSAR models using identical datasets for six end points: protein-ligand binding affinity, solubility, blood-brain barrier permeability, carcinogenicity, mutagenicity and bioaccumulation in fish. AutoQSAR demonstrates similar or better predictive performance as compared with published results for four of the six endpoints while requiring minimal human time and expertise.

  10. Computational Model of Secondary Palate Fusion and Disruption

    EPA Science Inventory

    Morphogenetic events are driven by cell-generated physical forces and complex cellular dynamics. To improve our capacity to predict developmental effects from cellular alterations, we built a multi-cellular agent-based model in CompuCell3D that recapitulates the cellular networks...

  11. Quantifying autonomous vehicles national fuel consumption impacts: A data-rich approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yuche; Gonder, Jeffrey; Young, Stanley

    Autonomous vehicles are drawing significant attention from governments, manufacturers and consumers. Experts predict them to be the primary means of transportation by the middle of this century. Recent literature shows that vehicle automation has the potential to alter traffic patterns, vehicle ownership, and land use, which may affect fuel consumption from the transportation sector. In this paper, we developed a data-rich analytical framework to quantify system-wide fuel impacts of automation in the United States by integrating (1) a dynamic vehicle sales, stock, and usage model, (2) an historical transportation network-level vehicle miles traveled (VMT)/vehicle activity database, and (3) estimates ofmore » automation's impacts on fuel efficiency and travel demand. The vehicle model considers dynamics in vehicle fleet turnover and fuel efficiency improvements of conventional and advanced vehicle fleet. The network activity database contains VMT, free-flow speeds, and historical speeds of road links that can help us accurately identify fuel-savings opportunities of automation. Based on the model setup and assumptions, we found that the impacts of automation on fuel consumption are quite wide-ranging - with the potential to reduce fuel consumption by 45% in our 'Optimistic' case or increase it by 30% in our 'Pessimistic' case. Second, implementing automation on urban roads could potentially result in larger fuel savings compared with highway automation because of the driving features of urban roads. Lastly, through scenario analysis, we showed that the proposed framework can be used for refined assessments as better data on vehicle-level fuel efficiency and travel demand impacts of automation become available.« less

  12. Quantifying autonomous vehicles national fuel consumption impacts: A data-rich approach

    DOE PAGES

    Chen, Yuche; Gonder, Jeffrey; Young, Stanley; ...

    2017-11-06

    Autonomous vehicles are drawing significant attention from governments, manufacturers and consumers. Experts predict them to be the primary means of transportation by the middle of this century. Recent literature shows that vehicle automation has the potential to alter traffic patterns, vehicle ownership, and land use, which may affect fuel consumption from the transportation sector. In this paper, we developed a data-rich analytical framework to quantify system-wide fuel impacts of automation in the United States by integrating (1) a dynamic vehicle sales, stock, and usage model, (2) an historical transportation network-level vehicle miles traveled (VMT)/vehicle activity database, and (3) estimates ofmore » automation's impacts on fuel efficiency and travel demand. The vehicle model considers dynamics in vehicle fleet turnover and fuel efficiency improvements of conventional and advanced vehicle fleet. The network activity database contains VMT, free-flow speeds, and historical speeds of road links that can help us accurately identify fuel-savings opportunities of automation. Based on the model setup and assumptions, we found that the impacts of automation on fuel consumption are quite wide-ranging - with the potential to reduce fuel consumption by 45% in our 'Optimistic' case or increase it by 30% in our 'Pessimistic' case. Second, implementing automation on urban roads could potentially result in larger fuel savings compared with highway automation because of the driving features of urban roads. Lastly, through scenario analysis, we showed that the proposed framework can be used for refined assessments as better data on vehicle-level fuel efficiency and travel demand impacts of automation become available.« less

  13. Learning a cost function for microscope image segmentation.

    PubMed

    Nilufar, Sharmin; Perkins, Theodore J

    2014-01-01

    Quantitative analysis of microscopy images is increasingly important in clinical researchers' efforts to unravel the cellular and molecular determinants of disease, and for pathological analysis of tissue samples. Yet, manual segmentation and measurement of cells or other features in images remains the norm in many fields. We report on a new system that aims for robust and accurate semi-automated analysis of microscope images. A user interactively outlines one or more examples of a target object in a training image. We then learn a cost function for detecting more objects of the same type, either in the same or different images. The cost function is incorporated into an active contour model, which can efficiently determine optimal boundaries by dynamic programming. We validate our approach and compare it to some standard alternatives on three different types of microscopic images: light microscopy of blood cells, light microscopy of muscle tissue sections, and electron microscopy cross-sections of axons and their myelin sheaths.

  14. Real-time metabolome profiling of the metabolic switch between starvation and growth.

    PubMed

    Link, Hannes; Fuhrer, Tobias; Gerosa, Luca; Zamboni, Nicola; Sauer, Uwe

    2015-11-01

    Metabolic systems are often the first networks to respond to environmental changes, and the ability to monitor metabolite dynamics is key for understanding these cellular responses. Because monitoring metabolome changes is experimentally tedious and demanding, dynamic data on time scales from seconds to hours are scarce. Here we describe real-time metabolome profiling by direct injection of living bacteria, yeast or mammalian cells into a high-resolution mass spectrometer, which enables automated monitoring of about 300 compounds in 15-30-s cycles over several hours. We observed accumulation of energetically costly biomass metabolites in Escherichia coli in carbon starvation-induced stationary phase, as well as the rapid use of these metabolites upon growth resumption. By combining real-time metabolome profiling with modeling and inhibitor experiments, we obtained evidence for switch-like feedback inhibition in amino acid biosynthesis and for control of substrate availability through the preferential use of the metabolically cheaper one-step salvaging pathway over costly ten-step de novo purine biosynthesis during growth resumption.

  15. Evaluating a variety of text-mined features for automatic protein function prediction with GOstruct.

    PubMed

    Funk, Christopher S; Kahanda, Indika; Ben-Hur, Asa; Verspoor, Karin M

    2015-01-01

    Most computational methods that predict protein function do not take advantage of the large amount of information contained in the biomedical literature. In this work we evaluate both ontology term co-mention and bag-of-words features mined from the biomedical literature and analyze their impact in the context of a structured output support vector machine model, GOstruct. We find that even simple literature based features are useful for predicting human protein function (F-max: Molecular Function =0.408, Biological Process =0.461, Cellular Component =0.608). One advantage of using literature features is their ability to offer easy verification of automated predictions. We find through manual inspection of misclassifications that some false positive predictions could be biologically valid predictions based upon support extracted from the literature. Additionally, we present a "medium-throughput" pipeline that was used to annotate a large subset of co-mentions; we suggest that this strategy could help to speed up the rate at which proteins are curated.

  16. Combining deep learning and coherent anti-Stokes Raman scattering imaging for automated differential diagnosis of lung cancer.

    PubMed

    Weng, Sheng; Xu, Xiaoyun; Li, Jiasong; Wong, Stephen T C

    2017-10-01

    Lung cancer is the most prevalent type of cancer and the leading cause of cancer-related deaths worldwide. Coherent anti-Stokes Raman scattering (CARS) is capable of providing cellular-level images and resolving pathologically related features on human lung tissues. However, conventional means of analyzing CARS images requires extensive image processing, feature engineering, and human intervention. This study demonstrates the feasibility of applying a deep learning algorithm to automatically differentiate normal and cancerous lung tissue images acquired by CARS. We leverage the features learned by pretrained deep neural networks and retrain the model using CARS images as the input. We achieve 89.2% accuracy in classifying normal, small-cell carcinoma, adenocarcinoma, and squamous cell carcinoma lung images. This computational method is a step toward on-the-spot diagnosis of lung cancer and can be further strengthened by the efforts aimed at miniaturizing the CARS technique for fiber-based microendoscopic imaging. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  17. Bioengineering Solutions for Manufacturing Challenges in CAR T Cells

    PubMed Central

    Piscopo, Nicole J.; Mueller, Katherine P.; Das, Amritava; Hematti, Peiman; Murphy, William L.; Palecek, Sean P.; Capitini, Christian M.

    2017-01-01

    The next generation of therapeutic products to be approved for the clinic is anticipated to be cell therapies, termed “living drugs” for their capacity to dynamically and temporally respond to changes during their production ex vivo and after their administration in vivo. Genetically engineered chimeric antigen receptor (CAR) T cells have rapidly developed into powerful tools to harness the power of immune system manipulation against cancer. Regulatory agencies are beginning to approve CAR T cell therapies due to their striking efficacy in treating some hematological malignancies. However, the engineering and manufacturing of such cells remains a challenge for widespread adoption of this technology. Bioengineering approaches including biomaterials, synthetic biology, metabolic engineering, process control and automation, and in vitro disease modeling could offer promising methods to overcome some of these challenges. Here, we describe the manufacturing process of CAR T cells, highlighting potential roles for bioengineers to partner with biologists and clinicians to advance the manufacture of these complex cellular products under rigorous regulatory and quality control. PMID:28840981

  18. A catalog of automated analysis methods for enterprise models.

    PubMed

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.

  19. Modeling Multiple Human-Automation Distributed Systems using Network-form Games

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume

    2012-01-01

    The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.

  20. Automated Cellient(™) cytoblocks: better, stronger, faster?

    PubMed

    Prendeville, S; Brosnan, T; Browne, T J; McCarthy, J

    2014-12-01

    Cytoblocks (CBs), or cell blocks, provide additional morphological detail and a platform for immunocytochemistry (ICC) in cytopathology. The Cellient(™) system produces CBs in 45 minutes using methanol fixation, compared with traditional CBs, which require overnight formalin fixation. This study compares Cellient and traditional CB methods in terms of cellularity, morphology and immunoreactivity, evaluates the potential to add formalin fixation to the Cellient method for ICC studies and determines the optimal sectioning depth for maximal cellularity in Cellient CBs. One hundred and sixty CBs were prepared from 40 cytology samples (32 malignant, eight benign) using four processing methods: (A) traditional; (B) Cellient (methanol fixation); (C) Cellient using additional formalin fixation for 30 minutes; (D) Cellient using additional formalin fixation for 60 minutes. Haematoxylin and eosin-stained sections were assessed for cellularity and morphology. ICC was assessed on 14 cases with a panel of antibodies. Three additional Cellient samples were serially sectioned to determine the optimal sectioning depth. Scoring was performed by two independent, blinded reviewers. For malignant cases, morphology was superior with Cellient relative to traditional CBs (P < 0.001). Cellularity was comparable across all methods. ICC was excellent in all groups and the addition of formalin at any stage during the Cellient process did not influence the staining quality. Serial sectioning through Cellient CBs showed optimum cellularity at 30-40 μm with at least 27 sections obtainable. Cellient CBs provide superior morphology to traditional CBs and, if required, formalin fixation may be added to the Cellient process for ICC. Optimal Cellient CB cellularity is achieved at 30-40 μm, which will impact on the handling of cases in daily practice. © 2014 John Wiley & Sons Ltd.

  1. Dynamic imaging of adaptive stress response pathway activation for prediction of drug induced liver injury.

    PubMed

    Wink, Steven; Hiemstra, Steven W; Huppelschoten, Suzanne; Klip, Janna E; van de Water, Bob

    2018-05-01

    Drug-induced liver injury remains a concern during drug treatment and development. There is an urgent need for improved mechanistic understanding and prediction of DILI liabilities using in vitro approaches. We have established and characterized a panel of liver cell models containing mechanism-based fluorescent protein toxicity pathway reporters to quantitatively assess the dynamics of cellular stress response pathway activation at the single cell level using automated live cell imaging. We have systematically evaluated the application of four key adaptive stress pathway reporters for the prediction of DILI liability: SRXN1-GFP (oxidative stress), CHOP-GFP (ER stress/UPR response), p21 (p53-mediated DNA damage-related response) and ICAM1 (NF-κB-mediated inflammatory signaling). 118 FDA-labeled drugs in five human exposure relevant concentrations were evaluated for reporter activation using live cell confocal imaging. Quantitative data analysis revealed activation of single or multiple reporters by most drugs in a concentration and time dependent manner. Hierarchical clustering of time course dynamics and refined single cell analysis allowed the allusion of key events in DILI liability. Concentration response modeling was performed to calculate benchmark concentrations (BMCs). Extracted temporal dynamic parameters and BMCs were used to assess the predictive power of sub-lethal adaptive stress pathway activation. Although cellular adaptive responses were activated by non-DILI and severe-DILI compounds alike, dynamic behavior and lower BMCs of pathway activation were sufficiently distinct between these compound classes. The high-level detailed temporal- and concentration-dependent evaluation of the dynamics of adaptive stress pathway activation adds to the overall understanding and prediction of drug-induced liver liabilities.

  2. An Investigation of the "e-rater"® Automated Scoring Engine's Grammar, Usage, Mechanics, and Style Microfeatures and Their Aggregation Model. Research Report. ETS RR-17-04

    ERIC Educational Resources Information Center

    Chen, Jing; Zhang, Mo; Bejar, Isaac I.

    2017-01-01

    Automated essay scoring (AES) generally computes essay scores as a function of macrofeatures derived from a set of microfeatures extracted from the text using natural language processing (NLP). In the "e-rater"® automated scoring engine, developed at "Educational Testing Service" (ETS) for the automated scoring of essays, each…

  3. Automated eukaryotic gene structure annotation using EVidenceModeler and the Program to Assemble Spliced Alignments

    PubMed Central

    Haas, Brian J; Salzberg, Steven L; Zhu, Wei; Pertea, Mihaela; Allen, Jonathan E; Orvis, Joshua; White, Owen; Buell, C Robin; Wortman, Jennifer R

    2008-01-01

    EVidenceModeler (EVM) is presented as an automated eukaryotic gene structure annotation tool that reports eukaryotic gene structures as a weighted consensus of all available evidence. EVM, when combined with the Program to Assemble Spliced Alignments (PASA), yields a comprehensive, configurable annotation system that predicts protein-coding genes and alternatively spliced isoforms. Our experiments on both rice and human genome sequences demonstrate that EVM produces automated gene structure annotation approaching the quality of manual curation. PMID:18190707

  4. Improved accuracy in ground-based facilities - Development of an automated film-reading system for ballistic ranges

    NASA Technical Reports Server (NTRS)

    Yates, Leslie A.

    1992-01-01

    Software for an automated film-reading system that uses personal computers and digitized shadowgraphs is described. The software identifies pixels associated with fiducial-line and model images, and least-squares procedures are used to calculate the positions and orientations of the images. Automated position and orientation readings for sphere and cone models are compared to those obtained using a manual film reader. When facility calibration errors are removed from these readings, the accuracy of the automated readings is better than the pixel resolution, and it is equal to, or better than, the manual readings. The effects of film-reading and facility-calibration errors on calculated aerodynamic coefficients is discussed.

  5. Robot graphic simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.

    1991-01-01

    The objective of this research was twofold. First, the basic capabilities of ROBOSIM (graphical simulation system) were improved and extended by taking advantage of advanced graphic workstation technology and artificial intelligence programming techniques. Second, the scope of the graphic simulation testbed was extended to include general problems of Space Station automation. Hardware support for 3-D graphics and high processing performance make high resolution solid modeling, collision detection, and simulation of structural dynamics computationally feasible. The Space Station is a complex system with many interacting subsystems. Design and testing of automation concepts demand modeling of the affected processes, their interactions, and that of the proposed control systems. The automation testbed was designed to facilitate studies in Space Station automation concepts.

  6. Understanding human management of automation errors

    PubMed Central

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  7. Understanding human management of automation errors.

    PubMed

    McBride, Sara E; Rogers, Wendy A; Fisk, Arthur D

    2014-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance.

  8. Pilot interaction with automated airborne decision making systems

    NASA Technical Reports Server (NTRS)

    Rouse, W. B.; Chu, Y. Y.; Greenstein, J. S.; Walden, R. S.

    1976-01-01

    An investigation was made of interaction between a human pilot and automated on-board decision making systems. Research was initiated on the topic of pilot problem solving in automated and semi-automated flight management systems and attempts were made to develop a model of human decision making in a multi-task situation. A study was made of allocation of responsibility between human and computer, and discussed were various pilot performance parameters with varying degrees of automation. Optimal allocation of responsibility between human and computer was considered and some theoretical results found in the literature were presented. The pilot as a problem solver was discussed. Finally the design of displays, controls, procedures, and computer aids for problem solving tasks in automated and semi-automated systems was considered.

  9. The Automation of Nowcast Model Assessment Processes

    DTIC Science & Technology

    2016-09-01

    that will automate real-time WRE-N model simulations, collect and quality control check weather observations for assimilation and verification, and...domains centered near White Sands Missile Range, New Mexico, where the Meteorological Sensor Array (MSA) will be located. The MSA will provide...observations and performing quality -control checks for the pre-forecast data assimilation period. 2. Run the WRE-N model to generate model forecast data

  10. General Models for Automated Essay Scoring: Exploring an Alternative to the Status Quo

    ERIC Educational Resources Information Center

    Kelly, P. Adam

    2005-01-01

    Powers, Burstein, Chodorow, Fowles, and Kukich (2002) suggested that automated essay scoring (AES) may benefit from the use of "general" scoring models designed to score essays irrespective of the prompt for which an essay was written. They reasoned that such models may enhance score credibility by signifying that an AES system measures the same…

  11. Integrated Electronic Warfare Systems Aboard the United States Navy 21st Century Warship

    DTIC Science & Technology

    2009-12-01

    automated operation using a Human-In-the-Loop that could be integrated into existing and future combat systems. A model was developed that demonstrates...complete range of automated operation using a Human-In-the-Loop that could be integrated into existing and future combat systems. A model was developed...44 1. Base Case Model

  12. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOGIC MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execu...

  13. AUTOMATED GEOSPATICAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOICAL MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execut...

  14. Model-based metrics of human-automation function allocation in complex work environments

    NASA Astrophysics Data System (ADS)

    Kim, So Young

    Function allocation is the design decision which assigns work functions to all agents in a team, both human and automated. Efforts to guide function allocation systematically has been studied in many fields such as engineering, human factors, team and organization design, management science, and cognitive systems engineering. Each field focuses on certain aspects of function allocation, but not all; thus, an independent discussion of each does not address all necessary issues with function allocation. Four distinctive perspectives emerged from a review of these fields: technology-centered, human-centered, team-oriented, and work-oriented. Each perspective focuses on different aspects of function allocation: capabilities and characteristics of agents (automation or human), team structure and processes, and work structure and the work environment. Together, these perspectives identify the following eight issues with function allocation: 1) Workload, 2) Incoherency in function allocations, 3) Mismatches between responsibility and authority, 4) Interruptive automation, 5) Automation boundary conditions, 6) Function allocation preventing human adaptation to context, 7) Function allocation destabilizing the humans' work environment, and 8) Mission Performance. Addressing these issues systematically requires formal models and simulations that include all necessary aspects of human-automation function allocation: the work environment, the dynamics inherent to the work, agents, and relationships among them. Also, addressing these issues requires not only a (static) model, but also a (dynamic) simulation that captures temporal aspects of work such as the timing of actions and their impact on the agent's work. Therefore, with properly modeled work as described by the work environment, the dynamics inherent to the work, agents, and relationships among them, a modeling framework developed by this thesis, which includes static work models and dynamic simulation, can capture the issues with function allocation. Then, based on the eight issues, eight types of metrics are established. The purpose of these metrics is to assess the extent to which each issue exists with a given function allocation. Specifically, the eight types of metrics assess workload, coherency of a function allocation, mismatches between responsibility and authority, interruptive automation, automation boundary conditions, human adaptation to context, stability of the human's work environment, and mission performance. Finally, to validate the modeling framework and the metrics, a case study was conducted modeling four different function allocations between a pilot and flight deck automation during the arrival and approach phases of flight. A range of pilot cognitive control modes and maximum human taskload limits were also included in the model. The metrics were assessed for these four function allocations and analyzed to validate capability of the metrics to identify important issues in given function allocations. In addition, the design insights provided by the metrics are highlighted. This thesis concludes with a discussion of mechanisms for further validating the modeling framework and function allocation metrics developed here, and highlights where these developments can be applied in research and in the design of function allocations in complex work environments such as aviation operations.

  15. Photogrammetry-Based Automated Measurements for Tooth Shape and Occlusion Analysis

    NASA Astrophysics Data System (ADS)

    Knyaz, V. A.; Gaboutchian, A. V.

    2016-06-01

    Tooth measurements (odontometry) are performed for various scientific and practical applications, including dentistry. Present-day techniques are being increasingly based on 3D model use that provides wider prospects in comparison to measurements on real objects: teeth or their plaster copies. The main advantages emerge through application of new measurement methods which provide the needed degree of non-invasiveness, precision, convenience and details. Tooth measurements have been always regarded as a time-consuming research, even more so with use of new methods due to their wider opportunities. This is where automation becomes essential for further development and implication of measurement techniques. In our research automation in obtaining 3D models and automation of measurements provided essential data that was analysed to suggest recommendations for tooth preparation - one of the most responsible clinical procedures in prosthetic dentistry - within a comparatively short period of time. The original photogrammetric 3D reconstruction system allows to generate 3D models of dental arches, reproduce their closure, or occlusion, and to perform a set of standard measurement in automated mode.

  16. Detecting Mode Confusion Through Formal Modeling and Analysis

    NASA Technical Reports Server (NTRS)

    Miller, Steven P.; Potts, James N.

    1999-01-01

    Aircraft safety has improved steadily over the last few decades. While much of this improvement can be attributed to the introduction of advanced automation in the cockpit, the growing complexity of these systems also increases the potential for the pilots to become confused about what the automation is doing. This phenomenon, often referred to as mode confusion, has been involved in several accidents involving modern aircraft. This report describes an effort by Rockwell Collins and NASA Langley to identify potential sources of mode confusion through two complementary strategies. The first is to create a clear, executable model of the automation, connect it to a simulation of the flight deck, and use this combination to review of the behavior of the automation and the man-machine interface with the designers, pilots, and experts in human factors. The second strategy is to conduct mathematical analyses of the model by translating it into a formal specification suitable for analysis with automated tools. The approach is illustrated by applying it to a hypothetical, but still realistic, example of the mode logic of a Flight Guidance System.

  17. Automated, Parametric Geometry Modeling and Grid Generation for Turbomachinery Applications

    NASA Technical Reports Server (NTRS)

    Harrand, Vincent J.; Uchitel, Vadim G.; Whitmire, John B.

    2000-01-01

    The objective of this Phase I project is to develop a highly automated software system for rapid geometry modeling and grid generation for turbomachinery applications. The proposed system features a graphical user interface for interactive control, a direct interface to commercial CAD/PDM systems, support for IGES geometry output, and a scripting capability for obtaining a high level of automation and end-user customization of the tool. The developed system is fully parametric and highly automated, and, therefore, significantly reduces the turnaround time for 3D geometry modeling, grid generation and model setup. This facilitates design environments in which a large number of cases need to be generated, such as for parametric analysis and design optimization of turbomachinery equipment. In Phase I we have successfully demonstrated the feasibility of the approach. The system has been tested on a wide variety of turbomachinery geometries, including several impellers and a multi stage rotor-stator combination. In Phase II, we plan to integrate the developed system with turbomachinery design software and with commercial CAD/PDM software.

  18. Iterative model building, structure refinement and density modification with the PHENIX AutoBuild wizard.

    PubMed

    Terwilliger, Thomas C; Grosse-Kunstleve, Ralf W; Afonine, Pavel V; Moriarty, Nigel W; Zwart, Peter H; Hung, Li Wei; Read, Randy J; Adams, Paul D

    2008-01-01

    The PHENIX AutoBuild wizard is a highly automated tool for iterative model building, structure refinement and density modification using RESOLVE model building, RESOLVE statistical density modification and phenix.refine structure refinement. Recent advances in the AutoBuild wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model-completion algorithms and automated solvent-molecule picking. Model-completion algorithms in the AutoBuild wizard include loop building, crossovers between chains in different models of a structure and side-chain optimization. The AutoBuild wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 to 3.2 A, resulting in a mean R factor of 0.24 and a mean free R factor of 0.29. The R factor of the final model is dependent on the quality of the starting electron density and is relatively independent of resolution.

  19. Automated Test Case Generation for an Autopilot Requirement Prototype

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Rungta, Neha; Feary, Michael

    2011-01-01

    Designing safety-critical automation with robust human interaction is a difficult task that is susceptible to a number of known Human-Automation Interaction (HAI) vulnerabilities. It is therefore essential to develop automated tools that provide support both in the design and rapid evaluation of such automation. The Automation Design and Evaluation Prototyping Toolset (ADEPT) enables the rapid development of an executable specification for automation behavior and user interaction. ADEPT supports a number of analysis capabilities, thus enabling the detection of HAI vulnerabilities early in the design process, when modifications are less costly. In this paper, we advocate the introduction of a new capability to model-based prototyping tools such as ADEPT. The new capability is based on symbolic execution that allows us to automatically generate quality test suites based on the system design. Symbolic execution is used to generate both user input and test oracles user input drives the testing of the system implementation, and test oracles ensure that the system behaves as designed. We present early results in the context of a component in the Autopilot system modeled in ADEPT, and discuss the challenges of test case generation in the HAI domain.

  20. Estimating kinetic mechanisms with prior knowledge I: Linear parameter constraints.

    PubMed

    Salari, Autoosa; Navarro, Marco A; Milescu, Mirela; Milescu, Lorin S

    2018-02-05

    To understand how ion channels and other proteins function at the molecular and cellular levels, one must decrypt their kinetic mechanisms. Sophisticated algorithms have been developed that can be used to extract kinetic parameters from a variety of experimental data types. However, formulating models that not only explain new data, but are also consistent with existing knowledge, remains a challenge. Here, we present a two-part study describing a mathematical and computational formalism that can be used to enforce prior knowledge into the model using constraints. In this first part, we focus on constraints that enforce explicit linear relationships involving rate constants or other model parameters. We develop a simple, linear algebra-based transformation that can be applied to enforce many types of model properties and assumptions, such as microscopic reversibility, allosteric gating, and equality and inequality parameter relationships. This transformation converts the set of linearly interdependent model parameters into a reduced set of independent parameters, which can be passed to an automated search engine for model optimization. In the companion article, we introduce a complementary method that can be used to enforce arbitrary parameter relationships and any constraints that quantify the behavior of the model under certain conditions. The procedures described in this study can, in principle, be coupled to any of the existing methods for solving molecular kinetics for ion channels or other proteins. These concepts can be used not only to enforce existing knowledge but also to formulate and test new hypotheses. © 2018 Salari et al.

  1. A statistical pixel intensity model for segmentation of confocal laser scanning microscopy images.

    PubMed

    Calapez, Alexandre; Rosa, Agostinho

    2010-09-01

    Confocal laser scanning microscopy (CLSM) has been widely used in the life sciences for the characterization of cell processes because it allows the recording of the distribution of fluorescence-tagged macromolecules on a section of the living cell. It is in fact the cornerstone of many molecular transport and interaction quantification techniques where the identification of regions of interest through image segmentation is usually a required step. In many situations, because of the complexity of the recorded cellular structures or because of the amounts of data involved, image segmentation either is too difficult or inefficient to be done by hand and automated segmentation procedures have to be considered. Given the nature of CLSM images, statistical segmentation methodologies appear as natural candidates. In this work we propose a model to be used for statistical unsupervised CLSM image segmentation. The model is derived from the CLSM image formation mechanics and its performance is compared to the existing alternatives. Results show that it provides a much better description of the data on classes characterized by their mean intensity, making it suitable not only for segmentation methodologies with known number of classes but also for use with schemes aiming at the estimation of the number of classes through the application of cluster selection criteria.

  2. High Resolution Sensing and Control of Urban Water Networks

    NASA Astrophysics Data System (ADS)

    Bartos, M. D.; Wong, B. P.; Kerkez, B.

    2016-12-01

    We present a framework to enable high-resolution sensing, modeling, and control of urban watersheds using (i) a distributed sensor network based on low-cost cellular-enabled motes, (ii) hydraulic models powered by a cloud computing infrastructure, and (iii) automated actuation valves that allow infrastructure to be controlled in real time. This platform initiates two major advances. First, we achieve a high density of measurements in urban environments, with an anticipated 40+ sensors over each urban area of interest. In addition to new measurements, we also illustrate the design and evaluation of a "smart" control system for real-world hydraulic networks. This control system improves water quality and mitigates flooding by using real-time hydraulic models to adaptively control releases from retention basins. We evaluate the potential of this platform through two ongoing deployments: (i) a flood monitoring network in the Dallas-Fort Worth metropolitan area that detects and anticipates floods at the level of individual roadways, and (ii) a real-time hydraulic control system in the city of Ann Arbor, MI—soon to be one of the most densely instrumented urban watersheds in the United States. Through these applications, we demonstrate that distributed sensing and control of water infrastructure can improve flash flood predictions, emergency response, and stormwater contaminant mitigation.

  3. FDR-controlled metabolite annotation for high-resolution imaging mass spectrometry.

    PubMed

    Palmer, Andrew; Phapale, Prasad; Chernyavsky, Ilya; Lavigne, Regis; Fay, Dominik; Tarasov, Artem; Kovalev, Vitaly; Fuchser, Jens; Nikolenko, Sergey; Pineau, Charles; Becker, Michael; Alexandrov, Theodore

    2017-01-01

    High-mass-resolution imaging mass spectrometry promises to localize hundreds of metabolites in tissues, cell cultures, and agar plates with cellular resolution, but it is hampered by the lack of bioinformatics tools for automated metabolite identification. We report pySM, a framework for false discovery rate (FDR)-controlled metabolite annotation at the level of the molecular sum formula, for high-mass-resolution imaging mass spectrometry (https://github.com/alexandrovteam/pySM). We introduce a metabolite-signal match score and a target-decoy FDR estimate for spatial metabolomics.

  4. A Versatile Microfluidic Device for Automating Synthetic Biology.

    PubMed

    Shih, Steve C C; Goyal, Garima; Kim, Peter W; Koutsoubelis, Nicolas; Keasling, Jay D; Adams, Paul D; Hillson, Nathan J; Singh, Anup K

    2015-10-16

    New microbes are being engineered that contain the genetic circuitry, metabolic pathways, and other cellular functions required for a wide range of applications such as producing biofuels, biobased chemicals, and pharmaceuticals. Although currently available tools are useful in improving the synthetic biology process, further improvements in physical automation would help to lower the barrier of entry into this field. We present an innovative microfluidic platform for assembling DNA fragments with 10× lower volumes (compared to that of current microfluidic platforms) and with integrated region-specific temperature control and on-chip transformation. Integration of these steps minimizes the loss of reagents and products compared to that with conventional methods, which require multiple pipetting steps. For assembling DNA fragments, we implemented three commonly used DNA assembly protocols on our microfluidic device: Golden Gate assembly, Gibson assembly, and yeast assembly (i.e., TAR cloning, DNA Assembler). We demonstrate the utility of these methods by assembling two combinatorial libraries of 16 plasmids each. Each DNA plasmid is transformed into Escherichia coli or Saccharomyces cerevisiae using on-chip electroporation and further sequenced to verify the assembly. We anticipate that this platform will enable new research that can integrate this automated microfluidic platform to generate large combinatorial libraries of plasmids and will help to expedite the overall synthetic biology process.

  5. Intuitive Cognition and Models of Human-Automation Interaction.

    PubMed

    Patterson, Robert Earl

    2017-02-01

    The aim of this study was to provide an analysis of the implications of the dominance of intuitive cognition in human reasoning and decision making for conceptualizing models and taxonomies of human-automation interaction, focusing on the Parasuraman et al. model and taxonomy. Knowledge about how humans reason and make decisions, which has been shown to be largely intuitive, has implications for the design of future human-machine systems. One hundred twenty articles and books cited in other works as well as those obtained from an Internet search were reviewed. Works were deemed eligible if they were published within the past 50 years and common to a given literature. Analysis shows that intuitive cognition dominates human reasoning and decision making in all situations examined. The implications of the dominance of intuitive cognition for the Parasuraman et al. model and taxonomy are discussed. A taxonomy of human-automation interaction that incorporates intuitive cognition is suggested. Understanding the ways in which human reasoning and decision making is intuitive can provide insight for future models and taxonomies of human-automation interaction.

  6. Model annotation for synthetic biology: automating model to nucleotide sequence conversion

    PubMed Central

    Misirli, Goksel; Hallinan, Jennifer S.; Yu, Tommy; Lawson, James R.; Wimalaratne, Sarala M.; Cooling, Michael T.; Wipat, Anil

    2011-01-01

    Motivation: The need for the automated computational design of genetic circuits is becoming increasingly apparent with the advent of ever more complex and ambitious synthetic biology projects. Currently, most circuits are designed through the assembly of models of individual parts such as promoters, ribosome binding sites and coding sequences. These low level models are combined to produce a dynamic model of a larger device that exhibits a desired behaviour. The larger model then acts as a blueprint for physical implementation at the DNA level. However, the conversion of models of complex genetic circuits into DNA sequences is a non-trivial undertaking due to the complexity of mapping the model parts to their physical manifestation. Automating this process is further hampered by the lack of computationally tractable information in most models. Results: We describe a method for automatically generating DNA sequences from dynamic models implemented in CellML and Systems Biology Markup Language (SBML). We also identify the metadata needed to annotate models to facilitate automated conversion, and propose and demonstrate a method for the markup of these models using RDF. Our algorithm has been implemented in a software tool called MoSeC. Availability: The software is available from the authors' web site http://research.ncl.ac.uk/synthetic_biology/downloads.html. Contact: anil.wipat@ncl.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21296753

  7. Modeling pilot interaction with automated digital avionics systems: Guidance and control algorithms for contour and nap-of-the-Earth flight

    NASA Technical Reports Server (NTRS)

    Hess, Ronald A.

    1990-01-01

    A collection of technical papers are presented that cover modeling pilot interaction with automated digital avionics systems and guidance and control algorithms for contour and nap-of-the-earth flight. The titles of the papers presented are as follows: (1) Automation effects in a multiloop manual control system; (2) A qualitative model of human interaction with complex dynamic systems; (3) Generalized predictive control of dynamic systems; (4) An application of generalized predictive control to rotorcraft terrain-following flight; (5) Self-tuning generalized predictive control applied to terrain-following flight; and (6) Precise flight path control using a predictive algorithm.

  8. New Method for Quantitation of Lipid Droplet Volume From Light Microscopic Images With an Application to Determination of PAT Protein Density on the Droplet Surface.

    PubMed

    Dejgaard, Selma Y; Presley, John F

    2018-06-01

    Determination of lipid droplet (LD) volume has depended on direct measurement of the diameter of individual LDs, which is not possible when LDs are small or closely apposed. To overcome this problem, we describe a new method in which a volume-fluorescence relationship is determined from automated analysis of calibration samples containing well-resolved LDs. This relationship is then used to estimate total cellular droplet volume in experimental samples, where the LDs need not be individually resolved, or to determine the volumes of individual LDs. We describe quantitatively the effects of various factors, including image noise, LD crowding, and variation in LD composition on the accuracy of this method. We then demonstrate this method by utilizing it to address a scientifically interesting question, to determine the density of green fluorescent protein (GFP)-tagged Perilipin-Adipocyte-Tail (PAT) proteins on the LD surface. We find that PAT proteins cover only a minority of the LD surface, consistent with models in which they primarily serve as scaffolds for binding of regulatory proteins and enzymes, but inconsistent with models in which their major function is to sterically block access to the droplet surface.

  9. Computational Science in Armenia (Invited Talk)

    NASA Astrophysics Data System (ADS)

    Marandjian, H.; Shoukourian, Yu.

    This survey is devoted to the development of informatics and computer science in Armenia. The results in theoretical computer science (algebraic models, solutions to systems of general form recursive equations, the methods of coding theory, pattern recognition and image processing), constitute the theoretical basis for developing problem-solving-oriented environments. As examples can be mentioned: a synthesizer of optimized distributed recursive programs, software tools for cluster-oriented implementations of two-dimensional cellular automata, a grid-aware web interface with advanced service trading for linear algebra calculations. In the direction of solving scientific problems that require high-performance computing resources, examples of completed projects include the field of physics (parallel computing of complex quantum systems), astrophysics (Armenian virtual laboratory), biology (molecular dynamics study of human red blood cell membrane), meteorology (implementing and evaluating the Weather Research and Forecast Model for the territory of Armenia). The overview also notes that the Institute for Informatics and Automation Problems of the National Academy of Sciences of Armenia has established a scientific and educational infrastructure, uniting computing clusters of scientific and educational institutions of the country and provides the scientific community with access to local and international computational resources, that is a strong support for computational science in Armenia.

  10. A Pulse Coupled Neural Network Segmentation Algorithm for Reflectance Confocal Images of Epithelial Tissue

    PubMed Central

    Malik, Bilal H.; Jabbour, Joey M.; Maitland, Kristen C.

    2015-01-01

    Automatic segmentation of nuclei in reflectance confocal microscopy images is critical for visualization and rapid quantification of nuclear-to-cytoplasmic ratio, a useful indicator of epithelial precancer. Reflectance confocal microscopy can provide three-dimensional imaging of epithelial tissue in vivo with sub-cellular resolution. Changes in nuclear density or nuclear-to-cytoplasmic ratio as a function of depth obtained from confocal images can be used to determine the presence or stage of epithelial cancers. However, low nuclear to background contrast, low resolution at greater imaging depths, and significant variation in reflectance signal of nuclei complicate segmentation required for quantification of nuclear-to-cytoplasmic ratio. Here, we present an automated segmentation method to segment nuclei in reflectance confocal images using a pulse coupled neural network algorithm, specifically a spiking cortical model, and an artificial neural network classifier. The segmentation algorithm was applied to an image model of nuclei with varying nuclear to background contrast. Greater than 90% of simulated nuclei were detected for contrast of 2.0 or greater. Confocal images of porcine and human oral mucosa were used to evaluate application to epithelial tissue. Segmentation accuracy was assessed using manual segmentation of nuclei as the gold standard. PMID:25816131

  11. Some Automated Cartography Developments at the Defense Mapping Agency.

    DTIC Science & Technology

    1981-01-01

    on a pantographic router creating a laminate step model which was moulded in plaster for carving Into a terrain model. This section will trace DMA’s...offering economical automation. Precision flatbed Concord plotters were brought into DMA with sufficiently programmable control computers to perform these

  12. ClonEvol: clonal ordering and visualization in cancer sequencing.

    PubMed

    Dang, H X; White, B S; Foltz, S M; Miller, C A; Luo, J; Fields, R C; Maher, C A

    2017-12-01

    Reconstruction of clonal evolution is critical for understanding tumor progression and implementing personalized therapies. This is often done by clustering somatic variants based on their cellular prevalence estimated via bulk tumor sequencing of multiple samples. The clusters, consisting of the clonal marker variants, are then ordered based on their estimated cellular prevalence to reconstruct clonal evolution trees, a process referred to as 'clonal ordering'. However, cellular prevalence estimate is confounded by statistical variability and errors in sequencing/data analysis, and therefore inhibits accurate reconstruction of the clonal evolution. This problem is further complicated by intra- and inter-tumor heterogeneity. Furthermore, the field lacks a comprehensive visualization tool to facilitate the interpretation of complex clonal relationships. To address these challenges we developed ClonEvol, a unified software tool for clonal ordering, visualization, and interpretation. ClonEvol uses a bootstrap resampling technique to estimate the cellular fraction of the clones and probabilistically models the clonal ordering constraints to account for statistical variability. The bootstrapping allows identification of the sample founding- and sub-clones, thus enabling interpretation of clonal seeding. ClonEvol automates the generation of multiple widely used visualizations for reconstructing and interpreting clonal evolution. ClonEvol outperformed three of the state of the art tools (LICHeE, Canopy and PhyloWGS) for clonal evolution inference, showing more robust error tolerance and producing more accurate trees in a simulation. Building upon multiple recent publications that utilized ClonEvol to study metastasis and drug resistance in solid cancers, here we show that ClonEvol rediscovered relapsed subclones in two published acute myeloid leukemia patients. Furthermore, we demonstrated that through noninvasive monitoring ClonEvol recapitulated the emerging subclones throughout metastatic progression observed in the tumors of a published breast cancer patient. ClonEvol has broad applicability for longitudinal monitoring of clonal populations in tumor biopsies, or noninvasively, to guide precision medicine. ClonEvol is written in R and is available at https://github.com/ChrisMaherLab/ClonEvol. © The Author 2017. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Los Alamos National Laboratory, Mailstop M888, Los Alamos, NM 87545, USA; Lawrence Berkeley National Laboratory, One Cyclotron Road, Building 64R0121, Berkeley, CA 94720, USA; Department of Haematology, University of Cambridge, Cambridge CB2 0XY, England

    The PHENIX AutoBuild Wizard is a highly automated tool for iterative model-building, structure refinement and density modification using RESOLVE or TEXTAL model-building, RESOLVE statistical density modification, and phenix.refine structure refinement. Recent advances in the AutoBuild Wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model completion algorithms, and automated solvent molecule picking. Model completion algorithms in the AutoBuild Wizard include loop-building, crossovers between chains in different models of a structure, and side-chain optimization. The AutoBuild Wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 {angstrom} tomore » 3.2 {angstrom}, resulting in a mean R-factor of 0.24 and a mean free R factor of 0.29. The R-factor of the final model is dependent on the quality of the starting electron density, and relatively independent of resolution.« less

  14. Improving the driver-automation interaction: an approach using automation uncertainty.

    PubMed

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  15. Automated monitor and control for deep space network subsystems

    NASA Technical Reports Server (NTRS)

    Smyth, P.

    1989-01-01

    The problem of automating monitor and control loops for Deep Space Network (DSN) subsystems is considered and an overview of currently available automation techniques is given. The use of standard numerical models, knowledge-based systems, and neural networks is considered. It is argued that none of these techniques alone possess sufficient generality to deal with the demands imposed by the DSN environment. However, it is shown that schemes that integrate the better aspects of each approach and are referenced to a formal system model show considerable promise, although such an integrated technology is not yet available for implementation. Frequent reference is made to the receiver subsystem since this work was largely motivated by experience in developing an automated monitor and control loop for the advanced receiver.

  16. Automation and Job Satisfaction among Reference Librarians.

    ERIC Educational Resources Information Center

    Whitlatch, Jo Bell

    1991-01-01

    Discussion of job satisfaction and the level of job performance focuses on the effect of automation on job satisfaction among reference librarians. The influence of stress is discussed, a job strain model is explained, and examples of how to design a job to reduce the stress caused by automation are given. (12 references) (LRW)

  17. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYRDOLOGIC MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly

    developed by the USDA Agricultural Research Service, the U.S. Environmental Protection

    Agency, the University of Arizona, and the University of Wyoming to automate the

    parame...

  18. Distress and worry as mediators in the relationship between psychosocial risks and upper body musculoskeletal complaints in highly automated manufacturing.

    PubMed

    Wixted, Fiona; Shevlin, Mark; O'Sullivan, Leonard W

    2018-03-15

    As a result of changes in manufacturing including an upward trend in automation and the advent of the fourth industrial revolution, the requirement for supervisory monitoring and consequently, cognitive demand has increased in automated manufacturing. The incidence of musculoskeletal disorders has also increased in the manufacturing sector. A model was developed based on survey data to test if distress and worry mediate the relationship between psychosocial factors (job control, cognitive demand, social isolation and skill discretion), stress states and symptoms of upper body musculoskeletal disorders in highly automated manufacturing companies (n = 235). These constructs facilitated the development of a statistically significant model (RMSEA 0.057, TLI 0.924, CFI 0.935). Cognitive demand was shown to be related to higher distress in employees, and distress to a higher incidence of self-reported shoulder and lower back symptoms. The mediation model incorporating stress states (distress, worry) as mediators is a novel approach in linking psychosocial risks to musculoskeletal disorders. Practitioners' Summary With little requirement for physical work in many modern automated manufacturing workplaces, there is often minimal management focus on Work-Related Musculoskeletal Disorders (WRMSDs) as important occupational health problems. Our model provides evidence that psychosocial factors are important risk factors in symptoms of WRMSD and should be managed.

  19. Design and evaluation of cellular power converter architectures

    NASA Astrophysics Data System (ADS)

    Perreault, David John

    Power electronic technology plays an important role in many energy conversion and storage applications, including machine drives, power supplies, frequency changers and UPS systems. Increases in performance and reductions in cost have been achieved through the development of higher performance power semiconductor devices and integrated control devices with increased functionality. Manufacturing techniques, however, have changed little. High power is typically achieved by paralleling multiple die in a sing!e package, producing the physical equivalent of a single large device. Consequently, both the device package and the converter in which the device is used continue to require large, complex mechanical structures, and relatively sophisticated heat transfer systems. An alternative to this approach is the use of a cellular power converter architecture, which is based upon the parallel connection of a large number of quasi-autonomous converters, called cells, each of which is designed for a fraction of the system rating. The cell rating is chosen such that single-die devices in inexpensive packages can be used, and the cell fabricated with an automated assembly process. The use of quasi-autonomous cells means that system performance is not compromised by the failure of a cell. This thesis explores the design of cellular converter architectures with the objective of achieving improvements in performance, reliability, and cost over conventional converter designs. New approaches are developed and experimentally verified for highly distributed control of cellular converters, including methods for ripple cancellation and current-sharing control. The performance of these techniques are quantified, and their dynamics are analyzed. Cell topologies suitable to the cellular architecture are investigated, and their use for systems in the 5-500 kVA range is explored. The design, construction, and experimental evaluation of a 6 kW cellular switched-mode rectifier is also addressed. This cellular system implements entirely distributed control, and achieves performance levels unattainable with an equivalent single converter. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.)

  20. Effect of crumb cellular structure characterized by image analysis on cake softness.

    PubMed

    Dewaest, Marine; Villemejane, Cindy; Berland, Sophie; Neron, Stéphane; Clement, Jérôme; Verel, Aliette; Michon, Camille

    2018-06-01

    Sponge cake is a cereal product characterized by an aerated crumb and appreciated for its softness. When formulating such product, it is interesting to be able to characterize the crumb structure using image analysis and to bring knowledge about the effects of the crumb cellular structure on its mechanical properties which contribute to softness. An image analysis method based on mathematical morphology was adapted from the one developed for bread crumb. In order to evaluate its ability to discriminate cellular structures, series of cakes were prepared using two rather similar emulsifiers but also using flours with different aging times before use. The mechanical properties of the crumbs of these different cakes were also characterized. It allowed a cell structure classification taking into account cell size and homogeneity, but also cell wall thickness and the number of holes in the walls. Interestingly, the cellular structure differences had a larger impact on the aerated crumb Young modulus than the wall firmness. Increasing the aging time of flour before use leads to the production of firmer crumbs due to coarser and inhomogeneous cellular structures. Changing the composition of the emulsifier may change the cellular structure and, depending on the type of the structural changes, have an impact on the firmness of the crumb. Cellular structure rather than cell wall firmness was found to impact cake crumb firmness. The new fast and automated tool for cake crumb structure analysis allows detecting quickly any change in cell size or homogeneity but also cell wall thickness and number of holes in the walls (openness degree). To obtain a softer crumb, it seems that options are to decrease the cell size and the cell wall thickness and/or to increase the openness degree. It is then possible to easily evaluate the effects of ingredients (flour composition, emulsifier …) or change in the process on the crumb structure and thus its softness. Moreover, this image analysis is a very efficient tool for quality control. © 2017 Wiley Periodicals, Inc.

  1. Developmental biology and tissue engineering.

    PubMed

    Marga, Francoise; Neagu, Adrian; Kosztin, Ioan; Forgacs, Gabor

    2007-12-01

    Morphogenesis implies the controlled spatial organization of cells that gives rise to tissues and organs in early embryonic development. While morphogenesis is under strict genetic control, the formation of specialized biological structures of specific shape hinges on physical processes. Tissue engineering (TE) aims at reproducing morphogenesis in the laboratory, i.e., in vitro, to fabricate replacement organs for regenerative medicine. The classical approach to generate tissues/organs is by seeding and expanding cells in appropriately shaped biocompatible scaffolds, in the hope that the maturation process will result in the desired structure. To accomplish this goal more naturally and efficiently, we set up and implemented a novel TE method that is based on principles of developmental biology and employs bioprinting, the automated delivery of cellular composites into a three-dimensional (3D) biocompatible environment. The novel technology relies on the concept of tissue liquidity according to which multicellular aggregates composed of adhesive and motile cells behave in analogy with liquids: in particular, they fuse. We emphasize the major role played by tissue fusion in the embryo and explain how the parameters (surface tension, viscosity) that govern tissue fusion can be used both experimentally and theoretically to control and simulate the self-assembly of cellular spheroids into 3D living structures. The experimentally observed postprinting shape evolution of tube- and sheet-like constructs is presented. Computer simulations, based on a liquid model, support the idea that tissue liquidity may provide a mechanism for in vitro organ building. Copyright 2008 Wiley-Liss, Inc.

  2. Command-line cellular electrophysiology for conventional and real-time closed-loop experiments.

    PubMed

    Linaro, Daniele; Couto, João; Giugliano, Michele

    2014-06-15

    Current software tools for electrophysiological experiments are limited in flexibility and rarely offer adequate support for advanced techniques such as dynamic clamp and hybrid experiments, which are therefore limited to laboratories with a significant expertise in neuroinformatics. We have developed lcg, a software suite based on a command-line interface (CLI) that allows performing both standard and advanced electrophysiological experiments. Stimulation protocols for classical voltage and current clamp experiments are defined by a concise and flexible meta description that allows representing complex waveforms as a piece-wise parametric decomposition of elementary sub-waveforms, abstracting the stimulation hardware. To perform complex experiments lcg provides a set of elementary building blocks that can be interconnected to yield a large variety of experimental paradigms. We present various cellular electrophysiological experiments in which lcg has been employed, ranging from the automated application of current clamp protocols for characterizing basic electrophysiological properties of neurons, to dynamic clamp, response clamp, and hybrid experiments. We finally show how the scripting capabilities behind a CLI are suited for integrating experimental trials into complex workflows, where actual experiment, online data analysis and computational modeling seamlessly integrate. We compare lcg with two open source toolboxes, RTXI and RELACS. We believe that lcg will greatly contribute to the standardization and reproducibility of both simple and complex experiments. Additionally, on the long run the increased efficiency due to a CLI will prove a great benefit for the experimental community. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Evaluation of automated cell disruptor methods for oomycetous and ascomycetous model organisms

    USDA-ARS?s Scientific Manuscript database

    Two automated cell disruptor-based methods for RNA extraction; disruption of thawed cells submerged in TRIzol Reagent (method QP), and direct disruption of frozen cells on dry ice (method CP), were optimized for a model oomycete, Phytophthora capsici, and compared with grinding in a mortar and pestl...

  4. Utility of an automated thermal-based approach for monitoring evapotranspiration

    USDA-ARS?s Scientific Manuscript database

    A very simple remote sensing-based model for water use monitoring is presented. The model acronym DATTUTDUT, (Deriving Atmosphere Turbulent Transport Useful To Dummies Using Temperature) is a Dutch word which loosely translates as “It’s unbelievable that it works”. DATTUTDUT is fully automated and o...

  5. Automated Tutoring in Interactive Environments: A Task-Centered Approach.

    ERIC Educational Resources Information Center

    Wolz, Ursula; And Others

    1989-01-01

    Discusses tutoring and consulting functions in interactive computer environments. Tutoring strategies are considered, the expert model and the user model are described, and GENIE (Generated Informative Explanations)--an answer generating system for the Berkeley Unix Mail system--is explained as an example of an automated consulting system. (33…

  6. Automated analysis of high-content microscopy data with deep learning.

    PubMed

    Kraus, Oren Z; Grys, Ben T; Ba, Jimmy; Chong, Yolanda; Frey, Brendan J; Boone, Charles; Andrews, Brenda J

    2017-04-18

    Existing computational pipelines for quantitative analysis of high-content microscopy data rely on traditional machine learning approaches that fail to accurately classify more than a single dataset without substantial tuning and training, requiring extensive analysis. Here, we demonstrate that the application of deep learning to biological image data can overcome the pitfalls associated with conventional machine learning classifiers. Using a deep convolutional neural network (DeepLoc) to analyze yeast cell images, we show improved performance over traditional approaches in the automated classification of protein subcellular localization. We also demonstrate the ability of DeepLoc to classify highly divergent image sets, including images of pheromone-arrested cells with abnormal cellular morphology, as well as images generated in different genetic backgrounds and in different laboratories. We offer an open-source implementation that enables updating DeepLoc on new microscopy datasets. This study highlights deep learning as an important tool for the expedited analysis of high-content microscopy data. © 2017 The Authors. Published under the terms of the CC BY 4.0 license.

  7. Automatic analysis of dividing cells in live cell movies to detect mitotic delays and correlate phenotypes in time.

    PubMed

    Harder, Nathalie; Mora-Bermúdez, Felipe; Godinez, William J; Wünsche, Annelie; Eils, Roland; Ellenberg, Jan; Rohr, Karl

    2009-11-01

    Live-cell imaging allows detailed dynamic cellular phenotyping for cell biology and, in combination with small molecule or drug libraries, for high-content screening. Fully automated analysis of live cell movies has been hampered by the lack of computational approaches that allow tracking and recognition of individual cell fates over time in a precise manner. Here, we present a fully automated approach to analyze time-lapse movies of dividing cells. Our method dynamically categorizes cells into seven phases of the cell cycle and five aberrant morphological phenotypes over time. It reliably tracks cells and their progeny and can thus measure the length of mitotic phases and detect cause and effect if mitosis goes awry. We applied our computational scheme to annotate mitotic phenotypes induced by RNAi gene knockdown of CKAP5 (also known as ch-TOG) or by treatment with the drug nocodazole. Our approach can be readily applied to comparable assays aiming at uncovering the dynamic cause of cell division phenotypes.

  8. Mapping of Brain Activity by Automated Volume Analysis of Immediate Early Genes.

    PubMed

    Renier, Nicolas; Adams, Eliza L; Kirst, Christoph; Wu, Zhuhao; Azevedo, Ricardo; Kohl, Johannes; Autry, Anita E; Kadiri, Lolahon; Umadevi Venkataraju, Kannan; Zhou, Yu; Wang, Victoria X; Tang, Cheuk Y; Olsen, Olav; Dulac, Catherine; Osten, Pavel; Tessier-Lavigne, Marc

    2016-06-16

    Understanding how neural information is processed in physiological and pathological states would benefit from precise detection, localization, and quantification of the activity of all neurons across the entire brain, which has not, to date, been achieved in the mammalian brain. We introduce a pipeline for high-speed acquisition of brain activity at cellular resolution through profiling immediate early gene expression using immunostaining and light-sheet fluorescence imaging, followed by automated mapping and analysis of activity by an open-source software program we term ClearMap. We validate the pipeline first by analysis of brain regions activated in response to haloperidol. Next, we report new cortical regions downstream of whisker-evoked sensory processing during active exploration. Last, we combine activity mapping with axon tracing to uncover new brain regions differentially activated during parenting behavior. This pipeline is widely applicable to different experimental paradigms, including animal species for which transgenic activity reporters are not readily available. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Mapping social behavior-induced brain activation at cellular resolution in the mouse

    PubMed Central

    Kim, Yongsoo; Venkataraju, Kannan Umadevi; Pradhan, Kith; Mende, Carolin; Taranda, Julian; Turaga, Srinivas C.; Arganda-Carreras, Ignacio; Ng, Lydia; Hawrylycz, Michael J.; Rockland, Kathleen; Seung, H. Sebastian; Osten, Pavel

    2014-01-01

    Understanding how brain activation mediates behaviors is a central goal of systems neuroscience. Here we apply an automated method for mapping brain activation in the mouse in order to probe how sex-specific social behaviors are represented in the male brain. Our method uses the immediate early gene c-fos, a marker of neuronal activation, visualized by serial two-photon tomography: the c-fos-GFP-positive neurons are computationally detected, their distribution is registered to a reference brain and a brain atlas, and their numbers are analyzed by statistical tests. Our results reveal distinct and shared female and male interaction-evoked patterns of male brain activation representing sex discrimination and social recognition. We also identify brain regions whose degree of activity correlates to specific features of social behaviors and estimate the total numbers and the densities of activated neurons per brain areas. Our study opens the door to automated screening of behavior-evoked brain activation in the mouse. PMID:25558063

  10. Mapping of brain activity by automated volume analysis of immediate early genes

    PubMed Central

    Renier, Nicolas; Adams, Eliza L.; Kirst, Christoph; Wu, Zhuhao; Azevedo, Ricardo; Kohl, Johannes; Autry, Anita E.; Kadiri, Lolahon; Venkataraju, Kannan Umadevi; Zhou, Yu; Wang, Victoria X.; Tang, Cheuk Y.; Olsen, Olav; Dulac, Catherine; Osten, Pavel; Tessier-Lavigne, Marc

    2016-01-01

    Summary Understanding how neural information is processed in physiological and pathological states would benefit from precise detection, localization and quantification of the activity of all neurons across the entire brain, which has not to date been achieved in the mammalian brain. We introduce a pipeline for high speed acquisition of brain activity at cellular resolution through profiling immediate early gene expression using immunostaining and light-sheet fluorescence imaging, followed by automated mapping and analysis of activity by an open-source software program we term ClearMap. We validate the pipeline first by analysis of brain regions activated in response to Haloperidol. Next, we report new cortical regions downstream of whisker-evoked sensory processing during active exploration. Lastly, we combine activity mapping with axon tracing to uncover new brain regions differentially activated during parenting behavior. This pipeline is widely applicable to different experimental paradigms, including animal species for which transgenic activity reporters are not readily available. PMID:27238021

  11. Electrophoretic mobility shift scanning using an automated infrared DNA sequencer.

    PubMed

    Sano, M; Ohyama, A; Takase, K; Yamamoto, M; Machida, M

    2001-11-01

    Electrophoretic mobility shift assay (EMSA) is widely used in the study of sequence-specific DNA-binding proteins, including transcription factors and mismatch binding proteins. We have established a non-radioisotope-based protocol for EMSA that features an automated DNA sequencer with an infrared fluorescent dye (IRDye) detection unit. Our modification of the elec- trophoresis unit, which includes cooling the gel plates with a reduced well-to-read length, has made it possible to detect shifted bands within 1 h. Further, we have developed a rapid ligation-based method for generating IRDye-labeled probes with an approximately 60% cost reduction. This method has the advantages of real-time scanning, stability of labeled probes, and better safety associated with nonradioactive methods of detection. Analysis of a promoter from an industrially important filamentous fungus, Aspergillus oryzae, in a prototype experiment revealed that the method we describe has potential for use in systematic scanning and identification of the functionally important elements to which cellular factors bind in a sequence-specific manner.

  12. Benefits of Ilizarov automated bone distraction for nerves and articular cartilage in experimental leg lengthening.

    PubMed

    Shchudlo, Nathalia; Varsegova, Tatyana; Stupina, Tatyana; Shchudlo, Michael; Saifutdinov, Marat; Yemanov, Andrey

    2017-09-18

    To determine peculiarities of tissue responses to manual and automated Ilizarov bone distraction in nerves and articular cartilage. Twenty-nine dogs were divided in two experimental groups: Group M - leg lengthening with manual distraction (1 mm/d in 4 steps), Group A - automated distraction (1 mm/d in 60 steps) and intact group. Animals were euthanized at the end of distraction, at 30 th day of fixation in apparatus and 30 d after the fixator removal. M-responses in gastrocnemius and tibialis anterior muscles were recorded, numerical histology of peroneal and tibialis nerves and knee cartilage semi-thin sections, scanning electron microscopy and X-ray electron probe microanalysis were performed. Better restoration of M-response amplitudes in leg muscles was noted in A-group. Fibrosis of epineurium with adipocytes loss in peroneal nerve, subperineurial edema and fibrosis of endoneurium in some fascicles of both nerves were noted only in M-group, shares of nerve fibers with atrophic and degenerative changes were bigger in M-group than in A-group. At the end of experiment morphometric parameters of nerve fibers in peroneal nerve were comparable with intact nerve only in A-group. Quantitative parameters of articular cartilage (thickness, volumetric densities of chondrocytes, percentages of isogenic clusters and empty cellular lacunas, contents of sulfur and calcium) were badly changed in M-group and less changed in A-group. Automated Ilizarov distraction is more safe method of orthopedic leg lengthening than manual distraction in points of nervous fibers survival and articular cartilage arthrotic changes.

  13. Benefits of Ilizarov automated bone distraction for nerves and articular cartilage in experimental leg lengthening

    PubMed Central

    Shchudlo, Nathalia; Varsegova, Tatyana; Stupina, Tatyana; Shchudlo, Michael; Saifutdinov, Marat; Yemanov, Andrey

    2017-01-01

    AIM To determine peculiarities of tissue responses to manual and automated Ilizarov bone distraction in nerves and articular cartilage. METHODS Twenty-nine dogs were divided in two experimental groups: Group M - leg lengthening with manual distraction (1 mm/d in 4 steps), Group A - automated distraction (1 mm/d in 60 steps) and intact group. Animals were euthanized at the end of distraction, at 30th day of fixation in apparatus and 30 d after the fixator removal. M-responses in gastrocnemius and tibialis anterior muscles were recorded, numerical histology of peroneal and tibialis nerves and knee cartilage semi-thin sections, scanning electron microscopy and X-ray electron probe microanalysis were performed. RESULTS Better restoration of M-response amplitudes in leg muscles was noted in A-group. Fibrosis of epineurium with adipocytes loss in peroneal nerve, subperineurial edema and fibrosis of endoneurium in some fascicles of both nerves were noted only in M-group, shares of nerve fibers with atrophic and degenerative changes were bigger in M-group than in A-group. At the end of experiment morphometric parameters of nerve fibers in peroneal nerve were comparable with intact nerve only in A-group. Quantitative parameters of articular cartilage (thickness, volumetric densities of chondrocytes, percentages of isogenic clusters and empty cellular lacunas, contents of sulfur and calcium) were badly changed in M-group and less changed in A-group. CONCLUSION Automated Ilizarov distraction is more safe method of orthopedic leg lengthening than manual distraction in points of nervous fibers survival and articular cartilage arthrotic changes. PMID:28979852

  14. A comparison of long-term parallel measurements of sunshine duration obtained with a Campbell-Stokes sunshine recorder and two automated sunshine sensors

    NASA Astrophysics Data System (ADS)

    Baumgartner, D. J.; Pötzi, W.; Freislich, H.; Strutzmann, H.; Veronig, A. M.; Foelsche, U.; Rieder, H. E.

    2017-06-01

    In recent decades, automated sensors for sunshine duration (SD) measurements have been introduced in meteorological networks, thereby replacing traditional instruments, most prominently the Campbell-Stokes (CS) sunshine recorder. Parallel records of automated and traditional SD recording systems are rare. Nevertheless, such records are important to understand the differences/similarities in SD totals obtained with different instruments and how changes in monitoring device type affect the homogeneity of SD records. This study investigates the differences/similarities in parallel SD records obtained with a CS and two automated SD sensors between 2007 and 2016 at the Kanzelhöhe Observatory, Austria. Comparing individual records of daily SD totals, we find differences of both positive and negative sign, with smallest differences between the automated sensors. The larger differences between CS-derived SD totals and those from automated sensors can be attributed (largely) to the higher sensitivity threshold of the CS instrument. Correspondingly, the closest agreement among all sensors is found during summer, the time of year when sensitivity thresholds are least critical. Furthermore, we investigate the performance of various models to create the so-called sensor-type-equivalent (STE) SD records. Our analysis shows that regression models including all available data on daily (or monthly) time scale perform better than simple three- (or four-) point regression models. Despite general good performance, none of the considered regression models (of linear or quadratic form) emerges as the "optimal" model. Although STEs prove useful for relating SD records of individual sensors on daily/monthly time scales, this does not ensure that STE (or joint) records can be used for trend analysis.

  15. Comparison of measured and modeled radiation, heat and water vapor fluxes: FIFE pilot study

    NASA Technical Reports Server (NTRS)

    Blad, Blaine L.; Hubbard, Kenneth G.; Verma, Shashi B.; Starks, Patrick; Norman, John M.; Walter-Shea, Elizabeth

    1987-01-01

    The feasibility of using radio frequency receivers to collect data from automated weather stations to model fluxes of latent heat, sensible heat, and radiation using routine weather data collected by automated weather stations was tested and the estimated fluxes were compared with fluxes measured over wheat. The model Cupid was used to model the fluxes. Two or more automated weather stations, interrogated by radio frequency and other means, were utilized to examine some of the climatic variability of the First ISLSCP (International Satellite Land-Surface Climatology Project) Field Experiment (FIFE) site, to measure and model reflected and emitted radiation streams from various locations at the site and to compare modeled latent and sensible heat fluxes with measured values. Some bidirectional reflected and emitted radiation data were collected from 23 locations throughout the FIFE site. Analysis of these data along with analysis of the measured sensible and latent heat fluxes is just beginning.

  16. Automated Analysis of Stateflow Models

    NASA Technical Reports Server (NTRS)

    Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier

    2017-01-01

    Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.

  17. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 2: HARP tutorial

    NASA Technical Reports Server (NTRS)

    Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Bavuso, Salvatore J.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. The Hybrid Automated Reliability Predictor (HARP) tutorial provides insight into HARP modeling techniques and the interactive textual prompting input language via a step-by-step explanation and demonstration of HARP's fault occurrence/repair model and the fault/error handling models. Example applications are worked in their entirety and the HARP tabular output data are presented for each. Simple models are presented at first with each succeeding example demonstrating greater modeling power and complexity. This document is not intended to present the theoretical and mathematical basis for HARP.

  18. Agent Models for Self-Motivated Home-Assistant Bots

    NASA Astrophysics Data System (ADS)

    Merrick, Kathryn; Shafi, Kamran

    2010-01-01

    Modern society increasingly relies on technology to support everyday activities. In the past, this technology has focused on automation, using computer technology embedded in physical objects. More recently, there is an expectation that this technology will not just embed reactive automation, but also embed intelligent, proactive automation in the environment. That is, there is an emerging desire for novel technologies that can monitor, assist, inform or entertain when required, and not just when requested. This paper presents three self-motivated, home-assistant bot applications using different self-motivated agent models. Self-motivated agents use a computational model of motivation to generate goals proactively. Technologies based on self-motivated agents can thus respond autonomously and proactively to stimuli from their environment. Three prototypes of different self-motivated agent models, using different computational models of motivation, are described to demonstrate these concepts.

  19. Framework for Human-Automation Collaboration: Conclusions from Four Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oxstrand, Johanna; Le Blanc, Katya L.; O'Hara, John

    The Human Automation Collaboration (HAC) research project is investigating how advanced technologies that are planned for Advanced Small Modular Reactors (AdvSMR) will affect the performance and the reliability of the plant from a human factors and human performance perspective. The HAC research effort investigates the consequences of allocating functions between the operators and automated systems. More specifically, the research team is addressing how to best design the collaboration between the operators and the automated systems in a manner that has the greatest positive impact on overall plant performance and reliability. Oxstrand et al. (2013 - March) describes the efforts conductedmore » by the researchers to identify the research needs for HAC. The research team reviewed the literature on HAC, developed a model of HAC, and identified gaps in the existing knowledge of human-automation collaboration. As described in Oxstrand et al. (2013 – June), the team then prioritized the research topics identified based on the specific needs in the context of AdvSMR. The prioritization was based on two sources of input: 1) The preliminary functions and tasks, and 2) The model of HAC. As a result, three analytical studies were planned and conduced; 1) Models of Teamwork, 2) Standardized HAC Performance Measurement Battery, and 3) Initiators and Triggering Conditions for Adaptive Automation. Additionally, one field study was also conducted at Idaho Falls Power.« less

  20. Subtle changes in myelination due to childhood experiences: label-free microscopy to infer nerve fibers morphology and myelination in brain (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Gasecka, Alicja; Tanti, Arnaud; Lutz, Pierre-Eric; Mechawar, Naguib; Cote, Daniel C.

    2017-02-01

    Adverse childhood experiences have lasting detrimental effects on mental health and are strongly associated with impaired cognition and increased risk of developing psychopathologies. Preclinical and neuroimaging studies have suggested that traumatic events during brain development can affect cerebral myelination particularly in areas and tracts implicated in mood and emotion. Although current neuroimaging techniques are quite powerful, they lack the resolution to infer myelin integrity at the cellular level. Recently demonstrated coherent Raman microscopy has accomplished cellular level imaging of myelin sheaths in the nervous system. However, a quantitative morphometric analysis of nerve fibers still remains a challenge. In particular, in brain, where fibres exhibit small diameters and varying local orientation. In this work, we developed an automated myelin identification and analysis method that is capable of providing a complete picture of axonal myelination and morphology in brain samples. This method performs three main procedures 1) detects molecular anisotropy of membrane phospholipids based on polarization resolved coherent Raman microscopy, 2) identifies regions of different molecular organization, 3) calculates morphometric features of myelinated axons (e.g. myelin thickness, g-ratio). We applied this method to monitor white matter areas from suicides adults that suffered from early live adversity and depression compared to depressed suicides adults and psychiatrically healthy controls. We demonstrate that our method allows for the rapid acquisition and automated analysis of neuronal networks morphology and myelination. This is especially useful for clinical and comparative studies, and may greatly enhance the understanding of processes underlying the neurobiological and psychopathological consequences of child abuse.

  1. Cellular Responses to Mechanical Stress Selected Contribution: A Three-Dimensional Model for Assessment of in Vitro Toxicity in Balaena Mysticetus Renal Tissue

    NASA Technical Reports Server (NTRS)

    Goodwin, T. J.; Coate-Li, L.; Linnehan, R. M.; Hammond, T. G.

    2000-01-01

    This study established two- and three-dimensional renal proximal tubular cell cultures of the endangered species bowhead whale (Balaena mysticetus), developed SV40-transfected cultures, and cloned the 61-amino acid open reading frame for the metallothionein protein, the primary binding site for heavy metal contamination in mammals. Microgravity research, modulations in mechanical culture conditions (modeled microgravity), and shear stress have spawned innovative approaches to understanding the dynamics of cellular interactions, gene expression, and differentiation in several cellular systems. These investigations have led to the creation of ex vivo tissue models capable of serving as physiological research analogs for three-dimensional cellular interactions. These models are enabling studies in immune function, tissue modeling for basic research, and neoplasia. Three-dimensional cellular models emulate aspects of in vivo cellular architecture and physiology and may facilitate environmental toxicological studies aimed at elucidating biological functions and responses at the cellular level. Marine mammals occupy a significant ecological niche (72% of the Earth's surface is water) in terms of the potential for information on bioaccumulation and transport of terrestrial and marine environmental toxins in high-order vertebrates. Few ex vivo models of marine mammal physiology exist in vitro to accomplish the aforementioned studies. Techniques developed in this investigation, based on previous tissue modeling successes, may serve to facilitate similar research in other marine mammals.

  2. A Comprehensive Automated 3D Approach for Building Extraction, Reconstruction, and Regularization from Airborne Laser Scanning Point Clouds

    PubMed Central

    Dorninger, Peter; Pfeifer, Norbert

    2008-01-01

    Three dimensional city models are necessary for supporting numerous management applications. For the determination of city models for visualization purposes, several standardized workflows do exist. They are either based on photogrammetry or on LiDAR or on a combination of both data acquisition techniques. However, the automated determination of reliable and highly accurate city models is still a challenging task, requiring a workflow comprising several processing steps. The most relevant are building detection, building outline generation, building modeling, and finally, building quality analysis. Commercial software tools for building modeling require, generally, a high degree of human interaction and most automated approaches described in literature stress the steps of such a workflow individually. In this article, we propose a comprehensive approach for automated determination of 3D city models from airborne acquired point cloud data. It is based on the assumption that individual buildings can be modeled properly by a composition of a set of planar faces. Hence, it is based on a reliable 3D segmentation algorithm, detecting planar faces in a point cloud. This segmentation is of crucial importance for the outline detection and for the modeling approach. We describe the theoretical background, the segmentation algorithm, the outline detection, and the modeling approach, and we present and discuss several actual projects. PMID:27873931

  3. Driver-centred vehicle automation: using network analysis for agent-based modelling of the driver in highly automated driving systems.

    PubMed

    Banks, Victoria A; Stanton, Neville A

    2016-11-01

    To the average driver, the concept of automation in driving infers that they can become completely 'hands and feet free'. This is a common misconception, however, one that has been shown through the application of Network Analysis to new Cruise Assist technologies that may feature on our roads by 2020. Through the adoption of a Systems Theoretic approach, this paper introduces the concept of driver-initiated automation which reflects the role of the driver in highly automated driving systems. Using a combination of traditional task analysis and the application of quantitative network metrics, this agent-based modelling paper shows how the role of the driver remains an integral part of the driving system implicating the need for designers to ensure they are provided with the tools necessary to remain actively in-the-loop despite giving increasing opportunities to delegate their control to the automated subsystems. Practitioner Summary: This paper describes and analyses a driver-initiated command and control system of automation using representations afforded by task and social networks to understand how drivers remain actively involved in the task. A network analysis of different driver commands suggests that such a strategy does maintain the driver in the control loop.

  4. Reconstruction and Validation of a Genome-Scale Metabolic Model for the Filamentous Fungus Neurospora crassa Using FARM

    PubMed Central

    Hood, Heather M.; Ocasio, Linda R.; Sachs, Matthew S.; Galagan, James E.

    2013-01-01

    The filamentous fungus Neurospora crassa played a central role in the development of twentieth-century genetics, biochemistry and molecular biology, and continues to serve as a model organism for eukaryotic biology. Here, we have reconstructed a genome-scale model of its metabolism. This model consists of 836 metabolic genes, 257 pathways, 6 cellular compartments, and is supported by extensive manual curation of 491 literature citations. To aid our reconstruction, we developed three optimization-based algorithms, which together comprise Fast Automated Reconstruction of Metabolism (FARM). These algorithms are: LInear MEtabolite Dilution Flux Balance Analysis (limed-FBA), which predicts flux while linearly accounting for metabolite dilution; One-step functional Pruning (OnePrune), which removes blocked reactions with a single compact linear program; and Consistent Reproduction Of growth/no-growth Phenotype (CROP), which reconciles differences between in silico and experimental gene essentiality faster than previous approaches. Against an independent test set of more than 300 essential/non-essential genes that were not used to train the model, the model displays 93% sensitivity and specificity. We also used the model to simulate the biochemical genetics experiments originally performed on Neurospora by comprehensively predicting nutrient rescue of essential genes and synthetic lethal interactions, and we provide detailed pathway-based mechanistic explanations of our predictions. Our model provides a reliable computational framework for the integration and interpretation of ongoing experimental efforts in Neurospora, and we anticipate that our methods will substantially reduce the manual effort required to develop high-quality genome-scale metabolic models for other organisms. PMID:23935467

  5. The ECLSS Advanced Automation Project Evolution and Technology Assessment

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.; Carnes, James R.; Lukefahr, Brenda D.; Rogers, John S.; Rochowiak, Daniel M.; Mckee, James W.; Benson, Brian L.

    1990-01-01

    Viewgraphs on Environmental Control and Life Support System (ECLSS) advanced automation project evolution and technology assessment are presented. Topics covered include: the ECLSS advanced automation project; automatic fault diagnosis of ECLSS subsystems descriptions; in-line, real-time chemical and microbial fluid analysis; and object-oriented, distributed chemical and microbial modeling of regenerative environmental control systems description.

  6. Automated data acquisition technology development:Automated modeling and control development

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.

    1995-01-01

    This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.

  7. Automation and decision support in interactive consumer products.

    PubMed

    Sauer, J; Rüttinger, B

    2007-06-01

    This article presents two empirical studies (n = 30, n = 48) that are concerned with different forms of automation in interactive consumer products. The goal of the studies was to evaluate the effectiveness of two types of automation: perceptual augmentation (i.e. supporting users' information acquisition and analysis); and control integration (i.e. supporting users' action selection and implementation). Furthermore, the effectiveness of on-product information (i.e. labels attached to product) in supporting automation design was evaluated. The findings suggested greater benefits for automation in control integration than in perceptual augmentation alone, which may be partly due to the specific requirements of consumer product usage. If employed appropriately, on-product information can be a helpful means of information conveyance. The article discusses the implications of automation design in interactive consumer products while drawing on automation models from the work environment.

  8. Automated finite element modeling of the lumbar spine: Using a statistical shape model to generate a virtual population of models.

    PubMed

    Campbell, J Q; Petrella, A J

    2016-09-06

    Population-based modeling of the lumbar spine has the potential to be a powerful clinical tool. However, developing a fully parameterized model of the lumbar spine with accurate geometry has remained a challenge. The current study used automated methods for landmark identification to create a statistical shape model of the lumbar spine. The shape model was evaluated using compactness, generalization ability, and specificity. The primary shape modes were analyzed visually, quantitatively, and biomechanically. The biomechanical analysis was performed by using the statistical shape model with an automated method for finite element model generation to create a fully parameterized finite element model of the lumbar spine. Functional finite element models of the mean shape and the extreme shapes (±3 standard deviations) of all 17 shape modes were created demonstrating the robust nature of the methods. This study represents an advancement in finite element modeling of the lumbar spine and will allow population-based modeling in the future. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. A framework for the automated data-driven constitutive characterization of composites

    Treesearch

    J.G. Michopoulos; John Hermanson; T. Furukawa; A. Iliopoulos

    2010-01-01

    We present advances on the development of a mechatronically and algorithmically automated framework for the data-driven identification of constitutive material models based on energy density considerations. These models can capture both the linear and nonlinear constitutive response of multiaxially loaded composite materials in a manner that accounts for progressive...

  10. The Function of Semantics in Automated Language Processing.

    ERIC Educational Resources Information Center

    Pacak, Milos; Pratt, Arnold W.

    This paper is a survey of some of the major semantic models that have been developed for automated semantic analysis of natural language. Current approaches to semantic analysis and logical interference are based mainly on models of human cognitive processes such as Quillian's semantic memory, Simmon's Protosynthex III and others. All existing…

  11. County-level job automation risk and health: Evidence from the United States.

    PubMed

    Patel, Pankaj C; Devaraj, Srikant; Hicks, Michael J; Wornell, Emily J

    2018-04-01

    Previous studies have observed a positive association between automation risk and employment loss. Based on the job insecurity-health risk hypothesis, greater exposure to automation risk could also be negatively associated with health outcomes. The main objective of this paper is to investigate the county-level association between prevalence of workers in jobs exposed to automation risk and general, physical, and mental health outcomes. As a preliminary assessment of the job insecurity-health risk hypothesis (automation risk → job insecurity → poorer health), a structural equation model was used based on individual-level data in the two cross-sectional waves (2012 and 2014) of General Social Survey (GSS). Next, using county-level data from County Health Rankings 2017, American Community Survey (ACS) 2015, and Statistics of US Businesses 2014, Two Stage Least Squares (2SLS) regression models were fitted to predict county-level health outcomes. Using the 2012 and 2014 waves of the GSS, employees in occupational classes at higher risk of automation reported more job insecurity, that, in turn, was associated with poorer health. The 2SLS estimates show that a 10% increase in automation risk at county-level is associated with 2.38, 0.8, and 0.6 percentage point lower general, physical, and mental health, respectively. Evidence suggests that exposure to automation risk may be negatively associated with health outcomes, plausibly through perceptions of poorer job security. More research is needed on interventions aimed at mitigating negative influence of automation risk on health. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Individual differences in response to automation: the five factor model of personality.

    PubMed

    Szalma, James L; Taylor, Grant S

    2011-06-01

    This study examined the relationship of operator personality (Five Factor Model) and characteristics of the task and of adaptive automation (reliability and adaptiveness-whether the automation was well-matched to changes in task demand) to operator performance, workload, stress, and coping. This represents the first investigation of how the Five Factors relate to human response to automation. One-hundred-sixty-one college students experienced either 75% or 95% reliable automation provided with task loads of either two or four displays to be monitored. The task required threat detection in a simulated uninhabited ground vehicle (UGV) task. Task demand exerted the strongest influence on outcome variables. Automation characteristics did not directly impact workload or stress, but effects did emerge in the context of trait-task interactions that varied as a function of the dimension of workload and stress. The pattern of relationships of traits to dependent variables was generally moderated by at least one task factor. Neuroticism was related to poorer performance in some conditions, and all five traits were associated with at least one measure of workload and stress. Neuroticism generally predicted increased workload and stress and the other traits predicted decreased levels of these states. However, in the case of the relation of Extraversion and Agreeableness to Worry, Frustration, and avoidant coping, the direction of effects varied across task conditions. The results support incorporation of individual differences into automation design by identifying the relevant person characteristics and using the information to determine what functions to automate and the form and level of automation.

  13. Automated Transition State Theory Calculations for High-Throughput Kinetics.

    PubMed

    Bhoorasingh, Pierre L; Slakman, Belinda L; Seyedzadeh Khanshan, Fariba; Cain, Jason Y; West, Richard H

    2017-09-21

    A scarcity of known chemical kinetic parameters leads to the use of many reaction rate estimates, which are not always sufficiently accurate, in the construction of detailed kinetic models. To reduce the reliance on these estimates and improve the accuracy of predictive kinetic models, we have developed a high-throughput, fully automated, reaction rate calculation method, AutoTST. The algorithm integrates automated saddle-point geometry search methods and a canonical transition state theory kinetics calculator. The automatically calculated reaction rates compare favorably to existing estimated rates. Comparison against high level theoretical calculations show the new automated method performs better than rate estimates when the estimate is made by a poor analogy. The method will improve by accounting for internal rotor contributions and by improving methods to determine molecular symmetry.

  14. Impact of automation: Measurement of performance, workload and behaviour in a complex control environment.

    PubMed

    Balfe, Nora; Sharples, Sarah; Wilson, John R

    2015-03-01

    This paper describes an experiment that was undertaken to compare three levels of automation in rail signalling; a high level in which an automated agent set routes for trains using timetable information, a medium level in which trains were routed along pre-defined paths, and a low level where the operator (signaller) was responsible for the movement of all trains. These levels are described in terms of a Rail Automation Model based on previous automation theory (Parasuraman et al., 2000). Performance, subjective workload, and signaller activity were measured for each level of automation running under both normal operating conditions and abnormal, or disrupted, conditions. The results indicate that perceived workload, during both normal and disrupted phases of the experiment, decreased as the level of automation increased and performance was most consistent (i.e. showed the least variation between participants) with the highest level of automation. The results give a strong case in favour of automation, particularly in terms of demonstrating the potential for automation to reduce workload, but also suggest much benefit can achieved from a mid-level of automation potentially at a lower cost and complexity. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  15. Iterative model building, structure refinement and density modification with the PHENIX AutoBuild wizard

    PubMed Central

    Terwilliger, Thomas C.; Grosse-Kunstleve, Ralf W.; Afonine, Pavel V.; Moriarty, Nigel W.; Zwart, Peter H.; Hung, Li-Wei; Read, Randy J.; Adams, Paul D.

    2008-01-01

    The PHENIX AutoBuild wizard is a highly automated tool for iterative model building, structure refinement and density modification using RESOLVE model building, RESOLVE statistical density modification and phenix.refine structure refinement. Recent advances in the AutoBuild wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model-completion algorithms and automated solvent-molecule picking. Model-completion algorithms in the AutoBuild wizard include loop building, crossovers between chains in different models of a structure and side-chain optimization. The AutoBuild wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 to 3.2 Å, resulting in a mean R factor of 0.24 and a mean free R factor of 0.29. The R factor of the final model is dependent on the quality of the starting electron density and is relatively independent of resolution. PMID:18094468

  16. Cellular-based modeling of oscillatory dynamics in brain networks.

    PubMed

    Skinner, Frances K

    2012-08-01

    Oscillatory, population activities have long been known to occur in our brains during different behavioral states. We know that many different cell types exist and that they contribute in distinct ways to the generation of these activities. I review recent papers that involve cellular-based models of brain networks, most of which include theta, gamma and sharp wave-ripple activities. To help organize the modeling work, I present it from a perspective of three different types of cellular-based modeling: 'Generic', 'Biophysical' and 'Linking'. Cellular-based modeling is taken to encompass the four features of experiment, model development, theory/analyses, and model usage/computation. The three modeling types are shown to include these features and interactions in different ways. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Formal verification of automated teller machine systems using SPIN

    NASA Astrophysics Data System (ADS)

    Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam

    2017-08-01

    Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.

  18. Automated model-based quantitative analysis of phantoms with spherical inserts in FDG PET scans.

    PubMed

    Ulrich, Ethan J; Sunderland, John J; Smith, Brian J; Mohiuddin, Imran; Parkhurst, Jessica; Plichta, Kristin A; Buatti, John M; Beichel, Reinhard R

    2018-01-01

    Quality control plays an increasingly important role in quantitative PET imaging and is typically performed using phantoms. The purpose of this work was to develop and validate a fully automated analysis method for two common PET/CT quality assurance phantoms: the NEMA NU-2 IQ and SNMMI/CTN oncology phantom. The algorithm was designed to only utilize the PET scan to enable the analysis of phantoms with thin-walled inserts. We introduce a model-based method for automated analysis of phantoms with spherical inserts. Models are first constructed for each type of phantom to be analyzed. A robust insert detection algorithm uses the model to locate all inserts inside the phantom. First, candidates for inserts are detected using a scale-space detection approach. Second, candidates are given an initial label using a score-based optimization algorithm. Third, a robust model fitting step aligns the phantom model to the initial labeling and fixes incorrect labels. Finally, the detected insert locations are refined and measurements are taken for each insert and several background regions. In addition, an approach for automated selection of NEMA and CTN phantom models is presented. The method was evaluated on a diverse set of 15 NEMA and 20 CTN phantom PET/CT scans. NEMA phantoms were filled with radioactive tracer solution at 9.7:1 activity ratio over background, and CTN phantoms were filled with 4:1 and 2:1 activity ratio over background. For quantitative evaluation, an independent reference standard was generated by two experts using PET/CT scans of the phantoms. In addition, the automated approach was compared against manual analysis, which represents the current clinical standard approach, of the PET phantom scans by four experts. The automated analysis method successfully detected and measured all inserts in all test phantom scans. It is a deterministic algorithm (zero variability), and the insert detection RMS error (i.e., bias) was 0.97, 1.12, and 1.48 mm for phantom activity ratios 9.7:1, 4:1, and 2:1, respectively. For all phantoms and at all contrast ratios, the average RMS error was found to be significantly lower for the proposed automated method compared to the manual analysis of the phantom scans. The uptake measurements produced by the automated method showed high correlation with the independent reference standard (R 2 ≥ 0.9987). In addition, the average computing time for the automated method was 30.6 s and was found to be significantly lower (P ≪ 0.001) compared to manual analysis (mean: 247.8 s). The proposed automated approach was found to have less error when measured against the independent reference than the manual approach. It can be easily adapted to other phantoms with spherical inserts. In addition, it eliminates inter- and intraoperator variability in PET phantom analysis and is significantly more time efficient, and therefore, represents a promising approach to facilitate and simplify PET standardization and harmonization efforts. © 2017 American Association of Physicists in Medicine.

  19. Modeling and deadlock avoidance of automated manufacturing systems with multiple automated guided vehicles.

    PubMed

    Wu, Naiqi; Zhou, MengChu

    2005-12-01

    An automated manufacturing system (AMS) contains a number of versatile machines (or workstations), buffers, an automated material handling system (MHS), and is computer-controlled. An effective and flexible alternative for implementing MHS is to use automated guided vehicle (AGV) system. The deadlock issue in AMS is very important in its operation and has extensively been studied. The deadlock problems were separately treated for parts in production and transportation and many techniques were developed for each problem. However, such treatment does not take the advantage of the flexibility offered by multiple AGVs. In general, it is intractable to obtain maximally permissive control policy for either problem. Instead, this paper investigates these two problems in an integrated way. First we model an AGV system and part processing processes by resource-oriented Petri nets, respectively. Then the two models are integrated by using macro transitions. Based on the combined model, a novel control policy for deadlock avoidance is proposed. It is shown to be maximally permissive with computational complexity of O (n2) where n is the number of machines in AMS if the complexity for controlling the part transportation by AGVs is not considered. Thus, the complexity of deadlock avoidance for the whole system is bounded by the complexity in controlling the AGV system. An illustrative example shows its application and power.

  20. Massachusetts Institute of Technology Consortium Agreement

    DTIC Science & Technology

    1999-03-01

    This is the third progress report of the M.I.T. Home Automation and Healthcare Consortium-Phase Two. It covers majority of the new findings, concepts...research projects of home automation and healthcare, ranging from human modeling, patient monitoring, and diagnosis to new sensors and actuators, physical...aids, human-machine interface and home automation infrastructure. This report contains several patentable concepts, algorithms, and designs.

  1. Department of the Army Cost Analysis Manual

    DTIC Science & Technology

    2001-05-01

    SECTION I - AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) ................................................................179 SECTION II - AUTOMATED...Management & Comptroller) endorsed the Automated Cost Estimating Integrated Tools ( ACEIT ) model and since it is widely used to prepare POEs, CCAs and...CRB IPT (in ACEIT ) will be the basis for information contained in the CAB. Any remaining unresolved issues from the IPT process will be raised at the

  2. Confocal nanoscanning, bead picking (CONA): PickoScreen microscopes for automated and quantitative screening of one-bead one-compound libraries.

    PubMed

    Hintersteiner, Martin; Buehler, Christof; Uhl, Volker; Schmied, Mario; Müller, Jürgen; Kottig, Karsten; Auer, Manfred

    2009-01-01

    Solid phase combinatorial chemistry provides fast and cost-effective access to large bead based libraries with compound numbers easily exceeding tens of thousands of compounds. Incubating one-bead one-compound library beads with fluorescently labeled target proteins and identifying and isolating the beads which contain a bound target protein, potentially represents one of the most powerful generic primary high throughput screening formats. On-bead screening (OBS) based on this detection principle can be carried out with limited automation. Often hit bead detection, i.e. recognizing beads with a fluorescently labeled protein bound to the compound on the bead, relies on eye-inspection under a wide-field microscope. Using low resolution detection techniques, the identification of hit beads and their ranking is limited by a low fluorescence signal intensity and varying levels of the library beads' autofluorescence. To exploit the full potential of an OBS process, reliable methods for both automated quantitative detection of hit beads and their subsequent isolation are needed. In a joint collaborative effort with Evotec Technologies (now Perkin-Elmer Cellular Technologies Germany GmbH), we have built two confocal bead scanner and picker platforms PS02 and a high-speed variant PS04 dedicated to automated high resolution OBS. The PS0X instruments combine fully automated confocal large area scanning of a bead monolayer at the bottom of standard MTP plates with semiautomated isolation of individual hit beads via hydraulic-driven picker capillaries. The quantification of fluorescence intensities with high spatial resolution in the equatorial plane of each bead allows for a reliable discrimination between entirely bright autofluorescent beads and real hit beads which exhibit an increased fluorescence signal at the outer few micrometers of the bead. The achieved screening speed of up to 200,000 bead assayed in less than 7 h and the picking time of approximately 1 bead/min allow exploitation of one-bead one-compound libraries with high sensitivity, accuracy, and speed.

  3. Emergence of HGF/SF-Induced Coordinated Cellular Motility

    PubMed Central

    Zaritsky, Assaf; Natan, Sari; Ben-Jacob, Eshel; Tsarfaty, Ilan

    2012-01-01

    Collective cell migration plays a major role in embryonic morphogenesis, tissue remodeling, wound repair and cancer invasion. Despite many decades of extensive investigations, only few analytical tools have been developed to enhance the biological understanding of this important phenomenon. Here we present a novel quantitative approach to analyze long term kinetics of bright field time-lapse wound healing. Fully-automated spatiotemporal measures and visualization of cells' motility and implicit morphology were proven to be sound, repetitive and highly informative compared to single-cell tracking analysis. We study cellular collective migration induced by tyrosine kinase-growth factor signaling (Met-Hepatocyte Growth Factor/Scatter Factor (HGF/SF)). Our quantitative approach is applied to demonstrate that collective migration of the adenocarcinoma cell lines is characterized by simple morpho-kinetics. HGF/SF induces complex morpho-kinetic coordinated collective migration: cells at the front move faster and are more spread than those further away from the wound edge. As the wound heals, distant cells gradually accelerate and enhance spread and elongation –resembling the epithelial to mesenchymal transition (EMT), and then the cells become more spread and maintain higher velocity than cells located closer to the wound. Finally, upon wound closure, front cells halt, shrink and round up (resembling mesenchymal to epithelial transition (MET) phenotype) while distant cells undergo the same process gradually. Met inhibition experiments further validate that Met signaling dramatically alters the morpho-kinetic dynamics of the healing wound. Machine-learning classification was applied to demonstrate the generalization of our findings, revealing even subtle changes in motility patterns induced by Met-inhibition. It is concluded that activation of Met-signaling induces an elaborated model in which cells lead a coordinated increased motility along with gradual differentiation-based collective cell motility dynamics. Our quantitative phenotypes may guide future investigation on the molecular and cellular mechanisms of tyrosine kinase-induced coordinate cell motility and morphogenesis in metastasis. PMID:22970283

  4. Modeling nurses' attitude toward using automated unit-based medication storage and distribution systems: an extension of the technology acceptance model.

    PubMed

    Escobar-Rodríguez, Tomás; Romero-Alonso, María Mercedes

    2013-05-01

    This article analyzes the attitude of nurses toward the use of automated unit-based medication storage and distribution systems and identifies influencing factors. Understanding these factors provides an opportunity to explore actions that might be taken to boost adoption by potential users. The theoretical grounding for this research is the Technology Acceptance Model. The Technology Acceptance Model specifies the causal relationships between perceived usefulness, perceived ease of use, attitude toward using, and actual usage behavior. The research model has six constructs, and nine hypotheses were generated from connections between these six constructs. These constructs include perceived risks, experience level, and training. The findings indicate that these three external variables are related to the perceived ease of use and perceived usefulness of automated unit-based medication storage and distribution systems, and therefore, they have a significant influence on attitude toward the use of these systems.

  5. Thinking Together: Modeling Clinical Decision-Support as a Sociotechnical System

    PubMed Central

    Hussain, Mustafa I.; Reynolds, Tera L.; Mousavi, Fatemeh E.; Chen, Yunan; Zheng, Kai

    2017-01-01

    Computerized clinical decision-support systems are members of larger sociotechnical systems, composed of human and automated actors, who send, receive, and manipulate artifacts. Sociotechnical consideration is rare in the literature. This makes it difficult to comparatively evaluate the success of CDS implementations, and it may also indicate that sociotechnical context receives inadequate consideration in practice. To facilitate sociotechnical consideration, we developed the Thinking Together model, a flexible diagrammatical means of representing CDS systems as sociotechnical systems. To develop this model, we examined the literature with the lens of Distributed Cognition (DCog) theory. We then present two case studies of vastly different CDSSs, one almost fully automated and the other with minimal automation, to illustrate the flexibility of the Thinking Together model. We show that this model, informed by DCog and the CDS literature, are capable of supporting both research, by enabling comparative evaluation, and practice, by facilitating explicit sociotechnical planning and communication. PMID:29854164

  6. Machine learning of network metrics in ATLAS Distributed Data Management

    NASA Astrophysics Data System (ADS)

    Lassnig, Mario; Toler, Wesley; Vamosi, Ralf; Bogado, Joaquin; ATLAS Collaboration

    2017-10-01

    The increasing volume of physics data poses a critical challenge to the ATLAS experiment. In anticipation of high luminosity physics, automation of everyday data management tasks has become necessary. Previously many of these tasks required human decision-making and operation. Recent advances in hardware and software have made it possible to entrust more complicated duties to automated systems using models trained by machine learning algorithms. In this contribution we show results from one of our ongoing automation efforts that focuses on network metrics. First, we describe our machine learning framework built atop the ATLAS Analytics Platform. This framework can automatically extract and aggregate data, train models with various machine learning algorithms, and eventually score the resulting models and parameters. Second, we use these models to forecast metrics relevant for networkaware job scheduling and data brokering. We show the characteristics of the data and evaluate the forecasting accuracy of our models.

  7. A Model of How Different Biology Experts Explain Molecular and Cellular Mechanisms

    ERIC Educational Resources Information Center

    Trujillo, Caleb M.; Anderson, Trevor R.; Pelaez, Nancy J.

    2015-01-01

    Constructing explanations is an essential skill for all science learners. The goal of this project was to model the key components of expert explanation of molecular and cellular mechanisms. As such, we asked: What is an appropriate model of the components of explanation used by biology experts to explain molecular and cellular mechanisms? Do…

  8. An Evaluation of the Efficacy of a Laboratory Exercise on Cellular Respiration

    ERIC Educational Resources Information Center

    Scholer, Anne-Marie; Hatton, Mary

    2008-01-01

    This study is an analysis of the effectiveness of a faculty-designed laboratory experience about a difficult topic, cellular respiration. The activity involves a hands-on model of the cellular-respiration process, making use of wooden ball-and-stick chemistry models and small toy trucks on a table top model of the mitochondrion. Students…

  9. Comparison of BrainTool to other UML modeling and model transformation tools

    NASA Astrophysics Data System (ADS)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  10. Comparative Cost-Effectiveness Analysis of Three Different Automated Medication Systems Implemented in a Danish Hospital Setting.

    PubMed

    Risør, Bettina Wulff; Lisby, Marianne; Sørensen, Jan

    2018-02-01

    Automated medication systems have been found to reduce errors in the medication process, but little is known about the cost-effectiveness of such systems. The objective of this study was to perform a model-based indirect cost-effectiveness comparison of three different, real-world automated medication systems compared with current standard practice. The considered automated medication systems were a patient-specific automated medication system (psAMS), a non-patient-specific automated medication system (npsAMS), and a complex automated medication system (cAMS). The economic evaluation used original effect and cost data from prospective, controlled, before-and-after studies of medication systems implemented at a Danish hematological ward and an acute medical unit. Effectiveness was described as the proportion of clinical and procedural error opportunities that were associated with one or more errors. An error was defined as a deviation from the electronic prescription, from standard hospital policy, or from written procedures. The cost assessment was based on 6-month standardization of observed cost data. The model-based comparative cost-effectiveness analyses were conducted with system-specific assumptions of the effect size and costs in scenarios with consumptions of 15,000, 30,000, and 45,000 doses per 6-month period. With 30,000 doses the cost-effectiveness model showed that the cost-effectiveness ratio expressed as the cost per avoided clinical error was €24 for the psAMS, €26 for the npsAMS, and €386 for the cAMS. Comparison of the cost-effectiveness of the three systems in relation to different valuations of an avoided error showed that the psAMS was the most cost-effective system regardless of error type or valuation. The model-based indirect comparison against the conventional practice showed that psAMS and npsAMS were more cost-effective than the cAMS alternative, and that psAMS was more cost-effective than npsAMS.

  11. Bacterial safety of cell-based therapeutic preparations, focusing on haematopoietic progenitor cells.

    PubMed

    Störmer, M; Wood, E M; Schurig, U; Karo, O; Spreitzer, I; McDonald, C P; Montag, T

    2014-05-01

    Bacterial safety of cellular preparations, especially haematopoietic progenitor cells (HPCs), as well as advanced therapy medicinal products (ATMPs) derived from stem cells of various origins, present a challenge for physicians, manufacturers and regulators. The article describes the background and practical issues in this area and illustrates why sterility of these products cannot currently be guaranteed. Advantages and limitations of approaches both for classical sterility testing and for microbiological control using automated culture systems are discussed. The review considers novel approaches for growth-based rapid microbiological control with high sensitivity and faster availability of results, as well as new methods for rapid bacterial detection in cellular preparations enabling meaningful information about product contamination within one to two hours. Generally, however, these direct rapid methods are less sensitive and have greater sampling error compared with the growth-based methods. Opportunities for pyrogen testing of cell therapeutics are also discussed. There is an urgent need for development of novel principles and methods applicable to bacterial safety of cellular therapeutics. We also need a major shift in approach from the traditional view of sterility evaluation (identify anything and everything) to a new thinking about how to find what is clinically relevant within the time frame available for the special clinical circumstances in which these products are used. The review concludes with recommendations for optimization of microbiological control of cellular preparations, focusing on HPCs. © 2013 International Society of Blood Transfusion.

  12. Nuclear protein accumulation in cellular senescence and organismal aging revealed with a novel single-cell resolution fluorescence microscopy assay.

    PubMed

    De Cecco, Marco; Jeyapalan, Jessie; Zhao, Xiaoai; Tamamori-Adachi, Mimi; Sedivy, John M

    2011-10-01

    Replicative cellular senescence was discovered some 50 years ago. The phenotypes of senescent cells have been investigated extensively in cell culture, and found to affect essentially all aspects of cellular physiology. The relevance of cellular senescence in the context of age-associated pathologies as well as normal aging is a topic of active and ongoing interest. Considerable effort has been devoted to biomarker discovery to enable the microscopic detection of single senescent cells in tissues. One characteristic of senescent cells documented very early in cell culture studies was an increase in cell size and total protein content, but whether this occurs in vivo is not known. A limiting factor for studies of protein content and localization has been the lack of suitable fluorescence microscopy tools. We have developed an easy and flexible method, based on the merocyanine dye known as NanoOrange, to visualize and quantitatively measure total protein levels by high resolution fluorescence microscopy. NanoOrange staining can be combined with antibody-based immunofluorescence, thus providing both specific target and total protein information in the same specimen. These methods are optimally combined with automated image analysis platforms for high throughput analysis. We document here increasing protein content and density in nuclei of senescent human and mouse fibroblasts in vitro, and in liver nuclei of aged mice in vivo. Additionally, in aged liver nuclei NanoOrange revealed protein-dense foci that colocalize with centromeric heterochromatin.

  13. Nuclear protein accumulation in cellular senescence and organismal aging revealed with a novel single-cell resolution fluorescence microscopy assay

    PubMed Central

    De Cecco, Marco; Jeyapalan, Jessie; Zhao, Xiaoai; Tamamori-Adachi, Mimi; Sedivy, John M.

    2011-01-01

    Replicative cellular senescence was discovered some 50 years ago. The phenotypes of senescent cells have been investigated extensively in cell culture, and found to affect essentially all aspects of cellular physiology. The relevance of cellular senescence in the context of age-associated pathologies as well as normal aging is a topic of active and ongoing interest. Considerable effort has been devoted to biomarker discovery to enable the microscopic detection of single senescent cells in tissues. One characteristic of senescent cells documented very early in cell culture studies was an increase in cell size and total protein content, but whether this occurs in vivo is not known. A limiting factor for studies of protein content and localization has been the lack of suitable fluorescence microscopy tools. We have developed an easy and flexible method, based on the merocyanine dye known as NanoOrange, to visualize and quantitatively measure total protein levels by high resolution fluorescence microscopy. NanoOrange staining can be combined with antibody-based immunofluorescence, thus providing both specific target and total protein information in the same specimen. These methods are optimally combined with automated image analysis platforms for high throughput analysis. We document here increasing protein content and density in nuclei of senescent human and mouse fibroblasts in vitro, and in liver nuclei of aged mice in vivo. Additionally, in aged liver nuclei NanoOrange revealed protein-dense foci that colocalize with centromeric heterochromatin. PMID:22006542

  14. Cellular senescence in the Penna model of aging

    NASA Astrophysics Data System (ADS)

    Periwal, Avikar

    2013-11-01

    Cellular senescence is thought to play a major role in age-related diseases, which cause nearly 67% of all human deaths worldwide. Recent research in mice showed that exercising mice had higher levels of telomerase, an enzyme that helps maintain telomere length, than nonexercising mice. A commonly used model for biological aging was proposed by Penna. I propose a modification of the Penna model that incorporates cellular senescence and find an analytical steady-state solution following Coe, Mao, and Cates [Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.89.288103 89, 288103 (2002)]. I find that models corresponding to delayed cellular senescence have younger populations that live longer. I fit the model to the United Kingdom's death distribution, which the original Penna model cannot do.

  15. Automated Test Assembly for Cognitive Diagnosis Models Using a Genetic Algorithm

    ERIC Educational Resources Information Center

    Finkelman, Matthew; Kim, Wonsuk; Roussos, Louis A.

    2009-01-01

    Much recent psychometric literature has focused on cognitive diagnosis models (CDMs), a promising class of instruments used to measure the strengths and weaknesses of examinees. This article introduces a genetic algorithm to perform automated test assembly alongside CDMs. The algorithm is flexible in that it can be applied whether the goal is to…

  16. Three Experiments Examining the Use of Electroencephalogram,Event-Related Potentials, and Heart-Rate Variability for Real-Time Human-Centered Adaptive Automation Design

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Parasuraman, Raja; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.

    2003-01-01

    Adaptive automation represents an advanced form of human-centered automation design. The approach to automation provides for real-time and model-based assessments of human-automation interaction, determines whether the human has entered into a hazardous state of awareness and then modulates the task environment to keep the operator in-the-loop , while maintaining an optimal state of task engagement and mental alertness. Because adaptive automation has not matured, numerous challenges remain, including what the criteria are, for determining when adaptive aiding and adaptive function allocation should take place. Human factors experts in the area have suggested a number of measures including the use of psychophysiology. This NASA Technical Paper reports on three experiments that examined the psychophysiological measures of event-related potentials, electroencephalogram, and heart-rate variability for real-time adaptive automation. The results of the experiments confirm the efficacy of these measures for use in both a developmental and operational role for adaptive automation design. The implications of these results and future directions for psychophysiology and human-centered automation design are discussed.

  17. Generic framework for mining cellular automata models on protein-folding simulations.

    PubMed

    Diaz, N; Tischer, I

    2016-05-13

    Cellular automata model identification is an important way of building simplified simulation models. In this study, we describe a generic architectural framework to ease the development process of new metaheuristic-based algorithms for cellular automata model identification in protein-folding trajectories. Our framework was developed by a methodology based on design patterns that allow an improved experience for new algorithms development. The usefulness of the proposed framework is demonstrated by the implementation of four algorithms, able to obtain extremely precise cellular automata models of the protein-folding process with a protein contact map representation. Dynamic rules obtained by the proposed approach are discussed, and future use for the new tool is outlined.

  18. Using Pareto points for model identification in predictive toxicology

    PubMed Central

    2013-01-01

    Predictive toxicology is concerned with the development of models that are able to predict the toxicity of chemicals. A reliable prediction of toxic effects of chemicals in living systems is highly desirable in cosmetics, drug design or food protection to speed up the process of chemical compound discovery while reducing the need for lab tests. There is an extensive literature associated with the best practice of model generation and data integration but management and automated identification of relevant models from available collections of models is still an open problem. Currently, the decision on which model should be used for a new chemical compound is left to users. This paper intends to initiate the discussion on automated model identification. We present an algorithm, based on Pareto optimality, which mines model collections and identifies a model that offers a reliable prediction for a new chemical compound. The performance of this new approach is verified for two endpoints: IGC50 and LogP. The results show a great potential for automated model identification methods in predictive toxicology. PMID:23517649

  19. Automatic specification of reliability models for fault-tolerant computers

    NASA Technical Reports Server (NTRS)

    Liceaga, Carlos A.; Siewiorek, Daniel P.

    1993-01-01

    The calculation of reliability measures using Markov models is required for life-critical processor-memory-switch structures that have standby redundancy or that are subject to transient or intermittent faults or repair. The task of specifying these models is tedious and prone to human error because of the large number of states and transitions required in any reasonable system. Therefore, model specification is a major analysis bottleneck, and model verification is a major validation problem. The general unfamiliarity of computer architects with Markov modeling techniques further increases the necessity of automating the model specification. Automation requires a general system description language (SDL). For practicality, this SDL should also provide a high level of abstraction and be easy to learn and use. The first attempt to define and implement an SDL with those characteristics is presented. A program named Automated Reliability Modeling (ARM) was constructed as a research vehicle. The ARM program uses a graphical interface as its SDL, and it outputs a Markov reliability model specification formulated for direct use by programs that generate and evaluate the model.

  20. Extracting microtubule networks from superresolution single-molecule localization microscopy data

    PubMed Central

    Zhang, Zhen; Nishimura, Yukako; Kanchanawong, Pakorn

    2017-01-01

    Microtubule filaments form ubiquitous networks that specify spatial organization in cells. However, quantitative analysis of microtubule networks is hampered by their complex architecture, limiting insights into the interplay between their organization and cellular functions. Although superresolution microscopy has greatly facilitated high-resolution imaging of microtubule filaments, extraction of complete filament networks from such data sets is challenging. Here we describe a computational tool for automated retrieval of microtubule filaments from single-molecule-localization–based superresolution microscopy images. We present a user-friendly, graphically interfaced implementation and a quantitative analysis of microtubule network architecture phenotypes in fibroblasts. PMID:27852898

  1. Pseudopolycythemia, pseudothrombocytopenia, and pseudoleukopenia due to overfilling of blood collection vacuum tubes.

    PubMed

    Pewarchuk, W; VanderBoom, J; Blajchman, M A

    1992-01-01

    A patient blood sample with an unexpectedly high hemoglobin level, high hematocrit, low white blood cell count, and low platelet count was recognized as being spurious based on previously available data. Repeated testing of the original sample showed a gradual return of all parameters to expected levels. We provide evidence that the overfilling of blood collection vacuum tubes can lead to inadequate sample mixing and that, in combination with the settling of the cellular contents in the collection tubes, can result in spuriously abnormal hematological parameters as estimated by an automated method.

  2. Recent advances and versatility of MAGE towards industrial applications.

    PubMed

    Singh, Vijai; Braddick, Darren

    2015-12-01

    The genome engineering toolkit has expanded significantly in recent years, allowing us to study the functions of genes in cellular networks and assist in over-production of proteins, drugs, chemicals and biofuels. Multiplex automated genome engineering (MAGE) has been recently developed and gained more scientific interest towards strain engineering. MAGE is a simple, rapid and efficient tool for manipulating genes simultaneously in multiple loci, assigning genetic codes and integrating non-natural amino acids. MAGE can be further expanded towards the engineering of fast, robust and over-producing strains for chemicals, drugs and biofuels at industrial scales.

  3. A 3D Image Filter for Parameter-Free Segmentation of Macromolecular Structures from Electron Tomograms

    PubMed Central

    Ali, Rubbiya A.; Landsberg, Michael J.; Knauth, Emily; Morgan, Garry P.; Marsh, Brad J.; Hankamer, Ben

    2012-01-01

    3D image reconstruction of large cellular volumes by electron tomography (ET) at high (≤5 nm) resolution can now routinely resolve organellar and compartmental membrane structures, protein coats, cytoskeletal filaments, and macromolecules. However, current image analysis methods for identifying in situ macromolecular structures within the crowded 3D ultrastructural landscape of a cell remain labor-intensive, time-consuming, and prone to user-bias and/or error. This paper demonstrates the development and application of a parameter-free, 3D implementation of the bilateral edge-detection (BLE) algorithm for the rapid and accurate segmentation of cellular tomograms. The performance of the 3D BLE filter has been tested on a range of synthetic and real biological data sets and validated against current leading filters—the pseudo 3D recursive and Canny filters. The performance of the 3D BLE filter was found to be comparable to or better than that of both the 3D recursive and Canny filters while offering the significant advantage that it requires no parameter input or optimisation. Edge widths as little as 2 pixels are reproducibly detected with signal intensity and grey scale values as low as 0.72% above the mean of the background noise. The 3D BLE thus provides an efficient method for the automated segmentation of complex cellular structures across multiple scales for further downstream processing, such as cellular annotation and sub-tomogram averaging, and provides a valuable tool for the accurate and high-throughput identification and annotation of 3D structural complexity at the subcellular level, as well as for mapping the spatial and temporal rearrangement of macromolecular assemblies in situ within cellular tomograms. PMID:22479430

  4. Java PathFinder: A Translator From Java to Promela

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    1999-01-01

    JAVA PATHFINDER, JPF, is a prototype translator from JAVA to PROMELA, the modeling language of the SPIN model checker. JPF is a product of a major effort by the Automated Software Engineering group at NASA Ames to make model checking technology part of the software process. Experience has shown that severe bugs can be found in final code using this technique, and that automated translation from a programming language to a modeling language like PROMELA can help reducing the effort required.

  5. Real-Time Field Data Acquisition and Remote Sensor Reconfiguration Using Scientific Workflows

    NASA Astrophysics Data System (ADS)

    Silva, F.; Mehta, G.; Vahi, K.; Deelman, E.

    2010-12-01

    Despite many technological advances, field data acquisition still consists of several manual and laborious steps. Once sensors and data loggers are deployed in the field, scientists often have to periodically return to their study sites in order to collect their data. Even when field deployments have a way to communicate and transmit data back to the laboratory (e.g. by using a satellite or a cellular modem), data analysis still requires several repetitive steps. Because data often needs to be processed and inspected manually, there is usually a significant time delay between data collection and analysis. As a result, sensor failures that could be detected almost in real-time are not noted for weeks or months. Finally, sensor reconfiguration as a result of interesting events in the field is still done manually, making rapid response nearly impossible and causing important data to be missed. By working closely with scientists from different application domains, we identified several tasks that, if automated, could greatly improve the way field data is collected, processed, and distributed. Our goals are to enable real-time data collection and validation, automate sensor reconfiguration in response to interest events in the field, and allow scientists to easily automate their data processing. We began our design by employing the Sensor Processing and Acquisition Network (SPAN) architecture. SPAN uses an embedded processor in the field to coordinate sensor data acquisition from analog and digital sensors by interfacing with different types of devices and data loggers. SPAN is also able to interact with various types of communication devices in order to provide real-time communication to and from field sites. We use the Pegasus Workflow Management System (Pegasus WMS) to coordinate data collection and control sensors and deployments in the field. Because scientific workflows can be used to automate multi-step, repetitive tasks, scientists can create simple workflows to download sensor data, perform basic QA/QC, and identify events of interest as well as sensor and data logger failures almost in real-time. As a result of this automation, scientists can quickly be notified (e.g. via e-mail or SMS) so that important events are not missed. In addition, Pegasus WMS has the ability to abstract the execution environment of where programs run. By placing a Pegasus WMS agent inside an embedded processor in the field, we allow scientists to ship simple computational models to the field, enabling remote data processing at the field site. As an example, scientists can send an image processing algorithm to the field so that the embedded processor can analyze images, thus reducing the bandwidth necessary for communication. In addition, when real-time communication to the laboratory is not possible, scientists can create simple computational models that can be run on sensor nodes autonomously, monitoring sensor data and making adjustments without any human intervention. We believe our system lowers the bar for the adoption of reconfigurable sensor networks by field scientists. In this poster, we will show how this technology can be used to provide not only data acquisition, but also real-time data validation and sensor reconfiguration.

  6. Feasibility and patient acceptability of a novel artificial intelligence-based screening model for diabetic retinopathy at endocrinology outpatient services: a pilot study.

    PubMed

    Keel, Stuart; Lee, Pei Ying; Scheetz, Jane; Li, Zhixi; Kotowicz, Mark A; MacIsaac, Richard J; He, Mingguang

    2018-03-12

    The purpose of this study is to evaluate the feasibility and patient acceptability of a novel artificial intelligence (AI)-based diabetic retinopathy (DR) screening model within endocrinology outpatient settings. Adults with diabetes were recruited from two urban endocrinology outpatient clinics and single-field, non-mydriatic fundus photographs were taken and graded for referable DR ( ≥ pre-proliferative DR). Each participant underwent; (1) automated screening model; where a deep learning algorithm (DLA) provided real-time reporting of results; and (2) manual model where retinal images were transferred to a retinal grading centre and manual grading outcomes were distributed to the patient within 2 weeks of assessment. Participants completed a questionnaire on the day of examination and 1-month following assessment to determine overall satisfaction and the preferred model of care. In total, 96 participants were screened for DR and the mean assessment time for automated screening was 6.9 minutes. Ninety-six percent of participants reported that they were either satisfied or very satisfied with the automated screening model and 78% reported that they preferred the automated model over manual. The sensitivity and specificity of the DLA for correct referral was 92.3% and 93.7%, respectively. AI-based DR screening in endocrinology outpatient settings appears to be feasible and well accepted by patients.

  7. Analytical research and development for the Whitney Programs. Automation and instrumentation. Computer automation of the Cary Model 17I spectrophotometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haugen, G.R.; Bystroff, R.I.; Downey, R.M.

    1975-09-01

    In the area of automation and instrumentation, progress in the following studies is reported: computer automation of the Cary model 17I spectrophotometer; a new concept for monitoring the concentration of water in gases; on-line gas analysis for a gas circulation experiment; and count-rate-discriminator technique for measuring grain-boundary composition. In the area of analytical methodology and measurements, progress is reported in the following studies: separation of molecular species by radiation pressure; study of the vaporization of U(thd)$sub 4$, (thd = 2,2,6,6-tetramethylheptane-3,5-drone); study of the vaporization of U(C$sub 8$H$sub 8$)$sub 2$; determination of ethylenic unsaturation in polyimide resins; and, semimicrodetermination of hydroxylmore » and amino groups with pyromellitic dianhydride (PMDA). (JGB)« less

  8. Deep Interactive Learning with Sharkzor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Sharkzor is a web application for machine-learning assisted image sort and summary. Deep learning algorithms are leveraged to infer, augment, and automate the user’s mental model. Initially, images uploaded by the user are spread out on a canvas. The user then interacts with the images to impute their mental model into the applications algorithmic underpinnings. Methods of interaction within Sharkzor’s user interface and user experience support three primary user tasks: triage, organize and automate. The user triages the large pile of overlapping images by moving images of interest into proximity. The user then organizes said images into meaningful groups. Aftermore » interacting with the images and groups, deep learning helps to automate the user’s interactions. The loop of interaction, automation, and response by the user allows the system to quickly make sense of large amounts of data.« less

  9. A Graphical User Interface for Software-assisted Tracking of Protein Concentration in Dynamic Cellular Protrusions.

    PubMed

    Saha, Tanumoy; Rathmann, Isabel; Galic, Milos

    2017-07-11

    Filopodia are dynamic, finger-like cellular protrusions associated with migration and cell-cell communication. In order to better understand the complex signaling mechanisms underlying filopodial initiation, elongation and subsequent stabilization or retraction, it is crucial to determine the spatio-temporal protein activity in these dynamic structures. To analyze protein function in filopodia, we recently developed a semi-automated tracking algorithm that adapts to filopodial shape-changes, thus allowing parallel analysis of protrusion dynamics and relative protein concentration along the whole filopodial length. Here, we present a detailed step-by-step protocol for optimized cell handling, image acquisition and software analysis. We further provide instructions for the use of optional features during image analysis and data representation, as well as troubleshooting guidelines for all critical steps along the way. Finally, we also include a comparison of the described image analysis software with other programs available for filopodia quantification. Together, the presented protocol provides a framework for accurate analysis of protein dynamics in filopodial protrusions using image analysis software.

  10. Novel microscopy-based screening method reveals regulators of contact-dependent intercellular transfer

    PubMed Central

    Michael Frei, Dominik; Hodneland, Erlend; Rios-Mondragon, Ivan; Burtey, Anne; Neumann, Beate; Bulkescher, Jutta; Schölermann, Julia; Pepperkok, Rainer; Gerdes, Hans-Hermann; Kögel, Tanja

    2015-01-01

    Contact-dependent intercellular transfer (codeIT) of cellular constituents can have functional consequences for recipient cells, such as enhanced survival and drug resistance. Pathogenic viruses, prions and bacteria can also utilize this mechanism to spread to adjacent cells and potentially evade immune detection. However, little is known about the molecular mechanism underlying this intercellular transfer process. Here, we present a novel microscopy-based screening method to identify regulators and cargo of codeIT. Single donor cells, carrying fluorescently labelled endocytic organelles or proteins, are co-cultured with excess acceptor cells. CodeIT is quantified by confocal microscopy and image analysis in 3D, preserving spatial information. An siRNA-based screening using this method revealed the involvement of several myosins and small GTPases as codeIT regulators. Our data indicates that cellular protrusions and tubular recycling endosomes are important for codeIT. We automated image acquisition and analysis to facilitate large-scale chemical and genetic screening efforts to identify key regulators of codeIT. PMID:26271723

  11. Optical toolkits for in vivo deep tissue laser scanning microscopy: a primer

    NASA Astrophysics Data System (ADS)

    Lee, Woei Ming; McMenamin, Thomas; Li, Yongxiao

    2018-06-01

    Life at the microscale is animated and multifaceted. The impact of dynamic in vivo microscopy in small animals has opened up opportunities to peer into a multitude of biological processes at the cellular scale in their native microenvironments. Laser scanning microscopy (LSM) coupled with targeted fluorescent proteins has become an indispensable tool to enable dynamic imaging in vivo at high temporal and spatial resolutions. In the last few decades, the technique has been translated from imaging cells in thin samples to mapping cells in the thick biological tissue of living organisms. Here, we sought to provide a concise overview of the design considerations of a LSM that enables cellular and subcellular imaging in deep tissue. Individual components under review include: long working distance microscope objectives, laser scanning technologies, adaptive optics devices, beam shaping technologies and photon detectors, with an emphasis on more recent advances. The review will conclude with the latest innovations in automated optical microscopy, which would impact tracking and quantification of heterogeneous populations of cells in vivo.

  12. Association of plasma cell subsets in the bone marrow and free light chain concentrations in the serum of monoclonal gammopathy patients.

    PubMed

    Ayliffe, Michael John; Behrens, Judith; Stern, Simon; Sumar, Nazira

    2012-08-01

    This study investigated bone marrow plasma cell subsets and monoclonal free light chain concentrations in blood of monoclonal gammopathy patients. 54 bone marrow samples were stained by double immunofluorescence to enumerate cellular subsets making either intact monoclonal immunoglobulin or free light chains only. Blood taken at the same time was assayed for free light chains by an automated immunoassay. Patients were assigned to three cellular population categories: single intact monoclonal immunoglobulin (59%), dual monoclonal immunoglobulin and free light chain only (31%), or single free light chain only (9%). The median affected free light chain concentration of each group was 75 mg/l, 903 mg/l and 3320 mg/l, respectively, but with substantial overlap. In myeloma patients the difference in serum free light chain concentrations between patients with free light chain only marrow cells and those without was statistically significant. Serum free light chain levels >600 mg/l result mostly from marrow cells restricted to free light chain production.

  13. Automated CPX support system preliminary design phase

    NASA Technical Reports Server (NTRS)

    Bordeaux, T. A.; Carson, E. T.; Hepburn, C. D.; Shinnick, F. M.

    1984-01-01

    The development of the Distributed Command and Control System (DCCS) is discussed. The development of an automated C2 system stimulated the development of an automated command post exercise (CPX) support system to provide a more realistic stimulus to DCCS than could be achieved with the existing manual system. An automated CPX system to support corps-level exercise was designed. The effort comprised four tasks: (1) collecting and documenting user requirements; (2) developing a preliminary system design; (3) defining a program plan; and (4) evaluating the suitability of the TRASANA FOURCE computer model.

  14. Generating Systems Biology Markup Language Models from the Synthetic Biology Open Language.

    PubMed

    Roehner, Nicholas; Zhang, Zhen; Nguyen, Tramy; Myers, Chris J

    2015-08-21

    In the context of synthetic biology, model generation is the automated process of constructing biochemical models based on genetic designs. This paper discusses the use cases for model generation in genetic design automation (GDA) software tools and introduces the foundational concepts of standards and model annotation that make this process useful. Finally, this paper presents an implementation of model generation in the GDA software tool iBioSim and provides an example of generating a Systems Biology Markup Language (SBML) model from a design of a 4-input AND sensor written in the Synthetic Biology Open Language (SBOL).

  15. Geometric Modeling of Cellular Materials for Additive Manufacturing in Biomedical Field: A Review

    PubMed Central

    Rosso, Stefano; Meneghello, Roberto; Concheri, Gianmaria

    2018-01-01

    Advances in additive manufacturing technologies facilitate the fabrication of cellular materials that have tailored functional characteristics. The application of solid freeform fabrication techniques is especially exploited in designing scaffolds for tissue engineering. In this review, firstly, a classification of cellular materials from a geometric point of view is proposed; then, the main approaches on geometric modeling of cellular materials are discussed. Finally, an investigation on porous scaffolds fabricated by additive manufacturing technologies is pointed out. Perspectives in geometric modeling of scaffolds for tissue engineering are also proposed. PMID:29487626

  16. Geometric Modeling of Cellular Materials for Additive Manufacturing in Biomedical Field: A Review.

    PubMed

    Savio, Gianpaolo; Rosso, Stefano; Meneghello, Roberto; Concheri, Gianmaria

    2018-01-01

    Advances in additive manufacturing technologies facilitate the fabrication of cellular materials that have tailored functional characteristics. The application of solid freeform fabrication techniques is especially exploited in designing scaffolds for tissue engineering. In this review, firstly, a classification of cellular materials from a geometric point of view is proposed; then, the main approaches on geometric modeling of cellular materials are discussed. Finally, an investigation on porous scaffolds fabricated by additive manufacturing technologies is pointed out. Perspectives in geometric modeling of scaffolds for tissue engineering are also proposed.

  17. A Fully Automated Microfluidic Femtosecond Laser Axotomy Platform for Nerve Regeneration Studies in C. elegans

    PubMed Central

    Gokce, Sertan Kutal; Guo, Samuel X.; Ghorashian, Navid; Everett, W. Neil; Jarrell, Travis; Kottek, Aubri; Bovik, Alan C.; Ben-Yakar, Adela

    2014-01-01

    Femtosecond laser nanosurgery has been widely accepted as an axonal injury model, enabling nerve regeneration studies in the small model organism, Caenorhabditis elegans. To overcome the time limitations of manual worm handling techniques, automation and new immobilization technologies must be adopted to improve throughput in these studies. While new microfluidic immobilization techniques have been developed that promise to reduce the time required for axotomies, there is a need for automated procedures to minimize the required amount of human intervention and accelerate the axotomy processes crucial for high-throughput. Here, we report a fully automated microfluidic platform for performing laser axotomies of fluorescently tagged neurons in living Caenorhabditis elegans. The presented automation process reduces the time required to perform axotomies within individual worms to ∼17 s/worm, at least one order of magnitude faster than manual approaches. The full automation is achieved with a unique chip design and an operation sequence that is fully computer controlled and synchronized with efficient and accurate image processing algorithms. The microfluidic device includes a T-shaped architecture and three-dimensional microfluidic interconnects to serially transport, position, and immobilize worms. The image processing algorithms can identify and precisely position axons targeted for ablation. There were no statistically significant differences observed in reconnection probabilities between axotomies carried out with the automated system and those performed manually with anesthetics. The overall success rate of automated axotomies was 67.4±3.2% of the cases (236/350) at an average processing rate of 17.0±2.4 s. This fully automated platform establishes a promising methodology for prospective genome-wide screening of nerve regeneration in C. elegans in a truly high-throughput manner. PMID:25470130

  18. Differential proteomic analysis of mouse macrophages exposed to adsorbate-loaded heavy fuel oil derived combustion particles using an automated sample-preparation workflow.

    PubMed

    Kanashova, Tamara; Popp, Oliver; Orasche, Jürgen; Karg, Erwin; Harndorf, Horst; Stengel, Benjamin; Sklorz, Martin; Streibel, Thorsten; Zimmermann, Ralf; Dittmar, Gunnar

    2015-08-01

    Ship diesel combustion particles are known to cause broad cytotoxic effects and thereby strongly impact human health. Particles from heavy fuel oil (HFO) operated ships are considered as particularly dangerous. However, little is known about the relevant components of the ship emission particles. In particular, it is interesting to know if the particle cores, consisting of soot and metal oxides, or the adsorbate layers, consisting of semi- and low-volatile organic compounds and salts, are more relevant. We therefore sought to relate the adsorbates and the core composition of HFO combustion particles to the early cellular responses, allowing for the development of measures that counteract their detrimental effects. Hence, the semi-volatile coating of HFO-operated ship diesel engine particles was removed by stepwise thermal stripping using different temperatures. RAW 264.7 macrophages were exposed to native and thermally stripped particles in submersed culture. Proteomic changes were monitored by two different quantitative mass spectrometry approaches, stable isotope labeling by amino acids in cell culture (SILAC) and dimethyl labeling. Our data revealed that cells reacted differently to native or stripped HFO combustion particles. Cells exposed to thermally stripped particles showed a very differential reaction with respect to the composition of the individual chemical load of the particle. The cellular reactions of the HFO particles included reaction to oxidative stress, reorganization of the cytoskeleton and changes in endocytosis. Cells exposed to the 280 °C treated particles showed an induction of RNA-related processes, a number of mitochondria-associated processes as well as DNA damage response, while the exposure to 580 °C treated HFO particles mainly induced the regulation of intracellular transport. In summary, our analysis based on a highly reproducible automated proteomic sample-preparation procedure shows a diverse cellular response, depending on the soot particle composition. In particular, it was shown that both the molecules of the adsorbate layer as well as particle cores induced strong but different effects in the exposed cells.

  19. Emergence of tissue mechanics from cellular processes: shaping a fly wing

    NASA Astrophysics Data System (ADS)

    Merkel, Matthias; Etournay, Raphael; Popovic, Marko; Nandi, Amitabha; Brandl, Holger; Salbreux, Guillaume; Eaton, Suzanne; Jülicher, Frank

    Nowadays, biologistsare able to image biological tissueswith up to 10,000 cells in vivowhere the behavior of each individual cell can be followed in detail.However, how precisely large-scale tissue deformation and stresses emerge from cellular behavior remains elusive. Here, we study this question in the developing wing of the fruit fly. To this end, we first establish a geometrical framework that exactly decomposes tissue deformation into contributions by different kinds of cellular processes. These processes comprise cell shape changes, cell neighbor exchanges, cell divisions, and cell extrusions. As the key idea, we introduce a tiling of the cellular network into triangles. This approach also reveals that tissue deformation can also be created by correlated cellular motion. Based on quantifications using these concepts, we developed a novel continuum mechanical model for the fly wing. In particular, our model includes active anisotropic stresses and a delay in the response of cell rearrangements to material stresses. A different approach to study the emergence of tissue mechanics from cellular behavior are cell-based models. We characterize the properties of a cell-based model for 3D tissues that is a hybrid between single particle models and the so-called vertex models.

  20. ChainMail based neural dynamics modeling of soft tissue deformation for surgical simulation.

    PubMed

    Zhang, Jinao; Zhong, Yongmin; Smith, Julian; Gu, Chengfan

    2017-07-20

    Realistic and real-time modeling and simulation of soft tissue deformation is a fundamental research issue in the field of surgical simulation. In this paper, a novel cellular neural network approach is presented for modeling and simulation of soft tissue deformation by combining neural dynamics of cellular neural network with ChainMail mechanism. The proposed method formulates the problem of elastic deformation into cellular neural network activities to avoid the complex computation of elasticity. The local position adjustments of ChainMail are incorporated into the cellular neural network as the local connectivity of cells, through which the dynamic behaviors of soft tissue deformation are transformed into the neural dynamics of cellular neural network. Experiments demonstrate that the proposed neural network approach is capable of modeling the soft tissues' nonlinear deformation and typical mechanical behaviors. The proposed method not only improves ChainMail's linear deformation with the nonlinear characteristics of neural dynamics but also enables the cellular neural network to follow the principle of continuum mechanics to simulate soft tissue deformation.

  1. An Automated Method for Landmark Identification and Finite-Element Modeling of the Lumbar Spine.

    PubMed

    Campbell, Julius Quinn; Petrella, Anthony J

    2015-11-01

    The purpose of this study was to develop a method for the automated creation of finite-element models of the lumbar spine. Custom scripts were written to extract bone landmarks of lumbar vertebrae and assemble L1-L5 finite-element models. End-plate borders, ligament attachment points, and facet surfaces were identified. Landmarks were identified to maintain mesh correspondence between meshes for later use in statistical shape modeling. 90 lumbar vertebrae were processed creating 18 subject-specific finite-element models. Finite-element model surfaces and ligament attachment points were reproduced within 1e-5 mm of the bone surface, including the critical contact surfaces of the facets. Element quality exceeded specifications in 97% of elements for the 18 models created. The current method is capable of producing subject-specific finite-element models of the lumbar spine with good accuracy, quality, and robustness. The automated methods developed represent advancement in the state of the art of subject-specific lumbar spine modeling to a scale not possible with prior manual and semiautomated methods.

  2. A NASA F/A-18, participating in the Automated Aerial Refueling (AAR) project, flies over the Dryden

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A NASA F/A-18 is participating in the Automated Aerial Refueling (AAR) project. The 300-gallon aerial refueling store seen on the belly of the aircraft carries fuel and a refueling drogue. This aircraft acts as a tanker in the study to develop an aerodynamic model for future automated aerial refueling, especially of unmanned vehicles.

  3. Evaluation of computer-based computer tomography stratification against outcome models in connective tissue disease-related interstitial lung disease: a patient outcome study.

    PubMed

    Jacob, Joseph; Bartholmai, Brian J; Rajagopalan, Srinivasan; Brun, Anne Laure; Egashira, Ryoko; Karwoski, Ronald; Kokosi, Maria; Wells, Athol U; Hansell, David M

    2016-11-23

    To evaluate computer-based computer tomography (CT) analysis (CALIPER) against visual CT scoring and pulmonary function tests (PFTs) when predicting mortality in patients with connective tissue disease-related interstitial lung disease (CTD-ILD). To identify outcome differences between distinct CTD-ILD groups derived following automated stratification of CALIPER variables. A total of 203 consecutive patients with assorted CTD-ILDs had CT parenchymal patterns evaluated by CALIPER and visual CT scoring: honeycombing, reticular pattern, ground glass opacities, pulmonary vessel volume, emphysema, and traction bronchiectasis. CT scores were evaluated against pulmonary function tests: forced vital capacity, diffusing capacity for carbon monoxide, carbon monoxide transfer coefficient, and composite physiologic index for mortality analysis. Automated stratification of CALIPER-CT variables was evaluated in place of and alongside forced vital capacity and diffusing capacity for carbon monoxide in the ILD gender, age physiology (ILD-GAP) model using receiver operating characteristic curve analysis. Cox regression analyses identified four independent predictors of mortality: patient age (P < 0.0001), smoking history (P = 0.0003), carbon monoxide transfer coefficient (P = 0.003), and pulmonary vessel volume (P < 0.0001). Automated stratification of CALIPER variables identified three morphologically distinct groups which were stronger predictors of mortality than all CT and functional indices. The Stratified-CT model substituted automated stratified groups for functional indices in the ILD-GAP model and maintained model strength (area under curve (AUC) = 0.74, P < 0.0001), ILD-GAP (AUC = 0.72, P < 0.0001). Combining automated stratified groups with the ILD-GAP model (stratified CT-GAP model) strengthened predictions of 1- and 2-year mortality: ILD-GAP (AUC = 0.87 and 0.86, respectively); stratified CT-GAP (AUC = 0.89 and 0.88, respectively). CALIPER-derived pulmonary vessel volume is an independent predictor of mortality across all CTD-ILD patients. Furthermore, automated stratification of CALIPER CT variables represents a novel method of prognostication at least as robust as PFTs in CTD-ILD patients.

  4. A Comparative Experimental Study on the Use of Machine Learning Approaches for Automated Valve Monitoring Based on Acoustic Emission Parameters

    NASA Astrophysics Data System (ADS)

    Ali, Salah M.; Hui, K. H.; Hee, L. M.; Salman Leong, M.; Al-Obaidi, M. A.; Ali, Y. H.; Abdelrhman, Ahmed M.

    2018-03-01

    Acoustic emission (AE) analysis has become a vital tool for initiating the maintenance tasks in many industries. However, the analysis process and interpretation has been found to be highly dependent on the experts. Therefore, an automated monitoring method would be required to reduce the cost and time consumed in the interpretation of AE signal. This paper investigates the application of two of the most common machine learning approaches namely artificial neural network (ANN) and support vector machine (SVM) to automate the diagnosis of valve faults in reciprocating compressor based on AE signal parameters. Since the accuracy is an essential factor in any automated diagnostic system, this paper also provides a comparative study based on predictive performance of ANN and SVM. AE parameters data was acquired from single stage reciprocating air compressor with different operational and valve conditions. ANN and SVM diagnosis models were subsequently devised by combining AE parameters of different conditions. Results demonstrate that ANN and SVM models have the same results in term of prediction accuracy. However, SVM model is recommended to automate diagnose the valve condition in due to the ability of handling a high number of input features with low sampling data sets.

  5. Assessing user acceptance towards automated and conventional sink use for hand decontamination using the technology acceptance model.

    PubMed

    Dawson, Carolyn H; Mackrill, Jamie B; Cain, Rebecca

    2017-12-01

    Hand hygiene (HH) prevents harmful contaminants spreading in settings including domestic, health care and food handling. Strategies to improve HH range from behavioural techniques through to automated sinks that ensure hand surface cleaning. This study aimed to assess user experience and acceptance towards a new automated sink, compared to a normal sink. An adapted version of the technology acceptance model (TAM) assessed each mode of handwashing. A within-subjects design enabled N = 46 participants to evaluate both sinks. Perceived Ease of Use and Satisfaction of Use were significantly lower for the automated sink, compared to the conventional sink (p < 0.005). Across the remaining TAM factors, there was no significant difference. Participants suggested design features including jet strength, water temperature and device affordance may improve HH technology. We provide recommendations for future HH technology development to contribute a positive user experience, relevant to technology developers, ergonomists and those involved in HH across all sectors. Practitioner Summary: The need to facilitate timely, effective hand hygiene to prevent illness has led to a rise in automated handwashing systems across different contexts. User acceptance is a key factor in system uptake. This paper applies the technology acceptance model as a means to explore and optimise the design of such systems.

  6. An algorithm for automated layout of process description maps drawn in SBGN.

    PubMed

    Genc, Begum; Dogrusoz, Ugur

    2016-01-01

    Evolving technology has increased the focus on genomics. The combination of today's advanced techniques with decades of molecular biology research has yielded huge amounts of pathway data. A standard, named the Systems Biology Graphical Notation (SBGN), was recently introduced to allow scientists to represent biological pathways in an unambiguous, easy-to-understand and efficient manner. Although there are a number of automated layout algorithms for various types of biological networks, currently none specialize on process description (PD) maps as defined by SBGN. We propose a new automated layout algorithm for PD maps drawn in SBGN. Our algorithm is based on a force-directed automated layout algorithm called Compound Spring Embedder (CoSE). On top of the existing force scheme, additional heuristics employing new types of forces and movement rules are defined to address SBGN-specific rules. Our algorithm is the only automatic layout algorithm that properly addresses all SBGN rules for drawing PD maps, including placement of substrates and products of process nodes on opposite sides, compact tiling of members of molecular complexes and extensively making use of nested structures (compound nodes) to properly draw cellular locations and molecular complex structures. As demonstrated experimentally, the algorithm results in significant improvements over use of a generic layout algorithm such as CoSE in addressing SBGN rules on top of commonly accepted graph drawing criteria. An implementation of our algorithm in Java is available within ChiLay library (https://github.com/iVis-at-Bilkent/chilay). ugur@cs.bilkent.edu.tr or dogrusoz@cbio.mskcc.org Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  7. An algorithm for automated layout of process description maps drawn in SBGN

    PubMed Central

    Genc, Begum; Dogrusoz, Ugur

    2016-01-01

    Motivation: Evolving technology has increased the focus on genomics. The combination of today’s advanced techniques with decades of molecular biology research has yielded huge amounts of pathway data. A standard, named the Systems Biology Graphical Notation (SBGN), was recently introduced to allow scientists to represent biological pathways in an unambiguous, easy-to-understand and efficient manner. Although there are a number of automated layout algorithms for various types of biological networks, currently none specialize on process description (PD) maps as defined by SBGN. Results: We propose a new automated layout algorithm for PD maps drawn in SBGN. Our algorithm is based on a force-directed automated layout algorithm called Compound Spring Embedder (CoSE). On top of the existing force scheme, additional heuristics employing new types of forces and movement rules are defined to address SBGN-specific rules. Our algorithm is the only automatic layout algorithm that properly addresses all SBGN rules for drawing PD maps, including placement of substrates and products of process nodes on opposite sides, compact tiling of members of molecular complexes and extensively making use of nested structures (compound nodes) to properly draw cellular locations and molecular complex structures. As demonstrated experimentally, the algorithm results in significant improvements over use of a generic layout algorithm such as CoSE in addressing SBGN rules on top of commonly accepted graph drawing criteria. Availability and implementation: An implementation of our algorithm in Java is available within ChiLay library (https://github.com/iVis-at-Bilkent/chilay). Contact: ugur@cs.bilkent.edu.tr or dogrusoz@cbio.mskcc.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26363029

  8. Comparison of two methodologies for the enrichment of mononuclear cells from thawed cord blood products: The automated Sepax system versus the manual Ficoll method.

    PubMed

    Kaur, Indreshpal; Zulovich, Jane M; Gonzalez, Marissa; McGee, Kara M; Ponweera, Nirmali; Thandi, Daljit; Alvarez, Enrique F; Annandale, Kathy; Flagge, Frank; Rezvani, Katayoun; Shpall, Elizabeth

    2017-03-01

    Umbilical cord blood (CB) is being used as a source of hematopoietic stem cells (HSCs) and immune cells to treat many disorders. Because these cells are present in low numbers in CB, investigators have developed strategies to expand HSCs and other immune cells such as natural killer (NK) cells. The initial step in this process is to enrich mononuclear cells (MNCs) while depleting unwanted cells. The manual method of MNC enrichment is routinely used by many centers; however, it is an open system, time-consuming and operator dependent. For clinical manufacturing, it is important to have a closed system to avoid microbial contamination. In this study, we optimized an automated, closed system (Sepax) for enriching MNCs from cryopreserved CB units. Using Sepax, we observed higher recovery of total nucleated cells (TNC), CD34 + cells, NK cells and monocytes when compared to manual enrichment, despite similar TNC and CD34 + viability with the two methods. Even though the depletion of red blood cells, granulocytes and platelets was superior using the manual method, significantly higher CFU-GM were obtained in MNCs enriched using Sepax compared to the manual method. This is likely related to the fact that the automated Sepax significantly shortened the processing time (Sepax: 74 - 175 minutes versus manual method: 180 - 290 minutes). The use of DNAse and MgCl 2 during the Sepax thaw and wash procedure prevents clumping of cells and loss of viability, resulting in improved post-thaw cell recovery. We optimized enrichment of MNCs from cryopreserved CB products in a closed system using the Sepax which is a walk away and automated processing system. Copyright © 2017 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  9. A universal algorithm for an improved finite element mesh generation Mesh quality assessment in comparison to former automated mesh-generators and an analytic model.

    PubMed

    Kaminsky, Jan; Rodt, Thomas; Gharabaghi, Alireza; Forster, Jan; Brand, Gerd; Samii, Madjid

    2005-06-01

    The FE-modeling of complex anatomical structures is not solved satisfyingly so far. Voxel-based as opposed to contour-based algorithms allow an automated mesh generation based on the image data. Nonetheless their geometric precision is limited. We developed an automated mesh-generator that combines the advantages of voxel-based generation with improved representation of the geometry by displacement of nodes on the object-surface. Models of an artificial 3D-pipe-section and a skullbase were generated with different mesh-densities using the newly developed geometric, unsmoothed and smoothed voxel generators. Compared to the analytic calculation of the 3D-pipe-section model the normalized RMS error of the surface stress was 0.173-0.647 for the unsmoothed voxel models, 0.111-0.616 for the smoothed voxel models with small volume error and 0.126-0.273 for the geometric models. The highest element-energy error as a criterion for the mesh quality was 2.61x10(-2) N mm, 2.46x10(-2) N mm and 1.81x10(-2) N mm for unsmoothed, smoothed and geometric voxel models, respectively. The geometric model of the 3D-skullbase resulted in the lowest element-energy error and volume error. This algorithm also allowed the best representation of anatomical details. The presented geometric mesh-generator is universally applicable and allows an automated and accurate modeling by combining the advantages of the voxel-technique and of improved surface-modeling.

  10. Toward Multiscale Models of Cyanobacterial Growth: A Modular Approach

    PubMed Central

    Westermark, Stefanie; Steuer, Ralf

    2016-01-01

    Oxygenic photosynthesis dominates global primary productivity ever since its evolution more than three billion years ago. While many aspects of phototrophic growth are well understood, it remains a considerable challenge to elucidate the manifold dependencies and interconnections between the diverse cellular processes that together facilitate the synthesis of new cells. Phototrophic growth involves the coordinated action of several layers of cellular functioning, ranging from the photosynthetic light reactions and the electron transport chain, to carbon-concentrating mechanisms and the assimilation of inorganic carbon. It requires the synthesis of new building blocks by cellular metabolism, protection against excessive light, as well as diurnal regulation by a circadian clock and the orchestration of gene expression and cell division. Computational modeling allows us to quantitatively describe these cellular functions and processes relevant for phototrophic growth. As yet, however, computational models are mostly confined to the inner workings of individual cellular processes, rather than describing the manifold interactions between them in the context of a living cell. Using cyanobacteria as model organisms, this contribution seeks to summarize existing computational models that are relevant to describe phototrophic growth and seeks to outline their interactions and dependencies. Our ultimate aim is to understand cellular functioning and growth as the outcome of a coordinated operation of diverse yet interconnected cellular processes. PMID:28083530

  11. Point process models for localization and interdependence of punctate cellular structures.

    PubMed

    Li, Ying; Majarian, Timothy D; Naik, Armaghan W; Johnson, Gregory R; Murphy, Robert F

    2016-07-01

    Accurate representations of cellular organization for multiple eukaryotic cell types are required for creating predictive models of dynamic cellular function. To this end, we have previously developed the CellOrganizer platform, an open source system for generative modeling of cellular components from microscopy images. CellOrganizer models capture the inherent heterogeneity in the spatial distribution, size, and quantity of different components among a cell population. Furthermore, CellOrganizer can generate quantitatively realistic synthetic images that reflect the underlying cell population. A current focus of the project is to model the complex, interdependent nature of organelle localization. We built upon previous work on developing multiple non-parametric models of organelles or structures that show punctate patterns. The previous models described the relationships between the subcellular localization of puncta and the positions of cell and nuclear membranes and microtubules. We extend these models to consider the relationship to the endoplasmic reticulum (ER), and to consider the relationship between the positions of different puncta of the same type. Our results do not suggest that the punctate patterns we examined are dependent on ER position or inter- and intra-class proximity. With these results, we built classifiers to update previous assignments of proteins to one of 11 patterns in three distinct cell lines. Our generative models demonstrate the ability to construct statistically accurate representations of puncta localization from simple cellular markers in distinct cell types, capturing the complex phenomena of cellular structure interaction with little human input. This protocol represents a novel approach to vesicular protein annotation, a field that is often neglected in high-throughput microscopy. These results suggest that spatial point process models provide useful insight with respect to the spatial dependence between cellular structures. © 2016 International Society for Advancement of Cytometry. © 2016 International Society for Advancement of Cytometry.

  12. Automated Guideway Network Traffic Modeling

    DOT National Transportation Integrated Search

    1972-02-01

    In the literature concerning automated guideway transportation systems, such as dual mode, a great deal of effort has been expended on the use of deterministic reservation schemes and the problem of merging streams of vehicles. However, little attent...

  13. Automated Guideway Ground Transportation Network Simulation

    DOT National Transportation Integrated Search

    1975-08-01

    The report discusses some automated guideway management problems relating to ground transportation systems and provides an outline of the types of models and algorithms that could be used to develop simulation tools for evaluating system performance....

  14. Development of mathematical models for automation of strength calculation during plastic deformation processing

    NASA Astrophysics Data System (ADS)

    Steposhina, S. V.; Fedonin, O. N.

    2018-03-01

    Dependencies that make it possible to automate the force calculation during surface plastic deformation (SPD) processing and, thus, to shorten the time for technological preparation of production have been developed.

  15. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan

    2015-11-15

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy ofmore » the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was attributed to phantom setup errors due to the slightly deformable and flexible phantom extremities. The estimated site-specific safety buffer distance with 0.001% probability of collision for (gantry-to-couch, gantry-to-phantom) was (1.23 cm, 3.35 cm), (1.01 cm, 3.99 cm), and (2.19 cm, 5.73 cm) for treatment to the head, lung, and prostate, respectively. Automated delivery to all three treatment sites was completed in 15 min and collision free using a digital Linac. Conclusions: An individualized collision prediction model for the purpose of noncoplanar beam delivery was developed and verified. With the model, the study has demonstrated the feasibility of predicting deliverable beams for an individual patient and then guiding fully automated noncoplanar treatment delivery. This work motivates development of clinical workflows and quality assurance procedures to allow more extensive use and automation of noncoplanar beam geometries.« less

  16. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery.

    PubMed

    Yu, Victoria Y; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A; Sheng, Ke

    2015-11-01

    Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was attributed to phantom setup errors due to the slightly deformable and flexible phantom extremities. The estimated site-specific safety buffer distance with 0.001% probability of collision for (gantry-to-couch, gantry-to-phantom) was (1.23 cm, 3.35 cm), (1.01 cm, 3.99 cm), and (2.19 cm, 5.73 cm) for treatment to the head, lung, and prostate, respectively. Automated delivery to all three treatment sites was completed in 15 min and collision free using a digital Linac. An individualized collision prediction model for the purpose of noncoplanar beam delivery was developed and verified. With the model, the study has demonstrated the feasibility of predicting deliverable beams for an individual patient and then guiding fully automated noncoplanar treatment delivery. This work motivates development of clinical workflows and quality assurance procedures to allow more extensive use and automation of noncoplanar beam geometries.

  17. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    PubMed Central

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke

    2015-01-01

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was attributed to phantom setup errors due to the slightly deformable and flexible phantom extremities. The estimated site-specific safety buffer distance with 0.001% probability of collision for (gantry-to-couch, gantry-to-phantom) was (1.23 cm, 3.35 cm), (1.01 cm, 3.99 cm), and (2.19 cm, 5.73 cm) for treatment to the head, lung, and prostate, respectively. Automated delivery to all three treatment sites was completed in 15 min and collision free using a digital Linac. Conclusions: An individualized collision prediction model for the purpose of noncoplanar beam delivery was developed and verified. With the model, the study has demonstrated the feasibility of predicting deliverable beams for an individual patient and then guiding fully automated noncoplanar treatment delivery. This work motivates development of clinical workflows and quality assurance procedures to allow more extensive use and automation of noncoplanar beam geometries. PMID:26520735

  18. Automating Security Protocol Analysis

    DTIC Science & Technology

    2004-03-01

    language that allows easy representation of pattern interaction. Using CSP, Lowe tests whether a protocol achieves authentication. In the case of...only to correctly code whatever protocol they intend to evaluate. The tool, OCaml 3.04 [1], translates the protocol into Horn clauses and then...model protocol transactions. One example of automated modeling software is Maude [19]. Maude was the intended language for this research, but Java

  19. Automated Welding System

    NASA Technical Reports Server (NTRS)

    Bayless, E. O.; Lawless, K. G.; Kurgan, C.; Nunes, A. C.; Graham, B. F.; Hoffman, D.; Jones, C. S.; Shepard, R.

    1993-01-01

    Fully automated variable-polarity plasma arc VPPA welding system developed at Marshall Space Flight Center. System eliminates defects caused by human error. Integrates many sensors with mathematical model of the weld and computer-controlled welding equipment. Sensors provide real-time information on geometry of weld bead, location of weld joint, and wire-feed entry. Mathematical model relates geometry of weld to critical parameters of welding process.

  20. An Office Automation Needs Assessment Model

    DTIC Science & Technology

    1985-08-01

    TRACKING FORM . . . 74 I. CSD OFFICE SYSTEMS ANALYSIS WORKSHEETS . . . 75 J. AMO EVALUATIONS OF PROPOSED MODEL ...... 113 FOOTNOTES...as to "who should plan for office automated systems," a checklist of attributes should be evaluated , including: experience, expertise, availability of...with experience, differs with respect to breadth of knowledge in numerous areas. In evaluating in-house vs. outside resources, the Hospital Commander

  1. Drivers' communicative interactions: on-road observations and modelling for integration in future automation systems.

    PubMed

    Portouli, Evangelia; Nathanael, Dimitris; Marmaras, Nicolas

    2014-01-01

    Social interactions with other road users are an essential component of the driving activity and may prove critical in view of future automation systems; still up to now they have received only limited attention in the scientific literature. In this paper, it is argued that drivers base their anticipations about the traffic scene to a large extent on observations of social behaviour of other 'animate human-vehicles'. It is further argued that in cases of uncertainty, drivers seek to establish a mutual situational awareness through deliberate communicative interactions. A linguistic model is proposed for modelling these communicative interactions. Empirical evidence from on-road observations and analysis of concurrent running commentary by 25 experienced drivers support the proposed model. It is suggested that the integration of a social interactions layer based on illocutionary acts in future driving support and automation systems will improve their performance towards matching human driver's expectations. Practitioner Summary: Interactions between drivers on the road may play a significant role in traffic coordination. On-road observations and running commentaries are presented as empirical evidence to support a model of such interactions; incorporation of drivers' interactions in future driving support and automation systems may improve their performance towards matching driver's expectations.

  2. An Automated, Experimenter-Free Method for the Standardised, Operant Cognitive Testing of Rats

    PubMed Central

    Rivalan, Marion; Munawar, Humaira; Fuchs, Anna; Winter, York

    2017-01-01

    Animal models of human pathology are essential for biomedical research. However, a recurring issue in the use of animal models is the poor reproducibility of behavioural and physiological findings within and between laboratories. The most critical factor influencing this issue remains the experimenter themselves. One solution is the use of procedures devoid of human intervention. We present a novel approach to experimenter-free testing cognitive abilities in rats, by combining undisturbed group housing with automated, standardized and individual operant testing. This experimenter-free system consisted of an automated-operant system (Bussey-Saksida rat touch screen) connected to a home cage containing group living rats via an automated animal sorter (PhenoSys). The automated animal sorter, which is based on radio-frequency identification (RFID) technology, functioned as a mechanical replacement of the experimenter. Rats learnt to regularly and individually enter the operant chamber and remained there for the duration of the experimental session only. Self-motivated rats acquired the complex touch screen task of trial-unique non-matching to location (TUNL) in half the time reported for animals that were manually placed into the operant chamber. Rat performance was similar between the two groups within our laboratory, and comparable to previously published results obtained elsewhere. This reproducibility, both within and between laboratories, confirms the validity of this approach. In addition, automation reduced daily experimental time by 80%, eliminated animal handling, and reduced equipment cost. This automated, experimenter-free setup is a promising tool of great potential for testing a large variety of functions with full automation in future studies. PMID:28060883

  3. Archaeal “Dark Matter” and the Origin of Eukaryotes

    PubMed Central

    Williams, Tom A.; Embley, T. Martin

    2014-01-01

    Current hypotheses about the history of cellular life are mainly based on analyses of cultivated organisms, but these represent only a small fraction of extant biodiversity. The sequencing of new environmental lineages therefore provides an opportunity to test, revise, or reject existing ideas about the tree of life and the origin of eukaryotes. According to the textbook three domains hypothesis, the eukaryotes emerge as the sister group to a monophyletic Archaea. However, recent analyses incorporating better phylogenetic models and an improved sampling of the archaeal domain have generally supported the competing eocyte hypothesis, in which core genes of eukaryotic cells originated from within the Archaea, with important implications for eukaryogenesis. Given this trend, it was surprising that a recent analysis incorporating new genomes from uncultivated Archaea recovered a strongly supported three domains tree. Here, we show that this result was due in part to the use of a poorly fitting phylogenetic model and also to the inclusion by an automated pipeline of genes of putative bacterial origin rather than nucleocytosolic versions for some of the eukaryotes analyzed. When these issues were resolved, analyses including the new archaeal lineages placed core eukaryotic genes within the Archaea. These results are consistent with a number of recent studies in which improved archaeal sampling and better phylogenetic models agree in supporting the eocyte tree over the three domains hypothesis. PMID:24532674

  4. Archaeal "dark matter" and the origin of eukaryotes.

    PubMed

    Williams, Tom A; Embley, T Martin

    2014-03-01

    Current hypotheses about the history of cellular life are mainly based on analyses of cultivated organisms, but these represent only a small fraction of extant biodiversity. The sequencing of new environmental lineages therefore provides an opportunity to test, revise, or reject existing ideas about the tree of life and the origin of eukaryotes. According to the textbook three domains hypothesis, the eukaryotes emerge as the sister group to a monophyletic Archaea. However, recent analyses incorporating better phylogenetic models and an improved sampling of the archaeal domain have generally supported the competing eocyte hypothesis, in which core genes of eukaryotic cells originated from within the Archaea, with important implications for eukaryogenesis. Given this trend, it was surprising that a recent analysis incorporating new genomes from uncultivated Archaea recovered a strongly supported three domains tree. Here, we show that this result was due in part to the use of a poorly fitting phylogenetic model and also to the inclusion by an automated pipeline of genes of putative bacterial origin rather than nucleocytosolic versions for some of the eukaryotes analyzed. When these issues were resolved, analyses including the new archaeal lineages placed core eukaryotic genes within the Archaea. These results are consistent with a number of recent studies in which improved archaeal sampling and better phylogenetic models agree in supporting the eocyte tree over the three domains hypothesis.

  5. Universal Verification Methodology Based Register Test Automation Flow.

    PubMed

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  6. A human performance modelling approach to intelligent decision support systems

    NASA Technical Reports Server (NTRS)

    Mccoy, Michael S.; Boys, Randy M.

    1987-01-01

    Manned space operations require that the many automated subsystems of a space platform be controllable by a limited number of personnel. To minimize the interaction required of these operators, artificial intelligence techniques may be applied to embed a human performance model within the automated, or semi-automated, systems, thereby allowing the derivation of operator intent. A similar application has previously been proposed in the domain of fighter piloting, where the demand for pilot intent derivation is primarily a function of limited time and high workload rather than limited operators. The derivation and propagation of pilot intent is presented as it might be applied to some programs.

  7. A Computational Workflow for the Automated Generation of Models of Genetic Designs.

    PubMed

    Misirli, Göksel; Nguyen, Tramy; McLaughlin, James Alastair; Vaidyanathan, Prashant; Jones, Timothy S; Densmore, Douglas; Myers, Chris; Wipat, Anil

    2018-06-05

    Computational models are essential to engineer predictable biological systems and to scale up this process for complex systems. Computational modeling often requires expert knowledge and data to build models. Clearly, manual creation of models is not scalable for large designs. Despite several automated model construction approaches, computational methodologies to bridge knowledge in design repositories and the process of creating computational models have still not been established. This paper describes a workflow for automatic generation of computational models of genetic circuits from data stored in design repositories using existing standards. This workflow leverages the software tool SBOLDesigner to build structural models that are then enriched by the Virtual Parts Repository API using Systems Biology Open Language (SBOL) data fetched from the SynBioHub design repository. The iBioSim software tool is then utilized to convert this SBOL description into a computational model encoded using the Systems Biology Markup Language (SBML). Finally, this SBML model can be simulated using a variety of methods. This workflow provides synthetic biologists with easy to use tools to create predictable biological systems, hiding away the complexity of building computational models. This approach can further be incorporated into other computational workflows for design automation.

  8. A top-level ontology of functions and its application in the Open Biomedical Ontologies.

    PubMed

    Burek, Patryk; Hoehndorf, Robert; Loebe, Frank; Visagie, Johann; Herre, Heinrich; Kelso, Janet

    2006-07-15

    A clear understanding of functions in biology is a key component in accurate modelling of molecular, cellular and organismal biology. Using the existing biomedical ontologies it has been impossible to capture the complexity of the community's knowledge about biological functions. We present here a top-level ontological framework for representing knowledge about biological functions. This framework lends greater accuracy, power and expressiveness to biomedical ontologies by providing a means to capture existing functional knowledge in a more formal manner. An initial major application of the ontology of functions is the provision of a principled way in which to curate functional knowledge and annotations in biomedical ontologies. Further potential applications include the facilitation of ontology interoperability and automated reasoning. A major advantage of the proposed implementation is that it is an extension to existing biomedical ontologies, and can be applied without substantial changes to these domain ontologies. The Ontology of Functions (OF) can be downloaded in OWL format from http://onto.eva.mpg.de/. Additionally, a UML profile and supplementary information and guides for using the OF can be accessed from the same website.

  9. Automated analysis of siRNA screens of cells infected by hepatitis C and dengue viruses based on immunofluorescence microscopy images

    NASA Astrophysics Data System (ADS)

    Matula, Petr; Kumar, Anil; Wörz, Ilka; Harder, Nathalie; Erfle, Holger; Bartenschlager, Ralf; Eils, Roland; Rohr, Karl

    2008-03-01

    We present an image analysis approach as part of a high-throughput microscopy siRNA-based screening system using cell arrays for the identification of cellular genes involved in hepatitis C and dengue virus replication. Our approach comprises: cell nucleus segmentation, quantification of virus replication level in the neighborhood of segmented cell nuclei, localization of regions with transfected cells, cell classification by infection status, and quality assessment of an experiment and single images. In particular, we propose a novel approach for the localization of regions of transfected cells within cell array images, which combines model-based circle fitting and grid fitting. By this scheme we integrate information from single cell array images and knowledge from the complete cell arrays. The approach is fully automatic and has been successfully applied to a large number of cell array images from screening experiments. The experimental results show a good agreement with the expected behaviour of positive as well as negative controls and encourage the application to screens from further high-throughput experiments.

  10. Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2017-10-01

    We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.

  11. An entirely automated method to score DSS-induced colitis in mice by digital image analysis of pathology slides

    PubMed Central

    Kozlowski, Cleopatra; Jeet, Surinder; Beyer, Joseph; Guerrero, Steve; Lesch, Justin; Wang, Xiaoting; DeVoss, Jason; Diehl, Lauri

    2013-01-01

    SUMMARY The DSS (dextran sulfate sodium) model of colitis is a mouse model of inflammatory bowel disease. Microscopic symptoms include loss of crypt cells from the gut lining and infiltration of inflammatory cells into the colon. An experienced pathologist requires several hours per study to score histological changes in selected regions of the mouse gut. In order to increase the efficiency of scoring, Definiens Developer software was used to devise an entirely automated method to quantify histological changes in the whole H&E slide. When the algorithm was applied to slides from historical drug-discovery studies, automated scores classified 88% of drug candidates in the same way as pathologists’ scores. In addition, another automated image analysis method was developed to quantify colon-infiltrating macrophages, neutrophils, B cells and T cells in immunohistochemical stains of serial sections of the H&E slides. The timing of neutrophil and macrophage infiltration had the highest correlation to pathological changes, whereas T and B cell infiltration occurred later. Thus, automated image analysis enables quantitative comparisons between tissue morphology changes and cell-infiltration dynamics. PMID:23580198

  12. Selected contribution: a three-dimensional model for assessment of in vitro toxicity in balaena mysticetus renal tissue

    NASA Technical Reports Server (NTRS)

    Goodwin, T. J.; Coate-Li, L.; Linnehan, R. M.; Hammond, T. G.

    2000-01-01

    This study established two- and three-dimensional renal proximal tubular cell cultures of the endangered species bowhead whale (Balaena mysticetus), developed SV40-transfected cultures, and cloned the 61-amino acid open reading frame for the metallothionein protein, the primary binding site for heavy metal contamination in mammals. Microgravity research, modulations in mechanical culture conditions (modeled microgravity), and shear stress have spawned innovative approaches to understanding the dynamics of cellular interactions, gene expression, and differentiation in several cellular systems. These investigations have led to the creation of ex vivo tissue models capable of serving as physiological research analogs for three-dimensional cellular interactions. These models are enabling studies in immune function, tissue modeling for basic research, and neoplasia. Three-dimensional cellular models emulate aspects of in vivo cellular architecture and physiology and may facilitate environmental toxicological studies aimed at elucidating biological functions and responses at the cellular level. Marine mammals occupy a significant ecological niche (72% of the Earth's surface is water) in terms of the potential for information on bioaccumulation and transport of terrestrial and marine environmental toxins in high-order vertebrates. Few ex vivo models of marine mammal physiology exist in vitro to accomplish the aforementioned studies. Techniques developed in this investigation, based on previous tissue modeling successes, may serve to facilitate similar research in other marine mammals.

  13. Geometric confinement influences cellular mechanical properties I -- adhesion area dependence.

    PubMed

    Su, Judith; Jiang, Xingyu; Welsch, Roy; Whitesides, George M; So, Peter T C

    2007-06-01

    Interactions between the cell and the extracellular matrix regulate a variety of cellular properties and functions, including cellular rheology. In the present study of cellular adhesion, area was controlled by confining NIH 3T3 fibroblast cells to circular micropatterned islands of defined size. The shear moduli of cells adhering to islands of well defined geometry, as measured by magnetic microrheometry, was found to have a significantly lower variance than those of cells allowed to spread on unpatterned surfaces. We observe that the area of cellular adhesion influences shear modulus. Rheological measurements further indicate that cellular shear modulus is a biphasic function of cellular adhesion area with stiffness decreasing to a minimum value for intermediate areas of adhesion, and then increasing for cells on larger patterns. We propose a simple hypothesis: that the area of adhesion affects cellular rheological properties by regulating the structure of the actin cytoskeleton. To test this hypothesis, we quantified the volume fraction of polymerized actin in the cytosol by staining with fluorescent phalloidin and imaging using quantitative 3D microscopy. The polymerized actin volume fraction exhibited a similar biphasic dependence on adhesion area. Within the limits of our simplifying hypothesis, our experimental results permit an evaluation of the ability of established, micromechanical models to predict the cellular shear modulus based on polymerized actin volume fraction. We investigated the "tensegrity", "cellular-solids", and "biopolymer physics" models that have, respectively, a linear, quadratic, and 5/2 dependence on polymerized actin volume fraction. All three models predict that a biphasic trend in polymerized actin volume fraction as a function of adhesion area will result in a biphasic behavior in shear modulus. Our data favors a higher-order dependence on polymerized actin volume fraction. Increasingly better experimental agreement is observed for the tensegrity, the cellular solids, and the biopolymer models respectively. Alternatively if we postulate the existence of a critical actin volume fraction below which the shear modulus vanishes, the experimental data can be equivalently described by a model with an almost linear dependence on polymerized actin volume fraction; this observation supports a tensegrity model with a critical actin volume fraction.

  14. Rapid SAW Sensor Development Tools

    NASA Technical Reports Server (NTRS)

    Wilson, William C.; Atkinson, Gary M.

    2007-01-01

    The lack of integrated design tools for Surface Acoustic Wave (SAW) devices has led us to develop tools for the design, modeling, analysis, and automatic layout generation of SAW devices. These tools enable rapid development of wireless SAW sensors. The tools developed have been designed to integrate into existing Electronic Design Automation (EDA) tools to take advantage of existing 3D modeling, and Finite Element Analysis (FEA). This paper presents the SAW design, modeling, analysis, and automated layout generation tools.

  15. Efficient Parallel Levenberg-Marquardt Model Fitting towards Real-Time Automated Parametric Imaging Microscopy

    PubMed Central

    Zhu, Xiang; Zhang, Dianwen

    2013-01-01

    We present a fast, accurate and robust parallel Levenberg-Marquardt minimization optimizer, GPU-LMFit, which is implemented on graphics processing unit for high performance scalable parallel model fitting processing. GPU-LMFit can provide a dramatic speed-up in massive model fitting analyses to enable real-time automated pixel-wise parametric imaging microscopy. We demonstrate the performance of GPU-LMFit for the applications in superresolution localization microscopy and fluorescence lifetime imaging microscopy. PMID:24130785

  16. Development of automation and robotics for space via computer graphic simulation methods

    NASA Technical Reports Server (NTRS)

    Fernandez, Ken

    1988-01-01

    A robot simulation system, has been developed to perform automation and robotics system design studies. The system uses a procedure-oriented solid modeling language to produce a model of the robotic mechanism. The simulator generates the kinematics, inverse kinematics, dynamics, control, and real-time graphic simulations needed to evaluate the performance of the model. Simulation examples are presented, including simulation of the Space Station and the design of telerobotics for the Orbital Maneuvering Vehicle.

  17. RCrane: semi-automated RNA model building.

    PubMed

    Keating, Kevin S; Pyle, Anna Marie

    2012-08-01

    RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems.

  18. Use of noncrystallographic symmetry for automated model building at medium to low resolution.

    PubMed

    Wiegels, Tim; Lamzin, Victor S

    2012-04-01

    A novel method is presented for the automatic detection of noncrystallographic symmetry (NCS) in macromolecular crystal structure determination which does not require the derivation of molecular masks or the segmentation of density. It was found that throughout structure determination the NCS-related parts may be differently pronounced in the electron density. This often results in the modelling of molecular fragments of variable length and accuracy, especially during automated model-building procedures. These fragments were used to identify NCS relations in order to aid automated model building and refinement. In a number of test cases higher completeness and greater accuracy of the obtained structures were achieved, specifically at a crystallographic resolution of 2.3 Å or poorer. In the best case, the method allowed the building of up to 15% more residues automatically and a tripling of the average length of the built fragments.

  19. Input-output identification of controlled discrete manufacturing systems

    NASA Astrophysics Data System (ADS)

    Estrada-Vargas, Ana Paula; López-Mellado, Ernesto; Lesage, Jean-Jacques

    2014-03-01

    The automated construction of discrete event models from observations of external system's behaviour is addressed. This problem, often referred to as system identification, allows obtaining models of ill-known (or even unknown) systems. In this article, an identification method for discrete event systems (DESs) controlled by a programmable logic controller is presented. The method allows processing a large quantity of observed long sequences of input/output signals generated by the controller and yields an interpreted Petri net model describing the closed-loop behaviour of the automated DESs. The proposed technique allows the identification of actual complex systems because it is sufficiently efficient and well adapted to cope with both the technological characteristics of industrial controllers and data collection requirements. Based on polynomial-time algorithms, the method is implemented as an efficient software tool which constructs and draws the model automatically; an overview of this tool is given through a case study dealing with an automated manufacturing system.

  20. Benefits Estimation Model for Automated Vehicle Operations: Phase 2 Final Report

    DOT National Transportation Integrated Search

    2018-01-01

    Automated vehicles have the potential to bring about transformative safety, mobility, energy, and environmental benefits to the surface transportation system. They are also being introduced into a complex transportation system, where second-order imp...

  1. Calcium Sensor, NCS-1, Promotes Tumor Aggressiveness and Predicts Patient Survival.

    PubMed

    Moore, Lauren M; England, Allison; Ehrlich, Barbara E; Rimm, David L

    2017-07-01

    Neuronal Calcium Sensor 1 (NCS-1) is a multi-functional Ca 2+ -binding protein that affects a range of cellular processes beyond those related to neurons. Functional characterization of NCS-1 in neuronal model systems suggests that NCS-1 may influence oncogenic processes. To this end, the biological role of NCS-1 was investigated by altering its endogenous expression in MCF-7 and MB-231 breast cancer cells. Overexpression of NCS-1 resulted in a more aggressive tumor phenotype demonstrated by a marked increase in invasion and motility, and a decrease in cell-matrix adhesion to collagen IV. Overexpression of NCS-1 was also shown to increase the efficacy of paclitaxel-induced cell death in a manner that was independent of cellular proliferation. To determine the association between NCS-1 and clinical outcome, NCS-1 expression was measured in two independent breast cancer cohorts by the Automated Quantitative Analysis method of quantitative immunofluorescence. Elevated levels of NCS-1 were significantly correlated with shorter survival rates. Furthermore, multivariate analysis demonstrated that NCS-1 status was prognostic, independent of estrogen receptor, progesterone receptor, HER2, and lymph node status. These findings indicate that NCS-1 plays a role in the aggressive behavior of a subset of breast cancers and has therapeutic or biomarker potential. Implications: NCS-1, a calcium-binding protein, is associated with clinicopathologic features of aggressiveness in breast cancer cells and worse outcome in two breast cancer patient cohorts. Mol Cancer Res; 15(7); 942-52. ©2017 AACR . ©2017 American Association for Cancer Research.

  2. Using Bayesian Networks and Decision Theory to Model Physical Security

    DTIC Science & Technology

    2003-02-01

    Home automation technologies allow a person to monitor and control various activities within a home or office setting. Cameras, sensors and other...components used along with the simple rules in the home automation software provide an environment where the lights, security and other appliances can be...monitored and controlled. These home automation technologies, however, lack the power to reason under uncertain conditions and thus the system can

  3. Operator modeling in commerical aviation: Cognitive models, intelligent displays, and pilot's assistants

    NASA Technical Reports Server (NTRS)

    Govindaraj, T.; Mitchell, C. M.

    1994-01-01

    One of the goals of the National Aviation Safety/Automation program is to address the issue of human-centered automation in the cockpit. Human-centered automation is automation that, in the cockpit, enhances or assists the crew rather than replacing them. The Georgia Tech research program focused on this general theme, with emphasis on designing a computer-based pilot's assistant, intelligent (i.e, context-sensitive) displays, and an intelligent tutoring system for understanding and operating the autoflight system. In particular, the aids and displays were designed to enhance the crew's situational awareness of the current state of the automated flight systems and to assist the crew's situational awareness of the current state of the automated flight systems and to assist the crew in coordinating the autoflight system resources. The activities of this grant included: (1) an OFMspert to understand pilot navigation activities in a 727 class aircraft; (2) an extension of OFMspert to understand mode control in a glass cockpit, Georgia Tech Crew Activity Tracking System (GT-CATS); (3) the design of a training system to teach pilots about the vertical navigation portion of the flight management system -VNAV Tutor; and (4) a proof-of-concept display, using existing display technology, to facilitate mode awareness, particularly in situations in which controlled flight into terrain (CFIT) is a potential.

  4. SamuROI, a Python-Based Software Tool for Visualization and Analysis of Dynamic Time Series Imaging at Multiple Spatial Scales.

    PubMed

    Rueckl, Martin; Lenzi, Stephen C; Moreno-Velasquez, Laura; Parthier, Daniel; Schmitz, Dietmar; Ruediger, Sten; Johenning, Friedrich W

    2017-01-01

    The measurement of activity in vivo and in vitro has shifted from electrical to optical methods. While the indicators for imaging activity have improved significantly over the last decade, tools for analysing optical data have not kept pace. Most available analysis tools are limited in their flexibility and applicability to datasets obtained at different spatial scales. Here, we present SamuROI (Structured analysis of multiple user-defined ROIs), an open source Python-based analysis environment for imaging data. SamuROI simplifies exploratory analysis and visualization of image series of fluorescence changes in complex structures over time and is readily applicable at different spatial scales. In this paper, we show the utility of SamuROI in Ca 2+ -imaging based applications at three spatial scales: the micro-scale (i.e., sub-cellular compartments including cell bodies, dendrites and spines); the meso-scale, (i.e., whole cell and population imaging with single-cell resolution); and the macro-scale (i.e., imaging of changes in bulk fluorescence in large brain areas, without cellular resolution). The software described here provides a graphical user interface for intuitive data exploration and region of interest (ROI) management that can be used interactively within Jupyter Notebook: a publicly available interactive Python platform that allows simple integration of our software with existing tools for automated ROI generation and post-processing, as well as custom analysis pipelines. SamuROI software, source code and installation instructions are publicly available on GitHub and documentation is available online. SamuROI reduces the energy barrier for manual exploration and semi-automated analysis of spatially complex Ca 2+ imaging datasets, particularly when these have been acquired at different spatial scales.

  5. SamuROI, a Python-Based Software Tool for Visualization and Analysis of Dynamic Time Series Imaging at Multiple Spatial Scales

    PubMed Central

    Rueckl, Martin; Lenzi, Stephen C.; Moreno-Velasquez, Laura; Parthier, Daniel; Schmitz, Dietmar; Ruediger, Sten; Johenning, Friedrich W.

    2017-01-01

    The measurement of activity in vivo and in vitro has shifted from electrical to optical methods. While the indicators for imaging activity have improved significantly over the last decade, tools for analysing optical data have not kept pace. Most available analysis tools are limited in their flexibility and applicability to datasets obtained at different spatial scales. Here, we present SamuROI (Structured analysis of multiple user-defined ROIs), an open source Python-based analysis environment for imaging data. SamuROI simplifies exploratory analysis and visualization of image series of fluorescence changes in complex structures over time and is readily applicable at different spatial scales. In this paper, we show the utility of SamuROI in Ca2+-imaging based applications at three spatial scales: the micro-scale (i.e., sub-cellular compartments including cell bodies, dendrites and spines); the meso-scale, (i.e., whole cell and population imaging with single-cell resolution); and the macro-scale (i.e., imaging of changes in bulk fluorescence in large brain areas, without cellular resolution). The software described here provides a graphical user interface for intuitive data exploration and region of interest (ROI) management that can be used interactively within Jupyter Notebook: a publicly available interactive Python platform that allows simple integration of our software with existing tools for automated ROI generation and post-processing, as well as custom analysis pipelines. SamuROI software, source code and installation instructions are publicly available on GitHub and documentation is available online. SamuROI reduces the energy barrier for manual exploration and semi-automated analysis of spatially complex Ca2+ imaging datasets, particularly when these have been acquired at different spatial scales. PMID:28706482

  6. Gait analysis in a mouse model resembling Leigh disease.

    PubMed

    de Haas, Ria; Russel, Frans G; Smeitink, Jan A

    2016-01-01

    Leigh disease (LD) is one of the clinical phenotypes of mitochondrial OXPHOS disorders and also known as sub-acute necrotizing encephalomyelopathy. The disease has an incidence of 1 in 77,000 live births. Symptoms typically begin early in life and prognosis for LD patients is poor. Currently, no clinically effective treatments are available. Suitable animal and cellular models are necessary for the understanding of the neuropathology and the development of successful new therapeutic strategies. In this study we used the Ndufs4 knockout (Ndufs4(-/-)) mouse, a model of mitochondrial complex I deficiency. Ndusf4(-/-) mice exhibit progressive neurodegeneration, which closely resemble the human LD phenotype. When dissecting behavioral abnormalities in animal models it is of great importance to apply translational tools that are clinically relevant. To distinguish gait abnormalities in patients, simple walking tests can be assessed, but in animals this is not easy. This study is the first to demonstrate automated CatWalk gait analysis in the Ndufs4(-/-) mouse model. Marked differences were noted between Ndufs4(-/-) and control mice in dynamic, static, coordination and support parameters. Variation of walking speed was significantly increased in Ndufs4(-/-) mice, suggesting hampered and uncoordinated gait. Furthermore, decreased regularity index, increased base of support and changes in support were noted in the Ndufs4(-/-) mice. Here, we report the ability of the CatWalk system to sensitively assess gait abnormalities in Ndufs4(-/-) mice. This objective gait analysis can be of great value for intervention and drug efficacy studies in animal models for mitochondrial disease. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Consistent prediction of GO protein localization.

    PubMed

    Spetale, Flavio E; Arce, Debora; Krsticevic, Flavia; Bulacio, Pilar; Tapia, Elizabeth

    2018-05-17

    The GO-Cellular Component (GO-CC) ontology provides a controlled vocabulary for the consistent description of the subcellular compartments or macromolecular complexes where proteins may act. Current machine learning-based methods used for the automated GO-CC annotation of proteins suffer from the inconsistency of individual GO-CC term predictions. Here, we present FGGA-CC + , a class of hierarchical graph-based classifiers for the consistent GO-CC annotation of protein coding genes at the subcellular compartment or macromolecular complex levels. Aiming to boost the accuracy of GO-CC predictions, we make use of the protein localization knowledge in the GO-Biological Process (GO-BP) annotations to boost the accuracy of GO-CC prediction. As a result, FGGA-CC + classifiers are built from annotation data in both the GO-CC and GO-BP ontologies. Due to their graph-based design, FGGA-CC + classifiers are fully interpretable and their predictions amenable to expert analysis. Promising results on protein annotation data from five model organisms were obtained. Additionally, successful validation results in the annotation of a challenging subset of tandem duplicated genes in the tomato non-model organism were accomplished. Overall, these results suggest that FGGA-CC + classifiers can indeed be useful for satisfying the huge demand of GO-CC annotation arising from ubiquitous high throughout sequencing and proteomic projects.

  8. An extended model of vesicle fusion at the plasma membrane to estimate protein lateral diffusion from TIRF microscopy images.

    PubMed

    Basset, Antoine; Bouthemy, Patrick; Boulanger, Jérôme; Waharte, François; Salamero, Jean; Kervrann, Charles

    2017-07-24

    Characterizing membrane dynamics is a key issue to understand cell exchanges with the extra-cellular medium. Total internal reflection fluorescence microscopy (TIRFM) is well suited to focus on the late steps of exocytosis at the plasma membrane. However, it is still a challenging task to quantify (lateral) diffusion and estimate local dynamics of proteins. A new model was introduced to represent the behavior of cargo transmembrane proteins during the vesicle fusion to the plasma membrane at the end of the exocytosis process. Two biophysical parameters, the diffusion coefficient and the release rate parameter, are automatically estimated from TIRFM image sequences, to account for both the lateral diffusion of molecules at the membrane and the continuous release of the proteins from the vesicle to the plasma membrane. Quantitative evaluation on 300 realistic computer-generated image sequences demonstrated the efficiency and accuracy of the method. The application of our method on 16 real TIRFM image sequences additionally revealed differences in the dynamic behavior of Transferrin Receptor (TfR) and Langerin proteins. An automated method has been designed to simultaneously estimate the diffusion coefficient and the release rate for each individual vesicle fusion event at the plasma membrane in TIRFM image sequences. It can be exploited for further deciphering cell membrane dynamics.

  9. Evaluating the Ability of Heart Rate and EEG to Control Alertness during Performance

    NASA Technical Reports Server (NTRS)

    Freeman, Fred

    2002-01-01

    The major focus of the present proposal was to examine psychophysiological indices that show promise for invoking different modes of automation in an adaptive automation system. With the increased use of automation in today's work environment, people's roles in the work place are being redefined from that of active participant to one of passive monitor. Although the introduction of automated systems has a number of benefits, there are also a number of disadvantages regarding worker performance. Byrne and Parasuraman have argued for the use of psychophysiological measures in the development and the implementation of adaptive automation. While performance based, model based, and psychophysiologically based adaptive automation systems have been studied, the combined use of several psychophysiological measures has never been investigated. Such a combination provides the advantage of real time evaluation of the state of the subject in two relevant dimensions and offers a more realistic approach to the implementation of adaptive automation compared to the use of either dimension by itself.

  10. InPRO: Automated Indoor Construction Progress Monitoring Using Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Hamledari, Hesam

    In this research, an envisioned automated intelligent robotic solution for automated indoor data collection and inspection that employs a series of unmanned aerial vehicles (UAV), entitled "InPRO", is presented. InPRO consists of four stages, namely: 1) automated path planning; 2) autonomous UAV-based indoor inspection; 3) automated computer vision-based assessment of progress; and, 4) automated updating of 4D building information models (BIM). The works presented in this thesis address the third stage of InPRO. A series of computer vision-based methods that automate the assessment of construction progress using images captured at indoor sites are introduced. The proposed methods employ computer vision and machine learning techniques to detect the components of under-construction indoor partitions. In particular, framing (studs), insulation, electrical outlets, and different states of drywall sheets (installing, plastering, and painting) are automatically detected using digital images. High accuracy rates, real-time performance, and operation without a priori information are indicators of the methods' promising performance.

  11. Optimized and Automated design of Plasma Diagnostics for Additive Manufacture

    NASA Astrophysics Data System (ADS)

    Stuber, James; Quinley, Morgan; Melnik, Paul; Sieck, Paul; Smith, Trevor; Chun, Katherine; Woodruff, Simon

    2016-10-01

    Despite having mature designs, diagnostics are usually custom designed for each experiment. Most of the design can be now be automated to reduce costs (engineering labor, and capital cost). We present results from scripted physics modeling and parametric engineering design for common optical and mechanical components found in many plasma diagnostics and outline the process for automated design optimization that employs scripts to communicate data from online forms through proprietary and open-source CAD and FE codes to provide a design that can be sent directly to a printer. As a demonstration of design automation, an optical beam dump, baffle and optical components are designed via an automated process and printed. Supported by DOE SBIR Grant DE-SC0011858.

  12. Liquid-based cytology and cell block immunocytochemistry in veterinary medicine: comparison with standard cytology for the evaluation of canine lymphoid samples.

    PubMed

    Fernandes, N C C A; Guerra, J M; Réssio, R A; Wasques, D G; Etlinger-Colonelli, D; Lorente, S; Nogueira, E; Dagli, M L Z

    2016-08-01

    Liquid-based Cytology (LBC) consists of immediate wet cell fixation with automated slide preparation. We applied LBC, cell block (CB) and immunocytochemistry to diagnose canine lymphoma and compare results with conventional cytology. Samples from enlarged lymph nodes of 18 dogs were collected and fixed in preservative solution for automated slide preparation (LBC), CB inclusion and immunophenotyping. Two CB techniques were tested: fixed sediment method (FSM) and agar method (AM). Anti-CD79a, anti-Pax5, anti-CD3 and anti-Ki67 were used in immunocytochemistry. LBC smears showed better nuclear and nucleolar definition, without cell superposition, but presented smaller cell size and worse cytoplasmic definition. FSM showed consistent cellular groups and were employed for immunocytochemistry, while AM CBs presented sparse groups of lymphocytes, with compromised analysis. Anti-Pax-5 allowed B-cell identification, both in reactive and neoplastic lymph nodes. Our preliminary report suggests that LBC and FSM together may be promising tools to improve lymphoma diagnosis through fine-needle aspiration. © 2015 John Wiley & Sons Ltd.

  13. Particle tracking in drug and gene delivery research: State-of-the-art applications and methods.

    PubMed

    Schuster, Benjamin S; Ensign, Laura M; Allan, Daniel B; Suk, Jung Soo; Hanes, Justin

    2015-08-30

    Particle tracking is a powerful microscopy technique to quantify the motion of individual particles at high spatial and temporal resolution in complex fluids and biological specimens. Particle tracking's applications and impact in drug and gene delivery research have greatly increased during the last decade. Thanks to advances in hardware and software, this technique is now more accessible than ever, and can be reliably automated to enable rapid processing of large data sets, thereby further enhancing the role that particle tracking will play in drug and gene delivery studies in the future. We begin this review by discussing particle tracking-based advances in characterizing extracellular and cellular barriers to therapeutic nanoparticles and in characterizing nanoparticle size and stability. To facilitate wider adoption of the technique, we then present a user-friendly review of state-of-the-art automated particle tracking algorithms and methods of analysis. We conclude by reviewing technological developments for next-generation particle tracking methods, and we survey future research directions in drug and gene delivery where particle tracking may be useful. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. An automated image analysis framework for segmentation and division plane detection of single live Staphylococcus aureus cells which can operate at millisecond sampling time scales using bespoke Slimfield microscopy

    NASA Astrophysics Data System (ADS)

    Wollman, Adam J. M.; Miller, Helen; Foster, Simon; Leake, Mark C.

    2016-10-01

    Staphylococcus aureus is an important pathogen, giving rise to antimicrobial resistance in cell strains such as Methicillin Resistant S. aureus (MRSA). Here we report an image analysis framework for automated detection and image segmentation of cells in S. aureus cell clusters, and explicit identification of their cell division planes. We use a new combination of several existing analytical tools of image analysis to detect cellular and subcellular morphological features relevant to cell division from millisecond time scale sampled images of live pathogens at a detection precision of single molecules. We demonstrate this approach using a fluorescent reporter GFP fused to the protein EzrA that localises to a mid-cell plane during division and is involved in regulation of cell size and division. This image analysis framework presents a valuable platform from which to study candidate new antimicrobials which target the cell division machinery, but may also have more general application in detecting morphologically complex structures of fluorescently labelled proteins present in clusters of other types of cells.

  15. Real-Time Three-Dimensional Cell Segmentation in Large-Scale Microscopy Data of Developing Embryos.

    PubMed

    Stegmaier, Johannes; Amat, Fernando; Lemon, William C; McDole, Katie; Wan, Yinan; Teodoro, George; Mikut, Ralf; Keller, Philipp J

    2016-01-25

    We present the Real-time Accurate Cell-shape Extractor (RACE), a high-throughput image analysis framework for automated three-dimensional cell segmentation in large-scale images. RACE is 55-330 times faster and 2-5 times more accurate than state-of-the-art methods. We demonstrate the generality of RACE by extracting cell-shape information from entire Drosophila, zebrafish, and mouse embryos imaged with confocal and light-sheet microscopes. Using RACE, we automatically reconstructed cellular-resolution tissue anisotropy maps across developing Drosophila embryos and quantified differences in cell-shape dynamics in wild-type and mutant embryos. We furthermore integrated RACE with our framework for automated cell lineaging and performed joint segmentation and cell tracking in entire Drosophila embryos. RACE processed these terabyte-sized datasets on a single computer within 1.4 days. RACE is easy to use, as it requires adjustment of only three parameters, takes full advantage of state-of-the-art multi-core processors and graphics cards, and is available as open-source software for Windows, Linux, and Mac OS. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. MOST-visualization: software for producing automated textbook-style maps of genome-scale metabolic networks.

    PubMed

    Kelley, James J; Maor, Shay; Kim, Min Kyung; Lane, Anatoliy; Lun, Desmond S

    2017-08-15

    Visualization of metabolites, reactions and pathways in genome-scale metabolic networks (GEMs) can assist in understanding cellular metabolism. Three attributes are desirable in software used for visualizing GEMs: (i) automation, since GEMs can be quite large; (ii) production of understandable maps that provide ease in identification of pathways, reactions and metabolites; and (iii) visualization of the entire network to show how pathways are interconnected. No software currently exists for visualizing GEMs that satisfies all three characteristics, but MOST-Visualization, an extension of the software package MOST (Metabolic Optimization and Simulation Tool), satisfies (i), and by using a pre-drawn overview map of metabolism based on the Roche map satisfies (ii) and comes close to satisfying (iii). MOST is distributed for free on the GNU General Public License. The software and full documentation are available at http://most.ccib.rutgers.edu/. dslun@rutgers.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  17. Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles.

    PubMed

    Eom, Hwisoo; Lee, Sang Hun

    2015-06-12

    A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model.

  18. Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles

    PubMed Central

    Eom, Hwisoo; Lee, Sang Hun

    2015-01-01

    A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model. PMID:26076406

  19. Performance of Automated Speech Scoring on Different Low- to Medium-Entropy Item Types for Low-Proficiency English Learners. Research Report. ETS RR-17-12

    ERIC Educational Resources Information Center

    Loukina, Anastassia; Zechner, Klaus; Yoon, Su-Youn; Zhang, Mo; Tao, Jidong; Wang, Xinhao; Lee, Chong Min; Mulholland, Matthew

    2017-01-01

    This report presents an overview of the "SpeechRater"? automated scoring engine model building and evaluation process for several item types with a focus on a low-English-proficiency test-taker population. We discuss each stage of speech scoring, including automatic speech recognition, filtering models for nonscorable responses, and…

  20. Extended System Operations Studies for Automated Guideway Transit Systems

    DOT National Transportation Integrated Search

    1982-02-01

    The objectives of the System Operations Studies (SOS) of the Automated Guideway Transit Technology (AGTT) program was to develop models for the analysis of system operations, to evaluate AGT system performance and cost, and to establish guidelines fo...

Top