Sample records for automated parallel cultures

  1. Toward an automated parallel computing environment for geosciences

    NASA Astrophysics Data System (ADS)

    Zhang, Huai; Liu, Mian; Shi, Yaolin; Yuen, David A.; Yan, Zhenzhen; Liang, Guoping

    2007-08-01

    Software for geodynamic modeling has not kept up with the fast growing computing hardware and network resources. In the past decade supercomputing power has become available to most researchers in the form of affordable Beowulf clusters and other parallel computer platforms. However, to take full advantage of such computing power requires developing parallel algorithms and associated software, a task that is often too daunting for geoscience modelers whose main expertise is in geosciences. We introduce here an automated parallel computing environment built on open-source algorithms and libraries. Users interact with this computing environment by specifying the partial differential equations, solvers, and model-specific properties using an English-like modeling language in the input files. The system then automatically generates the finite element codes that can be run on distributed or shared memory parallel machines. This system is dynamic and flexible, allowing users to address different problems in geosciences. It is capable of providing web-based services, enabling users to generate source codes online. This unique feature will facilitate high-performance computing to be integrated with distributed data grids in the emerging cyber-infrastructures for geosciences. In this paper we discuss the principles of this automated modeling environment and provide examples to demonstrate its versatility.

  2. An automated workflow for parallel processing of large multiview SPIM recordings.

    PubMed

    Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel

    2016-04-01

    Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction : schmied@mpi-cbg.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  3. An automated workflow for parallel processing of large multiview SPIM recordings

    PubMed Central

    Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel

    2016-01-01

    Summary: Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. Availability and implementation: The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT. The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows. Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction. Contact: schmied@mpi-cbg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26628585

  4. A hierarchical, automated target recognition algorithm for a parallel analog processor

    NASA Technical Reports Server (NTRS)

    Woodward, Gail; Padgett, Curtis

    1997-01-01

    A hierarchical approach is described for an automated target recognition (ATR) system, VIGILANTE, that uses a massively parallel, analog processor (3DANN). The 3DANN processor is capable of performing 64 concurrent inner products of size 1x4096 every 250 nanoseconds.

  5. Hierarchically Parallelized Constrained Nonlinear Solvers with Automated Substructuring

    NASA Technical Reports Server (NTRS)

    Padovan, Joe; Kwang, Abel

    1994-01-01

    This paper develops a parallelizable multilevel multiple constrained nonlinear equation solver. The substructuring process is automated to yield appropriately balanced partitioning of each succeeding level. Due to the generality of the procedure,_sequential, as well as partially and fully parallel environments can be handled. This includes both single and multiprocessor assignment per individual partition. Several benchmark examples are presented. These illustrate the robustness of the procedure as well as its capability to yield significant reductions in memory utilization and calculational effort due both to updating and inversion.

  6. At the intersection of automation and culture

    NASA Technical Reports Server (NTRS)

    Sherman, P. J.; Wiener, E. L.

    1995-01-01

    The crash of an automated passenger jet at Nagoya, Japan, in 1995, is used as an example of crew error in using automatic systems. Automation provides pilots with the ability to perform tasks in various ways. National culture is cited as a factor that affects how a pilot and crew interact with each other and equipment.

  7. Efficient Parallel Levenberg-Marquardt Model Fitting towards Real-Time Automated Parametric Imaging Microscopy

    PubMed Central

    Zhu, Xiang; Zhang, Dianwen

    2013-01-01

    We present a fast, accurate and robust parallel Levenberg-Marquardt minimization optimizer, GPU-LMFit, which is implemented on graphics processing unit for high performance scalable parallel model fitting processing. GPU-LMFit can provide a dramatic speed-up in massive model fitting analyses to enable real-time automated pixel-wise parametric imaging microscopy. We demonstrate the performance of GPU-LMFit for the applications in superresolution localization microscopy and fluorescence lifetime imaging microscopy. PMID:24130785

  8. StrAuto: automation and parallelization of STRUCTURE analysis.

    PubMed

    Chhatre, Vikram E; Emerson, Kevin J

    2017-03-24

    Population structure inference using the software STRUCTURE has become an integral part of population genetic studies covering a broad spectrum of taxa including humans. The ever-expanding size of genetic data sets poses computational challenges for this analysis. Although at least one tool currently implements parallel computing to reduce computational overload of this analysis, it does not fully automate the use of replicate STRUCTURE analysis runs required for downstream inference of optimal K. There is pressing need for a tool that can deploy population structure analysis on high performance computing clusters. We present an updated version of the popular Python program StrAuto, to streamline population structure analysis using parallel computing. StrAuto implements a pipeline that combines STRUCTURE analysis with the Evanno Δ K analysis and visualization of results using STRUCTURE HARVESTER. Using benchmarking tests, we demonstrate that StrAuto significantly reduces the computational time needed to perform iterative STRUCTURE analysis by distributing runs over two or more processors. StrAuto is the first tool to integrate STRUCTURE analysis with post-processing using a pipeline approach in addition to implementing parallel computation - a set up ideal for deployment on computing clusters. StrAuto is distributed under the GNU GPL (General Public License) and available to download from http://strauto.popgen.org .

  9. Automated Performance Prediction of Message-Passing Parallel Programs

    NASA Technical Reports Server (NTRS)

    Block, Robert J.; Sarukkai, Sekhar; Mehra, Pankaj; Woodrow, Thomas S. (Technical Monitor)

    1995-01-01

    The increasing use of massively parallel supercomputers to solve large-scale scientific problems has generated a need for tools that can predict scalability trends of applications written for these machines. Much work has been done to create simple models that represent important characteristics of parallel programs, such as latency, network contention, and communication volume. But many of these methods still require substantial manual effort to represent an application in the model's format. The NIK toolkit described in this paper is the result of an on-going effort to automate the formation of analytic expressions of program execution time, with a minimum of programmer assistance. In this paper we demonstrate the feasibility of our approach, by extending previous work to detect and model communication patterns automatically, with and without overlapped computations. The predictions derived from these models agree, within reasonable limits, with execution times of programs measured on the Intel iPSC/860 and Paragon. Further, we demonstrate the use of MK in selecting optimal computational grain size and studying various scalability metrics.

  10. Automated Long-Term Monitoring of Parallel Microfluidic Operations Applying a Machine Vision-Assisted Positioning Method

    PubMed Central

    Yip, Hon Ming; Li, John C. S.; Cui, Xin; Gao, Qiannan; Leung, Chi Chiu

    2014-01-01

    As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet the device positions may vary at different time points throughout operations as the device moves back and forth on a motorized microscopic stage. Here, we report an image-based positioning strategy to realign the chamber position before every recording of microscopic image. We fabricate alignment marks at defined locations next to the chambers in the microfluidic device as reference positions. We also develop image processing algorithms to recognize the chamber positions in real-time, followed by realigning the chambers to their preset positions in the captured images. We perform experiments to validate and characterize the device functionality and the automated realignment operation. Together, this microfluidic realignment strategy can be a platform technology to achieve precise positioning of multiple chambers for general microfluidic applications requiring long-term parallel monitoring of cell and biochemical activities. PMID:25133248

  11. Comparability of automated human induced pluripotent stem cell culture: a pilot study.

    PubMed

    Archibald, Peter R T; Chandra, Amit; Thomas, Dave; Chose, Olivier; Massouridès, Emmanuelle; Laâbi, Yacine; Williams, David J

    2016-12-01

    Consistent and robust manufacturing is essential for the translation of cell therapies, and the utilisation automation throughout the manufacturing process may allow for improvements in quality control, scalability, reproducibility and economics of the process. The aim of this study was to measure and establish the comparability between alternative process steps for the culture of hiPSCs. Consequently, the effects of manual centrifugation and automated non-centrifugation process steps, performed using TAP Biosystems' CompacT SelecT automated cell culture platform, upon the culture of a human induced pluripotent stem cell (hiPSC) line (VAX001024c07) were compared. This study, has demonstrated that comparable morphologies and cell diameters were observed in hiPSCs cultured using either manual or automated process steps. However, non-centrifugation hiPSC populations exhibited greater cell yields, greater aggregate rates, increased pluripotency marker expression, and decreased differentiation marker expression compared to centrifugation hiPSCs. A trend for decreased variability in cell yield was also observed after the utilisation of the automated process step. This study also highlights the detrimental effect of the cryopreservation and thawing processes upon the growth and characteristics of hiPSC cultures, and demonstrates that automated hiPSC manufacturing protocols can be successfully transferred between independent laboratories.

  12. Effects of ATC automation on precision approaches to closely space parallel runways

    NASA Technical Reports Server (NTRS)

    Slattery, R.; Lee, K.; Sanford, B.

    1995-01-01

    Improved navigational technology (such as the Microwave Landing System and the Global Positioning System) installed in modern aircraft will enable air traffic controllers to better utilize available airspace. Consequently, arrival traffic can fly approaches to parallel runways separated by smaller distances than are currently allowed. Previous simulation studies of advanced navigation approaches have found that controller workload is increased when there is a combination of aircraft that are capable of following advanced navigation routes and aircraft that are not. Research into Air Traffic Control automation at Ames Research Center has led to the development of the Center-TRACON Automation System (CTAS). The Final Approach Spacing Tool (FAST) is the component of the CTAS used in the TRACON area. The work in this paper examines, via simulation, the effects of FAST used for aircraft landing on closely spaced parallel runways. The simulation contained various combinations of aircraft, equipped and unequipped with advanced navigation systems. A set of simulations was run both manually and with an augmented set of FAST advisories to sequence aircraft, assign runways, and avoid conflicts. The results of the simulations are analyzed, measuring the airport throughput, aircraft delay, loss of separation, and controller workload.

  13. Long-term maintenance of human induced pluripotent stem cells by automated cell culture system.

    PubMed

    Konagaya, Shuhei; Ando, Takeshi; Yamauchi, Toshiaki; Suemori, Hirofumi; Iwata, Hiroo

    2015-11-17

    Pluripotent stem cells, such as embryonic stem cells and induced pluripotent stem (iPS) cells, are regarded as new sources for cell replacement therapy. These cells can unlimitedly expand under undifferentiated conditions and be differentiated into multiple cell types. Automated culture systems enable the large-scale production of cells. In addition to reducing the time and effort of researchers, an automated culture system improves the reproducibility of cell cultures. In the present study, we newly designed a fully automated cell culture system for human iPS maintenance. Using an automated culture system, hiPS cells maintained their undifferentiated state for 60 days. Automatically prepared hiPS cells had a potency of differentiation into three germ layer cells including dopaminergic neurons and pancreatic cells.

  14. Rapid, automated, parallel quantitative immunoassays using highly integrated microfluidics and AlphaLISA

    PubMed Central

    Tak For Yu, Zeta; Guan, Huijiao; Ki Cheung, Mei; McHugh, Walker M.; Cornell, Timothy T.; Shanley, Thomas P.; Kurabayashi, Katsuo; Fu, Jianping

    2015-01-01

    Immunoassays represent one of the most popular analytical methods for detection and quantification of biomolecules. However, conventional immunoassays such as ELISA and flow cytometry, even though providing high sensitivity and specificity and multiplexing capability, can be labor-intensive and prone to human error, making them unsuitable for standardized clinical diagnoses. Using a commercialized no-wash, homogeneous immunoassay technology (‘AlphaLISA’) in conjunction with integrated microfluidics, herein we developed a microfluidic immunoassay chip capable of rapid, automated, parallel immunoassays of microliter quantities of samples. Operation of the microfluidic immunoassay chip entailed rapid mixing and conjugation of AlphaLISA components with target analytes before quantitative imaging for analyte detections in up to eight samples simultaneously. Aspects such as fluid handling and operation, surface passivation, imaging uniformity, and detection sensitivity of the microfluidic immunoassay chip using AlphaLISA were investigated. The microfluidic immunoassay chip could detect one target analyte simultaneously for up to eight samples in 45 min with a limit of detection down to 10 pg mL−1. The microfluidic immunoassay chip was further utilized for functional immunophenotyping to examine cytokine secretion from human immune cells stimulated ex vivo. Together, the microfluidic immunoassay chip provides a promising high-throughput, high-content platform for rapid, automated, parallel quantitative immunosensing applications. PMID:26074253

  15. Automation of 3D cell culture using chemically defined hydrogels.

    PubMed

    Rimann, Markus; Angres, Brigitte; Patocchi-Tenzer, Isabel; Braum, Susanne; Graf-Hausner, Ursula

    2014-04-01

    Drug development relies on high-throughput screening involving cell-based assays. Most of the assays are still based on cells grown in monolayer rather than in three-dimensional (3D) formats, although cells behave more in vivo-like in 3D. To exemplify the adoption of 3D techniques in drug development, this project investigated the automation of a hydrogel-based 3D cell culture system using a liquid-handling robot. The hydrogel technology used offers high flexibility of gel design due to a modular composition of a polymer network and bioactive components. The cell inert degradation of the gel at the end of the culture period guaranteed the harmless isolation of live cells for further downstream processing. Human colon carcinoma cells HCT-116 were encapsulated and grown in these dextran-based hydrogels, thereby forming 3D multicellular spheroids. Viability and DNA content of the cells were shown to be similar in automated and manually produced hydrogels. Furthermore, cell treatment with toxic Taxol concentrations (100 nM) had the same effect on HCT-116 cell viability in manually and automated hydrogel preparations. Finally, a fully automated dose-response curve with the reference compound Taxol showed the potential of this hydrogel-based 3D cell culture system in advanced drug development.

  16. Parallel computing for automated model calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burke, John S.; Danielson, Gary R.; Schulz, Douglas A.

    2002-07-29

    Natural resources model calibration is a significant burden on computing and staff resources in modeling efforts. Most assessments must consider multiple calibration objectives (for example magnitude and timing of stream flow peak). An automated calibration process that allows real time updating of data/models, allowing scientists to focus effort on improving models is needed. We are in the process of building a fully featured multi objective calibration tool capable of processing multiple models cheaply and efficiently using null cycle computing. Our parallel processing and calibration software routines have been generically, but our focus has been on natural resources model calibration. Somore » far, the natural resources models have been friendly to parallel calibration efforts in that they require no inter-process communication, only need a small amount of input data and only output a small amount of statistical information for each calibration run. A typical auto calibration run might involve running a model 10,000 times with a variety of input parameters and summary statistical output. In the past model calibration has been done against individual models for each data set. The individual model runs are relatively fast, ranging from seconds to minutes. The process was run on a single computer using a simple iterative process. We have completed two Auto Calibration prototypes and are currently designing a more feature rich tool. Our prototypes have focused on running the calibration in a distributed computing cross platform environment. They allow incorporation of?smart? calibration parameter generation (using artificial intelligence processing techniques). Null cycle computing similar to SETI@Home has also been a focus of our efforts. This paper details the design of the latest prototype and discusses our plans for the next revision of the software.« less

  17. [Automated analyser of organ cultured corneal endothelial mosaic].

    PubMed

    Gain, P; Thuret, G; Chiquet, C; Gavet, Y; Turc, P H; Théillère, C; Acquart, S; Le Petit, J C; Maugery, J; Campos, L

    2002-05-01

    Until now, organ-cultured corneal endothelial mosaic has been assessed in France by cell counting using a calibrated graticule, or by drawing cells on a computerized image. The former method is unsatisfactory because it is characterized by a lack of objective evaluation of the cell surface and hexagonality and it requires an experienced technician. The latter method is time-consuming and requires careful attention. We aimed to make an efficient, fast and easy to use, automated digital analyzer of video images of the corneal endothelium. The hardware included a PC Pentium III ((R)) 800 MHz-Ram 256, a Data Translation 3155 acquisition card, a Sony SC 75 CE CCD camera, and a 22-inch screen. Special functions for automated cell boundary determination consisted of Plug-in programs included in the ImageTool software. Calibration was performed using a calibrated micrometer. Cell densities of 40 organ-cultured corneas measured by both manual and automated counting were compared using parametric tests (Student's t test for paired variables and the Pearson correlation coefficient). All steps were considered more ergonomic i.e., endothelial image capture, image selection, thresholding of multiple areas of interest, automated cell count, automated detection of errors in cell boundary drawing, presentation of the results in an HTML file including the number of counted cells, cell density, coefficient of variation of cell area, cell surface histogram and cell hexagonality. The device was efficient because the global process lasted on average 7 minutes and did not require an experienced technician. The correlation between cell densities obtained with both methods was high (r=+0.84, p<0.001). The results showed an under-estimation using manual counting (2191+/-322 vs. 2273+/-457 cell/mm(2), p=0.046), compared with the automated method. Our automated endothelial cell analyzer is efficient and gives reliable results quickly and easily. A multicentric validation would allow us to

  18. SequenceL: Automated Parallel Algorithms Derived from CSP-NT Computational Laws

    NASA Technical Reports Server (NTRS)

    Cooke, Daniel; Rushton, Nelson

    2013-01-01

    With the introduction of new parallel architectures like the cell and multicore chips from IBM, Intel, AMD, and ARM, as well as the petascale processing available for highend computing, a larger number of programmers will need to write parallel codes. Adding the parallel control structure to the sequence, selection, and iterative control constructs increases the complexity of code development, which often results in increased development costs and decreased reliability. SequenceL is a high-level programming language that is, a programming language that is closer to a human s way of thinking than to a machine s. Historically, high-level languages have resulted in decreased development costs and increased reliability, at the expense of performance. In recent applications at JSC and in industry, SequenceL has demonstrated the usual advantages of high-level programming in terms of low cost and high reliability. SequenceL programs, however, have run at speeds typically comparable with, and in many cases faster than, their counterparts written in C and C++ when run on single-core processors. Moreover, SequenceL is able to generate parallel executables automatically for multicore hardware, gaining parallel speedups without any extra effort from the programmer beyond what is required to write the sequen tial/singlecore code. A SequenceL-to-C++ translator has been developed that automatically renders readable multithreaded C++ from a combination of a SequenceL program and sample data input. The SequenceL language is based on two fundamental computational laws, Consume-Simplify- Produce (CSP) and Normalize-Trans - pose (NT), which enable it to automate the creation of parallel algorithms from high-level code that has no annotations of parallelism whatsoever. In our anecdotal experience, SequenceL development has been in every case less costly than development of the same algorithm in sequential (that is, single-core, single process) C or C++, and an order of magnitude less

  19. A Multiscale Parallel Computing Architecture for Automated Segmentation of the Brain Connectome

    PubMed Central

    Knobe, Kathleen; Newton, Ryan R.; Schlimbach, Frank; Blower, Melanie; Reid, R. Clay

    2015-01-01

    Several groups in neurobiology have embarked into deciphering the brain circuitry using large-scale imaging of a mouse brain and manual tracing of the connections between neurons. Creating a graph of the brain circuitry, also called a connectome, could have a huge impact on the understanding of neurodegenerative diseases such as Alzheimer’s disease. Although considerably smaller than a human brain, a mouse brain already exhibits one billion connections and manually tracing the connectome of a mouse brain can only be achieved partially. This paper proposes to scale up the tracing by using automated image segmentation and a parallel computing approach designed for domain experts. We explain the design decisions behind our parallel approach and we present our results for the segmentation of the vasculature and the cell nuclei, which have been obtained without any manual intervention. PMID:21926011

  20. Flexible automation of cell culture and tissue engineering tasks.

    PubMed

    Knoll, Alois; Scherer, Torsten; Poggendorf, Iris; Lütkemeyer, Dirk; Lehmann, Jürgen

    2004-01-01

    Until now, the predominant use cases of industrial robots have been routine handling tasks in the automotive industry. In biotechnology and tissue engineering, in contrast, only very few tasks have been automated with robots. New developments in robot platform and robot sensor technology, however, make it possible to automate plants that largely depend on human interaction with the production process, e.g., for material and cell culture fluid handling, transportation, operation of equipment, and maintenance. In this paper we present a robot system that lends itself to automating routine tasks in biotechnology but also has the potential to automate other production facilities that are similar in process structure. After motivating the design goals, we describe the system and its operation, illustrate sample runs, and give an assessment of the advantages. We conclude this paper by giving an outlook on possible further developments.

  1. Automation, parallelism, and robotics for proteomics.

    PubMed

    Alterovitz, Gil; Liu, Jonathan; Chow, Jijun; Ramoni, Marco F

    2006-07-01

    The speed of the human genome project (Lander, E. S., Linton, L. M., Birren, B., Nusbaum, C. et al., Nature 2001, 409, 860-921) was made possible, in part, by developments in automation of sequencing technologies. Before these technologies, sequencing was a laborious, expensive, and personnel-intensive task. Similarly, automation and robotics are changing the field of proteomics today. Proteomics is defined as the effort to understand and characterize proteins in the categories of structure, function and interaction (Englbrecht, C. C., Facius, A., Comb. Chem. High Throughput Screen. 2005, 8, 705-715). As such, this field nicely lends itself to automation technologies since these methods often require large economies of scale in order to achieve cost and time-saving benefits. This article describes some of the technologies and methods being applied in proteomics in order to facilitate automation within the field as well as in linking proteomics-based information with other related research areas.

  2. An Extended Case Study Methoology for Investigating Influence of Cultural, Organizational, and Automation Factors on Human-Automation Trust

    NASA Technical Reports Server (NTRS)

    Koltai, Kolina Sun; Ho, Nhut; Masequesmay, Gina; Niedober, David; Skoog, Mark; Johnson, Walter; Cacanindin, Artemio

    2014-01-01

    This paper discusses a case study that examined the influence of cultural, organizational and automation capability upon human trust in, and reliance on, automation. In particular, this paper focuses on the design and application of an extended case study methodology, and on the foundational lessons revealed by it. Experimental test pilots involved in the research and development of the US Air Forces newly developed Automatic Ground Collision Avoidance System served as the context for this examination. An eclectic, multi-pronged approach was designed to conduct this case study, and proved effective in addressing the challenges associated with the cases politically sensitive and military environment. Key results indicate that the system design was in alignment with pilot culture and organizational mission, indicating the potential for appropriate trust development in operational pilots. These include the low-vulnerabilityhigh risk nature of the pilot profession, automation transparency and suspicion, system reputation, and the setup of and communications among organizations involved in the system development.

  3. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    NASA Technical Reports Server (NTRS)

    Long, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris

    2000-01-01

    Parallelized versions of genetic algorithms (GAs) are popular primarily for three reasons: the GA is an inherently parallel algorithm, typical GA applications are very compute intensive, and powerful computing platforms, especially Beowulf-style computing clusters, are becoming more affordable and easier to implement. In addition, the low communication bandwidth required allows the use of inexpensive networking hardware such as standard office ethernet. In this paper we describe a parallel GA and its use in automated high-level circuit design. Genetic algorithms are a type of trial-and-error search technique that are guided by principles of Darwinian evolution. Just as the genetic material of two living organisms can intermix to produce offspring that are better adapted to their environment, GAs expose genetic material, frequently strings of 1s and Os, to the forces of artificial evolution: selection, mutation, recombination, etc. GAs start with a pool of randomly-generated candidate solutions which are then tested and scored with respect to their utility. Solutions are then bred by probabilistically selecting high quality parents and recombining their genetic representations to produce offspring solutions. Offspring are typically subjected to a small amount of random mutation. After a pool of offspring is produced, this process iterates until a satisfactory solution is found or an iteration limit is reached. Genetic algorithms have been applied to a wide variety of problems in many fields, including chemistry, biology, and many engineering disciplines. There are many styles of parallelism used in implementing parallel GAs. One such method is called the master-slave or processor farm approach. In this technique, slave nodes are used solely to compute fitness evaluations (the most time consuming part). The master processor collects fitness scores from the nodes and performs the genetic operators (selection, reproduction, variation, etc.). Because of dependency

  4. Can routine automated urinalysis reduce culture requests?

    PubMed

    Kayalp, Damla; Dogan, Kubra; Ceylan, Gozde; Senes, Mehmet; Yucel, Dogan

    2013-09-01

    There are a substantial number of unnecessary urine culture requests. We aimed to investigate whether urine dipstick and microscopy results could accurately rule out urinary tract infection (UTI) without urine culture. The study included a total of 32,998 patients (11,928 men and 21,070 women, mean age: 39 ± 32 years) with a preliminary diagnosis of UTI and both urinalysis and urinary culture were requested. All urine cultures were retrospectively reviewed; association of culture positivity with a positive urinalysis result for leukocyte esterase (LE) and nitrite in chemical analysis and pyuria (WBC) and bacteriuria in microscopy was determined. Diagnostic performance of urinalysis parameters for detection of UTI was evaluated. In total, 758 (2.3%) patients were positive by urine culture. Out of these culture positive samples, ratios of positive dipstick results for LE and nitrite were 71.0% (n=538) and 17.7% (n=134), respectively. The positive microscopy results for WBC and bacteria were 68.2% (n=517) and 78.8% (n=597), respectively. Negative predictive values for LE, nitrite, bacteriuria and WBC were very close to 100%. Most of the samples have no or insignificant bacterial growth. Urine dipstick and microscopy can accurately rule out UTI. Automated urinalysis is a practicable and faster screening test which may prevent unnecessary culture requests for majority of patients. © 2013. Published by Elsevier Inc. All rights reserved.

  5. Digital microfluidics for automated hanging drop cell spheroid culture.

    PubMed

    Aijian, Andrew P; Garrell, Robin L

    2015-06-01

    Cell spheroids are multicellular aggregates, grown in vitro, that mimic the three-dimensional morphology of physiological tissues. Although there are numerous benefits to using spheroids in cell-based assays, the adoption of spheroids in routine biomedical research has been limited, in part, by the tedious workflow associated with spheroid formation and analysis. Here we describe a digital microfluidic platform that has been developed to automate liquid-handling protocols for the formation, maintenance, and analysis of multicellular spheroids in hanging drop culture. We show that droplets of liquid can be added to and extracted from through-holes, or "wells," and fabricated in the bottom plate of a digital microfluidic device, enabling the formation and assaying of hanging drops. Using this digital microfluidic platform, spheroids of mouse mesenchymal stem cells were formed and maintained in situ for 72 h, exhibiting good viability (>90%) and size uniformity (% coefficient of variation <10% intraexperiment, <20% interexperiment). A proof-of-principle drug screen was performed on human colorectal adenocarcinoma spheroids to demonstrate the ability to recapitulate physiologically relevant phenomena such as insulin-induced drug resistance. With automatable and flexible liquid handling, and a wide range of in situ sample preparation and analysis capabilities, the digital microfluidic platform provides a viable tool for automating cell spheroid culture and analysis. © 2014 Society for Laboratory Automation and Screening.

  6. Bioreactor design for successive culture of anchorage-dependent cells operated in an automated manner.

    PubMed

    Kino-Oka, Masahiro; Ogawa, Natsuki; Umegaki, Ryota; Taya, Masahito

    2005-01-01

    A novel bioreactor system was designed to perform a series of batchwise cultures of anchorage-dependent cells by means of automated operations of medium change and passage for cell transfer. The experimental data on contamination frequency ensured the biological cleanliness in the bioreactor system, which facilitated the operations in a closed environment, as compared with that in flask culture system with manual handlings. In addition, the tools for growth prediction (based on growth kinetics) and real-time growth monitoring by measurement of medium components (based on small-volume analyzing machinery) were installed into the bioreactor system to schedule the operations of medium change and passage and to confirm that culture proceeds as scheduled, respectively. The successive culture of anchorage-dependent cells was conducted with the bioreactor running in an automated way. The automated bioreactor gave a successful culture performance with fair accordance to preset scheduling based on the information in the latest subculture, realizing 79- fold cell expansion for 169 h. In addition, the correlation factor between experimental data and scheduled values through the bioreactor performance was 0.998. It was concluded that the proposed bioreactor with the integration of the prediction and monitoring tools could offer a feasible system for the manufacturing process of cultured tissue products.

  7. Reproducible culture and differentiation of mouse embryonic stem cells using an automated microwell platform☆

    PubMed Central

    Hussain, Waqar; Moens, Nathalie; Veraitch, Farlan S.; Hernandez, Diana; Mason, Chris; Lye, Gary J.

    2013-01-01

    The use of embryonic stem cells (ESCs) and their progeny in high throughput drug discovery and regenerative medicine will require production at scale of well characterized cells at an appropriate level of purity. The adoption of automated bioprocessing techniques offers the possibility to overcome the lack of consistency and high failure rates seen with current manual protocols. To build the case for increased use of automation this work addresses the key question: “can an automated system match the quality of a highly skilled and experienced person working manually?” To answer this we first describe an integrated automation platform designed for the ‘hands-free’ culture and differentiation of ESCs in microwell formats. Next we outline a framework for the systematic investigation and optimization of key bioprocess variables for the rapid establishment of validatable Standard Operating Procedures (SOPs). Finally the experimental comparison between manual and automated bioprocessing is exemplified by expansion of the murine Oct-4-GiP ESC line over eight sequential passages with their subsequent directed differentiation into neural precursors. Our results show that ESCs can be effectively maintained and differentiated in a highly reproducible manner by the automated system described. Statistical analysis of the results for cell growth over single and multiple passages shows up to a 3-fold improvement in the consistency of cell growth kinetics with automated passaging. The quality of the cells produced was evaluated using a panel of biological markers including cell growth rate and viability, nutrient and metabolite profiles, changes in gene expression and immunocytochemistry. Automated processing of the ESCs had no measurable negative effect on either their pluripotency or their ability to differentiate into the three embryonic germ layers. Equally important is that over a 6-month period of culture without antibiotics in the medium, we have not had any cases

  8. Culture medium optimization for osmotolerant yeasts by use of a parallel fermenter system and rapid microbiological testing.

    PubMed

    Pfannebecker, Jens; Schiffer-Hetz, Claudia; Fröhlich, Jürgen; Becker, Barbara

    2016-11-01

    In the present study, a culture medium for qualitative detection of osmotolerant yeasts, named OM, was developed. For the development, culture media with different concentrations of glucose, fructose, potassium chloride and glycerin were analyzed in a Biolumix™ test incubator. Selectivity for osmotolerant yeasts was guaranteed by a water activity (a w )-value of 0.91. The best results regarding fast growth of Zygosaccharomyces rouxii (WH 1002) were achieved in a culture medium consisting of 45% glucose, 5% fructose and 0.5% yeast extract and in a medium with 30% glucose, 10% glycerin, 5% potassium chloride and 0.5% yeast extract. Substances to stimulate yeast fermentation rates were analyzed in a RAMOS ® parallel fermenter system, enabling online measurement of the carbon dioxide transfer rate (CTR) in shaking flasks. Significant increases of the CTR was achieved by adding especially 0.1-0.2% ammonium salts ((NH 4 ) 2 HPO 4 , (NH 4 ) 2 SO 4 or NH 4 NO 3 ), 0.5% meat peptone and 1% malt extract. Detection times and the CTR of 23 food-borne yeast strains of the genera Zygosaccharomyces, Torulaspora, Schizosaccharomyces, Candida and Wickerhamomyces were analyzed in OM bouillon in comparison to the selective culture media YEG50, MYG50 and DG18 in the parallel fermenter system. The OM culture medium enabled the detection of 10 2 CFU/g within a time period of 2-3days, depending on the analyzed yeast species. Compared with YEG50 and MYG50 the detection times could be reduced. As an example, W. anomalus (WH 1021) was detected after 124h in YEG50, 95.5h in MYG50 and 55h in OM bouillon. Compared to YEG50 the maximum CO 2 transfer rates for Z. rouxii (WH 1001), T. delbrueckii (DSM 70526), S. pombe (DSM 70576) and W. anomalus (WH 1016) increased by a factor ≥2.6. Furthermore, enrichment cultures of inoculated high-sugar products in OM culture medium were analyzed in the Biolumix™ system. The results proved that detection times of 3days for Z. rouxii and T. delbrueckii

  9. Influence of Cultural, Organizational, and Automation Capability on Human Automation Trust: A Case Study of Auto-GCAS Experimental Test Pilots

    NASA Technical Reports Server (NTRS)

    Koltai, Kolina; Ho, Nhut; Masequesmay, Gina; Niedober, David; Skoog, Mark; Cacanindin, Artemio; Johnson, Walter; Lyons, Joseph

    2014-01-01

    This paper discusses a case study that examined the influence of cultural, organizational and automation capability upon human trust in, and reliance on, automation. In particular, this paper focuses on the design and application of an extended case study methodology, and on the foundational lessons revealed by it. Experimental test pilots involved in the research and development of the US Air Force's newly developed Automatic Ground Collision Avoidance System served as the context for this examination. An eclectic, multi-pronged approach was designed to conduct this case study, and proved effective in addressing the challenges associated with the case's politically sensitive and military environment. Key results indicate that the system design was in alignment with pilot culture and organizational mission, indicating the potential for appropriate trust development in operational pilots. These include the low-vulnerability/ high risk nature of the pilot profession, automation transparency and suspicion, system reputation, and the setup of and communications among organizations involved in the system development.

  10. A Chip-Capillary Hybrid Device for Automated Transfer of Sample Pre-Separated by Capillary Isoelectric Focusing to Parallel Capillary Gel Electrophoresis for Two-Dimensional Protein Separation

    PubMed Central

    Lu, Joann J.; Wang, Shili; Li, Guanbin; Wang, Wei; Pu, Qiaosheng; Liu, Shaorong

    2012-01-01

    In this report, we introduce a chip-capillary hybrid device to integrate capillary isoelectric focusing (CIEF) with parallel capillary sodium dodecyl sulfate – polyacrylamide gel electrophoresis (SDS-PAGE) or capillary gel electrophoresis (CGE) toward automating two-dimensional (2D) protein separations. The hybrid device consists of three chips that are butted together. The middle chip can be moved between two positions to re-route the fluidic paths, which enables the performance of CIEF and injection of proteins partially resolved by CIEF to CGE capillaries for parallel CGE separations in a continuous and automated fashion. Capillaries are attached to the other two chips to facilitate CIEF and CGE separations and to extend the effective lengths of CGE columns. Specifically, we illustrate the working principle of the hybrid device, develop protocols for producing and preparing the hybrid device, and demonstrate the feasibility of using this hybrid device for automated injection of CIEF-separated sample to parallel CGE for 2D protein separations. Potentials and problems associated with the hybrid device are also discussed. PMID:22830584

  11. Impact of Implementation of an Automated Liquid Culture System on Diagnosis of Tuberculous Pleurisy.

    PubMed

    Lee, Byung Hee; Yoon, Seong Hoon; Yeo, Hye Ju; Kim, Dong Wan; Lee, Seung Eun; Cho, Woo Hyun; Lee, Su Jin; Kim, Yun Seong; Jeon, Doosoo

    2015-07-01

    This study was conducted to evaluate the impact of implementation of an automated liquid culture system on the diagnosis of tuberculous pleurisy in an HIV-uninfected patient population. We retrospectively compared the culture yield, time to positivity, and contamination rate of pleural effusion samples in the BACTEC Mycobacteria Growth Indicator Tube 960 (MGIT) and Ogawa media among patients with tuberculous pleurisy. Out of 104 effusion samples, 43 (41.3%) were culture positive on either the MGIT or the Ogawa media. The culture yield of MGIT was higher (40.4%, 42/104) than that of Ogawa media (18.3%, 19/104) (P<0.001). One of the samples was positive only on the Ogawa medium. The median time to positivity was faster in the MGIT (18 days, range 8-32 days) than in the Ogawa media (37 days, range 20-59 days) (P<0.001). No contamination or growth of nontuberculous mycobacterium was observed on either of the culture media. In conclusion, the automated liquid culture system could provide approximately twice as high yields and fast results in effusion culture, compared to solid media. Supplemental solid media may have a limited impact on maximizing sensitivity in effusion culture; however, further studies are required.

  12. Manufacture of a human mesenchymal stem cell population using an automated cell culture platform.

    PubMed

    Thomas, Robert James; Chandra, Amit; Liu, Yang; Hourd, Paul C; Conway, Paul P; Williams, David J

    2007-09-01

    Tissue engineering and regenerative medicine are rapidly developing fields that use cells or cell-based constructs as therapeutic products for a wide range of clinical applications. Efforts to commercialise these therapies are driving a need for capable, scaleable, manufacturing technologies to ensure therapies are able to meet regulatory requirements and are economically viable at industrial scale production. We report the first automated expansion of a human bone marrow derived mesenchymal stem cell population (hMSCs) using a fully automated cell culture platform. Differences in cell population growth profile, attributed to key methodological differences, were observed between the automated protocol and a benchmark manual protocol. However, qualitatively similar cell output, assessed by cell morphology and the expression of typical hMSC markers, was obtained from both systems. Furthermore, the critical importance of minor process variation, e.g. the effect of cell seeding density on characteristics such as population growth kinetics and cell phenotype, was observed irrespective of protocol type. This work highlights the importance of careful process design in therapeutic cell manufacture and demonstrates the potential of automated culture for future optimisation and scale up studies required for the translation of regenerative medicine products from the laboratory to the clinic.

  13. Robotic platform for parallelized cultivation and monitoring of microbial growth parameters in microwell plates.

    PubMed

    Knepper, Andreas; Heiser, Michael; Glauche, Florian; Neubauer, Peter

    2014-12-01

    The enormous variation possibilities of bioprocesses challenge process development to fix a commercial process with respect to costs and time. Although some cultivation systems and some devices for unit operations combine the latest technology on miniaturization, parallelization, and sensing, the degree of automation in upstream and downstream bioprocess development is still limited to single steps. We aim to face this challenge by an interdisciplinary approach to significantly shorten development times and costs. As a first step, we scaled down analytical assays to the microliter scale and created automated procedures for starting the cultivation and monitoring the optical density (OD), pH, concentrations of glucose and acetate in the culture medium, and product formation in fed-batch cultures in the 96-well format. Then, the separate measurements of pH, OD, and concentrations of acetate and glucose were combined to one method. This method enables automated process monitoring at dedicated intervals (e.g., also during the night). By this approach, we managed to increase the information content of cultivations in 96-microwell plates, thus turning them into a suitable tool for high-throughput bioprocess development. Here, we present the flowcharts as well as cultivation data of our automation approach. © 2014 Society for Laboratory Automation and Screening.

  14. A comparison of long-term parallel measurements of sunshine duration obtained with a Campbell-Stokes sunshine recorder and two automated sunshine sensors

    NASA Astrophysics Data System (ADS)

    Baumgartner, D. J.; Pötzi, W.; Freislich, H.; Strutzmann, H.; Veronig, A. M.; Foelsche, U.; Rieder, H. E.

    2017-06-01

    In recent decades, automated sensors for sunshine duration (SD) measurements have been introduced in meteorological networks, thereby replacing traditional instruments, most prominently the Campbell-Stokes (CS) sunshine recorder. Parallel records of automated and traditional SD recording systems are rare. Nevertheless, such records are important to understand the differences/similarities in SD totals obtained with different instruments and how changes in monitoring device type affect the homogeneity of SD records. This study investigates the differences/similarities in parallel SD records obtained with a CS and two automated SD sensors between 2007 and 2016 at the Kanzelhöhe Observatory, Austria. Comparing individual records of daily SD totals, we find differences of both positive and negative sign, with smallest differences between the automated sensors. The larger differences between CS-derived SD totals and those from automated sensors can be attributed (largely) to the higher sensitivity threshold of the CS instrument. Correspondingly, the closest agreement among all sensors is found during summer, the time of year when sensitivity thresholds are least critical. Furthermore, we investigate the performance of various models to create the so-called sensor-type-equivalent (STE) SD records. Our analysis shows that regression models including all available data on daily (or monthly) time scale perform better than simple three- (or four-) point regression models. Despite general good performance, none of the considered regression models (of linear or quadratic form) emerges as the "optimal" model. Although STEs prove useful for relating SD records of individual sensors on daily/monthly time scales, this does not ensure that STE (or joint) records can be used for trend analysis.

  15. Study of living single cells in culture: automated recognition of cell behavior.

    PubMed

    Bodin, P; Papin, S; Meyer, C; Travo, P

    1988-07-01

    An automated system capable of analyzing the behavior, in real time, of single living cells in culture, in a noninvasive and nondestructive way, has been developed. A large number of cell positions in single culture dishes were recorded using a computer controlled, robotized microscope. During subsequent observations, binary images obtained from video image analysis of the microscope visual field allowed the identification of the recorded cells. These cells could be revisited automatically every few minutes. Long-term studies of the behavior of cells make possible the analysis of cellular locomotary and mitotic activities as well as determination of cell shape (chosen from a defined library) for several hours or days in a fully automated way with observations spaced up to 30 minutes. Short-term studies of the behavior of cells permit the study, in a semiautomatic way, of acute effects of drugs (5 to 15 minutes) on changes of surface area and length of cells.

  16. Swab culture monitoring of automated endoscope reprocessors after high-level disinfection

    PubMed Central

    Lu, Lung-Sheng; Wu, Keng-Liang; Chiu, Yi-Chun; Lin, Ming-Tzung; Hu, Tsung-Hui; Chiu, King-Wah

    2012-01-01

    AIM: To conduct a bacterial culture study for monitoring decontamination of automated endoscope reprocessors (AERs) after high-level disinfection (HLD). METHODS: From February 2006 to January 2011, authors conducted randomized consecutive sampling each month for 7 AERs. Authors collected a total of 420 swab cultures, including 300 cultures from 5 gastroscope AERs, and 120 cultures from 2 colonoscope AERs. Swab cultures were obtained from the residual water from the AERs after a full reprocessing cycle. Samples were cultured to test for aerobic bacteria, anaerobic bacteria, and mycobacterium tuberculosis. RESULTS: The positive culture rate of the AERs was 2.0% (6/300) for gastroscope AERs and 0.8% (1/120) for colonoscope AERs. All the positive cultures, including 6 from gastroscope and 1 from colonoscope AERs, showed monofloral colonization. Of the gastroscope AER samples, 50% (3/6) were colonized by aerobic bacterial and 50% (3/6) by fungal contaminations. CONCLUSION: A full reprocessing cycle of an AER with HLD is adequate for disinfection of the machine. Swab culture is a useful method for monitoring AER decontamination after each reprocessing cycle. Fungal contamination of AERs after reprocessing should also be kept in mind. PMID:22529696

  17. Anthropology and cultural neuroscience: creating productive intersections in parallel fields.

    PubMed

    Brown, R A; Seligman, R

    2009-01-01

    Partly due to the failure of anthropology to productively engage the fields of psychology and neuroscience, investigations in cultural neuroscience have occurred largely without the active involvement of anthropologists or anthropological theory. Dramatic advances in the tools and findings of social neuroscience have emerged in parallel with significant advances in anthropology that connect social and political-economic processes with fine-grained descriptions of individual experience and behavior. We describe four domains of inquiry that follow from these recent developments, and provide suggestions for intersections between anthropological tools - such as social theory, ethnography, and quantitative modeling of cultural models - and cultural neuroscience. These domains are: the sociocultural construction of emotion, status and dominance, the embodiment of social information, and the dual social and biological nature of ritual. Anthropology can help locate unique or interesting populations and phenomena for cultural neuroscience research. Anthropological tools can also help "drill down" to investigate key socialization processes accountable for cross-group differences. Furthermore, anthropological research points at meaningful underlying complexity in assumed relationships between social forces and biological outcomes. Finally, ethnographic knowledge of cultural content can aid with the development of ecologically relevant stimuli for use in experimental protocols.

  18. Default Parallels Plesk Panel Page

    Science.gov Websites

    services that small businesses want and need. Our software includes key building blocks of cloud service virtualized servers Service Provider Products Parallels® Automation Hosting, SaaS, and cloud computing , the leading hosting automation software. You see this page because there is no Web site at this

  19. Automation of large scale transient protein expression in mammalian cells

    PubMed Central

    Zhao, Yuguang; Bishop, Benjamin; Clay, Jordan E.; Lu, Weixian; Jones, Margaret; Daenke, Susan; Siebold, Christian; Stuart, David I.; Yvonne Jones, E.; Radu Aricescu, A.

    2011-01-01

    Traditional mammalian expression systems rely on the time-consuming generation of stable cell lines; this is difficult to accommodate within a modern structural biology pipeline. Transient transfections are a fast, cost-effective solution, but require skilled cell culture scientists, making man-power a limiting factor in a setting where numerous samples are processed in parallel. Here we report a strategy employing a customised CompacT SelecT cell culture robot allowing the large-scale expression of multiple protein constructs in a transient format. Successful protocols have been designed for automated transient transfection of human embryonic kidney (HEK) 293T and 293S GnTI− cells in various flask formats. Protein yields obtained by this method were similar to those produced manually, with the added benefit of reproducibility, regardless of user. Automation of cell maintenance and transient transfection allows the expression of high quality recombinant protein in a completely sterile environment with limited support from a cell culture scientist. The reduction in human input has the added benefit of enabling continuous cell maintenance and protein production, features of particular importance to structural biology laboratories, which typically use large quantities of pure recombinant proteins, and often require rapid characterisation of a series of modified constructs. This automated method for large scale transient transfection is now offered as a Europe-wide service via the P-cube initiative. PMID:21571074

  20. Human mixed lymphocyte cultures. Evaluation of microculture technique utilizing the multiple automated sample harvester (MASH)

    PubMed Central

    Thurman, G. B.; Strong, D. M.; Ahmed, A.; Green, S. S.; Sell, K. W.; Hartzman, R. J.; Bach, F. H.

    1973-01-01

    Use of lymphocyte cultures for in vitro studies such as pretransplant histocompatibility testing has established the need for standardization of this technique. A microculture technique has been developed that has facilitated the culturing of lymphocytes and increased the quantity of cultures feasible, while lowering the variation between replicate samples. Cultures were prepared for determination of tritiated thymidine incorporation using a Multiple Automated Sample Harvester (MASH). Using this system, the parameters that influence the in vitro responsiveness of human lymphocytes to allogeneic lymphocytes have been investigated. PMID:4271568

  1. "Parallel Leadership in an "Unparallel" World"--Cultural Constraints on the Transferability of Western Educational Leadership Theories across Cultures

    ERIC Educational Resources Information Center

    Goh, Jonathan Wee Pin

    2009-01-01

    With the global economy becoming more integrated, the issues of cross-cultural relevance and transferability of leadership theories and practices have become increasingly urgent. Drawing upon the concept of parallel leadership in schools proposed by Crowther, Kaagan, Ferguson, and Hann as an example, the purpose of this paper is to examine the…

  2. Pursuing Darwin's curious parallel: Prospects for a science of cultural evolution.

    PubMed

    Mesoudi, Alex

    2017-07-24

    In the past few decades, scholars from several disciplines have pursued the curious parallel noted by Darwin between the genetic evolution of species and the cultural evolution of beliefs, skills, knowledge, languages, institutions, and other forms of socially transmitted information. Here, I review current progress in the pursuit of an evolutionary science of culture that is grounded in both biological and evolutionary theory, but also treats culture as more than a proximate mechanism that is directly controlled by genes. Both genetic and cultural evolution can be described as systems of inherited variation that change over time in response to processes such as selection, migration, and drift. Appropriate differences between genetic and cultural change are taken seriously, such as the possibility in the latter of nonrandomly guided variation or transformation, blending inheritance, and one-to-many transmission. The foundation of cultural evolution was laid in the late 20th century with population-genetic style models of cultural microevolution, and the use of phylogenetic methods to reconstruct cultural macroevolution. Since then, there have been major efforts to understand the sociocognitive mechanisms underlying cumulative cultural evolution, the consequences of demography on cultural evolution, the empirical validity of assumed social learning biases, the relative role of transformative and selective processes, and the use of quantitative phylogenetic and multilevel selection models to understand past and present dynamics of society-level change. I conclude by highlighting the interdisciplinary challenges of studying cultural evolution, including its relation to the traditional social sciences and humanities.

  3. Simplified Automated Image Analysis for Detection and Phenotyping of Mycobacterium tuberculosis on Porous Supports by Monitoring Growing Microcolonies

    PubMed Central

    den Hertog, Alice L.; Visser, Dennis W.; Ingham, Colin J.; Fey, Frank H. A. G.; Klatser, Paul R.; Anthony, Richard M.

    2010-01-01

    Background Even with the advent of nucleic acid (NA) amplification technologies the culture of mycobacteria for diagnostic and other applications remains of critical importance. Notably microscopic observed drug susceptibility testing (MODS), as opposed to traditional culture on solid media or automated liquid culture, has shown potential to both speed up and increase the provision of mycobacterial culture in high burden settings. Methods Here we explore the growth of Mycobacterial tuberculosis microcolonies, imaged by automated digital microscopy, cultured on a porous aluminium oxide (PAO) supports. Repeated imaging during colony growth greatly simplifies “computer vision” and presumptive identification of microcolonies was achieved here using existing publically available algorithms. Our system thus allows the growth of individual microcolonies to be monitored and critically, also to change the media during the growth phase without disrupting the microcolonies. Transfer of identified microcolonies onto selective media allowed us, within 1-2 bacterial generations, to rapidly detect the drug susceptibility of individual microcolonies, eliminating the need for time consuming subculturing or the inoculation of multiple parallel cultures. Significance Monitoring the phenotype of individual microcolonies as they grow has immense potential for research, screening, and ultimately M. tuberculosis diagnostic applications. The method described is particularly appealing with respect to speed and automation. PMID:20544033

  4. Miniaturized Mass-Spectrometry-Based Analysis System for Fully Automated Examination of Conditioned Cell Culture Media

    PubMed Central

    Weber, Emanuel; Pinkse, Martijn W. H.; Bener-Aksam, Eda; Vellekoop, Michael J.; Verhaert, Peter D. E. M.

    2012-01-01

    We present a fully automated setup for performing in-line mass spectrometry (MS) analysis of conditioned media in cell cultures, in particular focusing on the peptides therein. The goal is to assess peptides secreted by cells in different culture conditions. The developed system is compatible with MS as analytical technique, as this is one of the most powerful analysis methods for peptide detection and identification. Proof of concept was achieved using the well-known mating-factor signaling in baker's yeast, Saccharomyces cerevisiae. Our concept system holds 1 mL of cell culture medium and allows maintaining a yeast culture for, at least, 40 hours with continuous supernatant extraction (and medium replenishing). The device's small dimensions result in reduced costs for reagents and open perspectives towards full integration on-chip. Experimental data that can be obtained are time-resolved peptide profiles in a yeast culture, including information about the appearance of mating-factor-related peptides. We emphasize that the system operates without any manual intervention or pipetting steps, which allows for an improved overall sensitivity compared to non-automated alternatives. MS data confirmed previously reported aspects of the physiology of the yeast-mating process. Moreover, matingfactor breakdown products (as well as evidence for a potentially responsible protease) were found. PMID:23091722

  5. The automated counting of beating rates in individual cultured heart cells.

    PubMed

    Collins, G A; Dower, R; Walker, M J

    1981-12-01

    The effect of drugs on the beating rate of cultured heart cells can be monitored in a number of ways. The simultaneous automated measurement of beating rates of a number of cells allows drug effects to be rapidly quantified. A photoresistive detector placed on a television image of a cell, when coupled to operational amplifiers, gives binary signals that can be processed by a microprocessor. On this basis, we have devised a system that is capable of simultaneously monitoring the individual beating of six single cultured heart cells. A microprocessor automatically processes data obtained under different experimental conditions and records it in suitable descriptive formats such as dose-response curves and double reciprocal plots.

  6. The RABiT: a rapid automated biodosimetry tool for radiological triage. II. Technological developments.

    PubMed

    Garty, Guy; Chen, Youhua; Turner, Helen C; Zhang, Jian; Lyulko, Oleksandra V; Bertucci, Antonella; Xu, Yanping; Wang, Hongliang; Simaan, Nabil; Randers-Pehrson, Gerhard; Lawrence Yao, Y; Brenner, David J

    2011-08-01

    Over the past five years the Center for Minimally Invasive Radiation Biodosimetry at Columbia University has developed the Rapid Automated Biodosimetry Tool (RABiT), a completely automated, ultra-high throughput biodosimetry workstation. This paper describes recent upgrades and reliability testing of the RABiT. The RABiT analyses fingerstick-derived blood samples to estimate past radiation exposure or to identify individuals exposed above or below a cut-off dose. Through automated robotics, lymphocytes are extracted from fingerstick blood samples into filter-bottomed multi-well plates. Depending on the time since exposure, the RABiT scores either micronuclei or phosphorylation of the histone H2AX, in an automated robotic system, using filter-bottomed multi-well plates. Following lymphocyte culturing, fixation and staining, the filter bottoms are removed from the multi-well plates and sealed prior to automated high-speed imaging. Image analysis is performed online using dedicated image processing hardware. Both the sealed filters and the images are archived. We have developed a new robotic system for lymphocyte processing, making use of an upgraded laser power and parallel processing of four capillaries at once. This system has allowed acceleration of lymphocyte isolation, the main bottleneck of the RABiT operation, from 12 to 2 sec/sample. Reliability tests have been performed on all robotic subsystems. Parallel handling of multiple samples through the use of dedicated, purpose-built, robotics and high speed imaging allows analysis of up to 30,000 samples per day.

  7. Evaluation of negative results of BacT/Alert 3D automated blood culture system.

    PubMed

    Kocoglu, M Esra; Bayram, Aysen; Balci, Iclal

    2005-06-01

    Although automated continuous-monitoring blood culture systems are both rapid and sensitive, false-positive and false-negative results still occur. The objective of this study, then, was to evaluate negative results occurring with BacT/Alert 3D blood culture systems. A total of 1032 samples were cultured with the BacT/Alert 3D automated blood culture system, using both aerobic (FA) and anaerobic (FN) [corrected] media, and 128 of these samples yielded positive results. A total of 904 negative blood samples were then subcultured in 5% sheep blood agar, eosin methylene blue, chocolate agar, and sabouraud-dextrose agar. Organisms growing on these subcultures were subsequently identified using both Vitek32 (bioMerieux, Durham, NC) and conventional methods. Twenty four (2.6%) of the 904 subcultures grew on the subculture media. The majority (83.3%) of these were determined to be gram-positive microorganisms. Fourteen (58.3%) were coagulase-negative staphylococci, two (8.3%) were Bacillus spp., one (4.2%) was Staphylococcus aureus, and one (4.2%) was identified as Enterococcus faecium. Streptococcus pneumoniae and Neisseria spp. were isolated together in two (8.3%) vials. Gram-negative microorganisms comprised 12.5% of the subcultures, of which two (8.3%) were found to be Pseudomonas aeruginosa, and one (4.2%) was Pseudomonas fluorescens. The other isolate (4.2%) was identified as Candida albicans. We conclude that the subculture of negative results is valuable in the BacT/Alert 3D system, especially in situations in which only one set of blood cultures is taken.

  8. Parallel workflow tools to facilitate human brain MRI post-processing

    PubMed Central

    Cui, Zaixu; Zhao, Chenxi; Gong, Gaolang

    2015-01-01

    Multi-modal magnetic resonance imaging (MRI) techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computational resources and to support the parallel processing of different subjects or of independent processing steps for a single subject. Automated, parallel MRI post-processing tools can greatly facilitate relevant brain investigations and are being increasingly applied. In this review, we briefly summarize these parallel workflow tools and discuss relevant issues. PMID:26029043

  9. Pursuing Darwin’s curious parallel: Prospects for a science of cultural evolution

    PubMed Central

    2017-01-01

    In the past few decades, scholars from several disciplines have pursued the curious parallel noted by Darwin between the genetic evolution of species and the cultural evolution of beliefs, skills, knowledge, languages, institutions, and other forms of socially transmitted information. Here, I review current progress in the pursuit of an evolutionary science of culture that is grounded in both biological and evolutionary theory, but also treats culture as more than a proximate mechanism that is directly controlled by genes. Both genetic and cultural evolution can be described as systems of inherited variation that change over time in response to processes such as selection, migration, and drift. Appropriate differences between genetic and cultural change are taken seriously, such as the possibility in the latter of nonrandomly guided variation or transformation, blending inheritance, and one-to-many transmission. The foundation of cultural evolution was laid in the late 20th century with population-genetic style models of cultural microevolution, and the use of phylogenetic methods to reconstruct cultural macroevolution. Since then, there have been major efforts to understand the sociocognitive mechanisms underlying cumulative cultural evolution, the consequences of demography on cultural evolution, the empirical validity of assumed social learning biases, the relative role of transformative and selective processes, and the use of quantitative phylogenetic and multilevel selection models to understand past and present dynamics of society-level change. I conclude by highlighting the interdisciplinary challenges of studying cultural evolution, including its relation to the traditional social sciences and humanities. PMID:28739929

  10. Diagnostic performance of automated liquid culture and molecular line probe assay in smear-negative pulmonary tuberculosis.

    PubMed

    Kotwal, Aarti; Biswas, Debasis; Raghuvanshi, Shailendra; Sindhwani, Girish; Kakati, Barnali; Sharma, Shweta

    2017-04-01

    The diagnosis of smear-negative pulmonary tuberculosis (PTB) is particularly challenging, and automated liquid culture and molecular line probe assays (LPA) may prove particularly useful. The objective of our study was to evaluate the diagnostic potential of automated liquid culture (ALC) technology and commercial LPA in sputum smear-negative PTB suspects. Spot sputum samples were collected from 145 chest-symptomatic smear-negative patients and subjected to ALC, direct drug susceptibility test (DST) testing and LPA, as per manufacturers' instructions. A diagnostic yield of 26.2% was observed among sputum smear-negative TB suspects with 47.4% of the culture isolates being either INH- and/or rifampicin-resistant. Complete agreement was observed between the results of ALC assay and LPA except for two isolates which demonstrated sensitivity to INH and rifampicin at direct DST but were rifampicin-resistant in LPA. Two novel mutations were also detected among the multidrug isolates by LPA. In view of the diagnostic challenges associated with the diagnosis of TB in sputum smear-negative patients, our study demonstrates the applicability of ALC and LPA in establishing diagnostic evidence of TB.

  11. The RABiT: A Rapid Automated Biodosimetry Tool For Radiological Triage. II. Technological Developments

    PubMed Central

    Garty, Guy; Chen, Youhua; Turner, Helen; Zhang, Jian; Lyulko, Oleksandra; Bertucci, Antonella; Xu, Yanping; Wang, Hongliang; Simaan, Nabil; Randers-Pehrson, Gerhard; Yao, Y. Lawrence; Brenner, David J.

    2011-01-01

    Purpose Over the past five years the Center for Minimally Invasive Radiation Biodosimetry at Columbia University has developed the Rapid Automated Biodosimetry Tool (RABiT), a completely automated, ultra-high throughput biodosimetry workstation. This paper describes recent upgrades and reliability testing of the RABiT. Materials and methods The RABiT analyzes fingerstick-derived blood samples to estimate past radiation exposure or to identify individuals exposed above or below a cutoff dose. Through automated robotics, lymphocytes are extracted from fingerstick blood samples into filter-bottomed multi-well plates. Depending on the time since exposure, the RABiT scores either micronuclei or phosphorylation of the histone H2AX, in an automated robotic system, using filter-bottomed multi-well plates. Following lymphocyte culturing, fixation and staining, the filter bottoms are removed from the multi-well plates and sealed prior to automated high-speed imaging. Image analysis is performed online using dedicated image processing hardware. Both the sealed filters and the images are archived. Results We have developed a new robotic system for lymphocyte processing, making use of an upgraded laser power and parallel processing of four capillaries at once. This system has allowed acceleration of lymphocyte isolation, the main bottleneck of the RABiT operation, from 12 to 2 sec/sample. Reliability tests have been performed on all robotic subsystems. Conclusions Parallel handling of multiple samples through the use of dedicated, purpose-built, robotics and high speed imaging allows analysis of up to 30,000 samples per day. PMID:21557703

  12. Process development of human multipotent stromal cell microcarrier culture using an automated high-throughput microbioreactor.

    PubMed

    Rafiq, Qasim A; Hanga, Mariana P; Heathman, Thomas R J; Coopman, Karen; Nienow, Alvin W; Williams, David J; Hewitt, Christopher J

    2017-10-01

    Microbioreactors play a critical role in process development as they reduce reagent requirements and can facilitate high-throughput screening of process parameters and culture conditions. Here, we have demonstrated and explained in detail, for the first time, the amenability of the automated ambr15 cell culture microbioreactor system for the development of scalable adherent human mesenchymal multipotent stromal/stem cell (hMSC) microcarrier culture processes. This was achieved by first improving suspension and mixing of the microcarriers and then improving cell attachment thereby reducing the initial growth lag phase. The latter was achieved by using only 50% of the final working volume of medium for the first 24 h and using an intermittent agitation strategy. These changes resulted in >150% increase in viable cell density after 24 h compared to the original process (no agitation for 24 h and 100% working volume). Using the same methodology as in the ambr15, similar improvements were obtained with larger scale spinner flask studies. Finally, this improved bioprocess methodology based on a serum-based medium was applied to a serum-free process in the ambr15, resulting in >250% increase in yield compared to the serum-based process. At both scales, the agitation used during culture was the minimum required for microcarrier suspension, N JS . The use of the ambr15, with its improved control compared to the spinner flask, reduced the coefficient of variation on viable cell density in the serum containing medium from 7.65% to 4.08%, and the switch to serum free further reduced these to 1.06-0.54%, respectively. The combination of both serum-free and automated processing improved the reproducibility more than 10-fold compared to the serum-based, manual spinner flask process. The findings of this study demonstrate that the ambr15 microbioreactor is an effective tool for bioprocess development of hMSC microcarrier cultures and that a combination of serum-free medium

  13. Process development of human multipotent stromal cell microcarrier culture using an automated high‐throughput microbioreactor

    PubMed Central

    Hanga, Mariana P.; Heathman, Thomas R. J.; Coopman, Karen; Nienow, Alvin W.; Williams, David J.; Hewitt, Christopher J.

    2017-01-01

    ABSTRACT Microbioreactors play a critical role in process development as they reduce reagent requirements and can facilitate high‐throughput screening of process parameters and culture conditions. Here, we have demonstrated and explained in detail, for the first time, the amenability of the automated ambr15 cell culture microbioreactor system for the development of scalable adherent human mesenchymal multipotent stromal/stem cell (hMSC) microcarrier culture processes. This was achieved by first improving suspension and mixing of the microcarriers and then improving cell attachment thereby reducing the initial growth lag phase. The latter was achieved by using only 50% of the final working volume of medium for the first 24 h and using an intermittent agitation strategy. These changes resulted in >150% increase in viable cell density after 24 h compared to the original process (no agitation for 24 h and 100% working volume). Using the same methodology as in the ambr15, similar improvements were obtained with larger scale spinner flask studies. Finally, this improved bioprocess methodology based on a serum‐based medium was applied to a serum‐free process in the ambr15, resulting in >250% increase in yield compared to the serum‐based process. At both scales, the agitation used during culture was the minimum required for microcarrier suspension, NJS. The use of the ambr15, with its improved control compared to the spinner flask, reduced the coefficient of variation on viable cell density in the serum containing medium from 7.65% to 4.08%, and the switch to serum free further reduced these to 1.06–0.54%, respectively. The combination of both serum‐free and automated processing improved the reproducibility more than 10‐fold compared to the serum‐based, manual spinner flask process. The findings of this study demonstrate that the ambr15 microbioreactor is an effective tool for bioprocess development of hMSC microcarrier cultures and that a combination

  14. An Automated HIV-1 Env-Pseudotyped Virus Production for Global HIV Vaccine Trials

    PubMed Central

    Fuss, Martina; Mazzotta, Angela S.; Sarzotti-Kelsoe, Marcella; Ozaki, Daniel A.; Montefiori, David C.; von Briesen, Hagen; Zimmermann, Heiko; Meyerhans, Andreas

    2012-01-01

    Background Infections with HIV still represent a major human health problem worldwide and a vaccine is the only long-term option to fight efficiently against this virus. Standardized assessments of HIV-specific immune responses in vaccine trials are essential for prioritizing vaccine candidates in preclinical and clinical stages of development. With respect to neutralizing antibodies, assays with HIV-1 Env-pseudotyped viruses are a high priority. To cover the increasing demands of HIV pseudoviruses, a complete cell culture and transfection automation system has been developed. Methodology/Principal Findings The automation system for HIV pseudovirus production comprises a modified Tecan-based Cellerity system. It covers an area of 5×3 meters and includes a robot platform, a cell counting machine, a CO2 incubator for cell cultivation and a media refrigerator. The processes for cell handling, transfection and pseudovirus production have been implemented according to manual standard operating procedures and are controlled and scheduled autonomously by the system. The system is housed in a biosafety level II cabinet that guarantees protection of personnel, environment and the product. HIV pseudovirus stocks in a scale from 140 ml to 1000 ml have been produced on the automated system. Parallel manual production of HIV pseudoviruses and comparisons (bridging assays) confirmed that the automated produced pseudoviruses were of equivalent quality as those produced manually. In addition, the automated method was fully validated according to Good Clinical Laboratory Practice (GCLP) guidelines, including the validation parameters accuracy, precision, robustness and specificity. Conclusions An automated HIV pseudovirus production system has been successfully established. It allows the high quality production of HIV pseudoviruses under GCLP conditions. In its present form, the installed module enables the production of 1000 ml of virus-containing cell culture supernatant per

  15. a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation

    NASA Astrophysics Data System (ADS)

    Kıvılcım, C. Ö.; Duran, Z.

    2016-06-01

    The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.

  16. Wire-Guide Manipulator For Automated Welding

    NASA Technical Reports Server (NTRS)

    Morris, Tim; White, Kevin; Gordon, Steve; Emerich, Dave; Richardson, Dave; Faulkner, Mike; Stafford, Dave; Mccutcheon, Kim; Neal, Ken; Milly, Pete

    1994-01-01

    Compact motor drive positions guide for welding filler wire. Drive part of automated wire feeder in partly or fully automated welding system. Drive unit contains three parallel subunits. Rotations of lead screws in three subunits coordinated to obtain desired motions in three degrees of freedom. Suitable for both variable-polarity plasma arc welding and gas/tungsten arc welding.

  17. Automating the selection of standard parallels for conic map projections

    NASA Astrophysics Data System (ADS)

    Šavriǒ, Bojan; Jenny, Bernhard

    2016-05-01

    Conic map projections are appropriate for mapping regions at medium and large scales with east-west extents at intermediate latitudes. Conic projections are appropriate for these cases because they show the mapped area with less distortion than other projections. In order to minimize the distortion of the mapped area, the two standard parallels of conic projections need to be selected carefully. Rules of thumb exist for placing the standard parallels based on the width-to-height ratio of the map. These rules of thumb are simple to apply, but do not result in maps with minimum distortion. There also exist more sophisticated methods that determine standard parallels such that distortion in the mapped area is minimized. These methods are computationally expensive and cannot be used for real-time web mapping and GIS applications where the projection is adjusted automatically to the displayed area. This article presents a polynomial model that quickly provides the standard parallels for the three most common conic map projections: the Albers equal-area, the Lambert conformal, and the equidistant conic projection. The model defines the standard parallels with polynomial expressions based on the spatial extent of the mapped area. The spatial extent is defined by the length of the mapped central meridian segment, the central latitude of the displayed area, and the width-to-height ratio of the map. The polynomial model was derived from 3825 maps-each with a different spatial extent and computationally determined standard parallels that minimize the mean scale distortion index. The resulting model is computationally simple and can be used for the automatic selection of the standard parallels of conic map projections in GIS software and web mapping applications.

  18. Automated Interpretation of Blood Culture Gram Stains by Use of a Deep Convolutional Neural Network.

    PubMed

    Smith, Kenneth P; Kang, Anthony D; Kirby, James E

    2018-03-01

    Microscopic interpretation of stained smears is one of the most operator-dependent and time-intensive activities in the clinical microbiology laboratory. Here, we investigated application of an automated image acquisition and convolutional neural network (CNN)-based approach for automated Gram stain classification. Using an automated microscopy platform, uncoverslipped slides were scanned with a 40× dry objective, generating images of sufficient resolution for interpretation. We collected 25,488 images from positive blood culture Gram stains prepared during routine clinical workup. These images were used to generate 100,213 crops containing Gram-positive cocci in clusters, Gram-positive cocci in chains/pairs, Gram-negative rods, or background (no cells). These categories were targeted for proof-of-concept development as they are associated with the majority of bloodstream infections. Our CNN model achieved a classification accuracy of 94.9% on a test set of image crops. Receiver operating characteristic (ROC) curve analysis indicated a robust ability to differentiate between categories with an area under the curve of >0.98 for each. After training and validation, we applied the classification algorithm to new images collected from 189 whole slides without human intervention. Sensitivity and specificity were 98.4% and 75.0% for Gram-positive cocci in chains and pairs, 93.2% and 97.2% for Gram-positive cocci in clusters, and 96.3% and 98.1% for Gram-negative rods. Taken together, our data support a proof of concept for a fully automated classification methodology for blood-culture Gram stains. Importantly, the algorithm was highly adept at identifying image crops with organisms and could be used to present prescreened, classified crops to technologists to accelerate smear review. This concept could potentially be extended to all Gram stain interpretive activities in the clinical laboratory. Copyright © 2018 American Society for Microbiology.

  19. "First generation" automated DNA sequencing technology.

    PubMed

    Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M

    2011-10-01

    Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.

  20. Development of a bench-top device for parallel climate-controlled recordings of neuronal cultures activity with microelectrode arrays.

    PubMed

    Regalia, Giulia; Biffi, Emilia; Achilli, Silvia; Ferrigno, Giancarlo; Menegon, Andrea; Pedrocchi, Alessandra

    2016-02-01

    Two binding requirements for in vitro studies on long-term neuronal networks dynamics are (i) finely controlled environmental conditions to keep neuronal cultures viable and provide reliable data for more than a few hours and (ii) parallel operation on multiple neuronal cultures to shorten experimental time scales and enhance data reproducibility. In order to fulfill these needs with a Microelectrode Arrays (MEA)-based system, we designed a stand-alone device that permits to uninterruptedly monitor neuronal cultures activity over long periods, overcoming drawbacks of existing MEA platforms. We integrated in a single device: (i) a closed chamber housing four MEAs equipped with access for chemical manipulations, (ii) environmental control systems and embedded sensors to reproduce and remotely monitor the standard in vitro culture environment on the lab bench (i.e. in terms of temperature, air CO2 and relative humidity), and (iii) a modular MEA interface analog front-end for reliable and parallel recordings. The system has been proven to assure environmental conditions stable, physiological and homogeneos across different cultures. Prolonged recordings (up to 10 days) of spontaneous and pharmacologically stimulated neuronal culture activity have not shown signs of rundown thanks to the environmental stability and have not required to withdraw the cells from the chamber for culture medium manipulations. This system represents an effective MEA-based solution to elucidate neuronal network phenomena with slow dynamics, such as long-term plasticity, effects of chronic pharmacological stimulations or late-onset pathological mechanisms. © 2015 Wiley Periodicals, Inc.

  1. First Annual Workshop on Space Operations Automation and Robotics (SOAR 87)

    NASA Technical Reports Server (NTRS)

    Griffin, Sandy (Editor)

    1987-01-01

    Several topics relative to automation and robotics technology are discussed. Automation of checkout, ground support, and logistics; automated software development; man-machine interfaces; neural networks; systems engineering and distributed/parallel processing architectures; and artificial intelligence/expert systems are among the topics covered.

  2. Computer-Aided Parallelizer and Optimizer

    NASA Technical Reports Server (NTRS)

    Jin, Haoqiang

    2011-01-01

    The Computer-Aided Parallelizer and Optimizer (CAPO) automates the insertion of compiler directives (see figure) to facilitate parallel processing on Shared Memory Parallel (SMP) machines. While CAPO currently is integrated seamlessly into CAPTools (developed at the University of Greenwich, now marketed as ParaWise), CAPO was independently developed at Ames Research Center as one of the components for the Legacy Code Modernization (LCM) project. The current version takes serial FORTRAN programs, performs interprocedural data dependence analysis, and generates OpenMP directives. Due to the widely supported OpenMP standard, the generated OpenMP codes have the potential to run on a wide range of SMP machines. CAPO relies on accurate interprocedural data dependence information currently provided by CAPTools. Compiler directives are generated through identification of parallel loops in the outermost level, construction of parallel regions around parallel loops and optimization of parallel regions, and insertion of directives with automatic identification of private, reduction, induction, and shared variables. Attempts also have been made to identify potential pipeline parallelism (implemented with point-to-point synchronization). Although directives are generated automatically, user interaction with the tool is still important for producing good parallel codes. A comprehensive graphical user interface is included for users to interact with the parallelization process.

  3. Establishment of a fully automated microtiter plate-based system for suspension cell culture and its application for enhanced process optimization.

    PubMed

    Markert, Sven; Joeris, Klaus

    2017-01-01

    We developed an automated microtiter plate (MTP)-based system for suspension cell culture to meet the increased demands for miniaturized high throughput applications in biopharmaceutical process development. The generic system is based on off-the-shelf commercial laboratory automation equipment and is able to utilize MTPs of different configurations (6-24 wells per plate) in orbital shaken mode. The shaking conditions were optimized by Computational Fluid Dynamics simulations. The fully automated system handles plate transport, seeding and feeding of cells, daily sampling, and preparation of analytical assays. The integration of all required analytical instrumentation into the system enables a hands-off operation which prevents bottlenecks in sample processing. The modular set-up makes the system flexible and adaptable for a continuous extension of analytical parameters and add-on components. The system proved suitable as screening tool for process development by verifying the comparability of results for the MTP-based system and bioreactors regarding profiles of viable cell density, lactate, and product concentration of CHO cell lines. These studies confirmed that 6 well MTPs as well as 24 deepwell MTPs were predictive for a scale up to a 1000 L stirred tank reactor (scale factor 1:200,000). Applying the established cell culture system for automated media blend screening in late stage development, a 22% increase in product yield was achieved in comparison to the reference process. The predicted product increase was subsequently confirmed in 2 L bioreactors. Thus, we demonstrated the feasibility of the automated MTP-based cell culture system for enhanced screening and optimization applications in process development and identified further application areas such as process robustness. The system offers a great potential to accelerate time-to-market for new biopharmaceuticals. Biotechnol. Bioeng. 2017;114: 113-121. © 2016 Wiley Periodicals, Inc. © 2016 Wiley

  4. Two-dimensional parallel array technology as a new approach to automated combinatorial solid-phase organic synthesis

    PubMed

    Brennan; Biddison; Frauendorf; Schwarcz; Keen; Ecker; Davis; Tinder; Swayze

    1998-01-01

    An automated, 96-well parallel array synthesizer for solid-phase organic synthesis has been designed and constructed. The instrument employs a unique reagent array delivery format, in which each reagent utilized has a dedicated plumbing system. An inert atmosphere is maintained during all phases of a synthesis, and temperature can be controlled via a thermal transfer plate which holds the injection molded reaction block. The reaction plate assembly slides in the X-axis direction, while eight nozzle blocks holding the reagent lines slide in the Y-axis direction, allowing for the extremely rapid delivery of any of 64 reagents to 96 wells. In addition, there are six banks of fixed nozzle blocks, which deliver the same reagent or solvent to eight wells at once, for a total of 72 possible reagents. The instrument is controlled by software which allows the straightforward programming of the synthesis of a larger number of compounds. This is accomplished by supplying a general synthetic procedure in the form of a command file, which calls upon certain reagents to be added to specific wells via lookup in a sequence file. The bottle position, flow rate, and concentration of each reagent is stored in a separate reagent table file. To demonstrate the utility of the parallel array synthesizer, a small combinatorial library of hydroxamic acids was prepared in high throughput mode for biological screening. Approximately 1300 compounds were prepared on a 10 μmole scale (3-5 mg) in a few weeks. The resulting crude compounds were generally >80% pure, and were utilized directly for high throughput screening in antibacterial assays. Several active wells were found, and the activity was verified by solution-phase synthesis of analytically pure material, indicating that the system described herein is an efficient means for the parallel synthesis of compounds for lead discovery. Copyright 1998 John Wiley & Sons, Inc.

  5. Automated recognition of cell phenotypes in histology images based on membrane- and nuclei-targeting biomarkers

    PubMed Central

    Karaçalı, Bilge; Vamvakidou, Alexandra P; Tözeren, Aydın

    2007-01-01

    Background Three-dimensional in vitro culture of cancer cells are used to predict the effects of prospective anti-cancer drugs in vivo. In this study, we present an automated image analysis protocol for detailed morphological protein marker profiling of tumoroid cross section images. Methods Histologic cross sections of breast tumoroids developed in co-culture suspensions of breast cancer cell lines, stained for E-cadherin and progesterone receptor, were digitized and pixels in these images were classified into five categories using k-means clustering. Automated segmentation was used to identify image regions composed of cells expressing a given biomarker. Synthesized images were created to check the accuracy of the image processing system. Results Accuracy of automated segmentation was over 95% in identifying regions of interest in synthesized images. Image analysis of adjacent histology slides stained, respectively, for Ecad and PR, accurately predicted regions of different cell phenotypes. Image analysis of tumoroid cross sections from different tumoroids obtained under the same co-culture conditions indicated the variation of cellular composition from one tumoroid to another. Variations in the compositions of cross sections obtained from the same tumoroid were established by parallel analysis of Ecad and PR-stained cross section images. Conclusion Proposed image analysis methods offer standardized high throughput profiling of molecular anatomy of tumoroids based on both membrane and nuclei markers that is suitable to rapid large scale investigations of anti-cancer compounds for drug development. PMID:17822559

  6. The Influence of Cultural Factors on Trust in Automation

    ERIC Educational Resources Information Center

    Chien, Shih-Yi James

    2016-01-01

    Human interaction with automation is a complex process that requires both skilled operators and complex system designs to effectively enhance overall performance. Although automation has successfully managed complex systems throughout the world for over half a century, inappropriate reliance on automation can still occur, such as the recent…

  7. Combining Phase Identification and Statistic Modeling for Automated Parallel Benchmark Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Ye; Ma, Xiaosong; Liu, Qing Gary

    2015-01-01

    Parallel application benchmarks are indispensable for evaluating/optimizing HPC software and hardware. However, it is very challenging and costly to obtain high-fidelity benchmarks reflecting the scale and complexity of state-of-the-art parallel applications. Hand-extracted synthetic benchmarks are time-and labor-intensive to create. Real applications themselves, while offering most accurate performance evaluation, are expensive to compile, port, reconfigure, and often plainly inaccessible due to security or ownership concerns. This work contributes APPRIME, a novel tool for trace-based automatic parallel benchmark generation. Taking as input standard communication-I/O traces of an application's execution, it couples accurate automatic phase identification with statistical regeneration of event parameters tomore » create compact, portable, and to some degree reconfigurable parallel application benchmarks. Experiments with four NAS Parallel Benchmarks (NPB) and three real scientific simulation codes confirm the fidelity of APPRIME benchmarks. They retain the original applications' performance characteristics, in particular the relative performance across platforms.« less

  8. Automated Parallel Capillary Electrophoretic System

    DOEpatents

    Li, Qingbo; Kane, Thomas E.; Liu, Changsheng; Sonnenschein, Bernard; Sharer, Michael V.; Kernan, John R.

    2000-02-22

    An automated electrophoretic system is disclosed. The system employs a capillary cartridge having a plurality of capillary tubes. The cartridge has a first array of capillary ends projecting from one side of a plate. The first array of capillary ends are spaced apart in substantially the same manner as the wells of a microtitre tray of standard size. This allows one to simultaneously perform capillary electrophoresis on samples present in each of the wells of the tray. The system includes a stacked, dual carousel arrangement to eliminate cross-contamination resulting from reuse of the same buffer tray on consecutive executions from electrophoresis. The system also has a gel delivery module containing a gel syringe/a stepper motor or a high pressure chamber with a pump to quickly and uniformly deliver gel through the capillary tubes. The system further includes a multi-wavelength beam generator to generate a laser beam which produces a beam with a wide range of wavelengths. An off-line capillary reconditioner thoroughly cleans a capillary cartridge to enable simultaneous execution of electrophoresis with another capillary cartridge. The streamlined nature of the off-line capillary reconditioner offers the advantage of increased system throughput with a minimal increase in system cost.

  9. Performance of a Novel Algorithm Using Automated Digital Microscopy for Diagnosing Tuberculosis.

    PubMed

    Ismail, Nazir A; Omar, Shaheed V; Lewis, James J; Dowdy, David W; Dreyer, Andries W; van der Meulen, Hermina; Nconjana, George; Clark, David A; Churchyard, Gavin J

    2015-06-15

    TBDx automated microscopy is a novel technology that processes digital microscopic images to identify acid-fast bacilli (AFB). Use of TBDx as part of a diagnostic algorithm could improve the diagnosis of tuberculosis (TB), but its performance characteristics have not yet been formally tested. To evaluate the performance of the TBDx automated microscopy system in algorithms for diagnosis of TB. Prospective samples from patients with presumed TB were processed in parallel with conventional smear microscopy, TBDx microscopy, and liquid culture. All TBDx-positive specimens were also tested with the Xpert MTB/RIF (GXP) assay. We evaluated the sensitivity and specificity of two algorithms-(1) TBDx-GXP (TBDx with positive specimens tested by Xpert MTB/RIF) and (2) TBDx alone-against the gold standard liquid media culture. Of 1,210 samples, 1,009 were eligible for evaluation, of which 109 were culture positive for Mycobacterium tuberculosis. The TBDx system identified 70 specimens (68 culture positive) as having 10 or more putative AFB (high positive) and 207 (19 culture positive) as having 1-9 putative AFB (low positive). An algorithm in which "low-positive" results on TBDx were confirmed by GXP had 78% sensitivity (85 of 109) and 99.8% specificity (889 of 900), requiring 21% (207 of 1,009) specimens to be processed by GXP. As a stand-alone test, a "high-positive" result on TBDx had 62% sensitivity and 99.7% specificity. TBDx used in diagnostic algorithms with GXP provided reasonable sensitivity and high specificity for active TB while dramatically reducing the number GXP tests performed. As a stand-alone microscopy system, its performance was equivalent to that of a highly experienced TB microscopist.

  10. Automated Static Culture System Cell Module Mixing Protocol and Computational Fluid Dynamics Analysis

    NASA Technical Reports Server (NTRS)

    Kleis, Stanley J.; Truong, Tuan; Goodwin, Thomas J,

    2004-01-01

    This report is a documentation of a fluid dynamic analysis of the proposed Automated Static Culture System (ASCS) cell module mixing protocol. The report consists of a review of some basic fluid dynamics principles appropriate for the mixing of a patch of high oxygen content media into the surrounding media which is initially depleted of oxygen, followed by a computational fluid dynamics (CFD) study of this process for the proposed protocol over a range of the governing parameters. The time histories of oxygen concentration distributions and mechanical shear levels generated are used to characterize the mixing process for different parameter values.

  11. Applying Parallel Processing Techniques to Tether Dynamics Simulation

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    1996-01-01

    The focus of this research has been to determine the effectiveness of applying parallel processing techniques to a sizable real-world problem, the simulation of the dynamics associated with a tether which connects two objects in low earth orbit, and to explore the degree to which the parallelization process can be automated through the creation of new software tools. The goal has been to utilize this specific application problem as a base to develop more generally applicable techniques.

  12. Automating the parallel processing of fluid and structural dynamics calculations

    NASA Technical Reports Server (NTRS)

    Arpasi, Dale J.; Cole, Gary L.

    1987-01-01

    The NASA Lewis Research Center is actively involved in the development of expert system technology to assist users in applying parallel processing to computational fluid and structural dynamic analysis. The goal of this effort is to eliminate the necessity for the physical scientist to become a computer scientist in order to effectively use the computer as a research tool. Programming and operating software utilities have previously been developed to solve systems of ordinary nonlinear differential equations on parallel scalar processors. Current efforts are aimed at extending these capabilities to systems of partial differential equations, that describe the complex behavior of fluids and structures within aerospace propulsion systems. This paper presents some important considerations in the redesign, in particular, the need for algorithms and software utilities that can automatically identify data flow patterns in the application program and partition and allocate calculations to the parallel processors. A library-oriented multiprocessing concept for integrating the hardware and software functions is described.

  13. Validation of shortened 2-day sterility testing of mesenchymal stem cell-based therapeutic preparation on an automated culture system.

    PubMed

    Lysák, Daniel; Holubová, Monika; Bergerová, Tamara; Vávrová, Monika; Cangemi, Giuseppina Cristina; Ciccocioppo, Rachele; Kruzliak, Peter; Jindra, Pavel

    2016-03-01

    Cell therapy products represent a new trend of treatment in the field of immunotherapy and regenerative medicine. Their biological nature and multistep preparation procedure require the application of complex release criteria and quality control. Microbial contamination of cell therapy products is a potential source of morbidity in recipients. The automated blood culture systems are widely used for the detection of microorganisms in cell therapy products. However the standard 2-week cultivation period is too long for some cell-based treatments and alternative methods have to be devised. We tried to verify whether a shortened cultivation of the supernatant from the mesenchymal stem cell (MSC) culture obtained 2 days before the cell harvest could sufficiently detect microbial growth and allow the release of MSC for clinical application. We compared the standard Ph. Eur. cultivation method and the automated blood culture system BACTEC (Becton Dickinson). The time to detection (TTD) and the detection limit were analyzed for three bacterial and two fungal strains. The Staphylococcus aureus and Pseudomonas aeruginosa were recognized within 24 h with both methods (detection limit ~10 CFU). The time required for the detection of Bacillus subtilis was shorter with the automated method (TTD 10.3 vs. 60 h for 10-100 CFU). The BACTEC system reached significantly shorter times to the detection of Candida albicans and Aspergillus brasiliensis growth compared to the classical method (15.5 vs. 48 and 31.5 vs. 48 h, respectively; 10-100 CFU). The positivity was demonstrated within 48 h in all bottles, regardless of the size of the inoculum. This study validated the automated cultivation system as a method able to detect all tested microorganisms within a 48-h period with a detection limit of ~10 CFU. Only in case of B. subtilis, the lowest inoculum (~10 CFU) was not recognized. The 2-day cultivation technique is then capable of confirming the microbiological safety of MSC and

  14. Evaluation of a Multi-Parameter Sensor for Automated, Continuous Cell Culture Monitoring in Bioreactors

    NASA Technical Reports Server (NTRS)

    Pappas, D.; Jeevarajan, A.; Anderson, M. M.

    2004-01-01

    Compact and automated sensors are desired for assessing the health of cell cultures in biotechnology experiments in microgravity. Measurement of cell culture medium allows for the optirn.jzation of culture conditions on orbit to maximize cell growth and minimize unnecessary exchange of medium. While several discrete sensors exist to measure culture health, a multi-parameter sensor would simplify the experimental apparatus. One such sensor, the Paratrend 7, consists of three optical fibers for measuring pH, dissolved oxygen (p02), dissolved carbon dioxide (pC02) , and a thermocouple to measure temperature. The sensor bundle was designed for intra-arterial placement in clinical patients, and potentially can be used in NASA's Space Shuttle and International Space Station biotechnology program bioreactors. Methods: A Paratrend 7 sensor was placed at the outlet of a rotating-wall perfused vessel bioreactor system inoculated with BHK-21 (baby hamster kidney) cells. Cell culture medium (GTSF-2, composed of 40% minimum essential medium, 60% L-15 Leibovitz medium) was manually measured using a bench top blood gas analyzer (BGA, Ciba-Corning). Results: A Paratrend 7 sensor was used over a long-term (>120 day) cell culture experiment. The sensor was able to track changes in cell medium pH, p02, and pC02 due to the consumption of nutrients by the BHK-21. When compared to manually obtained BGA measurements, the sensor had good agreement for pH, p02, and pC02 with bias [and precision] of 0.02 [0.15], 1 mm Hg [18 mm Hg], and -4.0 mm Hg [8.0 mm Hg] respectively. The Paratrend oxygen sensor was recalibrated (offset) periodically due to drift. The bias for the raw (no offset or recalibration) oxygen measurements was 42 mm Hg [38 mm Hg]. The measured response (rise) time of the sensor was 20 +/- 4s for pH, 81 +/- 53s for pC02, 51 +/- 20s for p02. For long-term cell culture measurements, these response times are more than adequate. Based on these findings , the Paratrend sensor could

  15. A framework for accelerated phototrophic bioprocess development: integration of parallelized microscale cultivation, laboratory automation and Kriging-assisted experimental design.

    PubMed

    Morschett, Holger; Freier, Lars; Rohde, Jannis; Wiechert, Wolfgang; von Lieres, Eric; Oldiges, Marco

    2017-01-01

    Even though microalgae-derived biodiesel has regained interest within the last decade, industrial production is still challenging for economic reasons. Besides reactor design, as well as value chain and strain engineering, laborious and slow early-stage parameter optimization represents a major drawback. The present study introduces a framework for the accelerated development of phototrophic bioprocesses. A state-of-the-art micro-photobioreactor supported by a liquid-handling robot for automated medium preparation and product quantification was used. To take full advantage of the technology's experimental capacity, Kriging-assisted experimental design was integrated to enable highly efficient execution of screening applications. The resulting platform was used for medium optimization of a lipid production process using Chlorella vulgaris toward maximum volumetric productivity. Within only four experimental rounds, lipid production was increased approximately threefold to 212 ± 11 mg L -1  d -1 . Besides nitrogen availability as a key parameter, magnesium, calcium and various trace elements were shown to be of crucial importance. Here, synergistic multi-parameter interactions as revealed by the experimental design introduced significant further optimization potential. The integration of parallelized microscale cultivation, laboratory automation and Kriging-assisted experimental design proved to be a fruitful tool for the accelerated development of phototrophic bioprocesses. By means of the proposed technology, the targeted optimization task was conducted in a very timely and material-efficient manner.

  16. A Parallel Adaboost-Backpropagation Neural Network for Massive Image Dataset Classification

    PubMed Central

    Cao, Jianfang; Chen, Lichao; Wang, Min; Shi, Hao; Tian, Yun

    2016-01-01

    Image classification uses computers to simulate human understanding and cognition of images by automatically categorizing images. This study proposes a faster image classification approach that parallelizes the traditional Adaboost-Backpropagation (BP) neural network using the MapReduce parallel programming model. First, we construct a strong classifier by assembling the outputs of 15 BP neural networks (which are individually regarded as weak classifiers) based on the Adaboost algorithm. Second, we design Map and Reduce tasks for both the parallel Adaboost-BP neural network and the feature extraction algorithm. Finally, we establish an automated classification model by building a Hadoop cluster. We use the Pascal VOC2007 and Caltech256 datasets to train and test the classification model. The results are superior to those obtained using traditional Adaboost-BP neural network or parallel BP neural network approaches. Our approach increased the average classification accuracy rate by approximately 14.5% and 26.0% compared to the traditional Adaboost-BP neural network and parallel BP neural network, respectively. Furthermore, the proposed approach requires less computation time and scales very well as evaluated by speedup, sizeup and scaleup. The proposed approach may provide a foundation for automated large-scale image classification and demonstrates practical value. PMID:27905520

  17. A Parallel Adaboost-Backpropagation Neural Network for Massive Image Dataset Classification

    NASA Astrophysics Data System (ADS)

    Cao, Jianfang; Chen, Lichao; Wang, Min; Shi, Hao; Tian, Yun

    2016-12-01

    Image classification uses computers to simulate human understanding and cognition of images by automatically categorizing images. This study proposes a faster image classification approach that parallelizes the traditional Adaboost-Backpropagation (BP) neural network using the MapReduce parallel programming model. First, we construct a strong classifier by assembling the outputs of 15 BP neural networks (which are individually regarded as weak classifiers) based on the Adaboost algorithm. Second, we design Map and Reduce tasks for both the parallel Adaboost-BP neural network and the feature extraction algorithm. Finally, we establish an automated classification model by building a Hadoop cluster. We use the Pascal VOC2007 and Caltech256 datasets to train and test the classification model. The results are superior to those obtained using traditional Adaboost-BP neural network or parallel BP neural network approaches. Our approach increased the average classification accuracy rate by approximately 14.5% and 26.0% compared to the traditional Adaboost-BP neural network and parallel BP neural network, respectively. Furthermore, the proposed approach requires less computation time and scales very well as evaluated by speedup, sizeup and scaleup. The proposed approach may provide a foundation for automated large-scale image classification and demonstrates practical value.

  18. A Parallel Adaboost-Backpropagation Neural Network for Massive Image Dataset Classification.

    PubMed

    Cao, Jianfang; Chen, Lichao; Wang, Min; Shi, Hao; Tian, Yun

    2016-12-01

    Image classification uses computers to simulate human understanding and cognition of images by automatically categorizing images. This study proposes a faster image classification approach that parallelizes the traditional Adaboost-Backpropagation (BP) neural network using the MapReduce parallel programming model. First, we construct a strong classifier by assembling the outputs of 15 BP neural networks (which are individually regarded as weak classifiers) based on the Adaboost algorithm. Second, we design Map and Reduce tasks for both the parallel Adaboost-BP neural network and the feature extraction algorithm. Finally, we establish an automated classification model by building a Hadoop cluster. We use the Pascal VOC2007 and Caltech256 datasets to train and test the classification model. The results are superior to those obtained using traditional Adaboost-BP neural network or parallel BP neural network approaches. Our approach increased the average classification accuracy rate by approximately 14.5% and 26.0% compared to the traditional Adaboost-BP neural network and parallel BP neural network, respectively. Furthermore, the proposed approach requires less computation time and scales very well as evaluated by speedup, sizeup and scaleup. The proposed approach may provide a foundation for automated large-scale image classification and demonstrates practical value.

  19. Systematic review automation technologies

    PubMed Central

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  20. An automated parallel crystallisation search for predicted crystal structures and packing motifs of carbamazepine.

    PubMed

    Florence, Alastair J; Johnston, Andrea; Price, Sarah L; Nowell, Harriott; Kennedy, Alan R; Shankland, Norman

    2006-09-01

    An automated parallel crystallisation search for physical forms of carbamazepine, covering 66 solvents and five crystallisation protocols, identified three anhydrous polymorphs (forms I-III), one hydrate and eight organic solvates, including the single-crystal structures of three previously unreported solvates (N,N-dimethylformamide (1:1); hemi-furfural; hemi-1,4-dioxane). Correlation of physical form outcome with the crystallisation conditions demonstrated that the solvent adopts a relatively nonspecific role in determining which polymorph is obtained, and that the previously reported effect of a polymer template facilitating the formation of form IV could not be reproduced by solvent crystallisation alone. In the accompanying computational search, approximately half of the energetically feasible predicted crystal structures exhibit the C=O...H--N R2(2)(8)dimer motif that is observed in the known polymorphs, with the most stable correctly corresponding to form III. Most of the other energetically feasible structures, including the global minimum, have a C=O...H--N C(4) chain hydrogen bond motif. No such chain structures were observed in this or any other previously published work, suggesting that kinetic, rather than thermodynamic, factors determine which of the energetically feasible crystal structures are observed experimentally, with the kinetics apparently favouring nucleation of crystal structures based on the CBZ-CBZ R2(2)(8) motif. (c) 2006 Wiley-Liss, Inc. and the American Pharmacists Association.

  1. Imaging cell picker: A morphology-based automated cell separation system on a photodegradable hydrogel culture platform.

    PubMed

    Shibuta, Mayu; Tamura, Masato; Kanie, Kei; Yanagisawa, Masumi; Matsui, Hirofumi; Satoh, Taku; Takagi, Toshiyuki; Kanamori, Toshiyuki; Sugiura, Shinji; Kato, Ryuji

    2018-06-09

    Cellular morphology on and in a scaffold composed of extracellular matrix generally represents the cellular phenotype. Therefore, morphology-based cell separation should be interesting method that is applicable to cell separation without staining surface markers in contrast to conventional cell separation methods (e.g., fluorescence activated cell sorting and magnetic activated cell sorting). In our previous study, we have proposed a cloning technology using a photodegradable gelatin hydrogel to separate the individual cells on and in hydrogels. To further expand the applicability of this photodegradable hydrogel culture platform, we here report an image-based cell separation system imaging cell picker for the morphology-based cell separation on a photodegradable hydrogel. We have developed the platform which enables the automated workflow of image acquisition, image processing and morphology analysis, and collection of a target cells. We have shown the performance of the morphology-based cell separation through the optimization of the critical parameters that determine the system's performance, such as (i) culture conditions, (ii) imaging conditions, and (iii) the image analysis scheme, to actually clone the cells of interest. Furthermore, we demonstrated the morphology-based cloning performance of cancer cells in the mixture of cells by automated hydrogel degradation by light irradiation and pipetting. Copyright © 2018 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  2. Parallel adaptive wavelet collocation method for PDEs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nejadmalayeri, Alireza, E-mail: Alireza.Nejadmalayeri@gmail.com; Vezolainen, Alexei, E-mail: Alexei.Vezolainen@Colorado.edu; Brown-Dymkoski, Eric, E-mail: Eric.Browndymkoski@Colorado.edu

    2015-10-01

    A parallel adaptive wavelet collocation method for solving a large class of Partial Differential Equations is presented. The parallelization is achieved by developing an asynchronous parallel wavelet transform, which allows one to perform parallel wavelet transform and derivative calculations with only one data synchronization at the highest level of resolution. The data are stored using tree-like structure with tree roots starting at a priori defined level of resolution. Both static and dynamic domain partitioning approaches are developed. For the dynamic domain partitioning, trees are considered to be the minimum quanta of data to be migrated between the processes. This allowsmore » fully automated and efficient handling of non-simply connected partitioning of a computational domain. Dynamic load balancing is achieved via domain repartitioning during the grid adaptation step and reassigning trees to the appropriate processes to ensure approximately the same number of grid points on each process. The parallel efficiency of the approach is discussed based on parallel adaptive wavelet-based Coherent Vortex Simulations of homogeneous turbulence with linear forcing at effective non-adaptive resolutions up to 2048{sup 3} using as many as 2048 CPU cores.« less

  3. Parallelization of ARC3D with Computer-Aided Tools

    NASA Technical Reports Server (NTRS)

    Jin, Haoqiang; Hribar, Michelle; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    A series of efforts have been devoted to investigating methods of porting and parallelizing applications quickly and efficiently for new architectures, such as the SCSI Origin 2000 and Cray T3E. This report presents the parallelization of a CFD application, ARC3D, using the computer-aided tools, Cesspools. Steps of parallelizing this code and requirements of achieving better performance are discussed. The generated parallel version has achieved reasonably well performance, for example, having a speedup of 30 for 36 Cray T3E processors. However, this performance could not be obtained without modification of the original serial code. It is suggested that in many cases improving serial code and performing necessary code transformations are important parts for the automated parallelization process although user intervention in many of these parts are still necessary. Nevertheless, development and improvement of useful software tools, such as Cesspools, can help trim down many tedious parallelization details and improve the processing efficiency.

  4. Parallel solution-phase synthesis of a 2-aminothiazole library including fully automated work-up.

    PubMed

    Buchstaller, Hans-Peter; Anlauf, Uwe

    2011-02-01

    A straightforward and effective procedure for the solution phase preparation of a 2-aminothiazole combinatorial library is described. Reaction, work-up and isolation of the title compounds as free bases was accomplished in a fully automated fashion using the Chemspeed ASW 2000 automated synthesizer. The compounds were obtained in good yields and excellent purities without any further purification procedure.

  5. NeuronMetrics: software for semi-automated processing of cultured neuron images.

    PubMed

    Narro, Martha L; Yang, Fan; Kraft, Robert; Wenk, Carola; Efrat, Alon; Restifo, Linda L

    2007-03-23

    Using primary cell culture to screen for changes in neuronal morphology requires specialized analysis software. We developed NeuronMetrics for semi-automated, quantitative analysis of two-dimensional (2D) images of fluorescently labeled cultured neurons. It skeletonizes the neuron image using two complementary image-processing techniques, capturing fine terminal neurites with high fidelity. An algorithm was devised to span wide gaps in the skeleton. NeuronMetrics uses a novel strategy based on geometric features called faces to extract a branch number estimate from complex arbors with numerous neurite-to-neurite contacts, without creating a precise, contact-free representation of the neurite arbor. It estimates total neurite length, branch number, primary neurite number, territory (the area of the convex polygon bounding the skeleton and cell body), and Polarity Index (a measure of neuronal polarity). These parameters provide fundamental information about the size and shape of neurite arbors, which are critical factors for neuronal function. NeuronMetrics streamlines optional manual tasks such as removing noise, isolating the largest primary neurite, and correcting length for self-fasciculating neurites. Numeric data are output in a single text file, readily imported into other applications for further analysis. Written as modules for ImageJ, NeuronMetrics provides practical analysis tools that are easy to use and support batch processing. Depending on the need for manual intervention, processing time for a batch of approximately 60 2D images is 1.0-2.5 h, from a folder of images to a table of numeric data. NeuronMetrics' output accelerates the quantitative detection of mutations and chemical compounds that alter neurite morphology in vitro, and will contribute to the use of cultured neurons for drug discovery.

  6. Automated Cooperative Trajectories

    NASA Technical Reports Server (NTRS)

    Hanson, Curt; Pahle, Joseph; Brown, Nelson

    2015-01-01

    This presentation is an overview of the Automated Cooperative Trajectories project. An introduction to the phenomena of wake vortices is given, along with a summary of past research into the possibility of extracting energy from the wake by flying close parallel trajectories. Challenges and barriers to adoption of civilian automatic wake surfing technology are identified. A hardware-in-the-loop simulation is described that will support future research. Finally, a roadmap for future research and technology transition is proposed.

  7. Oxygen-controlled automated neural differentiation of mouse embryonic stem cells.

    PubMed

    Mondragon-Teran, Paul; Tostoes, Rui; Mason, Chris; Lye, Gary J; Veraitch, Farlan S

    2013-03-01

    Automation and oxygen tension control are two tools that provide significant improvements to the reproducibility and efficiency of stem cell production processes. the aim of this study was to establish a novel automation platform capable of controlling oxygen tension during both the cell-culture and liquid-handling steps of neural differentiation processes. We built a bespoke automation platform, which enclosed a liquid-handling platform in a sterile, oxygen-controlled environment. An airtight connection was used to transfer cell culture plates to and from an automated oxygen-controlled incubator. Our results demonstrate that our system yielded comparable cell numbers, viabilities, metabolism profiles and differentiation efficiencies when compared with traditional manual processes. Interestingly, eliminating exposure to ambient conditions during the liquid-handling stage resulted in significant improvements in the yield of MAP2-positive neural cells, indicating that this level of control can improve differentiation processes. This article describes, for the first time, an automation platform capable of maintaining oxygen tension control during both the cell-culture and liquid-handling stages of a 2D embryonic stem cell differentiation process.

  8. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris; Norvig, Peter (Technical Monitor)

    2000-01-01

    We describe a parallel genetic algorithm (GA) that automatically generates circuit designs using evolutionary search. A circuit-construction programming language is introduced and we show how evolution can generate practical analog circuit designs. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. We present experimental results as applied to analog filter and amplifier design tasks.

  9. Building a Morbidostat: An automated continuous-culture device for studying bacterial drug resistance under dynamically sustained drug inhibition

    PubMed Central

    Toprak, Erdal; Veres, Adrian; Yildiz, Sadik; Pedraza, Juan M.; Chait, Remy; Paulsson, Johan; Kishony, Roy

    2013-01-01

    We present a protocol for building and operating an automated fluidic system for continuous culture that we call the “morbidostat”. The morbidostat is used to follow evolution of microbial drug resistance in real time. Instead of exposing bacteria to predetermined drug environments, the morbidostat constantly measures the growth rates of evolving microbial populations and dynamically adjusts drug concentrations inside culture vials in order to maintain a constant drug induced inhibition. The growth rate measurements are done using an optical detection system that is based on measuring the intensity of back-scattered light from bacterial cells suspended in the liquid culture. The morbidostat can additionally be used as a chemostat or a turbidostat. The whole system can be built from readily available components within two to three weeks, by biologists with some electronics experience or engineers familiar with basic microbiology. PMID:23429717

  10. Automated culture system experiments hardware: developing test results and design solutions.

    PubMed

    Freddi, M; Covini, M; Tenconi, C; Ricci, C; Caprioli, M; Cotronei, V

    2002-07-01

    The experiment proposed by Prof. Ricci University of Milan is funded by ASI with Laben as industrial Prime Contractor. ACS-EH (Automated Culture System-Experiment Hardware) will support the multigenerational experiment on weightlessness with rotifers and nematodes within four Experiment Containers (ECs) located inside the European Modular Cultivation System (EMCS) facility..Actually the Phase B is in progress and a concept design solution has been defined. The most challenging aspects for the design of such hardware are, from biological point of view the provision of an environment which permits animal's survival and to maintain desiccated generations separated and from the technical point of view, the miniaturisation of the hardware itself due to the reduce EC provided volume (160mmx60mmx60mm). The miniaturisation will allow a better use of the available EMCS Facility resources (e.g. volume. power etc.) and to fulfil the experiment requirements. ACS-EH, will be ready to fly in the year 2005 on boar the ISS.

  11. On extending parallelism to serial simulators

    NASA Technical Reports Server (NTRS)

    Nicol, David; Heidelberger, Philip

    1994-01-01

    This paper describes an approach to discrete event simulation modeling that appears to be effective for developing portable and efficient parallel execution of models of large distributed systems and communication networks. In this approach, the modeler develops submodels using an existing sequential simulation modeling tool, using the full expressive power of the tool. A set of modeling language extensions permit automatically synchronized communication between submodels; however, the automation requires that any such communication must take a nonzero amount off simulation time. Within this modeling paradigm, a variety of conservative synchronization protocols can transparently support conservative execution of submodels on potentially different processors. A specific implementation of this approach, U.P.S. (Utilitarian Parallel Simulator), is described, along with performance results on the Intel Paragon.

  12. Compact, Automated, Frequency-Agile Microspectrofluorimeter

    NASA Technical Reports Server (NTRS)

    Fernandez, Salvador M.; Guignon, Ernest F.

    1995-01-01

    Compact, reliable, rugged, automated cell-culture and frequency-agile microspectrofluorimetric apparatus developed to perform experiments involving photometric imaging observations of single live cells. In original application, apparatus operates mostly unattended aboard spacecraft; potential terrestrial applications include automated or semiautomated diagnosis of pathological tissues in clinical laboratories, biomedical instrumentation, monitoring of biological process streams, and portable instrumentation for testing biological conditions in various environments. Offers obvious advantages over present laboratory instrumentation.

  13. Use of an automated blood culture system (BD BACTEC™) for diagnosis of prosthetic joint infections: easy and fast.

    PubMed

    Minassian, Angela M; Newnham, Robert; Kalimeris, Elizabeth; Bejon, Philip; Atkins, Bridget L; Bowler, Ian C J W

    2014-05-04

    For the diagnosis of prosthetic joint infection (PJI) automated BACTEC™ blood culture bottle methods have comparable sensitivity, specificity and a shorter time to positivity than traditional cooked meat enrichment broth methods. We evaluate the culture incubation period required to maximise sensitivity and specificity of microbiological diagnosis, and the ability of BACTEC™ to detect slow growing Propionibacteria spp. Multiple periprosthetic tissue samples taken by a standardised method from 332 patients undergoing prosthetic joint revision arthroplasty were cultured for 14 days, using a BD BACTEC™ instrumented blood culture system, in a prospective study from 1st January to 31st August 2012. The "gold standard" definition for PJI was the presence of at least one histological criterion, the presence of a sinus tract or purulence around the device. Cases where > =2 samples yielded indistinguishable isolates were considered culture-positive. 1000 BACTEC™ bottle cultures which were negative after 14 days incubation were sub-cultured for Propionibacteria spp. 79 patients fulfilled the definition for PJI, and 66 of these were culture-positive. All but 1 of these 66 culture-positive cases of PJI were detected within 3 days of incubation. Only one additional (clinically-insignificant) Propionibacterium spp. was identified on terminal subculture of 1000 bottles. Prolonged microbiological culture for 2 weeks is unnecessary when using BACTEC™ culture methods. The majority of clinically significant organisms grow within 3 days, and Propionibacteria spp. are identified without the need for terminal subculture. These findings should facilitate earlier decisions on final antimicrobial prescribing.

  14. The development of a scalable parallel 3-D CFD algorithm for turbomachinery. M.S. Thesis Final Report

    NASA Technical Reports Server (NTRS)

    Luke, Edward Allen

    1993-01-01

    Two algorithms capable of computing a transonic 3-D inviscid flow field about rotating machines are considered for parallel implementation. During the study of these algorithms, a significant new method of measuring the performance of parallel algorithms is developed. The theory that supports this new method creates an empirical definition of scalable parallel algorithms that is used to produce quantifiable evidence that a scalable parallel application was developed. The implementation of the parallel application and an automated domain decomposition tool are also discussed.

  15. NeuronMetrics: Software for Semi-Automated Processing of Cultured-Neuron Images

    PubMed Central

    Narro, Martha L.; Yang, Fan; Kraft, Robert; Wenk, Carola; Efrat, Alon; Restifo, Linda L.

    2007-01-01

    Using primary cell culture to screen for changes in neuronal morphology requires specialized analysis software. We developed NeuronMetrics™ for semi-automated, quantitative analysis of two-dimensional (2D) images of fluorescently labeled cultured neurons. It skeletonizes the neuron image using two complementary image-processing techniques, capturing fine terminal neurites with high fidelity. An algorithm was devised to span wide gaps in the skeleton. NeuronMetrics uses a novel strategy based on geometric features called faces to extract a branch-number estimate from complex arbors with numerous neurite-to-neurite contacts, without creating a precise, contact-free representation of the neurite arbor. It estimates total neurite length, branch number, primary neurite number, territory (the area of the convex polygon bounding the skeleton and cell body), and Polarity Index (a measure of neuronal polarity). These parameters provide fundamental information about the size and shape of neurite arbors, which are critical factors for neuronal function. NeuronMetrics streamlines optional manual tasks such as removing noise, isolating the largest primary neurite, and correcting length for self-fasciculating neurites. Numeric data are output in a single text file, readily imported into other applications for further analysis. Written as modules for ImageJ, NeuronMetrics provides practical analysis tools that are easy to use and support batch processing. Depending on the need for manual intervention, processing time for a batch of ~60 2D images is 1.0–2.5 hours, from a folder of images to a table of numeric data. NeuronMetrics’ output accelerates the quantitative detection of mutations and chemical compounds that alter neurite morphology in vitro, and will contribute to the use of cultured neurons for drug discovery. PMID:17270152

  16. Automated video surveillance: teaching an old dog new tricks

    NASA Astrophysics Data System (ADS)

    McLeod, Alastair

    1993-12-01

    The automated video surveillance market is booming with new players, new systems, new hardware and software, and an extended range of applications. This paper reviews available technology, and describes the features required for a good automated surveillance system. Both hardware and software are discussed. An overview of typical applications is also given. A shift towards PC-based hybrid systems, use of parallel processing, neural networks, and exploitation of modern telecomms are introduced, highlighting the evolution modern video surveillance systems.

  17. A Comparison of Automatic Parallelization Tools/Compilers on the SGI Origin 2000 Using the NAS Benchmarks

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Frumkin, Michael; Hribar, Michelle; Jin, Hao-Qiang; Waheed, Abdul; Yan, Jerry

    1998-01-01

    Porting applications to new high performance parallel and distributed computing platforms is a challenging task. Since writing parallel code by hand is extremely time consuming and costly, porting codes would ideally be automated by using some parallelization tools and compilers. In this paper, we compare the performance of the hand written NAB Parallel Benchmarks against three parallel versions generated with the help of tools and compilers: 1) CAPTools: an interactive computer aided parallelization too] that generates message passing code, 2) the Portland Group's HPF compiler and 3) using compiler directives with the native FORTAN77 compiler on the SGI Origin2000.

  18. Handling Big Data in Medical Imaging: Iterative Reconstruction with Large-Scale Automated Parallel Computation.

    PubMed

    Lee, Jae H; Yao, Yushu; Shrestha, Uttam; Gullberg, Grant T; Seo, Youngho

    2014-11-01

    The primary goal of this project is to implement the iterative statistical image reconstruction algorithm, in this case maximum likelihood expectation maximum (MLEM) used for dynamic cardiac single photon emission computed tomography, on Spark/GraphX. This involves porting the algorithm to run on large-scale parallel computing systems. Spark is an easy-to- program software platform that can handle large amounts of data in parallel. GraphX is a graph analytic system running on top of Spark to handle graph and sparse linear algebra operations in parallel. The main advantage of implementing MLEM algorithm in Spark/GraphX is that it allows users to parallelize such computation without any expertise in parallel computing or prior knowledge in computer science. In this paper we demonstrate a successful implementation of MLEM in Spark/GraphX and present the performance gains with the goal to eventually make it useable in clinical setting.

  19. Handling Big Data in Medical Imaging: Iterative Reconstruction with Large-Scale Automated Parallel Computation

    PubMed Central

    Lee, Jae H.; Yao, Yushu; Shrestha, Uttam; Gullberg, Grant T.; Seo, Youngho

    2014-01-01

    The primary goal of this project is to implement the iterative statistical image reconstruction algorithm, in this case maximum likelihood expectation maximum (MLEM) used for dynamic cardiac single photon emission computed tomography, on Spark/GraphX. This involves porting the algorithm to run on large-scale parallel computing systems. Spark is an easy-to- program software platform that can handle large amounts of data in parallel. GraphX is a graph analytic system running on top of Spark to handle graph and sparse linear algebra operations in parallel. The main advantage of implementing MLEM algorithm in Spark/GraphX is that it allows users to parallelize such computation without any expertise in parallel computing or prior knowledge in computer science. In this paper we demonstrate a successful implementation of MLEM in Spark/GraphX and present the performance gains with the goal to eventually make it useable in clinical setting. PMID:27081299

  20. Inventory management and reagent supply for automated chemistry.

    PubMed

    Kuzniar, E

    1999-08-01

    Developments in automated chemistry have kept pace with developments in HTS such that hundreds of thousands of new compounds can be rapidly synthesized in the belief that the greater the number and diversity of compounds that can be screened, the more successful HTS will be. The increasing use of automation for Multiple Parallel Synthesis (MPS) and the move to automated combinatorial library production is placing an overwhelming burden on the management of reagents. Although automation has improved the efficiency of the processes involved in compound synthesis, the bottleneck has shifted to ordering, collating and preparing reagents for automated chemistry resulting in loss of time, materials and momentum. Major efficiencies have already been made in the area of compound management for high throughput screening. Most of these efficiencies have been achieved with sophisticated library management systems using advanced engineering and data handling for the storage, tracking and retrieval of millions of compounds. The Automation Partnership has already provided many of the top pharmaceutical companies with modular automated storage, preparation and retrieval systems to manage compound libraries for high throughput screening. This article describes how these systems may be implemented to solve the specific problems of inventory management and reagent supply for automated chemistry.

  1. Automated live cell screening system based on a 24-well-microplate with integrated micro fluidics.

    PubMed

    Lob, V; Geisler, T; Brischwein, M; Uhl, R; Wolf, B

    2007-11-01

    In research, pharmacologic drug-screening and medical diagnostics, the trend towards the utilization of functional assays using living cells is persisting. Research groups working with living cells are confronted with the problem, that common endpoint measurement methods are not able to map dynamic changes. With consideration of time as a further dimension, the dynamic and networked molecular processes of cells in culture can be monitored. These processes can be investigated by measuring several extracellular parameters. This paper describes a high-content system that provides real-time monitoring data of cell parameters (metabolic and morphological alterations), e.g., upon treatment with drug compounds. Accessible are acidification rates, the oxygen consumption and changes in adhesion forces within 24 cell cultures in parallel. Addressing the rising interest in biomedical and pharmacological high-content screening assays, a concept has been developed, which integrates multi-parametric sensor readout, automated imaging and probe handling into a single embedded platform. A life-maintenance system keeps important environmental parameters (gas, humidity, sterility, temperature) constant.

  2. Run-time parallelization and scheduling of loops

    NASA Technical Reports Server (NTRS)

    Saltz, Joel H.; Mirchandaney, Ravi; Baxter, Doug

    1988-01-01

    The class of problems that can be effectively compiled by parallelizing compilers is discussed. This is accomplished with the doconsider construct which would allow these compilers to parallelize many problems in which substantial loop-level parallelism is available but cannot be detected by standard compile-time analysis. We describe and experimentally analyze mechanisms used to parallelize the work required for these types of loops. In each of these methods, a new loop structure is produced by modifying the loop to be parallelized. We also present the rules by which these loop transformations may be automated in order that they be included in language compilers. The main application area of the research involves problems in scientific computations and engineering. The workload used in our experiment includes a mixture of real problems as well as synthetically generated inputs. From our extensive tests on the Encore Multimax/320, we have reached the conclusion that for the types of workloads we have investigated, self-execution almost always performs better than pre-scheduling. Further, the improvement in performance that accrues as a result of global topological sorting of indices as opposed to the less expensive local sorting, is not very significant in the case of self-execution.

  3. Parallel Monotonic Basin Hopping for Low Thrust Trajectory Optimization

    NASA Technical Reports Server (NTRS)

    McCarty, Steven L.; McGuire, Melissa L.

    2018-01-01

    Monotonic Basin Hopping has been shown to be an effective method of solving low thrust trajectory optimization problems. This paper outlines an extension to the common serial implementation by parallelizing it over any number of available compute cores. The Parallel Monotonic Basin Hopping algorithm described herein is shown to be an effective way to more quickly locate feasible solutions, and improve locally optimal solutions in an automated way without requiring a feasible initial guess. The increased speed achieved through parallelization enables the algorithm to be applied to more complex problems that would otherwise be impractical for a serial implementation. Low thrust cislunar transfers and a hybrid Mars example case demonstrate the effectiveness of the algorithm. Finally, a preliminary scaling study quantifies the expected decrease in solve time compared to a serial implementation.,

  4. Advances in Parallelization for Large Scale Oct-Tree Mesh Generation

    NASA Technical Reports Server (NTRS)

    O'Connell, Matthew; Karman, Steve L.

    2015-01-01

    Despite great advancements in the parallelization of numerical simulation codes over the last 20 years, it is still common to perform grid generation in serial. Generating large scale grids in serial often requires using special "grid generation" compute machines that can have more than ten times the memory of average machines. While some parallel mesh generation techniques have been proposed, generating very large meshes for LES or aeroacoustic simulations is still a challenging problem. An automated method for the parallel generation of very large scale off-body hierarchical meshes is presented here. This work enables large scale parallel generation of off-body meshes by using a novel combination of parallel grid generation techniques and a hybrid "top down" and "bottom up" oct-tree method. Meshes are generated using hardware commonly found in parallel compute clusters. The capability to generate very large meshes is demonstrated by the generation of off-body meshes surrounding complex aerospace geometries. Results are shown including a one billion cell mesh generated around a Predator Unmanned Aerial Vehicle geometry, which was generated on 64 processors in under 45 minutes.

  5. Problems of Automation and Management Principles Information Flow in Manufacturing

    NASA Astrophysics Data System (ADS)

    Grigoryuk, E. N.; Bulkin, V. V.

    2017-07-01

    Automated control systems of technological processes are complex systems that are characterized by the presence of elements of the overall focus, the systemic nature of the implemented algorithms for the exchange and processing of information, as well as a large number of functional subsystems. The article gives examples of automatic control systems and automated control systems of technological processes held parallel between them by identifying strengths and weaknesses. Other proposed non-standard control system of technological process.

  6. Using CLIPS in the domain of knowledge-based massively parallel programming

    NASA Technical Reports Server (NTRS)

    Dvorak, Jiri J.

    1994-01-01

    The Program Development Environment (PDE) is a tool for massively parallel programming of distributed-memory architectures. Adopting a knowledge-based approach, the PDE eliminates the complexity introduced by parallel hardware with distributed memory and offers complete transparency in respect of parallelism exploitation. The knowledge-based part of the PDE is realized in CLIPS. Its principal task is to find an efficient parallel realization of the application specified by the user in a comfortable, abstract, domain-oriented formalism. A large collection of fine-grain parallel algorithmic skeletons, represented as COOL objects in a tree hierarchy, contains the algorithmic knowledge. A hybrid knowledge base with rule modules and procedural parts, encoding expertise about application domain, parallel programming, software engineering, and parallel hardware, enables a high degree of automation in the software development process. In this paper, important aspects of the implementation of the PDE using CLIPS and COOL are shown, including the embedding of CLIPS with C++-based parts of the PDE. The appropriateness of the chosen approach and of the CLIPS language for knowledge-based software engineering are discussed.

  7. Planning and Resource Management in an Intelligent Automated Power Management System

    NASA Technical Reports Server (NTRS)

    Morris, Robert A.

    1991-01-01

    Power system management is a process of guiding a power system towards the objective of continuous supply of electrical power to a set of loads. Spacecraft power system management requires planning and scheduling, since electrical power is a scarce resource in space. The automation of power system management for future spacecraft has been recognized as an important R&D goal. Several automation technologies have emerged including the use of expert systems for automating human problem solving capabilities such as rule based expert system for fault diagnosis and load scheduling. It is questionable whether current generation expert system technology is applicable for power system management in space. The objective of the ADEPTS (ADvanced Electrical Power management Techniques for Space systems) is to study new techniques for power management automation. These techniques involve integrating current expert system technology with that of parallel and distributed computing, as well as a distributed, object-oriented approach to software design. The focus of the current study is the integration of new procedures for automatically planning and scheduling loads with procedures for performing fault diagnosis and control. The objective is the concurrent execution of both sets of tasks on separate transputer processors, thus adding parallelism to the overall management process.

  8. Investigation of vinegar production using a novel shaken repeated batch culture system.

    PubMed

    Schlepütz, Tino; Büchs, Jochen

    2013-01-01

    Nowadays, bioprocesses are developed or optimized on small scale. Also, vinegar industry is motivated to reinvestigate the established repeated batch fermentation process. As yet, there is no small-scale culture system for optimizing fermentation conditions for repeated batch bioprocesses. Thus, the aim of this study is to propose a new shaken culture system for parallel repeated batch vinegar fermentation. A new operation mode - the flushing repeated batch - was developed. Parallel repeated batch vinegar production could be established in shaken overflow vessels in a completely automated operation with only one pump per vessel. This flushing repeated batch was first theoretically investigated and then empirically tested. The ethanol concentration was online monitored during repeated batch fermentation by semiconductor gas sensors. It was shown that the switch from one ethanol substrate quality to different ethanol substrate qualities resulted in prolonged lag phases and durations of the first batches. In the subsequent batches the length of the fermentations decreased considerably. This decrease in the respective lag phases indicates an adaptation of the acetic acid bacteria mixed culture to the specific ethanol substrate quality. Consequently, flushing repeated batch fermentations on small scale are valuable for screening fermentation conditions and, thereby, improving industrial-scale bioprocesses such as vinegar production in terms of process robustness, stability, and productivity. Copyright © 2013 American Institute of Chemical Engineers.

  9. Parallel Education and Defining the Fourth Sector.

    ERIC Educational Resources Information Center

    Chessell, Diana

    1996-01-01

    Parallel to the primary, secondary, postsecondary, and adult/community education sectors is education not associated with formal programs--learning in arts and cultural sites. The emergence of cultural and educational tourism is an opportunity for adult/community education to define itself by extending lifelong learning opportunities into parallel…

  10. Workload Capacity: A Response Time-Based Measure of Automation Dependence.

    PubMed

    Yamani, Yusuke; McCarley, Jason S

    2016-05-01

    An experiment used the workload capacity measure C(t) to quantify the processing efficiency of human-automation teams and identify operators' automation usage strategies in a speeded decision task. Although response accuracy rates and related measures are often used to measure the influence of an automated decision aid on human performance, aids can also influence response speed. Mean response times (RTs), however, conflate the influence of the human operator and the automated aid on team performance and may mask changes in the operator's performance strategy under aided conditions. The present study used a measure of parallel processing efficiency, or workload capacity, derived from empirical RT distributions as a novel gauge of human-automation performance and automation dependence in a speeded task. Participants performed a speeded probabilistic decision task with and without the assistance of an automated aid. RT distributions were used to calculate two variants of a workload capacity measure, COR(t) and CAND(t). Capacity measures gave evidence that a diagnosis from the automated aid speeded human participants' responses, and that participants did not moderate their own decision times in anticipation of diagnoses from the aid. Workload capacity provides a sensitive and informative measure of human-automation performance and operators' automation dependence in speeded tasks. © 2016, Human Factors and Ergonomics Society.

  11. Clarity: An Open Source Manager for Laboratory Automation

    PubMed Central

    Delaney, Nigel F.; Echenique, José Rojas; Marx, Christopher J.

    2013-01-01

    Software to manage automated laboratories interfaces with hardware instruments, gives users a way to specify experimental protocols, and schedules activities to avoid hardware conflicts. In addition to these basics, modern laboratories need software that can run multiple different protocols in parallel and that can be easily extended to interface with a constantly growing diversity of techniques and instruments. We present Clarity: a laboratory automation manager that is hardware agnostic, portable, extensible and open source. Clarity provides critical features including remote monitoring, robust error reporting by phone or email, and full state recovery in the event of a system crash. We discuss the basic organization of Clarity; demonstrate an example of its implementation for the automated analysis of bacterial growth; and describe how the program can be extended to manage new hardware. Clarity is mature; well documented; actively developed; written in C# for the Common Language Infrastructure; and is free and open source software. These advantages set Clarity apart from currently available laboratory automation programs. PMID:23032169

  12. Laboratory automation: trajectory, technology, and tactics.

    PubMed

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  13. An automated laboratory-scale methodology for the generation of sheared mammalian cell culture samples.

    PubMed

    Joseph, Adrian; Goldrick, Stephen; Mollet, Michael; Turner, Richard; Bender, Jean; Gruber, David; Farid, Suzanne S; Titchener-Hooker, Nigel

    2017-05-01

    Continuous disk-stack centrifugation is typically used for the removal of cells and cellular debris from mammalian cell culture broths at manufacturing-scale. The use of scale-down methods to characterise disk-stack centrifugation performance enables substantial reductions in material requirements and allows a much wider design space to be tested than is currently possible at pilot-scale. The process of scaling down centrifugation has historically been challenging due to the difficulties in mimicking the Energy Dissipation Rates (EDRs) in typical machines. This paper describes an alternative and easy-to-assemble automated capillary-based methodology to generate levels of EDRs consistent with those found in a continuous disk-stack centrifuge. Variations in EDR were achieved through changes in capillary internal diameter and the flow rate of operation through the capillary. The EDRs found to match the levels of shear in the feed zone of a pilot-scale centrifuge using the experimental method developed in this paper (2.4×10 5 W/Kg) are consistent with those obtained through previously published computational fluid dynamic (CFD) studies (2.0×10 5 W/Kg). Furthermore, this methodology can be incorporated into existing scale-down methods to model the process performance of continuous disk-stack centrifuges. This was demonstrated through the characterisation of culture hold time, culture temperature and EDRs on centrate quality. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. An 8-Fold Parallel Reactor System for Combinatorial Catalysis Research

    PubMed Central

    Stoll, Norbert; Allwardt, Arne; Dingerdissen, Uwe

    2006-01-01

    Increasing economic globalization and mounting time and cost pressure on the development of new raw materials for the chemical industry as well as materials and environmental engineering constantly raise the demands on technologies to be used. Parallelization, miniaturization, and automation are the main concepts involved in increasing the rate of chemical and biological experimentation. PMID:17671621

  15. Parallelism in integrated fluidic circuits

    NASA Astrophysics Data System (ADS)

    Bousse, Luc J.; Kopf-Sill, Anne R.; Parce, J. W.

    1998-04-01

    Many research groups around the world are working on integrated microfluidics. The goal of these projects is to automate and integrate the handling of liquid samples and reagents for measurement and assay procedures in chemistry and biology. Ultimately, it is hoped that this will lead to a revolution in chemical and biological procedures similar to that caused in electronics by the invention of the integrated circuit. The optimal size scale of channels for liquid flow is determined by basic constraints to be somewhere between 10 and 100 micrometers . In larger channels, mixing by diffusion takes too long; in smaller channels, the number of molecules present is so low it makes detection difficult. At Caliper, we are making fluidic systems in glass chips with channels in this size range, based on electroosmotic flow, and fluorescence detection. One application of this technology is rapid assays for drug screening, such as enzyme assays and binding assays. A further challenge in this area is to perform multiple functions on a chip in parallel, without a large increase in the number of inputs and outputs. A first step in this direction is a fluidic serial-to-parallel converter. Fluidic circuits will be shown with the ability to distribute an incoming serial sample stream to multiple parallel channels.

  16. Development of design principles for automated systems in transport control.

    PubMed

    Balfe, Nora; Wilson, John R; Sharples, Sarah; Clarke, Theresa

    2012-01-01

    This article reports the results of a qualitative study investigating attitudes towards and opinions of an advanced automation system currently used in UK rail signalling. In-depth interviews were held with 10 users, key issues associated with automation were identified and the automation's impact on the signalling task investigated. The interview data highlighted the importance of the signallers' understanding of the automation and their (in)ability to predict its outputs. The interviews also covered the methods used by signallers to interact with and control the automation, and the perceived effects on their workload. The results indicate that despite a generally low level of understanding and ability to predict the actions of the automation system, signallers have developed largely successful coping mechanisms that enable them to use the technology effectively. These findings, along with parallel work identifying desirable attributes of automation from the literature in the area, were used to develop 12 principles of automation which can be used to help design new systems which better facilitate cooperative working. The work reported in this article was completed with the active involvement of operational rail staff who regularly use automated systems in rail signalling. The outcomes are currently being used to inform decisions on the extent and type of automation and user interfaces in future generations of rail control systems.

  17. An automated live imaging platform for studying merozoite egress-invasion in malaria cultures.

    PubMed

    Crick, Alex J; Tiffert, Teresa; Shah, Sheel M; Kotar, Jurij; Lew, Virgilio L; Cicuta, Pietro

    2013-03-05

    Most cases of severe and fatal malaria are caused by the intraerythrocytic asexual reproduction cycle of Plasmodium falciparum. One of the most intriguing and least understood stages in this cycle is the brief preinvasion period during which dynamic merozoite-red-cell interactions align the merozoite apex in preparation for penetration. Studies of the molecular mechanisms involved in this process face formidable technical challenges, requiring multiple observations of merozoite egress-invasion sequences in live cultures under controlled experimental conditions, using high-resolution microscopy and a variety of fluorescent imaging tools. Here we describe a first successful step in the development of a fully automated, robotic imaging platform to enable such studies. Schizont-enriched live cultures of P. falciparum were set up on an inverted stage microscope with software-controlled motorized functions. By applying a variety of imaging filters and selection criteria, we identified infected red cells that were likely to rupture imminently, and recorded their coordinates. We developed a video-image analysis to detect and automatically record merozoite egress events in 100% of the 40 egress-invasion sequences recorded in this study. We observed a substantial polymorphism of the dynamic condition of pre-egress infected cells, probably reflecting asynchronies in the diversity of confluent processes leading to merozoite release. Copyright © 2013 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  18. Automated CFD Parameter Studies on Distributed Parallel Computers

    NASA Technical Reports Server (NTRS)

    Rogers, Stuart E.; Aftosmis, Michael; Pandya, Shishir; Tejnil, Edward; Ahmad, Jasim; Kwak, Dochan (Technical Monitor)

    2002-01-01

    The objective of the current work is to build a prototype software system which will automated the process of running CFD jobs on Information Power Grid (IPG) resources. This system should remove the need for user monitoring and intervention of every single CFD job. It should enable the use of many different computers to populate a massive run matrix in the shortest time possible. Such a software system has been developed, and is known as the AeroDB script system. The approach taken for the development of AeroDB was to build several discrete modules. These include a database, a job-launcher module, a run-manager module to monitor each individual job, and a web-based user portal for monitoring of the progress of the parameter study. The details of the design of AeroDB are presented in the following section. The following section provides the results of a parameter study which was performed using AeroDB for the analysis of a reusable launch vehicle (RLV). The paper concludes with a section on the lessons learned in this effort, and ideas for future work in this area.

  19. Seamless Combination of Fluorescence-Activated Cell Sorting and Hanging-Drop Networks for Individual Handling and Culturing of Stem Cells and Microtissue Spheroids.

    PubMed

    Birchler, Axel; Berger, Mischa; Jäggin, Verena; Lopes, Telma; Etzrodt, Martin; Misun, Patrick Mark; Pena-Francesch, Maria; Schroeder, Timm; Hierlemann, Andreas; Frey, Olivier

    2016-01-19

    Open microfluidic cell culturing devices offer new possibilities to simplify loading, culturing, and harvesting of individual cells or microtissues due to the fact that liquids and cells/microtissues are directly accessible. We present a complete workflow for microfluidic handling and culturing of individual cells and microtissue spheroids, which is based on the hanging-drop network concept: The open microfluidic devices are seamlessly combined with fluorescence-activated cell sorting (FACS), so that individual cells, including stem cells, can be directly sorted into specified culturing compartments in a fully automated way and at high accuracy. Moreover, already assembled microtissue spheroids can be loaded into the microfluidic structures by using a conventional pipet. Cell and microtissue culturing is then performed in hanging drops under controlled perfusion. On-chip drop size control measures were applied to stabilize the system. Cells and microtissue spheroids can be retrieved from the chip by using a parallelized transfer method. The presented methodology holds great promise for combinatorial screening of stem-cell and multicellular-spheroid cultures.

  20. Comparison of automated processing of flocked swabs with manual processing of fiber swabs for detection of nasal carriage of Staphylococcus aureus.

    PubMed

    Jones, Gillian; Matthews, Roger; Cunningham, Richard; Jenks, Peter

    2011-07-01

    The sensitivity of automated culture of Staphylococcus aureus from flocked swabs versus that of manual culture of fiber swabs was prospectively compared using nasal swabs from 867 patients. Automated culture from flocked swabs significantly increased the detection rate, by 13.1% for direct culture and 10.2% for enrichment culture.

  1. Analysis of biases from parallel observations of co-located manual and automatic weather stations in Indonesia

    NASA Astrophysics Data System (ADS)

    Sopaheluwakan, Ardhasena; Fajariana, Yuaning; Satyaningsih, Ratna; Aprilina, Kharisma; Astuti Nuraini, Tri; Ummiyatul Badriyah, Imelda; Lukita Sari, Dyah; Haryoko, Urip

    2017-04-01

    Inhomogeneities are often found in long records of climate data. These can occur because of various reasons, among others such as relocation of observation site, changes in observation method, and the transition to automated instruments. Changes to these automated systems are inevitable, and it is taking place worldwide in many of the National Meteorological Services. However this shift of observational practice must be done cautiously and a sufficient period of parallel observation of co-located manual and automated systems should take place as suggested by the World Meteorological Organization. With a sufficient parallel observation period, biases between the two systems can be analyzed. In this study we analyze the biases of a yearlong parallel observation of manual and automatic weather stations in 30 locations in Indonesia. The location of the sites spans from east to west of approximately 45 longitudinal degrees covering different climate characteristics and geographical settings. We study measurements taken by both sensors for temperature and rainfall parameters. We found that the biases from both systems vary from place to place and are more dependent to the setting of the instrument rather than to the climatic and geographical factors. For instance, daytime observations of the automatic weather stations are found to be consistently higher than the manual observation, and vice versa night time observations of the automatic weather stations are lower than the manual observation.

  2. Automated three-component synthesis of a library of γ-lactams

    PubMed Central

    Fenster, Erik; Hill, David; Reiser, Oliver

    2012-01-01

    Summary A three-component method for the synthesis of γ-lactams from commercially available maleimides, aldehydes, and amines was adapted to parallel library synthesis. Improvements to the chemistry over previous efforts include the optimization of the method to a one-pot process, the management of by-products and excess reagents, the development of an automated parallel sequence, and the adaption of the method to permit the preparation of enantiomerically enriched products. These efforts culminated in the preparation of a library of 169 γ-lactams. PMID:23209515

  3. Multilevel decomposition of complete vehicle configuration in a parallel computing environment

    NASA Technical Reports Server (NTRS)

    Bhatt, Vinay; Ragsdell, K. M.

    1989-01-01

    This research summarizes various approaches to multilevel decomposition to solve large structural problems. A linear decomposition scheme based on the Sobieski algorithm is selected as a vehicle for automated synthesis of a complete vehicle configuration in a parallel processing environment. The research is in a developmental state. Preliminary numerical results are presented for several example problems.

  4. Parallel synthesis of a series of potentially brain penetrant aminoalkyl benzoimidazoles.

    PubMed

    Micco, Iolanda; Nencini, Arianna; Quinn, Joanna; Bothmann, Hendrick; Ghiron, Chiara; Padova, Alessandro; Papini, Silvia

    2008-03-01

    Alpha7 agonists were identified via GOLD (CCDC) docking in the putative agonist binding site of an alpha7 homology model and a series of aminoalkyl benzoimidazoles was synthesised to obtain potentially brain penetrant drugs. The array was prepared starting from the reaction of ortho-fluoronitrobenzenes with a selection of diamines, followed by reduction of the nitro group to obtain a series of monoalkylated phenylene diamines. N,N'-Carbonyldiimidazole (CDI) mediated acylation, followed by a parallel automated work-up procedure, afforded the monoacylated phenylenediamines which were cyclised under acidic conditions. Parallel work-up and purification afforded the array products in good yields and purities with a robust parallel methodology which will be useful for other libraries. Screening for alpha7 activity revealed compounds with agonist activity for the receptor.

  5. Automated target recognition and tracking using an optical pattern recognition neural network

    NASA Technical Reports Server (NTRS)

    Chao, Tien-Hsin

    1991-01-01

    The on-going development of an automatic target recognition and tracking system at the Jet Propulsion Laboratory is presented. This system is an optical pattern recognition neural network (OPRNN) that is an integration of an innovative optical parallel processor and a feature extraction based neural net training algorithm. The parallel optical processor provides high speed and vast parallelism as well as full shift invariance. The neural network algorithm enables simultaneous discrimination of multiple noisy targets in spite of their scales, rotations, perspectives, and various deformations. This fully developed OPRNN system can be effectively utilized for the automated spacecraft recognition and tracking that will lead to success in the Automated Rendezvous and Capture (AR&C) of the unmanned Cargo Transfer Vehicle (CTV). One of the most powerful optical parallel processors for automatic target recognition is the multichannel correlator. With the inherent advantages of parallel processing capability and shift invariance, multiple objects can be simultaneously recognized and tracked using this multichannel correlator. This target tracking capability can be greatly enhanced by utilizing a powerful feature extraction based neural network training algorithm such as the neocognitron. The OPRNN, currently under investigation at JPL, is constructed with an optical multichannel correlator where holographic filters have been prepared using the neocognitron training algorithm. The computation speed of the neocognitron-type OPRNN is up to 10(exp 14) analog connections/sec that enabling the OPRNN to outperform its state-of-the-art electronics counterpart by at least two orders of magnitude.

  6. [Establishment of Automation System for Detection of Alcohol in Blood].

    PubMed

    Tian, L L; Shen, Lei; Xue, J F; Liu, M M; Liang, L J

    2017-02-01

    To establish an automation system for detection of alcohol content in blood. The determination was performed by automated workstation of extraction-headspace gas chromatography (HS-GC). The blood collection with negative pressure, sealing time of headspace bottle and sample needle were checked and optimized in the abstraction of automation system. The automatic sampling was compared with the manual sampling. The quantitative data obtained by the automated workstation of extraction-HS-GC for alcohol was stable. The relative differences of two parallel samples were less than 5%. The automated extraction was superior to the manual extraction. A good linear relationship was obtained at the alcohol concentration range of 0.1-3.0 mg/mL ( r ≥0.999) with good repeatability. The method is simple and quick, with more standard experiment process and accurate experimental data. It eliminates the error from the experimenter and has good repeatability, which can be applied to the qualitative and quantitative detections of alcohol in blood. Copyright© by the Editorial Department of Journal of Forensic Medicine

  7. The Design and Evaluation of "CAPTools"--A Computer Aided Parallelization Toolkit

    NASA Technical Reports Server (NTRS)

    Yan, Jerry; Frumkin, Michael; Hribar, Michelle; Jin, Haoqiang; Waheed, Abdul; Johnson, Steve; Cross, Jark; Evans, Emyr; Ierotheou, Constantinos; Leggett, Pete; hide

    1998-01-01

    Writing applications for high performance computers is a challenging task. Although writing code by hand still offers the best performance, it is extremely costly and often not very portable. The Computer Aided Parallelization Tools (CAPTools) are a toolkit designed to help automate the mapping of sequential FORTRAN scientific applications onto multiprocessors. CAPTools consists of the following major components: an inter-procedural dependence analysis module that incorporates user knowledge; a 'self-propagating' data partitioning module driven via user guidance; an execution control mask generation and optimization module for the user to fine tune parallel processing of individual partitions; a program transformation/restructuring facility for source code clean up and optimization; a set of browsers through which the user interacts with CAPTools at each stage of the parallelization process; and a code generator supporting multiple programming paradigms on various multiprocessors. Besides describing the rationale behind the architecture of CAPTools, the parallelization process is illustrated via case studies involving structured and unstructured meshes. The programming process and the performance of the generated parallel programs are compared against other programming alternatives based on the NAS Parallel Benchmarks, ARC3D and other scientific applications. Based on these results, a discussion on the feasibility of constructing architectural independent parallel applications is presented.

  8. Prioritizing multiple therapeutic targets in parallel using automated DNA-encoded library screening

    NASA Astrophysics Data System (ADS)

    Machutta, Carl A.; Kollmann, Christopher S.; Lind, Kenneth E.; Bai, Xiaopeng; Chan, Pan F.; Huang, Jianzhong; Ballell, Lluis; Belyanskaya, Svetlana; Besra, Gurdyal S.; Barros-Aguirre, David; Bates, Robert H.; Centrella, Paolo A.; Chang, Sandy S.; Chai, Jing; Choudhry, Anthony E.; Coffin, Aaron; Davie, Christopher P.; Deng, Hongfeng; Deng, Jianghe; Ding, Yun; Dodson, Jason W.; Fosbenner, David T.; Gao, Enoch N.; Graham, Taylor L.; Graybill, Todd L.; Ingraham, Karen; Johnson, Walter P.; King, Bryan W.; Kwiatkowski, Christopher R.; Lelièvre, Joël; Li, Yue; Liu, Xiaorong; Lu, Quinn; Lehr, Ruth; Mendoza-Losana, Alfonso; Martin, John; McCloskey, Lynn; McCormick, Patti; O'Keefe, Heather P.; O'Keeffe, Thomas; Pao, Christina; Phelps, Christopher B.; Qi, Hongwei; Rafferty, Keith; Scavello, Genaro S.; Steiginga, Matt S.; Sundersingh, Flora S.; Sweitzer, Sharon M.; Szewczuk, Lawrence M.; Taylor, Amy; Toh, May Fern; Wang, Juan; Wang, Minghui; Wilkins, Devan J.; Xia, Bing; Yao, Gang; Zhang, Jean; Zhou, Jingye; Donahue, Christine P.; Messer, Jeffrey A.; Holmes, David; Arico-Muendel, Christopher C.; Pope, Andrew J.; Gross, Jeffrey W.; Evindar, Ghotas

    2017-07-01

    The identification and prioritization of chemically tractable therapeutic targets is a significant challenge in the discovery of new medicines. We have developed a novel method that rapidly screens multiple proteins in parallel using DNA-encoded library technology (ELT). Initial efforts were focused on the efficient discovery of antibacterial leads against 119 targets from Acinetobacter baumannii and Staphylococcus aureus. The success of this effort led to the hypothesis that the relative number of ELT binders alone could be used to assess the ligandability of large sets of proteins. This concept was further explored by screening 42 targets from Mycobacterium tuberculosis. Active chemical series for six targets from our initial effort as well as three chemotypes for DHFR from M. tuberculosis are reported. The findings demonstrate that parallel ELT selections can be used to assess ligandability and highlight opportunities for successful lead and tool discovery.

  9. Comparison of Automated Processing of Flocked Swabs with Manual Processing of Fiber Swabs for Detection of Nasal Carriage of Staphylococcus aureus▿‡

    PubMed Central

    Jones, Gillian; Matthews, Roger; Cunningham, Richard; Jenks, Peter

    2011-01-01

    The sensitivity of automated culture of Staphylococcus aureus from flocked swabs versus that of manual culture of fiber swabs was prospectively compared using nasal swabs from 867 patients. Automated culture from flocked swabs significantly increased the detection rate, by 13.1% for direct culture and 10.2% for enrichment culture. PMID:21525218

  10. Role of the Controller in an Integrated Pilot-Controller Study for Parallel Approaches

    NASA Technical Reports Server (NTRS)

    Verma, Savvy; Kozon, Thomas; Ballinger, Debbi; Lozito, Sandra; Subramanian, Shobana

    2011-01-01

    Closely spaced parallel runway operations have been found to increase capacity within the National Airspace System but poor visibility conditions reduce the use of these operations [1]. Previous research examined the concepts and procedures related to parallel runways [2][4][5]. However, there has been no investigation of the procedures associated with the strategic and tactical pairing of aircraft for these operations. This study developed and examined the pilot s and controller s procedures and information requirements for creating aircraft pairs for closely spaced parallel runway operations. The goal was to achieve aircraft pairing with a temporal separation of 15s (+/- 10s error) at a coupling point that was 12 nmi from the runway threshold. In this paper, the role of the controller, as examined in an integrated study of controllers and pilots, is presented. The controllers utilized a pairing scheduler and new pairing interfaces to help create and maintain aircraft pairs, in a high-fidelity, human-in-the loop simulation experiment. Results show that the controllers worked as a team to achieve pairing between aircraft and the level of inter-controller coordination increased when the aircraft in the pair belonged to different sectors. Controller feedback did not reveal over reliance on the automation nor complacency with the pairing automation or pairing procedures.

  11. Self-optimizing approach for automated laser resonator alignment

    NASA Astrophysics Data System (ADS)

    Brecher, C.; Schmitt, R.; Loosen, P.; Guerrero, V.; Pyschny, N.; Pavim, A.; Gatej, A.

    2012-02-01

    Nowadays, the assembly of laser systems is dominated by manual operations, involving elaborate alignment by means of adjustable mountings. From a competition perspective, the most challenging problem in laser source manufacturing is price pressure, a result of cost competition exerted mainly from Asia. From an economical point of view, an automated assembly of laser systems defines a better approach to produce more reliable units at lower cost. However, the step from today's manual solutions towards an automated assembly requires parallel developments regarding product design, automation equipment and assembly processes. This paper introduces briefly the idea of self-optimizing technical systems as a new approach towards highly flexible automation. Technically, the work focuses on the precision assembly of laser resonators, which is one of the final and most crucial assembly steps in terms of beam quality and laser power. The paper presents a new design approach for miniaturized laser systems and new automation concepts for a robot-based precision assembly, as well as passive and active alignment methods, which are based on a self-optimizing approach. Very promising results have already been achieved, considerably reducing the duration and complexity of the laser resonator assembly. These results as well as future development perspectives are discussed.

  12. Microfluidic integration of parallel solid-phase liquid chromatography.

    PubMed

    Huft, Jens; Haynes, Charles A; Hansen, Carl L

    2013-03-05

    We report the development of a fully integrated microfluidic chromatography system based on a recently developed column geometry that allows for robust packing of high-performance separation columns in poly(dimethylsiloxane) microfluidic devices having integrated valves made by multilayer soft lithography (MSL). The combination of parallel high-performance separation columns and on-chip plumbing was used to achieve a fully integrated system for on-chip chromatography, including all steps of automated sample loading, programmable gradient generation, separation, fluorescent detection, and sample recovery. We demonstrate this system in the separation of fluorescently labeled DNA and parallel purification of reverse transcription polymerase chain reaction (RT-PCR) amplified variable regions of mouse immunoglobulin genes using a strong anion exchange (AEX) resin. Parallel sample recovery in an immiscible oil stream offers the advantage of low sample dilution and high recovery rates. The ability to perform nucleic acid size selection and recovery on subnanogram samples of DNA holds promise for on-chip genomics applications including sequencing library preparation, cloning, and sample fractionation for diagnostics.

  13. Automated manufacturing of chimeric antigen receptor T cells for adoptive immunotherapy using CliniMACS prodigy.

    PubMed

    Mock, Ulrike; Nickolay, Lauren; Philip, Brian; Cheung, Gordon Weng-Kit; Zhan, Hong; Johnston, Ian C D; Kaiser, Andrew D; Peggs, Karl; Pule, Martin; Thrasher, Adrian J; Qasim, Waseem

    2016-08-01

    Novel cell therapies derived from human T lymphocytes are exhibiting enormous potential in early-phase clinical trials in patients with hematologic malignancies. Ex vivo modification of T cells is currently limited to a small number of centers with the required infrastructure and expertise. The process requires isolation, activation, transduction, expansion and cryopreservation steps. To simplify procedures and widen applicability for clinical therapies, automation of these procedures is being developed. The CliniMACS Prodigy (Miltenyi Biotec) has recently been adapted for lentiviral transduction of T cells and here we analyse the feasibility of a clinically compliant T-cell engineering process for the manufacture of T cells encoding chimeric antigen receptors (CAR) for CD19 (CAR19), a widely targeted antigen in B-cell malignancies. Using a closed, single-use tubing set we processed mononuclear cells from fresh or frozen leukapheresis harvests collected from healthy volunteer donors. Cells were phenotyped and subjected to automated processing and activation using TransAct, a polymeric nanomatrix activation reagent incorporating CD3/CD28-specific antibodies. Cells were then transduced and expanded in the CentriCult-Unit of the tubing set, under stabilized culture conditions with automated feeding and media exchange. The process was continuously monitored to determine kinetics of expansion, transduction efficiency and phenotype of the engineered cells in comparison with small-scale transductions run in parallel. We found that transduction efficiencies, phenotype and function of CAR19 T cells were comparable with existing procedures and overall T-cell yields sufficient for anticipated therapeutic dosing. The automation of closed-system T-cell engineering should improve dissemination of emerging immunotherapies and greatly widen applicability. Copyright © 2016. Published by Elsevier Inc.

  14. Cultural Heritage: An example of graphical documentation with automated photogrammetric systems

    NASA Astrophysics Data System (ADS)

    Giuliano, M. G.

    2014-06-01

    In the field of Cultural Heritage, the use of automated photogrammetric systems, based on Structure from Motion techniques (SfM), is widely used, in particular for the study and for the documentation of the ancient ruins. This work has been carried out during the PhD cycle that was produced the "Carta Archeologica del territorio intorno al monte Massico". The study suggests the archeological documentation of the mausoleum "Torre del Ballerino" placed in the south-west area of Falciano del Massico, along the Via Appia. The graphic documentation has been achieved by using photogrammetric system (Image Based Modeling) and by the classical survey with total station, Nikon Nivo C. The data acquisition was carried out through digital camera Canon EOS 5D Mark II with Canon EF 17-40 mm f/4L USM @ 20 mm with images snapped in RAW and corrected in Adobe Lightroom. During the data processing, the camera calibration and orientation was carried out by the software Agisoft Photoscans and the final result has allowed to achieve a scaled 3D model of the monument, imported in software MeshLab for the different view. Three orthophotos in jpg format were extracted by the model, and then were imported in AutoCAD obtaining façade's surveys.

  15. Digital hydraulic drive for microfluidics and miniaturized cell culture devices based on shape memory alloy actuators

    NASA Astrophysics Data System (ADS)

    Tsai, Cheng-Han; Wu, Xuanye; Kuan, Da-Han; Zimmermann, Stefan; Zengerle, Roland; Koltay, Peter

    2018-08-01

    In order to culture and analyze individual living cells, microfluidic cultivation and manipulation of cells become an increasingly important topic. Such microfluidic systems allow for exploring the phenotypic differences between thousands of genetically identical cells or pharmacological tests in parallel, which is impossible to achieve by traditional macroscopic cell culture methods. Therefore, plenty of microfluidic systems and devices have been developed for cell biological studies like cell culture, cell sorting, and cell lysis in the past. However, these microfluidic systems are still limited by the external pressure sources which most of the time are large in size and have to be connected by fluidic tubing leading to complex and delicate systems. In order to provide a miniaturized, more robust actuation system a novel, compact and low power consumption digital hydraulic drive (DHD) has been developed that is intended for use in portable and automated microfluidic systems for various applications. The DHD considered in this work consists of a shape memory alloy (SMA) actuator and a pneumatic cylinder. The switching time of the digital modes (pressure ON versus OFF) can be adjusted from 1 s to min. Thus, the DHDs might have many applications for driving microfluidic devices. In this work, different implementations of DHDs are presented and their performance is characterized by experiments. In particular, it will be shown that DHDs can be used for microfluidic large-scale integration (mLSI) valve control (256 valves in parallel) as well as potentially for droplet-based microfluidic systems. As further application example, high-throughput mixing of cell cultures (96 wells in parallel) is demonstrated employing the DHD to drive a so-called ‘functional lid’ (FL), to enable a miniaturized micro bioreactor in a regular 96-well micro well plate.

  16. Signal amplification of FISH for automated detection using image cytometry.

    PubMed

    Truong, K; Boenders, J; Maciorowski, Z; Vielh, P; Dutrillaux, B; Malfoy, B; Bourgeois, C A

    1997-05-01

    The purpose of this study was to improve the detection of FISH signals, in order that spot counting by a fully automated image cytometer be comparable to that obtained visually under the microscope. Two systems of spot scoring, visual and automated counting, were investigated in parallel on stimulated human lymphocytes with FISH using a biotinylated centromeric probe for chromosome 3. Signal characteristics were first analyzed on images recorded with a coupled charge device (CCD) camera. Number of spots per nucleus were scored visually on these recorded images versus automatically with a DISCOVERY image analyzer. Several fluochromes, amplification and pretreatments were tested. Our results for both visual and automated scoring show that the tyramide amplification system (TSA) gives the best amplification of signal if pepsin treatment is applied prior to FISH. Accuracy of the automated scoring, however, remained low (58% of nuclei containing two spots) compared to the visual scoring because of the high intranuclear variation between FISH spots.

  17. Automated Coal-Mining System

    NASA Technical Reports Server (NTRS)

    Gangal, M. D.; Isenberg, L.; Lewis, E. V.

    1985-01-01

    Proposed system offers safety and large return on investment. System, operating by year 2000, employs machines and processes based on proven principles. According to concept, line of parallel machines, connected in groups of four to service modules, attacks face of coal seam. High-pressure water jets and central auger on each machine break face. Jaws scoop up coal chunks, and auger grinds them and forces fragments into slurry-transport system. Slurry pumped through pipeline to point of use. Concept for highly automated coal-mining system increases productivity, makes mining safer, and protects health of mine workers.

  18. Development of a novel automated cell isolation, expansion, and characterization platform.

    PubMed

    Franscini, Nicola; Wuertz, Karin; Patocchi-Tenzer, Isabel; Durner, Roland; Boos, Norbert; Graf-Hausner, Ursula

    2011-06-01

    Implementation of regenerative medicine in the clinical setting requires not only biological inventions, but also the development of reproducible and safe method for cell isolation and expansion. As the currently used manual techniques do not fulfill these requirements, there is a clear need to develop an adequate robotic platform for automated, large-scale production of cells or cell-based products. Here, we demonstrate an automated liquid-handling cell-culture platform that can be used to isolate, expand, and characterize human primary cells (e.g., from intervertebral disc tissue) with results that are comparable to the manual procedure. Specifically, no differences could be observed for cell yield, viability, aggregation rate, growth rate, and phenotype. Importantly, all steps-from the enzymatic isolation of cells through the biopsy to the final quality control-can be performed completely by the automated system because of novel tools that were incorporated into the platform. This automated cell-culture platform can therefore replace entirely manual processes in areas that require high throughput while maintaining stability and safety, such as clinical or industrial settings. Copyright © 2011 Society for Laboratory Automation and Screening. Published by Elsevier Inc. All rights reserved.

  19. Automated culture of aquatic model organisms: shrimp larvae husbandry for the needs of research and aquaculture.

    PubMed

    Mutalipassi, M; Di Natale, M; Mazzella, V; Zupo, V

    2018-01-01

    Modern research makes frequent use of animal models, that is, organisms raised and bred experimentally in order to help the understanding of biological and chemical processes affecting organisms or whole environments. The development of flexible, reprogrammable and modular systems that may help the automatic production of 'not-easy-to-keep' species is important for scientific purposes and for such aquaculture needs as the production of alive foods, the culture of small larvae and the test of new culture procedures. For this reason, we planned and built a programmable experimental system adaptable to the culture of various aquatic organisms, at different developmental stages. The system is based on culture cylinders contained into operational tanks connected to water conditioning tanks. A programmable central processor unit controls the operations, that is, water changes, temperature, light irradiance, the opening and closure of valves for the discharge of unused foods, water circulation and filtration and disinfection systems, according to the information received by various probes. Various devices may be set to modify water circulation and water changes to fulfil the needs of given organisms, to avoid damage of delicate structures, improve feeding performances and reduce the risk of movements over the water surface. The results obtained indicate that the system is effective in the production of shrimp larvae, being able to produce Hippolyte inermis post-larvae with low mortality as compared with the standard operation procedures followed by human operators. Therefore, the patented prototype described in the present study is a possible solution to automate and simplify the rearing of small invertebrates in the laboratory and in production plants.

  20. Design and validation of a clinical-scale bioreactor for long-term isolated lung culture.

    PubMed

    Charest, Jonathan M; Okamoto, Tatsuya; Kitano, Kentaro; Yasuda, Atsushi; Gilpin, Sarah E; Mathisen, Douglas J; Ott, Harald C

    2015-06-01

    The primary treatment for end-stage lung disease is lung transplantation. However, donor organ shortage remains a major barrier for many patients. In recent years, techniques for maintaining lungs ex vivo for evaluation and short-term (<12 h) resuscitation have come into more widespread use in an attempt to expand the donor pool. In parallel, progress in whole organ engineering has provided the potential perspective of patient derived grafts grown on demand. As both of these strategies advance to more complex interventions for lung repair and regeneration, the need for a long-term organ culture system becomes apparent. Herein we describe a novel clinical scale bioreactor capable of maintaining functional porcine and human lungs for at least 72 h in isolated lung culture (ILC). The fully automated, computer controlled, sterile, closed circuit system enables physiologic pulsatile perfusion and negative pressure ventilation, while gas exchange function, and metabolism can be evaluated. Creation of this stable, biomimetic long-term culture environment will enable advanced interventions in both donor lungs and engineered grafts of human scale. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Automation of Data Traffic Control on DSM Architecture

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Jin, Hao-Qiang; Yan, Jerry

    2001-01-01

    The design of distributed shared memory (DSM) computers liberates users from the duty to distribute data across processors and allows for the incremental development of parallel programs using, for example, OpenMP or Java threads. DSM architecture greatly simplifies the development of parallel programs having good performance on a few processors. However, to achieve a good program scalability on DSM computers requires that the user understand data flow in the application and use various techniques to avoid data traffic congestions. In this paper we discuss a number of such techniques, including data blocking, data placement, data transposition and page size control and evaluate their efficiency on the NAS (NASA Advanced Supercomputing) Parallel Benchmarks. We also present a tool which automates the detection of constructs causing data congestions in Fortran array oriented codes and advises the user on code transformations for improving data traffic in the application.

  2. Diagnostic accuracy of uriSed automated urine microscopic sediment analyzer and dipstick parameters in predicting urine culture test results.

    PubMed

    Huysal, Kağan; Budak, Yasemin U; Karaca, Ayse Ulusoy; Aydos, Murat; Kahvecioğlu, Serdar; Bulut, Mehtap; Polat, Murat

    2013-01-01

    Urinary tract infection (UTI) is one of the most common types of infection. Currently, diagnosis is primarily based on microbiologic culture, which is time- and labor-consuming. The aim of this study was to assess the diagnostic accuracy of urinalysis results from UriSed (77 Electronica, Budapest, Hungary), an automated microscopic image-based sediment analyzer, in predicting positive urine cultures. We examined a total of 384 urine specimens from hospitalized patients and outpatients attending our hospital on the same day for urinalysis, dipstick tests and semi-quantitative urine culture. The urinalysis results were compared with those of conventional semiquantitative urine culture. Of 384 urinary specimens, 68 were positive for bacteriuria by culture, and were thus considered true positives. Comparison of these results with those obtained from the UriSed analyzer indicated that the analyzer had a specificity of 91.1%, a sensitivity of 47.0%, a positive predictive value (PPV) of 53.3% (95% confidence interval (CI) = 40.8-65.3), and a negative predictive value (NPV) of 88.8% (95% CI = 85.0-91.8%). The accuracy was 83.3% when the urine leukocyte parameter was used, 76.8% when bacteriuria analysis of urinary sediment was used, and 85.1% when the bacteriuria and leukocyturia parameters were combined. The presence of nitrite was the best indicator of culture positivity (99.3% specificity) but had a negative likelihood ratio of 0.7, indicating that it was not a reliable clinical test. Although the specificity of the UriSed analyzer was within acceptable limits, the sensitivity value was low. Thus, UriSed urinalysis resuIts do not accurately predict the outcome of culture.

  3. Cloud parallel processing of tandem mass spectrometry based proteomics data.

    PubMed

    Mohammed, Yassene; Mostovenko, Ekaterina; Henneman, Alex A; Marissen, Rob J; Deelder, André M; Palmblad, Magnus

    2012-10-05

    Data analysis in mass spectrometry based proteomics struggles to keep pace with the advances in instrumentation and the increasing rate of data acquisition. Analyzing this data involves multiple steps requiring diverse software, using different algorithms and data formats. Speed and performance of the mass spectral search engines are continuously improving, although not necessarily as needed to face the challenges of acquired big data. Improving and parallelizing the search algorithms is one possibility; data decomposition presents another, simpler strategy for introducing parallelism. We describe a general method for parallelizing identification of tandem mass spectra using data decomposition that keeps the search engine intact and wraps the parallelization around it. We introduce two algorithms for decomposing mzXML files and recomposing resulting pepXML files. This makes the approach applicable to different search engines, including those relying on sequence databases and those searching spectral libraries. We use cloud computing to deliver the computational power and scientific workflow engines to interface and automate the different processing steps. We show how to leverage these technologies to achieve faster data analysis in proteomics and present three scientific workflows for parallel database as well as spectral library search using our data decomposition programs, X!Tandem and SpectraST.

  4. Conventional versus automated measurement of blood pressure in primary care patients with systolic hypertension: randomised parallel design controlled trial

    PubMed Central

    Godwin, Marshall; Dawes, Martin; Kiss, Alexander; Tobe, Sheldon W; Grant, F Curry; Kaczorowski, Janusz

    2011-01-01

    Objective To compare the quality and accuracy of manual office blood pressure and automated office blood pressure using the awake ambulatory blood pressure as a gold standard. Design Multi-site cluster randomised controlled trial. Setting Primary care practices in five cities in eastern Canada. Participants 555 patients with systolic hypertension and no serious comorbidities under the care of 88 primary care physicians in 67 practices in the community. Interventions Practices were randomly allocated to either ongoing use of manual office blood pressure (control group) or automated office blood pressure (intervention group) using the BpTRU device. The last routine manual office blood pressure (mm Hg) was obtained from each patient’s medical record before enrolment. Office blood pressure readings were compared before and after enrolment in the intervention and control groups; all readings were also compared with the awake ambulatory blood pressure. Main outcome measure Difference in systolic blood pressure between awake ambulatory blood pressure minus automated office blood pressure and awake ambulatory blood pressure minus manual office blood pressure. Results Cluster randomisation allocated 31 practices (252 patients) to manual office blood pressure and 36 practices (303 patients) to automated office blood pressure measurement. The most recent routine manual office blood pressure (149.5 (SD 10.8)/81.4 (8.3)) was higher than automated office blood pressure (135.6 (17.3)/77.7 (10.9)) (P<0.001). In the control group, routine manual office blood pressure before enrolment (149.9 (10.7)/81.8 (8.5)) was reduced to 141.4 (14.6)/80.2 (9.5) after enrolment (P<0.001/P=0.01), but the reduction in the intervention group from manual office to automated office blood pressure was significantly greater (P<0.001/P=0.02). On the first study visit after enrolment, the estimated mean difference for the intervention group between the awake ambulatory systolic/diastolic blood pressure

  5. Parallel Density-Based Clustering for Discovery of Ionospheric Phenomena

    NASA Astrophysics Data System (ADS)

    Pankratius, V.; Gowanlock, M.; Blair, D. M.

    2015-12-01

    Ionospheric total electron content maps derived from global networks of dual-frequency GPS receivers can reveal a plethora of ionospheric features in real-time and are key to space weather studies and natural hazard monitoring. However, growing data volumes from expanding sensor networks are making manual exploratory studies challenging. As the community is heading towards Big Data ionospheric science, automation and Computer-Aided Discovery become indispensable tools for scientists. One problem of machine learning methods is that they require domain-specific adaptations in order to be effective and useful for scientists. Addressing this problem, our Computer-Aided Discovery approach allows scientists to express various physical models as well as perturbation ranges for parameters. The search space is explored through an automated system and parallel processing of batched workloads, which finds corresponding matches and similarities in empirical data. We discuss density-based clustering as a particular method we employ in this process. Specifically, we adapt Density-Based Spatial Clustering of Applications with Noise (DBSCAN). This algorithm groups geospatial data points based on density. Clusters of points can be of arbitrary shape, and the number of clusters is not predetermined by the algorithm; only two input parameters need to be specified: (1) a distance threshold, (2) a minimum number of points within that threshold. We discuss an implementation of DBSCAN for batched workloads that is amenable to parallelization on manycore architectures such as Intel's Xeon Phi accelerator with 60+ general-purpose cores. This manycore parallelization can cluster large volumes of ionospheric total electronic content data quickly. Potential applications for cluster detection include the visualization, tracing, and examination of traveling ionospheric disturbances or other propagating phenomena. Acknowledgments. We acknowledge support from NSF ACI-1442997 (PI V. Pankratius).

  6. The State of Planning of Automation Projects in the Libraries of Canada.

    ERIC Educational Resources Information Center

    Clement, Hope E. A.

    Library automation in Canada is complicated by the large size, dispersed population, and cultural diversity of the country. The National Library of Canada is actively planning a Canadian library network based on national bibliographic services for which the library is now developing automated systems. Canadian libraries are involved in the…

  7. Social aspects of automation: Some critical insights

    NASA Astrophysics Data System (ADS)

    Nouzil, Ibrahim; Raza, Ali; Pervaiz, Salman

    2017-09-01

    Sustainable development has been recognized globally as one of the major driving forces towards the current technological innovations. To achieve sustainable development and attain its associated goals, it is very important to properly address its concerns in different aspects of technological innovations. Several industrial sectors have enjoyed productivity and economic gains due to advent of automation technology. It is important to characterize sustainability for the automation technology. Sustainability is key factor that will determine the future of our neighbours in time and it must be tightly wrapped around the double-edged sword of technology. In this study, different impacts of automation have been addressed using the ‘Circles of Sustainability’ approach as a framework, covering economic, political, cultural and ecological aspects and their implications. A systematic literature review of automation technology from its inception is outlined and plotted against its many outcomes covering a broad spectrum. The study is more focused towards the social aspects of the automation technology. The study also reviews literature to analyse the employment deficiency as one end of the social impact spectrum. On the other end of the spectrum, benefits to society through technological advancements, such as the Internet of Things (IoT) coupled with automation are presented.

  8. Quantification of Dynamic Morphological Drug Responses in 3D Organotypic Cell Cultures by Automated Image Analysis

    PubMed Central

    Härmä, Ville; Schukov, Hannu-Pekka; Happonen, Antti; Ahonen, Ilmari; Virtanen, Johannes; Siitari, Harri; Åkerfelt, Malin; Lötjönen, Jyrki; Nees, Matthias

    2014-01-01

    Glandular epithelial cells differentiate into complex multicellular or acinar structures, when embedded in three-dimensional (3D) extracellular matrix. The spectrum of different multicellular morphologies formed in 3D is a sensitive indicator for the differentiation potential of normal, non-transformed cells compared to different stages of malignant progression. In addition, single cells or cell aggregates may actively invade the matrix, utilizing epithelial, mesenchymal or mixed modes of motility. Dynamic phenotypic changes involved in 3D tumor cell invasion are sensitive to specific small-molecule inhibitors that target the actin cytoskeleton. We have used a panel of inhibitors to demonstrate the power of automated image analysis as a phenotypic or morphometric readout in cell-based assays. We introduce a streamlined stand-alone software solution that supports large-scale high-content screens, based on complex and organotypic cultures. AMIDA (Automated Morphometric Image Data Analysis) allows quantitative measurements of large numbers of images and structures, with a multitude of different spheroid shapes, sizes, and textures. AMIDA supports an automated workflow, and can be combined with quality control and statistical tools for data interpretation and visualization. We have used a representative panel of 12 prostate and breast cancer lines that display a broad spectrum of different spheroid morphologies and modes of invasion, challenged by a library of 19 direct or indirect modulators of the actin cytoskeleton which induce systematic changes in spheroid morphology and differentiation versus invasion. These results were independently validated by 2D proliferation, apoptosis and cell motility assays. We identified three drugs that primarily attenuated the invasion and formation of invasive processes in 3D, without affecting proliferation or apoptosis. Two of these compounds block Rac signalling, one affects cellular cAMP/cGMP accumulation. Our approach supports

  9. Automated analysis of food-borne pathogens using a novel microbial cell culture, sensing and classification system.

    PubMed

    Xiang, Kun; Li, Yinglei; Ford, William; Land, Walker; Schaffer, J David; Congdon, Robert; Zhang, Jing; Sadik, Omowunmi

    2016-02-21

    We hereby report the design and implementation of an Autonomous Microbial Cell Culture and Classification (AMC(3)) system for rapid detection of food pathogens. Traditional food testing methods require multistep procedures and long incubation period, and are thus prone to human error. AMC(3) introduces a "one click approach" to the detection and classification of pathogenic bacteria. Once the cultured materials are prepared, all operations are automatic. AMC(3) is an integrated sensor array platform in a microbial fuel cell system composed of a multi-potentiostat, an automated data collection system (Python program, Yocto Maxi-coupler electromechanical relay module) and a powerful classification program. The classification scheme consists of Probabilistic Neural Network (PNN), Support Vector Machines (SVM) and General Regression Neural Network (GRNN) oracle-based system. Differential Pulse Voltammetry (DPV) is performed on standard samples or unknown samples. Then, using preset feature extractions and quality control, accepted data are analyzed by the intelligent classification system. In a typical use, thirty-two extracted features were analyzed to correctly classify the following pathogens: Escherichia coli ATCC#25922, Escherichia coli ATCC#11775, and Staphylococcus epidermidis ATCC#12228. 85.4% accuracy range was recorded for unknown samples, and within a shorter time period than the industry standard of 24 hours.

  10. Evaluation of the Paratrend Multi-Analyte Sensor for Potential Utilization in Long-Duration Automated Cell Culture Monitoring

    NASA Technical Reports Server (NTRS)

    Hwang, Emma Y.; Pappas, Dimitri; Jeevarajan, Antony S.; Anderson, Melody M.

    2004-01-01

    BACKGROUND: Compact and automated sensors are desired for assessing the health of cell cultures in biotechnology experiments. While several single-analyte sensors exist to measure culture health, a multi-analyte sensor would simplify the cell culture system. One such multi-analyte sensor, the Paratrend 7 manufactured by Diametrics Medical, consists of three optical fibers for measuring pH, dissolved carbon dioxide (pCO(2)), dissolved oxygen (pO(2)), and a thermocouple to measure temperature. The sensor bundle was designed for intra-vascular measurements in clinical settings, and can be used in bioreactors operated both on the ground and in NASA's Space Shuttle and International Space Station (ISS) experiments. METHODS: A Paratrend 7 sensor was placed at the outlet of a bioreactor inoculated with BHK-21 (baby hamster kidney) cells. The pH, pCO(2), pO(2), and temperature data were transferred continuously to an external computer. Cell culture medium, manually extracted from the bioreactor through a sampling port, was also assayed using a bench top blood gas analyzer (BGA). RESULTS: Two Paratrend 7 sensors were used over a single cell culture experiment (64 days). When compared to the manually obtained BGA samples, the sensor had good agreement for pH, pCO(2), and pO(2) with bias (and precision) 0.005(0.024), 8.0 mmHg (4.4 mmHg), and 11 mmHg (17 mmHg), respectively for the first two sensors. A third Paratrend sensor (operated for 141 days) had similar agreement (0.02+/-0.15 for pH, -4+/-8 mm Hg for pCO(2), and 24+/-18 mmHg for pO(2)). CONCLUSION: The resulting biases and precisions are com- parable to Paratrend sensor clinical results. Although the pO(2) differences may be acceptable for clinically relevant measurement ranges, the O(2) sensor in this bundle may not be reliable enough for the ranges of pO(2) in these cell culture studies without periodic calibration.

  11. Automation in the clinical microbiology laboratory.

    PubMed

    Novak, Susan M; Marlowe, Elizabeth M

    2013-09-01

    Imagine a clinical microbiology laboratory where a patient's specimens are placed on a conveyor belt and sent on an automation line for processing and plating. Technologists need only log onto a computer to visualize the images of a culture and send to a mass spectrometer for identification. Once a pathogen is identified, the system knows to send the colony for susceptibility testing. This is the future of the clinical microbiology laboratory. This article outlines the operational and staffing challenges facing clinical microbiology laboratories and the evolution of automation that is shaping the way laboratory medicine will be practiced in the future. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Automated Purification of Recombinant Proteins: Combining High-throughput with High Yield

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Chiann Tso; Moore, Priscilla A.; Auberry, Deanna L.

    2006-05-01

    Protein crystallography, mapping protein interactions and other approaches of current functional genomics require not only purifying large numbers of proteins but also obtaining sufficient yield and homogeneity for downstream high-throughput applications. There is a need for the development of robust automated high-throughput protein expression and purification processes to meet these requirements. We developed and compared two alternative workflows for automated purification of recombinant proteins based on expression of bacterial genes in Escherichia coli: First - a filtration separation protocol based on expression of 800 ml E. coli cultures followed by filtration purification using Ni2+-NTATM Agarose (Qiagen). Second - a smallermore » scale magnetic separation method based on expression in 25 ml cultures of E.coli followed by 96-well purification on MagneHisTM Ni2+ Agarose (Promega). Both workflows provided comparable average yields of proteins about 8 ug of purified protein per unit of OD at 600 nm of bacterial culture. We discuss advantages and limitations of the automated workflows that can provide proteins more than 90 % pure in the range of 100 ug – 45 mg per purification run as well as strategies for optimization of these protocols.« less

  13. The Parallel System for Integrating Impact Models and Sectors (pSIMS)

    NASA Technical Reports Server (NTRS)

    Elliott, Joshua; Kelly, David; Chryssanthacopoulos, James; Glotter, Michael; Jhunjhnuwala, Kanika; Best, Neil; Wilde, Michael; Foster, Ian

    2014-01-01

    We present a framework for massively parallel climate impact simulations: the parallel System for Integrating Impact Models and Sectors (pSIMS). This framework comprises a) tools for ingesting and converting large amounts of data to a versatile datatype based on a common geospatial grid; b) tools for translating this datatype into custom formats for site-based models; c) a scalable parallel framework for performing large ensemble simulations, using any one of a number of different impacts models, on clusters, supercomputers, distributed grids, or clouds; d) tools and data standards for reformatting outputs to common datatypes for analysis and visualization; and e) methodologies for aggregating these datatypes to arbitrary spatial scales such as administrative and environmental demarcations. By automating many time-consuming and error-prone aspects of large-scale climate impacts studies, pSIMS accelerates computational research, encourages model intercomparison, and enhances reproducibility of simulation results. We present the pSIMS design and use example assessments to demonstrate its multi-model, multi-scale, and multi-sector versatility.

  14. Automation or De-automation

    NASA Astrophysics Data System (ADS)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  15. Automated method for the rapid and precise estimation of adherent cell culture characteristics from phase contrast microscopy images.

    PubMed

    Jaccard, Nicolas; Griffin, Lewis D; Keser, Ana; Macown, Rhys J; Super, Alexandre; Veraitch, Farlan S; Szita, Nicolas

    2014-03-01

    The quantitative determination of key adherent cell culture characteristics such as confluency, morphology, and cell density is necessary for the evaluation of experimental outcomes and to provide a suitable basis for the establishment of robust cell culture protocols. Automated processing of images acquired using phase contrast microscopy (PCM), an imaging modality widely used for the visual inspection of adherent cell cultures, could enable the non-invasive determination of these characteristics. We present an image-processing approach that accurately detects cellular objects in PCM images through a combination of local contrast thresholding and post hoc correction of halo artifacts. The method was thoroughly validated using a variety of cell lines, microscope models and imaging conditions, demonstrating consistently high segmentation performance in all cases and very short processing times (<1 s per 1,208 × 960 pixels image). Based on the high segmentation performance, it was possible to precisely determine culture confluency, cell density, and the morphology of cellular objects, demonstrating the wide applicability of our algorithm for typical microscopy image processing pipelines. Furthermore, PCM image segmentation was used to facilitate the interpretation and analysis of fluorescence microscopy data, enabling the determination of temporal and spatial expression patterns of a fluorescent reporter. We created a software toolbox (PHANTAST) that bundles all the algorithms and provides an easy to use graphical user interface. Source-code for MATLAB and ImageJ is freely available under a permissive open-source license. © 2013 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals, Inc.

  16. Automated Method for the Rapid and Precise Estimation of Adherent Cell Culture Characteristics from Phase Contrast Microscopy Images

    PubMed Central

    Jaccard, Nicolas; Griffin, Lewis D; Keser, Ana; Macown, Rhys J; Super, Alexandre; Veraitch, Farlan S; Szita, Nicolas

    2014-01-01

    The quantitative determination of key adherent cell culture characteristics such as confluency, morphology, and cell density is necessary for the evaluation of experimental outcomes and to provide a suitable basis for the establishment of robust cell culture protocols. Automated processing of images acquired using phase contrast microscopy (PCM), an imaging modality widely used for the visual inspection of adherent cell cultures, could enable the non-invasive determination of these characteristics. We present an image-processing approach that accurately detects cellular objects in PCM images through a combination of local contrast thresholding and post hoc correction of halo artifacts. The method was thoroughly validated using a variety of cell lines, microscope models and imaging conditions, demonstrating consistently high segmentation performance in all cases and very short processing times (<1 s per 1,208 × 960 pixels image). Based on the high segmentation performance, it was possible to precisely determine culture confluency, cell density, and the morphology of cellular objects, demonstrating the wide applicability of our algorithm for typical microscopy image processing pipelines. Furthermore, PCM image segmentation was used to facilitate the interpretation and analysis of fluorescence microscopy data, enabling the determination of temporal and spatial expression patterns of a fluorescent reporter. We created a software toolbox (PHANTAST) that bundles all the algorithms and provides an easy to use graphical user interface. Source-code for MATLAB and ImageJ is freely available under a permissive open-source license. Biotechnol. Bioeng. 2014;111: 504–517. © 2013 Wiley Periodicals, Inc. PMID:24037521

  17. PEGASUS 5: An Automated Pre-Processor for Overset-Grid CFD

    NASA Technical Reports Server (NTRS)

    Suhs, Norman E.; Rogers, Stuart E.; Dietz, William E.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    An all new, automated version of the PEGASUS software has been developed and tested. PEGASUS provides the hole-cutting and connectivity information between overlapping grids, and is used as the final part of the grid generation process for overset-grid computational fluid dynamics approaches. The new PEGASUS code (Version 5) has many new features: automated hole cutting; a projection scheme for fixing gaps in overset surfaces; more efficient interpolation search methods using an alternating digital tree; hole-size optimization based on adding additional layers of fringe points; and an automatic restart capability. The new code has also been parallelized using the Message Passing Interface standard. The parallelization performance provides efficient speed-up of the execution time by an order of magnitude, and up to a factor of 30 for very large problems. The results of three example cases are presented: a three-element high-lift airfoil, a generic business jet configuration, and a complete Boeing 777-200 aircraft in a high-lift landing configuration. Comparisons of the computed flow fields for the airfoil and 777 test cases between the old and new versions of the PEGASUS codes show excellent agreement with each other and with experimental results.

  18. Parallel pivoting combined with parallel reduction

    NASA Technical Reports Server (NTRS)

    Alaghband, Gita

    1987-01-01

    Parallel algorithms for triangularization of large, sparse, and unsymmetric matrices are presented. The method combines the parallel reduction with a new parallel pivoting technique, control over generations of fill-ins and a check for numerical stability, all done in parallel with the work being distributed over the active processes. The parallel technique uses the compatibility relation between pivots to identify parallel pivot candidates and uses the Markowitz number of pivots to minimize fill-in. This technique is not a preordering of the sparse matrix and is applied dynamically as the decomposition proceeds.

  19. 3 dimensional cell cultures: a comparison between manually and automatically produced alginate beads.

    PubMed

    Lehmann, R; Gallert, C; Roddelkopf, T; Junginger, S; Wree, A; Thurow, K

    2016-08-01

    Cancer diseases are a common problem of the population caused by age and increased harmful environmental influences. Herein, new therapeutic strategies and compound screenings are necessary. The regular 2D cultivation has to be replaced by three dimensional cell culturing (3D) for better simulation of in vivo conditions. The 3D cultivation with alginate matrix is an appropriate method for encapsulate cells to form cancer constructs. The automated manufacturing of alginate beads might be an ultimate method for large-scaled manufacturing constructs similar to cancer tissue. The aim of this study was the integration of full automated systems for the production, cultivation and screening of 3D cell cultures. We compared the automated methods with the regular manual processes. Furthermore, we investigated the influence of antibiotics on these 3D cell culture systems. The alginate beads were formed by automated and manual procedures. The automated steps were processes by the Biomek(®) Cell Workstation (celisca, Rostock, Germany). The proliferation and toxicity were manually and automatically evaluated at day 14 and 35 of cultivation. The results visualized an accumulation and expansion of cell aggregates over the period of incubation. However, the proliferation and toxicity were faintly and partly significantly decreased on day 35 compared to day 14. The comparison of the manual and automated methods displayed similar results. We conclude that the manual production process could be replaced by the automation. Using automation, 3D cell cultures can be produced in industrial scale and improve the drug development and screening to treat serious illnesses like cancer.

  20. Surveillance cultures of samples obtained from biopsy channels and automated endoscope reprocessors after high-level disinfection of gastrointestinal endoscopes.

    PubMed

    Chiu, King-Wah; Tsai, Ming-Chao; Wu, Keng-Liang; Chiu, Yi-Chun; Lin, Ming-Tzung; Hu, Tsung-Hui

    2012-09-03

    The instrument channels of gastrointestinal (GI) endoscopes may be heavily contaminated with bacteria even after high-level disinfection (HLD). The British Society of Gastroenterology guidelines emphasize the benefits of manually brushing endoscope channels and using automated endoscope reprocessors (AERs) for disinfecting endoscopes. In this study, we aimed to assess the effectiveness of decontamination using reprocessors after HLD by comparing the cultured samples obtained from biopsy channels (BCs) of GI endoscopes and the internal surfaces of AERs. We conducted a 5-year prospective study. Every month random consecutive sampling was carried out after a complete reprocessing cycle; 420 rinse and swabs samples were collected from BCs and internal surface of AERs, respectively. Of the 420 rinse samples collected from the BC of the GI endoscopes, 300 were obtained from the BCs of gastroscopes and 120 from BCs of colonoscopes. Samples were collected by flushing the BCs with sterile distilled water, and swabbing the residual water from the AERs after reprocessing. These samples were cultured to detect the presence of aerobic and anaerobic bacteria and mycobacteria. The number of culture-positive samples obtained from BCs (13.6%, 57/420) was significantly higher than that obtained from AERs (1.7%, 7/420). In addition, the number of culture-positive samples obtained from the BCs of gastroscopes (10.7%, 32/300) and colonoscopes (20.8%, 25/120) were significantly higher than that obtained from AER reprocess to gastroscopes (2.0%, 6/300) and AER reprocess to colonoscopes (0.8%, 1/120). Culturing rinse samples obtained from BCs provides a better indication of the effectiveness of the decontamination of GI endoscopes after HLD than culturing the swab samples obtained from the inner surfaces of AERs as the swab samples only indicate whether the AERs are free from microbial contamination or not.

  1. Surveillance cultures of samples obtained from biopsy channels and automated endoscope reprocessors after high-level disinfection of gastrointestinal endoscopes

    PubMed Central

    2012-01-01

    Background The instrument channels of gastrointestinal (GI) endoscopes may be heavily contaminated with bacteria even after high-level disinfection (HLD). The British Society of Gastroenterology guidelines emphasize the benefits of manually brushing endoscope channels and using automated endoscope reprocessors (AERs) for disinfecting endoscopes. In this study, we aimed to assess the effectiveness of decontamination using reprocessors after HLD by comparing the cultured samples obtained from biopsy channels (BCs) of GI endoscopes and the internal surfaces of AERs. Methods We conducted a 5-year prospective study. Every month random consecutive sampling was carried out after a complete reprocessing cycle; 420 rinse and swabs samples were collected from BCs and internal surface of AERs, respectively. Of the 420 rinse samples collected from the BC of the GI endoscopes, 300 were obtained from the BCs of gastroscopes and 120 from BCs of colonoscopes. Samples were collected by flushing the BCs with sterile distilled water, and swabbing the residual water from the AERs after reprocessing. These samples were cultured to detect the presence of aerobic and anaerobic bacteria and mycobacteria. Results The number of culture-positive samples obtained from BCs (13.6%, 57/420) was significantly higher than that obtained from AERs (1.7%, 7/420). In addition, the number of culture-positive samples obtained from the BCs of gastroscopes (10.7%, 32/300) and colonoscopes (20.8%, 25/120) were significantly higher than that obtained from AER reprocess to gastroscopes (2.0%, 6/300) and AER reprocess to colonoscopes (0.8%, 1/120). Conclusions Culturing rinse samples obtained from BCs provides a better indication of the effectiveness of the decontamination of GI endoscopes after HLD than culturing the swab samples obtained from the inner surfaces of AERs as the swab samples only indicate whether the AERs are free from microbial contamination or not. PMID:22943739

  2. Parallel peak pruning for scalable SMP contour tree computation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carr, Hamish A.; Weber, Gunther H.; Sewell, Christopher M.

    As data sets grow to exascale, automated data analysis and visualisation are increasingly important, to intermediate human understanding and to reduce demands on disk storage via in situ analysis. Trends in architecture of high performance computing systems necessitate analysis algorithms to make effective use of combinations of massively multicore and distributed systems. One of the principal analytic tools is the contour tree, which analyses relationships between contours to identify features of more than local importance. Unfortunately, the predominant algorithms for computing the contour tree are explicitly serial, and founded on serial metaphors, which has limited the scalability of this formmore » of analysis. While there is some work on distributed contour tree computation, and separately on hybrid GPU-CPU computation, there is no efficient algorithm with strong formal guarantees on performance allied with fast practical performance. Here in this paper, we report the first shared SMP algorithm for fully parallel contour tree computation, withfor-mal guarantees of O(lgnlgt) parallel steps and O(n lgn) work, and implementations with up to 10x parallel speed up in OpenMP and up to 50x speed up in NVIDIA Thrust.« less

  3. Automated Microbial Metabolism Laboratory

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Development of the automated microbial metabolism laboratory (AMML) concept is reported. The focus of effort of AMML was on the advanced labeled release experiment. Labeled substrates, inhibitors, and temperatures were investigated to establish a comparative biochemical profile. Profiles at three time intervals on soil and pure cultures of bacteria isolated from soil were prepared to establish a complete library. The development of a strategy for the return of a soil sample from Mars is also reported.

  4. Bacterial and fungal DNA extraction from blood samples: automated protocols.

    PubMed

    Lorenz, Michael G; Disqué, Claudia; Mühl, Helge

    2015-01-01

    Automation in DNA isolation is a necessity for routine practice employing molecular diagnosis of infectious agents. To this end, the development of automated systems for the molecular diagnosis of microorganisms directly in blood samples is at its beginning. Important characteristics of systems demanded for routine use include high recovery of microbial DNA, DNA-free containment for the reduction of DNA contamination from exogenous sources, DNA-free reagents and consumables, ideally a walkaway system, and economical pricing of the equipment and consumables. Such full automation of DNA extraction evaluated and in use for sepsis diagnostics is yet not available. Here, we present protocols for the semiautomated isolation of microbial DNA from blood culture and low- and high-volume blood samples. The protocols include a manual pretreatment step followed by automated extraction and purification of microbial DNA.

  5. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    DOE PAGES

    Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...

    2008-01-01

    The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less

  6. Development of an automated MODS plate reader to detect early growth of Mycobacterium tuberculosis.

    PubMed

    Comina, G; Mendoza, D; Velazco, A; Coronel, J; Sheen, P; Gilman, R H; Moore, D A J; Zimic, M

    2011-06-01

    In this work, an automated microscopic observation drug susceptibility (MODS) plate reader has been developed. The reader automatically handles MODS plates and after autofocussing digital images are acquired of the characteristic microscopic cording structures of Mycobacterium tuberculosis, which are the identification method utilized in the MODS technique to detect tuberculosis and multidrug resistant tuberculosis. In conventional MODS, trained technicians manually move the MODS plate on the stage of an inverted microscope while trying to locate and focus upon the characteristic microscopic cording colonies. In centres with high tuberculosis diagnostic demand, sufficient time may not be available to adequately examine all cultures. An automated reader would reduce labour time and the handling of M. tuberculosis cultures by laboratory personnel. Two hundred MODS culture images (100 from tuberculosis positive and 100 from tuberculosis negative sputum samples confirmed by a standard MODS reading using a commercial microscope) were acquired randomly using the automated MODS plate reader. A specialist analysed these digital images with the help of a personal computer and designated them as M. tuberculosis present or absent. The specialist considered four images insufficiently clear to permit a definitive reading. The readings from the 196 valid images resulted in a 100% agreement with the conventional nonautomated standard reading. The automated MODS plate reader combined with open-source MODS pattern recognition software provides a novel platform for high throughput automated tuberculosis diagnosis. © 2011 The Authors Journal of Microscopy © 2011 Royal Microscopical Society.

  7. Automated quality control in a file-based broadcasting workflow

    NASA Astrophysics Data System (ADS)

    Zhang, Lina

    2014-04-01

    Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.

  8. Automated high throughput microscale antibody purification workflows for accelerating antibody discovery

    PubMed Central

    Luan, Peng; Lee, Sophia; Paluch, Maciej; Kansopon, Joe; Viajar, Sharon; Begum, Zahira; Chiang, Nancy; Nakamura, Gerald; Hass, Philip E.; Wong, Athena W.; Lazar, Greg A.

    2018-01-01

    ABSTRACT To rapidly find “best-in-class” antibody therapeutics, it has become essential to develop high throughput (HTP) processes that allow rapid assessment of antibodies for functional and molecular properties. Consequently, it is critical to have access to sufficient amounts of high quality antibody, to carry out accurate and quantitative characterization. We have developed automated workflows using liquid handling systems to conduct affinity-based purification either in batch or tip column mode. Here, we demonstrate the capability to purify >2000 antibodies per day from microscale (1 mL) cultures. Our optimized, automated process for human IgG1 purification using MabSelect SuRe resin achieves ∼70% recovery over a wide range of antibody loads, up to 500 µg. This HTP process works well for hybridoma-derived antibodies that can be purified by MabSelect SuRe resin. For rat IgG2a, which is often encountered in hybridoma cultures and is challenging to purify via an HTP process, we established automated purification with GammaBind Plus resin. Using these HTP purification processes, we can efficiently recover sufficient amounts of antibodies from mammalian transient or hybridoma cultures with quality comparable to conventional column purification. PMID:29494273

  9. A Droplet Microfluidic Platform for Automating Genetic Engineering.

    PubMed

    Gach, Philip C; Shih, Steve C C; Sustarich, Jess; Keasling, Jay D; Hillson, Nathan J; Adams, Paul D; Singh, Anup K

    2016-05-20

    We present a water-in-oil droplet microfluidic platform for transformation, culture and expression of recombinant proteins in multiple host organisms including bacteria, yeast and fungi. The platform consists of a hybrid digital microfluidic/channel-based droplet chip with integrated temperature control to allow complete automation and integration of plasmid addition, heat-shock transformation, addition of selection medium, culture, and protein expression. The microfluidic format permitted significant reduction in consumption (100-fold) of expensive reagents such as DNA and enzymes compared to the benchtop method. The chip contains a channel to continuously replenish oil to the culture chamber to provide a fresh supply of oxygen to the cells for long-term (∼5 days) cell culture. The flow channel also replenished oil lost to evaporation and increased the number of droplets that could be processed and cultured. The platform was validated by transforming several plasmids into Escherichia coli including plasmids containing genes for fluorescent proteins GFP, BFP and RFP; plasmids with selectable markers for ampicillin or kanamycin resistance; and a Golden Gate DNA assembly reaction. We also demonstrate the applicability of this platform for transformation in widely used eukaryotic organisms such as Saccharomyces cerevisiae and Aspergillus niger. Duration and temperatures of the microfluidic heat-shock procedures were optimized to yield transformation efficiencies comparable to those obtained by benchtop methods with a throughput up to 6 droplets/min. The proposed platform offers potential for automation of molecular biology experiments significantly reducing cost, time and variability while improving throughput.

  10. Automated inspection of hot steel slabs

    DOEpatents

    Martin, R.J.

    1985-12-24

    The disclosure relates to a real time digital image enhancement system for performing the image enhancement segmentation processing required for a real time automated system for detecting and classifying surface imperfections in hot steel slabs. The system provides for simultaneous execution of edge detection processing and intensity threshold processing in parallel on the same image data produced by a sensor device such as a scanning camera. The results of each process are utilized to validate the results of the other process and a resulting image is generated that contains only corresponding segmentation that is produced by both processes. 5 figs.

  11. Automated inspection of hot steel slabs

    DOEpatents

    Martin, Ronald J.

    1985-01-01

    The disclosure relates to a real time digital image enhancement system for performing the image enhancement segmentation processing required for a real time automated system for detecting and classifying surface imperfections in hot steel slabs. The system provides for simultaneous execution of edge detection processing and intensity threshold processing in parallel on the same image data produced by a sensor device such as a scanning camera. The results of each process are utilized to validate the results of the other process and a resulting image is generated that contains only corresponding segmentation that is produced by both processes.

  12. Advantages and challenges in automated apatite fission track counting

    NASA Astrophysics Data System (ADS)

    Enkelmann, E.; Ehlers, T. A.

    2012-04-01

    Fission track thermochronometer data are often a core element of modern tectonic and denudation studies. Soon after the development of the fission track methods interest emerged for the developed an automated counting procedure to replace the time consuming labor of counting fission tracks under the microscope. Automated track counting became feasible in recent years with increasing improvements in computer software and hardware. One such example used in this study is the commercial automated fission track counting procedure from Autoscan Systems Pty that has been highlighted through several venues. We conducted experiments that are designed to reliably and consistently test the ability of this fully automated counting system to recognize fission tracks in apatite and a muscovite external detector. Fission tracks were analyzed in samples with a step-wise increase in sample complexity. The first set of experiments used a large (mm-size) slice of Durango apatite cut parallel to the prism plane. Second, samples with 80-200 μm large apatite grains of Fish Canyon Tuff were analyzed. This second sample set is characterized by complexities often found in apatites in different rock types. In addition to the automated counting procedure, the same samples were also analyzed using conventional counting procedures. We found for all samples that the fully automated fission track counting procedure using the Autoscan System yields a larger scatter in the fission track densities measured compared to conventional (manual) track counting. This scatter typically resulted from the false identification of tracks due surface and mineralogical defects, regardless of the image filtering procedure used. Large differences between track densities analyzed with the automated counting persisted between different grains analyzed in one sample as well as between different samples. As a result of these differences a manual correction of the fully automated fission track counts is necessary for

  13. Parallel computation of level set method for 500 Hz visual servo control

    NASA Astrophysics Data System (ADS)

    Fei, Xianfeng; Igarashi, Yasunobu; Hashimoto, Koichi

    2008-11-01

    We propose a 2D microorganism tracking system using a parallel level set method and a column parallel vision system (CPV). This system keeps a single microorganism in the middle of the visual field under a microscope by visual servoing an automated stage. We propose a new energy function for the level set method. This function constrains an amount of light intensity inside the detected object contour to control the number of the detected objects. This algorithm is implemented in CPV system and computational time for each frame is 2 [ms], approximately. A tracking experiment for about 25 s is demonstrated. Also we demonstrate a single paramecium can be kept tracking even if other paramecia appear in the visual field and contact with the tracked paramecium.

  14. Enabling Requirements-Based Programming for Highly-Dependable Complex Parallel and Distributed Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    The manual application of formal methods in system specification has produced successes, but in the end, despite any claims and assertions by practitioners, there is no provable relationship between a manually derived system specification or formal model and the customer's original requirements. Complex parallel and distributed system present the worst case implications for today s dearth of viable approaches for achieving system dependability. No avenue other than formal methods constitutes a serious contender for resolving the problem, and so recognition of requirements-based programming has come at a critical juncture. We describe a new, NASA-developed automated requirement-based programming method that can be applied to certain classes of systems, including complex parallel and distributed systems, to achieve a high degree of dependability.

  15. Parallel Eclipse Project Checkout

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas M.; Joswig, Joseph C.; Shams, Khawaja S.; Powell, Mark W.; Bachmann, Andrew G.

    2011-01-01

    Parallel Eclipse Project Checkout (PEPC) is a program written to leverage parallelism and to automate the checkout process of plug-ins created in Eclipse RCP (Rich Client Platform). Eclipse plug-ins can be aggregated in a feature project. This innovation digests a feature description (xml file) and automatically checks out all of the plug-ins listed in the feature. This resolves the issue of manually checking out each plug-in required to work on the project. To minimize the amount of time necessary to checkout the plug-ins, this program makes the plug-in checkouts parallel. After parsing the feature, a request to checkout for each plug-in in the feature has been inserted. These requests are handled by a thread pool with a configurable number of threads. By checking out the plug-ins in parallel, the checkout process is streamlined before getting started on the project. For instance, projects that took 30 minutes to checkout now take less than 5 minutes. The effect is especially clear on a Mac, which has a network monitor displaying the bandwidth use. When running the client from a developer s home, the checkout process now saturates the bandwidth in order to get all the plug-ins checked out as fast as possible. For comparison, a checkout process that ranged from 8-200 Kbps from a developer s home is now able to saturate a pipe of 1.3 Mbps, resulting in significantly faster checkouts. Eclipse IDE (integrated development environment) tries to build a project as soon as it is downloaded. As part of another optimization, this innovation programmatically tells Eclipse to stop building while checkouts are happening, which dramatically reduces lock contention and enables plug-ins to continue downloading until all of them finish. Furthermore, the software re-enables automatic building, and forces Eclipse to do a clean build once it finishes checking out all of the plug-ins. This software is fully generic and does not contain any NASA-specific code. It can be applied to any

  16. Automated deep-phenotyping of the vertebrate brain

    PubMed Central

    Allalou, Amin; Wu, Yuelong; Ghannad-Rezaie, Mostafa; Eimon, Peter M; Yanik, Mehmet Fatih

    2017-01-01

    Here, we describe an automated platform suitable for large-scale deep-phenotyping of zebrafish mutant lines, which uses optical projection tomography to rapidly image brain-specific gene expression patterns in 3D at cellular resolution. Registration algorithms and correlation analysis are then used to compare 3D expression patterns, to automatically detect all statistically significant alterations in mutants, and to map them onto a brain atlas. Automated deep-phenotyping of a mutation in the master transcriptional regulator fezf2 not only detects all known phenotypes but also uncovers important novel neural deficits that were overlooked in previous studies. In the telencephalon, we show for the first time that fezf2 mutant zebrafish have significant patterning deficits, particularly in glutamatergic populations. Our findings reveal unexpected parallels between fezf2 function in zebrafish and mice, where mutations cause deficits in glutamatergic neurons of the telencephalon-derived neocortex. DOI: http://dx.doi.org/10.7554/eLife.23379.001 PMID:28406399

  17. Catch and Patch: A Pipette-Based Approach for Automating Patch Clamp That Enables Cell Selection and Fast Compound Application.

    PubMed

    Danker, Timm; Braun, Franziska; Silbernagl, Nikole; Guenther, Elke

    2016-03-01

    Manual patch clamp, the gold standard of electrophysiology, represents a powerful and versatile toolbox to stimulate, modulate, and record ion channel activity from membrane fragments and whole cells. The electrophysiological readout can be combined with fluorescent or optogenetic methods and allows for ultrafast solution exchanges using specialized microfluidic tools. A hallmark of manual patch clamp is the intentional selection of individual cells for recording, often an essential prerequisite to generate meaningful data. So far, available automation solutions rely on random cell usage in the closed environment of a chip and thus sacrifice much of this versatility by design. To parallelize and automate the traditional patch clamp technique while perpetuating the full versatility of the method, we developed an approach to automation, which is based on active cell handling and targeted electrode placement rather than on random processes. This is achieved through an automated pipette positioning system, which guides the tips of recording pipettes with micrometer precision to a microfluidic cell handling device. Using a patch pipette array mounted on a conventional micromanipulator, our automated patch clamp process mimics the original manual patch clamp as closely as possible, yet achieving a configuration where recordings are obtained from many patch electrodes in parallel. In addition, our implementation is extensible by design to allow the easy integration of specialized equipment such as ultrafast compound application tools. The resulting system offers fully automated patch clamp on purposely selected cells and combines high-quality gigaseal recordings with solution switching in the millisecond timescale.

  18. Automating quantum experiment control

    NASA Astrophysics Data System (ADS)

    Stevens, Kelly E.; Amini, Jason M.; Doret, S. Charles; Mohler, Greg; Volin, Curtis; Harter, Alexa W.

    2017-03-01

    The field of quantum information processing is rapidly advancing. As the control of quantum systems approaches the level needed for useful computation, the physical hardware underlying the quantum systems is becoming increasingly complex. It is already becoming impractical to manually code control for the larger hardware implementations. In this chapter, we will employ an approach to the problem of system control that parallels compiler design for a classical computer. We will start with a candidate quantum computing technology, the surface electrode ion trap, and build a system instruction language which can be generated from a simple machine-independent programming language via compilation. We incorporate compile time generation of ion routing that separates the algorithm description from the physical geometry of the hardware. Extending this approach to automatic routing at run time allows for automated initialization of qubit number and placement and additionally allows for automated recovery after catastrophic events such as qubit loss. To show that these systems can handle real hardware, we present a simple demonstration system that routes two ions around a multi-zone ion trap and handles ion loss and ion placement. While we will mainly use examples from transport-based ion trap quantum computing, many of the issues and solutions are applicable to other architectures.

  19. 3D-templated, fully automated microfluidic input/output multiplexer for endocrine tissue culture and secretion sampling.

    PubMed

    Li, Xiangpeng; Brooks, Jessica C; Hu, Juan; Ford, Katarena I; Easley, Christopher J

    2017-01-17

    A fully automated, 16-channel microfluidic input/output multiplexer (μMUX) has been developed for interfacing to primary cells and to improve understanding of the dynamics of endocrine tissue function. The device utilizes pressure driven push-up valves for precise manipulation of nutrient input and hormone output dynamics, allowing time resolved interrogation of the cells. The ability to alternate any of the 16 channels from input to output, and vice versa, provides for high experimental flexibility without the need to alter microchannel designs. 3D-printed interface templates were custom designed to sculpt the above-channel polydimethylsiloxane (PDMS) in microdevices, creating millimeter scale reservoirs and confinement chambers to interface primary murine islets and adipose tissue explants to the μMUX sampling channels. This μMUX device and control system was first programmed for dynamic studies of pancreatic islet function to collect ∼90 minute insulin secretion profiles from groups of ∼10 islets. The automated system was also operated in temporal stimulation and cell imaging mode. Adipose tissue explants were exposed to a temporal mimic of post-prandial insulin and glucose levels, while simultaneous switching between labeled and unlabeled free fatty acid permitted fluorescent imaging of fatty acid uptake dynamics in real time over a ∼2.5 hour period. Application with varying stimulation and sampling modes on multiple murine tissue types highlights the inherent flexibility of this novel, 3D-templated μMUX device. The tissue culture reservoirs and μMUX control components presented herein should be adaptable as individual modules in other microfluidic systems, such as organ-on-a-chip devices, and should be translatable to different tissues such as liver, heart, skeletal muscle, and others.

  20. Parallel File System I/O Performance Testing On LANL Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiens, Isaac Christian; Green, Jennifer Kathleen

    2016-08-18

    These are slides from a presentation on parallel file system I/O performance testing on LANL clusters. I/O is a known bottleneck for HPC applications. Performance optimization of I/O is often required. This summer project entailed integrating IOR under Pavilion and automating the results analysis. The slides cover the following topics: scope of the work, tools utilized, IOR-Pavilion test workflow, build script, IOR parameters, how parameters are passed to IOR, *run_ior: functionality, Python IOR-Output Parser, Splunk data format, Splunk dashboard and features, and future work.

  1. A Concept for Airborne Precision Spacing for Dependent Parallel Approaches

    NASA Technical Reports Server (NTRS)

    Barmore, Bryan E.; Baxley, Brian T.; Abbott, Terence S.; Capron, William R.; Smith, Colin L.; Shay, Richard F.; Hubbs, Clay

    2012-01-01

    The Airborne Precision Spacing concept of operations has been previously developed to support the precise delivery of aircraft landing successively on the same runway. The high-precision and consistent delivery of inter-aircraft spacing allows for increased runway throughput and the use of energy-efficient arrivals routes such as Continuous Descent Arrivals and Optimized Profile Descents. This paper describes an extension to the Airborne Precision Spacing concept to enable dependent parallel approach operations where the spacing aircraft must manage their in-trail spacing from a leading aircraft on approach to the same runway and spacing from an aircraft on approach to a parallel runway. Functionality for supporting automation is discussed as well as procedures for pilots and controllers. An analysis is performed to identify the required information and a new ADS-B report is proposed to support these information needs. Finally, several scenarios are described in detail.

  2. Rapid automated classification of anesthetic depth levels using GPU based parallelization of neural networks.

    PubMed

    Peker, Musa; Şen, Baha; Gürüler, Hüseyin

    2015-02-01

    The effect of anesthesia on the patient is referred to as depth of anesthesia. Rapid classification of appropriate depth level of anesthesia is a matter of great importance in surgical operations. Similarly, accelerating classification algorithms is important for the rapid solution of problems in the field of biomedical signal processing. However numerous, time-consuming mathematical operations are required when training and testing stages of the classification algorithms, especially in neural networks. In this study, to accelerate the process, parallel programming and computing platform (Nvidia CUDA) facilitates dramatic increases in computing performance by harnessing the power of the graphics processing unit (GPU) was utilized. The system was employed to detect anesthetic depth level on related electroencephalogram (EEG) data set. This dataset is rather complex and large. Moreover, the achieving more anesthetic levels with rapid response is critical in anesthesia. The proposed parallelization method yielded high accurate classification results in a faster time.

  3. Automation of Hubble Space Telescope Mission Operations

    NASA Technical Reports Server (NTRS)

    Burley, Richard; Goulet, Gregory; Slater, Mark; Huey, William; Bassford, Lynn; Dunham, Larry

    2012-01-01

    On June 13, 2011, after more than 21 years, 115 thousand orbits, and nearly 1 million exposures taken, the operation of the Hubble Space Telescope successfully transitioned from 24x7x365 staffing to 815 staffing. This required the automation of routine mission operations including telemetry and forward link acquisition, data dumping and solid-state recorder management, stored command loading, and health and safety monitoring of both the observatory and the HST Ground System. These changes were driven by budget reductions, and required ground system and onboard spacecraft enhancements across the entire operations spectrum, from planning and scheduling systems to payload flight software. Changes in personnel and staffing were required in order to adapt to the new roles and responsibilities required in the new automated operations era. This paper will provide a high level overview of the obstacles to automating nominal HST mission operations, both technical and cultural, and how those obstacles were overcome.

  4. Representing and computing regular languages on massively parallel networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, M.I.; O'Sullivan, J.A.; Boysam, B.

    1991-01-01

    This paper proposes a general method for incorporating rule-based constraints corresponding to regular languages into stochastic inference problems, thereby allowing for a unified representation of stochastic and syntactic pattern constraints. The authors' approach first established the formal connection of rules to Chomsky grammars, and generalizes the original work of Shannon on the encoding of rule-based channel sequences to Markov chains of maximum entropy. This maximum entropy probabilistic view leads to Gibb's representations with potentials which have their number of minima growing at precisely the exponential rate that the language of deterministically constrained sequences grow. These representations are coupled to stochasticmore » diffusion algorithms, which sample the language-constrained sequences by visiting the energy minima according to the underlying Gibbs' probability law. The coupling to stochastic search methods yields the all-important practical result that fully parallel stochastic cellular automata may be derived to generate samples from the rule-based constraint sets. The production rules and neighborhood state structure of the language of sequences directly determines the necessary connection structures of the required parallel computing surface. Representations of this type have been mapped to the DAP-510 massively-parallel processor consisting of 1024 mesh-connected bit-serial processing elements for performing automated segmentation of electron-micrograph images.« less

  5. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    PubMed

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  6. Cardiac imaging: working towards fully-automated machine analysis & interpretation

    PubMed Central

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-01-01

    Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804

  7. An Automated Sample Preparation Instrument to Accelerate Positive Blood Cultures Microbial Identification by MALDI-TOF Mass Spectrometry (Vitek®MS).

    PubMed

    Broyer, Patrick; Perrot, Nadine; Rostaing, Hervé; Blaze, Jérome; Pinston, Frederic; Gervasi, Gaspard; Charles, Marie-Hélène; Dachaud, Fabien; Dachaud, Jacques; Moulin, Frederic; Cordier, Sylvain; Dauwalder, Olivier; Meugnier, Hélène; Vandenesch, Francois

    2018-01-01

    Sepsis is the leading cause of death among patients in intensive care units (ICUs) requiring an early diagnosis to introduce efficient therapeutic intervention. Rapid identification (ID) of a causative pathogen is key to guide directed antimicrobial selection and was recently shown to reduce hospitalization length in ICUs. Direct processing of positive blood cultures by MALDI-TOF MS technology is one of the several currently available tools used to generate rapid microbial ID. However, all recently published protocols are still manual and time consuming, requiring dedicated technician availability and specific strategies for batch processing. We present here a new prototype instrument for automated preparation of Vitek ® MS slides directly from positive blood culture broth based on an "all-in-one" extraction strip. This bench top instrument was evaluated on 111 and 22 organisms processed using artificially inoculated blood culture bottles in the BacT/ALERT ® 3D (SA/SN blood culture bottles) or the BacT/ALERT Virtuo TM system (FA/FN Plus bottles), respectively. Overall, this new preparation station provided reliable and accurate Vitek MS species-level identification of 87% (Gram-negative bacteria = 85%, Gram-positive bacteria = 88%, and yeast = 100%) when used with BacT/ALERT ® 3D and of 84% (Gram-negative bacteria = 86%, Gram-positive bacteria = 86%, and yeast = 75%) with Virtuo ® instruments, respectively. The prototype was then evaluated in a clinical microbiology laboratory on 102 clinical blood culture bottles and compared to routine laboratory ID procedures. Overall, the correlation of ID on monomicrobial bottles was 83% (Gram-negative bacteria = 89%, Gram-positive bacteria = 79%, and yeast = 78%), demonstrating roughly equivalent performance between manual and automatized extraction methods. This prototype instrument exhibited a high level of performance regardless of bottle type or BacT/ALERT system. Furthermore, blood culture workflow could potentially

  8. Altering user' acceptance of automation through prior automation exposure.

    PubMed

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  9. Automated Video-Based Analysis of Contractility and Calcium Flux in Human-Induced Pluripotent Stem Cell-Derived Cardiomyocytes Cultured over Different Spatial Scales

    PubMed Central

    Huebsch, Nathaniel; Loskill, Peter; Mandegar, Mohammad A.; Marks, Natalie C.; Sheehan, Alice S.; Ma, Zhen; Mathur, Anurag; Nguyen, Trieu N.; Yoo, Jennie C.; Judge, Luke M.; Spencer, C. Ian; Chukka, Anand C.; Russell, Caitlin R.; So, Po-Lin

    2015-01-01

    Contractile motion is the simplest metric of cardiomyocyte health in vitro, but unbiased quantification is challenging. We describe a rapid automated method, requiring only standard video microscopy, to analyze the contractility of human-induced pluripotent stem cell-derived cardiomyocytes (iPS-CM). New algorithms for generating and filtering motion vectors combined with a newly developed isogenic iPSC line harboring genetically encoded calcium indicator, GCaMP6f, allow simultaneous user-independent measurement and analysis of the coupling between calcium flux and contractility. The relative performance of these algorithms, in terms of improving signal to noise, was tested. Applying these algorithms allowed analysis of contractility in iPS-CM cultured over multiple spatial scales from single cells to three-dimensional constructs. This open source software was validated with analysis of isoproterenol response in these cells, and can be applied in future studies comparing the drug responsiveness of iPS-CM cultured in different microenvironments in the context of tissue engineering. PMID:25333967

  10. Automated Video-Based Analysis of Contractility and Calcium Flux in Human-Induced Pluripotent Stem Cell-Derived Cardiomyocytes Cultured over Different Spatial Scales.

    PubMed

    Huebsch, Nathaniel; Loskill, Peter; Mandegar, Mohammad A; Marks, Natalie C; Sheehan, Alice S; Ma, Zhen; Mathur, Anurag; Nguyen, Trieu N; Yoo, Jennie C; Judge, Luke M; Spencer, C Ian; Chukka, Anand C; Russell, Caitlin R; So, Po-Lin; Conklin, Bruce R; Healy, Kevin E

    2015-05-01

    Contractile motion is the simplest metric of cardiomyocyte health in vitro, but unbiased quantification is challenging. We describe a rapid automated method, requiring only standard video microscopy, to analyze the contractility of human-induced pluripotent stem cell-derived cardiomyocytes (iPS-CM). New algorithms for generating and filtering motion vectors combined with a newly developed isogenic iPSC line harboring genetically encoded calcium indicator, GCaMP6f, allow simultaneous user-independent measurement and analysis of the coupling between calcium flux and contractility. The relative performance of these algorithms, in terms of improving signal to noise, was tested. Applying these algorithms allowed analysis of contractility in iPS-CM cultured over multiple spatial scales from single cells to three-dimensional constructs. This open source software was validated with analysis of isoproterenol response in these cells, and can be applied in future studies comparing the drug responsiveness of iPS-CM cultured in different microenvironments in the context of tissue engineering.

  11. Automated macromolecular crystal detection system and method

    DOEpatents

    Christian, Allen T [Tracy, CA; Segelke, Brent [San Ramon, CA; Rupp, Bernard [Livermore, CA; Toppani, Dominique [Fontainebleau, FR

    2007-06-05

    An automated macromolecular method and system for detecting crystals in two-dimensional images, such as light microscopy images obtained from an array of crystallization screens. Edges are detected from the images by identifying local maxima of a phase congruency-based function associated with each image. The detected edges are segmented into discrete line segments, which are subsequently geometrically evaluated with respect to each other to identify any crystal-like qualities such as, for example, parallel lines, facing each other, similarity in length, and relative proximity. And from the evaluation a determination is made as to whether crystals are present in each image.

  12. Centrifugal LabTube platform for fully automated DNA purification and LAMP amplification based on an integrated, low-cost heating system.

    PubMed

    Hoehl, Melanie M; Weißert, Michael; Dannenberg, Arne; Nesch, Thomas; Paust, Nils; von Stetten, Felix; Zengerle, Roland; Slocum, Alexander H; Steigert, Juergen

    2014-06-01

    This paper introduces a disposable battery-driven heating system for loop-mediated isothermal DNA amplification (LAMP) inside a centrifugally-driven DNA purification platform (LabTube). We demonstrate LabTube-based fully automated DNA purification of as low as 100 cell-equivalents of verotoxin-producing Escherichia coli (VTEC) in water, milk and apple juice in a laboratory centrifuge, followed by integrated and automated LAMP amplification with a reduction of hands-on time from 45 to 1 min. The heating system consists of two parallel SMD thick film resistors and a NTC as heating and temperature sensing elements. They are driven by a 3 V battery and controlled by a microcontroller. The LAMP reagents are stored in the elution chamber and the amplification starts immediately after the eluate is purged into the chamber. The LabTube, including a microcontroller-based heating system, demonstrates contamination-free and automated sample-to-answer nucleic acid testing within a laboratory centrifuge. The heating system can be easily parallelized within one LabTube and it is deployable for a variety of heating and electrical applications.

  13. Performance of Gram staining on blood cultures flagged negative by an automated blood culture system.

    PubMed

    Peretz, A; Isakovich, N; Pastukh, N; Koifman, A; Glyatman, T; Brodsky, D

    2015-08-01

    Blood is one of the most important specimens sent to a microbiology laboratory for culture. Most blood cultures are incubated for 5-7 days, except in cases where there is a suspicion of infection caused by microorganisms that proliferate slowly, or infections expressed by a small number of bacteria in the bloodstream. Therefore, at the end of incubation, misidentification of positive cultures and false-negative results are a real possibility. The aim of this work was to perform a confirmation by Gram staining of the lack of any microorganisms in blood cultures that were identified as negative by the BACTEC™ FX system at the end of incubation. All bottles defined as negative by the BACTEC FX system were Gram-stained using an automatic device and inoculated on solid growth media. In our work, 15 cultures that were defined as negative by the BACTEC FX system at the end of the incubation were found to contain microorganisms when Gram-stained. The main characteristic of most bacteria and fungi growing in the culture bottles that were defined as negative was slow growth. This finding raises a problematic issue concerning the need to perform Gram staining of all blood cultures, which could overload the routine laboratory work, especially laboratories serving large medical centers and receiving a large number of blood cultures.

  14. Office automation: The administrative window into the integrated DBMS

    NASA Technical Reports Server (NTRS)

    Brock, G. H.

    1985-01-01

    In parallel to the evolution of Management Information Systems from simple data files to complex data bases, the stand-alone computer systems have been migrating toward fully integrated systems serving the work force. The next major productivity gain may very well be to make these highly sophisticated working level Data Base Management Systems (DMBS) serve all levels of management with reports of varying levels of detail. Most attempts by the DBMS development organization to provide useful information to management seem to bog down in the quagmire of competing working level requirements. Most large DBMS development organizations possess three to five year backlogs. Perhaps Office Automation is the vehicle that brings to pass the Management Information System that really serves management. A good office automation system manned by a team of facilitators seeking opportunities to serve end-users could go a long way toward defining a DBMS that serves management. This paper will briefly discuss the problems of the DBMS organization, alternative approaches to solving some of the major problems, a debate about problems that may have no solution, and finally how office automation fits into the development of the Manager's Management Information System.

  15. A parallel expert system for the control of a robotic air vehicle

    NASA Technical Reports Server (NTRS)

    Shakley, Donald; Lamont, Gary B.

    1988-01-01

    Expert systems can be used to govern the intelligent control of vehicles, for example the Robotic Air Vehicle (RAV). Due to the nature of the RAV system the associated expert system needs to perform in a demanding real-time environment. The use of a parallel processing capability to support the associated expert system's computational requirement is critical in this application. Thus, algorithms for parallel real-time expert systems must be designed, analyzed, and synthesized. The design process incorporates a consideration of the rule-set/face-set size along with representation issues. These issues are looked at in reference to information movement and various inference mechanisms. Also examined is the process involved with transporting the RAV expert system functions from the TI Explorer, where they are implemented in the Automated Reasoning Tool (ART), to the iPSC Hypercube, where the system is synthesized using Concurrent Common LISP (CCLISP). The transformation process for the ART to CCLISP conversion is described. The performance characteristics of the parallel implementation of these expert systems on the iPSC Hypercube are compared to the TI Explorer implementation.

  16. Robust Segmentation of Overlapping Cells in Histopathology Specimens Using Parallel Seed Detection and Repulsive Level Set

    PubMed Central

    Qi, Xin; Xing, Fuyong; Foran, David J.; Yang, Lin

    2013-01-01

    Automated image analysis of histopathology specimens could potentially provide support for early detection and improved characterization of breast cancer. Automated segmentation of the cells comprising imaged tissue microarrays (TMA) is a prerequisite for any subsequent quantitative analysis. Unfortunately, crowding and overlapping of cells present significant challenges for most traditional segmentation algorithms. In this paper, we propose a novel algorithm which can reliably separate touching cells in hematoxylin stained breast TMA specimens which have been acquired using a standard RGB camera. The algorithm is composed of two steps. It begins with a fast, reliable object center localization approach which utilizes single-path voting followed by mean-shift clustering. Next, the contour of each cell is obtained using a level set algorithm based on an interactive model. We compared the experimental results with those reported in the most current literature. Finally, performance was evaluated by comparing the pixel-wise accuracy provided by human experts with that produced by the new automated segmentation algorithm. The method was systematically tested on 234 image patches exhibiting dense overlap and containing more than 2200 cells. It was also tested on whole slide images including blood smears and tissue microarrays containing thousands of cells. Since the voting step of the seed detection algorithm is well suited for parallelization, a parallel version of the algorithm was implemented using graphic processing units (GPU) which resulted in significant speed-up over the C/C++ implementation. PMID:22167559

  17. Automation of servicibility of radio-relay station equipment

    NASA Astrophysics Data System (ADS)

    Uryev, A. G.; Mishkin, Y. I.; Itkis, G. Y.

    1985-03-01

    Automation of the serviceability of radio relay station equipment must ensure central gathering and primary processing of reliable instrument reading with subsequent display on the control panel, detection and recording of failures soon enough, advance enough warning based on analysis of detertioration symptoms, and correct remote measurement of equipment performance parameters. Such an inspection will minimize transmission losses while reducing nonproductive time and labor spent on documentation and measurement. A multichannel automated inspection system for this purpose should operate by a parallel rather than sequential procedure. Digital data processing is more expedient in this case than analog method and, therefore, analog to digital converters are required. Spepcial normal, above limit and below limit test signals provide means of self-inspection, to which must be added adequate interference immunization, stabilization, and standby power supply. Use of a microcomputer permits overall refinement and expansion of the inspection system while it minimizes though not completely eliminates dependence on subjective judgment.

  18. Automated synthesis of a library of triazolated 1,2,5-thiadiazepane 1,1-dioxides via a double aza-Michael strategy.

    PubMed

    Zang, Qin; Javed, Salim; Hill, David; Ullah, Farman; Bi, Danse; Porubsky, Patrick; Neuenswander, Benjamin; Lushington, Gerald H; Santini, Conrad; Organ, Michael G; Hanson, Paul R

    2012-08-13

    The construction of a 96-member library of triazolated 1,2,5-thiadiazepane 1,1-dioxides was performed on a Chemspeed Accelerator (SLT-100) automated parallel synthesis platform, culminating in the successful preparation of 94 out of 96 possible products. The key step, a one-pot, sequential elimination, double-aza-Michael reaction, and [3 + 2] Huisgen cycloaddition pathway has been automated and utilized in the production of two sets of triazolated sultam products.

  19. Automated Synthesis of a Library of Triazolated 1,2,5-Thiadiazepane 1,1-Dioxides via a Double aza-Michael Strategy

    PubMed Central

    Zang, Qin; Javed, Salim; Hill, David; Ullah, Farman; Bi, Danse; Porubsky, Patrick; Neuenswander, Benjamin; Lushington, Gerald H.; Santini, Conrad; Organ, Michael G.; Hanson, Paul R.

    2013-01-01

    The construction of a 96-member library of triazolated 1,2,5-thiadiazepane 1,1-dioxides was performed on a Chemspeed Accelerator (SLT-100) automated parallel synthesis platform, culminating in the successful preparation of 94 out of 96 possible products. The key step, a one-pot, sequential elimination, double-aza-Michael reaction, and [3+2] Huisgen cycloaddition pathway has been automated and utilized in the production of two sets of triazolated sultam products. PMID:22853708

  20. Designing for flexible interaction between humans and automation: delegation interfaces for supervisory control.

    PubMed

    Miller, Christopher A; Parasuraman, Raja

    2007-02-01

    To develop a method enabling human-like, flexible supervisory control via delegation to automation. Real-time supervisory relationships with automation are rarely as flexible as human task delegation to other humans. Flexibility in human-adaptable automation can provide important benefits, including improved situation awareness, more accurate automation usage, more balanced mental workload, increased user acceptance, and improved overall performance. We review problems with static and adaptive (as opposed to "adaptable") automation; contrast these approaches with human-human task delegation, which can mitigate many of the problems; and revise the concept of a "level of automation" as a pattern of task-based roles and authorizations. We argue that delegation requires a shared hierarchical task model between supervisor and subordinates, used to delegate tasks at various levels, and offer instruction on performing them. A prototype implementation called Playbook is described. On the basis of these analyses, we propose methods for supporting human-machine delegation interactions that parallel human-human delegation in important respects. We develop an architecture for machine-based delegation systems based on the metaphor of a sports team's "playbook." Finally, we describe a prototype implementation of this architecture, with an accompanying user interface and usage scenario, for mission planning for uninhabited air vehicles. Delegation offers a viable method for flexible, multilevel human-automation interaction to enhance system performance while maintaining user workload at a manageable level. Most applications of adaptive automation (aviation, air traffic control, robotics, process control, etc.) are potential avenues for the adaptable, delegation approach we advocate. We present an extended example for uninhabited air vehicle mission planning.

  1. Parallel multipoint recording of aligned and cultured neurons on micro channel array toward cellular network analysis.

    PubMed

    Tonomura, Wataru; Moriguchi, Hiroyuki; Jimbo, Yasuhiko; Konishi, Satoshi

    2010-08-01

    This paper describes an advanced Micro Channel Array (MCA) for recording electrophysiological signals of neuronal networks at multiple points simultaneously. The developed MCA is designed for neuronal network analysis which has been studied by the co-authors using the Micro Electrode Arrays (MEA) system, and employs the principles of extracellular recordings. A prerequisite for extracellular recordings with good signal-to-noise ratio is a tight contact between cells and electrodes. The MCA described herein has the following advantages. The electrodes integrated around individual micro channels are electrically isolated to enable parallel multipoint recording. Reliable clamping of a targeted cell through micro channels is expected to improve the cellular selectivity and the attachment between the cell and the electrode toward steady electrophysiological recordings. We cultured hippocampal neurons on the developed MCA. As a result, the spontaneous and evoked spike potentials could be recorded by sucking and clamping the cells at multiple points. In this paper, we describe the design and fabrication of the MCA and the successful electrophysiological recordings leading to the development of an effective cellular network analysis device.

  2. Semi-automated 96-well liquid-liquid extraction for quantitation of drugs in biological fluids.

    PubMed

    Zhang, N; Hoffman, K L; Li, W; Rossi, D T

    2000-02-01

    A semi-automated liquid-liquid extraction (LLE) technique for biological fluid sample preparation was introduced for the quantitation of four drugs in rat plasma. All liquid transferring during the sample preparation was automated using a Tomtec Quadra 96 Model 320 liquid handling robot, which processed up to 96 samples in parallel. The samples were either in 96-deep-well plate or tube-rack format. One plate of samples can be prepared in approximately 1.5 h, and the 96-well plate is directly compatible with the autosampler of an LC/MS system. Selection of organic solvents and recoveries are discussed. Also, precision, relative error, linearity and quantitation of the semi automated LLE method are estimated for four example drugs using LC/MS/MS with a multiple reaction monitoring (MRM) approach. The applicability of this method and future directions are evaluated.

  3. Complacency and Automation Bias in the Use of Imperfect Automation.

    PubMed

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  4. Flexible End2End Workflow Automation of Hit-Discovery Research.

    PubMed

    Holzmüller-Laue, Silke; Göde, Bernd; Thurow, Kerstin

    2014-08-01

    The article considers a new approach of more complex laboratory automation at the workflow layer. The authors purpose the automation of end2end workflows. The combination of all relevant subprocesses-whether automated or manually performed, independently, and in which organizational unit-results in end2end processes that include all result dependencies. The end2end approach focuses on not only the classical experiments in synthesis or screening, but also on auxiliary processes such as the production and storage of chemicals, cell culturing, and maintenance as well as preparatory activities and analyses of experiments. Furthermore, the connection of control flow and data flow in the same process model leads to reducing of effort of the data transfer between the involved systems, including the necessary data transformations. This end2end laboratory automation can be realized effectively with the modern methods of business process management (BPM). This approach is based on a new standardization of the process-modeling notation Business Process Model and Notation 2.0. In drug discovery, several scientific disciplines act together with manifold modern methods, technologies, and a wide range of automated instruments for the discovery and design of target-based drugs. The article discusses the novel BPM-based automation concept with an implemented example of a high-throughput screening of previously synthesized compound libraries. © 2014 Society for Laboratory Automation and Screening.

  5. Different in vitro cellular responses to tamoxifen treatment in polydimethylsiloxane-based devices compared to normal cell culture.

    PubMed

    Wang, Lingyu; Yu, Linfen; Grist, Samantha; Cheung, Karen C; Chen, David D Y

    2017-11-15

    Cell culture systems based on polydimethylsiloxane (PDMS) microfluidic devices offer great flexibility because of their simple fabrication and adaptability. PDMS devices also make it straightforward to set up parallel experiments and can facilitate process automation, potentially speeding up the drug discovery process. However, cells grown in PDMS-based systems can develop in different ways to those grown with conventional culturing systems because of the differences in the containers' surfaces. Despite the growing number of studies on microfluidic cell culture devices, the differences in cellular behavior in PDMS-based devices and normal cell culture systems are poorly characterized. In this work, we investigated the proliferation and autophagy of MCF7 cells cultured in uncoated and Parylene-C coated PDMS wells. Using a quantitative method combining solid phase extraction and liquid chromatography mass spectrometry we developed, we showed that Tamoxifen uptake into the surfaces of uncoated PDMS wells can change the drug's effective concentration in the culture medium, affecting the results of Tamoxifen-induced autophagy and cytotoxicity assays. Such changes must be carefully analyzed before transferring in vitro experiments from a traditional culture environment to a PDMS-based microfluidic system. We also found that cells cultured in Parylene-C coated PDMS wells showed similar proliferation and drug response characteristics to cells cultured in standard polystyrene (PS) plates, indicating that Parylene-C deposition offers an easy way of limiting the uptake of small molecules into porous PDMS materials and significantly improves the performance of PDMS-based device for cell related research. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Parallelized direct execution simulation of message-passing parallel programs

    NASA Technical Reports Server (NTRS)

    Dickens, Phillip M.; Heidelberger, Philip; Nicol, David M.

    1994-01-01

    As massively parallel computers proliferate, there is growing interest in findings ways by which performance of massively parallel codes can be efficiently predicted. This problem arises in diverse contexts such as parallelizing computers, parallel performance monitoring, and parallel algorithm development. In this paper we describe one solution where one directly executes the application code, but uses a discrete-event simulator to model details of the presumed parallel machine such as operating system and communication network behavior. Because this approach is computationally expensive, we are interested in its own parallelization specifically the parallelization of the discrete-event simulator. We describe methods suitable for parallelized direct execution simulation of message-passing parallel programs, and report on the performance of such a system, Large Application Parallel Simulation Environment (LAPSE), we have built on the Intel Paragon. On all codes measured to date, LAPSE predicts performance well typically within 10 percent relative error. Depending on the nature of the application code, we have observed low slowdowns (relative to natively executing code) and high relative speedups using up to 64 processors.

  7. Parallel multipoint recording of aligned and cultured neurons on corresponding Micro Channel Array toward on-chip cell analysis.

    PubMed

    Tonomura, W; Moriguchi, H; Jimbo, Y; Konishi, S

    2008-01-01

    This paper describes an advanced Micro Channel Array (MCA) so as to record neuronal network at multiple points simultaneously. Developed MCA is designed for neuronal network analysis which has been studied by co-authors using MEA (Micro Electrode Arrays) system. The MCA employs the principle of the extracellular recording. Presented MCA has the following advantages. First of all, the electrodes integrated around individual micro channels are electrically isolated for parallel multipoint recording. Sucking and clamping of cells through micro channels is expected to improve the cellular selectivity and S/N ratio. In this study, hippocampal neurons were cultured on the developed MCA. As a result, the spontaneous and evoked spike potential could be recorded by sucking and clamping the cells at multiple points. Herein, we describe the successful experimental results together with the design and fabrication of the advanced MCA toward on-chip analysis of neuronal network.

  8. STAMPS: Software Tool for Automated MRI Post-processing on a supercomputer.

    PubMed

    Bigler, Don C; Aksu, Yaman; Miller, David J; Yang, Qing X

    2009-08-01

    This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing tools (such as SPM, FSL, and HAMMER) for multiple MR image types in a pipeline fashion. It also contains novel MRI post-processing features. The STAMP image outputs can be used to perform brain analysis using Statistical Parametric Mapping (SPM) or single-/multi-image modality brain analysis using Support Vector Machines (SVMs). Since STAMPS is PBS-based, the supercomputer may be a multi-node computer cluster or one of the latest multi-core computers.

  9. Automating CPM-GOMS

    NASA Technical Reports Server (NTRS)

    John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger

    2002-01-01

    CPM-GOMS is a modeling method that combines the task decomposition of a GOMS analysis with a model of human resource usage at the level of cognitive, perceptual, and motor operations. CPM-GOMS models have made accurate predictions about skilled user behavior in routine tasks, but developing such models is tedious and error-prone. We describe a process for automatically generating CPM-GOMS models from a hierarchical task decomposition expressed in a cognitive modeling tool called Apex. Resource scheduling in Apex automates the difficult task of interleaving the cognitive, perceptual, and motor resources underlying common task operators (e.g. mouse move-and-click). Apex's UI automatically generates PERT charts, which allow modelers to visualize a model's complex parallel behavior. Because interleaving and visualization is now automated, it is feasible to construct arbitrarily long sequences of behavior. To demonstrate the process, we present a model of automated teller interactions in Apex and discuss implications for user modeling. available to model human users, the Goals, Operators, Methods, and Selection (GOMS) method [6, 21] has been the most widely used, providing accurate, often zero-parameter, predictions of the routine performance of skilled users in a wide range of procedural tasks [6, 13, 15, 27, 28]. GOMS is meant to model routine behavior. The user is assumed to have methods that apply sequences of operators and to achieve a goal. Selection rules are applied when there is more than one method to achieve a goal. Many routine tasks lend themselves well to such decomposition. Decomposition produces a representation of the task as a set of nested goal states that include an initial state and a final state. The iterative decomposition into goals and nested subgoals can terminate in primitives of any desired granularity, the choice of level of detail dependent on the predictions required. Although GOMS has proven useful in HCI, tools to support the

  10. Low cost automated whole smear microscopy screening system for detection of acid fast bacilli.

    PubMed

    Law, Yan Nei; Jian, Hanbin; Lo, Norman W S; Ip, Margaret; Chan, Mia Mei Yuk; Kam, Kai Man; Wu, Xiaohua

    2018-01-01

    In countries with high tuberculosis (TB) burden, there is urgent need for rapid, large-scale screening to detect smear-positive patients. We developed a computer-aided whole smear screening system that focuses in real-time, captures images and provides diagnostic grading, for both bright-field and fluorescence microscopy for detection of acid-fast-bacilli (AFB) from respiratory specimens. To evaluate the performance of dual-mode screening system in AFB diagnostic algorithms on concentrated smears with auramine O (AO) staining, as well as direct smears with AO and Ziehl-Neelsen (ZN) staining, using mycobacterial culture results as gold standard. Adult patient sputum samples requesting for M. tuberculosis cultures were divided into three batches for staining: direct AO-stained, direct ZN-stained and concentrated smears AO-stained. All slides were graded by an experienced microscopist, in parallel with the automated whole smear screening system. Sensitivity and specificity of a TB diagnostic algorithm in using the screening system alone, and in combination with a microscopist, were evaluated. Of 488 direct AO-stained smears, 228 were culture positive. These yielded a sensitivity of 81.6% and specificity of 74.2%. Of 334 direct smears with ZN staining, 142 were culture positive, which gave a sensitivity of 70.4% and specificity of 76.6%. Of 505 concentrated smears with AO staining, 250 were culture positive, giving a sensitivity of 86.4% and specificity of 71.0%. To further improve performance, machine grading was confirmed by manual smear grading when the number of AFBs detected fell within an uncertainty range. These combined results gave significant improvement in specificity (AO-direct:85.4%; ZN-direct:85.4%; AO-concentrated:92.5%) and slight improvement in sensitivity while requiring only limited manual workload. Our system achieved high sensitivity without substantially compromising specificity when compared to culture results. Significant improvement in

  11. An automated microfluidic platform for C. elegans embryo arraying, phenotyping, and long-term live imaging

    NASA Astrophysics Data System (ADS)

    Cornaglia, Matteo; Mouchiroud, Laurent; Marette, Alexis; Narasimhan, Shreya; Lehnert, Thomas; Jovaisaite, Virginija; Auwerx, Johan; Gijs, Martin A. M.

    2015-05-01

    Studies of the real-time dynamics of embryonic development require a gentle embryo handling method, the possibility of long-term live imaging during the complete embryogenesis, as well as of parallelization providing a population’s statistics, while keeping single embryo resolution. We describe an automated approach that fully accomplishes these requirements for embryos of Caenorhabditis elegans, one of the most employed model organisms in biomedical research. We developed a microfluidic platform which makes use of pure passive hydrodynamics to run on-chip worm cultures, from which we obtain synchronized embryo populations, and to immobilize these embryos in incubator microarrays for long-term high-resolution optical imaging. We successfully employ our platform to investigate morphogenesis and mitochondrial biogenesis during the full embryonic development and elucidate the role of the mitochondrial unfolded protein response (UPRmt) within C. elegans embryogenesis. Our method can be generally used for protein expression and developmental studies at the embryonic level, but can also provide clues to understand the aging process and age-related diseases in particular.

  12. Automation of multi-agent control for complex dynamic systems in heterogeneous computational network

    NASA Astrophysics Data System (ADS)

    Oparin, Gennady; Feoktistov, Alexander; Bogdanova, Vera; Sidorov, Ivan

    2017-01-01

    The rapid progress of high-performance computing entails new challenges related to solving large scientific problems for various subject domains in a heterogeneous distributed computing environment (e.g., a network, Grid system, or Cloud infrastructure). The specialists in the field of parallel and distributed computing give the special attention to a scalability of applications for problem solving. An effective management of the scalable application in the heterogeneous distributed computing environment is still a non-trivial issue. Control systems that operate in networks, especially relate to this issue. We propose a new approach to the multi-agent management for the scalable applications in the heterogeneous computational network. The fundamentals of our approach are the integrated use of conceptual programming, simulation modeling, network monitoring, multi-agent management, and service-oriented programming. We developed a special framework for an automation of the problem solving. Advantages of the proposed approach are demonstrated on the parametric synthesis example of the static linear regulator for complex dynamic systems. Benefits of the scalable application for solving this problem include automation of the multi-agent control for the systems in a parallel mode with various degrees of its detailed elaboration.

  13. An Expert System for Automating Nuclear Strike Aircraft Replacement, Aircraft Beddown, and Logistics Movement for the Theater Warfare Exercise.

    DTIC Science & Technology

    1989-12-01

    that can be easily understood. (9) Parallelism. Several system components may need to execute in parallel. For example, the processing of sensor data...knowledge base are not accessible for processing by the database. Also in the likely case that the expert system poses a series of related queries, the...hiharken nxpfilcs’Iog - Knowledge base for the automation of loCgistics rr-ovenet T’he Ii rectorY containing the strike aircraft replacement knowledge base

  14. Extended Field Laser Confocal Microscopy (EFLCM): Combining automated Gigapixel image capture with in silico virtual microscopy

    PubMed Central

    Flaberg, Emilie; Sabelström, Per; Strandh, Christer; Szekely, Laszlo

    2008-01-01

    Background Confocal laser scanning microscopy has revolutionized cell biology. However, the technique has major limitations in speed and sensitivity due to the fact that a single laser beam scans the sample, allowing only a few microseconds signal collection for each pixel. This limitation has been overcome by the introduction of parallel beam illumination techniques in combination with cold CCD camera based image capture. Methods Using the combination of microlens enhanced Nipkow spinning disc confocal illumination together with fully automated image capture and large scale in silico image processing we have developed a system allowing the acquisition, presentation and analysis of maximum resolution confocal panorama images of several Gigapixel size. We call the method Extended Field Laser Confocal Microscopy (EFLCM). Results We show using the EFLCM technique that it is possible to create a continuous confocal multi-colour mosaic from thousands of individually captured images. EFLCM can digitize and analyze histological slides, sections of entire rodent organ and full size embryos. It can also record hundreds of thousands cultured cells at multiple wavelength in single event or time-lapse fashion on fixed slides, in live cell imaging chambers or microtiter plates. Conclusion The observer independent image capture of EFLCM allows quantitative measurements of fluorescence intensities and morphological parameters on a large number of cells. EFLCM therefore bridges the gap between the mainly illustrative fluorescence microscopy and purely quantitative flow cytometry. EFLCM can also be used as high content analysis (HCA) instrument for automated screening processes. PMID:18627634

  15. Extended Field Laser Confocal Microscopy (EFLCM): combining automated Gigapixel image capture with in silico virtual microscopy.

    PubMed

    Flaberg, Emilie; Sabelström, Per; Strandh, Christer; Szekely, Laszlo

    2008-07-16

    Confocal laser scanning microscopy has revolutionized cell biology. However, the technique has major limitations in speed and sensitivity due to the fact that a single laser beam scans the sample, allowing only a few microseconds signal collection for each pixel. This limitation has been overcome by the introduction of parallel beam illumination techniques in combination with cold CCD camera based image capture. Using the combination of microlens enhanced Nipkow spinning disc confocal illumination together with fully automated image capture and large scale in silico image processing we have developed a system allowing the acquisition, presentation and analysis of maximum resolution confocal panorama images of several Gigapixel size. We call the method Extended Field Laser Confocal Microscopy (EFLCM). We show using the EFLCM technique that it is possible to create a continuous confocal multi-colour mosaic from thousands of individually captured images. EFLCM can digitize and analyze histological slides, sections of entire rodent organ and full size embryos. It can also record hundreds of thousands cultured cells at multiple wavelength in single event or time-lapse fashion on fixed slides, in live cell imaging chambers or microtiter plates. The observer independent image capture of EFLCM allows quantitative measurements of fluorescence intensities and morphological parameters on a large number of cells. EFLCM therefore bridges the gap between the mainly illustrative fluorescence microscopy and purely quantitative flow cytometry. EFLCM can also be used as high content analysis (HCA) instrument for automated screening processes.

  16. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amadio, G.; et al.

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less

  17. Improving the driver-automation interaction: an approach using automation uncertainty.

    PubMed

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  18. Uniaxial strain of cultured mouse and rat cardiomyocyte strands slows conduction more when its axis is parallel to impulse propagation than when it is perpendicular.

    PubMed

    Buccarello, A; Azzarito, M; Michoud, F; Lacour, S P; Kucera, J P

    2018-05-01

    Cardiac tissue deformation can modify tissue resistance, membrane capacitance and ion currents and hence cause arrhythmogenic slow conduction. Our aim was to investigate whether uniaxial strain causes different changes in conduction velocity (θ) when the principal strain axis is parallel vs perpendicular to impulse propagation. Cardiomyocyte strands were cultured on stretchable custom microelectrode arrays, and θ was determined during steady-state pacing. Uniaxial strain (5%) with principal axis parallel (orthodromic) or perpendicular (paradromic) to propagation was applied for 1 minute and controlled by imaging a grid of markers. The results were analysed in terms of cable theory. Both types of strain induced immediate changes of θ upon application and release. In material coordinates, orthodromic strain decreased θ significantly more (P < .001) than paradromic strain (2.2 ± 0.5% vs 1.0 ± 0.2% in n = 8 mouse cardiomyocyte cultures, 2.3 ± 0.4% vs 0.9 ± 0.5% in n = 4 rat cardiomyocyte cultures, respectively). The larger effect of orthodromic strain can be explained by the increase in axial myoplasmic resistance, which is not altered by paradromic strain. Thus, changes in tissue resistance substantially contributed to the changes of θ during strain, in addition to other influences (eg stretch-activated channels). Besides these immediate effects, the application of strain also consistently initiated a slow progressive decrease in θ and a slow recovery of θ upon release. Changes in cardiac conduction velocity caused by acute stretch do not only depend on the magnitude of strain but also on its orientation relative to impulse propagation. This dependence is due to different effects on tissue resistance. © 2017 Scandinavian Physiological Society. Published by John Wiley & Sons Ltd.

  19. Transfection in perfused microfluidic cell culture devices: A case study.

    PubMed

    Raimes, William; Rubi, Mathieu; Super, Alexandre; Marques, Marco P C; Veraitch, Farlan; Szita, Nicolas

    2017-08-01

    Automated microfluidic devices are a promising route towards a point-of-care autologous cell therapy. The initial steps of induced pluripotent stem cell (iPSC) derivation involve transfection and long term cell culture. Integration of these steps would help reduce the cost and footprint of micro-scale devices with applications in cell reprogramming or gene correction. Current examples of transfection integration focus on maximising efficiency rather than viable long-term culture. Here we look for whole process compatibility by integrating automated transfection with a perfused microfluidic device designed for homogeneous culture conditions. The injection process was characterised using fluorescein to establish a LabVIEW-based routine for user-defined automation. Proof-of-concept is demonstrated by chemically transfecting a GFP plasmid into mouse embryonic stem cells (mESCs). Cells transfected in the device showed an improvement in efficiency (34%, n = 3) compared with standard protocols (17.2%, n = 3). This represents a first step towards microfluidic processing systems for cell reprogramming or gene therapy.

  20. Method and automated apparatus for detecting coliform organisms

    NASA Technical Reports Server (NTRS)

    Dill, W. P.; Taylor, R. E.; Jeffers, E. L. (Inventor)

    1980-01-01

    Method and automated apparatus are disclosed for determining the time of detection of metabolically produced hydrogen by coliform bacteria cultured in an electroanalytical cell from the time the cell is inoculated with the bacteria. The detection time data provides bacteria concentration values. The apparatus is sequenced and controlled by a digital computer to discharge a spent sample, clean and sterilize the culture cell, provide a bacteria nutrient into the cell, control the temperature of the nutrient, inoculate the nutrient with a bacteria sample, measures the electrical potential difference produced by the cell, and measures the time of detection from inoculation.

  1. Parallel Measurement of Circadian Clock Gene Expression and Hormone Secretion in Human Primary Cell Cultures.

    PubMed

    Petrenko, Volodymyr; Saini, Camille; Perrin, Laurent; Dibner, Charna

    2016-11-11

    Circadian clocks are functional in all light-sensitive organisms, allowing for an adaptation to the external world by anticipating daily environmental changes. Considerable progress in our understanding of the tight connection between the circadian clock and most aspects of physiology has been made in the field over the last decade. However, unraveling the molecular basis that underlies the function of the circadian oscillator in humans stays of highest technical challenge. Here, we provide a detailed description of an experimental approach for long-term (2-5 days) bioluminescence recording and outflow medium collection in cultured human primary cells. For this purpose, we have transduced primary cells with a lentiviral luciferase reporter that is under control of a core clock gene promoter, which allows for the parallel assessment of hormone secretion and circadian bioluminescence. Furthermore, we describe the conditions for disrupting the circadian clock in primary human cells by transfecting siRNA targeting CLOCK. Our results on the circadian regulation of insulin secretion by human pancreatic islets, and myokine secretion by human skeletal muscle cells, are presented here to illustrate the application of this methodology. These settings can be used to study the molecular makeup of human peripheral clocks and to analyze their functional impact on primary cells under physiological or pathophysiological conditions.

  2. Automated Patch-Clamp Methods for the hERG Cardiac Potassium Channel.

    PubMed

    Houtmann, Sylvie; Schombert, Brigitte; Sanson, Camille; Partiseti, Michel; Bohme, G Andrees

    2017-01-01

    The human Ether-a-go-go Related Gene (hERG) product has been identified as a central ion channel underlying both familial forms of elongated QT interval on the electrocardiogram and drug-induced elongation of the same QT segment. Indeed, reduced function of this potassium channel involved in the repolarization of the cardiac action potential can produce a type of life-threatening cardiac ventricular arrhythmias called Torsades de Pointes (TdP). Therefore, hERG inhibitory activity of newly synthetized molecules is a relevant structure-activity metric for compound prioritization and optimization in medicinal chemistry phases of drug discovery. Electrophysiology remains the gold standard for the functional assessment of ion channel pharmacology. The recent years have witnessed automatization and parallelization of the manual patch-clamp technique, allowing higher throughput screening on recombinant hERG channels. However, the multi-well plate format of automatized patch-clamp does not allow visual detection of potential micro-precipitation of poorly soluble compounds. In this chapter we describe bench procedures for the culture and preparation of hERG-expressing CHO cells for recording on an automated patch-clamp workstation. We also show that the sensitivity of the assay can be improved by adding a surfactant to the extracellular medium.

  3. Domain Decomposition By the Advancing-Partition Method for Parallel Unstructured Grid Generation

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar Z.; Zagaris, George

    2009-01-01

    A new method of domain decomposition has been developed for generating unstructured grids in subdomains either sequentially or using multiple computers in parallel. Domain decomposition is a crucial and challenging step for parallel grid generation. Prior methods are generally based on auxiliary, complex, and computationally intensive operations for defining partition interfaces and usually produce grids of lower quality than those generated in single domains. The new technique, referred to as "Advancing Partition," is based on the Advancing-Front method, which partitions a domain as part of the volume mesh generation in a consistent and "natural" way. The benefits of this approach are: 1) the process of domain decomposition is highly automated, 2) partitioning of domain does not compromise the quality of the generated grids, and 3) the computational overhead for domain decomposition is minimal. The new method has been implemented in NASA's unstructured grid generation code VGRID.

  4. Cockpit automation

    NASA Technical Reports Server (NTRS)

    Wiener, Earl L.

    1988-01-01

    The aims and methods of aircraft cockpit automation are reviewed from a human-factors perspective. Consideration is given to the mixed pilot reception of increased automation, government concern with the safety and reliability of highly automated aircraft, the formal definition of automation, and the ground-proximity warning system and accidents involving controlled flight into terrain. The factors motivating automation include technology availability; safety; economy, reliability, and maintenance; workload reduction and two-pilot certification; more accurate maneuvering and navigation; display flexibility; economy of cockpit space; and military requirements.

  5. Parallel experimental design and multivariate analysis provides efficient screening of cell culture media supplements to improve biosimilar product quality.

    PubMed

    Brühlmann, David; Sokolov, Michael; Butté, Alessandro; Sauer, Markus; Hemberger, Jürgen; Souquet, Jonathan; Broly, Hervé; Jordan, Martin

    2017-07-01

    Rational and high-throughput optimization of mammalian cell culture media has a great potential to modulate recombinant protein product quality. We present a process design method based on parallel design-of-experiment (DoE) of CHO fed-batch cultures in 96-deepwell plates to modulate monoclonal antibody (mAb) glycosylation using medium supplements. To reduce the risk of losing valuable information in an intricate joint screening, 17 compounds were separated into five different groups, considering their mode of biological action. The concentration ranges of the medium supplements were defined according to information encountered in the literature and in-house experience. The screening experiments produced wide glycosylation pattern ranges. Multivariate analysis including principal component analysis and decision trees was used to select the best performing glycosylation modulators. Subsequent D-optimal quadratic design with four factors (three promising compounds and temperature shift) in shake tubes confirmed the outcome of the selection process and provided a solid basis for sequential process development at a larger scale. The glycosylation profile with respect to the specifications for biosimilarity was greatly improved in shake tube experiments: 75% of the conditions were equally close or closer to the specifications for biosimilarity than the best 25% in 96-deepwell plates. Biotechnol. Bioeng. 2017;114: 1448-1458. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  6. Epileptogenesis in organotypic hippocampal cultures has limited dependence on culture medium composition

    PubMed Central

    Mahoney, Mark M.; Staley, Kevin J.

    2017-01-01

    Rodent organotypic hippocampal cultures spontaneously develop epileptiform activity after approximately 2 weeks in vitro and are increasingly used as a model of chronic post-traumatic epilepsy. However, organotypic cultures are maintained in an artificial environment (culture medium), which contains electrolytes, glucose, amino acids and other components that are not present at the same concentrations in cerebrospinal fluid (CSF). Therefore, it is possible that epileptogenesis in organotypic cultures is driven by these components. We examined the influence of medium composition on epileptogenesis. Epileptogenesis was evaluated by measurements of lactate and lactate dehydrogenase (LDH) levels (biomarkers of ictal activity and cell death, respectively) in spent culture media, immunohistochemistry and automated 3-D cell counts, and extracellular recordings from CA3 regions. Changes in culture medium components moderately influenced lactate and LDH levels as well as electrographic seizure burden and cell death. However, epileptogenesis occurred in any culture medium that was capable of supporting neural survival. We conclude that medium composition is unlikely to be the cause of epileptogenesis in the organotypic hippocampal culture model of chronic post-traumatic epilepsy. PMID:28225808

  7. Epileptogenesis in organotypic hippocampal cultures has limited dependence on culture medium composition.

    PubMed

    Liu, Jing; Saponjian, Yero; Mahoney, Mark M; Staley, Kevin J; Berdichevsky, Yevgeny

    2017-01-01

    Rodent organotypic hippocampal cultures spontaneously develop epileptiform activity after approximately 2 weeks in vitro and are increasingly used as a model of chronic post-traumatic epilepsy. However, organotypic cultures are maintained in an artificial environment (culture medium), which contains electrolytes, glucose, amino acids and other components that are not present at the same concentrations in cerebrospinal fluid (CSF). Therefore, it is possible that epileptogenesis in organotypic cultures is driven by these components. We examined the influence of medium composition on epileptogenesis. Epileptogenesis was evaluated by measurements of lactate and lactate dehydrogenase (LDH) levels (biomarkers of ictal activity and cell death, respectively) in spent culture media, immunohistochemistry and automated 3-D cell counts, and extracellular recordings from CA3 regions. Changes in culture medium components moderately influenced lactate and LDH levels as well as electrographic seizure burden and cell death. However, epileptogenesis occurred in any culture medium that was capable of supporting neural survival. We conclude that medium composition is unlikely to be the cause of epileptogenesis in the organotypic hippocampal culture model of chronic post-traumatic epilepsy.

  8. Transformation From a Conventional Clinical Microbiology Laboratory to Full Automation.

    PubMed

    Moreno-Camacho, José L; Calva-Espinosa, Diana Y; Leal-Leyva, Yoseli Y; Elizalde-Olivas, Dolores C; Campos-Romero, Abraham; Alcántar-Fernández, Jonathan

    2017-12-22

    To validate the performance, reproducibility, and reliability of BD automated instruments in order to establish a fully automated clinical microbiology laboratory. We used control strains and clinical samples to assess the accuracy, reproducibility, and reliability of the BD Kiestra WCA, the BD Phoenix, and BD Bruker MALDI-Biotyper instruments and compared them to previously established conventional methods. The following processes were evaluated: sample inoculation and spreading, colony counts, sorting of cultures, antibiotic susceptibility test, and microbial identification. The BD Kiestra recovered single colonies in less time than conventional methods (e.g. E. coli, 7h vs 10h, respectively) and agreement between both methodologies was excellent for colony counts (κ=0.824) and sorting cultures (κ=0.821). Antibiotic susceptibility tests performed with BD Phoenix and disk diffusion demonstrated 96.3% agreement with both methods. Finally, we compared microbial identification in BD Phoenix and Bruker MALDI-Biotyper and observed perfect agreement (κ=1) and identification at a species level for control strains. Together these instruments allow us to process clinical urine samples in 36h (effective time). The BD automated technologies have improved performance compared with conventional methods, and are suitable for its implementation in very busy microbiology laboratories. © American Society for Clinical Pathology 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  9. Item Selection for the Development of Parallel Forms from an IRT-Based Seed Test Using a Sampling and Classification Approach

    ERIC Educational Resources Information Center

    Chen, Pei-Hua; Chang, Hua-Hua; Wu, Haiyan

    2012-01-01

    Two sampling-and-classification-based procedures were developed for automated test assembly: the Cell Only and the Cell and Cube methods. A simulation study based on a 540-item bank was conducted to compare the performance of the procedures with the performance of a mixed-integer programming (MIP) method for assembling multiple parallel test…

  10. Evolution of a minimal parallel programming model

    DOE PAGES

    Lusk, Ewing; Butler, Ralph; Pieper, Steven C.

    2017-04-30

    Here, we take a historical approach to our presentation of self-scheduled task parallelism, a programming model with its origins in early irregular and nondeterministic computations encountered in automated theorem proving and logic programming. We show how an extremely simple task model has evolved into a system, asynchronous dynamic load balancing (ADLB), and a scalable implementation capable of supporting sophisticated applications on today’s (and tomorrow’s) largest supercomputers; and we illustrate the use of ADLB with a Green’s function Monte Carlo application, a modern, mature nuclear physics code in production use. Our lesson is that by surrendering a certain amount of generalitymore » and thus applicability, a minimal programming model (in terms of its basic concepts and the size of its application programmer interface) can achieve extreme scalability without introducing complexity.« less

  11. Some thoughts about parallel process and psychotherapy supervision: when is a parallel just a parallel?

    PubMed

    Watkins, C Edward

    2012-09-01

    In a way not done before, Tracey, Bludworth, and Glidden-Tracey ("Are there parallel processes in psychotherapy supervision: An empirical examination," Psychotherapy, 2011, advance online publication, doi.10.1037/a0026246) have shown us that parallel process in psychotherapy supervision can indeed be rigorously and meaningfully researched, and their groundbreaking investigation provides a nice prototype for future supervision studies to emulate. In what follows, I offer a brief complementary comment to Tracey et al., addressing one matter that seems to be a potentially important conceptual and empirical parallel process consideration: When is a parallel just a parallel? PsycINFO Database Record (c) 2012 APA, all rights reserved.

  12. Embodied and Distributed Parallel DJing.

    PubMed

    Cappelen, Birgitta; Andersson, Anders-Petter

    2016-01-01

    Everyone has a right to take part in cultural events and activities, such as music performances and music making. Enforcing that right, within Universal Design, is often limited to a focus on physical access to public areas, hearing aids etc., or groups of persons with special needs performing in traditional ways. The latter might be people with disabilities, being musicians playing traditional instruments, or actors playing theatre. In this paper we focus on the innovative potential of including people with special needs, when creating new cultural activities. In our project RHYME our goal was to create health promoting activities for children with severe disabilities, by developing new musical and multimedia technologies. Because of the users' extreme demands and rich contribution, we ended up creating both a new genre of musical instruments and a new art form. We call this new art form Embodied and Distributed Parallel DJing, and the new genre of instruments for Empowering Multi-Sensorial Things.

  13. Churchill: an ultra-fast, deterministic, highly scalable and balanced parallelization strategy for the discovery of human genetic variation in clinical and population-scale genomics.

    PubMed

    Kelly, Benjamin J; Fitch, James R; Hu, Yangqiu; Corsmeier, Donald J; Zhong, Huachun; Wetzel, Amy N; Nordquist, Russell D; Newsom, David L; White, Peter

    2015-01-20

    While advances in genome sequencing technology make population-scale genomics a possibility, current approaches for analysis of these data rely upon parallelization strategies that have limited scalability, complex implementation and lack reproducibility. Churchill, a balanced regional parallelization strategy, overcomes these challenges, fully automating the multiple steps required to go from raw sequencing reads to variant discovery. Through implementation of novel deterministic parallelization techniques, Churchill allows computationally efficient analysis of a high-depth whole genome sample in less than two hours. The method is highly scalable, enabling full analysis of the 1000 Genomes raw sequence dataset in a week using cloud resources. http://churchill.nchri.org/.

  14. Autonomy and Automation

    NASA Technical Reports Server (NTRS)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  15. Semi-automated sorting using holographic optical tweezers remotely controlled by eye/hand tracking camera

    NASA Astrophysics Data System (ADS)

    Tomori, Zoltan; Keša, Peter; Nikorovič, Matej; Kaůka, Jan; Zemánek, Pavel

    2016-12-01

    We proposed the improved control software for the holographic optical tweezers (HOT) proper for simple semi-automated sorting. The controller receives data from both the human interface sensors and the HOT microscope camera and processes them. As a result, the new positions of active laser traps are calculated, packed into the network format and sent to the remote HOT. Using the photo-polymerization technique, we created a sorting container consisting of two parallel horizontal walls where one wall contains "gates" representing a place where the trapped particle enters into the container. The positions of particles and gates are obtained by image analysis technique which can be exploited to achieve the higher level of automation. Sorting is documented on computer game simulation and the real experiment.

  16. Passing in Command Line Arguments and Parallel Cluster/Multicore Batching in R with batch.

    PubMed

    Hoffmann, Thomas J

    2011-03-01

    It is often useful to rerun a command line R script with some slight change in the parameters used to run it - a new set of parameters for a simulation, a different dataset to process, etc. The R package batch provides a means to pass in multiple command line options, including vectors of values in the usual R format, easily into R. The same script can be setup to run things in parallel via different command line arguments. The R package batch also provides a means to simplify this parallel batching by allowing one to use R and an R-like syntax for arguments to spread a script across a cluster or local multicore/multiprocessor computer, with automated syntax for several popular cluster types. Finally it provides a means to aggregate the results together of multiple processes run on a cluster.

  17. The role of flow injection analysis within the framework of an automated laboratory

    PubMed Central

    Stockwell, Peter B.

    1990-01-01

    Flow Injection Analysis (FIA) was invented at roughly the same time by two quite dissimilar research groups [1,2]. FIA was patented by both groups in 1974; a year also marked by the publication of the first book on automatic chemical analysis [3]. This book was a major undertaking for its authors and it is hoped that it has added to the knowledge of those analysts attempting to automate their work or to increase the level of computerization/automation and thus reduce staffing commitments. This review discusses the role of FIA in laboratory automation, the advantages and disadvantages of the FIA approach, and the part it could play in future developments. It is important to stress at the outset that the FIA approach is all too often closely paralleled with convention al continuous flow analysis (CFA). This is a mistake for many reasons, none the least of which because of the considerable success of the CFA approach in contrast to the present lack of penetration in the commercial market-place of FIA instrumentation. PMID:18925262

  18. Cultural evolutionary theory: How culture evolves and why it matters

    PubMed Central

    Creanza, Nicole; Kolodny, Oren; Feldman, Marcus W.

    2017-01-01

    Human cultural traits—behaviors, ideas, and technologies that can be learned from other individuals—can exhibit complex patterns of transmission and evolution, and researchers have developed theoretical models, both verbal and mathematical, to facilitate our understanding of these patterns. Many of the first quantitative models of cultural evolution were modified from existing concepts in theoretical population genetics because cultural evolution has many parallels with, as well as clear differences from, genetic evolution. Furthermore, cultural and genetic evolution can interact with one another and influence both transmission and selection. This interaction requires theoretical treatments of gene–culture coevolution and dual inheritance, in addition to purely cultural evolution. In addition, cultural evolutionary theory is a natural component of studies in demography, human ecology, and many other disciplines. Here, we review the core concepts in cultural evolutionary theory as they pertain to the extension of biology through culture, focusing on cultural evolutionary applications in population genetics, ecology, and demography. For each of these disciplines, we review the theoretical literature and highlight relevant empirical studies. We also discuss the societal implications of the study of cultural evolution and of the interactions of humans with one another and with their environment. PMID:28739941

  19. Cultural evolutionary theory: How culture evolves and why it matters.

    PubMed

    Creanza, Nicole; Kolodny, Oren; Feldman, Marcus W

    2017-07-24

    Human cultural traits-behaviors, ideas, and technologies that can be learned from other individuals-can exhibit complex patterns of transmission and evolution, and researchers have developed theoretical models, both verbal and mathematical, to facilitate our understanding of these patterns. Many of the first quantitative models of cultural evolution were modified from existing concepts in theoretical population genetics because cultural evolution has many parallels with, as well as clear differences from, genetic evolution. Furthermore, cultural and genetic evolution can interact with one another and influence both transmission and selection. This interaction requires theoretical treatments of gene-culture coevolution and dual inheritance, in addition to purely cultural evolution. In addition, cultural evolutionary theory is a natural component of studies in demography, human ecology, and many other disciplines. Here, we review the core concepts in cultural evolutionary theory as they pertain to the extension of biology through culture, focusing on cultural evolutionary applications in population genetics, ecology, and demography. For each of these disciplines, we review the theoretical literature and highlight relevant empirical studies. We also discuss the societal implications of the study of cultural evolution and of the interactions of humans with one another and with their environment.

  20. Phaser.MRage: automated molecular replacement.

    PubMed

    Bunkóczi, Gábor; Echols, Nathaniel; McCoy, Airlie J; Oeffner, Robert D; Adams, Paul D; Read, Randy J

    2013-11-01

    Phaser.MRage is a molecular-replacement automation framework that implements a full model-generation workflow and provides several layers of model exploration to the user. It is designed to handle a large number of models and can distribute calculations efficiently onto parallel hardware. In addition, phaser.MRage can identify correct solutions and use this information to accelerate the search. Firstly, it can quickly score all alternative models of a component once a correct solution has been found. Secondly, it can perform extensive analysis of identified solutions to find protein assemblies and can employ assembled models for subsequent searches. Thirdly, it is able to use a priori assembly information (derived from, for example, homologues) to speculatively place and score molecules, thereby customizing the search procedure to a certain class of protein molecule (for example, antibodies) and incorporating additional biological information into molecular replacement.

  1. An Automated, High-Throughput System for GISAXS and GIWAXS Measurements of Thin Films

    NASA Astrophysics Data System (ADS)

    Schaible, Eric; Jimenez, Jessica; Church, Matthew; Lim, Eunhee; Stewart, Polite; Hexemer, Alexander

    Grazing incidence small-angle X-ray scattering (GISAXS) and grazing incidence wide-angle X-ray scattering (GIWAXS) are important techniques for characterizing thin films. In order to meet rapidly increasing demand, the SAXSWAXS beamline at the Advanced Light Source (beamline 7.3.3) has implemented a fully automated, high-throughput system to conduct SAXS, GISAXS and GIWAXS measurements. An automated robot arm transfers samples from a holding tray to a measurement stage. Intelligent software aligns each sample in turn, and measures each according to user-defined specifications. Users mail in trays of samples on individually barcoded pucks, and can download and view their data remotely. Data will be pipelined to the NERSC supercomputing facility, and will be available to users via a web portal that facilitates highly parallelized analysis.

  2. Searching for globally optimal functional forms for interatomic potentials using genetic programming with parallel tempering.

    PubMed

    Slepoy, A; Peters, M D; Thompson, A P

    2007-11-30

    Molecular dynamics and other molecular simulation methods rely on a potential energy function, based only on the relative coordinates of the atomic nuclei. Such a function, called a force field, approximately represents the electronic structure interactions of a condensed matter system. Developing such approximate functions and fitting their parameters remains an arduous, time-consuming process, relying on expert physical intuition. To address this problem, a functional programming methodology was developed that may enable automated discovery of entirely new force-field functional forms, while simultaneously fitting parameter values. The method uses a combination of genetic programming, Metropolis Monte Carlo importance sampling and parallel tempering, to efficiently search a large space of candidate functional forms and parameters. The methodology was tested using a nontrivial problem with a well-defined globally optimal solution: a small set of atomic configurations was generated and the energy of each configuration was calculated using the Lennard-Jones pair potential. Starting with a population of random functions, our fully automated, massively parallel implementation of the method reproducibly discovered the original Lennard-Jones pair potential by searching for several hours on 100 processors, sampling only a minuscule portion of the total search space. This result indicates that, with further improvement, the method may be suitable for unsupervised development of more accurate force fields with completely new functional forms. Copyright (c) 2007 Wiley Periodicals, Inc.

  3. The Automated Instrumentation and Monitoring System (AIMS) reference manual

    NASA Technical Reports Server (NTRS)

    Yan, Jerry; Hontalas, Philip; Listgarten, Sherry

    1993-01-01

    Whether a researcher is designing the 'next parallel programming paradigm,' another 'scalable multiprocessor' or investigating resource allocation algorithms for multiprocessors, a facility that enables parallel program execution to be captured and displayed is invaluable. Careful analysis of execution traces can help computer designers and software architects to uncover system behavior and to take advantage of specific application characteristics and hardware features. A software tool kit that facilitates performance evaluation of parallel applications on multiprocessors is described. The Automated Instrumentation and Monitoring System (AIMS) has four major software components: a source code instrumentor which automatically inserts active event recorders into the program's source code before compilation; a run time performance-monitoring library, which collects performance data; a trace file animation and analysis tool kit which reconstructs program execution from the trace file; and a trace post-processor which compensate for data collection overhead. Besides being used as prototype for developing new techniques for instrumenting, monitoring, and visualizing parallel program execution, AIMS is also being incorporated into the run-time environments of various hardware test beds to evaluate their impact on user productivity. Currently, AIMS instrumentors accept FORTRAN and C parallel programs written for Intel's NX operating system on the iPSC family of multi computers. A run-time performance-monitoring library for the iPSC/860 is included in this release. We plan to release monitors for other platforms (such as PVM and TMC's CM-5) in the near future. Performance data collected can be graphically displayed on workstations (e.g. Sun Sparc and SGI) supporting X-Windows (in particular, Xl IR5, Motif 1.1.3).

  4. SeqMule: automated pipeline for analysis of human exome/genome sequencing data.

    PubMed

    Guo, Yunfei; Ding, Xiaolei; Shen, Yufeng; Lyon, Gholson J; Wang, Kai

    2015-09-18

    Next-generation sequencing (NGS) technology has greatly helped us identify disease-contributory variants for Mendelian diseases. However, users are often faced with issues such as software compatibility, complicated configuration, and no access to high-performance computing facility. Discrepancies exist among aligners and variant callers. We developed a computational pipeline, SeqMule, to perform automated variant calling from NGS data on human genomes and exomes. SeqMule integrates computational-cluster-free parallelization capability built on top of the variant callers, and facilitates normalization/intersection of variant calls to generate consensus set with high confidence. SeqMule integrates 5 alignment tools, 5 variant calling algorithms and accepts various combinations all by one-line command, therefore allowing highly flexible yet fully automated variant calling. In a modern machine (2 Intel Xeon X5650 CPUs, 48 GB memory), when fast turn-around is needed, SeqMule generates annotated VCF files in a day from a 30X whole-genome sequencing data set; when more accurate calling is needed, SeqMule generates consensus call set that improves over single callers, as measured by both Mendelian error rate and consistency. SeqMule supports Sun Grid Engine for parallel processing, offers turn-key solution for deployment on Amazon Web Services, allows quality check, Mendelian error check, consistency evaluation, HTML-based reports. SeqMule is available at http://seqmule.openbioinformatics.org.

  5. A standardized staining protocol for L1CAM on formalin-fixed, paraffin-embedded tissues using automated platforms.

    PubMed

    Fogel, Mina; Harari, Ayelet; Müller-Holzner, Elisabeth; Zeimet, Alain G; Moldenhauer, Gerhard; Altevogt, Peter

    2014-06-25

    The L1 cell adhesion molecule (L1CAM) is overexpressed in many human cancers and can serve as a biomarker for prognosis in most of these cancers (including type I endometrial carcinomas). Here we provide an optimized immunohistochemical staining procedure for a widely used automated platform (VENTANA™), which has recourse to commercially available primary antibody and detection reagents. In parallel, we optimized the staining on a semi-automated BioGenix (i6000) 
immunostainer. These protocols yield good stainings and should represent the basis for a reliable and standardized immunohistochemical detection of L1CAM in a variety of malignancies in different laboratories.

  6. Quantitative high-throughput population dynamics in continuous-culture by automated microscopy.

    PubMed

    Merritt, Jason; Kuehn, Seppe

    2016-09-12

    We present a high-throughput method to measure abundance dynamics in microbial communities sustained in continuous-culture. Our method uses custom epi-fluorescence microscopes to automatically image single cells drawn from a continuously-cultured population while precisely controlling culture conditions. For clonal populations of Escherichia coli our instrument reveals history-dependent resilience and growth rate dependent aggregation.

  7. Parallel rendering

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas W.

    1995-01-01

    This article provides a broad introduction to the subject of parallel rendering, encompassing both hardware and software systems. The focus is on the underlying concepts and the issues which arise in the design of parallel rendering algorithms and systems. We examine the different types of parallelism and how they can be applied in rendering applications. Concepts from parallel computing, such as data decomposition, task granularity, scalability, and load balancing, are considered in relation to the rendering problem. We also explore concepts from computer graphics, such as coherence and projection, which have a significant impact on the structure of parallel rendering algorithms. Our survey covers a number of practical considerations as well, including the choice of architectural platform, communication and memory requirements, and the problem of image assembly and display. We illustrate the discussion with numerous examples from the parallel rendering literature, representing most of the principal rendering methods currently used in computer graphics.

  8. Tensor contraction engine: Abstraction and automated parallel implementation of configuration-interaction, coupled-cluster, and many-body perturbation theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirata, So

    2003-11-20

    We develop a symbolic manipulation program and program generator (Tensor Contraction Engine or TCE) that automatically derives the working equations of a well-defined model of second-quantized many-electron theories and synthesizes efficient parallel computer programs on the basis of these equations. Provided an ansatz of a many-electron theory model, TCE performs valid contractions of creation and annihilation operators according to Wick's theorem, consolidates identical terms, and reduces the expressions into the form of multiple tensor contractions acted by permutation operators. Subsequently, it determines the binary contraction order for each multiple tensor contraction with the minimal operation and memory cost, factorizes commonmore » binary contractions (defines intermediate tensors), and identifies reusable intermediates. The resulting ordered list of binary tensor contractions, additions, and index permutations is translated into an optimized program that is combined with the NWChem and UTChem computational chemistry software packages. The programs synthesized by TCE take advantage of spin symmetry, Abelian point-group symmetry, and index permutation symmetry at every stage of calculations to minimize the number of arithmetic operations and storage requirement, adjust the peak local memory usage by index range tiling, and support parallel I/O interfaces and dynamic load balancing for parallel executions. We demonstrate the utility of TCE through automatic derivation and implementation of parallel programs for various models of configuration-interaction theory (CISD, CISDT, CISDTQ), many-body perturbation theory [MBPT(2), MBPT(3), MBPT(4)], and coupled-cluster theory (LCCD, CCD, LCCSD, CCSD, QCISD, CCSDT, and CCSDTQ).« less

  9. Automated microaneurysm detection in diabetic retinopathy using curvelet transform

    NASA Astrophysics Data System (ADS)

    Ali Shah, Syed Ayaz; Laude, Augustinus; Faye, Ibrahima; Tang, Tong Boon

    2016-10-01

    Microaneurysms (MAs) are known to be the early signs of diabetic retinopathy (DR). An automated MA detection system based on curvelet transform is proposed for color fundus image analysis. Candidates of MA were extracted in two parallel steps. In step one, blood vessels were removed from preprocessed green band image and preliminary MA candidates were selected by local thresholding technique. In step two, based on statistical features, the image background was estimated. The results from the two steps allowed us to identify preliminary MA candidates which were also present in the image foreground. A collection set of features was fed to a rule-based classifier to divide the candidates into MAs and non-MAs. The proposed system was tested with Retinopathy Online Challenge database. The automated system detected 162 MAs out of 336, thus achieved a sensitivity of 48.21% with 65 false positives per image. Counting MA is a means to measure the progression of DR. Hence, the proposed system may be deployed to monitor the progression of DR at early stage in population studies.

  10. Automated microaneurysm detection in diabetic retinopathy using curvelet transform.

    PubMed

    Ali Shah, Syed Ayaz; Laude, Augustinus; Faye, Ibrahima; Tang, Tong Boon

    2016-10-01

    Microaneurysms (MAs) are known to be the early signs of diabetic retinopathy (DR). An automated MA detection system based on curvelet transform is proposed for color fundus image analysis. Candidates of MA were extracted in two parallel steps. In step one, blood vessels were removed from preprocessed green band image and preliminary MA candidates were selected by local thresholding technique. In step two, based on statistical features, the image background was estimated. The results from the two steps allowed us to identify preliminary MA candidates which were also present in the image foreground. A collection set of features was fed to a rule-based classifier to divide the candidates into MAs and non-MAs. The proposed system was tested with Retinopathy Online Challenge database. The automated system detected 162 MAs out of 336, thus achieved a sensitivity of 48.21% with 65 false positives per image. Counting MA is a means to measure the progression of DR. Hence, the proposed system may be deployed to monitor the progression of DR at early stage in population studies.

  11. Driving out errors through tight integration between software and automation.

    PubMed

    Reifsteck, Mark; Swanson, Thomas; Dallas, Mary

    2006-01-01

    A clear case has been made for using clinical IT to improve medication safety, particularly bar-code point-of-care medication administration and computerized practitioner order entry (CPOE) with clinical decision support. The equally important role of automation has been overlooked. When the two are tightly integrated, with pharmacy information serving as a hub, the distinctions between software and automation become blurred. A true end-to-end medication management system drives out errors from the dockside to the bedside. Presbyterian Healthcare Services in Albuquerque has been building such a system since 1999, beginning by automating pharmacy operations to support bar-coded medication administration. Encouraged by those results, it then began layering on software to further support clinician workflow and improve communication, culminating with the deployment of CPOE and clinical decision support. This combination, plus a hard-wired culture of safety, has resulted in a dramatically lower mortality and harm rate that could not have been achieved with a partial solution.

  12. Mass culture of photobacteria to obtain luciferase

    NASA Technical Reports Server (NTRS)

    Chappelle, E. W.; Picciolo, G. L.; Rich, E., Jr.

    1969-01-01

    Inoculating preheated trays containing nutrient agar with photobacteria provides a means for mass culture of aerobic microorganisms in order to obtain large quantities of luciferase. To determine optimum harvest time, growth can be monitored by automated light-detection instrumentation.

  13. Automation of serum (1→3)-beta-D-glucan testing allows reliable and rapid discrimination of patients with and without candidemia.

    PubMed

    Prüller, Florian; Wagner, Jasmin; Raggam, Reinhard B; Hoenigl, Martin; Kessler, Harald H; Truschnig-Wilders, Martie; Krause, Robert

    2014-07-01

    Testing for (1→3)-beta-D-glucan (BDG) is used for detection of invasive fungal infection. However, current assays lack automation and the ability to conduct rapid single-sample testing. The Fungitell assay was adopted for automation and evaluated using clinical samples from patients with culture-proven candidemia and from culture-negative controls in duplicate. A comparison with the standard assay protocol was made in order to establish analytical specifications. With the automated protocol, the analytical measuring range was 8-2500 pg/ml of BDG, and precision testing resulted in coefficients of variation that ranged from 3.0% to 5.5%. Samples from 15 patients with culture-proven candidemia and 94 culture-negative samples were evaluated. All culture-proven samples showed BDG values >80 pg/ml (mean 1247 pg/ml; range, 116-2990 pg/ml), which were considered positive. Of the 94 culture-negative samples, 92 had BDG values <60 pg/ml (mean, 28 pg/ml), which were considered to be negative, and 2 samples were false-positive (≥80 pg/ml; up to 124 pg/ml). Results could be obtained within 45 min and showed excellent agreement with results obtained with the standard assay protocol. The automated Fungitell assay proved to be reliable and rapid for diagnosis of candidemia. It was demonstrated to be feasible and cost efficient for both single-sample and large-scale testing of serum BDG. Its 1-h time-to-result will allow better support for clinicians in the management of antifungal therapy. © The Author 2014. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. Validation of an automated colony counting system for group A Streptococcus.

    PubMed

    Frost, H R; Tsoi, S K; Baker, C A; Laho, D; Sanderson-Smith, M L; Steer, A C; Smeesters, P R

    2016-02-08

    The practice of counting bacterial colony forming units on agar plates has long been used as a method to estimate the concentration of live bacteria in culture. However, due to the laborious and potentially error prone nature of this measurement technique, an alternative method is desirable. Recent technologic advancements have facilitated the development of automated colony counting systems, which reduce errors introduced during the manual counting process and recording of information. An additional benefit is the significant reduction in time taken to analyse colony counting data. Whilst automated counting procedures have been validated for a number of microorganisms, the process has not been successful for all bacteria due to the requirement for a relatively high contrast between bacterial colonies and growth medium. The purpose of this study was to validate an automated counting system for use with group A Streptococcus (GAS). Twenty-one different GAS strains, representative of major emm-types, were selected for assessment. In order to introduce the required contrast for automated counting, 2,3,5-triphenyl-2H-tetrazolium chloride (TTC) dye was added to Todd-Hewitt broth with yeast extract (THY) agar. Growth on THY agar with TTC was compared with growth on blood agar and THY agar to ensure the dye was not detrimental to bacterial growth. Automated colony counts using a ProtoCOL 3 instrument were compared with manual counting to confirm accuracy over the stages of the growth cycle (latent, mid-log and stationary phases) and in a number of different assays. The average percentage differences between plating and counting methods were analysed using the Bland-Altman method. A percentage difference of ±10 % was determined as the cut-off for a critical difference between plating and counting methods. All strains measured had an average difference of less than 10 % when plated on THY agar with TTC. This consistency was also observed over all phases of the growth

  15. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment.

    PubMed

    Lee, Wei-Po; Hsiao, Yu-Ting; Hwang, Wei-Che

    2014-01-16

    To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high

  16. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment

    PubMed Central

    2014-01-01

    Background To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. Results This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Conclusions Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel

  17. New culture devices in ART.

    PubMed

    Rienzi, L; Vajta, G; Ubaldi, F

    2011-09-01

    During the past decades, improvements in culture of preimplantation embryos have contributed substantially in the success of human assisted reproductive techniques. However, most efforts were focused on optimization of media and gas components, while the established physical conditions and applied devices have remained essentially unchanged. Very recently, however, intensive research has been started to provide a more appropriate environment for the embryos and to replace the rather primitive and inappropriate devices with more sophisticated and practical instruments. Success has been reported with simple or sophisticated tools (microwells or microchannels) that allow accumulation of autocrine factors and establishment of a proper microenvironment for embryos cultured individually or in groups. The microchannel system may also offer certain level of automation and increased standardization of culture parameters. Continuous monitoring of individual embryos by optical or biochemical methods may help to determine the optimal day of transfer, and selection of the embryo with highest developmental competence for transfer. This advancement may eventually lead to adjustment of the culture environment to each individual embryo according to its actual needs. Connection of these techniques to additional radical approaches as automated ICSI or an ultimate assisted hatching with full removal of the zona pellucida after or even before fertilization may result in devices with high reliability and consistency, to increase the overall efficiency and decrease the work-intensity, and to eliminate the existing technological gap between laboratory embryology work and most other fields of biomedical sciences. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. PCLIPS: Parallel CLIPS

    NASA Technical Reports Server (NTRS)

    Hall, Lawrence O.; Bennett, Bonnie H.; Tello, Ivan

    1994-01-01

    A parallel version of CLIPS 5.1 has been developed to run on Intel Hypercubes. The user interface is the same as that for CLIPS with some added commands to allow for parallel calls. A complete version of CLIPS runs on each node of the hypercube. The system has been instrumented to display the time spent in the match, recognize, and act cycles on each node. Only rule-level parallelism is supported. Parallel commands enable the assertion and retraction of facts to/from remote nodes working memory. Parallel CLIPS was used to implement a knowledge-based command, control, communications, and intelligence (C(sup 3)I) system to demonstrate the fusion of high-level, disparate sources. We discuss the nature of the information fusion problem, our approach, and implementation. Parallel CLIPS has also be used to run several benchmark parallel knowledge bases such as one to set up a cafeteria. Results show from running Parallel CLIPS with parallel knowledge base partitions indicate that significant speed increases, including superlinear in some cases, are possible.

  19. An automated perfusion bioreactor for the streamlined production of engineered osteogenic grafts.

    PubMed

    Ding, Ming; Henriksen, Susan S; Wendt, David; Overgaard, Søren

    2016-04-01

    A computer-controlled perfusion bioreactor was developed for the streamlined production of engineered osteogenic grafts. This system automated the required bioprocesses, from the initial filling of the system through the phases of cell seeding and prolonged cell/tissue culture. Flow through chemo-optic micro-sensors allowed to non-invasively monitor the levels of oxygen and pH in the perfused culture medium throughout the culture period. To validate its performance, freshly isolated ovine bone marrow stromal cells were directly seeded on porous scaffold granules (hydroxyapatite/β-tricalcium-phosphate/poly-lactic acid), bypassing the phase of monolayer cell expansion in flasks. Either 10 or 20 days after culture, engineered cell-granule grafts were implanted in an ectopic mouse model to quantify new bone formation. After four weeks of implantation, histomorphometry showed more bone in bioreactor-generated grafts than cell-free granule controls, while bone formation did not show significant differences between 10 days and 20 days of incubation. The implanted granules without cells had no bone formation. This novel perfusion bioreactor has revealed the capability of activation larger viable bone graft material, even after shorter incubation time of graft material. This study has demonstrated the feasibility of engineering osteogenic grafts in an automated bioreactor system, laying the foundation for a safe, regulatory-compliant, and cost-effective manufacturing process. © 2015 Wiley Periodicals, Inc.

  20. Automated analysis and reannotation of subcellular locations in confocal images from the Human Protein Atlas.

    PubMed

    Li, Jieyue; Newberg, Justin Y; Uhlén, Mathias; Lundberg, Emma; Murphy, Robert F

    2012-01-01

    The Human Protein Atlas contains immunofluorescence images showing subcellular locations for thousands of proteins. These are currently annotated by visual inspection. In this paper, we describe automated approaches to analyze the images and their use to improve annotation. We began by training classifiers to recognize the annotated patterns. By ranking proteins according to the confidence of the classifier, we generated a list of proteins that were strong candidates for reexamination. In parallel, we applied hierarchical clustering to group proteins and identified proteins whose annotations were inconsistent with the remainder of the proteins in their cluster. These proteins were reexamined by the original annotators, and a significant fraction had their annotations changed. The results demonstrate that automated approaches can provide an important complement to visual annotation.

  1. The Automated Instrumentation and Monitoring System (AIMS): Design and Architecture. 3.2

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Schmidt, Melisa; Schulbach, Cathy; Bailey, David (Technical Monitor)

    1997-01-01

    Whether a researcher is designing the 'next parallel programming paradigm', another 'scalable multiprocessor' or investigating resource allocation algorithms for multiprocessors, a facility that enables parallel program execution to be captured and displayed is invaluable. Careful analysis of such information can help computer and software architects to capture, and therefore, exploit behavioral variations among/within various parallel programs to take advantage of specific hardware characteristics. A software tool-set that facilitates performance evaluation of parallel applications on multiprocessors has been put together at NASA Ames Research Center under the sponsorship of NASA's High Performance Computing and Communications Program over the past five years. The Automated Instrumentation and Monitoring Systematic has three major software components: a source code instrumentor which automatically inserts active event recorders into program source code before compilation; a run-time performance monitoring library which collects performance data; and a visualization tool-set which reconstructs program execution based on the data collected. Besides being used as a prototype for developing new techniques for instrumenting, monitoring and presenting parallel program execution, AIMS is also being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Currently, the execution of FORTRAN and C programs on the Intel Paragon and PALM workstations can be automatically instrumented and monitored. Performance data thus collected can be displayed graphically on various workstations. The process of performance tuning with AIMS will be illustrated using various NAB Parallel Benchmarks. This report includes a description of the internal architecture of AIMS and a listing of the source code.

  2. Mechanical Alteration And Contamination Issues In Automated Subsurface Sample Acquisition And Handling

    NASA Astrophysics Data System (ADS)

    Glass, B. J.; Cannon, H.; Bonaccorsi, R.; Zacny, K.

    2006-12-01

    The Drilling Automation for Mars Exploration (DAME) project's purpose is to develop and field-test drilling automation and robotics technologies for projected use in missions in the 2011-15 period. DAME includes control of the drilling hardware, and state estimation of both the hardware and the lithography being drilled and the state of the hole. A sister drill was constructed for the Mars Analog Río Tinto Experiment (MARTE) project and demonstrated automated core handling and string changeout in 2005 drilling tests at Rio Tinto, Spain. DAME focused instead on the problem of drill control while actively drilling while not getting stuck. Together, the DAME and MARTE projects demonstrate a fully automated robotic drilling capability, including hands-off drilling, adjustment to different strata and downhole conditions, recovery from drilling faults (binding, choking, etc.), drill string changeouts, core acquisition and removal, and sample handling and conveyance to in-situ instruments. The 2006 top-level goal of DAME drilling in-situ tests was to verify and demonstrate a capability for hands-off automated drilling, at an Arctic Mars-analog site. There were three sets of 2006 test goals, all of which were exceeded during the July 2006 field season. The first was to demonstrate the recognition, while drilling, of at least three of the six known major fault modes for the DAME planetary-prototype drill, and to employ the correct recovery or safing procedure in response. The second set of 2006 goals was to operate for three or more hours autonomously, hands-off. And the third 2006 goal was to exceed 3m depth into the frozen breccia and permafrost with the DAME drill (it had not gone further than 2.2m previously). Five of six faults were detected and corrected, there were 43 hours of hands-off drilling (including a 4 hour sequence with no human presence nearby), and 3.2m was the total depth. And ground truth drilling used small commercial drilling equipment in parallel in

  3. An Evaluation of Different Statistical Targets for Assembling Parallel Forms in Item Response Theory

    PubMed Central

    Ali, Usama S.; van Rijn, Peter W.

    2015-01-01

    Assembly of parallel forms is an important step in the test development process. Therefore, choosing a suitable theoretical framework to generate well-defined test specifications is critical. The performance of different statistical targets of test specifications using the test characteristic curve (TCC) and the test information function (TIF) was investigated. Test length, the number of test forms, and content specifications are considered as well. The TCC target results in forms that are parallel in difficulty, but not necessarily in terms of precision. Vice versa, test forms created using a TIF target are parallel in terms of precision, but not necessarily in terms of difficulty. As sometimes the focus is either on TIF or TCC, differences in either difficulty or precision can arise. Differences in difficulty can be mitigated by equating, but differences in precision cannot. In a series of simulations using a real item bank, the two-parameter logistic model, and mixed integer linear programming for automated test assembly, these differences were found to be quite substantial. When both TIF and TCC are combined into one target with manipulation to relative importance, these differences can be made to disappear.

  4. Algorithms for the Construction of Parallel Tests by Zero-One Programming. Project Psychometric Aspects of Item Banking No. 7. Research Report 86-7.

    ERIC Educational Resources Information Center

    Boekkooi-Timminga, Ellen

    Nine methods for automated test construction are described. All are based on the concepts of information from item response theory. Two general kinds of methods for the construction of parallel tests are presented: (1) sequential test design; and (2) simultaneous test design. Sequential design implies that the tests are constructed one after the…

  5. Phaser.MRage: automated molecular replacement

    PubMed Central

    Bunkóczi, Gábor; Echols, Nathaniel; McCoy, Airlie J.; Oeffner, Robert D.; Adams, Paul D.; Read, Randy J.

    2013-01-01

    Phaser.MRage is a molecular-replacement automation framework that implements a full model-generation workflow and provides several layers of model exploration to the user. It is designed to handle a large number of models and can distribute calculations efficiently onto parallel hardware. In addition, phaser.MRage can identify correct solutions and use this information to accelerate the search. Firstly, it can quickly score all alternative models of a component once a correct solution has been found. Secondly, it can perform extensive analysis of identified solutions to find protein assemblies and can employ assembled models for subsequent searches. Thirdly, it is able to use a priori assembly information (derived from, for example, homologues) to speculatively place and score molecules, thereby customizing the search procedure to a certain class of protein molecule (for example, antibodies) and incorporating additional biological information into molecular replacement. PMID:24189240

  6. Cloud identification using genetic algorithms and massively parallel computation

    NASA Technical Reports Server (NTRS)

    Buckles, Bill P.; Petry, Frederick E.

    1996-01-01

    As a Guest Computational Investigator under the NASA administered component of the High Performance Computing and Communication Program, we implemented a massively parallel genetic algorithm on the MasPar SIMD computer. Experiments were conducted using Earth Science data in the domains of meteorology and oceanography. Results obtained in these domains are competitive with, and in most cases better than, similar problems solved using other methods. In the meteorological domain, we chose to identify clouds using AVHRR spectral data. Four cloud speciations were used although most researchers settle for three. Results were remarkedly consistent across all tests (91% accuracy). Refinements of this method may lead to more timely and complete information for Global Circulation Models (GCMS) that are prevalent in weather forecasting and global environment studies. In the oceanographic domain, we chose to identify ocean currents from a spectrometer having similar characteristics to AVHRR. Here the results were mixed (60% to 80% accuracy). Given that one is willing to run the experiment several times (say 10), then it is acceptable to claim the higher accuracy rating. This problem has never been successfully automated. Therefore, these results are encouraging even though less impressive than the cloud experiment. Successful conclusion of an automated ocean current detection system would impact coastal fishing, naval tactics, and the study of micro-climates. Finally we contributed to the basic knowledge of GA (genetic algorithm) behavior in parallel environments. We developed better knowledge of the use of subpopulations in the context of shared breeding pools and the migration of individuals. Rigorous experiments were conducted based on quantifiable performance criteria. While much of the work confirmed current wisdom, for the first time we were able to submit conclusive evidence. The software developed under this grant was placed in the public domain. An extensive user

  7. IOPA: I/O-aware parallelism adaption for parallel programs

    PubMed Central

    Liu, Tao; Liu, Yi; Qian, Chen; Qian, Depei

    2017-01-01

    With the development of multi-/many-core processors, applications need to be written as parallel programs to improve execution efficiency. For data-intensive applications that use multiple threads to read/write files simultaneously, an I/O sub-system can easily become a bottleneck when too many of these types of threads exist; on the contrary, too few threads will cause insufficient resource utilization and hurt performance. Therefore, programmers must pay much attention to parallelism control to find the appropriate number of I/O threads for an application. This paper proposes a parallelism control mechanism named IOPA that can adjust the parallelism of applications to adapt to the I/O capability of a system and balance computing resources and I/O bandwidth. The programming interface of IOPA is also provided to programmers to simplify parallel programming. IOPA is evaluated using multiple applications with both solid state and hard disk drives. The results show that the parallel applications using IOPA can achieve higher efficiency than those with a fixed number of threads. PMID:28278236

  8. IOPA: I/O-aware parallelism adaption for parallel programs.

    PubMed

    Liu, Tao; Liu, Yi; Qian, Chen; Qian, Depei

    2017-01-01

    With the development of multi-/many-core processors, applications need to be written as parallel programs to improve execution efficiency. For data-intensive applications that use multiple threads to read/write files simultaneously, an I/O sub-system can easily become a bottleneck when too many of these types of threads exist; on the contrary, too few threads will cause insufficient resource utilization and hurt performance. Therefore, programmers must pay much attention to parallelism control to find the appropriate number of I/O threads for an application. This paper proposes a parallelism control mechanism named IOPA that can adjust the parallelism of applications to adapt to the I/O capability of a system and balance computing resources and I/O bandwidth. The programming interface of IOPA is also provided to programmers to simplify parallel programming. IOPA is evaluated using multiple applications with both solid state and hard disk drives. The results show that the parallel applications using IOPA can achieve higher efficiency than those with a fixed number of threads.

  9. High-Performance Psychometrics: The Parallel-E Parallel-M Algorithm for Generalized Latent Variable Models. Research Report. ETS RR-16-34

    ERIC Educational Resources Information Center

    von Davier, Matthias

    2016-01-01

    This report presents results on a parallel implementation of the expectation-maximization (EM) algorithm for multidimensional latent variable models. The developments presented here are based on code that parallelizes both the E step and the M step of the parallel-E parallel-M algorithm. Examples presented in this report include item response…

  10. Synthesizing parallel imaging applications using the CAP (computer-aided parallelization) tool

    NASA Astrophysics Data System (ADS)

    Gennart, Benoit A.; Mazzariol, Marc; Messerli, Vincent; Hersch, Roger D.

    1997-12-01

    Imaging applications such as filtering, image transforms and compression/decompression require vast amounts of computing power when applied to large data sets. These applications would potentially benefit from the use of parallel processing. However, dedicated parallel computers are expensive and their processing power per node lags behind that of the most recent commodity components. Furthermore, developing parallel applications remains a difficult task: writing and debugging the application is difficult (deadlocks), programs may not be portable from one parallel architecture to the other, and performance often comes short of expectations. In order to facilitate the development of parallel applications, we propose the CAP computer-aided parallelization tool which enables application programmers to specify at a high-level of abstraction the flow of data between pipelined-parallel operations. In addition, the CAP tool supports the programmer in developing parallel imaging and storage operations. CAP enables combining efficiently parallel storage access routines and image processing sequential operations. This paper shows how processing and I/O intensive imaging applications must be implemented to take advantage of parallelism and pipelining between data access and processing. This paper's contribution is (1) to show how such implementations can be compactly specified in CAP, and (2) to demonstrate that CAP specified applications achieve the performance of custom parallel code. The paper analyzes theoretically the performance of CAP specified applications and demonstrates the accuracy of the theoretical analysis through experimental measurements.

  11. An automated microphysiological assay for toxicity evaluation.

    PubMed

    Eggert, S; Alexander, F A; Wiest, J

    2015-08-01

    Screening a newly developed drug, food additive or cosmetic ingredient for toxicity is a critical preliminary step before it can move forward in the development pipeline. Due to the sometimes dire consequences when a harmful agent is overlooked, toxicologists work under strict guidelines to effectively catalogue and classify new chemical agents. Conventional assays involve long experimental hours and many manual steps that increase the probability of user error; errors that can potentially manifest as inaccurate toxicology results. Automated assays can overcome many potential mistakes that arise due to human error. In the presented work, we created and validated a novel, automated platform for a microphysiological assay that can examine cellular attributes with sensors measuring changes in cellular metabolic rate, oxygen consumption, and vitality mediated by exposure to a potentially toxic agent. The system was validated with low buffer culture medium with varied conductivities that caused changes in the measured impedance on integrated impedance electrodes.

  12. Enhanced versus automated urinalysis for screening of urinary tract infections in children in the emergency department.

    PubMed

    Shah, Ami P; Cobb, Benjamin T; Lower, Darla R; Shaikh, Nader; Rasmussen, Jayne; Hoberman, Alejandro; Wald, Ellen R; Rosendorff, Adam; Hickey, Robert W

    2014-03-01

    Urinary tract infections (UTI) are the most common serious bacterial infection in febrile infants. Urinalysis (UA) is a screening test for preliminary diagnosis of UTI. UA can be performed manually or using automated techniques. We sought to compare manual versus automated UA for urine specimens obtained via catheterization in the pediatric emergency department. In this prospective study, we processed catheterized urine samples from infants with suspected UTI by both the manual method (enhanced UA) and the automated method. We defined a positive enhanced UA as ≥ 10 white blood cells per cubic millimeter and presence of any bacteria per 10 oil immersion fields on a Gram-stained smear. We defined a positive automated UA as ≥ 2 white blood cells per high-powered field and presence of any bacteria using the IRIS iQ200 ELITE. We defined a positive urine culture as growth of ≥ 50,000 colony-forming units per milliliter of a single uropathogen. We analyzed data using SPSS software. A total of 703 specimens were analyzed. Prevalence of UTI was 7%. For pyuria, the sensitivity and positive predictive value (PPV) of the enhanced UA in predicting positive urine culture were 83.6% and 52.5%, respectively; corresponding values for the automated UA were 79.5% and 37.5%, respectively. For bacteriuria, the sensitivity and PPV of a Gram-stained smear (enhanced UA) were 83.6% and 59.4%, respectively; corresponding values for the automated UA were 73.4%, and 26.2%, respectively. Using criteria of both pyuria and bacteriuria for the enhanced UA resulted in a sensitivity of 77.5% and a PPV of 84.4%; corresponding values for the automated UA were 63.2% and 51.6%, respectively. Combining automated pyuria (≥ 2 white blood cells/high-powered microscopic field) with a Gram-stained smear resulted in a sensitivity of 75.5% and a PPV of 84%. Automated UA is comparable with manual UA for detection of pyuria in young children with suspected UTI. Bacteriuria detected by automated UA is

  13. Research project evaluates the effect of national culture on flight crew behaviour.

    PubMed

    Helmreich, R L; Merritt, A C; Sherman, P J

    1996-10-01

    The role of national culture in flight crew interactions and behavior is examined. Researchers surveyed Asian, European, and American flight crews to determine attitudes about crew coordination and cockpit management. Universal attitudes among pilots are identified. Culturally variable attitudes among pilots from 16 countries are compared. The role of culture in response to increasing cockpit automation is reviewed. Culture-based challenges to crew resource management programs and multicultural organizations are discussed.

  14. Neonatal blood cultures: effect of delayed entry into the blood culture machine and bacterial concentration on the time to positive growth in a simulated model.

    PubMed

    Jardine, Luke Anthony; Sturgess, Barbara Ruth; Inglis, Garry Donald Trevor; Davies, Mark William

    2009-04-01

    To determine if: time from blood culture inoculation to positive growth (total time to positive) and time from blood culture machine entry to positive growth (machine time to positive) is altered by delayed entry into the automated blood culture machine, and if the total time to positive differs by the concentration of organisms inoculated into blood culture bottles. Staphylococcus epidermidis, Escherichia coli and group B beta-haemolytic streptococci were chosen as clinically significant representative organisms. Two concentrations (> or =10 colony-forming units per millilitre and <1 colony-forming units per millilitre) were inoculated into PEDS BacT/Alert blood culture bottles and randomly allocated to one of three delayed automated blood culture machine entry times (30 min/8.5 h/15.5 h). For all organisms at all concentrations, except the Staphylococcus epidermidis, the machine time to positive was significantly decreased by delayed entry. For all organisms at all concentrations, the mean total time to positive significantly increased with increasing delayed entry into the blood culture machine. Higher concentrations of group B beta-haemolytic streptococci and Escherichia coli grew significantly faster than lower concentrations. Bacterial growth in inoculated bottles, stored at room temperature, continues although at a slower rate than in those blood culture bottles immediately entered into the machine. If a blood culture specimen has been stored at room temperature for greater than 15.5 h, the currently allowed safety margin of 36 h (before declaring a result negative) may be insufficient.

  15. Comparison of methods for the identification of microorganisms isolated from blood cultures.

    PubMed

    Monteiro, Aydir Cecília Marinho; Fortaleza, Carlos Magno Castelo Branco; Ferreira, Adriano Martison; Cavalcante, Ricardo de Souza; Mondelli, Alessandro Lia; Bagagli, Eduardo; da Cunha, Maria de Lourdes Ribeiro de Souza

    2016-08-05

    Bloodstream infections are responsible for thousands of deaths each year. The rapid identification of the microorganisms causing these infections permits correct therapeutic management that will improve the prognosis of the patient. In an attempt to reduce the time spent on this step, microorganism identification devices have been developed, including the VITEK(®) 2 system, which is currently used in routine clinical microbiology laboratories. This study evaluated the accuracy of the VITEK(®) 2 system in the identification of 400 microorganisms isolated from blood cultures and compared the results to those obtained with conventional phenotypic and genotypic methods. In parallel to the phenotypic identification methods, the DNA of these microorganisms was extracted directly from the blood culture bottles for genotypic identification by the polymerase chain reaction (PCR) and DNA sequencing. The automated VITEK(®) 2 system correctly identified 94.7 % (379/400) of the isolates. The YST and GN cards resulted in 100 % correct identifications of yeasts (15/15) and Gram-negative bacilli (165/165), respectively. The GP card correctly identified 92.6 % (199/215) of Gram-positive cocci, while the ANC card was unable to correctly identify any Gram-positive bacilli (0/5). The performance of the VITEK(®) 2 system was considered acceptable and statistical analysis showed that the system is a suitable option for routine clinical microbiology laboratories to identify different microorganisms.

  16. The language parallel Pascal and other aspects of the massively parallel processor

    NASA Technical Reports Server (NTRS)

    Reeves, A. P.; Bruner, J. D.

    1982-01-01

    A high level language for the Massively Parallel Processor (MPP) was designed. This language, called Parallel Pascal, is described in detail. A description of the language design, a description of the intermediate language, Parallel P-Code, and details for the MPP implementation are included. Formal descriptions of Parallel Pascal and Parallel P-Code are given. A compiler was developed which converts programs in Parallel Pascal into the intermediate Parallel P-Code language. The code generator to complete the compiler for the MPP is being developed independently. A Parallel Pascal to Pascal translator was also developed. The architecture design for a VLSI version of the MPP was completed with a description of fault tolerant interconnection networks. The memory arrangement aspects of the MPP are discussed and a survey of other high level languages is given.

  17. Evaluation of the 3D BacT/ALERT automated culture system for the detection of microbial contamination of platelet concentrates.

    PubMed

    McDonald, C P; Rogers, A; Cox, M; Smith, R; Roy, A; Robbins, S; Hartley, S; Barbara, J A J; Rothenberg, S; Stutzman, L; Widders, G

    2002-10-01

    Bacterial transmission remains the major component of morbidity and mortality associated with transfusion-transmitted infections. Platelet concentrates are the most common cause of bacterial transmission. The BacT/ALERT 3D automated blood culture system has the potential to screen platelet concentrates for the presence of bacteria. Evaluation of this system was performed by spiking day 2 apheresis platelet units with individual bacterial isolates at final concentrations of 10 and 100 colony-forming units (cfu) mL-1. Fifteen organisms were used which had been cited in platelet transmission and monitoring studies. BacT/ALERT times to detection were compared with thioglycollate broth cultures, and the performance of five types of BacT/ALERT culture bottles was evaluated. Sampling was performed immediately after the inoculation of the units, and 10 replicates were performed per organism concentration for each of the five types of BacT/ALERT bottles. The mean times for the detection of these 15 organisms by BacT/ALERT, with the exception of Propionibacterium acnes, ranged from 9.1 to 48.1 h (all 10 replicates were positive). In comparison, the time range found using thioglycollate was 12.0-32.3 h (all 10 replicates were positive). P. acnes' BacT/ALERT mean detection times ranged from 89.0 to 177.6 h compared with 75.6-86.4 h for the thioglycollate broth. BacT/ALERT, with the exception of P. acnes, which has dubious clinical significance, gave equivalent or shorter detection times when compared with the thioglycollate broth system. The BacT/ALERT system detected a range of organisms at levels of 10 and 100 cfu mL-1. This study validates the BacT/ALERT microbial detection system for screening platelets. Currently, the system is the only practically viable option available for routinely screening platelet concentrates to prevent bacterial transmission.

  18. Method and apparatus for routing data in an inter-nodal communications lattice of a massively parallel computer system by routing through transporter nodes

    DOEpatents

    Archer, Charles Jens; Musselman, Roy Glenn; Peters, Amanda; Pinnow, Kurt Walter; Swartz, Brent Allen; Wallenfelt, Brian Paul

    2010-11-16

    A massively parallel computer system contains an inter-nodal communications network of node-to-node links. An automated routing strategy routes packets through one or more intermediate nodes of the network to reach a destination. Some packets are constrained to be routed through respective designated transporter nodes, the automated routing strategy determining a path from a respective source node to a respective transporter node, and from a respective transporter node to a respective destination node. Preferably, the source node chooses a routing policy from among multiple possible choices, and that policy is followed by all intermediate nodes. The use of transporter nodes allows greater flexibility in routing.

  19. Effects of a psychophysiological system for adaptive automation on performance, workload, and the event-related potential P300 component

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J 3rd; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.

    2003-01-01

    The present study examined the effects of an electroencephalographic- (EEG-) based system for adaptive automation on tracking performance and workload. In addition, event-related potentials (ERPs) to a secondary task were derived to determine whether they would provide an additional degree of workload specificity. Participants were run in an adaptive automation condition, in which the system switched between manual and automatic task modes based on the value of each individual's own EEG engagement index; a yoked control condition; or another control group, in which task mode switches followed a random pattern. Adaptive automation improved performance and resulted in lower levels of workload. Further, the P300 component of the ERP paralleled the sensitivity to task demands of the performance and subjective measures across conditions. These results indicate that it is possible to improve performance with a psychophysiological adaptive automation system and that ERPs may provide an alternative means for distinguishing among levels of cognitive task demand in such systems. Actual or potential applications of this research include improved methods for assessing operator workload and performance.

  20. Parallel integer sorting with medium and fine-scale parallelism

    NASA Technical Reports Server (NTRS)

    Dagum, Leonardo

    1993-01-01

    Two new parallel integer sorting algorithms, queue-sort and barrel-sort, are presented and analyzed in detail. These algorithms do not have optimal parallel complexity, yet they show very good performance in practice. Queue-sort designed for fine-scale parallel architectures which allow the queueing of multiple messages to the same destination. Barrel-sort is designed for medium-scale parallel architectures with a high message passing overhead. The performance results from the implementation of queue-sort on a Connection Machine CM-2 and barrel-sort on a 128 processor iPSC/860 are given. The two implementations are found to be comparable in performance but not as good as a fully vectorized bucket sort on the Cray YMP.

  1. Modularized Parallel Neutron Instrument Simulation on the TeraGrid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Meili; Cobb, John W; Hagen, Mark E

    2007-01-01

    In order to build a bridge between the TeraGrid (TG), a national scale cyberinfrastructure resource, and neutron science, the Neutron Science TeraGrid Gateway (NSTG) is focused on introducing productive HPC usage to the neutron science community, primarily the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory (ORNL). Monte Carlo simulations are used as a powerful tool for instrument design and optimization at SNS. One of the successful efforts of a collaboration team composed of NSTG HPC experts and SNS instrument scientists is the development of a software facility named PSoNI, Parallelizing Simulations of Neutron Instruments. Parallelizing the traditional serialmore » instrument simulation on TeraGrid resources, PSoNI quickly computes full instrument simulation at sufficient statistical levels in instrument de-sign. Upon SNS successful commissioning, to the end of 2007, three out of five commissioned instruments in SNS target station will be available for initial users. Advanced instrument study, proposal feasibility evalua-tion, and experiment planning are on the immediate schedule of SNS, which pose further requirements such as flexibility and high runtime efficiency on fast instrument simulation. PSoNI has been redesigned to meet the new challenges and a preliminary version is developed on TeraGrid. This paper explores the motivation and goals of the new design, and the improved software structure. Further, it describes the realized new fea-tures seen from MPI parallelized McStas running high resolution design simulations of the SEQUOIA and BSS instruments at SNS. A discussion regarding future work, which is targeted to do fast simulation for automated experiment adjustment and comparing models to data in analysis, is also presented.« less

  2. Bilingual parallel programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, I.; Overbeek, R.

    1990-01-01

    Numerous experiments have demonstrated that computationally intensive algorithms support adequate parallelism to exploit the potential of large parallel machines. Yet successful parallel implementations of serious applications are rare. The limiting factor is clearly programming technology. None of the approaches to parallel programming that have been proposed to date -- whether parallelizing compilers, language extensions, or new concurrent languages -- seem to adequately address the central problems of portability, expressiveness, efficiency, and compatibility with existing software. In this paper, we advocate an alternative approach to parallel programming based on what we call bilingual programming. We present evidence that this approach providesmore » and effective solution to parallel programming problems. The key idea in bilingual programming is to construct the upper levels of applications in a high-level language while coding selected low-level components in low-level languages. This approach permits the advantages of a high-level notation (expressiveness, elegance, conciseness) to be obtained without the cost in performance normally associated with high-level approaches. In addition, it provides a natural framework for reusing existing code.« less

  3. Advanced automation of a prototypic thermal control system for Space Station

    NASA Technical Reports Server (NTRS)

    Dominick, Jeff

    1990-01-01

    Viewgraphs on an advanced automation of a prototypic thermal control system for space station are presented. The Thermal Expert System (TEXSYS) was initiated in 1986 as a cooperative project between ARC and JCS as a way to leverage on-going work at both centers. JSC contributed Thermal Control System (TCS) hardware and control software, TCS operational expertise, and integration expertise. ARC contributed expert system and display expertise. The first years of the project were dedicated to parallel development of expert system tools, displays, interface software, and TCS technology and procedures by a total of four organizations.

  4. 78 FR 66039 - Modification of National Customs Automation Program Test Concerning Automated Commercial...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-04

    ... Customs Automation Program Test Concerning Automated Commercial Environment (ACE) Cargo Release (Formerly... Simplified Entry functionality in the Automated Commercial Environment (ACE). Originally, the test was known...) test concerning Automated Commercial Environment (ACE) Simplified Entry (SE test) functionality is...

  5. Scalable parallel communications

    NASA Technical Reports Server (NTRS)

    Maly, K.; Khanna, S.; Overstreet, C. M.; Mukkamala, R.; Zubair, M.; Sekhar, Y. S.; Foudriat, E. C.

    1992-01-01

    Coarse-grain parallelism in networking (that is, the use of multiple protocol processors running replicated software sending over several physical channels) can be used to provide gigabit communications for a single application. Since parallel network performance is highly dependent on real issues such as hardware properties (e.g., memory speeds and cache hit rates), operating system overhead (e.g., interrupt handling), and protocol performance (e.g., effect of timeouts), we have performed detailed simulations studies of both a bus-based multiprocessor workstation node (based on the Sun Galaxy MP multiprocessor) and a distributed-memory parallel computer node (based on the Touchstone DELTA) to evaluate the behavior of coarse-grain parallelism. Our results indicate: (1) coarse-grain parallelism can deliver multiple 100 Mbps with currently available hardware platforms and existing networking protocols (such as Transmission Control Protocol/Internet Protocol (TCP/IP) and parallel Fiber Distributed Data Interface (FDDI) rings); (2) scale-up is near linear in n, the number of protocol processors, and channels (for small n and up to a few hundred Mbps); and (3) since these results are based on existing hardware without specialized devices (except perhaps for some simple modifications of the FDDI boards), this is a low cost solution to providing multiple 100 Mbps on current machines. In addition, from both the performance analysis and the properties of these architectures, we conclude: (1) multiple processors providing identical services and the use of space division multiplexing for the physical channels can provide better reliability than monolithic approaches (it also provides graceful degradation and low-cost load balancing); (2) coarse-grain parallelism supports running several transport protocols in parallel to provide different types of service (for example, one TCP handles small messages for many users, other TCP's running in parallel provide high bandwidth

  6. Role of home automation in distribution automation and automated meter reading. Tropical report, December 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, K.W.

    1994-12-01

    This is one of a series of topical reports dealing with the strategic, technical, and market development of home automation. Particular emphasis is placed upon identifying those aspects of home automation that will impact the gas industry and gas products. Communication standards, market drivers, key organizations, technical implementation, product opportunities, and market growth projects will all be addressed in this or subsequent reports. These reports will also discuss how the gas industry and gas-fired equipment can use home automation technology to benefit the consumer.

  7. Parallel simulation today

    NASA Technical Reports Server (NTRS)

    Nicol, David; Fujimoto, Richard

    1992-01-01

    This paper surveys topics that presently define the state of the art in parallel simulation. Included in the tutorial are discussions on new protocols, mathematical performance analysis, time parallelism, hardware support for parallel simulation, load balancing algorithms, and dynamic memory management for optimistic synchronization.

  8. Automatic Generation of Directive-Based Parallel Programs for Shared Memory Parallel Systems

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Yan, Jerry; Frumkin, Michael

    2000-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. As great progress was made in hardware and software technologies, performance of parallel programs with compiler directives has demonstrated large improvement. The introduction of OpenMP directives, the industrial standard for shared-memory programming, has minimized the issue of portability. Due to its ease of programming and its good performance, the technique has become very popular. In this study, we have extended CAPTools, a computer-aided parallelization toolkit, to automatically generate directive-based, OpenMP, parallel programs. We outline techniques used in the implementation of the tool and present test results on the NAS parallel benchmarks and ARC3D, a CFD application. This work demonstrates the great potential of using computer-aided tools to quickly port parallel programs and also achieve good performance.

  9. Endpoint-based parallel data processing in a parallel active messaging interface of a parallel computer

    DOEpatents

    Archer, Charles J; Blocksome, Michael E; Ratterman, Joseph D; Smith, Brian E

    2014-02-11

    Endpoint-based parallel data processing in a parallel active messaging interface ('PAMI') of a parallel computer, the PAMI composed of data communications endpoints, each endpoint including a specification of data communications parameters for a thread of execution on a compute node, including specifications of a client, a context, and a task, the compute nodes coupled for data communications through the PAMI, including establishing a data communications geometry, the geometry specifying, for tasks representing processes of execution of the parallel application, a set of endpoints that are used in collective operations of the PAMI including a plurality of endpoints for one of the tasks; receiving in endpoints of the geometry an instruction for a collective operation; and executing the instruction for a collective opeartion through the endpoints in dependence upon the geometry, including dividing data communications operations among the plurality of endpoints for one of the tasks.

  10. Endpoint-based parallel data processing in a parallel active messaging interface of a parallel computer

    DOEpatents

    Archer, Charles J.; Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.

    2014-08-12

    Endpoint-based parallel data processing in a parallel active messaging interface (`PAMI`) of a parallel computer, the PAMI composed of data communications endpoints, each endpoint including a specification of data communications parameters for a thread of execution on a compute node, including specifications of a client, a context, and a task, the compute nodes coupled for data communications through the PAMI, including establishing a data communications geometry, the geometry specifying, for tasks representing processes of execution of the parallel application, a set of endpoints that are used in collective operations of the PAMI including a plurality of endpoints for one of the tasks; receiving in endpoints of the geometry an instruction for a collective operation; and executing the instruction for a collective operation through the endpoints in dependence upon the geometry, including dividing data communications operations among the plurality of endpoints for one of the tasks.

  11. Performance Equivalence and Validation of the Soleris Automated System for Quantitative Microbial Content Testing Using Pure Suspension Cultures.

    PubMed

    Limberg, Brian J; Johnstone, Kevin; Filloon, Thomas; Catrenich, Carl

    2016-09-01

    Using United States Pharmacopeia-National Formulary (USP-NF) general method <1223> guidance, the Soleris(®) automated system and reagents (Nonfermenting Total Viable Count for bacteria and Direct Yeast and Mold for yeast and mold) were validated, using a performance equivalence approach, as an alternative to plate counting for total microbial content analysis using five representative microbes: Staphylococcus aureus, Bacillus subtilis, Pseudomonas aeruginosa, Candida albicans, and Aspergillus brasiliensis. Detection times (DTs) in the alternative automated system were linearly correlated to CFU/sample (R(2) = 0.94-0.97) with ≥70% accuracy per USP General Chapter <1223> guidance. The LOD and LOQ of the automated system were statistically similar to the traditional plate count method. This system was significantly more precise than plate counting (RSD 1.2-2.9% for DT, 7.8-40.6% for plate counts), was statistically comparable to plate counting with respect to variations in analyst, vial lots, and instruments, and was robust when variations in the operating detection thresholds (dTs; ±2 units) were used. The automated system produced accurate results, was more precise and less labor-intensive, and met or exceeded criteria for a valid alternative quantitative method, consistent with USP-NF general method <1223> guidance.

  12. Keep Your Scanners Peeled: Gaze Behavior as a Measure of Automation Trust During Highly Automated Driving.

    PubMed

    Hergeth, Sebastian; Lorenz, Lutz; Vilimek, Roman; Krems, Josef F

    2016-05-01

    The feasibility of measuring drivers' automation trust via gaze behavior during highly automated driving was assessed with eye tracking and validated with self-reported automation trust in a driving simulator study. Earlier research from other domains indicates that drivers' automation trust might be inferred from gaze behavior, such as monitoring frequency. The gaze behavior and self-reported automation trust of 35 participants attending to a visually demanding non-driving-related task (NDRT) during highly automated driving was evaluated. The relationship between dispositional, situational, and learned automation trust with gaze behavior was compared. Overall, there was a consistent relationship between drivers' automation trust and gaze behavior. Participants reporting higher automation trust tended to monitor the automation less frequently. Further analyses revealed that higher automation trust was associated with lower monitoring frequency of the automation during NDRTs, and an increase in trust over the experimental session was connected with a decrease in monitoring frequency. We suggest that (a) the current results indicate a negative relationship between drivers' self-reported automation trust and monitoring frequency, (b) gaze behavior provides a more direct measure of automation trust than other behavioral measures, and (c) with further refinement, drivers' automation trust during highly automated driving might be inferred from gaze behavior. Potential applications of this research include the estimation of drivers' automation trust and reliance during highly automated driving. © 2016, Human Factors and Ergonomics Society.

  13. Parallel Processing Systems for Passive Ranging During Helicopter Flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Bavavar; Suorsa, Raymond E.; Showman, Robert D. (Technical Monitor)

    1994-01-01

    The complexity of rotorcraft missions involving operations close to the ground result in high pilot workload. In order to allow a pilot time to perform mission-oriented tasks, sensor-aiding and automation of some of the guidance and control functions are highly desirable. Images from an electro-optical sensor provide a covert way of detecting objects in the flight path of a low-flying helicopter. Passive ranging consists of processing a sequence of images using techniques based on optical low computation and recursive estimation. The passive ranging algorithm has to extract obstacle information from imagery at rates varying from five to thirty or more frames per second depending on the helicopter speed. We have implemented and tested the passive ranging algorithm off-line using helicopter-collected images. However, the real-time data and computation requirements of the algorithm are beyond the capability of any off-the-shelf microprocessor or digital signal processor. This paper describes the computational requirements of the algorithm and uses parallel processing technology to meet these requirements. Various issues in the selection of a parallel processing architecture are discussed and four different computer architectures are evaluated regarding their suitability to process the algorithm in real-time. Based on this evaluation, we conclude that real-time passive ranging is a realistic goal and can be achieved with a short time.

  14. Ion channel pharmacology under flow: automation via well-plate microfluidics.

    PubMed

    Spencer, C Ian; Li, Nianzhen; Chen, Qin; Johnson, Juliette; Nevill, Tanner; Kammonen, Juha; Ionescu-Zanetti, Cristian

    2012-08-01

    Automated patch clamping addresses the need for high-throughput screening of chemical entities that alter ion channel function. As a result, there is considerable utility in the pharmaceutical screening arena for novel platforms that can produce relevant data both rapidly and consistently. Here we present results that were obtained with an innovative microfluidic automated patch clamp system utilizing a well-plate that eliminates the necessity of internal robotic liquid handling. Continuous recording from cell ensembles, rapid solution switching, and a bench-top footprint enable a number of assay formats previously inaccessible to automated systems. An electro-pneumatic interface was employed to drive the laminar flow of solutions in a microfluidic network that delivered cells in suspension to ensemble recording sites. Whole-cell voltage clamp was applied to linear arrays of 20 cells in parallel utilizing a 64-channel voltage clamp amplifier. A number of unique assays requiring sequential compound applications separated by a second or less, such as rapid determination of the agonist EC(50) for a ligand-gated ion channel or the kinetics of desensitization recovery, are enabled by the system. In addition, the system was validated via electrophysiological characterizations of both voltage-gated and ligand-gated ion channel targets: hK(V)2.1 and human Ether-à-go-go-related gene potassium channels, hNa(V)1.7 and 1.8 sodium channels, and (α1) hGABA(A) and (α1) human nicotinic acetylcholine receptor receptors. Our results show that the voltage dependence, kinetics, and interactions of these channels with pharmacological agents were matched to reference data. The results from these IonFlux™ experiments demonstrate that the system provides high-throughput automated electrophysiology with enhanced reliability and consistency, in a user-friendly format.

  15. Investigating the feasibility of scale up and automation of human induced pluripotent stem cells cultured in aggregates in feeder free conditions☆

    PubMed Central

    Soares, Filipa A.C.; Chandra, Amit; Thomas, Robert J.; Pedersen, Roger A.; Vallier, Ludovic; Williams, David J.

    2014-01-01

    The transfer of a laboratory process into a manufacturing facility is one of the most critical steps required for the large scale production of cell-based therapy products. This study describes the first published protocol for scalable automated expansion of human induced pluripotent stem cell lines growing in aggregates in feeder-free and chemically defined medium. Cells were successfully transferred between different sites representative of research and manufacturing settings; and passaged manually and using the CompacT SelecT automation platform. Modified protocols were developed for the automated system and the management of cells aggregates (clumps) was identified as the critical step. Cellular morphology, pluripotency gene expression and differentiation into the three germ layers have been used compare the outcomes of manual and automated processes. PMID:24440272

  16. Sub-Second Parallel State Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Rice, Mark J.; Glaesemann, Kurt R.

    This report describes the performance of Pacific Northwest National Laboratory (PNNL) sub-second parallel state estimation (PSE) tool using the utility data from the Bonneville Power Administrative (BPA) and discusses the benefits of the fast computational speed for power system applications. The test data were provided by BPA. They are two-days’ worth of hourly snapshots that include power system data and measurement sets in a commercial tool format. These data are extracted out from the commercial tool box and fed into the PSE tool. With the help of advanced solvers, the PSE tool is able to solve each BPA hourly statemore » estimation problem within one second, which is more than 10 times faster than today’s commercial tool. This improved computational performance can help increase the reliability value of state estimation in many aspects: (1) the shorter the time required for execution of state estimation, the more time remains for operators to take appropriate actions, and/or to apply automatic or manual corrective control actions. This increases the chances of arresting or mitigating the impact of cascading failures; (2) the SE can be executed multiple times within time allowance. Therefore, the robustness of SE can be enhanced by repeating the execution of the SE with adaptive adjustments, including removing bad data and/or adjusting different initial conditions to compute a better estimate within the same time as a traditional state estimator’s single estimate. There are other benefits with the sub-second SE, such as that the PSE results can potentially be used in local and/or wide-area automatic corrective control actions that are currently dependent on raw measurements to minimize the impact of bad measurements, and provides opportunities to enhance the power grid reliability and efficiency. PSE also can enable other advanced tools that rely on SE outputs and could be used to further improve operators’ actions and automated controls to mitigate

  17. Automated Algorithms for Quantum-Level Accuracy in Atomistic Simulations: LDRD Final Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Aidan Patrick; Schultz, Peter Andrew; Crozier, Paul

    2014-09-01

    This report summarizes the result of LDRD project 12-0395, titled "Automated Algorithms for Quantum-level Accuracy in Atomistic Simulations." During the course of this LDRD, we have developed an interatomic potential for solids and liquids called Spectral Neighbor Analysis Poten- tial (SNAP). The SNAP potential has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projectedmore » on to a basis of hyperspherical harmonics in four dimensions. The SNAP coef- ficients are determined using weighted least-squares linear regression against the full QM training set. This allows the SNAP potential to be fit in a robust, automated manner to large QM data sets using many bispectrum components. The calculation of the bispectrum components and the SNAP potential are implemented in the LAMMPS parallel molecular dynamics code. Global optimization methods in the DAKOTA software package are used to seek out good choices of hyperparameters that define the overall structure of the SNAP potential. FitSnap.py, a Python-based software pack- age interfacing to both LAMMPS and DAKOTA is used to formulate the linear regression problem, solve it, and analyze the accuracy of the resultant SNAP potential. We describe a SNAP potential for tantalum that accurately reproduces a variety of solid and liquid properties. Most significantly, in contrast to existing tantalum potentials, SNAP correctly predicts the Peierls barrier for screw dislocation motion. We also present results from SNAP potentials generated for indium phosphide (InP) and silica (SiO 2 ). We describe efficient algorithms for calculating SNAP forces and energies in molecular dynamics simulations using massively parallel

  18. A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*

    PubMed Central

    Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing

    2016-01-01

    Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569

  19. Bit-parallel arithmetic in a massively-parallel associative processor

    NASA Technical Reports Server (NTRS)

    Scherson, Isaac D.; Kramer, David A.; Alleyne, Brian D.

    1992-01-01

    A simple but powerful new architecture based on a classical associative processor model is presented. Algorithms for performing the four basic arithmetic operations both for integer and floating point operands are described. For m-bit operands, the proposed architecture makes it possible to execute complex operations in O(m) cycles as opposed to O(m exp 2) for bit-serial machines. A word-parallel, bit-parallel, massively-parallel computing system can be constructed using this architecture with VLSI technology. The operation of this system is demonstrated for the fast Fourier transform and matrix multiplication.

  20. Automated Synthesis of a 184-Member Library of Thiadiazepan-1, 1-dioxide-4-ones

    PubMed Central

    Fenster, Erik; Long, Toby R.; Zang, Qin; Hill, David; Neuenswander, Benjamin; Lushington, Gerald H.; Zhou, Aihua; Santini, Conrad; Hanson, Paul R.

    2011-01-01

    The construction of a 225-member (3 × 5 × 15) library of thiadiazepan-1,1-dioxide-4-ones was performed on a Chemspeed Accelerator (SLT-100) automated parallel synthesis platform, culminating in the successful preparation of 184/225 sultams. Three sultam core scaffolds were prepared based upon the utilization of an aza-Michael reaction on a multifunctional vinyl sulfonamide linchpin. The library exploits peripheral diversity in the form of a sequential, two-step [3 + 2] Huisgen cycloaddition/Pd-catalyzed Suzuki–Miyaura coupling sequence. PMID:21309582

  1. Automated synthesis of a 184-member library of thiadiazepan-1,1-dioxide-4-ones.

    PubMed

    Fenster, Erik; Long, Toby R; Zang, Qin; Hill, David; Neuenswander, Benjamin; Lushington, Gerald H; Zhou, Aihua; Santini, Conrad; Hanson, Paul R

    2011-05-09

    The construction of a 225-member (3 × 5 × 15) library of thiadiazepan-1,1-dioxide-4-ones was performed on a Chemspeed Accelerator (SLT-100) automated parallel synthesis platform, culminating in the successful preparation of 184/225 sultams. Three sultam core scaffolds were prepared based upon the utilization of an aza-Michael reaction on a multifunctional vinyl sulfonamide linchpin. The library exploits peripheral diversity in the form of a sequential, two-step [3 + 2] Huisgen cycloaddition/Pd-catalyzed Suzuki-Miyaura coupling sequence.

  2. Process development for automated solar cell and module production. Task 4: Automated array assembly

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A process sequence which can be used in conjunction with automated equipment for the mass production of solar cell modules for terrestrial use was developed. The process sequence was then critically analyzed from a technical and economic standpoint to determine the technological readiness of certain process steps for implementation. The steps receiving analysis were: back contact metallization, automated cell array layup/interconnect, and module edge sealing. For automated layup/interconnect, both hard automation and programmable automation (using an industrial robot) were studied. The programmable automation system was then selected for actual hardware development.

  3. Management Planning for Workplace Automation.

    ERIC Educational Resources Information Center

    McDole, Thomas L.

    Several factors must be considered when implementing office automation. Included among these are whether or not to automate at all, the effects of automation on employees, requirements imposed by automation on the physical environment, effects of automation on the total organization, and effects on clientele. The reasons behind the success or…

  4. Automation in astronomy.

    NASA Technical Reports Server (NTRS)

    Wampler, E. J.

    1972-01-01

    Description and evaluation of the remotely operated Lick Observatory Cassegrain focus of the 120-inch telescope. The experience with this instrument has revealed that an automated system can profoundly change the observer's approach to his work. This makes it difficult to evaluate the 'advantage' of an automated telescope over a conventional instrument. Some of the problems arising with automation in astronomy are discussed.

  5. Automated Reuse of Scientific Subroutine Libraries through Deductive Synthesis

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Pressburger, Thomas; VanBaalen, Jeffrey; Roach, Steven

    1997-01-01

    Systematic software construction offers the potential of elevating software engineering from an art-form to an engineering discipline. The desired result is more predictable software development leading to better quality and more maintainable software. However, the overhead costs associated with the formalisms, mathematics, and methods of systematic software construction have largely precluded their adoption in real-world software development. In fact, many mainstream software development organizations, such as Microsoft, still maintain a predominantly oral culture for software development projects; which is far removed from a formalism-based culture for software development. An exception is the limited domain of safety-critical software, where the high-assuiance inherent in systematic software construction justifies the additional cost. We believe that systematic software construction will only be adopted by mainstream software development organization when the overhead costs have been greatly reduced. Two approaches to cost mitigation are reuse (amortizing costs over many applications) and automation. For the last four years, NASA Ames has funded the Amphion project, whose objective is to automate software reuse through techniques from systematic software construction. In particular, deductive program synthesis (i.e., program extraction from proofs) is used to derive a composition of software components (e.g., subroutines) that correctly implements a specification. The construction of reuse libraries of software components is the standard software engineering solution for improving software development productivity and quality.

  6. An Automated Parallel Image Registration Technique Based on the Correlation of Wavelet Features

    NASA Technical Reports Server (NTRS)

    LeMoigne, Jacqueline; Campbell, William J.; Cromp, Robert F.; Zukor, Dorothy (Technical Monitor)

    2001-01-01

    With the increasing importance of multiple platform/multiple remote sensing missions, fast and automatic integration of digital data from disparate sources has become critical to the success of these endeavors. Our work utilizes maxima of wavelet coefficients to form the basic features of a correlation-based automatic registration algorithm. Our wavelet-based registration algorithm is tested successfully with data from the National Oceanic and Atmospheric Administration (NOAA) Advanced Very High Resolution Radiometer (AVHRR) and the Landsat/Thematic Mapper(TM), which differ by translation and/or rotation. By the choice of high-frequency wavelet features, this method is similar to an edge-based correlation method, but by exploiting the multi-resolution nature of a wavelet decomposition, our method achieves higher computational speeds for comparable accuracies. This algorithm has been implemented on a Single Instruction Multiple Data (SIMD) massively parallel computer, the MasPar MP-2, as well as on the CrayT3D, the Cray T3E and a Beowulf cluster of Pentium workstations.

  7. Automation in Clinical Microbiology

    PubMed Central

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  8. Relevance of Piagetian cross-cultural psychology to the humanities and social sciences.

    PubMed

    Oesterdiekhoff, Georg W

    2013-01-01

    Jean Piaget held views according to which there are parallels between ontogeny and the historical development of culture, sciences, and reason. His books are full of remarks and considerations about these parallels, with reference to many logical, physical, social, and moral phenomena.This article explains that Piagetian cross-cultural psychology has delivered the decisive data needed to extend the research interests of Piaget. These data provide a basis for reconstructing not only the history of sciences but also the history of religion, politics, morals, culture, philosophy, and social change and the emergence of industrial society. Thus, it is possible to develop Piagetian theory as a historical anthropology in order to provide a basis for the humanities and social sciences.

  9. Virtual automation.

    PubMed

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  10. Laboratory automation: total and subtotal.

    PubMed

    Hawker, Charles D

    2007-12-01

    Worldwide, perhaps 2000 or more clinical laboratories have implemented some form of laboratory automation, either a modular automation system, such as for front-end processing, or a total laboratory automation system. This article provides descriptions and examples of these various types of automation. It also presents an outline of how a clinical laboratory that is contemplating automation should approach its decision and the steps it should follow to ensure a successful implementation. Finally, the role of standards in automation is reviewed.

  11. A novel milliliter-scale chemostat system for parallel cultivation of microorganisms in stirred-tank bioreactors.

    PubMed

    Schmideder, Andreas; Severin, Timm Steffen; Cremer, Johannes Heinrich; Weuster-Botz, Dirk

    2015-09-20

    A pH-controlled parallel stirred-tank bioreactor system was modified for parallel continuous cultivation on a 10 mL-scale by connecting multichannel peristaltic pumps for feeding and medium removal with micro-pipes (250 μm inner diameter). Parallel chemostat processes with Escherichia coli as an example showed high reproducibility with regard to culture volume and flow rates as well as dry cell weight, dissolved oxygen concentration and pH control at steady states (n=8, coefficient of variation <5%). Reliable estimation of kinetic growth parameters of E. coli was easily achieved within one parallel experiment by preselecting ten different steady states. Scalability of milliliter-scale steady state results was demonstrated by chemostat studies with a stirred-tank bioreactor on a liter-scale. Thus, parallel and continuously operated stirred-tank bioreactors on a milliliter-scale facilitate timesaving and cost reducing steady state studies with microorganisms. The applied continuous bioreactor system overcomes the drawbacks of existing miniaturized bioreactors, like poor mass transfer and insufficient process control. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. The Red Atlantic: Transoceanic Cultural Exchanges

    ERIC Educational Resources Information Center

    Weaver, Jace

    2011-01-01

    The development of David Armitage's "white Atlantic" history parallels the Cold War origins of American studies with its mission to define and promote "American culture" or "American civilization." British scholar Paul Gilroy's "The Black Atlantic" served as a necessary corrective. Armitage's statement leads…

  13. Use Computer-Aided Tools to Parallelize Large CFD Applications

    NASA Technical Reports Server (NTRS)

    Jin, H.; Frumkin, M.; Yan, J.

    2000-01-01

    Porting applications to high performance parallel computers is always a challenging task. It is time consuming and costly. With rapid progressing in hardware architectures and increasing complexity of real applications in recent years, the problem becomes even more sever. Today, scalability and high performance are mostly involving handwritten parallel programs using message-passing libraries (e.g. MPI). However, this process is very difficult and often error-prone. The recent reemergence of shared memory parallel (SMP) architectures, such as the cache coherent Non-Uniform Memory Access (ccNUMA) architecture used in the SGI Origin 2000, show good prospects for scaling beyond hundreds of processors. Programming on an SMP is simplified by working in a globally accessible address space. The user can supply compiler directives, such as OpenMP, to parallelize the code. As an industry standard for portable implementation of parallel programs for SMPs, OpenMP is a set of compiler directives and callable runtime library routines that extend Fortran, C and C++ to express shared memory parallelism. It promises an incremental path for parallel conversion of existing software, as well as scalability and performance for a complete rewrite or an entirely new development. Perhaps the main disadvantage of programming with directives is that inserted directives may not necessarily enhance performance. In the worst cases, it can create erroneous results. While vendors have provided tools to perform error-checking and profiling, automation in directive insertion is very limited and often failed on large programs, primarily due to the lack of a thorough enough data dependence analysis. To overcome the deficiency, we have developed a toolkit, CAPO, to automatically insert OpenMP directives in Fortran programs and apply certain degrees of optimization. CAPO is aimed at taking advantage of detailed inter-procedural dependence analysis provided by CAPTools, developed by the University of

  14. Application of a non-hazardous vital dye for cell counting with automated cell counters.

    PubMed

    Kim, Soo In; Kim, Hyun Jeong; Lee, Ho-Jae; Lee, Kiwon; Hong, Dongpyo; Lim, Hyunchang; Cho, Keunchang; Jung, Neoncheol; Yi, Yong Weon

    2016-01-01

    Recent advances in automated cell counters enable us to count cells more easily with consistency. However, the wide use of the traditional vital dye trypan blue (TB) raises environmental and health concerns due to its potential teratogenic effects. To avoid this chemical hazard, it is of importance to introduce an alternative non-hazardous vital dye that is compatible with automated cell counters. Erythrosin B (EB) is a vital dye that is impermeable to biological membranes and is used as a food additive. Similarly to TB, EB stains only nonviable cells with disintegrated membranes. However, EB is less popular than TB and is seldom used with automated cell counters. We found that cell counting accuracy with EB was comparable to that with TB. EB was found to be an effective dye for accurate counting of cells with different viabilities across three different automated cell counters. In contrast to TB, EB was less toxic to cultured HL-60 cells during the cell counting process. These results indicate that replacing TB with EB for use with automated cell counters will significantly reduce the hazardous risk while producing comparable results. Copyright © 2015 Logos Biosystems, Inc. Published by Elsevier Inc. All rights reserved.

  15. A scalable parallel black oil simulator on distributed memory parallel computers

    NASA Astrophysics Data System (ADS)

    Wang, Kun; Liu, Hui; Chen, Zhangxin

    2015-11-01

    This paper presents our work on developing a parallel black oil simulator for distributed memory computers based on our in-house parallel platform. The parallel simulator is designed to overcome the performance issues of common simulators that are implemented for personal computers and workstations. The finite difference method is applied to discretize the black oil model. In addition, some advanced techniques are employed to strengthen the robustness and parallel scalability of the simulator, including an inexact Newton method, matrix decoupling methods, and algebraic multigrid methods. A new multi-stage preconditioner is proposed to accelerate the solution of linear systems from the Newton methods. Numerical experiments show that our simulator is scalable and efficient, and is capable of simulating extremely large-scale black oil problems with tens of millions of grid blocks using thousands of MPI processes on parallel computers.

  16. Mainstreaming culture in psychology.

    PubMed

    Cheung, Fanny M

    2012-11-01

    Despite the "awakening" to the importance of culture in psychology in America, international psychology has remained on the sidelines of psychological science. The author recounts her personal and professional experience in tandem with the stages of development in international/cross-cultural psychology. Based on her research in cross-cultural personality assessment, the author discusses the inadequacies of sole reliance on either the etic or the emic approach and points out the advantages of a combined emic-etic approach in bridging global and local human experiences in psychological science and practice. With the blurring of the boundaries between North American-European psychologies and psychology in the rest of the world, there is a need to mainstream culture in psychology's epistemological paradigm. Borrowing from the concept of gender mainstreaming that embraces both similarities and differences in promoting equal opportunities, the author discusses the parallel needs of acknowledging universals and specifics when mainstreaming culture in psychology. She calls for building a culturally informed universal knowledge base that should be incorporated in the psychology curriculum and textbooks. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  17. Parallel algorithms for mapping pipelined and parallel computations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1988-01-01

    Many computational problems in image processing, signal processing, and scientific computing are naturally structured for either pipelined or parallel computation. When mapping such problems onto a parallel architecture it is often necessary to aggregate an obvious problem decomposition. Even in this context the general mapping problem is known to be computationally intractable, but recent advances have been made in identifying classes of problems and architectures for which optimal solutions can be found in polynomial time. Among these, the mapping of pipelined or parallel computations onto linear array, shared memory, and host-satellite systems figures prominently. This paper extends that work first by showing how to improve existing serial mapping algorithms. These improvements have significantly lower time and space complexities: in one case a published O(nm sup 3) time algorithm for mapping m modules onto n processors is reduced to an O(nm log m) time complexity, and its space requirements reduced from O(nm sup 2) to O(m). Run time complexity is further reduced with parallel mapping algorithms based on these improvements, which run on the architecture for which they create the mappings.

  18. Density-based parallel skin lesion border detection with webCL

    PubMed Central

    2015-01-01

    Background Dermoscopy is a highly effective and noninvasive imaging technique used in diagnosis of melanoma and other pigmented skin lesions. Many aspects of the lesion under consideration are defined in relation to the lesion border. This makes border detection one of the most important steps in dermoscopic image analysis. In current practice, dermatologists often delineate borders through a hand drawn representation based upon visual inspection. Due to the subjective nature of this technique, intra- and inter-observer variations are common. Because of this, the automated assessment of lesion borders in dermoscopic images has become an important area of study. Methods Fast density based skin lesion border detection method has been implemented in parallel with a new parallel technology called WebCL. WebCL utilizes client side computing capabilities to use available hardware resources such as multi cores and GPUs. Developed WebCL-parallel density based skin lesion border detection method runs efficiently from internet browsers. Results Previous research indicates that one of the highest accuracy rates can be achieved using density based clustering techniques for skin lesion border detection. While these algorithms do have unfavorable time complexities, this effect could be mitigated when implemented in parallel. In this study, density based clustering technique for skin lesion border detection is parallelized and redesigned to run very efficiently on the heterogeneous platforms (e.g. tablets, SmartPhones, multi-core CPUs, GPUs, and fully-integrated Accelerated Processing Units) by transforming the technique into a series of independent concurrent operations. Heterogeneous computing is adopted to support accessibility, portability and multi-device use in the clinical settings. For this, we used WebCL, an emerging technology that enables a HTML5 Web browser to execute code in parallel for heterogeneous platforms. We depicted WebCL and our parallel algorithm design. In

  19. Density-based parallel skin lesion border detection with webCL.

    PubMed

    Lemon, James; Kockara, Sinan; Halic, Tansel; Mete, Mutlu

    2015-01-01

    Dermoscopy is a highly effective and noninvasive imaging technique used in diagnosis of melanoma and other pigmented skin lesions. Many aspects of the lesion under consideration are defined in relation to the lesion border. This makes border detection one of the most important steps in dermoscopic image analysis. In current practice, dermatologists often delineate borders through a hand drawn representation based upon visual inspection. Due to the subjective nature of this technique, intra- and inter-observer variations are common. Because of this, the automated assessment of lesion borders in dermoscopic images has become an important area of study. Fast density based skin lesion border detection method has been implemented in parallel with a new parallel technology called WebCL. WebCL utilizes client side computing capabilities to use available hardware resources such as multi cores and GPUs. Developed WebCL-parallel density based skin lesion border detection method runs efficiently from internet browsers. Previous research indicates that one of the highest accuracy rates can be achieved using density based clustering techniques for skin lesion border detection. While these algorithms do have unfavorable time complexities, this effect could be mitigated when implemented in parallel. In this study, density based clustering technique for skin lesion border detection is parallelized and redesigned to run very efficiently on the heterogeneous platforms (e.g. tablets, SmartPhones, multi-core CPUs, GPUs, and fully-integrated Accelerated Processing Units) by transforming the technique into a series of independent concurrent operations. Heterogeneous computing is adopted to support accessibility, portability and multi-device use in the clinical settings. For this, we used WebCL, an emerging technology that enables a HTML5 Web browser to execute code in parallel for heterogeneous platforms. We depicted WebCL and our parallel algorithm design. In addition, we tested

  20. Titanium(IV) isopropoxide mediated solution phase reductive amination on an automated platform: application in the generation of urea and amide libraries.

    PubMed

    Bhattacharyya, S; Fan, L; Vo, L; Labadie, J

    2000-04-01

    Amine libraries and their derivatives are important targets for high throughput synthesis because of their versatility as medicinal agents and agrochemicals. As a part of our efforts towards automated chemical library synthesis, a titanium(IV) isopropoxide mediated solution phase reductive amination protocol was successfully translated to automation on the Trident(TM) library synthesizer of Argonaut Technologies. An array of 24 secondary amines was prepared in high yield and purity from 4 primary amines and 6 carbonyl compounds. These secondary amines were further utilized in a split synthesis to generate libraries of ureas, amides and sulfonamides in solution phase on the Trident(TM). The automated runs included 192 reactions to synthesize 96 ureas in duplicate and 96 reactions to synthesize 48 amides and 48 sulfonamides. A number of polymer-assisted solution phase protocols were employed for parallel work-up and purification of the products in each step.

  1. The cultural side of science communication.

    PubMed

    Medin, Douglas L; Bang, Megan

    2014-09-16

    The main proposition of this paper is that science communication necessarily involves and includes cultural orientations. There is a substantial body of work showing that cultural differences in values and epistemological frameworks are paralleled with cultural differences reflected in artifacts and public representations. One dimension of cultural difference is the psychological distance between humans and the rest of nature. Another is perspective taking and attention to context and relationships. As an example of distance, most (Western) images of ecosystems do not include human beings, and European American discourse tends to position human beings as being apart from nature. Native American discourse, in contrast, tends to describe humans beings as a part of nature. We trace the correspondences between cultural properties of media, focusing on children's books, and cultural differences in biological cognition. Finally, implications for both science communication and science education are outlined.

  2. Performance Evaluation of Evasion Maneuvers for Parallel Approach Collision Avoidance

    NASA Technical Reports Server (NTRS)

    Winder, Lee F.; Kuchar, James K.; Waller, Marvin (Technical Monitor)

    2000-01-01

    Current plans for independent instrument approaches to closely spaced parallel runways call for an automated pilot alerting system to ensure separation of aircraft in the case of a "blunder," or unexpected deviation from the a normal approach path. Resolution advisories by this system would require the pilot of an endangered aircraft to perform a trained evasion maneuver. The potential performance of two evasion maneuvers, referred to as the "turn-climb" and "climb-only," was estimated using an experimental NASA alerting logic (AILS) and a computer simulation of relative trajectory scenarios between two aircraft. One aircraft was equipped with the NASA alerting system, and maneuvered accordingly. Observation of the rates of different types of alerting failure allowed judgement of evasion maneuver performance. System Operating Characteristic (SOC) curves were used to assess the benefit of alerting with each maneuver.

  3. Automation of Oklahoma School Library Media Centers: Automation at the Local Level.

    ERIC Educational Resources Information Center

    Oklahoma State Dept. of Education, Oklahoma City. Library and Learning Resources Section.

    This document outlines a workshop for media specialists--"School Library Automation: Solving the Puzzle"--that is designed to reduce automation anxiety and give a broad overview of the concerns confronting school library media centers planning for or involved in automation. Issues are addressed under the following headings: (1) Levels of School…

  4. Parallel computing works

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of manymore » computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.« less

  5. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics.

    PubMed

    Röst, Hannes L; Liu, Yansheng; D'Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-09-01

    Next-generation mass spectrometric (MS) techniques such as SWATH-MS have substantially increased the throughput and reproducibility of proteomic analysis, but ensuring consistent quantification of thousands of peptide analytes across multiple liquid chromatography-tandem MS (LC-MS/MS) runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we developed TRIC (http://proteomics.ethz.ch/tric/), a software tool that utilizes fragment-ion data to perform cross-run alignment, consistent peak-picking and quantification for high-throughput targeted proteomics. TRIC reduced the identification error compared to a state-of-the-art SWATH-MS analysis without alignment by more than threefold at constant recall while correcting for highly nonlinear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups. Thus, TRIC fills a gap in the pipeline for automated analysis of massively parallel targeted proteomics data sets.

  6. Pumps for microfluidic cell culture.

    PubMed

    Byun, Chang Kyu; Abi-Samra, Kameel; Cho, Yoon-Kyoung; Takayama, Shuichi

    2014-02-01

    In comparison to traditional in vitro cell culture in Petri dishes or well plates, cell culture in microfluidic-based devices enables better control over chemical and physical environments, higher levels of experimental automation, and a reduction in experimental materials. Over the past decade, the advantages associated with cell culturing in microfluidic-based platforms have garnered significant interest and have led to a plethora of studies for high throughput cell assays, organs-on-a-chip applications, temporal signaling studies, and cell sorting. A clear concern for performing cell culture in microfluidic-based devices is deciding on a technique to deliver and pump media to cells that are encased in a microfluidic device. In this review, we summarize recent advances in pumping techniques for microfluidic cell culture and discuss their advantages and possible drawbacks. The ultimate goal of our review is to distill the large body of information available related to pumps for microfluidic cell culture in an effort to assist current and potential users of microfluidic-based devices for advanced in vitro cellular studies. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    PubMed

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  8. Automation of a Wave-Optics Simulation and Image Post-Processing Package on Riptide

    NASA Astrophysics Data System (ADS)

    Werth, M.; Lucas, J.; Thompson, D.; Abercrombie, M.; Holmes, R.; Roggemann, M.

    Detailed wave-optics simulations and image post-processing algorithms are computationally expensive and benefit from the massively parallel hardware available at supercomputing facilities. We created an automated system that interfaces with the Maui High Performance Computing Center (MHPCC) Distributed MATLAB® Portal interface to submit massively parallel waveoptics simulations to the IBM iDataPlex (Riptide) supercomputer. This system subsequently postprocesses the output images with an improved version of physically constrained iterative deconvolution (PCID) and analyzes the results using a series of modular algorithms written in Python. With this architecture, a single person can simulate thousands of unique scenarios and produce analyzed, archived, and briefing-compatible output products with very little effort. This research was developed with funding from the Defense Advanced Research Projects Agency (DARPA). The views, opinions, and/or findings expressed are those of the author(s) and should not be interpreted as representing the official views or policies of the Department of Defense or the U.S. Government.

  9. Automating Microbial Directed Evolution For Bioengineering Applications

    NASA Astrophysics Data System (ADS)

    Lee, A.; Demachkie, I. S.; Sardesh, N.; Arismendi, D.; Ouandji, C.; Wang, J.; Blaich, J.; Gentry, D.

    2016-12-01

    From a micro-biology perspective, directed evolution is a technique that uses controlled environmental pressures to select for a desired phenotype. Directed evolution has the distinct advantage over rational design of not needing extensive knowledge of the genome or pathways associated with a microorganism to induce phenotypes. However, there are currently limitations to the applicability of this technique including being time-consuming, error-prone, and dependent on existing assays that may lack selectivity for the given phenotype. The AADEC (Autonomous Adaptive Directed Evolution Chamber) system is a proof-of-concept instrument to automate and improve the technique such that directed evolution can be used more effectively as a general bioengineering tool. A series of tests using the automated system and comparable by-hand survival assay measurements have been carried out using UV-C radiation and Escherichia coli cultures in order to demonstrate the advantages of the AADEC versus traditional implementations of directed evolution such as random mutagenesis. AADEC uses UV-C exposure as both a source of environmental stress and mutagenesis, so in order to evaluate the UV-C tolerance obtained from the cultures, a manual UV-C exposure survival assay was developed alongside the device to compare the survival fractions at a fixed dosage. This survival assay involves exposing E.coli to UV-C radiation using a custom-designed exposure hood to control the flux and dose. Surviving cells are counted then transferred to the next iteration and so on for several iterations to calculate the survival fractions for each exposure iteration. This survival assay primarily serves as a baseline for the AADEC device, allowing quantification of the differences between the AADEC system over the manual approach. The primary data of comparison is survival fractions; this is obtained by optical density and plate counts in the manual assay and by optical density growth curve fits pre- and post

  10. Automated clinical trial eligibility prescreening: increasing the efficiency of patient identification for clinical trials in the emergency department

    PubMed Central

    Ni, Yizhao; Kennebeck, Stephanie; Dexheimer, Judith W; McAneney, Constance M; Tang, Huaxiu; Lingren, Todd; Li, Qi; Zhai, Haijun; Solti, Imre

    2015-01-01

    Objectives (1) To develop an automated eligibility screening (ES) approach for clinical trials in an urban tertiary care pediatric emergency department (ED); (2) to assess the effectiveness of natural language processing (NLP), information extraction (IE), and machine learning (ML) techniques on real-world clinical data and trials. Data and methods We collected eligibility criteria for 13 randomly selected, disease-specific clinical trials actively enrolling patients between January 1, 2010 and August 31, 2012. In parallel, we retrospectively selected data fields including demographics, laboratory data, and clinical notes from the electronic health record (EHR) to represent profiles of all 202795 patients visiting the ED during the same period. Leveraging NLP, IE, and ML technologies, the automated ES algorithms identified patients whose profiles matched the trial criteria to reduce the pool of candidates for staff screening. The performance was validated on both a physician-generated gold standard of trial–patient matches and a reference standard of historical trial–patient enrollment decisions, where workload, mean average precision (MAP), and recall were assessed. Results Compared with the case without automation, the workload with automated ES was reduced by 92% on the gold standard set, with a MAP of 62.9%. The automated ES achieved a 450% increase in trial screening efficiency. The findings on the gold standard set were confirmed by large-scale evaluation on the reference set of trial–patient matches. Discussion and conclusion By exploiting the text of trial criteria and the content of EHRs, we demonstrated that NLP-, IE-, and ML-based automated ES could successfully identify patients for clinical trials. PMID:25030032

  11. Automation: triumph or trap?

    PubMed

    Smythe, M H

    1997-01-01

    Automation, a hot topic in the laboratory world today, can be a very expensive option. Those who are considering implementing automation can save time and money by examining the issues from the standpoint of an industrial/manufacturing engineer. The engineer not only asks what problems will be solved by automation, but what problems will be created. This article discusses questions that must be asked and answered to ensure that automation efforts will yield real and substantial payoffs.

  12. A modular suite of hardware enabling spaceflight cell culture research

    NASA Technical Reports Server (NTRS)

    Hoehn, Alexander; Klaus, David M.; Stodieck, Louis S.

    2004-01-01

    BioServe Space Technologies, a NASA Research Partnership Center (RPC), has developed and operated various middeck payloads launched on 23 shuttle missions since 1991 in support of commercial space biotechnology projects. Modular cell culture systems are contained within the Commercial Generic Bioprocessing Apparatus (CGBA) suite of flight-qualified hardware, compatible with Space Shuttle, SPACEHAB, Spacelab and International Space Station (ISS) EXPRESS Rack interfaces. As part of the CGBA family, the Isothermal Containment Module (ICM) incubator provides thermal control, data acquisition and experiment manipulation capabilities, including accelerometer launch detection for automated activation and thermal profiling for culture incubation and sample preservation. The ICM can accommodate up to 8 individually controlled temperature zones. Command and telemetry capabilities allow real-time downlink of data and video permitting remote payload operation and ground control synchronization. Individual cell culture experiments can be accommodated in a variety of devices ranging from 'microgravity test tubes' or standard 100 mm Petri dishes, to complex, fed-batch bioreactors with automated culture feeding, waste removal and multiple sample draws. Up to 3 levels of containment can be achieved for chemical fixative addition, and passive gas exchange can be provided through hydrophobic membranes. Many additional options exist for designing customized hardware depending on specific science requirements.

  13. Development of an integrated semi-automated system for in vitro pharmacodynamic modelling.

    PubMed

    Wang, Liangsu; Wismer, Michael K; Racine, Fred; Conway, Donald; Giacobbe, Robert A; Berejnaia, Olga; Kath, Gary S

    2008-11-01

    The aim of this study was to develop an integrated system for in vitro pharmacodynamic modelling of antimicrobials with greater flexibility, easier control and better accuracy than existing in vitro models. Custom-made bottle caps, fittings, valve controllers and a modified bench-top shaking incubator were used. A temperature-controlled automated sample collector was built. Computer software was developed to manage experiments and to control the entire system including solenoid pinch valves, peristaltic pumps and the sample collector. The system was validated by pharmacokinetic simulations of linezolid 600 mg infusion. The antibacterial effect of linezolid against multiple Staphylococcus aureus strains was also studied in this system. An integrated semi-automated bench-top system was built and validated. The temperature-controlled automated sample collector allowed unattended collection and temporary storage of samples. The system software reduced the labour necessary for many tasks and also improved the timing accuracy for performing simultaneous actions in multiple parallel experiments. The system was able to simulate human pharmacokinetics of linezolid 600 mg intravenous infusion accurately. A pharmacodynamic study of linezolid against multiple S. aureus strains with a range of MICs showed that the required 24 h free drug AUC/MIC ratio was approximately 30 in order to keep the organism counts at the same level as their initial inoculum and was about > or = 68 in order to achieve > 2 log(10) cfu/mL reduction in the in vitro model. The integrated semi-automated bench-top system provided the ability to overcome many of the drawbacks of existing in vitro models. It can be used for various simple or complicated pharmacokinetic/pharmacodynamic studies efficiently and conveniently.

  14. Development and application of an automated precision solar radiometer

    NASA Astrophysics Data System (ADS)

    Qiu, Gang-gang; Li, Xin; Zhang, Quan; Zheng, Xiao-bing; Yan, Jing

    2016-10-01

    Automated filed vicarious calibration is becoming a growing trend for satellite remote sensor, which require a solar radiometer have to automatic measure reliable data for a long time whatever the weather conditions and transfer measurement data to the user office. An automated precision solar radiometer has been developed. It is used in measuring the solar spectral irradiance received at the Earth surface. The instrument consists of 8 parallel separate silicon-photodiode-based channels with narrow band-pass filters from the visible to near-IR regions. Each channel has a 2.0° full-angle Filed of View (FOV). The detectors and filters are temperature stabilized using a Thermal Energy Converter at 30+/-0.2°. The instrument is pointed toward the sun via an auto-tracking system that actively tracks the sun within a +/-0.1°. It collects data automatically and communicates with user terminal through BDS (China's BeiDou Navigation Satellite System) while records data as a redundant in internal memory, including working state and error. The solar radiometer is automated in the sense that it requires no supervision throughout the whole process of working. It calculates start-time and stop-time every day matched with the time of sunrise and sunset, and stop working once the precipitation. Calibrated via Langley curves and simultaneous observed with CE318, the different of Aerosol Optical Depth (AOD) is within 5%. The solar radiometer had run in all kinds of harsh weather condition in Gobi in Dunhuang and obtain the AODs nearly eight months continuously. This paper presents instrument design analysis, atmospheric optical depth retrievals as well as the experiment result.

  15. Automated sequence analysis and editing software for HIV drug resistance testing.

    PubMed

    Struck, Daniel; Wallis, Carole L; Denisov, Gennady; Lambert, Christine; Servais, Jean-Yves; Viana, Raquel V; Letsoalo, Esrom; Bronze, Michelle; Aitken, Sue C; Schuurman, Rob; Stevens, Wendy; Schmit, Jean Claude; Rinke de Wit, Tobias; Perez Bercoff, Danielle

    2012-05-01

    Access to antiretroviral treatment in resource-limited-settings is inevitably paralleled by the emergence of HIV drug resistance. Monitoring treatment efficacy and HIV drugs resistance testing are therefore of increasing importance in resource-limited settings. Yet low-cost technologies and procedures suited to the particular context and constraints of such settings are still lacking. The ART-A (Affordable Resistance Testing for Africa) consortium brought together public and private partners to address this issue. To develop an automated sequence analysis and editing software to support high throughput automated sequencing. The ART-A Software was designed to automatically process and edit ABI chromatograms or FASTA files from HIV-1 isolates. The ART-A Software performs the basecalling, assigns quality values, aligns query sequences against a set reference, infers a consensus sequence, identifies the HIV type and subtype, translates the nucleotide sequence to amino acids and reports insertions/deletions, premature stop codons, ambiguities and mixed calls. The results can be automatically exported to Excel to identify mutations. Automated analysis was compared to manual analysis using a panel of 1624 PR-RT sequences generated in 3 different laboratories. Discrepancies between manual and automated sequence analysis were 0.69% at the nucleotide level and 0.57% at the amino acid level (668,047 AA analyzed), and discordances at major resistance mutations were recorded in 62 cases (4.83% of differences, 0.04% of all AA) for PR and 171 (6.18% of differences, 0.03% of all AA) cases for RT. The ART-A Software is a time-sparing tool for pre-analyzing HIV and viral quasispecies sequences in high throughput laboratories and highlighting positions requiring attention. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Sunglint Detection for Unmanned and Automated Platforms

    PubMed Central

    Garaba, Shungudzemwoyo Pascal; Schulz, Jan; Wernand, Marcel Robert; Zielinski, Oliver

    2012-01-01

    We present an empirical quality control protocol for above-water radiometric sampling focussing on identifying sunglint situations. Using hyperspectral radiometers, measurements were taken on an automated and unmanned seaborne platform in northwest European shelf seas. In parallel, a camera system was used to capture sea surface and sky images of the investigated points. The quality control consists of meteorological flags, to mask dusk, dawn, precipitation and low light conditions, utilizing incoming solar irradiance (ES) spectra. Using 629 from a total of 3,121 spectral measurements that passed the test conditions of the meteorological flagging, a new sunglint flag was developed. To predict sunglint conspicuous in the simultaneously available sea surface images a sunglint image detection algorithm was developed and implemented. Applying this algorithm, two sets of data, one with (having too much or detectable white pixels or sunglint) and one without sunglint (having least visible/detectable white pixel or sunglint), were derived. To identify the most effective sunglint flagging criteria we evaluated the spectral characteristics of these two data sets using water leaving radiance (LW) and remote sensing reflectance (RRS). Spectral conditions satisfying ‘mean LW (700–950 nm) < 2 mW·m−2·nm−1·Sr−1’ or alternatively ‘minimum RRS (700–950 nm) < 0.010 Sr−1’, mask most measurements affected by sunglint, providing an efficient empirical flagging of sunglint in automated quality control.

  17. Template based parallel checkpointing in a massively parallel computer system

    DOEpatents

    Archer, Charles Jens [Rochester, MN; Inglett, Todd Alan [Rochester, MN

    2009-01-13

    A method and apparatus for a template based parallel checkpoint save for a massively parallel super computer system using a parallel variation of the rsync protocol, and network broadcast. In preferred embodiments, the checkpoint data for each node is compared to a template checkpoint file that resides in the storage and that was previously produced. Embodiments herein greatly decrease the amount of data that must be transmitted and stored for faster checkpointing and increased efficiency of the computer system. Embodiments are directed to a parallel computer system with nodes arranged in a cluster with a high speed interconnect that can perform broadcast communication. The checkpoint contains a set of actual small data blocks with their corresponding checksums from all nodes in the system. The data blocks may be compressed using conventional non-lossy data compression algorithms to further reduce the overall checkpoint size.

  18. Automated Microbiological Detection/Identification System

    PubMed Central

    Aldridge, C.; Jones, P. W.; Gibson, S.; Lanham, J.; Meyer, M.; Vannest, R.; Charles, R.

    1977-01-01

    An automated, computerized system, the AutoMicrobic System, has been developed for the detection, enumeration, and identification of bacteria and yeasts in clinical specimens. The biological basis for the system resides in lyophilized, highly selective and specific media enclosed in wells of a disposable plastic cuvette; introduction of a suitable specimen rehydrates and inoculates the media in the wells. An automated optical system monitors, and the computer interprets, changes in the media, with enumeration and identification results automatically obtained in 13 h. Sixteen different selective media were developed and tested with a variety of seeded (simulated) and clinical specimens. The AutoMicrobic System has been extensively tested with urine specimens, using a urine test kit (Identi-Pak) that contains selective media for Escherichia coli, Proteus species, Pseudomonas aeruginosa, Klebsiella-Enterobacter species, Serratia species, Citrobacter freundii, group D enterococci, Staphylococcus aureus, and yeasts (Candida species and Torulopsis glabrata). The system has been tested with 3,370 seeded urine specimens and 1,486 clinical urines. Agreement with simultaneous conventional (manual) cultures, at levels of 70,000 colony-forming units per ml (or more), was 92% or better for seeded specimens; clinical specimens yielded results of 93% or better for all organisms except P. aeruginosa, where agreement was 86%. System expansion in progress includes antibiotic susceptibility testing and compatibility with most types of clinical specimens. Images PMID:334798

  19. Fuzzy Control/Space Station automation

    NASA Technical Reports Server (NTRS)

    Gersh, Mark

    1990-01-01

    Viewgraphs on fuzzy control/space station automation are presented. Topics covered include: Space Station Freedom (SSF); SSF evolution; factors pointing to automation & robotics (A&R); astronaut office inputs concerning A&R; flight system automation and ground operations applications; transition definition program; and advanced automation software tools.

  20. Parallel Wavefront Analysis for a 4D Interferometer

    NASA Technical Reports Server (NTRS)

    Rao, Shanti R.

    2011-01-01

    This software provides a programming interface for automating data collection with a PhaseCam interferometer from 4D Technology, and distributing the image-processing algorithm across a cluster of general-purpose computers. Multiple instances of 4Sight (4D Technology s proprietary software) run on a networked cluster of computers. Each connects to a single server (the controller) and waits for instructions. The controller directs the interferometer to several images, then assigns each image to a different computer for processing. When the image processing is finished, the server directs one of the computers to collate and combine the processed images, saving the resulting measurement in a file on a disk. The available software captures approximately 100 images and analyzes them immediately. This software separates the capture and analysis processes, so that analysis can be done at a different time and faster by running the algorithm in parallel across several processors. The PhaseCam family of interferometers can measure an optical system in milliseconds, but it takes many seconds to process the data so that it is usable. In characterizing an adaptive optics system, like the next generation of astronomical observatories, thousands of measurements are required, and the processing time quickly becomes excessive. A programming interface distributes data processing for a PhaseCam interferometer across a Windows computing cluster. A scriptable controller program coordinates data acquisition from the interferometer, storage on networked hard disks, and parallel processing. Idle time of the interferometer is minimized. This architecture is implemented in Python and JavaScript, and may be altered to fit a customer s needs.

  1. Research in parallel computing

    NASA Technical Reports Server (NTRS)

    Ortega, James M.; Henderson, Charles

    1994-01-01

    This report summarizes work on parallel computations for NASA Grant NAG-1-1529 for the period 1 Jan. - 30 June 1994. Short summaries on highly parallel preconditioners, target-specific parallel reductions, and simulation of delta-cache protocols are provided.

  2. Cultural Hitchhiking in the Matrilineal Whales.

    PubMed

    Whitehead, Hal; Vachon, Felicia; Frasier, Timothy R

    2017-05-01

    Five species of whale with matrilineal social systems (daughters remain with mothers) have remarkably low levels of mitochondrial DNA diversity. Non-heritable matriline-level demography could reduce genetic diversity but the required conditions are not consistent with the natural histories of the matrilineal whales. The diversity of nuclear microsatellites is little reduced in the matrilineal whales arguing against bottlenecks. Selective sweeps of the mitochondrial genome are feasible causes but it is not clear why these only occurred in the matrilineal species. Cultural hitchhiking (cultural selection reducing diversity at neutral genetic loci transmitted in parallel to the culture) is supported in sperm whales which possess suitable matrilineal socio-cultural groups (coda clans). Killer whales are delineated into ecotypes which likely originated culturally. Culture, bottlenecks and selection, as well as their interactions, operating between- or within-ecotypes, may have reduced their mitochondrial diversity. The societies, cultures and genetics of false killer and two pilot whale species are insufficiently known to assess drivers of low mitochondrial diversity.

  3. The cultural side of science communication

    PubMed Central

    Medin, Douglas L.; Bang, Megan

    2014-01-01

    The main proposition of this paper is that science communication necessarily involves and includes cultural orientations. There is a substantial body of work showing that cultural differences in values and epistemological frameworks are paralleled with cultural differences reflected in artifacts and public representations. One dimension of cultural difference is the psychological distance between humans and the rest of nature. Another is perspective taking and attention to context and relationships. As an example of distance, most (Western) images of ecosystems do not include human beings, and European American discourse tends to position human beings as being apart from nature. Native American discourse, in contrast, tends to describe humans beings as a part of nature. We trace the correspondences between cultural properties of media, focusing on children’s books, and cultural differences in biological cognition. Finally, implications for both science communication and science education are outlined. PMID:25225366

  4. Asleep at the automated wheel-Sleepiness and fatigue during highly automated driving.

    PubMed

    Vogelpohl, Tobias; Kühn, Matthias; Hummel, Thomas; Vollrath, Mark

    2018-03-20

    Due to the lack of active involvement in the driving situation and due to monotonous driving environments drivers with automation may be prone to become fatigued faster than manual drivers (e.g. Schömig et al., 2015). However, little is known about the progression of fatigue during automated driving and its effects on the ability to take back manual control after a take-over request. In this driving simulator study with Nö=ö60 drivers we used a three factorial 2ö×ö2ö×ö12 mixed design to analyze the progression (12ö×ö5ömin; within subjects) of driver fatigue in drivers with automation compared to manual drivers (between subjects). Driver fatigue was induced as either mainly sleep related or mainly task related fatigue (between subjects). Additionally, we investigated the drivers' reactions to a take-over request in a critical driving scenario to gain insights into the ability of fatigued drivers to regain manual control and situation awareness after automated driving. Drivers in the automated driving condition exhibited facial indicators of fatigue after 15 to 35ömin of driving. Manual drivers only showed similar indicators of fatigue if they suffered from a lack of sleep and then only after a longer period of driving (approx. 40ömin). Several drivers in the automated condition closed their eyes for extended periods of time. In the driving with automation condition mean automation deactivation times after a take-over request were slower for a certain percentage (about 30%) of the drivers with a lack of sleep (Mö=ö3.2; SDö=ö2.1ös) compared to the reaction times after a long drive (Mö=ö2.4; SDö=ö0.9ös). Drivers with automation also took longer than manual drivers to first glance at the speed display after a take-over request and were more likely to stay behind a braking lead vehicle instead of overtaking it. Drivers are unable to stay alert during extended periods of automated driving without non-driving related tasks. Fatigued drivers could

  5. Automated Instrumentation, Monitoring and Visualization of PVM Programs Using AIMS

    NASA Technical Reports Server (NTRS)

    Mehra, Pankaj; VanVoorst, Brian; Yan, Jerry; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    We present views and analysis of the execution of several PVM (Parallel Virtual Machine) codes for Computational Fluid Dynamics on a networks of Sparcstations, including: (1) NAS Parallel Benchmarks CG and MG; (2) a multi-partitioning algorithm for NAS Parallel Benchmark SP; and (3) an overset grid flowsolver. These views and analysis were obtained using our Automated Instrumentation and Monitoring System (AIMS) version 3.0, a toolkit for debugging the performance of PVM programs. We will describe the architecture, operation and application of AIMS. The AIMS toolkit contains: (1) Xinstrument, which can automatically instrument various computational and communication constructs in message-passing parallel programs; (2) Monitor, a library of runtime trace-collection routines; (3) VK (Visual Kernel), an execution-animation tool with source-code clickback; and (4) Tally, a tool for statistical analysis of execution profiles. Currently, Xinstrument can handle C and Fortran 77 programs using PVM 3.2.x; Monitor has been implemented and tested on Sun 4 systems running SunOS 4.1.2; and VK uses XIIR5 and Motif 1.2. Data and views obtained using AIMS clearly illustrate several characteristic features of executing parallel programs on networked workstations: (1) the impact of long message latencies; (2) the impact of multiprogramming overheads and associated load imbalance; (3) cache and virtual-memory effects; and (4) significant skews between workstation clocks. Interestingly, AIMS can compensate for constant skew (zero drift) by calibrating the skew between a parent and its spawned children. In addition, AIMS' skew-compensation algorithm can adjust timestamps in a way that eliminates physically impossible communications (e.g., messages going backwards in time). Our current efforts are directed toward creating new views to explain the observed performance of PVM programs. Some of the features planned for the near future include: (1) ConfigView, showing the physical topology

  6. Elements of EAF automation processes

    NASA Astrophysics Data System (ADS)

    Ioana, A.; Constantin, N.; Dragna, E. C.

    2017-01-01

    Our article presents elements of Electric Arc Furnace (EAF) automation. So, we present and analyze detailed two automation schemes: the scheme of electrical EAF automation system; the scheme of thermic EAF automation system. The application results of these scheme of automation consists in: the sensitive reduction of specific consummation of electrical energy of Electric Arc Furnace, increasing the productivity of Electric Arc Furnace, increase the quality of the developed steel, increasing the durability of the building elements of Electric Arc Furnace.

  7. The Galley Parallel File System

    NASA Technical Reports Server (NTRS)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    As the I/O needs of parallel scientific applications increase, file systems for multiprocessors are being designed to provide applications with parallel access to multiple disks. Many parallel file systems present applications with a conventional Unix-like interface that allows the application to access multiple disks transparently. The interface conceals the parallelism within the file system, which increases the ease of programmability, but makes it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. Furthermore, most current parallel file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic parallel workloads. We discuss Galley's file structure and application interface, as well as an application that has been implemented using that interface.

  8. Solution of task related to control of swiss-type automatic lathe to get planes parallel to part axis

    NASA Astrophysics Data System (ADS)

    Tabekina, N. A.; Chepchurov, M. S.; Evtushenko, E. I.; Dmitrievsky, B. S.

    2018-05-01

    The work solves the problem of automation of machining process namely turning to produce parts having the planes parallel to an axis of rotation of part without using special tools. According to the results, the availability of the equipment of a high speed electromechanical drive to control the operative movements of lathe machine will enable one to get the planes parallel to the part axis. The method of getting planes parallel to the part axis is based on the mathematical model, which is presented as functional dependency between the conveying velocity of the driven element and the time. It describes the operative movements of lathe machine all over the tool path. Using the model of movement of the tool, it has been found that the conveying velocity varies from the maximum to zero value. It will allow one to carry out the reverse of the drive. The scheme of tool placement regarding the workpiece has been proposed for unidirectional movement of the driven element at high conveying velocity. The control method of CNC machines can be used for getting geometrically complex parts on the lathe without using special milling tools.

  9. Parallel digital forensics infrastructure.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liebrock, Lorie M.; Duggan, David Patrick

    2009-10-01

    This report documents the architecture and implementation of a Parallel Digital Forensics infrastructure. This infrastructure is necessary for supporting the design, implementation, and testing of new classes of parallel digital forensics tools. Digital Forensics has become extremely difficult with data sets of one terabyte and larger. The only way to overcome the processing time of these large sets is to identify and develop new parallel algorithms for performing the analysis. To support algorithm research, a flexible base infrastructure is required. A candidate architecture for this base infrastructure was designed, instantiated, and tested by this project, in collaboration with New Mexicomore » Tech. Previous infrastructures were not designed and built specifically for the development and testing of parallel algorithms. With the size of forensics data sets only expected to increase significantly, this type of infrastructure support is necessary for continued research in parallel digital forensics. This report documents the implementation of the parallel digital forensics (PDF) infrastructure architecture and implementation.« less

  10. 77 FR 48527 - National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-14

    ... Program (NCAP) Test Concerning Automated Commercial Environment (ACE) Simplified Entry: Modification of... Automated Commercial Environment (ACE). The test's participant selection criteria are modified to reflect... (NCAP) test concerning Automated Commercial Environment (ACE) Simplified Entry functionality (Simplified...

  11. Parallel processing and expert systems

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Lau, Sonie

    1991-01-01

    Whether it be monitoring the thermal subsystem of Space Station Freedom, or controlling the navigation of the autonomous rover on Mars, NASA missions in the 90's cannot enjoy an increased level of autonomy without the efficient use of expert systems. Merely increasing the computational speed of uniprocessors may not be able to guarantee that real time demands are met for large expert systems. Speed-up via parallel processing must be pursued alongside the optimization of sequential implementations. Prototypes of parallel expert systems have been built at universities and industrial labs in the U.S. and Japan. The state-of-the-art research in progress related to parallel execution of expert systems was surveyed. The survey is divided into three major sections: (1) multiprocessors for parallel expert systems; (2) parallel languages for symbolic computations; and (3) measurements of parallelism of expert system. Results to date indicate that the parallelism achieved for these systems is small. In order to obtain greater speed-ups, data parallelism and application parallelism must be exploited.

  12. Automation effects in a stereotypical multiloop manual control system. [for aircraft

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Mcnally, B. D.

    1984-01-01

    The increasing reliance of state-of-the art, high performance aircraft on high authority stability and command augmentation systems, in order to obtain satisfactory performance and handling qualities, has made critical the achievement of a better understanding of human capabilities, limitations, and preferences during interactions with complex dynamic systems that involve task allocation between man and machine. An analytical and experimental study has been undertaken to investigate human interaction with a simple, multiloop dynamic system in which human activity was systematically varied by changing the levels of automation. Task definition has led to a control loop structure which parallels that for any multiloop manual control system, and may therefore be considered a stereotype.

  13. Performance of automated multiplex PCR using sonication fluid for diagnosis of periprosthetic joint infection: a prospective cohort.

    PubMed

    Renz, Nora; Feihl, Susanne; Cabric, Sabrina; Trampuz, Andrej

    2017-12-01

    Sonication of explanted prostheses improved the microbiological diagnosis of periprosthetic joint infections (PJI). We evaluated the performance of automated multiplex polymerase chain reaction (PCR) using sonication fluid for the microbiological diagnosis of PJI. In a prospective cohort using uniform definition criteria for PJI, explanted joint prostheses were investigated by sonication and the resulting sonication fluid was analyzed by culture and multiplex PCR. McNemar's Chi-squared test was used to compare the performance of diagnostic tests. Among 111 patients, PJI was diagnosed in 78 (70%) and aseptic failure in 33 (30%). For the diagnosis of PJI, the sensitivity and specificity of periprosthetic tissue culture was 51 and 100%, of sonication fluid culture 58 and 100%, and of sonication fluid PCR 51 and 94%, respectively. Among 70 microorganisms, periprosthetic tissue culture grew 52 (74%), sonication fluid culture grew 50 (71%) and sonication fluid PCR detected 37 pathogens (53%). If only organisms are considered, for which primers are included in the test panel, PCR detected 37 of 58 pathogens (64%). The sonication fluid PCR missed 19 pathogens (predominantly oral streptococci and anaerobes), whereas 7 additional microorganisms were detected only by PCR (including Cutibacterium spp. and coagulase-negative staphylococci). The performance of multiplex PCR using sonication fluid is comparable to culture of periprosthetic tissue or sonication fluid. The advantages of PCR are short processing time (< 5 h) and fully automated procedure. However, culture technique is still needed due to the low sensitivity and the need of comprehensive susceptibility testing. Modification of primers or inclusion of additional ones may improve the performance of PCR, especially of low-virulent organisms.

  14. Automation in College Libraries.

    ERIC Educational Resources Information Center

    Werking, Richard Hume

    1991-01-01

    Reports the results of a survey of the "Bowdoin List" group of liberal arts colleges. The survey obtained information about (1) automation modules in place and when they had been installed; (2) financing of automation and its impacts on the library budgets; and (3) library director's views on library automation and the nature of the…

  15. Automation: Decision Aid or Decision Maker?

    NASA Technical Reports Server (NTRS)

    Skitka, Linda J.

    1998-01-01

    This study clarified that automation bias is something unique to automated decision making contexts, and is not the result of a general tendency toward complacency. By comparing performance on exactly the same events on the same tasks with and without an automated decision aid, we were able to determine that at least the omission error part of automation bias is due to the unique context created by having an automated decision aid, and is not a phenomena that would occur even if people were not in an automated context. However, this study also revealed that having an automated decision aid did lead to modestly improved performance across all non-error events. Participants in the non- automated condition responded with 83.68% accuracy, whereas participants in the automated condition responded with 88.67% accuracy, across all events. Automated decision aids clearly led to better overall performance when they were accurate. People performed almost exactly at the level of reliability as the automation (which across events was 88% reliable). However, also clear, is that the presence of less than 100% accurate automated decision aids creates a context in which new kinds of errors in decision making can occur. Participants in the non-automated condition responded with 97% accuracy on the six "error" events, whereas participants in the automated condition had only a 65% accuracy rate when confronted with those same six events. In short, the presence of an AMA can lead to vigilance decrements that can lead to errors in decision making.

  16. Data communications in a parallel active messaging interface of a parallel computer

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-11-12

    Data communications in a parallel active messaging interface (`PAMI`) of a parallel computer composed of compute nodes that execute a parallel application, each compute node including application processors that execute the parallel application and at least one management processor dedicated to gathering information regarding data communications. The PAMI is composed of data communications endpoints, each endpoint composed of a specification of data communications parameters for a thread of execution on a compute node, including specifications of a client, a context, and a task, the compute nodes and the endpoints coupled for data communications through the PAMI and through data communications resources. Embodiments function by gathering call site statistics describing data communications resulting from execution of data communications instructions and identifying in dependence upon the call cite statistics a data communications algorithm for use in executing a data communications instruction at a call site in the parallel application.

  17. Emerging Microtechnologies and Automated Systems for Rapid Bacterial Identification and Antibiotic Susceptibility Testing

    PubMed Central

    Li, Yiyan; Yang, Xing; Zhao, Weian

    2018-01-01

    Rapid bacterial identification (ID) and antibiotic susceptibility testing (AST) are in great demand due to the rise of drug-resistant bacteria. Conventional culture-based AST methods suffer from a long turnaround time. By necessity, physicians often have to treat patients empirically with antibiotics, which has led to an inappropriate use of antibiotics, an elevated mortality rate and healthcare costs, and antibiotic resistance. Recent advances in miniaturization and automation provide promising solutions for rapid bacterial ID/AST profiling, which will potentially make a significant impact in the clinical management of infectious diseases and antibiotic stewardship in the coming years. In this review, we summarize and analyze representative emerging micro- and nanotechnologies, as well as automated systems for bacterial ID/AST, including both phenotypic (e.g., microfluidic-based bacterial culture, and digital imaging of single cells) and molecular (e.g., multiplex PCR, hybridization probes, nanoparticles, synthetic biology tools, mass spectrometry, and sequencing technologies) methods. We also discuss representative point-of-care (POC) systems that integrate sample processing, fluid handling, and detection for rapid bacterial ID/AST. Finally, we highlight major remaining challenges and discuss potential future endeavors toward improving clinical outcomes with rapid bacterial ID/AST technologies. PMID:28850804

  18. Emerging Microtechnologies and Automated Systems for Rapid Bacterial Identification and Antibiotic Susceptibility Testing.

    PubMed

    Li, Yiyan; Yang, Xing; Zhao, Weian

    2017-12-01

    Rapid bacterial identification (ID) and antibiotic susceptibility testing (AST) are in great demand due to the rise of drug-resistant bacteria. Conventional culture-based AST methods suffer from a long turnaround time. By necessity, physicians often have to treat patients empirically with antibiotics, which has led to an inappropriate use of antibiotics, an elevated mortality rate and healthcare costs, and antibiotic resistance. Recent advances in miniaturization and automation provide promising solutions for rapid bacterial ID/AST profiling, which will potentially make a significant impact in the clinical management of infectious diseases and antibiotic stewardship in the coming years. In this review, we summarize and analyze representative emerging micro- and nanotechnologies, as well as automated systems for bacterial ID/AST, including both phenotypic (e.g., microfluidic-based bacterial culture, and digital imaging of single cells) and molecular (e.g., multiplex PCR, hybridization probes, nanoparticles, synthetic biology tools, mass spectrometry, and sequencing technologies) methods. We also discuss representative point-of-care (POC) systems that integrate sample processing, fluid handling, and detection for rapid bacterial ID/AST. Finally, we highlight major remaining challenges and discuss potential future endeavors toward improving clinical outcomes with rapid bacterial ID/AST technologies.

  19. Parallel Algorithms and Patterns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robey, Robert W.

    2016-06-16

    This is a powerpoint presentation on parallel algorithms and patterns. A parallel algorithm is a well-defined, step-by-step computational procedure that emphasizes concurrency to solve a problem. Examples of problems include: Sorting, searching, optimization, matrix operations. A parallel pattern is a computational step in a sequence of independent, potentially concurrent operations that occurs in diverse scenarios with some frequency. Examples are: Reductions, prefix scans, ghost cell updates. We only touch on parallel patterns in this presentation. It really deserves its own detailed discussion which Gabe Rockefeller would like to develop.

  20. Application Portable Parallel Library

    NASA Technical Reports Server (NTRS)

    Cole, Gary L.; Blech, Richard A.; Quealy, Angela; Townsend, Scott

    1995-01-01

    Application Portable Parallel Library (APPL) computer program is subroutine-based message-passing software library intended to provide consistent interface to variety of multiprocessor computers on market today. Minimizes effort needed to move application program from one computer to another. User develops application program once and then easily moves application program from parallel computer on which created to another parallel computer. ("Parallel computer" also include heterogeneous collection of networked computers). Written in C language with one FORTRAN 77 subroutine for UNIX-based computers and callable from application programs written in C language or FORTRAN 77.

  1. Parallel Logic Programming and Parallel Systems Software and Hardware

    DTIC Science & Technology

    1989-07-29

    Conference, Dallas TX. January 1985. (55) [Rous75] Roussel, P., "PROLOG: Manuel de Reference et d’Uilisation", Group d’ Intelligence Artificielle , Universite d...completed. Tools were provided for software development using artificial intelligence techniques. Al software for massively parallel architectures was...using artificial intelligence tech- niques. Al software for massively parallel architectures was started. 1. Introduction We describe research conducted

  2. Laboratory Automation and Middleware.

    PubMed

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Diagnostic performance of blood culture bottles for vitreous culture compared to conventional microbiological cultures in patients with suspected endophthalmitis.

    PubMed

    Kehrmann, Jan; Chapot, Valerie; Buer, Jan; Rating, Philipp; Bornfeld, Norbert; Steinmann, Joerg

    2018-05-01

    The purpose of this investigation was to evaluate the performance of blood culture bottles in comparison to conventional microbiological culture techniques in detecting causative microorganisms of endophthalmitis and to determine their anti-infective susceptibility profiles. All consecutive cases with clinically suspected endophthalmitis in a university-based ophthalmology department between January 2009 and December 2016 were analysed in this retrospective comparative case series. Samples from 247 patients with suspected endophthalmitis underwent microbiological diagnostic work-up. All three culture methods were performed from 140 vitreous specimens. Vitreous fluid specimens were inoculated in blood culture bottles, aerobic and anaerobic broth solutions, and on solid media. Anti-infective susceptibility profiles were evaluated by semi-automated methods and/or gradient diffusion methods. Microorganisms were grown in 82 of 140 specimens for which all methods were performed (59%). Microorganisms were more frequently grown from blood culture bottles (55%) compared to broth solution (45%, p = 0.007) and solid media (33%, p < 0.0001). Considerable differences in the performance among culture media were detected for fungal pathogens. All grown fungi were detected by blood culture bottles (11 of 11, 100%). Broth solution recovered 64% and solid media 46% of grown fungi. No Gram-positive bacterium was resistant to vancomycin and all Gram-negative pathogens except for one isolate were susceptible to third-generation cephalosporins. In suspected endophthalmitis patients, blood culture bottles have a higher overall pathogen detection rate from vitreous fluid compared to conventional microbiological media, especially for fungi. The initial intravitreal antibiotic therapy with vancomycin plus third-generation cephalosporins appears to be an appropriate treatment approach for bacterial endophthalmitis.

  4. Toward automated formation of microsphere arrangements using multiplexed optical tweezers

    NASA Astrophysics Data System (ADS)

    Rajasekaran, Keshav; Bollavaram, Manasa; Banerjee, Ashis G.

    2016-09-01

    Optical tweezers offer certain advantages such as multiplexing using a programmable spatial light modulator, flexibility in the choice of the manipulated object and the manipulation medium, precise control, easy object release, and minimal object damage. However, automated manipulation of multiple objects in parallel, which is essential for efficient and reliable formation of micro-scale assembly structures, poses a difficult challenge. There are two primary research issues in addressing this challenge. First, the presence of stochastic Langevin force giving rise to Brownian motion requires motion control for all the manipulated objects at fast rates of several Hz. Second, the object dynamics is non-linear and even difficult to represent analytically due to the interaction of multiple optical traps that are manipulating neighboring objects. As a result, automated controllers have not been realized for tens of objects, particularly with three dimensional motions with guaranteed collision avoidances. In this paper, we model the effect of interacting optical traps on microspheres with significant Brownian motions in stationary fluid media, and develop simplified state-space representations. These representations are used to design a model predictive controller to coordinate the motions of several spheres in real time. Preliminary experiments demonstrate the utility of the controller in automatically forming desired arrangements of varying configurations starting with randomly dispersed microspheres.

  5. Self-Scheduling Parallel Methods for Multiple Serial Codes with Application to WOPWOP

    NASA Technical Reports Server (NTRS)

    Long, Lyle N.; Brentner, Kenneth S.

    2000-01-01

    This paper presents a scheme for efficiently running a large number of serial jobs on parallel computers. Two examples are given of computer programs that run relatively quickly, but often they must be run numerous times to obtain all the results needed. It is very common in science and engineering to have codes that are not massive computing challenges in themselves, but due to the number of instances that must be run, they do become large-scale computing problems. The two examples given here represent common problems in aerospace engineering: aerodynamic panel methods and aeroacoustic integral methods. The first example simply solves many systems of linear equations. This is representative of an aerodynamic panel code where someone would like to solve for numerous angles of attack. The complete code for this first example is included in the appendix so that it can be readily used by others as a template. The second example is an aeroacoustics code (WOPWOP) that solves the Ffowcs Williams Hawkings equation to predict the far-field sound due to rotating blades. In this example, one quite often needs to compute the sound at numerous observer locations, hence parallelization is utilized to automate the noise computation for a large number of observers.

  6. Endpoint-based parallel data processing with non-blocking collective instructions in a parallel active messaging interface of a parallel computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Archer, Charles J; Blocksome, Michael A; Cernohous, Bob R

    Methods, apparatuses, and computer program products for endpoint-based parallel data processing with non-blocking collective instructions in a parallel active messaging interface (`PAMI`) of a parallel computer are provided. Embodiments include establishing by a parallel application a data communications geometry, the geometry specifying a set of endpoints that are used in collective operations of the PAMI, including associating with the geometry a list of collective algorithms valid for use with the endpoints of the geometry. Embodiments also include registering in each endpoint in the geometry a dispatch callback function for a collective operation and executing without blocking, through a single onemore » of the endpoints in the geometry, an instruction for the collective operation.« less

  7. Managing laboratory automation

    PubMed Central

    Saboe, Thomas J.

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed. PMID:18925018

  8. Managing laboratory automation.

    PubMed

    Saboe, T J

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed.

  9. The Science of Home Automation

    NASA Astrophysics Data System (ADS)

    Thomas, Brian Louis

    Smart home technologies and the concept of home automation have become more popular in recent years. This popularity has been accompanied by social acceptance of passive sensors installed throughout the home. The subsequent increase in smart homes facilitates the creation of home automation strategies. We believe that home automation strategies can be generated intelligently by utilizing smart home sensors and activity learning. In this dissertation, we hypothesize that home automation can benefit from activity awareness. To test this, we develop our activity-aware smart automation system, CARL (CASAS Activity-aware Resource Learning). CARL learns the associations between activities and device usage from historical data and utilizes the activity-aware capabilities to control the devices. To help validate CARL we deploy and test three different versions of the automation system in a real-world smart environment. To provide a foundation of activity learning, we integrate existing activity recognition and activity forecasting into CARL home automation. We also explore two alternatives to using human-labeled data to train the activity learning models. The first unsupervised method is Activity Detection, and the second is a modified DBSCAN algorithm that utilizes Dynamic Time Warping (DTW) as a distance metric. We compare the performance of activity learning with human-defined labels and with automatically-discovered activity categories. To provide evidence in support of our hypothesis, we evaluate CARL automation in a smart home testbed. Our results indicate that home automation can be boosted through activity awareness. We also find that the resulting automation has a high degree of usability and comfort for the smart home resident.

  10. By Hand or Not By-Hand: A Case Study of Alternative Approaches to Parallelize CFD Applications

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Bailey, David (Technical Monitor)

    1997-01-01

    While parallel processing promises to speed up applications by several orders of magnitude, the performance achieved still depends upon several factors, including the multiprocessor architecture, system software, data distribution and alignment, as well as the methods used for partitioning the application and mapping its components onto the architecture. The existence of the Gorden Bell Prize given out at Supercomputing every year suggests that while good performance can be attained for real applications on general purpose multiprocessors, the large investment in man-power and time still has to be repeated for each application-machine combination. As applications and machine architectures become more complex, the cost and time-delays for obtaining performance by hand will become prohibitive. Computer users today can turn to three possible avenues for help: parallel libraries, parallel languages and compilers, interactive parallelization tools. The success of these methodologies, in turn, depends on proper application of data dependency analysis, program structure recognition and transformation, performance prediction as well as exploitation of user supplied knowledge. NASA has been developing multidisciplinary applications on highly parallel architectures under the High Performance Computing and Communications Program. Over the past six years, the transition of underlying hardware and system software have forced the scientists to spend a large effort to migrate and recede their applications. Various attempts to exploit software tools to automate the parallelization process have not produced favorable results. In this paper, we report our most recent experience with CAPTOOL, a package developed at Greenwich University. We have chosen CAPTOOL for three reasons: 1. CAPTOOL accepts a FORTRAN 77 program as input. This suggests its potential applicability to a large collection of legacy codes currently in use. 2. CAPTOOL employs domain decomposition to obtain parallelism

  11. Neural Parallel Engine: A toolbox for massively parallel neural signal processing.

    PubMed

    Tam, Wing-Kin; Yang, Zhi

    2018-05-01

    Large-scale neural recordings provide detailed information on neuronal activities and can help elicit the underlying neural mechanisms of the brain. However, the computational burden is also formidable when we try to process the huge data stream generated by such recordings. In this study, we report the development of Neural Parallel Engine (NPE), a toolbox for massively parallel neural signal processing on graphical processing units (GPUs). It offers a selection of the most commonly used routines in neural signal processing such as spike detection and spike sorting, including advanced algorithms such as exponential-component-power-component (EC-PC) spike detection and binary pursuit spike sorting. We also propose a new method for detecting peaks in parallel through a parallel compact operation. Our toolbox is able to offer a 5× to 110× speedup compared with its CPU counterparts depending on the algorithms. A user-friendly MATLAB interface is provided to allow easy integration of the toolbox into existing workflows. Previous efforts on GPU neural signal processing only focus on a few rudimentary algorithms, are not well-optimized and often do not provide a user-friendly programming interface to fit into existing workflows. There is a strong need for a comprehensive toolbox for massively parallel neural signal processing. A new toolbox for massively parallel neural signal processing has been created. It can offer significant speedup in processing signals from large-scale recordings up to thousands of channels. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Automation in organizations: Eternal conflict

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  13. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  14. Data communications in a parallel active messaging interface of a parallel computer

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2014-02-11

    Data communications in a parallel active messaging interface ('PAMI') or a parallel computer, the parallel computer including a plurality of compute nodes that execute a parallel application, the PAMI composed of data communications endpoints, each endpoint including a specification of data communications parameters for a thread of execution of a compute node, including specification of a client, a context, and a task, the compute nodes and the endpoints coupled for data communications instruction, the instruction characterized by instruction type, the instruction specifying a transmission of transfer data from the origin endpoint to a target endpoint and transmitting, in accordance witht the instruction type, the transfer data from the origin endpoin to the target endpoint.

  15. Stromal vascular fraction isolated from lipo-aspirates using an automated processing system: bench and bed analysis.

    PubMed

    Doi, Kentaro; Tanaka, Shinsuke; Iida, Hideo; Eto, Hitomi; Kato, Harunosuke; Aoi, Noriyuki; Kuno, Shinichiro; Hirohi, Toshitsugu; Yoshimura, Kotaro

    2013-11-01

    The heterogeneous stromal vascular fraction (SVF), containing adipose-derived stem/progenitor cells (ASCs), can be easily isolated through enzymatic digestion of aspirated adipose tissue. In clinical settings, however, strict control of technical procedures according to standard operating procedures and validation of cell-processing conditions are required. Therefore, we evaluated the efficiency and reliability of an automated system for SVF isolation from adipose tissue. SVF cells, freshly isolated using the automated procedure, showed comparable number and viability to those from manual isolation. Flow cytometric analysis confirmed an SVF cell composition profile similar to that after manual isolation. In addition, the ASC yield after 1 week in culture was also not significantly different between the two groups. Our clinical study, in which SVF cells isolated with the automated system were transplanted with aspirated fat tissue for soft tissue augmentation/reconstruction in 42 patients, showed satisfactory outcomes with no serious side-effects. Taken together, our results suggested that the automated isolation system is as reliable a method as manual isolation and may also be useful in clinical settings. Automated isolation is expected to enable cell-based clinical trials in small facilities with an aseptic room, without the necessity of a good manufacturing practice-level cell processing area. Copyright © 2012 John Wiley & Sons, Ltd.

  16. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    PubMed

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.

  17. Automated vector selection of SIVQ and parallel computing integration MATLAB™: Innovations supporting large-scale and high-throughput image analysis studies.

    PubMed

    Cheng, Jerome; Hipp, Jason; Monaco, James; Lucas, David R; Madabhushi, Anant; Balis, Ulysses J

    2011-01-01

    Spatially invariant vector quantization (SIVQ) is a texture and color-based image matching algorithm that queries the image space through the use of ring vectors. In prior studies, the selection of one or more optimal vectors for a particular feature of interest required a manual process, with the user initially stochastically selecting candidate vectors and subsequently testing them upon other regions of the image to verify the vector's sensitivity and specificity properties (typically by reviewing a resultant heat map). In carrying out the prior efforts, the SIVQ algorithm was noted to exhibit highly scalable computational properties, where each region of analysis can take place independently of others, making a compelling case for the exploration of its deployment on high-throughput computing platforms, with the hypothesis that such an exercise will result in performance gains that scale linearly with increasing processor count. An automated process was developed for the selection of optimal ring vectors to serve as the predicate matching operator in defining histopathological features of interest. Briefly, candidate vectors were generated from every possible coordinate origin within a user-defined vector selection area (VSA) and subsequently compared against user-identified positive and negative "ground truth" regions on the same image. Each vector from the VSA was assessed for its goodness-of-fit to both the positive and negative areas via the use of the receiver operating characteristic (ROC) transfer function, with each assessment resulting in an associated area-under-the-curve (AUC) figure of merit. Use of the above-mentioned automated vector selection process was demonstrated in two cases of use: First, to identify malignant colonic epithelium, and second, to identify soft tissue sarcoma. For both examples, a very satisfactory optimized vector was identified, as defined by the AUC metric. Finally, as an additional effort directed towards attaining high

  18. Automated Chemotactic Sorting and Single-cell Cultivation of Microbes using Droplet Microfluidics

    NASA Astrophysics Data System (ADS)

    Dong, Libing; Chen, Dong-Wei; Liu, Shuang-Jiang; Du, Wenbin

    2016-04-01

    We report a microfluidic device for automated sorting and cultivation of chemotactic microbes from pure cultures or mixtures. The device consists of two parts: in the first part, a concentration gradient of the chemoeffector was built across the channel for inducing chemotaxis of motile cells; in the second part, chemotactic cells from the sample were separated, and mixed with culture media to form nanoliter droplets for encapsulation, cultivation, enumeration, and recovery of single cells. Chemotactic responses were assessed by imaging and statistical analysis of droplets based on Poisson distribution. An automated procedure was developed for rapid enumeration of droplets with cell growth, following with scale-up cultivation on agar plates. The performance of the device was evaluated by the chemotaxis assays of Escherichia coli (E. coli) RP437 and E. coli RP1616. Moreover, enrichment and isolation of non-labelled Comamonas testosteroni CNB-1 from its 1:10 mixture with E. coli RP437 was demonstrated. The enrichment factor reached 36.7 for CNB-1, based on its distinctive chemotaxis toward 4-hydroxybenzoic acid. We believe that this device can be widely used in chemotaxis studies without necessarily relying on fluorescent labelling, and isolation of functional microbial species from various environments.

  19. Automated Chemotactic Sorting and Single-cell Cultivation of Microbes using Droplet Microfluidics.

    PubMed

    Dong, Libing; Chen, Dong-Wei; Liu, Shuang-Jiang; Du, Wenbin

    2016-04-14

    We report a microfluidic device for automated sorting and cultivation of chemotactic microbes from pure cultures or mixtures. The device consists of two parts: in the first part, a concentration gradient of the chemoeffector was built across the channel for inducing chemotaxis of motile cells; in the second part, chemotactic cells from the sample were separated, and mixed with culture media to form nanoliter droplets for encapsulation, cultivation, enumeration, and recovery of single cells. Chemotactic responses were assessed by imaging and statistical analysis of droplets based on Poisson distribution. An automated procedure was developed for rapid enumeration of droplets with cell growth, following with scale-up cultivation on agar plates. The performance of the device was evaluated by the chemotaxis assays of Escherichia coli (E. coli) RP437 and E. coli RP1616. Moreover, enrichment and isolation of non-labelled Comamonas testosteroni CNB-1 from its 1:10 mixture with E. coli RP437 was demonstrated. The enrichment factor reached 36.7 for CNB-1, based on its distinctive chemotaxis toward 4-hydroxybenzoic acid. We believe that this device can be widely used in chemotaxis studies without necessarily relying on fluorescent labelling, and isolation of functional microbial species from various environments.

  20. GROMACS 4.5: a high-throughput and highly parallel open source molecular simulation toolkit

    PubMed Central

    Pronk, Sander; Páll, Szilárd; Schulz, Roland; Larsson, Per; Bjelkmar, Pär; Apostolov, Rossen; Shirts, Michael R.; Smith, Jeremy C.; Kasson, Peter M.; van der Spoel, David; Hess, Berk; Lindahl, Erik

    2013-01-01

    Motivation: Molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. Results: Here, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations. Availability: GROMACS is an open source and free software available from http://www.gromacs.org. Contact: erik.lindahl@scilifelab.se Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23407358

  1. Automated Engineering Design (AED); An approach to automated documentation

    NASA Technical Reports Server (NTRS)

    Mcclure, C. W.

    1970-01-01

    The automated engineering design (AED) is reviewed, consisting of a high level systems programming language, a series of modular precoded subroutines, and a set of powerful software machine tools that effectively automate the production and design of new languages. AED is used primarily for development of problem and user-oriented languages. Software production phases are diagramed, and factors which inhibit effective documentation are evaluated.

  2. Use of immunochromatographic assay for rapid identification of Mycobacterium tuberculosis complex from liquid culture

    PubMed Central

    Považan, Anika; Vukelić, Anka; Savković, Tijana; Kurucin, Tatjana

    2012-01-01

    A new, simple immunochromatographic assay for rapid identification of Mycobacterium tuberculosis complex in liquid cultures has been developed. The principle of the assay is binding of the Mycobacterium tuberculosis complex specific antigen to the monoclonal antibody conjugated on the test strip. The aim of this study is evaluation of the performance of immunochromatographic assay in identification of Mycobacterium tuberculosis complex in primary positive liquid cultures of BacT/Alert automated system. A total of 159 primary positive liquid cultures were tested using the immunochromatographic assay (BD MGIT TBc ID) and the conventional subculture, followed by identification using biochemical tests. Of 159 positive liquid cultures, using the conventional method, Mycobacterium tuberculos is was identified in 119 (74.8%), nontuberculous mycobacteria were found in 4 (2.5%), 14 (8.8%) cultures were contaminated and 22 (13.8%) cultures were found to be negative. Using the immunochromatographic assay, Mycobacterium tuberculosis complex was detected in 118 (74.2%) liquid cultures, and 41 (25.8%) tests were negative. Sensitivity, specificity, positive and negative predictive values of the test were 98.3%; 97.5%; 99.15%; 95.12%, respectively. The value of kappa test was 0.950, and McNemar test was 1.00. The immunochromatographic assay is a simple and rapid test which represents a suitable alternative to the conventional subculture method for the primary identification of Mycobacterium tuberculosis complex in liquid cultures of BacT/Alert automated system. PMID:22364301

  3. The Automated Office.

    ERIC Educational Resources Information Center

    Naclerio, Nick

    1979-01-01

    Clerical personnel may be able to climb career ladders as a result of office automation and expanded job opportunities in the word processing area. Suggests opportunities in an automated office system and lists books and periodicals on word processing for counselors and teachers. (MF)

  4. Separation and parallel sequencing of the genomes and transcriptomes of single cells using G&T-seq.

    PubMed

    Macaulay, Iain C; Teng, Mabel J; Haerty, Wilfried; Kumar, Parveen; Ponting, Chris P; Voet, Thierry

    2016-11-01

    Parallel sequencing of a single cell's genome and transcriptome provides a powerful tool for dissecting genetic variation and its relationship with gene expression. Here we present a detailed protocol for G&T-seq, a method for separation and parallel sequencing of genomic DNA and full-length polyA(+) mRNA from single cells. We provide step-by-step instructions for the isolation and lysis of single cells; the physical separation of polyA(+) mRNA from genomic DNA using a modified oligo-dT bead capture and the respective whole-transcriptome and whole-genome amplifications; and library preparation and sequence analyses of these amplification products. The method allows the detection of thousands of transcripts in parallel with the genetic variants captured by the DNA-seq data from the same single cell. G&T-seq differs from other currently available methods for parallel DNA and RNA sequencing from single cells, as it involves physical separation of the DNA and RNA and does not require bespoke microfluidics platforms. The process can be implemented manually or through automation. When performed manually, paired genome and transcriptome sequencing libraries from eight single cells can be produced in ∼3 d by researchers experienced in molecular laboratory work. For users with experience in the programming and operation of liquid-handling robots, paired DNA and RNA libraries from 96 single cells can be produced in the same time frame. Sequence analysis and integration of single-cell G&T-seq DNA and RNA data requires a high level of bioinformatics expertise and familiarity with a wide range of informatics tools.

  5. Parallels in History.

    ERIC Educational Resources Information Center

    Mugleston, William F.

    2000-01-01

    Believes that by focusing on the recurrent situations and problems, or parallels, throughout history, students will understand the relevance of history to their own times and lives. Provides suggestions for parallels in history that may be introduced within lectures or as a means to class discussions. (CMK)

  6. Understanding human management of automation errors.

    PubMed

    McBride, Sara E; Rogers, Wendy A; Fisk, Arthur D

    2014-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance.

  7. The Automation-by-Expertise-by-Training Interaction.

    PubMed

    Strauch, Barry

    2017-03-01

    I introduce the automation-by-expertise-by-training interaction in automated systems and discuss its influence on operator performance. Transportation accidents that, across a 30-year interval demonstrated identical automation-related operator errors, suggest a need to reexamine traditional views of automation. I review accident investigation reports, regulator studies, and literature on human computer interaction, expertise, and training and discuss how failing to attend to the interaction of automation, expertise level, and training has enabled operators to commit identical automation-related errors. Automated systems continue to provide capabilities exceeding operators' need for effective system operation and provide interfaces that can hinder, rather than enhance, operator automation-related situation awareness. Because of limitations in time and resources, training programs do not provide operators the expertise needed to effectively operate these automated systems, requiring them to obtain the expertise ad hoc during system operations. As a result, many do not acquire necessary automation-related system expertise. Integrating automation with expected operator expertise levels, and within training programs that provide operators the necessary automation expertise, can reduce opportunities for automation-related operator errors. Research to address the automation-by-expertise-by-training interaction is needed. However, such research must meet challenges inherent to examining realistic sociotechnical system automation features with representative samples of operators, perhaps by using observational and ethnographic research. Research in this domain should improve the integration of design and training and, it is hoped, enhance operator performance.

  8. Drawing together psyche, soma and spirit: my career in cultural psychiatry.

    PubMed

    Dein, Simon

    2011-04-01

    In this article I discuss my career in cultural psychiatry. I begin by examining the influence of my personal background on my interests in cultural psychiatry and religion and health. I then discuss my research, which has focused upon two areas: the cognitive and phenomenological parallels between religious experiences and psychopathological states, and relationships between biomedicine and religious healing in diverse cultural contexts. Finally, I discuss plans for future research and teaching.

  9. Automated Microwave Dielectric Constant Measurement

    DTIC Science & Technology

    1987-03-01

    IJSWC TR 86-46 AD.-A 184 182 AUTOMATED MICROWAVE DIELECTRIC CONSTANT MEASUREMENT SYTIEM BY B. C. GLANCY A. KRALL PESEARCH AND TECHNOLOGY DEPARTMENT...NO0. NO. ACCESSION NO. Silver Spring, Maryland 20903-500061152N ZROO1 ZRO131 R1AA29 11. TITLE (Include Security Classification) AUTOMATED MICROWAVE ...constants as a funct on of microwave frequency has been simplified using an automated testing apparatus. This automated procedure is based on the use of a

  10. Understanding human management of automation errors

    PubMed Central

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  11. Standardization of skin cleansing in vivo: part I. Development of an Automated Cleansing Device (ACiD).

    PubMed

    Sonsmann, F K; Strunk, M; Gediga, K; John, C; Schliemann, S; Seyfarth, F; Elsner, P; Diepgen, T L; Kutz, G; John, S M

    2014-05-01

    To date, there are no legally binding requirements concerning product testing in cosmetics. This leads to various manufacturer-specific test methods and absent transparent information on skin cleansing products. A standardized in vivo test procedure for assessment of cleansing efficacy and corresponding barrier impairment by the cleaning process is needed, especially in the occupational context where repeated hand washing procedures may be performed at short intervals. For the standardization of the cleansing procedure, an Automated Cleansing Device (ACiD) was designed and evaluated. Different smooth washing surfaces of the equipment for ACiD (incl. goat hair, felt, felt covered with nitrile caps) were evaluated regarding their skin compatibility. ACiD allows an automated, fully standardized skin washing procedure. Felt covered with nitrile as washing surface of the rotating washing units leads to a homogenous cleansing result and does not cause detectable skin irritation, neither clinically nor as assessed by skin bioengineering methods (transepidermal water loss, chromametry). Automated Cleansing Device may be useful for standardized evaluation of the cleansing effectiveness and parallel assessment of the corresponding irritancy potential of industrial skin cleansers. This will allow objectifying efficacy and safety of industrial skin cleansers, thus enabling market transparency and facilitating rational choice of products. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. Cockpit Adaptive Automation and Pilot Performance

    NASA Technical Reports Server (NTRS)

    Parasuraman, Raja

    2001-01-01

    The introduction of high-level automated systems in the aircraft cockpit has provided several benefits, e.g., new capabilities, enhanced operational efficiency, and reduced crew workload. At the same time, conventional 'static' automation has sometimes degraded human operator monitoring performance, increased workload, and reduced situation awareness. Adaptive automation represents an alternative to static automation. In this approach, task allocation between human operators and computer systems is flexible and context-dependent rather than static. Adaptive automation, or adaptive task allocation, is thought to provide for regulation of operator workload and performance, while preserving the benefits of static automation. In previous research we have reported beneficial effects of adaptive automation on the performance of both pilots and non-pilots of flight-related tasks. For adaptive systems to be viable, however, such benefits need to be examined jointly in the context of a single set of tasks. The studies carried out under this project evaluated a systematic method for combining different forms of adaptive automation. A model for effective combination of different forms of adaptive automation, based on matching adaptation to operator workload was proposed and tested. The model was evaluated in studies using IFR-rated pilots flying a general-aviation simulator. Performance, subjective, and physiological (heart rate variability, eye scan-paths) measures of workload were recorded. The studies compared workload-based adaptation to to non-adaptive control conditions and found evidence for systematic benefits of adaptive automation. The research provides an empirical basis for evaluating the effectiveness of adaptive automation in the cockpit. The results contribute to the development of design principles and guidelines for the implementation of adaptive automation in the cockpit, particularly in general aviation, and in other human-machine systems. Project goals

  13. An automated metrics system to measure and improve the success of laboratory automation implementation.

    PubMed

    Benn, Neil; Turlais, Fabrice; Clark, Victoria; Jones, Mike; Clulow, Stephen

    2007-03-01

    The authors describe a system for collecting usage metrics from widely distributed automation systems. An application that records and stores usage data centrally, calculates run times, and charts the data was developed. Data were collected over 20 months from at least 28 workstations. The application was used to plot bar charts of date versus run time for individual workstations, the automation in a specific laboratory, or automation of a specified type. The authors show that revised user training, redeployment of equipment, and running complimentary processes on one workstation can increase the average number of runs by up to 20-fold and run times by up to 450%. Active monitoring of usage leads to more effective use of automation. Usage data could be used to determine whether purchasing particular automation was a good investment.

  14. Donald Campbell's doubt: cultural difference or failure of communication?

    PubMed

    Shweder, Richard A

    2010-06-01

    The objection, rightfully noted but then dismissed by Henrich et al., that the observed variation across populations "may be due to various methodological artifacts that arise from translating experiments across contexts" is a theoretically profound and potentially constructive criticism. It parallels Donald Campbell's concern that many cultural differences reported by psychologists "come from failures of communication misreported as differences." Ironically, Campbell's doubt is a good foundation for investigations in cultural psychology.

  15. Parallel Architectures and Parallel Algorithms for Integrated Vision Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Choudhary, Alok Nidhi

    1989-01-01

    Computer vision is regarded as one of the most complex and computationally intensive problems. An integrated vision system (IVS) is a system that uses vision algorithms from all levels of processing to perform for a high level application (e.g., object recognition). An IVS normally involves algorithms from low level, intermediate level, and high level vision. Designing parallel architectures for vision systems is of tremendous interest to researchers. Several issues are addressed in parallel architectures and parallel algorithms for integrated vision systems.

  16. Integrated Task and Data Parallel Programming

    NASA Technical Reports Server (NTRS)

    Grimshaw, A. S.

    1998-01-01

    This research investigates the combination of task and data parallel language constructs within a single programming language. There are an number of applications that exhibit properties which would be well served by such an integrated language. Examples include global climate models, aircraft design problems, and multidisciplinary design optimization problems. Our approach incorporates data parallel language constructs into an existing, object oriented, task parallel language. The language will support creation and manipulation of parallel classes and objects of both types (task parallel and data parallel). Ultimately, the language will allow data parallel and task parallel classes to be used either as building blocks or managers of parallel objects of either type, thus allowing the development of single and multi-paradigm parallel applications. 1995 Research Accomplishments In February I presented a paper at Frontiers 1995 describing the design of the data parallel language subset. During the spring I wrote and defended my dissertation proposal. Since that time I have developed a runtime model for the language subset. I have begun implementing the model and hand-coding simple examples which demonstrate the language subset. I have identified an astrophysical fluid flow application which will validate the data parallel language subset. 1996 Research Agenda Milestones for the coming year include implementing a significant portion of the data parallel language subset over the Legion system. Using simple hand-coded methods, I plan to demonstrate (1) concurrent task and data parallel objects and (2) task parallel objects managing both task and data parallel objects. My next steps will focus on constructing a compiler and implementing the fluid flow application with the language. Concurrently, I will conduct a search for a real-world application exhibiting both task and data parallelism within the same program. Additional 1995 Activities During the fall I collaborated

  17. Team-Centered Perspective for Adaptive Automation Design

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III

    2003-01-01

    Automation represents a very active area of human factors research. The journal, Human Factors, published a special issue on automation in 1985. Since then, hundreds of scientific studies have been published examining the nature of automation and its interaction with human performance. However, despite a dramatic increase in research investigating human factors issues in aviation automation, there remain areas that need further exploration. This NASA Technical Memorandum describes a new area of automation design and research, called adaptive automation. It discusses the concepts and outlines the human factors issues associated with the new method of adaptive function allocation. The primary focus is on human-centered design, and specifically on ensuring that adaptive automation is from a team-centered perspective. The document shows that adaptive automation has many human factors issues common to traditional automation design. Much like the introduction of other new technologies and paradigm shifts, adaptive automation presents an opportunity to remediate current problems but poses new ones for human-automation interaction in aerospace operations. The review here is intended to communicate the philosophical perspective and direction of adaptive automation research conducted under the Aerospace Operations Systems (AOS), Physiological and Psychological Stressors and Factors (PPSF) project.

  18. Toward designing for trust in database automation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duez, P. P.; Jamieson, G. A.

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operatingmore » functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated

  19. The 'problem' with automation - Inappropriate feedback and interaction, not 'over-automation'

    NASA Technical Reports Server (NTRS)

    Norman, D. A.

    1990-01-01

    Automation in high-risk industry is often blamed for causing harm and increasing the chance of human error when failures occur. It is proposed that the problem is not the presence of automation, but rather its inappropriate design. The problem is that the operations are performed appropriately under normal conditions, but there is inadequate feedback and interaction with the humans who must control the overall conduct of the task. The problem is that the automation is at an intermediate level of intelligence, powerful enough to take over control which used to be done by people, but not powerful enough to handle all abnormalities. Moreover, its level of intelligence is insufficient to provide the continual, appropriate feedback that occurs naturally among human operators. To solve this problem, the automation should either be made less intelligent or more so, but the current level is quite inappropriate. The overall message is that it is possible to reduce error through appropriate design considerations.

  20. Data communications in a parallel active messaging interface of a parallel computer

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-10-29

    Data communications in a parallel active messaging interface (`PAMI`) of a parallel computer, the parallel computer including a plurality of compute nodes that execute a parallel application, the PAMI composed of data communications endpoints, each endpoint including a specification of data communications parameters for a thread of execution on a compute node, including specifications of a client, a context, and a task, the compute nodes and the endpoints coupled for data communications through the PAMI and through data communications resources, including receiving in an origin endpoint of the PAMI a data communications instruction, the instruction characterized by an instruction type, the instruction specifying a transmission of transfer data from the origin endpoint to a target endpoint and transmitting, in accordance with the instruction type, the transfer data from the origin endpoint to the target endpoint.

  1. Fully Parallel MHD Stability Analysis Tool

    NASA Astrophysics Data System (ADS)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2014-10-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Initial results of the code parallelization will be reported. Work is supported by the U.S. DOE SBIR program.

  2. Streamlining workflow and automation to accelerate laboratory scale protein production.

    PubMed

    Konczal, Jennifer; Gray, Christopher H

    2017-05-01

    Protein production facilities are often required to produce diverse arrays of proteins for demanding methodologies including crystallography, NMR, ITC and other reagent intensive techniques. It is common for these teams to find themselves a bottleneck in the pipeline of ambitious projects. This pressure to deliver has resulted in the evolution of many novel methods to increase capacity and throughput at all stages in the pipeline for generation of recombinant proteins. This review aims to describe current and emerging options to accelerate the success of protein production in Escherichia coli. We emphasize technologies that have been evaluated and implemented in our laboratory, including innovative molecular biology and expression vectors, small-scale expression screening strategies and the automation of parallel and multidimensional chromatography. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  3. The Automation of Reserve Processing.

    ERIC Educational Resources Information Center

    Self, James

    1985-01-01

    Describes an automated reserve processing system developed locally at Clemons Library, University of Virginia. Discussion covers developments in the reserve operation at Clemons Library, automation of the processing and circulation functions of reserve collections, and changes in reserve operation performance and staffing needs due to automation.…

  4. Vision Marker-Based In Situ Examination of Bacterial Growth in Liquid Culture Media.

    PubMed

    Kim, Kyukwang; Choi, Duckyu; Lim, Hwijoon; Kim, Hyeongkeun; Jeon, Jessie S

    2016-12-18

    The detection of bacterial growth in liquid media is an essential process in determining antibiotic susceptibility or the level of bacterial presence for clinical or research purposes. We have developed a system, which enables simplified and automated detection using a camera and a striped pattern marker. The quantification of bacterial growth is possible as the bacterial growth in the culturing vessel blurs the marker image, which is placed on the back of the vessel, and the blurring results in a decrease in the high-frequency spectrum region of the marker image. The experiment results show that the FFT (fast Fourier transform)-based growth detection method is robust to the variations in the type of bacterial carrier and vessels ranging from the culture tubes to the microfluidic devices. Moreover, the automated incubator and image acquisition system are developed to be used as a comprehensive in situ detection system. We expect that this result can be applied in the automation of biological experiments, such as the Antibiotics Susceptibility Test or toxicity measurement. Furthermore, the simple framework of the proposed growth measurement method may be further utilized as an effective and convenient method for building point-of-care devices for developing countries.

  5. File concepts for parallel I/O

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas W.

    1989-01-01

    The subject of input/output (I/O) was often neglected in the design of parallel computer systems, although for many problems I/O rates will limit the speedup attainable. The I/O problem is addressed by considering the role of files in parallel systems. The notion of parallel files is introduced. Parallel files provide for concurrent access by multiple processes, and utilize parallelism in the I/O system to improve performance. Parallel files can also be used conventionally by sequential programs. A set of standard parallel file organizations is proposed, organizations are suggested, using multiple storage devices. Problem areas are also identified and discussed.

  6. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  7. Non-Cartesian Parallel Imaging Reconstruction

    PubMed Central

    Wright, Katherine L.; Hamilton, Jesse I.; Griswold, Mark A.; Gulani, Vikas; Seiberlich, Nicole

    2014-01-01

    Non-Cartesian parallel imaging has played an important role in reducing data acquisition time in MRI. The use of non-Cartesian trajectories can enable more efficient coverage of k-space, which can be leveraged to reduce scan times. These trajectories can be undersampled to achieve even faster scan times, but the resulting images may contain aliasing artifacts. Just as Cartesian parallel imaging can be employed to reconstruct images from undersampled Cartesian data, non-Cartesian parallel imaging methods can mitigate aliasing artifacts by using additional spatial encoding information in the form of the non-homogeneous sensitivities of multi-coil phased arrays. This review will begin with an overview of non-Cartesian k-space trajectories and their sampling properties, followed by an in-depth discussion of several selected non-Cartesian parallel imaging algorithms. Three representative non-Cartesian parallel imaging methods will be described, including Conjugate Gradient SENSE (CG SENSE), non-Cartesian GRAPPA, and Iterative Self-Consistent Parallel Imaging Reconstruction (SPIRiT). After a discussion of these three techniques, several potential promising clinical applications of non-Cartesian parallel imaging will be covered. PMID:24408499

  8. Crossing the cultural divide: issues in translation, mistrust, and cocreation of meaning in cross-cultural therapeutic assessment.

    PubMed

    Rosenberg, Audrey; Almeida, Angelica; Macdonald, Heather

    2012-01-01

    This article examines cross-cultural therapeutic assessment in a community mental health clinic. The first case describes the work between a Caucasian assessor and a Mexican American family. The authors explore the metaphorical and literal translation of the findings from English to Spanish and the parallel process of translation of the self, experienced by both assessor and client. The second case describes the work between a Caucasian assessor and an African American adolescent. We describe the inherent challenge between the Eurocentric "task" orientation of the evaluation and the Afrocentric "relationship" orientation. We suggest that bridging the gap between cultures and overcoming cultural mistrust lay in the building of the assessor-client relationship. Fischer's concepts of rapport and intimacy are emphasized and expanded on as we emphasize the importance of cocreated meaning in cross-cultural assessment work.

  9. Trust in automation: designing for appropriate reliance.

    PubMed

    Lee, John D; See, Katrina A

    2004-01-01

    Automation is often problematic because people fail to rely upon it appropriately. Because people respond to technology socially, trust influences reliance on automation. In particular, trust guides reliance when complexity and unanticipated situations make a complete understanding of the automation impractical. This review considers trust from the organizational, sociological, interpersonal, psychological, and neurological perspectives. It considers how the context, automation characteristics, and cognitive processes affect the appropriateness of trust. The context in which the automation is used influences automation performance and provides a goal-oriented perspective to assess automation characteristics along a dimension of attributional abstraction. These characteristics can influence trust through analytic, analogical, and affective processes. The challenges of extrapolating the concept of trust in people to trust in automation are discussed. A conceptual model integrates research regarding trust in automation and describes the dynamics of trust, the role of context, and the influence of display characteristics. Actual or potential applications of this research include improved designs of systems that require people to manage imperfect automation.

  10. An automated workflow for enhancing microbial bioprocess optimization on a novel microbioreactor platform

    PubMed Central

    2012-01-01

    Background High-throughput methods are widely-used for strain screening effectively resulting in binary information regarding high or low productivity. Nevertheless achieving quantitative and scalable parameters for fast bioprocess development is much more challenging, especially for heterologous protein production. Here, the nature of the foreign protein makes it impossible to predict the, e.g. best expression construct, secretion signal peptide, inductor concentration, induction time, temperature and substrate feed rate in fed-batch operation to name only a few. Therefore, a high number of systematic experiments are necessary to elucidate the best conditions for heterologous expression of each new protein of interest. Results To increase the throughput in bioprocess development, we used a microtiter plate based cultivation system (Biolector) which was fully integrated into a liquid-handling platform enclosed in laminar airflow housing. This automated cultivation platform was used for optimization of the secretory production of a cutinase from Fusarium solani pisi with Corynebacterium glutamicum. The online monitoring of biomass, dissolved oxygen and pH in each of the microtiter plate wells enables to trigger sampling or dosing events with the pipetting robot used for a reliable selection of best performing cutinase producers. In addition to this, further automated methods like media optimization and induction profiling were developed and validated. All biological and bioprocess parameters were exclusively optimized at microtiter plate scale and showed perfect scalable results to 1 L and 20 L stirred tank bioreactor scale. Conclusions The optimization of heterologous protein expression in microbial systems currently requires extensive testing of biological and bioprocess engineering parameters. This can be efficiently boosted by using a microtiter plate cultivation setup embedded into a liquid-handling system, providing more throughput by parallelization and

  11. Scalable Device for Automated Microbial Electroporation in a Digital Microfluidic Platform.

    PubMed

    Madison, Andrew C; Royal, Matthew W; Vigneault, Frederic; Chen, Liji; Griffin, Peter B; Horowitz, Mark; Church, George M; Fair, Richard B

    2017-09-15

    Electrowetting-on-dielectric (EWD) digital microfluidic laboratory-on-a-chip platforms demonstrate excellent performance in automating labor-intensive protocols. When coupled with an on-chip electroporation capability, these systems hold promise for streamlining cumbersome processes such as multiplex automated genome engineering (MAGE). We integrated a single Ti:Au electroporation electrode into an otherwise standard parallel-plate EWD geometry to enable high-efficiency transformation of Escherichia coli with reporter plasmid DNA in a 200 nL droplet. Test devices exhibited robust operation with more than 10 transformation experiments performed per device without cross-contamination or failure. Despite intrinsic electric-field nonuniformity present in the EP/EWD device, the peak on-chip transformation efficiency was measured to be 8.6 ± 1.0 × 10 8 cfu·μg -1 for an average applied electric field strength of 2.25 ± 0.50 kV·mm -1 . Cell survival and transformation fractions at this electroporation pulse strength were found to be 1.5 ± 0.3 and 2.3 ± 0.1%, respectively. Our work expands the EWD toolkit to include on-chip microbial electroporation and opens the possibility of scaling advanced genome engineering methods, like MAGE, into the submicroliter regime.

  12. Automation and Cataloging.

    ERIC Educational Resources Information Center

    Furuta, Kenneth; And Others

    1990-01-01

    These three articles address issues in library cataloging that are affected by automation: (1) the impact of automation and bibliographic utilities on professional catalogers; (2) the effect of the LASS microcomputer software on the cost of authority work in cataloging at the University of Arizona; and (3) online subject heading and classification…

  13. Culture expansion of adipose derived stromal cells. A closed automated Quantum Cell Expansion System compared with manual flask-based culture.

    PubMed

    Haack-Sørensen, Mandana; Follin, Bjarke; Juhl, Morten; Brorsen, Sonja K; Søndergaard, Rebekka H; Kastrup, Jens; Ekblond, Annette

    2016-11-16

    Adipose derived stromal cells (ASCs) are a rich and convenient source of cells for clinical regenerative therapeutic approaches. However, applications of ASCs often require cell expansion to reach the needed dose. In this study, cultivation of ASCs from stromal vascular fraction (SVF) over two passages in the automated and functionally closed Quantum Cell Expansion System (Quantum system) is compared with traditional manual cultivation. Stromal vascular fraction was isolated from abdominal fat, suspended in α-MEM supplemented with 10% Fetal Bovine Serum and seeded into either T75 flasks or a Quantum system that had been coated with cryoprecipitate. The cultivation of ASCs from SVF was performed in 3 ways: flask to flask; flask to Quantum system; and Quantum system to Quantum system. In all cases, quality controls were conducted for sterility, mycoplasmas, and endotoxins, in addition to the assessment of cell counts, viability, immunophenotype, and differentiation potential. The viability of ASCs passage 0 (P0) and P1 was above 96%, regardless of cultivation in flasks or Quantum system. Expression of surface markers and differentiation potential was consistent with ISCT/IFATS standards for the ASC phenotype. Sterility, mycoplasma, and endotoxin tests were consistently negative. An average of 8.0 × 10 7 SVF cells loaded into a Quantum system yielded 8.96 × 10 7 ASCs P0, while 4.5 × 10 6 SVF cells seeded per T75 flask yielded an average of 2.37 × 10 6 ASCs-less than the number of SVF cells seeded. ASCs P1 expanded in the Quantum system demonstrated a population doubling (PD) around 2.2 regardless of whether P0 was previously cultured in flasks or Quantum, while ASCs P1 in flasks only reached a PD of 1.0. Manufacturing of ASCs in a Quantum system enhances ASC expansion rate and yield significantly relative to manual processing in T-flasks, while maintaining the purity and quality essential to safe and robust cell production. Notably, the use of the Quantum

  14. Parallel k-means++

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A parallelization of the k-means++ seed selection algorithm on three distinct hardware platforms: GPU, multicore CPU, and multithreaded architecture. K-means++ was developed by David Arthur and Sergei Vassilvitskii in 2007 as an extension of the k-means data clustering technique. These algorithms allow people to cluster multidimensional data, by attempting to minimize the mean distance of data points within a cluster. K-means++ improved upon traditional k-means by using a more intelligent approach to selecting the initial seeds for the clustering process. While k-means++ has become a popular alternative to traditional k-means clustering, little work has been done to parallelize this technique.more » We have developed original C++ code for parallelizing the algorithm on three unique hardware architectures: GPU using NVidia's CUDA/Thrust framework, multicore CPU using OpenMP, and the Cray XMT multithreaded architecture. By parallelizing the process for these platforms, we are able to perform k-means++ clustering much more quickly than it could be done before.« less

  15. Human-centered aircraft automation: A concept and guidelines

    NASA Technical Reports Server (NTRS)

    Billings, Charles E.

    1991-01-01

    Aircraft automation is examined and its effects on flight crews. Generic guidelines are proposed for the design and use of automation in transport aircraft, in the hope of stimulating increased and more effective dialogue among designers of automated cockpits, purchasers of automated aircraft, and the pilots who must fly those aircraft in line operations. The goal is to explore the means whereby automation may be a maximally effective tool or resource for pilots without compromising human authority and with an increase in system safety. After definition of the domain of the aircraft pilot and brief discussion of the history of aircraft automation, a concept of human centered automation is presented and discussed. Automated devices are categorized as a control automation, information automation, and management automation. The environment and context of aircraft automation are then considered, followed by thoughts on the likely future of automation of that category.

  16. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics

    PubMed Central

    Röst, Hannes L.; Liu, Yansheng; D’Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C.; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-01-01

    Large scale, quantitative proteomic studies have become essential for the analysis of clinical cohorts, large perturbation experiments and systems biology studies. While next-generation mass spectrometric techniques such as SWATH-MS have substantially increased throughput and reproducibility, ensuring consistent quantification of thousands of peptide analytes across multiple LC-MS/MS runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we have developed the TRIC software which utilizes fragment ion data to perform cross-run alignment, consistent peak-picking and quantification for high throughput targeted proteomics. TRIC uses a graph-based alignment strategy based on non-linear retention time correction to integrate peak elution information from all LC-MS/MS runs acquired in a study. When compared to state-of-the-art SWATH-MS data analysis, the algorithm was able to reduce the identification error by more than 3-fold at constant recall, while correcting for highly non-linear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem (iPS) cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups and substantially increased the quantitative completeness and biological information in the data, providing insights into protein dynamics of iPS cells. Overall, this study demonstrates the importance of consistent quantification in highly challenging experimental setups, and proposes an algorithm to automate this task, constituting the last missing piece in a pipeline for automated analysis of massively parallel targeted proteomics datasets. PMID:27479329

  17. Classification of Automated Search Traffic

    NASA Astrophysics Data System (ADS)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  18. Fully Parallel MHD Stability Analysis Tool

    NASA Astrophysics Data System (ADS)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2015-11-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Results of MARS parallelization and of the development of a new fix boundary equilibrium code adapted for MARS input will be reported. Work is supported by the U.S. DOE SBIR program.

  19. Automatic Camera Calibration for Cultural Heritage Applications Using Unstructured Planar Objects

    NASA Astrophysics Data System (ADS)

    Adam, K.; Kalisperakis, I.; Grammatikopoulos, L.; Karras, G.; Petsa, E.

    2013-07-01

    As a rule, image-based documentation of cultural heritage relies today on ordinary digital cameras and commercial software. As such projects often involve researchers not familiar with photogrammetry, the question of camera calibration is important. Freely available open-source user-friendly software for automatic camera calibration, often based on simple 2D chess-board patterns, are an answer to the demand for simplicity and automation. However, such tools cannot respond to all requirements met in cultural heritage conservation regarding possible imaging distances and focal lengths. Here we investigate the practical possibility of camera calibration from unknown planar objects, i.e. any planar surface with adequate texture; we have focused on the example of urban walls covered with graffiti. Images are connected pair-wise with inter-image homographies, which are estimated automatically through a RANSAC-based approach after extracting and matching interest points with the SIFT operator. All valid points are identified on all images on which they appear. Provided that the image set includes a "fronto-parallel" view, inter-image homographies with this image are regarded as emulations of image-to-world homographies and allow computing initial estimates for the interior and exterior orientation elements. Following this initialization step, the estimates are introduced into a final self-calibrating bundle adjustment. Measures are taken to discard unsuitable images and verify object planarity. Results from practical experimentation indicate that this method may produce satisfactory results. The authors intend to incorporate the described approach into their freely available user-friendly software tool, which relies on chess-boards, to assist non-experts in their projects with image-based approaches.

  20. Directions in parallel programming: HPF, shared virtual memory and object parallelism in pC++

    NASA Technical Reports Server (NTRS)

    Bodin, Francois; Priol, Thierry; Mehrotra, Piyush; Gannon, Dennis

    1994-01-01

    Fortran and C++ are the dominant programming languages used in scientific computation. Consequently, extensions to these languages are the most popular for programming massively parallel computers. We discuss two such approaches to parallel Fortran and one approach to C++. The High Performance Fortran Forum has designed HPF with the intent of supporting data parallelism on Fortran 90 applications. HPF works by asking the user to help the compiler distribute and align the data structures with the distributed memory modules in the system. Fortran-S takes a different approach in which the data distribution is managed by the operating system and the user provides annotations to indicate parallel control regions. In the case of C++, we look at pC++ which is based on a concurrent aggregate parallel model.

  1. The structure of cross-cultural musical diversity.

    PubMed

    Rzeszutek, Tom; Savage, Patrick E; Brown, Steven

    2012-04-22

    Human cultural traits, such as languages, musics, rituals and material objects, vary widely across cultures. However, the majority of comparative analyses of human cultural diversity focus on between-culture variation without consideration for within-culture variation. In contrast, biological approaches to genetic diversity, such as the analysis of molecular variance (AMOVA) framework, partition genetic diversity into both within- and between-population components. We attempt here for the first time to quantify both components of cultural diversity by applying the AMOVA model to music. By employing this approach with 421 traditional songs from 16 Austronesian-speaking populations, we show that the vast majority of musical variability is due to differences within populations rather than differences between. This demonstrates a striking parallel to the structure of genetic diversity in humans. A neighbour-net analysis of pairwise population musical divergence shows a large amount of reticulation, indicating the pervasive occurrence of borrowing and/or convergent evolution of musical features across populations.

  2. The structure of cross-cultural musical diversity

    PubMed Central

    Rzeszutek, Tom; Savage, Patrick E.; Brown, Steven

    2012-01-01

    Human cultural traits, such as languages, musics, rituals and material objects, vary widely across cultures. However, the majority of comparative analyses of human cultural diversity focus on between-culture variation without consideration for within-culture variation. In contrast, biological approaches to genetic diversity, such as the analysis of molecular variance (AMOVA) framework, partition genetic diversity into both within- and between-population components. We attempt here for the first time to quantify both components of cultural diversity by applying the AMOVA model to music. By employing this approach with 421 traditional songs from 16 Austronesian-speaking populations, we show that the vast majority of musical variability is due to differences within populations rather than differences between. This demonstrates a striking parallel to the structure of genetic diversity in humans. A neighbour-net analysis of pairwise population musical divergence shows a large amount of reticulation, indicating the pervasive occurrence of borrowing and/or convergent evolution of musical features across populations. PMID:22072606

  3. Automation pilot

    NASA Technical Reports Server (NTRS)

    1983-01-01

    An important concept of the Action Information Management System (AIMS) approach is to evaluate office automation technology in the context of hands on use by technical program managers in the conduct of human acceptance difficulties which may accompany the transition to a significantly changing work environment. The improved productivity and communications which result from application of office automation technology are already well established for general office environments, but benefits unique to NASA are anticipated and these will be explored in detail.

  4. Selecting automation for the clinical chemistry laboratory.

    PubMed

    Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr

    2007-07-01

    Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.

  5. A novel DTI-QA tool: Automated metric extraction exploiting the sphericity of an agar filled phantom.

    PubMed

    Chavez, Sofia; Viviano, Joseph; Zamyadi, Mojdeh; Kingsley, Peter B; Kochunov, Peter; Strother, Stephen; Voineskos, Aristotle

    2018-02-01

    To develop a quality assurance (QA) tool (acquisition guidelines and automated processing) for diffusion tensor imaging (DTI) data using a common agar-based phantom used for fMRI QA. The goal is to produce a comprehensive set of automated, sensitive and robust QA metrics. A readily available agar phantom was scanned with and without parallel imaging reconstruction. Other scanning parameters were matched to the human scans. A central slab made up of either a thick slice or an average of a few slices, was extracted and all processing was performed on that image. The proposed QA relies on the creation of two ROIs for processing: (i) a preset central circular region of interest (ccROI) and (ii) a signal mask for all images in the dataset. The ccROI enables computation of average signal for SNR calculations as well as average FA values. The production of the signal masks enables automated measurements of eddy current and B0 inhomogeneity induced distortions by exploiting the sphericity of the phantom. Also, the signal masks allow automated background localization to assess levels of Nyquist ghosting. The proposed DTI-QA was shown to produce eleven metrics which are robust yet sensitive to image quality changes within site and differences across sites. It can be performed in a reasonable amount of scan time (~15min) and the code for automated processing has been made publicly available. A novel DTI-QA tool has been proposed. It has been applied successfully on data from several scanners/platforms. The novelty lies in the exploitation of the sphericity of the phantom for distortion measurements. Other novel contributions are: the computation of an SNR value per gradient direction for the diffusion weighted images (DWIs) and an SNR value per non-DWI, an automated background detection for the Nyquist ghosting measurement and an error metric reflecting the contribution of EPI instability to the eddy current induced shape changes observed for DWIs. Copyright © 2017 Elsevier

  6. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the...

  7. Aptaligner: automated software for aligning pseudorandom DNA X-aptamers from next-generation sequencing data.

    PubMed

    Lu, Emily; Elizondo-Riojas, Miguel-Angel; Chang, Jeffrey T; Volk, David E

    2014-06-10

    Next-generation sequencing results from bead-based aptamer libraries have demonstrated that traditional DNA/RNA alignment software is insufficient. This is particularly true for X-aptamers containing specialty bases (W, X, Y, Z, ...) that are identified by special encoding. Thus, we sought an automated program that uses the inherent design scheme of bead-based X-aptamers to create a hypothetical reference library and Markov modeling techniques to provide improved alignments. Aptaligner provides this feature as well as length error and noise level cutoff features, is parallelized to run on multiple central processing units (cores), and sorts sequences from a single chip into projects and subprojects.

  8. Improvement of Vivarium Biodecontamination through Data-acquisition Systems and Automation.

    PubMed

    Devan, Shakthi Rk; Vasu, Suresh; Mallikarjuna, Yogesha; Ponraj, Ramkumar; Kamath, Gireesh; Poosala, Suresh

    2018-03-01

    Biodecontamination is important for eliminating pathogens at research animal facilities, thereby preventing contamination within barrier systems. We enhanced our facility's standard biodecontamination method to replace the traditional foggers, and the new system was used effectively after creating bypass ducts in HVAC units so that individual rooms could be isolated. The entire system was controlled by inhouse-developed supervisory control and data-acquisition software that supported multiple cycles of decontamination by equipment, which had different decontamination capacities, operated in parallel, and used different agents, including H2O2 vapor and ClO2 gas. The process was validated according to facility mapping, and effectiveness was assessed by using biologic (Geobacillus stearothermophilus) and chemical indicator strips, which were positioned before decontamination, and by sampling contact plates after the completion of each cycle. The results of biologic indicators showed 6-log reduction in microbial counts after successful decontamination cycles for both agents and found to be compatible with clean-room panels including commonly used materials in vivarium such as racks, cages, trolleys, cage changing stations, biosafety cabinets, refrigerators and other equipment in both procedure and animal rooms. In conclusion, the automated process enabled users to perform effective decontamination through multiple cycles with realtime documentation and provided additional capability to deal with potential outbreaks. Enabling software integration of automation improved quality-control systems in our vivarium.

  9. Hypergraph partitioning implementation for parallelizing matrix-vector multiplication using CUDA GPU-based parallel computing

    NASA Astrophysics Data System (ADS)

    Murni, Bustamam, A.; Ernastuti, Handhika, T.; Kerami, D.

    2017-07-01

    Calculation of the matrix-vector multiplication in the real-world problems often involves large matrix with arbitrary size. Therefore, parallelization is needed to speed up the calculation process that usually takes a long time. Graph partitioning techniques that have been discussed in the previous studies cannot be used to complete the parallelized calculation of matrix-vector multiplication with arbitrary size. This is due to the assumption of graph partitioning techniques that can only solve the square and symmetric matrix. Hypergraph partitioning techniques will overcome the shortcomings of the graph partitioning technique. This paper addresses the efficient parallelization of matrix-vector multiplication through hypergraph partitioning techniques using CUDA GPU-based parallel computing. CUDA (compute unified device architecture) is a parallel computing platform and programming model that was created by NVIDIA and implemented by the GPU (graphics processing unit).

  10. Automated condition-invariable neurite segmentation and synapse classification using textural analysis-based machine-learning algorithms

    PubMed Central

    Kandaswamy, Umasankar; Rotman, Ziv; Watt, Dana; Schillebeeckx, Ian; Cavalli, Valeria; Klyachko, Vitaly

    2013-01-01

    High-resolution live-cell imaging studies of neuronal structure and function are characterized by large variability in image acquisition conditions due to background and sample variations as well as low signal-to-noise ratio. The lack of automated image analysis tools that can be generalized for varying image acquisition conditions represents one of the main challenges in the field of biomedical image analysis. Specifically, segmentation of the axonal/dendritic arborizations in brightfield or fluorescence imaging studies is extremely labor-intensive and still performed mostly manually. Here we describe a fully automated machine-learning approach based on textural analysis algorithms for segmenting neuronal arborizations in high-resolution brightfield images of live cultured neurons. We compare performance of our algorithm to manual segmentation and show that it combines 90% accuracy, with similarly high levels of specificity and sensitivity. Moreover, the algorithm maintains high performance levels under a wide range of image acquisition conditions indicating that it is largely condition-invariable. We further describe an application of this algorithm to fully automated synapse localization and classification in fluorescence imaging studies based on synaptic activity. Textural analysis-based machine-learning approach thus offers a high performance condition-invariable tool for automated neurite segmentation. PMID:23261652

  11. New procedure to reduce the time and cost of broncho-pulmonary specimen management using the Previ Isola® automated inoculation system.

    PubMed

    Nebbad-Lechani, Biba; Emirian, Aurélie; Maillebuau, Fabienne; Mahjoub, Nadia; Fihman, Vincent; Legrand, Patrick; Decousser, Jean-Winoc

    2013-12-01

    The microbiological diagnosis of respiratory tract infections requires serial manual dilutions of the clinical specimen before agar plate inoculation, disrupting the workflow in bacteriology clinical laboratories. Automated plating instrument systems have been designed to increase the speed, reproducibility and safety of this inoculating step; nevertheless, data concerning respiratory specimens are lacking. We tested a specific procedure that uses the Previ Isola® (bioMérieux, Craponne, France) to inoculate with broncho-pulmonary specimens (BPS). A total of 350 BPS from a university-affiliated hospital were managed in parallel using the manual reference and the automated methods (expectoration: 75; broncho-alveolar lavage: 68; tracheal aspiration: 17; protected distal sample: 190). A specific enumeration reading grid, a pre-liquefaction step and a fluidity test, performed before the inoculation, were designed for the automated method. The qualitative (i.e., the number of specimens yielding a bacterial count greater than the clinical threshold) and quantitative (i.e., the discrepancy within a 0.5 log value) concordances were 100% and 98.2%, respectively. The slimmest subgroup of expectorations could not be managed by the automated method (8%, 6/75). The technical time and cost savings (i.e., number of consumed plates) reached 50%. Additional studies are required for specific populations, such as cystic fibrosis specimens and associated bacterial variants. An automated decapper should be implemented to increase the biosafety of the process. The PREVI Isola® adapted procedure is a time- and cost-saving method for broncho-pulmonary specimen processing. © 2013.

  12. Flight-deck automation - Promises and problems

    NASA Technical Reports Server (NTRS)

    Wiener, E. L.; Curry, R. E.

    1980-01-01

    The paper analyzes the role of human factors in flight-deck automation, identifies problem areas, and suggests design guidelines. Flight-deck automation using microprocessor technology and display systems improves performance and safety while leading to a decrease in size, cost, and power consumption. On the other hand negative factors such as failure of automatic equipment, automation-induced error compounded by crew error, crew error in equipment set-up, failure to heed automatic alarms, and loss of proficiency must also be taken into account. Among the problem areas discussed are automation of control tasks, monitoring of complex systems, psychosocial aspects of automation, and alerting and warning systems. Guidelines are suggested for designing, utilising, and improving control and monitoring systems. Investigation into flight-deck automation systems is important as the knowledge gained can be applied to other systems such as air traffic control and nuclear power generation, but the many problems encountered with automated systems need to be analyzed and overcome in future research.

  13. Conception through build of an automated liquids processing system for compound management in a low-humidity environment.

    PubMed

    Belval, Richard; Alamir, Ab; Corte, Christopher; DiValentino, Justin; Fernandes, James; Frerking, Stuart; Jenkins, Derek; Rogers, George; Sanville-Ross, Mary; Sledziona, Cindy; Taylor, Paul

    2012-12-01

    Boehringer Ingelheim's Automated Liquids Processing System (ALPS) in Ridgefield, Connecticut, was built to accommodate all compound solution-based operations following dissolution in neat DMSO. Process analysis resulted in the design of two nearly identical conveyor-based subsystems, each capable of executing 1400 × 384-well plate or punch tube replicates per batch. Two parallel-positioned subsystems are capable of independent execution or alternatively executed as a unified system for more complex or higher throughput processes. Primary ALPS functions include creation of high-throughput screening plates, concentration-response plates, and reformatted master stock plates (e.g., 384-well plates from 96-well plates). Integrated operations included centrifugation, unsealing/piercing, broadcast diluent addition, barcode print/application, compound transfer/mix via disposable pipette tips, and plate sealing. ALPS key features included instrument pooling for increased capacity or fail-over situations, programming constructs to associate one source plate to an array of replicate plates, and stacked collation of completed plates. Due to the hygroscopic nature of DMSO, ALPS was designed to operate within a 10% relativity humidity environment. The activities described are the collaborative efforts that contributed to the specification, build, delivery, and acceptance testing between Boehringer Ingelheim Pharmaceuticals, Inc. and the automation integration vendor, Thermo Scientific Laboratory Automation (Burlington, ON, Canada).

  14. Modular workcells: modern methods for laboratory automation.

    PubMed

    Felder, R A

    1998-12-01

    Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.

  15. Physiological Self-Regulation and Adaptive Automation

    NASA Technical Reports Server (NTRS)

    Prinzell, Lawrence J.; Pope, Alan T.; Freeman, Frederick G.

    2007-01-01

    Adaptive automation has been proposed as a solution to current problems of human-automation interaction. Past research has shown the potential of this advanced form of automation to enhance pilot engagement and lower cognitive workload. However, there have been concerns voiced regarding issues, such as automation surprises, associated with the use of adaptive automation. This study examined the use of psychophysiological self-regulation training with adaptive automation that may help pilots deal with these problems through the enhancement of cognitive resource management skills. Eighteen participants were assigned to 3 groups (self-regulation training, false feedback, and control) and performed resource management, monitoring, and tracking tasks from the Multiple Attribute Task Battery. The tracking task was cycled between 3 levels of task difficulty (automatic, adaptive aiding, manual) on the basis of the electroencephalogram-derived engagement index. The other two tasks remained in automatic mode that had a single automation failure. Those participants who had received self-regulation training performed significantly better and reported lower National Aeronautics and Space Administration Task Load Index scores than participants in the false feedback and control groups. The theoretical and practical implications of these results for adaptive automation are discussed.

  16. Protocols for Automated Protist Analysis

    DTIC Science & Technology

    2011-12-01

    Report No: CG-D-14-13 Protocols for Automated Protist Analysis December 2011 Distribution Statement A: Approved for public...release; distribution is unlimited. Protocols for Automated Protist Analysis ii UNCLAS//Public | CG-926 RDC | B. Nelson, et al. | Public...Director United States Coast Guard Research & Development Center 1 Chelsea Street New London, CT 06320 Protocols for Automated Protist Analysis

  17. Parallel Computing Strategies for Irregular Algorithms

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Oliker, Leonid; Shan, Hongzhang; Biegel, Bryan (Technical Monitor)

    2002-01-01

    Parallel computing promises several orders of magnitude increase in our ability to solve realistic computationally-intensive problems, but relies on their efficient mapping and execution on large-scale multiprocessor architectures. Unfortunately, many important applications are irregular and dynamic in nature, making their effective parallel implementation a daunting task. Moreover, with the proliferation of parallel architectures and programming paradigms, the typical scientist is faced with a plethora of questions that must be answered in order to obtain an acceptable parallel implementation of the solution algorithm. In this paper, we consider three representative irregular applications: unstructured remeshing, sparse matrix computations, and N-body problems, and parallelize them using various popular programming paradigms on a wide spectrum of computer platforms ranging from state-of-the-art supercomputers to PC clusters. We present the underlying problems, the solution algorithms, and the parallel implementation strategies. Smart load-balancing, partitioning, and ordering techniques are used to enhance parallel performance. Overall results demonstrate the complexity of efficiently parallelizing irregular algorithms.

  18. A machine learning approach for automated wide-range frequency tagging analysis in embedded neuromonitoring systems.

    PubMed

    Montagna, Fabio; Buiatti, Marco; Benatti, Simone; Rossi, Davide; Farella, Elisabetta; Benini, Luca

    2017-10-01

    EEG is a standard non-invasive technique used in neural disease diagnostics and neurosciences. Frequency-tagging is an increasingly popular experimental paradigm that efficiently tests brain function by measuring EEG responses to periodic stimulation. Recently, frequency-tagging paradigms have proven successful with low stimulation frequencies (0.5-6Hz), but the EEG signal is intrinsically noisy in this frequency range, requiring heavy signal processing and significant human intervention for response estimation. This limits the possibility to process the EEG on resource-constrained systems and to design smart EEG based devices for automated diagnostic. We propose an algorithm for artifact removal and automated detection of frequency tagging responses in a wide range of stimulation frequencies, which we test on a visual stimulation protocol. The algorithm is rooted on machine learning based pattern recognition techniques and it is tailored for a new generation parallel ultra low power processing platform (PULP), reaching performance of more that 90% accuracy in the frequency detection even for very low stimulation frequencies (<1Hz) with a power budget of 56mW. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Logistics Automation Master Plan (LAMP). Better Logistics Support through Automation.

    DTIC Science & Technology

    1983-06-01

    office micro-computers, positioned throughout the command chain , by providing real time links between LCA and all users: 2. Goals: Assist HQDA staff in...field i.e., Airland Battle 2000. IV-27 Section V: CONCEPT OF EXECUTION Suply (Retail) A. SRstem Description. I. The Division Logistics Property Book...7. Divisional Direct Support Unit Automated Supply System (DDASS)/Direct pport Level Suply Automation (DLSA). DDASS and DLSA are system development

  20. Automation: how much is too much?

    PubMed

    Hancock, P A

    2014-01-01

    The headlong rush to automate continues apace. The dominant question still remains whether we can automate, not whether we should automate. However, it is this latter question that is featured and considered explicitly here. The suggestion offered is that unlimited automation of all technical functions will eventually prove anathema to the fundamental quality of human life. Examples of tasks, pursuits and past-times that should potentially be excused from the automation imperative are discussed. This deliberation leads us back to the question of balance in the cooperation, coordination and potential conflict between humans and the machines they create.

  1. Automated Instrumentation, Monitoring and Visualization of PVM Programs Using AIMS

    NASA Technical Reports Server (NTRS)

    Mehra, Pankaj; VanVoorst, Brian; Yan, Jerry; Tucker, Deanne (Technical Monitor)

    1994-01-01

    We present views and analysis of the execution of several PVM codes for Computational Fluid Dynamics on a network of Sparcstations, including (a) NAS Parallel benchmarks CG and MG (White, Alund and Sunderam 1993); (b) a multi-partitioning algorithm for NAS Parallel Benchmark SP (Wijngaart 1993); and (c) an overset grid flowsolver (Smith 1993). These views and analysis were obtained using our Automated Instrumentation and Monitoring System (AIMS) version 3.0, a toolkit for debugging the performance of PVM programs. We will describe the architecture, operation and application of AIMS. The AIMS toolkit contains (a) Xinstrument, which can automatically instrument various computational and communication constructs in message-passing parallel programs; (b) Monitor, a library of run-time trace-collection routines; (c) VK (Visual Kernel), an execution-animation tool with source-code clickback; and (d) Tally, a tool for statistical analysis of execution profiles. Currently, Xinstrument can handle C and Fortran77 programs using PVM 3.2.x; Monitor has been implemented and tested on Sun 4 systems running SunOS 4.1.2; and VK uses X11R5 and Motif 1.2. Data and views obtained using AIMS clearly illustrate several characteristic features of executing parallel programs on networked workstations: (a) the impact of long message latencies; (b) the impact of multiprogramming overheads and associated load imbalance; (c) cache and virtual-memory effects; and (4significant skews between workstation clocks. Interestingly, AIMS can compensate for constant skew (zero drift) by calibrating the skew between a parent and its spawned children. In addition, AIMS' skew-compensation algorithm can adjust timestamps in a way that eliminates physically impossible communications (e.g., messages going backwards in time). Our current efforts are directed toward creating new views to explain the observed performance of PVM programs. Some of the features planned for the near future include: (a) Config

  2. Automation and robotics

    NASA Technical Reports Server (NTRS)

    Montemerlo, Melvin

    1988-01-01

    The Autonomous Systems focus on the automation of control systems for the Space Station and mission operations. Telerobotics focuses on automation for in-space servicing, assembly, and repair. The Autonomous Systems and Telerobotics each have a planned sequence of integrated demonstrations showing the evolutionary advance of the state-of-the-art. Progress is briefly described for each area of concern.

  3. CRUNCH_PARALLEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shumaker, Dana E.; Steefel, Carl I.

    The code CRUNCH_PARALLEL is a parallel version of the CRUNCH code. CRUNCH code version 2.0 was previously released by LLNL, (UCRL-CODE-200063). Crunch is a general purpose reactive transport code developed by Carl Steefel and Yabusake (Steefel Yabsaki 1996). The code handles non-isothermal transport and reaction in one, two, and three dimensions. The reaction algorithm is generic in form, handling an arbitrary number of aqueous and surface complexation as well as mineral dissolution/precipitation. A standardized database is used containing thermodynamic and kinetic data. The code includes advective, dispersive, and diffusive transport.

  4. Automated high-throughput flow-through real-time diagnostic system

    DOEpatents

    Regan, John Frederick

    2012-10-30

    An automated real-time flow-through system capable of processing multiple samples in an asynchronous, simultaneous, and parallel fashion for nucleic acid extraction and purification, followed by assay assembly, genetic amplification, multiplex detection, analysis, and decontamination. The system is able to hold and access an unlimited number of fluorescent reagents that may be used to screen samples for the presence of specific sequences. The apparatus works by associating extracted and purified sample with a series of reagent plugs that have been formed in a flow channel and delivered to a flow-through real-time amplification detector that has a multiplicity of optical windows, to which the sample-reagent plugs are placed in an operative position. The diagnostic apparatus includes sample multi-position valves, a master sample multi-position valve, a master reagent multi-position valve, reagent multi-position valves, and an optical amplification/detection system.

  5. Parallel processing and expert systems

    NASA Technical Reports Server (NTRS)

    Lau, Sonie; Yan, Jerry C.

    1991-01-01

    Whether it be monitoring the thermal subsystem of Space Station Freedom, or controlling the navigation of the autonomous rover on Mars, NASA missions in the 1990s cannot enjoy an increased level of autonomy without the efficient implementation of expert systems. Merely increasing the computational speed of uniprocessors may not be able to guarantee that real-time demands are met for larger systems. Speedup via parallel processing must be pursued alongside the optimization of sequential implementations. Prototypes of parallel expert systems have been built at universities and industrial laboratories in the U.S. and Japan. The state-of-the-art research in progress related to parallel execution of expert systems is surveyed. The survey discusses multiprocessors for expert systems, parallel languages for symbolic computations, and mapping expert systems to multiprocessors. Results to date indicate that the parallelism achieved for these systems is small. The main reasons are (1) the body of knowledge applicable in any given situation and the amount of computation executed by each rule firing are small, (2) dividing the problem solving process into relatively independent partitions is difficult, and (3) implementation decisions that enable expert systems to be incrementally refined hamper compile-time optimization. In order to obtain greater speedups, data parallelism and application parallelism must be exploited.

  6. Automated mitosis detection of stem cell populations in phase-contrast microscopy images.

    PubMed

    Huh, Seungil; Ker, Dai Fei Elmer; Bise, Ryoma; Chen, Mei; Kanade, Takeo

    2011-03-01

    Due to the enormous potential and impact that stem cells may have on regenerative medicine, there has been a rapidly growing interest for tools to analyze and characterize the behaviors of these cells in vitro in an automated and high throughput fashion. Among these behaviors, mitosis, or cell division, is important since stem cells proliferate and renew themselves through mitosis. However, current automated systems for measuring cell proliferation often require destructive or sacrificial methods of cell manipulation such as cell lysis or in vitro staining. In this paper, we propose an effective approach for automated mitosis detection using phase-contrast time-lapse microscopy, which is a nondestructive imaging modality, thereby allowing continuous monitoring of cells in culture. In our approach, we present a probabilistic model for event detection, which can simultaneously 1) identify spatio-temporal patch sequences that contain a mitotic event and 2) localize a birth event, defined as the time and location at which cell division is completed and two daughter cells are born. Our approach significantly outperforms previous approaches in terms of both detection accuracy and computational efficiency, when applied to multipotent C3H10T1/2 mesenchymal and C2C12 myoblastic stem cell populations.

  7. CSM parallel structural methods research

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O.

    1989-01-01

    Parallel structural methods, research team activities, advanced architecture computers for parallel computational structural mechanics (CSM) research, the FLEX/32 multicomputer, a parallel structural analyses testbed, blade-stiffened aluminum panel with a circular cutout and the dynamic characteristics of a 60 meter, 54-bay, 3-longeron deployable truss beam are among the topics discussed.

  8. Reductions in self-reported stress and anticipatory heart rate with the use of a semi-automated parallel parking system.

    PubMed

    Reimer, Bryan; Mehler, Bruce; Coughlin, Joseph F

    2016-01-01

    Drivers' reactions to a semi-autonomous technology for assisted parallel parking system were evaluated in a field experiment. A sample of 42 drivers balanced by gender and across three age groups (20-29, 40-49, 60-69) were given a comprehensive briefing, saw the technology demonstrated, practiced parallel parking 3 times each with and without the assistive technology, and then were assessed on an additional 3 parking events each with and without the technology. Anticipatory stress, as measured by heart rate, was significantly lower when drivers approached a parking space knowing that they would be using the assistive technology as opposed to manually parking. Self-reported stress levels following assisted parks were also lower. Thus, both subjective and objective data support the position that the assistive technology reduced stress levels in drivers who were given detailed training. It was observed that drivers decreased their use of turn signals when using the semi-autonomous technology, raising a caution concerning unintended lapses in safe driving behaviors that may occur when assistive technologies are used. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. ASSET: Analysis of Sequences of Synchronous Events in Massively Parallel Spike Trains

    PubMed Central

    Canova, Carlos; Denker, Michael; Gerstein, George; Helias, Moritz

    2016-01-01

    With the ability to observe the activity from large numbers of neurons simultaneously using modern recording technologies, the chance to identify sub-networks involved in coordinated processing increases. Sequences of synchronous spike events (SSEs) constitute one type of such coordinated spiking that propagates activity in a temporally precise manner. The synfire chain was proposed as one potential model for such network processing. Previous work introduced a method for visualization of SSEs in massively parallel spike trains, based on an intersection matrix that contains in each entry the degree of overlap of active neurons in two corresponding time bins. Repeated SSEs are reflected in the matrix as diagonal structures of high overlap values. The method as such, however, leaves the task of identifying these diagonal structures to visual inspection rather than to a quantitative analysis. Here we present ASSET (Analysis of Sequences of Synchronous EvenTs), an improved, fully automated method which determines diagonal structures in the intersection matrix by a robust mathematical procedure. The method consists of a sequence of steps that i) assess which entries in the matrix potentially belong to a diagonal structure, ii) cluster these entries into individual diagonal structures and iii) determine the neurons composing the associated SSEs. We employ parallel point processes generated by stochastic simulations as test data to demonstrate the performance of the method under a wide range of realistic scenarios, including different types of non-stationarity of the spiking activity and different correlation structures. Finally, the ability of the method to discover SSEs is demonstrated on complex data from large network simulations with embedded synfire chains. Thus, ASSET represents an effective and efficient tool to analyze massively parallel spike data for temporal sequences of synchronous activity. PMID:27420734

  10. Incremental Parallelization of Non-Data-Parallel Programs Using the Charon Message-Passing Library

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.

    2000-01-01

    Message passing is among the most popular techniques for parallelizing scientific programs on distributed-memory architectures. The reasons for its success are wide availability (MPI), efficiency, and full tuning control provided to the programmer. A major drawback, however, is that incremental parallelization, as offered by compiler directives, is not generally possible, because all data structures have to be changed throughout the program simultaneously. Charon remedies this situation through mappings between distributed and non-distributed data. It allows breaking up the parallelization into small steps, guaranteeing correctness at every stage. Several tools are available to help convert legacy codes into high-performance message-passing programs. They usually target data-parallel applications, whose loops carrying most of the work can be distributed among all processors without much dependency analysis. Others do a full dependency analysis and then convert the code virtually automatically. Even more toolkits are available that aid construction from scratch of message passing programs. None, however, allows piecemeal translation of codes with complex data dependencies (i.e. non-data-parallel programs) into message passing codes. The Charon library (available in both C and Fortran) provides incremental parallelization capabilities by linking legacy code arrays with distributed arrays. During the conversion process, non-distributed and distributed arrays exist side by side, and simple mapping functions allow the programmer to switch between the two in any location in the program. Charon also provides wrapper functions that leave the structure of the legacy code intact, but that allow execution on truly distributed data. Finally, the library provides a rich set of communication functions that support virtually all patterns of remote data demands in realistic structured grid scientific programs, including transposition, nearest-neighbor communication, pipelining

  11. Microfluidic large-scale integration: the evolution of design rules for biological automation.

    PubMed

    Melin, Jessica; Quake, Stephen R

    2007-01-01

    Microfluidic large-scale integration (mLSI) refers to the development of microfluidic chips with thousands of integrated micromechanical valves and control components. This technology is utilized in many areas of biology and chemistry and is a candidate to replace today's conventional automation paradigm, which consists of fluid-handling robots. We review the basic development of mLSI and then discuss design principles of mLSI to assess the capabilities and limitations of the current state of the art and to facilitate the application of mLSI to areas of biology. Many design and practical issues, including economies of scale, parallelization strategies, multiplexing, and multistep biochemical processing, are discussed. Several microfluidic components used as building blocks to create effective, complex, and highly integrated microfluidic networks are also highlighted.

  12. Simplified Parallel Domain Traversal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson III, David J

    2011-01-01

    Many data-intensive scientific analysis techniques require global domain traversal, which over the years has been a bottleneck for efficient parallelization across distributed-memory architectures. Inspired by MapReduce and other simplified parallel programming approaches, we have designed DStep, a flexible system that greatly simplifies efficient parallelization of domain traversal techniques at scale. In order to deliver both simplicity to users as well as scalability on HPC platforms, we introduce a novel two-tiered communication architecture for managing and exploiting asynchronous communication loads. We also integrate our design with advanced parallel I/O techniques that operate directly on native simulation output. We demonstrate DStep bymore » performing teleconnection analysis across ensemble runs of terascale atmospheric CO{sub 2} and climate data, and we show scalability results on up to 65,536 IBM BlueGene/P cores.« less

  13. Blastocyst microinjection automation.

    PubMed

    Mattos, Leonardo S; Grant, Edward; Thresher, Randy; Kluckman, Kimberly

    2009-09-01

    Blastocyst microinjections are routinely involved in the process of creating genetically modified mice for biomedical research, but their efficiency is highly dependent on the skills of the operators. As a consequence, much time and resources are required for training microinjection personnel. This situation has been aggravated by the rapid growth of genetic research, which has increased the demand for mutant animals. Therefore, increased productivity and efficiency in this area are highly desired. Here, we pursue these goals through the automation of a previously developed teleoperated blastocyst microinjection system. This included the design of a new system setup to facilitate automation, the definition of rules for automatic microinjections, the implementation of video processing algorithms to extract feedback information from microscope images, and the creation of control algorithms for process automation. Experimentation conducted with this new system and operator assistance during the cells delivery phase demonstrated a 75% microinjection success rate. In addition, implantation of the successfully injected blastocysts resulted in a 53% birth rate and a 20% yield of chimeras. These results proved that the developed system was capable of automatic blastocyst penetration and retraction, demonstrating the success of major steps toward full process automation.

  14. Evolution paths for advanced automation

    NASA Technical Reports Server (NTRS)

    Healey, Kathleen J.

    1990-01-01

    As Space Station Freedom (SSF) evolves, increased automation and autonomy will be required to meet Space Station Freedom Program (SSFP) objectives. As a precursor to the use of advanced automation within the SSFP, especially if it is to be used on SSF (e.g., to automate the operation of the flight systems), the underlying technologies will need to be elevated to a high level of readiness to ensure safe and effective operations. Ground facilities supporting the development of these flight systems -- from research and development laboratories through formal hardware and software development environments -- will be responsible for achieving these levels of technology readiness. These facilities will need to evolve support the general evolution of the SSFP. This evolution will include support for increasing the use of advanced automation. The SSF Advanced Development Program has funded a study to define evolution paths for advanced automaton within the SSFP's ground-based facilities which will enable, promote, and accelerate the appropriate use of advanced automation on-board SSF. The current capability of the test beds and facilities, such as the Software Support Environment, with regard to advanced automation, has been assessed and their desired evolutionary capabilities have been defined. Plans and guidelines for achieving this necessary capability have been constructed. The approach taken has combined indepth interviews of test beds personnel at all SSF Work Package centers with awareness of relevant state-of-the-art technology and technology insertion methodologies. Key recommendations from the study include advocating a NASA-wide task force for advanced automation, and the creation of software prototype transition environments to facilitate the incorporation of advanced automation in the SSFP.

  15. The NAS parallel benchmarks

    NASA Technical Reports Server (NTRS)

    Bailey, David (Editor); Barton, John (Editor); Lasinski, Thomas (Editor); Simon, Horst (Editor)

    1993-01-01

    A new set of benchmarks was developed for the performance evaluation of highly parallel supercomputers. These benchmarks consist of a set of kernels, the 'Parallel Kernels,' and a simulated application benchmark. Together they mimic the computation and data movement characteristics of large scale computational fluid dynamics (CFD) applications. The principal distinguishing feature of these benchmarks is their 'pencil and paper' specification - all details of these benchmarks are specified only algorithmically. In this way many of the difficulties associated with conventional benchmarking approaches on highly parallel systems are avoided.

  16. Parallel flow diffusion battery

    DOEpatents

    Yeh, H.C.; Cheng, Y.S.

    1984-01-01

    A parallel flow diffusion battery for determining the mass distribution of an aerosol has a plurality of diffusion cells mounted in parallel to an aerosol stream, each diffusion cell including a stack of mesh wire screens of different density.

  17. Parallel flow diffusion battery

    DOEpatents

    Yeh, Hsu-Chi; Cheng, Yung-Sung

    1984-08-07

    A parallel flow diffusion battery for determining the mass distribution of an aerosol has a plurality of diffusion cells mounted in parallel to an aerosol stream, each diffusion cell including a stack of mesh wire screens of different density.

  18. Online optimal experimental re-design in robotic parallel fed-batch cultivation facilities.

    PubMed

    Cruz Bournazou, M N; Barz, T; Nickel, D B; Lopez Cárdenas, D C; Glauche, F; Knepper, A; Neubauer, P

    2017-03-01

    We present an integrated framework for the online optimal experimental re-design applied to parallel nonlinear dynamic processes that aims to precisely estimate the parameter set of macro kinetic growth models with minimal experimental effort. This provides a systematic solution for rapid validation of a specific model to new strains, mutants, or products. In biosciences, this is especially important as model identification is a long and laborious process which is continuing to limit the use of mathematical modeling in this field. The strength of this approach is demonstrated by fitting a macro-kinetic differential equation model for Escherichia coli fed-batch processes after 6 h of cultivation. The system includes two fully-automated liquid handling robots; one containing eight mini-bioreactors and another used for automated at-line analyses, which allows for the immediate use of the available data in the modeling environment. As a result, the experiment can be continually re-designed while the cultivations are running using the information generated by periodical parameter estimations. The advantages of an online re-computation of the optimal experiment are proven by a 50-fold lower average coefficient of variation on the parameter estimates compared to the sequential method (4.83% instead of 235.86%). The success obtained in such a complex system is a further step towards a more efficient computer aided bioprocess development. Biotechnol. Bioeng. 2017;114: 610-619. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  19. What Is an Automated External Defibrillator?

    MedlinePlus

    ANSWERS by heart Treatments + Tests What Is an Automated External Defibrillator? An automated external defibrillator (AED) is a lightweight, portable device ... ANSWERS by heart Treatments + Tests What Is an Automated External Defibrillator? detect a rhythm that should be ...

  20. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali; Kagadis, George C.

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authorsmore » contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.« less

  1. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali; Kagadis, George C.

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authorsmore » contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.« less

  2. Vision 20/20: Automation and advanced computing in clinical radiation oncology.

    PubMed

    Moore, Kevin L; Kagadis, George C; McNutt, Todd R; Moiseenko, Vitali; Mutic, Sasa

    2014-01-01

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  3. Automated DBS microsampling, microscale automation and microflow LC-MS for therapeutic protein PK.

    PubMed

    Zhang, Qian; Tomazela, Daniela; Vasicek, Lisa A; Spellman, Daniel S; Beaumont, Maribel; Shyong, BaoJen; Kenny, Jacqueline; Fauty, Scott; Fillgrove, Kerry; Harrelson, Jane; Bateman, Kevin P

    2016-04-01

    Reduce animal usage for discovery-stage PK studies for biologics programs using microsampling-based approaches and microscale LC-MS. We report the development of an automated DBS-based serial microsampling approach for studying the PK of therapeutic proteins in mice. Automated sample preparation and microflow LC-MS were used to enable assay miniaturization and improve overall assay throughput. Serial sampling of mice was possible over the full 21-day study period with the first six time points over 24 h being collected using automated DBS sample collection. Overall, this approach demonstrated comparable data to a previous study using single mice per time point liquid samples while reducing animal and compound requirements by 14-fold. Reduction in animals and drug material is enabled by the use of automated serial DBS microsampling for mice studies in discovery-stage studies of protein therapeutics.

  4. A method to establish seismic noise baselines for automated station assessment

    USGS Publications Warehouse

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  5. Automating spectral measurements

    NASA Astrophysics Data System (ADS)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  6. Leveraging human oversight and intervention in large-scale parallel processing of open-source data

    NASA Astrophysics Data System (ADS)

    Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.

    2015-05-01

    The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.

  7. Wavelet-based de-noising algorithm for images acquired with parallel magnetic resonance imaging (MRI).

    PubMed

    Delakis, Ioannis; Hammad, Omer; Kitney, Richard I

    2007-07-07

    Wavelet-based de-noising has been shown to improve image signal-to-noise ratio in magnetic resonance imaging (MRI) while maintaining spatial resolution. Wavelet-based de-noising techniques typically implemented in MRI require that noise displays uniform spatial distribution. However, images acquired with parallel MRI have spatially varying noise levels. In this work, a new algorithm for filtering images with parallel MRI is presented. The proposed algorithm extracts the edges from the original image and then generates a noise map from the wavelet coefficients at finer scales. The noise map is zeroed at locations where edges have been detected and directional analysis is also used to calculate noise in regions of low-contrast edges that may not have been detected. The new methodology was applied on phantom and brain images and compared with other applicable de-noising techniques. The performance of the proposed algorithm was shown to be comparable with other techniques in central areas of the images, where noise levels are high. In addition, finer details and edges were maintained in peripheral areas, where noise levels are low. The proposed methodology is fully automated and can be applied on final reconstructed images without requiring sensitivity profiles or noise matrices of the receiver coils, therefore making it suitable for implementation in a clinical MRI setting.

  8. Ask the experts: automation: part I.

    PubMed

    Allinson, John L; Blick, Kenneth E; Cohen, Lucinda; Higton, David; Li, Ming

    2013-08-01

    Bioanalysis invited a selection of leading researchers to express their views on automation in the bioanalytical laboratory. The topics discussed include the challenges that the modern bioanalyst faces when integrating automation into existing drug-development processes, the impact of automation and how they envision the modern bioanalytical laboratory changing in the near future. Their enlightening responses provide a valuable insight into the impact of automation and the future of the constantly evolving bioanalytical laboratory.

  9. An Automation Survival Guide for Media Centers.

    ERIC Educational Resources Information Center

    Whaley, Roger E.

    1989-01-01

    Reviews factors that should affect the decision to automate a school media center and offers suggestions for the automation process. Topics discussed include getting the library collection ready for automation, deciding what automated functions are needed, evaluating software vendors, selecting software, and budgeting. (CLB)

  10. Automated Shape Analysis of Teeth from the Archaelogical Site of Nerqin Naver

    NASA Astrophysics Data System (ADS)

    Gaboutchian, A.; Simonyan, H.; Knyaz, V.; Petrosyan, G.; Ter-Vardanyan, L.; Leybova, N. A.; Apresyan, S. V.

    2018-05-01

    Traditional odontometry currently suggests a limited number of measurements on tooth coronal parts, including estimation of mesio-distal and vestibular-oral diameters, or dimension, through usually a single measurement of the maximal parameter. Taking into consideration the complexity, irregularity and variability of tooth shapes we find such measurements insufficient for interpreting tooth morphology. Thus we propose odontotomic approach of obtaining data from a series of parallel equally spaced sections in combination with automated detection of landmarks used for measurements. These sections allow locating maximal dimensions of teeth as well as collecting data from all parts of the tooth to describe it morphologically. Referring odontometric data to the whole tooth we obtain more precise and objective records which have proved to be informative in a series of dental and anthropological studies. Growing interest and implementing of digital technology in odontometric studies calls for studies ensuring transition to new methods. The current research is aimed to undertake a comparative study of the traditional and automated digital odontometry. The influence of various approaches to odontotomy (number and direction of sections) on odontometric data is subjected to studies as well. The above-mentioned tooth shape analysis is applied to samples from the archaeological site of Nerqin Naver to contribute to complicated odontological studies from the Early Bronze burials.

  11. Automatic pH Control and Soluble and Insoluble Substrate Input for Continuous Culture of Rumen Microorganisms

    PubMed Central

    Slyter, Leonard L.

    1975-01-01

    An artifical rumen continuous culture with pH control, automated input of water-soluble and water-insoluble substrates, controlled mixing of contents, and a collection system for gas is described. Images PMID:16350029

  12. Automations influence on nuclear power plants: a look at three accidents and how automation played a role.

    PubMed

    Schmitt, Kara

    2012-01-01

    Nuclear power is one of the ways that we can design an efficient sustainable future. Automation is the primary system used to assist operators in the task of monitoring and controlling nuclear power plants (NPP). Automation performs tasks such as assessing the status of the plant's operations as well as making real time life critical situational specific decisions. While the advantages and disadvantages of automation are well studied in variety of domains, accidents remind us that there is still vulnerability to unknown variables. This paper will look at the effects of automation within three NPP accidents and incidents and will consider why automation failed in preventing these accidents from occurring. It will also review the accidents at the Three Mile Island, Chernobyl, and Fukushima Daiichi NPP's in order to determine where better use of automation could have resulted in a more desirable outcome.

  13. Parallel consistent labeling algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samal, A.; Henderson, T.

    Mackworth and Freuder have analyzed the time complexity of several constraint satisfaction algorithms. Mohr and Henderson have given new algorithms, AC-4 and PC-3, for arc and path consistency, respectively, and have shown that the arc consistency algorithm is optimal in time complexity and of the same order space complexity as the earlier algorithms. In this paper, they give parallel algorithms for solving node and arc consistency. They show that any parallel algorithm for enforcing arc consistency in the worst case must have O(na) sequential steps, where n is number of nodes, and a is the number of labels per node.more » They give several parallel algorithms to do arc consistency. It is also shown that they all have optimal time complexity. The results of running the parallel algorithms on a BBN Butterfly multiprocessor are also presented.« less

  14. Automation in School Library Media Centers.

    ERIC Educational Resources Information Center

    Driver, Russell W.; Driver, Mary Anne

    1982-01-01

    Surveys the historical development of automated technical processing in schools and notes the impact of this automation in a number of cases. Speculations about the future involvement of school libraries in automated processing and networking are included. Thirty references are listed. (BBM)

  15. Robo-Lector - a novel platform for automated high-throughput cultivations in microtiter plates with high information content.

    PubMed

    Huber, Robert; Ritter, Daniel; Hering, Till; Hillmer, Anne-Kathrin; Kensy, Frank; Müller, Carsten; Wang, Le; Büchs, Jochen

    2009-08-01

    In industry and academic research, there is an increasing demand for flexible automated microfermentation platforms with advanced sensing technology. However, up to now, conventional platforms cannot generate continuous data in high-throughput cultivations, in particular for monitoring biomass and fluorescent proteins. Furthermore, microfermentation platforms are needed that can easily combine cost-effective, disposable microbioreactors with downstream processing and analytical assays. To meet this demand, a novel automated microfermentation platform consisting of a BioLector and a liquid-handling robot (Robo-Lector) was sucessfully built and tested. The BioLector provides a cultivation system that is able to permanently monitor microbial growth and the fluorescence of reporter proteins under defined conditions in microtiter plates. Three examplary methods were programed on the Robo-Lector platform to study in detail high-throughput cultivation processes and especially recombinant protein expression. The host/vector system E. coli BL21(DE3) pRhotHi-2-EcFbFP, expressing the fluorescence protein EcFbFP, was hereby investigated. With the method 'induction profiling' it was possible to conduct 96 different induction experiments (varying inducer concentrations from 0 to 1.5 mM IPTG at 8 different induction times) simultaneously in an automated way. The method 'biomass-specific induction' allowed to automatically induce cultures with different growth kinetics in a microtiter plate at the same biomass concentration, which resulted in a relative standard deviation of the EcFbFP production of only +/- 7%. The third method 'biomass-specific replication' enabled to generate equal initial biomass concentrations in main cultures from precultures with different growth kinetics. This was realized by automatically transferring an appropiate inoculum volume from the different preculture microtiter wells to respective wells of the main culture plate, where subsequently similar

  16. Partitioning in parallel processing of production systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oflazer, K.

    1987-01-01

    This thesis presents research on certain issues related to parallel processing of production systems. It first presents a parallel production system interpreter that has been implemented on a four-processor multiprocessor. This parallel interpreter is based on Forgy's OPS5 interpreter and exploits production-level parallelism in production systems. Runs on the multiprocessor system indicate that it is possible to obtain speed-up of around 1.7 in the match computation for certain production systems when productions are split into three sets that are processed in parallel. The next issue addressed is that of partitioning a set of rules to processors in a parallel interpretermore » with production-level parallelism, and the extent of additional improvement in performance. The partitioning problem is formulated and an algorithm for approximate solutions is presented. The thesis next presents a parallel processing scheme for OPS5 production systems that allows some redundancy in the match computation. This redundancy enables the processing of a production to be divided into units of medium granularity each of which can be processed in parallel. Subsequently, a parallel processor architecture for implementing the parallel processing algorithm is presented.« less

  17. Introduction matters: Manipulating trust in automation and reliance in automated driving.

    PubMed

    Körber, Moritz; Baseler, Eva; Bengler, Klaus

    2018-01-01

    Trust in automation is a key determinant for the adoption of automated systems and their appropriate use. Therefore, it constitutes an essential research area for the introduction of automated vehicles to road traffic. In this study, we investigated the influence of trust promoting (Trust promoted group) and trust lowering (Trust lowered group) introductory information on reported trust, reliance behavior and take-over performance. Forty participants encountered three situations in a 17-min highway drive in a conditionally automated vehicle (SAE Level 3). Situation 1 and Situation 3 were non-critical situations where a take-over was optional. Situation 2 represented a critical situation where a take-over was necessary to avoid a collision. A non-driving-related task (NDRT) was presented between the situations to record the allocation of visual attention. Participants reporting a higher trust level spent less time looking at the road or instrument cluster and more time looking at the NDRT. The manipulation of introductory information resulted in medium differences in reported trust and influenced participants' reliance behavior. Participants of the Trust promoted group looked less at the road or instrument cluster and more at the NDRT. The odds of participants of the Trust promoted group to overrule the automated driving system in the non-critical situations were 3.65 times (Situation 1) to 5 times (Situation 3) higher. In Situation 2, the Trust promoted group's mean take-over time was extended by 1154 ms and the mean minimum time-to-collision was 933 ms shorter. Six participants from the Trust promoted group compared to no participant of the Trust lowered group collided with the obstacle. The results demonstrate that the individual trust level influences how much drivers monitor the environment while performing an NDRT. Introductory information influences this trust level, reliance on an automated driving system, and if a critical take-over situation can be

  18. Flight-deck automation: Promises and problems

    NASA Technical Reports Server (NTRS)

    Wiener, E. L.; Curry, R. E.

    1980-01-01

    The state of the art in human factors in flight-deck automation is presented. A number of critical problem areas are identified and broad design guidelines are offered. Automation-related aircraft accidents and incidents are discussed as examples of human factors problems in automated flight.

  19. Automation in Photogrammetry,

    DTIC Science & Technology

    1980-07-25

    matrix (DTM) and digital planimetric data, combined and integrated into so-called "data bases." I’ll say more about this later. AUTOMATION OF...projection with mechanical inversors to maintain the Scheimpflug condition. Some automation has been achieved, with computer control to determine rectifier... matrix (DTM) form that is not necessarily collected from the same photography as that from which the orthophoto is being produced. Because they are

  20. The NAS parallel benchmarks

    NASA Technical Reports Server (NTRS)

    Bailey, D. H.; Barszcz, E.; Barton, J. T.; Carter, R. L.; Lasinski, T. A.; Browning, D. S.; Dagum, L.; Fatoohi, R. A.; Frederickson, P. O.; Schreiber, R. S.

    1991-01-01

    A new set of benchmarks has been developed for the performance evaluation of highly parallel supercomputers in the framework of the NASA Ames Numerical Aerodynamic Simulation (NAS) Program. These consist of five 'parallel kernel' benchmarks and three 'simulated application' benchmarks. Together they mimic the computation and data movement characteristics of large-scale computational fluid dynamics applications. The principal distinguishing feature of these benchmarks is their 'pencil and paper' specification-all details of these benchmarks are specified only algorithmically. In this way many of the difficulties associated with conventional benchmarking approaches on highly parallel systems are avoided.

  1. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...

  2. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...

  3. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...

  4. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...

  5. Sublattice parallel replica dynamics.

    PubMed

    Martínez, Enrique; Uberuaga, Blas P; Voter, Arthur F

    2014-06-01

    Exascale computing presents a challenge for the scientific community as new algorithms must be developed to take full advantage of the new computing paradigm. Atomistic simulation methods that offer full fidelity to the underlying potential, i.e., molecular dynamics (MD) and parallel replica dynamics, fail to use the whole machine speedup, leaving a region in time and sample size space that is unattainable with current algorithms. In this paper, we present an extension of the parallel replica dynamics algorithm [A. F. Voter, Phys. Rev. B 57, R13985 (1998)] by combining it with the synchronous sublattice approach of Shim and Amar [ and , Phys. Rev. B 71, 125432 (2005)], thereby exploiting event locality to improve the algorithm scalability. This algorithm is based on a domain decomposition in which events happen independently in different regions in the sample. We develop an analytical expression for the speedup given by this sublattice parallel replica dynamics algorithm and compare it with parallel MD and traditional parallel replica dynamics. We demonstrate how this algorithm, which introduces a slight additional approximation of event locality, enables the study of physical systems unreachable with traditional methodologies and promises to better utilize the resources of current high performance and future exascale computers.

  6. Variability of ribosomal RNA genes in Rauwolfia species: parallelism between tissue culture-induced rearrangements and interspecies polymorphism.

    PubMed

    Andreev, I O; Spiridonova, K V; Solovyan, V T; Kunakh, V A

    2005-01-01

    An analysis of 18S-25S and 5S rRNA genes in intact plants and cultured tissues of some Rauwolfia species was performed to compare these sequences variability occurred as a result of the species evolution in nature and that induced by tissue culture. The restriction fragment length polymorphism of 18S-25S and 5S rDNA was found both in intact plants of various Rauwolfia species and in long-term Rauwolfia serpentina tissue cultures. In addition, changes in the amount of 18S-25S rRNA genes were observed in long-term R. serpentina tissue cultures. The results demonstrate that rDNA variability observed in intact plants as well as in long-term cultures is attributed to differences in the same regions of ribosomal RNA genes.

  7. Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes

    NASA Astrophysics Data System (ADS)

    Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao

    2010-06-01

    To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.

  8. Design and implementation of an automated compound management system in support of lead optimization.

    PubMed

    Quintero, Catherine; Kariv, Ilona

    2009-06-01

    To meet the needs of the increasingly rapid and parallelized lead optimization process, a fully integrated local compound storage and liquid handling system was designed and implemented to automate the generation of assay-ready plates directly from newly submitted and cherry-picked compounds. A key feature of the system is the ability to create project- or assay-specific compound-handling methods, which provide flexibility for any combination of plate types, layouts, and plate bar-codes. Project-specific workflows can be created by linking methods for processing new and cherry-picked compounds and control additions to produce a complete compound set for both biological testing and local storage in one uninterrupted workflow. A flexible cherry-pick approach allows for multiple, user-defined strategies to select the most appropriate replicate of a compound for retesting. Examples of custom selection parameters include available volume, compound batch, and number of freeze/thaw cycles. This adaptable and integrated combination of software and hardware provides a basis for reducing cycle time, fully automating compound processing, and ultimately increasing the rate at which accurate, biologically relevant results can be produced for compounds of interest in the lead optimization process.

  9. Simulations of Continuous Descent Operations with Arrival-management Automation and Mixed Flight-deck Interval Management Equipage

    NASA Technical Reports Server (NTRS)

    Callantine, Todd J.; Kupfer, Michael; Martin, Lynne Hazel; Prevot, Thomas

    2013-01-01

    Air traffic management simulations conducted in the Airspace Operations Laboratory at NASA Ames Research Center have addressed the integration of trajectory-based arrival-management automation, controller tools, and Flight-Deck Interval Management avionics to enable Continuous Descent Operations (CDOs) during periods of sustained high traffic demand. The simulations are devoted to maturing the integrated system for field demonstration, and refining the controller tools, clearance phraseology, and procedures specified in the associated concept of operations. The results indicate a variety of factors impact the concept's safety and viability from a controller's perspective, including en-route preconditioning of arrival flows, useable clearance phraseology, and the characteristics of airspace, routes, and traffic-management methods in use at a particular site. Clear understanding of automation behavior and required shifts in roles and responsibilities is important for controller acceptance and realizing potential benefits. This paper discusses the simulations, drawing parallels with results from related European efforts. The most recent study found en-route controllers can effectively precondition arrival flows, which significantly improved route conformance during CDOs. Controllers found the tools acceptable, in line with previous studies.

  10. The Galley Parallel File System

    NASA Technical Reports Server (NTRS)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    Most current multiprocessor file systems are designed to use multiple disks in parallel, using the high aggregate bandwidth to meet the growing I/0 requirements of parallel scientific applications. Many multiprocessor file systems provide applications with a conventional Unix-like interface, allowing the application to access multiple disks transparently. This interface conceals the parallelism within the file system, increasing the ease of programmability, but making it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. In addition to providing an insufficient interface, most current multiprocessor file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic scientific multiprocessor workloads. We discuss Galley's file structure and application interface, as well as the performance advantages offered by that interface.

  11. Parallel computation with the force

    NASA Technical Reports Server (NTRS)

    Jordan, H. F.

    1985-01-01

    A methodology, called the force, supports the construction of programs to be executed in parallel by a force of processes. The number of processes in the force is unspecified, but potentially very large. The force idea is embodied in a set of macros which produce multiproceossor FORTRAN code and has been studied on two shared memory multiprocessors of fairly different character. The method has simplified the writing of highly parallel programs within a limited class of parallel algorithms and is being extended to cover a broader class. The individual parallel constructs which comprise the force methodology are discussed. Of central concern are their semantics, implementation on different architectures and performance implications.

  12. Automation Applications in an Advanced Air Traffic Management System : Volume 4A. Automation Requirements.

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 4 describes the automation requirements. A presentation of automation requirements is made for an advanced air traffic management system in terms of controller work force, computer resources, controller productivity, system manning, failure ef...

  13. Comparison of Susceptibility Testing of Mycobacterium tuberculosis Using the ESP Culture System II with That Using the BACTEC Method

    PubMed Central

    Ruiz, P.; Zerolo, F. J.; Casal, M. J.

    2000-01-01

    The ESP Culture System II was evaluated for its capacity to test the susceptibility of 389 cultures of Mycobacterium tuberculosis to streptomycin, rifampin, ethambutol, and isoniazid. Good agreement with results with the BACTEC TB 460 was found. ESP II is a reliable, rapid, and automated method for performing susceptibility testing. PMID:11101619

  14. The Pros and Cons of Army Automation

    DTIC Science & Technology

    2007-11-13

    The Pros and Cons of Army Automation 1 Running Head: THE PROS AND CONS OF ARMY AUTOMATION The Pros and Cons of Army Automation SGM...TITLE AND SUBTITLE The Pros and Cons of Army Automation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...Prescribed by ANSI Std Z39-18 The Pros and Cons of Army Automation 2 Outline I. Introduction (MSG (P) Dostie) II. Manual skills (MSG (P

  15. Design considerations for parallel graphics libraries

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas W.

    1994-01-01

    Applications which run on parallel supercomputers are often characterized by massive datasets. Converting these vast collections of numbers to visual form has proven to be a powerful aid to comprehension. For a variety of reasons, it may be desirable to provide this visual feedback at runtime. One way to accomplish this is to exploit the available parallelism to perform graphics operations in place. In order to do this, we need appropriate parallel rendering algorithms and library interfaces. This paper provides a tutorial introduction to some of the issues which arise in designing parallel graphics libraries and their underlying rendering algorithms. The focus is on polygon rendering for distributed memory message-passing systems. We illustrate our discussion with examples from PGL, a parallel graphics library which has been developed on the Intel family of parallel systems.

  16. I trust it, but I don't know why: effects of implicit attitudes toward automation on trust in an automated system.

    PubMed

    Merritt, Stephanie M; Heimbaugh, Heather; LaChapell, Jennifer; Lee, Deborah

    2013-06-01

    This study is the first to examine the influence of implicit attitudes toward automation on users' trust in automation. Past empirical work has examined explicit (conscious) influences on user level of trust in automation but has not yet measured implicit influences. We examine concurrent effects of explicit propensity to trust machines and implicit attitudes toward automation on trust in an automated system. We examine differential impacts of each under varying automation performance conditions (clearly good, ambiguous, clearly poor). Participants completed both a self-report measure of propensity to trust and an Implicit Association Test measuring implicit attitude toward automation, then performed an X-ray screening task. Automation performance was manipulated within-subjects by varying the number and obviousness of errors. Explicit propensity to trust and implicit attitude toward automation did not significantly correlate. When the automation's performance was ambiguous, implicit attitude significantly affected automation trust, and its relationship with propensity to trust was additive: Increments in either were related to increases in trust. When errors were obvious, a significant interaction between the implicit and explicit measures was found, with those high in both having higher trust. Implicit attitudes have important implications for automation trust. Users may not be able to accurately report why they experience a given level of trust. To understand why users trust or fail to trust automation, measurements of implicit and explicit predictors may be necessary. Furthermore, implicit attitude toward automation might be used as a lever to effectively calibrate trust.

  17. Automation-induced monitoring inefficiency: role of display location.

    PubMed

    Singh, I L; Molloy, R; Parasuraman, R

    1997-01-01

    Operators can be poor monitors of automation if they are engaged concurrently in other tasks. However, in previous studies of this phenomenon the automated task was always presented in the periphery, away from the primary manual tasks that were centrally displayed. In this study we examined whether centrally locating an automated task would boost monitoring performance during a flight-simulation task consisting of system monitoring, tracking and fuel resource management sub-tasks. Twelve nonpilot subjects were required to perform the tracking and fuel management tasks manually while watching the automated system monitoring task for occasional failures. The automation reliability was constant at 87.5% for six subjects and variable (alternating between 87.5% and 56.25%) for the other six subjects. Each subject completed four 30 min sessions over a period of 2 days. In each automation reliability condition the automation routine was disabled for the last 20 min of the fourth session in order to simulate catastrophic automation failure (0 % reliability). Monitoring for automation failure was inefficient when automation reliability was constant but not when it varied over time, replicating previous results. Furthermore, there was no evidence of resource or speed accuracy trade-off between tasks. Thus, automation-induced failures of monitoring cannot be prevented by centrally locating the automated task.

  18. Automation-induced monitoring inefficiency: role of display location

    NASA Technical Reports Server (NTRS)

    Singh, I. L.; Molloy, R.; Parasuraman, R.

    1997-01-01

    Operators can be poor monitors of automation if they are engaged concurrently in other tasks. However, in previous studies of this phenomenon the automated task was always presented in the periphery, away from the primary manual tasks that were centrally displayed. In this study we examined whether centrally locating an automated task would boost monitoring performance during a flight-simulation task consisting of system monitoring, tracking and fuel resource management sub-tasks. Twelve nonpilot subjects were required to perform the tracking and fuel management tasks manually while watching the automated system monitoring task for occasional failures. The automation reliability was constant at 87.5% for six subjects and variable (alternating between 87.5% and 56.25%) for the other six subjects. Each subject completed four 30 min sessions over a period of 2 days. In each automation reliability condition the automation routine was disabled for the last 20 min of the fourth session in order to simulate catastrophic automation failure (0 % reliability). Monitoring for automation failure was inefficient when automation reliability was constant but not when it varied over time, replicating previous results. Furthermore, there was no evidence of resource or speed accuracy trade-off between tasks. Thus, automation-induced failures of monitoring cannot be prevented by centrally locating the automated task.

  19. Humans: still vital after all these years of automation.

    PubMed

    Parasuraman, Raja; Wickens, Christopher D

    2008-06-01

    The authors discuss empirical studies of human-automation interaction and their implications for automation design. Automation is prevalent in safety-critical systems and increasingly in everyday life. Many studies of human performance in automated systems have been conducted over the past 30 years. Developments in three areas are examined: levels and stages of automation, reliance on and compliance with automation, and adaptive automation. Automation applied to information analysis or decision-making functions leads to differential system performance benefits and costs that must be considered in choosing appropriate levels and stages of automation. Human user dependence on automated alerts and advisories reflects two components of operator trust, reliance and compliance, which are in turn determined by the threshold designers use to balance automation misses and false alarms. Finally, adaptive automation can provide additional benefits in balancing workload and maintaining the user's situation awareness, although more research is required to identify when adaptation should be user controlled or system driven. The past three decades of empirical research on humans and automation has provided a strong science base that can be used to guide the design of automated systems. This research can be applied to most current and future automated systems.

  20. Automated seismic waveform location using Multichannel Coherency Migration (MCM)-I. Theory

    NASA Astrophysics Data System (ADS)

    Shi, Peidong; Angus, Doug; Rost, Sebastian; Nowacki, Andy; Yuan, Sanyi

    2018-03-01

    With the proliferation of dense seismic networks sampling the full seismic wavefield, recorded seismic data volumes are getting bigger and automated analysis tools to locate seismic events are essential. Here, we propose a novel Multichannel Coherency Migration (MCM) method to locate earthquakes in continuous seismic data and reveal the location and origin time of seismic events directly from recorded waveforms. By continuously calculating the coherency between waveforms from different receiver pairs, MCM greatly expands the available information which can be used for event location. MCM does not require phase picking or phase identification, which allows fully automated waveform analysis. By migrating the coherency between waveforms, MCM leads to improved source energy focusing. We have tested and compared MCM to other migration-based methods in noise-free and noisy synthetic data. The tests and analysis show that MCM is noise resistant and can achieve more accurate results compared with other migration-based methods. MCM is able to suppress strong interference from other seismic sources occurring at a similar time and location. It can be used with arbitrary 3D velocity models and is able to obtain reasonable location results with smooth but inaccurate velocity models. MCM exhibits excellent location performance and can be easily parallelized giving it large potential to be developed as a real-time location method for very large datasets.