Knepper, Andreas; Heiser, Michael; Glauche, Florian; Neubauer, Peter
2014-12-01
The enormous variation possibilities of bioprocesses challenge process development to fix a commercial process with respect to costs and time. Although some cultivation systems and some devices for unit operations combine the latest technology on miniaturization, parallelization, and sensing, the degree of automation in upstream and downstream bioprocess development is still limited to single steps. We aim to face this challenge by an interdisciplinary approach to significantly shorten development times and costs. As a first step, we scaled down analytical assays to the microliter scale and created automated procedures for starting the cultivation and monitoring the optical density (OD), pH, concentrations of glucose and acetate in the culture medium, and product formation in fed-batch cultures in the 96-well format. Then, the separate measurements of pH, OD, and concentrations of acetate and glucose were combined to one method. This method enables automated process monitoring at dedicated intervals (e.g., also during the night). By this approach, we managed to increase the information content of cultivations in 96-microwell plates, thus turning them into a suitable tool for high-throughput bioprocess development. Here, we present the flowcharts as well as cultivation data of our automation approach. © 2014 Society for Laboratory Automation and Screening.
den Hertog, Alice L.; Visser, Dennis W.; Ingham, Colin J.; Fey, Frank H. A. G.; Klatser, Paul R.; Anthony, Richard M.
2010-01-01
Background Even with the advent of nucleic acid (NA) amplification technologies the culture of mycobacteria for diagnostic and other applications remains of critical importance. Notably microscopic observed drug susceptibility testing (MODS), as opposed to traditional culture on solid media or automated liquid culture, has shown potential to both speed up and increase the provision of mycobacterial culture in high burden settings. Methods Here we explore the growth of Mycobacterial tuberculosis microcolonies, imaged by automated digital microscopy, cultured on a porous aluminium oxide (PAO) supports. Repeated imaging during colony growth greatly simplifies “computer vision” and presumptive identification of microcolonies was achieved here using existing publically available algorithms. Our system thus allows the growth of individual microcolonies to be monitored and critically, also to change the media during the growth phase without disrupting the microcolonies. Transfer of identified microcolonies onto selective media allowed us, within 1-2 bacterial generations, to rapidly detect the drug susceptibility of individual microcolonies, eliminating the need for time consuming subculturing or the inoculation of multiple parallel cultures. Significance Monitoring the phenotype of individual microcolonies as they grow has immense potential for research, screening, and ultimately M. tuberculosis diagnostic applications. The method described is particularly appealing with respect to speed and automation. PMID:20544033
Garty, Guy; Chen, Youhua; Turner, Helen C; Zhang, Jian; Lyulko, Oleksandra V; Bertucci, Antonella; Xu, Yanping; Wang, Hongliang; Simaan, Nabil; Randers-Pehrson, Gerhard; Lawrence Yao, Y; Brenner, David J
2011-08-01
Over the past five years the Center for Minimally Invasive Radiation Biodosimetry at Columbia University has developed the Rapid Automated Biodosimetry Tool (RABiT), a completely automated, ultra-high throughput biodosimetry workstation. This paper describes recent upgrades and reliability testing of the RABiT. The RABiT analyses fingerstick-derived blood samples to estimate past radiation exposure or to identify individuals exposed above or below a cut-off dose. Through automated robotics, lymphocytes are extracted from fingerstick blood samples into filter-bottomed multi-well plates. Depending on the time since exposure, the RABiT scores either micronuclei or phosphorylation of the histone H2AX, in an automated robotic system, using filter-bottomed multi-well plates. Following lymphocyte culturing, fixation and staining, the filter bottoms are removed from the multi-well plates and sealed prior to automated high-speed imaging. Image analysis is performed online using dedicated image processing hardware. Both the sealed filters and the images are archived. We have developed a new robotic system for lymphocyte processing, making use of an upgraded laser power and parallel processing of four capillaries at once. This system has allowed acceleration of lymphocyte isolation, the main bottleneck of the RABiT operation, from 12 to 2 sec/sample. Reliability tests have been performed on all robotic subsystems. Parallel handling of multiple samples through the use of dedicated, purpose-built, robotics and high speed imaging allows analysis of up to 30,000 samples per day.
Garty, Guy; Chen, Youhua; Turner, Helen; Zhang, Jian; Lyulko, Oleksandra; Bertucci, Antonella; Xu, Yanping; Wang, Hongliang; Simaan, Nabil; Randers-Pehrson, Gerhard; Yao, Y. Lawrence; Brenner, David J.
2011-01-01
Purpose Over the past five years the Center for Minimally Invasive Radiation Biodosimetry at Columbia University has developed the Rapid Automated Biodosimetry Tool (RABiT), a completely automated, ultra-high throughput biodosimetry workstation. This paper describes recent upgrades and reliability testing of the RABiT. Materials and methods The RABiT analyzes fingerstick-derived blood samples to estimate past radiation exposure or to identify individuals exposed above or below a cutoff dose. Through automated robotics, lymphocytes are extracted from fingerstick blood samples into filter-bottomed multi-well plates. Depending on the time since exposure, the RABiT scores either micronuclei or phosphorylation of the histone H2AX, in an automated robotic system, using filter-bottomed multi-well plates. Following lymphocyte culturing, fixation and staining, the filter bottoms are removed from the multi-well plates and sealed prior to automated high-speed imaging. Image analysis is performed online using dedicated image processing hardware. Both the sealed filters and the images are archived. Results We have developed a new robotic system for lymphocyte processing, making use of an upgraded laser power and parallel processing of four capillaries at once. This system has allowed acceleration of lymphocyte isolation, the main bottleneck of the RABiT operation, from 12 to 2 sec/sample. Reliability tests have been performed on all robotic subsystems. Conclusions Parallel handling of multiple samples through the use of dedicated, purpose-built, robotics and high speed imaging allows analysis of up to 30,000 samples per day. PMID:21557703
Automation of large scale transient protein expression in mammalian cells
Zhao, Yuguang; Bishop, Benjamin; Clay, Jordan E.; Lu, Weixian; Jones, Margaret; Daenke, Susan; Siebold, Christian; Stuart, David I.; Yvonne Jones, E.; Radu Aricescu, A.
2011-01-01
Traditional mammalian expression systems rely on the time-consuming generation of stable cell lines; this is difficult to accommodate within a modern structural biology pipeline. Transient transfections are a fast, cost-effective solution, but require skilled cell culture scientists, making man-power a limiting factor in a setting where numerous samples are processed in parallel. Here we report a strategy employing a customised CompacT SelecT cell culture robot allowing the large-scale expression of multiple protein constructs in a transient format. Successful protocols have been designed for automated transient transfection of human embryonic kidney (HEK) 293T and 293S GnTI− cells in various flask formats. Protein yields obtained by this method were similar to those produced manually, with the added benefit of reproducibility, regardless of user. Automation of cell maintenance and transient transfection allows the expression of high quality recombinant protein in a completely sterile environment with limited support from a cell culture scientist. The reduction in human input has the added benefit of enabling continuous cell maintenance and protein production, features of particular importance to structural biology laboratories, which typically use large quantities of pure recombinant proteins, and often require rapid characterisation of a series of modified constructs. This automated method for large scale transient transfection is now offered as a Europe-wide service via the P-cube initiative. PMID:21571074
Karaçalı, Bilge; Vamvakidou, Alexandra P; Tözeren, Aydın
2007-01-01
Background Three-dimensional in vitro culture of cancer cells are used to predict the effects of prospective anti-cancer drugs in vivo. In this study, we present an automated image analysis protocol for detailed morphological protein marker profiling of tumoroid cross section images. Methods Histologic cross sections of breast tumoroids developed in co-culture suspensions of breast cancer cell lines, stained for E-cadherin and progesterone receptor, were digitized and pixels in these images were classified into five categories using k-means clustering. Automated segmentation was used to identify image regions composed of cells expressing a given biomarker. Synthesized images were created to check the accuracy of the image processing system. Results Accuracy of automated segmentation was over 95% in identifying regions of interest in synthesized images. Image analysis of adjacent histology slides stained, respectively, for Ecad and PR, accurately predicted regions of different cell phenotypes. Image analysis of tumoroid cross sections from different tumoroids obtained under the same co-culture conditions indicated the variation of cellular composition from one tumoroid to another. Variations in the compositions of cross sections obtained from the same tumoroid were established by parallel analysis of Ecad and PR-stained cross section images. Conclusion Proposed image analysis methods offer standardized high throughput profiling of molecular anatomy of tumoroids based on both membrane and nuclei markers that is suitable to rapid large scale investigations of anti-cancer compounds for drug development. PMID:17822559
Automated live cell screening system based on a 24-well-microplate with integrated micro fluidics.
Lob, V; Geisler, T; Brischwein, M; Uhl, R; Wolf, B
2007-11-01
In research, pharmacologic drug-screening and medical diagnostics, the trend towards the utilization of functional assays using living cells is persisting. Research groups working with living cells are confronted with the problem, that common endpoint measurement methods are not able to map dynamic changes. With consideration of time as a further dimension, the dynamic and networked molecular processes of cells in culture can be monitored. These processes can be investigated by measuring several extracellular parameters. This paper describes a high-content system that provides real-time monitoring data of cell parameters (metabolic and morphological alterations), e.g., upon treatment with drug compounds. Accessible are acidification rates, the oxygen consumption and changes in adhesion forces within 24 cell cultures in parallel. Addressing the rising interest in biomedical and pharmacological high-content screening assays, a concept has been developed, which integrates multi-parametric sensor readout, automated imaging and probe handling into a single embedded platform. A life-maintenance system keeps important environmental parameters (gas, humidity, sterility, temperature) constant.
Performance of a Novel Algorithm Using Automated Digital Microscopy for Diagnosing Tuberculosis.
Ismail, Nazir A; Omar, Shaheed V; Lewis, James J; Dowdy, David W; Dreyer, Andries W; van der Meulen, Hermina; Nconjana, George; Clark, David A; Churchyard, Gavin J
2015-06-15
TBDx automated microscopy is a novel technology that processes digital microscopic images to identify acid-fast bacilli (AFB). Use of TBDx as part of a diagnostic algorithm could improve the diagnosis of tuberculosis (TB), but its performance characteristics have not yet been formally tested. To evaluate the performance of the TBDx automated microscopy system in algorithms for diagnosis of TB. Prospective samples from patients with presumed TB were processed in parallel with conventional smear microscopy, TBDx microscopy, and liquid culture. All TBDx-positive specimens were also tested with the Xpert MTB/RIF (GXP) assay. We evaluated the sensitivity and specificity of two algorithms-(1) TBDx-GXP (TBDx with positive specimens tested by Xpert MTB/RIF) and (2) TBDx alone-against the gold standard liquid media culture. Of 1,210 samples, 1,009 were eligible for evaluation, of which 109 were culture positive for Mycobacterium tuberculosis. The TBDx system identified 70 specimens (68 culture positive) as having 10 or more putative AFB (high positive) and 207 (19 culture positive) as having 1-9 putative AFB (low positive). An algorithm in which "low-positive" results on TBDx were confirmed by GXP had 78% sensitivity (85 of 109) and 99.8% specificity (889 of 900), requiring 21% (207 of 1,009) specimens to be processed by GXP. As a stand-alone test, a "high-positive" result on TBDx had 62% sensitivity and 99.7% specificity. TBDx used in diagnostic algorithms with GXP provided reasonable sensitivity and high specificity for active TB while dramatically reducing the number GXP tests performed. As a stand-alone microscopy system, its performance was equivalent to that of a highly experienced TB microscopist.
An Automated HIV-1 Env-Pseudotyped Virus Production for Global HIV Vaccine Trials
Fuss, Martina; Mazzotta, Angela S.; Sarzotti-Kelsoe, Marcella; Ozaki, Daniel A.; Montefiori, David C.; von Briesen, Hagen; Zimmermann, Heiko; Meyerhans, Andreas
2012-01-01
Background Infections with HIV still represent a major human health problem worldwide and a vaccine is the only long-term option to fight efficiently against this virus. Standardized assessments of HIV-specific immune responses in vaccine trials are essential for prioritizing vaccine candidates in preclinical and clinical stages of development. With respect to neutralizing antibodies, assays with HIV-1 Env-pseudotyped viruses are a high priority. To cover the increasing demands of HIV pseudoviruses, a complete cell culture and transfection automation system has been developed. Methodology/Principal Findings The automation system for HIV pseudovirus production comprises a modified Tecan-based Cellerity system. It covers an area of 5×3 meters and includes a robot platform, a cell counting machine, a CO2 incubator for cell cultivation and a media refrigerator. The processes for cell handling, transfection and pseudovirus production have been implemented according to manual standard operating procedures and are controlled and scheduled autonomously by the system. The system is housed in a biosafety level II cabinet that guarantees protection of personnel, environment and the product. HIV pseudovirus stocks in a scale from 140 ml to 1000 ml have been produced on the automated system. Parallel manual production of HIV pseudoviruses and comparisons (bridging assays) confirmed that the automated produced pseudoviruses were of equivalent quality as those produced manually. In addition, the automated method was fully validated according to Good Clinical Laboratory Practice (GCLP) guidelines, including the validation parameters accuracy, precision, robustness and specificity. Conclusions An automated HIV pseudovirus production system has been successfully established. It allows the high quality production of HIV pseudoviruses under GCLP conditions. In its present form, the installed module enables the production of 1000 ml of virus-containing cell culture supernatant per week. Thus, this novel automation facilitates standardized large-scale productions of HIV pseudoviruses for ongoing and upcoming HIV vaccine trials. PMID:23300558
NASA Technical Reports Server (NTRS)
Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.
Investigation of vinegar production using a novel shaken repeated batch culture system.
Schlepütz, Tino; Büchs, Jochen
2013-01-01
Nowadays, bioprocesses are developed or optimized on small scale. Also, vinegar industry is motivated to reinvestigate the established repeated batch fermentation process. As yet, there is no small-scale culture system for optimizing fermentation conditions for repeated batch bioprocesses. Thus, the aim of this study is to propose a new shaken culture system for parallel repeated batch vinegar fermentation. A new operation mode - the flushing repeated batch - was developed. Parallel repeated batch vinegar production could be established in shaken overflow vessels in a completely automated operation with only one pump per vessel. This flushing repeated batch was first theoretically investigated and then empirically tested. The ethanol concentration was online monitored during repeated batch fermentation by semiconductor gas sensors. It was shown that the switch from one ethanol substrate quality to different ethanol substrate qualities resulted in prolonged lag phases and durations of the first batches. In the subsequent batches the length of the fermentations decreased considerably. This decrease in the respective lag phases indicates an adaptation of the acetic acid bacteria mixed culture to the specific ethanol substrate quality. Consequently, flushing repeated batch fermentations on small scale are valuable for screening fermentation conditions and, thereby, improving industrial-scale bioprocesses such as vinegar production in terms of process robustness, stability, and productivity. Copyright © 2013 American Institute of Chemical Engineers.
Using AberOWL for fast and scalable reasoning over BioPortal ontologies.
Slater, Luke; Gkoutos, Georgios V; Schofield, Paul N; Hoehndorf, Robert
2016-08-08
Reasoning over biomedical ontologies using their OWL semantics has traditionally been a challenging task due to the high theoretical complexity of OWL-based automated reasoning. As a consequence, ontology repositories, as well as most other tools utilizing ontologies, either provide access to ontologies without use of automated reasoning, or limit the number of ontologies for which automated reasoning-based access is provided. We apply the AberOWL infrastructure to provide automated reasoning-based access to all accessible and consistent ontologies in BioPortal (368 ontologies). We perform an extensive performance evaluation to determine query times, both for queries of different complexity and for queries that are performed in parallel over the ontologies. We demonstrate that, with the exception of a few ontologies, even complex and parallel queries can now be answered in milliseconds, therefore allowing automated reasoning to be used on a large scale, to run in parallel, and with rapid response times.
NASA Astrophysics Data System (ADS)
Tsai, Cheng-Han; Wu, Xuanye; Kuan, Da-Han; Zimmermann, Stefan; Zengerle, Roland; Koltay, Peter
2018-08-01
In order to culture and analyze individual living cells, microfluidic cultivation and manipulation of cells become an increasingly important topic. Such microfluidic systems allow for exploring the phenotypic differences between thousands of genetically identical cells or pharmacological tests in parallel, which is impossible to achieve by traditional macroscopic cell culture methods. Therefore, plenty of microfluidic systems and devices have been developed for cell biological studies like cell culture, cell sorting, and cell lysis in the past. However, these microfluidic systems are still limited by the external pressure sources which most of the time are large in size and have to be connected by fluidic tubing leading to complex and delicate systems. In order to provide a miniaturized, more robust actuation system a novel, compact and low power consumption digital hydraulic drive (DHD) has been developed that is intended for use in portable and automated microfluidic systems for various applications. The DHD considered in this work consists of a shape memory alloy (SMA) actuator and a pneumatic cylinder. The switching time of the digital modes (pressure ON versus OFF) can be adjusted from 1 s to min. Thus, the DHDs might have many applications for driving microfluidic devices. In this work, different implementations of DHDs are presented and their performance is characterized by experiments. In particular, it will be shown that DHDs can be used for microfluidic large-scale integration (mLSI) valve control (256 valves in parallel) as well as potentially for droplet-based microfluidic systems. As further application example, high-throughput mixing of cell cultures (96 wells in parallel) is demonstrated employing the DHD to drive a so-called ‘functional lid’ (FL), to enable a miniaturized micro bioreactor in a regular 96-well micro well plate.
A hierarchical, automated target recognition algorithm for a parallel analog processor
NASA Technical Reports Server (NTRS)
Woodward, Gail; Padgett, Curtis
1997-01-01
A hierarchical approach is described for an automated target recognition (ATR) system, VIGILANTE, that uses a massively parallel, analog processor (3DANN). The 3DANN processor is capable of performing 64 concurrent inner products of size 1x4096 every 250 nanoseconds.
Birchler, Axel; Berger, Mischa; Jäggin, Verena; Lopes, Telma; Etzrodt, Martin; Misun, Patrick Mark; Pena-Francesch, Maria; Schroeder, Timm; Hierlemann, Andreas; Frey, Olivier
2016-01-19
Open microfluidic cell culturing devices offer new possibilities to simplify loading, culturing, and harvesting of individual cells or microtissues due to the fact that liquids and cells/microtissues are directly accessible. We present a complete workflow for microfluidic handling and culturing of individual cells and microtissue spheroids, which is based on the hanging-drop network concept: The open microfluidic devices are seamlessly combined with fluorescence-activated cell sorting (FACS), so that individual cells, including stem cells, can be directly sorted into specified culturing compartments in a fully automated way and at high accuracy. Moreover, already assembled microtissue spheroids can be loaded into the microfluidic structures by using a conventional pipet. Cell and microtissue culturing is then performed in hanging drops under controlled perfusion. On-chip drop size control measures were applied to stabilize the system. Cells and microtissue spheroids can be retrieved from the chip by using a parallelized transfer method. The presented methodology holds great promise for combinatorial screening of stem-cell and multicellular-spheroid cultures.
Terminal Area Procedures for Paired Runways
NASA Technical Reports Server (NTRS)
Lozito, Sandy
2011-01-01
Parallel Runway operations have been found to increase capacity within the National Airspace (NAS) however, poor visibility conditions reduce this capacity [1]. Much research has been conducted to examine the concepts and procedures related to parallel runways however, there has been no investigation of the procedures associated with the strategic and tactical pairing of aircraft for these operations. This study developed and examined the pilot and controller procedures and information requirements for creating aircraft pairs for parallel runway operations. The goal was to achieve aircraft pairing with a temporal separation of 15s(+/- 10s error) at a coupling point that is about 12 nmi from the runway threshold. Two variables were explored for the pilot participants: Two levels of flight deck automation (current-day flight deck automation, and a prototype future automation) as well as two flight deck displays that assisted in pilot conformance monitoring. The controllers were also provided with automation to help create and maintain aircraft pairs. Data showed that the operations in this study were found to be acceptable and safe. Workload when using the pairing procedures and tools was generally low for both controllers and pilots, and situation awareness (SA) was typically moderate to high. There were some differences based upon the display and automation conditions for the pilots. Future research should consider the refinement of the concepts and tools for pilot and controller displays and automation for parallel runway concepts.
Zhu, Xiang; Zhang, Dianwen
2013-01-01
We present a fast, accurate and robust parallel Levenberg-Marquardt minimization optimizer, GPU-LMFit, which is implemented on graphics processing unit for high performance scalable parallel model fitting processing. GPU-LMFit can provide a dramatic speed-up in massive model fitting analyses to enable real-time automated pixel-wise parametric imaging microscopy. We demonstrate the performance of GPU-LMFit for the applications in superresolution localization microscopy and fluorescence lifetime imaging microscopy. PMID:24130785
Default Parallels Plesk Panel Page
services that small businesses want and need. Our software includes key building blocks of cloud service virtualized servers Service Provider Products Parallels® Automation Hosting, SaaS, and cloud computing , the leading hosting automation software. You see this page because there is no Web site at this
Lu, Joann J.; Wang, Shili; Li, Guanbin; Wang, Wei; Pu, Qiaosheng; Liu, Shaorong
2012-01-01
In this report, we introduce a chip-capillary hybrid device to integrate capillary isoelectric focusing (CIEF) with parallel capillary sodium dodecyl sulfate – polyacrylamide gel electrophoresis (SDS-PAGE) or capillary gel electrophoresis (CGE) toward automating two-dimensional (2D) protein separations. The hybrid device consists of three chips that are butted together. The middle chip can be moved between two positions to re-route the fluidic paths, which enables the performance of CIEF and injection of proteins partially resolved by CIEF to CGE capillaries for parallel CGE separations in a continuous and automated fashion. Capillaries are attached to the other two chips to facilitate CIEF and CGE separations and to extend the effective lengths of CGE columns. Specifically, we illustrate the working principle of the hybrid device, develop protocols for producing and preparing the hybrid device, and demonstrate the feasibility of using this hybrid device for automated injection of CIEF-separated sample to parallel CGE for 2D protein separations. Potentials and problems associated with the hybrid device are also discussed. PMID:22830584
Terminal Area Procedures for Paired Runways
NASA Technical Reports Server (NTRS)
Lozito, Sandra; Verma, Savita Arora
2011-01-01
Parallel runway operations have been found to increase capacity within the National Airspace but poor visibility conditions reduce the use of these operations. The NextGen and SESAR Programs have identified the capacity benefits from increased use of closely-space parallel runway. Previous research examined the concepts and procedures related to parallel runways however, there has been no investigation of the procedures associated with the strategic and tactical pairing of aircraft for these operations. This simulation study developed and examined the pilot and controller procedures and information requirements for creating aircraft pairs for parallel runway operations. The goal was to achieve aircraft pairing with a temporal separation of 15s (+/- 10s error) at a coupling point that was about 12 nmi from the runway threshold. Two variables were explored for the pilot participants: two levels of flight deck automation (current-day flight deck automation and auto speed control future automation) as well as two flight deck displays that assisted in pilot conformance monitoring. The controllers were also provided with automation to help create and maintain aircraft pairs. Results show the operations in this study were acceptable and safe. Subjective workload, when using the pairing procedures and tools, was generally low for both controllers and pilots, and situation awareness was typically moderate to high. Pilot workload was influenced by display type and automation condition. Further research on pairing and off-nominal conditions is required however, this investigation identified promising findings about the feasibility of closely-spaced parallel runway operations.
First Annual Workshop on Space Operations Automation and Robotics (SOAR 87)
NASA Technical Reports Server (NTRS)
Griffin, Sandy (Editor)
1987-01-01
Several topics relative to automation and robotics technology are discussed. Automation of checkout, ground support, and logistics; automated software development; man-machine interfaces; neural networks; systems engineering and distributed/parallel processing architectures; and artificial intelligence/expert systems are among the topics covered.
Parallel workflow tools to facilitate human brain MRI post-processing
Cui, Zaixu; Zhao, Chenxi; Gong, Gaolang
2015-01-01
Multi-modal magnetic resonance imaging (MRI) techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computational resources and to support the parallel processing of different subjects or of independent processing steps for a single subject. Automated, parallel MRI post-processing tools can greatly facilitate relevant brain investigations and are being increasingly applied. In this review, we briefly summarize these parallel workflow tools and discuss relevant issues. PMID:26029043
Jones, Gillian; Matthews, Roger; Cunningham, Richard; Jenks, Peter
2011-07-01
The sensitivity of automated culture of Staphylococcus aureus from flocked swabs versus that of manual culture of fiber swabs was prospectively compared using nasal swabs from 867 patients. Automated culture from flocked swabs significantly increased the detection rate, by 13.1% for direct culture and 10.2% for enrichment culture.
An automated workflow for parallel processing of large multiview SPIM recordings
Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel
2016-01-01
Summary: Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. Availability and implementation: The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT. The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows. Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction. Contact: schmied@mpi-cbg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26628585
An automated workflow for parallel processing of large multiview SPIM recordings.
Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel
2016-04-01
Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction : schmied@mpi-cbg.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
Long-term maintenance of human induced pluripotent stem cells by automated cell culture system.
Konagaya, Shuhei; Ando, Takeshi; Yamauchi, Toshiaki; Suemori, Hirofumi; Iwata, Hiroo
2015-11-17
Pluripotent stem cells, such as embryonic stem cells and induced pluripotent stem (iPS) cells, are regarded as new sources for cell replacement therapy. These cells can unlimitedly expand under undifferentiated conditions and be differentiated into multiple cell types. Automated culture systems enable the large-scale production of cells. In addition to reducing the time and effort of researchers, an automated culture system improves the reproducibility of cell cultures. In the present study, we newly designed a fully automated cell culture system for human iPS maintenance. Using an automated culture system, hiPS cells maintained their undifferentiated state for 60 days. Automatically prepared hiPS cells had a potency of differentiation into three germ layer cells including dopaminergic neurons and pancreatic cells.
Jones, Gillian; Matthews, Roger; Cunningham, Richard; Jenks, Peter
2011-01-01
The sensitivity of automated culture of Staphylococcus aureus from flocked swabs versus that of manual culture of fiber swabs was prospectively compared using nasal swabs from 867 patients. Automated culture from flocked swabs significantly increased the detection rate, by 13.1% for direct culture and 10.2% for enrichment culture. PMID:21525218
High-throughput cultivation and screening platform for unicellular phototrophs.
Tillich, Ulrich M; Wolter, Nick; Schulze, Katja; Kramer, Dan; Brödel, Oliver; Frohme, Marcus
2014-09-16
High-throughput cultivation and screening methods allow a parallel, miniaturized and cost efficient processing of many samples. These methods however, have not been generally established for phototrophic organisms such as microalgae or cyanobacteria. In this work we describe and test high-throughput methods with the model organism Synechocystis sp. PCC6803. The required technical automation for these processes was achieved with a Tecan Freedom Evo 200 pipetting robot. The cultivation was performed in 2.2 ml deepwell microtiter plates within a cultivation chamber outfitted with programmable shaking conditions, variable illumination, variable temperature, and an adjustable CO2 atmosphere. Each microtiter-well within the chamber functions as a separate cultivation vessel with reproducible conditions. The automated measurement of various parameters such as growth, full absorption spectrum, chlorophyll concentration, MALDI-TOF-MS, as well as a novel vitality measurement protocol, have already been established and can be monitored during cultivation. Measurement of growth parameters can be used as inputs for the system to allow for periodic automatic dilutions and therefore a semi-continuous cultivation of hundreds of cultures in parallel. The system also allows the automatic generation of mid and long term backups of cultures to repeat experiments or to retrieve strains of interest. The presented platform allows for high-throughput cultivation and screening of Synechocystis sp. PCC6803. The platform should be usable for many phototrophic microorganisms as is, and be adaptable for even more. A variety of analyses are already established and the platform is easily expandable both in quality, i.e. with further parameters to screen for additional targets and in quantity, i.e. size or number of processed samples.
Yip, Hon Ming; Li, John C. S.; Cui, Xin; Gao, Qiannan; Leung, Chi Chiu
2014-01-01
As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet the device positions may vary at different time points throughout operations as the device moves back and forth on a motorized microscopic stage. Here, we report an image-based positioning strategy to realign the chamber position before every recording of microscopic image. We fabricate alignment marks at defined locations next to the chambers in the microfluidic device as reference positions. We also develop image processing algorithms to recognize the chamber positions in real-time, followed by realigning the chambers to their preset positions in the captured images. We perform experiments to validate and characterize the device functionality and the automated realignment operation. Together, this microfluidic realignment strategy can be a platform technology to achieve precise positioning of multiple chambers for general microfluidic applications requiring long-term parallel monitoring of cell and biochemical activities. PMID:25133248
Toward an automated parallel computing environment for geosciences
NASA Astrophysics Data System (ADS)
Zhang, Huai; Liu, Mian; Shi, Yaolin; Yuen, David A.; Yan, Zhenzhen; Liang, Guoping
2007-08-01
Software for geodynamic modeling has not kept up with the fast growing computing hardware and network resources. In the past decade supercomputing power has become available to most researchers in the form of affordable Beowulf clusters and other parallel computer platforms. However, to take full advantage of such computing power requires developing parallel algorithms and associated software, a task that is often too daunting for geoscience modelers whose main expertise is in geosciences. We introduce here an automated parallel computing environment built on open-source algorithms and libraries. Users interact with this computing environment by specifying the partial differential equations, solvers, and model-specific properties using an English-like modeling language in the input files. The system then automatically generates the finite element codes that can be run on distributed or shared memory parallel machines. This system is dynamic and flexible, allowing users to address different problems in geosciences. It is capable of providing web-based services, enabling users to generate source codes online. This unique feature will facilitate high-performance computing to be integrated with distributed data grids in the emerging cyber-infrastructures for geosciences. In this paper we discuss the principles of this automated modeling environment and provide examples to demonstrate its versatility.
Design and validation of a clinical-scale bioreactor for long-term isolated lung culture.
Charest, Jonathan M; Okamoto, Tatsuya; Kitano, Kentaro; Yasuda, Atsushi; Gilpin, Sarah E; Mathisen, Douglas J; Ott, Harald C
2015-06-01
The primary treatment for end-stage lung disease is lung transplantation. However, donor organ shortage remains a major barrier for many patients. In recent years, techniques for maintaining lungs ex vivo for evaluation and short-term (<12 h) resuscitation have come into more widespread use in an attempt to expand the donor pool. In parallel, progress in whole organ engineering has provided the potential perspective of patient derived grafts grown on demand. As both of these strategies advance to more complex interventions for lung repair and regeneration, the need for a long-term organ culture system becomes apparent. Herein we describe a novel clinical scale bioreactor capable of maintaining functional porcine and human lungs for at least 72 h in isolated lung culture (ILC). The fully automated, computer controlled, sterile, closed circuit system enables physiologic pulsatile perfusion and negative pressure ventilation, while gas exchange function, and metabolism can be evaluated. Creation of this stable, biomimetic long-term culture environment will enable advanced interventions in both donor lungs and engineered grafts of human scale. Copyright © 2015 Elsevier Ltd. All rights reserved.
Impedance-based cellular assays for regenerative medicine.
Gamal, W; Wu, H; Underwood, I; Jia, J; Smith, S; Bagnaninchi, P O
2018-07-05
Therapies based on regenerative techniques have the potential to radically improve healthcare in the coming years. As a result, there is an emerging need for non-destructive and label-free technologies to assess the quality of engineered tissues and cell-based products prior to their use in the clinic. In parallel, the emerging regenerative medicine industry that aims to produce stem cells and their progeny on a large scale will benefit from moving away from existing destructive biochemical assays towards data-driven automation and control at the industrial scale. Impedance-based cellular assays (IBCA) have emerged as an alternative approach to study stem-cell properties and cumulative studies, reviewed here, have shown their potential to monitor stem-cell renewal, differentiation and maturation. They offer a novel method to non-destructively assess and quality-control stem-cell cultures. In addition, when combined with in vitro disease models they provide complementary insights as label-free phenotypic assays. IBCA provide quantitative and very sensitive results that can easily be automated and up-scaled in multi-well format. When facing the emerging challenge of real-time monitoring of three-dimensional cell culture dielectric spectroscopy and electrical impedance tomography represent viable alternatives to two-dimensional impedance sensing.This article is part of the theme issue 'Designer human tissue: coming to a lab near you'. © 2018 The Author(s).
Comparability of automated human induced pluripotent stem cell culture: a pilot study.
Archibald, Peter R T; Chandra, Amit; Thomas, Dave; Chose, Olivier; Massouridès, Emmanuelle; Laâbi, Yacine; Williams, David J
2016-12-01
Consistent and robust manufacturing is essential for the translation of cell therapies, and the utilisation automation throughout the manufacturing process may allow for improvements in quality control, scalability, reproducibility and economics of the process. The aim of this study was to measure and establish the comparability between alternative process steps for the culture of hiPSCs. Consequently, the effects of manual centrifugation and automated non-centrifugation process steps, performed using TAP Biosystems' CompacT SelecT automated cell culture platform, upon the culture of a human induced pluripotent stem cell (hiPSC) line (VAX001024c07) were compared. This study, has demonstrated that comparable morphologies and cell diameters were observed in hiPSCs cultured using either manual or automated process steps. However, non-centrifugation hiPSC populations exhibited greater cell yields, greater aggregate rates, increased pluripotency marker expression, and decreased differentiation marker expression compared to centrifugation hiPSCs. A trend for decreased variability in cell yield was also observed after the utilisation of the automated process step. This study also highlights the detrimental effect of the cryopreservation and thawing processes upon the growth and characteristics of hiPSC cultures, and demonstrates that automated hiPSC manufacturing protocols can be successfully transferred between independent laboratories.
Low cost automated whole smear microscopy screening system for detection of acid fast bacilli.
Law, Yan Nei; Jian, Hanbin; Lo, Norman W S; Ip, Margaret; Chan, Mia Mei Yuk; Kam, Kai Man; Wu, Xiaohua
2018-01-01
In countries with high tuberculosis (TB) burden, there is urgent need for rapid, large-scale screening to detect smear-positive patients. We developed a computer-aided whole smear screening system that focuses in real-time, captures images and provides diagnostic grading, for both bright-field and fluorescence microscopy for detection of acid-fast-bacilli (AFB) from respiratory specimens. To evaluate the performance of dual-mode screening system in AFB diagnostic algorithms on concentrated smears with auramine O (AO) staining, as well as direct smears with AO and Ziehl-Neelsen (ZN) staining, using mycobacterial culture results as gold standard. Adult patient sputum samples requesting for M. tuberculosis cultures were divided into three batches for staining: direct AO-stained, direct ZN-stained and concentrated smears AO-stained. All slides were graded by an experienced microscopist, in parallel with the automated whole smear screening system. Sensitivity and specificity of a TB diagnostic algorithm in using the screening system alone, and in combination with a microscopist, were evaluated. Of 488 direct AO-stained smears, 228 were culture positive. These yielded a sensitivity of 81.6% and specificity of 74.2%. Of 334 direct smears with ZN staining, 142 were culture positive, which gave a sensitivity of 70.4% and specificity of 76.6%. Of 505 concentrated smears with AO staining, 250 were culture positive, giving a sensitivity of 86.4% and specificity of 71.0%. To further improve performance, machine grading was confirmed by manual smear grading when the number of AFBs detected fell within an uncertainty range. These combined results gave significant improvement in specificity (AO-direct:85.4%; ZN-direct:85.4%; AO-concentrated:92.5%) and slight improvement in sensitivity while requiring only limited manual workload. Our system achieved high sensitivity without substantially compromising specificity when compared to culture results. Significant improvement in specificity was obtained when uncertain results were confirmed by manual smear grading. This approach had potential to substantially reduce workload of microscopists in high burden countries.
Hierarchically Parallelized Constrained Nonlinear Solvers with Automated Substructuring
NASA Technical Reports Server (NTRS)
Padovan, Joe; Kwang, Abel
1994-01-01
This paper develops a parallelizable multilevel multiple constrained nonlinear equation solver. The substructuring process is automated to yield appropriately balanced partitioning of each succeeding level. Due to the generality of the procedure,_sequential, as well as partially and fully parallel environments can be handled. This includes both single and multiprocessor assignment per individual partition. Several benchmark examples are presented. These illustrate the robustness of the procedure as well as its capability to yield significant reductions in memory utilization and calculational effort due both to updating and inversion.
1989-12-01
that can be easily understood. (9) Parallelism. Several system components may need to execute in parallel. For example, the processing of sensor data...knowledge base are not accessible for processing by the database. Also in the likely case that the expert system poses a series of related queries, the...hiharken nxpfilcs’Iog - Knowledge base for the automation of loCgistics rr-ovenet T’he Ii rectorY containing the strike aircraft replacement knowledge base
Pfannebecker, Jens; Schiffer-Hetz, Claudia; Fröhlich, Jürgen; Becker, Barbara
2016-11-01
In the present study, a culture medium for qualitative detection of osmotolerant yeasts, named OM, was developed. For the development, culture media with different concentrations of glucose, fructose, potassium chloride and glycerin were analyzed in a Biolumix™ test incubator. Selectivity for osmotolerant yeasts was guaranteed by a water activity (a w )-value of 0.91. The best results regarding fast growth of Zygosaccharomyces rouxii (WH 1002) were achieved in a culture medium consisting of 45% glucose, 5% fructose and 0.5% yeast extract and in a medium with 30% glucose, 10% glycerin, 5% potassium chloride and 0.5% yeast extract. Substances to stimulate yeast fermentation rates were analyzed in a RAMOS ® parallel fermenter system, enabling online measurement of the carbon dioxide transfer rate (CTR) in shaking flasks. Significant increases of the CTR was achieved by adding especially 0.1-0.2% ammonium salts ((NH 4 ) 2 HPO 4 , (NH 4 ) 2 SO 4 or NH 4 NO 3 ), 0.5% meat peptone and 1% malt extract. Detection times and the CTR of 23 food-borne yeast strains of the genera Zygosaccharomyces, Torulaspora, Schizosaccharomyces, Candida and Wickerhamomyces were analyzed in OM bouillon in comparison to the selective culture media YEG50, MYG50 and DG18 in the parallel fermenter system. The OM culture medium enabled the detection of 10 2 CFU/g within a time period of 2-3days, depending on the analyzed yeast species. Compared with YEG50 and MYG50 the detection times could be reduced. As an example, W. anomalus (WH 1021) was detected after 124h in YEG50, 95.5h in MYG50 and 55h in OM bouillon. Compared to YEG50 the maximum CO 2 transfer rates for Z. rouxii (WH 1001), T. delbrueckii (DSM 70526), S. pombe (DSM 70576) and W. anomalus (WH 1016) increased by a factor ≥2.6. Furthermore, enrichment cultures of inoculated high-sugar products in OM culture medium were analyzed in the Biolumix™ system. The results proved that detection times of 3days for Z. rouxii and T. delbrueckii can be realized by using OM in combination with the automated test system even if low initial counts (10 1 CFU/g) are present in the products. In conclusion, the presented data suggest that the OM culture medium is appropriate for the enrichment of osmotolerant yeasts from high-sugar food products. Copyright © 2016 Elsevier B.V. All rights reserved.
Wire-Guide Manipulator For Automated Welding
NASA Technical Reports Server (NTRS)
Morris, Tim; White, Kevin; Gordon, Steve; Emerich, Dave; Richardson, Dave; Faulkner, Mike; Stafford, Dave; Mccutcheon, Kim; Neal, Ken; Milly, Pete
1994-01-01
Compact motor drive positions guide for welding filler wire. Drive part of automated wire feeder in partly or fully automated welding system. Drive unit contains three parallel subunits. Rotations of lead screws in three subunits coordinated to obtain desired motions in three degrees of freedom. Suitable for both variable-polarity plasma arc welding and gas/tungsten arc welding.
Automated three-component synthesis of a library of γ-lactams
Fenster, Erik; Hill, David; Reiser, Oliver
2012-01-01
Summary A three-component method for the synthesis of γ-lactams from commercially available maleimides, aldehydes, and amines was adapted to parallel library synthesis. Improvements to the chemistry over previous efforts include the optimization of the method to a one-pot process, the management of by-products and excess reagents, the development of an automated parallel sequence, and the adaption of the method to permit the preparation of enantiomerically enriched products. These efforts culminated in the preparation of a library of 169 γ-lactams. PMID:23209515
At the intersection of automation and culture
NASA Technical Reports Server (NTRS)
Sherman, P. J.; Wiener, E. L.
1995-01-01
The crash of an automated passenger jet at Nagoya, Japan, in 1995, is used as an example of crew error in using automatic systems. Automation provides pilots with the ability to perform tasks in various ways. National culture is cited as a factor that affects how a pilot and crew interact with each other and equipment.
NASA Technical Reports Server (NTRS)
Luke, Edward Allen
1993-01-01
Two algorithms capable of computing a transonic 3-D inviscid flow field about rotating machines are considered for parallel implementation. During the study of these algorithms, a significant new method of measuring the performance of parallel algorithms is developed. The theory that supports this new method creates an empirical definition of scalable parallel algorithms that is used to produce quantifiable evidence that a scalable parallel application was developed. The implementation of the parallel application and an automated domain decomposition tool are also discussed.
NASA Astrophysics Data System (ADS)
Cornaglia, Matteo; Mouchiroud, Laurent; Marette, Alexis; Narasimhan, Shreya; Lehnert, Thomas; Jovaisaite, Virginija; Auwerx, Johan; Gijs, Martin A. M.
2015-05-01
Studies of the real-time dynamics of embryonic development require a gentle embryo handling method, the possibility of long-term live imaging during the complete embryogenesis, as well as of parallelization providing a population’s statistics, while keeping single embryo resolution. We describe an automated approach that fully accomplishes these requirements for embryos of Caenorhabditis elegans, one of the most employed model organisms in biomedical research. We developed a microfluidic platform which makes use of pure passive hydrodynamics to run on-chip worm cultures, from which we obtain synchronized embryo populations, and to immobilize these embryos in incubator microarrays for long-term high-resolution optical imaging. We successfully employ our platform to investigate morphogenesis and mitochondrial biogenesis during the full embryonic development and elucidate the role of the mitochondrial unfolded protein response (UPRmt) within C. elegans embryogenesis. Our method can be generally used for protein expression and developmental studies at the embryonic level, but can also provide clues to understand the aging process and age-related diseases in particular.
Parallel solution-phase synthesis of a 2-aminothiazole library including fully automated work-up.
Buchstaller, Hans-Peter; Anlauf, Uwe
2011-02-01
A straightforward and effective procedure for the solution phase preparation of a 2-aminothiazole combinatorial library is described. Reaction, work-up and isolation of the title compounds as free bases was accomplished in a fully automated fashion using the Chemspeed ASW 2000 automated synthesizer. The compounds were obtained in good yields and excellent purities without any further purification procedure.
Lehmann, R; Gallert, C; Roddelkopf, T; Junginger, S; Wree, A; Thurow, K
2016-08-01
Cancer diseases are a common problem of the population caused by age and increased harmful environmental influences. Herein, new therapeutic strategies and compound screenings are necessary. The regular 2D cultivation has to be replaced by three dimensional cell culturing (3D) for better simulation of in vivo conditions. The 3D cultivation with alginate matrix is an appropriate method for encapsulate cells to form cancer constructs. The automated manufacturing of alginate beads might be an ultimate method for large-scaled manufacturing constructs similar to cancer tissue. The aim of this study was the integration of full automated systems for the production, cultivation and screening of 3D cell cultures. We compared the automated methods with the regular manual processes. Furthermore, we investigated the influence of antibiotics on these 3D cell culture systems. The alginate beads were formed by automated and manual procedures. The automated steps were processes by the Biomek(®) Cell Workstation (celisca, Rostock, Germany). The proliferation and toxicity were manually and automatically evaluated at day 14 and 35 of cultivation. The results visualized an accumulation and expansion of cell aggregates over the period of incubation. However, the proliferation and toxicity were faintly and partly significantly decreased on day 35 compared to day 14. The comparison of the manual and automated methods displayed similar results. We conclude that the manual production process could be replaced by the automation. Using automation, 3D cell cultures can be produced in industrial scale and improve the drug development and screening to treat serious illnesses like cancer.
NASA Astrophysics Data System (ADS)
Baumgartner, D. J.; Pötzi, W.; Freislich, H.; Strutzmann, H.; Veronig, A. M.; Foelsche, U.; Rieder, H. E.
2017-06-01
In recent decades, automated sensors for sunshine duration (SD) measurements have been introduced in meteorological networks, thereby replacing traditional instruments, most prominently the Campbell-Stokes (CS) sunshine recorder. Parallel records of automated and traditional SD recording systems are rare. Nevertheless, such records are important to understand the differences/similarities in SD totals obtained with different instruments and how changes in monitoring device type affect the homogeneity of SD records. This study investigates the differences/similarities in parallel SD records obtained with a CS and two automated SD sensors between 2007 and 2016 at the Kanzelhöhe Observatory, Austria. Comparing individual records of daily SD totals, we find differences of both positive and negative sign, with smallest differences between the automated sensors. The larger differences between CS-derived SD totals and those from automated sensors can be attributed (largely) to the higher sensitivity threshold of the CS instrument. Correspondingly, the closest agreement among all sensors is found during summer, the time of year when sensitivity thresholds are least critical. Furthermore, we investigate the performance of various models to create the so-called sensor-type-equivalent (STE) SD records. Our analysis shows that regression models including all available data on daily (or monthly) time scale perform better than simple three- (or four-) point regression models. Despite general good performance, none of the considered regression models (of linear or quadratic form) emerges as the "optimal" model. Although STEs prove useful for relating SD records of individual sensors on daily/monthly time scales, this does not ensure that STE (or joint) records can be used for trend analysis.
Predicting Flows of Rarefied Gases
NASA Technical Reports Server (NTRS)
LeBeau, Gerald J.; Wilmoth, Richard G.
2005-01-01
DSMC Analysis Code (DAC) is a flexible, highly automated, easy-to-use computer program for predicting flows of rarefied gases -- especially flows of upper-atmospheric, propulsion, and vented gases impinging on spacecraft surfaces. DAC implements the direct simulation Monte Carlo (DSMC) method, which is widely recognized as standard for simulating flows at densities so low that the continuum-based equations of computational fluid dynamics are invalid. DAC enables users to model complex surface shapes and boundary conditions quickly and easily. The discretization of a flow field into computational grids is automated, thereby relieving the user of a traditionally time-consuming task while ensuring (1) appropriate refinement of grids throughout the computational domain, (2) determination of optimal settings for temporal discretization and other simulation parameters, and (3) satisfaction of the fundamental constraints of the method. In so doing, DAC ensures an accurate and efficient simulation. In addition, DAC can utilize parallel processing to reduce computation time. The domain decomposition needed for parallel processing is completely automated, and the software employs a dynamic load-balancing mechanism to ensure optimal parallel efficiency throughout the simulation.
Automated target recognition and tracking using an optical pattern recognition neural network
NASA Technical Reports Server (NTRS)
Chao, Tien-Hsin
1991-01-01
The on-going development of an automatic target recognition and tracking system at the Jet Propulsion Laboratory is presented. This system is an optical pattern recognition neural network (OPRNN) that is an integration of an innovative optical parallel processor and a feature extraction based neural net training algorithm. The parallel optical processor provides high speed and vast parallelism as well as full shift invariance. The neural network algorithm enables simultaneous discrimination of multiple noisy targets in spite of their scales, rotations, perspectives, and various deformations. This fully developed OPRNN system can be effectively utilized for the automated spacecraft recognition and tracking that will lead to success in the Automated Rendezvous and Capture (AR&C) of the unmanned Cargo Transfer Vehicle (CTV). One of the most powerful optical parallel processors for automatic target recognition is the multichannel correlator. With the inherent advantages of parallel processing capability and shift invariance, multiple objects can be simultaneously recognized and tracked using this multichannel correlator. This target tracking capability can be greatly enhanced by utilizing a powerful feature extraction based neural network training algorithm such as the neocognitron. The OPRNN, currently under investigation at JPL, is constructed with an optical multichannel correlator where holographic filters have been prepared using the neocognitron training algorithm. The computation speed of the neocognitron-type OPRNN is up to 10(exp 14) analog connections/sec that enabling the OPRNN to outperform its state-of-the-art electronics counterpart by at least two orders of magnitude.
Mock, Ulrike; Nickolay, Lauren; Philip, Brian; Cheung, Gordon Weng-Kit; Zhan, Hong; Johnston, Ian C D; Kaiser, Andrew D; Peggs, Karl; Pule, Martin; Thrasher, Adrian J; Qasim, Waseem
2016-08-01
Novel cell therapies derived from human T lymphocytes are exhibiting enormous potential in early-phase clinical trials in patients with hematologic malignancies. Ex vivo modification of T cells is currently limited to a small number of centers with the required infrastructure and expertise. The process requires isolation, activation, transduction, expansion and cryopreservation steps. To simplify procedures and widen applicability for clinical therapies, automation of these procedures is being developed. The CliniMACS Prodigy (Miltenyi Biotec) has recently been adapted for lentiviral transduction of T cells and here we analyse the feasibility of a clinically compliant T-cell engineering process for the manufacture of T cells encoding chimeric antigen receptors (CAR) for CD19 (CAR19), a widely targeted antigen in B-cell malignancies. Using a closed, single-use tubing set we processed mononuclear cells from fresh or frozen leukapheresis harvests collected from healthy volunteer donors. Cells were phenotyped and subjected to automated processing and activation using TransAct, a polymeric nanomatrix activation reagent incorporating CD3/CD28-specific antibodies. Cells were then transduced and expanded in the CentriCult-Unit of the tubing set, under stabilized culture conditions with automated feeding and media exchange. The process was continuously monitored to determine kinetics of expansion, transduction efficiency and phenotype of the engineered cells in comparison with small-scale transductions run in parallel. We found that transduction efficiencies, phenotype and function of CAR19 T cells were comparable with existing procedures and overall T-cell yields sufficient for anticipated therapeutic dosing. The automation of closed-system T-cell engineering should improve dissemination of emerging immunotherapies and greatly widen applicability. Copyright © 2016. Published by Elsevier Inc.
A Parallel Adaboost-Backpropagation Neural Network for Massive Image Dataset Classification
NASA Astrophysics Data System (ADS)
Cao, Jianfang; Chen, Lichao; Wang, Min; Shi, Hao; Tian, Yun
2016-12-01
Image classification uses computers to simulate human understanding and cognition of images by automatically categorizing images. This study proposes a faster image classification approach that parallelizes the traditional Adaboost-Backpropagation (BP) neural network using the MapReduce parallel programming model. First, we construct a strong classifier by assembling the outputs of 15 BP neural networks (which are individually regarded as weak classifiers) based on the Adaboost algorithm. Second, we design Map and Reduce tasks for both the parallel Adaboost-BP neural network and the feature extraction algorithm. Finally, we establish an automated classification model by building a Hadoop cluster. We use the Pascal VOC2007 and Caltech256 datasets to train and test the classification model. The results are superior to those obtained using traditional Adaboost-BP neural network or parallel BP neural network approaches. Our approach increased the average classification accuracy rate by approximately 14.5% and 26.0% compared to the traditional Adaboost-BP neural network and parallel BP neural network, respectively. Furthermore, the proposed approach requires less computation time and scales very well as evaluated by speedup, sizeup and scaleup. The proposed approach may provide a foundation for automated large-scale image classification and demonstrates practical value.
A Parallel Adaboost-Backpropagation Neural Network for Massive Image Dataset Classification.
Cao, Jianfang; Chen, Lichao; Wang, Min; Shi, Hao; Tian, Yun
2016-12-01
Image classification uses computers to simulate human understanding and cognition of images by automatically categorizing images. This study proposes a faster image classification approach that parallelizes the traditional Adaboost-Backpropagation (BP) neural network using the MapReduce parallel programming model. First, we construct a strong classifier by assembling the outputs of 15 BP neural networks (which are individually regarded as weak classifiers) based on the Adaboost algorithm. Second, we design Map and Reduce tasks for both the parallel Adaboost-BP neural network and the feature extraction algorithm. Finally, we establish an automated classification model by building a Hadoop cluster. We use the Pascal VOC2007 and Caltech256 datasets to train and test the classification model. The results are superior to those obtained using traditional Adaboost-BP neural network or parallel BP neural network approaches. Our approach increased the average classification accuracy rate by approximately 14.5% and 26.0% compared to the traditional Adaboost-BP neural network and parallel BP neural network, respectively. Furthermore, the proposed approach requires less computation time and scales very well as evaluated by speedup, sizeup and scaleup. The proposed approach may provide a foundation for automated large-scale image classification and demonstrates practical value.
A Parallel Adaboost-Backpropagation Neural Network for Massive Image Dataset Classification
Cao, Jianfang; Chen, Lichao; Wang, Min; Shi, Hao; Tian, Yun
2016-01-01
Image classification uses computers to simulate human understanding and cognition of images by automatically categorizing images. This study proposes a faster image classification approach that parallelizes the traditional Adaboost-Backpropagation (BP) neural network using the MapReduce parallel programming model. First, we construct a strong classifier by assembling the outputs of 15 BP neural networks (which are individually regarded as weak classifiers) based on the Adaboost algorithm. Second, we design Map and Reduce tasks for both the parallel Adaboost-BP neural network and the feature extraction algorithm. Finally, we establish an automated classification model by building a Hadoop cluster. We use the Pascal VOC2007 and Caltech256 datasets to train and test the classification model. The results are superior to those obtained using traditional Adaboost-BP neural network or parallel BP neural network approaches. Our approach increased the average classification accuracy rate by approximately 14.5% and 26.0% compared to the traditional Adaboost-BP neural network and parallel BP neural network, respectively. Furthermore, the proposed approach requires less computation time and scales very well as evaluated by speedup, sizeup and scaleup. The proposed approach may provide a foundation for automated large-scale image classification and demonstrates practical value. PMID:27905520
Plouchart, Diane; Guizard, Guillaume; Latrille, Eric
2018-01-01
Continuous cultures in chemostats have proven their value in microbiology, microbial ecology, systems biology and bioprocess engineering, among others. In these systems, microbial growth and ecosystem performance can be quantified under stable and defined environmental conditions. This is essential when linking microbial diversity to ecosystem function. Here, a new system to test this link in anaerobic, methanogenic microbial communities is introduced. Rigorously replicated experiments or a suitable experimental design typically require operating several chemostats in parallel. However, this is labor intensive, especially when measuring biogas production. Commercial solutions for multiplying reactors performing continuous anaerobic digestion exist but are expensive and use comparably large reactor volumes, requiring the preparation of substantial amounts of media. Here, a flexible system of Lab-scale Automated and Multiplexed Anaerobic Chemostat system (LAMACs) with a working volume of 200 mL is introduced. Sterile feeding, biomass wasting and pressure monitoring are automated. One module containing six reactors fits the typical dimensions of a lab bench. Thanks to automation, time required for reactor operation and maintenance are reduced compared to traditional lab-scale systems. Several modules can be used together, and so far the parallel operation of 30 reactors was demonstrated. The chemostats are autoclavable. Parameters like reactor volume, flow rates and operating temperature can be freely set. The robustness of the system was tested in a two-month long experiment in which three inocula in four replicates, i.e., twelve continuous digesters were monitored. Statistically significant differences in the biogas production between inocula were observed. In anaerobic digestion, biogas production and consequently pressure development in a closed environment is a proxy for ecosystem performance. The precision of the pressure measurement is thus crucial. The measured maximum and minimum rates of gas production could be determined at the same precision. The LAMACs is a tool that enables us to put in practice the often-demanded need for replication and rigorous testing in microbial ecology as well as bioprocess engineering. PMID:29518106
Wang, Lingyu; Yu, Linfen; Grist, Samantha; Cheung, Karen C; Chen, David D Y
2017-11-15
Cell culture systems based on polydimethylsiloxane (PDMS) microfluidic devices offer great flexibility because of their simple fabrication and adaptability. PDMS devices also make it straightforward to set up parallel experiments and can facilitate process automation, potentially speeding up the drug discovery process. However, cells grown in PDMS-based systems can develop in different ways to those grown with conventional culturing systems because of the differences in the containers' surfaces. Despite the growing number of studies on microfluidic cell culture devices, the differences in cellular behavior in PDMS-based devices and normal cell culture systems are poorly characterized. In this work, we investigated the proliferation and autophagy of MCF7 cells cultured in uncoated and Parylene-C coated PDMS wells. Using a quantitative method combining solid phase extraction and liquid chromatography mass spectrometry we developed, we showed that Tamoxifen uptake into the surfaces of uncoated PDMS wells can change the drug's effective concentration in the culture medium, affecting the results of Tamoxifen-induced autophagy and cytotoxicity assays. Such changes must be carefully analyzed before transferring in vitro experiments from a traditional culture environment to a PDMS-based microfluidic system. We also found that cells cultured in Parylene-C coated PDMS wells showed similar proliferation and drug response characteristics to cells cultured in standard polystyrene (PS) plates, indicating that Parylene-C deposition offers an easy way of limiting the uptake of small molecules into porous PDMS materials and significantly improves the performance of PDMS-based device for cell related research. Copyright © 2017 Elsevier B.V. All rights reserved.
Planning and Resource Management in an Intelligent Automated Power Management System
NASA Technical Reports Server (NTRS)
Morris, Robert A.
1991-01-01
Power system management is a process of guiding a power system towards the objective of continuous supply of electrical power to a set of loads. Spacecraft power system management requires planning and scheduling, since electrical power is a scarce resource in space. The automation of power system management for future spacecraft has been recognized as an important R&D goal. Several automation technologies have emerged including the use of expert systems for automating human problem solving capabilities such as rule based expert system for fault diagnosis and load scheduling. It is questionable whether current generation expert system technology is applicable for power system management in space. The objective of the ADEPTS (ADvanced Electrical Power management Techniques for Space systems) is to study new techniques for power management automation. These techniques involve integrating current expert system technology with that of parallel and distributed computing, as well as a distributed, object-oriented approach to software design. The focus of the current study is the integration of new procedures for automatically planning and scheduling loads with procedures for performing fault diagnosis and control. The objective is the concurrent execution of both sets of tasks on separate transputer processors, thus adding parallelism to the overall management process.
NASA Technical Reports Server (NTRS)
Saini, Subhash; Frumkin, Michael; Hribar, Michelle; Jin, Hao-Qiang; Waheed, Abdul; Yan, Jerry
1998-01-01
Porting applications to new high performance parallel and distributed computing platforms is a challenging task. Since writing parallel code by hand is extremely time consuming and costly, porting codes would ideally be automated by using some parallelization tools and compilers. In this paper, we compare the performance of the hand written NAB Parallel Benchmarks against three parallel versions generated with the help of tools and compilers: 1) CAPTools: an interactive computer aided parallelization too] that generates message passing code, 2) the Portland Group's HPF compiler and 3) using compiler directives with the native FORTAN77 compiler on the SGI Origin2000.
"First generation" automated DNA sequencing technology.
Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M
2011-10-01
Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.
NASA Technical Reports Server (NTRS)
Ortega, J. M.
1986-01-01
Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.
NASA Astrophysics Data System (ADS)
Sopaheluwakan, Ardhasena; Fajariana, Yuaning; Satyaningsih, Ratna; Aprilina, Kharisma; Astuti Nuraini, Tri; Ummiyatul Badriyah, Imelda; Lukita Sari, Dyah; Haryoko, Urip
2017-04-01
Inhomogeneities are often found in long records of climate data. These can occur because of various reasons, among others such as relocation of observation site, changes in observation method, and the transition to automated instruments. Changes to these automated systems are inevitable, and it is taking place worldwide in many of the National Meteorological Services. However this shift of observational practice must be done cautiously and a sufficient period of parallel observation of co-located manual and automated systems should take place as suggested by the World Meteorological Organization. With a sufficient parallel observation period, biases between the two systems can be analyzed. In this study we analyze the biases of a yearlong parallel observation of manual and automatic weather stations in 30 locations in Indonesia. The location of the sites spans from east to west of approximately 45 longitudinal degrees covering different climate characteristics and geographical settings. We study measurements taken by both sensors for temperature and rainfall parameters. We found that the biases from both systems vary from place to place and are more dependent to the setting of the instrument rather than to the climatic and geographical factors. For instance, daytime observations of the automatic weather stations are found to be consistently higher than the manual observation, and vice versa night time observations of the automatic weather stations are lower than the manual observation.
A Multiscale Parallel Computing Architecture for Automated Segmentation of the Brain Connectome
Knobe, Kathleen; Newton, Ryan R.; Schlimbach, Frank; Blower, Melanie; Reid, R. Clay
2015-01-01
Several groups in neurobiology have embarked into deciphering the brain circuitry using large-scale imaging of a mouse brain and manual tracing of the connections between neurons. Creating a graph of the brain circuitry, also called a connectome, could have a huge impact on the understanding of neurodegenerative diseases such as Alzheimer’s disease. Although considerably smaller than a human brain, a mouse brain already exhibits one billion connections and manually tracing the connectome of a mouse brain can only be achieved partially. This paper proposes to scale up the tracing by using automated image segmentation and a parallel computing approach designed for domain experts. We explain the design decisions behind our parallel approach and we present our results for the segmentation of the vasculature and the cell nuclei, which have been obtained without any manual intervention. PMID:21926011
Production of yarns composed of oriented nanofibers for ophthalmological implants
NASA Astrophysics Data System (ADS)
Shynkarenko, A.; Klapstova, A.; Krotov, A.; Moucka, M.; Lukas, D.
2017-10-01
Parallelized nanofibrous structures are commonly used in medical sector, especially for the ophthalmological implants. In this research self-fabricated device is tested for improved collection and twisting of the parallel nanofibers. Previously manual techniques are used to collect the nanofibers and then twist is given, where as in our device different parameters can be optimized to obtained parallel nanofibers and further twisting can be given. The device is used to bring automation to the technique of achieving parallel fibrous structures for medical applications.
Effects of ATC automation on precision approaches to closely space parallel runways
NASA Technical Reports Server (NTRS)
Slattery, R.; Lee, K.; Sanford, B.
1995-01-01
Improved navigational technology (such as the Microwave Landing System and the Global Positioning System) installed in modern aircraft will enable air traffic controllers to better utilize available airspace. Consequently, arrival traffic can fly approaches to parallel runways separated by smaller distances than are currently allowed. Previous simulation studies of advanced navigation approaches have found that controller workload is increased when there is a combination of aircraft that are capable of following advanced navigation routes and aircraft that are not. Research into Air Traffic Control automation at Ames Research Center has led to the development of the Center-TRACON Automation System (CTAS). The Final Approach Spacing Tool (FAST) is the component of the CTAS used in the TRACON area. The work in this paper examines, via simulation, the effects of FAST used for aircraft landing on closely spaced parallel runways. The simulation contained various combinations of aircraft, equipped and unequipped with advanced navigation systems. A set of simulations was run both manually and with an augmented set of FAST advisories to sequence aircraft, assign runways, and avoid conflicts. The results of the simulations are analyzed, measuring the airport throughput, aircraft delay, loss of separation, and controller workload.
Parallel sequencing lives, or what makes large sequencing projects successful
Cuartero, Yasmina; Stadhouders, Ralph; Graf, Thomas; Marti-Renom, Marc A; Beato, Miguel
2017-01-01
Abstract T47D_rep2 and b1913e6c1_51720e9cf were 2 Hi-C samples. They were born and processed at the same time, yet their fates were very different. The life of b1913e6c1_51720e9cf was simple and fruitful, while that of T47D_rep2 was full of accidents and sorrow. At the heart of these differences lies the fact that b1913e6c1_51720e9cf was born under a lab culture of Documentation, Automation, Traceability, and Autonomy and compliance with the FAIR Principles. Their lives are a lesson for those who wish to embark on the journey of managing high-throughput sequencing data. PMID:29048533
Hoehl, Melanie M; Weißert, Michael; Dannenberg, Arne; Nesch, Thomas; Paust, Nils; von Stetten, Felix; Zengerle, Roland; Slocum, Alexander H; Steigert, Juergen
2014-06-01
This paper introduces a disposable battery-driven heating system for loop-mediated isothermal DNA amplification (LAMP) inside a centrifugally-driven DNA purification platform (LabTube). We demonstrate LabTube-based fully automated DNA purification of as low as 100 cell-equivalents of verotoxin-producing Escherichia coli (VTEC) in water, milk and apple juice in a laboratory centrifuge, followed by integrated and automated LAMP amplification with a reduction of hands-on time from 45 to 1 min. The heating system consists of two parallel SMD thick film resistors and a NTC as heating and temperature sensing elements. They are driven by a 3 V battery and controlled by a microcontroller. The LAMP reagents are stored in the elution chamber and the amplification starts immediately after the eluate is purged into the chamber. The LabTube, including a microcontroller-based heating system, demonstrates contamination-free and automated sample-to-answer nucleic acid testing within a laboratory centrifuge. The heating system can be easily parallelized within one LabTube and it is deployable for a variety of heating and electrical applications.
Kino-Oka, Masahiro; Ogawa, Natsuki; Umegaki, Ryota; Taya, Masahito
2005-01-01
A novel bioreactor system was designed to perform a series of batchwise cultures of anchorage-dependent cells by means of automated operations of medium change and passage for cell transfer. The experimental data on contamination frequency ensured the biological cleanliness in the bioreactor system, which facilitated the operations in a closed environment, as compared with that in flask culture system with manual handlings. In addition, the tools for growth prediction (based on growth kinetics) and real-time growth monitoring by measurement of medium components (based on small-volume analyzing machinery) were installed into the bioreactor system to schedule the operations of medium change and passage and to confirm that culture proceeds as scheduled, respectively. The successive culture of anchorage-dependent cells was conducted with the bioreactor running in an automated way. The automated bioreactor gave a successful culture performance with fair accordance to preset scheduling based on the information in the latest subculture, realizing 79- fold cell expansion for 169 h. In addition, the correlation factor between experimental data and scheduled values through the bioreactor performance was 0.998. It was concluded that the proposed bioreactor with the integration of the prediction and monitoring tools could offer a feasible system for the manufacturing process of cultured tissue products.
Problems of Automation and Management Principles Information Flow in Manufacturing
NASA Astrophysics Data System (ADS)
Grigoryuk, E. N.; Bulkin, V. V.
2017-07-01
Automated control systems of technological processes are complex systems that are characterized by the presence of elements of the overall focus, the systemic nature of the implemented algorithms for the exchange and processing of information, as well as a large number of functional subsystems. The article gives examples of automatic control systems and automated control systems of technological processes held parallel between them by identifying strengths and weaknesses. Other proposed non-standard control system of technological process.
Cellular Metabolomics for Exposure and Toxicity Assessment
We have developed NMR automation and cell quench methods for cell culture-based metabolomics to study chemical exposure and toxicity. Our flow automation method is robust and free of cross contamination. The direct cell quench method is rapid and effective. Cell culture-based met...
Oxygen-controlled automated neural differentiation of mouse embryonic stem cells.
Mondragon-Teran, Paul; Tostoes, Rui; Mason, Chris; Lye, Gary J; Veraitch, Farlan S
2013-03-01
Automation and oxygen tension control are two tools that provide significant improvements to the reproducibility and efficiency of stem cell production processes. the aim of this study was to establish a novel automation platform capable of controlling oxygen tension during both the cell-culture and liquid-handling steps of neural differentiation processes. We built a bespoke automation platform, which enclosed a liquid-handling platform in a sterile, oxygen-controlled environment. An airtight connection was used to transfer cell culture plates to and from an automated oxygen-controlled incubator. Our results demonstrate that our system yielded comparable cell numbers, viabilities, metabolism profiles and differentiation efficiencies when compared with traditional manual processes. Interestingly, eliminating exposure to ambient conditions during the liquid-handling stage resulted in significant improvements in the yield of MAP2-positive neural cells, indicating that this level of control can improve differentiation processes. This article describes, for the first time, an automation platform capable of maintaining oxygen tension control during both the cell-culture and liquid-handling stages of a 2D embryonic stem cell differentiation process.
NASA Bioculture System: From Experiment Definition to Flight Payload
NASA Technical Reports Server (NTRS)
Sato, Kevin Y.; Almeida, Eduardo; Austin, Edward M.
2014-01-01
Starting in 2015, the NASA Bioculture System will be available to the science community to conduct cell biology and microbiology experiments on ISS. The Bioculture System carries ten environmentally independent Cassettes, which house the experiments. The closed loop fluids flow path subsystem in each Cassette provides a perfusion-based method for maintain specimen cultures in a shear-free environment by using a biochamber based on porous hollow fiber bioreactor technology. Each Cassette contains an incubator and separate insulated refrigerator compartment for storage of media, samples, nutrients and additives. The hardware is capable of fully automated or manual specimen culturing and processing, including in-flight experiment initiation, sampling and fixation, up to BSL-2 specimen culturing, and the ability to up to 10 independent cultures in parallel for statistical analysis. The incubation and culturing of specimens in the Bioculture System is a departure from standard laboratory culturing methods. Therefore, it is critical that the PI has an understanding the pre-flight test required for successfully using the Bioculture System to conduct an on-orbit experiment. Overall, the PI will conduct a series of ground tests to define flight experiment and on-orbit implementation requirements, verify biocompatibility, and determine base bioreactor conditions. The ground test processes for the utilization of the Bioculture System, from experiment selection to flight, will be reviewed. Also, pre-flight test schedules and use of COTS ground test equipment (CellMax and FiberCell systems) and the Bioculture System will be discussed.
Automation of Data Traffic Control on DSM Architecture
NASA Technical Reports Server (NTRS)
Frumkin, Michael; Jin, Hao-Qiang; Yan, Jerry
2001-01-01
The design of distributed shared memory (DSM) computers liberates users from the duty to distribute data across processors and allows for the incremental development of parallel programs using, for example, OpenMP or Java threads. DSM architecture greatly simplifies the development of parallel programs having good performance on a few processors. However, to achieve a good program scalability on DSM computers requires that the user understand data flow in the application and use various techniques to avoid data traffic congestions. In this paper we discuss a number of such techniques, including data blocking, data placement, data transposition and page size control and evaluate their efficiency on the NAS (NASA Advanced Supercomputing) Parallel Benchmarks. We also present a tool which automates the detection of constructs causing data congestions in Fortran array oriented codes and advises the user on code transformations for improving data traffic in the application.
Thurman, G. B.; Strong, D. M.; Ahmed, A.; Green, S. S.; Sell, K. W.; Hartzman, R. J.; Bach, F. H.
1973-01-01
Use of lymphocyte cultures for in vitro studies such as pretransplant histocompatibility testing has established the need for standardization of this technique. A microculture technique has been developed that has facilitated the culturing of lymphocytes and increased the quantity of cultures feasible, while lowering the variation between replicate samples. Cultures were prepared for determination of tritiated thymidine incorporation using a Multiple Automated Sample Harvester (MASH). Using this system, the parameters that influence the in vitro responsiveness of human lymphocytes to allogeneic lymphocytes have been investigated. PMID:4271568
Hussain, Waqar; Moens, Nathalie; Veraitch, Farlan S.; Hernandez, Diana; Mason, Chris; Lye, Gary J.
2013-01-01
The use of embryonic stem cells (ESCs) and their progeny in high throughput drug discovery and regenerative medicine will require production at scale of well characterized cells at an appropriate level of purity. The adoption of automated bioprocessing techniques offers the possibility to overcome the lack of consistency and high failure rates seen with current manual protocols. To build the case for increased use of automation this work addresses the key question: “can an automated system match the quality of a highly skilled and experienced person working manually?” To answer this we first describe an integrated automation platform designed for the ‘hands-free’ culture and differentiation of ESCs in microwell formats. Next we outline a framework for the systematic investigation and optimization of key bioprocess variables for the rapid establishment of validatable Standard Operating Procedures (SOPs). Finally the experimental comparison between manual and automated bioprocessing is exemplified by expansion of the murine Oct-4-GiP ESC line over eight sequential passages with their subsequent directed differentiation into neural precursors. Our results show that ESCs can be effectively maintained and differentiated in a highly reproducible manner by the automated system described. Statistical analysis of the results for cell growth over single and multiple passages shows up to a 3-fold improvement in the consistency of cell growth kinetics with automated passaging. The quality of the cells produced was evaluated using a panel of biological markers including cell growth rate and viability, nutrient and metabolite profiles, changes in gene expression and immunocytochemistry. Automated processing of the ESCs had no measurable negative effect on either their pluripotency or their ability to differentiate into the three embryonic germ layers. Equally important is that over a 6-month period of culture without antibiotics in the medium, we have not had any cases of culture contamination. This study thus confirms the benefits of adopting automated bioprocess routes to produce cells for therapy and for use in basic discovery research. PMID:23956681
Automated Patch-Clamp Methods for the hERG Cardiac Potassium Channel.
Houtmann, Sylvie; Schombert, Brigitte; Sanson, Camille; Partiseti, Michel; Bohme, G Andrees
2017-01-01
The human Ether-a-go-go Related Gene (hERG) product has been identified as a central ion channel underlying both familial forms of elongated QT interval on the electrocardiogram and drug-induced elongation of the same QT segment. Indeed, reduced function of this potassium channel involved in the repolarization of the cardiac action potential can produce a type of life-threatening cardiac ventricular arrhythmias called Torsades de Pointes (TdP). Therefore, hERG inhibitory activity of newly synthetized molecules is a relevant structure-activity metric for compound prioritization and optimization in medicinal chemistry phases of drug discovery. Electrophysiology remains the gold standard for the functional assessment of ion channel pharmacology. The recent years have witnessed automatization and parallelization of the manual patch-clamp technique, allowing higher throughput screening on recombinant hERG channels. However, the multi-well plate format of automatized patch-clamp does not allow visual detection of potential micro-precipitation of poorly soluble compounds. In this chapter we describe bench procedures for the culture and preparation of hERG-expressing CHO cells for recording on an automated patch-clamp workstation. We also show that the sensitivity of the assay can be improved by adding a surfactant to the extracellular medium.
NASA Technical Reports Server (NTRS)
Koltai, Kolina; Ho, Nhut; Masequesmay, Gina; Niedober, David; Skoog, Mark; Cacanindin, Artemio; Johnson, Walter; Lyons, Joseph
2014-01-01
This paper discusses a case study that examined the influence of cultural, organizational and automation capability upon human trust in, and reliance on, automation. In particular, this paper focuses on the design and application of an extended case study methodology, and on the foundational lessons revealed by it. Experimental test pilots involved in the research and development of the US Air Force's newly developed Automatic Ground Collision Avoidance System served as the context for this examination. An eclectic, multi-pronged approach was designed to conduct this case study, and proved effective in addressing the challenges associated with the case's politically sensitive and military environment. Key results indicate that the system design was in alignment with pilot culture and organizational mission, indicating the potential for appropriate trust development in operational pilots. These include the low-vulnerability/ high risk nature of the pilot profession, automation transparency and suspicion, system reputation, and the setup of and communications among organizations involved in the system development.
NASA Technical Reports Server (NTRS)
Koltai, Kolina Sun; Ho, Nhut; Masequesmay, Gina; Niedober, David; Skoog, Mark; Johnson, Walter; Cacanindin, Artemio
2014-01-01
This paper discusses a case study that examined the influence of cultural, organizational and automation capability upon human trust in, and reliance on, automation. In particular, this paper focuses on the design and application of an extended case study methodology, and on the foundational lessons revealed by it. Experimental test pilots involved in the research and development of the US Air Forces newly developed Automatic Ground Collision Avoidance System served as the context for this examination. An eclectic, multi-pronged approach was designed to conduct this case study, and proved effective in addressing the challenges associated with the cases politically sensitive and military environment. Key results indicate that the system design was in alignment with pilot culture and organizational mission, indicating the potential for appropriate trust development in operational pilots. These include the low-vulnerabilityhigh risk nature of the pilot profession, automation transparency and suspicion, system reputation, and the setup of and communications among organizations involved in the system development.
Zang, Qin; Javed, Salim; Hill, David; Ullah, Farman; Bi, Danse; Porubsky, Patrick; Neuenswander, Benjamin; Lushington, Gerald H; Santini, Conrad; Organ, Michael G; Hanson, Paul R
2012-08-13
The construction of a 96-member library of triazolated 1,2,5-thiadiazepane 1,1-dioxides was performed on a Chemspeed Accelerator (SLT-100) automated parallel synthesis platform, culminating in the successful preparation of 94 out of 96 possible products. The key step, a one-pot, sequential elimination, double-aza-Michael reaction, and [3 + 2] Huisgen cycloaddition pathway has been automated and utilized in the production of two sets of triazolated sultam products.
Zang, Qin; Javed, Salim; Hill, David; Ullah, Farman; Bi, Danse; Porubsky, Patrick; Neuenswander, Benjamin; Lushington, Gerald H.; Santini, Conrad; Organ, Michael G.; Hanson, Paul R.
2013-01-01
The construction of a 96-member library of triazolated 1,2,5-thiadiazepane 1,1-dioxides was performed on a Chemspeed Accelerator (SLT-100) automated parallel synthesis platform, culminating in the successful preparation of 94 out of 96 possible products. The key step, a one-pot, sequential elimination, double-aza-Michael reaction, and [3+2] Huisgen cycloaddition pathway has been automated and utilized in the production of two sets of triazolated sultam products. PMID:22853708
Automated video surveillance: teaching an old dog new tricks
NASA Astrophysics Data System (ADS)
McLeod, Alastair
1993-12-01
The automated video surveillance market is booming with new players, new systems, new hardware and software, and an extended range of applications. This paper reviews available technology, and describes the features required for a good automated surveillance system. Both hardware and software are discussed. An overview of typical applications is also given. A shift towards PC-based hybrid systems, use of parallel processing, neural networks, and exploitation of modern telecomms are introduced, highlighting the evolution modern video surveillance systems.
Danker, Timm; Braun, Franziska; Silbernagl, Nikole; Guenther, Elke
2016-03-01
Manual patch clamp, the gold standard of electrophysiology, represents a powerful and versatile toolbox to stimulate, modulate, and record ion channel activity from membrane fragments and whole cells. The electrophysiological readout can be combined with fluorescent or optogenetic methods and allows for ultrafast solution exchanges using specialized microfluidic tools. A hallmark of manual patch clamp is the intentional selection of individual cells for recording, often an essential prerequisite to generate meaningful data. So far, available automation solutions rely on random cell usage in the closed environment of a chip and thus sacrifice much of this versatility by design. To parallelize and automate the traditional patch clamp technique while perpetuating the full versatility of the method, we developed an approach to automation, which is based on active cell handling and targeted electrode placement rather than on random processes. This is achieved through an automated pipette positioning system, which guides the tips of recording pipettes with micrometer precision to a microfluidic cell handling device. Using a patch pipette array mounted on a conventional micromanipulator, our automated patch clamp process mimics the original manual patch clamp as closely as possible, yet achieving a configuration where recordings are obtained from many patch electrodes in parallel. In addition, our implementation is extensible by design to allow the easy integration of specialized equipment such as ultrafast compound application tools. The resulting system offers fully automated patch clamp on purposely selected cells and combines high-quality gigaseal recordings with solution switching in the millisecond timescale.
Applying Parallel Processing Techniques to Tether Dynamics Simulation
NASA Technical Reports Server (NTRS)
Wells, B. Earl
1996-01-01
The focus of this research has been to determine the effectiveness of applying parallel processing techniques to a sizable real-world problem, the simulation of the dynamics associated with a tether which connects two objects in low earth orbit, and to explore the degree to which the parallelization process can be automated through the creation of new software tools. The goal has been to utilize this specific application problem as a base to develop more generally applicable techniques.
The Influence of Cultural Factors on Trust in Automation
ERIC Educational Resources Information Center
Chien, Shih-Yi James
2016-01-01
Human interaction with automation is a complex process that requires both skilled operators and complex system designs to effectively enhance overall performance. Although automation has successfully managed complex systems throughout the world for over half a century, inappropriate reliance on automation can still occur, such as the recent…
Flaberg, Emilie; Sabelström, Per; Strandh, Christer; Szekely, Laszlo
2008-01-01
Background Confocal laser scanning microscopy has revolutionized cell biology. However, the technique has major limitations in speed and sensitivity due to the fact that a single laser beam scans the sample, allowing only a few microseconds signal collection for each pixel. This limitation has been overcome by the introduction of parallel beam illumination techniques in combination with cold CCD camera based image capture. Methods Using the combination of microlens enhanced Nipkow spinning disc confocal illumination together with fully automated image capture and large scale in silico image processing we have developed a system allowing the acquisition, presentation and analysis of maximum resolution confocal panorama images of several Gigapixel size. We call the method Extended Field Laser Confocal Microscopy (EFLCM). Results We show using the EFLCM technique that it is possible to create a continuous confocal multi-colour mosaic from thousands of individually captured images. EFLCM can digitize and analyze histological slides, sections of entire rodent organ and full size embryos. It can also record hundreds of thousands cultured cells at multiple wavelength in single event or time-lapse fashion on fixed slides, in live cell imaging chambers or microtiter plates. Conclusion The observer independent image capture of EFLCM allows quantitative measurements of fluorescence intensities and morphological parameters on a large number of cells. EFLCM therefore bridges the gap between the mainly illustrative fluorescence microscopy and purely quantitative flow cytometry. EFLCM can also be used as high content analysis (HCA) instrument for automated screening processes. PMID:18627634
Flaberg, Emilie; Sabelström, Per; Strandh, Christer; Szekely, Laszlo
2008-07-16
Confocal laser scanning microscopy has revolutionized cell biology. However, the technique has major limitations in speed and sensitivity due to the fact that a single laser beam scans the sample, allowing only a few microseconds signal collection for each pixel. This limitation has been overcome by the introduction of parallel beam illumination techniques in combination with cold CCD camera based image capture. Using the combination of microlens enhanced Nipkow spinning disc confocal illumination together with fully automated image capture and large scale in silico image processing we have developed a system allowing the acquisition, presentation and analysis of maximum resolution confocal panorama images of several Gigapixel size. We call the method Extended Field Laser Confocal Microscopy (EFLCM). We show using the EFLCM technique that it is possible to create a continuous confocal multi-colour mosaic from thousands of individually captured images. EFLCM can digitize and analyze histological slides, sections of entire rodent organ and full size embryos. It can also record hundreds of thousands cultured cells at multiple wavelength in single event or time-lapse fashion on fixed slides, in live cell imaging chambers or microtiter plates. The observer independent image capture of EFLCM allows quantitative measurements of fluorescence intensities and morphological parameters on a large number of cells. EFLCM therefore bridges the gap between the mainly illustrative fluorescence microscopy and purely quantitative flow cytometry. EFLCM can also be used as high content analysis (HCA) instrument for automated screening processes.
Tak For Yu, Zeta; Guan, Huijiao; Ki Cheung, Mei; McHugh, Walker M.; Cornell, Timothy T.; Shanley, Thomas P.; Kurabayashi, Katsuo; Fu, Jianping
2015-01-01
Immunoassays represent one of the most popular analytical methods for detection and quantification of biomolecules. However, conventional immunoassays such as ELISA and flow cytometry, even though providing high sensitivity and specificity and multiplexing capability, can be labor-intensive and prone to human error, making them unsuitable for standardized clinical diagnoses. Using a commercialized no-wash, homogeneous immunoassay technology (‘AlphaLISA’) in conjunction with integrated microfluidics, herein we developed a microfluidic immunoassay chip capable of rapid, automated, parallel immunoassays of microliter quantities of samples. Operation of the microfluidic immunoassay chip entailed rapid mixing and conjugation of AlphaLISA components with target analytes before quantitative imaging for analyte detections in up to eight samples simultaneously. Aspects such as fluid handling and operation, surface passivation, imaging uniformity, and detection sensitivity of the microfluidic immunoassay chip using AlphaLISA were investigated. The microfluidic immunoassay chip could detect one target analyte simultaneously for up to eight samples in 45 min with a limit of detection down to 10 pg mL−1. The microfluidic immunoassay chip was further utilized for functional immunophenotyping to examine cytokine secretion from human immune cells stimulated ex vivo. Together, the microfluidic immunoassay chip provides a promising high-throughput, high-content platform for rapid, automated, parallel quantitative immunosensing applications. PMID:26074253
Droplet Array-Based 3D Coculture System for High-Throughput Tumor Angiogenesis Assay.
Du, Xiaohui; Li, Wanming; Du, Guansheng; Cho, Hansang; Yu, Min; Fang, Qun; Lee, Luke P; Fang, Jin
2018-03-06
Angiogenesis is critical for tumor progression and metastasis, and it progresses through orchestral multicellular interactions. Thus, there is urgent demand for high-throughput tumor angiogenesis assays for concurrent examination of multiple factors. For investigating tumor angiogenesis, we developed a microfluidic droplet array-based cell-coculture system comprising a two-layer polydimethylsiloxane chip featuring 6 × 9 paired-well arrays and an automated droplet-manipulation device. In each droplet-pair unit, tumor cells were cultured in 3D in one droplet by mixing cell suspensions with Matrigel, and in the other droplet, human umbilical vein endothelial cells (HUVECs) were cultured in 2D. Droplets were fused by a newly developed fusion method, and tumor angiogenesis was assayed by coculturing tumor cells and HUVECs in the fused droplet units. The 3D-cultured tumor cells formed aggregates harboring a hypoxic center-as observed in vivo-and secreted more vascular endothelial growth factor (VEGF) and more strongly induced HUVEC tubule formation than did 2D-cultured tumor cells. Our single array supported 54 assays in parallel. The angiogenic potentials of distinct tumor cells and their differential responses to antiangiogenesis agent, Fingolimod, could be investigated without mutual interference in a single array. Our droplet-based assay is convenient to evaluate multicellular interaction in high throughput in the context of tumor sprouting angiogenesis, and we envision that the assay can be extensively implementable for studying other cell-cell interactions.
Parallel sequencing lives, or what makes large sequencing projects successful.
Quilez, Javier; Vidal, Enrique; Dily, François Le; Serra, François; Cuartero, Yasmina; Stadhouders, Ralph; Graf, Thomas; Marti-Renom, Marc A; Beato, Miguel; Filion, Guillaume
2017-11-01
T47D_rep2 and b1913e6c1_51720e9cf were 2 Hi-C samples. They were born and processed at the same time, yet their fates were very different. The life of b1913e6c1_51720e9cf was simple and fruitful, while that of T47D_rep2 was full of accidents and sorrow. At the heart of these differences lies the fact that b1913e6c1_51720e9cf was born under a lab culture of Documentation, Automation, Traceability, and Autonomy and compliance with the FAIR Principles. Their lives are a lesson for those who wish to embark on the journey of managing high-throughput sequencing data. © The Author 2017. Published by Oxford University Press.
An 8-Fold Parallel Reactor System for Combinatorial Catalysis Research
Stoll, Norbert; Allwardt, Arne; Dingerdissen, Uwe
2006-01-01
Increasing economic globalization and mounting time and cost pressure on the development of new raw materials for the chemical industry as well as materials and environmental engineering constantly raise the demands on technologies to be used. Parallelization, miniaturization, and automation are the main concepts involved in increasing the rate of chemical and biological experimentation. PMID:17671621
Multilevel decomposition of complete vehicle configuration in a parallel computing environment
NASA Technical Reports Server (NTRS)
Bhatt, Vinay; Ragsdell, K. M.
1989-01-01
This research summarizes various approaches to multilevel decomposition to solve large structural problems. A linear decomposition scheme based on the Sobieski algorithm is selected as a vehicle for automated synthesis of a complete vehicle configuration in a parallel processing environment. The research is in a developmental state. Preliminary numerical results are presented for several example problems.
Automation of 3D cell culture using chemically defined hydrogels.
Rimann, Markus; Angres, Brigitte; Patocchi-Tenzer, Isabel; Braum, Susanne; Graf-Hausner, Ursula
2014-04-01
Drug development relies on high-throughput screening involving cell-based assays. Most of the assays are still based on cells grown in monolayer rather than in three-dimensional (3D) formats, although cells behave more in vivo-like in 3D. To exemplify the adoption of 3D techniques in drug development, this project investigated the automation of a hydrogel-based 3D cell culture system using a liquid-handling robot. The hydrogel technology used offers high flexibility of gel design due to a modular composition of a polymer network and bioactive components. The cell inert degradation of the gel at the end of the culture period guaranteed the harmless isolation of live cells for further downstream processing. Human colon carcinoma cells HCT-116 were encapsulated and grown in these dextran-based hydrogels, thereby forming 3D multicellular spheroids. Viability and DNA content of the cells were shown to be similar in automated and manually produced hydrogels. Furthermore, cell treatment with toxic Taxol concentrations (100 nM) had the same effect on HCT-116 cell viability in manually and automated hydrogel preparations. Finally, a fully automated dose-response curve with the reference compound Taxol showed the potential of this hydrogel-based 3D cell culture system in advanced drug development.
[Microbiological point of care tests].
Book, Malte; Lehmann, Lutz Eric; Zhang, Xianghong; Stüber, Frank
2010-11-01
It is well known that the early initiation of a specific antiinfective therapy is crucial to reduce the mortality in severe infection. Procedures culturing pathogens are the diagnostic gold standard in such diseases. However, these methods yield results earliest between 24 to 48 hours. Therefore, severe infections such as sepsis need to be treated with an empirical antimicrobial therapy, which is ineffective in an unknown fraction of these patients. Today's microbiological point of care tests are pathogen specific and therefore not appropriate for an infection with a variety of possible pathogens. Molecular nucleic acid diagnostics such as polymerase chain reaction (PCR) allow the identification of pathogens and resistances. These methods are used routinely to speed up the analysis of positive blood cultures. The newest PCR based system allows the identification of the 25 most frequent sepsis pathogens by PCR in parallel without previous culture in less than 6 hours. Thereby, these systems might shorten the time of possibly insufficient antiinfective therapy. However, these extensive tools are not suitable as point of care diagnostics. Miniaturization and automating of the nucleic acid based method is pending, as well as an increase of detectable pathogens and resistance genes by these methods. It is assumed that molecular PCR techniques will have an increasing impact on microbiological diagnostics in the future. © Georg Thieme Verlag Stuttgart · New York.
Archer, Charles Jens; Musselman, Roy Glenn; Peters, Amanda; Pinnow, Kurt Walter; Swartz, Brent Allen; Wallenfelt, Brian Paul
2010-11-16
A massively parallel computer system contains an inter-nodal communications network of node-to-node links. An automated routing strategy routes packets through one or more intermediate nodes of the network to reach a destination. Some packets are constrained to be routed through respective designated transporter nodes, the automated routing strategy determining a path from a respective source node to a respective transporter node, and from a respective transporter node to a respective destination node. Preferably, the source node chooses a routing policy from among multiple possible choices, and that policy is followed by all intermediate nodes. The use of transporter nodes allows greater flexibility in routing.
Automated Performance Prediction of Message-Passing Parallel Programs
NASA Technical Reports Server (NTRS)
Block, Robert J.; Sarukkai, Sekhar; Mehra, Pankaj; Woodrow, Thomas S. (Technical Monitor)
1995-01-01
The increasing use of massively parallel supercomputers to solve large-scale scientific problems has generated a need for tools that can predict scalability trends of applications written for these machines. Much work has been done to create simple models that represent important characteristics of parallel programs, such as latency, network contention, and communication volume. But many of these methods still require substantial manual effort to represent an application in the model's format. The NIK toolkit described in this paper is the result of an on-going effort to automate the formation of analytic expressions of program execution time, with a minimum of programmer assistance. In this paper we demonstrate the feasibility of our approach, by extending previous work to detect and model communication patterns automatically, with and without overlapped computations. The predictions derived from these models agree, within reasonable limits, with execution times of programs measured on the Intel iPSC/860 and Paragon. Further, we demonstrate the use of MK in selecting optimal computational grain size and studying various scalability metrics.
Prüller, Florian; Wagner, Jasmin; Raggam, Reinhard B; Hoenigl, Martin; Kessler, Harald H; Truschnig-Wilders, Martie; Krause, Robert
2014-07-01
Testing for (1→3)-beta-D-glucan (BDG) is used for detection of invasive fungal infection. However, current assays lack automation and the ability to conduct rapid single-sample testing. The Fungitell assay was adopted for automation and evaluated using clinical samples from patients with culture-proven candidemia and from culture-negative controls in duplicate. A comparison with the standard assay protocol was made in order to establish analytical specifications. With the automated protocol, the analytical measuring range was 8-2500 pg/ml of BDG, and precision testing resulted in coefficients of variation that ranged from 3.0% to 5.5%. Samples from 15 patients with culture-proven candidemia and 94 culture-negative samples were evaluated. All culture-proven samples showed BDG values >80 pg/ml (mean 1247 pg/ml; range, 116-2990 pg/ml), which were considered positive. Of the 94 culture-negative samples, 92 had BDG values <60 pg/ml (mean, 28 pg/ml), which were considered to be negative, and 2 samples were false-positive (≥80 pg/ml; up to 124 pg/ml). Results could be obtained within 45 min and showed excellent agreement with results obtained with the standard assay protocol. The automated Fungitell assay proved to be reliable and rapid for diagnosis of candidemia. It was demonstrated to be feasible and cost efficient for both single-sample and large-scale testing of serum BDG. Its 1-h time-to-result will allow better support for clinicians in the management of antifungal therapy. © The Author 2014. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Impact of Implementation of an Automated Liquid Culture System on Diagnosis of Tuberculous Pleurisy.
Lee, Byung Hee; Yoon, Seong Hoon; Yeo, Hye Ju; Kim, Dong Wan; Lee, Seung Eun; Cho, Woo Hyun; Lee, Su Jin; Kim, Yun Seong; Jeon, Doosoo
2015-07-01
This study was conducted to evaluate the impact of implementation of an automated liquid culture system on the diagnosis of tuberculous pleurisy in an HIV-uninfected patient population. We retrospectively compared the culture yield, time to positivity, and contamination rate of pleural effusion samples in the BACTEC Mycobacteria Growth Indicator Tube 960 (MGIT) and Ogawa media among patients with tuberculous pleurisy. Out of 104 effusion samples, 43 (41.3%) were culture positive on either the MGIT or the Ogawa media. The culture yield of MGIT was higher (40.4%, 42/104) than that of Ogawa media (18.3%, 19/104) (P<0.001). One of the samples was positive only on the Ogawa medium. The median time to positivity was faster in the MGIT (18 days, range 8-32 days) than in the Ogawa media (37 days, range 20-59 days) (P<0.001). No contamination or growth of nontuberculous mycobacterium was observed on either of the culture media. In conclusion, the automated liquid culture system could provide approximately twice as high yields and fast results in effusion culture, compared to solid media. Supplemental solid media may have a limited impact on maximizing sensitivity in effusion culture; however, further studies are required.
DataForge: Modular platform for data storage and analysis
NASA Astrophysics Data System (ADS)
Nozik, Alexander
2018-04-01
DataForge is a framework for automated data acquisition, storage and analysis based on modern achievements of applied programming. The aim of the DataForge is to automate some standard tasks like parallel data processing, logging, output sorting and distributed computing. Also the framework extensively uses declarative programming principles via meta-data concept which allows a certain degree of meta-programming and improves results reproducibility.
A Parallel Genetic Algorithm for Automated Electronic Circuit Design
NASA Technical Reports Server (NTRS)
Lohn, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris; Norvig, Peter (Technical Monitor)
2000-01-01
We describe a parallel genetic algorithm (GA) that automatically generates circuit designs using evolutionary search. A circuit-construction programming language is introduced and we show how evolution can generate practical analog circuit designs. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. We present experimental results as applied to analog filter and amplifier design tasks.
Automated Handling of Garments for Pressing
1991-09-30
Parallel Algorithms for 2D Kalman Filtering ................................. 47 DJ. Potter and M.P. Cline Hash Table and Sorted Array: A Case Study of... Kalman Filtering on the Connection Machine ............................ 55 MA. Palis and D.K. Krecker Parallel Sorting of Large Arrays on the MasPar...ALGORITHM’VS FOR SEAM SENSING. .. .. .. ... ... .... ..... 24 6.1 KarelTW Algorithms .. .. ... ... ... ... .... ... ...... 24 6.1.1 Image Filtering
Flexible automation of cell culture and tissue engineering tasks.
Knoll, Alois; Scherer, Torsten; Poggendorf, Iris; Lütkemeyer, Dirk; Lehmann, Jürgen
2004-01-01
Until now, the predominant use cases of industrial robots have been routine handling tasks in the automotive industry. In biotechnology and tissue engineering, in contrast, only very few tasks have been automated with robots. New developments in robot platform and robot sensor technology, however, make it possible to automate plants that largely depend on human interaction with the production process, e.g., for material and cell culture fluid handling, transportation, operation of equipment, and maintenance. In this paper we present a robot system that lends itself to automating routine tasks in biotechnology but also has the potential to automate other production facilities that are similar in process structure. After motivating the design goals, we describe the system and its operation, illustrate sample runs, and give an assessment of the advantages. We conclude this paper by giving an outlook on possible further developments.
Manufacture of a human mesenchymal stem cell population using an automated cell culture platform.
Thomas, Robert James; Chandra, Amit; Liu, Yang; Hourd, Paul C; Conway, Paul P; Williams, David J
2007-09-01
Tissue engineering and regenerative medicine are rapidly developing fields that use cells or cell-based constructs as therapeutic products for a wide range of clinical applications. Efforts to commercialise these therapies are driving a need for capable, scaleable, manufacturing technologies to ensure therapies are able to meet regulatory requirements and are economically viable at industrial scale production. We report the first automated expansion of a human bone marrow derived mesenchymal stem cell population (hMSCs) using a fully automated cell culture platform. Differences in cell population growth profile, attributed to key methodological differences, were observed between the automated protocol and a benchmark manual protocol. However, qualitatively similar cell output, assessed by cell morphology and the expression of typical hMSC markers, was obtained from both systems. Furthermore, the critical importance of minor process variation, e.g. the effect of cell seeding density on characteristics such as population growth kinetics and cell phenotype, was observed irrespective of protocol type. This work highlights the importance of careful process design in therapeutic cell manufacture and demonstrates the potential of automated culture for future optimisation and scale up studies required for the translation of regenerative medicine products from the laboratory to the clinic.
A droplet-to-digital (D2D) microfluidic device for single cell assays.
Shih, Steve C C; Gach, Philip C; Sustarich, Jess; Simmons, Blake A; Adams, Paul D; Singh, Seema; Singh, Anup K
2015-01-07
We have developed a new hybrid droplet-to-digital microfluidic platform (D2D) that integrates droplet-in-channel microfluidics with digital microfluidics (DMF) for performing multi-step assays. This D2D platform combines the strengths of the two formats-droplets-in-channel for facile generation of droplets containing single cells, and DMF for on-demand manipulation of droplets including control of different droplet volumes (pL-μL), creation of a dilution series of ionic liquid (IL), and parallel single cell culturing and analysis for IL toxicity screening. This D2D device also allows for automated analysis that includes a feedback-controlled system for merging and splitting of droplets to add reagents, an integrated Peltier element for parallel cell culture at optimum temperature, and an impedance sensing mechanism to control the flow rate for droplet generation and preventing droplet evaporation. Droplet-in-channel is well-suited for encapsulation of single cells as it allows the careful manipulation of flow rates of aqueous phase containing cells and oil to optimize encapsulation. Once single cell containing droplets are generated, they are transferred to a DMF chip via a capillary where they are merged with droplets containing IL and cultured at 30 °C. The DMF chip, in addition to permitting cell culture and reagent (ionic liquid/salt) addition, also allows recovery of individual droplets for off-chip analysis such as further culturing and measurement of ethanol production. The D2D chip was used to evaluate the effect of IL/salt type (four types: NaOAc, NaCl, [C2mim] [OAc], [C2mim] [Cl]) and concentration (four concentrations: 0, 37.5, 75, 150 mM) on the growth kinetics and ethanol production of yeast and as expected, increasing IL concentration led to lower biomass and ethanol production. Specifically, [C2mim] [OAc] had inhibitory effects on yeast growth at concentrations 75 and 150 mM and significantly reduced their ethanol production compared to cells grown in other ILs/salts. The growth curve trends obtained by D2D matched conventional yeast culturing in microtiter wells, validating the D2D platform. We believe that our approach represents a generic platform for multi-step biochemical assays such as drug screening, digital PCR, enzyme assays, immunoassays and cell-based assays.
ERIC Educational Resources Information Center
Chen, Pei-Hua; Chang, Hua-Hua; Wu, Haiyan
2012-01-01
Two sampling-and-classification-based procedures were developed for automated test assembly: the Cell Only and the Cell and Cube methods. A simulation study based on a 540-item bank was conducted to compare the performance of the procedures with the performance of a mixed-integer programming (MIP) method for assembling multiple parallel test…
Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0
Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...
2008-01-01
The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less
Parallelization of ARC3D with Computer-Aided Tools
NASA Technical Reports Server (NTRS)
Jin, Haoqiang; Hribar, Michelle; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
A series of efforts have been devoted to investigating methods of porting and parallelizing applications quickly and efficiently for new architectures, such as the SCSI Origin 2000 and Cray T3E. This report presents the parallelization of a CFD application, ARC3D, using the computer-aided tools, Cesspools. Steps of parallelizing this code and requirements of achieving better performance are discussed. The generated parallel version has achieved reasonably well performance, for example, having a speedup of 30 for 36 Cray T3E processors. However, this performance could not be obtained without modification of the original serial code. It is suggested that in many cases improving serial code and performing necessary code transformations are important parts for the automated parallelization process although user intervention in many of these parts are still necessary. Nevertheless, development and improvement of useful software tools, such as Cesspools, can help trim down many tedious parallelization details and improve the processing efficiency.
Identification of Chemical-Genetic Interactions via Parallel Analysis of Barcoded Yeast Strains.
Suresh, Sundari; Schlecht, Ulrich; Xu, Weihong; Miranda, Molly; Davis, Ronald W; Nislow, Corey; Giaever, Guri; St Onge, Robert P
2016-09-01
The Yeast Knockout Collection is a complete set of gene deletion strains for the budding yeast, Saccharomyces cerevisiae In each strain, one of approximately 6000 open-reading frames is replaced with a dominant selectable marker flanked by two DNA barcodes. These barcodes, which are unique to each gene, allow the growth of thousands of strains to be individually measured from a single pooled culture. The collection, and other resources that followed, has ushered in a new era in chemical biology, enabling unbiased and systematic identification of chemical-genetic interactions (CGIs) with remarkable ease. CGIs link bioactive compounds to biological processes, and hence can reveal the mechanism of action of growth-inhibitory compounds in vivo, including those of antifungal, antibiotic, and anticancer drugs. The chemogenomic profiling method described here measures the sensitivity induced in yeast heterozygous and homozygous deletion strains in the presence of a chemical inhibitor of growth (termed haploinsufficiency profiling and homozygous profiling, respectively, or HIPHOP). The protocol is both scalable and amenable to automation. After competitive growth of yeast knockout collection cultures, with and without chemical inhibitors, CGIs can be identified and quantified using either array- or sequencing-based approaches as described here. © 2016 Cold Spring Harbor Laboratory Press.
Automated Cooperative Trajectories
NASA Technical Reports Server (NTRS)
Hanson, Curt; Pahle, Joseph; Brown, Nelson
2015-01-01
This presentation is an overview of the Automated Cooperative Trajectories project. An introduction to the phenomena of wake vortices is given, along with a summary of past research into the possibility of extracting energy from the wake by flying close parallel trajectories. Challenges and barriers to adoption of civilian automatic wake surfing technology are identified. A hardware-in-the-loop simulation is described that will support future research. Finally, a roadmap for future research and technology transition is proposed.
Transfection in perfused microfluidic cell culture devices: A case study.
Raimes, William; Rubi, Mathieu; Super, Alexandre; Marques, Marco P C; Veraitch, Farlan; Szita, Nicolas
2017-08-01
Automated microfluidic devices are a promising route towards a point-of-care autologous cell therapy. The initial steps of induced pluripotent stem cell (iPSC) derivation involve transfection and long term cell culture. Integration of these steps would help reduce the cost and footprint of micro-scale devices with applications in cell reprogramming or gene correction. Current examples of transfection integration focus on maximising efficiency rather than viable long-term culture. Here we look for whole process compatibility by integrating automated transfection with a perfused microfluidic device designed for homogeneous culture conditions. The injection process was characterised using fluorescein to establish a LabVIEW-based routine for user-defined automation. Proof-of-concept is demonstrated by chemically transfecting a GFP plasmid into mouse embryonic stem cells (mESCs). Cells transfected in the device showed an improvement in efficiency (34%, n = 3) compared with standard protocols (17.2%, n = 3). This represents a first step towards microfluidic processing systems for cell reprogramming or gene therapy.
The Automated Instrumentation and Monitoring System (AIMS): Design and Architecture. 3.2
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Schmidt, Melisa; Schulbach, Cathy; Bailey, David (Technical Monitor)
1997-01-01
Whether a researcher is designing the 'next parallel programming paradigm', another 'scalable multiprocessor' or investigating resource allocation algorithms for multiprocessors, a facility that enables parallel program execution to be captured and displayed is invaluable. Careful analysis of such information can help computer and software architects to capture, and therefore, exploit behavioral variations among/within various parallel programs to take advantage of specific hardware characteristics. A software tool-set that facilitates performance evaluation of parallel applications on multiprocessors has been put together at NASA Ames Research Center under the sponsorship of NASA's High Performance Computing and Communications Program over the past five years. The Automated Instrumentation and Monitoring Systematic has three major software components: a source code instrumentor which automatically inserts active event recorders into program source code before compilation; a run-time performance monitoring library which collects performance data; and a visualization tool-set which reconstructs program execution based on the data collected. Besides being used as a prototype for developing new techniques for instrumenting, monitoring and presenting parallel program execution, AIMS is also being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Currently, the execution of FORTRAN and C programs on the Intel Paragon and PALM workstations can be automatically instrumented and monitored. Performance data thus collected can be displayed graphically on various workstations. The process of performance tuning with AIMS will be illustrated using various NAB Parallel Benchmarks. This report includes a description of the internal architecture of AIMS and a listing of the source code.
ERIC Educational Resources Information Center
Boekkooi-Timminga, Ellen
Nine methods for automated test construction are described. All are based on the concepts of information from item response theory. Two general kinds of methods for the construction of parallel tests are presented: (1) sequential test design; and (2) simultaneous test design. Sequential design implies that the tests are constructed one after the…
StrAuto: automation and parallelization of STRUCTURE analysis.
Chhatre, Vikram E; Emerson, Kevin J
2017-03-24
Population structure inference using the software STRUCTURE has become an integral part of population genetic studies covering a broad spectrum of taxa including humans. The ever-expanding size of genetic data sets poses computational challenges for this analysis. Although at least one tool currently implements parallel computing to reduce computational overload of this analysis, it does not fully automate the use of replicate STRUCTURE analysis runs required for downstream inference of optimal K. There is pressing need for a tool that can deploy population structure analysis on high performance computing clusters. We present an updated version of the popular Python program StrAuto, to streamline population structure analysis using parallel computing. StrAuto implements a pipeline that combines STRUCTURE analysis with the Evanno Δ K analysis and visualization of results using STRUCTURE HARVESTER. Using benchmarking tests, we demonstrate that StrAuto significantly reduces the computational time needed to perform iterative STRUCTURE analysis by distributing runs over two or more processors. StrAuto is the first tool to integrate STRUCTURE analysis with post-processing using a pipeline approach in addition to implementing parallel computation - a set up ideal for deployment on computing clusters. StrAuto is distributed under the GNU GPL (General Public License) and available to download from http://strauto.popgen.org .
Role of the Controller in an Integrated Pilot-Controller Study for Parallel Approaches
NASA Technical Reports Server (NTRS)
Verma, Savvy; Kozon, Thomas; Ballinger, Debbi; Lozito, Sandra; Subramanian, Shobana
2011-01-01
Closely spaced parallel runway operations have been found to increase capacity within the National Airspace System but poor visibility conditions reduce the use of these operations [1]. Previous research examined the concepts and procedures related to parallel runways [2][4][5]. However, there has been no investigation of the procedures associated with the strategic and tactical pairing of aircraft for these operations. This study developed and examined the pilot s and controller s procedures and information requirements for creating aircraft pairs for closely spaced parallel runway operations. The goal was to achieve aircraft pairing with a temporal separation of 15s (+/- 10s error) at a coupling point that was 12 nmi from the runway threshold. In this paper, the role of the controller, as examined in an integrated study of controllers and pilots, is presented. The controllers utilized a pairing scheduler and new pairing interfaces to help create and maintain aircraft pairs, in a high-fidelity, human-in-the loop simulation experiment. Results show that the controllers worked as a team to achieve pairing between aircraft and the level of inter-controller coordination increased when the aircraft in the pair belonged to different sectors. Controller feedback did not reveal over reliance on the automation nor complacency with the pairing automation or pairing procedures.
SequenceL: Automated Parallel Algorithms Derived from CSP-NT Computational Laws
NASA Technical Reports Server (NTRS)
Cooke, Daniel; Rushton, Nelson
2013-01-01
With the introduction of new parallel architectures like the cell and multicore chips from IBM, Intel, AMD, and ARM, as well as the petascale processing available for highend computing, a larger number of programmers will need to write parallel codes. Adding the parallel control structure to the sequence, selection, and iterative control constructs increases the complexity of code development, which often results in increased development costs and decreased reliability. SequenceL is a high-level programming language that is, a programming language that is closer to a human s way of thinking than to a machine s. Historically, high-level languages have resulted in decreased development costs and increased reliability, at the expense of performance. In recent applications at JSC and in industry, SequenceL has demonstrated the usual advantages of high-level programming in terms of low cost and high reliability. SequenceL programs, however, have run at speeds typically comparable with, and in many cases faster than, their counterparts written in C and C++ when run on single-core processors. Moreover, SequenceL is able to generate parallel executables automatically for multicore hardware, gaining parallel speedups without any extra effort from the programmer beyond what is required to write the sequen tial/singlecore code. A SequenceL-to-C++ translator has been developed that automatically renders readable multithreaded C++ from a combination of a SequenceL program and sample data input. The SequenceL language is based on two fundamental computational laws, Consume-Simplify- Produce (CSP) and Normalize-Trans - pose (NT), which enable it to automate the creation of parallel algorithms from high-level code that has no annotations of parallelism whatsoever. In our anecdotal experience, SequenceL development has been in every case less costly than development of the same algorithm in sequential (that is, single-core, single process) C or C++, and an order of magnitude less costly than development of comparable parallel code. Moreover, SequenceL not only automatically parallelizes the code, but since it is based on CSP-NT, it is provably race free, thus eliminating the largest quality challenge the parallelized software developer faces.
Workload Capacity: A Response Time-Based Measure of Automation Dependence.
Yamani, Yusuke; McCarley, Jason S
2016-05-01
An experiment used the workload capacity measure C(t) to quantify the processing efficiency of human-automation teams and identify operators' automation usage strategies in a speeded decision task. Although response accuracy rates and related measures are often used to measure the influence of an automated decision aid on human performance, aids can also influence response speed. Mean response times (RTs), however, conflate the influence of the human operator and the automated aid on team performance and may mask changes in the operator's performance strategy under aided conditions. The present study used a measure of parallel processing efficiency, or workload capacity, derived from empirical RT distributions as a novel gauge of human-automation performance and automation dependence in a speeded task. Participants performed a speeded probabilistic decision task with and without the assistance of an automated aid. RT distributions were used to calculate two variants of a workload capacity measure, COR(t) and CAND(t). Capacity measures gave evidence that a diagnosis from the automated aid speeded human participants' responses, and that participants did not moderate their own decision times in anticipation of diagnoses from the aid. Workload capacity provides a sensitive and informative measure of human-automation performance and operators' automation dependence in speeded tasks. © 2016, Human Factors and Ergonomics Society.
Development of design principles for automated systems in transport control.
Balfe, Nora; Wilson, John R; Sharples, Sarah; Clarke, Theresa
2012-01-01
This article reports the results of a qualitative study investigating attitudes towards and opinions of an advanced automation system currently used in UK rail signalling. In-depth interviews were held with 10 users, key issues associated with automation were identified and the automation's impact on the signalling task investigated. The interview data highlighted the importance of the signallers' understanding of the automation and their (in)ability to predict its outputs. The interviews also covered the methods used by signallers to interact with and control the automation, and the perceived effects on their workload. The results indicate that despite a generally low level of understanding and ability to predict the actions of the automation system, signallers have developed largely successful coping mechanisms that enable them to use the technology effectively. These findings, along with parallel work identifying desirable attributes of automation from the literature in the area, were used to develop 12 principles of automation which can be used to help design new systems which better facilitate cooperative working. The work reported in this article was completed with the active involvement of operational rail staff who regularly use automated systems in rail signalling. The outcomes are currently being used to inform decisions on the extent and type of automation and user interfaces in future generations of rail control systems.
The State of Planning of Automation Projects in the Libraries of Canada.
ERIC Educational Resources Information Center
Clement, Hope E. A.
Library automation in Canada is complicated by the large size, dispersed population, and cultural diversity of the country. The National Library of Canada is actively planning a Canadian library network based on national bibliographic services for which the library is now developing automated systems. Canadian libraries are involved in the…
ERIC Educational Resources Information Center
Goh, Jonathan Wee Pin
2009-01-01
With the global economy becoming more integrated, the issues of cross-cultural relevance and transferability of leadership theories and practices have become increasingly urgent. Drawing upon the concept of parallel leadership in schools proposed by Crowther, Kaagan, Ferguson, and Hann as an example, the purpose of this paper is to examine the…
Inventory management and reagent supply for automated chemistry.
Kuzniar, E
1999-08-01
Developments in automated chemistry have kept pace with developments in HTS such that hundreds of thousands of new compounds can be rapidly synthesized in the belief that the greater the number and diversity of compounds that can be screened, the more successful HTS will be. The increasing use of automation for Multiple Parallel Synthesis (MPS) and the move to automated combinatorial library production is placing an overwhelming burden on the management of reagents. Although automation has improved the efficiency of the processes involved in compound synthesis, the bottleneck has shifted to ordering, collating and preparing reagents for automated chemistry resulting in loss of time, materials and momentum. Major efficiencies have already been made in the area of compound management for high throughput screening. Most of these efficiencies have been achieved with sophisticated library management systems using advanced engineering and data handling for the storage, tracking and retrieval of millions of compounds. The Automation Partnership has already provided many of the top pharmaceutical companies with modular automated storage, preparation and retrieval systems to manage compound libraries for high throughput screening. This article describes how these systems may be implemented to solve the specific problems of inventory management and reagent supply for automated chemistry.
Comparison of methods for the identification of microorganisms isolated from blood cultures.
Monteiro, Aydir Cecília Marinho; Fortaleza, Carlos Magno Castelo Branco; Ferreira, Adriano Martison; Cavalcante, Ricardo de Souza; Mondelli, Alessandro Lia; Bagagli, Eduardo; da Cunha, Maria de Lourdes Ribeiro de Souza
2016-08-05
Bloodstream infections are responsible for thousands of deaths each year. The rapid identification of the microorganisms causing these infections permits correct therapeutic management that will improve the prognosis of the patient. In an attempt to reduce the time spent on this step, microorganism identification devices have been developed, including the VITEK(®) 2 system, which is currently used in routine clinical microbiology laboratories. This study evaluated the accuracy of the VITEK(®) 2 system in the identification of 400 microorganisms isolated from blood cultures and compared the results to those obtained with conventional phenotypic and genotypic methods. In parallel to the phenotypic identification methods, the DNA of these microorganisms was extracted directly from the blood culture bottles for genotypic identification by the polymerase chain reaction (PCR) and DNA sequencing. The automated VITEK(®) 2 system correctly identified 94.7 % (379/400) of the isolates. The YST and GN cards resulted in 100 % correct identifications of yeasts (15/15) and Gram-negative bacilli (165/165), respectively. The GP card correctly identified 92.6 % (199/215) of Gram-positive cocci, while the ANC card was unable to correctly identify any Gram-positive bacilli (0/5). The performance of the VITEK(®) 2 system was considered acceptable and statistical analysis showed that the system is a suitable option for routine clinical microbiology laboratories to identify different microorganisms.
NASA Technical Reports Server (NTRS)
2003-01-01
Topics covered include: Stable, Thermally Conductive Fillers for Bolted Joints; Connecting to Thermocouples with Fewer Lead Wires; Zipper Connectors for Flexible Electronic Circuits; Safety Interlock for Angularly Misdirected Power Tool; Modular, Parallel Pulse-Shaping Filter Architectures; High-Fidelity Piezoelectric Audio Device; Photovoltaic Power Station with Ultracapacitors for Storage; Time Analyzer for Time Synchronization and Monitor of the Deep Space Network; Program for Computing Albedo; Integrated Software for Analyzing Designs of Launch Vehicles; Abstract-Reasoning Software for Coordinating Multiple Agents; Software Searches for Better Spacecraft-Navigation Models; Software for Partly Automated Recognition of Targets; Antistatic Polycarbonate/Copper Oxide Composite; Better VPS Fabrication of Crucibles and Furnace Cartridges; Burn-Resistant, Strong Metal-Matrix Composites; Self-Deployable Spring-Strip Booms; Explosion Welding for Hermetic Containerization; Improved Process for Fabricating Carbon Nanotube Probes; Automated Serial Sectioning for 3D Reconstruction; and Parallel Subconvolution Filtering Architectures.
Qi, Xin; Xing, Fuyong; Foran, David J.; Yang, Lin
2013-01-01
Automated image analysis of histopathology specimens could potentially provide support for early detection and improved characterization of breast cancer. Automated segmentation of the cells comprising imaged tissue microarrays (TMA) is a prerequisite for any subsequent quantitative analysis. Unfortunately, crowding and overlapping of cells present significant challenges for most traditional segmentation algorithms. In this paper, we propose a novel algorithm which can reliably separate touching cells in hematoxylin stained breast TMA specimens which have been acquired using a standard RGB camera. The algorithm is composed of two steps. It begins with a fast, reliable object center localization approach which utilizes single-path voting followed by mean-shift clustering. Next, the contour of each cell is obtained using a level set algorithm based on an interactive model. We compared the experimental results with those reported in the most current literature. Finally, performance was evaluated by comparing the pixel-wise accuracy provided by human experts with that produced by the new automated segmentation algorithm. The method was systematically tested on 234 image patches exhibiting dense overlap and containing more than 2200 cells. It was also tested on whole slide images including blood smears and tissue microarrays containing thousands of cells. Since the voting step of the seed detection algorithm is well suited for parallelization, a parallel version of the algorithm was implemented using graphic processing units (GPU) which resulted in significant speed-up over the C/C++ implementation. PMID:22167559
Laser Scanner For Automatic Storage
NASA Astrophysics Data System (ADS)
Carvalho, Fernando D.; Correia, Bento A.; Rebordao, Jose M.; Rodrigues, F. Carvalho
1989-01-01
The automated magazines are beeing used at industry more and more. One of the problems related with the automation of a Store House is the identification of the products envolved. Already used for stock management, the Bar Codes allows an easy way to identify one product. Applied to automated magazines, the bar codes allows a great variety of items in a small code. In order to be used by the national producers of automated magazines, a devoted laser scanner has been develloped. The Prototype uses an He-Ne laser whose beam scans a field angle of 75 degrees at 16 Hz. The scene reflectivity is transduced by a photodiode into an electrical signal, which is then binarized. This digital signal is the input of the decodifying program. The machine is able to see barcodes and to decode the information. A parallel interface allows the comunication with the central unit, which is responsible for the management of automated magazine.
Automation, parallelism, and robotics for proteomics.
Alterovitz, Gil; Liu, Jonathan; Chow, Jijun; Ramoni, Marco F
2006-07-01
The speed of the human genome project (Lander, E. S., Linton, L. M., Birren, B., Nusbaum, C. et al., Nature 2001, 409, 860-921) was made possible, in part, by developments in automation of sequencing technologies. Before these technologies, sequencing was a laborious, expensive, and personnel-intensive task. Similarly, automation and robotics are changing the field of proteomics today. Proteomics is defined as the effort to understand and characterize proteins in the categories of structure, function and interaction (Englbrecht, C. C., Facius, A., Comb. Chem. High Throughput Screen. 2005, 8, 705-715). As such, this field nicely lends itself to automation technologies since these methods often require large economies of scale in order to achieve cost and time-saving benefits. This article describes some of the technologies and methods being applied in proteomics in order to facilitate automation within the field as well as in linking proteomics-based information with other related research areas.
Integrated microfluidic devices for combinatorial cell-based assays.
Yu, Zeta Tak For; Kamei, Ken-ichiro; Takahashi, Hiroko; Shu, Chengyi Jenny; Wang, Xiaopu; He, George Wenfu; Silverman, Robert; Radu, Caius G; Witte, Owen N; Lee, Ki-Bum; Tseng, Hsian-Rong
2009-06-01
The development of miniaturized cell culture platforms for performing parallel cultures and combinatorial assays is important in cell biology from the single-cell level to the system level. In this paper we developed an integrated microfluidic cell-culture platform, Cell-microChip (Cell-microChip), for parallel analyses of the effects of microenvironmental cues (i.e., culture scaffolds) on different mammalian cells and their cellular responses to external stimuli. As a model study, we demonstrated the ability of culturing and assaying several mammalian cells, such as NIH 3T3 fibroblast, B16 melanoma and HeLa cell lines, in a parallel way. For functional assays, first we tested drug-induced apoptotic responses from different cell lines. As a second functional assay, we performed "on-chip" transfection of a reporter gene encoding an enhanced green fluorescent protein (EGFP) followed by live-cell imaging of transcriptional activation of cyclooxygenase 2 (Cox-2) expression. Collectively, our Cell-microChip approach demonstrated the capability to carry out parallel operations and the potential to further integrate advanced functions and applications in the broader space of combinatorial chemistry and biology.
Integrated microfluidic devices for combinatorial cell-based assays
Yu, Zeta Tak For; Kamei, Ken-ichiro; Takahashi, Hiroko; Shu, Chengyi Jenny; Wang, Xiaopu; He, George Wenfu; Silverman, Robert
2010-01-01
The development of miniaturized cell culture platforms for performing parallel cultures and combinatorial assays is important in cell biology from the single-cell level to the system level. In this paper we developed an integrated microfluidic cell-culture platform, Cell-microChip (Cell-μChip), for parallel analyses of the effects of microenvir-onmental cues (i.e., culture scaffolds) on different mammalian cells and their cellular responses to external stimuli. As a model study, we demonstrated the ability of culturing and assaying several mammalian cells, such as NIH 3T3 fibro-blast, B16 melanoma and HeLa cell lines, in a parallel way. For functional assays, first we tested drug-induced apoptotic responses from different cell lines. As a second functional assay, we performed "on-chip" transfection of a reporter gene encoding an enhanced green fluorescent protein (EGFP) followed by live-cell imaging of transcriptional activation of cyclooxygenase 2 (Cox-2) expression. Collectively, our Cell-μChip approach demonstrated the capability to carry out parallel operations and the potential to further integrate advanced functions and applications in the broader space of combinatorial chemistry and biology. PMID:19130244
Design of a MIMD neural network processor
NASA Astrophysics Data System (ADS)
Saeks, Richard E.; Priddy, Kevin L.; Pap, Robert M.; Stowell, S.
1994-03-01
The Accurate Automation Corporation (AAC) neural network processor (NNP) module is a fully programmable multiple instruction multiple data (MIMD) parallel processor optimized for the implementation of neural networks. The AAC NNP design fully exploits the intrinsic sparseness of neural network topologies. Moreover, by using a MIMD parallel processing architecture one can update multiple neurons in parallel with efficiency approaching 100 percent as the size of the network increases. Each AAC NNP module has 8 K neurons and 32 K interconnections and is capable of 140,000,000 connections per second with an eight processor array capable of over one billion connections per second.
Development of an automated MODS plate reader to detect early growth of Mycobacterium tuberculosis.
Comina, G; Mendoza, D; Velazco, A; Coronel, J; Sheen, P; Gilman, R H; Moore, D A J; Zimic, M
2011-06-01
In this work, an automated microscopic observation drug susceptibility (MODS) plate reader has been developed. The reader automatically handles MODS plates and after autofocussing digital images are acquired of the characteristic microscopic cording structures of Mycobacterium tuberculosis, which are the identification method utilized in the MODS technique to detect tuberculosis and multidrug resistant tuberculosis. In conventional MODS, trained technicians manually move the MODS plate on the stage of an inverted microscope while trying to locate and focus upon the characteristic microscopic cording colonies. In centres with high tuberculosis diagnostic demand, sufficient time may not be available to adequately examine all cultures. An automated reader would reduce labour time and the handling of M. tuberculosis cultures by laboratory personnel. Two hundred MODS culture images (100 from tuberculosis positive and 100 from tuberculosis negative sputum samples confirmed by a standard MODS reading using a commercial microscope) were acquired randomly using the automated MODS plate reader. A specialist analysed these digital images with the help of a personal computer and designated them as M. tuberculosis present or absent. The specialist considered four images insufficiently clear to permit a definitive reading. The readings from the 196 valid images resulted in a 100% agreement with the conventional nonautomated standard reading. The automated MODS plate reader combined with open-source MODS pattern recognition software provides a novel platform for high throughput automated tuberculosis diagnosis. © 2011 The Authors Journal of Microscopy © 2011 Royal Microscopical Society.
1979-12-01
Increased Automation 52 35. ATC Responsibility for Weather Avoidance 53 36. Additional Simulation Needed - ATARS , BCAS, CDTI 53 37. Safety Impacts of...Address Beacon System (DABS)/Data Link, Automated Traffic Advisory and Resolution Service ( ATARS ) and Cockpit Display of Terminal Information (CDT[), are...should not consider down- linking of air derived data either to enhance ATARS or to be the basis for more closely spaced IFR approaches to parallel
Study of living single cells in culture: automated recognition of cell behavior.
Bodin, P; Papin, S; Meyer, C; Travo, P
1988-07-01
An automated system capable of analyzing the behavior, in real time, of single living cells in culture, in a noninvasive and nondestructive way, has been developed. A large number of cell positions in single culture dishes were recorded using a computer controlled, robotized microscope. During subsequent observations, binary images obtained from video image analysis of the microscope visual field allowed the identification of the recorded cells. These cells could be revisited automatically every few minutes. Long-term studies of the behavior of cells make possible the analysis of cellular locomotary and mitotic activities as well as determination of cell shape (chosen from a defined library) for several hours or days in a fully automated way with observations spaced up to 30 minutes. Short-term studies of the behavior of cells permit the study, in a semiautomatic way, of acute effects of drugs (5 to 15 minutes) on changes of surface area and length of cells.
Weber, Emanuel; Pinkse, Martijn W. H.; Bener-Aksam, Eda; Vellekoop, Michael J.; Verhaert, Peter D. E. M.
2012-01-01
We present a fully automated setup for performing in-line mass spectrometry (MS) analysis of conditioned media in cell cultures, in particular focusing on the peptides therein. The goal is to assess peptides secreted by cells in different culture conditions. The developed system is compatible with MS as analytical technique, as this is one of the most powerful analysis methods for peptide detection and identification. Proof of concept was achieved using the well-known mating-factor signaling in baker's yeast, Saccharomyces cerevisiae. Our concept system holds 1 mL of cell culture medium and allows maintaining a yeast culture for, at least, 40 hours with continuous supernatant extraction (and medium replenishing). The device's small dimensions result in reduced costs for reagents and open perspectives towards full integration on-chip. Experimental data that can be obtained are time-resolved peptide profiles in a yeast culture, including information about the appearance of mating-factor-related peptides. We emphasize that the system operates without any manual intervention or pipetting steps, which allows for an improved overall sensitivity compared to non-automated alternatives. MS data confirmed previously reported aspects of the physiology of the yeast-mating process. Moreover, matingfactor breakdown products (as well as evidence for a potentially responsible protease) were found. PMID:23091722
Lee, Jae H.; Yao, Yushu; Shrestha, Uttam; Gullberg, Grant T.; Seo, Youngho
2014-01-01
The primary goal of this project is to implement the iterative statistical image reconstruction algorithm, in this case maximum likelihood expectation maximum (MLEM) used for dynamic cardiac single photon emission computed tomography, on Spark/GraphX. This involves porting the algorithm to run on large-scale parallel computing systems. Spark is an easy-to- program software platform that can handle large amounts of data in parallel. GraphX is a graph analytic system running on top of Spark to handle graph and sparse linear algebra operations in parallel. The main advantage of implementing MLEM algorithm in Spark/GraphX is that it allows users to parallelize such computation without any expertise in parallel computing or prior knowledge in computer science. In this paper we demonstrate a successful implementation of MLEM in Spark/GraphX and present the performance gains with the goal to eventually make it useable in clinical setting. PMID:27081299
Lee, Jae H; Yao, Yushu; Shrestha, Uttam; Gullberg, Grant T; Seo, Youngho
2014-11-01
The primary goal of this project is to implement the iterative statistical image reconstruction algorithm, in this case maximum likelihood expectation maximum (MLEM) used for dynamic cardiac single photon emission computed tomography, on Spark/GraphX. This involves porting the algorithm to run on large-scale parallel computing systems. Spark is an easy-to- program software platform that can handle large amounts of data in parallel. GraphX is a graph analytic system running on top of Spark to handle graph and sparse linear algebra operations in parallel. The main advantage of implementing MLEM algorithm in Spark/GraphX is that it allows users to parallelize such computation without any expertise in parallel computing or prior knowledge in computer science. In this paper we demonstrate a successful implementation of MLEM in Spark/GraphX and present the performance gains with the goal to eventually make it useable in clinical setting.
Parallel Education and Defining the Fourth Sector.
ERIC Educational Resources Information Center
Chessell, Diana
1996-01-01
Parallel to the primary, secondary, postsecondary, and adult/community education sectors is education not associated with formal programs--learning in arts and cultural sites. The emergence of cultural and educational tourism is an opportunity for adult/community education to define itself by extending lifelong learning opportunities into parallel…
Choe, Leila H; Lee, Kelvin H
2003-10-01
We investigate one approach to assess the quantitative variability in two-dimensional gel electrophoresis (2-DE) separations based on gel-to-gel variability, sample preparation variability, sample load differences, and the effect of automation on image analysis. We observe that 95% of spots present in three out of four replicate gels exhibit less than a 0.52 coefficient of variation (CV) in fluorescent stain intensity (% volume) for a single sample run on multiple gels. When four parallel sample preparations are performed, this value increases to 0.57. We do not observe any significant change in quantitative value for an increase or decrease in sample load of 30% when using appropriate image analysis variables. Increasing use of automation, while necessary in modern 2-DE experiments, does change the observed level of quantitative and qualitative variability among replicate gels. The number of spots that change qualitatively for a single sample run in parallel varies from a CV = 0.03 for fully manual analysis to CV = 0.20 for a fully automated analysis. We present a systematic method by which a single laboratory can measure gel-to-gel variability using only three gel runs.
Automation of multi-agent control for complex dynamic systems in heterogeneous computational network
NASA Astrophysics Data System (ADS)
Oparin, Gennady; Feoktistov, Alexander; Bogdanova, Vera; Sidorov, Ivan
2017-01-01
The rapid progress of high-performance computing entails new challenges related to solving large scientific problems for various subject domains in a heterogeneous distributed computing environment (e.g., a network, Grid system, or Cloud infrastructure). The specialists in the field of parallel and distributed computing give the special attention to a scalability of applications for problem solving. An effective management of the scalable application in the heterogeneous distributed computing environment is still a non-trivial issue. Control systems that operate in networks, especially relate to this issue. We propose a new approach to the multi-agent management for the scalable applications in the heterogeneous computational network. The fundamentals of our approach are the integrated use of conceptual programming, simulation modeling, network monitoring, multi-agent management, and service-oriented programming. We developed a special framework for an automation of the problem solving. Advantages of the proposed approach are demonstrated on the parametric synthesis example of the static linear regulator for complex dynamic systems. Benefits of the scalable application for solving this problem include automation of the multi-agent control for the systems in a parallel mode with various degrees of its detailed elaboration.
PEGASUS 5: An Automated Pre-Processor for Overset-Grid CFD
NASA Technical Reports Server (NTRS)
Suhs, Norman E.; Rogers, Stuart E.; Dietz, William E.; Kwak, Dochan (Technical Monitor)
2002-01-01
An all new, automated version of the PEGASUS software has been developed and tested. PEGASUS provides the hole-cutting and connectivity information between overlapping grids, and is used as the final part of the grid generation process for overset-grid computational fluid dynamics approaches. The new PEGASUS code (Version 5) has many new features: automated hole cutting; a projection scheme for fixing gaps in overset surfaces; more efficient interpolation search methods using an alternating digital tree; hole-size optimization based on adding additional layers of fringe points; and an automatic restart capability. The new code has also been parallelized using the Message Passing Interface standard. The parallelization performance provides efficient speed-up of the execution time by an order of magnitude, and up to a factor of 30 for very large problems. The results of three example cases are presented: a three-element high-lift airfoil, a generic business jet configuration, and a complete Boeing 777-200 aircraft in a high-lift landing configuration. Comparisons of the computed flow fields for the airfoil and 777 test cases between the old and new versions of the PEGASUS codes show excellent agreement with each other and with experimental results.
Kelly, Benjamin J; Fitch, James R; Hu, Yangqiu; Corsmeier, Donald J; Zhong, Huachun; Wetzel, Amy N; Nordquist, Russell D; Newsom, David L; White, Peter
2015-01-20
While advances in genome sequencing technology make population-scale genomics a possibility, current approaches for analysis of these data rely upon parallelization strategies that have limited scalability, complex implementation and lack reproducibility. Churchill, a balanced regional parallelization strategy, overcomes these challenges, fully automating the multiple steps required to go from raw sequencing reads to variant discovery. Through implementation of novel deterministic parallelization techniques, Churchill allows computationally efficient analysis of a high-depth whole genome sample in less than two hours. The method is highly scalable, enabling full analysis of the 1000 Genomes raw sequence dataset in a week using cloud resources. http://churchill.nchri.org/.
Kotwal, Aarti; Biswas, Debasis; Raghuvanshi, Shailendra; Sindhwani, Girish; Kakati, Barnali; Sharma, Shweta
2017-04-01
The diagnosis of smear-negative pulmonary tuberculosis (PTB) is particularly challenging, and automated liquid culture and molecular line probe assays (LPA) may prove particularly useful. The objective of our study was to evaluate the diagnostic potential of automated liquid culture (ALC) technology and commercial LPA in sputum smear-negative PTB suspects. Spot sputum samples were collected from 145 chest-symptomatic smear-negative patients and subjected to ALC, direct drug susceptibility test (DST) testing and LPA, as per manufacturers' instructions. A diagnostic yield of 26.2% was observed among sputum smear-negative TB suspects with 47.4% of the culture isolates being either INH- and/or rifampicin-resistant. Complete agreement was observed between the results of ALC assay and LPA except for two isolates which demonstrated sensitivity to INH and rifampicin at direct DST but were rifampicin-resistant in LPA. Two novel mutations were also detected among the multidrug isolates by LPA. In view of the diagnostic challenges associated with the diagnosis of TB in sputum smear-negative patients, our study demonstrates the applicability of ALC and LPA in establishing diagnostic evidence of TB.
Karger, Barry L.; Kotler, Lev; Foret, Frantisek; Minarik, Marek; Kleparnik, Karel
2003-12-09
A modular multiple lane or capillary electrophoresis (chromatography) system that permits automated parallel separation and comprehensive collection of all fractions from samples in all lanes or columns, with the option of further on-line automated sample fraction analysis, is disclosed. Preferably, fractions are collected in a multi-well fraction collection unit, or plate (40). The multi-well collection plate (40) is preferably made of a solvent permeable gel, most preferably a hydrophilic, polymeric gel such as agarose or cross-linked polyacrylamide.
Rapid and sensitive detection of hepatitis A virus in representative food matrices.
Papafragkou, Efstathia; Plante, Michelle; Mattison, Kirsten; Bidawid, Sabah; Karthikeyan, Kalavethi; Farber, Jeffrey M; Jaykus, Lee-Ann
2008-01-01
Hepatitis A virus (HAV) is an important cause of foodborne disease worldwide. The detection of this virus in naturally contaminated food products is complicated by the absence of a reliable culture method, low levels of contamination, and the presence of matrix-associated compounds which inhibit molecular detection. In this study, we report a novel method to concentrate HAV from foods prior to the application of reverse transcription-PCR (RT-PCR) for detection. Specifically, we used cationically charged magnetic particles with an automated capture system (Pathatrix) to concentrate the virus from 25 g samples of artificially contaminated lettuce, strawberries, green onions, deli-turkey, oysters, and cake with frosting. Detection limits varied according to the product but in most cases, the virus could be consistently detected at input levels corresponding to 10(2)PFU/25 g food sample. For some products, detection was possible at levels as low as 10(-1)PFU/25 g. The assay was applied by a second independent laboratory and was also used to confirm viral contamination of produce items associated with a recent HAV outbreak. Parallel infectivity assays demonstrated that the cationically charged particles bound approximately 50% of the input virus. This is the first application of the automated magnetic capture technology to the concentration of viruses from foods, and it offers promise for facilitating the rapid detection of HAV from naturally contaminated products.
MPI, HPF or OpenMP: A Study with the NAS Benchmarks
NASA Technical Reports Server (NTRS)
Jin, Hao-Qiang; Frumkin, Michael; Hribar, Michelle; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1999-01-01
Porting applications to new high performance parallel and distributed platforms is a challenging task. Writing parallel code by hand is time consuming and costly, but the task can be simplified by high level languages and would even better be automated by parallelizing tools and compilers. The definition of HPF (High Performance Fortran, based on data parallel model) and OpenMP (based on shared memory parallel model) standards has offered great opportunity in this respect. Both provide simple and clear interfaces to language like FORTRAN and simplify many tedious tasks encountered in writing message passing programs. In our study we implemented the parallel versions of the NAS Benchmarks with HPF and OpenMP directives. Comparison of their performance with the MPI implementation and pros and cons of different approaches will be discussed along with experience of using computer-aided tools to help parallelize these benchmarks. Based on the study,potentials of applying some of the techniques to realistic aerospace applications will be presented
MPI, HPF or OpenMP: A Study with the NAS Benchmarks
NASA Technical Reports Server (NTRS)
Jin, H.; Frumkin, M.; Hribar, M.; Waheed, A.; Yan, J.; Saini, Subhash (Technical Monitor)
1999-01-01
Porting applications to new high performance parallel and distributed platforms is a challenging task. Writing parallel code by hand is time consuming and costly, but this task can be simplified by high level languages and would even better be automated by parallelizing tools and compilers. The definition of HPF (High Performance Fortran, based on data parallel model) and OpenMP (based on shared memory parallel model) standards has offered great opportunity in this respect. Both provide simple and clear interfaces to language like FORTRAN and simplify many tedious tasks encountered in writing message passing programs. In our study, we implemented the parallel versions of the NAS Benchmarks with HPF and OpenMP directives. Comparison of their performance with the MPI implementation and pros and cons of different approaches will be discussed along with experience of using computer-aided tools to help parallelize these benchmarks. Based on the study, potentials of applying some of the techniques to realistic aerospace applications will be presented.
Basic research planning in mathematical pattern recognition and image analysis
NASA Technical Reports Server (NTRS)
Bryant, J.; Guseman, L. F., Jr.
1981-01-01
Fundamental problems encountered while attempting to develop automated techniques for applications of remote sensing are discussed under the following categories: (1) geometric and radiometric preprocessing; (2) spatial, spectral, temporal, syntactic, and ancillary digital image representation; (3) image partitioning, proportion estimation, and error models in object scene interference; (4) parallel processing and image data structures; and (5) continuing studies in polarization; computer architectures and parallel processing; and the applicability of "expert systems" to interactive analysis.
2017-04-13
modelling code, a parallel benchmark , and a communication avoiding version of the QR algorithm. Further, several improvements to the OmpSs model were...movement; and a port of the dynamic load balancing library to OmpSs. Finally, several updates to the tools infrastructure were accomplished, including: an...OmpSs: a basic algorithm on image processing applications, a mini application representative of an ocean modelling code, a parallel benchmark , and a
Paramedir: A Tool for Programmable Performance Analysis
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Labarta, Jesus; Gimenez, Judit
2004-01-01
Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.
Gordon, Sarah M; Elegino-Steffens, Diane U; Agee, Willie; Barnhill, Jason; Hsue, Gunther
2013-01-01
Upper respiratory tract infections (URIs) can be a serious burden to the healthcare system. The majority of URIs are viral in etiology, but definitive diagnosis can prove difficult due to frequently overlapping clinical presentations of viral and bacterial infections, and the variable sensitivity, and lengthy turn-around time of viral culture. We tested new automated nested multiplex PCR technology, the FilmArray® system, in the TAMC department of clinical investigations, to determine the feasibility of replacing the standard viral culture with a rapid turn-around system. We conducted a feasibility study using a single-blinded comparison study, comparing PCR results with archived viral culture results from a convenience sample of cryopreserved archived nasopharyngeal swabs from acutely ill ED patients who presented with complaints of URI symptoms. A total of 61 archived samples were processed. Viral culture had previously identified 31 positive specimens from these samples. The automated nested multiplex PCR detected 38 positive samples. In total, PCR was 94.5% concordant with the previously positive viral culture results. However, PCR was only 63.4% concordant with the negative viral culture results, owing to PCR detection of 11 additional viral pathogens not recovered on viral culture. The average time to process a sample was 75 minutes. We determined that an automated nested multiplex PCR is a feasible alternative to viral culture in an acute clinical setting. We were able to detect at least 94.5% as many viral pathogens as viral culture is able to identify, with a faster turn-around time. PMID:24052914
Development of a novel automated cell isolation, expansion, and characterization platform.
Franscini, Nicola; Wuertz, Karin; Patocchi-Tenzer, Isabel; Durner, Roland; Boos, Norbert; Graf-Hausner, Ursula
2011-06-01
Implementation of regenerative medicine in the clinical setting requires not only biological inventions, but also the development of reproducible and safe method for cell isolation and expansion. As the currently used manual techniques do not fulfill these requirements, there is a clear need to develop an adequate robotic platform for automated, large-scale production of cells or cell-based products. Here, we demonstrate an automated liquid-handling cell-culture platform that can be used to isolate, expand, and characterize human primary cells (e.g., from intervertebral disc tissue) with results that are comparable to the manual procedure. Specifically, no differences could be observed for cell yield, viability, aggregation rate, growth rate, and phenotype. Importantly, all steps-from the enzymatic isolation of cells through the biopsy to the final quality control-can be performed completely by the automated system because of novel tools that were incorporated into the platform. This automated cell-culture platform can therefore replace entirely manual processes in areas that require high throughput while maintaining stability and safety, such as clinical or industrial settings. Copyright © 2011 Society for Laboratory Automation and Screening. Published by Elsevier Inc. All rights reserved.
Computer-Aided Parallelizer and Optimizer
NASA Technical Reports Server (NTRS)
Jin, Haoqiang
2011-01-01
The Computer-Aided Parallelizer and Optimizer (CAPO) automates the insertion of compiler directives (see figure) to facilitate parallel processing on Shared Memory Parallel (SMP) machines. While CAPO currently is integrated seamlessly into CAPTools (developed at the University of Greenwich, now marketed as ParaWise), CAPO was independently developed at Ames Research Center as one of the components for the Legacy Code Modernization (LCM) project. The current version takes serial FORTRAN programs, performs interprocedural data dependence analysis, and generates OpenMP directives. Due to the widely supported OpenMP standard, the generated OpenMP codes have the potential to run on a wide range of SMP machines. CAPO relies on accurate interprocedural data dependence information currently provided by CAPTools. Compiler directives are generated through identification of parallel loops in the outermost level, construction of parallel regions around parallel loops and optimization of parallel regions, and insertion of directives with automatic identification of private, reduction, induction, and shared variables. Attempts also have been made to identify potential pipeline parallelism (implemented with point-to-point synchronization). Although directives are generated automatically, user interaction with the tool is still important for producing good parallel codes. A comprehensive graphical user interface is included for users to interact with the parallelization process.
[Automated analyser of organ cultured corneal endothelial mosaic].
Gain, P; Thuret, G; Chiquet, C; Gavet, Y; Turc, P H; Théillère, C; Acquart, S; Le Petit, J C; Maugery, J; Campos, L
2002-05-01
Until now, organ-cultured corneal endothelial mosaic has been assessed in France by cell counting using a calibrated graticule, or by drawing cells on a computerized image. The former method is unsatisfactory because it is characterized by a lack of objective evaluation of the cell surface and hexagonality and it requires an experienced technician. The latter method is time-consuming and requires careful attention. We aimed to make an efficient, fast and easy to use, automated digital analyzer of video images of the corneal endothelium. The hardware included a PC Pentium III ((R)) 800 MHz-Ram 256, a Data Translation 3155 acquisition card, a Sony SC 75 CE CCD camera, and a 22-inch screen. Special functions for automated cell boundary determination consisted of Plug-in programs included in the ImageTool software. Calibration was performed using a calibrated micrometer. Cell densities of 40 organ-cultured corneas measured by both manual and automated counting were compared using parametric tests (Student's t test for paired variables and the Pearson correlation coefficient). All steps were considered more ergonomic i.e., endothelial image capture, image selection, thresholding of multiple areas of interest, automated cell count, automated detection of errors in cell boundary drawing, presentation of the results in an HTML file including the number of counted cells, cell density, coefficient of variation of cell area, cell surface histogram and cell hexagonality. The device was efficient because the global process lasted on average 7 minutes and did not require an experienced technician. The correlation between cell densities obtained with both methods was high (r=+0.84, p<0.001). The results showed an under-estimation using manual counting (2191+/-322 vs. 2273+/-457 cell/mm(2), p=0.046), compared with the automated method. Our automated endothelial cell analyzer is efficient and gives reliable results quickly and easily. A multicentric validation would allow us to standardize cell counts among cornea banks in our country.
Fogel, Mina; Harari, Ayelet; Müller-Holzner, Elisabeth; Zeimet, Alain G; Moldenhauer, Gerhard; Altevogt, Peter
2014-06-25
The L1 cell adhesion molecule (L1CAM) is overexpressed in many human cancers and can serve as a biomarker for prognosis in most of these cancers (including type I endometrial carcinomas). Here we provide an optimized immunohistochemical staining procedure for a widely used automated platform (VENTANA™), which has recourse to commercially available primary antibody and detection reagents. In parallel, we optimized the staining on a semi-automated BioGenix (i6000) immunostainer. These protocols yield good stainings and should represent the basis for a reliable and standardized immunohistochemical detection of L1CAM in a variety of malignancies in different laboratories.
Luan, Peng; Lee, Sophia; Paluch, Maciej; Kansopon, Joe; Viajar, Sharon; Begum, Zahira; Chiang, Nancy; Nakamura, Gerald; Hass, Philip E.; Wong, Athena W.; Lazar, Greg A.
2018-01-01
ABSTRACT To rapidly find “best-in-class” antibody therapeutics, it has become essential to develop high throughput (HTP) processes that allow rapid assessment of antibodies for functional and molecular properties. Consequently, it is critical to have access to sufficient amounts of high quality antibody, to carry out accurate and quantitative characterization. We have developed automated workflows using liquid handling systems to conduct affinity-based purification either in batch or tip column mode. Here, we demonstrate the capability to purify >2000 antibodies per day from microscale (1 mL) cultures. Our optimized, automated process for human IgG1 purification using MabSelect SuRe resin achieves ∼70% recovery over a wide range of antibody loads, up to 500 µg. This HTP process works well for hybridoma-derived antibodies that can be purified by MabSelect SuRe resin. For rat IgG2a, which is often encountered in hybridoma cultures and is challenging to purify via an HTP process, we established automated purification with GammaBind Plus resin. Using these HTP purification processes, we can efficiently recover sufficient amounts of antibodies from mammalian transient or hybridoma cultures with quality comparable to conventional column purification. PMID:29494273
Automated Purification of Recombinant Proteins: Combining High-throughput with High Yield
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Chiann Tso; Moore, Priscilla A.; Auberry, Deanna L.
2006-05-01
Protein crystallography, mapping protein interactions and other approaches of current functional genomics require not only purifying large numbers of proteins but also obtaining sufficient yield and homogeneity for downstream high-throughput applications. There is a need for the development of robust automated high-throughput protein expression and purification processes to meet these requirements. We developed and compared two alternative workflows for automated purification of recombinant proteins based on expression of bacterial genes in Escherichia coli: First - a filtration separation protocol based on expression of 800 ml E. coli cultures followed by filtration purification using Ni2+-NTATM Agarose (Qiagen). Second - a smallermore » scale magnetic separation method based on expression in 25 ml cultures of E.coli followed by 96-well purification on MagneHisTM Ni2+ Agarose (Promega). Both workflows provided comparable average yields of proteins about 8 ug of purified protein per unit of OD at 600 nm of bacterial culture. We discuss advantages and limitations of the automated workflows that can provide proteins more than 90 % pure in the range of 100 ug – 45 mg per purification run as well as strategies for optimization of these protocols.« less
Markert, Sven; Joeris, Klaus
2017-01-01
We developed an automated microtiter plate (MTP)-based system for suspension cell culture to meet the increased demands for miniaturized high throughput applications in biopharmaceutical process development. The generic system is based on off-the-shelf commercial laboratory automation equipment and is able to utilize MTPs of different configurations (6-24 wells per plate) in orbital shaken mode. The shaking conditions were optimized by Computational Fluid Dynamics simulations. The fully automated system handles plate transport, seeding and feeding of cells, daily sampling, and preparation of analytical assays. The integration of all required analytical instrumentation into the system enables a hands-off operation which prevents bottlenecks in sample processing. The modular set-up makes the system flexible and adaptable for a continuous extension of analytical parameters and add-on components. The system proved suitable as screening tool for process development by verifying the comparability of results for the MTP-based system and bioreactors regarding profiles of viable cell density, lactate, and product concentration of CHO cell lines. These studies confirmed that 6 well MTPs as well as 24 deepwell MTPs were predictive for a scale up to a 1000 L stirred tank reactor (scale factor 1:200,000). Applying the established cell culture system for automated media blend screening in late stage development, a 22% increase in product yield was achieved in comparison to the reference process. The predicted product increase was subsequently confirmed in 2 L bioreactors. Thus, we demonstrated the feasibility of the automated MTP-based cell culture system for enhanced screening and optimization applications in process development and identified further application areas such as process robustness. The system offers a great potential to accelerate time-to-market for new biopharmaceuticals. Biotechnol. Bioeng. 2017;114: 113-121. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Kuznetsov, P. A.; Kovalev, I. V.; Losev, V. V.; Kalinin, A. O.; Murygin, A. V.
2016-04-01
The article discusses the reliability of automated control systems. Analyzes the approach to the classification systems for health States. This approach can be as traditional binary approach, operating with the concept of "serviceability", and other variants of estimation of the system state. This article provides one such option, providing selective evaluation of components for the reliability of the entire system. Introduced description of various automatic control systems and their elements from the point of view of health and risk, mathematical method of determining the transition object from state to state, they differ from each other in the implementation of the objective function. Explores the interplay of elements in different States, the aggregate state of the elements connected in series or in parallel. Are the tables of various logic States and the principles of their calculation in series and parallel connection. Through simulation the proposed approach is illustrated by finding the probability of getting into the system state data in parallel and serially connected elements, with their different probabilities of moving from state to state. In general, the materials of article will be useful for analyzing of the reliability the automated control systems and engineering of the highly-reliable systems. Thus, this mechanism to determine the State of the system provides more detailed information about it and allows a selective approach to the reliability of the system as a whole. Detailed results when assessing the reliability of the automated control systems allows the engineer to make an informed decision when designing means of improving reliability.
The Automated Instrumentation and Monitoring System (AIMS) reference manual
NASA Technical Reports Server (NTRS)
Yan, Jerry; Hontalas, Philip; Listgarten, Sherry
1993-01-01
Whether a researcher is designing the 'next parallel programming paradigm,' another 'scalable multiprocessor' or investigating resource allocation algorithms for multiprocessors, a facility that enables parallel program execution to be captured and displayed is invaluable. Careful analysis of execution traces can help computer designers and software architects to uncover system behavior and to take advantage of specific application characteristics and hardware features. A software tool kit that facilitates performance evaluation of parallel applications on multiprocessors is described. The Automated Instrumentation and Monitoring System (AIMS) has four major software components: a source code instrumentor which automatically inserts active event recorders into the program's source code before compilation; a run time performance-monitoring library, which collects performance data; a trace file animation and analysis tool kit which reconstructs program execution from the trace file; and a trace post-processor which compensate for data collection overhead. Besides being used as prototype for developing new techniques for instrumenting, monitoring, and visualizing parallel program execution, AIMS is also being incorporated into the run-time environments of various hardware test beds to evaluate their impact on user productivity. Currently, AIMS instrumentors accept FORTRAN and C parallel programs written for Intel's NX operating system on the iPSC family of multi computers. A run-time performance-monitoring library for the iPSC/860 is included in this release. We plan to release monitors for other platforms (such as PVM and TMC's CM-5) in the near future. Performance data collected can be graphically displayed on workstations (e.g. Sun Sparc and SGI) supporting X-Windows (in particular, Xl IR5, Motif 1.1.3).
Third Conference on Artificial Intelligence for Space Applications, part 1
NASA Technical Reports Server (NTRS)
Denton, Judith S. (Compiler); Freeman, Michael S. (Compiler); Vereen, Mary (Compiler)
1987-01-01
The application of artificial intelligence to spacecraft and aerospace systems is discussed. Expert systems, robotics, space station automation, fault diagnostics, parallel processing, knowledge representation, scheduling, man-machine interfaces and neural nets are among the topics discussed.
[Establishment of Automation System for Detection of Alcohol in Blood].
Tian, L L; Shen, Lei; Xue, J F; Liu, M M; Liang, L J
2017-02-01
To establish an automation system for detection of alcohol content in blood. The determination was performed by automated workstation of extraction-headspace gas chromatography (HS-GC). The blood collection with negative pressure, sealing time of headspace bottle and sample needle were checked and optimized in the abstraction of automation system. The automatic sampling was compared with the manual sampling. The quantitative data obtained by the automated workstation of extraction-HS-GC for alcohol was stable. The relative differences of two parallel samples were less than 5%. The automated extraction was superior to the manual extraction. A good linear relationship was obtained at the alcohol concentration range of 0.1-3.0 mg/mL ( r ≥0.999) with good repeatability. The method is simple and quick, with more standard experiment process and accurate experimental data. It eliminates the error from the experimenter and has good repeatability, which can be applied to the qualitative and quantitative detections of alcohol in blood. Copyright© by the Editorial Department of Journal of Forensic Medicine
Using CLIPS in the domain of knowledge-based massively parallel programming
NASA Technical Reports Server (NTRS)
Dvorak, Jiri J.
1994-01-01
The Program Development Environment (PDE) is a tool for massively parallel programming of distributed-memory architectures. Adopting a knowledge-based approach, the PDE eliminates the complexity introduced by parallel hardware with distributed memory and offers complete transparency in respect of parallelism exploitation. The knowledge-based part of the PDE is realized in CLIPS. Its principal task is to find an efficient parallel realization of the application specified by the user in a comfortable, abstract, domain-oriented formalism. A large collection of fine-grain parallel algorithmic skeletons, represented as COOL objects in a tree hierarchy, contains the algorithmic knowledge. A hybrid knowledge base with rule modules and procedural parts, encoding expertise about application domain, parallel programming, software engineering, and parallel hardware, enables a high degree of automation in the software development process. In this paper, important aspects of the implementation of the PDE using CLIPS and COOL are shown, including the embedding of CLIPS with C++-based parts of the PDE. The appropriateness of the chosen approach and of the CLIPS language for knowledge-based software engineering are discussed.
Compact, Automated, Frequency-Agile Microspectrofluorimeter
NASA Technical Reports Server (NTRS)
Fernandez, Salvador M.; Guignon, Ernest F.
1995-01-01
Compact, reliable, rugged, automated cell-culture and frequency-agile microspectrofluorimetric apparatus developed to perform experiments involving photometric imaging observations of single live cells. In original application, apparatus operates mostly unattended aboard spacecraft; potential terrestrial applications include automated or semiautomated diagnosis of pathological tissues in clinical laboratories, biomedical instrumentation, monitoring of biological process streams, and portable instrumentation for testing biological conditions in various environments. Offers obvious advantages over present laboratory instrumentation.
Clarity: An Open Source Manager for Laboratory Automation
Delaney, Nigel F.; Echenique, José Rojas; Marx, Christopher J.
2013-01-01
Software to manage automated laboratories interfaces with hardware instruments, gives users a way to specify experimental protocols, and schedules activities to avoid hardware conflicts. In addition to these basics, modern laboratories need software that can run multiple different protocols in parallel and that can be easily extended to interface with a constantly growing diversity of techniques and instruments. We present Clarity: a laboratory automation manager that is hardware agnostic, portable, extensible and open source. Clarity provides critical features including remote monitoring, robust error reporting by phone or email, and full state recovery in the event of a system crash. We discuss the basic organization of Clarity; demonstrate an example of its implementation for the automated analysis of bacterial growth; and describe how the program can be extended to manage new hardware. Clarity is mature; well documented; actively developed; written in C# for the Common Language Infrastructure; and is free and open source software. These advantages set Clarity apart from currently available laboratory automation programs. PMID:23032169
Signal amplification of FISH for automated detection using image cytometry.
Truong, K; Boenders, J; Maciorowski, Z; Vielh, P; Dutrillaux, B; Malfoy, B; Bourgeois, C A
1997-05-01
The purpose of this study was to improve the detection of FISH signals, in order that spot counting by a fully automated image cytometer be comparable to that obtained visually under the microscope. Two systems of spot scoring, visual and automated counting, were investigated in parallel on stimulated human lymphocytes with FISH using a biotinylated centromeric probe for chromosome 3. Signal characteristics were first analyzed on images recorded with a coupled charge device (CCD) camera. Number of spots per nucleus were scored visually on these recorded images versus automatically with a DISCOVERY image analyzer. Several fluochromes, amplification and pretreatments were tested. Our results for both visual and automated scoring show that the tyramide amplification system (TSA) gives the best amplification of signal if pepsin treatment is applied prior to FISH. Accuracy of the automated scoring, however, remained low (58% of nuclei containing two spots) compared to the visual scoring because of the high intranuclear variation between FISH spots.
An Automated High-throughput Array Microscope for Cancer Cell Mechanics
NASA Astrophysics Data System (ADS)
Cribb, Jeremy A.; Osborne, Lukas D.; Beicker, Kellie; Psioda, Matthew; Chen, Jian; O'Brien, E. Timothy; Taylor, Russell M., II; Vicci, Leandra; Hsiao, Joe Ping-Lin; Shao, Chong; Falvo, Michael; Ibrahim, Joseph G.; Wood, Kris C.; Blobe, Gerard C.; Superfine, Richard
2016-06-01
Changes in cellular mechanical properties correlate with the progression of metastatic cancer along the epithelial-to-mesenchymal transition (EMT). Few high-throughput methodologies exist that measure cell compliance, which can be used to understand the impact of genetic alterations or to screen the efficacy of chemotherapeutic agents. We have developed a novel array high-throughput microscope (AHTM) system that combines the convenience of the standard 96-well plate with the ability to image cultured cells and membrane-bound microbeads in twelve independently-focusing channels simultaneously, visiting all wells in eight steps. We use the AHTM and passive bead rheology techniques to determine the relative compliance of human pancreatic ductal epithelial (HPDE) cells, h-TERT transformed HPDE cells (HPNE), and four gain-of-function constructs related to EMT. The AHTM found HPNE, H-ras, Myr-AKT, and Bcl2 transfected cells more compliant relative to controls, consistent with parallel tests using atomic force microscopy and invasion assays, proving the AHTM capable of screening for changes in mechanical phenotype.
Parallel Monotonic Basin Hopping for Low Thrust Trajectory Optimization
NASA Technical Reports Server (NTRS)
McCarty, Steven L.; McGuire, Melissa L.
2018-01-01
Monotonic Basin Hopping has been shown to be an effective method of solving low thrust trajectory optimization problems. This paper outlines an extension to the common serial implementation by parallelizing it over any number of available compute cores. The Parallel Monotonic Basin Hopping algorithm described herein is shown to be an effective way to more quickly locate feasible solutions, and improve locally optimal solutions in an automated way without requiring a feasible initial guess. The increased speed achieved through parallelization enables the algorithm to be applied to more complex problems that would otherwise be impractical for a serial implementation. Low thrust cislunar transfers and a hybrid Mars example case demonstrate the effectiveness of the algorithm. Finally, a preliminary scaling study quantifies the expected decrease in solve time compared to a serial implementation.,
Parallel synthesis of a series of potentially brain penetrant aminoalkyl benzoimidazoles.
Micco, Iolanda; Nencini, Arianna; Quinn, Joanna; Bothmann, Hendrick; Ghiron, Chiara; Padova, Alessandro; Papini, Silvia
2008-03-01
Alpha7 agonists were identified via GOLD (CCDC) docking in the putative agonist binding site of an alpha7 homology model and a series of aminoalkyl benzoimidazoles was synthesised to obtain potentially brain penetrant drugs. The array was prepared starting from the reaction of ortho-fluoronitrobenzenes with a selection of diamines, followed by reduction of the nitro group to obtain a series of monoalkylated phenylene diamines. N,N'-Carbonyldiimidazole (CDI) mediated acylation, followed by a parallel automated work-up procedure, afforded the monoacylated phenylenediamines which were cyclised under acidic conditions. Parallel work-up and purification afforded the array products in good yields and purities with a robust parallel methodology which will be useful for other libraries. Screening for alpha7 activity revealed compounds with agonist activity for the receptor.
NASA Astrophysics Data System (ADS)
Lasseur, Christophe
Long term manned missions of our Russian colleagues have demonstrated the risks associated with microbial contamination. These risks concern both crew health via the metabolic consumables contamination (water, air,.) but and also the hardware degradation. In parallel to these life support issues, planetary protection experts have agreed to place clear specifications of the microbial quality of future hardware landing on extraterrestrial planets as well as elaborate the requirements of contamination for manned missions on surface. For these activities, it is necessary to have a better understanding of microbial activity, to create culture collections and to develop on-line detection tools. . In this respect, over the last 6 years , ESA has supported active scientific research on the choice of critical genes and functions, including those linked to horizontal gene pool of bacteria and its dissemination. In parallel, ESA and European industries have been developing an automated instrument for rapid microbial detection on air and surface samples. Within this paper, we first present the life support and planetary protection requirements, and the state of the art of the instrument development. Preliminary results at breadboard level, including a mock-up view of the final instrument are also presented. Finally, the remaining steps required to reach a functional instrument for planetary hardware integration and life support flight hardware are also presented.
Combining Phase Identification and Statistic Modeling for Automated Parallel Benchmark Generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Ye; Ma, Xiaosong; Liu, Qing Gary
2015-01-01
Parallel application benchmarks are indispensable for evaluating/optimizing HPC software and hardware. However, it is very challenging and costly to obtain high-fidelity benchmarks reflecting the scale and complexity of state-of-the-art parallel applications. Hand-extracted synthetic benchmarks are time-and labor-intensive to create. Real applications themselves, while offering most accurate performance evaluation, are expensive to compile, port, reconfigure, and often plainly inaccessible due to security or ownership concerns. This work contributes APPRIME, a novel tool for trace-based automatic parallel benchmark generation. Taking as input standard communication-I/O traces of an application's execution, it couples accurate automatic phase identification with statistical regeneration of event parameters tomore » create compact, portable, and to some degree reconfigurable parallel application benchmarks. Experiments with four NAS Parallel Benchmarks (NPB) and three real scientific simulation codes confirm the fidelity of APPRIME benchmarks. They retain the original applications' performance characteristics, in particular the relative performance across platforms.« less
Semi-automated 96-well liquid-liquid extraction for quantitation of drugs in biological fluids.
Zhang, N; Hoffman, K L; Li, W; Rossi, D T
2000-02-01
A semi-automated liquid-liquid extraction (LLE) technique for biological fluid sample preparation was introduced for the quantitation of four drugs in rat plasma. All liquid transferring during the sample preparation was automated using a Tomtec Quadra 96 Model 320 liquid handling robot, which processed up to 96 samples in parallel. The samples were either in 96-deep-well plate or tube-rack format. One plate of samples can be prepared in approximately 1.5 h, and the 96-well plate is directly compatible with the autosampler of an LC/MS system. Selection of organic solvents and recoveries are discussed. Also, precision, relative error, linearity and quantitation of the semi automated LLE method are estimated for four example drugs using LC/MS/MS with a multiple reaction monitoring (MRM) approach. The applicability of this method and future directions are evaluated.
Bhattacharyya, S; Fan, L; Vo, L; Labadie, J
2000-04-01
Amine libraries and their derivatives are important targets for high throughput synthesis because of their versatility as medicinal agents and agrochemicals. As a part of our efforts towards automated chemical library synthesis, a titanium(IV) isopropoxide mediated solution phase reductive amination protocol was successfully translated to automation on the Trident(TM) library synthesizer of Argonaut Technologies. An array of 24 secondary amines was prepared in high yield and purity from 4 primary amines and 6 carbonyl compounds. These secondary amines were further utilized in a split synthesis to generate libraries of ureas, amides and sulfonamides in solution phase on the Trident(TM). The automated runs included 192 reactions to synthesize 96 ureas in duplicate and 96 reactions to synthesize 48 amides and 48 sulfonamides. A number of polymer-assisted solution phase protocols were employed for parallel work-up and purification of the products in each step.
The automated counting of beating rates in individual cultured heart cells.
Collins, G A; Dower, R; Walker, M J
1981-12-01
The effect of drugs on the beating rate of cultured heart cells can be monitored in a number of ways. The simultaneous automated measurement of beating rates of a number of cells allows drug effects to be rapidly quantified. A photoresistive detector placed on a television image of a cell, when coupled to operational amplifiers, gives binary signals that can be processed by a microprocessor. On this basis, we have devised a system that is capable of simultaneously monitoring the individual beating of six single cultured heart cells. A microprocessor automatically processes data obtained under different experimental conditions and records it in suitable descriptive formats such as dose-response curves and double reciprocal plots.
Swab culture monitoring of automated endoscope reprocessors after high-level disinfection
Lu, Lung-Sheng; Wu, Keng-Liang; Chiu, Yi-Chun; Lin, Ming-Tzung; Hu, Tsung-Hui; Chiu, King-Wah
2012-01-01
AIM: To conduct a bacterial culture study for monitoring decontamination of automated endoscope reprocessors (AERs) after high-level disinfection (HLD). METHODS: From February 2006 to January 2011, authors conducted randomized consecutive sampling each month for 7 AERs. Authors collected a total of 420 swab cultures, including 300 cultures from 5 gastroscope AERs, and 120 cultures from 2 colonoscope AERs. Swab cultures were obtained from the residual water from the AERs after a full reprocessing cycle. Samples were cultured to test for aerobic bacteria, anaerobic bacteria, and mycobacterium tuberculosis. RESULTS: The positive culture rate of the AERs was 2.0% (6/300) for gastroscope AERs and 0.8% (1/120) for colonoscope AERs. All the positive cultures, including 6 from gastroscope and 1 from colonoscope AERs, showed monofloral colonization. Of the gastroscope AER samples, 50% (3/6) were colonized by aerobic bacterial and 50% (3/6) by fungal contaminations. CONCLUSION: A full reprocessing cycle of an AER with HLD is adequate for disinfection of the machine. Swab culture is a useful method for monitoring AER decontamination after each reprocessing cycle. Fungal contamination of AERs after reprocessing should also be kept in mind. PMID:22529696
Collaborative Robots and Knowledge Management - A Short Review
NASA Astrophysics Data System (ADS)
Mușat, Flaviu-Constantin; Mihu, Florin-Constantin
2017-12-01
Because the requirements of the customers are more and more high related to quality, quantity, delivery times at lowest costs possible, the industry had to come with automated solutions to improve these requirements. Starting from the automated lines developed by Ford and Toyota, we have now developed automated and self-sustained working lines, which is possible nowadays-using collaborative robots. By using the knowledge management system we can improve the development of the future of this kind of area of research. This paper shows the benefits and the smartness use of the robots that are performing the manipulation activities that increases the work place ergonomically and improve the interaction between human - machine in order to assist in parallel tasks and lowering the physically human efforts.
Tension is required for fibripositor formation.
Kapacee, Zoher; Richardson, Susan H; Lu, Yinhui; Starborg, Tobias; Holmes, David F; Baar, Keith; Kadler, Karl E
2008-05-01
Embryonic tendon cells (ETCs) have actin-rich fibripositors that accompany parallel bundles of collagen fibrils in the extracellular matrix. To study fibripositor function, we have developed a three-dimensional cell culture system that promotes and maintains fibripositors. We show that ETCs cultured in fixed-length fibrin gels replace the fibrin during ~6 days in culture with parallel bundles of narrow-diameter collagen fibrils that are uniaxially aligned with fibripositors, thereby generating a tendon-like construct. Fibripositors occurred simultaneously with onset of parallel collagen fibrils. Interestingly, the constructs have a tendon-like crimp. In initial experiments to study the effects of tension, we showed that cutting the constructs resulted in loss of tension, loss of fibripositors and the appearance of immature fibrils with no preferred orientation.
In Vitro Mass Propagation of Cymbopogon citratus Stapf., a Medicinal Gramineae.
Quiala, Elisa; Barbón, Raúl; Capote, Alina; Pérez, Naivy; Jiménez, Elio
2016-01-01
Cymbopogon citratus (D.C.) Stapf. is a medicinal plant source of lemon grass oils with multiple uses in the pharmaceutical and food industry. Conventional propagation in semisolid culture medium has become a fast tool for mass propagation of lemon grass, but the production cost must be lower. A solution could be the application of in vitro propagation methods based on liquid culture advantages and automation. This chapter provides two efficient protocols for in vitro propagation via organogenesis and somatic embryogenesis of this medicinal plant. Firstly, we report the production of shoots using a temporary immersion system (TIS). Secondly, a protocol for somatic embryogenesis using semisolid culture for callus formation and multiplication, and liquid culture in a rotatory shaker and conventional bioreactors for the maintenance of embryogenic culture, is described. Well-developed plants can be achieved from both protocols. Here we provide a fast and efficient technology for mass propagation of this medicinal plant taking the advantage of liquid culture and automation.
NASA Technical Reports Server (NTRS)
Gangal, M. D.; Isenberg, L.; Lewis, E. V.
1985-01-01
Proposed system offers safety and large return on investment. System, operating by year 2000, employs machines and processes based on proven principles. According to concept, line of parallel machines, connected in groups of four to service modules, attacks face of coal seam. High-pressure water jets and central auger on each machine break face. Jaws scoop up coal chunks, and auger grinds them and forces fragments into slurry-transport system. Slurry pumped through pipeline to point of use. Concept for highly automated coal-mining system increases productivity, makes mining safer, and protects health of mine workers.
Automated Synthesis of a 184-Member Library of Thiadiazepan-1, 1-dioxide-4-ones
Fenster, Erik; Long, Toby R.; Zang, Qin; Hill, David; Neuenswander, Benjamin; Lushington, Gerald H.; Zhou, Aihua; Santini, Conrad; Hanson, Paul R.
2011-01-01
The construction of a 225-member (3 × 5 × 15) library of thiadiazepan-1,1-dioxide-4-ones was performed on a Chemspeed Accelerator (SLT-100) automated parallel synthesis platform, culminating in the successful preparation of 184/225 sultams. Three sultam core scaffolds were prepared based upon the utilization of an aza-Michael reaction on a multifunctional vinyl sulfonamide linchpin. The library exploits peripheral diversity in the form of a sequential, two-step [3 + 2] Huisgen cycloaddition/Pd-catalyzed Suzuki–Miyaura coupling sequence. PMID:21309582
Automated synthesis of a 184-member library of thiadiazepan-1,1-dioxide-4-ones.
Fenster, Erik; Long, Toby R; Zang, Qin; Hill, David; Neuenswander, Benjamin; Lushington, Gerald H; Zhou, Aihua; Santini, Conrad; Hanson, Paul R
2011-05-09
The construction of a 225-member (3 × 5 × 15) library of thiadiazepan-1,1-dioxide-4-ones was performed on a Chemspeed Accelerator (SLT-100) automated parallel synthesis platform, culminating in the successful preparation of 184/225 sultams. Three sultam core scaffolds were prepared based upon the utilization of an aza-Michael reaction on a multifunctional vinyl sulfonamide linchpin. The library exploits peripheral diversity in the form of a sequential, two-step [3 + 2] Huisgen cycloaddition/Pd-catalyzed Suzuki-Miyaura coupling sequence.
Digital microfluidics for automated hanging drop cell spheroid culture.
Aijian, Andrew P; Garrell, Robin L
2015-06-01
Cell spheroids are multicellular aggregates, grown in vitro, that mimic the three-dimensional morphology of physiological tissues. Although there are numerous benefits to using spheroids in cell-based assays, the adoption of spheroids in routine biomedical research has been limited, in part, by the tedious workflow associated with spheroid formation and analysis. Here we describe a digital microfluidic platform that has been developed to automate liquid-handling protocols for the formation, maintenance, and analysis of multicellular spheroids in hanging drop culture. We show that droplets of liquid can be added to and extracted from through-holes, or "wells," and fabricated in the bottom plate of a digital microfluidic device, enabling the formation and assaying of hanging drops. Using this digital microfluidic platform, spheroids of mouse mesenchymal stem cells were formed and maintained in situ for 72 h, exhibiting good viability (>90%) and size uniformity (% coefficient of variation <10% intraexperiment, <20% interexperiment). A proof-of-principle drug screen was performed on human colorectal adenocarcinoma spheroids to demonstrate the ability to recapitulate physiologically relevant phenomena such as insulin-induced drug resistance. With automatable and flexible liquid handling, and a wide range of in situ sample preparation and analysis capabilities, the digital microfluidic platform provides a viable tool for automating cell spheroid culture and analysis. © 2014 Society for Laboratory Automation and Screening.
Automated Testability Decision Tool
1991-09-01
Vol. 16,1968, pp. 538-558. Bertsekas, D. P., "Constraints Optimization and Lagrange Multiplier Methods," Academic Press, New York. McLeavey , D.W... McLeavey , J.A., "Parallel Optimization Methods in Standby Reliability, " University of Connecticut, School of Business Administration, Bureau of Business
TRANSMISSION NETWORK PLANNING METHOD FOR COMPARATIVE STUDIES (JOURNAL VERSION)
An automated transmission network planning method for comparative studies is presented. This method employs logical steps that may closely parallel those taken in practice by the planning engineers. Use is made of a sensitivity matrix to simulate the engineers' experience in sele...
Career Education via Data Processing
ERIC Educational Resources Information Center
Wagner, Gerald E.
1975-01-01
A data processing instructional program should provide students with career awareness, exploration, and orientation. This can be accomplished by establishing three objectives: (1) familiarization with automation terminology; (2) understanding the influence of the cultural and social impact of computers and automation; and (3) the kinds of job…
Advances in Parallelization for Large Scale Oct-Tree Mesh Generation
NASA Technical Reports Server (NTRS)
O'Connell, Matthew; Karman, Steve L.
2015-01-01
Despite great advancements in the parallelization of numerical simulation codes over the last 20 years, it is still common to perform grid generation in serial. Generating large scale grids in serial often requires using special "grid generation" compute machines that can have more than ten times the memory of average machines. While some parallel mesh generation techniques have been proposed, generating very large meshes for LES or aeroacoustic simulations is still a challenging problem. An automated method for the parallel generation of very large scale off-body hierarchical meshes is presented here. This work enables large scale parallel generation of off-body meshes by using a novel combination of parallel grid generation techniques and a hybrid "top down" and "bottom up" oct-tree method. Meshes are generated using hardware commonly found in parallel compute clusters. The capability to generate very large meshes is demonstrated by the generation of off-body meshes surrounding complex aerospace geometries. Results are shown including a one billion cell mesh generated around a Predator Unmanned Aerial Vehicle geometry, which was generated on 64 processors in under 45 minutes.
Parallel adaptive wavelet collocation method for PDEs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nejadmalayeri, Alireza, E-mail: Alireza.Nejadmalayeri@gmail.com; Vezolainen, Alexei, E-mail: Alexei.Vezolainen@Colorado.edu; Brown-Dymkoski, Eric, E-mail: Eric.Browndymkoski@Colorado.edu
2015-10-01
A parallel adaptive wavelet collocation method for solving a large class of Partial Differential Equations is presented. The parallelization is achieved by developing an asynchronous parallel wavelet transform, which allows one to perform parallel wavelet transform and derivative calculations with only one data synchronization at the highest level of resolution. The data are stored using tree-like structure with tree roots starting at a priori defined level of resolution. Both static and dynamic domain partitioning approaches are developed. For the dynamic domain partitioning, trees are considered to be the minimum quanta of data to be migrated between the processes. This allowsmore » fully automated and efficient handling of non-simply connected partitioning of a computational domain. Dynamic load balancing is achieved via domain repartitioning during the grid adaptation step and reassigning trees to the appropriate processes to ensure approximately the same number of grid points on each process. The parallel efficiency of the approach is discussed based on parallel adaptive wavelet-based Coherent Vortex Simulations of homogeneous turbulence with linear forcing at effective non-adaptive resolutions up to 2048{sup 3} using as many as 2048 CPU cores.« less
ARC Cell Science Validation (CS-V) Payload Overview
NASA Technical Reports Server (NTRS)
Gilkerson, Nikita
2017-01-01
Automated cell biology system for laboratory and International Space Station (ISS) National Laboratory research. Enhanced cell culture platform that provides undisturbed culture maintenance, including feedback temperature control, medical grade gas supply, perfusion nutrient delivery and removal of waste, and automated experiment manipulations. Programmable manipulations include: media feeds change out, injections, fraction collections, fixation, flow rate, and temperature modification within a one-piece sterile barrier flow path. Cassette provides 3 levels of containment and allows Crew access to the bioculture chamber and flow path assembly for experiment initiation, refurbishment, or sample retrieval and preservation.
Automation of a Wave-Optics Simulation and Image Post-Processing Package on Riptide
NASA Astrophysics Data System (ADS)
Werth, M.; Lucas, J.; Thompson, D.; Abercrombie, M.; Holmes, R.; Roggemann, M.
Detailed wave-optics simulations and image post-processing algorithms are computationally expensive and benefit from the massively parallel hardware available at supercomputing facilities. We created an automated system that interfaces with the Maui High Performance Computing Center (MHPCC) Distributed MATLAB® Portal interface to submit massively parallel waveoptics simulations to the IBM iDataPlex (Riptide) supercomputer. This system subsequently postprocesses the output images with an improved version of physically constrained iterative deconvolution (PCID) and analyzes the results using a series of modular algorithms written in Python. With this architecture, a single person can simulate thousands of unique scenarios and produce analyzed, archived, and briefing-compatible output products with very little effort. This research was developed with funding from the Defense Advanced Research Projects Agency (DARPA). The views, opinions, and/or findings expressed are those of the author(s) and should not be interpreted as representing the official views or policies of the Department of Defense or the U.S. Government.
Self-optimizing approach for automated laser resonator alignment
NASA Astrophysics Data System (ADS)
Brecher, C.; Schmitt, R.; Loosen, P.; Guerrero, V.; Pyschny, N.; Pavim, A.; Gatej, A.
2012-02-01
Nowadays, the assembly of laser systems is dominated by manual operations, involving elaborate alignment by means of adjustable mountings. From a competition perspective, the most challenging problem in laser source manufacturing is price pressure, a result of cost competition exerted mainly from Asia. From an economical point of view, an automated assembly of laser systems defines a better approach to produce more reliable units at lower cost. However, the step from today's manual solutions towards an automated assembly requires parallel developments regarding product design, automation equipment and assembly processes. This paper introduces briefly the idea of self-optimizing technical systems as a new approach towards highly flexible automation. Technically, the work focuses on the precision assembly of laser resonators, which is one of the final and most crucial assembly steps in terms of beam quality and laser power. The paper presents a new design approach for miniaturized laser systems and new automation concepts for a robot-based precision assembly, as well as passive and active alignment methods, which are based on a self-optimizing approach. Very promising results have already been achieved, considerably reducing the duration and complexity of the laser resonator assembly. These results as well as future development perspectives are discussed.
Parallel File System I/O Performance Testing On LANL Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wiens, Isaac Christian; Green, Jennifer Kathleen
2016-08-18
These are slides from a presentation on parallel file system I/O performance testing on LANL clusters. I/O is a known bottleneck for HPC applications. Performance optimization of I/O is often required. This summer project entailed integrating IOR under Pavilion and automating the results analysis. The slides cover the following topics: scope of the work, tools utilized, IOR-Pavilion test workflow, build script, IOR parameters, how parameters are passed to IOR, *run_ior: functionality, Python IOR-Output Parser, Splunk data format, Splunk dashboard and features, and future work.
Systematic review automation technologies
2014-01-01
Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128
Quantitative high-throughput population dynamics in continuous-culture by automated microscopy.
Merritt, Jason; Kuehn, Seppe
2016-09-12
We present a high-throughput method to measure abundance dynamics in microbial communities sustained in continuous-culture. Our method uses custom epi-fluorescence microscopes to automatically image single cells drawn from a continuously-cultured population while precisely controlling culture conditions. For clonal populations of Escherichia coli our instrument reveals history-dependent resilience and growth rate dependent aggregation.
NASA Technical Reports Server (NTRS)
Prinzel, Lawrence J 3rd; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.
2003-01-01
The present study examined the effects of an electroencephalographic- (EEG-) based system for adaptive automation on tracking performance and workload. In addition, event-related potentials (ERPs) to a secondary task were derived to determine whether they would provide an additional degree of workload specificity. Participants were run in an adaptive automation condition, in which the system switched between manual and automatic task modes based on the value of each individual's own EEG engagement index; a yoked control condition; or another control group, in which task mode switches followed a random pattern. Adaptive automation improved performance and resulted in lower levels of workload. Further, the P300 component of the ERP paralleled the sensitivity to task demands of the performance and subjective measures across conditions. These results indicate that it is possible to improve performance with a psychophysiological adaptive automation system and that ERPs may provide an alternative means for distinguishing among levels of cognitive task demand in such systems. Actual or potential applications of this research include improved methods for assessing operator workload and performance.
Li, Jieyue; Newberg, Justin Y; Uhlén, Mathias; Lundberg, Emma; Murphy, Robert F
2012-01-01
The Human Protein Atlas contains immunofluorescence images showing subcellular locations for thousands of proteins. These are currently annotated by visual inspection. In this paper, we describe automated approaches to analyze the images and their use to improve annotation. We began by training classifiers to recognize the annotated patterns. By ranking proteins according to the confidence of the classifier, we generated a list of proteins that were strong candidates for reexamination. In parallel, we applied hierarchical clustering to group proteins and identified proteins whose annotations were inconsistent with the remainder of the proteins in their cluster. These proteins were reexamined by the original annotators, and a significant fraction had their annotations changed. The results demonstrate that automated approaches can provide an important complement to visual annotation.
NASA Astrophysics Data System (ADS)
Tomori, Zoltan; Keša, Peter; Nikorovič, Matej; Kaůka, Jan; Zemánek, Pavel
2016-12-01
We proposed the improved control software for the holographic optical tweezers (HOT) proper for simple semi-automated sorting. The controller receives data from both the human interface sensors and the HOT microscope camera and processes them. As a result, the new positions of active laser traps are calculated, packed into the network format and sent to the remote HOT. Using the photo-polymerization technique, we created a sorting container consisting of two parallel horizontal walls where one wall contains "gates" representing a place where the trapped particle enters into the container. The positions of particles and gates are obtained by image analysis technique which can be exploited to achieve the higher level of automation. Sorting is documented on computer game simulation and the real experiment.
An Automated, High-Throughput System for GISAXS and GIWAXS Measurements of Thin Films
NASA Astrophysics Data System (ADS)
Schaible, Eric; Jimenez, Jessica; Church, Matthew; Lim, Eunhee; Stewart, Polite; Hexemer, Alexander
Grazing incidence small-angle X-ray scattering (GISAXS) and grazing incidence wide-angle X-ray scattering (GIWAXS) are important techniques for characterizing thin films. In order to meet rapidly increasing demand, the SAXSWAXS beamline at the Advanced Light Source (beamline 7.3.3) has implemented a fully automated, high-throughput system to conduct SAXS, GISAXS and GIWAXS measurements. An automated robot arm transfers samples from a holding tray to a measurement stage. Intelligent software aligns each sample in turn, and measures each according to user-defined specifications. Users mail in trays of samples on individually barcoded pucks, and can download and view their data remotely. Data will be pipelined to the NERSC supercomputing facility, and will be available to users via a web portal that facilitates highly parallelized analysis.
STAMPS: Software Tool for Automated MRI Post-processing on a supercomputer.
Bigler, Don C; Aksu, Yaman; Miller, David J; Yang, Qing X
2009-08-01
This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing tools (such as SPM, FSL, and HAMMER) for multiple MR image types in a pipeline fashion. It also contains novel MRI post-processing features. The STAMP image outputs can be used to perform brain analysis using Statistical Parametric Mapping (SPM) or single-/multi-image modality brain analysis using Support Vector Machines (SVMs). Since STAMPS is PBS-based, the supercomputer may be a multi-node computer cluster or one of the latest multi-core computers.
Lysák, Daniel; Holubová, Monika; Bergerová, Tamara; Vávrová, Monika; Cangemi, Giuseppina Cristina; Ciccocioppo, Rachele; Kruzliak, Peter; Jindra, Pavel
2016-03-01
Cell therapy products represent a new trend of treatment in the field of immunotherapy and regenerative medicine. Their biological nature and multistep preparation procedure require the application of complex release criteria and quality control. Microbial contamination of cell therapy products is a potential source of morbidity in recipients. The automated blood culture systems are widely used for the detection of microorganisms in cell therapy products. However the standard 2-week cultivation period is too long for some cell-based treatments and alternative methods have to be devised. We tried to verify whether a shortened cultivation of the supernatant from the mesenchymal stem cell (MSC) culture obtained 2 days before the cell harvest could sufficiently detect microbial growth and allow the release of MSC for clinical application. We compared the standard Ph. Eur. cultivation method and the automated blood culture system BACTEC (Becton Dickinson). The time to detection (TTD) and the detection limit were analyzed for three bacterial and two fungal strains. The Staphylococcus aureus and Pseudomonas aeruginosa were recognized within 24 h with both methods (detection limit ~10 CFU). The time required for the detection of Bacillus subtilis was shorter with the automated method (TTD 10.3 vs. 60 h for 10-100 CFU). The BACTEC system reached significantly shorter times to the detection of Candida albicans and Aspergillus brasiliensis growth compared to the classical method (15.5 vs. 48 and 31.5 vs. 48 h, respectively; 10-100 CFU). The positivity was demonstrated within 48 h in all bottles, regardless of the size of the inoculum. This study validated the automated cultivation system as a method able to detect all tested microorganisms within a 48-h period with a detection limit of ~10 CFU. Only in case of B. subtilis, the lowest inoculum (~10 CFU) was not recognized. The 2-day cultivation technique is then capable of confirming the microbiological safety of MSC and allows their timely release for clinical application.
Toprak, Erdal; Veres, Adrian; Yildiz, Sadik; Pedraza, Juan M.; Chait, Remy; Paulsson, Johan; Kishony, Roy
2013-01-01
We present a protocol for building and operating an automated fluidic system for continuous culture that we call the “morbidostat”. The morbidostat is used to follow evolution of microbial drug resistance in real time. Instead of exposing bacteria to predetermined drug environments, the morbidostat constantly measures the growth rates of evolving microbial populations and dynamically adjusts drug concentrations inside culture vials in order to maintain a constant drug induced inhibition. The growth rate measurements are done using an optical detection system that is based on measuring the intensity of back-scattered light from bacterial cells suspended in the liquid culture. The morbidostat can additionally be used as a chemostat or a turbidostat. The whole system can be built from readily available components within two to three weeks, by biologists with some electronics experience or engineers familiar with basic microbiology. PMID:23429717
Automated Generation of Message-Passing Programs: An Evaluation Using CAPTools
NASA Technical Reports Server (NTRS)
Hribar, Michelle R.; Jin, Haoqiang; Yan, Jerry C.; Saini, Subhash (Technical Monitor)
1998-01-01
Scientists at NASA Ames Research Center have been developing computational aeroscience applications on highly parallel architectures over the past ten years. During that same time period, a steady transition of hardware and system software also occurred, forcing us to expend great efforts into migrating and re-coding our applications. As applications and machine architectures become increasingly complex, the cost and time required for this process will become prohibitive. In this paper, we present the first set of results in our evaluation of interactive parallelization tools. In particular, we evaluate CAPTool's ability to parallelize computational aeroscience applications. CAPTools was tested on serial versions of the NAS Parallel Benchmarks and ARC3D, a computational fluid dynamics application, on two platforms: the SGI Origin 2000 and the Cray T3E. This evaluation includes performance, amount of user interaction required, limitations and portability. Based on these results, a discussion on the feasibility of computer aided parallelization of aerospace applications is presented along with suggestions for future work.
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
The manual application of formal methods in system specification has produced successes, but in the end, despite any claims and assertions by practitioners, there is no provable relationship between a manually derived system specification or formal model and the customer's original requirements. Complex parallel and distributed system present the worst case implications for today s dearth of viable approaches for achieving system dependability. No avenue other than formal methods constitutes a serious contender for resolving the problem, and so recognition of requirements-based programming has come at a critical juncture. We describe a new, NASA-developed automated requirement-based programming method that can be applied to certain classes of systems, including complex parallel and distributed systems, to achieve a high degree of dependability.
On extending parallelism to serial simulators
NASA Technical Reports Server (NTRS)
Nicol, David; Heidelberger, Philip
1994-01-01
This paper describes an approach to discrete event simulation modeling that appears to be effective for developing portable and efficient parallel execution of models of large distributed systems and communication networks. In this approach, the modeler develops submodels using an existing sequential simulation modeling tool, using the full expressive power of the tool. A set of modeling language extensions permit automatically synchronized communication between submodels; however, the automation requires that any such communication must take a nonzero amount off simulation time. Within this modeling paradigm, a variety of conservative synchronization protocols can transparently support conservative execution of submodels on potentially different processors. A specific implementation of this approach, U.P.S. (Utilitarian Parallel Simulator), is described, along with performance results on the Intel Paragon.
NASA Technical Reports Server (NTRS)
Sharma, Naveen
1992-01-01
In this paper we briefly describe a combined symbolic and numeric approach for solving mathematical models on parallel computers. An experimental software system, PIER, is being developed in Common Lisp to synthesize computationally intensive and domain formulation dependent phases of finite element analysis (FEA) solution methods. Quantities for domain formulation like shape functions, element stiffness matrices, etc., are automatically derived using symbolic mathematical computations. The problem specific information and derived formulae are then used to generate (parallel) numerical code for FEA solution steps. A constructive approach to specify a numerical program design is taken. The code generator compiles application oriented input specifications into (parallel) FORTRAN77 routines with the help of built-in knowledge of the particular problem, numerical solution methods and the target computer.
Automated inspection of hot steel slabs
Martin, R.J.
1985-12-24
The disclosure relates to a real time digital image enhancement system for performing the image enhancement segmentation processing required for a real time automated system for detecting and classifying surface imperfections in hot steel slabs. The system provides for simultaneous execution of edge detection processing and intensity threshold processing in parallel on the same image data produced by a sensor device such as a scanning camera. The results of each process are utilized to validate the results of the other process and a resulting image is generated that contains only corresponding segmentation that is produced by both processes. 5 figs.
Advanced automation of a prototypic thermal control system for Space Station
NASA Technical Reports Server (NTRS)
Dominick, Jeff
1990-01-01
Viewgraphs on an advanced automation of a prototypic thermal control system for space station are presented. The Thermal Expert System (TEXSYS) was initiated in 1986 as a cooperative project between ARC and JCS as a way to leverage on-going work at both centers. JSC contributed Thermal Control System (TCS) hardware and control software, TCS operational expertise, and integration expertise. ARC contributed expert system and display expertise. The first years of the project were dedicated to parallel development of expert system tools, displays, interface software, and TCS technology and procedures by a total of four organizations.
Automated inspection of hot steel slabs
Martin, Ronald J.
1985-01-01
The disclosure relates to a real time digital image enhancement system for performing the image enhancement segmentation processing required for a real time automated system for detecting and classifying surface imperfections in hot steel slabs. The system provides for simultaneous execution of edge detection processing and intensity threshold processing in parallel on the same image data produced by a sensor device such as a scanning camera. The results of each process are utilized to validate the results of the other process and a resulting image is generated that contains only corresponding segmentation that is produced by both processes.
Automated macromolecular crystal detection system and method
Christian, Allen T [Tracy, CA; Segelke, Brent [San Ramon, CA; Rupp, Bernard [Livermore, CA; Toppani, Dominique [Fontainebleau, FR
2007-06-05
An automated macromolecular method and system for detecting crystals in two-dimensional images, such as light microscopy images obtained from an array of crystallization screens. Edges are detected from the images by identifying local maxima of a phase congruency-based function associated with each image. The detected edges are segmented into discrete line segments, which are subsequently geometrically evaluated with respect to each other to identify any crystal-like qualities such as, for example, parallel lines, facing each other, similarity in length, and relative proximity. And from the evaluation a determination is made as to whether crystals are present in each image.
Renz, Nora; Feihl, Susanne; Cabric, Sabrina; Trampuz, Andrej
2017-12-01
Sonication of explanted prostheses improved the microbiological diagnosis of periprosthetic joint infections (PJI). We evaluated the performance of automated multiplex polymerase chain reaction (PCR) using sonication fluid for the microbiological diagnosis of PJI. In a prospective cohort using uniform definition criteria for PJI, explanted joint prostheses were investigated by sonication and the resulting sonication fluid was analyzed by culture and multiplex PCR. McNemar's Chi-squared test was used to compare the performance of diagnostic tests. Among 111 patients, PJI was diagnosed in 78 (70%) and aseptic failure in 33 (30%). For the diagnosis of PJI, the sensitivity and specificity of periprosthetic tissue culture was 51 and 100%, of sonication fluid culture 58 and 100%, and of sonication fluid PCR 51 and 94%, respectively. Among 70 microorganisms, periprosthetic tissue culture grew 52 (74%), sonication fluid culture grew 50 (71%) and sonication fluid PCR detected 37 pathogens (53%). If only organisms are considered, for which primers are included in the test panel, PCR detected 37 of 58 pathogens (64%). The sonication fluid PCR missed 19 pathogens (predominantly oral streptococci and anaerobes), whereas 7 additional microorganisms were detected only by PCR (including Cutibacterium spp. and coagulase-negative staphylococci). The performance of multiplex PCR using sonication fluid is comparable to culture of periprosthetic tissue or sonication fluid. The advantages of PCR are short processing time (< 5 h) and fully automated procedure. However, culture technique is still needed due to the low sensitivity and the need of comprehensive susceptibility testing. Modification of primers or inclusion of additional ones may improve the performance of PCR, especially of low-virulent organisms.
Method and automated apparatus for detecting coliform organisms
NASA Technical Reports Server (NTRS)
Dill, W. P.; Taylor, R. E.; Jeffers, E. L. (Inventor)
1980-01-01
Method and automated apparatus are disclosed for determining the time of detection of metabolically produced hydrogen by coliform bacteria cultured in an electroanalytical cell from the time the cell is inoculated with the bacteria. The detection time data provides bacteria concentration values. The apparatus is sequenced and controlled by a digital computer to discharge a spent sample, clean and sterilize the culture cell, provide a bacteria nutrient into the cell, control the temperature of the nutrient, inoculate the nutrient with a bacteria sample, measures the electrical potential difference produced by the cell, and measures the time of detection from inoculation.
Evaluation of negative results of BacT/Alert 3D automated blood culture system.
Kocoglu, M Esra; Bayram, Aysen; Balci, Iclal
2005-06-01
Although automated continuous-monitoring blood culture systems are both rapid and sensitive, false-positive and false-negative results still occur. The objective of this study, then, was to evaluate negative results occurring with BacT/Alert 3D blood culture systems. A total of 1032 samples were cultured with the BacT/Alert 3D automated blood culture system, using both aerobic (FA) and anaerobic (FN) [corrected] media, and 128 of these samples yielded positive results. A total of 904 negative blood samples were then subcultured in 5% sheep blood agar, eosin methylene blue, chocolate agar, and sabouraud-dextrose agar. Organisms growing on these subcultures were subsequently identified using both Vitek32 (bioMerieux, Durham, NC) and conventional methods. Twenty four (2.6%) of the 904 subcultures grew on the subculture media. The majority (83.3%) of these were determined to be gram-positive microorganisms. Fourteen (58.3%) were coagulase-negative staphylococci, two (8.3%) were Bacillus spp., one (4.2%) was Staphylococcus aureus, and one (4.2%) was identified as Enterococcus faecium. Streptococcus pneumoniae and Neisseria spp. were isolated together in two (8.3%) vials. Gram-negative microorganisms comprised 12.5% of the subcultures, of which two (8.3%) were found to be Pseudomonas aeruginosa, and one (4.2%) was Pseudomonas fluorescens. The other isolate (4.2%) was identified as Candida albicans. We conclude that the subculture of negative results is valuable in the BacT/Alert 3D system, especially in situations in which only one set of blood cultures is taken.
Automated Vectorization of Decision-Based Algorithms
NASA Technical Reports Server (NTRS)
James, Mark
2006-01-01
Virtually all existing vectorization algorithms are designed to only analyze the numeric properties of an algorithm and distribute those elements across multiple processors. This advances the state of the practice because it is the only known system, at the time of this reporting, that takes high-level statements and analyzes them for their decision properties and converts them to a form that allows them to automatically be executed in parallel. The software takes a high-level source program that describes a complex decision- based condition and rewrites it as a disjunctive set of component Boolean relations that can then be executed in parallel. This is important because parallel architectures are becoming more commonplace in conventional systems and they have always been present in NASA flight systems. This technology allows one to take existing condition-based code and automatically vectorize it so it naturally decomposes across parallel architectures.
Mass culture of photobacteria to obtain luciferase
NASA Technical Reports Server (NTRS)
Chappelle, E. W.; Picciolo, G. L.; Rich, E., Jr.
1969-01-01
Inoculating preheated trays containing nutrient agar with photobacteria provides a means for mass culture of aerobic microorganisms in order to obtain large quantities of luciferase. To determine optimum harvest time, growth can be monitored by automated light-detection instrumentation.
Heterocyclic compounds hold a special place in drug discovery and variety of techniques such as combinatorial synthesis, parallel synthesis, and automated library production to increase the output of these entities has been developed. Although most of these techniques are rapid a...
Automated design of spacecraft systems power subsystems
NASA Technical Reports Server (NTRS)
Terrile, Richard J.; Kordon, Mark; Mandutianu, Dan; Salcedo, Jose; Wood, Eric; Hashemi, Mona
2006-01-01
This paper discusses the application of evolutionary computing to a dynamic space vehicle power subsystem resource and performance simulation in a parallel processing environment. Our objective is to demonstrate the feasibility, application and advantage of using evolutionary computation techniques for the early design search and optimization of space systems.
Automatic Camera Calibration for Cultural Heritage Applications Using Unstructured Planar Objects
NASA Astrophysics Data System (ADS)
Adam, K.; Kalisperakis, I.; Grammatikopoulos, L.; Karras, G.; Petsa, E.
2013-07-01
As a rule, image-based documentation of cultural heritage relies today on ordinary digital cameras and commercial software. As such projects often involve researchers not familiar with photogrammetry, the question of camera calibration is important. Freely available open-source user-friendly software for automatic camera calibration, often based on simple 2D chess-board patterns, are an answer to the demand for simplicity and automation. However, such tools cannot respond to all requirements met in cultural heritage conservation regarding possible imaging distances and focal lengths. Here we investigate the practical possibility of camera calibration from unknown planar objects, i.e. any planar surface with adequate texture; we have focused on the example of urban walls covered with graffiti. Images are connected pair-wise with inter-image homographies, which are estimated automatically through a RANSAC-based approach after extracting and matching interest points with the SIFT operator. All valid points are identified on all images on which they appear. Provided that the image set includes a "fronto-parallel" view, inter-image homographies with this image are regarded as emulations of image-to-world homographies and allow computing initial estimates for the interior and exterior orientation elements. Following this initialization step, the estimates are introduced into a final self-calibrating bundle adjustment. Measures are taken to discard unsuitable images and verify object planarity. Results from practical experimentation indicate that this method may produce satisfactory results. The authors intend to incorporate the described approach into their freely available user-friendly software tool, which relies on chess-boards, to assist non-experts in their projects with image-based approaches.
Research project evaluates the effect of national culture on flight crew behaviour.
Helmreich, R L; Merritt, A C; Sherman, P J
1996-10-01
The role of national culture in flight crew interactions and behavior is examined. Researchers surveyed Asian, European, and American flight crews to determine attitudes about crew coordination and cockpit management. Universal attitudes among pilots are identified. Culturally variable attitudes among pilots from 16 countries are compared. The role of culture in response to increasing cockpit automation is reviewed. Culture-based challenges to crew resource management programs and multicultural organizations are discussed.
SeqMule: automated pipeline for analysis of human exome/genome sequencing data.
Guo, Yunfei; Ding, Xiaolei; Shen, Yufeng; Lyon, Gholson J; Wang, Kai
2015-09-18
Next-generation sequencing (NGS) technology has greatly helped us identify disease-contributory variants for Mendelian diseases. However, users are often faced with issues such as software compatibility, complicated configuration, and no access to high-performance computing facility. Discrepancies exist among aligners and variant callers. We developed a computational pipeline, SeqMule, to perform automated variant calling from NGS data on human genomes and exomes. SeqMule integrates computational-cluster-free parallelization capability built on top of the variant callers, and facilitates normalization/intersection of variant calls to generate consensus set with high confidence. SeqMule integrates 5 alignment tools, 5 variant calling algorithms and accepts various combinations all by one-line command, therefore allowing highly flexible yet fully automated variant calling. In a modern machine (2 Intel Xeon X5650 CPUs, 48 GB memory), when fast turn-around is needed, SeqMule generates annotated VCF files in a day from a 30X whole-genome sequencing data set; when more accurate calling is needed, SeqMule generates consensus call set that improves over single callers, as measured by both Mendelian error rate and consistency. SeqMule supports Sun Grid Engine for parallel processing, offers turn-key solution for deployment on Amazon Web Services, allows quality check, Mendelian error check, consistency evaluation, HTML-based reports. SeqMule is available at http://seqmule.openbioinformatics.org.
Morschett, Holger; Freier, Lars; Rohde, Jannis; Wiechert, Wolfgang; von Lieres, Eric; Oldiges, Marco
2017-01-01
Even though microalgae-derived biodiesel has regained interest within the last decade, industrial production is still challenging for economic reasons. Besides reactor design, as well as value chain and strain engineering, laborious and slow early-stage parameter optimization represents a major drawback. The present study introduces a framework for the accelerated development of phototrophic bioprocesses. A state-of-the-art micro-photobioreactor supported by a liquid-handling robot for automated medium preparation and product quantification was used. To take full advantage of the technology's experimental capacity, Kriging-assisted experimental design was integrated to enable highly efficient execution of screening applications. The resulting platform was used for medium optimization of a lipid production process using Chlorella vulgaris toward maximum volumetric productivity. Within only four experimental rounds, lipid production was increased approximately threefold to 212 ± 11 mg L -1 d -1 . Besides nitrogen availability as a key parameter, magnesium, calcium and various trace elements were shown to be of crucial importance. Here, synergistic multi-parameter interactions as revealed by the experimental design introduced significant further optimization potential. The integration of parallelized microscale cultivation, laboratory automation and Kriging-assisted experimental design proved to be a fruitful tool for the accelerated development of phototrophic bioprocesses. By means of the proposed technology, the targeted optimization task was conducted in a very timely and material-efficient manner.
New technologies for advanced three-dimensional optimum shape design in aeronautics
NASA Astrophysics Data System (ADS)
Dervieux, Alain; Lanteri, Stéphane; Malé, Jean-Michel; Marco, Nathalie; Rostaing-Schmidt, Nicole; Stoufflet, Bruno
1999-05-01
The analysis of complex flows around realistic aircraft geometries is becoming more and more predictive. In order to obtain this result, the complexity of flow analysis codes has been constantly increasing, involving more refined fluid models and sophisticated numerical methods. These codes can only run on top computers, exhausting their memory and CPU capabilities. It is, therefore, difficult to introduce best analysis codes in a shape optimization loop: most previous works in the optimum shape design field used only simplified analysis codes. Moreover, as the most popular optimization methods are the gradient-based ones, the more complex the flow solver, the more difficult it is to compute the sensitivity code. However, emerging technologies are contributing to make such an ambitious project, of including a state-of-the-art flow analysis code into an optimisation loop, feasible. Among those technologies, there are three important issues that this paper wishes to address: shape parametrization, automated differentiation and parallel computing. Shape parametrization allows faster optimization by reducing the number of design variable; in this work, it relies on a hierarchical multilevel approach. The sensitivity code can be obtained using automated differentiation. The automated approach is based on software manipulation tools, which allow the differentiation to be quick and the resulting differentiated code to be rather fast and reliable. In addition, the parallel algorithms implemented in this work allow the resulting optimization software to run on increasingly larger geometries. Copyright
A Droplet Microfluidic Platform for Automating Genetic Engineering.
Gach, Philip C; Shih, Steve C C; Sustarich, Jess; Keasling, Jay D; Hillson, Nathan J; Adams, Paul D; Singh, Anup K
2016-05-20
We present a water-in-oil droplet microfluidic platform for transformation, culture and expression of recombinant proteins in multiple host organisms including bacteria, yeast and fungi. The platform consists of a hybrid digital microfluidic/channel-based droplet chip with integrated temperature control to allow complete automation and integration of plasmid addition, heat-shock transformation, addition of selection medium, culture, and protein expression. The microfluidic format permitted significant reduction in consumption (100-fold) of expensive reagents such as DNA and enzymes compared to the benchtop method. The chip contains a channel to continuously replenish oil to the culture chamber to provide a fresh supply of oxygen to the cells for long-term (∼5 days) cell culture. The flow channel also replenished oil lost to evaporation and increased the number of droplets that could be processed and cultured. The platform was validated by transforming several plasmids into Escherichia coli including plasmids containing genes for fluorescent proteins GFP, BFP and RFP; plasmids with selectable markers for ampicillin or kanamycin resistance; and a Golden Gate DNA assembly reaction. We also demonstrate the applicability of this platform for transformation in widely used eukaryotic organisms such as Saccharomyces cerevisiae and Aspergillus niger. Duration and temperatures of the microfluidic heat-shock procedures were optimized to yield transformation efficiencies comparable to those obtained by benchtop methods with a throughput up to 6 droplets/min. The proposed platform offers potential for automation of molecular biology experiments significantly reducing cost, time and variability while improving throughput.
Slyter, Leonard L.
1975-01-01
An artifical rumen continuous culture with pH control, automated input of water-soluble and water-insoluble substrates, controlled mixing of contents, and a collection system for gas is described. Images PMID:16350029
NASA Astrophysics Data System (ADS)
Anderson, D.; Andrais, B.; Mirzayans, R.; Siegbahn, E. A.; Fallone, B. G.; Warkentin, B.
2013-06-01
Microbeam radiation therapy (MRT) delivers single fractions of very high doses of synchrotron x-rays using arrays of microbeams. In animal experiments, MRT has achieved higher tumour control and less normal tissue toxicity compared to single-fraction broad beam irradiations of much lower dose. The mechanism behind the normal tissue sparing of MRT has yet to be fully explained. An accurate method for evaluating DNA damage, such as the γ-H2AX immunofluorescence assay, will be important for understanding the role of cellular communication in the radiobiological response of normal and cancerous cell types to MRT. We compare two methods of quantifying γ-H2AX nuclear fluorescence for uniformly irradiated cell cultures: manual counting of γ-H2AX foci by eye, and an automated, MATLAB-based fluorescence intensity measurement. We also demonstrate the automated analysis of cell cultures irradiated with an array of microbeams. In addition to offering a relatively high dynamic range of γ-H2AX signal versus irradiation dose ( > 10 Gy), our automated method provides speed, robustness, and objectivity when examining a series of images. Our in-house analysis facilitates the automated extraction of the spatial distribution of the γ-H2AX intensity with respect to the microbeam array — for example, the intensities in the peak (high dose area) and valley (area between two microbeams) regions. The automated analysis is particularly beneficial when processing a large number of samples, as is needed to systematically study the relationship between the numerous dosimetric and geometric parameters involved with MRT (e.g., microbeam width, microbeam spacing, microbeam array dimensions, peak dose, valley dose, and geometric arrangement of multiple arrays) and the resulting DNA damage.
Dalpke, Alexander H; Hofko, Marjeta; Zimmermann, Stefan
2016-09-01
Vancomycin-resistant enterococci (VRE) are an important cause of health care-associated infections, resulting in significant mortality and a significant economic burden in hospitals. Active surveillance for at-risk populations contributes to the prevention of infections with VRE. The availability of a combination of automation and molecular detection procedures for rapid screening would be beneficial. Here, we report on the development of a laboratory-developed PCR for detection of VRE which runs on the fully automated Becton Dickinson (BD) Max platform, which combines DNA extraction, PCR setup, and real-time PCR amplification. We evaluated two protocols: one using a liquid master mix and the other employing commercially ordered dry-down reagents. The BD Max VRE PCR was evaluated in two rounds with 86 and 61 rectal elution swab (eSwab) samples, and the results were compared to the culture results. The sensitivities of the different PCR formats were 84 to 100% for vanA and 83.7 to 100% for vanB; specificities were 96.8 to 100% for vanA and 81.8 to 97% for vanB The use of dry-down reagents and the ExK DNA-2 kit for extraction showed that the samples were less inhibited (3.3%) than they were by the use of the liquid master mix (14.8%). Adoption of a cutoff threshold cycle of 35 for discrimination of vanB-positive samples allowed an increase of specificity to 87.9%. The performance of the BD Max VRE assay equaled that of the BD GeneOhm VanR assay, which was run in parallel. The use of dry-down reagents simplifies the assay and omits any need to handle liquid PCR reagents. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
The Design and Evaluation of "CAPTools"--A Computer Aided Parallelization Toolkit
NASA Technical Reports Server (NTRS)
Yan, Jerry; Frumkin, Michael; Hribar, Michelle; Jin, Haoqiang; Waheed, Abdul; Johnson, Steve; Cross, Jark; Evans, Emyr; Ierotheou, Constantinos; Leggett, Pete;
1998-01-01
Writing applications for high performance computers is a challenging task. Although writing code by hand still offers the best performance, it is extremely costly and often not very portable. The Computer Aided Parallelization Tools (CAPTools) are a toolkit designed to help automate the mapping of sequential FORTRAN scientific applications onto multiprocessors. CAPTools consists of the following major components: an inter-procedural dependence analysis module that incorporates user knowledge; a 'self-propagating' data partitioning module driven via user guidance; an execution control mask generation and optimization module for the user to fine tune parallel processing of individual partitions; a program transformation/restructuring facility for source code clean up and optimization; a set of browsers through which the user interacts with CAPTools at each stage of the parallelization process; and a code generator supporting multiple programming paradigms on various multiprocessors. Besides describing the rationale behind the architecture of CAPTools, the parallelization process is illustrated via case studies involving structured and unstructured meshes. The programming process and the performance of the generated parallel programs are compared against other programming alternatives based on the NAS Parallel Benchmarks, ARC3D and other scientific applications. Based on these results, a discussion on the feasibility of constructing architectural independent parallel applications is presented.
Rafiq, Qasim A; Hanga, Mariana P; Heathman, Thomas R J; Coopman, Karen; Nienow, Alvin W; Williams, David J; Hewitt, Christopher J
2017-10-01
Microbioreactors play a critical role in process development as they reduce reagent requirements and can facilitate high-throughput screening of process parameters and culture conditions. Here, we have demonstrated and explained in detail, for the first time, the amenability of the automated ambr15 cell culture microbioreactor system for the development of scalable adherent human mesenchymal multipotent stromal/stem cell (hMSC) microcarrier culture processes. This was achieved by first improving suspension and mixing of the microcarriers and then improving cell attachment thereby reducing the initial growth lag phase. The latter was achieved by using only 50% of the final working volume of medium for the first 24 h and using an intermittent agitation strategy. These changes resulted in >150% increase in viable cell density after 24 h compared to the original process (no agitation for 24 h and 100% working volume). Using the same methodology as in the ambr15, similar improvements were obtained with larger scale spinner flask studies. Finally, this improved bioprocess methodology based on a serum-based medium was applied to a serum-free process in the ambr15, resulting in >250% increase in yield compared to the serum-based process. At both scales, the agitation used during culture was the minimum required for microcarrier suspension, N JS . The use of the ambr15, with its improved control compared to the spinner flask, reduced the coefficient of variation on viable cell density in the serum containing medium from 7.65% to 4.08%, and the switch to serum free further reduced these to 1.06-0.54%, respectively. The combination of both serum-free and automated processing improved the reproducibility more than 10-fold compared to the serum-based, manual spinner flask process. The findings of this study demonstrate that the ambr15 microbioreactor is an effective tool for bioprocess development of hMSC microcarrier cultures and that a combination of serum-free medium, control, and automation improves both process yield and consistency. Biotechnol. Bioeng. 2017;114: 2253-2266. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Hanga, Mariana P.; Heathman, Thomas R. J.; Coopman, Karen; Nienow, Alvin W.; Williams, David J.; Hewitt, Christopher J.
2017-01-01
ABSTRACT Microbioreactors play a critical role in process development as they reduce reagent requirements and can facilitate high‐throughput screening of process parameters and culture conditions. Here, we have demonstrated and explained in detail, for the first time, the amenability of the automated ambr15 cell culture microbioreactor system for the development of scalable adherent human mesenchymal multipotent stromal/stem cell (hMSC) microcarrier culture processes. This was achieved by first improving suspension and mixing of the microcarriers and then improving cell attachment thereby reducing the initial growth lag phase. The latter was achieved by using only 50% of the final working volume of medium for the first 24 h and using an intermittent agitation strategy. These changes resulted in >150% increase in viable cell density after 24 h compared to the original process (no agitation for 24 h and 100% working volume). Using the same methodology as in the ambr15, similar improvements were obtained with larger scale spinner flask studies. Finally, this improved bioprocess methodology based on a serum‐based medium was applied to a serum‐free process in the ambr15, resulting in >250% increase in yield compared to the serum‐based process. At both scales, the agitation used during culture was the minimum required for microcarrier suspension, NJS. The use of the ambr15, with its improved control compared to the spinner flask, reduced the coefficient of variation on viable cell density in the serum containing medium from 7.65% to 4.08%, and the switch to serum free further reduced these to 1.06–0.54%, respectively. The combination of both serum‐free and automated processing improved the reproducibility more than 10‐fold compared to the serum‐based, manual spinner flask process. The findings of this study demonstrate that the ambr15 microbioreactor is an effective tool for bioprocess development of hMSC microcarrier cultures and that a combination of serum‐free medium, control, and automation improves both process yield and consistency. Biotechnol. Bioeng. 2017;114: 2253–2266. © 2017 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals, Inc. PMID:28627713
Slepoy, A; Peters, M D; Thompson, A P
2007-11-30
Molecular dynamics and other molecular simulation methods rely on a potential energy function, based only on the relative coordinates of the atomic nuclei. Such a function, called a force field, approximately represents the electronic structure interactions of a condensed matter system. Developing such approximate functions and fitting their parameters remains an arduous, time-consuming process, relying on expert physical intuition. To address this problem, a functional programming methodology was developed that may enable automated discovery of entirely new force-field functional forms, while simultaneously fitting parameter values. The method uses a combination of genetic programming, Metropolis Monte Carlo importance sampling and parallel tempering, to efficiently search a large space of candidate functional forms and parameters. The methodology was tested using a nontrivial problem with a well-defined globally optimal solution: a small set of atomic configurations was generated and the energy of each configuration was calculated using the Lennard-Jones pair potential. Starting with a population of random functions, our fully automated, massively parallel implementation of the method reproducibly discovered the original Lennard-Jones pair potential by searching for several hours on 100 processors, sampling only a minuscule portion of the total search space. This result indicates that, with further improvement, the method may be suitable for unsupervised development of more accurate force fields with completely new functional forms. Copyright (c) 2007 Wiley Periodicals, Inc.
Neural simulations on multi-core architectures.
Eichner, Hubert; Klug, Tobias; Borst, Alexander
2009-01-01
Neuroscience is witnessing increasing knowledge about the anatomy and electrophysiological properties of neurons and their connectivity, leading to an ever increasing computational complexity of neural simulations. At the same time, a rather radical change in personal computer technology emerges with the establishment of multi-cores: high-density, explicitly parallel processor architectures for both high performance as well as standard desktop computers. This work introduces strategies for the parallelization of biophysically realistic neural simulations based on the compartmental modeling technique and results of such an implementation, with a strong focus on multi-core architectures and automation, i.e. user-transparent load balancing.
Neural Simulations on Multi-Core Architectures
Eichner, Hubert; Klug, Tobias; Borst, Alexander
2009-01-01
Neuroscience is witnessing increasing knowledge about the anatomy and electrophysiological properties of neurons and their connectivity, leading to an ever increasing computational complexity of neural simulations. At the same time, a rather radical change in personal computer technology emerges with the establishment of multi-cores: high-density, explicitly parallel processor architectures for both high performance as well as standard desktop computers. This work introduces strategies for the parallelization of biophysically realistic neural simulations based on the compartmental modeling technique and results of such an implementation, with a strong focus on multi-core architectures and automation, i.e. user-transparent load balancing. PMID:19636393
Miller, Christopher A; Parasuraman, Raja
2007-02-01
To develop a method enabling human-like, flexible supervisory control via delegation to automation. Real-time supervisory relationships with automation are rarely as flexible as human task delegation to other humans. Flexibility in human-adaptable automation can provide important benefits, including improved situation awareness, more accurate automation usage, more balanced mental workload, increased user acceptance, and improved overall performance. We review problems with static and adaptive (as opposed to "adaptable") automation; contrast these approaches with human-human task delegation, which can mitigate many of the problems; and revise the concept of a "level of automation" as a pattern of task-based roles and authorizations. We argue that delegation requires a shared hierarchical task model between supervisor and subordinates, used to delegate tasks at various levels, and offer instruction on performing them. A prototype implementation called Playbook is described. On the basis of these analyses, we propose methods for supporting human-machine delegation interactions that parallel human-human delegation in important respects. We develop an architecture for machine-based delegation systems based on the metaphor of a sports team's "playbook." Finally, we describe a prototype implementation of this architecture, with an accompanying user interface and usage scenario, for mission planning for uninhabited air vehicles. Delegation offers a viable method for flexible, multilevel human-automation interaction to enhance system performance while maintaining user workload at a manageable level. Most applications of adaptive automation (aviation, air traffic control, robotics, process control, etc.) are potential avenues for the adaptable, delegation approach we advocate. We present an extended example for uninhabited air vehicle mission planning.
Schulz, Julia C; Stumpf, Patrick S; Katsen-Globa, Alisa; Sachinidis, Agapios; Hescheler, Jürgen; Zimmermann, Heiko
2012-11-01
Miniaturization and parallelization of cell culture procedures are in focus of research in order to develop test platforms with low material consumption and increased standardization for toxicity and drug screenings. The cultivation in hanging drops (HDs) is a convenient and versatile tool for biological applications and represents an interesting model system for the screening applications due to its uniform shape, the advantageous gas supply, and the small volume. However, its application has so far been limited to non-adherent and aggregate forming cells. Here, we describe for the first time the proof-of-principle regarding the adherent cultivation of human embryonic stem cells in HD. For this microcarriers were added to the droplet as dynamic cultivation surfaces resulting in a maintained pluripotency and proliferation capacity for 10 days. This enables the HD technique to be extended to the cultivation of adherence-dependent stem cells. Also, the possible automation of this method by implementation of liquid handling systems opens new possibilities for miniaturized screenings, the improvement of cultivation and differentiation conditions, and toxicity and drug development.
Schulz, Julia C; Stumpf, Patrick S; Katsen-Globa, Alisa; Sachinidis, Agapios; Hescheler, Jürgen; Zimmermann, Heiko
2012-01-01
Miniaturization and parallelization of cell culture procedures are in focus of research in order to develop test platforms with low material consumption and increased standardization for toxicity and drug screenings. The cultivation in hanging drops (HDs) is a convenient and versatile tool for biological applications and represents an interesting model system for the screening applications due to its uniform shape, the advantageous gas supply, and the small volume. However, its application has so far been limited to non‐adherent and aggregate forming cells. Here, we describe for the first time the proof-of-principle regarding the adherent cultivation of human embryonic stem cells in HD. For this microcarriers were added to the droplet as dynamic cultivation surfaces resulting in a maintained pluripotency and proliferation capacity for 10 days. This enables the HD technique to be extended to the cultivation of adherence-dependent stem cells. Also, the possible automation of this method by implementation of liquid handling systems opens new possibilities for miniaturized screenings, the improvement of cultivation and differentiation conditions, and toxicity and drug development. PMID:23486530
Gastens, Martin H; Goltry, Kristin; Prohaska, Wolfgang; Tschöpe, Diethelm; Stratmann, Bernd; Lammers, Dirk; Kirana, Stanley; Götting, Christian; Kleesiek, Knut
2007-01-01
Ex vivo expansion is being used to increase the number of stem and progenitor cells for autologous cell therapy. Initiation of pivotal clinical trials testing the efficacy of these cells for tissue repair has been hampered by the challenge of assuring safe and high-quality cell production. A strategy is described here for clinical-scale expansion of bone marrow (BM)-derived stem cells within a mixed cell population in a completely closed process from cell collection through postculture processing using sterile connectable devices. Human BM mononuclear cells (BMMNC) were isolated, cultured for 12 days, and washed postharvest using either standard open procedures in laminar flow hoods or using automated closed systems. Conditions for these studies were similar to long-term BM cultures in which hematopoietic and stromal components are cultured together. Expansion of marrow-derived stem and progenitor cells was then assessed. Cell yield, number of colony forming units (CFU), phenotype, stability, and multilineage differentiation capacity were compared from the single pass perfusion bioreactor and standard flask cultures. Purification of BMMNC using a closed Ficoll gradient process led to depletion of 98% erythrocytes and 87% granulocytes, compared to 100% and 70%, respectively, for manual processing. After closed system culture, mesenchymal progenitors, measured as CD105+CD166+CD14-CD45- and fibroblastic CFU, expanded 317- and 364-fold, respectively, while CD34+ hematopoietic progenitors were depleted 10-fold compared to starting BMMNC. Cultured cells exhibited multilineage differentiation by displaying adipogenic, osteogenic, and endothelial characteristics in vitro. No significant difference was observed between manual and bioreactor cultures. Automated culture and washing of the cell product resulted in 181 x 10(6) total cells that were viable and contained fibroblastic CFU for at least 24 h of storage. A combination of closed, automated technologies enabled production of good manufacturing practice (GMP)-compliant cell therapeutics, ready for use within a clinical setting, with minimal risk of microbial contamination.
Parallel computation of level set method for 500 Hz visual servo control
NASA Astrophysics Data System (ADS)
Fei, Xianfeng; Igarashi, Yasunobu; Hashimoto, Koichi
2008-11-01
We propose a 2D microorganism tracking system using a parallel level set method and a column parallel vision system (CPV). This system keeps a single microorganism in the middle of the visual field under a microscope by visual servoing an automated stage. We propose a new energy function for the level set method. This function constrains an amount of light intensity inside the detected object contour to control the number of the detected objects. This algorithm is implemented in CPV system and computational time for each frame is 2 [ms], approximately. A tracking experiment for about 25 s is demonstrated. Also we demonstrate a single paramecium can be kept tracking even if other paramecia appear in the visual field and contact with the tracked paramecium.
Program Correctness, Verification and Testing for Exascale (Corvette)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Koushik; Iancu, Costin; Demmel, James W
The goal of this project is to provide tools to assess the correctness of parallel programs written using hybrid parallelism. There is a dire lack of both theoretical and engineering know-how in the area of finding bugs in hybrid or large scale parallel programs, which our research aims to change. In the project we have demonstrated novel approaches in several areas: 1. Low overhead automated and precise detection of concurrency bugs at scale. 2. Using low overhead bug detection tools to guide speculative program transformations for performance. 3. Techniques to reduce the concurrency required to reproduce a bug using partialmore » program restart/replay. 4. Techniques to provide reproducible execution of floating point programs. 5. Techniques for tuning the floating point precision used in codes.« less
An automated perfusion bioreactor for the streamlined production of engineered osteogenic grafts.
Ding, Ming; Henriksen, Susan S; Wendt, David; Overgaard, Søren
2016-04-01
A computer-controlled perfusion bioreactor was developed for the streamlined production of engineered osteogenic grafts. This system automated the required bioprocesses, from the initial filling of the system through the phases of cell seeding and prolonged cell/tissue culture. Flow through chemo-optic micro-sensors allowed to non-invasively monitor the levels of oxygen and pH in the perfused culture medium throughout the culture period. To validate its performance, freshly isolated ovine bone marrow stromal cells were directly seeded on porous scaffold granules (hydroxyapatite/β-tricalcium-phosphate/poly-lactic acid), bypassing the phase of monolayer cell expansion in flasks. Either 10 or 20 days after culture, engineered cell-granule grafts were implanted in an ectopic mouse model to quantify new bone formation. After four weeks of implantation, histomorphometry showed more bone in bioreactor-generated grafts than cell-free granule controls, while bone formation did not show significant differences between 10 days and 20 days of incubation. The implanted granules without cells had no bone formation. This novel perfusion bioreactor has revealed the capability of activation larger viable bone graft material, even after shorter incubation time of graft material. This study has demonstrated the feasibility of engineering osteogenic grafts in an automated bioreactor system, laying the foundation for a safe, regulatory-compliant, and cost-effective manufacturing process. © 2015 Wiley Periodicals, Inc.
Rapid System to Quantitatively Characterize the Airborne Microbial Community
NASA Technical Reports Server (NTRS)
Macnaughton, Sarah J.
1998-01-01
Bioaerosols have been linked to a wide range of different allergies and respiratory illnesses. Currently, microorganism culture is the most commonly used method for exposure assessment. Such culture techniques, however, generally fail to detect between 90-99% of the actual viable biomass. Consequently, an unbiased technique for detecting airborne microorganisms is essential. In this Phase II proposal, a portable air sampling device his been developed for the collection of airborne microbial biomass from indoor (and outdoor) environments. Methods were evaluated for extracting and identifying lipids that provide information on indoor air microbial biomass, and automation of these procedures was investigated. Also, techniques to automate the extraction of DNA were explored.
NASA Technical Reports Server (NTRS)
Kleis, Stanley J.; Truong, Tuan; Goodwin, Thomas J,
2004-01-01
This report is a documentation of a fluid dynamic analysis of the proposed Automated Static Culture System (ASCS) cell module mixing protocol. The report consists of a review of some basic fluid dynamics principles appropriate for the mixing of a patch of high oxygen content media into the surrounding media which is initially depleted of oxygen, followed by a computational fluid dynamics (CFD) study of this process for the proposed protocol over a range of the governing parameters. The time histories of oxygen concentration distributions and mechanical shear levels generated are used to characterize the mixing process for different parameter values.
Vision Marker-Based In Situ Examination of Bacterial Growth in Liquid Culture Media.
Kim, Kyukwang; Choi, Duckyu; Lim, Hwijoon; Kim, Hyeongkeun; Jeon, Jessie S
2016-12-18
The detection of bacterial growth in liquid media is an essential process in determining antibiotic susceptibility or the level of bacterial presence for clinical or research purposes. We have developed a system, which enables simplified and automated detection using a camera and a striped pattern marker. The quantification of bacterial growth is possible as the bacterial growth in the culturing vessel blurs the marker image, which is placed on the back of the vessel, and the blurring results in a decrease in the high-frequency spectrum region of the marker image. The experiment results show that the FFT (fast Fourier transform)-based growth detection method is robust to the variations in the type of bacterial carrier and vessels ranging from the culture tubes to the microfluidic devices. Moreover, the automated incubator and image acquisition system are developed to be used as a comprehensive in situ detection system. We expect that this result can be applied in the automation of biological experiments, such as the Antibiotics Susceptibility Test or toxicity measurement. Furthermore, the simple framework of the proposed growth measurement method may be further utilized as an effective and convenient method for building point-of-care devices for developing countries.
a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation
NASA Astrophysics Data System (ADS)
Kıvılcım, C. Ö.; Duran, Z.
2016-06-01
The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.
Relevance of Piagetian cross-cultural psychology to the humanities and social sciences.
Oesterdiekhoff, Georg W
2013-01-01
Jean Piaget held views according to which there are parallels between ontogeny and the historical development of culture, sciences, and reason. His books are full of remarks and considerations about these parallels, with reference to many logical, physical, social, and moral phenomena.This article explains that Piagetian cross-cultural psychology has delivered the decisive data needed to extend the research interests of Piaget. These data provide a basis for reconstructing not only the history of sciences but also the history of religion, politics, morals, culture, philosophy, and social change and the emergence of industrial society. Thus, it is possible to develop Piagetian theory as a historical anthropology in order to provide a basis for the humanities and social sciences.
Domotics Project Housing Block.
Morón, Carlos; Payán, Alejandro; García, Alfonso; Bosquet, Francisco
2016-05-23
This document develops the study of an implementation project of a home automation system in a housing placed in the town of Galapagar, Madrid. This house, which is going to be occupied by a four-member family, consists of 67 constructed square meters distributed in lounge, kitchen, three bedrooms, bath, bathroom and terrace, this being a common arrangement in Spain. Thus, this study will allow extracting conclusions about the adequacy of the home automation in a wide percentage of housing in Spain. In this document, three house automation proposals are developed based on the requirements of the client and the different home automation levels that the Spanish House and Building Automation Association has established, besides two parallel proposals relating to the safety and the technical alarms. The mentioned proposed systems are described by means of product datasheets and descriptions, distribution plans, measurements, budgets and flow charts that describe the functioning of the system in every case. An evaluation of each system is included, based on other studies conclusions on this matter, where expected energy savings from each design, depending on the current cost of lighting, water and gas, as well as the expected economic amortization period is evaluated.
Domotics Project Housing Block
Morón, Carlos; Payán, Alejandro; García, Alfonso; Bosquet, Francisco
2016-01-01
This document develops the study of an implementation project of a home automation system in a housing placed in the town of Galapagar, Madrid. This house, which is going to be occupied by a four-member family, consists of 67 constructed square meters distributed in lounge, kitchen, three bedrooms, bath, bathroom and terrace, this being a common arrangement in Spain. Thus, this study will allow extracting conclusions about the adequacy of the home automation in a wide percentage of housing in Spain. In this document, three house automation proposals are developed based on the requirements of the client and the different home automation levels that the Spanish House and Building Automation Association has established, besides two parallel proposals relating to the safety and the technical alarms. The mentioned proposed systems are described by means of product datasheets and descriptions, distribution plans, measurements, budgets and flow charts that describe the functioning of the system in every case. An evaluation of each system is included, based on other studies conclusions on this matter, where expected energy savings from each design, depending on the current cost of lighting, water and gas, as well as the expected economic amortization period is evaluated. PMID:27223285
Advanced imaging techniques for the study of plant growth and development.
Sozzani, Rosangela; Busch, Wolfgang; Spalding, Edgar P; Benfey, Philip N
2014-05-01
A variety of imaging methodologies are being used to collect data for quantitative studies of plant growth and development from living plants. Multi-level data, from macroscopic to molecular, and from weeks to seconds, can be acquired. Furthermore, advances in parallelized and automated image acquisition enable the throughput to capture images from large populations of plants under specific growth conditions. Image-processing capabilities allow for 3D or 4D reconstruction of image data and automated quantification of biological features. These advances facilitate the integration of imaging data with genome-wide molecular data to enable systems-level modeling. Copyright © 2013 Elsevier Ltd. All rights reserved.
Jlalia, Ibtissem; Beauvineau, Claire; Beauvière, Sophie; Onen, Esra; Aufort, Marie; Beauvineau, Aymeric; Khaba, Eihab; Herscovici, Jean; Meganem, Faouzi; Girard, Christian
2010-04-28
This article deal with the parallel synthesis of a 96 product-sized library using a polymer-based copper catalyst that we developed which can be easily separated from the products by simple filtration. This gave us the opportunity to use this catalyst in an automated chemical synthesis station (Chemspeed ASW-2000). Studies and results about the preparation of the catalyst, its use in different solvent systems, its recycling capabilities and its scope and limitations in the synthesis of this library will be addressed. The synthesis of the triazole library and the very good results obtained will finally be discussed.
Left ventricular pressure and volume data acquisition and analysis using LabVIEW.
Cassidy, S C; Teitel, D F
1997-03-01
To automate analysis of left ventricular pressure-volume data, we used LabVIEW to create applications that digitize and display data recorded from conductance and manometric catheters. Applications separate data into cardiac cycles, calculate parallel conductance, and calculate indices of left ventricular function, including end-systolic elastance, preload-recruitable stroke work, stroke volume, ejection fraction, stroke work, maximum and minimum derivative of ventricular pressure, heart rate, indices of relaxation, peak filling rate, and ventricular chamber stiffness. Pressure-volume loops can be graphically displayed. These analyses are exported to a text-file. These applications have simplified and automated the process of evaluating ventricular function.
Lu, Emily; Elizondo-Riojas, Miguel-Angel; Chang, Jeffrey T; Volk, David E
2014-06-10
Next-generation sequencing results from bead-based aptamer libraries have demonstrated that traditional DNA/RNA alignment software is insufficient. This is particularly true for X-aptamers containing specialty bases (W, X, Y, Z, ...) that are identified by special encoding. Thus, we sought an automated program that uses the inherent design scheme of bead-based X-aptamers to create a hypothetical reference library and Markov modeling techniques to provide improved alignments. Aptaligner provides this feature as well as length error and noise level cutoff features, is parallelized to run on multiple central processing units (cores), and sorts sequences from a single chip into projects and subprojects.
Ruiz, P.; Zerolo, F. J.; Casal, M. J.
2000-01-01
The ESP Culture System II was evaluated for its capacity to test the susceptibility of 389 cultures of Mycobacterium tuberculosis to streptomycin, rifampin, ethambutol, and isoniazid. Good agreement with results with the BACTEC TB 460 was found. ESP II is a reliable, rapid, and automated method for performing susceptibility testing. PMID:11101619
Dai, Qingkai; Jiang, Yongmei; Shi, Hua; Zhou, Wei; Zhou, Shengjie; Yang, Hui
2014-01-01
Urinary tract infection (UTI) is a widespread disease in women. Urine culture is still the "gold standard" diagnostic test for UTI, but most of them are negative. To reduce unnecessary culture, we evaluated the automated urine particle analyzer UF-1000i screening for UTI in nonpregnant women. The urine specimens submitted to our laboratory were submitted for culture and tested by the Sysmex UF-1000i. Bacteria and white blood cell (WBC) counts were compared to standard urine culture results to assess the best cutoff values. In this study, 272 urine samples were included, of which 98 (36.0%) were culture positive with a bacterial cutoff value of 10 x 10(5) CFU/mL. A combination of bacterial (> 95/microL) and/or WBC count (> 24/microL) provided the best screening for UTI, with a sensitivity of 0.99 and a specificity of 0.82 compared with the urine culture. Sysmex UF-1000i could be used as a screening test for UTI in nonpregnant women. According to the distribution and range of the bacterial scattergram, we could primarily identify and differentiate between Gram-negative and Gram-positive bacteria.
Automated Instrumentation, Monitoring and Visualization of PVM Programs Using AIMS
NASA Technical Reports Server (NTRS)
Mehra, Pankaj; VanVoorst, Brian; Yan, Jerry; Lum, Henry, Jr. (Technical Monitor)
1994-01-01
We present views and analysis of the execution of several PVM (Parallel Virtual Machine) codes for Computational Fluid Dynamics on a networks of Sparcstations, including: (1) NAS Parallel Benchmarks CG and MG; (2) a multi-partitioning algorithm for NAS Parallel Benchmark SP; and (3) an overset grid flowsolver. These views and analysis were obtained using our Automated Instrumentation and Monitoring System (AIMS) version 3.0, a toolkit for debugging the performance of PVM programs. We will describe the architecture, operation and application of AIMS. The AIMS toolkit contains: (1) Xinstrument, which can automatically instrument various computational and communication constructs in message-passing parallel programs; (2) Monitor, a library of runtime trace-collection routines; (3) VK (Visual Kernel), an execution-animation tool with source-code clickback; and (4) Tally, a tool for statistical analysis of execution profiles. Currently, Xinstrument can handle C and Fortran 77 programs using PVM 3.2.x; Monitor has been implemented and tested on Sun 4 systems running SunOS 4.1.2; and VK uses XIIR5 and Motif 1.2. Data and views obtained using AIMS clearly illustrate several characteristic features of executing parallel programs on networked workstations: (1) the impact of long message latencies; (2) the impact of multiprogramming overheads and associated load imbalance; (3) cache and virtual-memory effects; and (4) significant skews between workstation clocks. Interestingly, AIMS can compensate for constant skew (zero drift) by calibrating the skew between a parent and its spawned children. In addition, AIMS' skew-compensation algorithm can adjust timestamps in a way that eliminates physically impossible communications (e.g., messages going backwards in time). Our current efforts are directed toward creating new views to explain the observed performance of PVM programs. Some of the features planned for the near future include: (1) ConfigView, showing the physical topology of the virtual machine, inferred using specially formatted IP (Internet Protocol) packets: and (2) LoadView, synchronous animation of PVM-program execution and resource-utilization patterns.
Can routine automated urinalysis reduce culture requests?
Kayalp, Damla; Dogan, Kubra; Ceylan, Gozde; Senes, Mehmet; Yucel, Dogan
2013-09-01
There are a substantial number of unnecessary urine culture requests. We aimed to investigate whether urine dipstick and microscopy results could accurately rule out urinary tract infection (UTI) without urine culture. The study included a total of 32,998 patients (11,928 men and 21,070 women, mean age: 39 ± 32 years) with a preliminary diagnosis of UTI and both urinalysis and urinary culture were requested. All urine cultures were retrospectively reviewed; association of culture positivity with a positive urinalysis result for leukocyte esterase (LE) and nitrite in chemical analysis and pyuria (WBC) and bacteriuria in microscopy was determined. Diagnostic performance of urinalysis parameters for detection of UTI was evaluated. In total, 758 (2.3%) patients were positive by urine culture. Out of these culture positive samples, ratios of positive dipstick results for LE and nitrite were 71.0% (n=538) and 17.7% (n=134), respectively. The positive microscopy results for WBC and bacteria were 68.2% (n=517) and 78.8% (n=597), respectively. Negative predictive values for LE, nitrite, bacteriuria and WBC were very close to 100%. Most of the samples have no or insignificant bacterial growth. Urine dipstick and microscopy can accurately rule out UTI. Automated urinalysis is a practicable and faster screening test which may prevent unnecessary culture requests for majority of patients. © 2013. Published by Elsevier Inc. All rights reserved.
Automated Algorithms for Quantum-Level Accuracy in Atomistic Simulations: LDRD Final Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, Aidan Patrick; Schultz, Peter Andrew; Crozier, Paul
2014-09-01
This report summarizes the result of LDRD project 12-0395, titled "Automated Algorithms for Quantum-level Accuracy in Atomistic Simulations." During the course of this LDRD, we have developed an interatomic potential for solids and liquids called Spectral Neighbor Analysis Poten- tial (SNAP). The SNAP potential has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projectedmore » on to a basis of hyperspherical harmonics in four dimensions. The SNAP coef- ficients are determined using weighted least-squares linear regression against the full QM training set. This allows the SNAP potential to be fit in a robust, automated manner to large QM data sets using many bispectrum components. The calculation of the bispectrum components and the SNAP potential are implemented in the LAMMPS parallel molecular dynamics code. Global optimization methods in the DAKOTA software package are used to seek out good choices of hyperparameters that define the overall structure of the SNAP potential. FitSnap.py, a Python-based software pack- age interfacing to both LAMMPS and DAKOTA is used to formulate the linear regression problem, solve it, and analyze the accuracy of the resultant SNAP potential. We describe a SNAP potential for tantalum that accurately reproduces a variety of solid and liquid properties. Most significantly, in contrast to existing tantalum potentials, SNAP correctly predicts the Peierls barrier for screw dislocation motion. We also present results from SNAP potentials generated for indium phosphide (InP) and silica (SiO 2 ). We describe efficient algorithms for calculating SNAP forces and energies in molecular dynamics simulations using massively parallel computers and advanced processor ar- chitectures. Finally, we briefly describe the MSM method for efficient calculation of electrostatic interactions on massively parallel computers.« less
Parallel Density-Based Clustering for Discovery of Ionospheric Phenomena
NASA Astrophysics Data System (ADS)
Pankratius, V.; Gowanlock, M.; Blair, D. M.
2015-12-01
Ionospheric total electron content maps derived from global networks of dual-frequency GPS receivers can reveal a plethora of ionospheric features in real-time and are key to space weather studies and natural hazard monitoring. However, growing data volumes from expanding sensor networks are making manual exploratory studies challenging. As the community is heading towards Big Data ionospheric science, automation and Computer-Aided Discovery become indispensable tools for scientists. One problem of machine learning methods is that they require domain-specific adaptations in order to be effective and useful for scientists. Addressing this problem, our Computer-Aided Discovery approach allows scientists to express various physical models as well as perturbation ranges for parameters. The search space is explored through an automated system and parallel processing of batched workloads, which finds corresponding matches and similarities in empirical data. We discuss density-based clustering as a particular method we employ in this process. Specifically, we adapt Density-Based Spatial Clustering of Applications with Noise (DBSCAN). This algorithm groups geospatial data points based on density. Clusters of points can be of arbitrary shape, and the number of clusters is not predetermined by the algorithm; only two input parameters need to be specified: (1) a distance threshold, (2) a minimum number of points within that threshold. We discuss an implementation of DBSCAN for batched workloads that is amenable to parallelization on manycore architectures such as Intel's Xeon Phi accelerator with 60+ general-purpose cores. This manycore parallelization can cluster large volumes of ionospheric total electronic content data quickly. Potential applications for cluster detection include the visualization, tracing, and examination of traveling ionospheric disturbances or other propagating phenomena. Acknowledgments. We acknowledge support from NSF ACI-1442997 (PI V. Pankratius).
Cloud parallel processing of tandem mass spectrometry based proteomics data.
Mohammed, Yassene; Mostovenko, Ekaterina; Henneman, Alex A; Marissen, Rob J; Deelder, André M; Palmblad, Magnus
2012-10-05
Data analysis in mass spectrometry based proteomics struggles to keep pace with the advances in instrumentation and the increasing rate of data acquisition. Analyzing this data involves multiple steps requiring diverse software, using different algorithms and data formats. Speed and performance of the mass spectral search engines are continuously improving, although not necessarily as needed to face the challenges of acquired big data. Improving and parallelizing the search algorithms is one possibility; data decomposition presents another, simpler strategy for introducing parallelism. We describe a general method for parallelizing identification of tandem mass spectra using data decomposition that keeps the search engine intact and wraps the parallelization around it. We introduce two algorithms for decomposing mzXML files and recomposing resulting pepXML files. This makes the approach applicable to different search engines, including those relying on sequence databases and those searching spectral libraries. We use cloud computing to deliver the computational power and scientific workflow engines to interface and automate the different processing steps. We show how to leverage these technologies to achieve faster data analysis in proteomics and present three scientific workflows for parallel database as well as spectral library search using our data decomposition programs, X!Tandem and SpectraST.
Run-time parallelization and scheduling of loops
NASA Technical Reports Server (NTRS)
Saltz, Joel H.; Mirchandaney, Ravi; Baxter, Doug
1988-01-01
The class of problems that can be effectively compiled by parallelizing compilers is discussed. This is accomplished with the doconsider construct which would allow these compilers to parallelize many problems in which substantial loop-level parallelism is available but cannot be detected by standard compile-time analysis. We describe and experimentally analyze mechanisms used to parallelize the work required for these types of loops. In each of these methods, a new loop structure is produced by modifying the loop to be parallelized. We also present the rules by which these loop transformations may be automated in order that they be included in language compilers. The main application area of the research involves problems in scientific computations and engineering. The workload used in our experiment includes a mixture of real problems as well as synthetically generated inputs. From our extensive tests on the Encore Multimax/320, we have reached the conclusion that for the types of workloads we have investigated, self-execution almost always performs better than pre-scheduling. Further, the improvement in performance that accrues as a result of global topological sorting of indices as opposed to the less expensive local sorting, is not very significant in the case of self-execution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amadio, G.; et al.
An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less
Shah, Ami P; Cobb, Benjamin T; Lower, Darla R; Shaikh, Nader; Rasmussen, Jayne; Hoberman, Alejandro; Wald, Ellen R; Rosendorff, Adam; Hickey, Robert W
2014-03-01
Urinary tract infections (UTI) are the most common serious bacterial infection in febrile infants. Urinalysis (UA) is a screening test for preliminary diagnosis of UTI. UA can be performed manually or using automated techniques. We sought to compare manual versus automated UA for urine specimens obtained via catheterization in the pediatric emergency department. In this prospective study, we processed catheterized urine samples from infants with suspected UTI by both the manual method (enhanced UA) and the automated method. We defined a positive enhanced UA as ≥ 10 white blood cells per cubic millimeter and presence of any bacteria per 10 oil immersion fields on a Gram-stained smear. We defined a positive automated UA as ≥ 2 white blood cells per high-powered field and presence of any bacteria using the IRIS iQ200 ELITE. We defined a positive urine culture as growth of ≥ 50,000 colony-forming units per milliliter of a single uropathogen. We analyzed data using SPSS software. A total of 703 specimens were analyzed. Prevalence of UTI was 7%. For pyuria, the sensitivity and positive predictive value (PPV) of the enhanced UA in predicting positive urine culture were 83.6% and 52.5%, respectively; corresponding values for the automated UA were 79.5% and 37.5%, respectively. For bacteriuria, the sensitivity and PPV of a Gram-stained smear (enhanced UA) were 83.6% and 59.4%, respectively; corresponding values for the automated UA were 73.4%, and 26.2%, respectively. Using criteria of both pyuria and bacteriuria for the enhanced UA resulted in a sensitivity of 77.5% and a PPV of 84.4%; corresponding values for the automated UA were 63.2% and 51.6%, respectively. Combining automated pyuria (≥ 2 white blood cells/high-powered microscopic field) with a Gram-stained smear resulted in a sensitivity of 75.5% and a PPV of 84%. Automated UA is comparable with manual UA for detection of pyuria in young children with suspected UTI. Bacteriuria detected by automated UA is less sensitive and specific for UTI when compared with a Gram-stained smear. We recommend using either manual or automated measurement of pyuria in combination with Gram-stained smear as the preferred technique for UA of catheterized specimens obtained from children in an acute care setting.
Automation of servicibility of radio-relay station equipment
NASA Astrophysics Data System (ADS)
Uryev, A. G.; Mishkin, Y. I.; Itkis, G. Y.
1985-03-01
Automation of the serviceability of radio relay station equipment must ensure central gathering and primary processing of reliable instrument reading with subsequent display on the control panel, detection and recording of failures soon enough, advance enough warning based on analysis of detertioration symptoms, and correct remote measurement of equipment performance parameters. Such an inspection will minimize transmission losses while reducing nonproductive time and labor spent on documentation and measurement. A multichannel automated inspection system for this purpose should operate by a parallel rather than sequential procedure. Digital data processing is more expedient in this case than analog method and, therefore, analog to digital converters are required. Spepcial normal, above limit and below limit test signals provide means of self-inspection, to which must be added adequate interference immunization, stabilization, and standby power supply. Use of a microcomputer permits overall refinement and expansion of the inspection system while it minimizes though not completely eliminates dependence on subjective judgment.
Automated deep-phenotyping of the vertebrate brain
Allalou, Amin; Wu, Yuelong; Ghannad-Rezaie, Mostafa; Eimon, Peter M; Yanik, Mehmet Fatih
2017-01-01
Here, we describe an automated platform suitable for large-scale deep-phenotyping of zebrafish mutant lines, which uses optical projection tomography to rapidly image brain-specific gene expression patterns in 3D at cellular resolution. Registration algorithms and correlation analysis are then used to compare 3D expression patterns, to automatically detect all statistically significant alterations in mutants, and to map them onto a brain atlas. Automated deep-phenotyping of a mutation in the master transcriptional regulator fezf2 not only detects all known phenotypes but also uncovers important novel neural deficits that were overlooked in previous studies. In the telencephalon, we show for the first time that fezf2 mutant zebrafish have significant patterning deficits, particularly in glutamatergic populations. Our findings reveal unexpected parallels between fezf2 function in zebrafish and mice, where mutations cause deficits in glutamatergic neurons of the telencephalon-derived neocortex. DOI: http://dx.doi.org/10.7554/eLife.23379.001 PMID:28406399
Laboratory automation: trajectory, technology, and tactics.
Markin, R S; Whalen, S A
2000-05-01
Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a modular approach, from a hardware-driven system to process control, from a one-of-a-kind novelty toward a standardized product, and from an in vitro diagnostics novelty to a marketing tool. Multiple vendors are present in the marketplace, many of whom are in vitro diagnostics manufacturers providing an automation solution coupled with their instruments, whereas others are focused automation companies. Automation technology continues to advance, acceptance continues to climb, and payback and cost justification methods are developing.
Regalia, Giulia; Biffi, Emilia; Achilli, Silvia; Ferrigno, Giancarlo; Menegon, Andrea; Pedrocchi, Alessandra
2016-02-01
Two binding requirements for in vitro studies on long-term neuronal networks dynamics are (i) finely controlled environmental conditions to keep neuronal cultures viable and provide reliable data for more than a few hours and (ii) parallel operation on multiple neuronal cultures to shorten experimental time scales and enhance data reproducibility. In order to fulfill these needs with a Microelectrode Arrays (MEA)-based system, we designed a stand-alone device that permits to uninterruptedly monitor neuronal cultures activity over long periods, overcoming drawbacks of existing MEA platforms. We integrated in a single device: (i) a closed chamber housing four MEAs equipped with access for chemical manipulations, (ii) environmental control systems and embedded sensors to reproduce and remotely monitor the standard in vitro culture environment on the lab bench (i.e. in terms of temperature, air CO2 and relative humidity), and (iii) a modular MEA interface analog front-end for reliable and parallel recordings. The system has been proven to assure environmental conditions stable, physiological and homogeneos across different cultures. Prolonged recordings (up to 10 days) of spontaneous and pharmacologically stimulated neuronal culture activity have not shown signs of rundown thanks to the environmental stability and have not required to withdraw the cells from the chamber for culture medium manipulations. This system represents an effective MEA-based solution to elucidate neuronal network phenomena with slow dynamics, such as long-term plasticity, effects of chronic pharmacological stimulations or late-onset pathological mechanisms. © 2015 Wiley Periodicals, Inc.
The Parallel System for Integrating Impact Models and Sectors (pSIMS)
NASA Technical Reports Server (NTRS)
Elliott, Joshua; Kelly, David; Chryssanthacopoulos, James; Glotter, Michael; Jhunjhnuwala, Kanika; Best, Neil; Wilde, Michael; Foster, Ian
2014-01-01
We present a framework for massively parallel climate impact simulations: the parallel System for Integrating Impact Models and Sectors (pSIMS). This framework comprises a) tools for ingesting and converting large amounts of data to a versatile datatype based on a common geospatial grid; b) tools for translating this datatype into custom formats for site-based models; c) a scalable parallel framework for performing large ensemble simulations, using any one of a number of different impacts models, on clusters, supercomputers, distributed grids, or clouds; d) tools and data standards for reformatting outputs to common datatypes for analysis and visualization; and e) methodologies for aggregating these datatypes to arbitrary spatial scales such as administrative and environmental demarcations. By automating many time-consuming and error-prone aspects of large-scale climate impacts studies, pSIMS accelerates computational research, encourages model intercomparison, and enhances reproducibility of simulation results. We present the pSIMS design and use example assessments to demonstrate its multi-model, multi-scale, and multi-sector versatility.
Microfluidic integration of parallel solid-phase liquid chromatography.
Huft, Jens; Haynes, Charles A; Hansen, Carl L
2013-03-05
We report the development of a fully integrated microfluidic chromatography system based on a recently developed column geometry that allows for robust packing of high-performance separation columns in poly(dimethylsiloxane) microfluidic devices having integrated valves made by multilayer soft lithography (MSL). The combination of parallel high-performance separation columns and on-chip plumbing was used to achieve a fully integrated system for on-chip chromatography, including all steps of automated sample loading, programmable gradient generation, separation, fluorescent detection, and sample recovery. We demonstrate this system in the separation of fluorescently labeled DNA and parallel purification of reverse transcription polymerase chain reaction (RT-PCR) amplified variable regions of mouse immunoglobulin genes using a strong anion exchange (AEX) resin. Parallel sample recovery in an immiscible oil stream offers the advantage of low sample dilution and high recovery rates. The ability to perform nucleic acid size selection and recovery on subnanogram samples of DNA holds promise for on-chip genomics applications including sequencing library preparation, cloning, and sample fractionation for diagnostics.
Automating the parallel processing of fluid and structural dynamics calculations
NASA Technical Reports Server (NTRS)
Arpasi, Dale J.; Cole, Gary L.
1987-01-01
The NASA Lewis Research Center is actively involved in the development of expert system technology to assist users in applying parallel processing to computational fluid and structural dynamic analysis. The goal of this effort is to eliminate the necessity for the physical scientist to become a computer scientist in order to effectively use the computer as a research tool. Programming and operating software utilities have previously been developed to solve systems of ordinary nonlinear differential equations on parallel scalar processors. Current efforts are aimed at extending these capabilities to systems of partial differential equations, that describe the complex behavior of fluids and structures within aerospace propulsion systems. This paper presents some important considerations in the redesign, in particular, the need for algorithms and software utilities that can automatically identify data flow patterns in the application program and partition and allocate calculations to the parallel processors. A library-oriented multiprocessing concept for integrating the hardware and software functions is described.
Automated quality control in a file-based broadcasting workflow
NASA Astrophysics Data System (ADS)
Zhang, Lina
2014-04-01
Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.
The role of flow injection analysis within the framework of an automated laboratory
Stockwell, Peter B.
1990-01-01
Flow Injection Analysis (FIA) was invented at roughly the same time by two quite dissimilar research groups [1,2]. FIA was patented by both groups in 1974; a year also marked by the publication of the first book on automatic chemical analysis [3]. This book was a major undertaking for its authors and it is hoped that it has added to the knowledge of those analysts attempting to automate their work or to increase the level of computerization/automation and thus reduce staffing commitments. This review discusses the role of FIA in laboratory automation, the advantages and disadvantages of the FIA approach, and the part it could play in future developments. It is important to stress at the outset that the FIA approach is all too often closely paralleled with convention al continuous flow analysis (CFA). This is a mistake for many reasons, none the least of which because of the considerable success of the CFA approach in contrast to the present lack of penetration in the commercial market-place of FIA instrumentation. PMID:18925262
Soares, Filipa A.C.; Chandra, Amit; Thomas, Robert J.; Pedersen, Roger A.; Vallier, Ludovic; Williams, David J.
2014-01-01
The transfer of a laboratory process into a manufacturing facility is one of the most critical steps required for the large scale production of cell-based therapy products. This study describes the first published protocol for scalable automated expansion of human induced pluripotent stem cell lines growing in aggregates in feeder-free and chemically defined medium. Cells were successfully transferred between different sites representative of research and manufacturing settings; and passaged manually and using the CompacT SelecT automation platform. Modified protocols were developed for the automated system and the management of cells aggregates (clumps) was identified as the critical step. Cellular morphology, pluripotency gene expression and differentiation into the three germ layers have been used compare the outcomes of manual and automated processes. PMID:24440272
Bacterial and fungal DNA extraction from blood samples: automated protocols.
Lorenz, Michael G; Disqué, Claudia; Mühl, Helge
2015-01-01
Automation in DNA isolation is a necessity for routine practice employing molecular diagnosis of infectious agents. To this end, the development of automated systems for the molecular diagnosis of microorganisms directly in blood samples is at its beginning. Important characteristics of systems demanded for routine use include high recovery of microbial DNA, DNA-free containment for the reduction of DNA contamination from exogenous sources, DNA-free reagents and consumables, ideally a walkaway system, and economical pricing of the equipment and consumables. Such full automation of DNA extraction evaluated and in use for sepsis diagnostics is yet not available. Here, we present protocols for the semiautomated isolation of microbial DNA from blood culture and low- and high-volume blood samples. The protocols include a manual pretreatment step followed by automated extraction and purification of microbial DNA.
Introduction of Automation for the Production of Bilingual, Parallel-Aligned Text
2011-10-01
Eril~ , .. !m, lwMJI.te: :o~JSt.’pJ~’f.IJ~tJ’.J15;11t. IISpe .:c~:oe ’(1JWU’t:O’.Se ’l’tlt"S."\\Ie$11eter.e$’d~tlelow:!Milew.s ’!’ll.wll’t:Q~Ksamtj
Update on Development of Mesh Generation Algorithms in MeshKit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jain, Rajeev; Vanderzee, Evan; Mahadevan, Vijay
2015-09-30
MeshKit uses a graph-based design for coding all its meshing algorithms, which includes the Reactor Geometry (and mesh) Generation (RGG) algorithms. This report highlights the developmental updates of all the algorithms, results and future work. Parallel versions of algorithms, documentation and performance results are reported. RGG GUI design was updated to incorporate new features requested by the users; boundary layer generation and parallel RGG support were added to the GUI. Key contributions to the release, upgrade and maintenance of other SIGMA1 libraries (CGM and MOAB) were made. Several fundamental meshing algorithms for creating a robust parallel meshing pipeline in MeshKitmore » are under development. Results and current status of automated, open-source and high quality nuclear reactor assembly mesh generation algorithms such as trimesher, quadmesher, interval matching and multi-sweeper are reported.« less
Minassian, Angela M; Newnham, Robert; Kalimeris, Elizabeth; Bejon, Philip; Atkins, Bridget L; Bowler, Ian C J W
2014-05-04
For the diagnosis of prosthetic joint infection (PJI) automated BACTEC™ blood culture bottle methods have comparable sensitivity, specificity and a shorter time to positivity than traditional cooked meat enrichment broth methods. We evaluate the culture incubation period required to maximise sensitivity and specificity of microbiological diagnosis, and the ability of BACTEC™ to detect slow growing Propionibacteria spp. Multiple periprosthetic tissue samples taken by a standardised method from 332 patients undergoing prosthetic joint revision arthroplasty were cultured for 14 days, using a BD BACTEC™ instrumented blood culture system, in a prospective study from 1st January to 31st August 2012. The "gold standard" definition for PJI was the presence of at least one histological criterion, the presence of a sinus tract or purulence around the device. Cases where > =2 samples yielded indistinguishable isolates were considered culture-positive. 1000 BACTEC™ bottle cultures which were negative after 14 days incubation were sub-cultured for Propionibacteria spp. 79 patients fulfilled the definition for PJI, and 66 of these were culture-positive. All but 1 of these 66 culture-positive cases of PJI were detected within 3 days of incubation. Only one additional (clinically-insignificant) Propionibacterium spp. was identified on terminal subculture of 1000 bottles. Prolonged microbiological culture for 2 weeks is unnecessary when using BACTEC™ culture methods. The majority of clinically significant organisms grow within 3 days, and Propionibacteria spp. are identified without the need for terminal subculture. These findings should facilitate earlier decisions on final antimicrobial prescribing.
Temporary Immersion System for Date Palm Micropropagation.
Othmani, Ahmed; Bayoudh, Chokri; Sellemi, Amel; Drira, Noureddine
2017-01-01
The temporary immersion system (TIS) is being used with tremendous success for automation of micropropagation of many plant species. TIS usually consists of a culture vessel comprising two compartments, an upper one with the plant material and a lower one with the liquid culture medium and an automated air pump. The latter enables contact between all parts of the explants and the liquid medium by setting overpressure to the lower part of the container. These systems are providing the most satisfactory conditions for date palm regeneration via shoot organogenesis and allow a significant increase of multiplication rate (5.5-fold in comparison with that regenerated on agar-solidified medium) and plant material quality, thereby reducing production cost.
Managing human error in aviation.
Helmreich, R L
1997-05-01
Crew resource management (CRM) programs were developed to address team and leadership aspects of piloting modern airplanes. The goal is to reduce errors through team work. Human factors research and social, cognitive, and organizational psychology are used to develop programs tailored for individual airlines. Flight crews study accident case histories, group dynamics, and human error. Simulators provide pilots with the opportunity to solve complex flight problems. CRM in the simulator is called line-oriented flight training (LOFT). In automated cockpits CRM promotes the idea of automation as a crew member. Cultural aspects of aviation include professional, business, and national culture. The aviation CRM model has been adapted for training surgeons and operating room staff in human factors.
78 FR 15800 - 30-Day Notice of Proposed Information Collection: Exchange Student Survey
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-12
... Collection. Originating Office: Educational and Cultural Affairs (ECA/ PE/C/PY). Form Number: SV2012-0007... automated collection techniques or other forms of information technology. Please note that comments... provisions of the Mutual Educational and Cultural Exchange Act, as amended, and the Exchange Visitor Program...
Archer, Charles Jens; Musselman, Roy Glenn; Peters, Amanda; Pinnow, Kurt Walter; Swartz, Brent Allen; Wallenfelt, Brian Paul
2010-04-27
A massively parallel computer system contains an inter-nodal communications network of node-to-node links. An automated routing strategy routes packets through one or more intermediate nodes of the network to reach a final destination. The default routing strategy is altered responsive to detection of overutilization of a particular path of one or more links, and at least some traffic is re-routed by distributing the traffic among multiple paths (which may include the default path). An alternative path may require a greater number of link traversals to reach the destination node.
Považan, Anika; Vukelić, Anka; Savković, Tijana; Kurucin, Tatjana
2012-01-01
A new, simple immunochromatographic assay for rapid identification of Mycobacterium tuberculosis complex in liquid cultures has been developed. The principle of the assay is binding of the Mycobacterium tuberculosis complex specific antigen to the monoclonal antibody conjugated on the test strip. The aim of this study is evaluation of the performance of immunochromatographic assay in identification of Mycobacterium tuberculosis complex in primary positive liquid cultures of BacT/Alert automated system. A total of 159 primary positive liquid cultures were tested using the immunochromatographic assay (BD MGIT TBc ID) and the conventional subculture, followed by identification using biochemical tests. Of 159 positive liquid cultures, using the conventional method, Mycobacterium tuberculos is was identified in 119 (74.8%), nontuberculous mycobacteria were found in 4 (2.5%), 14 (8.8%) cultures were contaminated and 22 (13.8%) cultures were found to be negative. Using the immunochromatographic assay, Mycobacterium tuberculosis complex was detected in 118 (74.2%) liquid cultures, and 41 (25.8%) tests were negative. Sensitivity, specificity, positive and negative predictive values of the test were 98.3%; 97.5%; 99.15%; 95.12%, respectively. The value of kappa test was 0.950, and McNemar test was 1.00. The immunochromatographic assay is a simple and rapid test which represents a suitable alternative to the conventional subculture method for the primary identification of Mycobacterium tuberculosis complex in liquid cultures of BacT/Alert automated system. PMID:22364301
Jardine, Luke Anthony; Sturgess, Barbara Ruth; Inglis, Garry Donald Trevor; Davies, Mark William
2009-04-01
To determine if: time from blood culture inoculation to positive growth (total time to positive) and time from blood culture machine entry to positive growth (machine time to positive) is altered by delayed entry into the automated blood culture machine, and if the total time to positive differs by the concentration of organisms inoculated into blood culture bottles. Staphylococcus epidermidis, Escherichia coli and group B beta-haemolytic streptococci were chosen as clinically significant representative organisms. Two concentrations (> or =10 colony-forming units per millilitre and <1 colony-forming units per millilitre) were inoculated into PEDS BacT/Alert blood culture bottles and randomly allocated to one of three delayed automated blood culture machine entry times (30 min/8.5 h/15.5 h). For all organisms at all concentrations, except the Staphylococcus epidermidis, the machine time to positive was significantly decreased by delayed entry. For all organisms at all concentrations, the mean total time to positive significantly increased with increasing delayed entry into the blood culture machine. Higher concentrations of group B beta-haemolytic streptococci and Escherichia coli grew significantly faster than lower concentrations. Bacterial growth in inoculated bottles, stored at room temperature, continues although at a slower rate than in those blood culture bottles immediately entered into the machine. If a blood culture specimen has been stored at room temperature for greater than 15.5 h, the currently allowed safety margin of 36 h (before declaring a result negative) may be insufficient.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kasemir, Kay; Pearson, Matthew R
For several years, the Control System Studio (CS-Studio) Scan System has successfully automated the operation of beam lines at the Oak Ridge National Laboratory (ORNL) High Flux Isotope Reactor (HFIR) and Spallation Neutron Source (SNS). As it is applied to additional beam lines, we need to support simultaneous adjustments of temperatures or motor positions. While this can be implemented via virtual motors or similar logic inside the Experimental Physics and Industrial Control System (EPICS) Input/Output Controllers (IOCs), doing so requires a priori knowledge of experimenters requirements. By adding support for the parallel control of multiple process variables (PVs) to themore » Scan System, we can better support ad hoc automation of experiments that benefit from such simultaneous PV adjustments.« less
Combinatorial and high-throughput approaches in polymer science
NASA Astrophysics Data System (ADS)
Zhang, Huiqi; Hoogenboom, Richard; Meier, Michael A. R.; Schubert, Ulrich S.
2005-01-01
Combinatorial and high-throughput approaches have become topics of great interest in the last decade due to their potential ability to significantly increase research productivity. Recent years have witnessed a rapid extension of these approaches in many areas of the discovery of new materials including pharmaceuticals, inorganic materials, catalysts and polymers. This paper mainly highlights our progress in polymer research by using an automated parallel synthesizer, microwave synthesizer and ink-jet printer. The equipment and methodologies in our experiments, the high-throughput experimentation of different polymerizations (such as atom transfer radical polymerization, cationic ring-opening polymerization and emulsion polymerization) and the automated matrix-assisted laser desorption/ionization time-of-flight mass spectroscopy (MALDI-TOF MS) sample preparation are described.
Accurate multiplex polony sequencing of an evolved bacterial genome.
Shendure, Jay; Porreca, Gregory J; Reppas, Nikos B; Lin, Xiaoxia; McCutcheon, John P; Rosenbaum, Abraham M; Wang, Michael D; Zhang, Kun; Mitra, Robi D; Church, George M
2005-09-09
We describe a DNA sequencing technology in which a commonly available, inexpensive epifluorescence microscope is converted to rapid nonelectrophoretic DNA sequencing automation. We apply this technology to resequence an evolved strain of Escherichia coli at less than one error per million consensus bases. A cell-free, mate-paired library provided single DNA molecules that were amplified in parallel to 1-micrometer beads by emulsion polymerase chain reaction. Millions of beads were immobilized in a polyacrylamide gel and subjected to automated cycles of sequencing by ligation and four-color imaging. Cost per base was roughly one-ninth as much as that of conventional sequencing. Our protocols were implemented with off-the-shelf instrumentation and reagents.
Automation effects in a stereotypical multiloop manual control system. [for aircraft
NASA Technical Reports Server (NTRS)
Hess, R. A.; Mcnally, B. D.
1984-01-01
The increasing reliance of state-of-the art, high performance aircraft on high authority stability and command augmentation systems, in order to obtain satisfactory performance and handling qualities, has made critical the achievement of a better understanding of human capabilities, limitations, and preferences during interactions with complex dynamic systems that involve task allocation between man and machine. An analytical and experimental study has been undertaken to investigate human interaction with a simple, multiloop dynamic system in which human activity was systematically varied by changing the levels of automation. Task definition has led to a control loop structure which parallels that for any multiloop manual control system, and may therefore be considered a stereotype.
A modular suite of hardware enabling spaceflight cell culture research
NASA Technical Reports Server (NTRS)
Hoehn, Alexander; Klaus, David M.; Stodieck, Louis S.
2004-01-01
BioServe Space Technologies, a NASA Research Partnership Center (RPC), has developed and operated various middeck payloads launched on 23 shuttle missions since 1991 in support of commercial space biotechnology projects. Modular cell culture systems are contained within the Commercial Generic Bioprocessing Apparatus (CGBA) suite of flight-qualified hardware, compatible with Space Shuttle, SPACEHAB, Spacelab and International Space Station (ISS) EXPRESS Rack interfaces. As part of the CGBA family, the Isothermal Containment Module (ICM) incubator provides thermal control, data acquisition and experiment manipulation capabilities, including accelerometer launch detection for automated activation and thermal profiling for culture incubation and sample preservation. The ICM can accommodate up to 8 individually controlled temperature zones. Command and telemetry capabilities allow real-time downlink of data and video permitting remote payload operation and ground control synchronization. Individual cell culture experiments can be accommodated in a variety of devices ranging from 'microgravity test tubes' or standard 100 mm Petri dishes, to complex, fed-batch bioreactors with automated culture feeding, waste removal and multiple sample draws. Up to 3 levels of containment can be achieved for chemical fixative addition, and passive gas exchange can be provided through hydrophobic membranes. Many additional options exist for designing customized hardware depending on specific science requirements.
Automated Interpretation of Blood Culture Gram Stains by Use of a Deep Convolutional Neural Network.
Smith, Kenneth P; Kang, Anthony D; Kirby, James E
2018-03-01
Microscopic interpretation of stained smears is one of the most operator-dependent and time-intensive activities in the clinical microbiology laboratory. Here, we investigated application of an automated image acquisition and convolutional neural network (CNN)-based approach for automated Gram stain classification. Using an automated microscopy platform, uncoverslipped slides were scanned with a 40× dry objective, generating images of sufficient resolution for interpretation. We collected 25,488 images from positive blood culture Gram stains prepared during routine clinical workup. These images were used to generate 100,213 crops containing Gram-positive cocci in clusters, Gram-positive cocci in chains/pairs, Gram-negative rods, or background (no cells). These categories were targeted for proof-of-concept development as they are associated with the majority of bloodstream infections. Our CNN model achieved a classification accuracy of 94.9% on a test set of image crops. Receiver operating characteristic (ROC) curve analysis indicated a robust ability to differentiate between categories with an area under the curve of >0.98 for each. After training and validation, we applied the classification algorithm to new images collected from 189 whole slides without human intervention. Sensitivity and specificity were 98.4% and 75.0% for Gram-positive cocci in chains and pairs, 93.2% and 97.2% for Gram-positive cocci in clusters, and 96.3% and 98.1% for Gram-negative rods. Taken together, our data support a proof of concept for a fully automated classification methodology for blood-culture Gram stains. Importantly, the algorithm was highly adept at identifying image crops with organisms and could be used to present prescreened, classified crops to technologists to accelerate smear review. This concept could potentially be extended to all Gram stain interpretive activities in the clinical laboratory. Copyright © 2018 American Society for Microbiology.
Cardiac imaging: working towards fully-automated machine analysis & interpretation.
Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido
2017-03-01
Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.
Developing Model Benchtop Systems for Microbial Experimental Evolution
NASA Astrophysics Data System (ADS)
Gentry, D.; Wang, J.; Arismendi, D.; Alvarez, J.; Ouandji, C.; Blaich, J.
2017-12-01
Understanding how microbes impact an ecosystem has improved through advances of molecular and genetic tools, but creating complex systems that emulate natural biology goes beyond current technology. In fact, many chemical, biological, and metabolic pathways of even model organisms are still poorly characterized. Even then, standard laboratory techniques for testing microbial impact on environmental change can have many drawbacks; they are time-consuming, labor intensive, and are at risk of contamination. By having an automated process, many of these problems can be reduced or even eliminated. We are developing a benchtop system that can run for long periods of time without the need for human intervention, involve multiple environmental stressors at once, perform real-time adjustments of stressor exposure based on current state of the population, and minimize contamination risks. Our prototype device allows operators to generate an analogue of real world micro-scale ecosystems that can be used to model the effects of disruptive environmental change on microbial ecosystems. It comprises of electronics, mechatronics, and fluidics based systems to control, measure, and evaluate the before and after state of microbial cultures from exposure to environmental stressors. Currently, it uses four parallel growth chambers to perform tests on liquid cultures. To measure the population state, optical sensors (LED/photodiode) are used. Its primary selection pressure is UV-C radiation, a well-studied stressor known for its cell- and DNA- damaging effects and as a mutagen. Future work will involve improving the current growth chambers, as well as implementing additional sensors and environmental stressors into the system. Full integration of multiple culture testing will allow inter-culture comparisons. Besides the temperature and OD sensors, other types of sensors can be integrated such as conductivity, biomass, pH, and dissolved gasses such as CO2 and O2. Additional environmental stressor systems like temperature (extreme heat or cold), metal toxicity, and other forms of radiation will increase the scale and testing range.
Developing Model Benchtop Systems for Microbial Experimental Evolution
NASA Technical Reports Server (NTRS)
Wang, Jonathan; Arismendi, Dillon; Alvarez, Jennifer; Ouandji, Cynthia; Blaich, Justin; Gentry, Diana
2017-01-01
Understanding how microbes impact an ecosystem has improved through advances of molecular and genetic tools, but creating complex systems that emulate natural biology goes beyond current technology. In fact, many chemical, biological, and metabolic pathways of even model organisms are still poorly characterized. Even then, standard laboratory techniques for testing microbial impact on environmental change can have many drawbacks; they are time-consuming, labor intensive, and are at risk of contamination. By having an automated process, many of these problems can be reduced or even eliminated. We are developing a benchtop system that can run for long periods of time without the need for human intervention, involve multiple environmental stressors at once, perform real-time adjustments of stressor exposure based on current state of the population, and minimize contamination risks. Our prototype device allows operators to generate an analogue of real world micro-scale ecosystems that can be used to model the effects of disruptive environmental change on microbial ecosystems. It comprises of electronics, mechatronics, and fluidics based systems to control, measure, and evaluate the before and after state of microbial cultures from exposure to environmental stressors. Currently, it uses four parallel growth chambers to perform tests on liquid cultures. To measure the population state, optical sensors (LED/photodiode) are used. Its primary selection pressure is UV-C radiation, a well-studied stressor known for its cell- and DNA-damaging effects and as a mutagen. Future work will involve improving the current growth chambers, as well as implementing additional sensors and environmental stressors into the system. Full integration of multiple culture testing will allow inter-culture comparisons. Besides the temperature and OD sensors, other types of sensors can be integrated such as conductivity, biomass, pH, and dissolved gasses such as CO and O. Additional environmental stressor systems like temperature (extreme heat or cold), metal toxicity, and other forms of radiation will increase the scale and testing range.
ERIC Educational Resources Information Center
Dietz, Roland; Grant, Carl
2005-01-01
Innovations from Google[TM] and Amazon[R] are clear wake-up calls that as a profession and an industry things need to be done differently. Automation vendors and librarians must work together to ensure that the profession is positioned to take advantage of changing culture and technology to assume a rightful place at the table where rich and…
Transformation From a Conventional Clinical Microbiology Laboratory to Full Automation.
Moreno-Camacho, José L; Calva-Espinosa, Diana Y; Leal-Leyva, Yoseli Y; Elizalde-Olivas, Dolores C; Campos-Romero, Abraham; Alcántar-Fernández, Jonathan
2017-12-22
To validate the performance, reproducibility, and reliability of BD automated instruments in order to establish a fully automated clinical microbiology laboratory. We used control strains and clinical samples to assess the accuracy, reproducibility, and reliability of the BD Kiestra WCA, the BD Phoenix, and BD Bruker MALDI-Biotyper instruments and compared them to previously established conventional methods. The following processes were evaluated: sample inoculation and spreading, colony counts, sorting of cultures, antibiotic susceptibility test, and microbial identification. The BD Kiestra recovered single colonies in less time than conventional methods (e.g. E. coli, 7h vs 10h, respectively) and agreement between both methodologies was excellent for colony counts (κ=0.824) and sorting cultures (κ=0.821). Antibiotic susceptibility tests performed with BD Phoenix and disk diffusion demonstrated 96.3% agreement with both methods. Finally, we compared microbial identification in BD Phoenix and Bruker MALDI-Biotyper and observed perfect agreement (κ=1) and identification at a species level for control strains. Together these instruments allow us to process clinical urine samples in 36h (effective time). The BD automated technologies have improved performance compared with conventional methods, and are suitable for its implementation in very busy microbiology laboratories. © American Society for Clinical Pathology 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
Proceedings of the international conference on cybernetics and societ
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1985-01-01
This book presents the papers given at a conference on artificial intelligence, expert systems and knowledge bases. Topics considered at the conference included automating expert system development, modeling expert systems, causal maps, data covariances, robot vision, image processing, multiprocessors, parallel processing, VLSI structures, man-machine systems, human factors engineering, cognitive decision analysis, natural language, computerized control systems, and cybernetics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunt, D.N.
1997-02-01
The Information Engineering thrust area develops information technology to support the programmatic needs of Lawrence Livermore National Laboratory`s Engineering Directorate. Progress in five programmatic areas are described in separate reports contained herein. These are entitled Three-dimensional Object Creation, Manipulation, and Transport, Zephyr:A Secure Internet-Based Process to Streamline Engineering Procurements, Subcarrier Multiplexing: Optical Network Demonstrations, Parallel Optical Interconnect Technology Demonstration, and Intelligent Automation Architecture.
Multiobjective Multifactorial Optimization in Evolutionary Multitasking.
Gupta, Abhishek; Ong, Yew-Soon; Feng, Liang; Tan, Kay Chen
2016-05-03
In recent decades, the field of multiobjective optimization has attracted considerable interest among evolutionary computation researchers. One of the main features that makes evolutionary methods particularly appealing for multiobjective problems is the implicit parallelism offered by a population, which enables simultaneous convergence toward the entire Pareto front. While a plethora of related algorithms have been proposed till date, a common attribute among them is that they focus on efficiently solving only a single optimization problem at a time. Despite the known power of implicit parallelism, seldom has an attempt been made to multitask, i.e., to solve multiple optimization problems simultaneously. It is contended that the notion of evolutionary multitasking leads to the possibility of automated transfer of information across different optimization exercises that may share underlying similarities, thereby facilitating improved convergence characteristics. In particular, the potential for automated transfer is deemed invaluable from the standpoint of engineering design exercises where manual knowledge adaptation and reuse are routine. Accordingly, in this paper, we present a realization of the evolutionary multitasking paradigm within the domain of multiobjective optimization. The efficacy of the associated evolutionary algorithm is demonstrated on some benchmark test functions as well as on a real-world manufacturing process design problem from the composites industry.
Advantages and challenges in automated apatite fission track counting
NASA Astrophysics Data System (ADS)
Enkelmann, E.; Ehlers, T. A.
2012-04-01
Fission track thermochronometer data are often a core element of modern tectonic and denudation studies. Soon after the development of the fission track methods interest emerged for the developed an automated counting procedure to replace the time consuming labor of counting fission tracks under the microscope. Automated track counting became feasible in recent years with increasing improvements in computer software and hardware. One such example used in this study is the commercial automated fission track counting procedure from Autoscan Systems Pty that has been highlighted through several venues. We conducted experiments that are designed to reliably and consistently test the ability of this fully automated counting system to recognize fission tracks in apatite and a muscovite external detector. Fission tracks were analyzed in samples with a step-wise increase in sample complexity. The first set of experiments used a large (mm-size) slice of Durango apatite cut parallel to the prism plane. Second, samples with 80-200 μm large apatite grains of Fish Canyon Tuff were analyzed. This second sample set is characterized by complexities often found in apatites in different rock types. In addition to the automated counting procedure, the same samples were also analyzed using conventional counting procedures. We found for all samples that the fully automated fission track counting procedure using the Autoscan System yields a larger scatter in the fission track densities measured compared to conventional (manual) track counting. This scatter typically resulted from the false identification of tracks due surface and mineralogical defects, regardless of the image filtering procedure used. Large differences between track densities analyzed with the automated counting persisted between different grains analyzed in one sample as well as between different samples. As a result of these differences a manual correction of the fully automated fission track counts is necessary for each individual surface area and grain counted. This manual correction procedure significantly increases (up to four times) the time required to analyze a sample with the automated counting procedure compared to the conventional approach.
Passing in Command Line Arguments and Parallel Cluster/Multicore Batching in R with batch.
Hoffmann, Thomas J
2011-03-01
It is often useful to rerun a command line R script with some slight change in the parameters used to run it - a new set of parameters for a simulation, a different dataset to process, etc. The R package batch provides a means to pass in multiple command line options, including vectors of values in the usual R format, easily into R. The same script can be setup to run things in parallel via different command line arguments. The R package batch also provides a means to simplify this parallel batching by allowing one to use R and an R-like syntax for arguments to spread a script across a cluster or local multicore/multiprocessor computer, with automated syntax for several popular cluster types. Finally it provides a means to aggregate the results together of multiple processes run on a cluster.
Prioritizing multiple therapeutic targets in parallel using automated DNA-encoded library screening
NASA Astrophysics Data System (ADS)
Machutta, Carl A.; Kollmann, Christopher S.; Lind, Kenneth E.; Bai, Xiaopeng; Chan, Pan F.; Huang, Jianzhong; Ballell, Lluis; Belyanskaya, Svetlana; Besra, Gurdyal S.; Barros-Aguirre, David; Bates, Robert H.; Centrella, Paolo A.; Chang, Sandy S.; Chai, Jing; Choudhry, Anthony E.; Coffin, Aaron; Davie, Christopher P.; Deng, Hongfeng; Deng, Jianghe; Ding, Yun; Dodson, Jason W.; Fosbenner, David T.; Gao, Enoch N.; Graham, Taylor L.; Graybill, Todd L.; Ingraham, Karen; Johnson, Walter P.; King, Bryan W.; Kwiatkowski, Christopher R.; Lelièvre, Joël; Li, Yue; Liu, Xiaorong; Lu, Quinn; Lehr, Ruth; Mendoza-Losana, Alfonso; Martin, John; McCloskey, Lynn; McCormick, Patti; O'Keefe, Heather P.; O'Keeffe, Thomas; Pao, Christina; Phelps, Christopher B.; Qi, Hongwei; Rafferty, Keith; Scavello, Genaro S.; Steiginga, Matt S.; Sundersingh, Flora S.; Sweitzer, Sharon M.; Szewczuk, Lawrence M.; Taylor, Amy; Toh, May Fern; Wang, Juan; Wang, Minghui; Wilkins, Devan J.; Xia, Bing; Yao, Gang; Zhang, Jean; Zhou, Jingye; Donahue, Christine P.; Messer, Jeffrey A.; Holmes, David; Arico-Muendel, Christopher C.; Pope, Andrew J.; Gross, Jeffrey W.; Evindar, Ghotas
2017-07-01
The identification and prioritization of chemically tractable therapeutic targets is a significant challenge in the discovery of new medicines. We have developed a novel method that rapidly screens multiple proteins in parallel using DNA-encoded library technology (ELT). Initial efforts were focused on the efficient discovery of antibacterial leads against 119 targets from Acinetobacter baumannii and Staphylococcus aureus. The success of this effort led to the hypothesis that the relative number of ELT binders alone could be used to assess the ligandability of large sets of proteins. This concept was further explored by screening 42 targets from Mycobacterium tuberculosis. Active chemical series for six targets from our initial effort as well as three chemotypes for DHFR from M. tuberculosis are reported. The findings demonstrate that parallel ELT selections can be used to assess ligandability and highlight opportunities for successful lead and tool discovery.
Domain Decomposition By the Advancing-Partition Method for Parallel Unstructured Grid Generation
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar Z.; Zagaris, George
2009-01-01
A new method of domain decomposition has been developed for generating unstructured grids in subdomains either sequentially or using multiple computers in parallel. Domain decomposition is a crucial and challenging step for parallel grid generation. Prior methods are generally based on auxiliary, complex, and computationally intensive operations for defining partition interfaces and usually produce grids of lower quality than those generated in single domains. The new technique, referred to as "Advancing Partition," is based on the Advancing-Front method, which partitions a domain as part of the volume mesh generation in a consistent and "natural" way. The benefits of this approach are: 1) the process of domain decomposition is highly automated, 2) partitioning of domain does not compromise the quality of the generated grids, and 3) the computational overhead for domain decomposition is minimal. The new method has been implemented in NASA's unstructured grid generation code VGRID.
A Concept for Airborne Precision Spacing for Dependent Parallel Approaches
NASA Technical Reports Server (NTRS)
Barmore, Bryan E.; Baxley, Brian T.; Abbott, Terence S.; Capron, William R.; Smith, Colin L.; Shay, Richard F.; Hubbs, Clay
2012-01-01
The Airborne Precision Spacing concept of operations has been previously developed to support the precise delivery of aircraft landing successively on the same runway. The high-precision and consistent delivery of inter-aircraft spacing allows for increased runway throughput and the use of energy-efficient arrivals routes such as Continuous Descent Arrivals and Optimized Profile Descents. This paper describes an extension to the Airborne Precision Spacing concept to enable dependent parallel approach operations where the spacing aircraft must manage their in-trail spacing from a leading aircraft on approach to the same runway and spacing from an aircraft on approach to a parallel runway. Functionality for supporting automation is discussed as well as procedures for pilots and controllers. An analysis is performed to identify the required information and a new ADS-B report is proposed to support these information needs. Finally, several scenarios are described in detail.
Peker, Musa; Şen, Baha; Gürüler, Hüseyin
2015-02-01
The effect of anesthesia on the patient is referred to as depth of anesthesia. Rapid classification of appropriate depth level of anesthesia is a matter of great importance in surgical operations. Similarly, accelerating classification algorithms is important for the rapid solution of problems in the field of biomedical signal processing. However numerous, time-consuming mathematical operations are required when training and testing stages of the classification algorithms, especially in neural networks. In this study, to accelerate the process, parallel programming and computing platform (Nvidia CUDA) facilitates dramatic increases in computing performance by harnessing the power of the graphics processing unit (GPU) was utilized. The system was employed to detect anesthetic depth level on related electroencephalogram (EEG) data set. This dataset is rather complex and large. Moreover, the achieving more anesthetic levels with rapid response is critical in anesthesia. The proposed parallelization method yielded high accurate classification results in a faster time.
HeNCE: A Heterogeneous Network Computing Environment
Beguelin, Adam; Dongarra, Jack J.; Geist, George Al; ...
1994-01-01
Network computing seeks to utilize the aggregate resources of many networked computers to solve a single problem. In so doing it is often possible to obtain supercomputer performance from an inexpensive local area network. The drawback is that network computing is complicated and error prone when done by hand, especially if the computers have different operating systems and data formats and are thus heterogeneous. The heterogeneous network computing environment (HeNCE) is an integrated graphical environment for creating and running parallel programs over a heterogeneous collection of computers. It is built on a lower level package called parallel virtual machine (PVM).more » The HeNCE philosophy of parallel programming is to have the programmer graphically specify the parallelism of a computation and to automate, as much as possible, the tasks of writing, compiling, executing, debugging, and tracing the network computation. Key to HeNCE is a graphical language based on directed graphs that describe the parallelism and data dependencies of an application. Nodes in the graphs represent conventional Fortran or C subroutines and the arcs represent data and control flow. This article describes the present state of HeNCE, its capabilities, limitations, and areas of future research.« less
A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*
Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing
2016-01-01
Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569
Ferrara, Giuseppe; Mercedes Panizol, Maria; Mazzone, Marja; Delia Pequeneze, Maria; Reviakina, Vera
2014-12-01
The aim of this study was to compare the identification of clin- ically relevant yeasts by the Vitek YBC and Microscan Walk Away RYID automated methods with conventional phenotypic methods. One hundred and ninety three yeast strains isolated from clinical samples and five controls strains were used. All the yeasts were identified by the automated methods previously mentioned and conventional phenotypic methods such as carbohydrate assimilation, visualization of microscopic morphology on corn meal agar and the use of chromogenic agar. Variables were assessed by 2 x 2 contingency tables, McNemar's Chi square, the Kappa index, and concordance values were calculated, as well as major and minor errors for the automated methods. Yeasts were divided into two groups: (1) frequent isolation and (2) rare isolation. The Vitek YBC and Microscan Walk Away RYID systems were concordant in 88.4 and 85.9% respectively, when compared to conventional phenotypic methods. Although both automated systems can be used for yeasts identification, the presence of major and minor errors indicates the possibility of misidentifications; therefore, the operator of this equipment must use in parallel, phenotypic tests such as visualization of microscopic morphology on corn meal agar and chromogenic agar, especially against infrequently isolated yeasts. Automated systems are a valuable tool; however, the expertise and judgment of the microbiologist are an important strength to ensure the quality of the results.
Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey
2017-01-01
As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed‐batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647–1661, 2017 PMID:28786215
Automation in the clinical microbiology laboratory.
Novak, Susan M; Marlowe, Elizabeth M
2013-09-01
Imagine a clinical microbiology laboratory where a patient's specimens are placed on a conveyor belt and sent on an automation line for processing and plating. Technologists need only log onto a computer to visualize the images of a culture and send to a mass spectrometer for identification. Once a pathogen is identified, the system knows to send the colony for susceptibility testing. This is the future of the clinical microbiology laboratory. This article outlines the operational and staffing challenges facing clinical microbiology laboratories and the evolution of automation that is shaping the way laboratory medicine will be practiced in the future. Copyright © 2013 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Reinicke, Melinda June
In addition to academic pressures shared with American students, students from other countries studying in the United States have the stress of living in an unfamiliar culture. Common symptoms of culture shock (irritability, loneliness, depression, rigidity) have been identified. Parallel symptoms have been described in the learned helplessness…
NASA Astrophysics Data System (ADS)
Blaber, Elizabeth; Dvorochkin, Natalya; Almeida, Eduardo; Fitzpatrick, Garret; Ellingson, Lance; Mitchell, Sarah; Yang, Anthony; Kosnik, Cristine; Rayl, Nicole; Cannon, Tom; Austin, Edward; Sato, Kevin
With the recent call by the 2011 Decadal Report and the 2010 Space Biosciences Roadmap for the International Space Station (ISS) to be used as a National Laboratory for scientific research, there is now a need for new laboratory instruments on ISS to enable such research to occur. The Bioculture System supports the extended culturing of multiple cell types and microbiological specimens. It consists of a docking station that carries ten independent incubation units or ‘Cassettes’. Each Cassette contains a cooling chamber (5(°) C) for temperature sensitive solutions and samples, or long duration fluids and sample storage, as well as an incubation chamber (ambient up to 42(°) C). Each Cassette houses an independent fluidics system comprised of a biochamber, medical-grade fluid tubing, medium warming module, oxygenation module, fluid pump, and sixteen solenoid valves for automated biochamber injections of sampling. The Bioculture System provides the user with the ability to select the incubation temperature, fluid flow rate and automated biochamber sampling or injection events for each separate Cassette. Furthermore, the ISS crew can access the biochamber, media bag, and accessory bags on-orbit using the Microgravity Science Glovebox. The Bioculture System also permits initiation of cultures, subculturing, injection of compounds, and removal of samples for on-orbit processing using ISS facilities. The Bioculture System therefore provides a unique opportunity for the study of stem cells and other cell types in space. The first validation flight of the Bioculture System will be conducted on SpaceX5, consisting of 8 Cassettes and lasting for 30-37 days. During this flight we plan to culture two different mammalian cell types in bioreactors: a mouse osteocytic-like cell line, and human induced pluripotent stem cell (iPS)-derived cardiomyocytes. Specifically, the osteocytic line will enable the study of a type of cell that has been flown on the Bioculture System’s predecessor, the Cell Culture Module, whilst demonstrating the Bioculture Systems bead-based sub-culturing capabilities, automated sampling and fixation, manual sample removal/storage by ISS crew members, and whole bioreactor fixation. These activities will enable, for the first time, the long-duration culture of a proliferative cell line. Furthermore, these activities will facilitate genetic and proteomic analysis of these cells at several time points to determine cell health throughout the culture period. The long-duration culture of iPS-derived cardiomyocytes will afford us the capability to assess the maturation and formation of a cardiac-like tissue in microgravity conditions. Automated sampling of this culture immediately prior to un-berthing from the ISS will enable genetic analysis of the mature cardiomyocyte tissue, whilst still enabling the return of live cultures for analysis of cardiomyocyte morphology, contractility, and viability in response to spaceflight. This validation flight will demonstrate the new functional capabilities of the Bioculture System and the System will enable, for the first time, the study of the response of stem cells and other cell lineages to long-duration spaceflight exposure, whilst enabling normal cell culturing techniques to be automatically conducted on ISS.
Trends in Modern Drug Discovery.
Eder, Jörg; Herrling, Paul L
2016-01-01
Drugs discovered by the pharmaceutical industry over the past 100 years have dramatically changed the practice of medicine and impacted on many aspects of our culture. For many years, drug discovery was a target- and mechanism-agnostic approach that was based on ethnobotanical knowledge often fueled by serendipity. With the advent of modern molecular biology methods and based on knowledge of the human genome, drug discovery has now largely changed into a hypothesis-driven target-based approach, a development which was paralleled by significant environmental changes in the pharmaceutical industry. Laboratories became increasingly computerized and automated, and geographically dispersed research sites are now more and more clustered into large centers to capture technological and biological synergies. Today, academia, the regulatory agencies, and the pharmaceutical industry all contribute to drug discovery, and, in order to translate the basic science into new medical treatments for unmet medical needs, pharmaceutical companies have to have a critical mass of excellent scientists working in many therapeutic fields, disciplines, and technologies. The imperative for the pharmaceutical industry to discover breakthrough medicines is matched by the increasing numbers of first-in-class drugs approved in recent years and reflects the impact of modern drug discovery approaches, technologies, and genomics.
Automated Microbial Metabolism Laboratory
NASA Technical Reports Server (NTRS)
1973-01-01
Development of the automated microbial metabolism laboratory (AMML) concept is reported. The focus of effort of AMML was on the advanced labeled release experiment. Labeled substrates, inhibitors, and temperatures were investigated to establish a comparative biochemical profile. Profiles at three time intervals on soil and pure cultures of bacteria isolated from soil were prepared to establish a complete library. The development of a strategy for the return of a soil sample from Mars is also reported.
Office automation: The administrative window into the integrated DBMS
NASA Technical Reports Server (NTRS)
Brock, G. H.
1985-01-01
In parallel to the evolution of Management Information Systems from simple data files to complex data bases, the stand-alone computer systems have been migrating toward fully integrated systems serving the work force. The next major productivity gain may very well be to make these highly sophisticated working level Data Base Management Systems (DMBS) serve all levels of management with reports of varying levels of detail. Most attempts by the DBMS development organization to provide useful information to management seem to bog down in the quagmire of competing working level requirements. Most large DBMS development organizations possess three to five year backlogs. Perhaps Office Automation is the vehicle that brings to pass the Management Information System that really serves management. A good office automation system manned by a team of facilitators seeking opportunities to serve end-users could go a long way toward defining a DBMS that serves management. This paper will briefly discuss the problems of the DBMS organization, alternative approaches to solving some of the major problems, a debate about problems that may have no solution, and finally how office automation fits into the development of the Manager's Management Information System.
Röst, Hannes L; Liu, Yansheng; D'Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi
2016-09-01
Next-generation mass spectrometric (MS) techniques such as SWATH-MS have substantially increased the throughput and reproducibility of proteomic analysis, but ensuring consistent quantification of thousands of peptide analytes across multiple liquid chromatography-tandem MS (LC-MS/MS) runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we developed TRIC (http://proteomics.ethz.ch/tric/), a software tool that utilizes fragment-ion data to perform cross-run alignment, consistent peak-picking and quantification for high-throughput targeted proteomics. TRIC reduced the identification error compared to a state-of-the-art SWATH-MS analysis without alignment by more than threefold at constant recall while correcting for highly nonlinear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups. Thus, TRIC fills a gap in the pipeline for automated analysis of massively parallel targeted proteomics data sets.
On the Use of Parmetric-CAD Systems and Cartesian Methods for Aerodynamic Design
NASA Technical Reports Server (NTRS)
Nemec, Marian; Aftosmis, Michael J.; Pulliam, Thomas H.
2004-01-01
Automated, high-fidelity tools for aerodynamic design face critical issues in attempting to optimize real-life geometry arid in permitting radical design changes. Success in these areas promises not only significantly shorter design- cycle times, but also superior and unconventional designs. To address these issues, we investigate the use of a parmetric-CAD system in conjunction with an embedded-boundary Cartesian method. Our goal is to combine the modeling capabilities of feature-based CAD with the robustness and flexibility of component-based Cartesian volume-mesh generation for complex geometry problems. We present the development of an automated optimization frame-work with a focus on the deployment of such a CAD-based design approach in a heterogeneous parallel computing environment.
Voorhaar, Lenny; De Meyer, Bernhard; Du Prez, Filip; Hoogenboom, Richard
2016-10-01
The preparation of physically crosslinked hydrogels from quasi ABA-triblock copolymers with a water-soluble middle block and hydrophobic end groups is reported. The hydrophilic monomer N-acryloylmorpholine is copolymerized with hydrophobic isobornyl acrylate via a one-pot sequential monomer addition through reversible addition fragmentation chain-transfer (RAFT) polymerization in an automated parallel synthesizer, allowing systematic variation of polymer chain length and hydrophobic-hydrophilic ratio. Hydrophobic interactions between the outer blocks cause them to phase-separate into larger hydrophobic domains in water, forming physical crosslinks between the polymers. The resulting hydrogels are studied using rheology and their self-healing ability after large strain damage is shown. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Martínez-Palou, Rafael; Zepeda, L Gerardo; Höpfl, Herbert; Montoya, Ascensión; Guzmán-Lucero, Diego J; Guzmán, Javier
2005-01-01
A versatile route to 40-membered library of 2-long alkyl chain substituted benzoazoles (1 and 2) and azole[4,5-b]pyridines (3 and 4) via microwave-assisted combinatorial synthesis was developed. The reactions were carried out in both monomode and multimode microwave oven. With the latter, all reactions were performed in high-throughput experimental settings consisting of an 8 x 5 combinatorial library designed to synthesize 40 compounds. Each step, from the addition of reagents to the recovery of final products, was automated. The microwave-assisted N-long chain alkylation reactions of 2-alkyl-1H-benzimidazole (1) and 2-alkyl-1H-benzimidazole[4,5-b] pyridines (3) were also studied.
A parallel expert system for the control of a robotic air vehicle
NASA Technical Reports Server (NTRS)
Shakley, Donald; Lamont, Gary B.
1988-01-01
Expert systems can be used to govern the intelligent control of vehicles, for example the Robotic Air Vehicle (RAV). Due to the nature of the RAV system the associated expert system needs to perform in a demanding real-time environment. The use of a parallel processing capability to support the associated expert system's computational requirement is critical in this application. Thus, algorithms for parallel real-time expert systems must be designed, analyzed, and synthesized. The design process incorporates a consideration of the rule-set/face-set size along with representation issues. These issues are looked at in reference to information movement and various inference mechanisms. Also examined is the process involved with transporting the RAV expert system functions from the TI Explorer, where they are implemented in the Automated Reasoning Tool (ART), to the iPSC Hypercube, where the system is synthesized using Concurrent Common LISP (CCLISP). The transformation process for the ART to CCLISP conversion is described. The performance characteristics of the parallel implementation of these expert systems on the iPSC Hypercube are compared to the TI Explorer implementation.
Shibuta, Mayu; Tamura, Masato; Kanie, Kei; Yanagisawa, Masumi; Matsui, Hirofumi; Satoh, Taku; Takagi, Toshiyuki; Kanamori, Toshiyuki; Sugiura, Shinji; Kato, Ryuji
2018-06-09
Cellular morphology on and in a scaffold composed of extracellular matrix generally represents the cellular phenotype. Therefore, morphology-based cell separation should be interesting method that is applicable to cell separation without staining surface markers in contrast to conventional cell separation methods (e.g., fluorescence activated cell sorting and magnetic activated cell sorting). In our previous study, we have proposed a cloning technology using a photodegradable gelatin hydrogel to separate the individual cells on and in hydrogels. To further expand the applicability of this photodegradable hydrogel culture platform, we here report an image-based cell separation system imaging cell picker for the morphology-based cell separation on a photodegradable hydrogel. We have developed the platform which enables the automated workflow of image acquisition, image processing and morphology analysis, and collection of a target cells. We have shown the performance of the morphology-based cell separation through the optimization of the critical parameters that determine the system's performance, such as (i) culture conditions, (ii) imaging conditions, and (iii) the image analysis scheme, to actually clone the cells of interest. Furthermore, we demonstrated the morphology-based cloning performance of cancer cells in the mixture of cells by automated hydrogel degradation by light irradiation and pipetting. Copyright © 2018 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Urinalysis and Urinary Tract Infection: Update for Clinicians
Young, Jennifer L.
2001-01-01
Dysuria is a common presenting complaint of women and urinalysis is a valuable tool in the initial evaluation of this presentation. Clinicians need to be aware that pyuria is the best determinate of bacteriuria requiring therapy and that values significant for infection differ depending on the method of analysis. A hemocytometer yields a value of ≥ 10 WBC/ mm3 significant for bacteriuria, while manual microscopy studies show ≥ 8 WBC/high-power field reliably predicts a positive urine culture. In cases of uncomplicated symptomatic urinary tract infection, a positive value for nitrites and leukocyte esterase by urine dipstick can be treated without the need for a urine culture. Automated urinalysis used widely in large volume laboratories provides more sensitive detection of leukocytes and bacteria in the urine.With automated microscopy, a value of > 2 WBC/hpf is significant pyuria indicative of inflammation of the urinary tract. In complicated cases such as pregnancy, recurrent infection or renal involvement, further evaluation is necessary including manual microscopy and urine culture with sensitivities. PMID:11916184
Semiautomated Method for Microbiological Vitamin Assays
Berg, T. M.; Behagel, H. A.
1972-01-01
A semiautomated method for microbiological vitamin assays is described, which includes separate automated systems for the preparation of the cultures and for the measurement of turbidity. In the dilution and dosage unit based on the continuous-flow principle, vitamin samples were diluted to two different dose levels at a rate of 40 per hr, mixed with the inoculated test broth, and dispensed into culture tubes. After incubation, racks with culture tubes were placed on the sampler of an automatic turbidimeter. This unit, based on the discrete-sample system, measured the turbidity and printed the extinction values at a rate of 300 per hr. Calculations were computerized and the results, including statistical data, are presented in an easily readable form. The automated method is in routine use for the assays of thiamine, riboflavine, pyridoxine, cyanocobalamin, calcium pantothenate, nicotinic acid, pantothenol, and folic acid. Identical vitamin solutions assayed on different days gave variation coefficients for the various vitamin assays of less than 10%. Images PMID:4553802
Huebsch, Nathaniel; Loskill, Peter; Mandegar, Mohammad A; Marks, Natalie C; Sheehan, Alice S; Ma, Zhen; Mathur, Anurag; Nguyen, Trieu N; Yoo, Jennie C; Judge, Luke M; Spencer, C Ian; Chukka, Anand C; Russell, Caitlin R; So, Po-Lin; Conklin, Bruce R; Healy, Kevin E
2015-05-01
Contractile motion is the simplest metric of cardiomyocyte health in vitro, but unbiased quantification is challenging. We describe a rapid automated method, requiring only standard video microscopy, to analyze the contractility of human-induced pluripotent stem cell-derived cardiomyocytes (iPS-CM). New algorithms for generating and filtering motion vectors combined with a newly developed isogenic iPSC line harboring genetically encoded calcium indicator, GCaMP6f, allow simultaneous user-independent measurement and analysis of the coupling between calcium flux and contractility. The relative performance of these algorithms, in terms of improving signal to noise, was tested. Applying these algorithms allowed analysis of contractility in iPS-CM cultured over multiple spatial scales from single cells to three-dimensional constructs. This open source software was validated with analysis of isoproterenol response in these cells, and can be applied in future studies comparing the drug responsiveness of iPS-CM cultured in different microenvironments in the context of tissue engineering.
Physics Mining of Multi-Source Data Sets
NASA Technical Reports Server (NTRS)
Helly, John; Karimabadi, Homa; Sipes, Tamara
2012-01-01
Powerful new parallel data mining algorithms can produce diagnostic and prognostic numerical models and analyses from observational data. These techniques yield higher-resolution measures than ever before of environmental parameters by fusing synoptic imagery and time-series measurements. These techniques are general and relevant to observational data, including raster, vector, and scalar, and can be applied in all Earth- and environmental science domains. Because they can be highly automated and are parallel, they scale to large spatial domains and are well suited to change and gap detection. This makes it possible to analyze spatial and temporal gaps in information, and facilitates within-mission replanning to optimize the allocation of observational resources. The basis of the innovation is the extension of a recently developed set of algorithms packaged into MineTool to multi-variate time-series data. MineTool is unique in that it automates the various steps of the data mining process, thus making it amenable to autonomous analysis of large data sets. Unlike techniques such as Artificial Neural Nets, which yield a blackbox solution, MineTool's outcome is always an analytical model in parametric form that expresses the output in terms of the input variables. This has the advantage that the derived equation can then be used to gain insight into the physical relevance and relative importance of the parameters and coefficients in the model. This is referred to as physics-mining of data. The capabilities of MineTool are extended to include both supervised and unsupervised algorithms, handle multi-type data sets, and parallelize it.
Enterprise systems in Russia: 1992-2012
NASA Astrophysics Data System (ADS)
Kataev, Michael Yu; Bulysheva, Larisa A.; Emelyanenko, Alexander A.; Emelyanenko, Vladimir A.
2013-05-01
This paper introduces the enterprise systems (ES) development and implementation in Russia in the past three decades. Historic analysis shows that, in terms of time frame, the development of ACS (Automated Control Systems) in the former Soviet Union and the ERP (Enterprise Resource Planning) in the West was almost parallel. In this paper, the current status and the major trend of ES in Russia is discussed.
Joint Experimentation on Scalable Parallel Processors (JESPP)
2006-04-01
made use of local embedded relational databases, implemented using sqlite on each node of an SPP to execute queries and return results via an ad hoc ...rl.af.mil 12a. DISTRIBUTION / AVAILABILITY STATEENT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. 12b. DISTRIBUTION CODE 13. ABSTRACT...Experimentation Directorate (J9) required expansion of its joint semi-automated forces (JSAF) code capabilities; including number of entities, behavior complexity
Social aspects of automation: Some critical insights
NASA Astrophysics Data System (ADS)
Nouzil, Ibrahim; Raza, Ali; Pervaiz, Salman
2017-09-01
Sustainable development has been recognized globally as one of the major driving forces towards the current technological innovations. To achieve sustainable development and attain its associated goals, it is very important to properly address its concerns in different aspects of technological innovations. Several industrial sectors have enjoyed productivity and economic gains due to advent of automation technology. It is important to characterize sustainability for the automation technology. Sustainability is key factor that will determine the future of our neighbours in time and it must be tightly wrapped around the double-edged sword of technology. In this study, different impacts of automation have been addressed using the ‘Circles of Sustainability’ approach as a framework, covering economic, political, cultural and ecological aspects and their implications. A systematic literature review of automation technology from its inception is outlined and plotted against its many outcomes covering a broad spectrum. The study is more focused towards the social aspects of the automation technology. The study also reviews literature to analyse the employment deficiency as one end of the social impact spectrum. On the other end of the spectrum, benefits to society through technological advancements, such as the Internet of Things (IoT) coupled with automation are presented.
NASA Astrophysics Data System (ADS)
Glass, B. J.; Cannon, H.; Bonaccorsi, R.; Zacny, K.
2006-12-01
The Drilling Automation for Mars Exploration (DAME) project's purpose is to develop and field-test drilling automation and robotics technologies for projected use in missions in the 2011-15 period. DAME includes control of the drilling hardware, and state estimation of both the hardware and the lithography being drilled and the state of the hole. A sister drill was constructed for the Mars Analog Río Tinto Experiment (MARTE) project and demonstrated automated core handling and string changeout in 2005 drilling tests at Rio Tinto, Spain. DAME focused instead on the problem of drill control while actively drilling while not getting stuck. Together, the DAME and MARTE projects demonstrate a fully automated robotic drilling capability, including hands-off drilling, adjustment to different strata and downhole conditions, recovery from drilling faults (binding, choking, etc.), drill string changeouts, core acquisition and removal, and sample handling and conveyance to in-situ instruments. The 2006 top-level goal of DAME drilling in-situ tests was to verify and demonstrate a capability for hands-off automated drilling, at an Arctic Mars-analog site. There were three sets of 2006 test goals, all of which were exceeded during the July 2006 field season. The first was to demonstrate the recognition, while drilling, of at least three of the six known major fault modes for the DAME planetary-prototype drill, and to employ the correct recovery or safing procedure in response. The second set of 2006 goals was to operate for three or more hours autonomously, hands-off. And the third 2006 goal was to exceed 3m depth into the frozen breccia and permafrost with the DAME drill (it had not gone further than 2.2m previously). Five of six faults were detected and corrected, there were 43 hours of hands-off drilling (including a 4 hour sequence with no human presence nearby), and 3.2m was the total depth. And ground truth drilling used small commercial drilling equipment in parallel in order to obtain cores and ice profiles at the drilling site. In the course of DAME drilling automation testing, the drilling-induced temperature gradients and their effects on encountered subsurface permafrost and ice layers were observed while drilling in frozen impact breccia at Haughton Crater. In repeated tests of robotic core removal processing and handling in the MARTE project, including field tests, cross-contamination issues arose between successive cores and samples, and procedures and metrics were developed for minimizing the cross-contamination. The MARTE core processing cross-contamination aspects were tested by analyzing a set of pristine samples (those stratigraphically known) vs. cuttings (loose clays) or artifacts from the robotic drilling (indurated clay layers). MARTE ground truth drilling, in parallel with the automated tests, provided control information on the discontinuity/continuity of the stratigraphic record (i.e., texture, color and structure of loose and consolidated materials).
Brennan; Biddison; Frauendorf; Schwarcz; Keen; Ecker; Davis; Tinder; Swayze
1998-01-01
An automated, 96-well parallel array synthesizer for solid-phase organic synthesis has been designed and constructed. The instrument employs a unique reagent array delivery format, in which each reagent utilized has a dedicated plumbing system. An inert atmosphere is maintained during all phases of a synthesis, and temperature can be controlled via a thermal transfer plate which holds the injection molded reaction block. The reaction plate assembly slides in the X-axis direction, while eight nozzle blocks holding the reagent lines slide in the Y-axis direction, allowing for the extremely rapid delivery of any of 64 reagents to 96 wells. In addition, there are six banks of fixed nozzle blocks, which deliver the same reagent or solvent to eight wells at once, for a total of 72 possible reagents. The instrument is controlled by software which allows the straightforward programming of the synthesis of a larger number of compounds. This is accomplished by supplying a general synthetic procedure in the form of a command file, which calls upon certain reagents to be added to specific wells via lookup in a sequence file. The bottle position, flow rate, and concentration of each reagent is stored in a separate reagent table file. To demonstrate the utility of the parallel array synthesizer, a small combinatorial library of hydroxamic acids was prepared in high throughput mode for biological screening. Approximately 1300 compounds were prepared on a 10 μmole scale (3-5 mg) in a few weeks. The resulting crude compounds were generally >80% pure, and were utilized directly for high throughput screening in antibacterial assays. Several active wells were found, and the activity was verified by solution-phase synthesis of analytically pure material, indicating that the system described herein is an efficient means for the parallel synthesis of compounds for lead discovery. Copyright 1998 John Wiley & Sons, Inc.
Cardiac imaging: working towards fully-automated machine analysis & interpretation
Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido
2017-01-01
Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804
The Red Atlantic: Transoceanic Cultural Exchanges
ERIC Educational Resources Information Center
Weaver, Jace
2011-01-01
The development of David Armitage's "white Atlantic" history parallels the Cold War origins of American studies with its mission to define and promote "American culture" or "American civilization." British scholar Paul Gilroy's "The Black Atlantic" served as a necessary corrective. Armitage's statement leads…
Ni, Yizhao; Kennebeck, Stephanie; Dexheimer, Judith W; McAneney, Constance M; Tang, Huaxiu; Lingren, Todd; Li, Qi; Zhai, Haijun; Solti, Imre
2015-01-01
Objectives (1) To develop an automated eligibility screening (ES) approach for clinical trials in an urban tertiary care pediatric emergency department (ED); (2) to assess the effectiveness of natural language processing (NLP), information extraction (IE), and machine learning (ML) techniques on real-world clinical data and trials. Data and methods We collected eligibility criteria for 13 randomly selected, disease-specific clinical trials actively enrolling patients between January 1, 2010 and August 31, 2012. In parallel, we retrospectively selected data fields including demographics, laboratory data, and clinical notes from the electronic health record (EHR) to represent profiles of all 202795 patients visiting the ED during the same period. Leveraging NLP, IE, and ML technologies, the automated ES algorithms identified patients whose profiles matched the trial criteria to reduce the pool of candidates for staff screening. The performance was validated on both a physician-generated gold standard of trial–patient matches and a reference standard of historical trial–patient enrollment decisions, where workload, mean average precision (MAP), and recall were assessed. Results Compared with the case without automation, the workload with automated ES was reduced by 92% on the gold standard set, with a MAP of 62.9%. The automated ES achieved a 450% increase in trial screening efficiency. The findings on the gold standard set were confirmed by large-scale evaluation on the reference set of trial–patient matches. Discussion and conclusion By exploiting the text of trial criteria and the content of EHRs, we demonstrated that NLP-, IE-, and ML-based automated ES could successfully identify patients for clinical trials. PMID:25030032
Godden, S M; Royster, E; Timmerman, J; Rapnicki, P; Green, H
2017-08-01
Study objectives were to (1) describe the diagnostic test characteristics of an automated milk leukocyte differential (MLD) test and the California Mastitis Test (CMT) to identify intramammary infection (IMI) in early- (EL) and late-lactation (LL) quarters and cows when using 3 different approaches to define IMI from milk culture, and (2) describe the repeatability of MLD test results at both the quarter and cow level. Eighty-six EL and 90 LL Holstein cows were sampled from 3 Midwest herds. Quarter milk samples were collected for a cow-side CMT test, milk culture, and MLD testing. Quarter IMI status was defined by 3 methods: culture of a single milk sample, culture of duplicate samples with parallel interpretation, and culture of duplicate samples with serial interpretation. The MLD testing was completed in duplicate within 8 h of sample collection; MLD results (positive/negative) were reported at each possible threshold setting (1-18 for EL; 1-12 for LL) and CMT results (positive/negative) were reported at each possible cut-points (trace, ≥1, ≥2, or 3). We created 2 × 2 tables to compare MLD and CMT results to milk culture, at both the quarter and cow level, when using each of 3 different definitions of IMI as the referent test. Paired MLD test results were compared with evaluate repeatability. The MLD test showed excellent repeatability. The choice of definition of IMI from milk culture had minor effects on estimates of MLD and CMT test characteristics. For EL samples, when interpreting MLD and CMT results at the quarter level, and regardless of the referent test used, both tests had low sensitivity (MLD = 11.7-39.1%; CMT = 0-52.2%) but good to very good specificity (MLD = 82.1-95.2%; CMT = 68.1-100%), depending on the cut-point used. Sensitivity improved slightly if diagnosis was interpreted at the cow level (MLD = 25.6-56.4%; CMT = 0-72.2%), though specificity generally declined (MLD = 61.8-100%; CMT = 25.0-100%) depending on the cut-point used. For LL samples, when interpreted at the quarter level, both tests had variable sensitivity (MLD = 46.6-84.8%; CMT = 9.6-72.7%) and variable specificity (MLD = 59.2-79.8%; CMT = 52.5-97.3%), depending on the cut-point used. Test sensitivity improved if interpreted at the cow level (MLD = 59.6-86.4%; CMT = 19.1-86.4%), though specificity declined (MLD = 32.4-56.8%; CMT = 14.3-92.3%). Producers considering adopting either test for LL or EL screening programs will need to carefully consider the goals and priorities of the program (e.g., whether to prioritize test sensitivity or specificity) when deciding on the level of interpretation (quarter or cow) and when selecting the optimal cut-point for interpreting test results. Additional validation studies and large randomized field studies will be needed to evaluate the effect of adopting either test in selective dry cow therapy or fresh cow screening programs on udder health, antibiotic use, and economics. The Authors. Published by the Federation of Animal Science Societies and Elsevier Inc. on behalf of the American Dairy Science Association®. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/3.0/).
Pursuing Darwin’s curious parallel: Prospects for a science of cultural evolution
2017-01-01
In the past few decades, scholars from several disciplines have pursued the curious parallel noted by Darwin between the genetic evolution of species and the cultural evolution of beliefs, skills, knowledge, languages, institutions, and other forms of socially transmitted information. Here, I review current progress in the pursuit of an evolutionary science of culture that is grounded in both biological and evolutionary theory, but also treats culture as more than a proximate mechanism that is directly controlled by genes. Both genetic and cultural evolution can be described as systems of inherited variation that change over time in response to processes such as selection, migration, and drift. Appropriate differences between genetic and cultural change are taken seriously, such as the possibility in the latter of nonrandomly guided variation or transformation, blending inheritance, and one-to-many transmission. The foundation of cultural evolution was laid in the late 20th century with population-genetic style models of cultural microevolution, and the use of phylogenetic methods to reconstruct cultural macroevolution. Since then, there have been major efforts to understand the sociocognitive mechanisms underlying cumulative cultural evolution, the consequences of demography on cultural evolution, the empirical validity of assumed social learning biases, the relative role of transformative and selective processes, and the use of quantitative phylogenetic and multilevel selection models to understand past and present dynamics of society-level change. I conclude by highlighting the interdisciplinary challenges of studying cultural evolution, including its relation to the traditional social sciences and humanities. PMID:28739929
Pursuing Darwin's curious parallel: Prospects for a science of cultural evolution.
Mesoudi, Alex
2017-07-24
In the past few decades, scholars from several disciplines have pursued the curious parallel noted by Darwin between the genetic evolution of species and the cultural evolution of beliefs, skills, knowledge, languages, institutions, and other forms of socially transmitted information. Here, I review current progress in the pursuit of an evolutionary science of culture that is grounded in both biological and evolutionary theory, but also treats culture as more than a proximate mechanism that is directly controlled by genes. Both genetic and cultural evolution can be described as systems of inherited variation that change over time in response to processes such as selection, migration, and drift. Appropriate differences between genetic and cultural change are taken seriously, such as the possibility in the latter of nonrandomly guided variation or transformation, blending inheritance, and one-to-many transmission. The foundation of cultural evolution was laid in the late 20th century with population-genetic style models of cultural microevolution, and the use of phylogenetic methods to reconstruct cultural macroevolution. Since then, there have been major efforts to understand the sociocognitive mechanisms underlying cumulative cultural evolution, the consequences of demography on cultural evolution, the empirical validity of assumed social learning biases, the relative role of transformative and selective processes, and the use of quantitative phylogenetic and multilevel selection models to understand past and present dynamics of society-level change. I conclude by highlighting the interdisciplinary challenges of studying cultural evolution, including its relation to the traditional social sciences and humanities.
Montone, K. T.; Brigati, D. J.; Budgeon, L. R.
1989-01-01
This paper presents the first automated system for simultaneously detecting human papilloma, herpes simplex, adenovirus, or cytomegalovirus viral antigens and gene sequences in standard formalin-fixed, paraffin-embedded tissue substrates and tissue culture. These viruses can be detected by colorimetric in situ nucleic acid hybridization, using biotinylated DNA probes, or by indirect immunoperoxidase techniques, using polyclonal or monoclonal antibodies, in a 2.0-hour assay performed at a single automated robotic workstation. Images FIG. 1 FIG. 4 FIG. 5 FIG. 6 FIG. 7 FIG. 8 FIG. 9 FIG. 10 FIG. 11 PMID:2773514
21 CFR 866.2560 - Microbial growth monitor.
Code of Federal Regulations, 2010 CFR
2010-04-01
... measures the concentration of bacteria suspended in a liquid medium by measuring changes in light.... With the exception of automated blood culturing system devices that are used in testing for bacteria...
21 CFR 866.2560 - Microbial growth monitor.
Code of Federal Regulations, 2011 CFR
2011-04-01
... measures the concentration of bacteria suspended in a liquid medium by measuring changes in light.... With the exception of automated blood culturing system devices that are used in testing for bacteria...
Downey, Brandon; Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey
2017-11-01
As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed-batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647-1661, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.
Global magnetosphere simulations using constrained-transport Hall-MHD with CWENO reconstruction
NASA Astrophysics Data System (ADS)
Lin, L.; Germaschewski, K.; Maynard, K. M.; Abbott, S.; Bhattacharjee, A.; Raeder, J.
2013-12-01
We present a new CWENO (Centrally-Weighted Essentially Non-Oscillatory) reconstruction based MHD solver for the OpenGGCM global magnetosphere code. The solver was built using libMRC, a library for creating efficient parallel PDE solvers on structured grids. The use of libMRC gives us access to its core functionality of providing an automated code generation framework which takes a user provided PDE right hand side in symbolic form to generate an efficient, computer architecture specific, parallel code. libMRC also supports block-structured adaptive mesh refinement and implicit-time stepping through integration with the PETSc library. We validate the new CWENO Hall-MHD solver against existing solvers both in standard test problems as well as in global magnetosphere simulations.
Robandt, P V; Klette, K L; Sibum, M
2009-10-01
An automated solid-phase extraction coupled with liquid chromatography and tandem mass spectrometry (SPE-LC-MS-MS) method for the analysis of 11-nor-Delta(9)-tetrahydrocannabinol-9-carboxylic acid (THC-COOH) in human urine specimens was developed. The method was linear (R(2) = 0.9986) to 1000 ng/mL with no carryover evidenced at 2000 ng/mL. Limits of quantification and detection were found to be 2 ng/mL. Interrun precision was evaluated at the 15 ng/mL level over nine batches spanning 15 days (n = 45). The coefficient of variation (%CV) was found to be 5.5% over the course of the validation. Intrarun precision of a 15 ng/mL control (n = 5) ranged from 0.58% CV to 7.4% CV for the same set of analytical batches. Interference was tested using (+/-)-11-hydroxy-Delta(9)-tetrahydrocannabinol, cannabidiol, (-)-Delta(8)-tetrahydrocannabinol, and cannabinol. One hundred and nineteen specimens previously found to contain THC-COOH by a previously validated gas chromatographic mass spectrometry (GC-MS) procedure were compared to the SPE-LC-MS-MS method. Excellent agreement was found (R(2) = 0.9925) for the parallel comparison study. The automated SPE procedure eliminates the human factors of specimen handling, extraction, and derivatization, thereby reducing labor costs and rework resulting from human error or technique issues. Additionally, method runtime is greatly reduced (e.g., during parallel studies the SPE-LC-MS-MS instrument was often finished with analysis by the time the technician finished the offline SPE and derivatization procedure prior to the GC-MS analysis).
Semiautomatic and Automatic Cooperative Inversion of Seismic and Magnetotelluric Data
NASA Astrophysics Data System (ADS)
Le, Cuong V. A.; Harris, Brett D.; Pethick, Andrew M.; Takam Takougang, Eric M.; Howe, Brendan
2016-09-01
Natural source electromagnetic methods have the potential to recover rock property distributions from the surface to great depths. Unfortunately, results in complex 3D geo-electrical settings can be disappointing, especially where significant near-surface conductivity variations exist. In such settings, unconstrained inversion of magnetotelluric data is inexorably non-unique. We believe that: (1) correctly introduced information from seismic reflection can substantially improve MT inversion, (2) a cooperative inversion approach can be automated, and (3) massively parallel computing can make such a process viable. Nine inversion strategies including baseline unconstrained inversion and new automated/semiautomated cooperative inversion approaches are applied to industry-scale co-located 3D seismic and magnetotelluric data sets. These data sets were acquired in one of the Carlin gold deposit districts in north-central Nevada, USA. In our approach, seismic information feeds directly into the creation of sets of prior conductivity model and covariance coefficient distributions. We demonstrate how statistical analysis of the distribution of selected seismic attributes can be used to automatically extract subvolumes that form the framework for prior model 3D conductivity distribution. Our cooperative inversion strategies result in detailed subsurface conductivity distributions that are consistent with seismic, electrical logs and geochemical analysis of cores. Such 3D conductivity distributions would be expected to provide clues to 3D velocity structures that could feed back into full seismic inversion for an iterative practical and truly cooperative inversion process. We anticipate that, with the aid of parallel computing, cooperative inversion of seismic and magnetotelluric data can be fully automated, and we hold confidence that significant and practical advances in this direction have been accomplished.
Nebbad-Lechani, Biba; Emirian, Aurélie; Maillebuau, Fabienne; Mahjoub, Nadia; Fihman, Vincent; Legrand, Patrick; Decousser, Jean-Winoc
2013-12-01
The microbiological diagnosis of respiratory tract infections requires serial manual dilutions of the clinical specimen before agar plate inoculation, disrupting the workflow in bacteriology clinical laboratories. Automated plating instrument systems have been designed to increase the speed, reproducibility and safety of this inoculating step; nevertheless, data concerning respiratory specimens are lacking. We tested a specific procedure that uses the Previ Isola® (bioMérieux, Craponne, France) to inoculate with broncho-pulmonary specimens (BPS). A total of 350 BPS from a university-affiliated hospital were managed in parallel using the manual reference and the automated methods (expectoration: 75; broncho-alveolar lavage: 68; tracheal aspiration: 17; protected distal sample: 190). A specific enumeration reading grid, a pre-liquefaction step and a fluidity test, performed before the inoculation, were designed for the automated method. The qualitative (i.e., the number of specimens yielding a bacterial count greater than the clinical threshold) and quantitative (i.e., the discrepancy within a 0.5 log value) concordances were 100% and 98.2%, respectively. The slimmest subgroup of expectorations could not be managed by the automated method (8%, 6/75). The technical time and cost savings (i.e., number of consumed plates) reached 50%. Additional studies are required for specific populations, such as cystic fibrosis specimens and associated bacterial variants. An automated decapper should be implemented to increase the biosafety of the process. The PREVI Isola® adapted procedure is a time- and cost-saving method for broncho-pulmonary specimen processing. © 2013.
GSRP/David Marshall: Fully Automated Cartesian Grid CFD Application for MDO in High Speed Flows
NASA Technical Reports Server (NTRS)
2003-01-01
With the renewed interest in Cartesian gridding methodologies for the ease and speed of gridding complex geometries in addition to the simplicity of the control volumes used in the computations, it has become important to investigate ways of extending the existing Cartesian grid solver functionalities. This includes developing methods of modeling the viscous effects in order to utilize Cartesian grids solvers for accurate drag predictions and addressing the issues related to the distributed memory parallelization of Cartesian solvers. This research presents advances in two areas of interest in Cartesian grid solvers, viscous effects modeling and MPI parallelization. The development of viscous effects modeling using solely Cartesian grids has been hampered by the widely varying control volume sizes associated with the mesh refinement and the cut cells associated with the solid surface. This problem is being addressed by using physically based modeling techniques to update the state vectors of the cut cells and removing them from the finite volume integration scheme. This work is performed on a new Cartesian grid solver, NASCART-GT, with modifications to its cut cell functionality. The development of MPI parallelization addresses issues associated with utilizing Cartesian solvers on distributed memory parallel environments. This work is performed on an existing Cartesian grid solver, CART3D, with modifications to its parallelization methodology.
NASA Astrophysics Data System (ADS)
Lorenzoni, Filippo; Casarin, Filippo; Caldon, Mauro; Islami, Kleidi; Modena, Claudio
2016-01-01
In the last decades the need for an effective seismic protection and vulnerability reduction of cultural heritage buildings and sites determined a growing interest in structural health monitoring (SHM) as a knowledge-based assessment tool to quantify and reduce uncertainties regarding their structural performance. Monitoring can be successfully implemented in some cases as an alternative to interventions or to control the medium- and long-term effectiveness of already applied strengthening solutions. The research group at the University of Padua, in collaboration with public administrations, has recently installed several SHM systems on heritage structures. The paper reports the application of monitoring strategies implemented to avoid (or at least minimize) the execution of strengthening interventions/repairs and control the response as long as a clear worsening or damaging process is detected. Two emblematic case studies are presented and discussed: the Roman Amphitheatre (Arena) of Verona and the Conegliano Cathedral. Both are excellent examples of on-going monitoring activities, performed through static and dynamic approaches in combination with automated procedures to extract meaningful structural features from collected data. In parallel to the application of innovative monitoring techniques, statistical models and data processing algorithms have been developed and applied in order to reduce uncertainties and exploit monitoring results for an effective assessment and protection of historical constructions. Processing software for SHM was implemented to perform the continuous real time treatment of static data and the identification of modal parameters based on the structural response to ambient vibrations. Statistical models were also developed to filter out the environmental effects and thermal cycles from the extracted features.
Fourth NASA Langley Formal Methods Workshop
NASA Technical Reports Server (NTRS)
Holloway, C. Michael (Compiler); Hayhurst, Kelly J. (Compiler)
1997-01-01
This publication consists of papers presented at NASA Langley Research Center's fourth workshop on the application of formal methods to the design and verification of life-critical systems. Topic considered include: Proving properties of accident; modeling and validating SAFER in VDM-SL; requirement analysis of real-time control systems using PVS; a tabular language for system design; automated deductive verification of parallel systems. Also included is a fundamental hardware design in PVS.
Parallels in Computer-Aided Design Framework and Software Development Environment Efforts.
1992-05-01
de - sign kits, and tool and design management frameworks. Also, books about software engineer- ing environments [Long 91] and electronic design...tool integration [Zarrella 90], and agreement upon a universal de - sign automation framework, such as the CAD Framework Initiative (CFI) [Malasky 91...ments: identification, control, status accounting, and audit and review. The paper by Dart ex- tracts 15 CM concepts from existing SDEs and tools
Automatic selection of dynamic data partitioning schemes for distributed memory multicomputers
NASA Technical Reports Server (NTRS)
Palermo, Daniel J.; Banerjee, Prithviraj
1995-01-01
For distributed memory multicomputers such as the Intel Paragon, the IBM SP-2, the NCUBE/2, and the Thinking Machines CM-5, the quality of the data partitioning for a given application is crucial to obtaining high performance. This task has traditionally been the user's responsibility, but in recent years much effort has been directed to automating the selection of data partitioning schemes. Several researchers have proposed systems that are able to produce data distributions that remain in effect for the entire execution of an application. For complex programs, however, such static data distributions may be insufficient to obtain acceptable performance. The selection of distributions that dynamically change over the course of a program's execution adds another dimension to the data partitioning problem. In this paper, we present a technique that can be used to automatically determine which partitionings are most beneficial over specific sections of a program while taking into account the added overhead of performing redistribution. This system is being built as part of the PARADIGM (PARAllelizing compiler for DIstributed memory General-purpose Multicomputers) project at the University of Illinois. The complete system will provide a fully automated means to parallelize programs written in a serial programming model obtaining high performance on a wide range of distributed-memory multicomputers.
Photochemical numerics for global-scale modeling: Fidelity and GCM testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elliott, S.; Jim Kao, Chih-Yue; Zhao, X.
1995-03-01
Atmospheric photochemistry lies at the heart of global-scale pollution problems, but it is a nonlinear system embedded in nonlinear transport and so must be modeled in three dimensions. Total earth grids are massive and kinetics require dozens of interacting tracers, taxing supercomputers to their limits in global calculations. A matrix-free and noniterative family scheme is described that permits chemical step sizes an order of magnitude or more larger than time constants for molecular groupings, in the 1-h range used for transport. Families are partitioned through linearized implicit integrations that produce stabilizing species concentrations for a mass-conserving forward solver. The kineticsmore » are also parallelized by moving geographic loops innermost and changes in the continuity equations are automated through list reading. The combination of speed, parallelization and automation renders the programs naturally modular. Accuracy lies within 1% for all species in week-long fidelity tests. A 50-species, 150-reaction stratospheric module tested in a spectral GCM benchmarks at 10 min CPU time per day and agrees with lower-dimensionality simulations. Tropospheric nonmethane hydrocarbon chemistry will soon be added, and inherently three-dimensional phenomena will be investigated both decoupled from dynamics and in a complete chemical GCM. 225 refs., 11 figs., 2 tabs.« less
Mühlebach, Anneke; Adam, Joachim; Schön, Uwe
2011-11-01
Automated medicinal chemistry (parallel chemistry) has become an integral part of the drug-discovery process in almost every large pharmaceutical company. Parallel array synthesis of individual organic compounds has been used extensively to generate diverse structural libraries to support different phases of the drug-discovery process, such as hit-to-lead, lead finding, or lead optimization. In order to guarantee effective project support, efficiency in the production of compound libraries has been maximized. As a consequence, also throughput in chromatographic purification and analysis has been adapted. As a recent trend, more laboratories are preparing smaller, yet more focused libraries with even increasing demands towards quality, i.e. optimal purity and unambiguous confirmation of identity. This paper presents an automated approach how to combine effective purification and structural conformation of a lead optimization library created by microwave-assisted organic synthesis. The results of complementary analytical techniques such as UHPLC-HRMS and NMR are not only regarded but even merged for fast and easy decision making, providing optimal quality of compound stock. In comparison with the previous procedures, throughput times are at least four times faster, while compound consumption could be decreased more than threefold. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Drawing together psyche, soma and spirit: my career in cultural psychiatry.
Dein, Simon
2011-04-01
In this article I discuss my career in cultural psychiatry. I begin by examining the influence of my personal background on my interests in cultural psychiatry and religion and health. I then discuss my research, which has focused upon two areas: the cognitive and phenomenological parallels between religious experiences and psychopathological states, and relationships between biomedicine and religious healing in diverse cultural contexts. Finally, I discuss plans for future research and teaching.
Anthropology and cultural neuroscience: creating productive intersections in parallel fields.
Brown, R A; Seligman, R
2009-01-01
Partly due to the failure of anthropology to productively engage the fields of psychology and neuroscience, investigations in cultural neuroscience have occurred largely without the active involvement of anthropologists or anthropological theory. Dramatic advances in the tools and findings of social neuroscience have emerged in parallel with significant advances in anthropology that connect social and political-economic processes with fine-grained descriptions of individual experience and behavior. We describe four domains of inquiry that follow from these recent developments, and provide suggestions for intersections between anthropological tools - such as social theory, ethnography, and quantitative modeling of cultural models - and cultural neuroscience. These domains are: the sociocultural construction of emotion, status and dominance, the embodiment of social information, and the dual social and biological nature of ritual. Anthropology can help locate unique or interesting populations and phenomena for cultural neuroscience research. Anthropological tools can also help "drill down" to investigate key socialization processes accountable for cross-group differences. Furthermore, anthropological research points at meaningful underlying complexity in assumed relationships between social forces and biological outcomes. Finally, ethnographic knowledge of cultural content can aid with the development of ecologically relevant stimuli for use in experimental protocols.
Microfluidic large-scale integration: the evolution of design rules for biological automation.
Melin, Jessica; Quake, Stephen R
2007-01-01
Microfluidic large-scale integration (mLSI) refers to the development of microfluidic chips with thousands of integrated micromechanical valves and control components. This technology is utilized in many areas of biology and chemistry and is a candidate to replace today's conventional automation paradigm, which consists of fluid-handling robots. We review the basic development of mLSI and then discuss design principles of mLSI to assess the capabilities and limitations of the current state of the art and to facilitate the application of mLSI to areas of biology. Many design and practical issues, including economies of scale, parallelization strategies, multiplexing, and multistep biochemical processing, are discussed. Several microfluidic components used as building blocks to create effective, complex, and highly integrated microfluidic networks are also highlighted.
Streamlining workflow and automation to accelerate laboratory scale protein production.
Konczal, Jennifer; Gray, Christopher H
2017-05-01
Protein production facilities are often required to produce diverse arrays of proteins for demanding methodologies including crystallography, NMR, ITC and other reagent intensive techniques. It is common for these teams to find themselves a bottleneck in the pipeline of ambitious projects. This pressure to deliver has resulted in the evolution of many novel methods to increase capacity and throughput at all stages in the pipeline for generation of recombinant proteins. This review aims to describe current and emerging options to accelerate the success of protein production in Escherichia coli. We emphasize technologies that have been evaluated and implemented in our laboratory, including innovative molecular biology and expression vectors, small-scale expression screening strategies and the automation of parallel and multidimensional chromatography. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Vision 20/20: Automation and advanced computing in clinical radiation oncology.
Moore, Kevin L; Kagadis, George C; McNutt, Todd R; Moiseenko, Vitali; Mutic, Sasa
2014-01-01
This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.
Automation of Hubble Space Telescope Mission Operations
NASA Technical Reports Server (NTRS)
Burley, Richard; Goulet, Gregory; Slater, Mark; Huey, William; Bassford, Lynn; Dunham, Larry
2012-01-01
On June 13, 2011, after more than 21 years, 115 thousand orbits, and nearly 1 million exposures taken, the operation of the Hubble Space Telescope successfully transitioned from 24x7x365 staffing to 815 staffing. This required the automation of routine mission operations including telemetry and forward link acquisition, data dumping and solid-state recorder management, stored command loading, and health and safety monitoring of both the observatory and the HST Ground System. These changes were driven by budget reductions, and required ground system and onboard spacecraft enhancements across the entire operations spectrum, from planning and scheduling systems to payload flight software. Changes in personnel and staffing were required in order to adapt to the new roles and responsibilities required in the new automated operations era. This paper will provide a high level overview of the obstacles to automating nominal HST mission operations, both technical and cultural, and how those obstacles were overcome.
Automatic scoring of dicentric chromosomes as a tool in large scale radiation accidents.
Romm, H; Ainsbury, E; Barnard, S; Barrios, L; Barquinero, J F; Beinke, C; Deperas, M; Gregoire, E; Koivistoinen, A; Lindholm, C; Moquet, J; Oestreicher, U; Puig, R; Rothkamm, K; Sommer, S; Thierens, H; Vandersickel, V; Vral, A; Wojcik, A
2013-08-30
Mass casualty scenarios of radiation exposure require high throughput biological dosimetry techniques for population triage in order to rapidly identify individuals who require clinical treatment. The manual dicentric assay is a highly suitable technique, but it is also very time consuming and requires well trained scorers. In the framework of the MULTIBIODOSE EU FP7 project, semi-automated dicentric scoring has been established in six European biodosimetry laboratories. Whole blood was irradiated with a Co-60 gamma source resulting in 8 different doses between 0 and 4.5Gy and then shipped to the six participating laboratories. To investigate two different scoring strategies, cell cultures were set up with short term (2-3h) or long term (24h) colcemid treatment. Three classifiers for automatic dicentric detection were applied, two of which were developed specifically for these two different culture techniques. The automation procedure included metaphase finding, capture of cells at high resolution and detection of dicentric candidates. The automatically detected dicentric candidates were then evaluated by a trained human scorer, which led to the term 'semi-automated' being applied to the analysis. The six participating laboratories established at least one semi-automated calibration curve each, using the appropriate classifier for their colcemid treatment time. There was no significant difference between the calibration curves established, regardless of the classifier used. The ratio of false positive to true positive dicentric candidates was dose dependent. The total staff effort required for analysing 150 metaphases using the semi-automated approach was 2 min as opposed to 60 min for manual scoring of 50 metaphases. Semi-automated dicentric scoring is a useful tool in a large scale radiation accident as it enables high throughput screening of samples for fast triage of potentially exposed individuals. Furthermore, the results from the participating laboratories were comparable which supports networking between laboratories for this assay. Copyright © 2013 Elsevier B.V. All rights reserved.
Brumback, B G; Farthing, P G; Castellino, S N
1993-12-01
Specimens from skin lesions were examined simultaneously for herpes simplex virus (HSV) and varicella-zoster virus (VZV) by direct specimen testing and shell vial culture in single-test systems. For direct testing, cells in a single specimen well were stained with a combination direct-indirect immunofluorescence stain by using two fluorescent tags. A total of 203 fresh specimens were tested in parallel. Of these, 100 specimens contained too few cells for the direct VZV comparison and 91 contained too few cells for the HSV comparison. After these specimens were eliminated, the sensitivities and specificities, respectively, of the dual direct test were 86.1 and 97.3% for HSV compared with single culture and 92.2 and 100% for VZV compared with single direct testing. Shell vial monolayers in the combined cultures were stained for both viruses by the same method. A total of 305 fresh specimens were cultured in parallel by dual- and single-culture methods. The sensitivities and specificities, respectively, of the combined culture compared with separate cultures were 100 and 98.4% for HSV and 87.9 and 99.2% for VZV. The combined methods gave a performance comparable to those of single tests, required less specimen volume, and were less costly to perform.
Flexible End2End Workflow Automation of Hit-Discovery Research.
Holzmüller-Laue, Silke; Göde, Bernd; Thurow, Kerstin
2014-08-01
The article considers a new approach of more complex laboratory automation at the workflow layer. The authors purpose the automation of end2end workflows. The combination of all relevant subprocesses-whether automated or manually performed, independently, and in which organizational unit-results in end2end processes that include all result dependencies. The end2end approach focuses on not only the classical experiments in synthesis or screening, but also on auxiliary processes such as the production and storage of chemicals, cell culturing, and maintenance as well as preparatory activities and analyses of experiments. Furthermore, the connection of control flow and data flow in the same process model leads to reducing of effort of the data transfer between the involved systems, including the necessary data transformations. This end2end laboratory automation can be realized effectively with the modern methods of business process management (BPM). This approach is based on a new standardization of the process-modeling notation Business Process Model and Notation 2.0. In drug discovery, several scientific disciplines act together with manifold modern methods, technologies, and a wide range of automated instruments for the discovery and design of target-based drugs. The article discusses the novel BPM-based automation concept with an implemented example of a high-throughput screening of previously synthesized compound libraries. © 2014 Society for Laboratory Automation and Screening.
Plant cell technologies in space: Background, strategies and prospects
NASA Technical Reports Server (NTRS)
Kirkorian, A. D.; Scheld, H. W.
1987-01-01
An attempt is made to summarize work in plant cell technologies in space. The evolution of concepts and the general principles of plant tissue culture are discussed. The potential for production of high value secondary products by plant cells and differentiated tissue in automated, precisely controlled bioreactors is discussed. The general course of the development of the literature on plant tissue culture is highlighted.
Automated problem scheduling and reduction of synchronization delay effects
NASA Technical Reports Server (NTRS)
Saltz, Joel H.
1987-01-01
It is anticipated that in order to make effective use of many future high performance architectures, programs will have to exhibit at least a medium grained parallelism. A framework is presented for partitioning very sparse triangular systems of linear equations that is designed to produce favorable preformance results in a wide variety of parallel architectures. Efficient methods for solving these systems are of interest because: (1) they provide a useful model problem for use in exploring heuristics for the aggregation, mapping and scheduling of relatively fine grained computations whose data dependencies are specified by directed acrylic graphs, and (2) because such efficient methods can find direct application in the development of parallel algorithms for scientific computation. Simple expressions are derived that describe how to schedule computational work with varying degrees of granularity. The Encore Multimax was used as a hardware simulator to investigate the performance effects of using the partitioning techniques presented in shared memory architectures with varying relative synchronization costs.
Validation of an automated colony counting system for group A Streptococcus.
Frost, H R; Tsoi, S K; Baker, C A; Laho, D; Sanderson-Smith, M L; Steer, A C; Smeesters, P R
2016-02-08
The practice of counting bacterial colony forming units on agar plates has long been used as a method to estimate the concentration of live bacteria in culture. However, due to the laborious and potentially error prone nature of this measurement technique, an alternative method is desirable. Recent technologic advancements have facilitated the development of automated colony counting systems, which reduce errors introduced during the manual counting process and recording of information. An additional benefit is the significant reduction in time taken to analyse colony counting data. Whilst automated counting procedures have been validated for a number of microorganisms, the process has not been successful for all bacteria due to the requirement for a relatively high contrast between bacterial colonies and growth medium. The purpose of this study was to validate an automated counting system for use with group A Streptococcus (GAS). Twenty-one different GAS strains, representative of major emm-types, were selected for assessment. In order to introduce the required contrast for automated counting, 2,3,5-triphenyl-2H-tetrazolium chloride (TTC) dye was added to Todd-Hewitt broth with yeast extract (THY) agar. Growth on THY agar with TTC was compared with growth on blood agar and THY agar to ensure the dye was not detrimental to bacterial growth. Automated colony counts using a ProtoCOL 3 instrument were compared with manual counting to confirm accuracy over the stages of the growth cycle (latent, mid-log and stationary phases) and in a number of different assays. The average percentage differences between plating and counting methods were analysed using the Bland-Altman method. A percentage difference of ±10 % was determined as the cut-off for a critical difference between plating and counting methods. All strains measured had an average difference of less than 10 % when plated on THY agar with TTC. This consistency was also observed over all phases of the growth cycle and when plated in blood following bactericidal assays. Agreement between these methods suggest the use of an automated colony counting technique for GAS will significantly reduce time spent counting bacteria to enable a more efficient and accurate measurement of bacteria concentration in culture.
Sonsmann, F K; Strunk, M; Gediga, K; John, C; Schliemann, S; Seyfarth, F; Elsner, P; Diepgen, T L; Kutz, G; John, S M
2014-05-01
To date, there are no legally binding requirements concerning product testing in cosmetics. This leads to various manufacturer-specific test methods and absent transparent information on skin cleansing products. A standardized in vivo test procedure for assessment of cleansing efficacy and corresponding barrier impairment by the cleaning process is needed, especially in the occupational context where repeated hand washing procedures may be performed at short intervals. For the standardization of the cleansing procedure, an Automated Cleansing Device (ACiD) was designed and evaluated. Different smooth washing surfaces of the equipment for ACiD (incl. goat hair, felt, felt covered with nitrile caps) were evaluated regarding their skin compatibility. ACiD allows an automated, fully standardized skin washing procedure. Felt covered with nitrile as washing surface of the rotating washing units leads to a homogenous cleansing result and does not cause detectable skin irritation, neither clinically nor as assessed by skin bioengineering methods (transepidermal water loss, chromametry). Automated Cleansing Device may be useful for standardized evaluation of the cleansing effectiveness and parallel assessment of the corresponding irritancy potential of industrial skin cleansers. This will allow objectifying efficacy and safety of industrial skin cleansers, thus enabling market transparency and facilitating rational choice of products. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Belval, Richard; Alamir, Ab; Corte, Christopher; DiValentino, Justin; Fernandes, James; Frerking, Stuart; Jenkins, Derek; Rogers, George; Sanville-Ross, Mary; Sledziona, Cindy; Taylor, Paul
2012-12-01
Boehringer Ingelheim's Automated Liquids Processing System (ALPS) in Ridgefield, Connecticut, was built to accommodate all compound solution-based operations following dissolution in neat DMSO. Process analysis resulted in the design of two nearly identical conveyor-based subsystems, each capable of executing 1400 × 384-well plate or punch tube replicates per batch. Two parallel-positioned subsystems are capable of independent execution or alternatively executed as a unified system for more complex or higher throughput processes. Primary ALPS functions include creation of high-throughput screening plates, concentration-response plates, and reformatted master stock plates (e.g., 384-well plates from 96-well plates). Integrated operations included centrifugation, unsealing/piercing, broadcast diluent addition, barcode print/application, compound transfer/mix via disposable pipette tips, and plate sealing. ALPS key features included instrument pooling for increased capacity or fail-over situations, programming constructs to associate one source plate to an array of replicate plates, and stacked collation of completed plates. Due to the hygroscopic nature of DMSO, ALPS was designed to operate within a 10% relativity humidity environment. The activities described are the collaborative efforts that contributed to the specification, build, delivery, and acceptance testing between Boehringer Ingelheim Pharmaceuticals, Inc. and the automation integration vendor, Thermo Scientific Laboratory Automation (Burlington, ON, Canada).
Parallel computing for automated model calibration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burke, John S.; Danielson, Gary R.; Schulz, Douglas A.
2002-07-29
Natural resources model calibration is a significant burden on computing and staff resources in modeling efforts. Most assessments must consider multiple calibration objectives (for example magnitude and timing of stream flow peak). An automated calibration process that allows real time updating of data/models, allowing scientists to focus effort on improving models is needed. We are in the process of building a fully featured multi objective calibration tool capable of processing multiple models cheaply and efficiently using null cycle computing. Our parallel processing and calibration software routines have been generically, but our focus has been on natural resources model calibration. Somore » far, the natural resources models have been friendly to parallel calibration efforts in that they require no inter-process communication, only need a small amount of input data and only output a small amount of statistical information for each calibration run. A typical auto calibration run might involve running a model 10,000 times with a variety of input parameters and summary statistical output. In the past model calibration has been done against individual models for each data set. The individual model runs are relatively fast, ranging from seconds to minutes. The process was run on a single computer using a simple iterative process. We have completed two Auto Calibration prototypes and are currently designing a more feature rich tool. Our prototypes have focused on running the calibration in a distributed computing cross platform environment. They allow incorporation of?smart? calibration parameter generation (using artificial intelligence processing techniques). Null cycle computing similar to SETI@Home has also been a focus of our efforts. This paper details the design of the latest prototype and discusses our plans for the next revision of the software.« less
Automated Chemotactic Sorting and Single-cell Cultivation of Microbes using Droplet Microfluidics
NASA Astrophysics Data System (ADS)
Dong, Libing; Chen, Dong-Wei; Liu, Shuang-Jiang; Du, Wenbin
2016-04-01
We report a microfluidic device for automated sorting and cultivation of chemotactic microbes from pure cultures or mixtures. The device consists of two parts: in the first part, a concentration gradient of the chemoeffector was built across the channel for inducing chemotaxis of motile cells; in the second part, chemotactic cells from the sample were separated, and mixed with culture media to form nanoliter droplets for encapsulation, cultivation, enumeration, and recovery of single cells. Chemotactic responses were assessed by imaging and statistical analysis of droplets based on Poisson distribution. An automated procedure was developed for rapid enumeration of droplets with cell growth, following with scale-up cultivation on agar plates. The performance of the device was evaluated by the chemotaxis assays of Escherichia coli (E. coli) RP437 and E. coli RP1616. Moreover, enrichment and isolation of non-labelled Comamonas testosteroni CNB-1 from its 1:10 mixture with E. coli RP437 was demonstrated. The enrichment factor reached 36.7 for CNB-1, based on its distinctive chemotaxis toward 4-hydroxybenzoic acid. We believe that this device can be widely used in chemotaxis studies without necessarily relying on fluorescent labelling, and isolation of functional microbial species from various environments.
Automated Chemotactic Sorting and Single-cell Cultivation of Microbes using Droplet Microfluidics.
Dong, Libing; Chen, Dong-Wei; Liu, Shuang-Jiang; Du, Wenbin
2016-04-14
We report a microfluidic device for automated sorting and cultivation of chemotactic microbes from pure cultures or mixtures. The device consists of two parts: in the first part, a concentration gradient of the chemoeffector was built across the channel for inducing chemotaxis of motile cells; in the second part, chemotactic cells from the sample were separated, and mixed with culture media to form nanoliter droplets for encapsulation, cultivation, enumeration, and recovery of single cells. Chemotactic responses were assessed by imaging and statistical analysis of droplets based on Poisson distribution. An automated procedure was developed for rapid enumeration of droplets with cell growth, following with scale-up cultivation on agar plates. The performance of the device was evaluated by the chemotaxis assays of Escherichia coli (E. coli) RP437 and E. coli RP1616. Moreover, enrichment and isolation of non-labelled Comamonas testosteroni CNB-1 from its 1:10 mixture with E. coli RP437 was demonstrated. The enrichment factor reached 36.7 for CNB-1, based on its distinctive chemotaxis toward 4-hydroxybenzoic acid. We believe that this device can be widely used in chemotaxis studies without necessarily relying on fluorescent labelling, and isolation of functional microbial species from various environments.
Li, Yiyan; Yang, Xing; Zhao, Weian
2018-01-01
Rapid bacterial identification (ID) and antibiotic susceptibility testing (AST) are in great demand due to the rise of drug-resistant bacteria. Conventional culture-based AST methods suffer from a long turnaround time. By necessity, physicians often have to treat patients empirically with antibiotics, which has led to an inappropriate use of antibiotics, an elevated mortality rate and healthcare costs, and antibiotic resistance. Recent advances in miniaturization and automation provide promising solutions for rapid bacterial ID/AST profiling, which will potentially make a significant impact in the clinical management of infectious diseases and antibiotic stewardship in the coming years. In this review, we summarize and analyze representative emerging micro- and nanotechnologies, as well as automated systems for bacterial ID/AST, including both phenotypic (e.g., microfluidic-based bacterial culture, and digital imaging of single cells) and molecular (e.g., multiplex PCR, hybridization probes, nanoparticles, synthetic biology tools, mass spectrometry, and sequencing technologies) methods. We also discuss representative point-of-care (POC) systems that integrate sample processing, fluid handling, and detection for rapid bacterial ID/AST. Finally, we highlight major remaining challenges and discuss potential future endeavors toward improving clinical outcomes with rapid bacterial ID/AST technologies. PMID:28850804
Artimovich, Elena; Jackson, Russell K; Kilander, Michaela B C; Lin, Yu-Chih; Nestor, Michael W
2017-10-16
Intracellular calcium is an important ion involved in the regulation and modulation of many neuronal functions. From regulating cell cycle and proliferation to initiating signaling cascades and regulating presynaptic neurotransmitter release, the concentration and timing of calcium activity governs the function and fate of neurons. Changes in calcium transients can be used in high-throughput screening applications as a basic measure of neuronal maturity, especially in developing or immature neuronal cultures derived from stem cells. Using human induced pluripotent stem cell derived neurons and dissociated mouse cortical neurons combined with the calcium indicator Fluo-4, we demonstrate that PeakCaller reduces type I and type II error in automated peak calling when compared to the oft-used PeakFinder algorithm under both basal and pharmacologically induced conditions. Here we describe PeakCaller, a novel MATLAB script and graphical user interface for the quantification of intracellular calcium transients in neuronal cultures. PeakCaller allows the user to set peak parameters and smoothing algorithms to best fit their data set. This new analysis script will allow for automation of calcium measurements and is a powerful software tool for researchers interested in high-throughput measurements of intracellular calcium.
Li, Yiyan; Yang, Xing; Zhao, Weian
2017-12-01
Rapid bacterial identification (ID) and antibiotic susceptibility testing (AST) are in great demand due to the rise of drug-resistant bacteria. Conventional culture-based AST methods suffer from a long turnaround time. By necessity, physicians often have to treat patients empirically with antibiotics, which has led to an inappropriate use of antibiotics, an elevated mortality rate and healthcare costs, and antibiotic resistance. Recent advances in miniaturization and automation provide promising solutions for rapid bacterial ID/AST profiling, which will potentially make a significant impact in the clinical management of infectious diseases and antibiotic stewardship in the coming years. In this review, we summarize and analyze representative emerging micro- and nanotechnologies, as well as automated systems for bacterial ID/AST, including both phenotypic (e.g., microfluidic-based bacterial culture, and digital imaging of single cells) and molecular (e.g., multiplex PCR, hybridization probes, nanoparticles, synthetic biology tools, mass spectrometry, and sequencing technologies) methods. We also discuss representative point-of-care (POC) systems that integrate sample processing, fluid handling, and detection for rapid bacterial ID/AST. Finally, we highlight major remaining challenges and discuss potential future endeavors toward improving clinical outcomes with rapid bacterial ID/AST technologies.
Application of a non-hazardous vital dye for cell counting with automated cell counters.
Kim, Soo In; Kim, Hyun Jeong; Lee, Ho-Jae; Lee, Kiwon; Hong, Dongpyo; Lim, Hyunchang; Cho, Keunchang; Jung, Neoncheol; Yi, Yong Weon
2016-01-01
Recent advances in automated cell counters enable us to count cells more easily with consistency. However, the wide use of the traditional vital dye trypan blue (TB) raises environmental and health concerns due to its potential teratogenic effects. To avoid this chemical hazard, it is of importance to introduce an alternative non-hazardous vital dye that is compatible with automated cell counters. Erythrosin B (EB) is a vital dye that is impermeable to biological membranes and is used as a food additive. Similarly to TB, EB stains only nonviable cells with disintegrated membranes. However, EB is less popular than TB and is seldom used with automated cell counters. We found that cell counting accuracy with EB was comparable to that with TB. EB was found to be an effective dye for accurate counting of cells with different viabilities across three different automated cell counters. In contrast to TB, EB was less toxic to cultured HL-60 cells during the cell counting process. These results indicate that replacing TB with EB for use with automated cell counters will significantly reduce the hazardous risk while producing comparable results. Copyright © 2015 Logos Biosystems, Inc. Published by Elsevier Inc. All rights reserved.
Fostering Intercultural Understanding through Secondary School Experiences of Cultural Immersion
ERIC Educational Resources Information Center
Walton, Jessica; Paradies, Yin; Priest, Naomi; Wertheim, Eleanor H.; Freeman, Elizabeth
2015-01-01
In parallel with many nations' education policies, national education policies in Australia seek to foster students' intercultural understanding. Due to Australia's location in the Asia-Pacific region, the Australian government has focused on students becoming "Asia literate" to support Australia's economic and cultural engagement with…
Huysal, Kağan; Budak, Yasemin U; Karaca, Ayse Ulusoy; Aydos, Murat; Kahvecioğlu, Serdar; Bulut, Mehtap; Polat, Murat
2013-01-01
Urinary tract infection (UTI) is one of the most common types of infection. Currently, diagnosis is primarily based on microbiologic culture, which is time- and labor-consuming. The aim of this study was to assess the diagnostic accuracy of urinalysis results from UriSed (77 Electronica, Budapest, Hungary), an automated microscopic image-based sediment analyzer, in predicting positive urine cultures. We examined a total of 384 urine specimens from hospitalized patients and outpatients attending our hospital on the same day for urinalysis, dipstick tests and semi-quantitative urine culture. The urinalysis results were compared with those of conventional semiquantitative urine culture. Of 384 urinary specimens, 68 were positive for bacteriuria by culture, and were thus considered true positives. Comparison of these results with those obtained from the UriSed analyzer indicated that the analyzer had a specificity of 91.1%, a sensitivity of 47.0%, a positive predictive value (PPV) of 53.3% (95% confidence interval (CI) = 40.8-65.3), and a negative predictive value (NPV) of 88.8% (95% CI = 85.0-91.8%). The accuracy was 83.3% when the urine leukocyte parameter was used, 76.8% when bacteriuria analysis of urinary sediment was used, and 85.1% when the bacteriuria and leukocyturia parameters were combined. The presence of nitrite was the best indicator of culture positivity (99.3% specificity) but had a negative likelihood ratio of 0.7, indicating that it was not a reliable clinical test. Although the specificity of the UriSed analyzer was within acceptable limits, the sensitivity value was low. Thus, UriSed urinalysis resuIts do not accurately predict the outcome of culture.
La Scola, Bernard; Raoult, Didier
2009-11-25
With long delays observed between sampling and availability of results, the usefulness of blood cultures in the context of emergency infectious diseases has recently been questioned. Among methods that allow quicker bacterial identification from growing colonies, matrix-assisted laser desorption ionisation time-of-flight (MALDI-TOF) mass spectrometry was demonstrated to accurately identify bacteria routinely isolated in a clinical biology laboratory. In order to speed up the identification process, in the present work we attempted bacterial identification directly from blood culture bottles detected positive by the automate. We prospectively analysed routine MALDI-TOF identification of bacteria detected in blood culture by two different protocols involving successive centrifugations and then lysis by trifluoroacetic acid or formic acid. Of the 562 blood culture broths detected as positive by the automate and containing one bacterial species, 370 (66%) were correctly identified. Changing the protocol from trifluoroacetic acid to formic acid improved identification of Staphylococci, and overall correct identification increased from 59% to 76%. Lack of identification was observed mostly with viridans streptococci, and only one false positive was observed. In the 22 positive blood culture broths that contained two or more different species, only one of the species was identified in 18 samples, no species were identified in two samples and false species identifications were obtained in two cases. The positive predictive value of bacterial identification using this procedure was 99.2%. MALDI-TOF MS is an efficient method for direct routine identification of bacterial isolates in blood culture, with the exception of polymicrobial samples and viridans streptococci. It may replace routine identification performed on colonies, provided improvement for the specificity of blood culture broths growing viridans streptococci is obtained in the near future.
Quantifying Spiral Ganglion Neurite and Schwann Behavior on Micropatterned Polymer Substrates.
Cheng, Elise L; Leigh, Braden; Guymon, C Allan; Hansen, Marlan R
2016-01-01
The first successful in vitro experiments on the cochlea were conducted in 1928 by Honor Fell (Fell, Arch Exp Zellforsch 7(1):69-81, 1928). Since then, techniques for culture of this tissue have been refined, and dissociated primary culture of the spiral ganglion has become a widely accepted in vitro model for studying nerve damage and regeneration in the cochlea. Additionally, patterned substrates have been developed that facilitate and direct neural outgrowth. A number of automated and semi-automated methods for quantifying this neurite outgrowth have been utilized in recent years (Zhang et al., J Neurosci Methods 160(1):149-162, 2007; Tapias et al., Neurobiol Dis 54:158-168, 2013). Here, we describe a method to study the effect of topographical cues on spiral ganglion neurite and Schwann cell alignment. We discuss our microfabrication process, characterization of pattern features, cell culture techniques for both spiral ganglion neurons and spiral ganglion Schwann cells. In addition, we describe protocols for reducing fibroblast count, immunocytochemistry, and methods for quantifying neurite and Schwann cell alignment.
Spore-forming organisms in platelet concentrates: a challenge in transfusion bacterial safety.
Störmer, M; Vollmer, T; Kleesiek, K; Dreier, J
2008-12-01
Bacterial detection and pathogen reduction are widely used methods of minimizing the risk of transfusion-transmitted bacterial infection. But, bacterial spores are highly resistant to chemical and physical agents. In this study, we assessed the bacterial proliferation of spore-forming organisms seeded into platelet concentrates (PCs) to demonstrate that spores can enter the vegetative state in PCs during storage. In the in vitro study, PCs were inoculated with 1-10 spores mL(-1)of Bacillus cereus (n = 1), Bacillus subtilis (n = 2) and Clostridium sporogenes (n = 2). Sampling was performed during 6-day aerobic storage at 22 degrees C. The presence of bacteria was assessed by plating culture, automated culture and real-time reverse transcriptase-polymerase chain reaction (RT-PCR). Spores of the C. sporogenes do not enter the vegetative phase under PC storage conditions, whereas B. subtilis and B. cereus showed growth in the PC and could be detected using RT-PCR and automated culture. Depending on the species and inoculums, bacterial spores may enter the vegetative phase during PC storage and can be detected by bacterial detection methods.
NASA Astrophysics Data System (ADS)
Wartmann, David; Rothbauer, Mario; Kuten, Olga; Barresi, Caterina; Visus, Carmen; Felzmann, Thomas; Ertl, Peter
2015-09-01
The combination of microfabrication-based technologies with cell biology has laid the foundation for the development of advanced in vitro diagnostic systems capable of evaluating cell cultures under defined, reproducible and standardizable measurement conditions. In the present review we describe recent lab-on-a-chip developments for cell analysis and how these methodologies could improve standard quality control in the field of manufacturing cell-based vaccines for clinical purposes. We highlight in particular the regulatory requirements for advanced cell therapy applications using as an example dendritic cell-based cancer vaccines to describe the tangible advantages of microfluidic devices that overcome most of the challenges associated with automation, miniaturization and integration of cell-based assays. As its main advantage lab-on-a-chip technology allows for precise regulation of culturing conditions, while simultaneously monitoring cell relevant parameters using embedded sensory systems. State-of-the-art lab-on-a-chip platforms for in vitro assessment of cell cultures and their potential future applications for cell therapies and cancer immunotherapy are discussed in the present review.
Huebsch, Nathaniel; Loskill, Peter; Mandegar, Mohammad A.; Marks, Natalie C.; Sheehan, Alice S.; Ma, Zhen; Mathur, Anurag; Nguyen, Trieu N.; Yoo, Jennie C.; Judge, Luke M.; Spencer, C. Ian; Chukka, Anand C.; Russell, Caitlin R.; So, Po-Lin
2015-01-01
Contractile motion is the simplest metric of cardiomyocyte health in vitro, but unbiased quantification is challenging. We describe a rapid automated method, requiring only standard video microscopy, to analyze the contractility of human-induced pluripotent stem cell-derived cardiomyocytes (iPS-CM). New algorithms for generating and filtering motion vectors combined with a newly developed isogenic iPSC line harboring genetically encoded calcium indicator, GCaMP6f, allow simultaneous user-independent measurement and analysis of the coupling between calcium flux and contractility. The relative performance of these algorithms, in terms of improving signal to noise, was tested. Applying these algorithms allowed analysis of contractility in iPS-CM cultured over multiple spatial scales from single cells to three-dimensional constructs. This open source software was validated with analysis of isoproterenol response in these cells, and can be applied in future studies comparing the drug responsiveness of iPS-CM cultured in different microenvironments in the context of tissue engineering. PMID:25333967
Parallel peak pruning for scalable SMP contour tree computation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carr, Hamish A.; Weber, Gunther H.; Sewell, Christopher M.
As data sets grow to exascale, automated data analysis and visualisation are increasingly important, to intermediate human understanding and to reduce demands on disk storage via in situ analysis. Trends in architecture of high performance computing systems necessitate analysis algorithms to make effective use of combinations of massively multicore and distributed systems. One of the principal analytic tools is the contour tree, which analyses relationships between contours to identify features of more than local importance. Unfortunately, the predominant algorithms for computing the contour tree are explicitly serial, and founded on serial metaphors, which has limited the scalability of this formmore » of analysis. While there is some work on distributed contour tree computation, and separately on hybrid GPU-CPU computation, there is no efficient algorithm with strong formal guarantees on performance allied with fast practical performance. Here in this paper, we report the first shared SMP algorithm for fully parallel contour tree computation, withfor-mal guarantees of O(lgnlgt) parallel steps and O(n lgn) work, and implementations with up to 10x parallel speed up in OpenMP and up to 50x speed up in NVIDIA Thrust.« less
An Evaluation of Different Statistical Targets for Assembling Parallel Forms in Item Response Theory
Ali, Usama S.; van Rijn, Peter W.
2015-01-01
Assembly of parallel forms is an important step in the test development process. Therefore, choosing a suitable theoretical framework to generate well-defined test specifications is critical. The performance of different statistical targets of test specifications using the test characteristic curve (TCC) and the test information function (TIF) was investigated. Test length, the number of test forms, and content specifications are considered as well. The TCC target results in forms that are parallel in difficulty, but not necessarily in terms of precision. Vice versa, test forms created using a TIF target are parallel in terms of precision, but not necessarily in terms of difficulty. As sometimes the focus is either on TIF or TCC, differences in either difficulty or precision can arise. Differences in difficulty can be mitigated by equating, but differences in precision cannot. In a series of simulations using a real item bank, the two-parameter logistic model, and mixed integer linear programming for automated test assembly, these differences were found to be quite substantial. When both TIF and TCC are combined into one target with manipulation to relative importance, these differences can be made to disappear.
NASA Astrophysics Data System (ADS)
Tabekina, N. A.; Chepchurov, M. S.; Evtushenko, E. I.; Dmitrievsky, B. S.
2018-05-01
The work solves the problem of automation of machining process namely turning to produce parts having the planes parallel to an axis of rotation of part without using special tools. According to the results, the availability of the equipment of a high speed electromechanical drive to control the operative movements of lathe machine will enable one to get the planes parallel to the part axis. The method of getting planes parallel to the part axis is based on the mathematical model, which is presented as functional dependency between the conveying velocity of the driven element and the time. It describes the operative movements of lathe machine all over the tool path. Using the model of movement of the tool, it has been found that the conveying velocity varies from the maximum to zero value. It will allow one to carry out the reverse of the drive. The scheme of tool placement regarding the workpiece has been proposed for unidirectional movement of the driven element at high conveying velocity. The control method of CNC machines can be used for getting geometrically complex parts on the lathe without using special milling tools.
Automated microaneurysm detection in diabetic retinopathy using curvelet transform
NASA Astrophysics Data System (ADS)
Ali Shah, Syed Ayaz; Laude, Augustinus; Faye, Ibrahima; Tang, Tong Boon
2016-10-01
Microaneurysms (MAs) are known to be the early signs of diabetic retinopathy (DR). An automated MA detection system based on curvelet transform is proposed for color fundus image analysis. Candidates of MA were extracted in two parallel steps. In step one, blood vessels were removed from preprocessed green band image and preliminary MA candidates were selected by local thresholding technique. In step two, based on statistical features, the image background was estimated. The results from the two steps allowed us to identify preliminary MA candidates which were also present in the image foreground. A collection set of features was fed to a rule-based classifier to divide the candidates into MAs and non-MAs. The proposed system was tested with Retinopathy Online Challenge database. The automated system detected 162 MAs out of 336, thus achieved a sensitivity of 48.21% with 65 false positives per image. Counting MA is a means to measure the progression of DR. Hence, the proposed system may be deployed to monitor the progression of DR at early stage in population studies.
Automated microaneurysm detection in diabetic retinopathy using curvelet transform.
Ali Shah, Syed Ayaz; Laude, Augustinus; Faye, Ibrahima; Tang, Tong Boon
2016-10-01
Microaneurysms (MAs) are known to be the early signs of diabetic retinopathy (DR). An automated MA detection system based on curvelet transform is proposed for color fundus image analysis. Candidates of MA were extracted in two parallel steps. In step one, blood vessels were removed from preprocessed green band image and preliminary MA candidates were selected by local thresholding technique. In step two, based on statistical features, the image background was estimated. The results from the two steps allowed us to identify preliminary MA candidates which were also present in the image foreground. A collection set of features was fed to a rule-based classifier to divide the candidates into MAs and non-MAs. The proposed system was tested with Retinopathy Online Challenge database. The automated system detected 162 MAs out of 336, thus achieved a sensitivity of 48.21% with 65 false positives per image. Counting MA is a means to measure the progression of DR. Hence, the proposed system may be deployed to monitor the progression of DR at early stage in population studies.
Sklar, A E; Sarter, N B
1999-12-01
Observed breakdowns in human-machine communication can be explained, in part, by the nature of current automation feedback, which relies heavily on focal visual attention. Such feedback is not well suited for capturing attention in case of unexpected changes and events or for supporting the parallel processing of large amounts of data in complex domains. As suggested by multiple-resource theory, one possible solution to this problem is to distribute information across various sensory modalities. A simulator study was conducted to compare the effectiveness of visual, tactile, and redundant visual and tactile cues for indicating unexpected changes in the status of an automated cockpit system. Both tactile conditions resulted in higher detection rates for, and faster response times to, uncommanded mode transitions. Tactile feedback did not interfere with, nor was its effectiveness affected by, the performance of concurrent visual tasks. The observed improvement in task-sharing performance indicates that the introduction of tactile feedback is a promising avenue toward better supporting human-machine communication in event-driven, information-rich domains.
Montagna, Fabio; Buiatti, Marco; Benatti, Simone; Rossi, Davide; Farella, Elisabetta; Benini, Luca
2017-10-01
EEG is a standard non-invasive technique used in neural disease diagnostics and neurosciences. Frequency-tagging is an increasingly popular experimental paradigm that efficiently tests brain function by measuring EEG responses to periodic stimulation. Recently, frequency-tagging paradigms have proven successful with low stimulation frequencies (0.5-6Hz), but the EEG signal is intrinsically noisy in this frequency range, requiring heavy signal processing and significant human intervention for response estimation. This limits the possibility to process the EEG on resource-constrained systems and to design smart EEG based devices for automated diagnostic. We propose an algorithm for artifact removal and automated detection of frequency tagging responses in a wide range of stimulation frequencies, which we test on a visual stimulation protocol. The algorithm is rooted on machine learning based pattern recognition techniques and it is tailored for a new generation parallel ultra low power processing platform (PULP), reaching performance of more that 90% accuracy in the frequency detection even for very low stimulation frequencies (<1Hz) with a power budget of 56mW. Copyright © 2017 Elsevier Inc. All rights reserved.
Towards Evolving Electronic Circuits for Autonomous Space Applications
NASA Technical Reports Server (NTRS)
Lohn, Jason D.; Haith, Gary L.; Colombano, Silvano P.; Stassinopoulos, Dimitris
2000-01-01
The relatively new field of Evolvable Hardware studies how simulated evolution can reconfigure, adapt, and design hardware structures in an automated manner. Space applications, especially those requiring autonomy, are potential beneficiaries of evolvable hardware. For example, robotic drilling from a mobile platform requires high-bandwidth controller circuits that are difficult to design. In this paper, we present automated design techniques based on evolutionary search that could potentially be used in such applications. First, we present a method of automatically generating analog circuit designs using evolutionary search and a circuit construction language. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. Using a parallel genetic algorithm, we present experimental results for five design tasks. Second, we investigate the use of coevolution in automated circuit design. We examine fitness evaluation by comparing the effectiveness of four fitness schedules. The results indicate that solution quality is highest with static and co-evolving fitness schedules as compared to the other two dynamic schedules. We discuss these results and offer two possible explanations for the observed behavior: retention of useful information, and alignment of problem difficulty with circuit proficiency.
NASA Technical Reports Server (NTRS)
Callantine, Todd J.; Kupfer, Michael; Martin, Lynne Hazel; Prevot, Thomas
2013-01-01
Air traffic management simulations conducted in the Airspace Operations Laboratory at NASA Ames Research Center have addressed the integration of trajectory-based arrival-management automation, controller tools, and Flight-Deck Interval Management avionics to enable Continuous Descent Operations (CDOs) during periods of sustained high traffic demand. The simulations are devoted to maturing the integrated system for field demonstration, and refining the controller tools, clearance phraseology, and procedures specified in the associated concept of operations. The results indicate a variety of factors impact the concept's safety and viability from a controller's perspective, including en-route preconditioning of arrival flows, useable clearance phraseology, and the characteristics of airspace, routes, and traffic-management methods in use at a particular site. Clear understanding of automation behavior and required shifts in roles and responsibilities is important for controller acceptance and realizing potential benefits. This paper discusses the simulations, drawing parallels with results from related European efforts. The most recent study found en-route controllers can effectively precondition arrival flows, which significantly improved route conformance during CDOs. Controllers found the tools acceptable, in line with previous studies.
Automating the selection of standard parallels for conic map projections
NASA Astrophysics Data System (ADS)
Šavriǒ, Bojan; Jenny, Bernhard
2016-05-01
Conic map projections are appropriate for mapping regions at medium and large scales with east-west extents at intermediate latitudes. Conic projections are appropriate for these cases because they show the mapped area with less distortion than other projections. In order to minimize the distortion of the mapped area, the two standard parallels of conic projections need to be selected carefully. Rules of thumb exist for placing the standard parallels based on the width-to-height ratio of the map. These rules of thumb are simple to apply, but do not result in maps with minimum distortion. There also exist more sophisticated methods that determine standard parallels such that distortion in the mapped area is minimized. These methods are computationally expensive and cannot be used for real-time web mapping and GIS applications where the projection is adjusted automatically to the displayed area. This article presents a polynomial model that quickly provides the standard parallels for the three most common conic map projections: the Albers equal-area, the Lambert conformal, and the equidistant conic projection. The model defines the standard parallels with polynomial expressions based on the spatial extent of the mapped area. The spatial extent is defined by the length of the mapped central meridian segment, the central latitude of the displayed area, and the width-to-height ratio of the map. The polynomial model was derived from 3825 maps-each with a different spatial extent and computationally determined standard parallels that minimize the mean scale distortion index. The resulting model is computationally simple and can be used for the automatic selection of the standard parallels of conic map projections in GIS software and web mapping applications.
New cellular automaton model for magnetohydrodynamics
NASA Technical Reports Server (NTRS)
Chen, Hudong; Matthaeus, William H.
1987-01-01
A new type of two-dimensional cellular automation method is introduced for computation of magnetohydrodynamic fluid systems. Particle population is described by a 36-component tensor referred to a hexagonal lattice. By appropriate choice of the coefficients that control the modified streaming algorithm and the definition of the macroscopic fields, it is possible to compute both Lorentz-force and magnetic-induction effects. The method is local in the microscopic space and therefore suited to massively parallel computations.
Nerandzic, Michelle M; Cadnum, Jennifer L; Pultz, Michael J; Donskey, Curtis J
2010-07-08
Environmental surfaces play an important role in transmission of healthcare-associated pathogens. There is a need for new disinfection methods that are effective against Clostridium difficile spores, but also safe, rapid, and automated. The Tru-D Rapid Room Disinfection device is a mobile, fully-automated room decontamination technology that utilizes ultraviolet-C irradiation to kill pathogens. We examined the efficacy of environmental disinfection using the Tru-D device in the laboratory and in rooms of hospitalized patients. Cultures for C. difficile, methicillin-resistant Staphylococcus aureus (MRSA), and vancomycin-resistant Enterococcus (VRE) were collected from commonly touched surfaces before and after use of Tru-D. On inoculated surfaces, application of Tru-D at a reflected dose of 22,000 microWs/cm(2) for approximately 45 minutes consistently reduced recovery of C. difficile spores and MRSA by >2-3 log10 colony forming units (CFU)/cm2 and of VRE by >3-4 log10 CFU/cm(2). Similar killing of MRSA and VRE was achieved in approximately 20 minutes at a reflected dose of 12,000 microWs/cm(2), but killing of C. difficile spores was reduced. Disinfection of hospital rooms with Tru-D reduced the frequency of positive MRSA and VRE cultures by 93% and of C. difficile cultures by 80%. After routine hospital cleaning of the rooms of MRSA carriers, 18% of sites under the edges of bedside tables (i.e., a frequently touched site not easily amenable to manual application of disinfectant) were contaminated with MRSA, versus 0% after Tru-D (P < 0.001). The system required <5 minutes to set up and did not require continuous monitoring. The Tru-D Rapid Room Disinfection device is a novel, automated, and efficient environmental disinfection technology that significantly reduces C. difficile, VRE and MRSA contamination on commonly touched hospital surfaces.
Repeated Stimulation of Cultured Networks of Rat Cortical Neurons Induces Parallel Memory Traces
ERIC Educational Resources Information Center
le Feber, Joost; Witteveen, Tim; van Veenendaal, Tamar M.; Dijkstra, Jelle
2015-01-01
During systems consolidation, memories are spontaneously replayed favoring information transfer from hippocampus to neocortex. However, at present no empirically supported mechanism to accomplish a transfer of memory from hippocampal to extra-hippocampal sites has been offered. We used cultured neuronal networks on multielectrode arrays and…
Huber, Robert; Ritter, Daniel; Hering, Till; Hillmer, Anne-Kathrin; Kensy, Frank; Müller, Carsten; Wang, Le; Büchs, Jochen
2009-08-01
In industry and academic research, there is an increasing demand for flexible automated microfermentation platforms with advanced sensing technology. However, up to now, conventional platforms cannot generate continuous data in high-throughput cultivations, in particular for monitoring biomass and fluorescent proteins. Furthermore, microfermentation platforms are needed that can easily combine cost-effective, disposable microbioreactors with downstream processing and analytical assays. To meet this demand, a novel automated microfermentation platform consisting of a BioLector and a liquid-handling robot (Robo-Lector) was sucessfully built and tested. The BioLector provides a cultivation system that is able to permanently monitor microbial growth and the fluorescence of reporter proteins under defined conditions in microtiter plates. Three examplary methods were programed on the Robo-Lector platform to study in detail high-throughput cultivation processes and especially recombinant protein expression. The host/vector system E. coli BL21(DE3) pRhotHi-2-EcFbFP, expressing the fluorescence protein EcFbFP, was hereby investigated. With the method 'induction profiling' it was possible to conduct 96 different induction experiments (varying inducer concentrations from 0 to 1.5 mM IPTG at 8 different induction times) simultaneously in an automated way. The method 'biomass-specific induction' allowed to automatically induce cultures with different growth kinetics in a microtiter plate at the same biomass concentration, which resulted in a relative standard deviation of the EcFbFP production of only +/- 7%. The third method 'biomass-specific replication' enabled to generate equal initial biomass concentrations in main cultures from precultures with different growth kinetics. This was realized by automatically transferring an appropiate inoculum volume from the different preculture microtiter wells to respective wells of the main culture plate, where subsequently similar growth kinetics could be obtained. The Robo-Lector generates extensive kinetic data in high-throughput cultivations, particularly for biomass and fluorescence protein formation. Based on the non-invasive on-line-monitoring signals, actions of the liquid-handling robot can easily be triggered. This interaction between the robot and the BioLector (Robo-Lector) combines high-content data generation with systematic high-throughput experimentation in an automated fashion, offering new possibilities to study biological production systems. The presented platform uses a standard liquid-handling workstation with widespread automation possibilities. Thus, high-throughput cultivations can now be combined with small-scale downstream processing techniques and analytical assays. Ultimately, this novel versatile platform can accelerate and intensify research and development in the field of systems biology as well as modelling and bioprocess optimization.
USDA-ARS?s Scientific Manuscript database
This study compared the BAX Polymerase Chain Reaction method (BAX PCR) with the Standard Culture Method (SCM) for detection of L. monocytogenes in blue crab meat and crab processing plants. The aim of this study was to address this data gap. Raw crabs, finished products and environmental sponge samp...
Lee, Wei-Po; Hsiao, Yu-Ting; Hwang, Wei-Che
2014-01-16
To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high quality solutions can be obtained within relatively short time. This integrated approach is a promising way for inferring large networks.
2014-01-01
Background To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. Results This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Conclusions Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high quality solutions can be obtained within relatively short time. This integrated approach is a promising way for inferring large networks. PMID:24428926
Robust and efficient overset grid assembly for partitioned unstructured meshes
NASA Astrophysics Data System (ADS)
Roget, Beatrice; Sitaraman, Jayanarayanan
2014-03-01
This paper presents a method to perform efficient and automated Overset Grid Assembly (OGA) on a system of overlapping unstructured meshes in a parallel computing environment where all meshes are partitioned into multiple mesh-blocks and processed on multiple cores. The main task of the overset grid assembler is to identify, in parallel, among all points in the overlapping mesh system, at which points the flow solution should be computed (field points), interpolated (receptor points), or ignored (hole points). Point containment search or donor search, an algorithm to efficiently determine the cell that contains a given point, is the core procedure necessary for accomplishing this task. Donor search is particularly challenging for partitioned unstructured meshes because of the complex irregular boundaries that are often created during partitioning.
Automated Solar Module Assembly Line
NASA Technical Reports Server (NTRS)
Bycer, M.
1979-01-01
The gathering of information that led to the design approach of the machine, and a summary of the findings in the areas of study along with a description of each station of the machine are discussed. The machine is a cell stringing and string applique machine which is flexible in design, capable of handling a variety of cells and assembling strings of cells which can then be placed in a matrix up to 4 ft x 2 ft. in series or parallel arrangement. The target machine cycle is to be 5 seconds per cell. This machine is primarily adapted to 100 MM round cells with one or two tabs between cells. It places finished strings of up to twelve cells in a matrix of up to six such strings arranged in series or in parallel.
High-Throughput Industrial Coatings Research at The Dow Chemical Company.
Kuo, Tzu-Chi; Malvadkar, Niranjan A; Drumright, Ray; Cesaretti, Richard; Bishop, Matthew T
2016-09-12
At The Dow Chemical Company, high-throughput research is an active area for developing new industrial coatings products. Using the principles of automation (i.e., using robotic instruments), parallel processing (i.e., prepare, process, and evaluate samples in parallel), and miniaturization (i.e., reduce sample size), high-throughput tools for synthesizing, formulating, and applying coating compositions have been developed at Dow. In addition, high-throughput workflows for measuring various coating properties, such as cure speed, hardness development, scratch resistance, impact toughness, resin compatibility, pot-life, surface defects, among others have also been developed in-house. These workflows correlate well with the traditional coatings tests, but they do not necessarily mimic those tests. The use of such high-throughput workflows in combination with smart experimental designs allows accelerated discovery and commercialization.
An economic evaluation of colorectal cancer screening in primary care practice.
Meenan, Richard T; Anderson, Melissa L; Chubak, Jessica; Vernon, Sally W; Fuller, Sharon; Wang, Ching-Yun; Green, Beverly B
2015-06-01
Recent colorectal cancer screening studies focus on optimizing adherence. This study evaluated the cost effectiveness of interventions using electronic health records (EHRs); automated mailings; and stepped support increases to improve 2-year colorectal cancer screening adherence. Analyses were based on a parallel-design, randomized trial in which three stepped interventions (EHR-linked mailings ["automated"]; automated plus telephone assistance ["assisted"]; or automated and assisted plus nurse navigation to testing completion or refusal [navigated"]) were compared to usual care. Data were from August 2008 to November 2011, with analyses performed during 2012-2013. Implementation resources were micro-costed; research and registry development costs were excluded. Incremental cost-effectiveness ratios (ICERs) were based on number of participants current for screening per guidelines over 2 years. Bootstrapping examined robustness of results. Intervention delivery cost per participant current for screening ranged from $21 (automated) to $27 (navigated). Inclusion of induced testing costs (e.g., screening colonoscopy) lowered expenditures for automated (ICER=-$159) and assisted (ICER=-$36) relative to usual care over 2 years. Savings arose from increased fecal occult blood testing, substituting for more expensive colonoscopies in usual care. Results were broadly consistent across demographic subgroups. More intensive interventions were consistently likely to be cost effective relative to less intensive interventions, with willingness to pay values of $600-$1,200 for an additional person current for screening yielding ≥80% probability of cost effectiveness. Two-year cost effectiveness of a stepped approach to colorectal cancer screening promotion based on EHR data is indicated, but longer-term cost effectiveness requires further study. Copyright © 2015 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Liu, Po C; Lee, Yi T; Wang, Chun Y; Yang, Ya-Tang
2016-09-27
We describe a low cost, configurable morbidostat for characterizing the evolutionary pathway of antibiotic resistance. The morbidostat is a bacterial culture device that continuously monitors bacterial growth and dynamically adjusts the drug concentration to constantly challenge the bacteria as they evolve to acquire drug resistance. The device features a working volume of ~10 ml and is fully automated and equipped with optical density measurement and micro-pumps for medium and drug delivery. To validate the platform, we measured the stepwise acquisition of trimethoprim resistance in Escherichia coli MG 1655, and integrated the device with a multiplexed microfluidic platform to investigate cell morphology and antibiotic susceptibility. The approach can be up-scaled to laboratory studies of antibiotic drug resistance, and is extendible to adaptive evolution for strain improvements in metabolic engineering and other bacterial culture experiments.
Groves, Tamar; Figuerola, Carlos G; Quintanilla, Miguel Á
2016-08-01
This article presents our study of science coverage in the digital Spanish press over the last decade. We employed automated information retrieval procedures to create a corpus of 50,763 text units dealing with science and technology, and used automated text-analysis procedures in order to provide a general picture of the structure, characteristics and evolution of science news in Spain. We found between 6% and 7% of science coverage, a clear high proportion of biomedicine and predominance of science over technology, although we also detected an increase in technological content during the second half of the decade. Analysing the extrinsic and intrinsic features of science culture, we found a predominance of intrinsic features that still need further analysis. Our attempt to use specialised software to examine big data was effective, and allowed us to reach these preliminary conclusions. © The Author(s) 2015.
Johnson-Chavarria, Eric M.; Agrawal, Utsav; Tanyeri, Melikhan; Kuhlman, Thomas E.
2014-01-01
We report an automated microfluidic-based platform for single cell analysis that allows for cell culture in free solution with the ability to control the cell growth environment. Using this approach, cells are confined by the sole action of gentle fluid flow, thereby enabling non-perturbative analysis of cell growth away from solid boundaries. In addition, the single cell microbioreactor allows for precise and time-dependent control over cell culture media, with the combined ability to observe the dynamics of non-adherent cells over long time scales. As a proof-of-principle demonstration, we used the platform to observe dynamic cell growth, gene expression, and intracellular diffusion of repressor proteins while precisely tuning the cell growth environment. Overall, this microfluidic approach enables the direct observation of cellular dynamics with exquisite control over environmental conditions, which will be useful for quantifying the behaviour of single cells in well-defined media. PMID:24836754
Inhibition of Prostate Cancer Skeletal Metastases by Targeting Cathepsin K
2009-05-01
micro synthetic calcium phosphate thin films coated onto the culture vessels. As a parallel study, a 96-well plate which contained dentin slice...bone resorption in vitro. (A) Representative images of resorption pits on dentin slices or synthetic calcium phosphate thin films are shown. Left...Osteologic Bone cell culture system (BD Bioscience) that consist of sub-micro synthetic calcium phosphate thin films coated on to the culture vessels and
ERIC Educational Resources Information Center
Akerson, Valarie L.; Buzzelli, Cary A.; Eastwood, Jennifer L.
2012-01-01
This study explored preservice teachers' views of their own cultural values, the cultural values they believed scientists hold, and the relationships of these views to their conceptions of nature of science (NOS). Parallel assignments in a foundations of early childhood education and a science methods course required preservice teachers to explore…
Vision 20/20: Automation and advanced computing in clinical radiation oncology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali; Kagadis, George C.
This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authorsmore » contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.« less
Phaser.MRage: automated molecular replacement
Bunkóczi, Gábor; Echols, Nathaniel; McCoy, Airlie J.; Oeffner, Robert D.; Adams, Paul D.; Read, Randy J.
2013-01-01
Phaser.MRage is a molecular-replacement automation framework that implements a full model-generation workflow and provides several layers of model exploration to the user. It is designed to handle a large number of models and can distribute calculations efficiently onto parallel hardware. In addition, phaser.MRage can identify correct solutions and use this information to accelerate the search. Firstly, it can quickly score all alternative models of a component once a correct solution has been found. Secondly, it can perform extensive analysis of identified solutions to find protein assemblies and can employ assembled models for subsequent searches. Thirdly, it is able to use a priori assembly information (derived from, for example, homologues) to speculatively place and score molecules, thereby customizing the search procedure to a certain class of protein molecule (for example, antibodies) and incorporating additional biological information into molecular replacement. PMID:24189240
Phaser.MRage: automated molecular replacement.
Bunkóczi, Gábor; Echols, Nathaniel; McCoy, Airlie J; Oeffner, Robert D; Adams, Paul D; Read, Randy J
2013-11-01
Phaser.MRage is a molecular-replacement automation framework that implements a full model-generation workflow and provides several layers of model exploration to the user. It is designed to handle a large number of models and can distribute calculations efficiently onto parallel hardware. In addition, phaser.MRage can identify correct solutions and use this information to accelerate the search. Firstly, it can quickly score all alternative models of a component once a correct solution has been found. Secondly, it can perform extensive analysis of identified solutions to find protein assemblies and can employ assembled models for subsequent searches. Thirdly, it is able to use a priori assembly information (derived from, for example, homologues) to speculatively place and score molecules, thereby customizing the search procedure to a certain class of protein molecule (for example, antibodies) and incorporating additional biological information into molecular replacement.
Research highlights: microfluidics meets big data.
Tseng, Peter; Weaver, Westbrook M; Masaeli, Mahdokht; Owsley, Keegan; Di Carlo, Dino
2014-03-07
In this issue we highlight a collection of recent work in which microfluidic parallelization and automation have been employed to address the increasing need for large amounts of quantitative data concerning cellular function--from correlating microRNA levels to protein expression, increasing the throughput and reducing the noise when studying protein dynamics in single-cells, and understanding how signal dynamics encodes information. The painstaking dissection of cellular pathways one protein at a time appears to be coming to an end, leading to more rapid discoveries which will inevitably translate to better cellular control--in producing useful gene products and treating disease at the individual cell level. From these studies it is also clear that development of large scale mutant or fusion libraries, automation of microscopy, image analysis, and data extraction will be key components as microfluidics contributes its strengths to aid systems biology moving forward.
Vision 20/20: Automation and advanced computing in clinical radiation oncology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali; Kagadis, George C.
2014-01-15
This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authorsmore » contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.« less
Adaptive radial basis function mesh deformation using data reduction
NASA Astrophysics Data System (ADS)
Gillebaart, T.; Blom, D. S.; van Zuijlen, A. H.; Bijl, H.
2016-09-01
Radial Basis Function (RBF) mesh deformation is one of the most robust mesh deformation methods available. Using the greedy (data reduction) method in combination with an explicit boundary correction, results in an efficient method as shown in literature. However, to ensure the method remains robust, two issues are addressed: 1) how to ensure that the set of control points remains an accurate representation of the geometry in time and 2) how to use/automate the explicit boundary correction, while ensuring a high mesh quality. In this paper, we propose an adaptive RBF mesh deformation method, which ensures the set of control points always represents the geometry/displacement up to a certain (user-specified) criteria, by keeping track of the boundary error throughout the simulation and re-selecting when needed. Opposed to the unit displacement and prescribed displacement selection methods, the adaptive method is more robust, user-independent and efficient, for the cases considered. Secondly, the analysis of a single high aspect ratio cell is used to formulate an equation for the correction radius needed, depending on the characteristics of the correction function used, maximum aspect ratio, minimum first cell height and boundary error. Based on the analysis two new radial basis correction functions are derived and proposed. This proposed automated procedure is verified while varying the correction function, Reynolds number (and thus first cell height and aspect ratio) and boundary error. Finally, the parallel efficiency is studied for the two adaptive methods, unit displacement and prescribed displacement for both the CPU as well as the memory formulation with a 2D oscillating and translating airfoil with oscillating flap, a 3D flexible locally deforming tube and deforming wind turbine blade. Generally, the memory formulation requires less work (due to the large amount of work required for evaluating RBF's), but the parallel efficiency reduces due to the limited bandwidth available between CPU and memory. In terms of parallel efficiency/scaling the different studied methods perform similarly, with the greedy algorithm being the bottleneck. In terms of absolute computational work the adaptive methods are better for the cases studied due to their more efficient selection of the control points. By automating most of the RBF mesh deformation, a robust, efficient and almost user-independent mesh deformation method is presented.
Culture without the petri-dish.
Thompson, Jeremy G
2007-01-01
Automation of oocyte maturation and embryo production techniques is a new and exciting development in the field of reproductive technologies. There are two areas where increased automation is having an impact: in the area of embryo diagnostics and in the process of embryo production itself. Benefits include decreased staffing and skill requirements for production and assessment of embryos, as well as increasing quality management systems by removing the "human" factor. However, the uptake of new technologies is likely to be slow, as costs and the conservative nature of the Assisted Reproduction Technology industry to adopt new techniques.
Integrating PCLIPS into ULowell's Lincoln Logs: Factory of the future
NASA Technical Reports Server (NTRS)
Mcgee, Brenda J.; Miller, Mark D.; Krolak, Patrick; Barr, Stanley J.
1990-01-01
We are attempting to show how independent but cooperating expert systems, executing within a parallel production system (PCLIPS), can operate and control a completely automated, fault tolerant prototype of a factory of the future (The Lincoln Logs Factory of the Future). The factory consists of a CAD system for designing the Lincoln Log Houses, two workcells, and a materials handling system. A workcell consists of two robots, part feeders, and a frame mounted vision system.
Current status and future prospects for enabling chemistry technology in the drug discovery process.
Djuric, Stevan W; Hutchins, Charles W; Talaty, Nari N
2016-01-01
This review covers recent advances in the implementation of enabling chemistry technologies into the drug discovery process. Areas covered include parallel synthesis chemistry, high-throughput experimentation, automated synthesis and purification methods, flow chemistry methodology including photochemistry, electrochemistry, and the handling of "dangerous" reagents. Also featured are advances in the "computer-assisted drug design" area and the expanding application of novel mass spectrometry-based techniques to a wide range of drug discovery activities.
Florence, Alastair J; Johnston, Andrea; Price, Sarah L; Nowell, Harriott; Kennedy, Alan R; Shankland, Norman
2006-09-01
An automated parallel crystallisation search for physical forms of carbamazepine, covering 66 solvents and five crystallisation protocols, identified three anhydrous polymorphs (forms I-III), one hydrate and eight organic solvates, including the single-crystal structures of three previously unreported solvates (N,N-dimethylformamide (1:1); hemi-furfural; hemi-1,4-dioxane). Correlation of physical form outcome with the crystallisation conditions demonstrated that the solvent adopts a relatively nonspecific role in determining which polymorph is obtained, and that the previously reported effect of a polymer template facilitating the formation of form IV could not be reproduced by solvent crystallisation alone. In the accompanying computational search, approximately half of the energetically feasible predicted crystal structures exhibit the C=O...H--N R2(2)(8)dimer motif that is observed in the known polymorphs, with the most stable correctly corresponding to form III. Most of the other energetically feasible structures, including the global minimum, have a C=O...H--N C(4) chain hydrogen bond motif. No such chain structures were observed in this or any other previously published work, suggesting that kinetic, rather than thermodynamic, factors determine which of the energetically feasible crystal structures are observed experimentally, with the kinetics apparently favouring nucleation of crystal structures based on the CBZ-CBZ R2(2)(8) motif. (c) 2006 Wiley-Liss, Inc. and the American Pharmacists Association.
A systematic approach to embedded biomedical decision making.
Song, Zhe; Ji, Zhongkai; Ma, Jian-Guo; Sputh, Bernhard; Acharya, U Rajendra; Faust, Oliver
2012-11-01
An embedded decision making is a key feature for many biomedical systems. In most cases human life directly depends on correct decisions made by these systems, therefore they have to work reliably. This paper describes how we applied systems engineering principles to design a high performance embedded classification system in a systematic and well structured way. We introduce the structured design approach by discussing requirements capturing, specifications refinement, implementation and testing. Thereby, we follow systems engineering principles and execute each of these processes as formal as possible. The requirements, which motivate the system design, describe an automated decision making system for diagnostic support. These requirements are refined into the implementation of a support vector machine (SVM) algorithm which enables us to integrate automated decision making in embedded systems. With a formal model we establish functionality, stability and reliability of the system. Furthermore, we investigated different parallel processing configurations of this computationally complex algorithm. We found that, by adding SVM processes, an almost linear speedup is possible. Once we established these system properties, we translated the formal model into an implementation. The resulting implementation was tested using XMOS processors with both normal and failure cases, to build up trust in the implementation. Finally, we demonstrated that our parallel implementation achieves the speedup, predicted by the formal model. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Automated Instrumentation, Monitoring and Visualization of PVM Programs Using AIMS
NASA Technical Reports Server (NTRS)
Mehra, Pankaj; VanVoorst, Brian; Yan, Jerry; Tucker, Deanne (Technical Monitor)
1994-01-01
We present views and analysis of the execution of several PVM codes for Computational Fluid Dynamics on a network of Sparcstations, including (a) NAS Parallel benchmarks CG and MG (White, Alund and Sunderam 1993); (b) a multi-partitioning algorithm for NAS Parallel Benchmark SP (Wijngaart 1993); and (c) an overset grid flowsolver (Smith 1993). These views and analysis were obtained using our Automated Instrumentation and Monitoring System (AIMS) version 3.0, a toolkit for debugging the performance of PVM programs. We will describe the architecture, operation and application of AIMS. The AIMS toolkit contains (a) Xinstrument, which can automatically instrument various computational and communication constructs in message-passing parallel programs; (b) Monitor, a library of run-time trace-collection routines; (c) VK (Visual Kernel), an execution-animation tool with source-code clickback; and (d) Tally, a tool for statistical analysis of execution profiles. Currently, Xinstrument can handle C and Fortran77 programs using PVM 3.2.x; Monitor has been implemented and tested on Sun 4 systems running SunOS 4.1.2; and VK uses X11R5 and Motif 1.2. Data and views obtained using AIMS clearly illustrate several characteristic features of executing parallel programs on networked workstations: (a) the impact of long message latencies; (b) the impact of multiprogramming overheads and associated load imbalance; (c) cache and virtual-memory effects; and (4significant skews between workstation clocks. Interestingly, AIMS can compensate for constant skew (zero drift) by calibrating the skew between a parent and its spawned children. In addition, AIMS' skew-compensation algorithm can adjust timestamps in a way that eliminates physically impossible communications (e.g., messages going backwards in time). Our current efforts are directed toward creating new views to explain the observed performance of PVM programs. Some of the features planned for the near future include: (a) ConfigView, showing the physical topology of the virtual machine, inferred using specially formatted IP (Internet Protocol) packets; and (b) LoadView, synchronous animation of PVM-program execution and resource-utilization patterns.
The Impact of New Information Technology on Bureaucratic Organizational Culture
ERIC Educational Resources Information Center
Givens, Mark A.
2011-01-01
Virtual work environments (VWEs) have been used in the private sector for more than a decade, but the United States Marine Corps (USMC), as a whole, has not yet taken advantage of associated benefits. The USMC construct parallels the bureaucratic organizational culture and uses an antiquated information technology (IT) infrastructure. During an…
2016-09-26
Intelligent Automation Incorporated Enhancements for a Dynamic Data Warehousing and Mining ...Enhancements for a Dynamic Data Warehousing and Mining System for N00014-16-P-3014 Large-Scale Human Social Cultural Behavioral (HSBC) Data 5b. GRANT NUMBER...Representative Media Gallery View. We perform Scraawl’s NER algorithm to the text associated with YouTube post, which classifies the named entities into
Tauk-Tornisielo, Sâmia M.; Arasato, Luciana S.; de Almeida, Alex F.; Govone, José S.; Malagutti, Eleni N.
2009-01-01
The fungi strains were tested in Bioscreen automated system to select the best nutritional source. Following, shaking submserse cultures were studied in media containing sole carbon or nitrogen source. The growth of these strains improved in media containing vegetable oil, with high concentration of lipids. The high concentration of γ-linolenic acid was obtained with M. circinelloides in culture containing sesame oil. PMID:24031370
Doi, Kentaro; Tanaka, Shinsuke; Iida, Hideo; Eto, Hitomi; Kato, Harunosuke; Aoi, Noriyuki; Kuno, Shinichiro; Hirohi, Toshitsugu; Yoshimura, Kotaro
2013-11-01
The heterogeneous stromal vascular fraction (SVF), containing adipose-derived stem/progenitor cells (ASCs), can be easily isolated through enzymatic digestion of aspirated adipose tissue. In clinical settings, however, strict control of technical procedures according to standard operating procedures and validation of cell-processing conditions are required. Therefore, we evaluated the efficiency and reliability of an automated system for SVF isolation from adipose tissue. SVF cells, freshly isolated using the automated procedure, showed comparable number and viability to those from manual isolation. Flow cytometric analysis confirmed an SVF cell composition profile similar to that after manual isolation. In addition, the ASC yield after 1 week in culture was also not significantly different between the two groups. Our clinical study, in which SVF cells isolated with the automated system were transplanted with aspirated fat tissue for soft tissue augmentation/reconstruction in 42 patients, showed satisfactory outcomes with no serious side-effects. Taken together, our results suggested that the automated isolation system is as reliable a method as manual isolation and may also be useful in clinical settings. Automated isolation is expected to enable cell-based clinical trials in small facilities with an aseptic room, without the necessity of a good manufacturing practice-level cell processing area. Copyright © 2012 John Wiley & Sons, Ltd.
Rienzi, L; Vajta, G; Ubaldi, F
2011-09-01
During the past decades, improvements in culture of preimplantation embryos have contributed substantially in the success of human assisted reproductive techniques. However, most efforts were focused on optimization of media and gas components, while the established physical conditions and applied devices have remained essentially unchanged. Very recently, however, intensive research has been started to provide a more appropriate environment for the embryos and to replace the rather primitive and inappropriate devices with more sophisticated and practical instruments. Success has been reported with simple or sophisticated tools (microwells or microchannels) that allow accumulation of autocrine factors and establishment of a proper microenvironment for embryos cultured individually or in groups. The microchannel system may also offer certain level of automation and increased standardization of culture parameters. Continuous monitoring of individual embryos by optical or biochemical methods may help to determine the optimal day of transfer, and selection of the embryo with highest developmental competence for transfer. This advancement may eventually lead to adjustment of the culture environment to each individual embryo according to its actual needs. Connection of these techniques to additional radical approaches as automated ICSI or an ultimate assisted hatching with full removal of the zona pellucida after or even before fertilization may result in devices with high reliability and consistency, to increase the overall efficiency and decrease the work-intensity, and to eliminate the existing technological gap between laboratory embryology work and most other fields of biomedical sciences. Copyright © 2011 Elsevier Ltd. All rights reserved.
Driving out errors through tight integration between software and automation.
Reifsteck, Mark; Swanson, Thomas; Dallas, Mary
2006-01-01
A clear case has been made for using clinical IT to improve medication safety, particularly bar-code point-of-care medication administration and computerized practitioner order entry (CPOE) with clinical decision support. The equally important role of automation has been overlooked. When the two are tightly integrated, with pharmacy information serving as a hub, the distinctions between software and automation become blurred. A true end-to-end medication management system drives out errors from the dockside to the bedside. Presbyterian Healthcare Services in Albuquerque has been building such a system since 1999, beginning by automating pharmacy operations to support bar-coded medication administration. Encouraged by those results, it then began layering on software to further support clinician workflow and improve communication, culminating with the deployment of CPOE and clinical decision support. This combination, plus a hard-wired culture of safety, has resulted in a dramatically lower mortality and harm rate that could not have been achieved with a partial solution.
Donald Campbell's doubt: cultural difference or failure of communication?
Shweder, Richard A
2010-06-01
The objection, rightfully noted but then dismissed by Henrich et al., that the observed variation across populations "may be due to various methodological artifacts that arise from translating experiments across contexts" is a theoretically profound and potentially constructive criticism. It parallels Donald Campbell's concern that many cultural differences reported by psychologists "come from failures of communication misreported as differences." Ironically, Campbell's doubt is a good foundation for investigations in cultural psychology.
Image-Based Single Cell Profiling: High-Throughput Processing of Mother Machine Experiments
Sachs, Christian Carsten; Grünberger, Alexander; Helfrich, Stefan; Probst, Christopher; Wiechert, Wolfgang; Kohlheyer, Dietrich; Nöh, Katharina
2016-01-01
Background Microfluidic lab-on-chip technology combined with live-cell imaging has enabled the observation of single cells in their spatio-temporal context. The mother machine (MM) cultivation system is particularly attractive for the long-term investigation of rod-shaped bacteria since it facilitates continuous cultivation and observation of individual cells over many generations in a highly parallelized manner. To date, the lack of fully automated image analysis software limits the practical applicability of the MM as a phenotypic screening tool. Results We present an image analysis pipeline for the automated processing of MM time lapse image stacks. The pipeline supports all analysis steps, i.e., image registration, orientation correction, channel/cell detection, cell tracking, and result visualization. Tailored algorithms account for the specialized MM layout to enable a robust automated analysis. Image data generated in a two-day growth study (≈ 90 GB) is analyzed in ≈ 30 min with negligible differences in growth rate between automated and manual evaluation quality. The proposed methods are implemented in the software molyso (MOther machine AnaLYsis SOftware) that provides a new profiling tool to analyze unbiasedly hitherto inaccessible large-scale MM image stacks. Conclusion Presented is the software molyso, a ready-to-use open source software (BSD-licensed) for the unsupervised analysis of MM time-lapse image stacks. molyso source code and user manual are available at https://github.com/modsim/molyso. PMID:27661996
An Economic Evaluation of Colorectal Cancer Screening in Primary Care Practice
Meenan, Richard T.; Anderson, Melissa L.; Chubak, Jessica; Vernon, Sally W.; Fuller, Sharon; Wang, Ching-Yun; Green, Beverly B.
2015-01-01
Introduction Recent colorectal cancer screening studies focus on optimizing adherence. This study evaluated the cost effectiveness of interventions using electronic health records (EHRs), automated mailings, and stepped support increases to improve 2-year colorectal cancer screening adherence. Methods Analyses were based on a parallel-design, randomized trial in which three stepped interventions (EHR-linked mailings [“automated”], automated plus telephone assistance [“assisted”], or automated and assisted plus nurse navigation to testing completion or refusal [navigated”]) were compared to usual care. Data were from August 2008–November 2011 with analyses performed during 2012–2013. Implementation resources were micro-costed; research and registry development costs were excluded. Incremental cost-effectiveness ratios (ICERs) were based on number of participants current for screening per guidelines over 2 years. Bootstrapping examined robustness of results. Results Intervention delivery cost per participant current for screening ranged from $21 (automated) to $27 (navigated). Inclusion of induced testing costs (e.g., screening colonoscopy) lowered expenditures for automated (ICER=−$159) and assisted (ICER=−$36) relative to usual care over 2 years. Savings arose from increased fecal occult blood testing, substituting for more expensive colonoscopies in usual care. Results were broadly consistent across demographic subgroups. More intensive interventions were consistently likely to be cost effective relative to less intensive interventions, with willingness to pay values of $600–$1,200 for an additional person current for screening yielding ≥80% probability of cost effectiveness. Conclusions Two-year cost effectiveness of a stepped approach to colorectal cancer screening promotion based on EHR data is indicated, but longer-term cost effectiveness requires further study. PMID:25998922
NASA Technical Reports Server (NTRS)
Pappas, D.; Jeevarajan, A.; Anderson, M. M.
2004-01-01
Compact and automated sensors are desired for assessing the health of cell cultures in biotechnology experiments in microgravity. Measurement of cell culture medium allows for the optirn.jzation of culture conditions on orbit to maximize cell growth and minimize unnecessary exchange of medium. While several discrete sensors exist to measure culture health, a multi-parameter sensor would simplify the experimental apparatus. One such sensor, the Paratrend 7, consists of three optical fibers for measuring pH, dissolved oxygen (p02), dissolved carbon dioxide (pC02) , and a thermocouple to measure temperature. The sensor bundle was designed for intra-arterial placement in clinical patients, and potentially can be used in NASA's Space Shuttle and International Space Station biotechnology program bioreactors. Methods: A Paratrend 7 sensor was placed at the outlet of a rotating-wall perfused vessel bioreactor system inoculated with BHK-21 (baby hamster kidney) cells. Cell culture medium (GTSF-2, composed of 40% minimum essential medium, 60% L-15 Leibovitz medium) was manually measured using a bench top blood gas analyzer (BGA, Ciba-Corning). Results: A Paratrend 7 sensor was used over a long-term (>120 day) cell culture experiment. The sensor was able to track changes in cell medium pH, p02, and pC02 due to the consumption of nutrients by the BHK-21. When compared to manually obtained BGA measurements, the sensor had good agreement for pH, p02, and pC02 with bias [and precision] of 0.02 [0.15], 1 mm Hg [18 mm Hg], and -4.0 mm Hg [8.0 mm Hg] respectively. The Paratrend oxygen sensor was recalibrated (offset) periodically due to drift. The bias for the raw (no offset or recalibration) oxygen measurements was 42 mm Hg [38 mm Hg]. The measured response (rise) time of the sensor was 20 +/- 4s for pH, 81 +/- 53s for pC02, 51 +/- 20s for p02. For long-term cell culture measurements, these response times are more than adequate. Based on these findings , the Paratrend sensor could offer automated, continuous monitoring of cell cultures with a temporal resolution of 1 minute, which is not attainable by sampling via handheld blood analyzer (i-STAT). Conclusion: The resulting bias and precision found in these cell culture-based studies is comparable to Paratrend sensor clinical results. Although the large error in p02 measurements (+/-18 mm Hg) may be acceptable for clinical applications, where Paratrend values are periodically adjusted to a BGA measurement, the O2 sensor in this bundle may not be reliable enough for the single-calibration requirement of sensors used in NASA's bioreactors. The pH and pC02 sensors in the bundle are reliable and stable over the measurement period, and can be used without recalibration to measure cell cultures in rn.jcrogravity biotechnology experiments. Future work will test additional Paratrend sensors to provide statistical assessment of sensor performance.
Low-intensity, stocker-based channel catfish culture
USDA-ARS?s Scientific Manuscript database
Low-intensity Channel Catfish production is characterized by low stocking rates, low installed aeration capacity, and no automated dissolved oxygen monitoring. Two studies conducted in nine 0.25-acre ponds quantified production characteristics of stocker Channel Catfish stocked for low-intensity foo...
Seo, Hyun Il; Lee, Dae Sung; Yoon, Eun Mi; Kwon, Min-Jung; Park, Hyosoon; Jung, Yoon Suk; Park, Jung Ho; Sohn, Chong Il
2016-01-01
Background/Aims To prevent the transmission of pathogens by endoscopes, following established reprocessing guidelines is critical. An ideal reprocessing step is simple, fast, and inexpensive. Here, we evaluated and compared the efficacy and safety of two disinfectants, a tertiary amine compound (TAC) and ortho-phthalaldehyde (OPA). Methods A total of 100 colonoscopes were randomly reprocessed using two same automated endoscope reprocessors, according to disinfectant. The exposure time was 10 minutes for 0.55% OPA (Cidex® OPA, Johnson & Johnson) and 5 minutes for 4% TAC (Sencron2®, Bab Gencel Pharma & Chemical Ind. Co.). Three culture samples were obtained from each colonoscope after reprocessing. Results A total of nine samples were positive among the 300 culture samples. The positive culture rate was not statistically different between the two groups (4% for OPA and 2% for TAC, P=0.501). There were no incidents related to safety during the study period. Conclusions TAC was non-inferior in terms of reprocessing efficacy to OPA and was safe to use. Therefore, TAC seems to be a good alternative disinfectant with a relatively short exposure time and is also less expensive than OPA. PMID:27175119
Evolution of a minimal parallel programming model
Lusk, Ewing; Butler, Ralph; Pieper, Steven C.
2017-04-30
Here, we take a historical approach to our presentation of self-scheduled task parallelism, a programming model with its origins in early irregular and nondeterministic computations encountered in automated theorem proving and logic programming. We show how an extremely simple task model has evolved into a system, asynchronous dynamic load balancing (ADLB), and a scalable implementation capable of supporting sophisticated applications on today’s (and tomorrow’s) largest supercomputers; and we illustrate the use of ADLB with a Green’s function Monte Carlo application, a modern, mature nuclear physics code in production use. Our lesson is that by surrendering a certain amount of generalitymore » and thus applicability, a minimal programming model (in terms of its basic concepts and the size of its application programmer interface) can achieve extreme scalability without introducing complexity.« less
An Optimizing Compiler for Petascale I/O on Leadership Class Architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choudhary, Alok; Kandemir, Mahmut
In high-performance computing systems, parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final report summarizesmore » the major achievements of the project and also points out promising future directions.« less
Performance Evaluation of Evasion Maneuvers for Parallel Approach Collision Avoidance
NASA Technical Reports Server (NTRS)
Winder, Lee F.; Kuchar, James K.; Waller, Marvin (Technical Monitor)
2000-01-01
Current plans for independent instrument approaches to closely spaced parallel runways call for an automated pilot alerting system to ensure separation of aircraft in the case of a "blunder," or unexpected deviation from the a normal approach path. Resolution advisories by this system would require the pilot of an endangered aircraft to perform a trained evasion maneuver. The potential performance of two evasion maneuvers, referred to as the "turn-climb" and "climb-only," was estimated using an experimental NASA alerting logic (AILS) and a computer simulation of relative trajectory scenarios between two aircraft. One aircraft was equipped with the NASA alerting system, and maneuvered accordingly. Observation of the rates of different types of alerting failure allowed judgement of evasion maneuver performance. System Operating Characteristic (SOC) curves were used to assess the benefit of alerting with each maneuver.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sweatman, W.J.; Brandon, D.R.; Cranstone, S.
The preparation of indium-111 tropolonate-radiolabeled guinea pig peripheral mixed white cells (greater than 80% neutrophils) is described. Autologous rather than homologous cells are required to provide a population of labeled, functional cells on reintroduction to the animals. Surgery has been shown to result in a profound neutropenia from which the animals must recover before removal of blood for cell preparation. The response of radiolabeled cells parallels that of the unlabeled cell population to a chemotaxin, leukotriene B4. This material causes a profound neutropenia of rapid onset accompanied by a parallel fall in blood radioactivity. The fall in circulating radiolabel ismore » accompanied by an increase in radioactivity in the thoracic region. These changes have been monitored externally using an automated isotope monitoring system.« less
Parallelism in integrated fluidic circuits
NASA Astrophysics Data System (ADS)
Bousse, Luc J.; Kopf-Sill, Anne R.; Parce, J. W.
1998-04-01
Many research groups around the world are working on integrated microfluidics. The goal of these projects is to automate and integrate the handling of liquid samples and reagents for measurement and assay procedures in chemistry and biology. Ultimately, it is hoped that this will lead to a revolution in chemical and biological procedures similar to that caused in electronics by the invention of the integrated circuit. The optimal size scale of channels for liquid flow is determined by basic constraints to be somewhere between 10 and 100 micrometers . In larger channels, mixing by diffusion takes too long; in smaller channels, the number of molecules present is so low it makes detection difficult. At Caliper, we are making fluidic systems in glass chips with channels in this size range, based on electroosmotic flow, and fluorescence detection. One application of this technology is rapid assays for drug screening, such as enzyme assays and binding assays. A further challenge in this area is to perform multiple functions on a chip in parallel, without a large increase in the number of inputs and outputs. A first step in this direction is a fluidic serial-to-parallel converter. Fluidic circuits will be shown with the ability to distribute an incoming serial sample stream to multiple parallel channels.
Representing and computing regular languages on massively parallel networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, M.I.; O'Sullivan, J.A.; Boysam, B.
1991-01-01
This paper proposes a general method for incorporating rule-based constraints corresponding to regular languages into stochastic inference problems, thereby allowing for a unified representation of stochastic and syntactic pattern constraints. The authors' approach first established the formal connection of rules to Chomsky grammars, and generalizes the original work of Shannon on the encoding of rule-based channel sequences to Markov chains of maximum entropy. This maximum entropy probabilistic view leads to Gibb's representations with potentials which have their number of minima growing at precisely the exponential rate that the language of deterministically constrained sequences grow. These representations are coupled to stochasticmore » diffusion algorithms, which sample the language-constrained sequences by visiting the energy minima according to the underlying Gibbs' probability law. The coupling to stochastic search methods yields the all-important practical result that fully parallel stochastic cellular automata may be derived to generate samples from the rule-based constraint sets. The production rules and neighborhood state structure of the language of sequences directly determines the necessary connection structures of the required parallel computing surface. Representations of this type have been mapped to the DAP-510 massively-parallel processor consisting of 1024 mesh-connected bit-serial processing elements for performing automated segmentation of electron-micrograph images.« less
The Construction of English: Culture, Consumerism and Promotion in the ELT Global Coursebook
ERIC Educational Resources Information Center
Gray, John
2010-01-01
This book takes the view that ELT global coursebooks, in addition to being curriculum artefacts, are also highly wrought cultural artefacts which seek to make English mean in highly selective ways and it argues that the textual construction (and imaging) of English parallels the processes of commodity promotion more generally. This book contains…
ERIC Educational Resources Information Center
Palinkas, Lawrence A.; Garcia, Antonio; Aarons, Gregory; Finno-Velasquez, Megan; Fuentes, Dahlia; Holloway, Ian; Chamberlain, Patricia
2018-01-01
The Cultural Exchange Inventory (CEI) is a 15-item instrument designed to measure the process (7 items) and outcomes (8 items) of exchanges of knowledge, attitudes and practices between members of different organisations collaborating in implementing evidence-based practice. We conducted principal axis factor analyses and parallel analyses of data…
Tai, Mitchell; Ly, Amanda; Leung, Inne; Nayar, Gautam
2015-01-01
The burgeoning pipeline for new biologic drugs has increased the need for high-throughput process characterization to efficiently use process development resources. Breakthroughs in highly automated and parallelized upstream process development have led to technologies such as the 250-mL automated mini bioreactor (ambr250™) system. Furthermore, developments in modern design of experiments (DoE) have promoted the use of definitive screening design (DSD) as an efficient method to combine factor screening and characterization. Here we utilize the 24-bioreactor ambr250™ system with 10-factor DSD to demonstrate a systematic experimental workflow to efficiently characterize an Escherichia coli (E. coli) fermentation process for recombinant protein production. The generated process model is further validated by laboratory-scale experiments and shows how the strategy is useful for quality by design (QbD) approaches to control strategies for late-stage characterization. © 2015 American Institute of Chemical Engineers.
Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun; Wang, Gi-Nam
2016-01-01
Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively.
Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun
2016-01-01
Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively. PMID:27974882
Automation of a N-S S and C Database Generation for the Harrier in Ground Effect
NASA Technical Reports Server (NTRS)
Murman, Scott M.; Chaderjian, Neal M.; Pandya, Shishir; Kwak, Dochan (Technical Monitor)
2001-01-01
A method of automating the generation of a time-dependent, Navier-Stokes static stability and control database for the Harrier aircraft in ground effect is outlined. Reusable, lightweight components arc described which allow different facets of the computational fluid dynamic simulation process to utilize a consistent interface to a remote database. These components also allow changes and customizations to easily be facilitated into the solution process to enhance performance, without relying upon third-party support. An analysis of the multi-level parallel solver OVERFLOW-MLP is presented, and the results indicate that it is feasible to utilize large numbers of processors (= 100) even with a grid system with relatively small number of cells (= 10(exp 6)). A more detailed discussion of the simulation process, as well as refined data for the scaling of the OVERFLOW-MLP flow solver will be included in the full paper.
Automated solid-phase extraction workstations combined with quantitative bioanalytical LC/MS.
Huang, N H; Kagel, J R; Rossi, D T
1999-03-01
An automated solid-phase extraction workstation was used to develop, characterize and validate an LC/MS/MS method for quantifying a novel lipid-regulating drug in dog plasma. Method development was facilitated by workstation functions that allowed wash solvents of varying organic composition to be mixed and tested automatically. Precision estimates for this approach were within 9.8% relative standard deviation (RSD) across the calibration range. Accuracy for replicate determinations of quality controls was between -7.2 and +6.2% relative error (RE) over 5-1,000 ng/ml(-1). Recoveries were evaluated for a wide variety of wash solvents, elution solvents and sorbents. Optimized recoveries were generally > 95%. A sample throughput benchmark for the method was approximately equal 8 min per sample. Because of parallel sample processing, 100 samples were extracted in less than 120 min. The approach has proven useful for use with LC/MS/MS, using a multiple reaction monitoring (MRM) approach.
SAND: an automated VLBI imaging and analysing pipeline - I. Stripping component trajectories
NASA Astrophysics Data System (ADS)
Zhang, M.; Collioud, A.; Charlot, P.
2018-02-01
We present our implementation of an automated very long baseline interferometry (VLBI) data-reduction pipeline that is dedicated to interferometric data imaging and analysis. The pipeline can handle massive VLBI data efficiently, which makes it an appropriate tool to investigate multi-epoch multiband VLBI data. Compared to traditional manual data reduction, our pipeline provides more objective results as less human interference is involved. The source extraction is carried out in the image plane, while deconvolution and model fitting are performed in both the image plane and the uv plane for parallel comparison. The output from the pipeline includes catalogues of CLEANed images and reconstructed models, polarization maps, proper motion estimates, core light curves and multiband spectra. We have developed a regression STRIP algorithm to automatically detect linear or non-linear patterns in the jet component trajectories. This algorithm offers an objective method to match jet components at different epochs and to determine their proper motions.
The optimization of total laboratory automation by simulation of a pull-strategy.
Yang, Taho; Wang, Teng-Kuan; Li, Vincent C; Su, Chia-Lo
2015-01-01
Laboratory results are essential for physicians to diagnose medical conditions. Because of the critical role of medical laboratories, an increasing number of hospitals use total laboratory automation (TLA) to improve laboratory performance. Although the benefits of TLA are well documented, systems occasionally become congested, particularly when hospitals face peak demand. This study optimizes TLA operations. Firstly, value stream mapping (VSM) is used to identify the non-value-added time. Subsequently, batch processing control and parallel scheduling rules are devised and a pull mechanism that comprises a constant work-in-process (CONWIP) is proposed. Simulation optimization is then used to optimize the design parameters and to ensure a small inventory and a shorter average cycle time (CT). For empirical illustration, this approach is applied to a real case. The proposed methodology significantly improves the efficiency of laboratory work and leads to a reduction in patient waiting times and increased service level.
NASA Astrophysics Data System (ADS)
Ahmad, Habib; Sutherland, Alex; Shin, Young Shik; Hwang, Kiwook; Qin, Lidong; Krom, Russell-John; Heath, James R.
2011-09-01
Microfluidics flow-patterning has been utilized for the construction of chip-scale miniaturized DNA and protein barcode arrays. Such arrays have been used for specific clinical and fundamental investigations in which many proteins are assayed from single cells or other small sample sizes. However, flow-patterned arrays are hand-prepared, and so are impractical for broad applications. We describe an integrated robotics/microfluidics platform for the automated preparation of such arrays, and we apply it to the batch fabrication of up to eighteen chips of flow-patterned DNA barcodes. The resulting substrates are comparable in quality with hand-made arrays and exhibit excellent substrate-to-substrate consistency. We demonstrate the utility and reproducibility of robotics-patterned barcodes by utilizing two flow-patterned chips for highly parallel assays of a panel of secreted proteins from single macrophage cells.
Ahmad, Habib; Sutherland, Alex; Shin, Young Shik; Hwang, Kiwook; Qin, Lidong; Krom, Russell-John; Heath, James R.
2011-01-01
Microfluidics flow-patterning has been utilized for the construction of chip-scale miniaturized DNA and protein barcode arrays. Such arrays have been used for specific clinical and fundamental investigations in which many proteins are assayed from single cells or other small sample sizes. However, flow-patterned arrays are hand-prepared, and so are impractical for broad applications. We describe an integrated robotics/microfluidics platform for the automated preparation of such arrays, and we apply it to the batch fabrication of up to eighteen chips of flow-patterned DNA barcodes. The resulting substrates are comparable in quality with hand-made arrays and exhibit excellent substrate-to-substrate consistency. We demonstrate the utility and reproducibility of robotics-patterned barcodes by utilizing two flow-patterned chips for highly parallel assays of a panel of secreted proteins from single macrophage cells. PMID:21974603
Ahmad, Habib; Sutherland, Alex; Shin, Young Shik; Hwang, Kiwook; Qin, Lidong; Krom, Russell-John; Heath, James R
2011-09-01
Microfluidics flow-patterning has been utilized for the construction of chip-scale miniaturized DNA and protein barcode arrays. Such arrays have been used for specific clinical and fundamental investigations in which many proteins are assayed from single cells or other small sample sizes. However, flow-patterned arrays are hand-prepared, and so are impractical for broad applications. We describe an integrated robotics/microfluidics platform for the automated preparation of such arrays, and we apply it to the batch fabrication of up to eighteen chips of flow-patterned DNA barcodes. The resulting substrates are comparable in quality with hand-made arrays and exhibit excellent substrate-to-substrate consistency. We demonstrate the utility and reproducibility of robotics-patterned barcodes by utilizing two flow-patterned chips for highly parallel assays of a panel of secreted proteins from single macrophage cells. © 2011 American Institute of Physics
NASA Astrophysics Data System (ADS)
Barnes, Brian C.; Leiter, Kenneth W.; Becker, Richard; Knap, Jaroslaw; Brennan, John K.
2017-07-01
We describe the development, accuracy, and efficiency of an automation package for molecular simulation, the large-scale atomic/molecular massively parallel simulator (LAMMPS) integrated materials engine (LIME). Heuristics and algorithms employed for equation of state (EOS) calculation using a particle-based model of a molecular crystal, hexahydro-1,3,5-trinitro-s-triazine (RDX), are described in detail. The simulation method for the particle-based model is energy-conserving dissipative particle dynamics, but the techniques used in LIME are generally applicable to molecular dynamics simulations with a variety of particle-based models. The newly created tool set is tested through use of its EOS data in plate impact and Taylor anvil impact continuum simulations of solid RDX. The coarse-grain model results from LIME provide an approach to bridge the scales from atomistic simulations to continuum simulations.
Automated high-throughput flow-through real-time diagnostic system
Regan, John Frederick
2012-10-30
An automated real-time flow-through system capable of processing multiple samples in an asynchronous, simultaneous, and parallel fashion for nucleic acid extraction and purification, followed by assay assembly, genetic amplification, multiplex detection, analysis, and decontamination. The system is able to hold and access an unlimited number of fluorescent reagents that may be used to screen samples for the presence of specific sequences. The apparatus works by associating extracted and purified sample with a series of reagent plugs that have been formed in a flow channel and delivered to a flow-through real-time amplification detector that has a multiplicity of optical windows, to which the sample-reagent plugs are placed in an operative position. The diagnostic apparatus includes sample multi-position valves, a master sample multi-position valve, a master reagent multi-position valve, reagent multi-position valves, and an optical amplification/detection system.
New Developments in the Field of Reaction Technology: The Multiparallel Reactor HPMR 50-96
Allwardt, Arne; Wendler, Christian; Thurow, Kerstin
2005-01-01
Catalytic high-pressure reactions play an important role in classic bulk chemistry. The optimization of common reactions, the search for new and more effective catalysts, and the increasing use of catalytic pressure reactions in the field of drug development call for high-parallel reaction systems. A crucial task of current developments, apart from the parameters of pressure, temperature, and number of reaction chambers, is, in this respect, the systems' integration into complex laboratory automation environments. PMID:18924722
A Bioluminometric Method of DNA Sequencing
NASA Technical Reports Server (NTRS)
Ronaghi, Mostafa; Pourmand, Nader; Stolc, Viktor; Arnold, Jim (Technical Monitor)
2001-01-01
Pyrosequencing is a bioluminometric single-tube DNA sequencing method that takes advantage of co-operativity between four enzymes to monitor DNA synthesis. In this sequencing-by-synthesis method, a cascade of enzymatic reactions yields detectable light, which is proportional to incorporated nucleotides. Pyrosequencing has the advantages of accuracy, flexibility and parallel processing. It can be easily automated. Furthermore, the technique dispenses with the need for labeled primers, labeled nucleotides and gel-electrophoresis. In this chapter, the use of this technique for different applications is discussed.
Current status and future prospects for enabling chemistry technology in the drug discovery process
Djuric, Stevan W.; Hutchins, Charles W.; Talaty, Nari N.
2016-01-01
This review covers recent advances in the implementation of enabling chemistry technologies into the drug discovery process. Areas covered include parallel synthesis chemistry, high-throughput experimentation, automated synthesis and purification methods, flow chemistry methodology including photochemistry, electrochemistry, and the handling of “dangerous” reagents. Also featured are advances in the “computer-assisted drug design” area and the expanding application of novel mass spectrometry-based techniques to a wide range of drug discovery activities. PMID:27781094
Boyle, M A; O'Donnell, M J; Russell, R J; Galvin, N; Swan, J; Coleman, D C
2015-10-01
Decontaminating dental chair unit (DCU) suction systems in a convenient, safe and effective manner is problematic. This study aimed to identify and quantify the extent of the problems using 25 DCUs, methodically eliminate these problems and develop an efficient approach for reliable, effective, automated disinfection. DCU suction system residual contamination by environmental and human-derived bacteria was evaluated by microbiological culture following standard aspiration disinfection with a quaternary ammonium disinfectant or alternatively, a novel flooding approach to disinfection. Disinfection of multicomponent suction handpieces, assembled and disassembled, was also studied. A prototype manual and a novel automated Suction Tube Cleaning System (STCS) were developed and tested, as were novel single component suction handpieces. Standard aspiration disinfection consistently failed to decontaminate DCU suction systems effectively. Semi-confluent bacterial growth (101-500 colony forming units (CFU) per culture plate) was recovered from up to 60% of suction filter housings and from up to 19% of high and 37% of low volume suction hoses. Manual and automated flood disinfection of DCU suction systems reduced this dramatically (ranges for filter cage and high and low volume hoses of 0-22, 0-16 and 0-14CFU/plate, respectively) (P<0.0001). Multicomponent suction handpieces could not be adequately disinfected without prior removal and disassembly. Novel single component handpieces, allowed their effective disinfection in situ using the STCS, which virtually eliminated contamination from the entire suction system. Flood disinfection of DCU suction systems and single component handpieces radically improves disinfection efficacy and considerably reduces potential cross-infection and cross-contamination risks. DCU suction systems become heavily contaminated during use. Conventional disinfection does not adequately control this. Furthermore, multicomponent suction handpieces cannot be adequately disinfected without disassembly, which is costly in time, staff and resources. The automated STCS DCU suction disinfection system used with single component handpieces provides an effective solution. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Scalable Device for Automated Microbial Electroporation in a Digital Microfluidic Platform.
Madison, Andrew C; Royal, Matthew W; Vigneault, Frederic; Chen, Liji; Griffin, Peter B; Horowitz, Mark; Church, George M; Fair, Richard B
2017-09-15
Electrowetting-on-dielectric (EWD) digital microfluidic laboratory-on-a-chip platforms demonstrate excellent performance in automating labor-intensive protocols. When coupled with an on-chip electroporation capability, these systems hold promise for streamlining cumbersome processes such as multiplex automated genome engineering (MAGE). We integrated a single Ti:Au electroporation electrode into an otherwise standard parallel-plate EWD geometry to enable high-efficiency transformation of Escherichia coli with reporter plasmid DNA in a 200 nL droplet. Test devices exhibited robust operation with more than 10 transformation experiments performed per device without cross-contamination or failure. Despite intrinsic electric-field nonuniformity present in the EP/EWD device, the peak on-chip transformation efficiency was measured to be 8.6 ± 1.0 × 10 8 cfu·μg -1 for an average applied electric field strength of 2.25 ± 0.50 kV·mm -1 . Cell survival and transformation fractions at this electroporation pulse strength were found to be 1.5 ± 0.3 and 2.3 ± 0.1%, respectively. Our work expands the EWD toolkit to include on-chip microbial electroporation and opens the possibility of scaling advanced genome engineering methods, like MAGE, into the submicroliter regime.
Al Sidairi, Hilal; Binkhamis, Khalifa; Jackson, Colleen; Roberts, Catherine; Heinstein, Charles; MacDonald, Jimmy; Needle, Robert; Hatchette, Todd F; LeBlanc, Jason J
2017-11-01
Serology remains the mainstay for diagnosis of Epstein-Barr virus (EBV) infection. This study compared two automated platforms (BioPlex 2200 and Architect i2000SR) to test three EBV serological markers: viral capsid antigen (VCA) immunoglobulins of class M (IgM), VCA immunoglobulins of class G (IgG) and EBV nuclear antigen-1 (EBNA-1) IgG. Using sera from 65 patients at various stages of EBV disease, BioPlex demonstrated near-perfect agreement for all EBV markers compared to a consensus reference. The agreement for Architect was near-perfect for VCA IgG and EBNA-1 IgG, and substantial for VCA IgM despite five equivocal results. Since the majority of testing in our hospital was from adults with EBNA-1 IgG positive results, post-implementation analysis of an EBNA-based algorithm showed advantages over parallel testing of the three serologic markers. This small verification demonstrated that both automated systems for EBV serology had good performance for all EBV markers, and an EBNA-based testing algorithm is ideal for an adult hospital.
Quintero, Catherine; Kariv, Ilona
2009-06-01
To meet the needs of the increasingly rapid and parallelized lead optimization process, a fully integrated local compound storage and liquid handling system was designed and implemented to automate the generation of assay-ready plates directly from newly submitted and cherry-picked compounds. A key feature of the system is the ability to create project- or assay-specific compound-handling methods, which provide flexibility for any combination of plate types, layouts, and plate bar-codes. Project-specific workflows can be created by linking methods for processing new and cherry-picked compounds and control additions to produce a complete compound set for both biological testing and local storage in one uninterrupted workflow. A flexible cherry-pick approach allows for multiple, user-defined strategies to select the most appropriate replicate of a compound for retesting. Examples of custom selection parameters include available volume, compound batch, and number of freeze/thaw cycles. This adaptable and integrated combination of software and hardware provides a basis for reducing cycle time, fully automating compound processing, and ultimately increasing the rate at which accurate, biologically relevant results can be produced for compounds of interest in the lead optimization process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsunoda, Hirokazu; Sato, Osamu; Okajima, Shigeaki
2002-07-01
In order to achieve fully automated reactor operation of RAPID-L reactor, innovative reactivity control systems LEM, LIM, and LRM are equipped with lithium-6 as a liquid poison. Because lithium-6 has not been used as a neutron absorbing material of conventional fast reactors, measurements of the reactivity worth of Lithium-6 were performed at the Fast Critical Assembly (FCA) of Japan Atomic Energy Research Institute (JAERI). The FCA core was composed of highly enriched uranium and stainless steel samples so as to simulate the core spectrum of RAPID-L. The samples of 95% enriched lithium-6 were inserted into the core parallel to themore » core axis for the measurement of the reactivity worth at each position. It was found that the measured reactivity worth in the core region well agreed with calculated value by the method for the core designs of RAPID-L. Bias factors for the core design method were obtained by comparing between experimental and calculated results. The factors were used to determine the number of LEM and LIM equipped in the core to achieve fully automated operation of RAPID-L. (authors)« less
Automating quantum experiment control
NASA Astrophysics Data System (ADS)
Stevens, Kelly E.; Amini, Jason M.; Doret, S. Charles; Mohler, Greg; Volin, Curtis; Harter, Alexa W.
2017-03-01
The field of quantum information processing is rapidly advancing. As the control of quantum systems approaches the level needed for useful computation, the physical hardware underlying the quantum systems is becoming increasingly complex. It is already becoming impractical to manually code control for the larger hardware implementations. In this chapter, we will employ an approach to the problem of system control that parallels compiler design for a classical computer. We will start with a candidate quantum computing technology, the surface electrode ion trap, and build a system instruction language which can be generated from a simple machine-independent programming language via compilation. We incorporate compile time generation of ion routing that separates the algorithm description from the physical geometry of the hardware. Extending this approach to automatic routing at run time allows for automated initialization of qubit number and placement and additionally allows for automated recovery after catastrophic events such as qubit loss. To show that these systems can handle real hardware, we present a simple demonstration system that routes two ions around a multi-zone ion trap and handles ion loss and ion placement. While we will mainly use examples from transport-based ion trap quantum computing, many of the issues and solutions are applicable to other architectures.
Park, Min Cheol; Kim, Moojong; Lim, Gun Taek; Kang, Sung Min; An, Seong Soo A; Kim, Tae Song; Kang, Ji Yoon
2016-06-21
Multiwell plates are regularly used in analytical research and clinical diagnosis but often require laborious washing steps and large sample or reagent volumes (typically, 100 μL per well). To overcome such drawbacks in the conventional multiwell plate, we present a novel microchannel-connected multiwell plate (μCHAMP) that can be used for automated disease biomarker detection in a small sample volume by performing droplet-based magnetic bead immunoassay inside the plate. In this μCHAMP-based immunoassay platform, small volumes (30-50 μL) of aqueous-phase working droplets are stably confined within each well by the simple microchannel structure (200-300 μm in height and 0.5-1 mm in width), and magnetic beads are exclusively transported into an adjacent droplet through the oil-filled microchannels assisted by a magnet array aligned beneath and controlled by a XY-motorized stage. Using this μCHAMP-based platform, we were able to perform parallel detection of synthetic amyloid beta (Aβ) oligomers as a model analyte for the early diagnosis of Alzheimer's disease (AD). This platform easily simplified the laborious and consumptive immunoassay procedure by achieving automated parallel immunoassay (32 assays per operation in 3-well connected 96-well plate) within 1 hour and at low sample consumption (less than 10 μL per assay) with no cumbersome manual washing step. Moreover, it could detect synthetic Aβ oligomers even below 10 pg mL(-1) concentration with a calculated detection limit of ∼3 pg mL(-1). Therefore, the μCHAMP and droplet-based magnetic bead immunoassay, with the combination of XY-motorized magnet array, would be a useful platform in the diagnosis of human disease, including AD, which requires low consumption of the patient's body fluid sample and automation of the entire immunoassay procedure for high processing capacity.
A Parallel Genetic Algorithm for Automated Electronic Circuit Design
NASA Technical Reports Server (NTRS)
Long, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris
2000-01-01
Parallelized versions of genetic algorithms (GAs) are popular primarily for three reasons: the GA is an inherently parallel algorithm, typical GA applications are very compute intensive, and powerful computing platforms, especially Beowulf-style computing clusters, are becoming more affordable and easier to implement. In addition, the low communication bandwidth required allows the use of inexpensive networking hardware such as standard office ethernet. In this paper we describe a parallel GA and its use in automated high-level circuit design. Genetic algorithms are a type of trial-and-error search technique that are guided by principles of Darwinian evolution. Just as the genetic material of two living organisms can intermix to produce offspring that are better adapted to their environment, GAs expose genetic material, frequently strings of 1s and Os, to the forces of artificial evolution: selection, mutation, recombination, etc. GAs start with a pool of randomly-generated candidate solutions which are then tested and scored with respect to their utility. Solutions are then bred by probabilistically selecting high quality parents and recombining their genetic representations to produce offspring solutions. Offspring are typically subjected to a small amount of random mutation. After a pool of offspring is produced, this process iterates until a satisfactory solution is found or an iteration limit is reached. Genetic algorithms have been applied to a wide variety of problems in many fields, including chemistry, biology, and many engineering disciplines. There are many styles of parallelism used in implementing parallel GAs. One such method is called the master-slave or processor farm approach. In this technique, slave nodes are used solely to compute fitness evaluations (the most time consuming part). The master processor collects fitness scores from the nodes and performs the genetic operators (selection, reproduction, variation, etc.). Because of dependency issues in the GA, it is possible to have idle processors. However, as long as the load at each processing node is similar, the processors are kept busy nearly all of the time. In applying GAs to circuit design, a suitable genetic representation 'is that of a circuit-construction program. We discuss one such circuit-construction programming language and show how evolution can generate useful analog circuit designs. This language has the desirable property that virtually all sets of combinations of primitives result in valid circuit graphs. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. Using a parallel genetic algorithm and circuit simulation software, we present experimental results as applied to three analog filter and two amplifier design tasks. For example, a figure shows an 85 dB amplifier design evolved by our system, and another figure shows the performance of that circuit (gain and frequency response). In all tasks, our system is able to generate circuits that achieve the target specifications.
Huber, Robert; Ritter, Daniel; Hering, Till; Hillmer, Anne-Kathrin; Kensy, Frank; Müller, Carsten; Wang, Le; Büchs, Jochen
2009-01-01
Background In industry and academic research, there is an increasing demand for flexible automated microfermentation platforms with advanced sensing technology. However, up to now, conventional platforms cannot generate continuous data in high-throughput cultivations, in particular for monitoring biomass and fluorescent proteins. Furthermore, microfermentation platforms are needed that can easily combine cost-effective, disposable microbioreactors with downstream processing and analytical assays. Results To meet this demand, a novel automated microfermentation platform consisting of a BioLector and a liquid-handling robot (Robo-Lector) was sucessfully built and tested. The BioLector provides a cultivation system that is able to permanently monitor microbial growth and the fluorescence of reporter proteins under defined conditions in microtiter plates. Three examplary methods were programed on the Robo-Lector platform to study in detail high-throughput cultivation processes and especially recombinant protein expression. The host/vector system E. coli BL21(DE3) pRhotHi-2-EcFbFP, expressing the fluorescence protein EcFbFP, was hereby investigated. With the method 'induction profiling' it was possible to conduct 96 different induction experiments (varying inducer concentrations from 0 to 1.5 mM IPTG at 8 different induction times) simultaneously in an automated way. The method 'biomass-specific induction' allowed to automatically induce cultures with different growth kinetics in a microtiter plate at the same biomass concentration, which resulted in a relative standard deviation of the EcFbFP production of only ± 7%. The third method 'biomass-specific replication' enabled to generate equal initial biomass concentrations in main cultures from precultures with different growth kinetics. This was realized by automatically transferring an appropiate inoculum volume from the different preculture microtiter wells to respective wells of the main culture plate, where subsequently similar growth kinetics could be obtained. Conclusion The Robo-Lector generates extensive kinetic data in high-throughput cultivations, particularly for biomass and fluorescence protein formation. Based on the non-invasive on-line-monitoring signals, actions of the liquid-handling robot can easily be triggered. This interaction between the robot and the BioLector (Robo-Lector) combines high-content data generation with systematic high-throughput experimentation in an automated fashion, offering new possibilities to study biological production systems. The presented platform uses a standard liquid-handling workstation with widespread automation possibilities. Thus, high-throughput cultivations can now be combined with small-scale downstream processing techniques and analytical assays. Ultimately, this novel versatile platform can accelerate and intensify research and development in the field of systems biology as well as modelling and bioprocess optimization. PMID:19646274
Broyer, Patrick; Perrot, Nadine; Rostaing, Hervé; Blaze, Jérome; Pinston, Frederic; Gervasi, Gaspard; Charles, Marie-Hélène; Dachaud, Fabien; Dachaud, Jacques; Moulin, Frederic; Cordier, Sylvain; Dauwalder, Olivier; Meugnier, Hélène; Vandenesch, Francois
2018-01-01
Sepsis is the leading cause of death among patients in intensive care units (ICUs) requiring an early diagnosis to introduce efficient therapeutic intervention. Rapid identification (ID) of a causative pathogen is key to guide directed antimicrobial selection and was recently shown to reduce hospitalization length in ICUs. Direct processing of positive blood cultures by MALDI-TOF MS technology is one of the several currently available tools used to generate rapid microbial ID. However, all recently published protocols are still manual and time consuming, requiring dedicated technician availability and specific strategies for batch processing. We present here a new prototype instrument for automated preparation of Vitek ® MS slides directly from positive blood culture broth based on an "all-in-one" extraction strip. This bench top instrument was evaluated on 111 and 22 organisms processed using artificially inoculated blood culture bottles in the BacT/ALERT ® 3D (SA/SN blood culture bottles) or the BacT/ALERT Virtuo TM system (FA/FN Plus bottles), respectively. Overall, this new preparation station provided reliable and accurate Vitek MS species-level identification of 87% (Gram-negative bacteria = 85%, Gram-positive bacteria = 88%, and yeast = 100%) when used with BacT/ALERT ® 3D and of 84% (Gram-negative bacteria = 86%, Gram-positive bacteria = 86%, and yeast = 75%) with Virtuo ® instruments, respectively. The prototype was then evaluated in a clinical microbiology laboratory on 102 clinical blood culture bottles and compared to routine laboratory ID procedures. Overall, the correlation of ID on monomicrobial bottles was 83% (Gram-negative bacteria = 89%, Gram-positive bacteria = 79%, and yeast = 78%), demonstrating roughly equivalent performance between manual and automatized extraction methods. This prototype instrument exhibited a high level of performance regardless of bottle type or BacT/ALERT system. Furthermore, blood culture workflow could potentially be improved by converting direct ID of positive blood cultures from a batch-based to real-time and "on-demand" process.
Frazee, Bradley W; Enriquez, Kayla; Ng, Valerie; Alter, Harrison
2015-06-01
Voided urinalysis to test for urinary tract infection (UTI) is prone to false-positive results for a number of reasons. Specimens are often collected at triage from women with any abdominal complaint, creating a low UTI prevalence population. Improper collection technique by the patient may affect the result. At least four indices, if positive, can indicate UTI. We examine the impact of voided specimen collection technique on urinalysis indicators of UTI and on urine culture contamination in disease-free women. In this crossover design, 40 menstrual-age female emergency department staff without UTI symptoms collected urine two ways: directly in a cup ("non-clean") and midstream clean catch ("ideal"). Samples underwent standard automated urinalysis and culture. Urinalysis indices and culture contamination were compared. The proportion of abnormal results from samples collected by "non-clean" vs. "ideal" technique, respectively, were: leukocyte esterase (>trace) 50%, 35% (95% confidence interval for difference -6% to 36%); nitrites (any) 2.5%, 2.5% (difference -2.5 to 2.5%); white blood cells (>5/high-powered field [HPF]) 50%, 27.5% (difference 4 to 41%); bacteria (any/HPF) 77.5%, 62.5%, (difference -7 to 37%); epithelial cells (>few) 65%, 30% (difference 13 to 56%); culture contamination (>1000 colony-forming units of commensal or >2 species) 77%, 63% (difference -5 to 35%). No urinalysis index was positively correlated with culture contamination. Contemporary automated urinalysis indices were often abnormal in a disease-free population of women, even using ideal collection technique. In clinical practice, such false-positive results could lead to false-positive UTI diagnosis. Only urine nitrite showed a high specificity. Culture contamination was common regardless of collection technique and was not predicted by urinalysis results. Copyright © 2015 Elsevier Inc. All rights reserved.
Enhanced clinical-scale manufacturing of TCR transduced T-cells using closed culture system modules.
Jin, Jianjian; Gkitsas, Nikolaos; Fellowes, Vicki S; Ren, Jiaqiang; Feldman, Steven A; Hinrichs, Christian S; Stroncek, David F; Highfill, Steven L
2018-01-24
Genetic engineering of T-cells to express specific T cell receptors (TCR) has emerged as a novel strategy to treat various malignancies. More widespread utilization of these types of therapies has been somewhat constrained by the lack of closed culture processes capable of expanding sufficient numbers of T-cells for clinical application. Here, we evaluate a process for robust clinical grade manufacturing of TCR gene engineered T-cells. TCRs that target human papillomavirus E6 and E7 were independently tested. A 21 day process was divided into a transduction phase (7 days) and a rapid expansion phase (14 days). This process was evaluated using two healthy donor samples and four samples obtained from patients with epithelial cancers. The process resulted in ~ 2000-fold increase in viable nucleated cells and high transduction efficiencies (64-92%). At the end of culture, functional assays demonstrated that these cells were potent and specific in their ability to kill tumor cells bearing target and secrete large quantities of interferon and tumor necrosis factor. Both phases of culture were contained within closed or semi-closed modules, which include automated density gradient separation and cell culture bags for the first phase and closed GREX culture devices and wash/concentrate systems for the second phase. Large-scale manufacturing using modular systems and semi-automated devices resulted in highly functional clinical-grade TCR transduced T-cells. This process is now in use in actively accruing clinical trials and the NIH Clinical Center and can be utilized at other cell therapy manufacturing sites that wish to scale-up and optimize their processing using closed systems.
Renz, Nora; Cabric, Sabrina; Morgenstern, Christian; Schuetz, Michael A; Trampuz, Andrej
2018-04-01
Bone healing disturbance following fracture fixation represents a continuing challenge. We evaluated a novel fully automated polymerase chain reaction (PCR) assay using sonication fluid from retrieved orthopedic hardware to diagnose infection. In this prospective diagnostic cohort study, explanted orthopedic hardware materials from consecutive patients were investigated by sonication and the resulting sonication fluid was analyzed by culture (standard procedure) and multiplex PCR (investigational procedure). Hardware-associated infection was defined as visible purulence, presence of a sinus tract, implant on view, inflammation in peri-implant tissue or positive culture. McNemar's chi-squared test was used to compare the performance of diagnostic tests. For the clinical performance all pathogens were considered, whereas for analytical performance only microorganisms were considered for which primers are included in the PCR assay. Among 51 patients, hardware-associated infection was diagnosed in 38 cases (75%) and non-infectious causes in 13 patients (25%). The sensitivity for diagnosing infection was 66% for peri-implant tissue culture, 84% for sonication fluid culture, 71% (clinical performance) and 77% (analytical performance) for sonication fluid PCR, the specificity of all tests was >90%. The analytical sensitivity of PCR was higher for gram-negative bacilli (100%), coagulase-negative staphylococci (89%) and Staphylococcus aureus (75%) than for Cutibacterium (formerly Propionibacterium) acnes (57%), enterococci (50%) and Candida spp. (25%). The performance of sonication fluid PCR for diagnosis of orthopedic hardware-associated infection was comparable to culture tests. The additional advantage of PCR was short processing time (<5 h) and fully automated procedure. With further improvement of the performance, PCR has the potential to complement conventional cultures. Copyright © 2018 Elsevier Ltd. All rights reserved.
Hanson, Marta; Pomata, Gianna
2017-03-01
This essay deals with the medical recipe as an epistemic genre that played an important role in the cross-cultural transmission of knowledge. The article first compares the development of the recipe as a textual form in Chinese and European premodern medical cultures. It then focuses on the use of recipes in the transmission of Chinese pharmacology to Europe in the second half of the seventeenth century. The main sources examined are the Chinese medicinal formulas translated—presumably—by the Jesuit Michael Boym and published in Specimen Medicinae Sinicae (1682), a text that introduced Chinese pulse medicine to Europe. The article examines how the translator rendered the Chinese formulas into Latin for a European audience. Arguably, the translation was facilitated by the fact that the recipe as a distinct epistemic genre had developed, with strong parallels, in both Europe and China. Building on these parallels, the translator used the recipe as a shared textual format that would allow the transfer of knowledge between the two medical cultures.
NASA Astrophysics Data System (ADS)
Wong, N.; Grace, J. M.; Liang, J.; Owyang, S.; Storrs, A.; Zhou, J.; Rothschild, L. J.; Gentry, D.
2014-12-01
Life acclimated to harsh conditions is frequently difficult to study using normal lab techniques and conventional equipment. Simplified studies using in-lab 'simulated' extreme environments, such as UV bulbs or cold blocks, are manually intensive, error-prone, and lose many complexities of the microbe/environment interaction. We have built a prototype instrument to address this dilemma by allowing automated iterations of microbial cultures to be subject to combinations of modular environmental pressures such as heat, radiation, and chemical exposure. The presence of multiple sensors allows the state of the culture and internal environment to be continuously monitored and adjusted in response.Our first prototype showed successful iterations of microbial growth and thermal exposure. Our second prototype, presented here, performs an demonstration of repeated exposure of Escherichia coli to ultraviolet radiation, a well-established procedure. As the E. coli becomes more resistant to ultraviolet radiation, the device detects their increased survival and growth and increases the dosage accordingly. Calibration data for the instrument was generated by performing the same proof-of-concept exposure experiment, at a smaller scale, by hand. Current performance data indicates that our finalized instrument will have the ability to run hundreds of iterations with multiple selection pressures. The automated sensing and adaptive exposure that the device provides will inform the challenges of managing and culturing life tailored to uncommon environmental stresses. We have designed this device to be flexible, extensible, low-cost and easy to reproduce. We hope that it enter wide use as a tool for conducting scalable studies of the interaction between extremophiles and multiple environmental stresses, and potentially for generating artificial extremophiles as analogues for life we might find in extreme environments here on Earth or elsewhere.
Fast Acceleration of 2D Wave Propagation Simulations Using Modern Computational Accelerators
Wang, Wei; Xu, Lifan; Cavazos, John; Huang, Howie H.; Kay, Matthew
2014-01-01
Recent developments in modern computational accelerators like Graphics Processing Units (GPUs) and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC) coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least faster than the sequential implementation and faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other computational models of wave propagation in multi-dimensional media. PMID:24497950
Shelley, Brandon C; Gowing, Geneviève; Svendsen, Clive N
2014-06-15
A cell expansion technique to amass large numbers of cells from a single specimen for research experiments and clinical trials would greatly benefit the stem cell community. Many current expansion methods are laborious and costly, and those involving complete dissociation may cause several stem and progenitor cell types to undergo differentiation or early senescence. To overcome these problems, we have developed an automated mechanical passaging method referred to as "chopping" that is simple and inexpensive. This technique avoids chemical or enzymatic dissociation into single cells and instead allows for the large-scale expansion of suspended, spheroid cultures that maintain constant cell/cell contact. The chopping method has primarily been used for fetal brain-derived neural progenitor cells or neurospheres, and has recently been published for use with neural stem cells derived from embryonic and induced pluripotent stem cells. The procedure involves seeding neurospheres onto a tissue culture Petri dish and subsequently passing a sharp, sterile blade through the cells effectively automating the tedious process of manually mechanically dissociating each sphere. Suspending cells in culture provides a favorable surface area-to-volume ratio; as over 500,000 cells can be grown within a single neurosphere of less than 0.5 mm in diameter. In one T175 flask, over 50 million cells can grow in suspension cultures compared to only 15 million in adherent cultures. Importantly, the chopping procedure has been used under current good manufacturing practice (cGMP), permitting mass quantity production of clinical-grade cell products.
Jaccard, Nicolas; Griffin, Lewis D; Keser, Ana; Macown, Rhys J; Super, Alexandre; Veraitch, Farlan S; Szita, Nicolas
2014-03-01
The quantitative determination of key adherent cell culture characteristics such as confluency, morphology, and cell density is necessary for the evaluation of experimental outcomes and to provide a suitable basis for the establishment of robust cell culture protocols. Automated processing of images acquired using phase contrast microscopy (PCM), an imaging modality widely used for the visual inspection of adherent cell cultures, could enable the non-invasive determination of these characteristics. We present an image-processing approach that accurately detects cellular objects in PCM images through a combination of local contrast thresholding and post hoc correction of halo artifacts. The method was thoroughly validated using a variety of cell lines, microscope models and imaging conditions, demonstrating consistently high segmentation performance in all cases and very short processing times (<1 s per 1,208 × 960 pixels image). Based on the high segmentation performance, it was possible to precisely determine culture confluency, cell density, and the morphology of cellular objects, demonstrating the wide applicability of our algorithm for typical microscopy image processing pipelines. Furthermore, PCM image segmentation was used to facilitate the interpretation and analysis of fluorescence microscopy data, enabling the determination of temporal and spatial expression patterns of a fluorescent reporter. We created a software toolbox (PHANTAST) that bundles all the algorithms and provides an easy to use graphical user interface. Source-code for MATLAB and ImageJ is freely available under a permissive open-source license. © 2013 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals, Inc.
Jaccard, Nicolas; Griffin, Lewis D; Keser, Ana; Macown, Rhys J; Super, Alexandre; Veraitch, Farlan S; Szita, Nicolas
2014-01-01
The quantitative determination of key adherent cell culture characteristics such as confluency, morphology, and cell density is necessary for the evaluation of experimental outcomes and to provide a suitable basis for the establishment of robust cell culture protocols. Automated processing of images acquired using phase contrast microscopy (PCM), an imaging modality widely used for the visual inspection of adherent cell cultures, could enable the non-invasive determination of these characteristics. We present an image-processing approach that accurately detects cellular objects in PCM images through a combination of local contrast thresholding and post hoc correction of halo artifacts. The method was thoroughly validated using a variety of cell lines, microscope models and imaging conditions, demonstrating consistently high segmentation performance in all cases and very short processing times (<1 s per 1,208 × 960 pixels image). Based on the high segmentation performance, it was possible to precisely determine culture confluency, cell density, and the morphology of cellular objects, demonstrating the wide applicability of our algorithm for typical microscopy image processing pipelines. Furthermore, PCM image segmentation was used to facilitate the interpretation and analysis of fluorescence microscopy data, enabling the determination of temporal and spatial expression patterns of a fluorescent reporter. We created a software toolbox (PHANTAST) that bundles all the algorithms and provides an easy to use graphical user interface. Source-code for MATLAB and ImageJ is freely available under a permissive open-source license. Biotechnol. Bioeng. 2014;111: 504–517. © 2013 Wiley Periodicals, Inc. PMID:24037521
Production of Pharmaceuticals from Papaver Cultivars In Vitro
USDA-ARS?s Scientific Manuscript database
A methodology to clonally proliferate Iranian poppy (Papaver bracteatum Lindl.) and opium poppy (P. somniferum L.) shoots is presented employing an in vitro hydroponics system (i.e., automated plant culture system (APCS)). Temperature had a profound effect on growth and alkaloid production after 8-...
Rossi-Rodrigues, Bianca Caroline; Brochetto-Braga, Márcia Regina; Tauk-Tornisielo, Sâmia Maria; Carmona, Eleonora Cano; Arruda, Valeska Marques; Chaud Netto, José
2009-01-01
Trichoderma is one of the fungi genera that produce important metabolites for industry. The growth of these organisms is a consequence of the nutritional sources used as also of the physical conditions employed to cultivate them. In this work, the automated Bioscreen C system was used to evaluate the influence of different nutritional sources on the growth of Trichoderma strains (T. hamatum, T. harzianum, T. viride, and T. longibrachiatum) isolated from the soil in the Juréia-Itatins Ecological Station (JIES), São Paulo State - Brazil. The cultures were grown in liquid culture media containing different carbon- (2%; w/v) and nitrogen (1%; w/v) sources at 28ºC, pH 6.5, and agitated at 150 rpm for 72 h. The results showed, as expected, that glucose is superior to sucrose as a growth-stimulating carbon source in the Trichoderma strains studied, while yeast extract and tryptone were good growth-stimulating nitrogen sources in the cultivation of T. hamatum and T. harzianum. PMID:24031380
Automated culture system experiments hardware: developing test results and design solutions.
Freddi, M; Covini, M; Tenconi, C; Ricci, C; Caprioli, M; Cotronei, V
2002-07-01
The experiment proposed by Prof. Ricci University of Milan is funded by ASI with Laben as industrial Prime Contractor. ACS-EH (Automated Culture System-Experiment Hardware) will support the multigenerational experiment on weightlessness with rotifers and nematodes within four Experiment Containers (ECs) located inside the European Modular Cultivation System (EMCS) facility..Actually the Phase B is in progress and a concept design solution has been defined. The most challenging aspects for the design of such hardware are, from biological point of view the provision of an environment which permits animal's survival and to maintain desiccated generations separated and from the technical point of view, the miniaturisation of the hardware itself due to the reduce EC provided volume (160mmx60mmx60mm). The miniaturisation will allow a better use of the available EMCS Facility resources (e.g. volume. power etc.) and to fulfil the experiment requirements. ACS-EH, will be ready to fly in the year 2005 on boar the ISS.
Automatic aeroponic irrigation system based on Arduino’s platform
NASA Astrophysics Data System (ADS)
Montoya, A. P.; Obando, F. A.; Morales, J. G.; Vargas, G.
2017-06-01
The recirculating hydroponic culture techniques, as aeroponics, has several advantages over traditional agriculture, aimed to improve the efficiently and environmental impact of agriculture. These techniques require continuous monitoring and automation for proper operation. In this work was developed an automatic monitored aeroponic-irrigation system based on the Arduino’s free software platform. Analog and digital sensors for measuring the temperature, flow and level of a nutrient solution in a real greenhouse were implemented. In addition, the pH and electric conductivity of nutritive solutions are monitored using the Arduino’s differential configuration. The sensor network, the acquisition and automation system are managed by two Arduinos modules in master-slave configuration, which communicate one each other wireless by Wi-Fi. Further, data are stored in micro SD memories and the information is loaded on a web page in real time. The developed device brings important agronomic information when is tested with an arugula culture (Eruca sativa Mill). The system also could be employ as an early warning system to prevent irrigation malfunctions.
2012-04-30
individualistic , Western view of human behavior. That is, it assumes that people adhere to the individualistic principles of Western culture . What happens to...people with networks rich in structural holes that live/work in environments that adhere to other principles, such as those of a collectivistic ... culture ? •http://orgtheory.wordpress.com/2007/06/19/structural-holes-in-context/ Preferential Attachment (PA) [Barabási & Albert, 1999] • The most
Feasibility of Standardizing Automated Laboratory Analyzers On-Board U.S. Naval Ships
1999-12-01
K), Chloride (Cl), Carbon Dioxide (C02) Fecal Leukocytes Feces for Ova, Cysts and Parasites Glucose Hematocrit Occult Blood Partial...Culture, Wound Culture, Throat Electrolytes (NA, K, Cl, C02) Fecal Leukocytes Feces for Ova, Cysts and Parasites Fibrinogen and FSP Gonorrhea...0.00 Drug screen, urine 0 0.00 0 0.00 Electrolytes 18 4.83 202 11.66 Fecal leukocytes 6 1.61 5 0.29 Fibrinogen/FSP 0 0.00 4 .023 GC
ERIC Educational Resources Information Center
Hannon, Eric E.
2009-01-01
Recent evidence suggests that the musical rhythm of a particular culture may parallel the speech rhythm of that culture's language (Patel, A. D., & Daniele, J. R. (2003). "An empirical comparison of rhythm in language and music." "Cognition, 87," B35-B45). The present experiments aimed to determine whether listeners actually perceive such rhythmic…
The Cultural-Rhetorical Role of Free Jazz: Forging an Identity in the Sixties.
ERIC Educational Resources Information Center
Francesconi, Robert
The free jazz movement of the 1960s provided a rhetorical parallel in music to the verbal messages of black power and black nationalism. The use of Third World musical patterns represented an attempt to reinforce the revolutions in perceptions that black Americans held of themselves, their cultural heritage, and relationships to the rest of the…
Cloud identification using genetic algorithms and massively parallel computation
NASA Technical Reports Server (NTRS)
Buckles, Bill P.; Petry, Frederick E.
1996-01-01
As a Guest Computational Investigator under the NASA administered component of the High Performance Computing and Communication Program, we implemented a massively parallel genetic algorithm on the MasPar SIMD computer. Experiments were conducted using Earth Science data in the domains of meteorology and oceanography. Results obtained in these domains are competitive with, and in most cases better than, similar problems solved using other methods. In the meteorological domain, we chose to identify clouds using AVHRR spectral data. Four cloud speciations were used although most researchers settle for three. Results were remarkedly consistent across all tests (91% accuracy). Refinements of this method may lead to more timely and complete information for Global Circulation Models (GCMS) that are prevalent in weather forecasting and global environment studies. In the oceanographic domain, we chose to identify ocean currents from a spectrometer having similar characteristics to AVHRR. Here the results were mixed (60% to 80% accuracy). Given that one is willing to run the experiment several times (say 10), then it is acceptable to claim the higher accuracy rating. This problem has never been successfully automated. Therefore, these results are encouraging even though less impressive than the cloud experiment. Successful conclusion of an automated ocean current detection system would impact coastal fishing, naval tactics, and the study of micro-climates. Finally we contributed to the basic knowledge of GA (genetic algorithm) behavior in parallel environments. We developed better knowledge of the use of subpopulations in the context of shared breeding pools and the migration of individuals. Rigorous experiments were conducted based on quantifiable performance criteria. While much of the work confirmed current wisdom, for the first time we were able to submit conclusive evidence. The software developed under this grant was placed in the public domain. An extensive user's manual was written and distributed nationwide to scientists whose work might benefit from its availability. Several papers, including two journal articles, were produced.
ASDF: An Adaptable Seismic Data Format with Full Provenance
NASA Astrophysics Data System (ADS)
Smith, J. A.; Krischer, L.; Tromp, J.; Lefebvre, M. P.
2015-12-01
In order for seismologists to maximize their knowledge of how the Earth works, they must extract the maximum amount of useful information from all recorded seismic data available for their research. This requires assimilating large sets of waveform data, keeping track of vast amounts of metadata, using validated standards for quality control, and automating the workflow in a careful and efficient manner. In addition, there is a growing gap between CPU/GPU speeds and disk access speeds that leads to an I/O bottleneck in seismic workflows. This is made even worse by existing seismic data formats that were not designed for performance and are limited to a few fixed headers for storing metadata.The Adaptable Seismic Data Format (ASDF) is a new data format for seismology that solves the problems with existing seismic data formats and integrates full provenance into the definition. ASDF is a self-describing format that features parallel I/O using the parallel HDF5 library. This makes it a great choice for use on HPC clusters. The format integrates the standards QuakeML for seismic sources and StationXML for receivers. ASDF is suitable for storing earthquake data sets, where all waveforms for a single earthquake are stored in a one file, ambient noise cross-correlations, and adjoint sources. The format comes with a user-friendly Python reader and writer that gives seismologists access to a full set of Python tools for seismology. There is also a faster C/Fortran library for integrating ASDF into performance-focused numerical wave solvers, such as SPECFEM3D_GLOBE. Finally, a GUI tool designed for visually exploring the format exists that provides a flexible interface for both research and educational applications. ASDF is a new seismic data format that offers seismologists high-performance parallel processing, organized and validated contents, and full provenance tracking for automated seismological workflows.
Mischnik, Alexander; Mieth, Markus; Busch, Cornelius J; Hofer, Stefan; Zimmermann, Stefan
2012-08-01
Automation of plate streaking is ongoing in clinical microbiological laboratories, but evaluation for routine use is mostly open. In the present study, the recovery of microorganisms from the Previ Isola system plated polyurethane (PU) swab samples is compared to manually plated control viscose swab samples from wounds according to the CLSI procedure M40-A (quality control of microbiological transport systems). One hundred twelve paired samples (224 swabs) were analyzed. In 80/112 samples (71%), concordant culture results were obtained with the two methods. In 32/112 samples (29%), CFU recovery of microorganisms from the two methods was discordant. In 24 (75%) of the 32 paired samples with a discordant result, Previ Isola plated PU swabs were superior. In 8 (25%) of the 32 paired samples with a discordant result, control viscose swabs were superior. The quality of colony growth on culture media for further investigations was superior with Previ Isola inoculated plates compared to manual plating techniques. Gram stain results were concordant between the two methods in 62/112 samples (55%). In 50/112 samples (45%), the results of Gram staining were discordant between the two methods. In 34 (68%) of the 50 paired samples with discordant results, Gram staining of PU swabs was superior to that of control viscose swabs. In 16 (32%) of the 50 paired samples, Gram staining of control viscose swabs was superior to that of PU swabs. We report the first clinical evaluation of Previ Isola automated specimen inoculation for wound swab samples. This study suggests that use of an automated specimen inoculation system has good results with regard to CFU recovery, quality of Gram staining, and accuracy of diagnosis.
Mieth, Markus; Busch, Cornelius J.; Hofer, Stefan; Zimmermann, Stefan
2012-01-01
Automation of plate streaking is ongoing in clinical microbiological laboratories, but evaluation for routine use is mostly open. In the present study, the recovery of microorganisms from the Previ Isola system plated polyurethane (PU) swab samples is compared to manually plated control viscose swab samples from wounds according to the CLSI procedure M40-A (quality control of microbiological transport systems). One hundred twelve paired samples (224 swabs) were analyzed. In 80/112 samples (71%), concordant culture results were obtained with the two methods. In 32/112 samples (29%), CFU recovery of microorganisms from the two methods was discordant. In 24 (75%) of the 32 paired samples with a discordant result, Previ Isola plated PU swabs were superior. In 8 (25%) of the 32 paired samples with a discordant result, control viscose swabs were superior. The quality of colony growth on culture media for further investigations was superior with Previ Isola inoculated plates compared to manual plating techniques. Gram stain results were concordant between the two methods in 62/112 samples (55%). In 50/112 samples (45%), the results of Gram staining were discordant between the two methods. In 34 (68%) of the 50 paired samples with discordant results, Gram staining of PU swabs was superior to that of control viscose swabs. In 16 (32%) of the 50 paired samples, Gram staining of control viscose swabs was superior to that of PU swabs. We report the first clinical evaluation of Previ Isola automated specimen inoculation for wound swab samples. This study suggests that use of an automated specimen inoculation system has good results with regard to CFU recovery, quality of Gram staining, and accuracy of diagnosis. PMID:22692745
Automated sequence analysis and editing software for HIV drug resistance testing.
Struck, Daniel; Wallis, Carole L; Denisov, Gennady; Lambert, Christine; Servais, Jean-Yves; Viana, Raquel V; Letsoalo, Esrom; Bronze, Michelle; Aitken, Sue C; Schuurman, Rob; Stevens, Wendy; Schmit, Jean Claude; Rinke de Wit, Tobias; Perez Bercoff, Danielle
2012-05-01
Access to antiretroviral treatment in resource-limited-settings is inevitably paralleled by the emergence of HIV drug resistance. Monitoring treatment efficacy and HIV drugs resistance testing are therefore of increasing importance in resource-limited settings. Yet low-cost technologies and procedures suited to the particular context and constraints of such settings are still lacking. The ART-A (Affordable Resistance Testing for Africa) consortium brought together public and private partners to address this issue. To develop an automated sequence analysis and editing software to support high throughput automated sequencing. The ART-A Software was designed to automatically process and edit ABI chromatograms or FASTA files from HIV-1 isolates. The ART-A Software performs the basecalling, assigns quality values, aligns query sequences against a set reference, infers a consensus sequence, identifies the HIV type and subtype, translates the nucleotide sequence to amino acids and reports insertions/deletions, premature stop codons, ambiguities and mixed calls. The results can be automatically exported to Excel to identify mutations. Automated analysis was compared to manual analysis using a panel of 1624 PR-RT sequences generated in 3 different laboratories. Discrepancies between manual and automated sequence analysis were 0.69% at the nucleotide level and 0.57% at the amino acid level (668,047 AA analyzed), and discordances at major resistance mutations were recorded in 62 cases (4.83% of differences, 0.04% of all AA) for PR and 171 (6.18% of differences, 0.03% of all AA) cases for RT. The ART-A Software is a time-sparing tool for pre-analyzing HIV and viral quasispecies sequences in high throughput laboratories and highlighting positions requiring attention. Copyright © 2012 Elsevier B.V. All rights reserved.
Ion channel pharmacology under flow: automation via well-plate microfluidics.
Spencer, C Ian; Li, Nianzhen; Chen, Qin; Johnson, Juliette; Nevill, Tanner; Kammonen, Juha; Ionescu-Zanetti, Cristian
2012-08-01
Automated patch clamping addresses the need for high-throughput screening of chemical entities that alter ion channel function. As a result, there is considerable utility in the pharmaceutical screening arena for novel platforms that can produce relevant data both rapidly and consistently. Here we present results that were obtained with an innovative microfluidic automated patch clamp system utilizing a well-plate that eliminates the necessity of internal robotic liquid handling. Continuous recording from cell ensembles, rapid solution switching, and a bench-top footprint enable a number of assay formats previously inaccessible to automated systems. An electro-pneumatic interface was employed to drive the laminar flow of solutions in a microfluidic network that delivered cells in suspension to ensemble recording sites. Whole-cell voltage clamp was applied to linear arrays of 20 cells in parallel utilizing a 64-channel voltage clamp amplifier. A number of unique assays requiring sequential compound applications separated by a second or less, such as rapid determination of the agonist EC(50) for a ligand-gated ion channel or the kinetics of desensitization recovery, are enabled by the system. In addition, the system was validated via electrophysiological characterizations of both voltage-gated and ligand-gated ion channel targets: hK(V)2.1 and human Ether-à-go-go-related gene potassium channels, hNa(V)1.7 and 1.8 sodium channels, and (α1) hGABA(A) and (α1) human nicotinic acetylcholine receptor receptors. Our results show that the voltage dependence, kinetics, and interactions of these channels with pharmacological agents were matched to reference data. The results from these IonFlux™ experiments demonstrate that the system provides high-throughput automated electrophysiology with enhanced reliability and consistency, in a user-friendly format.
Development of an integrated semi-automated system for in vitro pharmacodynamic modelling.
Wang, Liangsu; Wismer, Michael K; Racine, Fred; Conway, Donald; Giacobbe, Robert A; Berejnaia, Olga; Kath, Gary S
2008-11-01
The aim of this study was to develop an integrated system for in vitro pharmacodynamic modelling of antimicrobials with greater flexibility, easier control and better accuracy than existing in vitro models. Custom-made bottle caps, fittings, valve controllers and a modified bench-top shaking incubator were used. A temperature-controlled automated sample collector was built. Computer software was developed to manage experiments and to control the entire system including solenoid pinch valves, peristaltic pumps and the sample collector. The system was validated by pharmacokinetic simulations of linezolid 600 mg infusion. The antibacterial effect of linezolid against multiple Staphylococcus aureus strains was also studied in this system. An integrated semi-automated bench-top system was built and validated. The temperature-controlled automated sample collector allowed unattended collection and temporary storage of samples. The system software reduced the labour necessary for many tasks and also improved the timing accuracy for performing simultaneous actions in multiple parallel experiments. The system was able to simulate human pharmacokinetics of linezolid 600 mg intravenous infusion accurately. A pharmacodynamic study of linezolid against multiple S. aureus strains with a range of MICs showed that the required 24 h free drug AUC/MIC ratio was approximately 30 in order to keep the organism counts at the same level as their initial inoculum and was about > or = 68 in order to achieve > 2 log(10) cfu/mL reduction in the in vitro model. The integrated semi-automated bench-top system provided the ability to overcome many of the drawbacks of existing in vitro models. It can be used for various simple or complicated pharmacokinetic/pharmacodynamic studies efficiently and conveniently.
Devine, Emily Beth; Van Eaton, Erik; Zadworny, Megan E; Symons, Rebecca; Devlin, Allison; Yanez, David; Yetisgen, Meliha; Keyloun, Katelyn R; Capurro, Daniel; Alfonso-Cristancho, Rafael; Flum, David R; Tarczy-Hornoch, Peter
2018-05-22
The availability of high fidelity electronic health record (EHR) data is a hallmark of the learning health care system. Washington State's Surgical Care Outcomes and Assessment Program (SCOAP) is a network of hospitals participating in quality improvement (QI) registries wherein data are manually abstracted from EHRs. To create the Comparative Effectiveness Research and Translation Network (CERTAIN), we semi-automated SCOAP data abstraction using a centralized federated data model, created a central data repository (CDR), and assessed whether these data could be used as real world evidence for QI and research. Describe the validation processes and complexities involved and lessons learned. Investigators installed a commercial CDR to retrieve and store data from disparate EHRs. Manual and automated abstraction systems were conducted in parallel (10/2012-7/2013) and validated in three phases using the EHR as the gold standard: 1) ingestion, 2) standardization, and 3) concordance of automated versus manually abstracted cases. Information retrieval statistics were calculated. Four unaffiliated health systems provided data. Between 6 and 15 percent of data elements were abstracted: 51 to 86 percent from structured data; the remainder using natural language processing (NLP). In phase 1, data ingestion from 12 out of 20 feeds reached 95 percent accuracy. In phase 2, 55 percent of structured data elements performed with 96 to 100 percent accuracy; NLP with 89 to 91 percent accuracy. In phase 3, concordance ranged from 69 to 89 percent. Information retrieval statistics were consistently above 90 percent. Semi-automated data abstraction may be useful, although raw data collected as a byproduct of health care delivery is not immediately available for use as real world evidence. New approaches to gathering and analyzing extant data are required.
Clinical utility of serum HER-2/neu testing on the Bayer Immuno 1 automated system in breast cancer.
Cook, G B; Neaman, I E; Goldblatt, J L; Cambetas, D R; Hussain, M; Lüftner, D; Yeung, K K; Chan, D W; Schwartz, M K; Allard, W J
2001-01-01
The clinical utility of automated serum HER-2/neu measurements in breast cancer run on the Bayer random analyzer Immuno 1 was analyzed in several steps: [a] The reference interval was determined for 242 normal healthy pre- and postmenopausal females. [b] The clinical specificity of serum HER-2/neu to separate healthy controls from 210 patients with non-malignant breast--and non-breast diseases was calculated. [c] The clinical sensitivity of cross-sectional serum HER-2/neu values for 204 patients (pts) with stage I-IV breast cancer was established. [d] Specimens from 103 stage IV breast cancer pts were tested for their parallel between serial serum HER-2/neu results and disease course. [a] The value of 13.03 ng/ml exceeded 95% of the results from the healthy female population. Based on the mean +2 standard deviations value of 14.7 ng/dl, the upper limit of normal was established at 15 ng/ml. [b] The specificity for benign breast diseases and other benign non-breast diseases was 98.0% and 94.6%, respectively. [c] The correlation of increased serum HER-2/neu levels and stage of breast cancer revealed the best sensitivity of 40% for stage IV disease. [4] Thirty-eight (36.9%) of 103 stage IV patients had initial HER-2/neu values > 15 ng/ml, 33 of whom showed longitudinal HER-2/neu concentrations which paralleled the clinical course of the disease giving a sensitivity of 86.8%.
Online optimal experimental re-design in robotic parallel fed-batch cultivation facilities.
Cruz Bournazou, M N; Barz, T; Nickel, D B; Lopez Cárdenas, D C; Glauche, F; Knepper, A; Neubauer, P
2017-03-01
We present an integrated framework for the online optimal experimental re-design applied to parallel nonlinear dynamic processes that aims to precisely estimate the parameter set of macro kinetic growth models with minimal experimental effort. This provides a systematic solution for rapid validation of a specific model to new strains, mutants, or products. In biosciences, this is especially important as model identification is a long and laborious process which is continuing to limit the use of mathematical modeling in this field. The strength of this approach is demonstrated by fitting a macro-kinetic differential equation model for Escherichia coli fed-batch processes after 6 h of cultivation. The system includes two fully-automated liquid handling robots; one containing eight mini-bioreactors and another used for automated at-line analyses, which allows for the immediate use of the available data in the modeling environment. As a result, the experiment can be continually re-designed while the cultivations are running using the information generated by periodical parameter estimations. The advantages of an online re-computation of the optimal experiment are proven by a 50-fold lower average coefficient of variation on the parameter estimates compared to the sequential method (4.83% instead of 235.86%). The success obtained in such a complex system is a further step towards a more efficient computer aided bioprocess development. Biotechnol. Bioeng. 2017;114: 610-619. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
High-Precision Hysteresis Sensing of the Quartz Crystal Inductance-to-Frequency Converter
Matko, Vojko; Milanović, Miro
2016-01-01
A new method for the automated measurement of the hysteresis of the temperature-compensated inductance-to-frequency converter with a single quartz crystal is proposed. The new idea behind this method is a converter with two programmable analog switches enabling the automated measurement of the converter hysteresis, as well as the temperature compensation of the quartz crystal and any other circuit element. Also used is the programmable timing control device that allows the selection of different oscillating frequencies. In the proposed programmable method two different inductances connected in series to the quartz crystal are switched in a short time sequence, compensating the crystal’s natural temperature characteristics (in the temperature range between 0 and 50 °C). The procedure allows for the measurement of the converter hysteresis at various values of capacitance connected in parallel with the quartz crystal for the converter sensitivity setting at selected inductance. It, furthermore, enables the measurement of hysteresis at various values of inductance at selected parallel capacitance (sensitivity) connected to the quartz crystal. The article shows that the proposed hysteresis measurement of the converter, which converts the inductance in the range between 95 and 100 μH to a frequency in the range between 1 and 200 kHz, has only 7 × 10−13 frequency instability (during the temperature change between 0 and 50 °C) with a maximum 1 × 10−11 hysteresis frequency difference. PMID:27367688
SISYPHUS: A high performance seismic inversion factory
NASA Astrophysics Data System (ADS)
Gokhberg, Alexey; Simutė, Saulė; Boehm, Christian; Fichtner, Andreas
2016-04-01
In the recent years the massively parallel high performance computers became the standard instruments for solving the forward and inverse problems in seismology. The respective software packages dedicated to forward and inverse waveform modelling specially designed for such computers (SPECFEM3D, SES3D) became mature and widely available. These packages achieve significant computational performance and provide researchers with an opportunity to solve problems of bigger size at higher resolution within a shorter time. However, a typical seismic inversion process contains various activities that are beyond the common solver functionality. They include management of information on seismic events and stations, 3D models, observed and synthetic seismograms, pre-processing of the observed signals, computation of misfits and adjoint sources, minimization of misfits, and process workflow management. These activities are time consuming, seldom sufficiently automated, and therefore represent a bottleneck that can substantially offset performance benefits provided by even the most powerful modern supercomputers. Furthermore, a typical system architecture of modern supercomputing platforms is oriented towards the maximum computational performance and provides limited standard facilities for automation of the supporting activities. We present a prototype solution that automates all aspects of the seismic inversion process and is tuned for the modern massively parallel high performance computing systems. We address several major aspects of the solution architecture, which include (1) design of an inversion state database for tracing all relevant aspects of the entire solution process, (2) design of an extensible workflow management framework, (3) integration with wave propagation solvers, (4) integration with optimization packages, (5) computation of misfits and adjoint sources, and (6) process monitoring. The inversion state database represents a hierarchical structure with branches for the static process setup, inversion iterations, and solver runs, each branch specifying information at the event, station and channel levels. The workflow management framework is based on an embedded scripting engine that allows definition of various workflow scenarios using a high-level scripting language and provides access to all available inversion components represented as standard library functions. At present the SES3D wave propagation solver is integrated in the solution; the work is in progress for interfacing with SPECFEM3D. A separate framework is designed for interoperability with an optimization module; the workflow manager and optimization process run in parallel and cooperate by exchanging messages according to a specially designed protocol. A library of high-performance modules implementing signal pre-processing, misfit and adjoint computations according to established good practices is included. Monitoring is based on information stored in the inversion state database and at present implements a command line interface; design of a graphical user interface is in progress. The software design fits well into the common massively parallel system architecture featuring a large number of computational nodes running distributed applications under control of batch-oriented resource managers. The solution prototype has been implemented on the "Piz Daint" supercomputer provided by the Swiss Supercomputing Centre (CSCS).
Eulberg, Dirk; Buchner, Klaus; Maasch, Christian; Klussmann, Sven
2005-01-01
We have developed an automated SELEX (Systematic Evolution of Ligands by EXponential Enrichment) process that allows the execution of in vitro selection cycles without any direct manual intervention steps. The automated selection protocol is designed to provide for high flexibility and versatility in terms of choice of buffers and reagents as well as stringency of selection conditions. Employing the automated SELEX process, we have identified RNA aptamers to the mirror-image configuration (d-peptide) of substance P. The peptide substance P belongs to the tachykinin family and exerts various biologically important functions, such as peripheral vasodilation, smooth muscle contraction and pain transmission. The aptamer that was identified most frequently was truncated to the 44mer SUP-A-004. The mirror-image configuration of SUP-A-004, the so-called Spiegelmer, has been shown to bind to naturally occurring l-substance P displaying a Kd of 40 nM and to inhibit (IC50 of 45 nM) l-substance P-mediated Ca2+ release in a cell culture assay. PMID:15745995
Eulberg, Dirk; Buchner, Klaus; Maasch, Christian; Klussmann, Sven
2005-03-03
We have developed an automated SELEX (Systematic Evolution of Ligands by EXponential Enrichment) process that allows the execution of in vitro selection cycles without any direct manual intervention steps. The automated selection protocol is designed to provide for high flexibility and versatility in terms of choice of buffers and reagents as well as stringency of selection conditions. Employing the automated SELEX process, we have identified RNA aptamers to the mirror-image configuration (d-peptide) of substance P. The peptide substance P belongs to the tachykinin family and exerts various biologically important functions, such as peripheral vasodilation, smooth muscle contraction and pain transmission. The aptamer that was identified most frequently was truncated to the 44mer SUP-A-004. The mirror-image configuration of SUP-A-004, the so-called Spiegelmer, has been shown to bind to naturally occurring l-substance P displaying a K(d) of 40 nM and to inhibit (IC50 of 45 nM) l-substance P-mediated Ca2+ release in a cell culture assay.
Automated Reuse of Scientific Subroutine Libraries through Deductive Synthesis
NASA Technical Reports Server (NTRS)
Lowry, Michael R.; Pressburger, Thomas; VanBaalen, Jeffrey; Roach, Steven
1997-01-01
Systematic software construction offers the potential of elevating software engineering from an art-form to an engineering discipline. The desired result is more predictable software development leading to better quality and more maintainable software. However, the overhead costs associated with the formalisms, mathematics, and methods of systematic software construction have largely precluded their adoption in real-world software development. In fact, many mainstream software development organizations, such as Microsoft, still maintain a predominantly oral culture for software development projects; which is far removed from a formalism-based culture for software development. An exception is the limited domain of safety-critical software, where the high-assuiance inherent in systematic software construction justifies the additional cost. We believe that systematic software construction will only be adopted by mainstream software development organization when the overhead costs have been greatly reduced. Two approaches to cost mitigation are reuse (amortizing costs over many applications) and automation. For the last four years, NASA Ames has funded the Amphion project, whose objective is to automate software reuse through techniques from systematic software construction. In particular, deductive program synthesis (i.e., program extraction from proofs) is used to derive a composition of software components (e.g., subroutines) that correctly implements a specification. The construction of reuse libraries of software components is the standard software engineering solution for improving software development productivity and quality.
An Optimizing Compiler for Petascale I/O on Leadership-Class Architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kandemir, Mahmut Taylan; Choudary, Alok; Thakur, Rajeev
In high-performance computing (HPC), parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our DOE project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final reportmore » summarizes the major achievements of the project and also points out promising future directions Two new sections in this report compared to the previous report are IOGenie and SSD/NVM-specific optimizations.« less
Takeuchi, Masaki; Tsunoda, Hiromichi; Tanaka, Hideji; Shiramizu, Yoshimi
2011-01-01
This paper describes the performance of our automated acidic (CH(3)COOH, HCOOH, HCl, HNO(2), SO(2), and HNO(3)) gases monitor utilizing a parallel-plate wet denuder (PPWD). The PPWD quantitatively collects gaseous contaminants at a high sample flow rate (∼8 dm(3) min(-1)) compared to the conventional methods used in a clean room. Rapid response to any variability in the sample concentration enables near-real-time monitoring. In the developed monitor, the analyte collected with the PPWD is pumped into one of two preconcentration columns for 15 min, and determined by means of ion chromatography. While one preconcentration column is used for chromatographic separation, the other is used for loading the sample solution. The system allows continuous monitoring of the common acidic gases in an advanced semiconductor manufacturing clean room. 2011 © The Japan Society for Analytical Chemistry
Domain Decomposition By the Advancing-Partition Method
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar Z.
2008-01-01
A new method of domain decomposition has been developed for generating unstructured grids in subdomains either sequentially or using multiple computers in parallel. Domain decomposition is a crucial and challenging step for parallel grid generation. Prior methods are generally based on auxiliary, complex, and computationally intensive operations for defining partition interfaces and usually produce grids of lower quality than those generated in single domains. The new technique, referred to as "Advancing Partition," is based on the Advancing-Front method, which partitions a domain as part of the volume mesh generation in a consistent and "natural" way. The benefits of this approach are: 1) the process of domain decomposition is highly automated, 2) partitioning of domain does not compromise the quality of the generated grids, and 3) the computational overhead for domain decomposition is minimal. The new method has been implemented in NASA's unstructured grid generation code VGRID.
Library of the Future: Croydon's New Central Library Complex.
ERIC Educational Resources Information Center
Batt, Chris
1993-01-01
A new library and cultural center in Croyden (England) is described. Function-based areas include library, administration, technical services, museum and galleries, museum offices and store, cinema, tourist information center, and local government offices. Information technology systems include the library management system, office automation, and…
Using Apex To Construct CPM-GOMS Models
NASA Technical Reports Server (NTRS)
John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger
2006-01-01
process for automatically generating computational models of human/computer interactions as well as graphical and textual representations of the models has been built on the conceptual foundation of a method known in the art as CPM-GOMS. This method is so named because it combines (1) the task decomposition of analysis according to an underlying method known in the art as the goals, operators, methods, and selection (GOMS) method with (2) a model of human resource usage at the level of cognitive, perceptual, and motor (CPM) operations. CPM-GOMS models have made accurate predictions about behaviors of skilled computer users in routine tasks, but heretofore, such models have been generated in a tedious, error-prone manual process. In the present process, CPM-GOMS models are generated automatically from a hierarchical task decomposition expressed by use of a computer program, known as Apex, designed previously to be used to model human behavior in complex, dynamic tasks. An inherent capability of Apex for scheduling of resources automates the difficult task of interleaving the cognitive, perceptual, and motor resources that underlie common task operators (e.g., move and click mouse). The user interface of Apex automatically generates Program Evaluation Review Technique (PERT) charts, which enable modelers to visualize the complex parallel behavior represented by a model. Because interleaving and the generation of displays to aid visualization are automated, it is now feasible to construct arbitrarily long sequences of behaviors. The process was tested by using Apex to create a CPM-GOMS model of a relatively simple human/computer-interaction task and comparing the time predictions of the model and measurements of the times taken by human users in performing the various steps of the task. The task was to withdraw $80 in cash from an automated teller machine (ATM). For the test, a Visual Basic mockup of an ATM was created, with a provision for input from (and measurement of the performance of) the user via a mouse. The times predicted by the automatically generated model turned out to approximate the measured times fairly well (see figure). While these results are promising, there is need for further development of the process. Moreover, it will also be necessary to test other, more complex models: The actions required of the user in the ATM task are too sequential to involve substantial parallelism and interleaving and, hence, do not serve as an adequate test of the unique strength of CPM-GOMS models to accommodate parallelism and interleaving.
An automated microphysiological assay for toxicity evaluation.
Eggert, S; Alexander, F A; Wiest, J
2015-08-01
Screening a newly developed drug, food additive or cosmetic ingredient for toxicity is a critical preliminary step before it can move forward in the development pipeline. Due to the sometimes dire consequences when a harmful agent is overlooked, toxicologists work under strict guidelines to effectively catalogue and classify new chemical agents. Conventional assays involve long experimental hours and many manual steps that increase the probability of user error; errors that can potentially manifest as inaccurate toxicology results. Automated assays can overcome many potential mistakes that arise due to human error. In the presented work, we created and validated a novel, automated platform for a microphysiological assay that can examine cellular attributes with sensors measuring changes in cellular metabolic rate, oxygen consumption, and vitality mediated by exposure to a potentially toxic agent. The system was validated with low buffer culture medium with varied conductivities that caused changes in the measured impedance on integrated impedance electrodes.
Limberg, Brian J; Johnstone, Kevin; Filloon, Thomas; Catrenich, Carl
2016-09-01
Using United States Pharmacopeia-National Formulary (USP-NF) general method <1223> guidance, the Soleris(®) automated system and reagents (Nonfermenting Total Viable Count for bacteria and Direct Yeast and Mold for yeast and mold) were validated, using a performance equivalence approach, as an alternative to plate counting for total microbial content analysis using five representative microbes: Staphylococcus aureus, Bacillus subtilis, Pseudomonas aeruginosa, Candida albicans, and Aspergillus brasiliensis. Detection times (DTs) in the alternative automated system were linearly correlated to CFU/sample (R(2) = 0.94-0.97) with ≥70% accuracy per USP General Chapter <1223> guidance. The LOD and LOQ of the automated system were statistically similar to the traditional plate count method. This system was significantly more precise than plate counting (RSD 1.2-2.9% for DT, 7.8-40.6% for plate counts), was statistically comparable to plate counting with respect to variations in analyst, vial lots, and instruments, and was robust when variations in the operating detection thresholds (dTs; ±2 units) were used. The automated system produced accurate results, was more precise and less labor-intensive, and met or exceeded criteria for a valid alternative quantitative method, consistent with USP-NF general method <1223> guidance.
Kandaswamy, Umasankar; Rotman, Ziv; Watt, Dana; Schillebeeckx, Ian; Cavalli, Valeria; Klyachko, Vitaly
2013-01-01
High-resolution live-cell imaging studies of neuronal structure and function are characterized by large variability in image acquisition conditions due to background and sample variations as well as low signal-to-noise ratio. The lack of automated image analysis tools that can be generalized for varying image acquisition conditions represents one of the main challenges in the field of biomedical image analysis. Specifically, segmentation of the axonal/dendritic arborizations in brightfield or fluorescence imaging studies is extremely labor-intensive and still performed mostly manually. Here we describe a fully automated machine-learning approach based on textural analysis algorithms for segmenting neuronal arborizations in high-resolution brightfield images of live cultured neurons. We compare performance of our algorithm to manual segmentation and show that it combines 90% accuracy, with similarly high levels of specificity and sensitivity. Moreover, the algorithm maintains high performance levels under a wide range of image acquisition conditions indicating that it is largely condition-invariable. We further describe an application of this algorithm to fully automated synapse localization and classification in fluorescence imaging studies based on synaptic activity. Textural analysis-based machine-learning approach thus offers a high performance condition-invariable tool for automated neurite segmentation. PMID:23261652
2012-01-01
Background The instrument channels of gastrointestinal (GI) endoscopes may be heavily contaminated with bacteria even after high-level disinfection (HLD). The British Society of Gastroenterology guidelines emphasize the benefits of manually brushing endoscope channels and using automated endoscope reprocessors (AERs) for disinfecting endoscopes. In this study, we aimed to assess the effectiveness of decontamination using reprocessors after HLD by comparing the cultured samples obtained from biopsy channels (BCs) of GI endoscopes and the internal surfaces of AERs. Methods We conducted a 5-year prospective study. Every month random consecutive sampling was carried out after a complete reprocessing cycle; 420 rinse and swabs samples were collected from BCs and internal surface of AERs, respectively. Of the 420 rinse samples collected from the BC of the GI endoscopes, 300 were obtained from the BCs of gastroscopes and 120 from BCs of colonoscopes. Samples were collected by flushing the BCs with sterile distilled water, and swabbing the residual water from the AERs after reprocessing. These samples were cultured to detect the presence of aerobic and anaerobic bacteria and mycobacteria. Results The number of culture-positive samples obtained from BCs (13.6%, 57/420) was significantly higher than that obtained from AERs (1.7%, 7/420). In addition, the number of culture-positive samples obtained from the BCs of gastroscopes (10.7%, 32/300) and colonoscopes (20.8%, 25/120) were significantly higher than that obtained from AER reprocess to gastroscopes (2.0%, 6/300) and AER reprocess to colonoscopes (0.8%, 1/120). Conclusions Culturing rinse samples obtained from BCs provides a better indication of the effectiveness of the decontamination of GI endoscopes after HLD than culturing the swab samples obtained from the inner surfaces of AERs as the swab samples only indicate whether the AERs are free from microbial contamination or not. PMID:22943739
Comparison of six different methods to calculate cell densities.
Camacho-Fernández, Carolina; Hervás, David; Rivas-Sendra, Alba; Marín, Mª Pilar; Seguí-Simarro, Jose M
2018-01-01
For in vitro culture of plant and animal cells, one of the critical steps is to adjust the initial cell density. A typical example of this is isolated microspore culture, where specific cell densities have been determined for different species. Out of these ranges, microspore growth is not induced, or is severely reduced. A similar situation occurs in many other plant and animal cell culture systems. Traditionally, researchers have used counting chambers (hemacytometers) to calculate cell densities, but little is still known about their technical advantages. In addition, much less information is available about other, alternative methods. In this work, using isolated eggplant microspore cultures and fluorescent beads (fluorospheres) as experimental systems, we performed a comprehensive comparison of six methods to calculate cell densities: (1) a Neubauer improved hemacytometer, (2) an automated cell counter, (3) a manual-counting method, and three flow cytometry methods based on (4) autofluorescence, (5) propidium iodide staining, and (6) side scattered light (SSC). Our results show that from a technical perspective, hemacytometers are the most reasonable option for cell counting, which may explain their widely spread use. Automated cell counters represent a good compromise between precision and affordability, although with limited accuracy. Finally, the methods based on flow cytometry were, by far, the best in terms of reproducibility and agreement between them, but they showed deficient accuracy and precision. Together, our results show a thorough technical evaluation of each counting method, provide unambiguous arguments to decide which one is the most convenient for the particular case of each laboratory, and in general, shed light into the best way to determine cell densities for in vitro cell cultures. They may have an impact in such a practice not only in the context of microspore culture, but also in any other plant cell culture procedure, or in any process involving particle counting.
Chiu, King-Wah; Tsai, Ming-Chao; Wu, Keng-Liang; Chiu, Yi-Chun; Lin, Ming-Tzung; Hu, Tsung-Hui
2012-09-03
The instrument channels of gastrointestinal (GI) endoscopes may be heavily contaminated with bacteria even after high-level disinfection (HLD). The British Society of Gastroenterology guidelines emphasize the benefits of manually brushing endoscope channels and using automated endoscope reprocessors (AERs) for disinfecting endoscopes. In this study, we aimed to assess the effectiveness of decontamination using reprocessors after HLD by comparing the cultured samples obtained from biopsy channels (BCs) of GI endoscopes and the internal surfaces of AERs. We conducted a 5-year prospective study. Every month random consecutive sampling was carried out after a complete reprocessing cycle; 420 rinse and swabs samples were collected from BCs and internal surface of AERs, respectively. Of the 420 rinse samples collected from the BC of the GI endoscopes, 300 were obtained from the BCs of gastroscopes and 120 from BCs of colonoscopes. Samples were collected by flushing the BCs with sterile distilled water, and swabbing the residual water from the AERs after reprocessing. These samples were cultured to detect the presence of aerobic and anaerobic bacteria and mycobacteria. The number of culture-positive samples obtained from BCs (13.6%, 57/420) was significantly higher than that obtained from AERs (1.7%, 7/420). In addition, the number of culture-positive samples obtained from the BCs of gastroscopes (10.7%, 32/300) and colonoscopes (20.8%, 25/120) were significantly higher than that obtained from AER reprocess to gastroscopes (2.0%, 6/300) and AER reprocess to colonoscopes (0.8%, 1/120). Culturing rinse samples obtained from BCs provides a better indication of the effectiveness of the decontamination of GI endoscopes after HLD than culturing the swab samples obtained from the inner surfaces of AERs as the swab samples only indicate whether the AERs are free from microbial contamination or not.
Mahoney, Mark M.; Staley, Kevin J.
2017-01-01
Rodent organotypic hippocampal cultures spontaneously develop epileptiform activity after approximately 2 weeks in vitro and are increasingly used as a model of chronic post-traumatic epilepsy. However, organotypic cultures are maintained in an artificial environment (culture medium), which contains electrolytes, glucose, amino acids and other components that are not present at the same concentrations in cerebrospinal fluid (CSF). Therefore, it is possible that epileptogenesis in organotypic cultures is driven by these components. We examined the influence of medium composition on epileptogenesis. Epileptogenesis was evaluated by measurements of lactate and lactate dehydrogenase (LDH) levels (biomarkers of ictal activity and cell death, respectively) in spent culture media, immunohistochemistry and automated 3-D cell counts, and extracellular recordings from CA3 regions. Changes in culture medium components moderately influenced lactate and LDH levels as well as electrographic seizure burden and cell death. However, epileptogenesis occurred in any culture medium that was capable of supporting neural survival. We conclude that medium composition is unlikely to be the cause of epileptogenesis in the organotypic hippocampal culture model of chronic post-traumatic epilepsy. PMID:28225808
Liu, Jing; Saponjian, Yero; Mahoney, Mark M; Staley, Kevin J; Berdichevsky, Yevgeny
2017-01-01
Rodent organotypic hippocampal cultures spontaneously develop epileptiform activity after approximately 2 weeks in vitro and are increasingly used as a model of chronic post-traumatic epilepsy. However, organotypic cultures are maintained in an artificial environment (culture medium), which contains electrolytes, glucose, amino acids and other components that are not present at the same concentrations in cerebrospinal fluid (CSF). Therefore, it is possible that epileptogenesis in organotypic cultures is driven by these components. We examined the influence of medium composition on epileptogenesis. Epileptogenesis was evaluated by measurements of lactate and lactate dehydrogenase (LDH) levels (biomarkers of ictal activity and cell death, respectively) in spent culture media, immunohistochemistry and automated 3-D cell counts, and extracellular recordings from CA3 regions. Changes in culture medium components moderately influenced lactate and LDH levels as well as electrographic seizure burden and cell death. However, epileptogenesis occurred in any culture medium that was capable of supporting neural survival. We conclude that medium composition is unlikely to be the cause of epileptogenesis in the organotypic hippocampal culture model of chronic post-traumatic epilepsy.
Chavez, Sofia; Viviano, Joseph; Zamyadi, Mojdeh; Kingsley, Peter B; Kochunov, Peter; Strother, Stephen; Voineskos, Aristotle
2018-02-01
To develop a quality assurance (QA) tool (acquisition guidelines and automated processing) for diffusion tensor imaging (DTI) data using a common agar-based phantom used for fMRI QA. The goal is to produce a comprehensive set of automated, sensitive and robust QA metrics. A readily available agar phantom was scanned with and without parallel imaging reconstruction. Other scanning parameters were matched to the human scans. A central slab made up of either a thick slice or an average of a few slices, was extracted and all processing was performed on that image. The proposed QA relies on the creation of two ROIs for processing: (i) a preset central circular region of interest (ccROI) and (ii) a signal mask for all images in the dataset. The ccROI enables computation of average signal for SNR calculations as well as average FA values. The production of the signal masks enables automated measurements of eddy current and B0 inhomogeneity induced distortions by exploiting the sphericity of the phantom. Also, the signal masks allow automated background localization to assess levels of Nyquist ghosting. The proposed DTI-QA was shown to produce eleven metrics which are robust yet sensitive to image quality changes within site and differences across sites. It can be performed in a reasonable amount of scan time (~15min) and the code for automated processing has been made publicly available. A novel DTI-QA tool has been proposed. It has been applied successfully on data from several scanners/platforms. The novelty lies in the exploitation of the sphericity of the phantom for distortion measurements. Other novel contributions are: the computation of an SNR value per gradient direction for the diffusion weighted images (DWIs) and an SNR value per non-DWI, an automated background detection for the Nyquist ghosting measurement and an error metric reflecting the contribution of EPI instability to the eddy current induced shape changes observed for DWIs. Copyright © 2017 Elsevier Inc. All rights reserved.
Potential Research and Development Synergies between Life support and Planetary protection
NASA Astrophysics Data System (ADS)
Lasseur, Ch.; Kminek, G.; Mergeay, M.
Long term manned missions of our Russian colleagues have demonstrated the risks associated with microbial contamination These risks concern both crew health via the metabolic consumables contamination water air but and also the hardware degradation Over the last six years ESA and IBMP have developed a collaboration to elaborate and document these microbial contamination issues The collaboration involved the mutual exchanges of knowledge as well as microbial samples and leads up to the microbial survey of the Russian module of the ISS Based on these results and in addition to an external expert report commissioned by ESA the agency initiated the development of a rapid and automated microbial detection and identification tool for use in future space missions In parallel to these developments and via several international meetings planetary protection experts have agreed to place clear specification of the microbial quality of future hardware landing on virgin planets as well as elaborate the preliminary requirements of contamination for manned missions on surface For these activities its is necessary to have a better understanding of microbial activity to create culture collection and to develop on-line detection tools Within this paper we present more deeply the life support activities related to microbial issues we identify some potential synergies with Planetary protection developments and we propose some pathway for collaboration between these two communities
Xiang, Kun; Li, Yinglei; Ford, William; Land, Walker; Schaffer, J David; Congdon, Robert; Zhang, Jing; Sadik, Omowunmi
2016-02-21
We hereby report the design and implementation of an Autonomous Microbial Cell Culture and Classification (AMC(3)) system for rapid detection of food pathogens. Traditional food testing methods require multistep procedures and long incubation period, and are thus prone to human error. AMC(3) introduces a "one click approach" to the detection and classification of pathogenic bacteria. Once the cultured materials are prepared, all operations are automatic. AMC(3) is an integrated sensor array platform in a microbial fuel cell system composed of a multi-potentiostat, an automated data collection system (Python program, Yocto Maxi-coupler electromechanical relay module) and a powerful classification program. The classification scheme consists of Probabilistic Neural Network (PNN), Support Vector Machines (SVM) and General Regression Neural Network (GRNN) oracle-based system. Differential Pulse Voltammetry (DPV) is performed on standard samples or unknown samples. Then, using preset feature extractions and quality control, accepted data are analyzed by the intelligent classification system. In a typical use, thirty-two extracted features were analyzed to correctly classify the following pathogens: Escherichia coli ATCC#25922, Escherichia coli ATCC#11775, and Staphylococcus epidermidis ATCC#12228. 85.4% accuracy range was recorded for unknown samples, and within a shorter time period than the industry standard of 24 hours.
Åkerfelt, Malin; Bayramoglu, Neslihan; Robinson, Sean; Toriseva, Mervi; Schukov, Hannu-Pekka; Härmä, Ville; Virtanen, Johannes; Sormunen, Raija; Kaakinen, Mika; Kannala, Juho; Eklund, Lauri; Heikkilä, Janne; Nees, Matthias
2015-01-01
Cancer-associated fibroblasts (CAFs) constitute an important part of the tumor microenvironment and promote invasion via paracrine functions and physical impact on the tumor. Although the importance of including CAFs into three-dimensional (3D) cell cultures has been acknowledged, computational support for quantitative live-cell measurements of complex cell cultures has been lacking. Here, we have developed a novel automated pipeline to model tumor-stroma interplay, track motility and quantify morphological changes of 3D co-cultures, in real-time live-cell settings. The platform consists of microtissues from prostate cancer cells, combined with CAFs in extracellular matrix that allows biochemical perturbation. Tracking of fibroblast dynamics revealed that CAFs guided the way for tumor cells to invade and increased the growth and invasiveness of tumor organoids. We utilized the platform to determine the efficacy of inhibitors in prostate cancer and the associated tumor microenvironment as a functional unit. Interestingly, certain inhibitors selectively disrupted tumor-CAF interactions, e.g. focal adhesion kinase (FAK) inhibitors specifically blocked tumor growth and invasion concurrently with fibroblast spreading and motility. This complex phenotype was not detected in other standard in vitro models. These results highlight the advantage of our approach, which recapitulates tumor histology and can significantly improve cancer target validation in vitro. PMID:26375443
Costa, Pedro F; Hutmacher, Dietmar W; Theodoropoulos, Christina; Gomes, Manuela E; Reis, Rui L; Vaquette, Cédryck
2015-04-22
The ability to test large arrays of cell and biomaterial combinations in 3D environments is still rather limited in the context of tissue engineering and regenerative medicine. This limitation can be generally addressed by employing highly automated and reproducible methodologies. This study reports on the development of a highly versatile and upscalable method based on additive manufacturing for the fabrication of arrays of scaffolds, which are enclosed into individualized perfusion chambers. Devices containing eight scaffolds and their corresponding bioreactor chambers are simultaneously fabricated utilizing a dual extrusion additive manufacturing system. To demonstrate the versatility of the concept, the scaffolds, while enclosed into the device, are subsequently surface-coated with a biomimetic calcium phosphate layer by perfusion with simulated body fluid solution. 96 scaffolds are simultaneously seeded and cultured with human osteoblasts under highly controlled bidirectional perfusion dynamic conditions over 4 weeks. Both coated and noncoated resulting scaffolds show homogeneous cell distribution and high cell viability throughout the 4 weeks culture period and CaP-coated scaffolds result in a significantly increased cell number. The methodology developed in this work exemplifies the applicability of additive manufacturing as a tool for further automation of studies in the field of tissue engineering and regenerative medicine. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
González, Jorge Ernesto; Radl, Analía; Romero, Ivonne; Barquinero, Joan Francesc; García, Omar; Di Giorgio, Marina
2016-12-01
Mitotic Index (MI) estimation expressed as percentage of mitosis plays an important role as quality control endpoint. To this end, MI is applied to check the lot of media and reagents to be used throughout the assay and also to check cellular viability after blood sample shipping, indicating satisfactory/unsatisfactory conditions for the progression of cell culture. The objective of this paper was to apply the CellProfiler open-source software for automatic detection of mitotic and nuclei figures from digitized images of cultured human lymphocytes for MI assessment, and to compare its performance to that performed through semi-automatic and visual detection. Lymphocytes were irradiated and cultured for mitosis detection. Sets of images from cultures were analyzed visually and findings were compared with those using CellProfiler software. The CellProfiler pipeline includes the detection of nuclei and mitosis with 80% sensitivity and more than 99% specificity. We conclude that CellProfiler is a reliable tool for counting mitosis and nuclei from cytogenetic images, saves considerable time compared to manual operation and reduces the variability derived from the scoring criteria of different scorers. The CellProfiler automated pipeline achieves good agreement with visual counting workflow, i.e. it allows fully automated mitotic and nuclei scoring in cytogenetic images yielding reliable information with minimal user intervention. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Joseph, Adrian; Goldrick, Stephen; Mollet, Michael; Turner, Richard; Bender, Jean; Gruber, David; Farid, Suzanne S; Titchener-Hooker, Nigel
2017-05-01
Continuous disk-stack centrifugation is typically used for the removal of cells and cellular debris from mammalian cell culture broths at manufacturing-scale. The use of scale-down methods to characterise disk-stack centrifugation performance enables substantial reductions in material requirements and allows a much wider design space to be tested than is currently possible at pilot-scale. The process of scaling down centrifugation has historically been challenging due to the difficulties in mimicking the Energy Dissipation Rates (EDRs) in typical machines. This paper describes an alternative and easy-to-assemble automated capillary-based methodology to generate levels of EDRs consistent with those found in a continuous disk-stack centrifuge. Variations in EDR were achieved through changes in capillary internal diameter and the flow rate of operation through the capillary. The EDRs found to match the levels of shear in the feed zone of a pilot-scale centrifuge using the experimental method developed in this paper (2.4×10 5 W/Kg) are consistent with those obtained through previously published computational fluid dynamic (CFD) studies (2.0×10 5 W/Kg). Furthermore, this methodology can be incorporated into existing scale-down methods to model the process performance of continuous disk-stack centrifuges. This was demonstrated through the characterisation of culture hold time, culture temperature and EDRs on centrate quality. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Rameez, Shahid; Mostafa, Sigma S; Miller, Christopher; Shukla, Abhinav A
2014-01-01
Decreasing the timeframe for cell culture process development has been a key goal toward accelerating biopharmaceutical development. Advanced Microscale Bioreactors (ambr™) is an automated micro-bioreactor system with miniature single-use bioreactors with a 10-15 mL working volume controlled by an automated workstation. This system was compared to conventional bioreactor systems in terms of its performance for the production of a monoclonal antibody in a recombinant Chinese Hamster Ovary cell line. The miniaturized bioreactor system was found to produce cell culture profiles that matched across scales to 3 L, 15 L, and 200 L stirred tank bioreactors. The processes used in this article involve complex feed formulations, perturbations, and strict process control within the design space, which are in-line with processes used for commercial scale manufacturing of biopharmaceuticals. Changes to important process parameters in ambr™ resulted in predictable cell growth, viability and titer changes, which were in good agreement to data from the conventional larger scale bioreactors. ambr™ was found to successfully reproduce variations in temperature, dissolved oxygen (DO), and pH conditions similar to the larger bioreactor systems. Additionally, the miniature bioreactors were found to react well to perturbations in pH and DO through adjustments to the Proportional and Integral control loop. The data presented here demonstrates the utility of the ambr™ system as a high throughput system for cell culture process development. © 2014 American Institute of Chemical Engineers.