Fojtu, Michaela; Gumulec, Jaromir; Balvan, Jan; Raudenska, Martina; Sztalmachova, Marketa; Polanska, Hana; Smerkova, Kristyna; Adam, Vojtech; Kizek, Rene; Masarik, Michal
2014-02-01
Determination of serum mRNA gained a lot of attention in recent years, particularly from the perspective of disease markers. Streptavidin-modified paramagnetic particles (SMPs) seem an interesting technique, mainly due to possible automated isolation and high efficiency. The aim of this study was to optimize serum isolation protocol to reduce the consumption of chemicals and sample volume. The following factors were optimized: amounts of (i) paramagnetic particles, (ii) oligo(dT)20 probe, (iii) serum, and (iv) the binding sequence (SMPs, oligo(dT)20 , serum vs. oligo(dT)20 , serum and SMPs). RNA content was measured, and the expression of metallothionein-2A as possible prostate cancer marker was analyzed to demonstrate measurable RNA content with ability for RT-PCR detection. Isolation is possible on serum volume range (10-200 μL) without altering of efficiency or purity. Amount of SMPs can be reduced up to 5 μL, with optimal results within 10-30 μL SMPs. Volume of oligo(dT)20 does not affect efficiency, when used within 0.1-0.4 μL. This optimized protocol was also modified to fit needs of automated one-step single-tube analysis with identical efficiency compared to conventional setup. One-step analysis protocol is considered a promising simplification, making RNA isolation suitable for automatable process. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Combined process automation for large-scale EEG analysis.
Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E
2012-01-01
Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.
Automated matching software for clinical trials eligibility: measuring efficiency and flexibility.
Penberthy, Lynne; Brown, Richard; Puma, Federico; Dahman, Bassam
2010-05-01
Clinical trials (CT) serve as the media that translates clinical research into standards of care. Low or slow recruitment leads to delays in delivery of new therapies to the public. Determination of eligibility in all patients is one of the most important factors to assure unbiased results from the clinical trials process and represents the first step in addressing the issue of under representation and equal access to clinical trials. This is a pilot project evaluating the efficiency, flexibility, and generalizibility of an automated clinical trials eligibility screening tool across 5 different clinical trials and clinical trial scenarios. There was a substantial total savings during the study period in research staff time spent in evaluating patients for eligibility ranging from 165h to 1329h. There was a marked enhancement in efficiency with the automated system for all but one study in the pilot. The ratio of mean staff time required per eligible patient identified ranged from 0.8 to 19.4 for the manual versus the automated process. The results of this study demonstrate that automation offers an opportunity to reduce the burden of the manual processes required for CT eligibility screening and to assure that all patients have an opportunity to be evaluated for participation in clinical trials as appropriate. The automated process greatly reduces the time spent on eligibility screening compared with the traditional manual process by effectively transferring the load of the eligibility assessment process to the computer. Copyright (c) 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Nelson, Andrew
2010-11-01
The efficient use of complex neutron scattering instruments is often hindered by the complex nature of their operating software. This complexity exists at each experimental step: data acquisition, reduction and analysis, with each step being as important as the previous. For example, whilst command line interfaces are powerful at automated acquisition they often reduce accessibility by novice users and sometimes reduce the efficiency for advanced users. One solution to this is the development of a graphical user interface which allows the user to operate the instrument by a simple and intuitive "push button" approach. This approach was taken by the Motofit software package for analysis of multiple contrast reflectometry data. Here we describe the extension of this package to cover the data acquisition and reduction steps for the Platypus time-of-flight neutron reflectometer. Consequently, the complete operation of an instrument is integrated into a single, easy to use, program, leading to efficient instrument usage.
A Task Analytic Process to Define Future Concepts in Aviation
NASA Technical Reports Server (NTRS)
Gore, Brian Francis; Wolter, Cynthia A.
2014-01-01
A necessary step when developing next generation systems is to understand the tasks that operators will perform. One NextGen concept under evaluation termed Single Pilot Operations (SPO) is designed to improve the efficiency of airline operations. One SPO concept includes a Pilot on Board (PoB), a Ground Station Operator (GSO), and automation. A number of procedural changes are likely to result when such changes in roles and responsibilities are undertaken. Automation is expected to relieve the PoB and GSO of some tasks (e.g. radio frequency changes, loading expected arrival information). A major difference in the SPO environment is the shift to communication-cued crosschecks (verbal / automated) rather than movement-cued crosschecks that occur in a shared cockpit. The current article highlights a task analytic process of the roles and responsibilities between a PoB, an approach-phase GSO, and automation.
Transfection in perfused microfluidic cell culture devices: A case study.
Raimes, William; Rubi, Mathieu; Super, Alexandre; Marques, Marco P C; Veraitch, Farlan; Szita, Nicolas
2017-08-01
Automated microfluidic devices are a promising route towards a point-of-care autologous cell therapy. The initial steps of induced pluripotent stem cell (iPSC) derivation involve transfection and long term cell culture. Integration of these steps would help reduce the cost and footprint of micro-scale devices with applications in cell reprogramming or gene correction. Current examples of transfection integration focus on maximising efficiency rather than viable long-term culture. Here we look for whole process compatibility by integrating automated transfection with a perfused microfluidic device designed for homogeneous culture conditions. The injection process was characterised using fluorescein to establish a LabVIEW-based routine for user-defined automation. Proof-of-concept is demonstrated by chemically transfecting a GFP plasmid into mouse embryonic stem cells (mESCs). Cells transfected in the device showed an improvement in efficiency (34%, n = 3) compared with standard protocols (17.2%, n = 3). This represents a first step towards microfluidic processing systems for cell reprogramming or gene therapy.
A Recommendation Algorithm for Automating Corollary Order Generation
Klann, Jeffrey; Schadow, Gunther; McCoy, JM
2009-01-01
Manual development and maintenance of decision support content is time-consuming and expensive. We explore recommendation algorithms, e-commerce data-mining tools that use collective order history to suggest purchases, to assist with this. In particular, previous work shows corollary order suggestions are amenable to automated data-mining techniques. Here, an item-based collaborative filtering algorithm augmented with association rule interestingness measures mined suggestions from 866,445 orders made in an inpatient hospital in 2007, generating 584 potential corollary orders. Our expert physician panel evaluated the top 92 and agreed 75.3% were clinically meaningful. Also, at least one felt 47.9% would be directly relevant in guideline development. This automated generation of a rough-cut of corollary orders confirms prior indications about automated tools in building decision support content. It is an important step toward computerized augmentation to decision support development, which could increase development efficiency and content quality while automatically capturing local standards. PMID:20351875
A recommendation algorithm for automating corollary order generation.
Klann, Jeffrey; Schadow, Gunther; McCoy, J M
2009-11-14
Manual development and maintenance of decision support content is time-consuming and expensive. We explore recommendation algorithms, e-commerce data-mining tools that use collective order history to suggest purchases, to assist with this. In particular, previous work shows corollary order suggestions are amenable to automated data-mining techniques. Here, an item-based collaborative filtering algorithm augmented with association rule interestingness measures mined suggestions from 866,445 orders made in an inpatient hospital in 2007, generating 584 potential corollary orders. Our expert physician panel evaluated the top 92 and agreed 75.3% were clinically meaningful. Also, at least one felt 47.9% would be directly relevant in guideline development. This automated generation of a rough-cut of corollary orders confirms prior indications about automated tools in building decision support content. It is an important step toward computerized augmentation to decision support development, which could increase development efficiency and content quality while automatically capturing local standards.
Ultramap: the all in One Photogrammetric Solution
NASA Astrophysics Data System (ADS)
Wiechert, A.; Gruber, M.; Karner, K.
2012-07-01
This paper describes in detail the dense matcher developed since years by Vexcel Imaging in Graz for Microsoft's Bing Maps project. This dense matcher was exclusively developed for and used by Microsoft for the production of the 3D city models of Virtual Earth. It will now be made available to the public with the UltraMap software release mid-2012. That represents a revolutionary step in digital photogrammetry. The dense matcher generates digital surface models (DSM) and digital terrain models (DTM) automatically out of a set of overlapping UltraCam images. The models have an outstanding point density of several hundred points per square meter and sub-pixel accuracy and are generated automatically. The dense matcher consists of two steps. The first step rectifies overlapping image areas to speed up the dense image matching process. This rectification step ensures a very efficient processing and detects occluded areas by applying a back-matching step. In this dense image matching process a cost function consisting of a matching score as well as a smoothness term is minimized. In the second step the resulting range image patches are fused into a DSM by optimizing a global cost function. The whole process is optimized for multi-core CPUs and optionally uses GPUs if available. UltraMap 3.0 features also an additional step which is presented in this paper, a complete automated true-ortho and ortho workflow. For this, the UltraCam images are combined with the DSM or DTM in an automated rectification step and that results in high quality true-ortho or ortho images as a result of a highly automated workflow. The paper presents the new workflow and first results.
How smart is your BEOL? productivity improvement through intelligent automation
NASA Astrophysics Data System (ADS)
Schulz, Kristian; Egodage, Kokila; Tabbone, Gilles; Garetto, Anthony
2017-07-01
The back end of line (BEOL) workflow in the mask shop still has crucial issues throughout all standard steps which are inspection, disposition, photomask repair and verification of repair success. All involved tools are typically run by highly trained operators or engineers who setup jobs and recipes, execute tasks, analyze data and make decisions based on the results. No matter how experienced operators are and how good the systems perform, there is one aspect that always limits the productivity and effectiveness of the operation: the human aspect. Human errors can range from seemingly rather harmless slip-ups to mistakes with serious and direct economic impact including mask rejects, customer returns and line stops in the wafer fab. Even with the introduction of quality control mechanisms that help to reduce these critical but unavoidable faults, they can never be completely eliminated. Therefore the mask shop BEOL cannot run in the most efficient manner as unnecessary time and money are spent on processes that still remain labor intensive. The best way to address this issue is to automate critical segments of the workflow that are prone to human errors. In fact, manufacturing errors can occur for each BEOL step where operators intervene. These processes comprise of image evaluation, setting up tool recipes, data handling and all other tedious but required steps. With the help of smart solutions, operators can work more efficiently and dedicate their time to less mundane tasks. Smart solutions connect tools, taking over the data handling and analysis typically performed by operators and engineers. These solutions not only eliminate the human error factor in the manufacturing process but can provide benefits in terms of shorter cycle times, reduced bottlenecks and prediction of an optimized workflow. In addition such software solutions consist of building blocks that seamlessly integrate applications and allow the customers to use tailored solutions. To accommodate for the variability and complexity in mask shops today, individual workflows can be supported according to the needs of any particular manufacturing line with respect to necessary measurement and production steps. At the same time the efficiency of assets is increased by avoiding unneeded cycle time and waste of resources due to the presence of process steps that are very crucial for a given technology. In this paper we present details of which areas of the BEOL can benefit most from intelligent automation, what solutions exist and the quantification of benefits to a mask shop with full automation by the use of a back end of line model.
Efficiency of an automated reception and turnaround time management system for the phlebotomy room.
Yun, Soon Gyu; Shin, Jeong Won; Park, Eun Su; Bang, Hae In; Kang, Jung Gu
2016-01-01
Recent advances in laboratory information systems have largely been focused on automation. However, the phlebotomy services have not been completely automated. To address this issue, we introduced an automated reception and turnaround time (TAT) management system, for the first time in Korea, whereby the patient's information is transmitted directly to the actual phlebotomy site and the TAT for each phlebotomy step can be monitored at a glance. The GNT5 system (Energium Co., Ltd., Korea) was installed in June 2013. The automated reception and TAT management system has been in operation since February 2014. Integration of the automated reception machine with the GNT5 allowed for direct transmission of laboratory order information to the GNT5 without involving any manual reception step. We used the mean TAT from reception to actual phlebotomy as the parameter for evaluating the efficiency of our system. Mean TAT decreased from 5:45 min to 2:42 min after operationalization of the system. The mean number of patients in queue decreased from 2.9 to 1.0. Further, the number of cases taking more than five minutes from reception to phlebotomy, defined as the defect rate, decreased from 20.1% to 9.7%. The use of automated reception and TAT management system was associated with a decrease of overall TAT and an improved workflow at the phlebotomy room.
Efficiency of an Automated Reception and Turnaround Time Management System for the Phlebotomy Room
Yun, Soon Gyu; Park, Eun Su; Bang, Hae In; Kang, Jung Gu
2016-01-01
Background Recent advances in laboratory information systems have largely been focused on automation. However, the phlebotomy services have not been completely automated. To address this issue, we introduced an automated reception and turnaround time (TAT) management system, for the first time in Korea, whereby the patient's information is transmitted directly to the actual phlebotomy site and the TAT for each phlebotomy step can be monitored at a glance. Methods The GNT5 system (Energium Co., Ltd., Korea) was installed in June 2013. The automated reception and TAT management system has been in operation since February 2014. Integration of the automated reception machine with the GNT5 allowed for direct transmission of laboratory order information to the GNT5 without involving any manual reception step. We used the mean TAT from reception to actual phlebotomy as the parameter for evaluating the efficiency of our system. Results Mean TAT decreased from 5:45 min to 2:42 min after operationalization of the system. The mean number of patients in queue decreased from 2.9 to 1.0. Further, the number of cases taking more than five minutes from reception to phlebotomy, defined as the defect rate, decreased from 20.1% to 9.7%. Conclusions The use of automated reception and TAT management system was associated with a decrease of overall TAT and an improved workflow at the phlebotomy room. PMID:26522759
Does the use of automated fetal biometry improve clinical work flow efficiency?
Espinoza, Jimmy; Good, Sara; Russell, Evie; Lee, Wesley
2013-05-01
This study was designed to compare the work flow efficiency of manual measurements of 5 fetal parameters with a novel technique that automatically measures these parameters from 2-dimensional sonograms. This prospective study included 200 singleton pregnancies between 15 and 40 weeks' gestation. Patients were randomly allocated to either manual (n = 100) or automatic (n = 100) fetal biometry. The automatic measurement was performed using a commercially available software application. A digital video recorder captured all on-screen activity associated with the sonographic examination. The examination time and number of steps required to obtain fetal measurements were compared between manual and automatic methods. The mean time required to obtain the biometric measurements was significantly shorter using the automated technique than the manual approach (P < .001 for all comparisons). Similarly, the mean number of steps required to perform these measurements was significantly fewer with automatic measurements compared to the manual technique (P < .001). In summary, automated biometry reduced the examination time required for standard fetal measurements. This approach may improve work flow efficiency in busy obstetric sonography practices.
Seaberg, R C; Statland, B E; Stallone, R O
1999-06-01
Lab automation and consolidation can be a daunting, risky, major reengineering project. Done right, it can mean decreased labor costs and space requirements, increased test volume, and more efficient use of personnel. See how this health system got the job done using a carefully defined, seven-step plan.
Automation for Accommodating Fuel-Efficient Descents in Constrained Airspace
NASA Technical Reports Server (NTRS)
Coopenbarger, Richard A.
2010-01-01
Continuous descents at low engine power are desired to reduce fuel consumption, emissions and noise during arrival operations. The challenge is to allow airplanes to fly these types of efficient descents without interruption during busy traffic conditions. During busy conditions today, airplanes are commonly forced to fly inefficient, step-down descents as airtraffic controllers work to ensure separation and maximize throughput. NASA in collaboration with government and industry partners is developing new automation to help controllers accommodate continuous descents in the presence of complex traffic and airspace constraints. This automation relies on accurate trajectory predictions to compute strategic maneuver advisories. The talk will describe the concept behind this new automation and provide an overview of the simulations and flight testing used to develop and refine its underlying technology.
Automated assay for screening the enzymatic release of reducing sugars from micronized biomass.
Navarro, David; Couturier, Marie; da Silva, Gabriela Ghizzi Damasceno; Berrin, Jean-Guy; Rouau, Xavier; Asther, Marcel; Bignon, Christophe
2010-07-16
To reduce the production cost of bioethanol obtained from fermentation of the sugars provided by degradation of lignocellulosic biomass (i.e., second generation bioethanol), it is necessary to screen for new enzymes endowed with more efficient biomass degrading properties. This demands the set-up of high-throughput screening methods. Several methods have been devised all using microplates in the industrial SBS format. Although this size reduction and standardization has greatly improved the screening process, the published methods comprise one or more manual steps that seriously decrease throughput. Therefore, we worked to devise a screening method devoid of any manual steps. We describe a fully automated assay for measuring the amount of reducing sugars released by biomass-degrading enzymes from wheat-straw and spruce. The method comprises two independent and automated steps. The first step is the making of "substrate plates". It consists of filling 96-well microplates with slurry suspensions of micronized substrate which are then stored frozen until use. The second step is an enzymatic activity assay. After thawing, the substrate plates are supplemented by the robot with cell-wall degrading enzymes where necessary, and the whole process from addition of enzymes to quantification of released sugars is autonomously performed by the robot. We describe how critical parameters (amount of substrate, amount of enzyme, incubation duration and temperature) were selected to fit with our specific use. The ability of this automated small-scale assay to discriminate among different enzymatic activities was validated using a set of commercial enzymes. Using an automatic microplate sealer solved three main problems generally encountered during the set-up of methods for measuring the sugar-releasing activity of plant cell wall-degrading enzymes: throughput, automation, and evaporation losses. In its present set-up, the robot can autonomously process 120 triplicate wheat-straw samples per day. This throughput can be doubled if the incubation time is reduced from 24 h to 4 h (for initial rates measurements, for instance). This method can potentially be used with any insoluble substrate that is micronizable. A video illustrating the method can be seen at the following URL: http://www.youtube.com/watch?v=NFg6TxjuMWU.
Energy conservation and management system using efficient building automation
NASA Astrophysics Data System (ADS)
Ahmed, S. Faiz; Hazry, D.; Tanveer, M. Hassan; Joyo, M. Kamran; Warsi, Faizan A.; Kamarudin, H.; Wan, Khairunizam; Razlan, Zuradzman M.; Shahriman A., B.; Hussain, A. T.
2015-05-01
In countries where the demand and supply gap of electricity is huge and the people are forced to endure increasing hours of load shedding, unnecessary consumption of electricity makes matters even worse. So the importance and need for electricity conservation increases exponentially. This paper outlines a step towards the conservation of energy in general and electricity in particular by employing efficient Building Automation technique. It should be noted that by careful designing and implementation of the Building Automation System, up to 30% to 40% of energy consumption can be reduced, which makes a huge difference for energy saving. In this study above mentioned concept is verified by performing experiment on a prototype experimental room and by implementing efficient building automation technique. For the sake of this efficient automation, Programmable Logic Controller (PLC) is employed as a main controller, monitoring various system parameters and controlling appliances as per required. The hardware test run and experimental findings further clarifies and proved the concept. The added advantage of this project is that it can be implemented to both small and medium level domestic homes thus greatly reducing the overall unnecessary load on the Utility provider.
Walsh, Kristin E; Chui, Michelle Anne; Kieser, Mara A; Williams, Staci M; Sutter, Susan L; Sutter, John G
2011-01-01
To explore community pharmacy technician workflow change after implementation of an automated robotic prescription-filling device. At an independent community pharmacy in rural Mayville, WI, pharmacy technicians were observed before and 3 months after installation of an automated robotic prescription-filling device. The main outcome measures were sequences and timing of technician workflow steps, workflow interruptions, automation surprises, and workarounds. Of the 77 and 80 observations made before and 3 months after robot installation, respectively, 17 different workflow sequences were observed before installation and 38 after installation. Average prescription filling time was reduced by 40 seconds per prescription with use of the robot. Workflow interruptions per observation increased from 1.49 to 1.79 (P = 0.11), and workarounds increased from 10% to 36% after robot use. Although automated prescription-filling devices can increase efficiency, workflow interruptions and workarounds may negate that efficiency. Assessing changes in workflow and sequencing of tasks that may result from the use of automation can help uncover opportunities for workflow policy and procedure redesign.
Zhao, Shanrong; Xi, Li; Quan, Jie; Xi, Hualin; Zhang, Ying; von Schack, David; Vincent, Michael; Zhang, Baohong
2016-01-08
RNA sequencing (RNA-seq), a next-generation sequencing technique for transcriptome profiling, is being increasingly used, in part driven by the decreasing cost of sequencing. Nevertheless, the analysis of the massive amounts of data generated by large-scale RNA-seq remains a challenge. Multiple algorithms pertinent to basic analyses have been developed, and there is an increasing need to automate the use of these tools so as to obtain results in an efficient and user friendly manner. Increased automation and improved visualization of the results will help make the results and findings of the analyses readily available to experimental scientists. By combing the best open source tools developed for RNA-seq data analyses and the most advanced web 2.0 technologies, we have implemented QuickRNASeq, a pipeline for large-scale RNA-seq data analyses and visualization. The QuickRNASeq workflow consists of three main steps. In Step #1, each individual sample is processed, including mapping RNA-seq reads to a reference genome, counting the numbers of mapped reads, quality control of the aligned reads, and SNP (single nucleotide polymorphism) calling. Step #1 is computationally intensive, and can be processed in parallel. In Step #2, the results from individual samples are merged, and an integrated and interactive project report is generated. All analyses results in the report are accessible via a single HTML entry webpage. Step #3 is the data interpretation and presentation step. The rich visualization features implemented here allow end users to interactively explore the results of RNA-seq data analyses, and to gain more insights into RNA-seq datasets. In addition, we used a real world dataset to demonstrate the simplicity and efficiency of QuickRNASeq in RNA-seq data analyses and interactive visualizations. The seamless integration of automated capabilites with interactive visualizations in QuickRNASeq is not available in other published RNA-seq pipelines. The high degree of automation and interactivity in QuickRNASeq leads to a substantial reduction in the time and effort required prior to further downstream analyses and interpretation of the analyses findings. QuickRNASeq advances primary RNA-seq data analyses to the next level of automation, and is mature for public release and adoption.
NASA Astrophysics Data System (ADS)
Koziel, Slawomir; Bekasiewicz, Adrian
2018-02-01
In this article, a simple yet efficient and reliable technique for fully automated multi-objective design optimization of antenna structures using sequential domain patching (SDP) is discussed. The optimization procedure according to SDP is a two-step process: (i) obtaining the initial set of Pareto-optimal designs representing the best possible trade-offs between considered conflicting objectives, and (ii) Pareto set refinement for yielding the optimal designs at the high-fidelity electromagnetic (EM) simulation model level. For the sake of computational efficiency, the first step is realized at the level of a low-fidelity (coarse-discretization) EM model by sequential construction and relocation of small design space segments (patches) in order to create a path connecting the extreme Pareto front designs obtained beforehand. The second stage involves response correction techniques and local response surface approximation models constructed by reusing EM simulation data acquired in the first step. A major contribution of this work is an automated procedure for determining the patch dimensions. It allows for appropriate selection of the number of patches for each geometry variable so as to ensure reliability of the optimization process while maintaining its low cost. The importance of this procedure is demonstrated by comparing it with uniform patch dimensions.
Maximizing coupling-efficiency of high-power diode lasers utilizing hybrid assembly technology
NASA Astrophysics Data System (ADS)
Zontar, D.; Dogan, M.; Fulghum, S.; Müller, T.; Haag, S.; Brecher, C.
2015-03-01
In this paper, we present hybrid assembly technology to maximize coupling efficiency for spatially combined laser systems. High quality components, such as center-turned focusing units, as well as suitable assembly strategies are necessary to obtain highest possible output ratios. Alignment strategies are challenging tasks due to their complexity and sensitivity. Especially in low-volume production fully automated systems are economically at a disadvantage, as operator experience is often expensive. However reproducibility and quality of automatically assembled systems can be superior. Therefore automated and manual assembly techniques are combined to obtain high coupling efficiency while preserving maximum flexibility. The paper will describe necessary equipment and software to enable hybrid assembly processes. Micromanipulator technology with high step-resolution and six degrees of freedom provide a large number of possible evaluation points. Automated algorithms are necess ary to speed-up data gathering and alignment to efficiently utilize available granularity for manual assembly processes. Furthermore, an engineering environment is presented to enable rapid prototyping of automation tasks with simultaneous data ev aluation. Integration with simulation environments, e.g. Zemax, allows the verification of assembly strategies in advance. Data driven decision making ensures constant high quality, documents the assembly process and is a basis for further improvement. The hybrid assembly technology has been applied on several applications for efficiencies above 80% and will be discussed in this paper. High level coupling efficiency has been achieved with minimized assembly as a result of semi-automated alignment. This paper will focus on hybrid automation for optimizing and attaching turning mirrors and collimation lenses.
Ultrasound: a subexploited tool for sample preparation in metabolomics.
Luque de Castro, M D; Delgado-Povedano, M M
2014-01-02
Metabolomics, one of the most recently emerged "omics", has taken advantage of ultrasound (US) to improve sample preparation (SP) steps. The metabolomics-US assisted SP step binomial has experienced a dissimilar development that has depended on the area (vegetal or animal) and the SP step. Thus, vegetal metabolomics and US assisted leaching has received the greater attention (encompassing subdisciplines such as metallomics, xenometabolomics and, mainly, lipidomics), but also liquid-liquid extraction and (bio)chemical reactions in metabolomics have taken advantage of US energy. Also clinical and animal samples have benefited from US assisted SP in metabolomics studies but in a lesser extension. The main effects of US have been shortening of the time required for the given step, and/or increase of its efficiency or availability for automation; nevertheless, attention paid to potential degradation caused by US has been scant or nil. Achievements and weak points of the metabolomics-US assisted SP step binomial are discussed and possible solutions to the present shortcomings are exposed. Copyright © 2013 Elsevier B.V. All rights reserved.
Phase 1 of the automated array assembly task of the low cost silicon solar array project
NASA Technical Reports Server (NTRS)
Coleman, M. G.; Pryor, R. A.; Grenon, L. A.; Lesk, I. A.
1977-01-01
The state of technology readiness for the automated production of solar cells and modules is reviewed. Individual process steps and process sequences for making solar cells and modules were evaluated both technically and economically. High efficiency with a suggested cell goal of 15% was stressed. It is concluded that the technology exists to manufacture solar cells which will meet program goals.
Oxygen-controlled automated neural differentiation of mouse embryonic stem cells.
Mondragon-Teran, Paul; Tostoes, Rui; Mason, Chris; Lye, Gary J; Veraitch, Farlan S
2013-03-01
Automation and oxygen tension control are two tools that provide significant improvements to the reproducibility and efficiency of stem cell production processes. the aim of this study was to establish a novel automation platform capable of controlling oxygen tension during both the cell-culture and liquid-handling steps of neural differentiation processes. We built a bespoke automation platform, which enclosed a liquid-handling platform in a sterile, oxygen-controlled environment. An airtight connection was used to transfer cell culture plates to and from an automated oxygen-controlled incubator. Our results demonstrate that our system yielded comparable cell numbers, viabilities, metabolism profiles and differentiation efficiencies when compared with traditional manual processes. Interestingly, eliminating exposure to ambient conditions during the liquid-handling stage resulted in significant improvements in the yield of MAP2-positive neural cells, indicating that this level of control can improve differentiation processes. This article describes, for the first time, an automation platform capable of maintaining oxygen tension control during both the cell-culture and liquid-handling stages of a 2D embryonic stem cell differentiation process.
Walsh, Kristin E.; Chui, Michelle Anne; Kieser, Mara A.; Williams, Staci M.; Sutter, Susan L.; Sutter, John G.
2012-01-01
Objective To explore community pharmacy technician workflow change after implementation of an automated robotic prescription-filling device. Methods At an independent community pharmacy in rural Mayville, WI, pharmacy technicians were observed before and 3 months after installation of an automated robotic prescription-filling device. The main outcome measures were sequences and timing of technician workflow steps, workflow interruptions, automation surprises, and workarounds. Results Of the 77 and 80 observations made before and 3 months after robot installation, respectively, 17 different workflow sequences were observed before installation and 38 after installation. Average prescription filling time was reduced by 40 seconds per prescription with use of the robot. Workflow interruptions per observation increased from 1.49 to 1.79 (P = 0.11), and workarounds increased from 10% to 36% after robot use. Conclusion Although automated prescription-filling devices can increase efficiency, workflow interruptions and workarounds may negate that efficiency. Assessing changes in workflow and sequencing of tasks that may result from the use of automation can help uncover opportunities for workflow policy and procedure redesign. PMID:21896459
Automated assay for screening the enzymatic release of reducing sugars from micronized biomass
2010-01-01
Background To reduce the production cost of bioethanol obtained from fermentation of the sugars provided by degradation of lignocellulosic biomass (i.e., second generation bioethanol), it is necessary to screen for new enzymes endowed with more efficient biomass degrading properties. This demands the set-up of high-throughput screening methods. Several methods have been devised all using microplates in the industrial SBS format. Although this size reduction and standardization has greatly improved the screening process, the published methods comprise one or more manual steps that seriously decrease throughput. Therefore, we worked to devise a screening method devoid of any manual steps. Results We describe a fully automated assay for measuring the amount of reducing sugars released by biomass-degrading enzymes from wheat-straw and spruce. The method comprises two independent and automated steps. The first step is the making of "substrate plates". It consists of filling 96-well microplates with slurry suspensions of micronized substrate which are then stored frozen until use. The second step is an enzymatic activity assay. After thawing, the substrate plates are supplemented by the robot with cell-wall degrading enzymes where necessary, and the whole process from addition of enzymes to quantification of released sugars is autonomously performed by the robot. We describe how critical parameters (amount of substrate, amount of enzyme, incubation duration and temperature) were selected to fit with our specific use. The ability of this automated small-scale assay to discriminate among different enzymatic activities was validated using a set of commercial enzymes. Conclusions Using an automatic microplate sealer solved three main problems generally encountered during the set-up of methods for measuring the sugar-releasing activity of plant cell wall-degrading enzymes: throughput, automation, and evaporation losses. In its present set-up, the robot can autonomously process 120 triplicate wheat-straw samples per day. This throughput can be doubled if the incubation time is reduced from 24 h to 4 h (for initial rates measurements, for instance). This method can potentially be used with any insoluble substrate that is micronizable. A video illustrating the method can be seen at the following URL: http://www.youtube.com/watch?v=NFg6TxjuMWU PMID:20637080
Automation, consolidation, and integration in autoimmune diagnostics.
Tozzoli, Renato; D'Aurizio, Federica; Villalta, Danilo; Bizzaro, Nicola
2015-08-01
Over the past two decades, we have witnessed an extraordinary change in autoimmune diagnostics, characterized by the progressive evolution of analytical technologies, the availability of new tests, and the explosive growth of molecular biology and proteomics. Aside from these huge improvements, organizational changes have also occurred which brought about a more modern vision of the autoimmune laboratory. The introduction of automation (for harmonization of testing, reduction of human error, reduction of handling steps, increase of productivity, decrease of turnaround time, improvement of safety), consolidation (combining different analytical technologies or strategies on one instrument or on one group of connected instruments) and integration (linking analytical instruments or group of instruments with pre- and post-analytical devices) opened a new era in immunodiagnostics. In this article, we review the most important changes that have occurred in autoimmune diagnostics and present some models related to the introduction of automation in the autoimmunology laboratory, such as automated indirect immunofluorescence and changes in the two-step strategy for detection of autoantibodies; automated monoplex immunoassays and reduction of turnaround time; and automated multiplex immunoassays for autoantibody profiling.
Zhao, Wenle; Pauls, Keith
2016-04-01
Centralized outcome adjudication has been used widely in multicenter clinical trials in order to prevent potential biases and to reduce variations in important safety and efficacy outcome assessments. Adjudication procedures could vary significantly among different studies. In practice, the coordination of outcome adjudication procedures in many multicenter clinical trials remains as a manual process with low efficiency and high risk of delay. Motivated by the demands from two large clinical trial networks, a generic outcome adjudication module has been developed by the network's data management center within a homegrown clinical trial management system. In this article, the system design strategy and database structure are presented. A generic database model was created to transfer different adjudication procedures into a unified set of sequential adjudication steps. Each adjudication step was defined by one activate condition, one lock condition, one to five categorical data items to capture adjudication results, and one free text field for general comments. Based on this model, a generic outcome adjudication user interface and a generic data processing program were developed within a homegrown clinical trial management system to provide automated coordination of outcome adjudication. By the end of 2014, this generic outcome adjudication module had been implemented in 10 multicenter trials. A total of 29 adjudication procedures were defined with the number of adjudication steps varying from 1 to 7. The implementation of a new adjudication procedure in this generic module took an experienced programmer 1 or 2 days. A total of 7336 outcome events had been adjudicated and 16,235 adjudication step activities had been recorded. In a multicenter trial, 1144 safety outcome event submissions went through a three-step adjudication procedure and reported a median of 3.95 days from safety event case report form submission to adjudication completion. In another trial, 277 clinical outcome events were adjudicated by a six-step procedure and took a median of 23.84 days from outcome event case report form submission to adjudication procedure completion. A generic outcome adjudication module integrated in the clinical trial management system made the automated coordination of efficacy and safety outcome adjudication a reality. © The Author(s) 2015.
Thermal Model Development for Ares I-X
NASA Technical Reports Server (NTRS)
Amundsen, Ruth M.; DelCorso, Joe
2008-01-01
Thermal analysis for the Ares I-X vehicle has involved extensive thermal model integration, since thermal models of vehicle elements came from several different NASA and industry organizations. Many valuable lessons were learned in terms of model integration and validation. Modeling practices such as submodel, analysis group and symbol naming were standardized to facilitate the later model integration. Upfront coordination of coordinate systems, timelines, units, symbols and case scenarios was very helpful in minimizing integration rework. A process for model integration was developed that included pre-integration runs and basic checks of both models, and a step-by-step process to efficiently integrate one model into another. Extensive use of model logic was used to create scenarios and timelines for avionics and air flow activation. Efficient methods of model restart between case scenarios were developed. Standardization of software version and even compiler version between organizations was found to be essential. An automated method for applying aeroheating to the full integrated vehicle model, including submodels developed by other organizations, was developed.
Schulze, H Georg; Turner, Robin F B
2015-06-01
High-throughput information extraction from large numbers of Raman spectra is becoming an increasingly taxing problem due to the proliferation of new applications enabled using advances in instrumentation. Fortunately, in many of these applications, the entire process can be automated, yielding reproducibly good results with significant time and cost savings. Information extraction consists of two stages, preprocessing and analysis. We focus here on the preprocessing stage, which typically involves several steps, such as calibration, background subtraction, baseline flattening, artifact removal, smoothing, and so on, before the resulting spectra can be further analyzed. Because the results of some of these steps can affect the performance of subsequent ones, attention must be given to the sequencing of steps, the compatibility of these sequences, and the propensity of each step to generate spectral distortions. We outline here important considerations to effect full automation of Raman spectral preprocessing: what is considered full automation; putative general principles to effect full automation; the proper sequencing of processing and analysis steps; conflicts and circularities arising from sequencing; and the need for, and approaches to, preprocessing quality control. These considerations are discussed and illustrated with biological and biomedical examples reflecting both successful and faulty preprocessing.
NASA Technical Reports Server (NTRS)
Billman, Dorrit Owen; Schreckenghost, Debra; Miri, Pardis
2014-01-01
Astronauts will be responsible for executing a much larger body of procedures as human exploration moves further from Earth and Mission Control. Efficient, reliable methods for executing these procedures, including manual, automated, and mixed execution will be important. Our interface integrates step-by-step instruction with the means for execution. The research reported here compared manual execution using the new system to a system analogous to the manual-only system currently in use on the International Space Station, to assess whether user performance in manual operations would be as good or better with the new than with the legacy system. The system used also allows flexible automated execution. The system and our data lay the foundation for integrating automated execution into the flow of procedures designed for humans. In our formative study, we found speed and accuracy of manual procedure execution was better using the new, integrated interface over the legacy design.
Blastocyst microinjection automation.
Mattos, Leonardo S; Grant, Edward; Thresher, Randy; Kluckman, Kimberly
2009-09-01
Blastocyst microinjections are routinely involved in the process of creating genetically modified mice for biomedical research, but their efficiency is highly dependent on the skills of the operators. As a consequence, much time and resources are required for training microinjection personnel. This situation has been aggravated by the rapid growth of genetic research, which has increased the demand for mutant animals. Therefore, increased productivity and efficiency in this area are highly desired. Here, we pursue these goals through the automation of a previously developed teleoperated blastocyst microinjection system. This included the design of a new system setup to facilitate automation, the definition of rules for automatic microinjections, the implementation of video processing algorithms to extract feedback information from microscope images, and the creation of control algorithms for process automation. Experimentation conducted with this new system and operator assistance during the cells delivery phase demonstrated a 75% microinjection success rate. In addition, implantation of the successfully injected blastocysts resulted in a 53% birth rate and a 20% yield of chimeras. These results proved that the developed system was capable of automatic blastocyst penetration and retraction, demonstrating the success of major steps toward full process automation.
Zhang, Jie; Wei, Shimin; Ayres, David W; Smith, Harold T; Tse, Francis L S
2011-09-01
Although it is well known that automation can provide significant improvement in the efficiency of biological sample preparation in quantitative LC-MS/MS analysis, it has not been widely implemented in bioanalytical laboratories throughout the industry. This can be attributed to the lack of a sound strategy and practical procedures in working with robotic liquid-handling systems. Several comprehensive automation assisted procedures for biological sample preparation and method validation were developed and qualified using two types of Hamilton Microlab liquid-handling robots. The procedures developed were generic, user-friendly and covered the majority of steps involved in routine sample preparation and method validation. Generic automation procedures were established as a practical approach to widely implement automation into the routine bioanalysis of samples in support of drug-development programs.
Automated Classification of Asteroids into Families at Work
NASA Astrophysics Data System (ADS)
Knežević, Zoran; Milani, Andrea; Cellino, Alberto; Novaković, Bojan; Spoto, Federica; Paolicchi, Paolo
2014-07-01
We have recently proposed a new approach to the asteroid family classification by combining the classical HCM method with an automated procedure to add newly discovered members to existing families. This approach is specifically intended to cope with ever increasing asteroid data sets, and consists of several steps to segment the problem and handle the very large amount of data in an efficient and accurate manner. We briefly present all these steps and show the results from three subsequent updates making use of only the automated step of attributing the newly numbered asteroids to the known families. We describe the changes of the individual families membership, as well as the evolution of the classification due to the newly added intersections between the families, resolved candidate family mergers, and emergence of the new candidates for the mergers. We thus demonstrate how by the new approach the asteroid family classification becomes stable in general terms (converging towards a permanent list of confirmed families), and in the same time evolving in details (to account for the newly discovered asteroids) at each update.
Zhao, Y; Czilwik, G; Klein, V; Mitsakakis, K; Zengerle, R; Paust, N
2017-05-02
We present a fully automated centrifugal microfluidic method for particle based protein immunoassays. Stick-pack technology is employed for pre-storage and release of liquid reagents. Quantitative layout of centrifugo-pneumatic particle handling, including timed valving, switching and pumping is assisted by network simulations. The automation is exclusively controlled by the spinning frequency and does not require any additional means. New centrifugal microfluidic process chains are developed in order to sequentially supply wash buffer based on frequency dependent stick-pack opening and pneumatic pumping to perform two washing steps from one stored wash buffer; pre-store and re-suspend functionalized microparticles on a disk; and switch between the path of the waste fluid and the path of the substrate reaction product with 100% efficiency. The automated immunoassay concept is composed of on demand ligand binding, two washing steps, the substrate reaction, timed separation of the reaction products, and termination of the substrate reaction. We demonstrated separation of particles from three different liquids with particle loss below 4% and residual liquid remaining within particles below 3%. The automated immunoassay concept was demonstrated by means of detecting C-reactive protein (CRP) in the range of 1-81 ng ml -1 and interleukin 6 (IL-6) in the range of 64-13 500 pg ml -1 . The limit of detection and quantification were 1.0 ng ml -1 and 2.1 ng ml -1 for CRP and 64 pg ml -1 and 205 pg ml -1 for IL-6, respectively.
Soares, Filipa A.C.; Chandra, Amit; Thomas, Robert J.; Pedersen, Roger A.; Vallier, Ludovic; Williams, David J.
2014-01-01
The transfer of a laboratory process into a manufacturing facility is one of the most critical steps required for the large scale production of cell-based therapy products. This study describes the first published protocol for scalable automated expansion of human induced pluripotent stem cell lines growing in aggregates in feeder-free and chemically defined medium. Cells were successfully transferred between different sites representative of research and manufacturing settings; and passaged manually and using the CompacT SelecT automation platform. Modified protocols were developed for the automated system and the management of cells aggregates (clumps) was identified as the critical step. Cellular morphology, pluripotency gene expression and differentiation into the three germ layers have been used compare the outcomes of manual and automated processes. PMID:24440272
NASA Technical Reports Server (NTRS)
Wolf, M.
1982-01-01
The historical progression of efficiency improvements, cost reductions, and performance improvements in modules and photovoltaic systems are described. The potential for future improvements in photovoltaic device efficiencies and cost reductions continues as device concepts, designs, processes, and automated production capabilities mature. Additional step-function improvements can be made as today's simpler devices are replaced by more sophisticated devices.
Tan, Swee Jin; Phan, Huan; Gerry, Benjamin Michael; Kuhn, Alexandre; Hong, Lewis Zuocheng; Min Ong, Yao; Poon, Polly Suk Yean; Unger, Marc Alexander; Jones, Robert C; Quake, Stephen R; Burkholder, William F
2013-01-01
Library preparation for next-generation DNA sequencing (NGS) remains a key bottleneck in the sequencing process which can be relieved through improved automation and miniaturization. We describe a microfluidic device for automating laboratory protocols that require one or more column chromatography steps and demonstrate its utility for preparing Next Generation sequencing libraries for the Illumina and Ion Torrent platforms. Sixteen different libraries can be generated simultaneously with significantly reduced reagent cost and hands-on time compared to manual library preparation. Using an appropriate column matrix and buffers, size selection can be performed on-chip following end-repair, dA tailing, and linker ligation, so that the libraries eluted from the chip are ready for sequencing. The core architecture of the device ensures uniform, reproducible column packing without user supervision and accommodates multiple routine protocol steps in any sequence, such as reagent mixing and incubation; column packing, loading, washing, elution, and regeneration; capture of eluted material for use as a substrate in a later step of the protocol; and removal of one column matrix so that two or more column matrices with different functional properties can be used in the same protocol. The microfluidic device is mounted on a plastic carrier so that reagents and products can be aliquoted and recovered using standard pipettors and liquid handling robots. The carrier-mounted device is operated using a benchtop controller that seals and operates the device with programmable temperature control, eliminating any requirement for the user to manually attach tubing or connectors. In addition to NGS library preparation, the device and controller are suitable for automating other time-consuming and error-prone laboratory protocols requiring column chromatography steps, such as chromatin immunoprecipitation.
Tan, Swee Jin; Phan, Huan; Gerry, Benjamin Michael; Kuhn, Alexandre; Hong, Lewis Zuocheng; Min Ong, Yao; Poon, Polly Suk Yean; Unger, Marc Alexander; Jones, Robert C.; Quake, Stephen R.; Burkholder, William F.
2013-01-01
Library preparation for next-generation DNA sequencing (NGS) remains a key bottleneck in the sequencing process which can be relieved through improved automation and miniaturization. We describe a microfluidic device for automating laboratory protocols that require one or more column chromatography steps and demonstrate its utility for preparing Next Generation sequencing libraries for the Illumina and Ion Torrent platforms. Sixteen different libraries can be generated simultaneously with significantly reduced reagent cost and hands-on time compared to manual library preparation. Using an appropriate column matrix and buffers, size selection can be performed on-chip following end-repair, dA tailing, and linker ligation, so that the libraries eluted from the chip are ready for sequencing. The core architecture of the device ensures uniform, reproducible column packing without user supervision and accommodates multiple routine protocol steps in any sequence, such as reagent mixing and incubation; column packing, loading, washing, elution, and regeneration; capture of eluted material for use as a substrate in a later step of the protocol; and removal of one column matrix so that two or more column matrices with different functional properties can be used in the same protocol. The microfluidic device is mounted on a plastic carrier so that reagents and products can be aliquoted and recovered using standard pipettors and liquid handling robots. The carrier-mounted device is operated using a benchtop controller that seals and operates the device with programmable temperature control, eliminating any requirement for the user to manually attach tubing or connectors. In addition to NGS library preparation, the device and controller are suitable for automating other time-consuming and error-prone laboratory protocols requiring column chromatography steps, such as chromatin immunoprecipitation. PMID:23894273
Zhao, Wenle; Pauls, Keith
2015-01-01
Background Centralized outcome adjudication has been used widely in multi-center clinical trials in order to prevent potential biases and to reduce variations in important safety and efficacy outcome assessments. Adjudication procedures could vary significantly among different studies. In practice, the coordination of outcome adjudication procedures in many multicenter clinical trials remains as a manual process with low efficiency and high risk of delay. Motivated by the demands from two large clinical trial networks, a generic outcome adjudication module has been developed by the network’s data management center within a homegrown clinical trial management system. In this paper, the system design strategy and database structure are presented. Methods A generic database model was created to transfer different adjudication procedures into a unified set of sequential adjudication steps. Each adjudication step was defined by one activate condition, one lock condition, one to five categorical data items to capture adjudication results, and one free text field for general comments. Based on this model, a generic outcome adjudication user interface and a generic data processing program were developed within a homegrown clinical trial management system to provide automated coordination of outcome adjudication. Results By the end of 2014, this generic outcome adjudication module had been implemented in 10 multicenter trials. A total of 29 adjudication procedures were defined with the number of adjudication steps varying from 1 to 7. The implementation of a new adjudication procedure in this generic module took an experienced programmer one or two days. A total of 7,336 outcome events had been adjudicated and 16,235 adjudication step activities had been recorded. In a multicenter trial, 1144 safety outcome event submissions went through a three-step adjudication procedure and reported a median of 3.95 days from safety event case report form submission to adjudication completion. In another trial, 277 clinical outcome events were adjudicated by a six-step procedure and took a median of 23.84 days from outcome event case report form submission to adjudication procedure completion. Conclusions A generic outcome adjudication module integrated in the clinical trial management system made the automated coordination of efficacy and safety outcome adjudication a reality. PMID:26464429
An efficient preparation of labelling precursor of [11C]L-deprenyl-D2 and automated radiosynthesis.
Zirbesegger, Kevin; Buccino, Pablo; Kreimerman, Ingrid; Engler, Henry; Porcal, Williams; Savio, Eduardo
2017-01-01
The synthesis of [ 11 C]L-deprenyl-D 2 for imaging of astrocytosis with positron emission tomography (PET) in neurodegenerative diseases has been previously reported. [ 11 C]L-deprenyl-D 2 radiosynthesis requires a precursor, L-nordeprenyl-D 2 , which has been previously synthesized from L-amphetamine as starting material with low overall yields. Here, we present an efficient synthesis of L-nordeprenyl-D 2 organic precursor as free base and automated radiosynthesis of [ 11 C]L-deprenyl-D 2 for PET imaging of astrocytosis. The L-nordeprenyl-D 2 precursor was synthesized from the easily commercial available and cheap reagent L-phenylalanine in five steps. Next, N -alkylation of L-nordeprenyl-D 2 free base with [ 11 C]MeOTf was optimized using the automated commercial platform GE TRACERlab® FX C Pro. A simple and efficient synthesis of L-nordeprenyl-D 2 precursor of [ 11 C]L-deprenyl-D 2 as free base has been developed in five synthetic steps with an overall yield of 33%. The precursor as free base has been stable for 9 months stored at low temperature (-20 °C). The labelled product was obtained with 44 ± 13% ( n = 12) (end of synthesis, decay corrected) radiochemical yield from [ 11 C]MeI after 35 min synthesis time. The radiochemical purity was over 99% in all cases and specific activity was (170 ± 116) GBq/μmol. A high-yield synthesis of [ 11 C]L-deprenyl-D 2 has been achieved with high purity and specific activity. L-nordeprenyl-D 2 precursor as free amine was applicable for automated production in a commercial synthesis module for preclinical and clinical application.
How to sharpen your automated tools.
DOT National Transportation Integrated Search
2014-12-01
New programs that claim to make flying more efficient have several things in common, new tasks for pilots, new flight deck displays, automated support tools, changes to ground automation, and displays for air traffic control. Training is one of the t...
Defining the drivers for accepting decision making automation in air traffic management.
Bekier, Marek; Molesworth, Brett R C; Williamson, Ann
2011-04-01
Air Traffic Management (ATM) operators are under increasing pressure to improve the efficiency of their operation to cater for forecasted increases in air traffic movements. One solution involves increasing the utilisation of automation within the ATM system. The success of this approach is contingent on Air Traffic Control Operators' (ATCOs) willingness to accept increased levels of automation. The main aim of the present research was to examine the drivers underpinning ATCOs' willingness to accept increased utilisation of automation within their role. Two fictitious scenarios involving the application of two new automated decision-making tools were created. The results of an online survey revealed traditional predictors of automation acceptance such as age, trust and job satisfaction explain between 4 and 7% of the variance. Furthermore, these predictors varied depending on the purpose in which the automation was to be employed. These results are discussed from an applied and theoretical perspective. STATEMENT OF RELEVANCE: Efficiency improvements in ATM are required to cater for forecasted increases in air traffic movements. One solution is to increase the utilisation of automation within Air Traffic Control. The present research examines the drivers underpinning air traffic controllers' willingness to accept increased levels of automation in their role.
Towards "Inverse" Character Tables? A One-Step Method for Decomposing Reducible Representations
ERIC Educational Resources Information Center
Piquemal, J.-Y.; Losno, R.; Ancian, B.
2009-01-01
In the framework of group theory, a new procedure is described for a one-step automated reduction of reducible representations. The matrix inversion tool, provided by standard spreadsheet software, is applied to the central part of the character table that contains the characters of the irreducible representation. This method is not restricted to…
Lian, Yanyun; Song, Zhijian
2014-01-01
Brain tumor segmentation from magnetic resonance imaging (MRI) is an important step toward surgical planning, treatment planning, monitoring of therapy. However, manual tumor segmentation commonly used in clinic is time-consuming and challenging, and none of the existed automated methods are highly robust, reliable and efficient in clinic application. An accurate and automated tumor segmentation method has been developed for brain tumor segmentation that will provide reproducible and objective results close to manual segmentation results. Based on the symmetry of human brain, we employed sliding-window technique and correlation coefficient to locate the tumor position. At first, the image to be segmented was normalized, rotated, denoised, and bisected. Subsequently, through vertical and horizontal sliding-windows technique in turn, that is, two windows in the left and the right part of brain image moving simultaneously pixel by pixel in two parts of brain image, along with calculating of correlation coefficient of two windows, two windows with minimal correlation coefficient were obtained, and the window with bigger average gray value is the location of tumor and the pixel with biggest gray value is the locating point of tumor. At last, the segmentation threshold was decided by the average gray value of the pixels in the square with center at the locating point and 10 pixels of side length, and threshold segmentation and morphological operations were used to acquire the final tumor region. The method was evaluated on 3D FSPGR brain MR images of 10 patients. As a result, the average ratio of correct location was 93.4% for 575 slices containing tumor, the average Dice similarity coefficient was 0.77 for one scan, and the average time spent on one scan was 40 seconds. An fully automated, simple and efficient segmentation method for brain tumor is proposed and promising for future clinic use. Correlation coefficient is a new and effective feature for tumor location.
Automated protein NMR structure determination using wavelet de-noised NOESY spectra.
Dancea, Felician; Günther, Ulrich
2005-11-01
A major time-consuming step of protein NMR structure determination is the generation of reliable NOESY cross peak lists which usually requires a significant amount of manual interaction. Here we present a new algorithm for automated peak picking involving wavelet de-noised NOESY spectra in a process where the identification of peaks is coupled to automated structure determination. The core of this method is the generation of incremental peak lists by applying different wavelet de-noising procedures which yield peak lists of a different noise content. In combination with additional filters which probe the consistency of the peak lists, good convergence of the NOESY-based automated structure determination could be achieved. These algorithms were implemented in the context of the ARIA software for automated NOE assignment and structure determination and were validated for a polysulfide-sulfur transferase protein of known structure. The procedures presented here should be commonly applicable for efficient protein NMR structure determination and automated NMR peak picking.
Martella, Andrea; Matjusaitis, Mantas; Auxillos, Jamie; Pollard, Steven M; Cai, Yizhi
2017-07-21
Mammalian plasmid expression vectors are critical reagents underpinning many facets of research across biology, biomedical research, and the biotechnology industry. Traditional cloning methods often require laborious manual design and assembly of plasmids using tailored sequential cloning steps. This process can be protracted, complicated, expensive, and error-prone. New tools and strategies that facilitate the efficient design and production of bespoke vectors would help relieve a current bottleneck for researchers. To address this, we have developed an extensible mammalian modular assembly kit (EMMA). This enables rapid and efficient modular assembly of mammalian expression vectors in a one-tube, one-step golden-gate cloning reaction, using a standardized library of compatible genetic parts. The high modularity, flexibility, and extensibility of EMMA provide a simple method for the production of functionally diverse mammalian expression vectors. We demonstrate the value of this toolkit by constructing and validating a range of representative vectors, such as transient and stable expression vectors (transposon based vectors), targeting vectors, inducible systems, polycistronic expression cassettes, fusion proteins, and fluorescent reporters. The method also supports simple assembly combinatorial libraries and hierarchical assembly for production of larger multigenetic cargos. In summary, EMMA is compatible with automated production, and novel genetic parts can be easily incorporated, providing new opportunities for mammalian synthetic biology.
Framework for Automated GD&T Inspection Using 3D Scanner
NASA Astrophysics Data System (ADS)
Pathak, Vimal Kumar; Singh, Amit Kumar; Sivadasan, M.; Singh, N. K.
2018-04-01
Geometric Dimensioning and Tolerancing (GD&T) is a typical dialect that helps designers, production faculty and quality monitors to convey design specifications in an effective and efficient manner. GD&T has been practiced since the start of machine component assembly but without overly naming it. However, in recent times industries have started increasingly emphasizing on it. One prominent area where most of the industries struggle with is quality inspection. Complete inspection process is mostly human intensive. Also, the use of conventional gauges and templates for inspection purpose highly depends on skill of workers and quality inspectors. In industries, the concept of 3D scanning is not new but is used only for creating 3D drawings or modelling of physical parts. However, the potential of 3D scanning as a powerful inspection tool is hardly explored. This study is centred on designing a procedure for automated inspection using 3D scanner. Linear, geometric and dimensional inspection of the most popular test bar-stepped bar, as a simple example was also carried out as per the new framework. The new generation engineering industries would definitely welcome this automated inspection procedure being quick and reliable with reduced human intervention.
Computational efficiency for the surface renewal method
NASA Astrophysics Data System (ADS)
Kelley, Jason; Higgins, Chad
2018-04-01
Measuring surface fluxes using the surface renewal (SR) method requires programmatic algorithms for tabulation, algebraic calculation, and data quality control. A number of different methods have been published describing automated calibration of SR parameters. Because the SR method utilizes high-frequency (10 Hz+) measurements, some steps in the flux calculation are computationally expensive, especially when automating SR to perform many iterations of these calculations. Several new algorithms were written that perform the required calculations more efficiently and rapidly, and that tested for sensitivity to length of flux averaging period, ability to measure over a large range of lag timescales, and overall computational efficiency. These algorithms utilize signal processing techniques and algebraic simplifications that demonstrate simple modifications that dramatically improve computational efficiency. The results here complement efforts by other authors to standardize a robust and accurate computational SR method. Increased speed of computation time grants flexibility to implementing the SR method, opening new avenues for SR to be used in research, for applied monitoring, and in novel field deployments.
Automated solid-phase subcloning based on beads brought into proximity by magnetic force.
Hudson, Elton P; Nikoshkov, Andrej; Uhlen, Mathias; Rockberg, Johan
2012-01-01
In the fields of proteomics, metabolic engineering and synthetic biology there is a need for high-throughput and reliable cloning methods to facilitate construction of expression vectors and genetic pathways. Here, we describe a new approach for solid-phase cloning in which both the vector and the gene are immobilized to separate paramagnetic beads and brought into proximity by magnetic force. Ligation events were directly evaluated using fluorescent-based microscopy and flow cytometry. The highest ligation efficiencies were obtained when gene- and vector-coated beads were brought into close contact by application of a magnet during the ligation step. An automated procedure was developed using a laboratory workstation to transfer genes into various expression vectors and more than 95% correct clones were obtained in a number of various applications. The method presented here is suitable for efficient subcloning in an automated manner to rapidly generate a large number of gene constructs in various vectors intended for high throughput applications.
Automated Solid-Phase Subcloning Based on Beads Brought into Proximity by Magnetic Force
Hudson, Elton P.; Nikoshkov, Andrej; Uhlen, Mathias; Rockberg, Johan
2012-01-01
In the fields of proteomics, metabolic engineering and synthetic biology there is a need for high-throughput and reliable cloning methods to facilitate construction of expression vectors and genetic pathways. Here, we describe a new approach for solid-phase cloning in which both the vector and the gene are immobilized to separate paramagnetic beads and brought into proximity by magnetic force. Ligation events were directly evaluated using fluorescent-based microscopy and flow cytometry. The highest ligation efficiencies were obtained when gene- and vector-coated beads were brought into close contact by application of a magnet during the ligation step. An automated procedure was developed using a laboratory workstation to transfer genes into various expression vectors and more than 95% correct clones were obtained in a number of various applications. The method presented here is suitable for efficient subcloning in an automated manner to rapidly generate a large number of gene constructs in various vectors intended for high throughput applications. PMID:22624028
Automated and assisted RNA resonance assignment using NMR chemical shift statistics
Aeschbacher, Thomas; Schmidt, Elena; Blatter, Markus; Maris, Christophe; Duss, Olivier; Allain, Frédéric H.-T.; Güntert, Peter; Schubert, Mario
2013-01-01
The three-dimensional structure determination of RNAs by NMR spectroscopy relies on chemical shift assignment, which still constitutes a bottleneck. In order to develop more efficient assignment strategies, we analysed relationships between sequence and 1H and 13C chemical shifts. Statistics of resonances from regularly Watson–Crick base-paired RNA revealed highly characteristic chemical shift clusters. We developed two approaches using these statistics for chemical shift assignment of double-stranded RNA (dsRNA): a manual approach that yields starting points for resonance assignment and simplifies decision trees and an automated approach based on the recently introduced automated resonance assignment algorithm FLYA. Both strategies require only unlabeled RNAs and three 2D spectra for assigning the H2/C2, H5/C5, H6/C6, H8/C8 and H1′/C1′ chemical shifts. The manual approach proved to be efficient and robust when applied to the experimental data of RNAs with a size between 20 nt and 42 nt. The more advanced automated assignment approach was successfully applied to four stem-loop RNAs and a 42 nt siRNA, assigning 92–100% of the resonances from dsRNA regions correctly. This is the first automated approach for chemical shift assignment of non-exchangeable protons of RNA and their corresponding 13C resonances, which provides an important step toward automated structure determination of RNAs. PMID:23921634
Automated workflows for modelling chemical fate, kinetics and toxicity.
Sala Benito, J V; Paini, Alicia; Richarz, Andrea-Nicole; Meinl, Thorsten; Berthold, Michael R; Cronin, Mark T D; Worth, Andrew P
2017-12-01
Automation is universal in today's society, from operating equipment such as machinery, in factory processes, to self-parking automobile systems. While these examples show the efficiency and effectiveness of automated mechanical processes, automated procedures that support the chemical risk assessment process are still in their infancy. Future human safety assessments will rely increasingly on the use of automated models, such as physiologically based kinetic (PBK) and dynamic models and the virtual cell based assay (VCBA). These biologically-based models will be coupled with chemistry-based prediction models that also automate the generation of key input parameters such as physicochemical properties. The development of automated software tools is an important step in harmonising and expediting the chemical safety assessment process. In this study, we illustrate how the KNIME Analytics Platform can be used to provide a user-friendly graphical interface for these biokinetic models, such as PBK models and VCBA, which simulates the fate of chemicals in vivo within the body and in vitro test systems respectively. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Automation in clinical bacteriology: what system to choose?
Greub, G; Prod'hom, G
2011-05-01
With increased activity and reduced financial and human resources, there is a need for automation in clinical bacteriology. Initial processing of clinical samples includes repetitive and fastidious steps. These tasks are suitable for automation, and several instruments are now available on the market, including the WASP (Copan), Previ-Isola (BioMerieux), Innova (Becton-Dickinson) and Inoqula (KIESTRA) systems. These new instruments allow efficient and accurate inoculation of samples, including four main steps: (i) selecting the appropriate Petri dish; (ii) inoculating the sample; (iii) spreading the inoculum on agar plates to obtain, upon incubation, well-separated bacterial colonies; and (iv) accurate labelling and sorting of each inoculated media. The challenge for clinical bacteriologists is to determine what is the ideal automated system for their own laboratory. Indeed, different solutions will be preferred, according to the number and variety of samples, and to the types of sample that will be processed with the automated system. The final choice is troublesome, because audits proposed by industrials risk being biased towards the solution proposed by their company, and because these automated systems may not be easily tested on site prior to the final decision, owing to the complexity of computer connections between the laboratory information system and the instrument. This article thus summarizes the main parameters that need to be taken into account for choosing the optimal system, and provides some clues to help clinical bacteriologists to make their choice. © 2011 The Authors. Clinical Microbiology and Infection © 2011 European Society of Clinical Microbiology and Infectious Diseases.
Design of a novel automated methanol feed system for pilot-scale fermentation of Pichia pastoris.
Hamaker, Kent H; Johnson, Daniel C; Bellucci, Joseph J; Apgar, Kristie R; Soslow, Sherry; Gercke, John C; Menzo, Darrin J; Ton, Christopher
2011-01-01
Large-scale fermentation of Pichia pastoris requires a large volume of methanol feed during the induction phase. However, a large volume of methanol feed is difficult to use in the processing suite because of the inconvenience of constant monitoring, manual manipulation steps, and fire and explosion hazards. To optimize and improve safety of the methanol feed process, a novel automated methanol feed system has been designed and implemented for industrial fermentation of P. pastoris. Details of the design of the methanol feed system are described. The main goals of the design were to automate the methanol feed process and to minimize the hazardous risks associated with storing and handling large quantities of methanol in the processing area. The methanol feed system is composed of two main components: a bulk feed (BF) system and up to three portable process feed (PF) systems. The BF system automatically delivers methanol from a central location to the portable PF system. The PF system provides precise flow control of linear, step, or exponential feed of methanol to the fermenter. Pilot-scale fermentations with linear and exponential methanol feeds were conducted using two Mut(+) (methanol utilization plus) strains, one expressing a recombinant therapeutic protein and the other a monoclonal antibody. Results show that the methanol feed system is accurate, safe, and efficient. The feed rates for both linear and exponential feed methods were within ± 5% of the set points, and the total amount of methanol fed was within 1% of the targeted volume. Copyright © 2011 American Institute of Chemical Engineers (AIChE).
Automated microaneurysm detection in diabetic retinopathy using curvelet transform
NASA Astrophysics Data System (ADS)
Ali Shah, Syed Ayaz; Laude, Augustinus; Faye, Ibrahima; Tang, Tong Boon
2016-10-01
Microaneurysms (MAs) are known to be the early signs of diabetic retinopathy (DR). An automated MA detection system based on curvelet transform is proposed for color fundus image analysis. Candidates of MA were extracted in two parallel steps. In step one, blood vessels were removed from preprocessed green band image and preliminary MA candidates were selected by local thresholding technique. In step two, based on statistical features, the image background was estimated. The results from the two steps allowed us to identify preliminary MA candidates which were also present in the image foreground. A collection set of features was fed to a rule-based classifier to divide the candidates into MAs and non-MAs. The proposed system was tested with Retinopathy Online Challenge database. The automated system detected 162 MAs out of 336, thus achieved a sensitivity of 48.21% with 65 false positives per image. Counting MA is a means to measure the progression of DR. Hence, the proposed system may be deployed to monitor the progression of DR at early stage in population studies.
Automated microaneurysm detection in diabetic retinopathy using curvelet transform.
Ali Shah, Syed Ayaz; Laude, Augustinus; Faye, Ibrahima; Tang, Tong Boon
2016-10-01
Microaneurysms (MAs) are known to be the early signs of diabetic retinopathy (DR). An automated MA detection system based on curvelet transform is proposed for color fundus image analysis. Candidates of MA were extracted in two parallel steps. In step one, blood vessels were removed from preprocessed green band image and preliminary MA candidates were selected by local thresholding technique. In step two, based on statistical features, the image background was estimated. The results from the two steps allowed us to identify preliminary MA candidates which were also present in the image foreground. A collection set of features was fed to a rule-based classifier to divide the candidates into MAs and non-MAs. The proposed system was tested with Retinopathy Online Challenge database. The automated system detected 162 MAs out of 336, thus achieved a sensitivity of 48.21% with 65 false positives per image. Counting MA is a means to measure the progression of DR. Hence, the proposed system may be deployed to monitor the progression of DR at early stage in population studies.
[Automated analyser of organ cultured corneal endothelial mosaic].
Gain, P; Thuret, G; Chiquet, C; Gavet, Y; Turc, P H; Théillère, C; Acquart, S; Le Petit, J C; Maugery, J; Campos, L
2002-05-01
Until now, organ-cultured corneal endothelial mosaic has been assessed in France by cell counting using a calibrated graticule, or by drawing cells on a computerized image. The former method is unsatisfactory because it is characterized by a lack of objective evaluation of the cell surface and hexagonality and it requires an experienced technician. The latter method is time-consuming and requires careful attention. We aimed to make an efficient, fast and easy to use, automated digital analyzer of video images of the corneal endothelium. The hardware included a PC Pentium III ((R)) 800 MHz-Ram 256, a Data Translation 3155 acquisition card, a Sony SC 75 CE CCD camera, and a 22-inch screen. Special functions for automated cell boundary determination consisted of Plug-in programs included in the ImageTool software. Calibration was performed using a calibrated micrometer. Cell densities of 40 organ-cultured corneas measured by both manual and automated counting were compared using parametric tests (Student's t test for paired variables and the Pearson correlation coefficient). All steps were considered more ergonomic i.e., endothelial image capture, image selection, thresholding of multiple areas of interest, automated cell count, automated detection of errors in cell boundary drawing, presentation of the results in an HTML file including the number of counted cells, cell density, coefficient of variation of cell area, cell surface histogram and cell hexagonality. The device was efficient because the global process lasted on average 7 minutes and did not require an experienced technician. The correlation between cell densities obtained with both methods was high (r=+0.84, p<0.001). The results showed an under-estimation using manual counting (2191+/-322 vs. 2273+/-457 cell/mm(2), p=0.046), compared with the automated method. Our automated endothelial cell analyzer is efficient and gives reliable results quickly and easily. A multicentric validation would allow us to standardize cell counts among cornea banks in our country.
A fully automated digitally controlled 30-inch telescope
NASA Technical Reports Server (NTRS)
Colgate, S. A.; Moore, E. P.; Carlson, R.
1975-01-01
A fully automated 30-inch (75-cm) telescope has been successfully designed and constructed from a military surplus Nike-Ajax radar mount. Novel features include: closed-loop operation between mountain telescope and campus computer 30 km apart via microwave link, a TV-type sensor which is photon shot-noise limited, a special lightweight primary mirror, and a stepping motor drive capable of slewing and settling one degree in one second or a radian in fifteen seconds.
Agile based "Semi-"Automated Data ingest process : ORNL DAAC example
NASA Astrophysics Data System (ADS)
Santhana Vannan, S. K.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Hook, L.; Wei, Y.; Wright, D.
2015-12-01
The ORNL DAAC archives and publishes data and information relevant to biogeochemical, ecological, and environmental processes. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. The ORNL DAAC ingest team curates diverse data sets from multiple data providers simultaneously. To streamline the ingest process, the data set submission process at the ORNL DAAC has been recently updated to use an agile process and a semi-automated workflow system has been developed to provide a consistent data provider experience and to create a uniform data product. The goals of semi-automated agile ingest process are to: 1.Provide the ability to track a data set from acceptance to publication 2. Automate steps that can be automated to improve efficiencies and reduce redundancy 3.Update legacy ingest infrastructure 4.Provide a centralized system to manage the various aspects of ingest. This talk will cover the agile methodology, workflow, and tools developed through this system.
Zang, Qin; Javed, Salim; Hill, David; Ullah, Farman; Bi, Danse; Porubsky, Patrick; Neuenswander, Benjamin; Lushington, Gerald H; Santini, Conrad; Organ, Michael G; Hanson, Paul R
2012-08-13
The construction of a 96-member library of triazolated 1,2,5-thiadiazepane 1,1-dioxides was performed on a Chemspeed Accelerator (SLT-100) automated parallel synthesis platform, culminating in the successful preparation of 94 out of 96 possible products. The key step, a one-pot, sequential elimination, double-aza-Michael reaction, and [3 + 2] Huisgen cycloaddition pathway has been automated and utilized in the production of two sets of triazolated sultam products.
Zang, Qin; Javed, Salim; Hill, David; Ullah, Farman; Bi, Danse; Porubsky, Patrick; Neuenswander, Benjamin; Lushington, Gerald H.; Santini, Conrad; Organ, Michael G.; Hanson, Paul R.
2013-01-01
The construction of a 96-member library of triazolated 1,2,5-thiadiazepane 1,1-dioxides was performed on a Chemspeed Accelerator (SLT-100) automated parallel synthesis platform, culminating in the successful preparation of 94 out of 96 possible products. The key step, a one-pot, sequential elimination, double-aza-Michael reaction, and [3+2] Huisgen cycloaddition pathway has been automated and utilized in the production of two sets of triazolated sultam products. PMID:22853708
Experimental and numerical investigation of the Fast-SAGD process
NASA Astrophysics Data System (ADS)
Shin, Hyundon
The SAGD process has been tested in the field, and is now in a commercial stage in Western Canadian oil sands areas. The Fast-SAGD method can partly solve the drilling difficulty and reduce costs in a SAGD operation requiring paired parallel wells one above the other. This method also enhances the thermal efficiency in the reservoir. In this research, the reservoir parameters and operating conditions for the SAGD and Fast-SAGD processes are investigated by numerical simulation in the three Alberta oil sands areas. Scaled physical model experiments, which are operated by an automated process control system, are conducted under high temperature and high pressure conditions. The results of the study indicate that the shallow Athabasca-type reservoir, which is thick with high permeability (high kxh), is a good candidate for SAGD application, whereas Cold Lake- and Peace River-type reservoirs, which are thin with low permeability, are not as good candidates for conventional SAGD implementation. The simulation results indicate improved energy efficiency and productivity in most cases for the Fast-SAGD process; in those cases, the project economics were enhanced compared to the SAGD process. Both Cold Lake- and Peace River-type reservoirs are good candidates for a Fast-SAGD application rather than a conventional SAGD application. This new process demonstrates improved efficiency and lower costs for extracting heavy oil from these important reservoirs. A new economic indicator, called simple thermal efficiency parameter (STEP), was developed and validated to evaluate the performance of a SAGD project. STEP is based on cumulative steam-oil ratio (CSOR), calendar day oil rate (CDOR) and recovery factor (RF) for the time prior to the steam-oil ratio (SOR) attaining 4. STEP can be used as a financial metric quantitatively as well as qualitatively for this type of thermal project. An automated process control system was set-up and validated, and has the capability of controlling and handling steam injection processes like the steam-assisted gravity drainage process. The results of these preliminary experiments showed the overall cumulative oil production to be larger in the Fast-SAGD case, but end-point CSOR to be lower in the SAGD case. History matching results indicated that the steam quality was as low as 0.3 in the SAGD experiments, and even lower in the Fast-SAGD experiments after starting the CSS.
Initial steps toward automation of a propellant processor
NASA Technical Reports Server (NTRS)
Schallhorn, Paul; Ramohalli, Kumar
1990-01-01
This paper presents the results from an experimental study aimed at ultimately automating the mixing of propellants in order to minimize unintended variations usually attributed to human error. The water heater and delivery system of a one-pint Baker-Perkins (APV) vertical mixer are automated with computer control. Various innovations are employed to introduce economy and low thermal inertia. Some of these include twin heaters/reservoirs instead of one large reservoir, a compact water mixer for achieving the desired temperature quickly, and thorough insulation of the entire water system. The completed system is tested during two propellant mixes. The temperature uniformly is proven through careful measurements employing several local thermocouples.
Davis, Mark D; Wade, Erin L; Restrepo, Paula R; Roman-Esteva, William; Bravo, Roberto; Kuklenyik, Peter; Calafat, Antonia M
2013-06-15
Organophosphate and pyrethroid insecticides and phenoxyacetic acid herbicides represent important classes of pesticides applied in commercial and residential settings. Interest in assessing the extent of human exposure to these pesticides exists because of their widespread use and their potential adverse health effects. An analytical method for measuring 12 biomarkers of several of these pesticides in urine has been developed. The target analytes were extracted from one milliliter of urine by a semi-automated solid phase extraction technique, separated from each other and from other urinary biomolecules by reversed-phase high performance liquid chromatography, and detected using tandem mass spectrometry with isotope dilution quantitation. This method can be used to measure all the target analytes in one injection with similar repeatability and detection limits of previous methods which required more than one injection. Each step of the procedure was optimized to produce a robust, reproducible, accurate, precise and efficient method. The required selectivity and sensitivity for trace-level analysis (e.g., limits of detection below 0.5ng/mL) was achieved using a narrow diameter analytical column, higher than unit mass resolution for certain analytes, and stable isotope labeled internal standards. The method was applied to the analysis of 55 samples collected from adult anonymous donors with no known exposure to the target pesticides. This efficient and cost-effective method is adequate to handle the large number of samples required for national biomonitoring surveys. Published by Elsevier B.V.
RoboPIV: how robotics enable PIV on a large industrial scale
NASA Astrophysics Data System (ADS)
Michaux, F.; Mattern, P.; Kallweit, S.
2018-07-01
This work demonstrates how the interaction between particle image velocimetry (PIV) and robotics can massively increase measurement efficiency. The interdisciplinary approach is shown using the complex example of an automated, large scale, industrial environment: a typical automotive wind tunnel application. Both the high degree of flexibility in choosing the measurement region and the complete automation of stereo PIV measurements are presented. The setup consists of a combination of three robots, individually used as a 6D traversing unit for the laser illumination system as well as for each of the two cameras. Synchronised movements in the same reference frame are realised through a master-slave setup with a single interface to the user. By integrating the interface into the standard wind tunnel management system, a single measurement plane or a predefined sequence of several planes can be requested through a single trigger event, providing the resulting vector fields within minutes. In this paper, a brief overview on the demands of large scale industrial PIV and the existing solutions is given. Afterwards, the concept of RoboPIV is introduced as a new approach. In a first step, the usability of a selection of commercially available robot arms is analysed. The challenges of pose uncertainty and importance of absolute accuracy are demonstrated through comparative measurements, explaining the individual pros and cons of the analysed systems. Subsequently, the advantage of integrating RoboPIV directly into the existing wind tunnel management system is shown on basis of a typical measurement sequence. In a final step, a practical measurement procedure, including post-processing, is given by using real data and results. Ultimately, the benefits of high automation are demonstrated, leading to a drastic reduction in necessary measurement time compared to non-automated systems, thus massively increasing the efficiency of PIV measurements.
Two-step chlorination: A new approach to disinfection of a primary sewage effluent.
Li, Yu; Yang, Mengting; Zhang, Xiangru; Jiang, Jingyi; Liu, Jiaqi; Yau, Cie Fu; Graham, Nigel J D; Li, Xiaoyan
2017-01-01
Sewage disinfection aims at inactivating pathogenic microorganisms and preventing the transmission of waterborne diseases. Chlorination is extensively applied for disinfecting sewage effluents. The objective of achieving a disinfection goal and reducing disinfectant consumption and operational costs remains a challenge in sewage treatment. In this study, we have demonstrated that, for the same chlorine dosage, a two-step addition of chlorine (two-step chlorination) was significantly more efficient in disinfecting a primary sewage effluent than a one-step addition of chlorine (one-step chlorination), and shown how the two-step chlorination was optimized with respect to time interval and dosage ratio. Two-step chlorination of the sewage effluent attained its highest disinfection efficiency at a time interval of 19 s and a dosage ratio of 5:1. Compared to one-step chlorination, two-step chlorination enhanced the disinfection efficiency by up to 0.81- or even 1.02-log for two different chlorine doses and contact times. An empirical relationship involving disinfection efficiency, time interval and dosage ratio was obtained by best fitting. Mechanisms (including a higher overall Ct value, an intensive synergistic effect, and a shorter recovery time) were proposed for the higher disinfection efficiency of two-step chlorination in the sewage effluent disinfection. Annual chlorine consumption costs in one-step and two-step chlorination of the primary sewage effluent were estimated. Compared to one-step chlorination, two-step chlorination reduced the cost by up to 16.7%. Copyright © 2016 Elsevier Ltd. All rights reserved.
Halper, Sean M; Cetnar, Daniel P; Salis, Howard M
2018-01-01
Engineering many-enzyme metabolic pathways suffers from the design curse of dimensionality. There are an astronomical number of synonymous DNA sequence choices, though relatively few will express an evolutionary robust, maximally productive pathway without metabolic bottlenecks. To solve this challenge, we have developed an integrated, automated computational-experimental pipeline that identifies a pathway's optimal DNA sequence without high-throughput screening or many cycles of design-build-test. The first step applies our Operon Calculator algorithm to design a host-specific evolutionary robust bacterial operon sequence with maximally tunable enzyme expression levels. The second step applies our RBS Library Calculator algorithm to systematically vary enzyme expression levels with the smallest-sized library. After characterizing a small number of constructed pathway variants, measurements are supplied to our Pathway Map Calculator algorithm, which then parameterizes a kinetic metabolic model that ultimately predicts the pathway's optimal enzyme expression levels and DNA sequences. Altogether, our algorithms provide the ability to efficiently map the pathway's sequence-expression-activity space and predict DNA sequences with desired metabolic fluxes. Here, we provide a step-by-step guide to applying the Pathway Optimization Pipeline on a desired multi-enzyme pathway in a bacterial host.
Self-optimizing approach for automated laser resonator alignment
NASA Astrophysics Data System (ADS)
Brecher, C.; Schmitt, R.; Loosen, P.; Guerrero, V.; Pyschny, N.; Pavim, A.; Gatej, A.
2012-02-01
Nowadays, the assembly of laser systems is dominated by manual operations, involving elaborate alignment by means of adjustable mountings. From a competition perspective, the most challenging problem in laser source manufacturing is price pressure, a result of cost competition exerted mainly from Asia. From an economical point of view, an automated assembly of laser systems defines a better approach to produce more reliable units at lower cost. However, the step from today's manual solutions towards an automated assembly requires parallel developments regarding product design, automation equipment and assembly processes. This paper introduces briefly the idea of self-optimizing technical systems as a new approach towards highly flexible automation. Technically, the work focuses on the precision assembly of laser resonators, which is one of the final and most crucial assembly steps in terms of beam quality and laser power. The paper presents a new design approach for miniaturized laser systems and new automation concepts for a robot-based precision assembly, as well as passive and active alignment methods, which are based on a self-optimizing approach. Very promising results have already been achieved, considerably reducing the duration and complexity of the laser resonator assembly. These results as well as future development perspectives are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karnowski, Thomas Paul; Giancardo, Luca; Li, Yaquin
2013-01-01
Automated retina image analysis has reached a high level of maturity in recent years, and thus the question of how validation is performed in these systems is beginning to grow in importance. One application of retina image analysis is in telemedicine, where an automated system could enable the automated detection of diabetic retinopathy and other eye diseases as a low-cost method for broad-based screening. In this work we discuss our experiences in developing a telemedical network for retina image analysis, including our progression from a manual diagnosis network to a more fully automated one. We pay special attention to howmore » validations of our algorithm steps are performed, both using data from the telemedicine network and other public databases.« less
Automated aray assembly, phase 2
NASA Technical Reports Server (NTRS)
Daiello, R. V.
1979-01-01
A manufacturing process suitable for the large-scale production of silicon solar array modules at a cost of less than $500/peak kW is described. Factors which control the efficiency of ion implanted silicon solar cells, screen-printed thick film metallization, spray-on antireflection coating process, and panel assembly are discussed. Conclusions regarding technological readiness or cost effectiveness of individual process steps are presented.
Non-Contact Conductivity Measurement for Automated Sample Processing Systems
NASA Technical Reports Server (NTRS)
Beegle, Luther W.; Kirby, James P.
2012-01-01
A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables
Peng, Sean X; Cousineau, Martin; Juzwin, Stephen J; Ritchie, David M
2006-01-01
A novel 96-well screen filter plate (patent pending) has been invented to eliminate a time-consuming and labor-intensive step in preparation of in vivo study samples--to remove blood or plasma clots. These clots plug the pipet tips during a manual or automated sample-transfer step causing inaccurate pipetting or total pipetting failure. Traditionally, these blood and plasma clots are removed by picking them out manually one by one from each sample tube before any sample transfer can be made. This has significantly slowed the sample preparation process and has become a bottleneck for automated high-throughput sample preparation using robotic liquid handlers. Our novel screen filter plate was developed to solve this problem. The 96-well screen filter plate consists of 96 stainless steel wire-mesh screen tubes connected to the 96 openings of a top plate so that the screen filter plate can be readily inserted into a 96-well sample storage plate. Upon insertion, the blood and plasma clots are excluded from entering the screen tube while clear sample solutions flow freely into it. In this way, sample transfer can be easily completed by either manual or automated pipetting methods. In this report, three structurally diverse compounds were selected to evaluate and validate the use of the screen filter plate. The plasma samples of these compounds were transferred and processed in the presence and absence of the screen filter plate and then analyzed by LC-MS/MS methods. Our results showed a good agreement between the samples prepared with and without the screen filter plate, demonstrating the utility and efficiency of this novel device for preparation of blood and plasma samples. The device is simple, easy to use, and reusable. It can be employed for sample preparation of other biological fluids that contain floating particulates or aggregates.
Decision support system for the detection and grading of hard exudates from color fundus photographs
NASA Astrophysics Data System (ADS)
Jaafar, Hussain F.; Nandi, Asoke K.; Al-Nuaimy, Waleed
2011-11-01
Diabetic retinopathy is a major cause of blindness, and its earliest signs include damage to the blood vessels and the formation of lesions in the retina. Automated detection and grading of hard exudates from the color fundus image is a critical step in the automated screening system for diabetic retinopathy. We propose novel methods for the detection and grading of hard exudates and the main retinal structures. For exudate detection, a novel approach based on coarse-to-fine strategy and a new image-splitting method are proposed with overall sensitivity of 93.2% and positive predictive value of 83.7% at the pixel level. The average sensitivity of the blood vessel detection is 85%, and the success rate of fovea localization is 100%. For exudate grading, a polar fovea coordinate system is adopted in accordance with medical criteria. Because of its competitive performance and ability to deal efficiently with images of variable quality, the proposed technique offers promising and efficient performance as part of an automated screening system for diabetic retinopathy.
NASA Astrophysics Data System (ADS)
van Leunen, J. A. J.; Dreessen, J.
1984-05-01
The result of a measurement of the modulation transfer function is only useful as long as it is accompanied by a complete description of all relevant measuring conditions involved. For this reason it is necessary to file a full description of the relevant measuring conditions together with the results. In earlier times some of our results were rendered useless because some of the relevant measuring conditions were accidentally not written down and were forgotten. This was mainly due to the lack of concensus about which measuring conditions had to be filed together with the result of a measurement. One way to secure uniform and complete archiving of measuring conditions and results is to automate the data handling. An attendent advantage of automation of data handling is that it does away with the time-consuming correction of rough measuring results. The automation of the data handling was accomplished with rather cheap desktop computers, which were powerfull enough, however, to allow us to automate the measurement as well. After automation of the data handling we started with automatic collection of rough measurement data. Step by step we extended the automation by letting the desktop computer control more and more of the measuring set-up. At present the desktop computer controls all the electrical and most of the mechanical measuring conditions. Further it controls and reads the MTF measuring instrument. Focussing and orientation optimization can be fully automatic, semi-automatic or completely manual. MTF measuring results can be collected automatically but they can also be typed in by hand. Due to the automation we are able to implement proper archival of measuring results together with all necessary measuring conditions. The improved measuring efficiency made it possible to increase the number of routine measurements done in the same time period by an order of magnitude. To our surprise the measuring accuracy also improved by a factor of two. This was due to the much better reproducibility of the automatic optimization, which resulted in better reproducibility of the measurement result. Another advantage of the automation is that the programs that control the data handling and the automatic measurement are "user friendly". They guide the operator through the measuring procedure using information from earlier measurements of equivalent test specimens. This makes it possible to let routine measurements be done by much less skilled assistants. It also removes much of the tedious routine labour normally involved in MTF measurements. It can be concluded that automation of MTF measurements as described in the foregoing enhances the usefulness of MTF results as well as reducing the cost of MTF measurements.
Parallel adaptive wavelet collocation method for PDEs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nejadmalayeri, Alireza, E-mail: Alireza.Nejadmalayeri@gmail.com; Vezolainen, Alexei, E-mail: Alexei.Vezolainen@Colorado.edu; Brown-Dymkoski, Eric, E-mail: Eric.Browndymkoski@Colorado.edu
2015-10-01
A parallel adaptive wavelet collocation method for solving a large class of Partial Differential Equations is presented. The parallelization is achieved by developing an asynchronous parallel wavelet transform, which allows one to perform parallel wavelet transform and derivative calculations with only one data synchronization at the highest level of resolution. The data are stored using tree-like structure with tree roots starting at a priori defined level of resolution. Both static and dynamic domain partitioning approaches are developed. For the dynamic domain partitioning, trees are considered to be the minimum quanta of data to be migrated between the processes. This allowsmore » fully automated and efficient handling of non-simply connected partitioning of a computational domain. Dynamic load balancing is achieved via domain repartitioning during the grid adaptation step and reassigning trees to the appropriate processes to ensure approximately the same number of grid points on each process. The parallel efficiency of the approach is discussed based on parallel adaptive wavelet-based Coherent Vortex Simulations of homogeneous turbulence with linear forcing at effective non-adaptive resolutions up to 2048{sup 3} using as many as 2048 CPU cores.« less
To repair or not to repair: with FAVOR there is no question
NASA Astrophysics Data System (ADS)
Garetto, Anthony; Schulz, Kristian; Tabbone, Gilles; Himmelhaus, Michael; Scheruebl, Thomas
2016-10-01
In the mask shop the challenges associated with today's advanced technology nodes, both technical and economic, are becoming increasingly difficult. The constant drive to continue shrinking features means more masks per device, smaller manufacturing tolerances and more complexity along the manufacturing line with respect to the number of manufacturing steps required. Furthermore, the extremely competitive nature of the industry makes it critical for mask shops to optimize asset utilization and processes in order to maximize their competitive advantage and, in the end, profitability. Full maximization of profitability in such a complex and technologically sophisticated environment simply cannot be achieved without the use of smart automation. Smart automation allows productivity to be maximized through better asset utilization and process optimization. Reliability is improved through the minimization of manual interactions leading to fewer human error contributions and a more efficient manufacturing line. In addition to these improvements in productivity and reliability, extra value can be added through the collection and cross-verification of data from multiple sources which provides more information about our products and processes. When it comes to handling mask defects, for instance, the process consists largely of time consuming manual interactions that are error prone and often require quick decisions from operators and engineers who are under pressure. The handling of defects itself is a multiple step process consisting of several iterations of inspection, disposition, repair, review and cleaning steps. Smaller manufacturing tolerances and features with higher complexity contribute to a higher number of defects which must be handled as well as a higher level of complexity. In this paper the recent efforts undertaken by ZEISS to provide solutions which address these challenges, particularly those associated with defectivity, will be presented. From automation of aerial image analysis to the use of data driven decision making to predict and propose the optimized back end of line process flow, productivity and reliability improvements are targeted by smart automation. Additionally the generation of the ideal aerial image from the design and several repair enhancement features offer additional capabilities to improve the efficiency and yield associated with defect handling.
Development of processes for the production of low cost silicon dendritic web for solar cells
NASA Technical Reports Server (NTRS)
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.; Skutch, M. E.; Driggers, J. M.; Hill, F. E.
1980-01-01
High area output rates and continuous, automated growth are two key technical requirements for the growth of low-cost silicon ribbons for solar cells. By means of computer-aided furnace design, silicon dendritic web output rates as high as 27 sq cm/min have been achieved, a value in excess of that projected to meet a $0.50 per peak watt solar array manufacturing cost. The feasibility of simultaneous web growth while the melt is replenished with pelletized silicon has also been demonstrated. This step is an important precursor to the development of an automated growth system. Solar cells made on the replenished material were just as efficient as devices fabricated on typical webs grown without replenishment. Moreover, web cells made on a less-refined, pelletized polycrystalline silicon synthesized by the Battelle process yielded efficiencies up to 13% (AM1).
Process development for automated solar cell and module production. Task 4: Automated array assembly
NASA Technical Reports Server (NTRS)
1980-01-01
A process sequence which can be used in conjunction with automated equipment for the mass production of solar cell modules for terrestrial use was developed. The process sequence was then critically analyzed from a technical and economic standpoint to determine the technological readiness of certain process steps for implementation. The steps receiving analysis were: back contact metallization, automated cell array layup/interconnect, and module edge sealing. For automated layup/interconnect, both hard automation and programmable automation (using an industrial robot) were studied. The programmable automation system was then selected for actual hardware development.
Zhou, Mo; Fukuoka, Yoshimi; Mintz, Yonatan; Goldberg, Ken; Kaminsky, Philip; Flowers, Elena
2018-01-01
Background Growing evidence shows that fixed, nonpersonalized daily step goals can discourage individuals, resulting in unchanged or even reduced physical activity. Objective The aim of this randomized controlled trial (RCT) was to evaluate the efficacy of an automated mobile phone–based personalized and adaptive goal-setting intervention using machine learning as compared with an active control with steady daily step goals of 10,000. Methods In this 10-week RCT, 64 participants were recruited via email announcements and were required to attend an initial in-person session. The participants were randomized into either the intervention or active control group with a one-to-one ratio after a run-in period for data collection. A study-developed mobile phone app (which delivers daily step goals using push notifications and allows real-time physical activity monitoring) was installed on each participant’s mobile phone, and participants were asked to keep their phone in a pocket throughout the entire day. Through the app, the intervention group received fully automated adaptively personalized daily step goals, and the control group received constant step goals of 10,000 steps per day. Daily step count was objectively measured by the study-developed mobile phone app. Results The mean (SD) age of participants was 41.1 (11.3) years, and 83% (53/64) of participants were female. The baseline demographics between the 2 groups were similar (P>.05). Participants in the intervention group (n=34) had a decrease in mean (SD) daily step count of 390 (490) steps between run-in and 10 weeks, compared with a decrease of 1350 (420) steps among control participants (n=30; P=.03). The net difference in daily steps between the groups was 960 steps (95% CI 90-1830 steps). Both groups had a decrease in daily step count between run-in and 10 weeks because interventions were also provided during run-in and no natural baseline was collected. Conclusions The results showed the short-term efficacy of this intervention, which should be formally evaluated in a full-scale RCT with a longer follow-up period. Trial Registration ClinicalTrials.gov: NCT02886871; https://clinicaltrials.gov/ct2/show/NCT02886871 (Archived by WebCite at http://www.webcitation.org/6wM1Be1Ng). PMID:29371177
Antonelli, Giorgia; Padoan, Andrea; Artusi, Carlo; Marinova, Mariela; Zaninotto, Martina; Plebani, Mario
2016-04-01
The aim of this study was to implement in our routine practice an automated saliva preparation protocol for quantification of cortisol (F) and cortisone (E) by LC-MS/MS using a liquid handling platform, maintaining the previously defined reference intervals with the manual preparation. Addition of internal standard solution to saliva samples and calibrators and SPE on μ-elution 96-well plate were performed by liquid handling platform. After extraction, the eluates were submitted to LC-MS/MS analysis. The manual steps within the entire process were to transfer saliva samples in suitable tubes, to put the cap mat and transfer of the collection plate to the LC auto sampler. Transference of the reference intervals from the manual to the automated procedure was established by Passing Bablok regression on 120 saliva samples analyzed simultaneously with the two procedures. Calibration curves were linear throughout the selected ranges. The imprecision ranged from 2 to 10%, with recoveries from 95 to 116%. Passing Bablok regression demonstrated no significant bias. The liquid handling platform translates the manual steps into automated operations allowing for saving hands-on time, while maintaining assay reproducibility and ensuring reliability of results, making it implementable in our routine with the previous established reference intervals. Copyright © 2015 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Automated Synthesis of a 184-Member Library of Thiadiazepan-1, 1-dioxide-4-ones
Fenster, Erik; Long, Toby R.; Zang, Qin; Hill, David; Neuenswander, Benjamin; Lushington, Gerald H.; Zhou, Aihua; Santini, Conrad; Hanson, Paul R.
2011-01-01
The construction of a 225-member (3 × 5 × 15) library of thiadiazepan-1,1-dioxide-4-ones was performed on a Chemspeed Accelerator (SLT-100) automated parallel synthesis platform, culminating in the successful preparation of 184/225 sultams. Three sultam core scaffolds were prepared based upon the utilization of an aza-Michael reaction on a multifunctional vinyl sulfonamide linchpin. The library exploits peripheral diversity in the form of a sequential, two-step [3 + 2] Huisgen cycloaddition/Pd-catalyzed Suzuki–Miyaura coupling sequence. PMID:21309582
Automated synthesis of a 184-member library of thiadiazepan-1,1-dioxide-4-ones.
Fenster, Erik; Long, Toby R; Zang, Qin; Hill, David; Neuenswander, Benjamin; Lushington, Gerald H; Zhou, Aihua; Santini, Conrad; Hanson, Paul R
2011-05-09
The construction of a 225-member (3 × 5 × 15) library of thiadiazepan-1,1-dioxide-4-ones was performed on a Chemspeed Accelerator (SLT-100) automated parallel synthesis platform, culminating in the successful preparation of 184/225 sultams. Three sultam core scaffolds were prepared based upon the utilization of an aza-Michael reaction on a multifunctional vinyl sulfonamide linchpin. The library exploits peripheral diversity in the form of a sequential, two-step [3 + 2] Huisgen cycloaddition/Pd-catalyzed Suzuki-Miyaura coupling sequence.
Questioned document workflow for handwriting with automated tools
NASA Astrophysics Data System (ADS)
Das, Krishnanand; Srihari, Sargur N.; Srinivasan, Harish
2012-01-01
During the last few years many document recognition methods have been developed to determine whether a handwriting specimen can be attributed to a known writer. However, in practice, the work-flow of the document examiner continues to be manual-intensive. Before a systematic or computational, approach can be developed, an articulation of the steps involved in handwriting comparison is needed. We describe the work flow of handwritten questioned document examination, as described in a standards manual, and the steps where existing automation tools can be used. A well-known ransom note case is considered as an example, where one encounters testing for multiple writers of the same document, determining whether the writing is disguised, known writing is formal while questioned writing is informal, etc. The findings for the particular ransom note case using the tools are given. Also observations are made for developing a more fully automated approach to handwriting examination.
NASA Astrophysics Data System (ADS)
Prabandari, R. D.; Murfi, H.
2017-07-01
An increasing amount of information on social media such as Twitter requires an efficient way to find the topics so that the information can be well managed. One of an automated method for topic detection is separable non-negative matrix factorization (SNMF). SNMF assumes that each topic has at least one word that does not appear on other topics. This method uses the direct approach and gives polynomial-time complexity, while the previous methods are iterative approaches and have NP-hard complexity. There are three steps of SNMF algorithm, i.e. constructing word co-occurrences, finding anchor words, and recovering topics. In this paper, we examine two topic recover methods, namely original recover that is using algebraic manipulation and recover KL that using probability approach with Kullback-Leibler divergence. Our simulations show that recover KL provides better accuracies in term of topic recall than original recover.
An automated methodology development. [software design for combat simulation
NASA Technical Reports Server (NTRS)
Hawley, L. R.
1985-01-01
The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.
Purdie, Thomas G; Dinniwell, Robert E; Letourneau, Daniel; Hill, Christine; Sharpe, Michael B
2011-10-01
To present an automated technique for two-field tangential breast intensity-modulated radiotherapy (IMRT) treatment planning. A total of 158 planned patients with Stage 0, I, and II breast cancer treated using whole-breast IMRT were retrospectively replanned using automated treatment planning tools. The tools developed are integrated into the existing clinical treatment planning system (Pinnacle(3)) and are designed to perform the manual volume delineation, beam placement, and IMRT treatment planning steps carried out by the treatment planning radiation therapist. The automated algorithm, using only the radio-opaque markers placed at CT simulation as inputs, optimizes the tangential beam parameters to geometrically minimize the amount of lung and heart treated while covering the whole-breast volume. The IMRT parameters are optimized according to the automatically delineated whole-breast volume. The mean time to generate a complete treatment plan was 6 min, 50 s ± 1 min 12 s. For the automated plans, 157 of 158 plans (99%) were deemed clinically acceptable, and 138 of 158 plans (87%) were deemed clinically improved or equal to the corresponding clinical plan when reviewed in a randomized, double-blinded study by one experienced breast radiation oncologist. In addition, overall the automated plans were dosimetrically equivalent to the clinical plans when scored for target coverage and lung and heart doses. We have developed robust and efficient automated tools for fully inversed planned tangential breast IMRT planning that can be readily integrated into clinical practice. The tools produce clinically acceptable plans using only the common anatomic landmarks from the CT simulation process as an input. We anticipate the tools will improve patient access to high-quality IMRT treatment by simplifying the planning process and will reduce the effort and cost of incorporating more advanced planning into clinical practice. Crown Copyright © 2011. Published by Elsevier Inc. All rights reserved.
NMR-based automated protein structure determination.
Würz, Julia M; Kazemi, Sina; Schmidt, Elena; Bagaria, Anurag; Güntert, Peter
2017-08-15
NMR spectra analysis for protein structure determination can now in many cases be performed by automated computational methods. This overview of the computational methods for NMR protein structure analysis presents recent automated methods for signal identification in multidimensional NMR spectra, sequence-specific resonance assignment, collection of conformational restraints, and structure calculation, as implemented in the CYANA software package. These algorithms are sufficiently reliable and integrated into one software package to enable the fully automated structure determination of proteins starting from NMR spectra without manual interventions or corrections at intermediate steps, with an accuracy of 1-2 Å backbone RMSD in comparison with manually solved reference structures. Copyright © 2017 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tiwari, P; Chen, Y; Hong, L
2015-06-15
Purpose We developed an automated treatment planning system based on a hierarchical goal programming approach. To demonstrate the feasibility of our method, we report the comparison of prostate treatment plans produced from the automated treatment planning system with those produced by a commercial treatment planning system. Methods In our approach, we prioritized the goals of the optimization, and solved one goal at a time. The purpose of prioritization is to ensure that higher priority dose-volume planning goals are not sacrificed to improve lower priority goals. The algorithm has four steps. The first step optimizes dose to the target structures, whilemore » sparing key sensitive organs from radiation. In the second step, the algorithm finds the best beamlet weight to reduce toxicity risks to normal tissue while holding the objective function achieved in the first step as a constraint, with a small amount of allowed slip. Likewise, the third and fourth steps introduce lower priority normal tissue goals and beam smoothing. We compared with prostate treatment plans from Memorial Sloan Kettering Cancer Center developed using Eclipse, with a prescription dose of 72 Gy. A combination of liear, quadratic, and gEUD objective functions were used with a modified open source solver code (IPOPT). Results Initial plan results on 3 different cases show that the automated planning system is capable of competing or improving on expert-driven eclipse plans. Compared to the Eclipse planning system, the automated system produced up to 26% less mean dose to rectum and 24% less mean dose to bladder while having the same D95 (after matching) to the target. Conclusion We have demonstrated that Pareto optimal treatment plans can be generated automatically without a trial-and-error process. The solver finds an optimal plan for the given patient, as opposed to database-driven approaches that set parameters based on geometry and population modeling.« less
Learn, R; Feigenbaum, E
2016-06-01
Two algorithms that enhance the utility of the absorbing boundary layer are presented, mainly in the framework of the Fourier beam-propagation method. One is an automated boundary layer width selector that chooses a near-optimal boundary size based on the initial beam shape. The second algorithm adjusts the propagation step sizes based on the beam shape at the beginning of each step in order to reduce aliasing artifacts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Learn, R.; Feigenbaum, E.
Two algorithms that enhance the utility of the absorbing boundary layer are presented, mainly in the framework of the Fourier beam-propagation method. One is an automated boundary layer width selector that chooses a near-optimal boundary size based on the initial beam shape. Furthermore, the second algorithm adjusts the propagation step sizes based on the beam shape at the beginning of each step in order to reduce aliasing artifacts.
Learn, R.; Feigenbaum, E.
2016-05-27
Two algorithms that enhance the utility of the absorbing boundary layer are presented, mainly in the framework of the Fourier beam-propagation method. One is an automated boundary layer width selector that chooses a near-optimal boundary size based on the initial beam shape. Furthermore, the second algorithm adjusts the propagation step sizes based on the beam shape at the beginning of each step in order to reduce aliasing artifacts.
NASA Technical Reports Server (NTRS)
Aksamentov, Valery
1996-01-01
Changes in the former Soviet Union have opened the gate for the exchange of new technology. Interest in this work has been particularly related to Thermal Electric Cooling Devices (TED's) which have an application for the Thermal Enclosure System (TES) developed by NASA. Preliminary information received by NASA/MSFC indicates that Russian TED's have higher efficiency. Based on that assumption NASA/MSFC awarded a contract to the University of Alabama in Huntsville (UAH) in order to study the Russian TED's technology. In order to fulfill this a few steps should be made: (1) potential specifications and configurations should be defined for use of TED's in Protein Crystal Growing (PCG) thermal control hardware; and (2) work closely with the identified Russian source to define and identify potential Russian TED's to exceed the performance of available domestic TED's. Based on the data from Russia, it is possible to make plans for further steps such as buying and testing high performance TED's. To accomplish this goal two subcontracts have been released. One subcontract to Automated Sciences Group (ASG) located in Huntsville, AL and one to the International Center for Advanced Studies 'Cosmos' located in Moscow, Russia.
Schmidt, Thomas H; Kandt, Christian
2012-10-22
At the beginning of each molecular dynamics membrane simulation stands the generation of a suitable starting structure which includes the working steps of aligning membrane and protein and seamlessly accommodating the protein in the membrane. Here we introduce two efficient and complementary methods based on pre-equilibrated membrane patches, automating these steps. Using a voxel-based cast of the coarse-grained protein, LAMBADA computes a hydrophilicity profile-derived scoring function based on which the optimal rotation and translation operations are determined to align protein and membrane. Employing an entirely geometrical approach, LAMBADA is independent from any precalculated data and aligns even large membrane proteins within minutes on a regular workstation. LAMBADA is the first tool performing the entire alignment process automatically while providing the user with the explicit 3D coordinates of the aligned protein and membrane. The second tool is an extension of the InflateGRO method addressing the shortcomings of its predecessor in a fully automated workflow. Determining the exact number of overlapping lipids based on the area occupied by the protein and restricting expansion, compression and energy minimization steps to a subset of relevant lipids through automatically calculated and system-optimized operation parameters, InflateGRO2 yields optimal lipid packing and reduces lipid vacuum exposure to a minimum preserving as much of the equilibrated membrane structure as possible. Applicable to atomistic and coarse grain structures in MARTINI format, InflateGRO2 offers high accuracy, fast performance, and increased application flexibility permitting the easy preparation of systems exhibiting heterogeneous lipid composition as well as embedding proteins into multiple membranes. Both tools can be used separately, in combination with other methods, or in tandem permitting a fully automated workflow while retaining a maximum level of usage control and flexibility. To assess the performance of both methods, we carried out test runs using 22 membrane proteins of different size and transmembrane structure.
Yu, Sheng; Liao, Katherine P; Shaw, Stanley Y; Gainer, Vivian S; Churchill, Susanne E; Szolovits, Peter; Murphy, Shawn N; Kohane, Isaac S; Cai, Tianxi
2015-09-01
Analysis of narrative (text) data from electronic health records (EHRs) can improve population-scale phenotyping for clinical and genetic research. Currently, selection of text features for phenotyping algorithms is slow and laborious, requiring extensive and iterative involvement by domain experts. This paper introduces a method to develop phenotyping algorithms in an unbiased manner by automatically extracting and selecting informative features, which can be comparable to expert-curated ones in classification accuracy. Comprehensive medical concepts were collected from publicly available knowledge sources in an automated, unbiased fashion. Natural language processing (NLP) revealed the occurrence patterns of these concepts in EHR narrative notes, which enabled selection of informative features for phenotype classification. When combined with additional codified features, a penalized logistic regression model was trained to classify the target phenotype. The authors applied our method to develop algorithms to identify patients with rheumatoid arthritis and coronary artery disease cases among those with rheumatoid arthritis from a large multi-institutional EHR. The area under the receiver operating characteristic curves (AUC) for classifying RA and CAD using models trained with automated features were 0.951 and 0.929, respectively, compared to the AUCs of 0.938 and 0.929 by models trained with expert-curated features. Models trained with NLP text features selected through an unbiased, automated procedure achieved comparable or slightly higher accuracy than those trained with expert-curated features. The majority of the selected model features were interpretable. The proposed automated feature extraction method, generating highly accurate phenotyping algorithms with improved efficiency, is a significant step toward high-throughput phenotyping. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Fully automated chest wall line segmentation in breast MRI by using context information
NASA Astrophysics Data System (ADS)
Wu, Shandong; Weinstein, Susan P.; Conant, Emily F.; Localio, A. Russell; Schnall, Mitchell D.; Kontos, Despina
2012-03-01
Breast MRI has emerged as an effective modality for the clinical management of breast cancer. Evidence suggests that computer-aided applications can further improve the diagnostic accuracy of breast MRI. A critical and challenging first step for automated breast MRI analysis, is to separate the breast as an organ from the chest wall. Manual segmentation or user-assisted interactive tools are inefficient, tedious, and error-prone, which is prohibitively impractical for processing large amounts of data from clinical trials. To address this challenge, we developed a fully automated and robust computerized segmentation method that intensively utilizes context information of breast MR imaging and the breast tissue's morphological characteristics to accurately delineate the breast and chest wall boundary. A critical component is the joint application of anisotropic diffusion and bilateral image filtering to enhance the edge that corresponds to the chest wall line (CWL) and to reduce the effect of adjacent non-CWL tissues. A CWL voting algorithm is proposed based on CWL candidates yielded from multiple sequential MRI slices, in which a CWL representative is generated and used through a dynamic time warping (DTW) algorithm to filter out inferior candidates, leaving the optimal one. Our method is validated by a representative dataset of 20 3D unilateral breast MRI scans that span the full range of the American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS) fibroglandular density categorization. A promising performance (average overlay percentage of 89.33%) is observed when the automated segmentation is compared to manually segmented ground truth obtained by an experienced breast imaging radiologist. The automated method runs time-efficiently at ~3 minutes for each breast MR image set (28 slices).
Automation of testing modules of controller ELSY-ТМК
NASA Astrophysics Data System (ADS)
Dolotov, A. E.; Dolotova, R. G.; Petuhov, D. V.; Potapova, A. P.
2017-01-01
In modern life, there are means for automation of various processes which allow one to provide high quality standards of released products and to raise labour efficiency. In the given paper, the data on the automation of the test process of the ELSY-TMK controller [1] is presented. The ELSY-TMK programmed logic controller is an effective modular platform for construction of automation systems for small and average branches of industrial production. The modern and functional standard of communication and open environment of the logic controller give a powerful tool of wide spectrum applications for industrial automation. The algorithm allows one to test controller modules by operating the switching system and external devices faster and at a higher level of quality than a human without such means does.
Rapid, chemical-free breaking of microfluidic emulsions with a hand-held antistatic gun
Shahi, Payam; Abate, Adam R.
2017-01-01
Droplet microfluidics can form and process millions of picoliter droplets with speed and ease, allowing the execution of huge numbers of biological reactions for high-throughput studies. However, at the conclusion of most experiments, the emulsions must be broken to recover and analyze their contents. This is usually achieved with demulsifiers, like perfluorooctanol and chloroform, which can interfere with downstream reactions and harm cells. Here, we describe a simple approach to rapidly and efficiently break microfluidic emulsions, which requires no chemicals. Our method allows one-pot multi-step reactions, making it useful for large scale automated processing of reactions requiring demulsification. Using a hand-held antistatic gun, we pulse emulsions with the electric field, coalescing ∼100 μl of droplets in ∼10 s. We show that while emulsions broken with chemical demulsifiers exhibit potent PCR inhibition, the antistatic-broken emulsions amplify efficiently. The ability to break emulsions quickly without chemicals should make our approach valuable for most demulsification needs in microfluidics. PMID:28794817
Lange, Paul P; James, Keith
2012-10-08
A novel methodology for the synthesis of druglike heterocycle libraries has been developed through the use of flow reactor technology. The strategy employs orthogonal modification of a heterocyclic core, which is generated in situ, and was used to construct both a 25-membered library of druglike 3-aminoindolizines, and selected examples of a 100-member virtual library. This general protocol allows a broad range of acylation, alkylation and sulfonamidation reactions to be performed in conjunction with a tandem Sonogashira coupling/cycloisomerization sequence. All three synthetic steps were conducted under full automation in the flow reactor, with no handling or isolation of intermediates, to afford the desired products in good yields. This fully automated, multistep flow approach opens the way to highly efficient generation of druglike heterocyclic systems as part of a lead discovery strategy or within a lead optimization program.
Moenninghoff, Christoph; Umutlu, Lale; Kloeters, Christian; Ringelstein, Adrian; Ladd, Mark E; Sombetzki, Antje; Lauenstein, Thomas C; Forsting, Michael; Schlamann, Marc
2013-06-01
Workflow efficiency and workload of radiological technologists (RTs) were compared in head examinations performed with two 1.5 T magnetic resonance (MR) scanners equipped with or without an automated user interface called "day optimizing throughput" (Dot) workflow engine. Thirty-four patients with known intracranial pathology were examined with a 1.5 T MR scanner with Dot workflow engine (Siemens MAGNETOM Aera) and with a 1.5 T MR scanner with conventional user interface (Siemens MAGNETOM Avanto) using four standardized examination protocols. The elapsed time for all necessary work steps, which were performed by 11 RTs within the total examination time, was compared for each examination at both MR scanners. The RTs evaluated the user-friendliness of both scanners by a questionnaire. Normality of distribution was checked for all continuous variables by use of the Shapiro-Wilk test. Normally distributed variables were analyzed by Student's paired t-test, otherwise Wilcoxon signed-rank test was used to compare means. Total examination time of MR examinations performed with Dot engine was reduced from 24:53 to 20:01 minutes (P < .001) and the necessary RT intervention decreased by 61% (P < .001). The Dot engine's automated choice of MR protocols was significantly better assessed by the RTs than the conventional user interface (P = .001). According to this preliminary study, the Dot workflow engine is a time-saving user assistance software, which decreases the RTs' effort significantly and may help to automate neuroradiological examinations for a higher workflow efficiency. Copyright © 2013 AUR. Published by Elsevier Inc. All rights reserved.
Zhou, Long; Chang, Jingjing; Liu, Ziye; Sun, Xu; Lin, Zhenhua; Chen, Dazheng; Zhang, Chunfu; Zhang, Jincheng; Hao, Yue
2018-02-08
Perovskite/PCBM heterojunctions are efficient for fabricating perovskite solar cells with high performance and long-term stability. In this study, an efficient perovskite/PCBM heterojunction was formed via conventional sequential deposition and one-step formation processes. Compared with conventional deposition, the one-step process was more facile, and produced a perovskite thin film of substantially improved quality due to fullerene passivation. Moreover, the resulting perovskite/PCBM heterojunction exhibited more efficient carrier transfer and extraction, and reduced carrier recombination. The perovskite solar cell device based on one-step perovskite/PCBM heterojunction formation exhibited a higher maximum PCE of 17.8% compared with that from the conventional method (13.7%). The device also showed exceptional stability, retaining 83% of initial PCE after 60 days of storage under ambient conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Egorov, Oleg; O'Hara, Matthew J.; Grate, Jay W.
An automated fluidic instrument is described that rapidly determines the total 99Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the 99Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of 99Tc. We developed amore » preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of 99Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.« less
Weiss, Kenneth L; Pan, Hai; Storrs, Judd; Strub, William; Weiss, Jane L; Jia, Li; Eldevik, O Petter
2003-05-01
Variability in patient head positioning may yield substantial interstudy image variance in the clinical setting. We describe and test three-step technologist and computer-automated algorithms designed to image the brain in a standard reference system and reduce variance. Triple oblique axial images obtained parallel to the Talairach anterior commissure (AC)-posterior commissure (PC) plane were reviewed in a prospective analysis of 126 consecutive patients. Requisite roll, yaw, and pitch correction, as three authors determined independently and subsequently by consensus, were compared with the technologists' actual graphical prescriptions and those generated by a novel computer automated three-step (CATS) program. Automated pitch determinations generated with Statistical Parametric Mapping '99 (SPM'99) were also compared. Requisite pitch correction (15.2 degrees +/- 10.2 degrees ) far exceeded that for roll (-0.6 degrees +/- 3.7 degrees ) and yaw (-0.9 degrees +/- 4.7 degrees ) in terms of magnitude and variance (P <.001). Technologist and computer-generated prescriptions substantially reduced interpatient image variance with regard to roll (3.4 degrees and 3.9 degrees vs 13.5 degrees ), yaw (0.6 degrees and 2.5 degrees vs 22.3 degrees ), and pitch (28.6 degrees, 18.5 degrees with CATS, and 59.3 degrees with SPM'99 vs 104 degrees ). CATS performed worse than the technologists in yaw prescription, and it was equivalent in roll and pitch prescriptions. Talairach prescriptions better approximated standard CT canthomeatal angulations (9 degrees vs 24 degrees ) and provided more efficient brain coverage than that of routine axial imaging. Brain MR prescriptions corrected for direct roll, yaw, and Talairach AC-PC pitch can be readily achieved by trained technologists or automated computer algorithms. This ability will substantially reduce interpatient variance, allow better approximation of standard CT angulation, and yield more efficient brain coverage than that of routine clinical axial imaging.
An automated dose tracking system for adaptive radiation therapy.
Liu, Chang; Kim, Jinkoo; Kumarasiri, Akila; Mayyas, Essa; Brown, Stephen L; Wen, Ning; Siddiqui, Farzan; Chetty, Indrin J
2018-02-01
The implementation of adaptive radiation therapy (ART) into routine clinical practice is technically challenging and requires significant resources to perform and validate each process step. The objective of this report is to identify the key components of ART, to illustrate how a specific automated procedure improves efficiency, and to facilitate the routine clinical application of ART. Data was used from patient images, exported from a clinical database and converted to an intermediate format for point-wise dose tracking and accumulation. The process was automated using in-house developed software containing three modularized components: an ART engine, user interactive tools, and integration tools. The ART engine conducts computing tasks using the following modules: data importing, image pre-processing, dose mapping, dose accumulation, and reporting. In addition, custom graphical user interfaces (GUIs) were developed to allow user interaction with select processes such as deformable image registration (DIR). A commercial scripting application programming interface was used to incorporate automated dose calculation for application in routine treatment planning. Each module was considered an independent program, written in C++or C#, running in a distributed Windows environment, scheduled and monitored by integration tools. The automated tracking system was retrospectively evaluated for 20 patients with prostate cancer and 96 patients with head and neck cancer, under institutional review board (IRB) approval. In addition, the system was evaluated prospectively using 4 patients with head and neck cancer. Altogether 780 prostate dose fractions and 2586 head and neck cancer dose fractions went processed, including DIR and dose mapping. On average, daily cumulative dose was computed in 3 h and the manual work was limited to 13 min per case with approximately 10% of cases requiring an additional 10 min for image registration refinement. An efficient and convenient dose tracking system for ART in the clinical setting is presented. The software and automated processes were rigorously evaluated and validated using patient image datasets. Automation of the various procedures has improved efficiency significantly, allowing for the routine clinical application of ART for improving radiation therapy effectiveness. Copyright © 2017 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Cochrane, Andy; Barnes-Holmes, Dermot; Barnes-Holmes, Yvonne
2008-01-01
One hundred twenty female participants, with varying levels of spider fear were asked to complete an automated 8-step perceived-threat behavioral approach test (PT-BAT). The steps involved asking the participants if they were willing to put their hand into a number of opaque jars with an incrementally increasing risk of contact with a spider (none…
Puttini, Stefania; Ouvrard-Pascaud, Antoine; Palais, Gael; Beggah, Ahmed T; Gascard, Philippe; Cohen-Tannoudji, Michel; Babinet, Charles; Blot-Chabaud, Marcel; Jaisser, Frederic
2005-03-16
Functional genomic analysis is a challenging step in the so-called post-genomic field. Identification of potential targets using large-scale gene expression analysis requires functional validation to identify those that are physiologically relevant. Genetically modified cell models are often used for this purpose allowing up- or down-expression of selected targets in a well-defined and if possible highly differentiated cell type. However, the generation of such models remains time-consuming and expensive. In order to alleviate this step, we developed a strategy aimed at the rapid and efficient generation of genetically modified cell lines with conditional, inducible expression of various target genes. Efficient knock-in of various constructs, called targeted transgenesis, in a locus selected for its permissibility to the tet inducible system, was obtained through the stimulation of site-specific homologous recombination by the meganuclease I-SceI. Our results demonstrate that targeted transgenesis in a reference inducible locus greatly facilitated the functional analysis of the selected recombinant cells. The efficient screening strategy we have designed makes possible automation of the transfection and selection steps. Furthermore, this strategy could be applied to a variety of highly differentiated cells.
Using Automation to Improve the Flight Software Testing Process
NASA Technical Reports Server (NTRS)
ODonnell, James R., Jr.; Andrews, Stephen F.; Morgenstern, Wendy M.; Bartholomew, Maureen O.; McComas, David C.; Bauer, Frank H. (Technical Monitor)
2001-01-01
One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, attitude control, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on previous missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the perceived benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.
Using Automation to Improve the Flight Software Testing Process
NASA Technical Reports Server (NTRS)
ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.
2001-01-01
One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.
Automated assembly of fast-axis collimation (FAC) lenses for diode laser bar modules
NASA Astrophysics Data System (ADS)
Miesner, Jörn; Timmermann, Andre; Meinschien, Jens; Neumann, Bernhard; Wright, Steve; Tekin, Tolga; Schröder, Henning; Westphalen, Thomas; Frischkorn, Felix
2009-02-01
Laser diodes and diode laser bars are key components in high power semiconductor lasers and solid state laser systems. During manufacture, the assembly of the fast axis collimation (FAC) lens is a crucial step. The goal of our activities is to design an automated assembly system for high volume production. In this paper the results of an intermediate milestone will be reported: a demonstration system was designed, realized and tested to prove the feasibility of all of the system components and process features. The demonstration system consists of a high precision handling system, metrology for process feedback, a powerful digital image processing system and tooling for glue dispensing, UV curing and laser operation. The system components as well as their interaction with each other were tested in an experimental system in order to glean design knowledge for the fully automated assembly system. The adjustment of the FAC lens is performed by a series of predefined steps monitored by two cameras concurrently imaging the far field and the near field intensity distributions. Feedback from these cameras processed by a powerful and efficient image processing algorithm control a five axis precision motion system to optimize the fast axis collimation of the laser beam. Automated cementing of the FAC to the diode bar completes the process. The presentation will show the system concept, the algorithm of the adjustment as well as experimental results. A critical discussion of the results will close the talk.
Zhou, Mo; Fukuoka, Yoshimi; Mintz, Yonatan; Goldberg, Ken; Kaminsky, Philip; Flowers, Elena; Aswani, Anil
2018-01-25
Growing evidence shows that fixed, nonpersonalized daily step goals can discourage individuals, resulting in unchanged or even reduced physical activity. The aim of this randomized controlled trial (RCT) was to evaluate the efficacy of an automated mobile phone-based personalized and adaptive goal-setting intervention using machine learning as compared with an active control with steady daily step goals of 10,000. In this 10-week RCT, 64 participants were recruited via email announcements and were required to attend an initial in-person session. The participants were randomized into either the intervention or active control group with a one-to-one ratio after a run-in period for data collection. A study-developed mobile phone app (which delivers daily step goals using push notifications and allows real-time physical activity monitoring) was installed on each participant's mobile phone, and participants were asked to keep their phone in a pocket throughout the entire day. Through the app, the intervention group received fully automated adaptively personalized daily step goals, and the control group received constant step goals of 10,000 steps per day. Daily step count was objectively measured by the study-developed mobile phone app. The mean (SD) age of participants was 41.1 (11.3) years, and 83% (53/64) of participants were female. The baseline demographics between the 2 groups were similar (P>.05). Participants in the intervention group (n=34) had a decrease in mean (SD) daily step count of 390 (490) steps between run-in and 10 weeks, compared with a decrease of 1350 (420) steps among control participants (n=30; P=.03). The net difference in daily steps between the groups was 960 steps (95% CI 90-1830 steps). Both groups had a decrease in daily step count between run-in and 10 weeks because interventions were also provided during run-in and no natural baseline was collected. The results showed the short-term efficacy of this intervention, which should be formally evaluated in a full-scale RCT with a longer follow-up period. ClinicalTrials.gov: NCT02886871; https://clinicaltrials.gov/ct2/show/NCT02886871 (Archived by WebCite at http://www.webcitation.org/6wM1Be1Ng). ©Mo Zhou, Yoshimi Fukuoka, Yonatan Mintz, Ken Goldberg, Philip Kaminsky, Elena Flowers, Anil Aswani. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 25.01.2018.
NASA Technical Reports Server (NTRS)
1984-01-01
The electroepitaxial process and the Very Large Scale Integration (VLSI) circuits (chips) facilities were chosen because each requires a very high degree of automation, and therefore involved extensive use of teleoperators, robotics, process mechanization, and artificial intelligence. Both cover a raw materials process and a sophisticated multi-step process and are therfore highly representative of the kinds of difficult operation, maintenance, and repair challenges which can be expected for any type of space manufacturing facility. Generic areas were identified which will require significant further study. The initial design will be based on terrestrial state-of-the-art hard automation. One hundred candidate missions were evaluated on the basis of automation portential and availability of meaning ful knowldege. The design requirements and unconstrained design concepts developed for the two missions are presented.
Amendola, Alessandra; Coen, Sabrina; Belladonna, Stefano; Pulvirenti, F Renato; Clemens, John M; Capobianchi, M Rosaria
2011-08-01
Diagnostic laboratories need automation that facilitates efficient processing and workflow management to meet today's challenges for expanding services and reducing cost, yet maintaining the highest levels of quality. Processing efficiency of two commercially available automated systems for quantifying HIV-1 and HCV RNA, Abbott m2000 system and Roche COBAS Ampliprep/COBAS TaqMan 96 (docked) systems (CAP/CTM), was evaluated in a mid/high throughput workflow laboratory using a representative daily workload of 24 HCV and 72 HIV samples. Three test scenarios were evaluated: A) one run with four batches on the CAP/CTM system, B) two runs on the Abbott m2000 and C) one run using the Abbott m2000 maxCycle feature (maxCycle) for co-processing these assays. Cycle times for processing, throughput and hands-on time were evaluated. Overall processing cycle time was 10.3, 9.1 and 7.6 h for Scenarios A), B) and C), respectively. Total hands-on time for each scenario was, in order, 100.0 (A), 90.3 (B) and 61.4 min (C). The interface of an automated analyzer to the laboratory workflow, notably system set up for samples and reagents and clean up functions, are as important as the automation capability of the analyzer for the overall impact to processing efficiency and operator hands-on time.
Knepper, Andreas; Heiser, Michael; Glauche, Florian; Neubauer, Peter
2014-12-01
The enormous variation possibilities of bioprocesses challenge process development to fix a commercial process with respect to costs and time. Although some cultivation systems and some devices for unit operations combine the latest technology on miniaturization, parallelization, and sensing, the degree of automation in upstream and downstream bioprocess development is still limited to single steps. We aim to face this challenge by an interdisciplinary approach to significantly shorten development times and costs. As a first step, we scaled down analytical assays to the microliter scale and created automated procedures for starting the cultivation and monitoring the optical density (OD), pH, concentrations of glucose and acetate in the culture medium, and product formation in fed-batch cultures in the 96-well format. Then, the separate measurements of pH, OD, and concentrations of acetate and glucose were combined to one method. This method enables automated process monitoring at dedicated intervals (e.g., also during the night). By this approach, we managed to increase the information content of cultivations in 96-microwell plates, thus turning them into a suitable tool for high-throughput bioprocess development. Here, we present the flowcharts as well as cultivation data of our automation approach. © 2014 Society for Laboratory Automation and Screening.
Contingency Management with Human Autonomy Teaming
NASA Technical Reports Server (NTRS)
Shively, Robert J.; Lachter, Joel B.
2018-01-01
Automation is playing an increasingly important role in many operations. It is often cheaper faster and more precise than human operators. However, automation is not perfect. There are many situations in which a human operator must step in. We refer to these instances as contingencies and the act of stepping in contingency management. Here we propose coupling Human Autonomy Teaming (HAT) with contingency management. We describe two aspects to HAT, bi-directional communication, and working agreements (or plays). Bi-directional communication like Crew Resource Management in traditional aviation, allows all parties to contribute to a decision. Working agreements specify roles and responsibilities. Importantly working agreements allow for the possibility of roles and responsibilities changing depending on environmental factors (e.g., situations the automation was not designed for, workload, risk, or trust). This allows for the automation to "automatically" become more autonomous as it becomes more trusted and/or it is updated to deal with a more complete set of possible situations. We present a concrete example using a prototype contingency management station one might find in a future airline operations center. Automation proposes reroutes for aircraft that encounter bad weather or are forced to divert for environmental or systems reasons. If specific conditions are met, these recommendations may be autonomously datalinked to the affected aircraft.
Poon, Candice C; Ebacher, Vincent; Liu, Katherine; Yong, Voon Wee; Kelly, John James Patrick
2018-05-03
Automated slide scanning and segmentation of fluorescently-labeled tissues is the most efficient way to analyze whole slides or large tissue sections. Unfortunately, many researchers spend large amounts of time and resources developing and optimizing workflows that are only relevant to their own experiments. In this article, we describe a protocol that can be used by those with access to a widefield high-content analysis system (WHCAS) to image any slide-mounted tissue, with options for customization within pre-built modules found in the associated software. Not originally intended for slide scanning, the steps detailed in this article make it possible to acquire slide scanning images in the WHCAS which can be imported into the associated software. In this example, the automated segmentation of brain tumor slides is demonstrated, but the automated segmentation of any fluorescently-labeled nuclear or cytoplasmic marker is possible. Furthermore, there are a variety of other quantitative software modules including assays for protein localization/translocation, cellular proliferation/viability/apoptosis, and angiogenesis that can be run. This technique will save researchers time and effort and create an automated protocol for slide analysis.
Chatterjee, Anirban; Mirer, Paul L; Zaldivar Santamaria, Elvira; Klapperich, Catherine; Sharon, Andre; Sauer-Budge, Alexis F
2010-06-01
The life science and healthcare communities have been redefining the importance of ribonucleic acid (RNA) through the study of small molecule RNA (in RNAi/siRNA technologies), micro RNA (in cancer research and stem cell research), and mRNA (gene expression analysis for biologic drug targets). Research in this field increasingly requires efficient and high-throughput isolation techniques for RNA. Currently, several commercial kits are available for isolating RNA from cells. Although the quality and quantity of RNA yielded from these kits is sufficiently good for many purposes, limitations exist in terms of extraction efficiency from small cell populations and the ability to automate the extraction process. Traditionally, automating a process decreases the cost and personnel time while simultaneously increasing the throughput and reproducibility. As the RNA field matures, new methods for automating its extraction, especially from low cell numbers and in high throughput, are needed to achieve these improvements. The technology presented in this article is a step toward this goal. The method is based on a solid-phase extraction technology using a porous polymer monolith (PPM). A novel cell lysis approach and a larger binding surface throughout the PPM extraction column ensure a high yield from small starting samples, increasing sensitivity and reducing indirect costs in cell culture and sample storage. The method ensures a fast and simple procedure for RNA isolation from eukaryotic cells, with a high yield both in terms of quality and quantity. The technique is amenable to automation and streamlined workflow integration, with possible miniaturization of the sample handling process making it suitable for high-throughput applications.
Personalized Physical Activity Coaching: A Machine Learning Approach
Dijkhuis, Talko B.; van Ittersum, Miriam W.; Velthuijsen, Hugo
2018-01-01
Living a sedentary lifestyle is one of the major causes of numerous health problems. To encourage employees to lead a less sedentary life, the Hanze University started a health promotion program. One of the interventions in the program was the use of an activity tracker to record participants' daily step count. The daily step count served as input for a fortnightly coaching session. In this paper, we investigate the possibility of automating part of the coaching procedure on physical activity by providing personalized feedback throughout the day on a participant’s progress in achieving a personal step goal. The gathered step count data was used to train eight different machine learning algorithms to make hourly estimations of the probability of achieving a personalized, daily steps threshold. In 80% of the individual cases, the Random Forest algorithm was the best performing algorithm (mean accuracy = 0.93, range = 0.88–0.99, and mean F1-score = 0.90, range = 0.87–0.94). To demonstrate the practical usefulness of these models, we developed a proof-of-concept Web application that provides personalized feedback about whether a participant is expected to reach his or her daily threshold. We argue that the use of machine learning could become an invaluable asset in the process of automated personalized coaching. The individualized algorithms allow for predicting physical activity during the day and provides the possibility to intervene in time. PMID:29463052
An efficient multi-resolution GA approach to dental image alignment
NASA Astrophysics Data System (ADS)
Nassar, Diaa Eldin; Ogirala, Mythili; Adjeroh, Donald; Ammar, Hany
2006-02-01
Automating the process of postmortem identification of individuals using dental records is receiving an increased attention in forensic science, especially with the large volume of victims encountered in mass disasters. Dental radiograph alignment is a key step required for automating the dental identification process. In this paper, we address the problem of dental radiograph alignment using a Multi-Resolution Genetic Algorithm (MR-GA) approach. We use location and orientation information of edge points as features; we assume that affine transformations suffice to restore geometric discrepancies between two images of a tooth, we efficiently search the 6D space of affine parameters using GA progressively across multi-resolution image versions, and we use a Hausdorff distance measure to compute the similarity between a reference tooth and a query tooth subject to a possible alignment transform. Testing results based on 52 teeth-pair images suggest that our algorithm converges to reasonable solutions in more than 85% of the test cases, with most of the error in the remaining cases due to excessive misalignments.
Comparability of automated human induced pluripotent stem cell culture: a pilot study.
Archibald, Peter R T; Chandra, Amit; Thomas, Dave; Chose, Olivier; Massouridès, Emmanuelle; Laâbi, Yacine; Williams, David J
2016-12-01
Consistent and robust manufacturing is essential for the translation of cell therapies, and the utilisation automation throughout the manufacturing process may allow for improvements in quality control, scalability, reproducibility and economics of the process. The aim of this study was to measure and establish the comparability between alternative process steps for the culture of hiPSCs. Consequently, the effects of manual centrifugation and automated non-centrifugation process steps, performed using TAP Biosystems' CompacT SelecT automated cell culture platform, upon the culture of a human induced pluripotent stem cell (hiPSC) line (VAX001024c07) were compared. This study, has demonstrated that comparable morphologies and cell diameters were observed in hiPSCs cultured using either manual or automated process steps. However, non-centrifugation hiPSC populations exhibited greater cell yields, greater aggregate rates, increased pluripotency marker expression, and decreased differentiation marker expression compared to centrifugation hiPSCs. A trend for decreased variability in cell yield was also observed after the utilisation of the automated process step. This study also highlights the detrimental effect of the cryopreservation and thawing processes upon the growth and characteristics of hiPSC cultures, and demonstrates that automated hiPSC manufacturing protocols can be successfully transferred between independent laboratories.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katipamula, Srinivas; Gowri, Krishnan; Hernandez, George
This paper describes one such reference process that can be deployed to provide continuous automated conditioned-based maintenance management for buildings that have BIM, a building automation system (BAS) and a computerized maintenance management software (CMMS) systems. The process can be deployed using an open source transactional network platform, VOLTTRON™, designed for distributed sensing and controls and supports both energy efficiency and grid services.
Improved Feature Matching for Mobile Devices with IMU.
Masiero, Andrea; Vettore, Antonio
2016-08-05
Thanks to the recent diffusion of low-cost high-resolution digital cameras and to the development of mostly automated procedures for image-based 3D reconstruction, the popularity of photogrammetry for environment surveys is constantly increasing in the last years. Automatic feature matching is an important step in order to successfully complete the photogrammetric 3D reconstruction: this step is the fundamental basis for the subsequent estimation of the geometry of the scene. This paper reconsiders the feature matching problem when dealing with smart mobile devices (e.g., when using the standard camera embedded in a smartphone as imaging sensor). More specifically, this paper aims at exploiting the information on camera movements provided by the inertial navigation system (INS) in order to make the feature matching step more robust and, possibly, computationally more efficient. First, a revised version of the affine scale-invariant feature transform (ASIFT) is considered: this version reduces the computational complexity of the original ASIFT, while still ensuring an increase of correct feature matches with respect to the SIFT. Furthermore, a new two-step procedure for the estimation of the essential matrix E (and the camera pose) is proposed in order to increase its estimation robustness and computational efficiency.
Gao, Yali; Lam, Albert W Y; Chan, Warren C W
2013-04-24
The impact of detecting multiple infectious diseases simultaneously at point-of-care with good sensitivity, specificity, and reproducibility would be enormous for containing the spread of diseases in both resource-limited and rich countries. Many barcoding technologies have been introduced for addressing this need as barcodes can be applied to detecting thousands of genetic and protein biomarkers simultaneously. However, the assay process is not automated and is tedious and requires skilled technicians. Barcoding technology is currently limited to use in resource-rich settings. Here we used magnetism and microfluidics technology to automate the multiple steps in a quantum dot barcode assay. The quantum dot-barcoded microbeads are sequentially (a) introduced into the chip, (b) magnetically moved to a stream containing target molecules, (c) moved back to the original stream containing secondary probes, (d) washed, and (e) finally aligned for detection. The assay requires 20 min, has a limit of detection of 1.2 nM, and can detect genetic targets for HIV, hepatitis B, and syphilis. This study provides a simple strategy to automate the entire barcode assay process and moves barcoding technologies one step closer to point-of-care applications.
Li, Yu; Zhang, Xiangru; Yang, Mengting; Liu, Jiaqi; Li, Wanxin; Graham, Nigel J D; Li, Xiaoyan; Yang, Bo
2017-02-01
Chlorination is extensively applied for disinfecting sewage effluents, but it unintentionally generates disinfection byproducts (DBPs). Using seawater for toilet flushing introduces a high level of bromide into domestic sewage. Chlorination of sewage effluent rich in bromide causes the formation of brominated DBPs. The objectives of achieving a disinfection goal, reducing disinfectant consumption and operational costs, as well as diminishing adverse effects to aquatic organisms in receiving water body remain a challenge in sewage treatment. In this study, we have demonstrated that, with the same total chlorine dosage, a three-step chlorination (dosing chlorine by splitting it into three equal portions with a 5-min time interval for each portion) was significantly more efficient in disinfecting a primary saline sewage effluent than a one-step chlorination (dosing chlorine at one time). Compared to one-step chlorination, three-step chlorination enhanced the disinfection efficiency by up to 0.73-log reduction of Escherichia coli. The overall DBP formation resulting from one-step and three-step chlorination was quantified by total organic halogen measurement. Compared to one-step chlorination, the DBP formation in three-step chlorination was decreased by up to 23.4%. The comparative toxicity of one-step and three-step chlorination was evaluated in terms of the development of embryo-larva of a marine polychaete Platynereis dumerilii. The results revealed that the primary sewage effluent with three-step chlorination was less toxic than that with one-step chlorination, indicating that three-step chlorination could reduce the potential adverse effects of disinfected sewage effluents to aquatic organisms in the receiving marine water. Copyright © 2016 Elsevier Ltd. All rights reserved.
MoCha: Molecular Characterization of Unknown Pathways.
Lobo, Daniel; Hammelman, Jennifer; Levin, Michael
2016-04-01
Automated methods for the reverse-engineering of complex regulatory networks are paving the way for the inference of mechanistic comprehensive models directly from experimental data. These novel methods can infer not only the relations and parameters of the known molecules defined in their input datasets, but also unknown components and pathways identified as necessary by the automated algorithms. Identifying the molecular nature of these unknown components is a crucial step for making testable predictions and experimentally validating the models, yet no specific and efficient tools exist to aid in this process. To this end, we present here MoCha (Molecular Characterization), a tool optimized for the search of unknown proteins and their pathways from a given set of known interacting proteins. MoCha uses the comprehensive dataset of protein-protein interactions provided by the STRING database, which currently includes more than a billion interactions from over 2,000 organisms. MoCha is highly optimized, performing typical searches within seconds. We demonstrate the use of MoCha with the characterization of unknown components from reverse-engineered models from the literature. MoCha is useful for working on network models by hand or as a downstream step of a model inference engine workflow and represents a valuable and efficient tool for the characterization of unknown pathways using known data from thousands of organisms. MoCha and its source code are freely available online under the GPLv3 license.
Liu, Zhiyi; Zhang, Luyao; Liu, Yanling; Gu, Yang; Sun, Tieliang
2017-11-01
We aimed to evaluate the efficiency and safety of one-step procedure combined endoscopic retrograde cholangiopancreatography (ERCP) and laparoscopic cholecystectomy (LC) for treatment of patients with cholecysto-choledocholithiasis. A prospective randomized study was performed on 63 consecutive cholecysto-choledocholithiasis patients during 2008 and 2011. The efficiency and safety of one-step procedure was assessed by comparing the two-step LC with ERCP + endoscopic sphincterotomy (EST). Outcomes including intraoperative features, postoperative features (length of stay and postoperative complications) were evaluated. One- or two-step procedure of LC with ERCP + EST was successfully performed in all patients, and common bile duct stones were completely removed. Statistical analyses showed that length of stay and pulmonary infection rate were significantly lower in the test group compared with that in the control group (P < 0.05), whereas no statistical difference in other outcomes was found between the two groups (all P > 0.05). The one-step procedure of LC with ERCP + EST is superior to the two-step procedure for treatment of patients with cholecysto-choledocholithiasis regarding to the reduced hospital stay and inhibited occurrence of pulmonary infections. Compared with two-step procedure, one-step procedure of LC with ERCP + EST may be a superior option for cholecysto-choledocholithiasis patients treatment regarding to hospital stay and pulmonary infections.
Röhnisch, Hanna E; Eriksson, Jan; Müllner, Elisabeth; Agback, Peter; Sandström, Corine; Moazzami, Ali A
2018-02-06
A key limiting step for high-throughput NMR-based metabolomics is the lack of rapid and accurate tools for absolute quantification of many metabolites. We developed, implemented, and evaluated an algorithm, AQuA (Automated Quantification Algorithm), for targeted metabolite quantification from complex 1 H NMR spectra. AQuA operates based on spectral data extracted from a library consisting of one standard calibration spectrum for each metabolite. It uses one preselected NMR signal per metabolite for determining absolute concentrations and does so by effectively accounting for interferences caused by other metabolites. AQuA was implemented and evaluated using experimental NMR spectra from human plasma. The accuracy of AQuA was tested and confirmed in comparison with a manual spectral fitting approach using the ChenomX software, in which 61 out of 67 metabolites quantified in 30 human plasma spectra showed a goodness-of-fit (r 2 ) close to or exceeding 0.9 between the two approaches. In addition, three quality indicators generated by AQuA, namely, occurrence, interference, and positional deviation, were studied. These quality indicators permit evaluation of the results each time the algorithm is operated. The efficiency was tested and confirmed by implementing AQuA for quantification of 67 metabolites in a large data set comprising 1342 experimental spectra from human plasma, in which the whole computation took less than 1 s.
NASA Astrophysics Data System (ADS)
Goma, Sergio R.
2015-03-01
In current times, mobile technologies are ubiquitous and the complexity of problems is continuously increasing. In the context of advancement of engineering, we explore in this paper possible reasons that could cause a saturation in technology evolution - namely the ability of problem solving based on previous results and the ability of expressing solutions in a more efficient way, concluding that `thinking outside of brain' - as in solving engineering problems that are expressed in a virtual media due to their complexity - would benefit from mobile technology augmentation. This could be the necessary evolutionary step that would provide the efficiency required to solve new complex problems (addressing the `running out of time' issue) and remove the communication of results barrier (addressing the human `perception/expression imbalance' issue). Some consequences are discussed, as in this context the artificial intelligence becomes an automation tool aid instead of a necessary next evolutionary step. The paper concludes that research in modeling as problem solving aid and data visualization as perception aid augmented with mobile technologies could be the path to an evolutionary step in advancing engineering.
Van Heirstraeten, Liesbet; Spang, Peter; Schwind, Carmen; Drese, Klaus S; Ritzi-Lehnert, Marion; Nieto, Benjamin; Camps, Marta; Landgraf, Bryan; Guasch, Francesc; Corbera, Antoni Homs; Samitier, Josep; Goossens, Herman; Malhotra-Kumar, Surbhi; Roeser, Tina
2014-05-07
In this paper, we describe the development of an automated sample preparation procedure for etiological agents of community-acquired lower respiratory tract infections (CA-LRTI). The consecutive assay steps, including sample re-suspension, pre-treatment, lysis, nucleic acid purification, and concentration, were integrated into a microfluidic lab-on-a-chip (LOC) cassette that is operated hands-free by a demonstrator setup, providing fluidic and valve actuation. The performance of the assay was evaluated on viral and Gram-positive and Gram-negative bacterial broth cultures previously sampled using a nasopharyngeal swab. Sample preparation on the microfluidic cassette resulted in higher or similar concentrations of pure bacterial DNA or viral RNA compared to manual benchtop experiments. The miniaturization and integration of the complete sample preparation procedure, to extract purified nucleic acids from real samples of CA-LRTI pathogens to, and above, lab quality and efficiency, represent important steps towards its application in a point-of-care test (POCT) for rapid diagnosis of CA-LRTI.
One-step manufacturing of innovative flat-knitted 3D net-shape preforms for composite applications
NASA Astrophysics Data System (ADS)
Bollengier, Quentin; Wieczorek, Florian; Hellmann, Sven; Trümper, Wolfgang; Cherif, Chokri
2017-10-01
Mostly due to the cost-intensive manually performed processing operations, the production of complex-shaped fibre reinforced plastic composites (FRPC) is currently very expensive and therefore either restricted to sectors with high added value or for small batch applications (e.g. in the aerospace or automotive industry). Previous works suggest that the successful integration of conventional textile manufacturing processes in the FRPC-process chain is the key to a cost-efficient manufacturing of complex three-dimensional (3D) FRPC-components with stress-oriented fibre arrangement. Therefore, this work focuses on the development of the multilayer weft knitting technology for the one-step manufacturing of complex 3D net-shaped preforms for high performance FRPC applications. In order to highlight the advantages of net-shaped multilayer weft knitted fabrics for the production of complex FRPC parts, seamless preforms such as 3D skin-stringer structures and tubular fabrics with load oriented fibre arrangement are realised. In this paper, the development of the textile bindings and performed technical modifications on flat knitting machines are presented. The results show that the multilayer weft knitting technology meets perfectly the requirements for a fully automated and reproducible manufacturing of complex 3D textile preforms with stress-oriented fibre arrangement.
Schmitt, Kara
2012-01-01
Nuclear power is one of the ways that we can design an efficient sustainable future. Automation is the primary system used to assist operators in the task of monitoring and controlling nuclear power plants (NPP). Automation performs tasks such as assessing the status of the plant's operations as well as making real time life critical situational specific decisions. While the advantages and disadvantages of automation are well studied in variety of domains, accidents remind us that there is still vulnerability to unknown variables. This paper will look at the effects of automation within three NPP accidents and incidents and will consider why automation failed in preventing these accidents from occurring. It will also review the accidents at the Three Mile Island, Chernobyl, and Fukushima Daiichi NPP's in order to determine where better use of automation could have resulted in a more desirable outcome.
Automated tilt series alignment and tomographic reconstruction in IMOD.
Mastronarde, David N; Held, Susannah R
2017-02-01
Automated tomographic reconstruction is now possible in the IMOD software package, including the merging of tomograms taken around two orthogonal axes. Several developments enable the production of high-quality tomograms. When using fiducial markers for alignment, the markers to be tracked through the series are chosen automatically; if there is an excess of markers available, a well-distributed subset is selected that is most likely to track well. Marker positions are refined by applying an edge-enhancing Sobel filter, which results in a 20% improvement in alignment error for plastic-embedded samples and 10% for frozen-hydrated samples. Robust fitting, in which outlying points are given less or no weight in computing the fitting error, is used to obtain an alignment solution, so that aberrant points from the automated tracking can have little effect on the alignment. When merging two dual-axis tomograms, the alignment between them is refined from correlations between local patches; a measure of structure was developed so that patches with insufficient structure to give accurate correlations can now be excluded automatically. We have also developed a script for running all steps in the reconstruction process with a flexible mechanism for setting parameters, and we have added a user interface for batch processing of tilt series to the Etomo program in IMOD. Batch processing is fully compatible with interactive processing and can increase efficiency even when the automation is not fully successful, because users can focus their effort on the steps that require manual intervention. Copyright © 2016 Elsevier Inc. All rights reserved.
Spaceport Command and Control System Automated Verification Software Development
NASA Technical Reports Server (NTRS)
Backus, Michael W.
2017-01-01
For as long as we have walked the Earth, humans have always been explorers. We have visited our nearest celestial body and sent Voyager 1 beyond our solar system1 out into interstellar space. Now it is finally time for us to step beyond our home and onto another planet. The Spaceport Command and Control System (SCCS) is being developed along with the Space Launch System (SLS) to take us on a journey further than ever attempted. Within SCCS are separate subsystems and system level software, each of which have to be tested and verified. Testing is a long and tedious process, so automating it will be much more efficient and also helps to remove the possibility of human error from mission operations. I was part of a team of interns and full-time engineers who automated tests for the requirements on SCCS, and with that was able to help verify that the software systems are performing as expected.
Yang, Jianji J; Cohen, Aaron M; Cohen, Aaron; McDonagh, Marian S
2008-11-06
Automatic document classification can be valuable in increasing the efficiency in updating systematic reviews (SR). In order for the machine learning process to work well, it is critical to create and maintain high-quality training datasets consisting of expert SR inclusion/exclusion decisions. This task can be laborious, especially when the number of topics is large and source data format is inconsistent.To approach this problem, we build an automated system to streamline the required steps, from initial notification of update in source annotation files to loading the data warehouse, along with a web interface to monitor the status of each topic. In our current collection of 26 SR topics, we were able to standardize almost all of the relevance judgments and recovered PMIDs for over 80% of all articles. Of those PMIDs, over 99% were correct in a manual random sample study. Our system performs an essential function in creating training and evaluation data sets for SR text mining research.
Yang, Jianji J.; Cohen, Aaron M.; McDonagh, Marian S.
2008-01-01
Automatic document classification can be valuable in increasing the efficiency in updating systematic reviews (SR). In order for the machine learning process to work well, it is critical to create and maintain high-quality training datasets consisting of expert SR inclusion/exclusion decisions. This task can be laborious, especially when the number of topics is large and source data format is inconsistent. To approach this problem, we build an automated system to streamline the required steps, from initial notification of update in source annotation files to loading the data warehouse, along with a web interface to monitor the status of each topic. In our current collection of 26 SR topics, we were able to standardize almost all of the relevance judgments and recovered PMIDs for over 80% of all articles. Of those PMIDs, over 99% were correct in a manual random sample study. Our system performs an essential function in creating training and evaluation datasets for SR text mining research. PMID:18999194
Peng, Chen; Frommlet, Alexandra; Perez, Manuel; Cobas, Carlos; Blechschmidt, Anke; Dominguez, Santiago; Lingel, Andreas
2016-04-14
NMR binding assays are routinely applied in hit finding and validation during early stages of drug discovery, particularly for fragment-based lead generation. To this end, compound libraries are screened by ligand-observed NMR experiments such as STD, T1ρ, and CPMG to identify molecules interacting with a target. The analysis of a high number of complex spectra is performed largely manually and therefore represents a limiting step in hit generation campaigns. Here we report a novel integrated computational procedure that processes and analyzes ligand-observed proton and fluorine NMR binding data in a fully automated fashion. A performance evaluation comparing automated and manual analysis results on (19)F- and (1)H-detected data sets shows that the program delivers robust, high-confidence hit lists in a fraction of the time needed for manual analysis and greatly facilitates visual inspection of the associated NMR spectra. These features enable considerably higher throughput, the assessment of larger libraries, and shorter turn-around times.
Hirai, Hiroki; Nakajima, Kiichi; Nakatsuka, Soichiro; Shiren, Kazushi; Ni, Jingping; Nomura, Shintaro; Ikuta, Toshiaki; Hatakeyama, Takuji
2015-11-09
The development of a one-step borylation of 1,3-diaryloxybenzenes, yielding novel boron-containing polycyclic aromatic compounds, is reported. The resulting boron-containing compounds possess high singlet-triplet excitation energies as a result of localized frontier molecular orbitals induced by boron and oxygen. Using these compounds as a host material, we successfully prepared phosphorescent organic light-emitting diodes exhibiting high efficiency and adequate lifetimes. Moreover, using the present one-step borylation, we succeeded in the synthesis of an efficient, thermally activated delayed fluorescence emitter and boron-fused benzo[6]helicene. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Multi-Mission Automated Task Invocation Subsystem
NASA Technical Reports Server (NTRS)
Cheng, Cecilia S.; Patel, Rajesh R.; Sayfi, Elias M.; Lee, Hyun H.
2009-01-01
Multi-Mission Automated Task Invocation Subsystem (MATIS) is software that establishes a distributed data-processing framework for automated generation of instrument data products from a spacecraft mission. Each mission may set up a set of MATIS servers for processing its data products. MATIS embodies lessons learned in experience with prior instrument- data-product-generation software. MATIS is an event-driven workflow manager that interprets project-specific, user-defined rules for managing processes. It executes programs in response to specific events under specific conditions according to the rules. Because requirements of different missions are too diverse to be satisfied by one program, MATIS accommodates plug-in programs. MATIS is flexible in that users can control such processing parameters as how many pipelines to run and on which computing machines to run them. MATIS has a fail-safe capability. At each step, MATIS captures and retains pertinent information needed to complete the step and start the next step. In the event of a restart, this information is retrieved so that processing can be resumed appropriately. At this writing, it is planned to develop a graphical user interface (GUI) for monitoring and controlling a product generation engine in MATIS. The GUI would enable users to schedule multiple processes and manage the data products produced in the processes. Although MATIS was initially designed for instrument data product generation,
The laboratory of the 1990s—Planning for total automation
Brunner, Linda A.
1992-01-01
The analytical laboratory of the 1990s must be able to meet and accommodate the rapid evolution of modern-day technology. One such area is laboratory automation. Total automation may be seen as the coupling of computerized sample tracking, electronic documentation and data reduction with automated sample handling, preparation and analysis, resulting in a complete analytical procedure with minimal human involvement. Requirements may vary from one laboratory or facility to another, so the automation has to be flexible enough to cover a wide range of applications, and yet fit into specific niches depending on individual needs. Total automation must be planned for, well in advance, if the endeavour is to be a success. Space, laboratory layout, proper equipment, and the availability and access to necessary utilities must be taken into account. Adequate training and experience of the personnel working with the technology must also be ensured. In addition, responsibilities of installation, programming maintenance and operation have to be addressed. Proper time management and the efficient implementation and use of total automation are also crucial to successful operations. This paper provides insights into laboratory organization and requirements, as well as discussing the management issues that must be faced when automating laboratory procedures. PMID:18924925
FISH-in-CHIPS: A Microfluidic Platform for Molecular Typing of Cancer Cells.
Perez-Toralla, Karla; Mottet, Guillaume; Tulukcuoglu-Guneri, Ezgi; Champ, Jérôme; Bidard, François-Clément; Pierga, Jean-Yves; Klijanienko, Jerzy; Draskovic, Irena; Malaquin, Laurent; Viovy, Jean-Louis; Descroix, Stéphanie
2017-01-01
Microfluidics offer powerful tools for the control, manipulation, and analysis of cells, in particular for the assessment of cell malignancy or the study of cell subpopulations. However, implementing complex biological protocols on chip remains a challenge. Sample preparation is often performed off chip using multiple manually performed steps, and protocols usually include different dehydration and drying steps that are not always compatible with a microfluidic format.Here, we report the implementation of a Fluorescence in situ Hybridization (FISH) protocol for the molecular typing of cancer cells in a simple and low-cost device. The geometry of the chip allows integrating the sample preparation steps to efficiently assess the genomic content of individual cells using a minute amount of sample. The FISH protocol can be fully automated, thus enabling its use in routine clinical practice.
Mock, Ulrike; Nickolay, Lauren; Philip, Brian; Cheung, Gordon Weng-Kit; Zhan, Hong; Johnston, Ian C D; Kaiser, Andrew D; Peggs, Karl; Pule, Martin; Thrasher, Adrian J; Qasim, Waseem
2016-08-01
Novel cell therapies derived from human T lymphocytes are exhibiting enormous potential in early-phase clinical trials in patients with hematologic malignancies. Ex vivo modification of T cells is currently limited to a small number of centers with the required infrastructure and expertise. The process requires isolation, activation, transduction, expansion and cryopreservation steps. To simplify procedures and widen applicability for clinical therapies, automation of these procedures is being developed. The CliniMACS Prodigy (Miltenyi Biotec) has recently been adapted for lentiviral transduction of T cells and here we analyse the feasibility of a clinically compliant T-cell engineering process for the manufacture of T cells encoding chimeric antigen receptors (CAR) for CD19 (CAR19), a widely targeted antigen in B-cell malignancies. Using a closed, single-use tubing set we processed mononuclear cells from fresh or frozen leukapheresis harvests collected from healthy volunteer donors. Cells were phenotyped and subjected to automated processing and activation using TransAct, a polymeric nanomatrix activation reagent incorporating CD3/CD28-specific antibodies. Cells were then transduced and expanded in the CentriCult-Unit of the tubing set, under stabilized culture conditions with automated feeding and media exchange. The process was continuously monitored to determine kinetics of expansion, transduction efficiency and phenotype of the engineered cells in comparison with small-scale transductions run in parallel. We found that transduction efficiencies, phenotype and function of CAR19 T cells were comparable with existing procedures and overall T-cell yields sufficient for anticipated therapeutic dosing. The automation of closed-system T-cell engineering should improve dissemination of emerging immunotherapies and greatly widen applicability. Copyright © 2016. Published by Elsevier Inc.
Sun, Yung-Shin; Zhu, Xiangdong
2016-10-01
Microarrays provide a platform for high-throughput characterization of biomolecular interactions. To increase the sensitivity and specificity of microarrays, surface blocking is required to minimize the nonspecific interactions between analytes and unprinted yet functionalized surfaces. To block amine- or epoxy-functionalized substrates, bovine serum albumin (BSA) is one of the most commonly used blocking reagents because it is cheap and easy to use. Based on standard protocols from microarray manufactories, a BSA concentration of 1% (10 mg/mL or 200 μM) and reaction time of at least 30 min are required to efficiently block epoxy-coated slides. In this paper, we used both fluorescent and label-free methods to characterize the BSA blocking efficiency on epoxy-functionalized substrates. The blocking efficiency of BSA was characterized using a fluorescent scanner and a label-free oblique-incidence reflectivity difference (OI-RD) microscope. We found that (1) a BSA concentration of 0.05% (0.5 mg/mL or 10 μM) could give a blocking efficiency of 98%, and (2) the BSA blocking step took only about 5 min to be complete. Also, from real-time and in situ measurements, we were able to calculate the conformational properties (thickness, mass density, and number density) of BSA molecules deposited on the epoxy surface. © 2015 Society for Laboratory Automation and Screening.
Design on wireless auto-measurement system for lead rail straightness measurement based on PSD
NASA Astrophysics Data System (ADS)
Yan, Xiugang; Zhang, Shuqin; Dong, Dengfeng; Cheng, Zhi; Wu, Guanghua; Wang, Jie; Zhou, Weihu
2016-10-01
Straightness detection is not only one of the key technologies for the product quality and installation accuracy of all types of lead rail, but also an important dimensional measurement technology. The straightness measuring devices now available have disadvantages of low automation level, limiting by measuring environment, and low measurement efficiency. In this paper, a wireless measurement system for straightness detection based on position sensitive detector (PSD) is proposed. The system has some advantage of high automation-level, convenient, high measurement efficiency, easy to transplanting and expanding, and can detect straightness of lead rail in real-time.
Automation of On-Board Flightpath Management
NASA Technical Reports Server (NTRS)
Erzberger, H.
1981-01-01
The status of concepts and techniques for the design of onboard flight path management systems is reviewed. Such systems are designed to increase flight efficiency and safety by automating the optimization of flight procedures onboard aircraft. After a brief review of the origins and functions of such systems, two complementary methods are described for attacking the key design problem, namely, the synthesis of efficient trajectories. One method optimizes en route, the other optimizes terminal area flight; both methods are rooted in optimal control theory. Simulation and flight test results are reviewed to illustrate the potential of these systems for fuel and cost savings.
A multi-step system for screening and localization of hard exudates in retinal images
NASA Astrophysics Data System (ADS)
Bopardikar, Ajit S.; Bhola, Vishal; Raghavendra, B. S.; Narayanan, Rangavittal
2012-03-01
The number of people being affected by Diabetes mellitus worldwide is increasing at an alarming rate. Monitoring of the diabetic condition and its effects on the human body are therefore of great importance. Of particular interest is diabetic retinopathy (DR) which is a result of prolonged, unchecked diabetes and affects the visual system. DR is a leading cause of blindness throughout the world. At any point of time 25 - 44% of people with diabetes are afflicted by DR. Automation of the screening and monitoring process for DR is therefore essential for efficient utilization of healthcare resources and optimizing treatment of the affected individuals. Such automation would use retinal images and detect the presence of specific artifacts such as hard exudates, hemorrhages and soft exudates (that may appear in the image) to gauge the severity of DR. In this paper, we focus on the detection of hard exudates. We propose a two step system that consists of a screening step that classifies retinal images as normal or abnormal based on the presence of hard exudates and a detection stage that localizes these artifacts in an abnormal retinal image. The proposed screening step automatically detects the presence of hard exudates with a high sensitivity and positive predictive value (PPV ). The detection/localization step uses a k-means based clustering approach to localize hard exudates in the retinal image. Suitable feature vectors are chosen based on their ability to isolate hard exudates while minimizing false detections. The algorithm was tested on a benchmark dataset (DIARETDB1) and was seen to provide a superior performance compared to existing methods. The two-step process described in this paper can be embedded in a tele-ophthalmology system to aid with speedy detection and diagnosis of the severity of DR.
Automated fuel pin loading system
Christiansen, David W.; Brown, William F.; Steffen, Jim M.
1985-01-01
An automated loading system for nuclear reactor fuel elements utilizes a gravity feed conveyor which permits individual fuel pins to roll along a constrained path perpendicular to their respective lengths. The individual lengths of fuel cladding are directed onto movable transports, where they are aligned coaxially with the axes of associated handling equipment at appropriate production stations. Each fuel pin can be reciprocated axially and/or rotated about its axis as required during handling steps. The fuel pins are inserted as a batch prior to welding of end caps by one of two disclosed welding systems.
Automated fuel pin loading system
Christiansen, D.W.; Brown, W.F.; Steffen, J.M.
An automated loading system for nuclear reactor fuel elements utilizes a gravity feed conveyor which permits individual fuel pins to roll along a constrained path perpendicular to their respective lengths. The individual lengths of fuel cladding are directed onto movable transports, where they are aligned coaxially with the axes of associated handling equipment at appropriate production stations. Each fuel pin can be be reciprocated axially and/or rotated about its axis as required during handling steps. The fuel pins are inerted as a batch prior to welding of end caps by one of two disclosed welding systems.
Miroshnichenko, Iu V; Umarov, S Z
2012-12-01
One of the ways of increase of effectiveness and safety of patients medication supplement is the use of automated systems of distribution, through which substantially increases the efficiency and safety of patients' medication supplement, achieves significant economy of material and financial resources for medication assistance and possibility of systematical improvement of its accessibility and quality.
Multi-step high-throughput conjugation platform for the development of antibody-drug conjugates.
Andris, Sebastian; Wendeler, Michaela; Wang, Xiangyang; Hubbuch, Jürgen
2018-07-20
Antibody-drug conjugates (ADCs) form a rapidly growing class of biopharmaceuticals which attracts a lot of attention throughout the industry due to its high potential for cancer therapy. They combine the specificity of a monoclonal antibody (mAb) and the cell-killing capacity of highly cytotoxic small molecule drugs. Site-specific conjugation approaches involve a multi-step process for covalent linkage of antibody and drug via a linker. Despite the range of parameters that have to be investigated, high-throughput methods are scarcely used so far in ADC development. In this work an automated high-throughput platform for a site-specific multi-step conjugation process on a liquid-handling station is presented by use of a model conjugation system. A high-throughput solid-phase buffer exchange was successfully incorporated for reagent removal by utilization of a batch cation exchange step. To ensure accurate screening of conjugation parameters, an intermediate UV/Vis-based concentration determination was established including feedback to the process. For conjugate characterization, a high-throughput compatible reversed-phase chromatography method with a runtime of 7 min and no sample preparation was developed. Two case studies illustrate the efficient use for mapping the operating space of a conjugation process. Due to the degree of automation and parallelization, the platform is capable of significantly reducing process development efforts and material demands and shorten development timelines for antibody-drug conjugates. Copyright © 2018 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orimoto, Yuuichi, E-mail: orimoto.yuuichi.888@m.kyushu-u.ac.jp; Aoki, Yuriko; Japan Science and Technology Agency, CREST, 4-1-8 Hon-chou, Kawaguchi, Saitama 332-0012
An automated property optimization method was developed based on the ab initio O(N) elongation (ELG) method and applied to the optimization of nonlinear optical (NLO) properties in DNA as a first test. The ELG method mimics a polymerization reaction on a computer, and the reaction terminal of a starting cluster is attacked by monomers sequentially to elongate the electronic structure of the system by solving in each step a limited space including the terminal (localized molecular orbitals at the terminal) and monomer. The ELG-finite field (ELG-FF) method for calculating (hyper-)polarizabilities was used as the engine program of the optimization method,more » and it was found to show linear scaling efficiency while maintaining high computational accuracy for a random sequenced DNA model. Furthermore, the self-consistent field convergence was significantly improved by using the ELG-FF method compared with a conventional method, and it can lead to more feasible NLO property values in the FF treatment. The automated optimization method successfully chose an appropriate base pair from four base pairs (A, T, G, and C) for each elongation step according to an evaluation function. From test optimizations for the first order hyper-polarizability (β) in DNA, a substantial difference was observed depending on optimization conditions between “choose-maximum” (choose a base pair giving the maximum β for each step) and “choose-minimum” (choose a base pair giving the minimum β). In contrast, there was an ambiguous difference between these conditions for optimizing the second order hyper-polarizability (γ) because of the small absolute value of γ and the limitation of numerical differential calculations in the FF method. It can be concluded that the ab initio level property optimization method introduced here can be an effective step towards an advanced computer aided material design method as long as the numerical limitation of the FF method is taken into account.« less
Kinahan, David J; Kearney, Sinéad M; Dimov, Nikolay; Glynn, Macdara T; Ducrée, Jens
2014-07-07
The centrifugal "lab-on-a-disc" concept has proven to have great potential for process integration of bioanalytical assays, in particular where ease-of-use, ruggedness, portability, fast turn-around time and cost efficiency are of paramount importance. Yet, as all liquids residing on the disc are exposed to the same centrifugal field, an inherent challenge of these systems remains the automation of multi-step, multi-liquid sample processing and subsequent detection. In order to orchestrate the underlying bioanalytical protocols, an ample palette of rotationally and externally actuated valving schemes has been developed. While excelling with the level of flow control, externally actuated valves require interaction with peripheral instrumentation, thus compromising the conceptual simplicity of the centrifugal platform. In turn, for rotationally controlled schemes, such as common capillary burst valves, typical manufacturing tolerances tend to limit the number of consecutive laboratory unit operations (LUOs) that can be automated on a single disc. In this paper, a major advancement on recently established dissolvable film (DF) valving is presented; for the very first time, a liquid handling sequence can be controlled in response to completion of preceding liquid transfer event, i.e. completely independent of external stimulus or changes in speed of disc rotation. The basic, event-triggered valve configuration is further adapted to leverage conditional, large-scale process integration. First, we demonstrate a fluidic network on a disc encompassing 10 discrete valving steps including logical relationships such as an AND-conditional as well as serial and parallel flow control. Then we present a disc which is capable of implementing common laboratory unit operations such as metering and selective routing of flows. Finally, as a pilot study, these functions are integrated on a single disc to automate a common, multi-step lab protocol for the extraction of total RNA from mammalian cell homogenate.
Orimoto, Yuuichi; Aoki, Yuriko
2016-07-14
An automated property optimization method was developed based on the ab initio O(N) elongation (ELG) method and applied to the optimization of nonlinear optical (NLO) properties in DNA as a first test. The ELG method mimics a polymerization reaction on a computer, and the reaction terminal of a starting cluster is attacked by monomers sequentially to elongate the electronic structure of the system by solving in each step a limited space including the terminal (localized molecular orbitals at the terminal) and monomer. The ELG-finite field (ELG-FF) method for calculating (hyper-)polarizabilities was used as the engine program of the optimization method, and it was found to show linear scaling efficiency while maintaining high computational accuracy for a random sequenced DNA model. Furthermore, the self-consistent field convergence was significantly improved by using the ELG-FF method compared with a conventional method, and it can lead to more feasible NLO property values in the FF treatment. The automated optimization method successfully chose an appropriate base pair from four base pairs (A, T, G, and C) for each elongation step according to an evaluation function. From test optimizations for the first order hyper-polarizability (β) in DNA, a substantial difference was observed depending on optimization conditions between "choose-maximum" (choose a base pair giving the maximum β for each step) and "choose-minimum" (choose a base pair giving the minimum β). In contrast, there was an ambiguous difference between these conditions for optimizing the second order hyper-polarizability (γ) because of the small absolute value of γ and the limitation of numerical differential calculations in the FF method. It can be concluded that the ab initio level property optimization method introduced here can be an effective step towards an advanced computer aided material design method as long as the numerical limitation of the FF method is taken into account.
Evaluation of the SeedCounter, A Mobile Application for Grain Phenotyping.
Komyshev, Evgenii; Genaev, Mikhail; Afonnikov, Dmitry
2016-01-01
Grain morphometry in cereals is an important step in selecting new high-yielding plants. Manual assessment of parameters such as the number of grains per ear and grain size is laborious. One solution to this problem is image-based analysis that can be performed using a desktop PC. Furthermore, the effectiveness of analysis performed in the field can be improved through the use of mobile devices. In this paper, we propose a method for the automated evaluation of phenotypic parameters of grains using mobile devices running the Android operational system. The experimental results show that this approach is efficient and sufficiently accurate for the large-scale analysis of phenotypic characteristics in wheat grains. Evaluation of our application under six different lighting conditions and three mobile devices demonstrated that the lighting of the paper has significant influence on the accuracy of our method, unlike the smartphone type.
Rabal, Obdulia; Link, Wolfgang; Serelde, Beatriz G; Bischoff, James R; Oyarzabal, Julen
2010-04-01
Here we report the development and validation of a complete solution to manage and analyze the data produced by image-based phenotypic screening campaigns of small-molecule libraries. In one step initial crude images are analyzed for multiple cytological features, statistical analysis is performed and molecules that produce the desired phenotypic profile are identified. A naïve Bayes classifier, integrating chemical and phenotypic spaces, is built and utilized during the process to assess those images initially classified as "fuzzy"-an automated iterative feedback tuning. Simultaneously, all this information is directly annotated in a relational database containing the chemical data. This novel fully automated method was validated by conducting a re-analysis of results from a high-content screening campaign involving 33 992 molecules used to identify inhibitors of the PI3K/Akt signaling pathway. Ninety-two percent of confirmed hits identified by the conventional multistep analysis method were identified using this integrated one-step system as well as 40 new hits, 14.9% of the total, originally false negatives. Ninety-six percent of true negatives were properly recognized too. A web-based access to the database, with customizable data retrieval and visualization tools, facilitates the posterior analysis of annotated cytological features which allows identification of additional phenotypic profiles; thus, further analysis of original crude images is not required.
Walking-Beam Solar-Cell Conveyor
NASA Technical Reports Server (NTRS)
Feder, H.; Frasch, W.
1982-01-01
Microprocessor-controlled walking-beam conveyor moves cells between work stations in automated assembly line. Conveyor has arm at each work station. In unison arms pick up all solar cells and advance them one station; then beam retracks to be in position for next step. Microprocessor sets beam stroke, speed, and position.
Non-Uniform Sampling and J-UNIO Automation for Efficient Protein NMR Structure Determination.
Didenko, Tatiana; Proudfoot, Andrew; Dutta, Samit Kumar; Serrano, Pedro; Wüthrich, Kurt
2015-08-24
High-resolution structure determination of small proteins in solution is one of the big assets of NMR spectroscopy in structural biology. Improvements in the efficiency of NMR structure determination by advances in NMR experiments and automation of data handling therefore attracts continued interest. Here, non-uniform sampling (NUS) of 3D heteronuclear-resolved [(1)H,(1)H]-NOESY data yielded two- to three-fold savings of instrument time for structure determinations of soluble proteins. With the 152-residue protein NP_372339.1 from Staphylococcus aureus and the 71-residue protein NP_346341.1 from Streptococcus pneumonia we show that high-quality structures can be obtained with NUS NMR data, which are equally well amenable to robust automated analysis as the corresponding uniformly sampled data. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Beller, Elaine; Clark, Justin; Tsafnat, Guy; Adams, Clive; Diehl, Heinz; Lund, Hans; Ouzzani, Mourad; Thayer, Kristina; Thomas, James; Turner, Tari; Xia, Jun; Robinson, Karen; Glasziou, Paul
2018-05-19
Systematic reviews (SR) are vital to health care, but have become complicated and time-consuming, due to the rapid expansion of evidence to be synthesised. Fortunately, many tasks of systematic reviews have the potential to be automated or may be assisted by automation. Recent advances in natural language processing, text mining and machine learning have produced new algorithms that can accurately mimic human endeavour in systematic review activity, faster and more cheaply. Automation tools need to be able to work together, to exchange data and results. Therefore, we initiated the International Collaboration for the Automation of Systematic Reviews (ICASR), to successfully put all the parts of automation of systematic review production together. The first meeting was held in Vienna in October 2015. We established a set of principles to enable tools to be developed and integrated into toolkits.This paper sets out the principles devised at that meeting, which cover the need for improvement in efficiency of SR tasks, automation across the spectrum of SR tasks, continuous improvement, adherence to high quality standards, flexibility of use and combining components, the need for a collaboration and varied skills, the desire for open source, shared code and evaluation, and a requirement for replicability through rigorous and open evaluation.Automation has a great potential to improve the speed of systematic reviews. Considerable work is already being done on many of the steps involved in a review. The 'Vienna Principles' set out in this paper aim to guide a more coordinated effort which will allow the integration of work by separate teams and build on the experience, code and evaluations done by the many teams working across the globe.
NASA Astrophysics Data System (ADS)
de Oliveira, Helder C. R.; Mencattini, Arianna; Casti, Paola; Martinelli, Eugenio; di Natale, Corrado; Catani, Juliana H.; de Barros, Nestor; Melo, Carlos F. E.; Gonzaga, Adilson; Vieira, Marcelo A. C.
2018-02-01
This paper proposes a method to reduce the number of false-positives (FP) in a computer-aided detection (CAD) scheme for automated detection of architectural distortion (AD) in digital mammography. AD is a subtle contraction of breast parenchyma that may represent an early sign of breast cancer. Due to its subtlety and variability, AD is more difficult to detect compared to microcalcifications and masses, and is commonly found in retrospective evaluations of false-negative mammograms. Several computer-based systems have been proposed for automated detection of AD in breast images. The usual approach is automatically detect possible sites of AD in a mammographic image (segmentation step) and then use a classifier to eliminate the false-positives and identify the suspicious regions (classification step). This paper focus on the optimization of the segmentation step to reduce the number of FPs that is used as input to the classifier. The proposal is to use statistical measurements to score the segmented regions and then apply a threshold to select a small quantity of regions that should be submitted to the classification step, improving the detection performance of a CAD scheme. We evaluated 12 image features to score and select suspicious regions of 74 clinical Full-Field Digital Mammography (FFDM). All images in this dataset contained at least one region with AD previously marked by an expert radiologist. The results showed that the proposed method can reduce the false positives of the segmentation step of the CAD scheme from 43.4 false positives (FP) per image to 34.5 FP per image, without increasing the number of false negatives.
Increases in efficiency and enhancements to the Mars Observer non-stored commanding process
NASA Technical Reports Server (NTRS)
Brooks, Robert N., Jr.; Torgerson, J. Leigh
1994-01-01
The Mars Observer team was, until the untimely loss of the spacecraft on August 21, 1993, performing flight operations with greater efficiency and speed than any previous JPL mission of its size. This level of through-put was made possible by a mission operations system which was composed of skilled personnel using sophisticated sequencing and commanding tools. During cruise flight operations, however, it was realized by the project that this commanding level was not going to be sufficient to support the activities planned for mapping operations. The project had committed to providing the science instrument principle investigators with a much higher level of commanding during mapping. Thus, the project began taking steps to enhance the capabilities of the flight team. One mechanism used by project management was a tool available from total quality management (TQM). This tool is known as a process action team (PAT). The Mars Observer PAT was tasked to increase the capacity of the flight team's nonstored commanding process by fifty percent with no increase in staffing and a minimal increase in risk. The outcome of this effort was, in fact, to increase the capacity by a factor of 2.5 rather than the desired fifty percent and actually reduce risk. The majority of these improvements came from the automation of the existing command process. These results required very few changes to the existing mission operations system. Rather, the PAT was able to take advantage of automation capabilities inherent in the existing system and make changes to the existing flight team procedures.
Automated Array Assembly, Phase 2
NASA Technical Reports Server (NTRS)
Carbajal, B. G.
1979-01-01
The solar cell module process development activities in the areas of surface preparation are presented. The process step development was carried out on texture etching including the evolution of a conceptual process model for the texturing process; plasma etching; and diffusion studies that focused on doped polymer diffusion sources. Cell processing was carried out to test process steps and a simplified diode solar cell process was developed. Cell processing was also run to fabricate square cells to populate sample minimodules. Module fabrication featured the demonstration of a porcelainized steel glass structure that should exceed the 20 year life goal of the low cost silicon array program. High efficiency cell development was carried out in the development of the tandem junction cell and a modification of the TJC called the front surface field cell. Cell efficiencies in excess of 16 percent at AM1 have been attained with only modest fill factors. The transistor-like model was proposed that fits the cell performance and provides a guideline for future improvements in cell performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hawkins, C.; Dietz, M.; Kaminski, M.
2016-03-01
A technical program to support the Centers of Disease Control and Prevention is being developed to provide an analytical method for rapid extraction of Sr-90 from urine, with the intent of assessing the general population’s exposure during an emergency response to a radiological terrorist event. Results are presented on the progress in urine sample preparation and chemical separation steps that provide an accurate and quantitative detection of Sr-90 based upon an automated column separation sequence and a liquid scintillation assay. Batch extractions were used to evaluate the urine pretreatment and the column separation efficiency and loading capacity based upon commercial,more » extractant-loaded resins. An efficient pretreatment process for decolorizing and removing organics from urine without measurable loss of radiostrontium from the sample was demonstrated. In addition, the Diphonix® resin shows promise for the removal of high concentrations of common strontium interferents in urine as a first separation step for Sr-90 analysis.« less
NASA Astrophysics Data System (ADS)
Yu, H.; Barriga, S.; Agurto, C.; Zamora, G.; Bauman, W.; Soliz, P.
2012-03-01
Retinal vasculature is one of the most important anatomical structures in digital retinal photographs. Accurate segmentation of retinal blood vessels is an essential task in automated analysis of retinopathy. This paper presents a new and effective vessel segmentation algorithm that features computational simplicity and fast implementation. This method uses morphological pre-processing to decrease the disturbance of bright structures and lesions before vessel extraction. Next, a vessel probability map is generated by computing the eigenvalues of the second derivatives of Gaussian filtered image at multiple scales. Then, the second order local entropy thresholding is applied to segment the vessel map. Lastly, a rule-based decision step, which measures the geometric shape difference between vessels and lesions is applied to reduce false positives. The algorithm is evaluated on the low-resolution DRIVE and STARE databases and the publicly available high-resolution image database from Friedrich-Alexander University Erlangen-Nuremberg, Germany). The proposed method achieved comparable performance to state of the art unsupervised vessel segmentation methods with a competitive faster speed on the DRIVE and STARE databases. For the high resolution fundus image database, the proposed algorithm outperforms an existing approach both on performance and speed. The efficiency and robustness make the blood vessel segmentation method described here suitable for broad application in automated analysis of retinal images.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Purdie, Thomas G., E-mail: Tom.Purdie@rmp.uhn.on.ca; Department of Radiation Oncology, University of Toronto, Toronto, Ontario; Techna Institute, University Health Network, Toronto, Ontario
Purpose: To demonstrate the large-scale clinical implementation and performance of an automated treatment planning methodology for tangential breast intensity modulated radiation therapy (IMRT). Methods and Materials: Automated planning was used to prospectively plan tangential breast IMRT treatment for 1661 patients between June 2009 and November 2012. The automated planning method emulates the manual steps performed by the user during treatment planning, including anatomical segmentation, beam placement, optimization, dose calculation, and plan documentation. The user specifies clinical requirements of the plan to be generated through a user interface embedded in the planning system. The automated method uses heuristic algorithms to definemore » and simplify the technical aspects of the treatment planning process. Results: Automated planning was used in 1661 of 1708 patients receiving tangential breast IMRT during the time interval studied. Therefore, automated planning was applicable in greater than 97% of cases. The time for treatment planning using the automated process is routinely 5 to 6 minutes on standard commercially available planning hardware. We have shown a consistent reduction in plan rejections from plan reviews through the standard quality control process or weekly quality review multidisciplinary breast rounds as we have automated the planning process for tangential breast IMRT. Clinical plan acceptance increased from 97.3% using our previous semiautomated inverse method to 98.9% using the fully automated method. Conclusions: Automation has become the routine standard method for treatment planning of tangential breast IMRT at our institution and is clinically feasible on a large scale. The method has wide clinical applicability and can add tremendous efficiency, standardization, and quality to the current treatment planning process. The use of automated methods can allow centers to more rapidly adopt IMRT and enhance access to the documented improvements in care for breast cancer patients, using technologies that are widely available and already in clinical use.« less
Purdie, Thomas G; Dinniwell, Robert E; Fyles, Anthony; Sharpe, Michael B
2014-11-01
To demonstrate the large-scale clinical implementation and performance of an automated treatment planning methodology for tangential breast intensity modulated radiation therapy (IMRT). Automated planning was used to prospectively plan tangential breast IMRT treatment for 1661 patients between June 2009 and November 2012. The automated planning method emulates the manual steps performed by the user during treatment planning, including anatomical segmentation, beam placement, optimization, dose calculation, and plan documentation. The user specifies clinical requirements of the plan to be generated through a user interface embedded in the planning system. The automated method uses heuristic algorithms to define and simplify the technical aspects of the treatment planning process. Automated planning was used in 1661 of 1708 patients receiving tangential breast IMRT during the time interval studied. Therefore, automated planning was applicable in greater than 97% of cases. The time for treatment planning using the automated process is routinely 5 to 6 minutes on standard commercially available planning hardware. We have shown a consistent reduction in plan rejections from plan reviews through the standard quality control process or weekly quality review multidisciplinary breast rounds as we have automated the planning process for tangential breast IMRT. Clinical plan acceptance increased from 97.3% using our previous semiautomated inverse method to 98.9% using the fully automated method. Automation has become the routine standard method for treatment planning of tangential breast IMRT at our institution and is clinically feasible on a large scale. The method has wide clinical applicability and can add tremendous efficiency, standardization, and quality to the current treatment planning process. The use of automated methods can allow centers to more rapidly adopt IMRT and enhance access to the documented improvements in care for breast cancer patients, using technologies that are widely available and already in clinical use. Copyright © 2014 Elsevier Inc. All rights reserved.
Software-supported USER cloning strategies for site-directed mutagenesis and DNA assembly.
Genee, Hans Jasper; Bonde, Mads Tvillinggaard; Bagger, Frederik Otzen; Jespersen, Jakob Berg; Sommer, Morten O A; Wernersson, Rasmus; Olsen, Lars Rønn
2015-03-20
USER cloning is a fast and versatile method for engineering of plasmid DNA. We have developed a user friendly Web server tool that automates the design of optimal PCR primers for several distinct USER cloning-based applications. Our Web server, named AMUSER (Automated DNA Modifications with USER cloning), facilitates DNA assembly and introduction of virtually any type of site-directed mutagenesis by designing optimal PCR primers for the desired genetic changes. To demonstrate the utility, we designed primers for a simultaneous two-position site-directed mutagenesis of green fluorescent protein (GFP) to yellow fluorescent protein (YFP), which in a single step reaction resulted in a 94% cloning efficiency. AMUSER also supports degenerate nucleotide primers, single insert combinatorial assembly, and flexible parameters for PCR amplification. AMUSER is freely available online at http://www.cbs.dtu.dk/services/AMUSER/.
[Innovative technology and blood safety].
Begue, S; Morel, P; Djoudi, R
2016-11-01
If technological innovations are not enough alone to improve blood safety, their contributions for several decades in blood transfusion are major. The improvement of blood donation (new apheresis devices, RFID) or blood components (additive solutions, pathogen reduction technology, automated processing of platelets concentrates) or manufacturing process of these products (by automated processing of whole blood), all these steps where technological innovations were implemented, lead us to better traceability, more efficient processes, quality improvement of blood products and therefore increased blood safety for blood donors and patients. If we are on the threshold of a great change with the progress of pathogen reduction technology (for whole blood and red blood cells), we hope to see production of ex vivo red blood cells or platelets who are real and who open new conceptual paths on blood safety. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
21 CFR 211.188 - Batch production and control records.
Code of Federal Regulations, 2012 CFR
2012-04-01
... that each significant step in the manufacture, processing, packing, or holding of the batch was... automated equipment under § 211.68, the identification of the person checking the significant step performed by the automated equipment. (12) Any investigation made according to § 211.192. (13) Results of...
21 CFR 211.188 - Batch production and control records.
Code of Federal Regulations, 2013 CFR
2013-04-01
... that each significant step in the manufacture, processing, packing, or holding of the batch was... automated equipment under § 211.68, the identification of the person checking the significant step performed by the automated equipment. (12) Any investigation made according to § 211.192. (13) Results of...
21 CFR 211.188 - Batch production and control records.
Code of Federal Regulations, 2014 CFR
2014-04-01
... that each significant step in the manufacture, processing, packing, or holding of the batch was... automated equipment under § 211.68, the identification of the person checking the significant step performed by the automated equipment. (12) Any investigation made according to § 211.192. (13) Results of...
Dornan, Mark H; Simard, José-Mathieu; Leblond, Antoine; Juneau, Daniel; Delouya, Guila; Saad, Fred; Ménard, Cynthia; DaSilva, Jean N
2018-05-02
[ 18 F]DCFPyL is a clinical-stage PET radiotracer used to image prostate cancer. This report details the efficient production of [ 18 F]DCFPyL using single-step direct radiofluorination, without the use of carboxylic acid-protecting groups. Radiolabeling reaction optimization studies revealed an inverse correlation between the amount of precursor used and the radiochemical yield. This simplified approach enabled automated preparation of [ 18 F]DCFPyL within 28 minutes using HPLC purification (26% ± 6%, at EOS, n = 4), which was then scaled up for large-batch production to generate 1.46 ± 0.23 Ci of [ 18 F]DCFPyL at EOS (n = 7) in high molar activity (37 933 ± 4158 mCi/μmol, 1403 ± 153 GBq/μmol, at EOS, n = 7). Further, this work enabled the development of [ 18 F]DCFPyL production in 21 minutes using an easy cartridge-based purification (25% ± 9% radiochemical yield, at EOS, n = 3). Copyright © 2018 John Wiley & Sons, Ltd.
A simple and rapid one-step continuous-flow synthesis route has been developed for the preparation of chromene derivatives from the reaction of aromatic aldehydes, α-cyanomethylene compounds and naphthols. In this contribution, a one-step continuous-flow protocol in a continuous ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shwehdi, M.H.; Khan, A.Z.
Building automation technology is rapidly developing towards more reliable communication systems, devices that control electronic equipments. These equipment if controlled leads to efficient energy management, and savings on the monthly electricity bill. Power Line communication (PLC) has been one of the dreams of the electronics industry for decades, especially for building automation. It is the purpose of this paper to demonstrate communication methods among electronic control devices through an AC power line carrier within the buildings for more efficient energy control. The paper outlines methods of communication over a powerline, namely the X-10 and CE bus. It also introduces themore » spread spectrum technology as to increase speed to 100--150 times faster than the X-10 system. The powerline carrier has tremendous applications in the field of building automation. The paper presents an attempt to realize a smart house concept, so called, in which all home electronic devices from a coffee maker to a water heater microwave to chaos robots will be utilized by an intelligent network whenever one wishes to do so. The designed system may be applied very profitably to help in energy management for both customer and utility.« less
Lutz, Barry; Liang, Tinny; Fu, Elain; Ramachandran, Sujatha; Kauffman, Peter; Yager, Paul
2013-07-21
Lateral flow tests (LFTs) are an ingenious format for rapid and easy-to-use diagnostics, but they are fundamentally limited to assay chemistries that can be reduced to a single chemical step. In contrast, most laboratory diagnostic assays rely on multiple timed steps carried out by a human or a machine. Here, we use dissolvable sugar applied to paper to create programmable flow delays and present a paper network topology that uses these time delays to program automated multi-step fluidic protocols. Solutions of sucrose at different concentrations (10-70% of saturation) were added to paper strips and dried to create fluidic time delays spanning minutes to nearly an hour. A simple folding card format employing sugar delays was shown to automate a four-step fluidic process initiated by a single user activation step (folding the card); this device was used to perform a signal-amplified sandwich immunoassay for a diagnostic biomarker for malaria. The cards are capable of automating multi-step assay protocols normally used in laboratories, but in a rapid, low-cost, and easy-to-use format.
Lutz, Barry; Liang, Tinny; Fu, Elain; Ramachandran, Sujatha; Kauffman, Peter; Yager, Paul
2013-01-01
Lateral flow tests (LFTs) are an ingenious format for rapid and easy-to-use diagnostics, but they are fundamentally limited to assay chemistries that can be reduced to a single chemical step. In contrast, most laboratory diagnostic assays rely on multiple timed steps carried out by a human or a machine. Here, we use dissolvable sugar applied to paper to create programmable flow delays and present a paper network topology that uses these time delays to program automated multi-step fluidic protocols. Solutions of sucrose at different concentrations (10-70% of saturation) were added to paper strips and dried to create fluidic time delays spanning minutes to nearly an hour. A simple folding card format employing sugar delays was shown to automate a four-step fluidic process initiated by a single user activation step (folding the card); this device was used to perform a signal-amplified sandwich immunoassay for a diagnostic biomarker for malaria. The cards are capable of automating multi-step assay protocols normally used in laboratories, but in a rapid, low-cost, and easy-to-use format. PMID:23685876
Automated basin delineation from digital terrain data
NASA Technical Reports Server (NTRS)
Marks, D.; Dozier, J.; Frew, J.
1983-01-01
While digital terrain grids are now in wide use, accurate delineation of drainage basins from these data is difficult to efficiently automate. A recursive order N solution to this problem is presented. The algorithm is fast because no point in the basin is checked more than once, and no points outside the basin are considered. Two applications for terrain analysis and one for remote sensing are given to illustrate the method, on a basin with high relief in the Sierra Nevada. This technique for automated basin delineation will enhance the utility of digital terrain analysis for hydrologic modeling and remote sensing.
Array Automated Assembly Task Low Cost Silicon Solar Array Project, Phase 2
NASA Technical Reports Server (NTRS)
Rhee, S. S.; Jones, G. T.; Allison, K. L.
1978-01-01
Progress in the development of solar cells and module process steps for low-cost solar arrays is reported. Specific topics covered include: (1) a system to automatically measure solar cell electrical performance parameters; (2) automation of wafer surface preparation, printing, and plating; (3) laser inspection of mechanical defects of solar cells; and (4) a silicon antireflection coating system. Two solar cell process steps, laser trimming and holing automation and spray-on dopant junction formation, are described.
Wang, Jin; Patel, Vimal; Burns, Daniel; Laycock, John; Pandya, Kinnari; Tsoi, Jennifer; DeSilva, Binodh; Ma, Mark; Lee, Jean
2013-07-01
Regulated bioanalytical laboratories that run ligand-binding assays in support of biotherapeutics development face ever-increasing demand to support more projects with increased efficiency. Laboratory automation is a tool that has the potential to improve both quality and efficiency in a bioanalytical laboratory. The success of laboratory automation requires thoughtful evaluation of program needs and fit-for-purpose strategies, followed by pragmatic implementation plans and continuous user support. In this article, we present the development of fit-for-purpose automation of total walk-away and flexible modular modes. We shared the sustaining experience of vendor collaboration and team work to educate, promote and track the use of automation. The implementation of laboratory automation improves assay performance, data quality, process efficiency and method transfer to CRO in a regulated bioanalytical laboratory environment.
NASA Astrophysics Data System (ADS)
Bera, Amrita Mandal; Wargulski, Dan Ralf; Unold, Thomas
2018-04-01
Hybrid organometal perovskites have been emerged as promising solar cell material and have exhibited solar cell efficiency more than 20%. Thin films of Methylammonium lead iodide CH3NH3PbI3 perovskite materials have been synthesized by two different (one step and two steps) methods and their morphological properties have been studied by scanning electron microscopy and optical microscope imaging. The morphology of the perovskite layer is one of the most important parameters which affect solar cell efficiency. The morphology of the films revealed that two steps method provides better surface coverage than the one step method. However, the grain sizes were smaller in case of two steps method. The films prepared by two steps methods on different substrates revealed that the grain size also depend on the substrate where an increase of the grain size was found from glass substrate to FTO with TiO2 blocking layer to FTO without any change in the surface coverage area. Present study reveals that an improved quality of films can be obtained by two steps method by an optimization of synthesis processes.
Detection of canine distemper virus (CDV) through one step RT-PCR combined with nested PCR.
Kim, Y H; Cho, K W; Youn, H Y; Yoo, H S; Han, H R
2001-04-01
A one step reverse transcription PCR (RT-PCR) combined nested PCR was set up to increase efficiency in the diagnosis of canine distemper virus (CDV) infection after developement of nested PCR. Two PCR primer sets were designed based on the sequence of nucleocapsid gene of CDV Onderstepoort strain. One-step RT-PCR with the outer primer pair was revealed to detect 10(2) PFU/ml. The sensitivity was increased hundredfold using the one-step RT-PCR combined with the nested PCR. Specificity of the PCR was also confirmed using other related canine virus and peripheral blood mononuclear cells (PBMC) and body secretes of healthy dogs. Of the 51 blood samples from dogs clinically suspected of CD, 45 samples were revealed as positive by one-step RT-PCR combined with nested PCR. However, only 15 samples were identified as positive with a single one step RT-PCR. Therefore approximately 60% increase in the efficiency of the diagnosis was observed by the combined method. These results suggested that one step RT-PCR combined with nested PCR could be a sensitive, specific, and practical method for diagnosis of CDV infection.
Biomimetic Molecular Signaling using DNA Walkers on Microparticles.
Damase, Tulsi Ram; Spencer, Adam; Samuel, Bamidele; Allen, Peter B
2017-06-22
We report the release of catalytic DNA walkers from hydrogel microparticles and the detection of those walkers by substrate-coated microparticles. This might be considered a synthetic biology analog of molecular signal release and reception. One type of particles was coated with components of a DNA one-step strand displacement (OSD) reaction to release the walker. A second type of particle was coated with substrate (or "track") for the molecular walker. We distinguish these particle types using fluorescence barcoding: we synthesized and distinguished multiple particle types with multicolor fluorescence microscopy and automated image analysis software. This represents a step toward amplified, multiplex, and microscopically localized detection based on DNA nanotechnology.
ASPECTS: an automation-assisted SPE method development system.
Li, Ming; Chou, Judy; King, Kristopher W; Yang, Liyu
2013-07-01
A typical conventional SPE method development (MD) process usually involves deciding the chemistry of the sorbent and eluent based on information about the analyte; experimentally preparing and trying out various combinations of adsorption chemistry and elution conditions; quantitatively evaluating the various conditions; and comparing quantitative results from all combination of conditions to select the best condition for method qualification. The second and fourth steps have mostly been performed manually until now. We developed an automation-assisted system that expedites the conventional SPE MD process by automating 99% of the second step, and expedites the fourth step by automatically processing the results data and presenting it to the analyst in a user-friendly format. The automation-assisted SPE MD system greatly saves the manual labor in SPE MD work, prevents analyst errors from causing misinterpretation of quantitative results, and shortens data analysis and interpretation time.
Merz, Michael; Eisele, Thomas; Berends, Pieter; Appel, Daniel; Rabe, Swen; Blank, Imre; Stressler, Timo; Fischer, Lutz
2015-06-17
Flavourzyme is sold as a peptidase preparation from Aspergillus oryzae. The enzyme preparation is widely and diversely used for protein hydrolysis in industrial and research applications. However, detailed information about the composition of this mixture is still missing due to the complexity. The present study identified eight key enzymes by mass spectrometry and partially by activity staining on native polyacrylamide gels or gel zymography. The eight enzymes identified were two aminopeptidases, two dipeptidyl peptidases, three endopeptidases, and one α-amylase from the A. oryzae strain ATCC 42149/RIB 40 (yellow koji mold). Various specific marker substrates for these Flavourzyme enzymes were ascertained. An automated, time-saving nine-step protocol for the purification of all eight enzymes within 7 h was designed. Finally, the purified Flavourzyme enzymes were biochemically characterized with regard to pH and temperature profiles and molecular sizes.
ISS Asset Tracking Using SAW RFID Technology
NASA Technical Reports Server (NTRS)
Schellhase, Amy; Powers, Annie
2004-01-01
A team at the NASA Johnson Space Center (JSC) is undergoing final preparations to test Surface Acoustic Wave (SAW) Radio Frequency Identification (RFID) technology to track assets aboard the International Space Station (ISS). Currently, almost 10,000 U.S. items onboard the ISS are tracked within a database maintained by both the JSC ground teams and crew onboard the ISS. This barcode-based inventory management system has successfully tracked the location of 97% of the items onboard, but its accuracy is dependant on the crew to report hardware movements, taking valuable time away from science and other activities. With the addition of future modules, the volume of inventory to be tracked is expected to increase significantly. The first test of RFID technology on ISS, which will be conducted by the Expedition 16 crew later this year, will evaluate the ability of RFID technology to track consumable items. These consumables, which include office supplies and clothing, are regularly supplied to ISS and can be tagged on the ground. Automation will eliminate line-of-sight auditing requirements, directly saving crew time. This first step in automating an inventory tracking system will pave the way for future uses of RFID for inventory tracking in space. Not only are there immediate benefits for ISS applications, it is a crucial step to ensure efficient logistics support for future vehicles and exploration missions where resupplies are not readily available. Following a successful initial test, the team plans to execute additional tests for new technology, expanded operations concepts, and increased automation.
Robotics in biomedical chromatography and electrophoresis.
Fouda, H G
1989-08-11
The ideal laboratory robot can be viewed as "an indefatigable assistant capable of working continuously for 24 h a day with constant efficiency". The development of a system approaching that promise requires considerable skill and time commitment, a thorough understanding of the capabilities and limitations of the robot and its specialized modules and an intimate knowledge of the functions to be automated. The robot need not emulate every manual step. Effective substitutes for difficult steps must be devised. The future of laboratory robots depends not only on technological advances in other fields, but also on the skill and creativity of chromatographers and other scientists. The robot has been applied to automate numerous biomedical chromatography and electrophoresis methods. The quality of its data can approach, and in some cases exceed, that of manual methods. Maintaining high data quality during continuous operation requires frequent maintenance and validation. Well designed robotic systems can yield substantial increase in the laboratory productivity without a corresponding increase in manpower. They can free skilled personnel from mundane tasks and can enhance the safety of the laboratory environment. The integration of robotics, chromatography systems and laboratory information management systems permits full automation and affords opportunities for unattended method development and for future incorporation of artificial intelligence techniques and the evolution of expert systems. Finally, humanoid attributes aside, robotic utilization in the laboratory should not be an end in itself. The robot is a useful tool that should be utilized only when it is prudent and cost-effective to do so.
Automated and comprehensive link engineering supporting branched, ring, and mesh network topologies
NASA Astrophysics Data System (ADS)
Farina, J.; Khomchenko, D.; Yevseyenko, D.; Meester, J.; Richter, A.
2016-02-01
Link design, while relatively easy in the past, can become quite cumbersome with complex channel plans and equipment configurations. The task of designing optical transport systems and selecting equipment is often performed by an applications or sales engineer using simple tools, such as custom Excel spreadsheets. Eventually, every individual has their own version of the spreadsheet as well as their own methodology for building the network. This approach becomes unmanageable very quickly and leads to mistakes, bending of the engineering rules and installations that do not perform as expected. We demonstrate a comprehensive planning environment, which offers an efficient approach to unify, control and expedite the design process by controlling libraries of equipment and engineering methodologies, automating the process and providing the analysis tools necessary to predict system performance throughout the system and for all channels. In addition to the placement of EDFAs and DCEs, performance analysis metrics are provided at every step of the way. Metrics that can be tracked include power, CD and OSNR, SPM, XPM, FWM and SBS. Automated routine steps assist in design aspects such as equalization, padding and gain setting for EDFAs, the placement of ROADMs and transceivers, and creating regeneration points. DWDM networks consisting of a large number of nodes and repeater huts, interconnected in linear, branched, mesh and ring network topologies, can be designed much faster when compared with conventional design methods. Using flexible templates for all major optical components, our technology-agnostic planning approach supports the constant advances in optical communications.
Automated Detection of Surgical Adverse Events from Retrospective Clinical Data
ERIC Educational Resources Information Center
Hu, Zhen
2017-01-01
The Detection of surgical adverse events has become increasingly important with the growing demand for quality improvement and public health surveillance with surgery. Event reporting is one of the key steps in determining the impact of postoperative complications from a variety of perspectives and is an integral component of improving…
Lemaire, C; Libert, L; Franci, X; Genon, J-L; Kuci, S; Giacomelli, F; Luxen, A
2015-06-15
An efficient, fully automated, enantioselective multi-step synthesis of no-carrier-added (nca) 6-[(18)F]fluoro-L-dopa ([(18)F]FDOPA) and 2-[(18)F]fluoro-L-tyrosine ([(18)F]FTYR) on a GE FASTlab synthesizer in conjunction with an additional high- performance liquid chromatography (HPLC) purification has been developed. A PTC (phase-transfer catalyst) strategy was used to synthesize these two important radiopharmaceuticals. According to recent chemistry improvements, automation of the whole process was implemented in a commercially available GE FASTlab module, with slight hardware modification using single use cassettes and stand-alone HPLC. [(18)F]FDOPA and [(18)F]FTYR were produced in 36.3 ± 3.0% (n = 8) and 50.5 ± 2.7% (n = 10) FASTlab radiochemical yield (decay corrected). The automated radiosynthesis on the FASTlab module requires about 52 min. Total synthesis time including HPLC purification and formulation was about 62 min. Enantiomeric excesses for these two aromatic amino acids were always >95%, and the specific activity of was >740 GBq/µmol. This automated synthesis provides high amount of [(18)F]FDOPA and [(18)F]FTYR (>37 GBq end of synthesis (EOS)). The process, fully adaptable for reliable production across multiple PET sites, could be readily implemented into a clinical good manufacturing process (GMP) environment. Copyright © 2015 John Wiley & Sons, Ltd.
Gädke, Johannes; Kleinfeldt, Lennart; Schubert, Chris; Rohde, Manfred; Biedendieck, Rebekka; Garnweitner, Georg; Krull, Rainer
2017-01-20
This paper discusses the use of recyclable functionalized nanoparticles for an improved downstream processing of recombinant products. The Gram-positive bacterium Bacillus megaterium was used to secrete recombinant protein A fused to a histidine tag into the culture supernatant in shaker flasks. Superparamagnetic iron oxide nanoparticles functionalized with 3-glycidoxypropyl-trimethoxysilane-coupled-nitrilotriacetic-acid groups (GNTA-SPION) were synthesized and added directly to the growing culture. After 10min incubation time, >85% of the product was adsorbed onto the particles. The particles were magnetically separated using handheld neodymium magnets and the product was eluted. The GNTA-SPION were successfully regenerated and reused in five consecutive cycles. In the one-step purification, the purity of the product reached >99.9% regarding protein A. A very low particle concentration of 0.5g/L was sufficient for effective product separation. Bacterial growth was not influenced negatively by this concentration. Particle analysis showed similar properties between freshly synthesized and regenerated GNTA-SPION. The overall process efficiency was however influenced by partial disintegration of particle agglomerates and thus loss of particles. The demonstration of very fast in situ product removal from growing bacterial culture combined with a very high product purity within one step shows possibilities for automated large scale purification combined with recycling of biomass. Copyright © 2016 Elsevier B.V. All rights reserved.
ARES: automated response function code. Users manual. [HPGAM and LSQVM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maung, T.; Reynolds, G.M.
This ARES user's manual provides detailed instructions for a general understanding of the Automated Response Function Code and gives step by step instructions for using the complete code package on a HP-1000 system. This code is designed to calculate response functions of NaI gamma-ray detectors, with cylindrical or rectangular geometries.
The Automated Array Assembly Task of the Low-cost Silicon Solar Array Project, Phase 2
NASA Technical Reports Server (NTRS)
Coleman, M. G.; Grenon, L.; Pastirik, E. M.; Pryor, R. A.; Sparks, T. G.
1978-01-01
An advanced process sequence for manufacturing high efficiency solar cells and modules in a cost-effective manner is discussed. Emphasis is on process simplicity and minimizing consumed materials. The process sequence incorporates texture etching, plasma processes for damage removal and patterning, ion implantation, low pressure silicon nitride deposition, and plated metal. A reliable module design is presented. Specific process step developments are given. A detailed cost analysis was performed to indicate future areas of fruitful cost reduction effort. Recommendations for advanced investigations are included.
Optical benchmarking of security document readers for automated border control
NASA Astrophysics Data System (ADS)
Valentín, Kristián.; Wild, Peter; Å tolc, Svorad; Daubner, Franz; Clabian, Markus
2016-10-01
Authentication and optical verification of travel documents upon crossing borders is of utmost importance for national security. Understanding the workflow and different approaches to ICAO 9303 travel document scanning in passport readers, as well as highlighting normalization issues and designing new methods to achieve better harmonization across inspection devices are key steps for the development of more effective and efficient next- generation passport inspection. This paper presents a survey of state-of-the-art document inspection systems, showcasing results of a document reader challenge investigating 9 devices with regards to optical characteristics.
Phase editing as a signal pre-processing step for automated bearing fault detection
NASA Astrophysics Data System (ADS)
Barbini, L.; Ompusunggu, A. P.; Hillis, A. J.; du Bois, J. L.; Bartic, A.
2017-07-01
Scheduled maintenance and inspection of bearing elements in industrial machinery contributes significantly to the operating costs. Savings can be made through automatic vibration-based damage detection and prognostics, to permit condition-based maintenance. However automation of the detection process is difficult due to the complexity of vibration signals in realistic operating environments. The sensitivity of existing methods to the choice of parameters imposes a requirement for oversight from a skilled operator. This paper presents a novel approach to the removal of unwanted vibrational components from the signal: phase editing. The approach uses a computationally-efficient full-band demodulation and requires very little oversight. Its effectiveness is tested on experimental data sets from three different test-rigs, and comparisons are made with two state-of-the-art processing techniques: spectral kurtosis and cepstral pre- whitening. The results from the phase editing technique show a 10% improvement in damage detection rates compared to the state-of-the-art while simultaneously improving on the degree of automation. This outcome represents a significant contribution in the pursuit of fully automatic fault detection.
Automatic high throughput empty ISO container verification
NASA Astrophysics Data System (ADS)
Chalmers, Alex
2007-04-01
Encouraging results are presented for the automatic analysis of radiographic images of a continuous stream of ISO containers to confirm they are truly empty. A series of image processing algorithms are described that process real-time data acquired during the actual inspection of each container and assigns each to one of the classes "empty", "not empty" or "suspect threat". This research is one step towards achieving fully automated analysis of cargo containers.
Automated control of robotic camera tacheometers for measurements of industrial large scale objects
NASA Astrophysics Data System (ADS)
Heimonen, Teuvo; Leinonen, Jukka; Sipola, Jani
2013-04-01
The modern robotic tacheometers equipped with digital cameras (called also imaging total stations) and capable to measure reflectorless offer new possibilities to gather 3d data. In this paper an automated approach for the tacheometer measurements needed in the dimensional control of industrial large scale objects is proposed. There are two new contributions in the approach: the automated extraction of the vital points (i.e. the points to be measured) and the automated fine aiming of the tacheometer. The proposed approach proceeds through the following steps: First the coordinates of the vital points are automatically extracted from the computer aided design (CAD) data. The extracted design coordinates are then used to aim the tacheometer to point out to the designed location of the points, one after another. However, due to the deviations between the designed and the actual location of the points, the aiming need to be adjusted. An automated dynamic image-based look-and-move type servoing architecture is proposed to be used for this task. After a successful fine aiming, the actual coordinates of the point in question can be automatically measured by using the measuring functionalities of the tacheometer. The approach was validated experimentally and noted to be feasible. On average 97 % of the points actually measured in four different shipbuilding measurement cases were indeed proposed to be vital points by the automated extraction algorithm. The accuracy of the results obtained with the automatic control method of the tachoemeter were comparable to the results obtained with the manual control, and also the reliability of the image processing step of the method was found to be high in the laboratory experiments.
ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wieselquist, William A.; Thompson, Adam B.; Bowman, Stephen M.
2016-04-01
Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process datamore » to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.« less
Goatman, Keith; Charnley, Amanda; Webster, Laura; Nussey, Stephen
2011-01-01
To assess the performance of automated disease detection in diabetic retinopathy screening using two field mydriatic photography. Images from 8,271 sequential patient screening episodes from a South London diabetic retinopathy screening service were processed by the Medalytix iGrading™ automated grading system. For each screening episode macular-centred and disc-centred images of both eyes were acquired and independently graded according to the English national grading scheme. Where discrepancies were found between the automated result and original manual grade, internal and external arbitration was used to determine the final study grades. Two versions of the software were used: one that detected microaneurysms alone, and one that detected blot haemorrhages and exudates in addition to microaneurysms. Results for each version were calculated once using both fields and once using the macula-centred field alone. Of the 8,271 episodes, 346 (4.2%) were considered unassessable. Referable disease was detected in 587 episodes (7.1%). The sensitivity of the automated system for detecting unassessable images ranged from 97.4% to 99.1% depending on configuration. The sensitivity of the automated system for referable episodes ranged from 98.3% to 99.3%. All the episodes that included proliferative or pre-proliferative retinopathy were detected by the automated system regardless of configuration (192/192, 95% confidence interval 98.0% to 100%). If implemented as the first step in grading, the automated system would have reduced the manual grading effort by between 2,183 and 3,147 patient episodes (26.4% to 38.1%). Automated grading can safely reduce the workload of manual grading using two field, mydriatic photography in a routine screening service.
High-volume workflow management in the ITN/FBI system
NASA Astrophysics Data System (ADS)
Paulson, Thomas L.
1997-02-01
The Identification Tasking and Networking (ITN) Federal Bureau of Investigation system will manage the processing of more than 70,000 submissions per day. The workflow manager controls the routing of each submission through a combination of automated and manual processing steps whose exact sequence is dynamically determined by the results at each step. For most submissions, one or more of the steps involve the visual comparison of fingerprint images. The ITN workflow manager is implemented within a scaleable client/server architecture. The paper describes the key aspects of the ITN workflow manager design which allow the high volume of daily processing to be successfully accomplished.
Demonstration of the feasibility of automated silicon solar cell fabrication
NASA Technical Reports Server (NTRS)
Taylor, W. E.; Schwartz, F. M.
1975-01-01
A study effort was undertaken to determine the process, steps and design requirements of an automated silicon solar cell production facility. Identification of the key process steps was made and a laboratory model was conceptually designed to demonstrate the feasibility of automating the silicon solar cell fabrication process. A detailed laboratory model was designed to demonstrate those functions most critical to the question of solar cell fabrication process automating feasibility. The study and conceptual design have established the technical feasibility of automating the solar cell manufacturing process to produce low cost solar cells with improved performance. Estimates predict an automated process throughput of 21,973 kilograms of silicon a year on a three shift 49-week basis, producing 4,747,000 hexagonal cells (38mm/side), a total of 3,373 kilowatts at an estimated manufacturing cost of $0.866 per cell or $1.22 per watt.
A HUMAN AUTOMATION INTERACTION CONCEPT FOR A SMALL MODULAR REACTOR CONTROL ROOM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le Blanc, Katya; Spielman, Zach; Hill, Rachael
Many advanced nuclear power plant (NPP) designs incorporate higher degrees of automation than the existing fleet of NPPs. Automation is being introduced or proposed in NPPs through a wide variety of systems and technologies, such as advanced displays, computer-based procedures, advanced alarm systems, and computerized operator support systems. Additionally, many new reactor concepts, both full scale and small modular reactors, are proposing increased automation and reduced staffing as part of their concept of operations. However, research consistently finds that there is a fundamental tradeoff between system performance with increased automation and reduced human performance. There is a need to addressmore » the question of how to achieve high performance and efficiency of high levels of automation without degrading human performance. One example of a new NPP concept that will utilize greater degrees of automation is the SMR concept from NuScale Power. The NuScale Power design requires 12 modular units to be operated in one single control room, which leads to a need for higher degrees of automation in the control room. Idaho National Laboratory (INL) researchers and NuScale Power human factors and operations staff are working on a collaborative project to address the human performance challenges of increased automation and to determine the principles that lead to optimal performance in highly automated systems. This paper will describe this concept in detail and will describe an experimental test of the concept. The benefits and challenges of the approach will be discussed.« less
Automated packing systems: review of industrial implementations
NASA Astrophysics Data System (ADS)
Whelan, Paul F.; Batchelor, Bruce G.
1993-08-01
A rich theoretical background to the problems that occur in the automation of material handling can be found in operations research, production engineering, systems engineering and automation, more specifically machine vision, literature. This work has contributed towards the design of intelligent handling systems. This paper will review the application of these automated material handling and packing techniques to industrial problems. The discussion will also highlight the systems integration issues involved in these applications. An outline of one such industrial application, the automated placement of shape templates on to leather hides, is also discussed. The purpose of this system is to arrange shape templates on a leather hide in an efficient manner, so as to minimize the leather waste, before they are automatically cut from the hide. These pieces are used in the furniture and car manufacturing industries for the upholstery of high quality leather chairs and car seats. Currently this type of operation is semi-automated. The paper will outline the problems involved in the full automation of such a procedure.
Hao, Nanjing; Jayawardana, Kalana W; Chen, Xuan; Yan, Mingdi
2015-01-21
In this study, amine-functionalized hollow mesoporous silica nanoparticles with an average diameter of ∼100 nm and shell thickness of ∼20 nm were prepared by an one-step process. This new nanoparticulate system exhibited excellent killing efficiency against mycobacterial (M. smegmatis strain mc(2) 651) and cancer cells (A549).
Laboratory automation: total and subtotal.
Hawker, Charles D
2007-12-01
Worldwide, perhaps 2000 or more clinical laboratories have implemented some form of laboratory automation, either a modular automation system, such as for front-end processing, or a total laboratory automation system. This article provides descriptions and examples of these various types of automation. It also presents an outline of how a clinical laboratory that is contemplating automation should approach its decision and the steps it should follow to ensure a successful implementation. Finally, the role of standards in automation is reviewed.
Suh, Hyewon; Porter, John R; Racadio, Robert; Sung, Yi-Chen; Kientz, Julie A
2016-01-01
To help reach populations of children without consistent Internet access or medical care, we designed and implemented Baby Steps Text, an automated text message-based screening tool. We conducted preliminary user research via storyboarding and prototyping with target populations and then developed a fully functional system. In a one-month deployment study, we evaluated the feasibility of Baby Steps Text with fourteen families. During a one-month study, 13 out of 14 participants were able to learn and use the response structure (yielding 2.88% error rate) and complete a child development screener entirely via text messages. All post-study survey respondents agreed Baby Steps Text was understandable and easy to use, which was also confirmed through post-study interviews. Some survey respondents expressed liking Baby Steps Text because it was easy, quick, convenient to use, and delivered helpful, timely information. Our initial deployment study shows text messaging is a feasible tool for supporting parents in tracking and monitoring their child's development.
NASA Astrophysics Data System (ADS)
Fern, Lisa Carolynn
This dissertation examines the challenges inherent in designing and regulating to support human-automation interaction for new technologies that will be deployed into complex systems. A key question for new technologies with increasingly capable automation, is how work will be accomplished by human and machine agents. This question has traditionally been framed as how functions should be allocated between humans and machines. Such framing misses the coordination and synchronization that is needed for the different human and machine roles in the system to accomplish their goals. Coordination and synchronization demands are driven by the underlying human-automation architecture of the new technology, which are typically not specified explicitly by designers. The human machine interface (HMI), which is intended to facilitate human-machine interaction and cooperation, typically is defined explicitly and therefore serves as a proxy for human-automation cooperation requirements with respect to technical standards for technologies. Unfortunately, mismatches between the HMI and the coordination and synchronization demands of the underlying human-automation architecture can lead to system breakdowns. A methodology is needed that both designers and regulators can utilize to evaluate the predicted performance of a new technology given potential human-automation architectures. Three experiments were conducted to inform the minimum HMI requirements for a detect and avoid (DAA) system for unmanned aircraft systems (UAS). The results of the experiments provided empirical input to specific minimum operational performance standards that UAS manufacturers will have to meet in order to operate UAS in the National Airspace System (NAS). These studies represent a success story for how to objectively and systematically evaluate prototype technologies as part of the process for developing regulatory requirements. They also provide an opportunity to reflect on the lessons learned in order to improve the methodology for defining technology requirements for regulators in the future. The biggest shortcoming of the presented research program was the absence of the explicit definition, generation and analysis of potential human-automation architectures. Failure to execute this step in the research process resulted in less efficient evaluation of the candidate prototypes technologies in addition to a lack of exploration of different approaches to human-automation cooperation. Defining potential human-automation architectures a priori also allows regulators to develop scenarios that will stress the performance boundaries of the technology during the evaluation phase. The importance of adding this step of generating and evaluating candidate human-automation architectures prior to formal empirical evaluation is discussed. This document concludes with a look at both the importance of, and the challenges facing, the inclusion of examining human-automation coordination issues as part of the safety assurance activities of new technologies.
Global magnetosphere simulations using constrained-transport Hall-MHD with CWENO reconstruction
NASA Astrophysics Data System (ADS)
Lin, L.; Germaschewski, K.; Maynard, K. M.; Abbott, S.; Bhattacharjee, A.; Raeder, J.
2013-12-01
We present a new CWENO (Centrally-Weighted Essentially Non-Oscillatory) reconstruction based MHD solver for the OpenGGCM global magnetosphere code. The solver was built using libMRC, a library for creating efficient parallel PDE solvers on structured grids. The use of libMRC gives us access to its core functionality of providing an automated code generation framework which takes a user provided PDE right hand side in symbolic form to generate an efficient, computer architecture specific, parallel code. libMRC also supports block-structured adaptive mesh refinement and implicit-time stepping through integration with the PETSc library. We validate the new CWENO Hall-MHD solver against existing solvers both in standard test problems as well as in global magnetosphere simulations.
High-concentration planar microtracking photovoltaic system exceeding 30% efficiency
NASA Astrophysics Data System (ADS)
Price, Jared S.; Grede, Alex J.; Wang, Baomin; Lipski, Michael V.; Fisher, Brent; Lee, Kyu-Tae; He, Junwen; Brulo, Gregory S.; Ma, Xiaokun; Burroughs, Scott; Rahn, Christopher D.; Nuzzo, Ralph G.; Rogers, John A.; Giebink, Noel C.
2017-08-01
Prospects for concentrating photovoltaic (CPV) power are growing as the market increasingly values high power conversion efficiency to leverage now-dominant balance of system and soft costs. This trend is particularly acute for rooftop photovoltaic power, where delivering the high efficiency of traditional CPV in the form factor of a standard rooftop photovoltaic panel could be transformative. Here, we demonstrate a fully automated planar microtracking CPV system <2 cm thick that operates at fixed tilt with a microscale triple-junction solar cell at >660× concentration ratio over a 140∘ full field of view. In outdoor testing over the course of two sunny days, the system operates automatically from sunrise to sunset, outperforming a 17%-efficient commercial silicon solar cell by generating >50% more energy per unit area per day in a direct head-to-head competition. These results support the technical feasibility of planar microtracking CPV to deliver a step change in the efficiency of rooftop solar panels at a commercially relevant concentration ratio.
A high-throughput semi-automated preparation for filtered synaptoneurosomes.
Murphy, Kathryn M; Balsor, Justin; Beshara, Simon; Siu, Caitlin; Pinto, Joshua G A
2014-09-30
Synaptoneurosomes have become an important tool for studying synaptic proteins. The filtered synaptoneurosomes preparation originally developed by Hollingsworth et al. (1985) is widely used and is an easy method to prepare synaptoneurosomes. The hand processing steps in that preparation, however, are labor intensive and have become a bottleneck for current proteomic studies using synaptoneurosomes. For this reason, we developed new steps for tissue homogenization and filtration that transform the preparation of synaptoneurosomes to a high-throughput, semi-automated process. We implemented a standardized protocol with easy to follow steps for homogenizing multiple samples simultaneously using a FastPrep tissue homogenizer (MP Biomedicals, LLC) and then filtering all of the samples in centrifugal filter units (EMD Millipore, Corp). The new steps dramatically reduce the time to prepare synaptoneurosomes from hours to minutes, increase sample recovery, and nearly double enrichment for synaptic proteins. These steps are also compatible with biosafety requirements for working with pathogen infected brain tissue. The new high-throughput semi-automated steps to prepare synaptoneurosomes are timely technical advances for studies of low abundance synaptic proteins in valuable tissue samples. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Palmieri, Frank L.; Belcher, Marcus A.; Wohl, Christopher J.; Blohowiak, Kay Y.; Connell, John W.
2013-01-01
Surface preparation is widely recognized as a key step to producing robust and predictable bonds in a precise and reproducible manner. Standard surface preparation techniques, including grit blasting, manual abrasion, and peel ply, can lack precision and reproducibility, which can lead to variation in surface properties and subsequent bonding performance. The use of a laser to ablate composite surface resin can provide an efficient, precise, and reproducible means of preparing composite surfaces for adhesive bonding. Advantages include elimination of physical waste (i.e., grit media and sacrificial peel ply layers that ultimately require disposal), reduction in process variability due to increased precision (e.g. increased reproducibility), and automation of surface preparation, all of which improve reliability and process control. This paper describes a Nd:YAG laser surface preparation technique for composite substrates and the mechanical performance and failure modes of bonded laminates thus prepared. Additionally, bonded specimens were aged in a hot, wet environment for approximately one year and subsequently mechanically tested. The results of a one year hygrothermal aging study will be presented.
Automating Media Centers and Small Libraries: A Microcomputer-Based Approach.
ERIC Educational Resources Information Center
Meghabghab, Dania Bilal
Although the general automation process can be applied to most libraries, small libraries and media centers require a customized approach. Using a systematic approach, this guide covers each step and aspect of automation in a small library setting, and combines the principles of automation with field- tested activities. After discussing needs…
A Cognitive Systems Engineering Approach to Developing HMI Requirements for New Technologies
NASA Technical Reports Server (NTRS)
Fern, Lisa Carolynn
2016-01-01
This document examines the challenges inherent in designing and regulating to support human-automation interaction for new technologies that will deployed into complex systems. A key question for new technologies, is how work will be accomplished by the human and machine agents. This question has traditionally been framed as how functions should be allocated between humans and machines. Such framing misses the coordination and synchronization that is needed for the different human and machine roles in the system to accomplish their goals. Coordination and synchronization demands are driven by the underlying human-automation architecture of the new technology, which are typically not specified explicitly by the designers. The human machine interface (HMI) which is intended to facilitate human-machine interaction and cooperation, however, typically is defined explicitly and therefore serves as a proxy for human-automation cooperation requirements with respect to technical standards for technologies. Unfortunately, mismatches between the HMI and the coordination and synchronization demands of the underlying human-automation architecture, can lead to system breakdowns. A methodology is needed that both designers and regulators can utilize to evaluate the expected performance of a new technology given potential human-automation architectures. Three experiments were conducted to inform the minimum HMI requirements a detect and avoid system for unmanned aircraft systems (UAS). The results of the experiments provided empirical input to specific minimum operational performance standards that UAS manufacturers will have to meet in order to operate UAS in the National Airspace System (NAS). These studies represent a success story for how to objectively and systematically evaluate prototype technologies as part of the process for developing regulatory requirements. They also provide an opportunity to reflect on the lessons learned from a recent research effort in order to improve the methodology for defining technology requirements for regulators in the future. The biggest shortcoming of the presented research program was the absence of the explicit definition, generation and analysis of potential human-automation architectures. Failure to execute this step in the research process resulted in less efficient evaluation of the candidate prototypes technologies in addition to the complete absence of different approaches to human-automation cooperation. For example, all of the prototype technologies that were evaluated in the research program assumed a human-automation architecture that relied on serial processing from the automation to the human. While this type of human-automation architecture is typical across many different technologies and in many different domains, it ignores different architectures where humans and automation work in parallel. Defining potential human-automation architectures a priori also allows regulators to develop scenarios that will stress the performance boundaries of the technology during the evaluation phase. The importance of adding this step of generating and evaluating candidate human-automation architectures prior to formal empirical evaluation is discussed.
Automated Conflict Resolution, Arrival Management and Weather Avoidance for ATM
NASA Technical Reports Server (NTRS)
Erzberger, H.; Lauderdale, Todd A.; Chu, Yung-Cheng
2010-01-01
The paper describes a unified solution to three types of separation assurance problems that occur in en-route airspace: separation conflicts, arrival sequencing, and weather-cell avoidance. Algorithms for solving these problems play a key role in the design of future air traffic management systems such as NextGen. Because these problems can arise simultaneously in any combination, it is necessary to develop integrated algorithms for solving them. A unified and comprehensive solution to these problems provides the foundation for a future air traffic management system that requires a high level of automation in separation assurance. The paper describes the three algorithms developed for solving each problem and then shows how they are used sequentially to solve any combination of these problems. The first algorithm resolves loss-of-separation conflicts and is an evolution of an algorithm described in an earlier paper. The new version generates multiple resolutions for each conflict and then selects the one giving the least delay. Two new algorithms, one for sequencing and merging of arrival traffic, referred to as the Arrival Manager, and the other for weather-cell avoidance are the major focus of the paper. Because these three problems constitute a substantial fraction of the workload of en-route controllers, integrated algorithms to solve them is a basic requirement for automated separation assurance. The paper also reviews the Advanced Airspace Concept, a proposed design for a ground-based system that postulates redundant systems for separation assurance in order to achieve both high levels of safety and airspace capacity. It is proposed that automated separation assurance be introduced operationally in several steps, each step reducing controller workload further while increasing airspace capacity. A fast time simulation was used to determine performance statistics of the algorithm at up to 3 times current traffic levels.
Code of Federal Regulations, 2011 CFR
2011-01-01
... than anti-terrorism (AT). The only exception to this requirement would be the return of unwanted... be entered on the invoice and on the bill of lading, air waybill, or other export control document... THE EAR § 732.5 Steps regarding Shipper's Export Declaration or Automated Export System record...
Code of Federal Regulations, 2013 CFR
2013-01-01
... than anti-terrorism (AT). The only exception to this requirement would be the return of unwanted... be entered on the invoice and on the bill of lading, air waybill, or other export control document... THE EAR § 732.5 Steps regarding Shipper's Export Declaration or Automated Export System record...
Code of Federal Regulations, 2012 CFR
2012-01-01
... than anti-terrorism (AT). The only exception to this requirement would be the return of unwanted... be entered on the invoice and on the bill of lading, air waybill, or other export control document... THE EAR § 732.5 Steps regarding Shipper's Export Declaration or Automated Export System record...
Code of Federal Regulations, 2014 CFR
2014-01-01
... than anti-terrorism (AT). The only exception to this requirement would be the return of unwanted... be entered on the invoice and on the bill of lading, air waybill, or other export control document... THE EAR § 732.5 Steps regarding Shipper's Export Declaration or Automated Export System record...
Development of a Production Ready Automated Wire Delivery System
NASA Technical Reports Server (NTRS)
1997-01-01
The current development effort is a Phase 3 research study entitled "A Production Ready Automated Wire Delivery System", contract number NAS8-39933, awarded to Nichols Research Corporation (NRC). The goals of this research study were to production harden the existing Automated Wire Delivery (AWDS) motion and sensor hardware and test the modified AWDS in a range of welding applications. In addition, the prototype AWDS controller would be moved to the VME bus platform by designing, fabricating and testing a single board VME bus AWDS controller. This effort was to provide an AWDS that could transition from the laboratory environment to production operations. The project was performed in two development steps. Step 1 modified and tested an improved MWG. Step 2 developed and tested the AWDS single board VME bus controller. Step 3 installed the Wire Pilot in a Weld Controller with the imbedded VME bus controller.
Madduri, Ravi K.; Sulakhe, Dinanath; Lacinski, Lukasz; Liu, Bo; Rodriguez, Alex; Chard, Kyle; Dave, Utpal J.; Foster, Ian T.
2014-01-01
We describe Globus Genomics, a system that we have developed for rapid analysis of large quantities of next-generation sequencing (NGS) genomic data. This system achieves a high degree of end-to-end automation that encompasses every stage of data analysis including initial data retrieval from remote sequencing centers or storage (via the Globus file transfer system); specification, configuration, and reuse of multi-step processing pipelines (via the Galaxy workflow system); creation of custom Amazon Machine Images and on-demand resource acquisition via a specialized elastic provisioner (on Amazon EC2); and efficient scheduling of these pipelines over many processors (via the HTCondor scheduler). The system allows biomedical researchers to perform rapid analysis of large NGS datasets in a fully automated manner, without software installation or a need for any local computing infrastructure. We report performance and cost results for some representative workloads. PMID:25342933
Madduri, Ravi K; Sulakhe, Dinanath; Lacinski, Lukasz; Liu, Bo; Rodriguez, Alex; Chard, Kyle; Dave, Utpal J; Foster, Ian T
2014-09-10
We describe Globus Genomics, a system that we have developed for rapid analysis of large quantities of next-generation sequencing (NGS) genomic data. This system achieves a high degree of end-to-end automation that encompasses every stage of data analysis including initial data retrieval from remote sequencing centers or storage (via the Globus file transfer system); specification, configuration, and reuse of multi-step processing pipelines (via the Galaxy workflow system); creation of custom Amazon Machine Images and on-demand resource acquisition via a specialized elastic provisioner (on Amazon EC2); and efficient scheduling of these pipelines over many processors (via the HTCondor scheduler). The system allows biomedical researchers to perform rapid analysis of large NGS datasets in a fully automated manner, without software installation or a need for any local computing infrastructure. We report performance and cost results for some representative workloads.
High performance, inexpensive solar cell process capable of a high degree of automation
NASA Technical Reports Server (NTRS)
Shah, P.; Fuller, C. R.
1976-01-01
This paper proposes a process for inexpensive high performance solar cell fabrication that can be automated for further cost reduction and higher throughputs. The unique feature of the process is the use of oxides as doping sources for simultaneous n(+) junction formation and back p(+) layer, as a mask for metallization and as an in situ AR coating for spectrum matching. Cost analysis is performed to show that significant cost reductions over the conventional process is possible using the proposed scheme and the cost intensive steps are identified which can be further reduced to make the process compatible with the needed price goals of 50 cents/watt. The process was demonstrated by fabricating n(+)-p cells using Arsenic doped oxides. Simple n(+)-p structure cells showed corrected efficiencies of 14.5% (AMO) and 12% with doped oxide as an in situ antireflection coating.
Automated detection of microcalcification clusters in mammograms
NASA Astrophysics Data System (ADS)
Karale, Vikrant A.; Mukhopadhyay, Sudipta; Singh, Tulika; Khandelwal, Niranjan; Sadhu, Anup
2017-03-01
Mammography is the most efficient modality for detection of breast cancer at early stage. Microcalcifications are tiny bright spots in mammograms and can often get missed by the radiologist during diagnosis. The presence of microcalcification clusters in mammograms can act as an early sign of breast cancer. This paper presents a completely automated computer-aided detection (CAD) system for detection of microcalcification clusters in mammograms. Unsharp masking is used as a preprocessing step which enhances the contrast between microcalcifications and the background. The preprocessed image is thresholded and various shape and intensity based features are extracted. Support vector machine (SVM) classifier is used to reduce the false positives while preserving the true microcalcification clusters. The proposed technique is applied on two different databases i.e DDSM and private database. The proposed technique shows good sensitivity with moderate false positives (FPs) per image on both databases.
Zhang, Yan; Zhang, Ting; Feng, Yanye; Lu, Xiuxiu; Lan, Wenxian; Wang, Jufang; Wu, Houming; Cao, Chunyang; Wang, Xiaoning
2011-01-01
The production of recombinant proteins in a large scale is important for protein functional and structural studies, particularly by using Escherichia coli over-expression systems; however, approximate 70% of recombinant proteins are over-expressed as insoluble inclusion bodies. Here we presented an efficient method for generating soluble proteins from inclusion bodies by using two steps of denaturation and one step of refolding. We first demonstrated the advantages of this method over a conventional procedure with one denaturation step and one refolding step using three proteins with different folding properties. The refolded proteins were found to be active using in vitro tests and a bioassay. We then tested the general applicability of this method by analyzing 88 proteins from human and other organisms, all of which were expressed as inclusion bodies. We found that about 76% of these proteins were refolded with an average of >75% yield of soluble proteins. This “two-step-denaturing and refolding” (2DR) method is simple, highly efficient and generally applicable; it can be utilized to obtain active recombinant proteins for both basic research and industrial purposes. PMID:21829569
Piccinelli, Marina; Faber, Tracy L; Arepalli, Chesnal D; Appia, Vikram; Vinten-Johansen, Jakob; Schmarkey, Susan L; Folks, Russell D; Garcia, Ernest V; Yezzi, Anthony
2014-02-01
Accurate alignment between cardiac CT angiographic studies (CTA) and nuclear perfusion images is crucial for improved diagnosis of coronary artery disease. This study evaluated in an animal model the accuracy of a CTA fully automated biventricular segmentation algorithm, a necessary step for automatic and thus efficient PET/CT alignment. Twelve pigs with acute infarcts were imaged using Rb-82 PET and 64-slice CTA. Post-mortem myocardium mass measurements were obtained. Endocardial and epicardial myocardial boundaries were manually and automatically detected on the CTA and both segmentations used to perform PET/CT alignment. To assess the segmentation performance, image-based myocardial masses were compared to experimental data; the hand-traced profiles were used as a reference standard to assess the global and slice-by-slice robustness of the automated algorithm in extracting myocardium, LV, and RV. Mean distances between the automated and the manual 3D segmented surfaces were computed. Finally, differences in rotations and translations between the manual and automatic surfaces were estimated post-PET/CT alignment. The largest, smallest, and median distances between interactive and automatic surfaces averaged 1.2 ± 2.1, 0.2 ± 1.6, and 0.7 ± 1.9 mm. The average angular and translational differences in CT/PET alignments were 0.4°, -0.6°, and -2.3° about x, y, and z axes, and 1.8, -2.1, and 2.0 mm in x, y, and z directions. Our automatic myocardial boundary detection algorithm creates surfaces from CTA that are similar in accuracy and provide similar alignments with PET as those obtained from interactive tracing. Specific difficulties in a reliable segmentation of the apex and base regions will require further improvements in the automated technique.
Liston, Adam D; De Munck, Jan C; Hamandi, Khalid; Laufs, Helmut; Ossenblok, Pauly; Duncan, John S; Lemieux, Louis
2006-07-01
Simultaneous acquisition of EEG and fMRI data enables the investigation of the hemodynamic correlates of interictal epileptiform discharges (IEDs) during the resting state in patients with epilepsy. This paper addresses two issues: (1) the semi-automation of IED classification in statistical modelling for fMRI analysis and (2) the improvement of IED detection to increase experimental fMRI efficiency. For patients with multiple IED generators, sensitivity to IED-correlated BOLD signal changes can be improved when the fMRI analysis model distinguishes between IEDs of differing morphology and field. In an attempt to reduce the subjectivity of visual IED classification, we implemented a semi-automated system, based on the spatio-temporal clustering of EEG events. We illustrate the technique's usefulness using EEG-fMRI data from a subject with focal epilepsy in whom 202 IEDs were visually identified and then clustered semi-automatically into four clusters. Each cluster of IEDs was modelled separately for the purpose of fMRI analysis. This revealed IED-correlated BOLD activations in distinct regions corresponding to three different IED categories. In a second step, Signal Space Projection (SSP) was used to project the scalp EEG onto the dipoles corresponding to each IED cluster. This resulted in 123 previously unrecognised IEDs, the inclusion of which, in the General Linear Model (GLM), increased the experimental efficiency as reflected by significant BOLD activations. We have also shown that the detection of extra IEDs is robust in the face of fluctuations in the set of visually detected IEDs. We conclude that automated IED classification can result in more objective fMRI models of IEDs and significantly increased sensitivity.
McDonald, Sandra A; Ryan, Benjamin J; Brink, Amy; Holtschlag, Victoria L
2012-02-01
Informatics systems, particularly those that provide capabilities for data storage, specimen tracking, retrieval, and order fulfillment, are critical to the success of biorepositories and other laboratories engaged in translational medical research. A crucial item-one easily overlooked-is an efficient way to receive and process investigator-initiated requests. A successful electronic ordering system should allow request processing in a maximally efficient manner, while also allowing streamlined tracking and mining of request data such as turnaround times and numerical categorizations (user groups, funding sources, protocols, and so on). Ideally, an electronic ordering system also facilitates the initial contact between the laboratory and customers, while still allowing for downstream communications and other steps toward scientific partnerships. We describe here the recently established Web-based ordering system for the biorepository at Washington University Medical Center, along with its benefits for workflow, tracking, and customer service. Because of the system's numerous value-added impacts, we think our experience can serve as a good model for other customer-focused biorepositories, especially those currently using manual or non-Web-based request systems. Our lessons learned also apply to the informatics developers who serve such biobanks.
Ryan, Benjamin J.; Brink, Amy; Holtschlag, Victoria L.
2012-01-01
Informatics systems, particularly those that provide capabilities for data storage, specimen tracking, retrieval, and order fulfillment, are critical to the success of biorepositories and other laboratories engaged in translational medical research. A crucial item—one easily overlooked—is an efficient way to receive and process investigator-initiated requests. A successful electronic ordering system should allow request processing in a maximally efficient manner, while also allowing streamlined tracking and mining of request data such as turnaround times and numerical categorizations (user groups, funding sources, protocols, and so on). Ideally, an electronic ordering system also facilitates the initial contact between the laboratory and customers, while still allowing for downstream communications and other steps toward scientific partnerships. We describe here the recently established Web-based ordering system for the biorepository at Washington University Medical Center, along with its benefits for workflow, tracking, and customer service. Because of the system's numerous value-added impacts, we think our experience can serve as a good model for other customer-focused biorepositories, especially those currently using manual or non-Web–based request systems. Our lessons learned also apply to the informatics developers who serve such biobanks. PMID:23386921
Qiu, Ying-Kun; Chen, Fang-Fang; Zhang, Ling-Ling; Yan, Xia; Chen, Lin; Fang, Mei-Juan; Wu, Zhen
2014-04-11
An on-line comprehensive two-dimensional preparative liquid chromatography system was developed for preparative separation of minor amount components from complicated natural products. Medium-pressure liquid chromatograph (MPLC) was applied as the first dimension and preparative HPLC as the second one, in conjunction with trapping column and makeup pump. The performance of the trapping column was evaluated, in terms of column size, dilution ratio and diameter-height ratio, as well as system pressure from the view of medium pressure liquid chromatograph. Satisfactory trapping efficiency can be achieved using a commercially available 15 mm × 30 mm i.d. ODS pre-column. The instrument operation and the performance of this MPLC×preparative HPLC system were illustrated by gram-scale isolation of crude macro-porous resin enriched water extract of Rheum hotaoense. Automated multi-step preparative separation of 25 compounds, whose structures were identified by MS, (1)H NMR and even by less-sensitive (13)C NMR, could be achieved in a short period of time using this system, exhibiting great advantages in analytical efficiency and sample treatment capacity compared with conventional methods. Copyright © 2014 Elsevier B.V. All rights reserved.
Reliability Analysis of Large Commercial Vessel Engine Room Automation Systems. Volume 1. Results
1982-11-01
analyzing the engine room automiations systems on two steam vessels and one diesel vessel, conducting a criticality evaluation, pre- paring...of automated engine room systems,° the effect of *. maintenance was also to be considered, as was the human inter- face and backup. Besides being...designed to replace the human element, the systems periorm more efficiently than the human watchstander. But as with any system, there is no such thing as
Summary of the industry/NASA/FAA workshop on philosophy of automation: Promises and realities
NASA Technical Reports Server (NTRS)
Norman, Susan D.
1990-01-01
Issues of flight deck automation are multi-faceted and complex. The rapid introduction of advanced computer based technology on to the flight deck of transport category aircraft has had considerable impact on both aircraft operations and the flight crew. As part of NASA's responsibility to facilitate an active exchange of ideas and information between members of the aviation community, an Industry/NASA/FAA workshop was conducted in August 1988. One of the most important conclusions to emerge from the workshop was that the introduction of automation has clearly benefited aviation and has substantially improved the operational safety and efficiency of our air transport system. For example, one carrier stated that they have been flying the Boeing 767 (one of the first aircraft to employ substantial automation) since 1982, and they have never had an accident or incident resulting in damage to the aircraft. Notwithstanding its benefits, many issues associated with the design, certification, and operation of automated aircraft were identified. For example two key conceptual issues were the need for the crew to have a thorough understanding of the system and the importance of defining the pilot's role. With respect to certification, a fundamental issue is the lack of comprehensive human factors requirements in the current regulations. Operational considerations, which have been a factor in incidents involving automation, were also cited. Viewgraphs used in the presentation are given.
2011-01-01
Background Machine learning has a vast range of applications. In particular, advanced machine learning methods are routinely and increasingly used in quantitative structure activity relationship (QSAR) modeling. QSAR data sets often encompass tens of thousands of compounds and the size of proprietary, as well as public data sets, is rapidly growing. Hence, there is a demand for computationally efficient machine learning algorithms, easily available to researchers without extensive machine learning knowledge. In granting the scientific principles of transparency and reproducibility, Open Source solutions are increasingly acknowledged by regulatory authorities. Thus, an Open Source state-of-the-art high performance machine learning platform, interfacing multiple, customized machine learning algorithms for both graphical programming and scripting, to be used for large scale development of QSAR models of regulatory quality, is of great value to the QSAR community. Results This paper describes the implementation of the Open Source machine learning package AZOrange. AZOrange is specially developed to support batch generation of QSAR models in providing the full work flow of QSAR modeling, from descriptor calculation to automated model building, validation and selection. The automated work flow relies upon the customization of the machine learning algorithms and a generalized, automated model hyper-parameter selection process. Several high performance machine learning algorithms are interfaced for efficient data set specific selection of the statistical method, promoting model accuracy. Using the high performance machine learning algorithms of AZOrange does not require programming knowledge as flexible applications can be created, not only at a scripting level, but also in a graphical programming environment. Conclusions AZOrange is a step towards meeting the needs for an Open Source high performance machine learning platform, supporting the efficient development of highly accurate QSAR models fulfilling regulatory requirements. PMID:21798025
Stålring, Jonna C; Carlsson, Lars A; Almeida, Pedro; Boyer, Scott
2011-07-28
Machine learning has a vast range of applications. In particular, advanced machine learning methods are routinely and increasingly used in quantitative structure activity relationship (QSAR) modeling. QSAR data sets often encompass tens of thousands of compounds and the size of proprietary, as well as public data sets, is rapidly growing. Hence, there is a demand for computationally efficient machine learning algorithms, easily available to researchers without extensive machine learning knowledge. In granting the scientific principles of transparency and reproducibility, Open Source solutions are increasingly acknowledged by regulatory authorities. Thus, an Open Source state-of-the-art high performance machine learning platform, interfacing multiple, customized machine learning algorithms for both graphical programming and scripting, to be used for large scale development of QSAR models of regulatory quality, is of great value to the QSAR community. This paper describes the implementation of the Open Source machine learning package AZOrange. AZOrange is specially developed to support batch generation of QSAR models in providing the full work flow of QSAR modeling, from descriptor calculation to automated model building, validation and selection. The automated work flow relies upon the customization of the machine learning algorithms and a generalized, automated model hyper-parameter selection process. Several high performance machine learning algorithms are interfaced for efficient data set specific selection of the statistical method, promoting model accuracy. Using the high performance machine learning algorithms of AZOrange does not require programming knowledge as flexible applications can be created, not only at a scripting level, but also in a graphical programming environment. AZOrange is a step towards meeting the needs for an Open Source high performance machine learning platform, supporting the efficient development of highly accurate QSAR models fulfilling regulatory requirements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aykac, Deniz; Chaum, Edward; Fox, Karen
A telemedicine network with retina cameras and automated quality control, physiological feature location, and lesion/anomaly detection is a low-cost way of achieving broad-based screening for diabetic retinopathy (DR) and other eye diseases. In the process of a routine eye-screening examination, other non-image data is often available which may be useful in automated diagnosis of disease. In this work, we report on the results of combining this non-image data with image data, using the protocol and processing steps of a prototype system for automated disease diagnosis of retina examinations from a telemedicine network. The system includes quality assessments, automated physiology detection,more » and automated lesion detection to create an archive of known cases. Non-image data such as diabetes onset date and hemoglobin A1c (HgA1c) for each patient examination are included as well, and the system is used to create a content-based image retrieval engine capable of automated diagnosis of disease into 'normal' and 'abnormal' categories. The system achieves a sensitivity and specificity of 91.2% and 71.6% using hold-one-out validation testing.« less
NASA Technical Reports Server (NTRS)
Campbell, B. H.
1974-01-01
A methodology which was developed for balanced designing of spacecraft subsystems and interrelates cost, performance, safety, and schedule considerations was refined. The methodology consists of a two-step process: the first step is one of selecting all hardware designs which satisfy the given performance and safety requirements, the second step is one of estimating the cost and schedule required to design, build, and operate each spacecraft design. Using this methodology to develop a systems cost/performance model allows the user of such a model to establish specific designs and the related costs and schedule. The user is able to determine the sensitivity of design, costs, and schedules to changes in requirements. The resulting systems cost performance model is described and implemented as a digital computer program.
Zhang, Yaohong; Wu, Guohua; Ding, Chao; Liu, Feng; Yao, Yingfang; Zhou, Yong; Wu, Congping; Nakazawa, Naoki; Huang, Qingxun; Toyoda, Taro; Wang, Ruixiang; Hayase, Shuzi; Zou, Zhigang; Shen, Qing
2018-06-18
Lead selenide (PbSe) colloidal quantum dots (CQDs) are considered to be a strong candidate for high-efficiency colloidal quantum dot solar cells (CQDSCs) due to its efficient multiple exciton generation. However, currently, even the best PbSe CQDSCs can only display open-circuit voltage ( V oc ) about 0.530 V. Here, we introduce a solution-phase ligand exchange method to prepare PbI 2 -capped PbSe (PbSe-PbI 2 ) CQD inks, and for the first time, the absorber layer of PbSe CQDSCs was deposited in one step by using this PbSe-PbI 2 CQD inks. One-step-deposited PbSe CQDs absorber layer exhibits fast charge transfer rate, reduced energy funneling, and low trap assisted recombination. The champion large-area (active area is 0.35 cm 2 ) PbSe CQDSCs fabricated with one-step PbSe CQDs achieve a power conversion efficiency (PCE) of 6.0% and a V oc of 0.616 V, which is the highest V oc among PbSe CQDSCs reported to date.
Schneidereit, Dominik; Kraus, Larissa; Meier, Jochen C; Friedrich, Oliver; Gilbert, Daniel F
2017-06-15
High-content screening microscopy relies on automation infrastructure that is typically proprietary, non-customizable, costly and requires a high level of skill to use and maintain. The increasing availability of rapid prototyping technology makes it possible to quickly engineer alternatives to conventional automation infrastructure that are low-cost and user-friendly. Here, we describe a 3D printed inexpensive open source and scalable motorized positioning stage for automated high-content screening microscopy and provide detailed step-by-step instructions to re-building the device, including a comprehensive parts list, 3D design files in STEP (Standard for the Exchange of Product model data) and STL (Standard Tessellation Language) format, electronic circuits and wiring diagrams as well as software code. System assembly including 3D printing requires approx. 30h. The fully assembled device is light-weight (1.1kg), small (33×20×8cm) and extremely low-cost (approx. EUR 250). We describe positioning characteristics of the stage, including spatial resolution, accuracy and repeatability, compare imaging data generated with our device to data obtained using a commercially available microplate reader, demonstrate its suitability to high-content microscopy in 96-well high-throughput screening format and validate its applicability to automated functional Cl - - and Ca 2+ -imaging with recombinant HEK293 cells as a model system. A time-lapse video of the stage during operation and as part of a custom assembled screening robot can be found at https://vimeo.com/158813199. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Automated detection and analysis of particle beams in laser-plasma accelerator simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ushizima, Daniela Mayumi; Geddes, C.G.; Cormier-Michel, E.
Numerical simulations of laser-plasma wakefield (particle) accelerators model the acceleration of electrons trapped in plasma oscillations (wakes) left behind when an intense laser pulse propagates through the plasma. The goal of these simulations is to better understand the process involved in plasma wake generation and how electrons are trapped and accelerated by the wake. Understanding of such accelerators, and their development, offer high accelerating gradients, potentially reducing size and cost of new accelerators. One operating regime of interest is where a trapped subset of electrons loads the wake and forms an isolated group of accelerated particles with low spread inmore » momentum and position, desirable characteristics for many applications. The electrons trapped in the wake may be accelerated to high energies, the plasma gradient in the wake reaching up to a gigaelectronvolt per centimeter. High-energy electron accelerators power intense X-ray radiation to terahertz sources, and are used in many applications including medical radiotherapy and imaging. To extract information from the simulation about the quality of the beam, a typical approach is to examine plots of the entire dataset, visually determining the adequate parameters necessary to select a subset of particles, which is then further analyzed. This procedure requires laborious examination of massive data sets over many time steps using several plots, a routine that is unfeasible for large data collections. Demand for automated analysis is growing along with the volume and size of simulations. Current 2D LWFA simulation datasets are typically between 1GB and 100GB in size, but simulations in 3D are of the order of TBs. The increase in the number of datasets and dataset sizes leads to a need for automatic routines to recognize particle patterns as particle bunches (beam of electrons) for subsequent analysis. Because of the growth in dataset size, the application of machine learning techniques for scientific data mining is increasingly considered. In plasma simulations, Bagherjeiran et al. presented a comprehensive report on applying graph-based techniques for orbit classification. They used the KAM classifier to label points and components in single and multiple orbits. Love et al. conducted an image space analysis of coherent structures in plasma simulations. They used a number of segmentation and region-growing techniques to isolate regions of interest in orbit plots. Both approaches analyzed particle accelerator data, targeting the system dynamics in terms of particle orbits. However, they did not address particle dynamics as a function of time or inspected the behavior of bunches of particles. Ruebel et al. addressed the visual analysis of massive laser wakefield acceleration (LWFA) simulation data using interactive procedures to query the data. Sophisticated visualization tools were provided to inspect the data manually. Ruebel et al. have integrated these tools to the visualization and analysis system VisIt, in addition to utilizing efficient data management based on HDF5, H5Part, and the index/query tool FastBit. In Ruebel et al. proposed automatic beam path analysis using a suite of methods to classify particles in simulation data and to analyze their temporal evolution. To enable researchers to accurately define particle beams, the method computes a set of measures based on the path of particles relative to the distance of the particles to a beam. To achieve good performance, this framework uses an analysis pipeline designed to quickly reduce the amount of data that needs to be considered in the actual path distance computation. As part of this process, region-growing methods are utilized to detect particle bunches at single time steps. Efficient data reduction is essential to enable automated analysis of large data sets as described in the next section, where data reduction methods are steered to the particular requirements of our clustering analysis. Previously, we have described the application of a set of algorithms to automate the data analysis and classification of particle beams in the LWFA simulation data, identifying locations with high density of high energy particles. These algorithms detected high density locations (nodes) in each time step, i.e. maximum points on the particle distribution for only one spatial variable. Each node was correlated to a node in previous or later time steps by linking these nodes according to a pruned minimum spanning tree (PMST). We call the PMST representation 'a lifetime diagram', which is a graphical tool to show temporal information of high dense groups of particles in the longitudinal direction for the time series. Electron bunch compactness was described by another step of the processing, designed to partition each time step, using fuzzy clustering, into a fixed number of clusters.« less
ClinicalTrials.gov as a data source for semi-automated point-of-care trial eligibility screening.
Pfiffner, Pascal B; Oh, JiWon; Miller, Timothy A; Mandl, Kenneth D
2014-01-01
Implementing semi-automated processes to efficiently match patients to clinical trials at the point of care requires both detailed patient data and authoritative information about open studies. To evaluate the utility of the ClinicalTrials.gov registry as a data source for semi-automated trial eligibility screening. Eligibility criteria and metadata for 437 trials open for recruitment in four different clinical domains were identified in ClinicalTrials.gov. Trials were evaluated for up to date recruitment status and eligibility criteria were evaluated for obstacles to automated interpretation. Finally, phone or email outreach to coordinators at a subset of the trials was made to assess the accuracy of contact details and recruitment status. 24% (104 of 437) of trials declaring on open recruitment status list a study completion date in the past, indicating out of date records. Substantial barriers to automated eligibility interpretation in free form text are present in 81% to up to 94% of all trials. We were unable to contact coordinators at 31% (45 of 146) of the trials in the subset, either by phone or by email. Only 53% (74 of 146) would confirm that they were still recruiting patients. Because ClinicalTrials.gov has entries on most US and many international trials, the registry could be repurposed as a comprehensive trial matching data source. Semi-automated point of care recruitment would be facilitated by matching the registry's eligibility criteria against clinical data from electronic health records. But the current entries fall short. Ultimately, improved techniques in natural language processing will facilitate semi-automated complex matching. As immediate next steps, we recommend augmenting ClinicalTrials.gov data entry forms to capture key eligibility criteria in a simple, structured format.
Trahearn, Nicholas; Tsang, Yee Wah; Cree, Ian A; Snead, David; Epstein, David; Rajpoot, Nasir
2017-06-01
Automation of downstream analysis may offer many potential benefits to routine histopathology. One area of interest for automation is in the scoring of multiple immunohistochemical markers to predict the patient's response to targeted therapies. Automated serial slide analysis of this kind requires robust registration to identify common tissue regions across sections. We present an automated method for co-localized scoring of Estrogen Receptor and Progesterone Receptor (ER/PR) in breast cancer core biopsies using whole slide images. Regions of tumor in a series of fifty consecutive breast core biopsies were identified by annotation on H&E whole slide images. Sequentially cut immunohistochemical stained sections were scored manually, before being digitally scanned and then exported into JPEG 2000 format. A two-stage registration process was performed to identify the annotated regions of interest in the immunohistochemistry sections, which were then scored using the Allred system. Overall correlation between manual and automated scoring for ER and PR was 0.944 and 0.883, respectively, with 90% of ER and 80% of PR scores within in one point or less of agreement. This proof of principle study indicates slide registration can be used as a basis for automation of the downstream analysis for clinically relevant biomarkers in the majority of cases. The approach is likely to be improved by implantation of safeguarding analysis steps post registration. © 2016 International Society for Advancement of Cytometry. © 2016 International Society for Advancement of Cytometry.
ERIC Educational Resources Information Center
Hull, Daniel M.; Lovett, James E.
This volume of the final report for the Robotics/Automated Systems Technician (RAST) curriculum project is a curriculum planning guide intended for school administrators, faculty, and student counselors/advisors. It includes step-by-step procedures to help institutions evaluate their community's needs and their capabilities to meet these needs in…
Watanabe, Kae; Lopez-Colon, Dalia; Shuster, Jonathan J; Philip, Joseph
2017-03-01
The American Heart Association (AHA) advocates for CPR education as a requirement of secondary school curriculum. Unfortunately, many states have not adopted CPR education. Our aim was to investigate a low-cost, time effective method to educate students on Basic Life Support (BLS), including reeducation. This is a prospective, randomized study. Retention was assessed at 4 months post-initial education. Education was performed by AHA-certified providers during a 45-minute physical education class in a middle school in Florida. This age provides opportunities for reinforcement through high school, with ability for efficient learning. The study included 41 Eighth grade students. Students were randomized into two groups; one group received repeat education 2 months after the first education, the second group did not. All students received BLS education limited to chest compressions and usage of an Automated External Defibrillator. Students had skills and knowledge tests administered pre- and post-education after initial education, and repeated 2 and 4 months later to assess retention. There was a significant increase in CPR skills and knowledge when comparing pre- and post-education results for all time-points ( p < 0.001). When assessing reeducation, a significant improvement was noted in total knowledge scores but not during the actual steps of CPR. Our study indicates significant increase in CPR knowledge and skills following a one-time 45-minute session. Reeducation may be useful, but the interval needs further investigation. If schools across the United States invested one 45-60-minute period every school year, this would ensure widespread CPR knowledge with minimal cost and loss of school time.
Fleet Sizing of Automated Material Handling Using Simulation Approach
NASA Astrophysics Data System (ADS)
Wibisono, Radinal; Ai, The Jin; Ratna Yuniartha, Deny
2018-03-01
Automated material handling tends to be chosen rather than using human power in material handling activity for production floor in manufacturing company. One critical issue in implementing automated material handling is designing phase to ensure that material handling activity more efficient in term of cost spending. Fleet sizing become one of the topic in designing phase. In this research, simulation approach is being used to solve fleet sizing problem in flow shop production to ensure optimum situation. Optimum situation in this research means minimum flow time and maximum capacity in production floor. Simulation approach is being used because flow shop can be modelled into queuing network and inter-arrival time is not following exponential distribution. Therefore, contribution of this research is solving fleet sizing problem with multi objectives in flow shop production using simulation approach with ARENA Software
On the Automation of the MarkIII Data Analysis System.
NASA Astrophysics Data System (ADS)
Schwegmann, W.; Schuh, H.
1999-03-01
A faster and semiautomatic data analysis is an important contribution to the acceleration of the VLBI procedure. A concept for the automation of one of the most widely used VLBI software packages the MarkIII Data Analysis System was developed. Then, the program PWXCB, which extracts weather and cable calibration data from the station log-files, was automated supplementing the existing Fortran77 program-code. The new program XLOG and its results will be presented. Most of the tasks in the VLBI data analysis are very complex and their automation requires typical knowledge-based techniques. Thus, a knowledge-based system (KBS) for support and guidance of the analyst is being developed using the AI-workbench BABYLON, which is based on methods of artificial intelligence (AI). The advantages of a KBS for the MarkIII Data Analysis System and the required steps to build a KBS will be demonstrated. Examples about the current status of the project will be given, too.
Comparison of automated and manual vital sign collection at hospital wards.
Wood, Jeffrey; Finkelstein, Joseph
2013-01-01
Using a cross-over study design, vital signs were collected from 60 patients by 6 nurses. Each nurse was randomly assigned for manual vital sign collection in 5 patients and for automated data collection in other 5 patients. The mean time taken for vital signs information to be available in EMR was significantly (p <0.004) lower after automated data collection (158.7±67.0) than after the manual collection (4079.8±7091.8 s). The nursing satisfaction score of collecting vital signs was significantly lower (p<0.007) for the manual way (10.3±3.9) than for the automated way (16.5±3.4). We found that 30% of vital sign records were transmitted to EMR with at least one error after manual data collection whereas there wasno transmission error with automated data collection. Allparticipating nurses stated that the automated vital sign collection can improve their efficiency and save their time for direct patient care.
Yousef Kalafi, Elham; Town, Christopher; Kaur Dhillon, Sarinder
2017-09-04
Identification of taxonomy at a specific level is time consuming and reliant upon expert ecologists. Hence the demand for automated species identification increased over the last two decades. Automation of data classification is primarily focussed on images, incorporating and analysing image data has recently become easier due to developments in computational technology. Research efforts in identification of species include specimens' image processing, extraction of identical features, followed by classifying them into correct categories. In this paper, we discuss recent automated species identification systems, categorizing and evaluating their methods. We reviewed and compared different methods in step by step scheme of automated identification and classification systems of species images. The selection of methods is influenced by many variables such as level of classification, number of training data and complexity of images. The aim of writing this paper is to provide researchers and scientists an extensive background study on work related to automated species identification, focusing on pattern recognition techniques in building such systems for biodiversity studies.
Advantages and challenges in automated apatite fission track counting
NASA Astrophysics Data System (ADS)
Enkelmann, E.; Ehlers, T. A.
2012-04-01
Fission track thermochronometer data are often a core element of modern tectonic and denudation studies. Soon after the development of the fission track methods interest emerged for the developed an automated counting procedure to replace the time consuming labor of counting fission tracks under the microscope. Automated track counting became feasible in recent years with increasing improvements in computer software and hardware. One such example used in this study is the commercial automated fission track counting procedure from Autoscan Systems Pty that has been highlighted through several venues. We conducted experiments that are designed to reliably and consistently test the ability of this fully automated counting system to recognize fission tracks in apatite and a muscovite external detector. Fission tracks were analyzed in samples with a step-wise increase in sample complexity. The first set of experiments used a large (mm-size) slice of Durango apatite cut parallel to the prism plane. Second, samples with 80-200 μm large apatite grains of Fish Canyon Tuff were analyzed. This second sample set is characterized by complexities often found in apatites in different rock types. In addition to the automated counting procedure, the same samples were also analyzed using conventional counting procedures. We found for all samples that the fully automated fission track counting procedure using the Autoscan System yields a larger scatter in the fission track densities measured compared to conventional (manual) track counting. This scatter typically resulted from the false identification of tracks due surface and mineralogical defects, regardless of the image filtering procedure used. Large differences between track densities analyzed with the automated counting persisted between different grains analyzed in one sample as well as between different samples. As a result of these differences a manual correction of the fully automated fission track counts is necessary for each individual surface area and grain counted. This manual correction procedure significantly increases (up to four times) the time required to analyze a sample with the automated counting procedure compared to the conventional approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lekov, Alex; Thompson, Lisa; McKane, Aimee
This report summarizes the Lawrence Berkeley National Laboratory?s research to date in characterizing energy efficiency and automated demand response opportunities for wastewater treatment facilities in California. The report describes the characteristics of wastewater treatment facilities, the nature of the wastewater stream, energy use and demand, as well as details of the wastewater treatment process. It also discusses control systems and energy efficiency and automated demand response opportunities. In addition, several energy efficiency and load management case studies are provided for wastewater treatment facilities.This study shows that wastewater treatment facilities can be excellent candidates for open automated demand response and thatmore » facilities which have implemented energy efficiency measures and have centralized control systems are well-suited to shift or shed electrical loads in response to financial incentives, utility bill savings, and/or opportunities to enhance reliability of service. Control technologies installed for energy efficiency and load management purposes can often be adapted for automated demand response at little additional cost. These improved controls may prepare facilities to be more receptive to open automated demand response due to both increased confidence in the opportunities for controlling energy cost/use and access to the real-time data.« less
Hu, Lei; Zuo, Peng; Ye, Bang-Ce
2010-10-01
An automated multicomponent mesofluidic system (MCMS) based on biorecognitions carried out on meso-scale glass beads in polydimethylsiloxane (PDMS) channels was developed. The constructed MCMS consisted of five modules: a bead introduction module, a bioreaction module, a solution handling module, a liquid driving module, and a signal collection module. The integration of these modules enables the assay to be automated and reduces it to a one-step protocol. The MCMS has successfully been applied toward the detection of veterinary drug residues in animal-derived foods. The drug antigen-coated beads (varphi250 microm) were arrayed in the PDMS channels (varphi300 microm). The competitive immunoassay was then carried out on the surface of the glass beads. After washing, the Cy3-labeled secondary antibody was introduced to probe the antigen-antibody complex anchored to the beads. The fluorescence intensity of each bead was measured and used to determine the residual drug concentration. The MCMS is highly sensitive, with its detection limits ranging from 0.02 (salbutamol) to 3.5 microg/L (sulfamethazine), and has a short assay time of 45 min or less. The experimental results demonstrate that the MCMS proves to be an economic, efficient, and sensitive platform for multicomponent detection of compound residues for contamination in foods or the environment. Copyright 2010 Elsevier Inc. All rights reserved.
Towards automated processing of clinical Finnish: sublanguage analysis and a rule-based parser.
Laippala, Veronika; Ginter, Filip; Pyysalo, Sampo; Salakoski, Tapio
2009-12-01
In this paper, we present steps taken towards more efficient automated processing of clinical Finnish, focusing on daily nursing notes in a Finnish Intensive Care Unit (ICU). First, we analyze ICU Finnish as a sublanguage, identifying its specific features facilitating, for example, the development of a specialized syntactic analyser. The identified features include frequent omission of finite verbs, limitations in allowed syntactic structures, and domain-specific vocabulary. Second, we develop a formal grammar and a parser for ICU Finnish, thus providing better tools for the development of further applications in the clinical domain. The grammar is implemented in the LKB system in a typed feature structure formalism. The lexicon is automatically generated based on the output of the FinTWOL morphological analyzer adapted to the clinical domain. As an additional experiment, we study the effect of using Finnish constraint grammar to reduce the size of the lexicon. The parser construction thus makes efficient use of existing resources for Finnish. The grammar currently covers 76.6% of ICU Finnish sentences, producing highly accurate best-parse analyzes with F-score of 91.1%. We find that building a parser for the highly specialized domain sublanguage is not only feasible, but also surprisingly efficient, given an existing morphological analyzer with broad vocabulary coverage. The resulting parser enables a deeper analysis of the text than was previously possible.
Workflow Management for Complex HEP Analyses
NASA Astrophysics Data System (ADS)
Erdmann, M.; Fischer, R.; Rieger, M.; von Cube, R. F.
2017-10-01
We present the novel Analysis Workflow Management (AWM) that provides users with the tools and competences of professional large scale workflow systems, e.g. Apache’s Airavata[1]. The approach presents a paradigm shift from executing parts of the analysis to defining the analysis. Within AWM an analysis consists of steps. For example, a step defines to run a certain executable for multiple files of an input data collection. Each call to the executable for one of those input files can be submitted to the desired run location, which could be the local computer or a remote batch system. An integrated software manager enables automated user installation of dependencies in the working directory at the run location. Each execution of a step item creates one report for bookkeeping purposes containing error codes and output data or file references. Required files, e.g. created by previous steps, are retrieved automatically. Since data storage and run locations are exchangeable from the steps perspective, computing resources can be used opportunistically. A visualization of the workflow as a graph of the steps in the web browser provides a high-level view on the analysis. The workflow system is developed and tested alongside of a ttbb cross section measurement where, for instance, the event selection is represented by one step and a Bayesian statistical inference is performed by another. The clear interface and dependencies between steps enables a make-like execution of the whole analysis.
Study of the impact of automation on productivity in bus-maintenance facilities. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sumanth, D.J.; Weiss, H.J.; Adya, B.
1988-12-01
Whether or not the various types of automation and new technologies introduced in a bus-transit system really have an impact on productivity is the question addressed in the study. The report describes a new procedure of productivity measurement and evaluation for a county-transit system and provides an objective perspective on the impact of automation on productivity in bus maintenance facilities. The research objectives were: to study the impact of automation on total productivity in transit maintenance facilities; to develop and apply a methodology for measuring the total productivity of a Floridian transit maintenance facility (Bradenton-Manatee County bus maintenance facility whichmore » has been introducing automation since 1983); and to develop a practical step-by-step implementation scheme for the total productivity-based productivity measurement system that any bus manager can use. All 3 objectives were successfully accomplished.« less
Effects of a Longer Detection Window in VHF Time-of-Arrival Lightning Detection Systems
NASA Astrophysics Data System (ADS)
Murphy, M.; Holle, R.; Demetriades, N.
2003-12-01
Lightning detection systems that operate by measuring the times of arrival (TOA) of short bursts of radiation at VHF can produce huge volumes of data. The first automated system of this kind, the NASA Kennedy Space Center LDAR network, is capable of producing one detection every 100 usec from each of seven sensors (Lennon and Maier, 1991), where each detection consists of the time and amplitude of the highest-amplitude peak observed within the 100 usec window. More modern systems have been shown to produce very detailed information with one detection every 10 usec (Rison et al., 2001). Operating such systems in real time, however, can become expensive because of the large data communications rates required. One solution to this problem is to use a longer detection window, say 500 usec. In principle, this has little or no effect on the flash detection efficiency because each flash typically produces a very large number of these VHF bursts (known as sources). By simply taking the largest-amplitude peak from every 500-usec interval instead of every 100-usec interval, we should detect the largest 20{%} of the sources that would have been detected using the 100-usec window. However, questions remain about the exact effect of a longer detection window on the source detection efficiency with distance from the network, its effects on how well flashes are represented in space, and how well the reduced information represents the parent thunderstorm. The latter issue is relevant for automated location and tracking of thunderstorm cells using data from VHF TOA lightning detection networks, as well as for understanding relationships between lightning and severe weather. References Lennon, C.L. and L.M. Maier, Lightning mapping system. Proceedings, Intl. Aerospace and Ground Conf. on Lightning and Static Elec., Cocoa Beach, Fla., NASA Conf. Pub. 3106, vol. II, pp. 89-1 - 89-10, 1991. Rison, W., P. Krehbiel, R. Thomas, T. Hamlin, J. Harlin, High time resolution lightning mapping observations of a small thunderstorm during STEPS. Eos Trans. AGU, 82 (47), Fall Meet. Suppl., Abstract AE12A-83, 2001.
SU-G-BRB-05: Automation of the Photon Dosimetric Quality Assurance Program of a Linear Accelerator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lebron, S; Lu, B; Yan, G
Purpose: To develop an automated method to calculate a linear accelerator (LINAC) photon radiation field size, flatness, symmetry, output and beam quality in a single delivery for flattened (FF) and flattening-filter-free (FFF) beams using an ionization chamber array. Methods: The proposed method consists of three control points that deliver 30×30, 10×10 and 5×5cm{sup 2} fields (FF or FFF) in a step-and-shoot sequence where the number of monitor units is weighted for each field size. The IC Profiler (Sun Nuclear Inc.) with 5mm detector spacing was used for this study. The corrected counts (CCs) were calculated and the locations of themore » maxima and minima values of the first-order gradient determined data of each sub field. Then, all CCs for each field size are summed in order to obtain the final profiles. For each profile, the radiation field size, symmetry, flatness, output factor and beam quality were calculated. For field size calculation, a parameterized gradient method was used. For method validation, profiles were collected in the detector array both, individually and as part of the step-and-shoot plan, with 9.9cm buildup for FF and FFF beams at 90cm source-to-surface distance. The same data were collected with the device (plus buildup) placed on a movable platform to achieve a 1mm resolution. Results: The differences between the dosimetric quantities calculated from both deliveries, individually and step-and-shoot, were within 0.31±0.20% and 0.04±0.02mm. The differences between the calculated field sizes with 5mm and 1mm resolution were ±0.1mm. Conclusion: The proposed single delivery method proved to be simple and efficient in automating the photon dosimetric monthly and annual quality assurance.« less
Licht, S
2011-12-15
STEP (solar thermal electrochemical production) theory is derived and experimentally verified for the electrosynthesis of energetic molecules at solar energy efficiency greater than any photovoltaic conversion efficiency. In STEP the efficient formation of metals, fuels, chlorine, and carbon capture is driven by solar thermal heated endothermic electrolyses of concentrated reactants occuring at a voltage below that of the room temperature energy stored in the products. One example is CO(2) , which is reduced to either fuels or storable carbon at a solar efficiency of over 50% due to a synergy of efficient solar thermal absorption and electrochemical conversion at high temperature and reactant concentration. CO(2) -free production of iron by STEP, from iron ore, occurs via Fe(III) in molten carbonate. Water is efficiently split to hydrogen by molten hydroxide electrolysis, and chlorine, sodium, and magnesium from molten chlorides. A pathway is provided for the STEP decrease of atmospheric carbon dioxide levels to pre-industial age levels in 10 years. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Efficient, Multi-Scale Designs Take Flight
NASA Technical Reports Server (NTRS)
2003-01-01
Engineers can solve aerospace design problems faster and more efficiently with a versatile software product that performs automated structural analysis and sizing optimization. Collier Research Corporation's HyperSizer Structural Sizing Software is a design, analysis, and documentation tool that increases productivity and standardization for a design team. Based on established aerospace structural methods for strength, stability, and stiffness, HyperSizer can be used all the way from the conceptual design to in service support. The software originated from NASA s efforts to automate its capability to perform aircraft strength analyses, structural sizing, and weight prediction and reduction. With a strategy to combine finite element analysis with an automated design procedure, NASA s Langley Research Center led the development of a software code known as ST-SIZE from 1988 to 1995. Collier Research employees were principal developers of the code along with Langley researchers. The code evolved into one that could analyze the strength and stability of stiffened panels constructed of any material, including light-weight, fiber-reinforced composites.
AUTOBA: automation of backbone assignment from HN(C)N suite of experiments.
Borkar, Aditi; Kumar, Dinesh; Hosur, Ramakrishna V
2011-07-01
Development of efficient strategies and automation represent important milestones of progress in rapid structure determination efforts in proteomics research. In this context, we present here an efficient algorithm named as AUTOBA (Automatic Backbone Assignment) designed to automate the assignment protocol based on HN(C)N suite of experiments. Depending upon the spectral dispersion, the user can record 2D or 3D versions of the experiments for assignment. The algorithm uses as inputs: (i) protein primary sequence and (ii) peak-lists from user defined HN(C)N suite of experiments. In the end, one gets H(N), (15)N, C(α) and C' assignments (in common BMRB format) for the individual residues along the polypeptide chain. The success of the algorithm has been demonstrated, not only with experimental spectra recorded on two small globular proteins: ubiquitin (76 aa) and M-crystallin (85 aa), but also with simulated spectra of 27 other proteins using assignment data from the BMRB.
A practical review of energy saving technology for ageing populations.
Walker, Guy; Taylor, Andrea; Whittet, Craig; Lynn, Craig; Docherty, Catherine; Stephen, Bruce; Owens, Edward; Galloway, Stuart
2017-07-01
Fuel poverty is a critical issue for a globally ageing population. Longer heating/cooling requirements combine with declining incomes to create a problem in need of urgent attention. One solution is to deploy technology to help elderly users feel informed about their energy use, and empowered to take steps to make it more cost effective and efficient. This study subjects a broad cross section of energy monitoring and home automation products to a formal ergonomic analysis. A high level task analysis was used to guide a product walk through, and a toolkit approach was used thereafter to drive out further insights. The findings reveal a number of serious usability issues which prevent these products from successfully accessing an important target demographic and associated energy saving and fuel poverty outcomes. Design principles and examples are distilled from the research to enable practitioners to translate the underlying research into high quality design-engineering solutions. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Budzan, Sebastian
2018-04-01
In this paper, the automatic method of grain detection and classification has been presented. As input, it uses a single digital image obtained from milling process of the copper ore with an high-quality digital camera. The grinding process is an extremely energy and cost consuming process, thus granularity evaluation process should be performed with high efficiency and time consumption. The method proposed in this paper is based on the three-stage image processing. First, using Seeded Region Growing (SRG) segmentation with proposed adaptive thresholding based on the calculation of Relative Standard Deviation (RSD) all grains are detected. In the next step results of the detection are improved using information about the shape of the detected grains using distance map. Finally, each grain in the sample is classified into one of the predefined granularity class. The quality of the proposed method has been obtained by using nominal granularity samples, also with a comparison to the other methods.
Hot water-repellent and mechanically durable superhydrophobic mesh for oil/water separation.
Cao, Min; Luo, Xiaomin; Ren, Huijun; Feng, Jianyan
2018-02-15
The leakage of oil or organic pollutants into the ocean arouses a global catastrophe. The superhydrophobic materials have offered a new idea for the efficient, thorough and automated oil/water separation. However, most of such materials lose superhydrophobicity when exposed to hot water (e.g. >55 °C). In this study, a hot water-repellent superhydrophobic mesh used for oil/water separation was prepared with one-step spray of modified polyurethane and hydrophobic silica nanoparticles on the copper mesh. The as-prepared superhydrophobic mesh could be applied as the effective materials for the separation of oil/water mixture with a temperature up to 100 °C. In addition, the obtained mesh could selectively remove a wide range of organic solvents from water with high absorption capacity and good recyclability. Moreover, the as-prepared superhydrophobic mesh shows excellent mechanical durability, which makes it a promising material for practical oil/water separation. Copyright © 2017 Elsevier Inc. All rights reserved.
Streamlining Collaborative Planning in Spacecraft Mission Architectures
NASA Technical Reports Server (NTRS)
Misra, Dhariti; Bopf, Michel; Fishman, Mark; Jones, Jeremy; Kerbel, Uri; Pell, Vince
2000-01-01
During the past two decades, the planning and scheduling community has substantially increased the capability and efficiency of individual planning and scheduling systems. Relatively recently, research work to streamline collaboration between planning systems is gaining attention. Spacecraft missions stand to benefit substantially from this work as they require the coordination of multiple planning organizations and planning systems. Up to the present time this coordination has demanded a great deal of human intervention and/or extensive custom software development efforts. This problem will become acute with increased requirements for cross-mission plan coordination and multi -spacecraft mission planning. The Advanced Architectures and Automation Branch of NASA's Goddard Space Flight Center is taking innovative steps to define collaborative planning architectures, and to identify coordinated planning tools for Cross-Mission Campaigns. Prototypes are being developed to validate these architectures and assess the usefulness of the coordination tools by the planning community. This presentation will focus on one such planning coordination too], named Visual Observation Layout Tool (VOLT), which is currently being developed to streamline the coordination between astronomical missions
Wiemken, Timothy L; Furmanek, Stephen P; Mattingly, William A; Haas, Janet; Ramirez, Julio A; Carrico, Ruth M
2018-06-01
Hand hygiene is one of the most important interventions in the quest to eliminate healthcare-associated infections, and rates in healthcare facilities are markedly low. Since hand hygiene observation and feedback are critical to improve adherence, we created an easy-to-use, platform-independent hand hygiene data collection process and an automated, on-demand reporting engine. A 3-step approach was used for this project: 1) creation of a data collection form using Google Forms, 2) transfer of data from the form to a spreadsheet using Google Spreadsheets, and 3) creation of an automated, cloud-based analytics platform for report generation using R and RStudio Shiny software. A video tutorial of all steps in the creation and use of this free tool can be found on our YouTube channel: https://www.youtube.com/watch?v=uFatMR1rXqU&t. The on-demand reporting tool can be accessed at: https://crsp.louisville.edu/shiny/handhygiene. This data collection and automated analytics engine provides an easy-to-use environment for evaluating hand hygiene data; it also provides rapid feedback to healthcare workers. By reducing some of the data management workload required of the infection preventionist, more focused interventions may be instituted to increase global hand hygiene rates and reduce infection. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Modular magazine for suitable handling of microparts in industry
NASA Astrophysics Data System (ADS)
Grimme, Ralf; Schmutz, Wolfgang; Schlenker, Dirk; Schuenemann, Matthias; Stock, Achim; Schaefer, Wolfgang
1998-01-01
Microassembly and microadjustment techniques are key technologies in the industrial production of hybrid microelectromechanical systems. One focal point in current microproduction research and engineering is the design and development of high-precision microassembly and microadjustment equipment capable of operating within the framework of flexible automated industrial production. As well as these developments, suitable microassembly tools for industrial use also need to be equipped with interfaces for the supply and delivery of microcomponents. The microassembly process necessitates the supply of microparts in a geometrically defined manner. In order to reduce processing steps and production costs, there is a demand for magazines capable of providing free accessibility to the fixed microcomponents. Commonly used at present are feeding techniques, which originate from the field of semiconductor production. However none of these techniques fully meets the requirements of industrial microassembly technology. A novel modular magazine set, developed and tested in a joint project, is presented here. The magazines are able to hold microcomponents during cleaning, inspection and assembly without nay additional handling steps. The modularity of their design allows for maximum technical flexibility. The modular magazine fits into currently practiced SEMI standards. The design and concept of the magazine enables industrial manufacturers to promote a cost-efficient and flexible precision assembly of microelectromechanical systems.
NASA Technical Reports Server (NTRS)
Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.
Automated systems to identify relevant documents in product risk management
2012-01-01
Background Product risk management involves critical assessment of the risks and benefits of health products circulating in the market. One of the important sources of safety information is the primary literature, especially for newer products which regulatory authorities have relatively little experience with. Although the primary literature provides vast and diverse information, only a small proportion of which is useful for product risk assessment work. Hence, the aim of this study is to explore the possibility of using text mining to automate the identification of useful articles, which will reduce the time taken for literature search and hence improving work efficiency. In this study, term-frequency inverse document-frequency values were computed for predictors extracted from the titles and abstracts of articles related to three tumour necrosis factors-alpha blockers. A general automated system was developed using only general predictors and was tested for its generalizability using articles related to four other drug classes. Several specific automated systems were developed using both general and specific predictors and training sets of different sizes in order to determine the minimum number of articles required for developing such systems. Results The general automated system had an area under the curve value of 0.731 and was able to rank 34.6% and 46.2% of the total number of 'useful' articles among the first 10% and 20% of the articles presented to the evaluators when tested on the generalizability set. However, its use may be limited by the subjective definition of useful articles. For the specific automated system, it was found that only 20 articles were required to develop a specific automated system with a prediction performance (AUC 0.748) that was better than that of general automated system. Conclusions Specific automated systems can be developed rapidly and avoid problems caused by subjective definition of useful articles. Thus the efficiency of product risk management can be improved with the use of specific automated systems. PMID:22380483
An Aspect-Oriented Framework for Business Process Improvement
NASA Astrophysics Data System (ADS)
Pourshahid, Alireza; Mussbacher, Gunter; Amyot, Daniel; Weiss, Michael
Recently, many organizations invested in Business Process Management Systems (BPMSs) in order to automate and monitor their processes. Business Activity Monitoring is one of the essential modules of a BPMS as it provides the core monitoring capabilities. Although the natural step after process monitoring is process improvement, most of the existing systems do not provide the means to help users with the improvement step. In this paper, we address this issue by proposing an aspect-oriented framework that allows the impact of changes to business processes to be explored with what-if scenarios based on the most appropriate process redesign patterns among several possibilities. As the four cornerstones of a BPMS are process, goal, performance and validation views, these views need to be aligned automatically by any approach that intends to support automated improvement of business processes. Our framework therefore provides means to reflect process changes also in the other views of the business process. A health care case study presented as a proof of concept suggests that this novel approach is feasible.
Automated assembling of single fuel cell units for use in a fuel cell stack
NASA Astrophysics Data System (ADS)
Jalba, C. K.; Muminovic, A.; Barz, C.; Nasui, V.
2017-05-01
The manufacturing of PEMFC stacks (POLYMER ELEKTROLYT MEMBRAN Fuel Cell) is nowadays still done by hand. Over hundreds of identical single components have to be placed accurate together for the construction of a fuel cell stack. Beside logistic problems, higher total costs and disadvantages in weight the high number of components produce a higher statistic interference because of faulty erection or material defects and summation of manufacturing tolerances. The saving of costs is about 20 - 25 %. Furthermore, the total weight of the fuel cells will be reduced because of a new sealing technology. Overall a one minute cycle time has to be aimed per cell at the manufacturing of these single components. The change of the existing sealing concept to a bonded sealing is one of the important requisites to get an automated manufacturing of single cell units. One of the important steps for an automated gluing process is the checking of the glue application by using of an image processing system. After bonding the single fuel cell the sealing and electrical function can be checked, so that only functional and high qualitative cells can get into further manufacturing processes.
Vohra, Rais; Kelner, Michael; Clark, Richard F
2009-01-01
Crotaline Polyvalent Ovine Fab antivenom (CroFab, Savage Laboratories and Protherics Inc., Brentwood, TN, USA) preparation requires that the lyophilized powder be manually reconstituted before use. We compared automated methods for driving the product into solution with the standard manual method of reconstitution, and the effect of repeated rinsing of the product vial, on the per-vial availability of antivenom. Normal saline (NS, 10 mL) was added to 12 vials of expired CroFab. Vials were assigned in pairs to each of six mixing methods, including one pair mixed manually as recommended by the product package insert. Each vial's contents were diluted to a final volume of 75 mL of normal saline. Protein concentration was measured with a colorimetric assay. The fluid left in each vial was removed and the vial was washed with 10 mL NS. Total protein yield from each step was calculated. There was no significant change in protein yield among three of five automated mixing methods when compared to manual reconstitution. Repeat rinsing of the product vial with an additional 10 mLs of fluid added to the protein yield regardless of the mixing method used. We found slightly higher protein yields with all automated methods compared to manual mixing, but only two of five comparisons with the standard mixing method demonstrated statistical significance. However, for all methods tested, the addition of a second rinsing and recovery step increased the amount of protein recovered considerably, presumably by allowing solution of protein trapped in the foamy residues. Automated mixing methods and repeat rinsing of the product vial may allow higher protein yields in the preparation of CroFab antivenom.
Impact of assay design on test performance: lessons learned from 25-hydroxyvitamin D.
Farrell, Christopher-John L; Soldo, Joshua; McWhinney, Brett; Bandodkar, Sushil; Herrmann, Markus
2014-11-01
Current automated immunoassays vary significantly in many aspects of their design. This study sought to establish if the theoretical advantages and disadvantages associated with different design formats of automated 25-hydroxyvitamin D (25-OHD) assays are translated into variations in assay performance in practice. 25-OHD was measured in 1236 samples using automated assays from Abbott, DiaSorin, Roche and Siemens. A subset of 362 samples had up to three liquid chromatography-tandem mass spectrometry 25-OHD analyses performed. 25-OHD₂ recovery, dilution recovery, human anti-animal antibody (HAAA) interference, 3-epi-25-OHD₃ cross-reactivity and precision of the automated assays were evaluated. The assay that combined release of 25-OHD with analyte capture in a single step showed the most accurate 25-OHD₂ recovery and the best dilution recovery. The use of vitamin D binding protein (DBP) as the capture moiety was associated with 25-OHD₂ under-recovery, a trend consistent with 3-epi-25-OHD₃ cross-reactivity and immunity to HAAA interference. Assays using animal-derived antibodies did not show 3-epi-25-OHD₃ cross-reactivity but were variably susceptible to HAAA interference. Not combining 25-OHD release and capture in one step and use of biotin-streptavidin interaction for solid phase separation were features of the assays with inferior accuracy for diluted samples. The assays that used a backfill assay format showed the best precision at high concentrations but this design did not guarantee precision at low 25-OHD concentrations. Variations in design among automated 25-OHD assays influence their performance characteristics. Consideration of the details of assay design is therefore important when selecting and validating new assays.
The Historical Evolution of Educational Software.
ERIC Educational Resources Information Center
Troutner, Joanne
This paper establishes the roots of computers and automated teaching in the field of psychology and describes Dr. S. L. Pressey's presentation of the teaching machine; B. F. Skinner's teaching machine; Meyer's steps in composing a program for the automated teaching machine; IBM's beginning research on automated courses and the development of the…
Model-centric distribution automation: Capacity, reliability, and efficiency
Onen, Ahmet; Jung, Jaesung; Dilek, Murat; ...
2016-02-26
A series of analyses along with field validations that evaluate efficiency, reliability, and capacity improvements of model-centric distribution automation are presented. With model-centric distribution automation, the same model is used from design to real-time control calculations. A 14-feeder system with 7 substations is considered. The analyses involve hourly time-varying loads and annual load growth factors. Phase balancing and capacitor redesign modifications are used to better prepare the system for distribution automation, where the designs are performed considering time-varying loads. Coordinated control of load tap changing transformers, line regulators, and switched capacitor banks is considered. In evaluating distribution automation versus traditionalmore » system design and operation, quasi-steady-state power flow analysis is used. In evaluating distribution automation performance for substation transformer failures, reconfiguration for restoration analysis is performed. In evaluating distribution automation for storm conditions, Monte Carlo simulations coupled with reconfiguration for restoration calculations are used. As a result, the evaluations demonstrate that model-centric distribution automation has positive effects on system efficiency, capacity, and reliability.« less
Model-centric distribution automation: Capacity, reliability, and efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Onen, Ahmet; Jung, Jaesung; Dilek, Murat
A series of analyses along with field validations that evaluate efficiency, reliability, and capacity improvements of model-centric distribution automation are presented. With model-centric distribution automation, the same model is used from design to real-time control calculations. A 14-feeder system with 7 substations is considered. The analyses involve hourly time-varying loads and annual load growth factors. Phase balancing and capacitor redesign modifications are used to better prepare the system for distribution automation, where the designs are performed considering time-varying loads. Coordinated control of load tap changing transformers, line regulators, and switched capacitor banks is considered. In evaluating distribution automation versus traditionalmore » system design and operation, quasi-steady-state power flow analysis is used. In evaluating distribution automation performance for substation transformer failures, reconfiguration for restoration analysis is performed. In evaluating distribution automation for storm conditions, Monte Carlo simulations coupled with reconfiguration for restoration calculations are used. As a result, the evaluations demonstrate that model-centric distribution automation has positive effects on system efficiency, capacity, and reliability.« less
Duval, Kristin; Aubin, Rémy A; Elliott, James; Gorn-Hondermann, Ivan; Birnboim, H Chaim; Jonker, Derek; Fourney, Ron M; Frégeau, Chantal J
2010-02-01
Archival tissue preserved in fixative constitutes an invaluable resource for histological examination, molecular diagnostic procedures and for DNA typing analysis in forensic investigations. However, available material is often limited in size and quantity. Moreover, recovery of DNA is often severely compromised by the presence of covalent DNA-protein cross-links generated by formalin, the most prevalent fixative. We describe the evaluation of buffer formulations, sample lysis regimens and DNA recovery strategies and define optimized manual and automated procedures for the extraction of high quality DNA suitable for molecular diagnostics and genotyping. Using a 3-step enzymatic digestion protocol carried out in the absence of dithiothreitol, we demonstrate that DNA can be efficiently released from cells or tissues preserved in buffered formalin or the alcohol-based fixative GenoFix. This preparatory procedure can then be integrated to traditional phenol/chloroform extraction, a modified manual DNA IQ or automated DNA IQ/Te-Shake-based extraction in order to recover DNA for downstream applications. Quantitative recovery of high quality DNA was best achieved from specimens archived in GenoFix and extracted using magnetic bead capture.
NASA Astrophysics Data System (ADS)
Focke, Maximilian; Mark, Daniel; Stumpf, Fabian; Müller, Martina; Roth, Günter; Zengerle, Roland; von Stetten, Felix
2011-06-01
Two microfluidic cartridges intended for upgrading standard laboratory instruments with automated liquid handling capability by use of centrifugal forces are presented. The first microfluidic cartridge enables purification of DNA from human whole blood and is operated in a standard laboratory centrifuge. The second microfluidic catridge enables genotyping of pathogens by geometrically multiplexed real-time PCR. It is operated in a slightly modified off-the-shelf thermal cycler. Both solutions aim at smart and cost-efficient ways to automate work flows in laboratories. The DNA purification cartridge automates all liquid handling steps starting from a lysed blood sample to PCR ready DNA. The cartridge contains two manually crushable glass ampoules with liquid reagents. The DNA yield extracted from a 32 μl blood sample is 192 +/- 30 ng which corresponds to 53 +/- 8% of a reference extraction. The genotyping cartridge is applied to analyse isolates of the multi-resistant Staphyloccus aureus (MRSA) by real-time PCR. The wells contain pre-stored dry reagents such as primers and probes. Evaluation of the system with 44 genotyping assays showed a 100% specificity and agreement with the reference assays in standard tubes. The lower limit of detection was well below 10 copies of DNA per reaction.
Bessemans, Laurent; Jully, Vanessa; de Raikem, Caroline; Albanese, Mathieu; Moniotte, Nicolas; Silversmet, Pascal; Lemoine, Dominique
2016-01-01
High-throughput screening technologies are increasingly integrated into the formulation development process of biopharmaceuticals. The performance of liquid handling systems is dependent on the ability to deliver accurate and precise volumes of specific reagents to ensure process quality. We have developed an automated gravimetric calibration procedure to adjust the accuracy and evaluate the precision of the TECAN Freedom EVO liquid handling system. Volumes from 3 to 900 µL using calibrated syringes and fixed tips were evaluated with various solutions, including aluminum hydroxide and phosphate adjuvants, β-casein, sucrose, sodium chloride, and phosphate-buffered saline. The methodology to set up liquid class pipetting parameters for each solution was to split the process in three steps: (1) screening of predefined liquid class, including different pipetting parameters; (2) adjustment of accuracy parameters based on a calibration curve; and (3) confirmation of the adjustment. The run of appropriate pipetting scripts, data acquisition, and reports until the creation of a new liquid class in EVOware was fully automated. The calibration and confirmation of the robotic system was simple, efficient, and precise and could accelerate data acquisition for a wide range of biopharmaceutical applications. PMID:26905719
Evangelopoulos, Angelos A; Dalamaga, Maria; Panoutsopoulos, Konstantinos; Dima, Kleanthi
2013-01-01
In the early 80s, the word automation was used in the clinical laboratory setting referring only to analyzers. But in late 80s and afterwards, automation found its way into all aspects of the diagnostic process, embracing not only the analytical but also the pre- and post-analytical phase. While laboratories in the eastern world, mainly Japan, paved the way for laboratory automation, US and European laboratories soon realized the benefits and were quick to follow. Clearly, automation and robotics will be a key survival tool in a very competitive and cost-concious healthcare market. What sets automation technology apart from so many other efficiency solutions are the dramatic savings that it brings to the clinical laboratory. Further standardization will assure the success of this revolutionary new technology. One of the main difficulties laboratory managers and personnel must deal with when studying solutions to reengineer a laboratory is familiarizing themselves with the multidisciplinary and technical terminology of this new and exciting field. The present review/glossary aims at giving an overview of the most frequently used terms within the scope of laboratory automation and to put laboratory automation on a sounder linguistic basis.
Space station automation study. Volume 1: Executive summary. Autonomous systems and assembly
NASA Technical Reports Server (NTRS)
1984-01-01
The space station automation study (SSAS) was to develop informed technical guidance for NASA personnel in the use of autonomy and autonomous systems to implement space station functions. The initial step taken by NASA in organizing the SSAS was to form and convene a panel of recognized expert technologists in automation, space sciences and aerospace engineering to produce a space station automation plan.
Stanford, Tyman E; Bagley, Christopher J; Solomon, Patty J
2016-01-01
Proteomic matrix-assisted laser desorption/ionisation (MALDI) linear time-of-flight (TOF) mass spectrometry (MS) may be used to produce protein profiles from biological samples with the aim of discovering biomarkers for disease. However, the raw protein profiles suffer from several sources of bias or systematic variation which need to be removed via pre-processing before meaningful downstream analysis of the data can be undertaken. Baseline subtraction, an early pre-processing step that removes the non-peptide signal from the spectra, is complicated by the following: (i) each spectrum has, on average, wider peaks for peptides with higher mass-to-charge ratios ( m / z ), and (ii) the time-consuming and error-prone trial-and-error process for optimising the baseline subtraction input arguments. With reference to the aforementioned complications, we present an automated pipeline that includes (i) a novel 'continuous' line segment algorithm that efficiently operates over data with a transformed m / z -axis to remove the relationship between peptide mass and peak width, and (ii) an input-free algorithm to estimate peak widths on the transformed m / z scale. The automated baseline subtraction method was deployed on six publicly available proteomic MS datasets using six different m/z-axis transformations. Optimality of the automated baseline subtraction pipeline was assessed quantitatively using the mean absolute scaled error (MASE) when compared to a gold-standard baseline subtracted signal. Several of the transformations investigated were able to reduce, if not entirely remove, the peak width and peak location relationship resulting in near-optimal baseline subtraction using the automated pipeline. The proposed novel 'continuous' line segment algorithm is shown to far outperform naive sliding window algorithms with regard to the computational time required. The improvement in computational time was at least four-fold on real MALDI TOF-MS data and at least an order of magnitude on many simulated datasets. The advantages of the proposed pipeline include informed and data specific input arguments for baseline subtraction methods, the avoidance of time-intensive and subjective piecewise baseline subtraction, and the ability to automate baseline subtraction completely. Moreover, individual steps can be adopted as stand-alone routines.
Exponential error reduction in pretransfusion testing with automation.
South, Susan F; Casina, Tony S; Li, Lily
2012-08-01
Protecting the safety of blood transfusion is the top priority of transfusion service laboratories. Pretransfusion testing is a critical element of the entire transfusion process to enhance vein-to-vein safety. Human error associated with manual pretransfusion testing is a cause of transfusion-related mortality and morbidity and most human errors can be eliminated by automated systems. However, the uptake of automation in transfusion services has been slow and many transfusion service laboratories around the world still use manual blood group and antibody screen (G&S) methods. The goal of this study was to compare error potentials of commonly used manual (e.g., tiles and tubes) versus automated (e.g., ID-GelStation and AutoVue Innova) G&S methods. Routine G&S processes in seven transfusion service laboratories (four with manual and three with automated G&S methods) were analyzed using failure modes and effects analysis to evaluate the corresponding error potentials of each method. Manual methods contained a higher number of process steps ranging from 22 to 39, while automated G&S methods only contained six to eight steps. Corresponding to the number of the process steps that required human interactions, the risk priority number (RPN) of the manual methods ranged from 5304 to 10,976. In contrast, the RPN of the automated methods was between 129 and 436 and also demonstrated a 90% to 98% reduction of the defect opportunities in routine G&S testing. This study provided quantitative evidence on how automation could transform pretransfusion testing processes by dramatically reducing error potentials and thus would improve the safety of blood transfusion. © 2012 American Association of Blood Banks.
Inventory management and reagent supply for automated chemistry.
Kuzniar, E
1999-08-01
Developments in automated chemistry have kept pace with developments in HTS such that hundreds of thousands of new compounds can be rapidly synthesized in the belief that the greater the number and diversity of compounds that can be screened, the more successful HTS will be. The increasing use of automation for Multiple Parallel Synthesis (MPS) and the move to automated combinatorial library production is placing an overwhelming burden on the management of reagents. Although automation has improved the efficiency of the processes involved in compound synthesis, the bottleneck has shifted to ordering, collating and preparing reagents for automated chemistry resulting in loss of time, materials and momentum. Major efficiencies have already been made in the area of compound management for high throughput screening. Most of these efficiencies have been achieved with sophisticated library management systems using advanced engineering and data handling for the storage, tracking and retrieval of millions of compounds. The Automation Partnership has already provided many of the top pharmaceutical companies with modular automated storage, preparation and retrieval systems to manage compound libraries for high throughput screening. This article describes how these systems may be implemented to solve the specific problems of inventory management and reagent supply for automated chemistry.
Automated telescope scheduling
NASA Technical Reports Server (NTRS)
Johnston, Mark D.
1988-01-01
With the ever increasing level of automation of astronomical telescopes the benefits and feasibility of automated planning and scheduling are becoming more apparent. Improved efficiency and increased overall telescope utilization are the most obvious goals. Automated scheduling at some level has been done for several satellite observatories, but the requirements on these systems were much less stringent than on modern ground or satellite observatories. The scheduling problem is particularly acute for Hubble Space Telescope: virtually all observations must be planned in excruciating detail weeks to months in advance. Space Telescope Science Institute has recently made significant progress on the scheduling problem by exploiting state-of-the-art artificial intelligence software technology. What is especially interesting is that this effort has already yielded software that is well suited to scheduling groundbased telescopes, including the problem of optimizing the coordinated scheduling of more than one telescope.
Efficiency of using construction machines when strengthening foundation soils
NASA Astrophysics Data System (ADS)
Turchin, Vadim; Yudina, Ludmila; Ivanova, Tatyana; Zhilkina, Tatyana; Sychugove, Stanislav; Mackevicius, Rimantas; Danutė, Slizyte
2017-10-01
The article describes the efficiency of using construction machines when strengthening foundation base soils, as one of the ways to solve the problem of reducing and optimizing costs during construction. The analysis is presented in regard to inspection results of the soil bodies in the pile foundation base of “School of general education No. 5 in the town of Malgobek” of the republic of Ingushetia. Economical efficiency through reducing the duration of construction due to the automation of production is calculated.
Xing, KeYi; Han, LiBin; Zhou, MengChu; Wang, Feng
2012-06-01
Deadlock-free control and scheduling are vital for optimizing the performance of automated manufacturing systems (AMSs) with shared resources and route flexibility. Based on the Petri net models of AMSs, this paper embeds the optimal deadlock avoidance policy into the genetic algorithm and develops a novel deadlock-free genetic scheduling algorithm for AMSs. A possible solution of the scheduling problem is coded as a chromosome representation that is a permutation with repetition of parts. By using the one-step look-ahead method in the optimal deadlock control policy, the feasibility of a chromosome is checked, and infeasible chromosomes are amended into feasible ones, which can be easily decoded into a feasible deadlock-free schedule. The chromosome representation and polynomial complexity of checking and amending procedures together support the cooperative aspect of genetic search for scheduling problems strongly.
Automated Cognitive Health Assessment Using Smart Home Monitoring of Complex Tasks
Dawadi, Prafulla N.; Cook, Diane J.; Schmitter-Edgecombe, Maureen
2014-01-01
One of the many services that intelligent systems can provide is the automated assessment of resident well-being. We hypothesize that the functional health of individuals, or ability of individuals to perform activities independently without assistance, can be estimated by tracking their activities using smart home technologies. In this paper, we introduce a machine learning-based method for assessing activity quality in smart homes. To validate our approach we quantify activity quality for 179 volunteer participants who performed a complex, interweaved set of activities in our smart home apartment. We observed a statistically significant correlation (r=0.79) between automated assessment of task quality and direct observation scores. Using machine learning techniques to predict the cognitive health of the participants based on task quality is accomplished with an AUC value of 0.64. We believe that this capability is an important step in understanding everyday functional health of individuals in their home environments. PMID:25530925
The digital code driven autonomous synthesis of ibuprofen automated in a 3D-printer-based robot.
Kitson, Philip J; Glatzel, Stefan; Cronin, Leroy
2016-01-01
An automated synthesis robot was constructed by modifying an open source 3D printing platform. The resulting automated system was used to 3D print reaction vessels (reactionware) of differing internal volumes using polypropylene feedstock via a fused deposition modeling 3D printing approach and subsequently make use of these fabricated vessels to synthesize the nonsteroidal anti-inflammatory drug ibuprofen via a consecutive one-pot three-step approach. The synthesis of ibuprofen could be achieved on different scales simply by adjusting the parameters in the robot control software. The software for controlling the synthesis robot was written in the python programming language and hard-coded for the synthesis of ibuprofen by the method described, opening possibilities for the sharing of validated synthetic 'programs' which can run on similar low cost, user-constructed robotic platforms towards an 'open-source' regime in the area of chemical synthesis.
The digital code driven autonomous synthesis of ibuprofen automated in a 3D-printer-based robot
Kitson, Philip J; Glatzel, Stefan
2016-01-01
An automated synthesis robot was constructed by modifying an open source 3D printing platform. The resulting automated system was used to 3D print reaction vessels (reactionware) of differing internal volumes using polypropylene feedstock via a fused deposition modeling 3D printing approach and subsequently make use of these fabricated vessels to synthesize the nonsteroidal anti-inflammatory drug ibuprofen via a consecutive one-pot three-step approach. The synthesis of ibuprofen could be achieved on different scales simply by adjusting the parameters in the robot control software. The software for controlling the synthesis robot was written in the python programming language and hard-coded for the synthesis of ibuprofen by the method described, opening possibilities for the sharing of validated synthetic ‘programs’ which can run on similar low cost, user-constructed robotic platforms towards an ‘open-source’ regime in the area of chemical synthesis. PMID:28144350
Automated Cognitive Health Assessment Using Smart Home Monitoring of Complex Tasks.
Dawadi, Prafulla N; Cook, Diane J; Schmitter-Edgecombe, Maureen
2013-11-01
One of the many services that intelligent systems can provide is the automated assessment of resident well-being. We hypothesize that the functional health of individuals, or ability of individuals to perform activities independently without assistance, can be estimated by tracking their activities using smart home technologies. In this paper, we introduce a machine learning-based method for assessing activity quality in smart homes. To validate our approach we quantify activity quality for 179 volunteer participants who performed a complex, interweaved set of activities in our smart home apartment. We observed a statistically significant correlation (r=0.79) between automated assessment of task quality and direct observation scores. Using machine learning techniques to predict the cognitive health of the participants based on task quality is accomplished with an AUC value of 0.64. We believe that this capability is an important step in understanding everyday functional health of individuals in their home environments.
Lerch, Oliver; Temme, Oliver; Daldrup, Thomas
2014-07-01
The analysis of opioids, cocaine, and metabolites from blood serum is a routine task in forensic laboratories. Commonly, the employed methods include many manual or partly automated steps like protein precipitation, dilution, solid phase extraction, evaporation, and derivatization preceding a gas chromatography (GC)/mass spectrometry (MS) or liquid chromatography (LC)/MS analysis. In this study, a comprehensively automated method was developed from a validated, partly automated routine method. This was possible by replicating method parameters on the automated system. Only marginal optimization of parameters was necessary. The automation relying on an x-y-z robot after manual protein precipitation includes the solid phase extraction, evaporation of the eluate, derivatization (silylation with N-methyl-N-trimethylsilyltrifluoroacetamide, MSTFA), and injection into a GC/MS. A quantitative analysis of almost 170 authentic serum samples and more than 50 authentic samples of other matrices like urine, different tissues, and heart blood on cocaine, benzoylecgonine, methadone, morphine, codeine, 6-monoacetylmorphine, dihydrocodeine, and 7-aminoflunitrazepam was conducted with both methods proving that the analytical results are equivalent even near the limits of quantification (low ng/ml range). To our best knowledge, this application is the first one reported in the literature employing this sample preparation system.
Liu, Benmei; Yu, Mandi; Graubard, Barry I; Troiano, Richard P; Schenker, Nathaniel
2016-01-01
The Physical Activity Monitor (PAM) component was introduced into the 2003-2004 National Health and Nutrition Examination Survey (NHANES) to collect objective information on physical activity including both movement intensity counts and ambulatory steps. Due to an error in the accelerometer device initialization process, the steps data were missing for all participants in several primary sampling units (PSUs), typically a single county or group of contiguous counties, who had intensity count data from their accelerometers. To avoid potential bias and loss in efficiency in estimation and inference involving the steps data, we considered methods to accurately impute the missing values for steps collected in the 2003-2004 NHANES. The objective was to come up with an efficient imputation method which minimized model-based assumptions. We adopted a multiple imputation approach based on Additive Regression, Bootstrapping and Predictive mean matching (ARBP) methods. This method fits alternative conditional expectation (ace) models, which use an automated procedure to estimate optimal transformations for both the predictor and response variables. This paper describes the approaches used in this imputation and evaluates the methods by comparing the distributions of the original and the imputed data. A simulation study using the observed data is also conducted as part of the model diagnostics. Finally some real data analyses are performed to compare the before and after imputation results. PMID:27488606
Kimoto, Minoru; Okada, Kyoji; Sakamoto, Hitoshi; Kondou, Takanori
2017-05-01
[Purpose] To improve walking efficiency could be useful for reducing fatigue and extending possible period of walking in children with cerebral palsy (CP). For this purpose, current study compared conventional parameters of gross motor performance, step length, and cadence in the evaluation of walking efficiency in children with CP. [Subjects and Methods] Thirty-one children with CP (21 boys, 10 girls; mean age, 12.3 ± 2.7 years) participated. Parameters of gross motor performance, including the maximum step length (MSL), maximum side step length, step number, lateral step up number, and single leg standing time, were measured in both dominant and non-dominant sides. Spatio-temporal parameters of walking, including speed, step length, and cadence, were calculated. Total heart beat index (THBI), a parameter of walking efficiency, was also calculated from heartbeats and walking distance in 10 minutes of walking. To analyze the relationships between these parameters and the THBI, the coefficients of determination were calculated using stepwise analysis. [Results] The MSL of the dominant side best accounted for the THBI (R 2 =0.759). [Conclusion] The MSL of the dominant side was the best explanatory parameter for walking efficiency in children with CP.
Test/score/report: Simulation techniques for automating the test process
NASA Technical Reports Server (NTRS)
Hageman, Barbara H.; Sigman, Clayton B.; Koslosky, John T.
1994-01-01
A Test/Score/Report capability is currently being developed for the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) system which will automate testing of the Goddard Space Flight Center (GSFC) Payload Operations Control Center (POCC) and Mission Operations Center (MOC) software in three areas: telemetry decommutation, spacecraft command processing, and spacecraft memory load and dump processing. Automated computer control of the acceptance test process is one of the primary goals of a test team. With the proper simulation tools and user interface, the task of acceptance testing, regression testing, and repeatability of specific test procedures of a ground data system can be a simpler task. Ideally, the goal for complete automation would be to plug the operational deliverable into the simulator, press the start button, execute the test procedure, accumulate and analyze the data, score the results, and report the results to the test team along with a go/no recommendation to the test team. In practice, this may not be possible because of inadequate test tools, pressures of schedules, limited resources, etc. Most tests are accomplished using a certain degree of automation and test procedures that are labor intensive. This paper discusses some simulation techniques that can improve the automation of the test process. The TASS system tests the POCC/MOC software and provides a score based on the test results. The TASS system displays statistics on the success of the POCC/MOC system processing in each of the three areas as well as event messages pertaining to the Test/Score/Report processing. The TASS system also provides formatted reports documenting each step performed during the tests and the results of each step. A prototype of the Test/Score/Report capability is available and currently being used to test some POCC/MOC software deliveries. When this capability is fully operational it should greatly reduce the time necessary to test a POCC/MOC software delivery, as well as improve the quality of the test process.
Instructor Model Characteristics for Automated Speech Technology (IMCAST).
1979-10-01
the student leaves the school, OJT opportunities for further training may be on the decline as reported by Hooks et al. (1978)5 in the case of LSOs...this case , errors are 31 NAVTRAEQUIPCEN 79-C-0085-1 committed, or maybe a response was omitted, because of a lack of "knowledge" (structure) on the...Talking this reasoning one step further, Figure 7 shows the case wherein the range of resource allocation is extended from A to D. As can be seen, one
NASA Astrophysics Data System (ADS)
Ringenberg, Jordan; Deo, Makarand; Devabhaktuni, Vijay; Filgueiras-Rama, David; Pizarro, Gonzalo; Ibañez, Borja; Berenfeld, Omer; Boyers, Pamela; Gold, Jeffrey
2012-12-01
This paper presents an automated method to segment left ventricle (LV) tissues from functional and delayed-enhancement (DE) cardiac magnetic resonance imaging (MRI) scans using a sequential multi-step approach. First, a region of interest (ROI) is computed to create a subvolume around the LV using morphological operations and image arithmetic. From the subvolume, the myocardial contours are automatically delineated using difference of Gaussians (DoG) filters and GSV snakes. These contours are used as a mask to identify pathological tissues, such as fibrosis or scar, within the DE-MRI. The presented automated technique is able to accurately delineate the myocardium and identify the pathological tissue in patient sets. The results were validated by two expert cardiologists, and in one set the automated results are quantitatively and qualitatively compared with expert manual delineation. Furthermore, the method is patient-specific, performed on an entire patient MRI series. Thus, in addition to providing a quick analysis of individual MRI scans, the fully automated segmentation method is used for effectively tagging regions in order to reconstruct computerized patient-specific 3D cardiac models. These models can then be used in electrophysiological studies and surgical strategy planning.
Superpixel-Augmented Endmember Detection for Hyperspectral Images
NASA Technical Reports Server (NTRS)
Thompson, David R.; Castano, Rebecca; Gilmore, Martha
2011-01-01
Superpixels are homogeneous image regions comprised of several contiguous pixels. They are produced by shattering the image into contiguous, homogeneous regions that each cover between 20 and 100 image pixels. The segmentation aims for a many-to-one mapping from superpixels to image features; each image feature could contain several superpixels, but each superpixel occupies no more than one image feature. This conservative segmentation is relatively easy to automate in a robust fashion. Superpixel processing is related to the more general idea of improving hyperspectral analysis through spatial constraints, which can recognize subtle features at or below the level of noise by exploiting the fact that their spectral signatures are found in neighboring pixels. Recent work has explored spatial constraints for endmember extraction, showing significant advantages over techniques that ignore pixels relative positions. Methods such as AMEE (automated morphological endmember extraction) express spatial influence using fixed isometric relationships a local square window or Euclidean distance in pixel coordinates. In other words, two pixels covariances are based on their spatial proximity, but are independent of their absolute location in the scene. These isometric spatial constraints are most appropriate when spectral variation is smooth and constant over the image. Superpixels are simple to implement, efficient to compute, and are empirically effective. They can be used as a preprocessing step with any desired endmember extraction technique. Superpixels also have a solid theoretical basis in the hyperspectral linear mixing model, making them a principled approach for improving endmember extraction. Unlike existing approaches, superpixels can accommodate non-isometric covariance between image pixels (characteristic of discrete image features separated by step discontinuities). These kinds of image features are common in natural scenes. Analysts can substitute superpixels for image pixels during endmember analysis that leverages the spatial contiguity of scene features to enhance subtle spectral features. Superpixels define populations of image pixels that are independent samples from each image feature, permitting robust estimation of spectral properties, and reducing measurement noise in proportion to the area of the superpixel. This permits improved endmember extraction, and enables automated search for novel and constituent minerals in very noisy, hyperspatial images. This innovation begins with a graph-based segmentation based on the work of Felzenszwalb et al., but then expands their approach to the hyperspectral image domain with a Euclidean distance metric. Then, the mean spectrum of each segment is computed, and the resulting data cloud is used as input into sequential maximum angle convex cone (SMACC) endmember extraction.
Ibrahim, Sarah A; Martini, Luigi
2014-08-01
Dissolution method transfer is a complicated yet common process in the pharmaceutical industry. With increased pharmaceutical product manufacturing and dissolution acceptance requirements, dissolution testing has become one of the most labor-intensive quality control testing methods. There is an increased trend for automation in dissolution testing, particularly for large pharmaceutical companies to reduce variability and increase personnel efficiency. There is no official guideline for dissolution testing method transfer from a manual, semi-automated, to automated dissolution tester. In this study, a manual multipoint dissolution testing procedure for an enteric-coated aspirin tablet was transferred effectively and reproducibly to a fully automated dissolution testing device, RoboDis II. Enteric-coated aspirin samples were used as a model formulation to assess the feasibility and accuracy of media pH change during continuous automated dissolution testing. Several RoboDis II parameters were evaluated to ensure the integrity and equivalency of dissolution method transfer from a manual dissolution tester. This current study provides a systematic outline for the transfer of the manual dissolution testing protocol to an automated dissolution tester. This study further supports that automated dissolution testers compliant with regulatory requirements and similar to manual dissolution testers facilitate method transfer. © 2014 Society for Laboratory Automation and Screening.
The Role of Automation in Education: Now and in the Future
ERIC Educational Resources Information Center
Scandura, Joseph M.
2010-01-01
According to Wikipedia "Automation is a step beyond mechanism." Whereas mechanization provided human operators with machinery to assist them with the muscular requirements of work, automation greatly reduces the need for human sensory and mental requirements as well. In this context, Artificial Intelligence (AI) was founded on the claim that a…
ERIC Educational Resources Information Center
Sylvia, Margaret
1993-01-01
Describes one college library's experience with a gateway for dial-in access to its CD-ROM network to increase access to automated index searching for students off-campus. Hardware and software choices are discussed in terms of access, reliability, affordability, and ease of use. Installation problems are discussed, and an appendix lists product…
75 FR 69143 - Postal Rate and Classification Changes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-10
...This document addresses a recently-filed Postal Service request for three postal rate and classification changes. One change will affect certain senders of First-Class Mail Presort and Automation Letters. Another change will affect Standard Mail and High Density milers. The third change affects the Move Update Charge threshold. This document provides details about the anticipated changes and addresses procedural steps associated with this filing.
Development of the automated circulating tumor cell recovery system with microcavity array.
Negishi, Ryo; Hosokawa, Masahito; Nakamura, Seita; Kanbara, Hisashige; Kanetomo, Masafumi; Kikuhara, Yoshihito; Tanaka, Tsuyoshi; Matsunaga, Tadashi; Yoshino, Tomoko
2015-05-15
Circulating tumor cells (CTCs) are well recognized as useful biomarker for cancer diagnosis and potential target of drug discovery for metastatic cancer. Efficient and precise recovery of extremely low concentrations of CTCs from blood has been required to increase the detection sensitivity. Here, an automated system equipped with a microcavity array (MCA) was demonstrated for highly efficient and reproducible CTC recovery. The use of MCA allows selective recovery of cancer cells from whole blood on the basis of differences in size between tumor and blood cells. Intra- and inter-assays revealed that the automated system achieved high efficiency and reproducibility equal to the assay manually performed by well-trained operator. Under optimized assay workflow, the automated system allows efficient and precise cell recovery for non-small cell lung cancer cells spiked in whole blood. The automated CTC recovery system will contribute to high-throughput analysis in the further clinical studies on large cohort of cancer patients. Copyright © 2014 Elsevier B.V. All rights reserved.
Tavakoli, Mohammad Mahdi; Gu, Leilei; Gao, Yuan; Reckmeier, Claas; He, Jin; Rogach, Andrey L.; Yao, Yan; Fan, Zhiyong
2015-01-01
Organometallic trihalide perovskites are promising materials for photovoltaic applications, which have demonstrated a rapid rise in photovoltaic performance in a short period of time. We report a facile one-step method to fabricate planar heterojunction perovskite solar cells by chemical vapor deposition (CVD), with a solar power conversion efficiency of up to 11.1%. We performed a systematic optimization of CVD parameters such as temperature and growth time to obtain high quality films of CH3NH3PbI3 and CH3NH3PbI3-xClx perovskite. Scanning electron microscopy and time resolved photoluminescence data showed that the perovskite films have a large grain size of more than 1 micrometer, and carrier life-times of 10 ns and 120 ns for CH3NH3PbI3 and CH3NH3PbI3-xClx, respectively. This is the first demonstration of a highly efficient perovskite solar cell using one step CVD and there is likely room for significant improvement of device efficiency. PMID:26392200
NASA Astrophysics Data System (ADS)
Sawall, Mathias; von Harbou, Erik; Moog, Annekathrin; Behrens, Richard; Schröder, Henning; Simoneau, Joël; Steimers, Ellen; Neymeyr, Klaus
2018-04-01
Spectral data preprocessing is an integral and sometimes inevitable part of chemometric analyses. For Nuclear Magnetic Resonance (NMR) spectra a possible first preprocessing step is a phase correction which is applied to the Fourier transformed free induction decay (FID) signal. This preprocessing step can be followed by a separate baseline correction step. Especially if series of high-resolution spectra are considered, then automated and computationally fast preprocessing routines are desirable. A new method is suggested that applies the phase and the baseline corrections simultaneously in an automated form without manual input, which distinguishes this work from other approaches. The underlying multi-objective optimization or Pareto optimization provides improved results compared to consecutively applied correction steps. The optimization process uses an objective function which applies strong penalty constraints and weaker regularization conditions. The new method includes an approach for the detection of zero baseline regions. The baseline correction uses a modified Whittaker smoother. The functionality of the new method is demonstrated for experimental NMR spectra. The results are verified against gravimetric data. The method is compared to alternative preprocessing tools. Additionally, the simultaneous correction method is compared to a consecutive application of the two correction steps.
Magellan Project: Evolving enhanced operations efficiency to maximize science value
NASA Technical Reports Server (NTRS)
Cheuvront, Allan R.; Neuman, James C.; Mckinney, J. Franklin
1994-01-01
Magellan has been one of NASA's most successful spacecraft, returning more science data than all planetary spacecraft combined. The Magellan Spacecraft Team (SCT) has maximized the science return with innovative operational techniques to overcome anomalies and to perform activities for which the spacecraft was not designed. Commanding the spacecraft was originally time consuming because the standard development process was envisioned as manual tasks. The Program understood that reducing mission operations costs were essential for an extended mission. Management created an environment which encouraged automation of routine tasks, allowing staff reduction while maximizing the science data returned. Data analysis and trending, command preparation, and command reviews are some of the tasks that were automated. The SCT has accommodated personnel reductions by improving operations efficiency while returning the maximum science data possible.
NASA Astrophysics Data System (ADS)
Topolsky, D. V.; Gonenko, T. V.; Khatsevskiy, V. F.
2017-10-01
The present paper discusses ways to solve the problem of enhancing operating efficiency of automated electric power supply control systems of mining companies. According to the authors, one of the ways to solve this problem is intellectualization of the electric power supply control system equipment. To enhance efficiency of electric power supply control and electricity metering, it is proposed to use specially designed digital combined instrument current and voltage transformers. This equipment conforms to IEC 61850 international standard and is adapted for integration into the digital substation structure. Tests were performed to check conformity of an experimental prototype of the digital combined instrument current and voltage transformer with IEC 61850 standard. The test results have shown that the considered equipment meets the requirements of the standard.
Maddux, Randy J.
1995-01-01
The political and economic climate that exists today is a challenging one for the pharmaceutical industry. To effectively compete in today's marketplace, companies must discover and develop truly innovative medicines. The R&D organizations within these companies are under increasing pressure to hold down costs while accomplishing this mission. In this environment of level head count and operating budgets, it is imperative that laboratory management uses resources in the most effective, efficient ways possible. Investment in laboratory automation is a proven tool for doing just that. This paper looks at the strategy and tactics behind the formation and evolution of a central automation/laboratory technology support function at the Glaxo Research Institute. Staffing of the function is explained, along with operating strategy and alignment with the scientific client base. Using the S-curve model of technological progress, both the realized and potential impact on successful R&D automation and laboratory technology development are assessed. PMID:18925012
Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey
2017-01-01
As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed‐batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647–1661, 2017 PMID:28786215
De Tobel, J; Radesh, P; Vandermeulen, D; Thevissen, P W
2017-12-01
Automated methods to evaluate growth of hand and wrist bones on radiographs and magnetic resonance imaging have been developed. They can be applied to estimate age in children and subadults. Automated methods require the software to (1) recognise the region of interest in the image(s), (2) evaluate the degree of development and (3) correlate this to the age of the subject based on a reference population. For age estimation based on third molars an automated method for step (1) has been presented for 3D magnetic resonance imaging and is currently being optimised (Unterpirker et al. 2015). To develop an automated method for step (2) based on lower third molars on panoramic radiographs. A modified Demirjian staging technique including ten developmental stages was developed. Twenty panoramic radiographs per stage per gender were retrospectively selected for FDI element 38. Two observers decided in consensus about the stages. When necessary, a third observer acted as a referee to establish the reference stage for the considered third molar. This set of radiographs was used as training data for machine learning algorithms for automated staging. First, image contrast settings were optimised to evaluate the third molar of interest and a rectangular bounding box was placed around it in a standardised way using Adobe Photoshop CC 2017 software. This bounding box indicated the region of interest for the next step. Second, several machine learning algorithms available in MATLAB R2017a software were applied for automated stage recognition. Third, the classification performance was evaluated in a 5-fold cross-validation scenario, using different validation metrics (accuracy, Rank-N recognition rate, mean absolute difference, linear kappa coefficient). Transfer Learning as a type of Deep Learning Convolutional Neural Network approach outperformed all other tested approaches. Mean accuracy equalled 0.51, mean absolute difference was 0.6 stages and mean linearly weighted kappa was 0.82. The overall performance of the presented automated pilot technique to stage lower third molar development on panoramic radiographs was similar to staging by human observers. It will be further optimised in future research, since it represents a necessary step to achieve a fully automated dental age estimation method, which to date is not available.
ICSH guidelines for the verification and performance of automated cell counters for body fluids.
Bourner, G; De la Salle, B; George, T; Tabe, Y; Baum, H; Culp, N; Keng, T B
2014-12-01
One of the many challenges facing laboratories is the verification of their automated Complete Blood Count cell counters for the enumeration of body fluids. These analyzers offer improved accuracy, precision, and efficiency in performing the enumeration of cells compared with manual methods. A patterns of practice survey was distributed to laboratories that participate in proficiency testing in Ontario, Canada, the United States, the United Kingdom, and Japan to determine the number of laboratories that are testing body fluids on automated analyzers and the performance specifications that were performed. Based on the results of this questionnaire, an International Working Group for the Verification and Performance of Automated Cell Counters for Body Fluids was formed by the International Council for Standardization in Hematology (ICSH) to prepare a set of guidelines to help laboratories plan and execute the verification of their automated cell counters to provide accurate and reliable results for automated body fluid counts. These guidelines were discussed at the ICSH General Assemblies and reviewed by an international panel of experts to achieve further consensus. © 2014 John Wiley & Sons Ltd.
Picard-Meyer, Evelyne; Peytavin de Garam, Carine; Schereffer, Jean Luc; Marchal, Clotilde; Robardet, Emmanuelle; Cliquet, Florence
2015-01-01
This study evaluates the performance of five two-step SYBR Green RT-qPCR kits and five one-step SYBR Green qRT-PCR kits using real-time PCR assays. Two real-time thermocyclers showing different throughput capacities were used. The analysed performance evaluation criteria included the generation of standard curve, reaction efficiency, analytical sensitivity, intra- and interassay repeatability as well as the costs and the practicability of kits, and thermocycling times. We found that the optimised one-step PCR assays had a higher detection sensitivity than the optimised two-step assays regardless of the machine used, while no difference was detected in reaction efficiency, R (2) values, and intra- and interreproducibility between the two methods. The limit of detection at the 95% confidence level varied between 15 to 981 copies/µL and 41 to 171 for one-step kits and two-step kits, respectively. Of the ten kits tested, the most efficient kit was the Quantitect SYBR Green qRT-PCR with a limit of detection at 95% of confidence of 20 and 22 copies/µL on the thermocyclers Rotor gene Q MDx and MX3005P, respectively. The study demonstrated the pivotal influence of the thermocycler on PCR performance for the detection of rabies RNA, as well as that of the master mixes.
Picard-Meyer, Evelyne; Peytavin de Garam, Carine; Schereffer, Jean Luc; Marchal, Clotilde; Robardet, Emmanuelle; Cliquet, Florence
2015-01-01
This study evaluates the performance of five two-step SYBR Green RT-qPCR kits and five one-step SYBR Green qRT-PCR kits using real-time PCR assays. Two real-time thermocyclers showing different throughput capacities were used. The analysed performance evaluation criteria included the generation of standard curve, reaction efficiency, analytical sensitivity, intra- and interassay repeatability as well as the costs and the practicability of kits, and thermocycling times. We found that the optimised one-step PCR assays had a higher detection sensitivity than the optimised two-step assays regardless of the machine used, while no difference was detected in reaction efficiency, R 2 values, and intra- and interreproducibility between the two methods. The limit of detection at the 95% confidence level varied between 15 to 981 copies/µL and 41 to 171 for one-step kits and two-step kits, respectively. Of the ten kits tested, the most efficient kit was the Quantitect SYBR Green qRT-PCR with a limit of detection at 95% of confidence of 20 and 22 copies/µL on the thermocyclers Rotor gene Q MDx and MX3005P, respectively. The study demonstrated the pivotal influence of the thermocycler on PCR performance for the detection of rabies RNA, as well as that of the master mixes. PMID:25785274
Detecting Solar-like Oscillations in Red Giants with Deep Learning
NASA Astrophysics Data System (ADS)
Hon, Marc; Stello, Dennis; Zinn, Joel C.
2018-05-01
Time-resolved photometry of tens of thousands of red giant stars from space missions like Kepler and K2 has created the need for automated asteroseismic analysis methods. The first and most fundamental step in such analysis is to identify which stars show oscillations. It is critical that this step be performed with no, or little, detection bias, particularly when performing subsequent ensemble analyses that aim to compare the properties of observed stellar populations with those from galactic models. However, an efficient, automated solution to this initial detection step still has not been found, meaning that expert visual inspection of data from each star is required to obtain the highest level of detections. Hence, to mimic how an expert eye analyzes the data, we use supervised deep learning to not only detect oscillations in red giants, but also to predict the location of the frequency at maximum power, ν max, by observing features in 2D images of power spectra. By training on Kepler data, we benchmark our deep-learning classifier against K2 data that are given detections by the expert eye, achieving a detection accuracy of 98% on K2 Campaign 6 stars and a detection accuracy of 99% on K2 Campaign 3 stars. We further find that the estimated uncertainty of our deep-learning-based ν max predictions is about 5%. This is comparable to human-level performance using visual inspection. When examining outliers, we find that the deep-learning results are more likely to provide robust ν max estimates than the classical model-fitting method.
Feng, Xiaoyan; Deng, Chunhui; Gao, Mingxia; Zhang, Xiangmin
2018-01-01
Protein glycosylation is one of the most important post-translational modifications. Also, efficient enrichment and separation of glycopeptides from complex samples are crucial for the thorough analysis of glycosylation. Developing novel hydrophilic materials with facile and easily popularized synthesis is an urgent need in large-scale glycoproteomics research. Herein, for the first time, a one-step functionalization strategy based on metal-organic coordination was proposed and Fe 3 O 4 nanoparticles were directly functionalized with zwitterionic hydrophilic L-cysteine (L-Cys), greatly simplifying the synthetic procedure. The easily synthesized Fe 3 O 4 /L-Cys possessed excellent hydrophilicity and brief composition, contributing to affinity for glycopeptides and reduction in nonspecific interaction. Thus, Fe 3 O 4 /L-Cys nanoparticles showed outstanding sensitivity (25 amol/μL), high selectivity (mixture of bovine serum albumin and horseradish peroxidase tryptic digests at a mass ratio of 100:1), good reusability (five repeated times), and stability (room temperature storage of 1 month). Encouragingly, in the glycosylation analysis of human serum, a total of 376 glycopeptides with 393 N-glycosylation sites corresponding to 118 glycoproteins were identified after enrichment with Fe 3 O 4 /L-Cys, which was superior to ever reported L-Cys modified magnetic materials. Furthermore, applying the one-step functionalization strategy, cysteamine and glutathione respectively direct-functionalized Fe 3 O 4 nanoparticles were successfully synthesized and also achieved efficient glycopeptide enrichment in human serum. The results indicated that we have presented an efficient and easily popularized strategy in glycoproteomics as well as in the synthesis of novel materials. Graphical abstract Fe 3 O 4 /L-Cys nanoparticles based on one-step functionalization for highly efficient enrichment of glycopeptides.
A Student Synthesis of the Housefly Sex Attractant.
ERIC Educational Resources Information Center
Cormier, Russell; And Others
1979-01-01
A novel and efficient (34 percent overall) multi-step synthesis of the housefly sex attractant, muscalure, is described. Each of the steps involves types of reactions with which the undergraduate student would be familiar after one-and-one-half semesters of organic chemistry. (BB)
CT liver volumetry using geodesic active contour segmentation with a level-set algorithm
NASA Astrophysics Data System (ADS)
Suzuki, Kenji; Epstein, Mark L.; Kohlbrenner, Ryan; Obajuluwa, Ademola; Xu, Jianwu; Hori, Masatoshi; Baron, Richard
2010-03-01
Automatic liver segmentation on CT images is challenging because the liver often abuts other organs of a similar density. Our purpose was to develop an accurate automated liver segmentation scheme for measuring liver volumes. We developed an automated volumetry scheme for the liver in CT based on a 5 step schema. First, an anisotropic smoothing filter was applied to portal-venous phase CT images to remove noise while preserving the liver structure, followed by an edge enhancer to enhance the liver boundary. By using the boundary-enhanced image as a speed function, a fastmarching algorithm generated an initial surface that roughly estimated the liver shape. A geodesic-active-contour segmentation algorithm coupled with level-set contour-evolution refined the initial surface so as to more precisely fit the liver boundary. The liver volume was calculated based on the refined liver surface. Hepatic CT scans of eighteen prospective liver donors were obtained under a liver transplant protocol with a multi-detector CT system. Automated liver volumes obtained were compared with those manually traced by a radiologist, used as "gold standard." The mean liver volume obtained with our scheme was 1,520 cc, whereas the mean manual volume was 1,486 cc, with the mean absolute difference of 104 cc (7.0%). CT liver volumetrics based on an automated scheme agreed excellently with "goldstandard" manual volumetrics (intra-class correlation coefficient was 0.95) with no statistically significant difference (p(F<=f)=0.32), and required substantially less completion time. Our automated scheme provides an efficient and accurate way of measuring liver volumes.
Java Tool Framework for Automation of Hardware Commissioning and Maintenance Procedures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ho, J C; Fisher, J M; Gordon, J B
2007-10-02
The National Ignition Facility (NIF) is a 192-beam laser system designed to study high energy density physics. Each beam line contains a variety of line replaceable units (LRUs) that contain optics, stepping motors, sensors and other devices to control and diagnose the laser. During commissioning and subsequent maintenance of the laser, LRUs undergo a qualification process using the Integrated Computer Control System (ICCS) to verify and calibrate the equipment. The commissioning processes are both repetitive and tedious when we use remote manual computer controls, making them ideal candidates for software automation. Maintenance and Commissioning Tool (MCT) software was developed tomore » improve the efficiency of the qualification process. The tools are implemented in Java, leveraging ICCS services and CORBA to communicate with the control devices. The framework provides easy-to-use mechanisms for handling configuration data, task execution, task progress reporting, and generation of commissioning test reports. The tool framework design and application examples will be discussed.« less
Securing Color Fidelity in 3D Architectural Heritage Scenarios.
Gaiani, Marco; Apollonio, Fabrizio Ivan; Ballabeni, Andrea; Remondino, Fabio
2017-10-25
Ensuring color fidelity in image-based 3D modeling of heritage scenarios is nowadays still an open research matter. Image colors are important during the data processing as they affect algorithm outcomes, therefore their correct treatment, reduction and enhancement is fundamental. In this contribution, we present an automated solution developed to improve the radiometric quality of an image datasets and the performances of two main steps of the photogrammetric pipeline (camera orientation and dense image matching). The suggested solution aims to achieve a robust automatic color balance and exposure equalization, stability of the RGB-to-gray image conversion and faithful color appearance of a digitized artifact. The innovative aspects of the article are: complete automation, better color target detection, a MATLAB implementation of the ACR scripts created by Fraser and the use of a specific weighted polynomial regression. A series of tests are presented to demonstrate the efficiency of the developed methodology and to evaluate color accuracy ('color characterization').
Schmidtgall, Boris; Höbartner, Claudia; Ducho, Christian
2015-01-01
Modifications of the nucleic acid backbone are essential for the development of oligonucleotide-derived bioactive agents. The NAA-modification represents a novel artificial internucleotide linkage which enables the site-specific introduction of positive charges into the otherwise polyanionic backbone of DNA oligonucleotides. Following initial studies with the introduction of the NAA-linkage at T-T sites, it is now envisioned to prepare NAA-modified oligonucleotides bearing the modification at X-T motifs (X = A, C, G). We have therefore developed the efficient and stereoselective synthesis of NAA-linked 'dimeric' A-T phosphoramidite building blocks for automated DNA synthesis. Both the (S)- and the (R)-configured NAA-motifs were constructed with high diastereoselectivities to furnish two different phosphoramidite reagents, which were employed for the solid phase-supported automated synthesis of two NAA-modified DNA oligonucleotides. This represents a significant step to further establish the NAA-linkage as a useful addition to the existing 'toolbox' of backbone modifications for the design of bioactive oligonucleotide analogues.
Thielmann, Yvonne; Koepke, Juergen; Michel, Hartmut
2012-06-01
Structure determination of membrane proteins and membrane protein complexes is still a very challenging field. To facilitate the work on membrane proteins the Core Centre follows a strategy that comprises four labs of protein analytics and crystal handling, covering mass spectrometry, calorimetry, crystallization and X-ray diffraction. This general workflow is presented and a capacity of 20% of the operating time of all systems is provided to the European structural biology community within the ESFRI Instruct program. A description of the crystallization service offered at the Core Centre is given with detailed information on screening strategy, screens used and changes to adapt high throughput for membrane proteins. Our aim is to constantly develop the Core Centre towards the usage of more efficient methods. This strategy might also include the ability to automate all steps from crystallization trials to crystal screening; here we look ahead how this aim might be realized at the Core Centre.
Securing Color Fidelity in 3D Architectural Heritage Scenarios
Apollonio, Fabrizio Ivan; Ballabeni, Andrea; Remondino, Fabio
2017-01-01
Ensuring color fidelity in image-based 3D modeling of heritage scenarios is nowadays still an open research matter. Image colors are important during the data processing as they affect algorithm outcomes, therefore their correct treatment, reduction and enhancement is fundamental. In this contribution, we present an automated solution developed to improve the radiometric quality of an image datasets and the performances of two main steps of the photogrammetric pipeline (camera orientation and dense image matching). The suggested solution aims to achieve a robust automatic color balance and exposure equalization, stability of the RGB-to-gray image conversion and faithful color appearance of a digitized artifact. The innovative aspects of the article are: complete automation, better color target detection, a MATLAB implementation of the ACR scripts created by Fraser and the use of a specific weighted polynomial regression. A series of tests are presented to demonstrate the efficiency of the developed methodology and to evaluate color accuracy (‘color characterization’). PMID:29068359
Oosterwijk, J C; Knepflé, C F; Mesker, W E; Vrolijk, H; Sloos, W C; Pattenier, H; Ravkin, I; van Ommen, G J; Kanhai, H H; Tanke, H J
1998-01-01
This article explores the feasibility of the use of automated microscopy and image analysis to detect the presence of rare fetal nucleated red blood cells (NRBCs) circulating in maternal blood. The rationales for enrichment and for automated image analysis for "rare-event" detection are reviewed. We also describe the application of automated image analysis to 42 maternal blood samples, using a protocol consisting of one-step enrichment followed by immunocytochemical staining for fetal hemoglobin (HbF) and FISH for X- and Y-chromosomal sequences. Automated image analysis consisted of multimode microscopy and subsequent visual evaluation of image memories containing the selected objects. The FISH results were compared with the results of conventional karyotyping of the chorionic villi. By use of manual screening, 43% of the slides were found to be positive (>=1 NRBC), with a mean number of 11 NRBCs (range 1-40). By automated microscopy, 52% were positive, with on average 17 NRBCs (range 1-111). There was a good correlation between both manual and automated screening, but the NRBC yield from automated image analysis was found to be superior to that from manual screening (P=.0443), particularly when the NRBC count was >15. Seven (64%) of 11 XY fetuses were correctly diagnosed by FISH analysis of automatically detected cells, and all discrepancies were restricted to the lower cell-count range. We believe that automated microscopy and image analysis reduce the screening workload, are more sensitive than manual evaluation, and can be used to detect rare HbF-containing NRBCs in maternal blood. PMID:9837832
ERIC Educational Resources Information Center
Colom, Roberto; Stein, Jason L.; Rajagopalan, Priya; Martinez, Kenia; Hermel, David; Wang, Yalin; Alvarez-Linera, Juan; Burgaleta, Miguel; Quiroga, Ma. Angeles; Shih, Pei Chun; Thompson, Paul M.
2013-01-01
Here we apply a method for automated segmentation of the hippocampus in 3D high-resolution structural brain MRI scans. One hundred and four healthy young adults completed twenty one tasks measuring abstract, verbal, and spatial intelligence, along with working memory, executive control, attention, and processing speed. After permutation tests…
The Effect of Computer Automation on Institutional Review Board (IRB) Office Efficiency
ERIC Educational Resources Information Center
Oder, Karl; Pittman, Stephanie
2015-01-01
Companies purchase computer systems to make their processes more efficient through automation. Some academic medical centers (AMC) have purchased computer systems for their institutional review boards (IRB) to increase efficiency and compliance with regulations. IRB computer systems are expensive to purchase, deploy, and maintain. An AMC should…
Niaksu, Olegas; Zaptorius, Jonas
2014-01-01
This paper presents the methodology suitable for creation of a performance related remuneration system in healthcare sector, which would meet requirements for efficiency and sustainable quality of healthcare services. Methodology for performance indicators selection, ranking and a posteriori evaluation has been proposed and discussed. Priority Distribution Method is applied for unbiased performance criteria weighting. Data mining methods are proposed to monitor and evaluate the results of motivation system.We developed a method for healthcare specific criteria selection consisting of 8 steps; proposed and demonstrated application of Priority Distribution Method for the selected criteria weighting. Moreover, a set of data mining methods for evaluation of the motivational system outcomes was proposed. The described methodology for calculating performance related payment needs practical approbation. We plan to develop semi-automated tools for institutional and personal performance indicators monitoring. The final step would be approbation of the methodology in a healthcare facility.
Vessel extraction in retinal images using automatic thresholding and Gabor Wavelet.
Ali, Aziah; Hussain, Aini; Wan Zaki, Wan Mimi Diyana
2017-07-01
Retinal image analysis has been widely used for early detection and diagnosis of multiple systemic diseases. Accurate vessel extraction in retinal image is a crucial step towards a fully automated diagnosis system. This work affords an efficient unsupervised method for extracting blood vessels from retinal images by combining existing Gabor Wavelet (GW) method with automatic thresholding. Green channel image is extracted from color retinal image and used to produce Gabor feature image using GW. Both green channel image and Gabor feature image undergo vessel-enhancement step in order to highlight blood vessels. Next, the two vessel-enhanced images are transformed to binary images using automatic thresholding before combined to produce the final vessel output. Combining the images results in significant improvement of blood vessel extraction performance compared to using individual image. Effectiveness of the proposed method was proven via comparative analysis with existing methods validated using publicly available database, DRIVE.
Automation on the generation of genome-scale metabolic models.
Reyes, R; Gamermann, D; Montagud, A; Fuente, D; Triana, J; Urchueguía, J F; de Córdoba, P Fernández
2012-12-01
Nowadays, the reconstruction of genome-scale metabolic models is a nonautomatized and interactive process based on decision making. This lengthy process usually requires a full year of one person's work in order to satisfactory collect, analyze, and validate the list of all metabolic reactions present in a specific organism. In order to write this list, one manually has to go through a huge amount of genomic, metabolomic, and physiological information. Currently, there is no optimal algorithm that allows one to automatically go through all this information and generate the models taking into account probabilistic criteria of unicity and completeness that a biologist would consider. This work presents the automation of a methodology for the reconstruction of genome-scale metabolic models for any organism. The methodology that follows is the automatized version of the steps implemented manually for the reconstruction of the genome-scale metabolic model of a photosynthetic organism, Synechocystis sp. PCC6803. The steps for the reconstruction are implemented in a computational platform (COPABI) that generates the models from the probabilistic algorithms that have been developed. For validation of the developed algorithm robustness, the metabolic models of several organisms generated by the platform have been studied together with published models that have been manually curated. Network properties of the models, like connectivity and average shortest mean path of the different models, have been compared and analyzed.
Automated extraction and validation of children's gait parameters with the Kinect.
Motiian, Saeid; Pergami, Paola; Guffey, Keegan; Mancinelli, Corrie A; Doretto, Gianfranco
2015-12-02
Gait analysis for therapy regimen prescription and monitoring requires patients to physically access clinics with specialized equipment. The timely availability of such infrastructure at the right frequency is especially important for small children. Besides being very costly, this is a challenge for many children living in rural areas. This is why this work develops a low-cost, portable, and automated approach for in-home gait analysis, based on the Microsoft Kinect. A robust and efficient method for extracting gait parameters is introduced, which copes with the high variability of noisy Kinect skeleton tracking data experienced across the population of young children. This is achieved by temporally segmenting the data with an approach based on coupling a probabilistic matching of stride template models, learned offline, with the estimation of their global and local temporal scaling. A preliminary study conducted on healthy children between 2 and 4 years of age is performed to analyze the accuracy, precision, repeatability, and concurrent validity of the proposed method against the GAITRite when measuring several spatial and temporal children's gait parameters. The method has excellent accuracy and good precision, with segmenting temporal sequences of body joint locations into stride and step cycles. Also, the spatial and temporal gait parameters, estimated automatically, exhibit good concurrent validity with those provided by the GAITRite, as well as very good repeatability. In particular, on a range of nine gait parameters, the relative and absolute agreements were found to be good and excellent, and the overall agreements were found to be good and moderate. This work enables and validates the automated use of the Kinect for children's gait analysis in healthy subjects. In particular, the approach makes a step forward towards developing a low-cost, portable, parent-operated in-home tool for clinicians assisting young children.
Software for Automated Reading of STEP Files by I-DEAS(trademark)
NASA Technical Reports Server (NTRS)
Pinedo, John
2003-01-01
A program called "readstep" enables the I-DEAS(tm) computer-aided-design (CAD) software to automatically read Standard for the Exchange of Product Model Data (STEP) files. (The STEP format is one of several used to transfer data between dissimilar CAD programs.) Prior to the development of "readstep," it was necessary to read STEP files into I-DEAS(tm) one at a time in a slow process that required repeated intervention by the user. In operation, "readstep" prompts the user for the location of the desired STEP files and the names of the I-DEAS(tm) project and model file, then generates an I-DEAS(tm) program file called "readstep.prg" and two Unix shell programs called "runner" and "controller." The program "runner" runs I-DEAS(tm) sessions that execute readstep.prg, while "controller" controls the execution of "runner" and edits readstep.prg if necessary. The user sets "runner" and "controller" into execution simultaneously, and then no further intervention by the user is required. When "runner" has finished, the user should see only parts from successfully read STEP files present in the model file. STEP files that could not be read successfully (e.g., because of format errors) should be regenerated before attempting to read them again.
Islam, Asef; Oldham, Michael J; Wexler, Anthony S
2017-11-01
Mammalian lungs are comprised of large numbers of tracheobronchial airways that transition from the trachea to alveoli. Studies as wide ranging as pollutant deposition and lung development rely on accurate characterization of these airways. Advancements in CT imaging and the value of computational approaches in eliminating the burden of manual measurement are providing increased efficiency in obtaining this geometric data. In this study, we compare an automated method to a manual one for the first six generations of three Balb/c mouse lungs. We find good agreement between manual and automated methods and that much of the disagreement can be attributed to method precision. Using the automated method, we then provide anatomical data for the entire tracheobronchial airway tree from three Balb/C mice. Anat Rec, 2017. © 2017 Wiley Periodicals, Inc. Anat Rec, 300:2046-2057, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Information Handling in Selected Academic Libraries of the Caribbean.
ERIC Educational Resources Information Center
Rodriguez, Ketty
1988-01-01
Describes a survey that examined the extent of library technical processes automation within academic libraries at 10 Caribbean universities. Existing conditions, steps in progress, and plans for future automation are discussed. (8 references) (CLB)
Assessing emissions impacts of automated vehicles
DOT National Transportation Integrated Search
2016-06-20
With their potential for transforming surface transportation, understanding the impacts and benefits of automated vehicles (AVs) with regards to safety, mobility, energy and the environment is a necessary first step for informing policy to aid the su...
NASA Astrophysics Data System (ADS)
Santhana Vannan, S. K.; Ramachandran, R.; Deb, D.; Beaty, T.; Wright, D.
2017-12-01
This paper summarizes the workflow challenges of curating and publishing data produced from disparate data sources and provides a generalized workflow solution to efficiently archive data generated by researchers. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics and the Global Hydrology Resource Center (GHRC) DAAC have been collaborating on the development of a generalized workflow solution to efficiently manage the data publication process. The generalized workflow presented here are built on lessons learned from implementations of the workflow system. Data publication consists of the following steps: Accepting the data package from the data providers, ensuring the full integrity of the data files. Identifying and addressing data quality issues Assembling standardized, detailed metadata and documentation, including file level details, processing methodology, and characteristics of data files Setting up data access mechanisms Setup of the data in data tools and services for improved data dissemination and user experience Registering the dataset in online search and discovery catalogues Preserving the data location through Digital Object Identifiers (DOI) We will describe the steps taken to automate, and realize efficiencies to the above process. The goals of the workflow system are to reduce the time taken to publish a dataset, to increase the quality of documentation and metadata, and to track individual datasets through the data curation process. Utilities developed to achieve these goal will be described. We will also share metrics driven value of the workflow system and discuss the future steps towards creation of a common software framework.
Vieira, J; Cunha, M C
2011-01-01
This article describes a solution method of solving large nonlinear problems in two steps. The two steps solution approach takes advantage of handling smaller and simpler models and having better starting points to improve solution efficiency. The set of nonlinear constraints (named as complicating constraints) which makes the solution of the model rather complex and time consuming is eliminated from step one. The complicating constraints are added only in the second step so that a solution of the complete model is then found. The solution method is applied to a large-scale problem of conjunctive use of surface water and groundwater resources. The results obtained are compared with solutions determined with the direct solve of the complete model in one single step. In all examples the two steps solution approach allowed a significant reduction of the computation time. This potential gain of efficiency of the two steps solution approach can be extremely important for work in progress and it can be particularly useful for cases where the computation time would be a critical factor for having an optimized solution in due time.
Single nucleotide polymorphisms and haplotypes associated with feed efficiency in beef cattle
2013-01-01
Background General, breed- and diet-dependent associations between feed efficiency in beef cattle and single nucleotide polymorphisms (SNPs) or haplotypes were identified on a population of 1321 steers using a 50 K SNP panel. Genomic associations with traditional two-step indicators of feed efficiency – residual feed intake (RFI), residual average daily gain (RADG), and residual intake gain (RIG) – were compared to associations with two complementary one-step indicators of feed efficiency: efficiency of intake (EI) and efficiency of gain (EG). Associations uncovered in a training data set were evaluated on independent validation data set. A multi-SNP model was developed to predict feed efficiency. Functional analysis of genes harboring SNPs significantly associated with feed efficiency and network visualization aided in the interpretation of the results. Results For the five feed efficiency indicators, the numbers of general, breed-dependent, and diet-dependent associations with SNPs (P-value < 0.0001) were 31, 40, and 25, and with haplotypes were six, ten, and nine, respectively. Of these, 20 SNP and six haplotype associations overlapped between RFI and EI, and five SNP and one haplotype associations overlapped between RADG and EG. This result confirms the complementary value of the one and two-step indicators. The multi-SNP models included 89 SNPs and offered a precise prediction of the five feed efficiency indicators. The associations of 17 SNPs and 7 haplotypes with feed efficiency were confirmed on the validation data set. Nine clusters of Gene Ontology and KEGG pathway categories (mean P-value < 0.001) including, 9nucleotide binding; ion transport, phosphorous metabolic process, and the MAPK signaling pathway were overrepresented among the genes harboring the SNPs associated with feed efficiency. Conclusions The general SNP associations suggest that a single panel of genomic variants can be used regardless of breed and diet. The breed- and diet-dependent associations between SNPs and feed efficiency suggest that further refinement of variant panels require the consideration of the breed and management practices. The unique genomic variants associated with the one- and two-step indicators suggest that both types of indicators offer complementary description of feed efficiency that can be exploited for genome-enabled selection purposes. PMID:24066663
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lekov, Alex; Thompson, Lisa; McKane, Aimee
2009-05-11
This report summarizes the Lawrence Berkeley National Laboratory's research to date in characterizing energy efficiency and open automated demand response opportunities for industrial refrigerated warehouses in California. The report describes refrigerated warehouses characteristics, energy use and demand, and control systems. It also discusses energy efficiency and open automated demand response opportunities and provides analysis results from three demand response studies. In addition, several energy efficiency, load management, and demand response case studies are provided for refrigerated warehouses. This study shows that refrigerated warehouses can be excellent candidates for open automated demand response and that facilities which have implemented energy efficiencymore » measures and have centralized control systems are well-suited to shift or shed electrical loads in response to financial incentives, utility bill savings, and/or opportunities to enhance reliability of service. Control technologies installed for energy efficiency and load management purposes can often be adapted for open automated demand response (OpenADR) at little additional cost. These improved controls may prepare facilities to be more receptive to OpenADR due to both increased confidence in the opportunities for controlling energy cost/use and access to the real-time data.« less
Gemi: PCR Primers Prediction from Multiple Alignments
Sobhy, Haitham; Colson, Philippe
2012-01-01
Designing primers and probes for polymerase chain reaction (PCR) is a preliminary and critical step that requires the identification of highly conserved regions in a given set of sequences. This task can be challenging if the targeted sequences display a high level of diversity, as frequently encountered in microbiologic studies. We developed Gemi, an automated, fast, and easy-to-use bioinformatics tool with a user-friendly interface to design primers and probes based on multiple aligned sequences. This tool can be used for the purpose of real-time and conventional PCR and can deal efficiently with large sets of sequences of a large size. PMID:23316117
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaw, L. A.; Chizari, S.; Panas, R. M.
The aim of this research is to demonstrate a holographically driven photopolymerization process for joining colloidal particles to create planar microstructures fixed to a substrate, which can be monitored with real-time measurement. Holographic optical tweezers (HOT) have been used to arrange arrays of microparticles prior to this work; here we introduce a new photopolymerization process for rapidly joining simultaneously handled microspheres in a plane. Additionally, we demonstrate a new process control technique for efficiently identifying when particles have been successfully joined by measuring a sufficient reduction in the particles’ Brownian motion. Furthermore, this technique and our demonstrated joining approach enablemore » HOT technology to take critical steps toward automated additive fabrication of microstructures.« less
Functional-to-form mapping for assembly design automation
NASA Astrophysics Data System (ADS)
Xu, Z. G.; Liu, W. M.; Shen, W. D.; Yang, D. Y.; Liu, T. T.
2017-11-01
Assembly-level function-to-form mapping is the most effective procedure towards design automation. The research work mainly includes: the assembly-level function definitions, product network model and the two-step mapping mechanisms. The function-to-form mapping is divided into two steps, i.e. mapping of function-to-behavior, called the first-step mapping, and the second-step mapping, i.e. mapping of behavior-to-structure. After the first step mapping, the three dimensional transmission chain (or 3D sketch) is studied, and the feasible design computing tools are developed. The mapping procedure is relatively easy to be implemented interactively, but, it is quite difficult to finish it automatically. So manual, semi-automatic, automatic and interactive modification of the mapping model are studied. A mechanical hand F-F mapping process is illustrated to verify the design methodologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gardner, S; Gulam, M; Song, K
2014-06-01
Purpose: The Varian EDGE machine is a new stereotactic platform, combining Calypso and VisionRT localization systems with a stereotactic linac. The system includes TrueBeam DeveloperMode, making possible the use of XML-scripting for automation of linac-related tasks. This study details the use of DeveloperMode to automate commissioning tasks for Varian EDGE, thereby improving efficiency and measurement consistency. Methods: XML-scripting was used for various commissioning tasks,including couch model verification,beam-scanning,and isocenter verification. For couch measurements, point measurements were acquired for several field sizes (2×2,4×4,10×10cm{sup 2}) at 42 gantry angles for two couch-models. Measurements were acquired with variations in couch position(rails in/out,couch shifted inmore » each of motion axes) compared to treatment planning system(TPS)-calculated values,which were logged automatically through advanced planning interface(API) scripting functionality. For beam scanning, XML-scripts were used to create custom MLC-apertures. For isocenter verification, XML-scripts were used to automate various Winston-Lutz-type tests. Results: For couch measurements, the time required for each set of angles was approximately 9 minutes. Without scripting, each set required approximately 12 minutes. Automated measurements required only one physicist, while manual measurements required at least two physicists to handle linac positions/beams and data recording. MLC apertures were generated outside of the TPS,and with the .xml file format, double-checking without use of TPS/operator console was possible. Similar time efficiency gains were found for isocenter verification measurements Conclusion: The use of XML scripting in TrueBeam DeveloperMode allows for efficient and accurate data acquisition during commissioning. The efficiency improvement is most pronounced for iterative measurements, exemplified by the time savings for couch modeling measurements(approximately 10 hours). The scripting also allowed for creation of the files in advance without requiring access to TPS. The API scripting functionality enabled efficient creation/mining of TPS data. Finally, automation reduces the potential for human error in entering linac values at the machine console,and the script provides a log of measurements acquired for each session. This research was supported in part by a grant from Varian Medical Systems, Palo Alto, CA.« less
Computer-controlled attenuator.
Mitov, D; Grozev, Z
1991-01-01
Various possibilities for applying electronic computer-controlled attenuators for the automation of physiological experiments are considered. A detailed description is given of the design of a 4-channel computer-controlled attenuator, in two of the channels of which the output signal can change by a linear step, in the other two channels--by a logarithmic step. This, together with the existence of additional programmable timers, allows to automate a wide range of studies in different spheres of physiology and psychophysics, including vision and hearing.
Lesion Border Detection in Dermoscopy Images
Celebi, M. Emre; Schaefer, Gerald; Iyatomi, Hitoshi; Stoecker, William V.
2009-01-01
Background Dermoscopy is one of the major imaging modalities used in the diagnosis of melanoma and other pigmented skin lesions. Due to the difficulty and subjectivity of human interpretation, computerized analysis of dermoscopy images has become an important research area. One of the most important steps in dermoscopy image analysis is the automated detection of lesion borders. Methods In this article, we present a systematic overview of the recent border detection methods in the literature paying particular attention to computational issues and evaluation aspects. Conclusion Common problems with the existing approaches include the acquisition, size, and diagnostic distribution of the test image set, the evaluation of the results, and the inadequate description of the employed methods. Border determination by dermatologists appears to depend upon higher-level knowledge, therefore it is likely that the incorporation of domain knowledge in automated methods will enable them to perform better, especially in sets of images with a variety of diagnoses. PMID:19121917
Revealing biological information using data structuring and automated learning.
Mohorianu, Irina; Moulton, Vincent
2010-11-01
The intermediary steps between a biological hypothesis, concretized in the input data, and meaningful results, validated using biological experiments, commonly employ bioinformatics tools. Starting with storage of the data and ending with a statistical analysis of the significance of the results, every step in a bioinformatics analysis has been intensively studied and the resulting methods and models patented. This review summarizes the bioinformatics patents that have been developed mainly for the study of genes, and points out the universal applicability of bioinformatics methods to other related studies such as RNA interference. More specifically, we overview the steps undertaken in the majority of bioinformatics analyses, highlighting, for each, various approaches that have been developed to reveal details from different perspectives. First we consider data warehousing, the first task that has to be performed efficiently, optimizing the structure of the database, in order to facilitate both the subsequent steps and the retrieval of information. Next, we review data mining, which occupies the central part of most bioinformatics analyses, presenting patents concerning differential expression, unsupervised and supervised learning. Last, we discuss how networks of interactions of genes or other players in the cell may be created, which help draw biological conclusions and have been described in several patents.
One-Class Classification-Based Real-Time Activity Error Detection in Smart Homes.
Das, Barnan; Cook, Diane J; Krishnan, Narayanan C; Schmitter-Edgecombe, Maureen
2016-08-01
Caring for individuals with dementia is frequently associated with extreme physical and emotional stress, which often leads to depression. Smart home technology and advances in machine learning techniques can provide innovative solutions to reduce caregiver burden. One key service that caregivers provide is prompting individuals with memory limitations to initiate and complete daily activities. We hypothesize that sensor technologies combined with machine learning techniques can automate the process of providing reminder-based interventions. The first step towards automated interventions is to detect when an individual faces difficulty with activities. We propose machine learning approaches based on one-class classification that learn normal activity patterns. When we apply these classifiers to activity patterns that were not seen before, the classifiers are able to detect activity errors, which represent potential prompt situations. We validate our approaches on smart home sensor data obtained from older adult participants, some of whom faced difficulties performing routine activities and thus committed errors.
Camporese, Alessandro
2004-06-01
The diagnosis of infectious diseases and the role of the microbiology laboratory are currently undergoing a process of change. The need for overall efficiency in providing results is now given the same importance as accuracy. This means that laboratories must be able to produce quality results in less time with the capacity to interpret the results clinically. To improve the clinical impact of microbiology results, the new challenge facing the microbiologist has become one of process management instead of pure analysis. A proper project management process designed to improve workflow, reduce analytical time, and provide the same high quality results without losing valuable time treating the patient, has become essential. Our objective was to study the impact of introducing automation and computerization into the microbiology laboratory, and the reorganization of the laboratory workflow, i.e. scheduling personnel to work shifts covering both the entire day and the entire week. In our laboratory, the introduction of automation and computerization, as well as the reorganization of personnel, thus the workflow itself, has resulted in an improvement in response time and greater efficiency in diagnostic procedures.
Van Eaton, Erik G; Devlin, Allison B; Devine, Emily Beth; Flum, David R; Tarczy-Hornoch, Peter
2014-01-01
Delivering more appropriate, safer, and highly effective health care is the goal of a learning health care system. The Agency for Healthcare Research and Quality (AHRQ) funded enhanced registry projects: (1) to create and analyze valid data for comparative effectiveness research (CER); and (2) to enhance the ability to monitor and advance clinical quality improvement (QI). This case report describes barriers and solutions from one state-wide enhanced registry project. The Comparative Effectiveness Research and Translation Network (CERTAIN) deployed the commercially available Amalga Unified Intelligence System™ (Amalga) as a central data repository to enhance an existing QI registry (the Automation Project). An eight-step implementation process included hospital recruitment, technical electronic health record (EHR) review, hospital-specific interface planning, data ingestion, and validation. Data ownership and security protocols were established, along with formal methods to separate data management for QI purposes and research purposes. Sustainability would come from lowered chart review costs and the hospital's desire to invest in the infrastructure after trying it. CERTAIN approached 19 hospitals in Washington State operating within 12 unaffiliated health care systems for the Automation Project. Five of the 19 completed all implementation steps. Four hospitals did not participate due to lack of perceived institutional value. Ten hospitals did not participate because their information technology (IT) departments were oversubscribed (e.g., too busy with Meaningful Use upgrades). One organization representing 22 additional hospitals expressed interest, but was unable to overcome data governance barriers in time. Questions about data use for QI versus research were resolved in a widely adopted project framework. Hospitals restricted data delivery to a subset of patients, introducing substantial technical challenges. Overcoming challenges of idiosyncratic EHR implementations required each hospital to devote more IT resources than were predicted. Cost savings did not meet projections because of the increased IT resource requirements and a different source of lowered chart review costs. CERTAIN succeeded in recruiting unaffiliated hospitals into the Automation Project to create an enhanced registry to achieve AHRQ goals. This case report describes several distinct barriers to central data aggregation for QI and CER across unaffiliated hospitals: (1) competition for limited on-site IT expertise, (2) concerns about data use for QI versus research, (3) restrictions on data automation to a defined subset of patients, and (4) unpredictable resource needs because of idiosyncrasies among unaffiliated hospitals in how EHR data are coded, stored, and made available for transmission-even between hospitals using the same vendor's EHR. Therefore, even a fully optimized automation infrastructure would still not achieve complete automation. The Automation Project was unable to align sufficiently with internal hospital objectives, so it could not show a compelling case for sustainability.
Achieving and Sustaining Automated Health Data Linkages for Learning Systems: Barriers and Solutions
Van Eaton, Erik G.; Devlin, Allison B.; Devine, Emily Beth; Flum, David R.; Tarczy-Hornoch, Peter
2014-01-01
Introduction: Delivering more appropriate, safer, and highly effective health care is the goal of a learning health care system. The Agency for Healthcare Research and Quality (AHRQ) funded enhanced registry projects: (1) to create and analyze valid data for comparative effectiveness research (CER); and (2) to enhance the ability to monitor and advance clinical quality improvement (QI). This case report describes barriers and solutions from one state-wide enhanced registry project. Methods: The Comparative Effectiveness Research and Translation Network (CERTAIN) deployed the commercially available Amalga Unified Intelligence System™ (Amalga) as a central data repository to enhance an existing QI registry (the Automation Project). An eight-step implementation process included hospital recruitment, technical electronic health record (EHR) review, hospital-specific interface planning, data ingestion, and validation. Data ownership and security protocols were established, along with formal methods to separate data management for QI purposes and research purposes. Sustainability would come from lowered chart review costs and the hospital’s desire to invest in the infrastructure after trying it. Findings: CERTAIN approached 19 hospitals in Washington State operating within 12 unaffiliated health care systems for the Automation Project. Five of the 19 completed all implementation steps. Four hospitals did not participate due to lack of perceived institutional value. Ten hospitals did not participate because their information technology (IT) departments were oversubscribed (e.g., too busy with Meaningful Use upgrades). One organization representing 22 additional hospitals expressed interest, but was unable to overcome data governance barriers in time. Questions about data use for QI versus research were resolved in a widely adopted project framework. Hospitals restricted data delivery to a subset of patients, introducing substantial technical challenges. Overcoming challenges of idiosyncratic EHR implementations required each hospital to devote more IT resources than were predicted. Cost savings did not meet projections because of the increased IT resource requirements and a different source of lowered chart review costs. Discussion: CERTAIN succeeded in recruiting unaffiliated hospitals into the Automation Project to create an enhanced registry to achieve AHRQ goals. This case report describes several distinct barriers to central data aggregation for QI and CER across unaffiliated hospitals: (1) competition for limited on-site IT expertise, (2) concerns about data use for QI versus research, (3) restrictions on data automation to a defined subset of patients, and (4) unpredictable resource needs because of idiosyncrasies among unaffiliated hospitals in how EHR data are coded, stored, and made available for transmission—even between hospitals using the same vendor’s EHR. Therefore, even a fully optimized automation infrastructure would still not achieve complete automation. The Automation Project was unable to align sufficiently with internal hospital objectives, so it could not show a compelling case for sustainability. PMID:25848606
PyGOLD: a python based API for docking based virtual screening workflow generation.
Patel, Hitesh; Brinkjost, Tobias; Koch, Oliver
2017-08-15
Molecular docking is one of the successful approaches in structure based discovery and development of bioactive molecules in chemical biology and medicinal chemistry. Due to the huge amount of computational time that is still required, docking is often the last step in a virtual screening approach. Such screenings are set as workflows spanned over many steps, each aiming at different filtering task. These workflows can be automatized in large parts using python based toolkits except for docking using the docking software GOLD. However, within an automated virtual screening workflow it is not feasible to use the GUI in between every step to change the GOLD configuration file. Thus, a python module called PyGOLD was developed, to parse, edit and write the GOLD configuration file and to automate docking based virtual screening workflows. The latest version of PyGOLD, its documentation and example scripts are available at: http://www.ccb.tu-dortmund.de/koch or http://www.agkoch.de. PyGOLD is implemented in Python and can be imported as a standard python module without any further dependencies. oliver.koch@agkoch.de, oliver.koch@tu-dortmund.de. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Automatic Feature Selection and Improved Classification in SICADA Counterfeit Electronics Detection
2017-03-20
The SICADA methodology was developed to detect such counterfeit microelectronics by collecting power side channel data and applying machine learning...to identify counterfeits. This methodology has been extended to include a two-step automated feature selection process and now uses a one-class SVM...classifier. We describe this methodology and show results for empirical data collected from several types of Microchip dsPIC33F microcontrollers
Market-Based and System-Wide Fuel Cycle Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, Paul Philip Hood; Scopatz, Anthony; Gidden, Matthew
This work introduces automated optimization into fuel cycle simulations in the Cyclus platform. This includes system-level optimizations, seeking a deployment plan that optimizes the performance over the entire transition, and market-level optimization, seeking an optimal set of material trades at each time step. These concepts were introduced in a way that preserves the flexibility of the Cyclus fuel cycle framework, one of its most important design principles.
Pilot and Controller Evaluations of Separation Function Allocation in Air Traffic Management
NASA Technical Reports Server (NTRS)
Wing, David; Prevot, Thomas; Morey, Susan; Lewis, Timothy; Martin, Lynne; Johnson, Sally; Cabrall, Christopher; Como, Sean; Homola, Jeffrey; Sheth-Chandra, Manasi;
2013-01-01
Two human-in-the-loop simulation experiments were conducted in coordinated fashion to investigate the allocation of separation assurance functions between ground and air and between humans and automation. The experiments modeled a mixed-operations concept in which aircraft receiving ground-based separation services shared the airspace with aircraft providing their own separation service (i.e., self-separation). Ground-based separation was provided by air traffic controllers without automation tools, with tools, or by ground-based automation with controllers in a managing role. Airborne self-separation was provided by airline pilots using self-separation automation enabled by airborne surveillance technology. The two experiments, one pilot-focused and the other controller-focused, addressed selected key issues of mixed operations, assuming the starting point of current-day operations and modeling an emergence of NextGen technologies and procedures. In the controller-focused experiment, the impact of mixed operations on controller performance was assessed at four stages of NextGen implementation. In the pilot-focused experiment, the limits to which pilots with automation tools could take full responsibility for separation from ground-controlled aircraft were tested. Results indicate that the presence of self-separating aircraft had little impact on the controllers' ability to provide separation services for ground-controlled aircraft. Overall performance was best in the most automated environment in which all aircraft were data communications equipped, ground-based separation was highly automated, and self-separating aircraft had access to trajectory intent information for all aircraft. In this environment, safe, efficient, and highly acceptable operations could be achieved for twice today's peak airspace throughput. In less automated environments, reduced trajectory intent exchange and manual air traffic control limited the safely achievable airspace throughput and negatively impacted the maneuver efficiency of self-separating aircraft through high-density airspace. In a test of scripted conflicts with ground-managed aircraft, flight crews of self-separating aircraft prevented separation loss in all conflicts with detection time greater than one minute. In debrief, pilots indicated a preference for at least five minute's alerting notice and trajectory intent information on all aircraft. When intent information on ground-managed aircraft was available, self-separating aircraft benefited from fewer conflict alerts and fewer required deviations from trajectory-based operations.
Robst, John; Scheeringa, Michael S.; Cohen, Judith A.; Wang, Wei; Murphy, Tanya K.; Tolin, David F.; Storch, Eric A.
2013-01-01
This pilot study explored the preliminary efficacy, parent acceptability and economic cost of delivering Step One within Stepped Care Trauma-Focused Cognitive Behavioral Therapy (SC-TF-CBT). Nine young children ages 3–6 years and their parents participated in SC-TF-CBT. Eighty-three percent (5/6) of the children who completed Step One treatment and 55.6 % (5/9) of the intent-to-treat sample responded to Step One. One case relapsed at post-assessment. Treatment gains were maintained at 3-month follow-up. Generally, parents found Step One to be acceptable and were satisfied with treatment. At 3-month follow-up, the cost per unit improvement for posttraumatic stress symptoms and severity ranged from $27.65 to $131.33 for the responders and from $36.12 to $208.11 for the intent-to-treat sample. Further research on stepped care for young children is warranted to examine if this approach is more efficient, accessible and cost-effective than traditional therapy. PMID:23584728
Salloum, Alison; Robst, John; Scheeringa, Michael S; Cohen, Judith A; Wang, Wei; Murphy, Tanya K; Tolin, David F; Storch, Eric A
2014-02-01
This pilot study explored the preliminary efficacy, parent acceptability and economic cost of delivering Step One within Stepped Care Trauma-Focused Cognitive Behavioral Therapy (SC-TF-CBT). Nine young children ages 3-6 years and their parents participated in SC-TF-CBT. Eighty-three percent (5/6) of the children who completed Step One treatment and 55.6 % (5/9) of the intent-to-treat sample responded to Step One. One case relapsed at post-assessment. Treatment gains were maintained at 3-month follow-up. Generally, parents found Step One to be acceptable and were satisfied with treatment. At 3-month follow-up, the cost per unit improvement for posttraumatic stress symptoms and severity ranged from $27.65 to $131.33 for the responders and from $36.12 to $208.11 for the intent-to-treat sample. Further research on stepped care for young children is warranted to examine if this approach is more efficient, accessible and cost-effective than traditional therapy.
Melvin, Neal R; Poda, Daniel; Sutherland, Robert J
2007-10-01
When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.
Kamboj, Atul; Hallwirth, Claus V; Alexander, Ian E; McCowage, Geoffrey B; Kramer, Belinda
2017-06-17
The analysis of viral vector genomic integration sites is an important component in assessing the safety and efficiency of patient treatment using gene therapy. Alongside this clinical application, integration site identification is a key step in the genetic mapping of viral elements in mutagenesis screens that aim to elucidate gene function. We have developed a UNIX-based vector integration site analysis pipeline (Ub-ISAP) that utilises a UNIX-based workflow for automated integration site identification and annotation of both single and paired-end sequencing reads. Reads that contain viral sequences of interest are selected and aligned to the host genome, and unique integration sites are then classified as transcription start site-proximal, intragenic or intergenic. Ub-ISAP provides a reliable and efficient pipeline to generate large datasets for assessing the safety and efficiency of integrating vectors in clinical settings, with broader applications in cancer research. Ub-ISAP is available as an open source software package at https://sourceforge.net/projects/ub-isap/ .
Divertor target shape optimization in realistic edge plasma geometry
NASA Astrophysics Data System (ADS)
Dekeyser, W.; Reiter, D.; Baelmans, M.
2014-07-01
Tokamak divertor design for next-step fusion reactors heavily relies on numerical simulations of the plasma edge. Currently, the design process is mainly done in a forward approach, where the designer is strongly guided by his experience and physical intuition in proposing divertor shapes, which are then thoroughly assessed by numerical computations. On the other hand, automated design methods based on optimization have proven very successful in the related field of aerodynamic design. By recasting design objectives and constraints into the framework of a mathematical optimization problem, efficient forward-adjoint based algorithms can be used to automatically compute the divertor shape which performs the best with respect to the selected edge plasma model and design criteria. In the past years, we have extended these methods to automated divertor target shape design, using somewhat simplified edge plasma models and geometries. In this paper, we build on and extend previous work to apply these shape optimization methods for the first time in more realistic, single null edge plasma and divertor geometry, as commonly used in current divertor design studies. In a case study with JET-like parameters, we show that the so-called one-shot method is very effective is solving divertor target design problems. Furthermore, by detailed shape sensitivity analysis we demonstrate that the development of the method already at the present state provides physically plausible trends, allowing to achieve a divertor design with an almost perfectly uniform power load for our particular choice of edge plasma model and design criteria.
Microfluidic Capillaric Circuit for Rapid and Facile Bacteria Detection.
Olanrewaju, Ayokunle Oluwafemi; Ng, Andy; DeCorwin-Martin, Philippe; Robillard, Alessandra; Juncker, David
2017-06-20
Urinary tract infections (UTI) are one of the most common bacterial infections and would greatly benefit from a rapid point-of-care diagnostic test. Although significant progress has been made in developing microfluidic systems for nucleic acid and whole bacteria immunoassay tests, their practical application is limited by complex protocols, bulky peripherals, and slow operation. Here we present a microfluidic capillaric circuit (CC) optimized for rapid and automated detection of bacteria in urine. Molds for CCs were constructed using previously established design rules, then 3D-printed and replicated into poly(dimethylsiloxane). CCs autonomously and sequentially performed all liquid delivery steps required for the assay. For efficient bacteria capture, on-the-spot packing of antibody-functionalized microbeads was completed in <20 s followed by autonomous sequential delivery of 100 μL of bacteria sample, biotinylated detection antibodies, fluorescent streptavidin conjugate, and wash buffer for a total volume ≈115 μL. The assay was completed in <7 min. Fluorescence images of the microbead column revealed captured bacteria as bright spots that were easily counted manually or using an automated script for user-independent assay readout. The limit of detection of E. coli in synthetic urine was 1.2 × 10 2 colony-forming-units per mL (CFU/mL), which is well below the clinical diagnostic criterion (>10 5 CFU/mL) for UTI. The self-powered, peripheral-free CC presented here has potential for use in rapid point-of-care UTI screening.
Online optimal experimental re-design in robotic parallel fed-batch cultivation facilities.
Cruz Bournazou, M N; Barz, T; Nickel, D B; Lopez Cárdenas, D C; Glauche, F; Knepper, A; Neubauer, P
2017-03-01
We present an integrated framework for the online optimal experimental re-design applied to parallel nonlinear dynamic processes that aims to precisely estimate the parameter set of macro kinetic growth models with minimal experimental effort. This provides a systematic solution for rapid validation of a specific model to new strains, mutants, or products. In biosciences, this is especially important as model identification is a long and laborious process which is continuing to limit the use of mathematical modeling in this field. The strength of this approach is demonstrated by fitting a macro-kinetic differential equation model for Escherichia coli fed-batch processes after 6 h of cultivation. The system includes two fully-automated liquid handling robots; one containing eight mini-bioreactors and another used for automated at-line analyses, which allows for the immediate use of the available data in the modeling environment. As a result, the experiment can be continually re-designed while the cultivations are running using the information generated by periodical parameter estimations. The advantages of an online re-computation of the optimal experiment are proven by a 50-fold lower average coefficient of variation on the parameter estimates compared to the sequential method (4.83% instead of 235.86%). The success obtained in such a complex system is a further step towards a more efficient computer aided bioprocess development. Biotechnol. Bioeng. 2017;114: 610-619. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Azorin-Lopez, Jorge; Fuster-Guillo, Andres; Saval-Calvo, Marcelo; Mora-Mora, Higinio; Garcia-Chamizo, Juan Manuel
2017-01-01
The use of visual information is a very well known input from different kinds of sensors. However, most of the perception problems are individually modeled and tackled. It is necessary to provide a general imaging model that allows us to parametrize different input systems as well as their problems and possible solutions. In this paper, we present an active vision model considering the imaging system as a whole (including camera, lighting system, object to be perceived) in order to propose solutions to automated visual systems that present problems that we perceive. As a concrete case study, we instantiate the model in a real application and still challenging problem: automated visual inspection. It is one of the most used quality control systems to detect defects on manufactured objects. However, it presents problems for specular products. We model these perception problems taking into account environmental conditions and camera parameters that allow a system to properly perceive the specific object characteristics to determine defects on surfaces. The validation of the model has been carried out using simulations providing an efficient way to perform a large set of tests (different environment conditions and camera parameters) as a previous step of experimentation in real manufacturing environments, which more complex in terms of instrumentation and more expensive. Results prove the success of the model application adjusting scale, viewpoint and lighting conditions to detect structural and color defects on specular surfaces. PMID:28640211
NASA Astrophysics Data System (ADS)
Kelly, Jamie S.; Bowman, Hiroshi C.; Rao, Vittal S.; Pottinger, Hardy J.
1997-06-01
Implementation issues represent an unfamiliar challenge to most control engineers, and many techniques for controller design ignore these issues outright. Consequently, the design of controllers for smart structural systems usually proceeds without regard for their eventual implementation, thus resulting either in serious performance degradation or in hardware requirements that squander power, complicate integration, and drive up cost. The level of integration assumed by the Smart Patch further exacerbates these difficulties, and any design inefficiency may render the realization of a single-package sensor-controller-actuator system infeasible. The goal of this research is to automate the controller implementation process and to relieve the design engineer of implementation concerns like quantization, computational efficiency, and device selection. We specifically target Field Programmable Gate Arrays (FPGAs) as our hardware platform because these devices are highly flexible, power efficient, and reprogrammable. The current study develops an automated implementation sequence that minimizes hardware requirements while maintaining controller performance. Beginning with a state space representation of the controller, the sequence automatically generates a configuration bitstream for a suitable FPGA implementation. MATLAB functions optimize and simulate the control algorithm before translating it into the VHSIC hardware description language. These functions improve power efficiency and simplify integration in the final implementation by performing a linear transformation that renders the controller computationally friendly. The transformation favors sparse matrices in order to reduce multiply operations and the hardware necessary to support them; simultaneously, the remaining matrix elements take on values that minimize limit cycles and parameter sensitivity. The proposed controller design methodology is implemented on a simple cantilever beam test structure using FPGA hardware. The experimental closed loop response is compared with that of an automated FPGA controller implementation. Finally, we explore the integration of FPGA based controllers into a multi-chip module, which we believe represents the next step towards the realization of the Smart Patch.
Total Radiosynthesis: Thinking outside "the box".
Liang, Steven H; Vasdev, Neil
2015-09-01
The logic of total synthesis transformed a stagnant state of medicinal and synthetic organic chemistry when there was a paucity of methods and reagents to synthesize drug molecules and/or natural products. Molecular imaging by positron emission tomography (PET) is now experiencing a renaissance in the way radiopharmaceuticals for molecular imaging are synthesized, however, a paradigm shift is desperately needed in the discovery pipeline to accelerate in vivo imaging studies. A significant challenge in radiochemistry is the limited choice of labeled reagents (or building blocks) available for the synthesis of novel radiopharmaceuticals with the most commonly used short-lived radionuclides carbon-11 ( 11 C; half-life ~20 minutes) and fluorine-18 ( 18 F; half-life ~2 hours). In fact, most drugs cannot be labeled with 11 C or 18 F due to a lack of efficient and diverse radiosynthetic methods. In general, routine radiopharmaceutical production relies on the incorporation of the isotope at the last or penultimate step of synthesis, ideally within one half-life of the radionuclide, to maximize radiochemical yields and specific activities thereby reducing losses due to radioactive decay. Reliance on radiochemistry conducted within the constraints of an automated synthesis unit ("box") has stifled the exploration of multi-step reactions with short-lived radionuclides. Radiopharmaceutical synthesis can be transformed by considering logic of total synthesis to develop novel approaches for 11 C- and 18 F-radiolabeling complex molecules via retrosynthetic analysis and multi-step reactions. As a result of such exploration, new methods, reagents and radiopharmaceuticals for in vivo imaging studies are discovered. A new avenue to develop radiotracers that were previously unattainable due to the lack of efficient radiosynthetic methods is necessary to work towards our ultimate, albeit impossible goal - the concept we term total radiosynthesis - to radiolabel virtually any molecule. As with the vast majority of drugs, most radiotracers also fail, therefore expeditious evaluation of tracers in preclinical models prior to optimization or derivatization of the lead molecules/drugs is necessary. Furthermore the exact position of the 11 C and 18 F radionuclide in tracers is often critical for metabolic considerations, and flexible methodologies to introduce the radiolabel are needed. Using the principles of total synthesis our laboratory and others have shown that multi-step radiochemical reactions are indeed suitable for preclinical and even clinical use. As the goal of total synthesis is to be concise, we have also simplified the syntheses of radiopharmaceuticals. We are presently developing new strategies via [ 11 C]CO 2 fixation which has enabled library radiosynthesis as well as labeling non-activated arenes using [ 18 F]fluoride via iodonium ylides. Both of which have proven to be suitable for human PET imaging. We concurrently utilize state-of-the-art automation technologies including microfluidic flow chemistry and rapid purification strategies for radiopharmaceutical production. In this account we highlight how total radiosynthesis has impacted our radiochemistry program, with prominent examples from others, focusing on its impact towards preclinical and clinical research studies.
Total Radiosynthesis: Thinking outside “the box”
Liang, Steven H.; Vasdev, Neil
2016-01-01
The logic of total synthesis transformed a stagnant state of medicinal and synthetic organic chemistry when there was a paucity of methods and reagents to synthesize drug molecules and/or natural products. Molecular imaging by positron emission tomography (PET) is now experiencing a renaissance in the way radiopharmaceuticals for molecular imaging are synthesized, however, a paradigm shift is desperately needed in the discovery pipeline to accelerate in vivo imaging studies. A significant challenge in radiochemistry is the limited choice of labeled reagents (or building blocks) available for the synthesis of novel radiopharmaceuticals with the most commonly used short-lived radionuclides carbon-11 (11C; half-life ~20 minutes) and fluorine-18 (18F; half-life ~2 hours). In fact, most drugs cannot be labeled with 11C or 18F due to a lack of efficient and diverse radiosynthetic methods. In general, routine radiopharmaceutical production relies on the incorporation of the isotope at the last or penultimate step of synthesis, ideally within one half-life of the radionuclide, to maximize radiochemical yields and specific activities thereby reducing losses due to radioactive decay. Reliance on radiochemistry conducted within the constraints of an automated synthesis unit (“box”) has stifled the exploration of multi-step reactions with short-lived radionuclides. Radiopharmaceutical synthesis can be transformed by considering logic of total synthesis to develop novel approaches for 11C- and 18F-radiolabeling complex molecules via retrosynthetic analysis and multi-step reactions. As a result of such exploration, new methods, reagents and radiopharmaceuticals for in vivo imaging studies are discovered. A new avenue to develop radiotracers that were previously unattainable due to the lack of efficient radiosynthetic methods is necessary to work towards our ultimate, albeit impossible goal – the concept we term total radiosynthesis - to radiolabel virtually any molecule. As with the vast majority of drugs, most radiotracers also fail, therefore expeditious evaluation of tracers in preclinical models prior to optimization or derivatization of the lead molecules/drugs is necessary. Furthermore the exact position of the 11C and 18F radionuclide in tracers is often critical for metabolic considerations, and flexible methodologies to introduce the radiolabel are needed. Using the principles of total synthesis our laboratory and others have shown that multi-step radiochemical reactions are indeed suitable for preclinical and even clinical use. As the goal of total synthesis is to be concise, we have also simplified the syntheses of radiopharmaceuticals. We are presently developing new strategies via [11C]CO2 fixation which has enabled library radiosynthesis as well as labeling non-activated arenes using [18F]fluoride via iodonium ylides. Both of which have proven to be suitable for human PET imaging. We concurrently utilize state-of-the-art automation technologies including microfluidic flow chemistry and rapid purification strategies for radiopharmaceutical production. In this account we highlight how total radiosynthesis has impacted our radiochemistry program, with prominent examples from others, focusing on its impact towards preclinical and clinical research studies. PMID:27512156
Selecting and Planning for an Automated Library System: Guidelines for Libraries.
ERIC Educational Resources Information Center
Piccininni, James
Guidelines are given for automating a library. Issues arising in automation are illustrated through the experience of the Doherty Library of the University of St. Thomas, Houston (Texas). The first step is to decide what type of system is right for the needs of the library and its patrons. In considering vendors of systems, it is important to…
Film/Adhesive Processing Module for Fiber-Placement Processing of Composites
NASA Technical Reports Server (NTRS)
Hulcher, A. Bruce
2007-01-01
An automated apparatus has been designed and constructed that enables the automated lay-up of composite structures incorporating films, foils, and adhesives during the automated fiber-placement process. This apparatus, denoted a film module, could be used to deposit materials in film or thin sheet form either simultaneously when laying down the fiber composite article or in an independent step.
Space power subsystem automation technology
NASA Technical Reports Server (NTRS)
Graves, J. R. (Compiler)
1982-01-01
The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.
Approaches to automated protein crystal harvesting
Deller, Marc C.; Rupp, Bernhard
2014-01-01
The harvesting of protein crystals is almost always a necessary step in the determination of a protein structure using X-ray crystallographic techniques. However, protein crystals are usually fragile and susceptible to damage during the harvesting process. For this reason, protein crystal harvesting is the single step that remains entirely dependent on skilled human intervention. Automation has been implemented in the majority of other stages of the structure-determination pipeline, including cloning, expression, purification, crystallization and data collection. The gap in automation between crystallization and data collection results in a bottleneck in throughput and presents unfortunate opportunities for crystal damage. Several automated protein crystal harvesting systems have been developed, including systems utilizing microcapillaries, microtools, microgrippers, acoustic droplet ejection and optical traps. However, these systems have yet to be commonly deployed in the majority of crystallography laboratories owing to a variety of technical and cost-related issues. Automation of protein crystal harvesting remains essential for harnessing the full benefits of fourth-generation synchrotrons, free-electron lasers and microfocus beamlines. Furthermore, automation of protein crystal harvesting offers several benefits when compared with traditional manual approaches, including the ability to harvest microcrystals, improved flash-cooling procedures and increased throughput. PMID:24637746
ERIC Educational Resources Information Center
Zhao, Weiyi
2011-01-01
Wireless mesh networks (WMNs) have recently emerged to be a cost-effective solution to support large-scale wireless Internet access. They have numerous applications, such as broadband Internet access, building automation, and intelligent transportation systems. One research challenge for Internet-based WMNs is to design efficient mobility…
USDA-ARS?s Scientific Manuscript database
Automated sensing of macronutrients in hydroponic solution would allow more efficient management of nutrients for crop growth in closed hydroponic systems. Ion-selective microelectrode technology requires an ion-selective membrane or a solid metal material that responds selectively to one analyte in...
Automatic intrinsic cardiac and respiratory gating from cone-beam CT scans of the thorax region
NASA Astrophysics Data System (ADS)
Hahn, Andreas; Sauppe, Sebastian; Lell, Michael; Kachelrieß, Marc
2016-03-01
We present a new algorithm that allows for raw data-based automated cardiac and respiratory intrinsic gating in cone-beam CT scans. It can be summarized in three steps: First, a median filter is applied to an initially reconstructed volume. The forward projection of this volume contains less motion information and is subtracted from the original projections. This results in new raw data that contain only moving and not static anatomy like bones, that would otherwise impede the cardiac or respiratory signal acquisition. All further steps are applied to these modified raw data. Second, the raw data are cropped to a region of interest (ROI). The ROI in the raw data is determined by the forward projection of a binary volume of interest (VOI) that includes the diaphragm for respiratory gating and most of the edge of the heart for cardiac gating. Third, the mean gray value in this ROI is calculated for every projection and the respiratory/cardiac signal is acquired using a bandpass filter. Steps two and three are carried out simultaneously for 64 or 1440 overlapping VOI inside the body for the respiratory or cardiac signal respectively. The signals acquired from each ROI are compared and the most consistent one is chosen as the desired cardiac or respiratory motion signal. Consistency is assessed by the standard deviation of the time between two maxima. The robustness and efficiency of the method is evaluated using simulated and measured patient data by computing the standard deviation of the mean signal difference between the ground truth and the intrinsic signal.
NASA Astrophysics Data System (ADS)
Shim, Hackjoon; Kwoh, C. Kent; Yun, Il Dong; Lee, Sang Uk; Bae, Kyongtae
2009-02-01
Osteoarthritis (OA) is associated with degradation of cartilage and related changes in the underlying bone. Quantitative measurement of those changes from MR images is an important biomarker to study the progression of OA and it requires a reliable segmentation of knee bone and cartilage. As the most popular method, manual segmentation of knee joint structures by boundary delineation is highly laborious and subject to user-variation. To overcome these difficulties, we have developed a semi-automated method for segmentation of knee bones, which consisted of two steps: placement of seeds and computation of segmentation. In the first step, seeds were placed by the user on a number of slices and then were propagated automatically to neighboring images. The seed placement could be performed on any of sagittal, coronal, and axial planes. The second step, computation of segmentation, was based on a graph-cuts algorithm where the optimal segmentation is the one that minimizes a cost function, which integrated the seeds specified by the user and both the regional and boundary properties of the regions to be segmented. The algorithm also allows simultaneous segmentation of three compartments of the knee bone (femur, tibia, patella). Our method was tested on the knee MR images of six subjects from the osteoarthritis initiative (OAI). The segmentation processing time (mean+/-SD) was (22+/-4)min, which is much shorter than that by the manual boundary delineation method (typically several hours). With this improved efficiency, our segmentation method will facilitate the quantitative morphologic analysis of changes in knee bones associated with osteoarthritis.
Geraghty, Adam W A; Torres, Leandro D; Leykin, Yan; Pérez-Stable, Eliseo J; Muñoz, Ricardo F
2013-09-01
Worldwide automated Internet health interventions have the potential to greatly reduce health disparities. High attrition from automated Internet interventions is ubiquitous, and presents a challenge in the evaluation of their effectiveness. Our objective was to evaluate variables hypothesized to be related to attrition, by modeling predictors of attrition in a secondary data analysis of two cohorts of an international, dual language (English and Spanish) Internet smoking cessation intervention. The two cohorts were identical except for the approach to follow-up (FU): one cohort employed only fully automated FU (n = 16 430), while the other cohort also used 'live' contact conditional upon initial non-response (n = 1000). Attrition rates were 48.1 and 10.8% for the automated FU and live FU cohorts, respectively. Significant attrition predictors in the automated FU cohort included higher levels of nicotine dependency, lower education, lower quitting confidence and receiving more contact emails. Participants' younger age was the sole predictor of attrition in the live FU cohort. While research on large-scale deployment of Internet interventions is at an early stage, this study demonstrates that differences in attrition from trials on this scale are (i) systematic and predictable and (ii) can largely be eliminated by live FU efforts. In fully automated trials, targeting the predictors we identify may reduce attrition, a necessary precursor to effective behavioral Internet interventions that can be accessed globally.
Chu, Chia-Hui; Kuo, Ming-Chuan; Weng, Shu-Hui; Lee, Ting-Ting
2016-01-01
A user friendly interface can enhance the efficiency of data entry, which is crucial for building a complete database. In this study, two user interfaces (traditional pull-down menu vs. check boxes) are proposed and evaluated based on medical records with fever medication orders by measuring the time for data entry, steps for each data entry record, and the complete rate of each medical record. The result revealed that the time for data entry is reduced from 22.8 sec/record to 3.2 sec/record. The data entry procedures also have reduced from 9 steps in the traditional one to 3 steps in the new one. In addition, the completeness of medical records is increased from 20.2% to 98%. All these results indicate that the new user interface provides a more user friendly and efficient approach for data entry than the traditional interface.
Cui, Jiayue; Chai, David I.; Miller, Christopher; Hao, Jason; Thomas, Christopher; Wang, JingQi; Scheidt, Karl A.; Kozmin, Sergey A.
2013-01-01
We describe a unified synthetic strategy for efficient assembly of four new heterocyclic libraries. The synthesis began by creating a range of structurally diverse pyrrolidinones or piperidinones. Such compounds were obtained in a simple one-flask operation starting with readily available amines, ketoesters, and unsaturated anhydrides. The use of tetrahydropyran-containing ketoesters, which were rapidly assembled by our Prins cyclization protocol, enabled efficient fusion of pyran and piperidinone cores. A newly developed Au(I)-catalyzed cycloisomerization of alkyne-containing enamides further expanded heterocyclic diversity by providing rapid entry into a wide range of bicyclic and tricyclic dienamides. The final stage of the process entailed diversification of each of the initially produced carboxylic acids using a fully automated platform for amide synthesis, which delivered 1872 compounds in high diastereomeric and chemical purity. PMID:22860634
Li, Yang; Hong, Jiali; Wei, Renjian; Zhang, Yingying; Tong, Zaizai; Zhang, Xinghong; Du, Binyang; Xu, Junting; Fan, Zhiqiang
2015-02-01
It is a long-standing challenge to combine mixed monomers into multiblock copolymer (MBC) in a one-pot/one-step polymerization manner. We report the first example of MBC with biodegradable polycarbonate and polyester blocks that were synthesized from highly efficient one-pot/one-step polymerization of cyclohexene oxide (CHO), CO 2 and ε-caprolactone (ε-CL) in the presence of zinc-cobalt double metal cyanide complex and stannous octoate. In this protocol, two cross-chain exchange reactions (CCER) occurred at dual catalysts respectively and connected two independent chain propagation procedures ( i.e. , polycarbonate formation and polyester formation) simultaneously in a block-by-block manner, affording MBC without tapering structure. The multiblock structure of MBC was determined by the rate ratio of CCER to the two chain propagations and could be simply tuned by various kinetic factors. This protocol is also of significance due to partial utilization of renewable CO 2 and improved mechanical properties of the resultant MBC.
EOS Terra: EOS DAM Automation Constellation MOWG
NASA Technical Reports Server (NTRS)
Mantziaras, Dimitrios C.
2017-01-01
Brief summary of the decision factors considered and process improvement steps made, to evolve the ESMO debris avoidance maneuver process to a more automated process. Presentation is in response to an action item/question received at a prior MOWG meeting.
Computer automation for feedback system design
NASA Technical Reports Server (NTRS)
1975-01-01
Mathematical techniques and explanations of various steps used by an automated computer program to design feedback systems are summarized. Special attention was given to refining the automatic evaluation suboptimal loop transmission and the translation of time to frequency domain specifications.
Segmentation of the whole breast from low-dose chest CT images
NASA Astrophysics Data System (ADS)
Liu, Shuang; Salvatore, Mary; Yankelevitz, David F.; Henschke, Claudia I.; Reeves, Anthony P.
2015-03-01
The segmentation of whole breast serves as the first step towards automated breast lesion detection. It is also necessary for automatically assessing the breast density, which is considered to be an important risk factor for breast cancer. In this paper we present a fully automated algorithm to segment the whole breast in low-dose chest CT images (LDCT), which has been recommended as an annual lung cancer screening test. The automated whole breast segmentation and potential breast density readings as well as lesion detection in LDCT will provide useful information for women who have received LDCT screening, especially the ones who have not undergone mammographic screening, by providing them additional risk indicators for breast cancer with no additional radiation exposure. The two main challenges to be addressed are significant range of variations in terms of the shape and location of the breast in LDCT and the separation of pectoral muscles from the glandular tissues. The presented algorithm achieves robust whole breast segmentation using an anatomy directed rule-based method. The evaluation is performed on 20 LDCT scans by comparing the segmentation with ground truth manually annotated by a radiologist on one axial slice and two sagittal slices for each scan. The resulting average Dice coefficient is 0.880 with a standard deviation of 0.058, demonstrating that the automated segmentation algorithm achieves results consistent with manual annotations of a radiologist.
The application of automated operations at the Institutional Processing Center
NASA Technical Reports Server (NTRS)
Barr, Thomas H.
1993-01-01
The JPL Institutional and Mission Computing Division, Communications, Computing and Network Services Section, with its mission contractor, OAO Corporation, have for some time been applying automation to the operation of JPL's Information Processing Center (IPC). Automation does not come in one easy to use package. Automation for a data processing center is made up of many different software and hardware products supported by trained personnel. The IPC automation effort formally began with console automation, and has since spiraled out to include production scheduling, data entry, report distribution, online reporting, failure reporting and resolution, documentation, library storage, and operator and user education, while requiring the interaction of multi-vendor and locally developed software. To begin the process, automation goals are determined. Then a team including operations personnel is formed to research and evaluate available options. By acquiring knowledge of current products and those in development, taking an active role in industry organizations, and learning of other data center's experiences, a forecast can be developed as to what direction technology is moving. With IPC management's approval, an implementation plan is developed and resources identified to test or implement new systems. As an example, IPC's new automated data entry system was researched by Data Entry, Production Control, and Advance Planning personnel. A proposal was then submitted to management for review. A determination to implement the new system was made and elements/personnel involved with the initial planning performed the implementation. The final steps of the implementation were educating data entry personnel in the areas effected and procedural changes necessary to the successful operation of the new system.
Automated one-step DNA sequencing based on nanoliter reaction volumes and capillary electrophoresis.
Pang, H M; Yeung, E S
2000-08-01
An integrated system with a nano-reactor for cycle-sequencing reaction coupled to on-line purification and capillary gel electrophoresis has been demonstrated. Fifty nanoliters of reagent solution, which includes dye-labeled terminators, polymerase, BSA and template, was aspirated and mixed with the template inside the nano-reactor followed by cycle-sequencing reaction. The reaction products were then purified by a size-exclusion chromatographic column operated at 50 degrees C followed by room temperature on-line injection of the DNA fragments into a capillary for gel electrophoresis. Over 450 bases of DNA can be separated and identified. As little as 25 nl reagent solution can be used for the cycle-sequencing reaction with a slightly shorter read length. Significant savings on reagent cost is achieved because the remaining stock solution can be reused without contamination. The steps of cycle sequencing, on-line purification, injection, DNA separation, capillary regeneration, gel-filling and fluidic manipulation were performed with complete automation. This system can be readily multiplexed for high-throughput DNA sequencing or PCR analysis directly from templates or even biological materials.
Adaptive Finite Element Methods for Continuum Damage Modeling
NASA Technical Reports Server (NTRS)
Min, J. B.; Tworzydlo, W. W.; Xiques, K. E.
1995-01-01
The paper presents an application of adaptive finite element methods to the modeling of low-cycle continuum damage and life prediction of high-temperature components. The major objective is to provide automated and accurate modeling of damaged zones through adaptive mesh refinement and adaptive time-stepping methods. The damage modeling methodology is implemented in an usual way by embedding damage evolution in the transient nonlinear solution of elasto-viscoplastic deformation problems. This nonlinear boundary-value problem is discretized by adaptive finite element methods. The automated h-adaptive mesh refinements are driven by error indicators, based on selected principal variables in the problem (stresses, non-elastic strains, damage, etc.). In the time domain, adaptive time-stepping is used, combined with a predictor-corrector time marching algorithm. The time selection is controlled by required time accuracy. In order to take into account strong temperature dependency of material parameters, the nonlinear structural solution a coupled with thermal analyses (one-way coupling). Several test examples illustrate the importance and benefits of adaptive mesh refinements in accurate prediction of damage levels and failure time.
Batch manufacturing: Six strategic needs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ash, R.H.; Chappell, D.A.
1995-08-01
Since the advent of industrial digital control systems in the mid-1970s, industry has had the promise of integrated, configurable digital batch control systems to replace the morass of electromechanical devices like relays and stepping switches, recorders, and indicators which comprised the components of previous generations of batch control systems - the {open_quotes}monolithic monsters{close_quotes} of the 1960s and earlier. To help fulfill that promise, there have been many wide-ranging proprietary automation solutions for batch control since 1975, many of them technically excellent. However, even the best examples suffered from the lack of a common language and unifying concept permitting separate systemsmore » to be interconnected and work together. Today, some 20 years after the digital revolution began, industry has microprocessors, memory chips, data highways, and other marvelous technology to help automate the control of discontinuous processes. They also are on the way to having an accepted standard for batch automation, ISA S88. Batching systems are at once conceptually simple but executionally complex. The notion of adding ingredients one at a time to a vat, mixing, and then processing into final form is as old as the stone age. Every homemaker on earth, male or female, is familiar with how to follow a recipe to create some sumptuous item of culinary delight. Food recipes, so familiar and ubiquitous, are really just microcosms of the S88 recipe standard. They contain the same components: (1) Header (name and description of item being prepared, sometimes serving size); (2) Formula (list and amount of ingredients); (3) Equipment requirements (pans, mixing and cooking equipment); (4) Procedure (description of order of ingredient addition, mixing and other processing steps, baking/cooling time, and other processing steps); and (5) Other information (safety, cautions, and other miscellaneous instructions).« less
Shen, Heping; Wu, Yiliang; Peng, Jun; Duong, The; Fu, Xiao; Barugkin, Chog; White, Thomas P; Weber, Klaus; Catchpole, Kylie R
2017-02-22
With rapid progress in recent years, organohalide perovskite solar cells (PSC) are promising candidates for a new generation of highly efficient thin-film photovoltaic technologies, for which up-scaling is an essential step toward commercialization. In this work, we propose a modified two-step method to deposit the CH 3 NH 3 PbI 3 (MAPbI 3 ) perovskite film that improves the uniformity, photovoltaic performance, and repeatability of large-area perovskite solar cells. This method is based on the commonly used two-step method, with one additional process involving treating the perovskite film with concentrated methylammonium iodide (MAI) solution. This additional treatment is proved to be helpful for tailoring the residual PbI 2 level to an optimal range that is favorable for both optical absorption and inhibition of recombination. Scanning electron microscopy and photoluminescence image analysis further reveal that, compared to the standard two-step and one-step methods, this method is very robust for achieving uniform and pinhole-free large-area films. This is validated by the photovoltaic performance of the prototype devices with an active area of 1 cm 2 , where we achieved the champion efficiency of ∼14.5% and an average efficiency of ∼13.5%, with excellent reproducibility.
Universal Verification Methodology Based Register Test Automation Flow.
Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu
2016-05-01
In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.
Li, Siwei; Ding, Wentao; Zhang, Xueli; Jiang, Huifeng; Bi, Changhao
2016-01-01
Saccharomyces cerevisiae has already been used for heterologous production of fuel chemicals and valuable natural products. The establishment of complicated heterologous biosynthetic pathways in S. cerevisiae became the research focus of Synthetic Biology and Metabolic Engineering. Thus, simple and efficient genomic integration techniques of large number of transcription units are demanded urgently. An efficient DNA assembly and chromosomal integration method was created by combining homologous recombination (HR) in S. cerevisiae and Golden Gate DNA assembly method, designated as modularized two-step (M2S) technique. Two major assembly steps are performed consecutively to integrate multiple transcription units simultaneously. In Step 1, Modularized scaffold containing a head-to-head promoter module and a pair of terminators was assembled with two genes. Thus, two transcription units were assembled with Golden Gate method into one scaffold in one reaction. In Step 2, the two transcription units were mixed with modules of selective markers and integration sites and transformed into S. cerevisiae for assembly and integration. In both steps, universal primers were designed for identification of correct clones. Establishment of a functional β-carotene biosynthetic pathway in S. cerevisiae within 5 days demonstrated high efficiency of this method, and a 10-transcriptional-unit pathway integration illustrated the capacity of this method. Modular design of transcription units and integration elements simplified assembly and integration procedure, and eliminated frequent designing and synthesis of DNA fragments in previous methods. Also, by assembling most parts in Step 1 in vitro, the number of DNA cassettes for homologous integration in Step 2 was significantly reduced. Thus, high assembly efficiency, high integration capacity, and low error rate were achieved.
NASA Astrophysics Data System (ADS)
Wang, Xingwei; Zheng, Bin; Li, Shibo; Mulvihill, John J.; Chen, Xiaodong; Liu, Hong
2010-07-01
Karyotyping is an important process to classify chromosomes into standard classes and the results are routinely used by the clinicians to diagnose cancers and genetic diseases. However, visual karyotyping using microscopic images is time-consuming and tedious, which reduces the diagnostic efficiency and accuracy. Although many efforts have been made to develop computerized schemes for automated karyotyping, no schemes can get be performed without substantial human intervention. Instead of developing a method to classify all chromosome classes, we develop an automatic scheme to detect abnormal metaphase cells by identifying a specific class of chromosomes (class 22) and prescreen for suspicious chronic myeloid leukemia (CML). The scheme includes three steps: (1) iteratively segment randomly distributed individual chromosomes, (2) process segmented chromosomes and compute image features to identify the candidates, and (3) apply an adaptive matching template to identify chromosomes of class 22. An image data set of 451 metaphase cells extracted from bone marrow specimens of 30 positive and 30 negative cases for CML is selected to test the scheme's performance. The overall case-based classification accuracy is 93.3% (100% sensitivity and 86.7% specificity). The results demonstrate the feasibility of applying an automated scheme to detect or prescreen the suspicious cancer cases.
Automated Technical Library System Users Manual.
1979-12-01
AUTOPILOT);SH:AGEH 1; SH:PGH 1;SH:PGH 2;SH:FHE 400(CA);SH:PCH 1;SH:PRM;AL:HY-130;AL=I7- 4PH ; SH:PT150(SW);SH:HS DENISON CENTER DOCUMENT TYPE:PA Circulation...0 One or the other or both AND NOT AN One and not the other 17 To combine sets, enclose all Boolean statements, including embedded statements, in... 17 / FG-9001 * 48 18/ CS-ENERGY DEPT* AND FG=9001 Sometimes you are only interested in seeing the final results of a FIND statement without the step
More steps towards process automation for optical fabrication
NASA Astrophysics Data System (ADS)
Walker, David; Yu, Guoyu; Beaucamp, Anthony; Bibby, Matt; Li, Hongyu; McCluskey, Lee; Petrovic, Sanja; Reynolds, Christina
2017-06-01
In the context of Industrie 4.0, we have previously described the roles of robots in optical processing, and their complementarity with classical CNC machines, providing both processing and automation functions. After having demonstrated robotic moving of parts between a CNC polisher and metrology station, and auto-fringe-acquisition, we have moved on to automate the wash-down operation. This is part of a wider strategy we describe in this paper, leading towards automating the decision-making operations required before and throughout an optical manufacturing cycle.
An automated qualification framework for the MeerKAT CAM (Control-And-Monitoring)
NASA Astrophysics Data System (ADS)
van den Heever, Lize; Marais, Neilen; Slabber, Martin
2016-08-01
This paper introduces and discusses the design of an Automated Qualification Framework (AQF) that was developed to automate as much as possible of the formal Qualification Testing of the Control And Monitoring (CAM) subsystem of the 64 dish MeerKAT radio telescope currently under construction in the Karoo region of South Africa. The AQF allows each Integrated CAM Test to reference the MeerKAT CAM requirement and associated verification requirement it covers and automatically produces the Qualification Test Procedure and Qualification Test Report from the test steps and evaluation steps annotated in the Integrated CAM Tests. The MeerKAT System Engineers are extremely happy with the AQF results, but mostly by the approach and process it enforces.
Orbegoso, Elder Mendoza; Saavedra, Rafael; Marcelo, Daniel; La Madrid, Raúl
2017-12-01
In the northern coastal and jungle areas of Peru, cocoa beans are dried using artisan methods, such as direct exposure to sunlight. This traditional process is time intensive, leading to a reduction in productivity and, therefore, delays in delivery times. The present study was intended to numerically characterise the thermal behaviour of three configurations of solar air heating collectors in order to determine which demonstrated the best thermal performance under several controlled operating conditions. For this purpose, a computational fluid dynamics model was developed to describe the simultaneous convective and radiative heat transfer phenomena under several operation conditions. The constructed computational fluid dynamics model was firstly validated through comparison with the data measurements of a one-step solar air heating collector. We then simulated two further three-step solar air heating collectors in order to identify which demonstrated the best thermal performance in terms of outlet air temperature and thermal efficiency. The numerical results show that under the same solar irradiation area of exposition and operating conditions, the three-step solar air heating collector with the collector plate mounted between the second and third channels was 67% more thermally efficient compared to the one-step solar air heating collector. This is because the air exposition with the surface of the collector plate for the three-step solar air heating collector former device was twice than the one-step solar air heating collector. Copyright © 2017 Elsevier Ltd. All rights reserved.
An automated system for reduction of the firm's employees under maximal overall efficiency
NASA Astrophysics Data System (ADS)
Yonchev, Yoncho; Nikolov, Simeon; Baeva, Silvia
2012-11-01
Achieving maximal overall efficiency is a priority in all companies. This problem is formulated as a knap-sack problem and afterwards as a linear assignment problem. An automated system is created for solving of this problem.
Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.
Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades
2015-01-01
DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA gel electrophoresis images, called GELect, which was written in Java and made available through the imageJ framework. With a novel automated image processing workflow, the tool can accurately segment lanes from a gel matrix, intelligently extract distorted and even doublet bands that are difficult to identify by existing image processing tools. Consequently, genotyping from DNA gel electrophoresis can be performed automatically allowing users to efficiently conduct large scale DNA fingerprinting via DNA gel electrophoresis. The software is freely available from http://www.biotec.or.th/gi/tools/gelect.
Souvignet, Julien; Declerck, Gunnar; Asfari, Hadyl; Jaulent, Marie-Christine; Bousquet, Cédric
2016-10-01
Efficient searching and coding in databases that use terminological resources requires that they support efficient data retrieval. The Medical Dictionary for Regulatory Activities (MedDRA) is a reference terminology for several countries and organizations to code adverse drug reactions (ADRs) for pharmacovigilance. Ontologies that are available in the medical domain provide several advantages such as reasoning to improve data retrieval. The field of pharmacovigilance does not yet benefit from a fully operational ontology to formally represent the MedDRA terms. Our objective was to build a semantic resource based on formal description logic to improve MedDRA term retrieval and aid the generation of on-demand custom groupings by appropriately and efficiently selecting terms: OntoADR. The method consists of the following steps: (1) mapping between MedDRA terms and SNOMED-CT, (2) generation of semantic definitions using semi-automatic methods, (3) storage of the resource and (4) manual curation by pharmacovigilance experts. We built a semantic resource for ADRs enabling a new type of semantics-based term search. OntoADR adds new search capabilities relative to previous approaches, overcoming the usual limitations of computation using lightweight description logic, such as the intractability of unions or negation queries, bringing it closer to user needs. Our automated approach for defining MedDRA terms enabled the association of at least one defining relationship with 67% of preferred terms. The curation work performed on our sample showed an error level of 14% for this automated approach. We tested OntoADR in practice, which allowed us to build custom groupings for several medical topics of interest. The methods we describe in this article could be adapted and extended to other terminologies which do not benefit from a formal semantic representation, thus enabling better data retrieval performance. Our custom groupings of MedDRA terms were used while performing signal detection, which suggests that the graphical user interface we are currently implementing to process OntoADR could be usefully integrated into specialized pharmacovigilance software that rely on MedDRA. Copyright © 2016 Elsevier Inc. All rights reserved.
Method and Apparatus for Automated Isolation of Nucleic Acids from Small Cell Samples
NASA Technical Reports Server (NTRS)
Sundaram, Shivshankar; Prabhakarpandian, Balabhaskar; Pant, Kapil; Wang, Yi
2014-01-01
RNA isolation is a ubiquitous need, driven by current emphasis on microarrays and miniaturization. With commercial systems requiring 100,000 to 1,000,000 cells for successful isolation, there is a growing need for a small-footprint, easy-to-use device that can harvest nucleic acids from much smaller cell samples (1,000 to 10,000 cells). The process of extraction of RNA from cell cultures is a complex, multi-step one, and requires timed, asynchronous operations with multiple reagents/buffers. An added complexity is the fragility of RNA (subject to degradation) and its reactivity to surface. A novel, microfluidics-based, integrated cartridge has been developed that can fully automate the complex process of RNA isolation (lyse, capture, and elute RNA) from small cell culture samples. On-cartridge cell lysis is achieved using either reagents or high-strength electric fields made possible by the miniaturized format. Traditionally, silica-based, porous-membrane formats have been used for RNA capture, requiring slow perfusion for effective capture. In this design, high efficiency capture/elution are achieved using a microsphere-based "microfluidized" format. Electrokinetic phenomena are harnessed to actively mix microspheres with the cell lysate and capture/elution buffer, providing important advantages in extraction efficiency, processing time, and operational flexibility. Successful RNA isolation was demonstrated using both suspension (HL-60) and adherent (BHK-21) cells. Novel features associated with this development are twofold. First, novel designs that execute needed processes with improved speed and efficiency were developed. These primarily encompass electric-field-driven lysis of cells. The configurations include electrode-containing constructs, or an "electrode-less" chip design, which is easy to fabricate and mitigates fouling at the electrode surface; and the "fluidized" extraction format based on electrokinetically assisted mixing and contacting of microbeads in a shape-optimized chamber. A secondary proprietary feature is in the particular layout integrating these components to perform the desired operation of RNA isolation. Apart from a novel functional capability, advantages of the innovation include reduced or eliminated use of toxic reagents, and operator-independent extraction of RNA.
Lomnitz, Jason G.; Savageau, Michael A.
2016-01-01
Mathematical models of biochemical systems provide a means to elucidate the link between the genotype, environment, and phenotype. A subclass of mathematical models, known as mechanistic models, quantitatively describe the complex non-linear mechanisms that capture the intricate interactions between biochemical components. However, the study of mechanistic models is challenging because most are analytically intractable and involve large numbers of system parameters. Conventional methods to analyze them rely on local analyses about a nominal parameter set and they do not reveal the vast majority of potential phenotypes possible for a given system design. We have recently developed a new modeling approach that does not require estimated values for the parameters initially and inverts the typical steps of the conventional modeling strategy. Instead, this approach relies on architectural features of the model to identify the phenotypic repertoire and then predict values for the parameters that yield specific instances of the system that realize desired phenotypic characteristics. Here, we present a collection of software tools, the Design Space Toolbox V2 based on the System Design Space method, that automates (1) enumeration of the repertoire of model phenotypes, (2) prediction of values for the parameters for any model phenotype, and (3) analysis of model phenotypes through analytical and numerical methods. The result is an enabling technology that facilitates this radically new, phenotype-centric, modeling approach. We illustrate the power of these new tools by applying them to a synthetic gene circuit that can exhibit multi-stability. We then predict values for the system parameters such that the design exhibits 2, 3, and 4 stable steady states. In one example, inspection of the basins of attraction reveals that the circuit can count between three stable states by transient stimulation through one of two input channels: a positive channel that increases the count, and a negative channel that decreases the count. This example shows the power of these new automated methods to rapidly identify behaviors of interest and efficiently predict parameter values for their realization. These tools may be applied to understand complex natural circuitry and to aid in the rational design of synthetic circuits. PMID:27462346
Automated Quantitative Rare Earth Elements Mineralogy by Scanning Electron Microscopy
NASA Astrophysics Data System (ADS)
Sindern, Sven; Meyer, F. Michael
2016-09-01
Increasing industrial demand of rare earth elements (REEs) stems from the central role they play for advanced technologies and the accelerating move away from carbon-based fuels. However, REE production is often hampered by the chemical, mineralogical as well as textural complexity of the ores with a need for better understanding of their salient properties. This is not only essential for in-depth genetic interpretations but also for a robust assessment of ore quality and economic viability. The design of energy and cost-efficient processing of REE ores depends heavily on information about REE element deportment that can be made available employing automated quantitative process mineralogy. Quantitative mineralogy assigns numeric values to compositional and textural properties of mineral matter. Scanning electron microscopy (SEM) combined with a suitable software package for acquisition of backscatter electron and X-ray signals, phase assignment and image analysis is one of the most efficient tools for quantitative mineralogy. The four different SEM-based automated quantitative mineralogy systems, i.e. FEI QEMSCAN and MLA, Tescan TIMA and Zeiss Mineralogic Mining, which are commercially available, are briefly characterized. Using examples of quantitative REE mineralogy, this chapter illustrates capabilities and limitations of automated SEM-based systems. Chemical variability of REE minerals and analytical uncertainty can reduce performance of phase assignment. This is shown for the REE phases parisite and synchysite. In another example from a monazite REE deposit, the quantitative mineralogical parameters surface roughness and mineral association derived from image analysis are applied for automated discrimination of apatite formed in a breakdown reaction of monazite and apatite formed by metamorphism prior to monazite breakdown. SEM-based automated mineralogy fulfils all requirements for characterization of complex unconventional REE ores that will become increasingly important for supply of REEs in the future.
Fu, Yongqian; Sun, Xiaolong; Zhu, Huayue; Jiang, Ru; Luo, Xi; Yin, Longfei
2018-05-21
In previous work, we proposed a novel modified one-step fermentation fed-batch strategy to efficiently generate L-lactic acid (L-LA) using Rhizopus oryzae. In this study, to further enhance efficiency of L-LA production through one-step fermentation in fed-batch cultures, we systematically investigated the initial peptone- and glucose-feeding approaches, including different initial peptone and glucose concentrations and maintained residual glucose levels. Based on the results of this study, culturing R. oryzae with initial peptone and glucose concentrations of 3.0 and 50.0 g/l, respectively, using a fed-batch strategy is an effective approach of producing L-LA through one-step fermentation. Changing the residual glucose had no obvious effect on the generation of L-LA. We determined the maximum LA production and productivity to be 162 g/l and 6.23 g/(l·h), respectively, during the acid production stage. Compared to our previous work, there was almost no change in L-LA production or yield; however, the productivity of L-LA increased by 14.3%.
Automated procedures for sizing aerospace vehicle structures /SAVES/
NASA Technical Reports Server (NTRS)
Giles, G. L.; Blackburn, C. L.; Dixon, S. C.
1972-01-01
Results from a continuing effort to develop automated methods for structural design are described. A system of computer programs presently under development called SAVES is intended to automate the preliminary structural design of a complete aerospace vehicle. Each step in the automated design process of the SAVES system of programs is discussed, with emphasis placed on use of automated routines for generation of finite-element models. The versatility of these routines is demonstrated by structural models generated for a space shuttle orbiter, an advanced technology transport,n hydrogen fueled Mach 3 transport. Illustrative numerical results are presented for the Mach 3 transport wing.
Feasibility and Utility of Lexical Analysis for Occupational Health Text.
Harber, Philip; Leroy, Gondy
2017-06-01
Assess feasibility and potential utility of natural language processing (NLP) for storing and analyzing occupational health data. Basic NLP lexical analysis methods were applied to 89,000 Mine Safety and Health Administration (MSHA) free text records. Steps included tokenization, term and co-occurrence counts, term annotation, and identifying exposure-health effect relationships. Presence of terms in the Unified Medical Language System (UMLS) was assessed. The methods efficiently demonstrated common exposures, health effects, and exposure-injury relationships. Many workplace terms are not present in UMLS or map inaccurately. Use of free text rather than narrowly defined numerically coded fields is feasible, flexible, and efficient. It has potential to encourage workers and clinicians to provide more data and to support automated knowledge creation. The lexical method used is easily generalizable to other areas. The UMLS vocabularies should be enhanced to be relevant to occupational health.
Parallelization of ARC3D with Computer-Aided Tools
NASA Technical Reports Server (NTRS)
Jin, Haoqiang; Hribar, Michelle; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
A series of efforts have been devoted to investigating methods of porting and parallelizing applications quickly and efficiently for new architectures, such as the SCSI Origin 2000 and Cray T3E. This report presents the parallelization of a CFD application, ARC3D, using the computer-aided tools, Cesspools. Steps of parallelizing this code and requirements of achieving better performance are discussed. The generated parallel version has achieved reasonably well performance, for example, having a speedup of 30 for 36 Cray T3E processors. However, this performance could not be obtained without modification of the original serial code. It is suggested that in many cases improving serial code and performing necessary code transformations are important parts for the automated parallelization process although user intervention in many of these parts are still necessary. Nevertheless, development and improvement of useful software tools, such as Cesspools, can help trim down many tedious parallelization details and improve the processing efficiency.
Strategies Toward Automation of Overset Structured Surface Grid Generation
NASA Technical Reports Server (NTRS)
Chan, William M.
2017-01-01
An outline of a strategy for automation of overset structured surface grid generation on complex geometries is described. The starting point of the process consists of an unstructured surface triangulation representation of the geometry derived from a native CAD, STEP, or IGES definition, and a set of discretized surface curves that captures all geometric features of interest. The procedure for surface grid generation is decomposed into an algebraic meshing step, a hyperbolic meshing step, and a gap-filling step. This paper will focus primarily on the high-level plan with details on the algebraic step. The algorithmic procedure for the algebraic step involves analyzing the topology of the network of surface curves, distributing grid points appropriately on these curves, identifying domains bounded by four curves that can be meshed algebraically, concatenating the resulting grids into fewer patches, and extending appropriate boundaries of the concatenated grids to provide proper overlap. Results are presented for grids created on various aerospace vehicle components.
NASA Astrophysics Data System (ADS)
Abate, A.; Pressello, M. C.; Benassi, M.; Strigari, L.
2009-12-01
The aim of this study was to evaluate the effectiveness and efficiency in inverse IMRT planning of one-step optimization with the step-and-shoot (SS) technique as compared to traditional two-step optimization using the sliding windows (SW) technique. The Pinnacle IMRT TPS allows both one-step and two-step approaches. The same beam setup for five head-and-neck tumor patients and dose-volume constraints were applied for all optimization methods. Two-step plans were produced converting the ideal fluence with or without a smoothing filter into the SW sequence. One-step plans, based on direct machine parameter optimization (DMPO), had the maximum number of segments per beam set at 8, 10, 12, producing a directly deliverable sequence. Moreover, the plans were generated whether a split-beam was used or not. Total monitor units (MUs), overall treatment time, cost function and dose-volume histograms (DVHs) were estimated for each plan. PTV conformality and homogeneity indexes and normal tissue complication probability (NTCP) that are the basis for improving therapeutic gain, as well as non-tumor integral dose (NTID), were evaluated. A two-sided t-test was used to compare quantitative variables. All plans showed similar target coverage. Compared to two-step SW optimization, the DMPO-SS plans resulted in lower MUs (20%), NTID (4%) as well as NTCP values. Differences of about 15-20% in the treatment delivery time were registered. DMPO generates less complex plans with identical PTV coverage, providing lower NTCP and NTID, which is expected to reduce the risk of secondary cancer. It is an effective and efficient method and, if available, it should be favored over the two-step IMRT planning.
Development of a semi-automated combined PET and CT lung lesion segmentation framework
NASA Astrophysics Data System (ADS)
Rossi, Farli; Mokri, Siti Salasiah; Rahni, Ashrani Aizzuddin Abd.
2017-03-01
Segmentation is one of the most important steps in automated medical diagnosis applications, which affects the accuracy of the overall system. In this paper, we propose a semi-automated segmentation method for extracting lung lesions from thoracic PET/CT images by combining low level processing and active contour techniques. The lesions are first segmented in PET images which are first converted to standardised uptake values (SUVs). The segmented PET images then serve as an initial contour for subsequent active contour segmentation of corresponding CT images. To evaluate its accuracy, the Jaccard Index (JI) was used as a measure of the accuracy of the segmented lesion compared to alternative segmentations from the QIN lung CT segmentation challenge, which is possible by registering the whole body PET/CT images to the corresponding thoracic CT images. The results show that our proposed technique has acceptable accuracy in lung lesion segmentation with JI values of around 0.8, especially when considering the variability of the alternative segmentations.
Agrawal, Rupesh; Keane, Pearse A; Singh, Jasmin; Saihan, Zubin; Kontos, Andreas; Pavesio, Carlos E
2016-01-01
To assess correlation for anterior chamber flare grading between clinicians with different levels of experience and with semi-automated flare reading in a cohort of patients with heterogeneous uveitic entities. Fifty-nine observations from 36 patients were recorded and analyzed for statistical association. In each patient, flare was assessed objectively using the Kowa FM-700 laser flare photometer, and subjective masked grading by two clinicians was performed. The study demonstrated disparity in flare readings between clinical graders with one step disagreement in clinical grading in 26 (44.06%) eyes (p < 0.001) and concordance between the flare readings by experienced grader and flare photometry. After review of semi-automated flare readings, management was changed in 11% of the patients. Laser flare photometry can be a valuable tool to remove the observer bias in grading flare for selected cohort of uveitis patients. It can be further applied to titrate therapy in intraocular inflammation.
Downey, Brandon; Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey
2017-11-01
As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed-batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647-1661, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.
NASA Astrophysics Data System (ADS)
Wang, Minhuan; Feng, Yulin; Bian, Jiming; Liu, Hongzhu; Shi, Yantao
2018-01-01
The mesoscopic perovskite solar cells (M-PSCs) were synthesized with MAPbI3 perovskite layers as light harvesters, which were grown with one-step and two-step solution process, respectively. A comparative study was performed through the quantitative correlation of resulting device performance and the crystalline quality of perovskite layers. Comparing with the one-step counterpart, a pronounced improvement in the steady-state power conversion efficiencies (PCEs) by 56.86% was achieved with two-step process, which was mainly resulted from the significant enhancement in fill factor (FF) from 48% to 77% without sacrificing the open circuit voltage (Voc) and short circuit current (Jsc). The enhanced FF was attributed to the reduced non-radiative recombination channels due to the better crystalline quality and larger grain size with the two-step processed perovskite layer. Moreover, the superiority of two-step over one-step process was demonstrated with rather good reproducibility.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ton, H.; Yeung, E.S.
1997-02-15
An integrated on-line prototype for coupling a microreactor to capillary electrophoresis for DNA sequencing has been demonstrated. A dye-labeled terminator cycle-sequencing reaction is performed in a fused-silica capillary. Subsequently, the sequencing ladder is directly injected into a size-exclusion chromatographic column operated at nearly 95{degree}C for purification. On-line injection to a capillary for electrophoresis is accomplished at a junction set at nearly 70{degree}C. High temperature at the purification column and injection junction prevents the renaturation of DNA fragments during on-line transfer without affecting the separation. The high solubility of DNA in and the relatively low ionic strength of 1 x TEmore » buffer permit both effective purification and electrokinetic injection of the DNA sample. The system is compatible with highly efficient separations by a replaceable poly(ethylene oxide) polymer solution in uncoated capillary tubes. Future automation and adaptation to a multiple-capillary array system should allow high-speed, high-throughput DNA sequencing from templates to called bases in one step. 32 refs., 5 figs.« less
Mueller, Dirk; Klette, Ingo; Baum, Richard P; Gottschaldt, M; Schultz, Michael K; Breeman, Wouter A P
2012-08-15
A simple sodium chloride (NaCl) based (68)Ga eluate concentration and labeling method that enables rapid, high-efficiency labeling of DOTA conjugated peptides in high radiochemical purity is described. The method utilizes relatively few reagents and comprises minimal procedural steps. It is particularly well-suited for routine automated synthesis of clinical radiopharmaceuticals. For the (68)Ga generator eluate concentration step, commercially available cation-exchange cartridges and (68)Ga generators were used. The (68)Ga generator eluate was collected by use of a strong cation exchange cartridge. 98% of the total activity of (68)Ga was then eluted from the cation exchange cartridge with 0.5 mL of 5 M NaCl solution containing a small amount of 5.5 M HCl. After buffering with ammonium acetate, the eluate was used directly for radiolabeling of DOTATOC and DOTATATE. The (68)Ga-labeled peptides were obtained in higher radiochemical purity compared to other commonly used procedures, with radiochemical yields greater than 80%. The presence of (68)Ge could not be detected in the final product. The new method obviates the need for organic solvents, which eliminates the required quality control of the final product by gas chromatography, thereby reducing postsynthesis analytical effort significantly. The (68)Ga-labeled products were used directly, with no subsequent purification steps, such as solid-phase extraction. The NaCl method was further evaluated using an automated fluid handling system and it routinely facilitates radiochemical yields in excess of 65% in less than 15 min, with radiochemical purity consistently greater than 99% for the preparation of (68)Ga-DOTATOC.
Human Factors Design Of Automated Highway Systems: Scenario Definition
DOT National Transportation Integrated Search
1995-09-01
Attention to driver acceptance and performance issues during system design will be key to the success of the Automated Highway System (AHS). A first step in the process of defining driver roles and driver-system interface requirements of AHS is the d...
ARES - A New Airborne Reflective Emissive Spectrometer
2005-10-01
Information and Management System (DIMS), an automated processing environment with robot archive interface as established for the handling of satellite data...consisting of geocoded ground reflectance data. All described processing steps will be integrated in the automated processing environment DIMS to assure a
Proposed Conceptual Requirements for the CTBT Knowledge Base,
1995-08-14
knowledge available to automated processing routines and human analysts are significant, and solving these problems is an essential step in ensuring...knowledge storage in a CTBT system. In addition to providing regional knowledge to automated processing routines, the knowledge base will also address
Final Technical Report for Automated Manufacturing of Innovative CPV/PV Modules
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okawa, David
Cogenra’s Dense Cell Interconnect system was designed to use traditional front-contact cells and string them together into high efficiency and high reliability “supercells”. This novel stringer allows one to take advantage of the ~100 GW/year of existing cell production capacity and create a solar product for the customer that will produce more power and last longer than traditional PV products. The goal for this program was for Cogenra Solar to design and develop a first-of-kind automated solar manufacturing line that produces strings of overlapping cells or “supercells” based on Cogenra’s Dense Cell Interconnect (DCI) technology for their Low Concentration Photovoltaicmore » (LCPV) systems. This will enable the commercialization of DCI technology to improve the efficiency, reliability and economics for their Low Concentration Photovoltaic systems. In this program, Cogenra Solar very successfully designed, developed, built, installed, and started up the ground-breaking manufacturing tools required to assemble supercells. Cogenra then successfully demonstrated operation of the integrated line at high yield and throughput far exceeding expectations. The development of a supercell production line represents a critical step toward a high volume and low cost Low Concentration Photovoltaic Module with Dense Cell Interconnect technology and has enabled the evaluation of the technology for reliability and yield. Unfortunately, performance and cost headwinds on Low Concentration Photovoltaics systems including lack of diffuse capture (10-15% hit) and more expensive tracker requirements resulted in a move away from LCPV technology. Fortunately, the versatility of Dense Cell Interconnect technology allows for application to flat plate module technology as well and Cogenra has worked with the DOE to utilize the learning from this grant to commercialize DCI technology for the solar market through the on-going grant: Catalyzing PV Manufacturing in the US With Cogenra Solar’s Next-Generation Dense Cell Interconnect PV Module Manufacturing Technology. This program is now very successfully building off of this work and commercializing the technology to enable increased solar adoption.« less
NASA Astrophysics Data System (ADS)
Cherri, Abdallah K.; Alam, Mohammed S.
1998-07-01
Highly-efficient two-step recoded and one-step nonrecoded trinary signed-digit (TSD) carry-free adders subtracters are presented on the basis of redundant-bit representation for the operands digits. It has been shown that only 24 (30) minterms are needed to implement the two-step recoded (the one-step nonrecoded) TSD addition for any operand length. Optical implementation of the proposed arithmetic can be carried out by use of correlation- or matrix-multiplication-based schemes, saving 50% of the system memory. Furthermore, we present four different multiplication designs based on our proposed recoded and nonrecoded TSD adders. Our multiplication designs require a small number of reduced minterms to generate the multiplication partial products. Finally, a recently proposed pipelined iterative-tree algorithm can be used in the TSD adders multipliers; consequently, efficient use of all available adders can be made.
Cherri, A K; Alam, M S
1998-07-10
Highly-efficient two-step recoded and one-step nonrecoded trinary signed-digit (TSD) carry-free adders-subtracters are presented on the basis of redundant-bit representation for the operands' digits. It has been shown that only 24 (30) minterms are needed to implement the two-step recoded (the one-step nonrecoded) TSD addition for any operand length. Optical implementation of the proposed arithmetic can be carried out by use of correlation- or matrix-multiplication-based schemes, saving 50% of the system memory. Furthermore, we present four different multiplication designs based on our proposed recoded and nonrecoded TSD adders. Our multiplication designs require a small number of reduced minterms to generate the multiplication partial products. Finally, a recently proposed pipelined iterative-tree algorithm can be used in the TSD adders-multipliers; consequently, efficient use of all available adders can be made.
Data-Acquisition System With Remotely Adjustable Amplifiers
NASA Technical Reports Server (NTRS)
Nurge, Mark A.; Larson, William E.; Hallberg, Carl G.; Thayer, Steven W.; Ake, Jeffrey C.; Gleman, Stuart M.; Thompson, David L.; Medelius, Pedro J.; Crawford, Wayne A.; Vangilder, Richard M.;
1994-01-01
Improved data-acquisition system has both centralized and decentralized characteristics developed. Provides infrastructure for automation and standardization of operation, maintenance, calibration, and adjustment of many transducers. Increases efficiency by reducing need for diminishing work force of highly trained technicians to perform routine tasks. Large industrial and academic laboratory facilities benefit from systems like this one.
Jung, Yen-Sook; Hwang, Kyeongil; Heo, Youn-Jung; Kim, Jueng-Eun; Lee, Donmin; Lee, Cheol-Ho; Joh, Han-Ik; Yeo, Jun-Seok; Kim, Dong-Yu
2017-08-23
Despite the potential of roll-to-roll processing for the fabrication of perovskite films, the realization of highly efficient and reproducible perovskite solar cells (PeSCs) through continuous coating techniques and low-temperature processing is still challenging. Here, we demonstrate that efficient and reliable CH 3 NH 3 PbI 3 (MAPbI 3 ) films fabricated by a printing process can be achieved through synergetic effects of binary processing additives, N-cyclohexyl-2-pyrrolidone (CHP) and dimethyl sulfoxide (DMSO). Notably, these perovskite films are deposited from premixed perovskite solutions for facile one-step processing under a room-temperature and ambient atmosphere. The CHP molecules result in the uniform and homogeneous perovskite films even in the one-step slot-die system, which originate from the high boiling point and low vapor pressure of CHP. Meanwhile, the DMSO molecules facilitate the growth of perovskite grains by forming intermediate states with the perovskite precursor molecules. Consequently, fully printed PeSC based on the binary additive system exhibits a high PCE of 12.56% with a high reproducibility.
An economic evaluation of colorectal cancer screening in primary care practice.
Meenan, Richard T; Anderson, Melissa L; Chubak, Jessica; Vernon, Sally W; Fuller, Sharon; Wang, Ching-Yun; Green, Beverly B
2015-06-01
Recent colorectal cancer screening studies focus on optimizing adherence. This study evaluated the cost effectiveness of interventions using electronic health records (EHRs); automated mailings; and stepped support increases to improve 2-year colorectal cancer screening adherence. Analyses were based on a parallel-design, randomized trial in which three stepped interventions (EHR-linked mailings ["automated"]; automated plus telephone assistance ["assisted"]; or automated and assisted plus nurse navigation to testing completion or refusal [navigated"]) were compared to usual care. Data were from August 2008 to November 2011, with analyses performed during 2012-2013. Implementation resources were micro-costed; research and registry development costs were excluded. Incremental cost-effectiveness ratios (ICERs) were based on number of participants current for screening per guidelines over 2 years. Bootstrapping examined robustness of results. Intervention delivery cost per participant current for screening ranged from $21 (automated) to $27 (navigated). Inclusion of induced testing costs (e.g., screening colonoscopy) lowered expenditures for automated (ICER=-$159) and assisted (ICER=-$36) relative to usual care over 2 years. Savings arose from increased fecal occult blood testing, substituting for more expensive colonoscopies in usual care. Results were broadly consistent across demographic subgroups. More intensive interventions were consistently likely to be cost effective relative to less intensive interventions, with willingness to pay values of $600-$1,200 for an additional person current for screening yielding ≥80% probability of cost effectiveness. Two-year cost effectiveness of a stepped approach to colorectal cancer screening promotion based on EHR data is indicated, but longer-term cost effectiveness requires further study. Copyright © 2015 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Solution of elliptic PDEs by fast Poisson solvers using a local relaxation factor
NASA Technical Reports Server (NTRS)
Chang, Sin-Chung
1986-01-01
A large class of two- and three-dimensional, nonseparable elliptic partial differential equations (PDEs) is presently solved by means of novel one-step (D'Yakanov-Gunn) and two-step (accelerated one-step) iterative procedures, using a local, discrete Fourier analysis. In addition to being easily implemented and applicable to a variety of boundary conditions, these procedures are found to be computationally efficient on the basis of the results of numerical comparison with other established methods, which lack the present one's: (1) insensitivity to grid cell size and aspect ratio, and (2) ease of convergence rate estimation by means of the coefficient of the PDE being solved. The two-step procedure is numerically demonstrated to outperform the one-step procedure in the case of PDEs with variable coefficients.
Madry, Milena M; Kraemer, Thomas; Baumgartner, Markus R
2018-01-01
Hair analysis has been established as a prevalent tool for retrospective drug monitoring. In this study, different extraction solvents for the determination of drugs of abuse and pharmaceuticals in hair were evaluated for their efficiency. A pool of authentic hair from drug users was used for extraction experiments. Hair was pulverized and extracted in triplicate with seven different solvents in a one- or two-step extraction. Three one- (methanol, acetonitrile, and acetonitrile/water) and four two-step extractions (methanol two-fold, methanol and methanol/acetonitrile/formate buffer, methanol and methanol/formate buffer, and methanol and methanol/hydrochloric acid) were tested under accurately equal experimental conditions. The extracts were directly analyzed by liquid chromatography-tandem mass spectrometry for opiates/opioids, stimulants, ketamine, selected benzodiazepines, antidepressants, antipsychotics, and antihistamines using deuterated internal standards. For most analytes, a two-step extraction with methanol did not significantly improve the yield compared to a one-step extraction with methanol. Extraction with acetonitrile alone was least efficient for most analytes. Extraction yields of acetonitrile/water, methanol and methanol/acetonitrile/formate buffer, and methanol and methanol/formate buffer were significantly higher compared to methanol. Highest efficiencies were obtained by a two-step extraction with methanol and methanol/hydrochloric acid, particularly for morphine, 6-monoacetylmorphine, codeine, 6-acetylcodeine, MDMA, zopiclone, zolpidem, amitriptyline, nortriptyline, citalopram, and doxylamine. For some analytes (e.g., tramadol, fluoxetine, sertraline), all extraction solvents, except for acetonitrile, were comparably efficient. There was no significant correlation between extraction efficiency with an acidic solvent and the pka or log P of the analyte. However, there was a significant trend for the extraction efficiency with acetonitrile to the log P of the analyte. The study demonstrates that the choice of extraction solvent has a strong impact on hair analysis outcomes. Therefore, validation protocols should include the evaluation of extraction efficiency of drugs by using authentic rather than spiked hair. Different extraction procedures may contribute to the scatter of quantitative results in inter-laboratory comparisons. Harmonization of extraction protocols is recommended, when interpretation is based on same cut-off levels. Copyright © 2017 Elsevier B.V. All rights reserved.
Janiszewski, J; Schneider, P; Hoffmaster, K; Swyden, M; Wells, D; Fouda, H
1997-01-01
The development and application of membrane solid phase extraction (SPE) in 96-well microtiter plate format is described for the automated analysis of drugs in biological fluids. The small bed volume of the membrane allows elution of the analyte in a very small solvent volume, permitting direct HPLC injection and negating the need for the time consuming solvent evaporation step. A programmable liquid handling station (Quadra 96) was modified to automate all SPE steps. To avoid drying of the SPE bed and to enhance the analytical precision a novel protocol for performing the condition, load and wash steps in rapid succession was utilized. A block of 96 samples can now be extracted in 10 min., about 30 times faster than manual solvent extraction or single cartridge SPE methods. This processing speed complements the high-throughput speed of contemporary high performance liquid chromatography mass spectrometry (HPLC/MS) analysis. The quantitative analysis of a test analyte (Ziprasidone) in plasma demonstrates the utility and throughput of membrane SPE in combination with HPLC/MS. The results obtained with the current automated procedure compare favorably with those obtained using solvent and traditional solid phase extraction methods. The method has been used for the analysis of numerous drug prototypes in biological fluids to support drug discovery efforts.
Step-by-step magic state encoding for efficient fault-tolerant quantum computation
Goto, Hayato
2014-01-01
Quantum error correction allows one to make quantum computers fault-tolerant against unavoidable errors due to decoherence and imperfect physical gate operations. However, the fault-tolerant quantum computation requires impractically large computational resources for useful applications. This is a current major obstacle to the realization of a quantum computer. In particular, magic state distillation, which is a standard approach to universality, consumes the most resources in fault-tolerant quantum computation. For the resource problem, here we propose step-by-step magic state encoding for concatenated quantum codes, where magic states are encoded step by step from the physical level to the logical one. To manage errors during the encoding, we carefully use error detection. Since the sizes of intermediate codes are small, it is expected that the resource overheads will become lower than previous approaches based on the distillation at the logical level. Our simulation results suggest that the resource requirements for a logical magic state will become comparable to those for a single logical controlled-NOT gate. Thus, the present method opens a new possibility for efficient fault-tolerant quantum computation. PMID:25511387
Step-by-step magic state encoding for efficient fault-tolerant quantum computation.
Goto, Hayato
2014-12-16
Quantum error correction allows one to make quantum computers fault-tolerant against unavoidable errors due to decoherence and imperfect physical gate operations. However, the fault-tolerant quantum computation requires impractically large computational resources for useful applications. This is a current major obstacle to the realization of a quantum computer. In particular, magic state distillation, which is a standard approach to universality, consumes the most resources in fault-tolerant quantum computation. For the resource problem, here we propose step-by-step magic state encoding for concatenated quantum codes, where magic states are encoded step by step from the physical level to the logical one. To manage errors during the encoding, we carefully use error detection. Since the sizes of intermediate codes are small, it is expected that the resource overheads will become lower than previous approaches based on the distillation at the logical level. Our simulation results suggest that the resource requirements for a logical magic state will become comparable to those for a single logical controlled-NOT gate. Thus, the present method opens a new possibility for efficient fault-tolerant quantum computation.
Holographic optical assembly and photopolymerized joining of planar microspheres
Shaw, L. A.; Chizari, S.; Panas, R. M.; ...
2016-07-27
The aim of this research is to demonstrate a holographically driven photopolymerization process for joining colloidal particles to create planar microstructures fixed to a substrate, which can be monitored with real-time measurement. Holographic optical tweezers (HOT) have been used to arrange arrays of microparticles prior to this work; here we introduce a new photopolymerization process for rapidly joining simultaneously handled microspheres in a plane. Additionally, we demonstrate a new process control technique for efficiently identifying when particles have been successfully joined by measuring a sufficient reduction in the particles’ Brownian motion. Furthermore, this technique and our demonstrated joining approach enablemore » HOT technology to take critical steps toward automated additive fabrication of microstructures.« less
Jipp, Meike
2012-12-01
The extent to which individual differences in fine motor abilities affect indoor safety and efficiency of human-wheelchair systems was examined. To reduce the currently large number of indoor wheelchair accidents, assistance systems with a high level of automation were developed. It was proposed to adapt the wheelchair's level of automation to the user's ability to steer the device to avoid drawbacks of highly automated wheelchairs. The state of the art, however, lacks an empirical identification of those abilities. A study with 23 participants is described. The participants drove through various sections of a course with a powered wheelchair. Repeatedly measured criteria were safety (numbers of collisions) and efficiency (times required for reaching goals). As covariates, the participants' fine motor abilities were assessed. A random coefficient modeling approach was conducted to analyze the data,which were available on two levels as course sections were nested within participants.The participants' aiming, precision, and armhand speed contributed significantly to both criteria: Participants with lower fine motor abilities had more collisions and required more time for reaching goals. Adapting the wheelchair's level of automation to these fine motor abilities can improve indoor safety and efficiency. In addition, the results highlight the need to further examine the impact of individual differences on the design of automation features for powered wheelchairs as well as other applications of automation. The results facilitate the improvement of current wheelchair technology.
Liu, Dong; Wu, Lili; Li, Chunxiu; Ren, Shengqiang; Zhang, Jingquan; Li, Wei; Feng, Lianghuan
2015-08-05
The methylammonium lead halide perovskite solar cells have become very attractive because they can be prepared with low-cost solution-processable technology and their power conversion efficiency have been increasing from 3.9% to 20% in recent years. However, the high performance of perovskite photovoltaic devices are dependent on the complicated process to prepare compact perovskite films with large grain size. Herein, a new method is developed to achieve excellent CH3NH3PbI3-xClx film with fine morphology and crystallization based on one step deposition and two-step annealing process. This method include the spin coating deposition of the perovskite films with the precursor solution of PbI2, PbCl2, and CH3NH3I at the molar ratio 1:1:4 in dimethylformamide (DMF) and the post two-step annealing (TSA). The first annealing is achieved by solvent-induced process in DMF to promote migration and interdiffusion of the solvent-assisted precursor ions and molecules and realize large size grain growth. The second annealing is conducted by thermal-induced process to further improve morphology and crystallization of films. The compact perovskite films are successfully prepared with grain size up to 1.1 μm according to SEM observation. The PL decay lifetime, and the optic energy gap for the film with two-step annealing are 460 ns and 1.575 eV, respectively, while they are 307 and 327 ns and 1.577 and 1.582 eV for the films annealed in one-step thermal and one-step solvent process. On the basis of the TSA process, the photovoltaic devices exhibit the best efficiency of 14% under AM 1.5G irradiation (100 mW·cm(-2)).
Tractable policy management framework for IoT
NASA Astrophysics Data System (ADS)
Goynugur, Emre; de Mel, Geeth; Sensoy, Murat; Calo, Seraphin
2017-05-01
Due to the advancement in the technology, hype of connected devices (hence forth referred to as IoT) in support of automating the functionality of many domains, be it intelligent manufacturing or smart homes, have become a reality. However, with the proliferation of such connected and interconnected devices, efficiently and effectively managing networks manually becomes an impractical, if not an impossible task. This is because devices have their own obligations and prohibitions in context, and humans are not equip to maintain a bird's-eye-view of the state. Traditionally, policies are used to address the issue, but in the IoT arena, one requires a policy framework in which the language can provide sufficient amount of expressiveness along with efficient reasoning procedures to automate the management. In this work we present our initial work into creating a scalable knowledge-based policy framework for IoT and demonstrate its applicability through a smart home application.
The LabTube - a novel microfluidic platform for assay automation in laboratory centrifuges.
Kloke, A; Fiebach, A R; Zhang, S; Drechsel, L; Niekrawietz, S; Hoehl, M M; Kneusel, R; Panthel, K; Steigert, J; von Stetten, F; Zengerle, R; Paust, N
2014-05-07
Assay automation is the key for successful transformation of modern biotechnology into routine workflows. Yet, it requires considerable investment in processing devices and auxiliary infrastructure, which is not cost-efficient for laboratories with low or medium sample throughput or point-of-care testing. To close this gap, we present the LabTube platform, which is based on assay specific disposable cartridges for processing in laboratory centrifuges. LabTube cartridges comprise interfaces for sample loading and downstream applications and fluidic unit operations for release of prestored reagents, mixing, and solid phase extraction. Process control is achieved by a centrifugally-actuated ballpen mechanism. To demonstrate the workflow and functionality of the LabTube platform, we show two LabTube automated sample preparation assays from laboratory routines: DNA extractions from whole blood and purification of His-tagged proteins. Equal DNA and protein yields were observed compared to manual reference runs, while LabTube automation could significantly reduce the hands-on-time to one minute per extraction.
Automation of Cassini Support Imaging Uplink Command Development
NASA Technical Reports Server (NTRS)
Ly-Hollins, Lisa; Breneman, Herbert H.; Brooks, Robert
2010-01-01
"Support imaging" is imagery requested by other Cassini science teams to aid in the interpretation of their data. The generation of the spacecraft command sequences for these images is performed by the Cassini Instrument Operations Team. The process initially established for doing this was very labor-intensive, tedious and prone to human error. Team management recognized this process as one that could easily benefit from automation. Team members were tasked to document the existing manual process, develop a plan and strategy to automate the process, implement the plan and strategy, test and validate the new automated process, and deliver the new software tools and documentation to Flight Operations for use during the Cassini extended mission. In addition to the goals of higher efficiency and lower risk in the processing of support imaging requests, an effort was made to maximize adaptability of the process to accommodate uplink procedure changes and the potential addition of new capabilities outside the scope of the initial effort.
Enhanced mobility for aging populations using automated vehicles : [summary].
DOT National Transportation Integrated Search
2016-01-01
Studies show that aging adults have travel needs that can be inadequately addressed by todays : transportation system. Automated vehicles (AVs), ranging from assistive technologies to full : automation, may offer a safe and efficient transportatio...
Coutts/Sweetgrass automated border crossing : phase I
DOT National Transportation Integrated Search
1999-03-01
The Coutts/Sweetgrass Automated Border Crossing Project was intended to improve operational efficiency of this rural border crossing facility using ITS applications. Phase I of the Coutts/Sweetgrass Automated Border Crossing Project was intended to r...
Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.
Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras
2016-04-01
There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a <4 h magnetic bead-based process with excellent yield and good repeatability. This article demonstrates the next level of this work by automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (<3 min) separation to accommodate the high-throughput processing of the automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences. © 2015 Society for Laboratory Automation and Screening.
NASA Astrophysics Data System (ADS)
Moreno, R.; Bazán, A. M.
2017-10-01
The main purpose of this work is to study improvements to the learning method of technical drawing and descriptive geometry through exercises with traditional techniques that are usually solved manually by applying automated processes assisted by high-level CAD templates (HLCts). Given that an exercise with traditional procedures can be solved, detailed step by step in technical drawing and descriptive geometry manuals, CAD applications allow us to do the same and generalize it later, incorporating references. Traditional teachings have become obsolete and current curricula have been relegated. However, they can be applied in certain automation processes. The use of geometric references (using variables in script languages) and their incorporation into HLCts allows the automation of drawing processes. Instead of repeatedly creating similar exercises or modifying data in the same exercises, users should be able to use HLCts to generate future modifications of these exercises. This paper introduces the automation process when generating exercises based on CAD script files, aided by parametric geometry calculation tools. The proposed method allows us to design new exercises without user intervention. The integration of CAD, mathematics, and descriptive geometry facilitates their joint learning. Automation in the generation of exercises not only saves time but also increases the quality of the statements and reduces the possibility of human error.
SpcAudace: Spectroscopic processing and analysis package of Audela software
NASA Astrophysics Data System (ADS)
Mauclaire, Benjamin
2017-11-01
SpcAudace processes long slit spectra with automated pipelines and performs astrophysical analysis of the latter data. These powerful pipelines do all the required steps in one pass: standard preprocessing, masking of bad pixels, geometric corrections, registration, optimized spectrum extraction, wavelength calibration and instrumental response computation and correction. Both high and low resolution long slit spectra are managed for stellar and non-stellar targets. Many types of publication-quality figures can be easily produced: pdf and png plots or annotated time series plots. Astrophysical quantities can be derived from individual or large amount of spectra with advanced functions: from line profile characteristics to equivalent width and periodogram. More than 300 documented functions are available and can be used into TCL scripts for automation. SpcAudace is based on Audela open source software.
Sequeira, Ana Filipa; Brás, Joana L A; Guerreiro, Catarina I P D; Vincentelli, Renaud; Fontes, Carlos M G A
2016-12-01
Gene synthesis is becoming an important tool in many fields of recombinant DNA technology, including recombinant protein production. De novo gene synthesis is quickly replacing the classical cloning and mutagenesis procedures and allows generating nucleic acids for which no template is available. In addition, when coupled with efficient gene design algorithms that optimize codon usage, it leads to high levels of recombinant protein expression. Here, we describe the development of an optimized gene synthesis platform that was applied to the large scale production of small genes encoding venom peptides. This improved gene synthesis method uses a PCR-based protocol to assemble synthetic DNA from pools of overlapping oligonucleotides and was developed to synthesise multiples genes simultaneously. This technology incorporates an accurate, automated and cost effective ligation independent cloning step to directly integrate the synthetic genes into an effective Escherichia coli expression vector. The robustness of this technology to generate large libraries of dozens to thousands of synthetic nucleic acids was demonstrated through the parallel and simultaneous synthesis of 96 genes encoding animal toxins. An automated platform was developed for the large-scale synthesis of small genes encoding eukaryotic toxins. Large scale recombinant expression of synthetic genes encoding eukaryotic toxins will allow exploring the extraordinary potency and pharmacological diversity of animal venoms, an increasingly valuable but unexplored source of lead molecules for drug discovery.
de Brouwer, Hans; Stegeman, Gerrit
2011-02-01
To maximize utilization of expensive laboratory instruments and to make most effective use of skilled human resources, the entire chain of data processing, calculation, and reporting that is needed to transform raw NMR data into meaningful results was automated. The LEAN process improvement tools were used to identify non-value-added steps in the existing process. These steps were eliminated using an in-house developed software package, which allowed us to meet the key requirement of improving quality and reliability compared with the existing process while freeing up valuable human resources and increasing productivity. Reliability and quality were improved by the consistent data treatment as performed by the software and the uniform administration of results. Automating a single NMR spectrophotometer led to a reduction in operator time of 35%, doubling of the annual sample throughput from 1400 to 2800, and reducing the turn around time from 6 days to less than 2. Copyright © 2011 Society for Laboratory Automation and Screening. Published by Elsevier Inc. All rights reserved.
NASA Systems Autonomy Demonstration Program - A step toward Space Station automation
NASA Technical Reports Server (NTRS)
Starks, S. A.; Rundus, D.; Erickson, W. K.; Healey, K. J.
1987-01-01
This paper addresses a multiyear NASA program, the Systems Autonomy Demonstration Program (SADP), whose main objectives include the development, integration, and demonstration of automation technology in Space Station flight and ground support systems. The role of automation in the Space Station is reviewed, and the main players in SADP and their roles are described. The core research and technology being promoted by SADP are discussed, and a planned 1988 milestone demonstration of the automated monitoring, operation, and control of a complete mission operations subsystem is addressed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, X; Gao, H; Sharp, G
Purpose: Accurate image segmentation is a crucial step during image guided radiation therapy. This work proposes multi-atlas machine learning (MAML) algorithm for automated segmentation of head-and-neck CT images. Methods: As the first step, the algorithm utilizes normalized mutual information as similarity metric, affine registration combined with multiresolution B-Spline registration, and then fuses together using the label fusion strategy via Plastimatch. As the second step, the following feature selection strategy is proposed to extract five feature components from reference or atlas images: intensity (I), distance map (D), box (B), center of gravity (C) and stable point (S). The box feature Bmore » is novel. It describes a relative position from each point to minimum inscribed rectangle of ROI. The center-of-gravity feature C is the 3D Euclidean distance from a sample point to the ROI center of gravity, and then S is the distance of the sample point to the landmarks. Then, we adopt random forest (RF) in Scikit-learn, a Python module integrating a wide range of state-of-the-art machine learning algorithms as classifier. Different feature and atlas strategies are used for different ROIs for improved performance, such as multi-atlas strategy with reference box for brainstem, and single-atlas strategy with reference landmark for optic chiasm. Results: The algorithm was validated on a set of 33 CT images with manual contours using a leave-one-out cross-validation strategy. Dice similarity coefficients between manual contours and automated contours were calculated: the proposed MAML method had an improvement from 0.79 to 0.83 for brainstem and 0.11 to 0.52 for optic chiasm with respect to multi-atlas segmentation method (MA). Conclusion: A MAML method has been proposed for automated segmentation of head-and-neck CT images with improved performance. It provides the comparable result in brainstem and the improved result in optic chiasm compared with MA. Xuhua Ren and Hao Gao were partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000), and the Shanghai Pujiang Talent Program (#14PJ1404500).« less
Rashed-Ul Islam, S M; Jahan, Munira; Tabassum, Shahina
2015-01-01
Virological monitoring is the best predictor for the management of chronic hepatitis B virus (HBV) infections. Consequently, it is important to use the most efficient, rapid and cost-effective testing systems for HBV DNA quantification. The present study compared the performance characteristics of a one-step HBV polymerase chain reaction (PCR) vs the two-step HBV PCR method for quantification of HBV DNA from clinical samples. A total of 100 samples consisting of 85 randomly selected samples from patients with chronic hepatitis B (CHB) and 15 samples from apparently healthy individuals were enrolled in this study. Of the 85 CHB clinical samples tested, HBV DNA was detected from 81% samples by one-step PCR method with median HBV DNA viral load (VL) of 7.50 × 10 3 lU/ml. In contrast, 72% samples were detected by the two-step PCR system with median HBV DNA of 3.71 × 10 3 lU/ml. The one-step method showed strong linear correlation with two-step PCR method (r = 0.89; p < 0.0001). Both methods showed good agreement at Bland-Altman plot, with a mean difference of 0.61 log 10 IU/ml and limits of agreement of -1.82 to 3.03 log 10 IU/ml. The intra-assay and interassay coefficients of variation (CV%) of plasma samples (4-7 log 10 IU/ml) for the one-step PCR method ranged between 0.33 to 0.59 and 0.28 to 0.48 respectively, thus demonstrating a high level of concordance between the two methods. Moreover, elimination of the DNA extraction step in the one-step PCR kit allowed time-efficient and significant labor and cost savings for the quantification of HBV DNA in a resource limited setting. Rashed-Ul Islam SM, Jahan M, Tabassum S. Evaluation of a Rapid One-step Real-time PCR Method as a High-throughput Screening for Quantification of Hepatitis B Virus DNA in a Resource-limited Setting. Euroasian J Hepato-Gastroenterol 2015;5(1):11-15.
Jahan, Munira; Tabassum, Shahina
2015-01-01
Virological monitoring is the best predictor for the management of chronic hepatitis B virus (HBV) infections. Consequently, it is important to use the most efficient, rapid and cost-effective testing systems for HBV DNA quantification. The present study compared the performance characteristics of a one-step HBV polymerase chain reaction (PCR) vs the two-step HBV PCR method for quantification of HBV DNA from clinical samples. A total of 100 samples consisting of 85 randomly selected samples from patients with chronic hepatitis B (CHB) and 15 samples from apparently healthy individuals were enrolled in this study. Of the 85 CHB clinical samples tested, HBV DNA was detected from 81% samples by one-step PCR method with median HBV DNA viral load (VL) of 7.50 × 103 lU/ml. In contrast, 72% samples were detected by the two-step PCR system with median HBV DNA of 3.71 × 103 lU/ml. The one-step method showed strong linear correlation with two-step PCR method (r = 0.89; p < 0.0001). Both methods showed good agreement at Bland-Altman plot, with a mean difference of 0.61 log10 IU/ml and limits of agreement of -1.82 to 3.03 log10 IU/ml. The intra-assay and interassay coefficients of variation (CV%) of plasma samples (4-7 log10 IU/ml) for the one-step PCR method ranged between 0.33 to 0.59 and 0.28 to 0.48 respectively, thus demonstrating a high level of concordance between the two methods. Moreover, elimination of the DNA extraction step in the one-step PCR kit allowed time-efficient and significant labor and cost savings for the quantification of HBV DNA in a resource limited setting. How to cite this article Rashed-Ul Islam SM, Jahan M, Tabassum S. Evaluation of a Rapid One-step Real-time PCR Method as a High-throughput Screening for Quantification of Hepatitis B Virus DNA in a Resource-limited Setting. Euroasian J Hepato-Gastroenterol 2015;5(1):11-15. PMID:29201678
Software support in automation of medicinal product evaluations.
Juric, Radmila; Shojanoori, Reza; Slevin, Lindi; Williams, Stephen
2005-01-01
Medicinal product evaluation is one of the most important tasks undertaken by government health departments and their regulatory authorities, in every country in the world. The automation and adequate software support are critical tasks that can improve the efficiency and interoperation of regulatory systems across the world. In this paper we propose a software solution that supports the automation of the (i) submission of licensing applications, and (ii) evaluations of submitted licensing applications, according to regulatory authorities' procedures. The novelty of our solution is in allowing licensing applications to be submitted in any country in the world and evaluated according to any evaluation procedure (which can be chosen by either regulatory authorities or pharmaceutical companies). Consequently, submission and evaluation procedures become interoperable and the associated data repositories/databases can be shared between various countries and regulatory authorities.
Automating Mapping Production for the Enterprise: from Contract to Delivery
NASA Astrophysics Data System (ADS)
Uebbing, R.; Xie, C.; Beshah, B.; Welter, J.
2012-07-01
The ever increasing volume and quality of geospatial data has created new challenges for mapping companies. Due to increased image resolution, fusion of different data sources and more frequent data update requirements, mapping production is forced to streamline the work flow to meet client deadlines. But the data volume alone is not the only barrier for an efficient production work flow. Processing geospatial information traditionally uses domain and vendor specific applications that do not interface with each other, often leading to data duplication and therefore creating sources for error. Also, it creates isolation between different departments within a mapping company resulting in additional communication barriers. North West Geomatics has designed and implemented a data centric enterprise solution for the flight acquisition and production work flow to combat the above challenges. A central data repository containing not only geospatial data in the strictest sense such as images, vector layers and 3D point clouds, but also other information such as product specifications, client requirements, flight acquisition data, production resource usage and much more has been deployed at the company. As there is only one instance of the database shared throughout the whole organization it allows all employees, given they have been granted the appropriate permission, to view the current status of any project with a graphical and table based interface through its life cycle from sales, through flight acquisition, production and product delivery. Not only can users track progress and status of various work flow steps, but the system also allows users and applications to actively schedule or start specific production steps such as data ingestion and triangulation with many other steps (orthorectification, mosaicing, accounting, etc.) in the planning stages. While the complete system is exposed to the users through a web interface and therefore allowing outside customers to also view their data, much of the design and development was focused on work flow automation, scalability and security. Ideally, users will interact with the system to retrieve a specific project status and summaries while the work flow processes are triggered automatically by modeling their dependencies. The enterprise system is built using open source technologies (PostGIS, Hibernate, OpenLayers, GWT and others) and adheres to OGC web services for data delivery (WMS/WFS/WCS) to third party applications.
Improving the Operations of the Earth Observing One Mission via Automated Mission Planning
NASA Technical Reports Server (NTRS)
Chien, Steve A.; Tran, Daniel; Rabideau, Gregg; Schaffer, Steve; Mandl, Daniel; Frye, Stuart
2010-01-01
We describe the modeling and reasoning about operations constraints in an automated mission planning system for an earth observing satellite - EO-1. We first discuss the large number of elements that can be naturally represented in an expressive planning and scheduling framework. We then describe a number of constraints that challenge the current state of the art in automated planning systems and discuss how we modeled these constraints as well as discuss tradeoffs in representation versus efficiency. Finally we describe the challenges in efficiently generating operations plans for this mission. These discussions involve lessons learned from an operations model that has been in use since Fall 2004 (called R4) as well as a newer more accurate operations model operational since June 2009 (called R5). We present analysis of the R5 software documenting a significant (greater than 50%) increase in the number of weekly observations scheduled by the EO-1 mission. We also show that the R5 mission planning system produces schedules within 15% of an upper bound on optimal schedules. This operational enhancement has created value of millions of dollars US over the projected remaining lifetime of the EO-1 mission.
Workload Capacity: A Response Time-Based Measure of Automation Dependence.
Yamani, Yusuke; McCarley, Jason S
2016-05-01
An experiment used the workload capacity measure C(t) to quantify the processing efficiency of human-automation teams and identify operators' automation usage strategies in a speeded decision task. Although response accuracy rates and related measures are often used to measure the influence of an automated decision aid on human performance, aids can also influence response speed. Mean response times (RTs), however, conflate the influence of the human operator and the automated aid on team performance and may mask changes in the operator's performance strategy under aided conditions. The present study used a measure of parallel processing efficiency, or workload capacity, derived from empirical RT distributions as a novel gauge of human-automation performance and automation dependence in a speeded task. Participants performed a speeded probabilistic decision task with and without the assistance of an automated aid. RT distributions were used to calculate two variants of a workload capacity measure, COR(t) and CAND(t). Capacity measures gave evidence that a diagnosis from the automated aid speeded human participants' responses, and that participants did not moderate their own decision times in anticipation of diagnoses from the aid. Workload capacity provides a sensitive and informative measure of human-automation performance and operators' automation dependence in speeded tasks. © 2016, Human Factors and Ergonomics Society.
Automation: the competitive edge for HMOs and other alternative delivery systems.
Prussin, J A
1987-12-01
Until recently, many, if not most, Health Maintenance Organizations (HMO) were not automated. Moreover, HMOs that were automated tended to be automated only on a limited basis. Recently, however, the highly competitive marketplace within which HMOs and other Alternative Delivery Systems (ADS) exist has required that they operate at a maximum effectiveness and efficiency. Given the complex nature of ADSs, the volume of transactions in ADSs, the large number of members served by ADSs, and the numerous providers who are paid at different rates and on different bases by ADSs, it is impossible for an ADS to operate effectively or efficiently, let alone show optimal performance, without a sophisticated, comprehensive automated system. Reliable automated systems designed specifically to address ADS functions such as enrollment and premium billing, finance and accounting, medical information and patient management, and marketing have recently become available at a reasonable cost.
Pistón, Mariela; Mollo, Alicia; Knochen, Moisés
2011-01-01
A fast and efficient automated method using a sequential injection analysis (SIA) system, based on the Griess, reaction was developed for the determination of nitrate and nitrite in infant formulas and milk powder. The system enables to mix a measured amount of sample (previously constituted in the liquid form and deproteinized) with the chromogenic reagent to produce a colored substance whose absorbance was recorded. For nitrate determination, an on-line prereduction step was added by passing the sample through a Cd minicolumn. The system was controlled from a PC by means of a user-friendly program. Figures of merit include linearity (r2 > 0.999 for both analytes), limits of detection (0.32 mg kg−1 NO3-N, and 0.05 mg kg−1 NO2-N), and precision (sr%) 0.8–3.0. Results were statistically in good agreement with those obtained with the reference ISO-IDF method. The sampling frequency was 30 hour−1 (nitrate) and 80 hour−1 (nitrite) when performed separately. PMID:21960750
NASA Astrophysics Data System (ADS)
Kuetemeyer, Kai; Lucas-Hahn, Andrea; Petersen, Bjoern; Lemme, Erika; Hassel, Petra; Niemann, Heiner; Heisterkamp, Alexander
2010-07-01
Since the birth of ``Dolly'' as the first mammal cloned from a differentiated cell, somatic cell cloning has been successful in several mammalian species, albeit at low success rates. The highly invasive mechanical enucleation step of a cloning protocol requires sophisticated, expensive equipment and considerable micromanipulation skill. We present a novel noninvasive method for combined oocyte imaging and automated functional enucleation using femtosecond (fs) laser pulses. After three-dimensional imaging of Hoechst-labeled porcine oocytes by multiphoton microscopy, our self-developed software automatically identified the metaphase plate. Subsequent irradiation of the metaphase chromosomes with the very same laser at higher pulse energies in the low-density-plasma regime was used for metaphase plate ablation (functional enucleation). We show that fs laser-based functional enucleation of porcine oocytes completely inhibited the parthenogenetic development without affecting the oocyte morphology. In contrast, nonirradiated oocytes were able to develop parthenogenetically to the blastocyst stage without significant differences to controls. Our results indicate that fs laser systems have great potential for oocyte imaging and functional enucleation and may improve the efficiency of somatic cell cloning.
Smart Grid Interoperability Maturity Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Levinson, Alex; Mater, J.
2010-04-28
The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizationalmore » alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.« less
Ladner, Yoann; Mas, Silvia; Coussot, Gaelle; Bartley, Killian; Montels, Jérôme; Morel, Jacques; Perrin, Catherine
2017-12-15
The main purpose of the present work is to provide a fully integrated miniaturized electrophoretic methodology in order to facilitate the quality control of monoclonal antibodies (mAbs). This methodology called D-PES, which stands for Diffusion-mediated Proteolysis combined with an Electrophoretic Separation, permits to perform subsequently mAb tryptic digestion and electrophoresis separation of proteolysis products in an automated manner. Tryptic digestion conditions were optimized regarding the influence of enzyme concentration and incubation time in order to achieve similar enzymatic digestion efficiency to that obtained with the classical methodology (off-line). Then, the optimization of electrophoretic separation conditions concerning the nature of background electrolyte (BGE), ionic strength and pH was realized. Successful and repeatable electrophoretic profiles of three mAbs digests (Trastuzumab, Infliximab and Tocilizumab), comparable to the off-line digestion profiles, were obtained demonstrating the feasibility and robustness of the proposed methodology. In summary, the use of the proposed and optimized in-line approach opens a new, fast and easy way for the quality control of mAbs. Copyright © 2017 Elsevier B.V. All rights reserved.
Parallel workflow tools to facilitate human brain MRI post-processing
Cui, Zaixu; Zhao, Chenxi; Gong, Gaolang
2015-01-01
Multi-modal magnetic resonance imaging (MRI) techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computational resources and to support the parallel processing of different subjects or of independent processing steps for a single subject. Automated, parallel MRI post-processing tools can greatly facilitate relevant brain investigations and are being increasingly applied. In this review, we briefly summarize these parallel workflow tools and discuss relevant issues. PMID:26029043
Some Automated Cartography Developments at the Defense Mapping Agency.
1981-01-01
on a pantographic router creating a laminate step model which was moulded in plaster for carving Into a terrain model. This section will trace DMA’s...offering economical automation. Precision flatbed Concord plotters were brought into DMA with sufficiently programmable control computers to perform these
Design of Inhouse Automated Library Systems.
ERIC Educational Resources Information Center
Cortez, Edwin M.
1984-01-01
Examines six steps inherent to development of in-house automated library system: (1) problem definition, (2) requirement specifications, (3) analysis of alternatives and solutions, (4, 5) design and implementation of hardware and software, and (6) evaluation. Practical method for comparing and weighting options is illustrated and explained. A…
ERIC Educational Resources Information Center
Bradley, Lucy K.; Cook, Jonneen; Cook, Chris
2011-01-01
North Carolina State University has incorporated many aspects of volunteer program administration and reporting into an on-line solution that integrates impact reporting into daily program management. The Extension Master Gardener Intranet automates many of the administrative tasks associated with volunteer management, increasing efficiency, and…
Integrated Communications and Work Efficiency: Impacts on Organizational Structure and Power.
ERIC Educational Resources Information Center
Wigand, Rolf T.
This paper reviews the work environment surrounding integrated office systems, synthesizes the known effects of automated office technologies, and discusses their impact on work efficiency in office environments. Particular attention is given to the effect of automated technologies on networks, workflow/processes, and organizational structure and…
Automated dental implantation using image-guided robotics: registration results.
Sun, Xiaoyan; McKenzie, Frederic D; Bawab, Sebastian; Li, Jiang; Yoon, Yongki; Huang, Jen-K
2011-09-01
One of the most important factors affecting the outcome of dental implantation is the accurate insertion of the implant into the patient's jaw bone, which requires a high degree of anatomical accuracy. With the accuracy and stability of robots, image-guided robotics is expected to provide more reliable and successful outcomes for dental implantation. Here, we proposed the use of a robot for drilling the implant site in preparation for the insertion of the implant. An image-guided robotic system for automated dental implantation is described in this paper. Patient-specific 3D models are reconstructed from preoperative Cone-beam CT images, and implantation planning is performed with these virtual models. A two-step registration procedure is applied to transform the preoperative plan of the implant insertion into intra-operative operations of the robot with the help of a Coordinate Measurement Machine (CMM). Experiments are carried out with a phantom that is generated from the patient-specific 3D model. Fiducial Registration Error (FRE) and Target Registration Error (TRE) values are calculated to evaluate the accuracy of the registration procedure. FRE values are less than 0.30 mm. Final TRE values after the two-step registration are 1.42 ± 0.70 mm (N = 5). The registration results of an automated dental implantation system using image-guided robotics are reported in this paper. Phantom experiments show that the practice of robot in the dental implantation is feasible and the system accuracy is comparable to other similar systems for dental implantation.
Improving automated 3D reconstruction methods via vision metrology
NASA Astrophysics Data System (ADS)
Toschi, Isabella; Nocerino, Erica; Hess, Mona; Menna, Fabio; Sargeant, Ben; MacDonald, Lindsay; Remondino, Fabio; Robson, Stuart
2015-05-01
This paper aims to provide a procedure for improving automated 3D reconstruction methods via vision metrology. The 3D reconstruction problem is generally addressed using two different approaches. On the one hand, vision metrology (VM) systems try to accurately derive 3D coordinates of few sparse object points for industrial measurement and inspection applications; on the other, recent dense image matching (DIM) algorithms are designed to produce dense point clouds for surface representations and analyses. This paper strives to demonstrate a step towards narrowing the gap between traditional VM and DIM approaches. Efforts are therefore intended to (i) test the metric performance of the automated photogrammetric 3D reconstruction procedure, (ii) enhance the accuracy of the final results and (iii) obtain statistical indicators of the quality achieved in the orientation step. VM tools are exploited to integrate their main functionalities (centroid measurement, photogrammetric network adjustment, precision assessment, etc.) into the pipeline of 3D dense reconstruction. Finally, geometric analyses and accuracy evaluations are performed on the raw output of the matching (i.e. the point clouds) by adopting a metrological approach. The latter is based on the use of known geometric shapes and quality parameters derived from VDI/VDE guidelines. Tests are carried out by imaging the calibrated Portable Metric Test Object, designed and built at University College London (UCL), UK. It allows assessment of the performance of the image orientation and matching procedures within a typical industrial scenario, characterised by poor texture and known 3D/2D shapes.
Recent progress in preparation and application of microfluidic chip electrophoresis
NASA Astrophysics Data System (ADS)
Cong, Hailin; Xu, Xiaodan; Yu, Bing; Yuan, Hua; Peng, Qiaohong; Tian, Chao
2015-05-01
Since its discovery in 1990, microfluidic chip electrophoresis (MCE) has allowed the development of applications with small size, fast analysis, low cost, high integration density and automatic level, which are easy to carry and have made commercialization efficient. MCE has been widely used in the areas of environmental protection, biochemistry, medicine and health, clinical testing, judicial expertise, food sanitation, pharmaceutical checking, drug testing, agrochemistry, biomedical engineering and life science. As one of the foremost fields in the research of capillary electrophoresis, MCE is the ultimate frontier to develop the miniaturized, integrated, automated all-in-one instruments needed in modern analytical chemistry. By adopting the advanced technologies of micro-machining, lasers and microelectronics, and the latest research achievements in analytical chemistry and biochemistry, the sampling, separation and detection systems of commonly used capillary electrophoresis are integrated with high densities onto glass, quartz, silicon or polymer wafers to form the MCE, which can finish the analysis of multi-step operations such as injection, enrichment, reaction, derivatization, separation, and collection of samples in a portable, efficient and super high speed manner. With reference to the different technological achievements in this area, the latest developments in MCE are reviewed in this article. The preparation mechanisms, surface modifications, and properties of different materials in MCE are compared, and the different sampling, separation and detection systems in MCE are summarized. The performance of MCE in analysis of fluorescent substance, metallic ion, sugar, medicine, nucleic acid, DNA, amino acid, polypeptide and protein is discussed, and the future direction of development is forecast.
From field notes to data portal - An operational QA/QC framework for tower networks
NASA Astrophysics Data System (ADS)
Sturtevant, C.; Hackley, S.; Meehan, T.; Roberti, J. A.; Holling, G.; Bonarrigo, S.
2016-12-01
Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. This is especially so for environmental sensor networks collecting numerous high-frequency measurement streams at distributed sites. Here, the quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from the natural environment. To complicate matters, there are often multiple personnel managing different sites or different steps in the data flow. For large, centrally managed sensor networks such as NEON, the separation of field and processing duties is in the extreme. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process relying on visual inspection of the data. In addition, notes of observed measurement interference or visible problems are often recorded on paper without an explicit pathway to data flagging during processing. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. There is a need for a scalable, operational QA/QC framework that combines the efficiency and standardization of automated tests with the power and flexibility of visual checks, and includes an efficient communication pathway from field personnel to data processors to end users. Here we propose such a framework and an accompanying set of tools in development, including a mobile application template for recording tower maintenance and an R/shiny application for efficiently monitoring and synthesizing data quality issues. This framework seeks to incorporate lessons learned from the Ameriflux community and provide tools to aid continued network advancements.
An, Gao; Hong, Li; Zhou, Xiao-Bing; Yang, Qiong; Li, Mei-Qing; Tang, Xiang-Yang
2017-03-01
We investigated and compared the functionality of two 3D visualization software provided by a CT vendor and a third-party vendor, respectively. Using surgical anatomical measurement as baseline, we evaluated the accuracy of 3D visualization and verified their utility in computer-aided anatomical analysis. The study cohort consisted of 50 adult cadavers fixed with the classical formaldehyde method. The computer-aided anatomical analysis was based on CT images (in DICOM format) acquired by helical scan with contrast enhancement, using a CT vendor provided 3D visualization workstation (Syngo) and a third-party 3D visualization software (Mimics) that was installed on a PC. Automated and semi-automated segmentations were utilized in the 3D visualization workstation and software, respectively. The functionality and efficiency of automated and semi-automated segmentation methods were compared. Using surgical anatomical measurement as a baseline, the accuracy of 3D visualization based on automated and semi-automated segmentations was quantitatively compared. In semi-automated segmentation, the Mimics 3D visualization software outperformed the Syngo 3D visualization workstation. No significant difference was observed in anatomical data measurement by the Syngo 3D visualization workstation and the Mimics 3D visualization software (P>0.05). Both the Syngo 3D visualization workstation provided by a CT vendor and the Mimics 3D visualization software by a third-party vendor possessed the needed functionality, efficiency and accuracy for computer-aided anatomical analysis. Copyright © 2016 Elsevier GmbH. All rights reserved.
Horsch, Salome; Kopczynski, Dominik; Kuthe, Elias; Baumbach, Jörg Ingo; Rahmann, Sven
2017-01-01
Motivation Disease classification from molecular measurements typically requires an analysis pipeline from raw noisy measurements to final classification results. Multi capillary column—ion mobility spectrometry (MCC-IMS) is a promising technology for the detection of volatile organic compounds in the air of exhaled breath. From raw measurements, the peak regions representing the compounds have to be identified, quantified, and clustered across different experiments. Currently, several steps of this analysis process require manual intervention of human experts. Our goal is to identify a fully automatic pipeline that yields competitive disease classification results compared to an established but subjective and tedious semi-manual process. Method We combine a large number of modern methods for peak detection, peak clustering, and multivariate classification into analysis pipelines for raw MCC-IMS data. We evaluate all combinations on three different real datasets in an unbiased cross-validation setting. We determine which specific algorithmic combinations lead to high AUC values in disease classifications across the different medical application scenarios. Results The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace-operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology. PMID:28910313
NASA Astrophysics Data System (ADS)
Kerekes, Ryan A.; Gleason, Shaun S.; Trivedi, Niraj; Solecki, David J.
2010-03-01
Segmentation, tracking, and tracing of neurons in video imagery are important steps in many neuronal migration studies and can be inaccurate and time-consuming when performed manually. In this paper, we present an automated method for tracing the leading and trailing processes of migrating neurons in time-lapse image stacks acquired with a confocal fluorescence microscope. In our approach, we first locate and track the soma of the cell of interest by smoothing each frame and tracking the local maxima through the sequence. We then trace the leading process in each frame by starting at the center of the soma and stepping repeatedly in the most likely direction of the leading process. This direction is found at each step by examining second derivatives of fluorescent intensity along curves of constant radius around the current point. Tracing terminates after a fixed number of steps or when fluorescent intensity drops below a fixed threshold. We evolve the resulting trace to form an improved trace that more closely follows the approximate centerline of the leading process. We apply a similar algorithm to the trailing process of the cell by starting the trace in the opposite direction. We demonstrate our algorithm on two time-lapse confocal video sequences of migrating cerebellar granule neurons (CGNs). We show that the automated traces closely approximate ground truth traces to within 1 or 2 pixels on average. Additionally, we compute line intensity profiles of fluorescence along the automated traces and quantitatively demonstrate their similarity to manually generated profiles in terms of fluorescence peak locations.
Automation literature: A brief review and analysis
NASA Technical Reports Server (NTRS)
Smith, D.; Dieterly, D. L.
1980-01-01
Current thought and research positions which may allow for an improved capability to understand the impact of introducing automation to an existing system are established. The orientation was toward the type of studies which may provide some general insight into automation; specifically, the impact of automation in human performance and the resulting system performance. While an extensive number of articles were reviewed, only those that addressed the issue of automation and human performance were selected to be discussed. The literature is organized along two dimensions: time, Pre-1970, Post-1970; and type of approach, Engineering or Behavioral Science. The conclusions reached are not definitive, but do provide the initial stepping stones in an attempt to begin to bridge the concept of automation in a systematic progression.
Towards enhanced automated elution systems for waterborne protozoa using megasonic energy.
Horton, B; Katzer, F; Desmulliez, M P Y; Bridle, H L
2018-02-01
Continuous and reliable monitoring of water sources for human consumption is imperative for public health. For protozoa, which cannot be multiplied efficiently in laboratory settings, concentration and recovery steps are key to a successful detection procedure. Recently, the use of megasonic energy was demonstrated to recover Cryptosporidium from commonly used water industry filtration procedures, forming thereby a basis for a simplified and cost effective method of elution of pathogens. In this article, we report the benefits of incorporating megasonic sonication into the current methodologies of Giardia duodenalis elution from an internationally approved filtration and elution system used within the water industry, the Filta-Max®. Megasonic energy assisted elution has many benefits over current methods since a smaller final volume of eluent allows removal of time-consuming centrifugation steps and reduces manual involvement resulting in a potentially more consistent and more cost-effective method. We also show that megasonic sonication of G. duodenalis cysts provides the option of a less damaging elution method compared to the standard Filta-Max® operation, although the elution from filter matrices is not currently fully optimised. A notable decrease in recovery of damaged cysts was observed in megasonic processed samples, potentially increasing the abilities of further genetic identification options upon isolation of the parasite from a filter sample. This work paves the way for the development of a fully automated and more cost-effective elution method of Giardia from water samples. Copyright © 2017 Elsevier B.V. All rights reserved.
Linear feature detection algorithm for astronomical surveys - I. Algorithm description
NASA Astrophysics Data System (ADS)
Bektešević, Dino; Vinković, Dejan
2017-11-01
Computer vision algorithms are powerful tools in astronomical image analyses, especially when automation of object detection and extraction is required. Modern object detection algorithms in astronomy are oriented towards detection of stars and galaxies, ignoring completely the detection of existing linear features. With the emergence of wide-field sky surveys, linear features attract scientific interest as possible trails of fast flybys of near-Earth asteroids and meteors. In this work, we describe a new linear feature detection algorithm designed specifically for implementation in big data astronomy. The algorithm combines a series of algorithmic steps that first remove other objects (stars and galaxies) from the image and then enhance the line to enable more efficient line detection with the Hough algorithm. The rate of false positives is greatly reduced thanks to a step that replaces possible line segments with rectangles and then compares lines fitted to the rectangles with the lines obtained directly from the image. The speed of the algorithm and its applicability in astronomical surveys are also discussed.
Knowing what to expect, forecasting monthly emergency department visits: A time-series analysis.
Bergs, Jochen; Heerinckx, Philipe; Verelst, Sandra
2014-04-01
To evaluate an automatic forecasting algorithm in order to predict the number of monthly emergency department (ED) visits one year ahead. We collected retrospective data of the number of monthly visiting patients for a 6-year period (2005-2011) from 4 Belgian Hospitals. We used an automated exponential smoothing approach to predict monthly visits during the year 2011 based on the first 5 years of the dataset. Several in- and post-sample forecasting accuracy measures were calculated. The automatic forecasting algorithm was able to predict monthly visits with a mean absolute percentage error ranging from 2.64% to 4.8%, indicating an accurate prediction. The mean absolute scaled error ranged from 0.53 to 0.68 indicating that, on average, the forecast was better compared with in-sample one-step forecast from the naïve method. The applied automated exponential smoothing approach provided useful predictions of the number of monthly visits a year in advance. Copyright © 2013 Elsevier Ltd. All rights reserved.
An Intelligent Automation Platform for Rapid Bioprocess Design.
Wu, Tianyi; Zhou, Yuhong
2014-08-01
Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.
Ialongo, Cristiano; Pieri, Massimo; Bernardini, Sergio
2017-02-01
Saving resources is a paramount issue for the modern laboratory, and new trainable as well as smart technologies can be used to allow the automated instrumentation to manage samples more efficiently in order to achieve streamlined processes. In this regard the serum free light chain (sFLC) testing represents an interesting challenge, as it usually causes using a number of assays before achieving an acceptable result within the analytical range. An artificial neural network based on the multi-layer perceptron (MLP-ANN) was used to infer the starting dilution status of sFLC samples based on the information available through the laboratory information system (LIS). After the learning phase, the MLP-ANN simulation was applied to the nephelometric testing routinely performed in our laboratory on a BN ProSpec® System analyzer (Siemens Helathcare) using the N Latex FLC kit. The MLP-ANN reduced the serum kappa free light chain (κ-FLC) and serum lambda free light chain (λ-FLC) wasted tests by 69.4% and 70.8% with respect to the naïve stepwise dilution scheme used by the automated analyzer, and by 64.9% and 66.9% compared to a "rational" dilution scheme based on a 4-step dilution. Although it was restricted to follow-up samples, the MLP-ANN showed good predictive performance, which alongside the possibility to implement it in any automated system, made it a suitable solution for achieving streamlined laboratory processes and saving resources.
Two Different Approaches to Automated Mark Up of Emotions in Text
NASA Astrophysics Data System (ADS)
Francisco, Virginia; Hervás, Raqucl; Gervás, Pablo
This paper presents two different approaches to automated marking up of texts with emotional labels. For the first approach a corpus of example texts previously annotated by human evaluators is mined for an initial assignment of emotional features to words. This results in a List of Emotional Words (LEW) which becomes a useful resource for later automated mark up. The mark up algorithm in this first approach mirrors closely the steps taken during feature extraction, employing for the actual assignment of emotional features a combination of the LEW resource and WordNet for knowledge-based expansion of words not occurring in LEW. The algorithm for automated mark up is tested against new text samples to test its coverage. The second approach mark up texts during their generation. We have a knowledge base which contains the necessary information for marking up the text. This information is related to actions and characters. The algorithm in this case employ the information of the knowledge database and decides the correct emotion for every sentence. The algorithm for automated mark up is tested against four different texts. The results of the two approaches are compared and discussed with respect to three main issues: relative adequacy of each one of the representations used, correctness and coverage of the proposed algorithms, and additional techniques and solutions that may be employed to improve the results.
MAAMD: a workflow to standardize meta-analyses and comparison of affymetrix microarray data
2014-01-01
Background Mandatory deposit of raw microarray data files for public access, prior to study publication, provides significant opportunities to conduct new bioinformatics analyses within and across multiple datasets. Analysis of raw microarray data files (e.g. Affymetrix CEL files) can be time consuming, complex, and requires fundamental computational and bioinformatics skills. The development of analytical workflows to automate these tasks simplifies the processing of, improves the efficiency of, and serves to standardize multiple and sequential analyses. Once installed, workflows facilitate the tedious steps required to run rapid intra- and inter-dataset comparisons. Results We developed a workflow to facilitate and standardize Meta-Analysis of Affymetrix Microarray Data analysis (MAAMD) in Kepler. Two freely available stand-alone software tools, R and AltAnalyze were embedded in MAAMD. The inputs of MAAMD are user-editable csv files, which contain sample information and parameters describing the locations of input files and required tools. MAAMD was tested by analyzing 4 different GEO datasets from mice and drosophila. MAAMD automates data downloading, data organization, data quality control assesment, differential gene expression analysis, clustering analysis, pathway visualization, gene-set enrichment analysis, and cross-species orthologous-gene comparisons. MAAMD was utilized to identify gene orthologues responding to hypoxia or hyperoxia in both mice and drosophila. The entire set of analyses for 4 datasets (34 total microarrays) finished in ~ one hour. Conclusions MAAMD saves time, minimizes the required computer skills, and offers a standardized procedure for users to analyze microarray datasets and make new intra- and inter-dataset comparisons. PMID:24621103
NASA Astrophysics Data System (ADS)
Vermeesch, Pieter; Rittner, Martin; Petrou, Ethan; Omma, Jenny; Mattinson, Chris; Garzanti, Eduardo
2017-11-01
The first step in most geochronological studies is to extract dateable minerals from the host rock, which is time consuming, removes textural context, and increases the chance for sample cross contamination. We here present a new method to rapidly perform in situ analyses by coupling a fast scanning electron microscope (SEM) with Energy Dispersive X-ray Spectrometer (EDS) to a Laser Ablation Inductively Coupled Plasma Mass Spectrometer (LAICPMS) instrument. Given a polished hand specimen, a petrographic thin section, or a grain mount, Automated Phase Mapping (APM) by SEM/EDS produces chemical and mineralogical maps from which the X-Y coordinates of the datable minerals are extracted. These coordinates are subsequently passed on to the laser ablation system for isotopic analysis. We apply the APM + LAICPMS method to three igneous, metamorphic, and sedimentary case studies. In the first case study, a polished slab of granite from Guernsey was scanned for zircon, producing a 609 ± 8 Ma weighted mean age. The second case study investigates a paragneiss from an ultra high pressure terrane in the north Qaidam terrane (Qinghai, China). One hundred seven small (25 µm) metamorphic zircons were analyzed by LAICPMS to confirm a 419 ± 4 Ma age of peak metamorphism. The third and final case study uses APM + LAICPMS to generate a large provenance data set and trace the provenance of 25 modern sediments from Angola, documenting longshore drift of Orange River sediments over a distance of 1,500 km. These examples demonstrate that APM + LAICPMS is an efficient and cost effective way to improve the quantity and quality of geochronological data.
Biradar, Ankush V; Patil, Vijayshinha S; Chandra, Prakash; Doke, Dhananjay S; Asefa, Tewodros
2015-05-18
We report the synthesis of a trifunctional catalyst containing amine, sulphonic acid and Pd nanoparticle catalytic groups anchored on the pore walls of SBA-15. The catalyst efficiently catalyzes one-pot three-step cascade reactions comprising deacetylation, Henry reaction and hydrogenation, giving up to ∼100% conversion and 92% selectivity to the final product.
Lee, Young Han
2012-01-01
The objectives are (1) to introduce an easy open-source macro program as connection software and (2) to illustrate the practical usages in radiologic reading environment by simulating the radiologic reading process. The simulation is a set of radiologic reading process to do a practical task in the radiologic reading room. The principal processes are: (1) to view radiologic images on the Picture Archiving and Communicating System (PACS), (2) to connect the HIS/EMR (Hospital Information System/Electronic Medical Record) system, (3) to make an automatic radiologic reporting system, and (4) to record and recall information of interesting cases. This simulation environment was designed by using open-source macro program as connection software. The simulation performed well on the Window-based PACS workstation. Radiologists practiced the steps of the simulation comfortably by utilizing the macro-powered radiologic environment. This macro program could automate several manual cumbersome steps in the radiologic reading process. This program successfully acts as connection software for the PACS software, EMR/HIS, spreadsheet, and other various input devices in the radiologic reading environment. A user-friendly efficient radiologic reading environment could be established by utilizing open-source macro program as connection software. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Increasing labor costs and reduced labor pools for hop production have resulted in the necessity to develop strategies to improve efficiency and automate hop production and harvest. One solution for reducing labor inputs is the use and production of “low-trellis” hop varieties optimized for mechani...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false Steps regarding Shipper's Export... Section 732.5 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade (Continued) BUREAU OF INDUSTRY AND SECURITY, DEPARTMENT OF COMMERCE EXPORT ADMINISTRATION REGULATIONS STEPS FOR USING...
NASA Astrophysics Data System (ADS)
Belabbassi, L.; Garzio, L. M.; Smith, M. J.; Knuth, F.; Vardaro, M.; Kerfoot, J.
2016-02-01
The Ocean Observatories Initiative (OOI), funded by the National Science Foundation, provides users with access to long-term datasets from a variety of deployed oceanographic sensors. The Pioneer Array in the Atlantic Ocean off the Coast of New England hosts 10 moorings and 6 gliders. Each mooring is outfitted with 6 to 19 different instruments telemetering more than 1000 data streams. These data are available to science users to collaborate on common scientific goals such as water quality monitoring and scale variability measures of continental shelf processes and coastal open ocean exchanges. To serve this purpose, the acquired datasets undergo an iterative multi-step quality assurance and quality control procedure automated to work with all types of data. Data processing involves several stages, including a fundamental pre-processing step when the data are prepared for processing. This takes a considerable amount of processing time and is often not given enough thought in development initiatives. The volume and complexity of OOI data necessitates the development of a systematic diagnostic tool to enable the management of a comprehensive data information system for the OOI arrays. We present two examples to demonstrate the current OOI pre-processing diagnostic tool. First, Data Filtering is used to identify incomplete, incorrect, or irrelevant parts of the data and then replaces, modifies or deletes the coarse data. This provides data consistency with similar datasets in the system. Second, Data Normalization occurs when the database is organized in fields and tables to minimize redundancy and dependency. At the end of this step, the data are stored in one place to reduce the risk of data inconsistency and promote easy and efficient mapping to the database.
An Economic Evaluation of Colorectal Cancer Screening in Primary Care Practice
Meenan, Richard T.; Anderson, Melissa L.; Chubak, Jessica; Vernon, Sally W.; Fuller, Sharon; Wang, Ching-Yun; Green, Beverly B.
2015-01-01
Introduction Recent colorectal cancer screening studies focus on optimizing adherence. This study evaluated the cost effectiveness of interventions using electronic health records (EHRs), automated mailings, and stepped support increases to improve 2-year colorectal cancer screening adherence. Methods Analyses were based on a parallel-design, randomized trial in which three stepped interventions (EHR-linked mailings [“automated”], automated plus telephone assistance [“assisted”], or automated and assisted plus nurse navigation to testing completion or refusal [navigated”]) were compared to usual care. Data were from August 2008–November 2011 with analyses performed during 2012–2013. Implementation resources were micro-costed; research and registry development costs were excluded. Incremental cost-effectiveness ratios (ICERs) were based on number of participants current for screening per guidelines over 2 years. Bootstrapping examined robustness of results. Results Intervention delivery cost per participant current for screening ranged from $21 (automated) to $27 (navigated). Inclusion of induced testing costs (e.g., screening colonoscopy) lowered expenditures for automated (ICER=−$159) and assisted (ICER=−$36) relative to usual care over 2 years. Savings arose from increased fecal occult blood testing, substituting for more expensive colonoscopies in usual care. Results were broadly consistent across demographic subgroups. More intensive interventions were consistently likely to be cost effective relative to less intensive interventions, with willingness to pay values of $600–$1,200 for an additional person current for screening yielding ≥80% probability of cost effectiveness. Conclusions Two-year cost effectiveness of a stepped approach to colorectal cancer screening promotion based on EHR data is indicated, but longer-term cost effectiveness requires further study. PMID:25998922
Automated method for study of drug metabolism
NASA Technical Reports Server (NTRS)
Furner, R. L.; Feller, D. D.
1973-01-01
Commercially available equipment can be modified to provide automated system for assaying drug metabolism by continuous flow-through. System includes steps and devices for mixing drug with enzyme and cofactor in the presence of pure oxygen, dialyzing resulting metabolite against buffer, and determining amount of metabolite by colorimetric method.
NASA Astrophysics Data System (ADS)
Itoh, Hayato; Mori, Yuichi; Misawa, Masashi; Oda, Masahiro; Kudo, Shin-ei; Mori, Kensaku
2018-02-01
This paper presents a new classification method for endocytoscopic images. Endocytoscopy is a new endoscope that enables us to perform conventional endoscopic observation and ultramagnified observation of cell level. This ultramagnified views (endocytoscopic images) make possible to perform pathological diagnosis only on endo-scopic views of polyps during colonoscopy. However, endocytoscopic image diagnosis requires higher experiences for physicians. An automated pathological diagnosis system is required to prevent the overlooking of neoplastic lesions in endocytoscopy. For this purpose, we propose a new automated endocytoscopic image classification method that classifies neoplastic and non-neoplastic endocytoscopic images. This method consists of two classification steps. At the first step, we classify an input image by support vector machine. We forward the image to the second step if the confidence of the first classification is low. At the second step, we classify the forwarded image by convolutional neural network. We reject the input image if the confidence of the second classification is also low. We experimentally evaluate the classification performance of the proposed method. In this experiment, we use about 16,000 and 4,000 colorectal endocytoscopic images as training and test data, respectively. The results show that the proposed method achieves high sensitivity 93.4% with small rejection rate 9.3% even for difficult test data.
Asahi, Shigeo; Kusaki, Kazuki; Harada, Yukihiro; Kita, Takashi
2018-01-17
Development of high-efficiency solar cells is one of the attractive challenges in renewable energy technologies. Photon up-conversion can reduce the transmission loss and is one of the promising concepts which improve conversion efficiency. Here we present an analysis of the conversion efficiency, which can be increased by up-conversion in a single-junction solar cell with a hetero-interface that boosts the output voltage. We confirm that an increase in the quasi-Fermi gap and substantial photocurrent generation result in a high conversion efficiency.
Automated Recognition of 3D Features in GPIR Images
NASA Technical Reports Server (NTRS)
Park, Han; Stough, Timothy; Fijany, Amir
2007-01-01
A method of automated recognition of three-dimensional (3D) features in images generated by ground-penetrating imaging radar (GPIR) is undergoing development. GPIR 3D images can be analyzed to detect and identify such subsurface features as pipes and other utility conduits. Until now, much of the analysis of GPIR images has been performed manually by expert operators who must visually identify and track each feature. The present method is intended to satisfy a need for more efficient and accurate analysis by means of algorithms that can automatically identify and track subsurface features, with minimal supervision by human operators. In this method, data from multiple sources (for example, data on different features extracted by different algorithms) are fused together for identifying subsurface objects. The algorithms of this method can be classified in several different ways. In one classification, the algorithms fall into three classes: (1) image-processing algorithms, (2) feature- extraction algorithms, and (3) a multiaxis data-fusion/pattern-recognition algorithm that includes a combination of machine-learning, pattern-recognition, and object-linking algorithms. The image-processing class includes preprocessing algorithms for reducing noise and enhancing target features for pattern recognition. The feature-extraction algorithms operate on preprocessed data to extract such specific features in images as two-dimensional (2D) slices of a pipe. Then the multiaxis data-fusion/ pattern-recognition algorithm identifies, classifies, and reconstructs 3D objects from the extracted features. In this process, multiple 2D features extracted by use of different algorithms and representing views along different directions are used to identify and reconstruct 3D objects. In object linking, which is an essential part of this process, features identified in successive 2D slices and located within a threshold radius of identical features in adjacent slices are linked in a directed-graph data structure. Relative to past approaches, this multiaxis approach offers the advantages of more reliable detections, better discrimination of objects, and provision of redundant information, which can be helpful in filling gaps in feature recognition by one of the component algorithms. The image-processing class also includes postprocessing algorithms that enhance identified features to prepare them for further scrutiny by human analysts (see figure). Enhancement of images as a postprocessing step is a significant departure from traditional practice, in which enhancement of images is a preprocessing step.
NASA Astrophysics Data System (ADS)
Fei, Jiangfeng
2013-03-01
In 2006, JDRF launched the Artificial Pancreas Project (APP) to accelerate the development of a commercially-viable artificial pancreas system to closely mimic the biological function of the pancreas individuals with insulin-dependent diabetes, particularly type 1 diabetes. By automating detection of blood sugar levels and delivery of insulin in response to those levels, an artificial pancreas has the potential to transform the lives of people with type 1 diabetes. The 6-step APP development pathway serves as JDRF's APP strategic funding plan and defines the priorities of product research and development. Each step in the plan represents incremental advances in automation beginning with devices that shut off insulin delivery to prevent episodes of low blood sugar and progressing ultimately to a fully automated ``closed loop'' system that maintains blood glucose at a target level without the need to bolus for meals or adjust for exercise.
NASA Technical Reports Server (NTRS)
Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Bavuso, Salvatore J.
1994-01-01
The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. The Hybrid Automated Reliability Predictor (HARP) tutorial provides insight into HARP modeling techniques and the interactive textual prompting input language via a step-by-step explanation and demonstration of HARP's fault occurrence/repair model and the fault/error handling models. Example applications are worked in their entirety and the HARP tabular output data are presented for each. Simple models are presented at first with each succeeding example demonstrating greater modeling power and complexity. This document is not intended to present the theoretical and mathematical basis for HARP.
A Geometry Based Infra-structure for Computational Analysis and Design
NASA Technical Reports Server (NTRS)
Haimes, Robert
1997-01-01
The computational steps traditionally taken for most engineering analysis (CFD, structural analysis, and etc.) are: Surface Generation - usually by employing a CAD system; Grid Generation - preparing the volume for the simulation; Flow Solver - producing the results at the specified operational point; and Post-processing Visualization - interactively attempting to understand the results For structural analysis, integrated systems can be obtained from a number of commercial vendors. For CFD, these steps have worked well in the past for simple steady-state simulations at the expense of much user interaction. The data was transmitted between phases via files. Specifically the problems with this procedure are: (1) File based. Information flows from one step to the next via data files with formats specified for that procedure. (2) 'Good' Geometry. A bottleneck in getting results from a solver is the construction of proper geometry to be fed to the grid generator. With 'good' geometry a grid can be constructed in tens of minutes (even with a complex configuration) using unstructured techniques. (3) One-Way communication. All information travels on from one phase to the next. Until this process can be automated, more complex problems such as multi-disciplinary analysis or using the above procedure for design becomes prohibitive.
Automated Car Park Management System
NASA Astrophysics Data System (ADS)
Fabros, J. P.; Tabañag, D.; Espra, A.; Gerasta, O. J.
2015-06-01
This study aims to develop a prototype for an Automated Car Park Management System that will increase the quality of service of parking lots through the integration of a smart system that assists motorist in finding vacant parking lot. The research was based on implementing an operating system and a monitoring system for parking system without the use of manpower. This will include Parking Guidance and Information System concept which will efficiently assist motorists and ensures the safety of the vehicles and the valuables inside the vehicle. For monitoring, Optical Character Recognition was employed to monitor and put into list all the cars entering the parking area. All parking events in this system are visible via MATLAB GUI which contain time-in, time-out, time consumed information and also the lot number where the car parks. To put into reality, this system has a payment method, and it comes via a coin slot operation to control the exit gate. The Automated Car Park Management System was successfully built by utilizing microcontrollers specifically one PIC18f4550 and two PIC16F84s and one PIC16F628A.
Recent development in software and automation tools for high-throughput discovery bioanalysis.
Shou, Wilson Z; Zhang, Jun
2012-05-01
Bioanalysis with LC-MS/MS has been established as the method of choice for quantitative determination of drug candidates in biological matrices in drug discovery and development. The LC-MS/MS bioanalytical support for drug discovery, especially for early discovery, often requires high-throughput (HT) analysis of large numbers of samples (hundreds to thousands per day) generated from many structurally diverse compounds (tens to hundreds per day) with a very quick turnaround time, in order to provide important activity and liability data to move discovery projects forward. Another important consideration for discovery bioanalysis is its fit-for-purpose quality requirement depending on the particular experiments being conducted at this stage, and it is usually not as stringent as those required in bioanalysis supporting drug development. These aforementioned attributes of HT discovery bioanalysis made it an ideal candidate for using software and automation tools to eliminate manual steps, remove bottlenecks, improve efficiency and reduce turnaround time while maintaining adequate quality. In this article we will review various recent developments that facilitate automation of individual bioanalytical procedures, such as sample preparation, MS/MS method development, sample analysis and data review, as well as fully integrated software tools that manage the entire bioanalytical workflow in HT discovery bioanalysis. In addition, software tools supporting the emerging high-resolution accurate MS bioanalytical approach are also discussed.
Demystifying the Search Button
McKeever, Liam; Nguyen, Van; Peterson, Sarah J.; Gomez-Perez, Sandra
2015-01-01
A thorough review of the literature is the basis of all research and evidence-based practice. A gold-standard efficient and exhaustive search strategy is needed to ensure all relevant citations have been captured and that the search performed is reproducible. The PubMed database comprises both the MEDLINE and non-MEDLINE databases. MEDLINE-based search strategies are robust but capture only 89% of the total available citations in PubMed. The remaining 11% include the most recent and possibly relevant citations but are only searchable through less efficient techniques. An effective search strategy must employ both the MEDLINE and the non-MEDLINE portion of PubMed to ensure all studies have been identified. The robust MEDLINE search strategies are used for the MEDLINE portion of the search. Usage of the less robust strategies is then efficiently confined to search only the remaining 11% of PubMed citations that have not been indexed for MEDLINE. The current article offers step-by-step instructions for building such a search exploring methods for the discovery of medical subject heading (MeSH) terms to search MEDLINE, text-based methods for exploring the non-MEDLINE database, information on the limitations of convenience algorithms such as the “related citations feature,” the strengths and pitfalls associated with commonly used filters, the proper usage of Boolean operators to organize a master search strategy, and instructions for automating that search through “MyNCBI” to receive search query updates by email as new citations become available. PMID:26129895
Fahlgren, Noah; Feldman, Maximilian; Gehan, Malia A; Wilson, Melinda S; Shyu, Christine; Bryant, Douglas W; Hill, Steven T; McEntee, Colton J; Warnasooriya, Sankalpi N; Kumar, Indrajit; Ficor, Tracy; Turnipseed, Stephanie; Gilbert, Kerrigan B; Brutnell, Thomas P; Carrington, James C; Mockler, Todd C; Baxter, Ivan
2015-10-05
Phenotyping has become the rate-limiting step in using large-scale genomic data to understand and improve agricultural crops. Here, the Bellwether Phenotyping Platform for controlled-environment plant growth and automated multimodal phenotyping is described. The system has capacity for 1140 plants, which pass daily through stations to record fluorescence, near-infrared, and visible images. Plant Computer Vision (PlantCV) was developed as open-source, hardware platform-independent software for quantitative image analysis. In a 4-week experiment, wild Setaria viridis and domesticated Setaria italica had fundamentally different temporal responses to water availability. While both lines produced similar levels of biomass under limited water conditions, Setaria viridis maintained the same water-use efficiency under water replete conditions, while Setaria italica shifted to less efficient growth. Overall, the Bellwether Phenotyping Platform and PlantCV software detected significant effects of genotype and environment on height, biomass, water-use efficiency, color, plant architecture, and tissue water status traits. All ∼ 79,000 images acquired during the course of the experiment are publicly available. Copyright © 2015 The Author. Published by Elsevier Inc. All rights reserved.
Protein purification and analysis: next generation Western blotting techniques.
Mishra, Manish; Tiwari, Shuchita; Gomes, Aldrin V
2017-11-01
Western blotting is one of the most commonly used techniques in molecular biology and proteomics. Since western blotting is a multistep protocol, variations and errors can occur at any step reducing the reliability and reproducibility of this technique. Recent reports suggest that a few key steps, such as the sample preparation method, the amount and source of primary antibody used, as well as the normalization method utilized, are critical for reproducible western blot results. Areas covered: In this review, improvements in different areas of western blotting, including protein transfer and antibody validation, are summarized. The review discusses the most advanced western blotting techniques available and highlights the relationship between next generation western blotting techniques and its clinical relevance. Expert commentary: Over the last decade significant improvements have been made in creating more sensitive, automated, and advanced techniques by optimizing various aspects of the western blot protocol. New methods such as single cell-resolution western blot, capillary electrophoresis, DigiWest, automated microfluid western blotting and microchip electrophoresis have all been developed to reduce potential problems associated with the western blotting technique. Innovative developments in instrumentation and increased sensitivity for western blots offer novel possibilities for increasing the clinical implications of western blot.
Learning to detect and combine the features of an object
Suchow, Jordan W.; Pelli, Denis G.
2013-01-01
To recognize an object, it is widely supposed that we first detect and then combine its features. Familiar objects are recognized effortlessly, but unfamiliar objects—like new faces or foreign-language letters—are hard to distinguish and must be learned through practice. Here, we describe a method that separates detection and combination and reveals how each improves as the observer learns. We dissociate the steps by two independent manipulations: For each step, we do or do not provide a bionic crutch that performs it optimally. Thus, the two steps may be performed solely by the human, solely by the crutches, or cooperatively, when the human takes one step and a crutch takes the other. The crutches reveal a double dissociation between detecting and combining. Relative to the two-step ideal, the human observer’s overall efficiency for unconstrained identification equals the product of the efficiencies with which the human performs the steps separately. The two-step strategy is inefficient: Constraining the ideal to take two steps roughly halves its identification efficiency. In contrast, we find that humans constrained to take two steps perform just as well as when unconstrained, which suggests that they normally take two steps. Measuring threshold contrast (the faintness of a barely identifiable letter) as it improves with practice, we find that detection is inefficient and learned slowly. Combining is learned at a rate that is 4× higher and, after 1,000 trials, 7× more efficient. This difference explains much of the diversity of rates reported in perceptual learning studies, including effects of complexity and familiarity. PMID:23267067
Albert, Océane; Reintsch, Wolfgang E; Chan, Peter; Robaire, Bernard
2016-05-01
Can we make the comet assay (single-cell gel electrophoresis) for human sperm a more accurate and informative high throughput assay? We developed a standardized automated high throughput comet (HT-COMET) assay for human sperm that improves its accuracy and efficiency, and could be of prognostic value to patients in the fertility clinic. The comet assay involves the collection of data on sperm DNA damage at the level of the single cell, allowing the use of samples from severe oligozoospermic patients. However, this makes comet scoring a low throughput procedure that renders large cohort analyses tedious. Furthermore, the comet assay comes with an inherent vulnerability to variability. Our objective is to develop an automated high throughput comet assay for human sperm that will increase both its accuracy and efficiency. The study comprised two distinct components: a HT-COMET technical optimization section based on control versus DNAse treatment analyses ( ITALIC! n = 3-5), and a cross-sectional study on 123 men presenting to a reproductive center with sperm concentrations categorized as severe oligozoospermia, oligozoospermia or normozoospermia. Sperm chromatin quality was measured using the comet assay: on classic 2-well slides for software comparison; on 96-well slides for HT-COMET optimization; after exposure to various concentrations of a damage-inducing agent, DNAse, using HT-COMET; on 123 subjects with different sperm concentrations using HT-COMET. Data from the 123 subjects were correlated to classic semen quality parameters and plotted as single-cell data in individual DNA damage profiles. We have developed a standard automated HT-COMET procedure for human sperm. It includes automated scoring of comets by a fully integrated high content screening setup that compares well with the most commonly used semi-manual analysis software. Using this method, a cross-sectional study on 123 men showed no significant correlation between sperm concentration and sperm DNA damage, confirming the existence of hidden chromatin damage in men with apparently normal semen characteristics, and a significant correlation between percentage DNA in the tail and percentage of progressively motile spermatozoa. Finally, the use of DNA damage profiles helped to distinguish subjects between and within sperm concentration categories, and allowed a determination of the proportion of highly damaged cells. The main limitations of the HT-COMET are the high, yet indispensable, investment in an automated liquid handling system and heating block to ensure accuracy, and the availability of an automated plate reading microscope and analysis software. This standardized HT-COMET assay offers many advantages, including higher accuracy and evenness due to automation of sensitive steps, a 14.4-fold increase in sample analysis capacity, and an imaging and scoring time of 1 min/well. Overall, HT-COMET offers a decrease in total experimental time of more than 90%. Hence, this assay constitutes a more efficient option to assess sperm chromatin quality, paves the way to using this assay to screen large cohorts, and holds prognostic value for infertile patients. Funded by the CIHR Institute of Human Development, Child and Youth Health (IHDCYH; RHF 100625). O.A. is a fellow supported by the Fonds de la Recherche du Québec - Santé (FRQS) and the CIHR Training Program in Reproduction, Early Development, and the Impact on Health (REDIH). B.R. is a James McGill Professor. The authors declare no conflicts of interest. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Automated optimization techniques for aircraft synthesis
NASA Technical Reports Server (NTRS)
Vanderplaats, G. N.
1976-01-01
Application of numerical optimization techniques to automated conceptual aircraft design is examined. These methods are shown to be a general and efficient way to obtain quantitative information for evaluating alternative new vehicle projects. Fully automated design is compared with traditional point design methods and time and resource requirements for automated design are given. The NASA Ames Research Center aircraft synthesis program (ACSYNT) is described with special attention to calculation of the weight of a vehicle to fly a specified mission. The ACSYNT procedures for automatically obtaining sensitivity of the design (aircraft weight, performance and cost) to various vehicle, mission, and material technology parameters are presented. Examples are used to demonstrate the efficient application of these techniques.
Automation of Cataloging: Effects on Use of Staff, Efficiency, and Service to Patrons.
ERIC Educational Resources Information Center
Bednar, Marie
1988-01-01
Describes the effects of the automation of cataloging processes at Pennsylvania State University. The discussion covers the reorganization of professional and paraprofessional personnel and job responsibilities, staff reactions to the changes, the impact on cataloging quality and efficiency, and patron satisfaction with the services offered. (15…
Xie, Wei-Qi; Gong, Yi-Xian; Yu, Kong-Xian
2017-06-01
We demonstrate a reaction headspace gas chromatographic method for quantifying anhydride groups in anhydride-based epoxy hardeners. In this method, the conversion process of anhydride groups can be realized by two steps. In the first step, anhydride groups in anhydride-based epoxy hardeners completely reacted with water to form carboxyl groups. In the second step, the carboxyl groups reacted with sodium bicarbonate solution in a closed sample vial. After the complete reaction between the carboxyl groups and sodium bicarbonate, the CO 2 formed from this reaction was then measured by headspace gas chromatography. The data showed that the reaction in the closed headspace vial can be completed in 15 min at 55°C, the relative standard deviation of the reaction headspace gas chromatography method in the precision test was less than 3.94%, the relative differences between the new method and a reference method were no more than 9.38%. The present reaction method is automated, efficient and can be a reliable tool for quantifying the anhydride groups in anhydride-based epoxy hardeners and related research. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Korshunov, G. I.; Petrushevskaya, A. A.; Lipatnikov, V. A.; Smirnova, M. S.
2018-03-01
The strategy of quality of electronics insurance is represented as most important. To provide quality, the processes sequence is considered and modeled by Markov chain. The improvement is distinguished by simple database means of design for manufacturing for future step-by-step development. Phased automation of design and digital manufacturing electronics is supposed. The MatLab modelling results showed effectiveness increase. New tools and software should be more effective. The primary digital model is proposed to represent product in the processes sequence from several processes till the whole life circle.
Process Development for Automated Solar Cell and Module Production. Task 4: Automated Array Assembly
NASA Technical Reports Server (NTRS)
1979-01-01
A baseline sequence for the manufacture of solar cell modules was specified. Starting with silicon wafers, the process goes through damage etching, texture etching, junction formation, plasma edge etch, aluminum back surface field formation, and screen printed metallization to produce finished solar cells. The cells were then series connected on a ribbon and bonded into a finished glass tedlar module. A number of steps required additional developmental effort to verify technical and economic feasibility. These steps include texture etching, plasma edge etch, aluminum back surface field formation, array layup and interconnect, and module edge sealing and framing.
Schaefer, Peter
2011-07-01
The purpose of bioanalysis in the pharmaceutical industry is to provide 'raw' data about the concentration of a drug candidate and its metabolites as input for studies of drug properties such as pharmacokinetic (PK), toxicokinetic, bioavailability/bioequivalence and other studies. Building a seamless workflow from the laboratory to final reports is an ongoing challenge for IT groups and users alike. In such a workflow, PK automation can provide companies with the means to vastly increase the productivity of their scientific staff while improving the quality and consistency of their reports on PK analyses. This report presents the concept and benefits of PK automation and discuss which features of an automated reporting workflow should be translated into software requirements that pharmaceutical companies can use to select or build an efficient and effective PK automation solution that best meets their needs.
Isoda, Yuta; Sasaki, Norihiko; Kitamura, Kei; Takahashi, Shuji; Manmode, Sujit; Takeda-Okuda, Naoko; Tamura, Jun-Ichi; Nokami, Toshiki; Itoh, Toshiyuki
2017-01-01
The total synthesis of TMG-chitotriomycin using an automated electrochemical synthesizer for the assembly of carbohydrate building blocks is demonstrated. We have successfully prepared a precursor of TMG-chitotriomycin, which is a structurally-pure tetrasaccharide with typical protecting groups, through the methodology of automated electrochemical solution-phase synthesis developed by us. The synthesis of structurally well-defined TMG-chitotriomycin has been accomplished in 10-steps from a disaccharide building block.
A modular computational framework for automated peak extraction from ion mobility spectra
2014-01-01
Background An ion mobility (IM) spectrometer coupled with a multi-capillary column (MCC) measures volatile organic compounds (VOCs) in the air or in exhaled breath. This technique is utilized in several biotechnological and medical applications. Each peak in an MCC/IM measurement represents a certain compound, which may be known or unknown. For clustering and classification of measurements, the raw data matrix must be reduced to a set of peaks. Each peak is described by its coordinates (retention time in the MCC and reduced inverse ion mobility) and shape (signal intensity, further shape parameters). This fundamental step is referred to as peak extraction. It is the basis for identifying discriminating peaks, and hence putative biomarkers, between two classes of measurements, such as a healthy control group and a group of patients with a confirmed disease. Current state-of-the-art peak extraction methods require human interaction, such as hand-picking approximate peak locations, assisted by a visualization of the data matrix. In a high-throughput context, however, it is preferable to have robust methods for fully automated peak extraction. Results We introduce PEAX, a modular framework for automated peak extraction. The framework consists of several steps in a pipeline architecture. Each step performs a specific sub-task and can be instantiated by different methods implemented as modules. We provide open-source software for the framework and several modules for each step. Additionally, an interface that allows easy extension by a new module is provided. Combining the modules in all reasonable ways leads to a large number of peak extraction methods. We evaluate all combinations using intrinsic error measures and by comparing the resulting peak sets with an expert-picked one. Conclusions Our software PEAX is able to automatically extract peaks from MCC/IM measurements within a few seconds. The automatically obtained results keep up with the results provided by current state-of-the-art peak extraction methods. This opens a high-throughput context for the MCC/IM application field. Our software is available at http://www.rahmannlab.de/research/ims. PMID:24450533
A modular computational framework for automated peak extraction from ion mobility spectra.
D'Addario, Marianna; Kopczynski, Dominik; Baumbach, Jörg Ingo; Rahmann, Sven
2014-01-22
An ion mobility (IM) spectrometer coupled with a multi-capillary column (MCC) measures volatile organic compounds (VOCs) in the air or in exhaled breath. This technique is utilized in several biotechnological and medical applications. Each peak in an MCC/IM measurement represents a certain compound, which may be known or unknown. For clustering and classification of measurements, the raw data matrix must be reduced to a set of peaks. Each peak is described by its coordinates (retention time in the MCC and reduced inverse ion mobility) and shape (signal intensity, further shape parameters). This fundamental step is referred to as peak extraction. It is the basis for identifying discriminating peaks, and hence putative biomarkers, between two classes of measurements, such as a healthy control group and a group of patients with a confirmed disease. Current state-of-the-art peak extraction methods require human interaction, such as hand-picking approximate peak locations, assisted by a visualization of the data matrix. In a high-throughput context, however, it is preferable to have robust methods for fully automated peak extraction. We introduce PEAX, a modular framework for automated peak extraction. The framework consists of several steps in a pipeline architecture. Each step performs a specific sub-task and can be instantiated by different methods implemented as modules. We provide open-source software for the framework and several modules for each step. Additionally, an interface that allows easy extension by a new module is provided. Combining the modules in all reasonable ways leads to a large number of peak extraction methods. We evaluate all combinations using intrinsic error measures and by comparing the resulting peak sets with an expert-picked one. Our software PEAX is able to automatically extract peaks from MCC/IM measurements within a few seconds. The automatically obtained results keep up with the results provided by current state-of-the-art peak extraction methods. This opens a high-throughput context for the MCC/IM application field. Our software is available at http://www.rahmannlab.de/research/ims.
Obtaining and processing Daymet data using Python and ArcGIS
Bohms, Stefanie
2013-01-01
This set of scripts was developed to automate the process of downloading and mosaicking daily Daymet data to a user defined extent using ArcGIS and Python programming language. The three steps are downloading the needed Daymet tiles for the study area extent, converting the netcdf file to a tif raster format, and mosaicking those rasters to one file. The set of scripts is intended for all levels of experience with Python programming language and requires no scripting by the user.
Tang, Peng; Wu, Jie; Liu, Hou; Liu, Youcai; Zhou, Xingding
2018-01-01
One of the newly developed methods for Assimilable organic carbon (AOC) determination is leveraged on the cell enumeration by flow cytometry (FC) which could provide a rapid and automated solution for AOC measurement. However, cell samples staining with fluorescence dye is indispensable to reduce background and machine noise. This step would bring additional cost and time consuming for this method. In this study, a green fluorescence protein (GFP) tagged strain derived of AOC testing strain Pseudomonas fluorescens P-17 (GFP-P17) was generated using Tn5 transposon mutagenesis. Continuous culture of this mutant GFP-P17 showed stable expression of eGFP signal detected by flow cytometry without staining step. In addition, this GFP-P17 strain displayed faster growth rate and had a wider range of carbon substrate utilization patterns as compared with P17 wild-type. With this strain, the capability of a new FC method with no dye staining was explored in standard acetate solution, which suggests linear correlation of counts with acetate carbon concentration. Furthermore, this FC method with GFP-P17 strain is applicable in monitoring GAC/BAC efficiency and condition as similar trends of AOC level in water treatment process were measured by both FC method and conventional spread plating count method. Therefore, this fast and easily applicable GFP-P17 based FC method could serve as a tool for routine microbiological drinking water monitoring.
NASA Astrophysics Data System (ADS)
Wang, Yi-Min; Li, Cheng-Zu
2010-01-01
We propose theoretical schemes to generate highly entangled cluster state with superconducting qubits in a circuit QED architecture. Charge qubits are located inside a superconducting transmission line, which serves as a quantum data bus. We show that large clusters state can be efficiently generated in just one step with the long-range Ising-like unitary operators. The quantum operations which are generally realized by two coupling mechanisms: either voltage coupling or current coupling, depend only on global geometric features and are insensitive not only to the thermal state of the transmission line but also to certain random operation errors. Thus high-fidelity one-way quantum computation can be achieved.
Vorstius, Christian; Radach, Ralph; Lang, Alan R
2012-02-01
Reflexive and voluntary levels of processing have been studied extensively with respect to possible impairments due to alcohol intoxication. This study examined alcohol effects at the 'automated' level of processing essential to many complex visual processing tasks (e.g., reading, visual search) that involve ongoing modifications or reprogramming of well-practiced routines. Data from 30 participants (16 male) were collected in two counterbalanced sessions (alcohol vs. no-alcohol control; mean breath alcohol concentration = 68 mg/dL vs. 0 mg/dL). Eye movements were recorded during a double-step task where 75% of trials involved two target stimuli in rapid succession (inter-stimulus interval [ISI]=40, 70, or 100 ms) so that they could elicit two distinct saccades or eye movements (double steps). On 25% of trials a single target appeared. Results indicated that saccade latencies were longer under alcohol. In addition, the proportion of single-step responses and the mean saccade amplitude (length) of primary saccades decreased significantly with increasing ISI. The key novel finding, however, was that the reprogramming time needed to cancel the first saccade and adjust saccade amplitude was extended significantly by alcohol. The additional time made available by prolonged latencies due to alcohol was not utilized by the saccade programming system to decrease the number of two-step responses. These results represent the first demonstration of specific alcohol-induced programming deficits at the automated level of oculomotor processing.
Expected Improvements in Work Truck Efficiency Through Connectivity and Automation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walkowicz, Kevin A
This presentation focuses on the potential impact of connected and automated technologies on commercial vehicle operations. It includes topics such as the U.S. Department of Energy's Energy Efficient Mobility Systems (EEMS) program and the Systems and Modeling for Accelerated Research in Transportation (SMART) Mobility Initiative. It also describes National Renewable Energy Laboratory (NREL) research findings pertaining to the potential energy impacts of connectivity and automation and stresses the need for integration and optimization to take advantage of the benefits offered by these transformative technologies while mitigating the potential negative consequences.
Classification of product inspection items using nonlinear features
NASA Astrophysics Data System (ADS)
Talukder, Ashit; Casasent, David P.; Lee, H.-W.
1998-03-01
Automated processing and classification of real-time x-ray images of randomly oriented touching pistachio nuts is discussed. The ultimate objective is the development of a system for automated non-invasive detection of defective product items on a conveyor belt. This approach involves two main steps: preprocessing and classification. Preprocessing locates individual items and segments ones that touch using a modified watershed algorithm. The second stage involves extraction of features that allow discrimination between damaged and clean items (pistachio nuts). This feature extraction and classification stage is the new aspect of this paper. We use a new nonlinear feature extraction scheme called the maximum representation and discriminating feature (MRDF) extraction method to compute nonlinear features that are used as inputs to a classifier. The MRDF is shown to provide better classification and a better ROC (receiver operating characteristic) curve than other methods.
Refurbishment and Automation of the Thermal/Vacuum Facilities at the Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Donohue, John T.; Johnson, Chris; Ogden, Rick; Sushon, Janet
1998-01-01
The thermal/vacuum facilities located at the Goddard Space Flight Center (GSFC) have supported both manned and unmanned space flight since the 1960s. Of the 11 facilities, currently 10 of the systems are scheduled for refurbishment and/or replacement as part of a 5-year implementation. Expected return on investment includes the reduction in test schedules, improvements in the safety of facility operations, reduction in the complexity of a test and the reduction in personnel support required for a test. Additionally, GSFC will become a global resource renowned for expertise in thermal engineering, mechanical engineering and for the automation of thermal/vacuum facilities and thermal/vacuum tests. Automation of the thermal/vacuum facilities includes the utilization of Programmable Logic Controllers (PLCs) and the use of Supervisory Control and Data Acquisition (SCADA) systems. These components allow the computer control and automation of mechanical components such as valves and pumps. In some cases, the chamber and chamber shroud require complete replacement while others require only mechanical component retrofit or replacement. The project of refurbishment and automation began in 1996 and has resulted in the computer control of one Facility (Facility #225) and the integration of electronically controlled devices and PLCs within several other facilities. Facility 225 has been successfully controlled by PLC and SCADA for over one year. Insignificant anomalies have occurred and were resolved with minimal impact to testing and operations. The amount of work remaining to be performed will occur over the next four to five years. Fiscal year 1998 includes the complete refurbishment of one facility, computer control of the thermal systems in two facilities, implementation of SCADA and PLC systems to support multiple facilities and the implementation of a Database server to allow efficient test management and data analysis.
NASA Astrophysics Data System (ADS)
Ivanova, A.; Tokmakov, A.; Lebedeva, K.; Roze, M.; Kaulachs, I.
2017-08-01
Organometal halide perovskites are promising materials for lowcost, high-efficiency solar cells. The method of perovskite layer deposition and the interfacial layers play an important role in determining the efficiency of perovskite solar cells (PSCs). In the paper, we demonstrate inverted planar perovskite solar cells where perovskite layers are deposited by two-step modified interdiffusion and one-step methods. We also demonstrate how PSC parameters change by doping of charge transport layers (CTL). We used dimethylsupoxide (DMSO) as dopant for the hole transport layer (PEDOT:PSS) but for the electron transport layer [6,6]-phenyl C61 butyric acid methyl ester (PCBM)) we used N,N-dimethyl-N-octadecyl(3-aminopropyl)trimethoxysilyl chloride (DMOAP). The highest main PSC parameters (PCE, EQE, VOC) were obtained for cells prepared by the one-step method with fast crystallization and doped CTLs but higher fill factor (FF) and shunt resistance (Rsh) values were obtained for cells prepared by the two-step method with undoped CTLs.
Steps Towards the Integration of Conflict Resolution with Metering and Scheduling
NASA Technical Reports Server (NTRS)
McNally, B. David; Edwards, Thomas (Technical Monitor)
1998-01-01
NASA Ames Research Center is developing decision support tool technology for air traffic controllers to improve the efficiency and capacity of National Airspace System. The goal is to provide technology, tools and procedures that result in the highest possible level of user preferred trajectories whenever possible with safe and efficient traffic management when necessary. The work is being conducted under the NASA Advanced Air Transportation Technology Program in cooperation with the FAA through the Inter-Agency Integrated Product Team. The objective is to develop technology and procedures that lead towards a seamless integration of conflict resolution with metering and scheduling for arrival aircraft and en route aircraft that are under metering restrictions. A requirement is that the integration incorporate user preferred trajectories. The ultimate goal is the implementation and validation of the Descent Advisor (DA) concept which provides clearance advisories to a sector controller that simultaneously meet metering constraints, are conflict free, incorporate a user preferred (e.g., minimum fuel) descent profile, and generally require no further corrective clearance as the aircraft transitions from en route cruise into the TRACON. The DA concept may also be applied to en route aircraft under metering constraints, e.g., miles-in-trail. To achieve the DA concept a stepwise development and field evaluation is anticipated. This paper addresses the initial steps towards implementation of the DA. The Traffic Management Advisor (TMA) computes arrival time sequence and required delay information for display to the sector controller during periods when arrivals must be metered due to landing rate restrictions at the airport. The Initial Conflict Probe (ICP) compares trajectory predictions for all aircraft and alerts the controller when any two aircraft are predicted to violate separation standards (5 mi. and 2000 ft. in en route airspace). ICP also includes a trial planning function allowing the controller to develop and check a separate "what if" trajectory for conflict resolution. TMA and ICP currently operate independent of one another and have separate controller displays. The TMA meter list is on the radar controller's plan view display. The ICP is still under development, but the current concept calls for a list of predicted conflicts at the data controller position. The research described herein address two steps towards the implementation of DA. The first is to develop a concept for integrated display of conflict and metering information on a controller's display. The objective is to provide the controller with situational awareness of one problem while he develops a solution to the other. The next step is to expand the concept for display of automated clearance advisories for one problem (e.g., metering) which take into account the other problems (e.g., conflicts). Information to be communicated between TMA and ICP to facilitate' manual or automated advisories is being identified as the concept matures. In order to study the ICP/TMA integration concept the CTAS conflict probe capability has been adapted to Ft. Worth Center. The system is being validated in the laboratory with all-track data and for non -interference with TMA, TMA is already running as a daily use prototype at Ft. Worth Center. A laboratory prototype system has been developed under the CTAS baseline which combines conflict and metering information on a common user interface. Elements of the user interface are shown in Figure 1. In this simple illustration the user sees the simultaneous effect of a trial plan on meter fix delay and conflict status. Delay information is shown in the aircraft flight data block, the meter list, and the TMA timeline. Experience during Descent Advisor development and observations at Ft. Worth Center high and low attitude arrival sectors during metering suggests that manual trial planning will be unworkable during rush periods due to high controller workload.
Laboratory automation in clinical bacteriology: what system to choose?
Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G
2016-03-01
Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.